Oct  8 10:18:18 np0005476733 kernel: Linux version 5.14.0-620.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-11), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Fri Sep 26 01:13:23 UTC 2025
Oct  8 10:18:18 np0005476733 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Oct  8 10:18:18 np0005476733 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct  8 10:18:18 np0005476733 kernel: BIOS-provided physical RAM map:
Oct  8 10:18:18 np0005476733 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Oct  8 10:18:18 np0005476733 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Oct  8 10:18:18 np0005476733 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Oct  8 10:18:18 np0005476733 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Oct  8 10:18:18 np0005476733 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Oct  8 10:18:18 np0005476733 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Oct  8 10:18:18 np0005476733 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Oct  8 10:18:18 np0005476733 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000043fffffff] usable
Oct  8 10:18:18 np0005476733 kernel: NX (Execute Disable) protection: active
Oct  8 10:18:18 np0005476733 kernel: APIC: Static calls initialized
Oct  8 10:18:18 np0005476733 kernel: SMBIOS 2.8 present.
Oct  8 10:18:18 np0005476733 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Oct  8 10:18:18 np0005476733 kernel: Hypervisor detected: KVM
Oct  8 10:18:18 np0005476733 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Oct  8 10:18:18 np0005476733 kernel: kvm-clock: using sched offset of 6858365903 cycles
Oct  8 10:18:18 np0005476733 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Oct  8 10:18:18 np0005476733 kernel: tsc: Detected 2800.000 MHz processor
Oct  8 10:18:18 np0005476733 kernel: last_pfn = 0x440000 max_arch_pfn = 0x400000000
Oct  8 10:18:18 np0005476733 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Oct  8 10:18:18 np0005476733 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Oct  8 10:18:18 np0005476733 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Oct  8 10:18:18 np0005476733 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Oct  8 10:18:18 np0005476733 kernel: Using GB pages for direct mapping
Oct  8 10:18:18 np0005476733 kernel: RAMDISK: [mem 0x2d7c4000-0x32bd9fff]
Oct  8 10:18:18 np0005476733 kernel: ACPI: Early table checksum verification disabled
Oct  8 10:18:18 np0005476733 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Oct  8 10:18:18 np0005476733 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  8 10:18:18 np0005476733 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  8 10:18:18 np0005476733 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  8 10:18:18 np0005476733 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Oct  8 10:18:18 np0005476733 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  8 10:18:18 np0005476733 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  8 10:18:18 np0005476733 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Oct  8 10:18:18 np0005476733 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Oct  8 10:18:18 np0005476733 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Oct  8 10:18:18 np0005476733 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Oct  8 10:18:18 np0005476733 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Oct  8 10:18:18 np0005476733 kernel: No NUMA configuration found
Oct  8 10:18:18 np0005476733 kernel: Faking a node at [mem 0x0000000000000000-0x000000043fffffff]
Oct  8 10:18:18 np0005476733 kernel: NODE_DATA(0) allocated [mem 0x43ffd5000-0x43fffffff]
Oct  8 10:18:18 np0005476733 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Oct  8 10:18:18 np0005476733 kernel: Zone ranges:
Oct  8 10:18:18 np0005476733 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Oct  8 10:18:18 np0005476733 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Oct  8 10:18:18 np0005476733 kernel:  Normal   [mem 0x0000000100000000-0x000000043fffffff]
Oct  8 10:18:18 np0005476733 kernel:  Device   empty
Oct  8 10:18:18 np0005476733 kernel: Movable zone start for each node
Oct  8 10:18:18 np0005476733 kernel: Early memory node ranges
Oct  8 10:18:18 np0005476733 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Oct  8 10:18:18 np0005476733 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Oct  8 10:18:18 np0005476733 kernel:  node   0: [mem 0x0000000100000000-0x000000043fffffff]
Oct  8 10:18:18 np0005476733 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000043fffffff]
Oct  8 10:18:18 np0005476733 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Oct  8 10:18:18 np0005476733 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Oct  8 10:18:18 np0005476733 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Oct  8 10:18:18 np0005476733 kernel: ACPI: PM-Timer IO Port: 0x608
Oct  8 10:18:18 np0005476733 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Oct  8 10:18:18 np0005476733 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Oct  8 10:18:18 np0005476733 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Oct  8 10:18:18 np0005476733 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Oct  8 10:18:18 np0005476733 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Oct  8 10:18:18 np0005476733 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Oct  8 10:18:18 np0005476733 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Oct  8 10:18:18 np0005476733 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Oct  8 10:18:18 np0005476733 kernel: TSC deadline timer available
Oct  8 10:18:18 np0005476733 kernel: CPU topo: Max. logical packages:   8
Oct  8 10:18:18 np0005476733 kernel: CPU topo: Max. logical dies:       8
Oct  8 10:18:18 np0005476733 kernel: CPU topo: Max. dies per package:   1
Oct  8 10:18:18 np0005476733 kernel: CPU topo: Max. threads per core:   1
Oct  8 10:18:18 np0005476733 kernel: CPU topo: Num. cores per package:     1
Oct  8 10:18:18 np0005476733 kernel: CPU topo: Num. threads per package:   1
Oct  8 10:18:18 np0005476733 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Oct  8 10:18:18 np0005476733 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Oct  8 10:18:18 np0005476733 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Oct  8 10:18:18 np0005476733 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Oct  8 10:18:18 np0005476733 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Oct  8 10:18:18 np0005476733 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Oct  8 10:18:18 np0005476733 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Oct  8 10:18:18 np0005476733 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Oct  8 10:18:18 np0005476733 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Oct  8 10:18:18 np0005476733 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Oct  8 10:18:18 np0005476733 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Oct  8 10:18:18 np0005476733 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Oct  8 10:18:18 np0005476733 kernel: Booting paravirtualized kernel on KVM
Oct  8 10:18:18 np0005476733 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Oct  8 10:18:18 np0005476733 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Oct  8 10:18:18 np0005476733 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Oct  8 10:18:18 np0005476733 kernel: kvm-guest: PV spinlocks disabled, no host support
Oct  8 10:18:18 np0005476733 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct  8 10:18:18 np0005476733 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64", will be passed to user space.
Oct  8 10:18:18 np0005476733 kernel: random: crng init done
Oct  8 10:18:18 np0005476733 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear)
Oct  8 10:18:18 np0005476733 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Oct  8 10:18:18 np0005476733 kernel: Fallback order for Node 0: 0 
Oct  8 10:18:18 np0005476733 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 4128475
Oct  8 10:18:18 np0005476733 kernel: Policy zone: Normal
Oct  8 10:18:18 np0005476733 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Oct  8 10:18:18 np0005476733 kernel: software IO TLB: area num 8.
Oct  8 10:18:18 np0005476733 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Oct  8 10:18:18 np0005476733 kernel: ftrace: allocating 49370 entries in 193 pages
Oct  8 10:18:18 np0005476733 kernel: ftrace: allocated 193 pages with 3 groups
Oct  8 10:18:18 np0005476733 kernel: Dynamic Preempt: voluntary
Oct  8 10:18:18 np0005476733 kernel: rcu: Preemptible hierarchical RCU implementation.
Oct  8 10:18:18 np0005476733 kernel: rcu: #011RCU event tracing is enabled.
Oct  8 10:18:18 np0005476733 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Oct  8 10:18:18 np0005476733 kernel: #011Trampoline variant of Tasks RCU enabled.
Oct  8 10:18:18 np0005476733 kernel: #011Rude variant of Tasks RCU enabled.
Oct  8 10:18:18 np0005476733 kernel: #011Tracing variant of Tasks RCU enabled.
Oct  8 10:18:18 np0005476733 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Oct  8 10:18:18 np0005476733 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Oct  8 10:18:18 np0005476733 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct  8 10:18:18 np0005476733 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct  8 10:18:18 np0005476733 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct  8 10:18:18 np0005476733 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Oct  8 10:18:18 np0005476733 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Oct  8 10:18:18 np0005476733 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Oct  8 10:18:18 np0005476733 kernel: Console: colour VGA+ 80x25
Oct  8 10:18:18 np0005476733 kernel: printk: console [ttyS0] enabled
Oct  8 10:18:18 np0005476733 kernel: ACPI: Core revision 20230331
Oct  8 10:18:18 np0005476733 kernel: APIC: Switch to symmetric I/O mode setup
Oct  8 10:18:18 np0005476733 kernel: x2apic enabled
Oct  8 10:18:18 np0005476733 kernel: APIC: Switched APIC routing to: physical x2apic
Oct  8 10:18:18 np0005476733 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Oct  8 10:18:18 np0005476733 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Oct  8 10:18:18 np0005476733 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Oct  8 10:18:18 np0005476733 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Oct  8 10:18:18 np0005476733 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Oct  8 10:18:18 np0005476733 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Oct  8 10:18:18 np0005476733 kernel: Spectre V2 : Mitigation: Retpolines
Oct  8 10:18:18 np0005476733 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Oct  8 10:18:18 np0005476733 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Oct  8 10:18:18 np0005476733 kernel: RETBleed: Mitigation: untrained return thunk
Oct  8 10:18:18 np0005476733 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Oct  8 10:18:18 np0005476733 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Oct  8 10:18:18 np0005476733 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Oct  8 10:18:18 np0005476733 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Oct  8 10:18:18 np0005476733 kernel: x86/bugs: return thunk changed
Oct  8 10:18:18 np0005476733 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Oct  8 10:18:18 np0005476733 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Oct  8 10:18:18 np0005476733 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Oct  8 10:18:18 np0005476733 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Oct  8 10:18:18 np0005476733 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Oct  8 10:18:18 np0005476733 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Oct  8 10:18:18 np0005476733 kernel: Freeing SMP alternatives memory: 40K
Oct  8 10:18:18 np0005476733 kernel: pid_max: default: 32768 minimum: 301
Oct  8 10:18:18 np0005476733 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Oct  8 10:18:18 np0005476733 kernel: landlock: Up and running.
Oct  8 10:18:18 np0005476733 kernel: Yama: becoming mindful.
Oct  8 10:18:18 np0005476733 kernel: SELinux:  Initializing.
Oct  8 10:18:18 np0005476733 kernel: LSM support for eBPF active
Oct  8 10:18:18 np0005476733 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear)
Oct  8 10:18:18 np0005476733 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear)
Oct  8 10:18:18 np0005476733 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Oct  8 10:18:18 np0005476733 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Oct  8 10:18:18 np0005476733 kernel: ... version:                0
Oct  8 10:18:18 np0005476733 kernel: ... bit width:              48
Oct  8 10:18:18 np0005476733 kernel: ... generic registers:      6
Oct  8 10:18:18 np0005476733 kernel: ... value mask:             0000ffffffffffff
Oct  8 10:18:18 np0005476733 kernel: ... max period:             00007fffffffffff
Oct  8 10:18:18 np0005476733 kernel: ... fixed-purpose events:   0
Oct  8 10:18:18 np0005476733 kernel: ... event mask:             000000000000003f
Oct  8 10:18:18 np0005476733 kernel: signal: max sigframe size: 1776
Oct  8 10:18:18 np0005476733 kernel: rcu: Hierarchical SRCU implementation.
Oct  8 10:18:18 np0005476733 kernel: rcu: #011Max phase no-delay instances is 400.
Oct  8 10:18:18 np0005476733 kernel: smp: Bringing up secondary CPUs ...
Oct  8 10:18:18 np0005476733 kernel: smpboot: x86: Booting SMP configuration:
Oct  8 10:18:18 np0005476733 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Oct  8 10:18:18 np0005476733 kernel: smp: Brought up 1 node, 8 CPUs
Oct  8 10:18:18 np0005476733 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Oct  8 10:18:18 np0005476733 kernel: node 0 deferred pages initialised in 46ms
Oct  8 10:18:18 np0005476733 kernel: Memory: 16010288K/16776676K available (16384K kernel code, 5784K rwdata, 13996K rodata, 4068K init, 7304K bss, 759876K reserved, 0K cma-reserved)
Oct  8 10:18:18 np0005476733 kernel: devtmpfs: initialized
Oct  8 10:18:18 np0005476733 kernel: x86/mm: Memory block size: 128MB
Oct  8 10:18:18 np0005476733 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Oct  8 10:18:18 np0005476733 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Oct  8 10:18:18 np0005476733 kernel: pinctrl core: initialized pinctrl subsystem
Oct  8 10:18:18 np0005476733 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Oct  8 10:18:18 np0005476733 kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations
Oct  8 10:18:18 np0005476733 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Oct  8 10:18:18 np0005476733 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Oct  8 10:18:18 np0005476733 kernel: audit: initializing netlink subsys (disabled)
Oct  8 10:18:18 np0005476733 kernel: audit: type=2000 audit(1759933096.168:1): state=initialized audit_enabled=0 res=1
Oct  8 10:18:18 np0005476733 kernel: thermal_sys: Registered thermal governor 'fair_share'
Oct  8 10:18:18 np0005476733 kernel: thermal_sys: Registered thermal governor 'step_wise'
Oct  8 10:18:18 np0005476733 kernel: thermal_sys: Registered thermal governor 'user_space'
Oct  8 10:18:18 np0005476733 kernel: cpuidle: using governor menu
Oct  8 10:18:18 np0005476733 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Oct  8 10:18:18 np0005476733 kernel: PCI: Using configuration type 1 for base access
Oct  8 10:18:18 np0005476733 kernel: PCI: Using configuration type 1 for extended access
Oct  8 10:18:18 np0005476733 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Oct  8 10:18:18 np0005476733 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Oct  8 10:18:18 np0005476733 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Oct  8 10:18:18 np0005476733 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Oct  8 10:18:18 np0005476733 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Oct  8 10:18:18 np0005476733 kernel: Demotion targets for Node 0: null
Oct  8 10:18:18 np0005476733 kernel: cryptd: max_cpu_qlen set to 1000
Oct  8 10:18:18 np0005476733 kernel: ACPI: Added _OSI(Module Device)
Oct  8 10:18:18 np0005476733 kernel: ACPI: Added _OSI(Processor Device)
Oct  8 10:18:18 np0005476733 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Oct  8 10:18:18 np0005476733 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Oct  8 10:18:18 np0005476733 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Oct  8 10:18:18 np0005476733 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Oct  8 10:18:18 np0005476733 kernel: ACPI: Interpreter enabled
Oct  8 10:18:18 np0005476733 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Oct  8 10:18:18 np0005476733 kernel: ACPI: Using IOAPIC for interrupt routing
Oct  8 10:18:18 np0005476733 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Oct  8 10:18:18 np0005476733 kernel: PCI: Using E820 reservations for host bridge windows
Oct  8 10:18:18 np0005476733 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Oct  8 10:18:18 np0005476733 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Oct  8 10:18:18 np0005476733 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [3] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [4] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [5] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [6] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [7] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [8] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [9] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [10] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [11] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [12] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [13] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [14] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [15] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [16] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [17] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [18] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [19] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [20] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [21] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [22] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [23] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [24] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [25] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [26] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [27] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [28] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [29] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [30] registered
Oct  8 10:18:18 np0005476733 kernel: acpiphp: Slot [31] registered
Oct  8 10:18:18 np0005476733 kernel: PCI host bridge to bus 0000:00
Oct  8 10:18:18 np0005476733 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Oct  8 10:18:18 np0005476733 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Oct  8 10:18:18 np0005476733 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Oct  8 10:18:18 np0005476733 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Oct  8 10:18:18 np0005476733 kernel: pci_bus 0000:00: root bus resource [mem 0x440000000-0x4bfffffff window]
Oct  8 10:18:18 np0005476733 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Oct  8 10:18:18 np0005476733 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Oct  8 10:18:18 np0005476733 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Oct  8 10:18:18 np0005476733 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Oct  8 10:18:18 np0005476733 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Oct  8 10:18:18 np0005476733 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Oct  8 10:18:18 np0005476733 kernel: iommu: Default domain type: Translated
Oct  8 10:18:18 np0005476733 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Oct  8 10:18:18 np0005476733 kernel: SCSI subsystem initialized
Oct  8 10:18:18 np0005476733 kernel: ACPI: bus type USB registered
Oct  8 10:18:18 np0005476733 kernel: usbcore: registered new interface driver usbfs
Oct  8 10:18:18 np0005476733 kernel: usbcore: registered new interface driver hub
Oct  8 10:18:18 np0005476733 kernel: usbcore: registered new device driver usb
Oct  8 10:18:18 np0005476733 kernel: pps_core: LinuxPPS API ver. 1 registered
Oct  8 10:18:18 np0005476733 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Oct  8 10:18:18 np0005476733 kernel: PTP clock support registered
Oct  8 10:18:18 np0005476733 kernel: EDAC MC: Ver: 3.0.0
Oct  8 10:18:18 np0005476733 kernel: NetLabel: Initializing
Oct  8 10:18:18 np0005476733 kernel: NetLabel:  domain hash size = 128
Oct  8 10:18:18 np0005476733 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Oct  8 10:18:18 np0005476733 kernel: NetLabel:  unlabeled traffic allowed by default
Oct  8 10:18:18 np0005476733 kernel: PCI: Using ACPI for IRQ routing
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Oct  8 10:18:18 np0005476733 kernel: vgaarb: loaded
Oct  8 10:18:18 np0005476733 kernel: clocksource: Switched to clocksource kvm-clock
Oct  8 10:18:18 np0005476733 kernel: VFS: Disk quotas dquot_6.6.0
Oct  8 10:18:18 np0005476733 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Oct  8 10:18:18 np0005476733 kernel: pnp: PnP ACPI init
Oct  8 10:18:18 np0005476733 kernel: pnp: PnP ACPI: found 5 devices
Oct  8 10:18:18 np0005476733 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Oct  8 10:18:18 np0005476733 kernel: NET: Registered PF_INET protocol family
Oct  8 10:18:18 np0005476733 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear)
Oct  8 10:18:18 np0005476733 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear)
Oct  8 10:18:18 np0005476733 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Oct  8 10:18:18 np0005476733 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Oct  8 10:18:18 np0005476733 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Oct  8 10:18:18 np0005476733 kernel: TCP: Hash tables configured (established 131072 bind 65536)
Oct  8 10:18:18 np0005476733 kernel: MPTCP token hash table entries: 16384 (order: 6, 393216 bytes, linear)
Oct  8 10:18:18 np0005476733 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear)
Oct  8 10:18:18 np0005476733 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear)
Oct  8 10:18:18 np0005476733 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Oct  8 10:18:18 np0005476733 kernel: NET: Registered PF_XDP protocol family
Oct  8 10:18:18 np0005476733 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Oct  8 10:18:18 np0005476733 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Oct  8 10:18:18 np0005476733 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Oct  8 10:18:18 np0005476733 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Oct  8 10:18:18 np0005476733 kernel: pci_bus 0000:00: resource 8 [mem 0x440000000-0x4bfffffff window]
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Oct  8 10:18:18 np0005476733 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Oct  8 10:18:18 np0005476733 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 73375 usecs
Oct  8 10:18:18 np0005476733 kernel: PCI: CLS 0 bytes, default 64
Oct  8 10:18:18 np0005476733 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Oct  8 10:18:18 np0005476733 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Oct  8 10:18:18 np0005476733 kernel: ACPI: bus type thunderbolt registered
Oct  8 10:18:18 np0005476733 kernel: Trying to unpack rootfs image as initramfs...
Oct  8 10:18:18 np0005476733 kernel: Initialise system trusted keyrings
Oct  8 10:18:18 np0005476733 kernel: Key type blacklist registered
Oct  8 10:18:18 np0005476733 kernel: workingset: timestamp_bits=36 max_order=22 bucket_order=0
Oct  8 10:18:18 np0005476733 kernel: zbud: loaded
Oct  8 10:18:18 np0005476733 kernel: integrity: Platform Keyring initialized
Oct  8 10:18:18 np0005476733 kernel: integrity: Machine keyring initialized
Oct  8 10:18:18 np0005476733 kernel: Freeing initrd memory: 86104K
Oct  8 10:18:18 np0005476733 kernel: NET: Registered PF_ALG protocol family
Oct  8 10:18:18 np0005476733 kernel: xor: automatically using best checksumming function   avx       
Oct  8 10:18:18 np0005476733 kernel: Key type asymmetric registered
Oct  8 10:18:18 np0005476733 kernel: Asymmetric key parser 'x509' registered
Oct  8 10:18:18 np0005476733 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Oct  8 10:18:18 np0005476733 kernel: io scheduler mq-deadline registered
Oct  8 10:18:18 np0005476733 kernel: io scheduler kyber registered
Oct  8 10:18:18 np0005476733 kernel: io scheduler bfq registered
Oct  8 10:18:18 np0005476733 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Oct  8 10:18:18 np0005476733 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Oct  8 10:18:18 np0005476733 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Oct  8 10:18:18 np0005476733 kernel: ACPI: button: Power Button [PWRF]
Oct  8 10:18:18 np0005476733 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Oct  8 10:18:18 np0005476733 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Oct  8 10:18:18 np0005476733 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Oct  8 10:18:18 np0005476733 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Oct  8 10:18:18 np0005476733 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Oct  8 10:18:18 np0005476733 kernel: Non-volatile memory driver v1.3
Oct  8 10:18:18 np0005476733 kernel: rdac: device handler registered
Oct  8 10:18:18 np0005476733 kernel: hp_sw: device handler registered
Oct  8 10:18:18 np0005476733 kernel: emc: device handler registered
Oct  8 10:18:18 np0005476733 kernel: alua: device handler registered
Oct  8 10:18:18 np0005476733 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Oct  8 10:18:18 np0005476733 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Oct  8 10:18:18 np0005476733 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Oct  8 10:18:18 np0005476733 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Oct  8 10:18:18 np0005476733 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Oct  8 10:18:18 np0005476733 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Oct  8 10:18:18 np0005476733 kernel: usb usb1: Product: UHCI Host Controller
Oct  8 10:18:18 np0005476733 kernel: usb usb1: Manufacturer: Linux 5.14.0-620.el9.x86_64 uhci_hcd
Oct  8 10:18:18 np0005476733 kernel: usb usb1: SerialNumber: 0000:00:01.2
Oct  8 10:18:18 np0005476733 kernel: hub 1-0:1.0: USB hub found
Oct  8 10:18:18 np0005476733 kernel: hub 1-0:1.0: 2 ports detected
Oct  8 10:18:18 np0005476733 kernel: usbcore: registered new interface driver usbserial_generic
Oct  8 10:18:18 np0005476733 kernel: usbserial: USB Serial support registered for generic
Oct  8 10:18:18 np0005476733 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Oct  8 10:18:18 np0005476733 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Oct  8 10:18:18 np0005476733 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Oct  8 10:18:18 np0005476733 kernel: mousedev: PS/2 mouse device common for all mice
Oct  8 10:18:18 np0005476733 kernel: rtc_cmos 00:04: RTC can wake from S4
Oct  8 10:18:18 np0005476733 kernel: rtc_cmos 00:04: registered as rtc0
Oct  8 10:18:18 np0005476733 kernel: rtc_cmos 00:04: setting system clock to 2025-10-08T14:18:17 UTC (1759933097)
Oct  8 10:18:18 np0005476733 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Oct  8 10:18:18 np0005476733 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Oct  8 10:18:18 np0005476733 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Oct  8 10:18:18 np0005476733 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Oct  8 10:18:18 np0005476733 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Oct  8 10:18:18 np0005476733 kernel: hid: raw HID events driver (C) Jiri Kosina
Oct  8 10:18:18 np0005476733 kernel: usbcore: registered new interface driver usbhid
Oct  8 10:18:18 np0005476733 kernel: usbhid: USB HID core driver
Oct  8 10:18:18 np0005476733 kernel: drop_monitor: Initializing network drop monitor service
Oct  8 10:18:18 np0005476733 kernel: Initializing XFRM netlink socket
Oct  8 10:18:18 np0005476733 kernel: NET: Registered PF_INET6 protocol family
Oct  8 10:18:18 np0005476733 kernel: Segment Routing with IPv6
Oct  8 10:18:18 np0005476733 kernel: NET: Registered PF_PACKET protocol family
Oct  8 10:18:18 np0005476733 kernel: mpls_gso: MPLS GSO support
Oct  8 10:18:18 np0005476733 kernel: IPI shorthand broadcast: enabled
Oct  8 10:18:18 np0005476733 kernel: AVX2 version of gcm_enc/dec engaged.
Oct  8 10:18:18 np0005476733 kernel: AES CTR mode by8 optimization enabled
Oct  8 10:18:18 np0005476733 kernel: sched_clock: Marking stable (1279003910, 141848909)->(1537748690, -116895871)
Oct  8 10:18:18 np0005476733 kernel: registered taskstats version 1
Oct  8 10:18:18 np0005476733 kernel: Loading compiled-in X.509 certificates
Oct  8 10:18:18 np0005476733 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4ff821c4997fbb659836adb05f5bc400c914e148'
Oct  8 10:18:18 np0005476733 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Oct  8 10:18:18 np0005476733 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Oct  8 10:18:18 np0005476733 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Oct  8 10:18:18 np0005476733 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Oct  8 10:18:18 np0005476733 kernel: Demotion targets for Node 0: null
Oct  8 10:18:18 np0005476733 kernel: page_owner is disabled
Oct  8 10:18:18 np0005476733 kernel: Key type .fscrypt registered
Oct  8 10:18:18 np0005476733 kernel: Key type fscrypt-provisioning registered
Oct  8 10:18:18 np0005476733 kernel: Key type big_key registered
Oct  8 10:18:18 np0005476733 kernel: Key type encrypted registered
Oct  8 10:18:18 np0005476733 kernel: ima: No TPM chip found, activating TPM-bypass!
Oct  8 10:18:18 np0005476733 kernel: Loading compiled-in module X.509 certificates
Oct  8 10:18:18 np0005476733 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4ff821c4997fbb659836adb05f5bc400c914e148'
Oct  8 10:18:18 np0005476733 kernel: ima: Allocated hash algorithm: sha256
Oct  8 10:18:18 np0005476733 kernel: ima: No architecture policies found
Oct  8 10:18:18 np0005476733 kernel: evm: Initialising EVM extended attributes:
Oct  8 10:18:18 np0005476733 kernel: evm: security.selinux
Oct  8 10:18:18 np0005476733 kernel: evm: security.SMACK64 (disabled)
Oct  8 10:18:18 np0005476733 kernel: evm: security.SMACK64EXEC (disabled)
Oct  8 10:18:18 np0005476733 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Oct  8 10:18:18 np0005476733 kernel: evm: security.SMACK64MMAP (disabled)
Oct  8 10:18:18 np0005476733 kernel: evm: security.apparmor (disabled)
Oct  8 10:18:18 np0005476733 kernel: evm: security.ima
Oct  8 10:18:18 np0005476733 kernel: evm: security.capability
Oct  8 10:18:18 np0005476733 kernel: evm: HMAC attrs: 0x1
Oct  8 10:18:18 np0005476733 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Oct  8 10:18:18 np0005476733 kernel: Running certificate verification RSA selftest
Oct  8 10:18:18 np0005476733 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Oct  8 10:18:18 np0005476733 kernel: Running certificate verification ECDSA selftest
Oct  8 10:18:18 np0005476733 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Oct  8 10:18:18 np0005476733 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Oct  8 10:18:18 np0005476733 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Oct  8 10:18:18 np0005476733 kernel: usb 1-1: Product: QEMU USB Tablet
Oct  8 10:18:18 np0005476733 kernel: usb 1-1: Manufacturer: QEMU
Oct  8 10:18:18 np0005476733 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Oct  8 10:18:18 np0005476733 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Oct  8 10:18:18 np0005476733 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Oct  8 10:18:18 np0005476733 kernel: clk: Disabling unused clocks
Oct  8 10:18:18 np0005476733 kernel: Freeing unused decrypted memory: 2028K
Oct  8 10:18:18 np0005476733 kernel: Freeing unused kernel image (initmem) memory: 4068K
Oct  8 10:18:18 np0005476733 kernel: Write protecting the kernel read-only data: 30720k
Oct  8 10:18:18 np0005476733 kernel: Freeing unused kernel image (rodata/data gap) memory: 340K
Oct  8 10:18:18 np0005476733 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Oct  8 10:18:18 np0005476733 kernel: Run /init as init process
Oct  8 10:18:18 np0005476733 systemd: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct  8 10:18:18 np0005476733 systemd: Detected virtualization kvm.
Oct  8 10:18:18 np0005476733 systemd: Detected architecture x86-64.
Oct  8 10:18:18 np0005476733 systemd: Running in initrd.
Oct  8 10:18:18 np0005476733 systemd: No hostname configured, using default hostname.
Oct  8 10:18:18 np0005476733 systemd: Hostname set to <localhost>.
Oct  8 10:18:18 np0005476733 systemd: Initializing machine ID from VM UUID.
Oct  8 10:18:18 np0005476733 systemd: Queued start job for default target Initrd Default Target.
Oct  8 10:18:18 np0005476733 systemd: Started Dispatch Password Requests to Console Directory Watch.
Oct  8 10:18:18 np0005476733 systemd: Reached target Local Encrypted Volumes.
Oct  8 10:18:18 np0005476733 systemd: Reached target Initrd /usr File System.
Oct  8 10:18:18 np0005476733 systemd: Reached target Local File Systems.
Oct  8 10:18:18 np0005476733 systemd: Reached target Path Units.
Oct  8 10:18:18 np0005476733 systemd: Reached target Slice Units.
Oct  8 10:18:18 np0005476733 systemd: Reached target Swaps.
Oct  8 10:18:18 np0005476733 systemd: Reached target Timer Units.
Oct  8 10:18:18 np0005476733 systemd: Listening on D-Bus System Message Bus Socket.
Oct  8 10:18:18 np0005476733 systemd: Listening on Journal Socket (/dev/log).
Oct  8 10:18:18 np0005476733 systemd: Listening on Journal Socket.
Oct  8 10:18:18 np0005476733 systemd: Listening on udev Control Socket.
Oct  8 10:18:18 np0005476733 systemd: Listening on udev Kernel Socket.
Oct  8 10:18:18 np0005476733 systemd: Reached target Socket Units.
Oct  8 10:18:18 np0005476733 systemd: Starting Create List of Static Device Nodes...
Oct  8 10:18:18 np0005476733 systemd: Starting Journal Service...
Oct  8 10:18:18 np0005476733 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct  8 10:18:18 np0005476733 systemd: Starting Apply Kernel Variables...
Oct  8 10:18:18 np0005476733 systemd: Starting Create System Users...
Oct  8 10:18:18 np0005476733 systemd: Starting Setup Virtual Console...
Oct  8 10:18:18 np0005476733 systemd: Finished Create List of Static Device Nodes.
Oct  8 10:18:18 np0005476733 systemd: Finished Apply Kernel Variables.
Oct  8 10:18:18 np0005476733 systemd: Finished Create System Users.
Oct  8 10:18:18 np0005476733 systemd-journald[311]: Journal started
Oct  8 10:18:18 np0005476733 systemd-journald[311]: Runtime Journal (/run/log/journal/e18df0603a534792ac3f8aebcc82fccc) is 8.0M, max 314.6M, 306.6M free.
Oct  8 10:18:18 np0005476733 systemd-sysusers[315]: Creating group 'users' with GID 100.
Oct  8 10:18:18 np0005476733 systemd-sysusers[315]: Creating group 'dbus' with GID 81.
Oct  8 10:18:18 np0005476733 systemd-sysusers[315]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Oct  8 10:18:18 np0005476733 systemd: Started Journal Service.
Oct  8 10:18:18 np0005476733 systemd[1]: Starting Create Static Device Nodes in /dev...
Oct  8 10:18:18 np0005476733 systemd[1]: Starting Create Volatile Files and Directories...
Oct  8 10:18:18 np0005476733 systemd[1]: Finished Create Static Device Nodes in /dev.
Oct  8 10:18:18 np0005476733 systemd[1]: Finished Create Volatile Files and Directories.
Oct  8 10:18:18 np0005476733 systemd[1]: Finished Setup Virtual Console.
Oct  8 10:18:18 np0005476733 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Oct  8 10:18:18 np0005476733 systemd[1]: Starting dracut cmdline hook...
Oct  8 10:18:18 np0005476733 dracut-cmdline[329]: dracut-9 dracut-057-102.git20250818.el9
Oct  8 10:18:18 np0005476733 dracut-cmdline[329]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct  8 10:18:18 np0005476733 systemd[1]: Finished dracut cmdline hook.
Oct  8 10:18:18 np0005476733 systemd[1]: Starting dracut pre-udev hook...
Oct  8 10:18:18 np0005476733 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Oct  8 10:18:18 np0005476733 kernel: device-mapper: uevent: version 1.0.3
Oct  8 10:18:18 np0005476733 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Oct  8 10:18:18 np0005476733 kernel: RPC: Registered named UNIX socket transport module.
Oct  8 10:18:18 np0005476733 kernel: RPC: Registered udp transport module.
Oct  8 10:18:18 np0005476733 kernel: RPC: Registered tcp transport module.
Oct  8 10:18:18 np0005476733 kernel: RPC: Registered tcp-with-tls transport module.
Oct  8 10:18:18 np0005476733 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Oct  8 10:18:18 np0005476733 rpc.statd[446]: Version 2.5.4 starting
Oct  8 10:18:18 np0005476733 rpc.statd[446]: Initializing NSM state
Oct  8 10:18:18 np0005476733 rpc.idmapd[451]: Setting log level to 0
Oct  8 10:18:19 np0005476733 systemd[1]: Finished dracut pre-udev hook.
Oct  8 10:18:19 np0005476733 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct  8 10:18:19 np0005476733 systemd-udevd[464]: Using default interface naming scheme 'rhel-9.0'.
Oct  8 10:18:19 np0005476733 systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct  8 10:18:19 np0005476733 systemd[1]: Starting dracut pre-trigger hook...
Oct  8 10:18:19 np0005476733 systemd[1]: Finished dracut pre-trigger hook.
Oct  8 10:18:19 np0005476733 systemd[1]: Starting Coldplug All udev Devices...
Oct  8 10:18:19 np0005476733 systemd[1]: Created slice Slice /system/modprobe.
Oct  8 10:18:19 np0005476733 systemd[1]: Starting Load Kernel Module configfs...
Oct  8 10:18:19 np0005476733 systemd[1]: Finished Coldplug All udev Devices.
Oct  8 10:18:19 np0005476733 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct  8 10:18:19 np0005476733 systemd[1]: Finished Load Kernel Module configfs.
Oct  8 10:18:19 np0005476733 systemd[1]: Mounting Kernel Configuration File System...
Oct  8 10:18:19 np0005476733 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct  8 10:18:19 np0005476733 systemd[1]: Reached target Network.
Oct  8 10:18:19 np0005476733 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct  8 10:18:19 np0005476733 systemd[1]: Starting dracut initqueue hook...
Oct  8 10:18:19 np0005476733 systemd[1]: Mounted Kernel Configuration File System.
Oct  8 10:18:19 np0005476733 systemd[1]: Reached target System Initialization.
Oct  8 10:18:19 np0005476733 systemd[1]: Reached target Basic System.
Oct  8 10:18:19 np0005476733 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Oct  8 10:18:19 np0005476733 kernel: virtio_blk virtio2: [vda] 251658240 512-byte logical blocks (129 GB/120 GiB)
Oct  8 10:18:19 np0005476733 systemd-udevd[468]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 10:18:19 np0005476733 kernel: vda: vda1
Oct  8 10:18:19 np0005476733 kernel: scsi host0: ata_piix
Oct  8 10:18:19 np0005476733 kernel: scsi host1: ata_piix
Oct  8 10:18:19 np0005476733 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Oct  8 10:18:19 np0005476733 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Oct  8 10:18:19 np0005476733 systemd[1]: Found device /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458.
Oct  8 10:18:19 np0005476733 systemd[1]: Reached target Initrd Root Device.
Oct  8 10:18:19 np0005476733 kernel: ata1: found unknown device (class 0)
Oct  8 10:18:19 np0005476733 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Oct  8 10:18:19 np0005476733 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Oct  8 10:18:19 np0005476733 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Oct  8 10:18:19 np0005476733 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Oct  8 10:18:19 np0005476733 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Oct  8 10:18:19 np0005476733 systemd[1]: Finished dracut initqueue hook.
Oct  8 10:18:19 np0005476733 systemd[1]: Reached target Preparation for Remote File Systems.
Oct  8 10:18:19 np0005476733 systemd[1]: Reached target Remote Encrypted Volumes.
Oct  8 10:18:19 np0005476733 systemd[1]: Reached target Remote File Systems.
Oct  8 10:18:19 np0005476733 systemd[1]: Starting dracut pre-mount hook...
Oct  8 10:18:19 np0005476733 systemd[1]: Finished dracut pre-mount hook.
Oct  8 10:18:19 np0005476733 systemd[1]: Starting File System Check on /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458...
Oct  8 10:18:19 np0005476733 systemd-fsck[558]: /usr/sbin/fsck.xfs: XFS file system.
Oct  8 10:18:19 np0005476733 systemd[1]: Finished File System Check on /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458.
Oct  8 10:18:19 np0005476733 systemd[1]: Mounting /sysroot...
Oct  8 10:18:20 np0005476733 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Oct  8 10:18:20 np0005476733 kernel: XFS (vda1): Mounting V5 Filesystem 1631a6ad-43b8-436d-ae76-16fa14b94458
Oct  8 10:18:20 np0005476733 kernel: XFS (vda1): Ending clean mount
Oct  8 10:18:20 np0005476733 systemd[1]: Mounted /sysroot.
Oct  8 10:18:20 np0005476733 systemd[1]: Reached target Initrd Root File System.
Oct  8 10:18:20 np0005476733 systemd[1]: Starting Mountpoints Configured in the Real Root...
Oct  8 10:18:20 np0005476733 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: Finished Mountpoints Configured in the Real Root.
Oct  8 10:18:20 np0005476733 systemd[1]: Reached target Initrd File Systems.
Oct  8 10:18:20 np0005476733 systemd[1]: Reached target Initrd Default Target.
Oct  8 10:18:20 np0005476733 systemd[1]: Starting dracut mount hook...
Oct  8 10:18:20 np0005476733 systemd[1]: Finished dracut mount hook.
Oct  8 10:18:20 np0005476733 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Oct  8 10:18:20 np0005476733 rpc.idmapd[451]: exiting on signal 15
Oct  8 10:18:20 np0005476733 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Oct  8 10:18:20 np0005476733 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped target Network.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped target Remote Encrypted Volumes.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped target Timer Units.
Oct  8 10:18:20 np0005476733 systemd[1]: dbus.socket: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: Closed D-Bus System Message Bus Socket.
Oct  8 10:18:20 np0005476733 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped target Initrd Default Target.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped target Basic System.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped target Initrd Root Device.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped target Initrd /usr File System.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped target Path Units.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped target Remote File Systems.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped target Preparation for Remote File Systems.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped target Slice Units.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped target Socket Units.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped target System Initialization.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped target Local File Systems.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped target Swaps.
Oct  8 10:18:20 np0005476733 systemd[1]: dracut-mount.service: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped dracut mount hook.
Oct  8 10:18:20 np0005476733 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped dracut pre-mount hook.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped target Local Encrypted Volumes.
Oct  8 10:18:20 np0005476733 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Oct  8 10:18:20 np0005476733 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped dracut initqueue hook.
Oct  8 10:18:20 np0005476733 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped Apply Kernel Variables.
Oct  8 10:18:20 np0005476733 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped Create Volatile Files and Directories.
Oct  8 10:18:20 np0005476733 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped Coldplug All udev Devices.
Oct  8 10:18:20 np0005476733 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped dracut pre-trigger hook.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Oct  8 10:18:20 np0005476733 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped Setup Virtual Console.
Oct  8 10:18:20 np0005476733 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Oct  8 10:18:20 np0005476733 systemd[1]: systemd-udevd.service: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Oct  8 10:18:20 np0005476733 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: Closed udev Control Socket.
Oct  8 10:18:20 np0005476733 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: Closed udev Kernel Socket.
Oct  8 10:18:20 np0005476733 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped dracut pre-udev hook.
Oct  8 10:18:20 np0005476733 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped dracut cmdline hook.
Oct  8 10:18:20 np0005476733 systemd[1]: Starting Cleanup udev Database...
Oct  8 10:18:20 np0005476733 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped Create Static Device Nodes in /dev.
Oct  8 10:18:20 np0005476733 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped Create List of Static Device Nodes.
Oct  8 10:18:20 np0005476733 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: Stopped Create System Users.
Oct  8 10:18:20 np0005476733 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Oct  8 10:18:20 np0005476733 systemd[1]: Finished Cleanup udev Database.
Oct  8 10:18:20 np0005476733 systemd[1]: Reached target Switch Root.
Oct  8 10:18:20 np0005476733 systemd[1]: Starting Switch Root...
Oct  8 10:18:20 np0005476733 systemd[1]: Switching root.
Oct  8 10:18:20 np0005476733 systemd-journald[311]: Journal stopped
Oct  8 10:18:21 np0005476733 systemd-journald: Received SIGTERM from PID 1 (systemd).
Oct  8 10:18:21 np0005476733 kernel: audit: type=1404 audit(1759933100.919:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Oct  8 10:18:21 np0005476733 kernel: SELinux:  policy capability network_peer_controls=1
Oct  8 10:18:21 np0005476733 kernel: SELinux:  policy capability open_perms=1
Oct  8 10:18:21 np0005476733 kernel: SELinux:  policy capability extended_socket_class=1
Oct  8 10:18:21 np0005476733 kernel: SELinux:  policy capability always_check_network=0
Oct  8 10:18:21 np0005476733 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  8 10:18:21 np0005476733 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  8 10:18:21 np0005476733 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  8 10:18:21 np0005476733 kernel: audit: type=1403 audit(1759933101.070:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Oct  8 10:18:21 np0005476733 systemd: Successfully loaded SELinux policy in 155.230ms.
Oct  8 10:18:21 np0005476733 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 36.310ms.
Oct  8 10:18:21 np0005476733 systemd: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct  8 10:18:21 np0005476733 systemd: Detected virtualization kvm.
Oct  8 10:18:21 np0005476733 systemd: Detected architecture x86-64.
Oct  8 10:18:21 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 10:18:21 np0005476733 systemd: initrd-switch-root.service: Deactivated successfully.
Oct  8 10:18:21 np0005476733 systemd: Stopped Switch Root.
Oct  8 10:18:21 np0005476733 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Oct  8 10:18:21 np0005476733 systemd: Created slice Slice /system/getty.
Oct  8 10:18:21 np0005476733 systemd: Created slice Slice /system/serial-getty.
Oct  8 10:18:21 np0005476733 systemd: Created slice Slice /system/sshd-keygen.
Oct  8 10:18:21 np0005476733 systemd: Created slice User and Session Slice.
Oct  8 10:18:21 np0005476733 systemd: Started Dispatch Password Requests to Console Directory Watch.
Oct  8 10:18:21 np0005476733 systemd: Started Forward Password Requests to Wall Directory Watch.
Oct  8 10:18:21 np0005476733 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Oct  8 10:18:21 np0005476733 systemd: Reached target Local Encrypted Volumes.
Oct  8 10:18:21 np0005476733 systemd: Stopped target Switch Root.
Oct  8 10:18:21 np0005476733 systemd: Stopped target Initrd File Systems.
Oct  8 10:18:21 np0005476733 systemd: Stopped target Initrd Root File System.
Oct  8 10:18:21 np0005476733 systemd: Reached target Local Integrity Protected Volumes.
Oct  8 10:18:21 np0005476733 systemd: Reached target Path Units.
Oct  8 10:18:21 np0005476733 systemd: Reached target rpc_pipefs.target.
Oct  8 10:18:21 np0005476733 systemd: Reached target Slice Units.
Oct  8 10:18:21 np0005476733 systemd: Reached target Swaps.
Oct  8 10:18:21 np0005476733 systemd: Reached target Local Verity Protected Volumes.
Oct  8 10:18:21 np0005476733 systemd: Listening on RPCbind Server Activation Socket.
Oct  8 10:18:21 np0005476733 systemd: Reached target RPC Port Mapper.
Oct  8 10:18:21 np0005476733 systemd: Listening on Process Core Dump Socket.
Oct  8 10:18:21 np0005476733 systemd: Listening on initctl Compatibility Named Pipe.
Oct  8 10:18:21 np0005476733 systemd: Listening on udev Control Socket.
Oct  8 10:18:21 np0005476733 systemd: Listening on udev Kernel Socket.
Oct  8 10:18:21 np0005476733 systemd: Mounting Huge Pages File System...
Oct  8 10:18:21 np0005476733 systemd: Mounting POSIX Message Queue File System...
Oct  8 10:18:21 np0005476733 systemd: Mounting Kernel Debug File System...
Oct  8 10:18:21 np0005476733 systemd: Mounting Kernel Trace File System...
Oct  8 10:18:21 np0005476733 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct  8 10:18:21 np0005476733 systemd: Starting Create List of Static Device Nodes...
Oct  8 10:18:21 np0005476733 systemd: Starting Load Kernel Module configfs...
Oct  8 10:18:21 np0005476733 systemd: Starting Load Kernel Module drm...
Oct  8 10:18:21 np0005476733 systemd: Starting Load Kernel Module efi_pstore...
Oct  8 10:18:21 np0005476733 systemd: Starting Load Kernel Module fuse...
Oct  8 10:18:21 np0005476733 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Oct  8 10:18:21 np0005476733 systemd: systemd-fsck-root.service: Deactivated successfully.
Oct  8 10:18:21 np0005476733 systemd: Stopped File System Check on Root Device.
Oct  8 10:18:21 np0005476733 systemd: Stopped Journal Service.
Oct  8 10:18:21 np0005476733 systemd: Starting Journal Service...
Oct  8 10:18:21 np0005476733 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct  8 10:18:21 np0005476733 systemd: Starting Generate network units from Kernel command line...
Oct  8 10:18:21 np0005476733 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct  8 10:18:21 np0005476733 systemd: Starting Remount Root and Kernel File Systems...
Oct  8 10:18:21 np0005476733 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Oct  8 10:18:21 np0005476733 systemd: Starting Apply Kernel Variables...
Oct  8 10:18:21 np0005476733 kernel: fuse: init (API version 7.37)
Oct  8 10:18:21 np0005476733 systemd: Starting Coldplug All udev Devices...
Oct  8 10:18:21 np0005476733 systemd: Mounted Huge Pages File System.
Oct  8 10:18:21 np0005476733 systemd: Mounted POSIX Message Queue File System.
Oct  8 10:18:21 np0005476733 systemd: Mounted Kernel Debug File System.
Oct  8 10:18:21 np0005476733 systemd: Mounted Kernel Trace File System.
Oct  8 10:18:21 np0005476733 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Oct  8 10:18:21 np0005476733 systemd: Finished Create List of Static Device Nodes.
Oct  8 10:18:21 np0005476733 kernel: ACPI: bus type drm_connector registered
Oct  8 10:18:21 np0005476733 systemd: modprobe@configfs.service: Deactivated successfully.
Oct  8 10:18:21 np0005476733 systemd: Finished Load Kernel Module configfs.
Oct  8 10:18:21 np0005476733 systemd-journald[682]: Journal started
Oct  8 10:18:21 np0005476733 systemd-journald[682]: Runtime Journal (/run/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 314.6M, 306.6M free.
Oct  8 10:18:21 np0005476733 systemd[1]: Queued start job for default target Multi-User System.
Oct  8 10:18:21 np0005476733 systemd[1]: systemd-journald.service: Deactivated successfully.
Oct  8 10:18:21 np0005476733 systemd: Started Journal Service.
Oct  8 10:18:21 np0005476733 systemd[1]: modprobe@drm.service: Deactivated successfully.
Oct  8 10:18:21 np0005476733 systemd[1]: Finished Load Kernel Module drm.
Oct  8 10:18:21 np0005476733 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Oct  8 10:18:21 np0005476733 systemd[1]: Finished Load Kernel Module efi_pstore.
Oct  8 10:18:21 np0005476733 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Oct  8 10:18:21 np0005476733 systemd[1]: Finished Load Kernel Module fuse.
Oct  8 10:18:21 np0005476733 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Oct  8 10:18:21 np0005476733 systemd[1]: Finished Generate network units from Kernel command line.
Oct  8 10:18:21 np0005476733 systemd[1]: Finished Remount Root and Kernel File Systems.
Oct  8 10:18:21 np0005476733 systemd[1]: Mounting FUSE Control File System...
Oct  8 10:18:21 np0005476733 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct  8 10:18:21 np0005476733 systemd[1]: Starting Rebuild Hardware Database...
Oct  8 10:18:21 np0005476733 systemd[1]: Starting Flush Journal to Persistent Storage...
Oct  8 10:18:21 np0005476733 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Oct  8 10:18:21 np0005476733 systemd[1]: Starting Load/Save OS Random Seed...
Oct  8 10:18:21 np0005476733 systemd[1]: Starting Create System Users...
Oct  8 10:18:21 np0005476733 systemd[1]: Finished Apply Kernel Variables.
Oct  8 10:18:21 np0005476733 systemd[1]: Mounted FUSE Control File System.
Oct  8 10:18:21 np0005476733 systemd[1]: Finished Coldplug All udev Devices.
Oct  8 10:18:21 np0005476733 systemd-journald[682]: Runtime Journal (/run/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 314.6M, 306.6M free.
Oct  8 10:18:21 np0005476733 systemd-journald[682]: Received client request to flush runtime journal.
Oct  8 10:18:21 np0005476733 systemd[1]: Finished Flush Journal to Persistent Storage.
Oct  8 10:18:22 np0005476733 systemd[1]: Finished Load/Save OS Random Seed.
Oct  8 10:18:22 np0005476733 systemd[1]: Finished Create System Users.
Oct  8 10:18:22 np0005476733 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct  8 10:18:22 np0005476733 systemd[1]: Starting Create Static Device Nodes in /dev...
Oct  8 10:18:22 np0005476733 systemd[1]: Finished Create Static Device Nodes in /dev.
Oct  8 10:18:22 np0005476733 systemd[1]: Reached target Preparation for Local File Systems.
Oct  8 10:18:22 np0005476733 systemd[1]: Reached target Local File Systems.
Oct  8 10:18:22 np0005476733 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Oct  8 10:18:22 np0005476733 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Oct  8 10:18:22 np0005476733 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Oct  8 10:18:22 np0005476733 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Oct  8 10:18:22 np0005476733 systemd[1]: Starting Automatic Boot Loader Update...
Oct  8 10:18:22 np0005476733 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Oct  8 10:18:22 np0005476733 systemd[1]: Starting Create Volatile Files and Directories...
Oct  8 10:18:22 np0005476733 bootctl[700]: Couldn't find EFI system partition, skipping.
Oct  8 10:18:22 np0005476733 systemd[1]: Finished Automatic Boot Loader Update.
Oct  8 10:18:22 np0005476733 systemd[1]: Finished Create Volatile Files and Directories.
Oct  8 10:18:22 np0005476733 systemd[1]: Starting Security Auditing Service...
Oct  8 10:18:22 np0005476733 systemd[1]: Starting RPC Bind...
Oct  8 10:18:22 np0005476733 systemd[1]: Starting Rebuild Journal Catalog...
Oct  8 10:18:22 np0005476733 systemd[1]: Finished Rebuild Journal Catalog.
Oct  8 10:18:22 np0005476733 auditd[706]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Oct  8 10:18:22 np0005476733 auditd[706]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Oct  8 10:18:22 np0005476733 systemd[1]: Started RPC Bind.
Oct  8 10:18:22 np0005476733 augenrules[711]: /sbin/augenrules: No change
Oct  8 10:18:22 np0005476733 augenrules[726]: No rules
Oct  8 10:18:22 np0005476733 augenrules[726]: enabled 1
Oct  8 10:18:22 np0005476733 augenrules[726]: failure 1
Oct  8 10:18:22 np0005476733 augenrules[726]: pid 706
Oct  8 10:18:22 np0005476733 augenrules[726]: rate_limit 0
Oct  8 10:18:22 np0005476733 augenrules[726]: backlog_limit 8192
Oct  8 10:18:22 np0005476733 augenrules[726]: lost 0
Oct  8 10:18:22 np0005476733 augenrules[726]: backlog 0
Oct  8 10:18:22 np0005476733 augenrules[726]: backlog_wait_time 60000
Oct  8 10:18:22 np0005476733 augenrules[726]: backlog_wait_time_actual 0
Oct  8 10:18:22 np0005476733 augenrules[726]: enabled 1
Oct  8 10:18:22 np0005476733 augenrules[726]: failure 1
Oct  8 10:18:22 np0005476733 augenrules[726]: pid 706
Oct  8 10:18:22 np0005476733 augenrules[726]: rate_limit 0
Oct  8 10:18:22 np0005476733 augenrules[726]: backlog_limit 8192
Oct  8 10:18:22 np0005476733 augenrules[726]: lost 0
Oct  8 10:18:22 np0005476733 augenrules[726]: backlog 3
Oct  8 10:18:22 np0005476733 augenrules[726]: backlog_wait_time 60000
Oct  8 10:18:22 np0005476733 augenrules[726]: backlog_wait_time_actual 0
Oct  8 10:18:22 np0005476733 augenrules[726]: enabled 1
Oct  8 10:18:22 np0005476733 augenrules[726]: failure 1
Oct  8 10:18:22 np0005476733 augenrules[726]: pid 706
Oct  8 10:18:22 np0005476733 augenrules[726]: rate_limit 0
Oct  8 10:18:22 np0005476733 augenrules[726]: backlog_limit 8192
Oct  8 10:18:22 np0005476733 augenrules[726]: lost 0
Oct  8 10:18:22 np0005476733 augenrules[726]: backlog 0
Oct  8 10:18:22 np0005476733 augenrules[726]: backlog_wait_time 60000
Oct  8 10:18:22 np0005476733 augenrules[726]: backlog_wait_time_actual 0
Oct  8 10:18:22 np0005476733 systemd[1]: Started Security Auditing Service.
Oct  8 10:18:22 np0005476733 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Oct  8 10:18:22 np0005476733 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Oct  8 10:18:23 np0005476733 systemd[1]: Finished Rebuild Hardware Database.
Oct  8 10:18:23 np0005476733 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct  8 10:18:23 np0005476733 systemd-udevd[735]: Using default interface naming scheme 'rhel-9.0'.
Oct  8 10:18:23 np0005476733 systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct  8 10:18:23 np0005476733 systemd[1]: Starting Load Kernel Module configfs...
Oct  8 10:18:23 np0005476733 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct  8 10:18:23 np0005476733 systemd[1]: Finished Load Kernel Module configfs.
Oct  8 10:18:23 np0005476733 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Oct  8 10:18:24 np0005476733 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Oct  8 10:18:24 np0005476733 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Oct  8 10:18:24 np0005476733 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Oct  8 10:18:24 np0005476733 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Oct  8 10:18:24 np0005476733 systemd-udevd[738]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 10:18:24 np0005476733 kernel: kvm_amd: TSC scaling supported
Oct  8 10:18:24 np0005476733 kernel: kvm_amd: Nested Virtualization enabled
Oct  8 10:18:24 np0005476733 kernel: kvm_amd: Nested Paging enabled
Oct  8 10:18:24 np0005476733 kernel: kvm_amd: LBR virtualization supported
Oct  8 10:18:24 np0005476733 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Oct  8 10:18:24 np0005476733 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Oct  8 10:18:24 np0005476733 kernel: Console: switching to colour dummy device 80x25
Oct  8 10:18:24 np0005476733 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Oct  8 10:18:24 np0005476733 kernel: [drm] features: -context_init
Oct  8 10:18:24 np0005476733 kernel: [drm] number of scanouts: 1
Oct  8 10:18:24 np0005476733 kernel: [drm] number of cap sets: 0
Oct  8 10:18:24 np0005476733 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Oct  8 10:18:24 np0005476733 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Oct  8 10:18:24 np0005476733 kernel: Console: switching to colour frame buffer device 128x48
Oct  8 10:18:24 np0005476733 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Oct  8 10:18:24 np0005476733 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Oct  8 10:18:24 np0005476733 systemd[1]: Starting Update is Completed...
Oct  8 10:18:24 np0005476733 systemd[1]: Finished Update is Completed.
Oct  8 10:18:24 np0005476733 systemd[1]: Reached target System Initialization.
Oct  8 10:18:24 np0005476733 systemd[1]: Started dnf makecache --timer.
Oct  8 10:18:24 np0005476733 systemd[1]: Started Daily rotation of log files.
Oct  8 10:18:24 np0005476733 systemd[1]: Started Daily Cleanup of Temporary Directories.
Oct  8 10:18:24 np0005476733 systemd[1]: Reached target Timer Units.
Oct  8 10:18:24 np0005476733 systemd[1]: Listening on D-Bus System Message Bus Socket.
Oct  8 10:18:24 np0005476733 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Oct  8 10:18:24 np0005476733 systemd[1]: Reached target Socket Units.
Oct  8 10:18:24 np0005476733 systemd[1]: Starting D-Bus System Message Bus...
Oct  8 10:18:24 np0005476733 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct  8 10:18:24 np0005476733 systemd[1]: Started D-Bus System Message Bus.
Oct  8 10:18:24 np0005476733 systemd[1]: Reached target Basic System.
Oct  8 10:18:24 np0005476733 dbus-broker-lau[816]: Ready
Oct  8 10:18:24 np0005476733 systemd[1]: Starting NTP client/server...
Oct  8 10:18:24 np0005476733 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Oct  8 10:18:24 np0005476733 systemd[1]: Starting Restore /run/initramfs on shutdown...
Oct  8 10:18:24 np0005476733 systemd[1]: Starting IPv4 firewall with iptables...
Oct  8 10:18:24 np0005476733 systemd[1]: Started irqbalance daemon.
Oct  8 10:18:24 np0005476733 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Oct  8 10:18:24 np0005476733 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  8 10:18:24 np0005476733 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  8 10:18:24 np0005476733 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  8 10:18:24 np0005476733 systemd[1]: Reached target sshd-keygen.target.
Oct  8 10:18:24 np0005476733 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Oct  8 10:18:24 np0005476733 systemd[1]: Reached target User and Group Name Lookups.
Oct  8 10:18:24 np0005476733 systemd[1]: Starting User Login Management...
Oct  8 10:18:24 np0005476733 systemd[1]: Finished Restore /run/initramfs on shutdown.
Oct  8 10:18:24 np0005476733 chronyd[835]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct  8 10:18:24 np0005476733 chronyd[835]: Loaded 0 symmetric keys
Oct  8 10:18:24 np0005476733 chronyd[835]: Using right/UTC timezone to obtain leap second data
Oct  8 10:18:24 np0005476733 chronyd[835]: Loaded seccomp filter (level 2)
Oct  8 10:18:24 np0005476733 systemd[1]: Started NTP client/server.
Oct  8 10:18:24 np0005476733 systemd-logind[827]: New seat seat0.
Oct  8 10:18:24 np0005476733 systemd-logind[827]: Watching system buttons on /dev/input/event0 (Power Button)
Oct  8 10:18:24 np0005476733 systemd-logind[827]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct  8 10:18:24 np0005476733 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Oct  8 10:18:24 np0005476733 systemd[1]: Started User Login Management.
Oct  8 10:18:24 np0005476733 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Oct  8 10:18:25 np0005476733 iptables.init[821]: iptables: Applying firewall rules: [  OK  ]
Oct  8 10:18:25 np0005476733 systemd[1]: Finished IPv4 firewall with iptables.
Oct  8 10:18:25 np0005476733 cloud-init[844]: Cloud-init v. 24.4-7.el9 running 'init-local' at Wed, 08 Oct 2025 14:18:25 +0000. Up 9.24 seconds.
Oct  8 10:18:25 np0005476733 systemd[1]: run-cloud\x2dinit-tmp-tmpadw1nfkg.mount: Deactivated successfully.
Oct  8 10:18:25 np0005476733 systemd[1]: Starting Hostname Service...
Oct  8 10:18:25 np0005476733 systemd[1]: Started Hostname Service.
Oct  8 10:18:25 np0005476733 systemd-hostnamed[858]: Hostname set to <np0005476733.novalocal> (static)
Oct  8 10:18:26 np0005476733 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Oct  8 10:18:26 np0005476733 systemd[1]: Reached target Preparation for Network.
Oct  8 10:18:26 np0005476733 systemd[1]: Starting Network Manager...
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.1470] NetworkManager (version 1.54.1-1.el9) is starting... (boot:333cb165-ebfa-4717-89b0-51e1063c215b)
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.1476] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.1625] manager[0x560616a24080]: monitoring kernel firmware directory '/lib/firmware'.
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.1673] hostname: hostname: using hostnamed
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.1673] hostname: static hostname changed from (none) to "np0005476733.novalocal"
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.1677] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.1789] manager[0x560616a24080]: rfkill: Wi-Fi hardware radio set enabled
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.1790] manager[0x560616a24080]: rfkill: WWAN hardware radio set enabled
Oct  8 10:18:26 np0005476733 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.1859] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.1860] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.1860] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.1861] manager: Networking is enabled by state file
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.1862] settings: Loaded settings plugin: keyfile (internal)
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.1920] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.1969] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.1998] dhcp: init: Using DHCP client 'internal'
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2000] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2013] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2026] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2033] device (lo): Activation: starting connection 'lo' (f95091c6-00bf-46c5-85a3-3d1e635fc62f)
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2043] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2045] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2081] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2086] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2088] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2091] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2093] device (eth0): carrier: link connected
Oct  8 10:18:26 np0005476733 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2096] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2103] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2112] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2117] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2117] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2120] manager: NetworkManager state is now CONNECTING
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2121] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 10:18:26 np0005476733 systemd[1]: Started Network Manager.
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2128] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2131] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  8 10:18:26 np0005476733 systemd[1]: Reached target Network.
Oct  8 10:18:26 np0005476733 systemd[1]: Starting Network Manager Wait Online...
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2191] dhcp4 (eth0): state changed new lease, address=38.102.83.224
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2198] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2216] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 10:18:26 np0005476733 systemd[1]: Starting GSSAPI Proxy Daemon...
Oct  8 10:18:26 np0005476733 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2368] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2371] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2378] device (lo): Activation: successful, device activated.
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2385] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2387] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2391] manager: NetworkManager state is now CONNECTED_SITE
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2394] device (eth0): Activation: successful, device activated.
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2399] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct  8 10:18:26 np0005476733 NetworkManager[862]: <info>  [1759933106.2402] manager: startup complete
Oct  8 10:18:26 np0005476733 systemd[1]: Finished Network Manager Wait Online.
Oct  8 10:18:26 np0005476733 systemd[1]: Started GSSAPI Proxy Daemon.
Oct  8 10:18:26 np0005476733 systemd[1]: Starting Cloud-init: Network Stage...
Oct  8 10:18:26 np0005476733 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct  8 10:18:26 np0005476733 systemd[1]: Reached target NFS client services.
Oct  8 10:18:26 np0005476733 systemd[1]: Reached target Preparation for Remote File Systems.
Oct  8 10:18:26 np0005476733 systemd[1]: Reached target Remote File Systems.
Oct  8 10:18:26 np0005476733 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct  8 10:18:26 np0005476733 cloud-init[922]: Cloud-init v. 24.4-7.el9 running 'init' at Wed, 08 Oct 2025 14:18:26 +0000. Up 10.29 seconds.
Oct  8 10:18:26 np0005476733 cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Oct  8 10:18:26 np0005476733 cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct  8 10:18:26 np0005476733 cloud-init[922]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Oct  8 10:18:26 np0005476733 cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct  8 10:18:26 np0005476733 cloud-init[922]: ci-info: |  eth0  | True |        38.102.83.224         | 255.255.255.0 | global | fa:16:3e:f6:70:48 |
Oct  8 10:18:26 np0005476733 cloud-init[922]: ci-info: |  eth0  | True | fe80::f816:3eff:fef6:7048/64 |       .       |  link  | fa:16:3e:f6:70:48 |
Oct  8 10:18:26 np0005476733 cloud-init[922]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Oct  8 10:18:26 np0005476733 cloud-init[922]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Oct  8 10:18:26 np0005476733 cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct  8 10:18:26 np0005476733 cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Oct  8 10:18:26 np0005476733 cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct  8 10:18:26 np0005476733 cloud-init[922]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Oct  8 10:18:26 np0005476733 cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct  8 10:18:26 np0005476733 cloud-init[922]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Oct  8 10:18:26 np0005476733 cloud-init[922]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Oct  8 10:18:26 np0005476733 cloud-init[922]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Oct  8 10:18:26 np0005476733 cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct  8 10:18:26 np0005476733 cloud-init[922]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Oct  8 10:18:26 np0005476733 cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Oct  8 10:18:26 np0005476733 cloud-init[922]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Oct  8 10:18:26 np0005476733 cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Oct  8 10:18:26 np0005476733 cloud-init[922]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Oct  8 10:18:26 np0005476733 cloud-init[922]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Oct  8 10:18:26 np0005476733 cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Oct  8 10:18:28 np0005476733 cloud-init[922]: Generating public/private rsa key pair.
Oct  8 10:18:28 np0005476733 cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Oct  8 10:18:28 np0005476733 cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Oct  8 10:18:28 np0005476733 cloud-init[922]: The key fingerprint is:
Oct  8 10:18:28 np0005476733 cloud-init[922]: SHA256:f+MF+WzkDu/5Qlx6nl7di1j//4XTRtD0ndU6auzMrSQ root@np0005476733.novalocal
Oct  8 10:18:28 np0005476733 cloud-init[922]: The key's randomart image is:
Oct  8 10:18:28 np0005476733 cloud-init[922]: +---[RSA 3072]----+
Oct  8 10:18:28 np0005476733 cloud-init[922]: |                +|
Oct  8 10:18:28 np0005476733 cloud-init[922]: |               o*|
Oct  8 10:18:28 np0005476733 cloud-init[922]: |              .o+|
Oct  8 10:18:28 np0005476733 cloud-init[922]: |             .o..|
Oct  8 10:18:28 np0005476733 cloud-init[922]: |        S  .oo.+.|
Oct  8 10:18:28 np0005476733 cloud-init[922]: |         .  +*+++|
Oct  8 10:18:28 np0005476733 cloud-init[922]: |          E==+OoO|
Oct  8 10:18:28 np0005476733 cloud-init[922]: |           =*B===|
Oct  8 10:18:28 np0005476733 cloud-init[922]: |           .o+**O|
Oct  8 10:18:28 np0005476733 cloud-init[922]: +----[SHA256]-----+
Oct  8 10:18:28 np0005476733 cloud-init[922]: Generating public/private ecdsa key pair.
Oct  8 10:18:28 np0005476733 cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Oct  8 10:18:28 np0005476733 cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Oct  8 10:18:28 np0005476733 cloud-init[922]: The key fingerprint is:
Oct  8 10:18:28 np0005476733 cloud-init[922]: SHA256:Ww2vPEKKCMRNkNoRkIaGvsFODPdyfcYeV3VthX22yd0 root@np0005476733.novalocal
Oct  8 10:18:28 np0005476733 cloud-init[922]: The key's randomart image is:
Oct  8 10:18:28 np0005476733 cloud-init[922]: +---[ECDSA 256]---+
Oct  8 10:18:28 np0005476733 cloud-init[922]: |+++o         ..o=|
Oct  8 10:18:28 np0005476733 cloud-init[922]: |*+=         . ..*|
Oct  8 10:18:28 np0005476733 cloud-init[922]: |O= + . .  ..  .o*|
Oct  8 10:18:28 np0005476733 cloud-init[922]: |o*o o . = .+   +E|
Oct  8 10:18:28 np0005476733 cloud-init[922]: |+ oo   +So. o    |
Oct  8 10:18:28 np0005476733 cloud-init[922]: | + . . o.+ .     |
Oct  8 10:18:28 np0005476733 cloud-init[922]: |  . . . o +      |
Oct  8 10:18:28 np0005476733 cloud-init[922]: |         . .     |
Oct  8 10:18:28 np0005476733 cloud-init[922]: |                 |
Oct  8 10:18:28 np0005476733 cloud-init[922]: +----[SHA256]-----+
Oct  8 10:18:28 np0005476733 cloud-init[922]: Generating public/private ed25519 key pair.
Oct  8 10:18:28 np0005476733 cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Oct  8 10:18:28 np0005476733 cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Oct  8 10:18:28 np0005476733 cloud-init[922]: The key fingerprint is:
Oct  8 10:18:28 np0005476733 cloud-init[922]: SHA256:vr9yKKKsrD+52aWcFp+dTE7lk5EfwIPUgbzSfrTeBVo root@np0005476733.novalocal
Oct  8 10:18:28 np0005476733 cloud-init[922]: The key's randomart image is:
Oct  8 10:18:28 np0005476733 cloud-init[922]: +--[ED25519 256]--+
Oct  8 10:18:28 np0005476733 cloud-init[922]: |       o.=..     |
Oct  8 10:18:28 np0005476733 cloud-init[922]: |        + =      |
Oct  8 10:18:28 np0005476733 cloud-init[922]: |       . . +     |
Oct  8 10:18:28 np0005476733 cloud-init[922]: |      . o = E    |
Oct  8 10:18:28 np0005476733 cloud-init[922]: |       oS+ B o   |
Oct  8 10:18:28 np0005476733 cloud-init[922]: |    .  .+ B . .  |
Oct  8 10:18:28 np0005476733 cloud-init[922]: |   . o.B.= o .   |
Oct  8 10:18:28 np0005476733 cloud-init[922]: |..o+o++ B.o .    |
Oct  8 10:18:28 np0005476733 cloud-init[922]: |++*==. ..+o.     |
Oct  8 10:18:28 np0005476733 cloud-init[922]: +----[SHA256]-----+
Oct  8 10:18:28 np0005476733 systemd[1]: Finished Cloud-init: Network Stage.
Oct  8 10:18:28 np0005476733 systemd[1]: Reached target Cloud-config availability.
Oct  8 10:18:28 np0005476733 systemd[1]: Reached target Network is Online.
Oct  8 10:18:28 np0005476733 systemd[1]: Starting Cloud-init: Config Stage...
Oct  8 10:18:28 np0005476733 systemd[1]: Starting Notify NFS peers of a restart...
Oct  8 10:18:28 np0005476733 systemd[1]: Starting System Logging Service...
Oct  8 10:18:28 np0005476733 systemd[1]: Starting OpenSSH server daemon...
Oct  8 10:18:28 np0005476733 systemd[1]: Starting Permit User Sessions...
Oct  8 10:18:28 np0005476733 sm-notify[1004]: Version 2.5.4 starting
Oct  8 10:18:28 np0005476733 systemd[1]: Started Notify NFS peers of a restart.
Oct  8 10:18:28 np0005476733 systemd[1]: Started OpenSSH server daemon.
Oct  8 10:18:28 np0005476733 systemd[1]: Finished Permit User Sessions.
Oct  8 10:18:28 np0005476733 systemd[1]: Started Command Scheduler.
Oct  8 10:18:28 np0005476733 rsyslogd[1005]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1005" x-info="https://www.rsyslog.com"] start
Oct  8 10:18:28 np0005476733 rsyslogd[1005]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Oct  8 10:18:28 np0005476733 systemd[1]: Started Getty on tty1.
Oct  8 10:18:28 np0005476733 systemd[1]: Started Serial Getty on ttyS0.
Oct  8 10:18:28 np0005476733 systemd[1]: Reached target Login Prompts.
Oct  8 10:18:28 np0005476733 systemd[1]: Started System Logging Service.
Oct  8 10:18:28 np0005476733 systemd[1]: Reached target Multi-User System.
Oct  8 10:18:28 np0005476733 systemd[1]: Starting Record Runlevel Change in UTMP...
Oct  8 10:18:28 np0005476733 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Oct  8 10:18:28 np0005476733 systemd[1]: Finished Record Runlevel Change in UTMP.
Oct  8 10:18:28 np0005476733 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 10:18:28 np0005476733 cloud-init[1017]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Wed, 08 Oct 2025 14:18:28 +0000. Up 12.59 seconds.
Oct  8 10:18:28 np0005476733 systemd[1]: Finished Cloud-init: Config Stage.
Oct  8 10:18:28 np0005476733 systemd[1]: Starting Cloud-init: Final Stage...
Oct  8 10:18:29 np0005476733 cloud-init[1028]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Wed, 08 Oct 2025 14:18:29 +0000. Up 13.00 seconds.
Oct  8 10:18:29 np0005476733 cloud-init[1035]: #############################################################
Oct  8 10:18:29 np0005476733 cloud-init[1036]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Oct  8 10:18:29 np0005476733 cloud-init[1038]: 256 SHA256:Ww2vPEKKCMRNkNoRkIaGvsFODPdyfcYeV3VthX22yd0 root@np0005476733.novalocal (ECDSA)
Oct  8 10:18:29 np0005476733 cloud-init[1040]: 256 SHA256:vr9yKKKsrD+52aWcFp+dTE7lk5EfwIPUgbzSfrTeBVo root@np0005476733.novalocal (ED25519)
Oct  8 10:18:29 np0005476733 cloud-init[1042]: 3072 SHA256:f+MF+WzkDu/5Qlx6nl7di1j//4XTRtD0ndU6auzMrSQ root@np0005476733.novalocal (RSA)
Oct  8 10:18:29 np0005476733 cloud-init[1043]: -----END SSH HOST KEY FINGERPRINTS-----
Oct  8 10:18:29 np0005476733 cloud-init[1044]: #############################################################
Oct  8 10:18:29 np0005476733 cloud-init[1028]: Cloud-init v. 24.4-7.el9 finished at Wed, 08 Oct 2025 14:18:29 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 13.16 seconds
Oct  8 10:18:29 np0005476733 systemd[1]: Finished Cloud-init: Final Stage.
Oct  8 10:18:29 np0005476733 systemd[1]: Reached target Cloud-init target.
Oct  8 10:18:29 np0005476733 systemd[1]: Startup finished in 1.733s (kernel) + 2.896s (initrd) + 8.613s (userspace) = 13.243s.
Oct  8 10:18:31 np0005476733 chronyd[835]: Selected source 198.181.199.82 (2.centos.pool.ntp.org)
Oct  8 10:18:31 np0005476733 chronyd[835]: System clock TAI offset set to 37 seconds
Oct  8 10:18:35 np0005476733 irqbalance[822]: Cannot change IRQ 35 affinity: Operation not permitted
Oct  8 10:18:35 np0005476733 irqbalance[822]: IRQ 35 affinity is now unmanaged
Oct  8 10:18:35 np0005476733 irqbalance[822]: Cannot change IRQ 33 affinity: Operation not permitted
Oct  8 10:18:35 np0005476733 irqbalance[822]: IRQ 33 affinity is now unmanaged
Oct  8 10:18:35 np0005476733 irqbalance[822]: Cannot change IRQ 31 affinity: Operation not permitted
Oct  8 10:18:35 np0005476733 irqbalance[822]: IRQ 31 affinity is now unmanaged
Oct  8 10:18:35 np0005476733 irqbalance[822]: Cannot change IRQ 28 affinity: Operation not permitted
Oct  8 10:18:35 np0005476733 irqbalance[822]: IRQ 28 affinity is now unmanaged
Oct  8 10:18:35 np0005476733 irqbalance[822]: Cannot change IRQ 34 affinity: Operation not permitted
Oct  8 10:18:35 np0005476733 irqbalance[822]: IRQ 34 affinity is now unmanaged
Oct  8 10:18:35 np0005476733 irqbalance[822]: Cannot change IRQ 32 affinity: Operation not permitted
Oct  8 10:18:35 np0005476733 irqbalance[822]: IRQ 32 affinity is now unmanaged
Oct  8 10:18:35 np0005476733 irqbalance[822]: Cannot change IRQ 30 affinity: Operation not permitted
Oct  8 10:18:35 np0005476733 irqbalance[822]: IRQ 30 affinity is now unmanaged
Oct  8 10:18:35 np0005476733 irqbalance[822]: Cannot change IRQ 29 affinity: Operation not permitted
Oct  8 10:18:35 np0005476733 irqbalance[822]: IRQ 29 affinity is now unmanaged
Oct  8 10:18:36 np0005476733 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  8 10:18:44 np0005476733 systemd[1]: Created slice User Slice of UID 1000.
Oct  8 10:18:44 np0005476733 systemd[1]: Starting User Runtime Directory /run/user/1000...
Oct  8 10:18:44 np0005476733 systemd-logind[827]: New session 1 of user zuul.
Oct  8 10:18:44 np0005476733 systemd[1]: Finished User Runtime Directory /run/user/1000.
Oct  8 10:18:44 np0005476733 systemd[1]: Starting User Manager for UID 1000...
Oct  8 10:18:44 np0005476733 systemd[1058]: Queued start job for default target Main User Target.
Oct  8 10:18:44 np0005476733 systemd[1058]: Created slice User Application Slice.
Oct  8 10:18:44 np0005476733 systemd[1058]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  8 10:18:44 np0005476733 systemd[1058]: Started Daily Cleanup of User's Temporary Directories.
Oct  8 10:18:44 np0005476733 systemd[1058]: Reached target Paths.
Oct  8 10:18:44 np0005476733 systemd[1058]: Reached target Timers.
Oct  8 10:18:44 np0005476733 systemd[1058]: Starting D-Bus User Message Bus Socket...
Oct  8 10:18:44 np0005476733 systemd[1058]: Starting Create User's Volatile Files and Directories...
Oct  8 10:18:44 np0005476733 systemd[1058]: Listening on D-Bus User Message Bus Socket.
Oct  8 10:18:44 np0005476733 systemd[1058]: Reached target Sockets.
Oct  8 10:18:44 np0005476733 systemd[1058]: Finished Create User's Volatile Files and Directories.
Oct  8 10:18:44 np0005476733 systemd[1058]: Reached target Basic System.
Oct  8 10:18:44 np0005476733 systemd[1058]: Reached target Main User Target.
Oct  8 10:18:44 np0005476733 systemd[1058]: Startup finished in 142ms.
Oct  8 10:18:44 np0005476733 systemd[1]: Started User Manager for UID 1000.
Oct  8 10:18:44 np0005476733 systemd[1]: Started Session 1 of User zuul.
Oct  8 10:18:45 np0005476733 python3[1141]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:18:48 np0005476733 python3[1169]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:18:54 np0005476733 python3[1227]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:18:55 np0005476733 python3[1267]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Oct  8 10:18:56 np0005476733 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  8 10:18:57 np0005476733 python3[1295]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC+CQgbbTZHZ6mzBohFL1QKU4VJSpBcdT/QLnLVjUKeTYmGqW4Qb1EtkhY8IB2Esk6rKafeGmj6uOdH6zH11aaeH8HmcVTB6CGd/O26v2u0/IpTB6pB1zWMCVR2f54iXa7gyRhpZrXqJOL6DuWagDZZ+Nqeh/fmwLMsPsq9QpvudNCzL7yswJHB5PKZ+NGrX0/zQWZ/tJGLVlmj6G4WigWcOXBGJGr70e8Vv21KkbsJ6jV1C6ScspGmotH8tU44+X0Ryq37uqAYrkxiP7q/Td+II8SAgkvekEvkU8r2iJy/CU4oF6+gshyNDI2TIS6Oz860RFvqPi9fSBogEQD9zxuelJ86drjCYoxYINRwc2rxz0jRZGaJVTSOudIMUUMTjVLT7XYJU6Qa6yRfvnpOmxNrWW5gbKgTVzcb7t6ErVHFFLW5nDBjQiKkqGZXT9F3qhNunBb/KkJdXxQs/m/sH4GglvA2eyIzZkW0OUvhI9Up32fNucTkkg5GnSEgyE1U0ck= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:18:58 np0005476733 python3[1319]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:18:58 np0005476733 python3[1418]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 10:18:59 np0005476733 python3[1489]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759933138.5469248-230-210934858536236/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=04262f62e9a94253abbd759c6dda6612_id_rsa follow=False checksum=68d8b0f5f8212700cb9214576f122f5a01608c6f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:19:00 np0005476733 python3[1612]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 10:19:00 np0005476733 python3[1683]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759933139.6160371-276-245793341604012/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=04262f62e9a94253abbd759c6dda6612_id_rsa.pub follow=False checksum=27117fd782c88f1b783ca69276bc7c31cb7b8c94 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:19:01 np0005476733 python3[1731]: ansible-ping Invoked with data=pong
Oct  8 10:19:02 np0005476733 python3[1755]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:19:05 np0005476733 python3[1813]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Oct  8 10:19:06 np0005476733 python3[1845]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:19:07 np0005476733 python3[1869]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:19:07 np0005476733 python3[1893]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:19:08 np0005476733 python3[1917]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:19:08 np0005476733 python3[1941]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:19:08 np0005476733 python3[1965]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:19:10 np0005476733 python3[1991]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:19:11 np0005476733 python3[2069]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 10:19:12 np0005476733 python3[2142]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759933151.2222717-27-249852763907454/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:19:13 np0005476733 python3[2190]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:13 np0005476733 python3[2214]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:13 np0005476733 python3[2238]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:14 np0005476733 python3[2262]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:14 np0005476733 python3[2286]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:14 np0005476733 python3[2310]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:14 np0005476733 python3[2334]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:15 np0005476733 python3[2358]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:15 np0005476733 python3[2382]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:15 np0005476733 python3[2406]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:16 np0005476733 python3[2430]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:16 np0005476733 python3[2454]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:16 np0005476733 python3[2478]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:16 np0005476733 python3[2502]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:17 np0005476733 python3[2526]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:17 np0005476733 python3[2550]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:17 np0005476733 python3[2574]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:18 np0005476733 python3[2598]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:18 np0005476733 python3[2622]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:18 np0005476733 python3[2646]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:18 np0005476733 python3[2670]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:19 np0005476733 python3[2694]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:19 np0005476733 python3[2718]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:19 np0005476733 python3[2742]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:20 np0005476733 python3[2766]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:20 np0005476733 python3[2790]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:19:22 np0005476733 python3[2816]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct  8 10:19:22 np0005476733 systemd[1]: Starting Time & Date Service...
Oct  8 10:19:22 np0005476733 systemd[1]: Started Time & Date Service.
Oct  8 10:19:22 np0005476733 systemd-timedated[2818]: Changed time zone to 'UTC' (UTC).
Oct  8 10:19:22 np0005476733 python3[2847]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:19:23 np0005476733 python3[2923]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 10:19:23 np0005476733 python3[2994]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1759933163.1048176-204-214095020364472/source _original_basename=tmpvkiaakxg follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:19:24 np0005476733 python3[3094]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 10:19:24 np0005476733 python3[3165]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759933164.0690255-243-30207299427607/source _original_basename=tmp7otbyh63 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:19:25 np0005476733 python3[3267]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 10:19:25 np0005476733 python3[3340]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759933165.3188732-308-116845308315741/source _original_basename=tmpy111__0z follow=False checksum=332c94ac911d053598365a4ff7b72c4143f36dd6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:19:26 np0005476733 python3[3388]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:19:26 np0005476733 python3[3414]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:19:27 np0005476733 python3[3494]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 10:19:27 np0005476733 python3[3567]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1759933167.203094-364-142610098498754/source _original_basename=tmpzvv4hl2a follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:19:28 np0005476733 python3[3618]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-e42e-3584-00000000001e-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:19:29 np0005476733 python3[3646]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163efc-24cc-e42e-3584-00000000001f-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Oct  8 10:19:30 np0005476733 python3[3674]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:19:49 np0005476733 python3[3700]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:19:52 np0005476733 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct  8 10:20:48 np0005476733 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct  8 10:20:48 np0005476733 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Oct  8 10:20:48 np0005476733 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Oct  8 10:20:48 np0005476733 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Oct  8 10:20:48 np0005476733 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Oct  8 10:20:48 np0005476733 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Oct  8 10:20:48 np0005476733 kernel: pci 0000:00:07.0: BAR 4 [mem 0x440000000-0x440003fff 64bit pref]: assigned
Oct  8 10:20:48 np0005476733 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Oct  8 10:20:48 np0005476733 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Oct  8 10:20:48 np0005476733 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Oct  8 10:20:48 np0005476733 NetworkManager[862]: <info>  [1759933248.7176] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct  8 10:20:48 np0005476733 systemd-udevd[3703]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 10:20:48 np0005476733 NetworkManager[862]: <info>  [1759933248.7315] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 10:20:48 np0005476733 NetworkManager[862]: <info>  [1759933248.7336] settings: (eth1): created default wired connection 'Wired connection 1'
Oct  8 10:20:48 np0005476733 NetworkManager[862]: <info>  [1759933248.7339] device (eth1): carrier: link connected
Oct  8 10:20:48 np0005476733 NetworkManager[862]: <info>  [1759933248.7340] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct  8 10:20:48 np0005476733 NetworkManager[862]: <info>  [1759933248.7345] policy: auto-activating connection 'Wired connection 1' (5ec6a8e8-7a4f-32f6-a4d8-d91859011c0c)
Oct  8 10:20:48 np0005476733 NetworkManager[862]: <info>  [1759933248.7347] device (eth1): Activation: starting connection 'Wired connection 1' (5ec6a8e8-7a4f-32f6-a4d8-d91859011c0c)
Oct  8 10:20:48 np0005476733 NetworkManager[862]: <info>  [1759933248.7348] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 10:20:48 np0005476733 NetworkManager[862]: <info>  [1759933248.7350] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 10:20:48 np0005476733 NetworkManager[862]: <info>  [1759933248.7353] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 10:20:48 np0005476733 NetworkManager[862]: <info>  [1759933248.7358] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct  8 10:20:48 np0005476733 systemd[1058]: Starting Mark boot as successful...
Oct  8 10:20:48 np0005476733 systemd[1058]: Finished Mark boot as successful.
Oct  8 10:20:49 np0005476733 systemd-logind[827]: Session 1 logged out. Waiting for processes to exit.
Oct  8 10:20:49 np0005476733 systemd-logind[827]: New session 3 of user zuul.
Oct  8 10:20:49 np0005476733 systemd[1]: Started Session 3 of User zuul.
Oct  8 10:20:49 np0005476733 python3[3735]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-7b46-c241-000000000173-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:20:57 np0005476733 python3[3816]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 10:20:57 np0005476733 python3[3889]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759933256.626193-154-140045155283497/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=7eb06a3aa70886879428718bae94faa29bbc81fd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:20:57 np0005476733 python3[3939]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 10:20:58 np0005476733 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct  8 10:20:58 np0005476733 systemd[1]: Stopped Network Manager Wait Online.
Oct  8 10:20:58 np0005476733 systemd[1]: Stopping Network Manager Wait Online...
Oct  8 10:20:58 np0005476733 systemd[1]: Stopping Network Manager...
Oct  8 10:20:58 np0005476733 NetworkManager[862]: <info>  [1759933258.0390] caught SIGTERM, shutting down normally.
Oct  8 10:20:58 np0005476733 NetworkManager[862]: <info>  [1759933258.0406] dhcp4 (eth0): canceled DHCP transaction
Oct  8 10:20:58 np0005476733 NetworkManager[862]: <info>  [1759933258.0406] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  8 10:20:58 np0005476733 NetworkManager[862]: <info>  [1759933258.0407] dhcp4 (eth0): state changed no lease
Oct  8 10:20:58 np0005476733 NetworkManager[862]: <info>  [1759933258.0412] manager: NetworkManager state is now CONNECTING
Oct  8 10:20:58 np0005476733 NetworkManager[862]: <info>  [1759933258.0471] dhcp4 (eth1): canceled DHCP transaction
Oct  8 10:20:58 np0005476733 NetworkManager[862]: <info>  [1759933258.0471] dhcp4 (eth1): state changed no lease
Oct  8 10:20:58 np0005476733 NetworkManager[862]: <info>  [1759933258.0522] exiting (success)
Oct  8 10:20:58 np0005476733 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  8 10:20:58 np0005476733 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  8 10:20:58 np0005476733 systemd[1]: NetworkManager.service: Deactivated successfully.
Oct  8 10:20:58 np0005476733 systemd[1]: Stopped Network Manager.
Oct  8 10:20:58 np0005476733 systemd[1]: Starting Network Manager...
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.1235] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:333cb165-ebfa-4717-89b0-51e1063c215b)
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.1239] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.1317] manager[0x561aaee4a070]: monitoring kernel firmware directory '/lib/firmware'.
Oct  8 10:20:58 np0005476733 systemd[1]: Starting Hostname Service...
Oct  8 10:20:58 np0005476733 systemd[1]: Started Hostname Service.
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2137] hostname: hostname: using hostnamed
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2138] hostname: static hostname changed from (none) to "np0005476733.novalocal"
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2144] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2151] manager[0x561aaee4a070]: rfkill: Wi-Fi hardware radio set enabled
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2151] manager[0x561aaee4a070]: rfkill: WWAN hardware radio set enabled
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2191] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2191] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2192] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2192] manager: Networking is enabled by state file
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2195] settings: Loaded settings plugin: keyfile (internal)
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2201] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2231] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2244] dhcp: init: Using DHCP client 'internal'
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2247] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2253] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2257] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2266] device (lo): Activation: starting connection 'lo' (f95091c6-00bf-46c5-85a3-3d1e635fc62f)
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2273] device (eth0): carrier: link connected
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2278] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2283] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2283] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2291] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2298] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2304] device (eth1): carrier: link connected
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2309] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2313] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (5ec6a8e8-7a4f-32f6-a4d8-d91859011c0c) (indicated)
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2314] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2319] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2325] device (eth1): Activation: starting connection 'Wired connection 1' (5ec6a8e8-7a4f-32f6-a4d8-d91859011c0c)
Oct  8 10:20:58 np0005476733 systemd[1]: Started Network Manager.
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2333] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2351] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2357] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2362] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2368] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2377] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2383] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2390] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2397] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2411] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2416] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2429] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2434] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2459] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  8 10:20:58 np0005476733 systemd[1]: Starting Network Manager Wait Online...
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2466] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2476] device (lo): Activation: successful, device activated.
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2489] dhcp4 (eth0): state changed new lease, address=38.102.83.224
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2500] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2600] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2624] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2626] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2632] manager: NetworkManager state is now CONNECTED_SITE
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2639] device (eth0): Activation: successful, device activated.
Oct  8 10:20:58 np0005476733 NetworkManager[3949]: <info>  [1759933258.2648] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct  8 10:20:58 np0005476733 python3[4023]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-7b46-c241-0000000000bd-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:21:08 np0005476733 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  8 10:21:28 np0005476733 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  8 10:21:43 np0005476733 NetworkManager[3949]: <info>  [1759933303.2779] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  8 10:21:43 np0005476733 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  8 10:21:43 np0005476733 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  8 10:21:43 np0005476733 NetworkManager[3949]: <info>  [1759933303.3053] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  8 10:21:43 np0005476733 NetworkManager[3949]: <info>  [1759933303.3055] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  8 10:21:43 np0005476733 NetworkManager[3949]: <info>  [1759933303.3067] device (eth1): Activation: successful, device activated.
Oct  8 10:21:43 np0005476733 NetworkManager[3949]: <info>  [1759933303.3075] manager: startup complete
Oct  8 10:21:43 np0005476733 NetworkManager[3949]: <info>  [1759933303.3077] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Oct  8 10:21:43 np0005476733 NetworkManager[3949]: <warn>  [1759933303.3084] device (eth1): Activation: failed for connection 'Wired connection 1'
Oct  8 10:21:43 np0005476733 NetworkManager[3949]: <info>  [1759933303.3093] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Oct  8 10:21:43 np0005476733 systemd[1]: Finished Network Manager Wait Online.
Oct  8 10:21:43 np0005476733 NetworkManager[3949]: <info>  [1759933303.3181] dhcp4 (eth1): canceled DHCP transaction
Oct  8 10:21:43 np0005476733 NetworkManager[3949]: <info>  [1759933303.3182] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct  8 10:21:43 np0005476733 NetworkManager[3949]: <info>  [1759933303.3182] dhcp4 (eth1): state changed no lease
Oct  8 10:21:43 np0005476733 NetworkManager[3949]: <info>  [1759933303.3196] policy: auto-activating connection 'ci-private-network' (741aa56c-7d41-58ab-9957-86492a3b919f)
Oct  8 10:21:43 np0005476733 NetworkManager[3949]: <info>  [1759933303.3201] device (eth1): Activation: starting connection 'ci-private-network' (741aa56c-7d41-58ab-9957-86492a3b919f)
Oct  8 10:21:43 np0005476733 NetworkManager[3949]: <info>  [1759933303.3201] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 10:21:43 np0005476733 NetworkManager[3949]: <info>  [1759933303.3204] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 10:21:43 np0005476733 NetworkManager[3949]: <info>  [1759933303.3210] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 10:21:43 np0005476733 NetworkManager[3949]: <info>  [1759933303.3218] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 10:21:43 np0005476733 NetworkManager[3949]: <info>  [1759933303.3303] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 10:21:43 np0005476733 NetworkManager[3949]: <info>  [1759933303.3307] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 10:21:43 np0005476733 NetworkManager[3949]: <info>  [1759933303.3317] device (eth1): Activation: successful, device activated.
Oct  8 10:21:53 np0005476733 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  8 10:21:58 np0005476733 systemd[1]: session-3.scope: Deactivated successfully.
Oct  8 10:21:58 np0005476733 systemd[1]: session-3.scope: Consumed 1.764s CPU time.
Oct  8 10:21:58 np0005476733 systemd-logind[827]: Session 3 logged out. Waiting for processes to exit.
Oct  8 10:21:58 np0005476733 systemd-logind[827]: Removed session 3.
Oct  8 10:22:01 np0005476733 systemd-logind[827]: New session 4 of user zuul.
Oct  8 10:22:01 np0005476733 systemd[1]: Started Session 4 of User zuul.
Oct  8 10:22:02 np0005476733 python3[4133]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 10:22:02 np0005476733 python3[4206]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759933321.887347-312-205070136103383/source _original_basename=tmp5xo_arze follow=False checksum=bb4bb8ce79ff890cdceb190ab64a5ed2ce6a7881 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:22:05 np0005476733 systemd[1]: session-4.scope: Deactivated successfully.
Oct  8 10:22:05 np0005476733 systemd-logind[827]: Session 4 logged out. Waiting for processes to exit.
Oct  8 10:22:05 np0005476733 systemd-logind[827]: Removed session 4.
Oct  8 10:24:22 np0005476733 systemd[1058]: Created slice User Background Tasks Slice.
Oct  8 10:24:22 np0005476733 systemd[1058]: Starting Cleanup of User's Temporary Files and Directories...
Oct  8 10:24:22 np0005476733 systemd[1058]: Finished Cleanup of User's Temporary Files and Directories.
Oct  8 10:27:35 np0005476733 systemd-logind[827]: New session 5 of user zuul.
Oct  8 10:27:35 np0005476733 systemd[1]: Started Session 5 of User zuul.
Oct  8 10:27:35 np0005476733 python3[4267]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163efc-24cc-fe02-cfa8-000000001cf3-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:27:35 np0005476733 python3[4295]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:27:36 np0005476733 python3[4322]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:27:36 np0005476733 python3[4348]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:27:36 np0005476733 python3[4374]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:27:37 np0005476733 python3[4400]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/systemd/system.conf regexp=^#DefaultIOAccounting=no line=DefaultIOAccounting=yes state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:27:37 np0005476733 python3[4400]: ansible-ansible.builtin.lineinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Oct  8 10:27:38 np0005476733 python3[4426]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 10:27:38 np0005476733 systemd[1]: Reloading.
Oct  8 10:27:38 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 10:27:40 np0005476733 python3[4482]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Oct  8 10:27:40 np0005476733 python3[4508]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:27:40 np0005476733 python3[4536]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:27:41 np0005476733 python3[4564]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:27:41 np0005476733 python3[4592]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:27:41 np0005476733 python3[4619]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163efc-24cc-fe02-cfa8-000000001cf9-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:27:42 np0005476733 python3[4649]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 10:27:45 np0005476733 systemd[1]: session-5.scope: Deactivated successfully.
Oct  8 10:27:45 np0005476733 systemd[1]: session-5.scope: Consumed 3.431s CPU time.
Oct  8 10:27:45 np0005476733 systemd-logind[827]: Session 5 logged out. Waiting for processes to exit.
Oct  8 10:27:45 np0005476733 systemd-logind[827]: Removed session 5.
Oct  8 10:27:46 np0005476733 systemd-logind[827]: New session 6 of user zuul.
Oct  8 10:27:46 np0005476733 systemd[1]: Started Session 6 of User zuul.
Oct  8 10:27:47 np0005476733 python3[4683]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct  8 10:28:04 np0005476733 kernel: SELinux:  Converting 363 SID table entries...
Oct  8 10:28:04 np0005476733 kernel: SELinux:  policy capability network_peer_controls=1
Oct  8 10:28:04 np0005476733 kernel: SELinux:  policy capability open_perms=1
Oct  8 10:28:04 np0005476733 kernel: SELinux:  policy capability extended_socket_class=1
Oct  8 10:28:04 np0005476733 kernel: SELinux:  policy capability always_check_network=0
Oct  8 10:28:04 np0005476733 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  8 10:28:04 np0005476733 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  8 10:28:04 np0005476733 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  8 10:28:14 np0005476733 kernel: SELinux:  Converting 363 SID table entries...
Oct  8 10:28:14 np0005476733 kernel: SELinux:  policy capability network_peer_controls=1
Oct  8 10:28:14 np0005476733 kernel: SELinux:  policy capability open_perms=1
Oct  8 10:28:14 np0005476733 kernel: SELinux:  policy capability extended_socket_class=1
Oct  8 10:28:14 np0005476733 kernel: SELinux:  policy capability always_check_network=0
Oct  8 10:28:14 np0005476733 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  8 10:28:14 np0005476733 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  8 10:28:14 np0005476733 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  8 10:28:23 np0005476733 kernel: SELinux:  Converting 363 SID table entries...
Oct  8 10:28:23 np0005476733 kernel: SELinux:  policy capability network_peer_controls=1
Oct  8 10:28:23 np0005476733 kernel: SELinux:  policy capability open_perms=1
Oct  8 10:28:23 np0005476733 kernel: SELinux:  policy capability extended_socket_class=1
Oct  8 10:28:23 np0005476733 kernel: SELinux:  policy capability always_check_network=0
Oct  8 10:28:23 np0005476733 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  8 10:28:23 np0005476733 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  8 10:28:23 np0005476733 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  8 10:28:25 np0005476733 setsebool[4750]: The virt_use_nfs policy boolean was changed to 1 by root
Oct  8 10:28:25 np0005476733 setsebool[4750]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Oct  8 10:28:36 np0005476733 kernel: SELinux:  Converting 366 SID table entries...
Oct  8 10:28:36 np0005476733 kernel: SELinux:  policy capability network_peer_controls=1
Oct  8 10:28:36 np0005476733 kernel: SELinux:  policy capability open_perms=1
Oct  8 10:28:36 np0005476733 kernel: SELinux:  policy capability extended_socket_class=1
Oct  8 10:28:36 np0005476733 kernel: SELinux:  policy capability always_check_network=0
Oct  8 10:28:36 np0005476733 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  8 10:28:36 np0005476733 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  8 10:28:36 np0005476733 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  8 10:28:55 np0005476733 dbus-broker-launch[817]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct  8 10:28:55 np0005476733 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  8 10:28:55 np0005476733 systemd[1]: Starting man-db-cache-update.service...
Oct  8 10:28:55 np0005476733 systemd[1]: Reloading.
Oct  8 10:28:55 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 10:28:55 np0005476733 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  8 10:28:56 np0005476733 systemd[1]: Starting PackageKit Daemon...
Oct  8 10:28:56 np0005476733 systemd[1]: Starting Authorization Manager...
Oct  8 10:28:56 np0005476733 polkitd[6223]: Started polkitd version 0.117
Oct  8 10:28:57 np0005476733 systemd[1]: Started Authorization Manager.
Oct  8 10:28:57 np0005476733 systemd[1]: Started PackageKit Daemon.
Oct  8 10:28:58 np0005476733 python3[7742]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163efc-24cc-cb41-ba21-00000000000b-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:28:59 np0005476733 kernel: evm: overlay not supported
Oct  8 10:28:59 np0005476733 systemd[1058]: Starting D-Bus User Message Bus...
Oct  8 10:28:59 np0005476733 dbus-broker-launch[8732]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Oct  8 10:28:59 np0005476733 dbus-broker-launch[8732]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Oct  8 10:28:59 np0005476733 systemd[1058]: Started D-Bus User Message Bus.
Oct  8 10:28:59 np0005476733 dbus-broker-lau[8732]: Ready
Oct  8 10:28:59 np0005476733 systemd[1058]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct  8 10:28:59 np0005476733 systemd[1058]: Created slice Slice /user.
Oct  8 10:28:59 np0005476733 systemd[1058]: podman-8609.scope: unit configures an IP firewall, but not running as root.
Oct  8 10:28:59 np0005476733 systemd[1058]: (This warning is only shown for the first unit using IP firewalling.)
Oct  8 10:28:59 np0005476733 systemd[1058]: Started podman-8609.scope.
Oct  8 10:28:59 np0005476733 systemd[1058]: Started podman-pause-7ab22f83.scope.
Oct  8 10:29:00 np0005476733 python3[9319]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.103:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.103:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:29:00 np0005476733 systemd-logind[827]: Session 6 logged out. Waiting for processes to exit.
Oct  8 10:29:00 np0005476733 systemd[1]: session-6.scope: Deactivated successfully.
Oct  8 10:29:00 np0005476733 systemd[1]: session-6.scope: Consumed 1min 4.194s CPU time.
Oct  8 10:29:00 np0005476733 systemd-logind[827]: Removed session 6.
Oct  8 10:29:24 np0005476733 systemd-logind[827]: New session 7 of user zuul.
Oct  8 10:29:24 np0005476733 systemd[1]: Started Session 7 of User zuul.
Oct  8 10:29:24 np0005476733 python3[19791]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMYwD2vM7x2PcqGK7LlKJn8Sy9oE6tMK5OiC6DI/A7cm1lCGwYGEsHtMiW9kRUVLSj3MEKxCWvuHlvr0AchcchA= zuul@np0005476731.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:29:25 np0005476733 python3[19982]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMYwD2vM7x2PcqGK7LlKJn8Sy9oE6tMK5OiC6DI/A7cm1lCGwYGEsHtMiW9kRUVLSj3MEKxCWvuHlvr0AchcchA= zuul@np0005476731.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:29:25 np0005476733 python3[20345]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005476733.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Oct  8 10:29:27 np0005476733 python3[20996]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMYwD2vM7x2PcqGK7LlKJn8Sy9oE6tMK5OiC6DI/A7cm1lCGwYGEsHtMiW9kRUVLSj3MEKxCWvuHlvr0AchcchA= zuul@np0005476731.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 10:29:27 np0005476733 python3[21247]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 10:29:28 np0005476733 python3[21561]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759933767.6707342-152-63552957438898/source _original_basename=tmpf1szdim1 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:29:29 np0005476733 python3[21885]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Oct  8 10:29:29 np0005476733 systemd[1]: Starting Hostname Service...
Oct  8 10:29:29 np0005476733 systemd[1]: Started Hostname Service.
Oct  8 10:29:29 np0005476733 systemd-hostnamed[21980]: Changed pretty hostname to 'compute-1'
Oct  8 10:29:29 np0005476733 systemd-hostnamed[21980]: Hostname set to <compute-1> (static)
Oct  8 10:29:29 np0005476733 NetworkManager[3949]: <info>  [1759933769.6329] hostname: static hostname changed from "np0005476733.novalocal" to "compute-1"
Oct  8 10:29:29 np0005476733 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  8 10:29:29 np0005476733 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  8 10:29:29 np0005476733 systemd[1]: session-7.scope: Deactivated successfully.
Oct  8 10:29:29 np0005476733 systemd[1]: session-7.scope: Consumed 2.308s CPU time.
Oct  8 10:29:29 np0005476733 systemd-logind[827]: Session 7 logged out. Waiting for processes to exit.
Oct  8 10:29:29 np0005476733 systemd-logind[827]: Removed session 7.
Oct  8 10:29:39 np0005476733 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  8 10:29:40 np0005476733 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  8 10:29:40 np0005476733 systemd[1]: Finished man-db-cache-update.service.
Oct  8 10:29:40 np0005476733 systemd[1]: man-db-cache-update.service: Consumed 50.602s CPU time.
Oct  8 10:29:40 np0005476733 systemd[1]: run-r356271a3fe7a456a87d729353d5b6242.service: Deactivated successfully.
Oct  8 10:29:59 np0005476733 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  8 10:33:22 np0005476733 systemd[1]: Starting Cleanup of Temporary Directories...
Oct  8 10:33:22 np0005476733 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Oct  8 10:33:22 np0005476733 systemd[1]: Finished Cleanup of Temporary Directories.
Oct  8 10:33:22 np0005476733 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Oct  8 10:33:38 np0005476733 systemd-logind[827]: New session 8 of user zuul.
Oct  8 10:33:38 np0005476733 systemd[1]: Started Session 8 of User zuul.
Oct  8 10:33:38 np0005476733 python3[26650]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:33:40 np0005476733 python3[26766]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 10:33:41 np0005476733 python3[26839]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759934020.423206-30699-211561919943345/source mode=0755 _original_basename=delorean.repo follow=False checksum=f3f029ef513950de857eede9231def34e37a0d9c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:33:41 np0005476733 python3[26865]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 10:33:41 np0005476733 python3[26938]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759934020.423206-30699-211561919943345/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:33:42 np0005476733 python3[26964]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 10:33:42 np0005476733 python3[27037]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759934020.423206-30699-211561919943345/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:33:42 np0005476733 python3[27063]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 10:33:43 np0005476733 python3[27136]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759934020.423206-30699-211561919943345/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:33:43 np0005476733 python3[27162]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 10:33:43 np0005476733 python3[27235]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759934020.423206-30699-211561919943345/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:33:44 np0005476733 python3[27261]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 10:33:44 np0005476733 python3[27334]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759934020.423206-30699-211561919943345/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:33:44 np0005476733 python3[27360]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 10:33:45 np0005476733 python3[27433]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759934020.423206-30699-211561919943345/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=75ca8f9fe9a538824fd094f239c30e8ce8652e8a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:33:45 np0005476733 python3[27459]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/gating.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 10:33:45 np0005476733 python3[27532]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759934020.423206-30699-211561919943345/source mode=0755 _original_basename=gating.repo follow=False checksum=565a499ab50e2724464cc10a6175658a156f9fa3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:33:58 np0005476733 python3[27580]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:34:02 np0005476733 systemd[1]: packagekit.service: Deactivated successfully.
Oct  8 10:38:58 np0005476733 systemd[1]: session-8.scope: Deactivated successfully.
Oct  8 10:38:58 np0005476733 systemd[1]: session-8.scope: Consumed 5.969s CPU time.
Oct  8 10:38:58 np0005476733 systemd-logind[827]: Session 8 logged out. Waiting for processes to exit.
Oct  8 10:38:58 np0005476733 systemd-logind[827]: Removed session 8.
Oct  8 10:47:37 np0005476733 systemd-logind[827]: New session 9 of user zuul.
Oct  8 10:47:37 np0005476733 systemd[1]: Started Session 9 of User zuul.
Oct  8 10:47:38 np0005476733 python3.9[27745]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:47:41 np0005476733 python3.9[27926]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:47:48 np0005476733 systemd[1]: session-9.scope: Deactivated successfully.
Oct  8 10:47:48 np0005476733 systemd[1]: session-9.scope: Consumed 8.076s CPU time.
Oct  8 10:47:48 np0005476733 systemd-logind[827]: Session 9 logged out. Waiting for processes to exit.
Oct  8 10:47:48 np0005476733 systemd-logind[827]: Removed session 9.
Oct  8 10:47:53 np0005476733 systemd-logind[827]: New session 10 of user zuul.
Oct  8 10:47:53 np0005476733 systemd[1]: Started Session 10 of User zuul.
Oct  8 10:47:55 np0005476733 python3.9[28136]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:47:55 np0005476733 systemd[1]: session-10.scope: Deactivated successfully.
Oct  8 10:47:55 np0005476733 systemd-logind[827]: Session 10 logged out. Waiting for processes to exit.
Oct  8 10:47:55 np0005476733 systemd-logind[827]: Removed session 10.
Oct  8 10:48:10 np0005476733 systemd-logind[827]: New session 11 of user zuul.
Oct  8 10:48:10 np0005476733 systemd[1]: Started Session 11 of User zuul.
Oct  8 10:48:11 np0005476733 python3.9[28317]: ansible-ansible.legacy.ping Invoked with data=pong
Oct  8 10:48:13 np0005476733 python3.9[28491]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:48:14 np0005476733 python3.9[28643]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:48:15 np0005476733 python3.9[28796]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 10:48:16 np0005476733 python3.9[28948]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:48:17 np0005476733 python3.9[29100]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:48:17 np0005476733 python3.9[29223]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759934896.6534529-126-123597409635995/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:48:18 np0005476733 python3.9[29375]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:48:19 np0005476733 python3.9[29531]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:48:20 np0005476733 python3.9[29681]: ansible-ansible.builtin.service_facts Invoked
Oct  8 10:48:26 np0005476733 python3.9[29936]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:48:27 np0005476733 python3.9[30086]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:48:29 np0005476733 python3.9[30240]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:48:30 np0005476733 python3.9[30398]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 10:48:30 np0005476733 python3.9[30482]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 10:49:14 np0005476733 systemd[1]: Reloading.
Oct  8 10:49:14 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 10:49:14 np0005476733 systemd[1]: Starting dnf makecache...
Oct  8 10:49:14 np0005476733 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Oct  8 10:49:15 np0005476733 dnf[30688]: Repository 'gating-repo' is missing name in configuration, using id.
Oct  8 10:49:15 np0005476733 dnf[30688]: Failed determining last makecache time.
Oct  8 10:49:15 np0005476733 dnf[30688]: delorean-openstack-barbican-42b4c41831408a8e323 139 kB/s | 3.0 kB     00:00
Oct  8 10:49:15 np0005476733 dnf[30688]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 170 kB/s | 3.0 kB     00:00
Oct  8 10:49:15 np0005476733 systemd[1]: Reloading.
Oct  8 10:49:15 np0005476733 dnf[30688]: delorean-openstack-cinder-1c00d6490d88e436f26ef 171 kB/s | 3.0 kB     00:00
Oct  8 10:49:15 np0005476733 dnf[30688]: delorean-python-stevedore-c4acc5639fd2329372142 169 kB/s | 3.0 kB     00:00
Oct  8 10:49:15 np0005476733 dnf[30688]: delorean-python-cloudkitty-tests-tempest-3961dc 154 kB/s | 3.0 kB     00:00
Oct  8 10:49:15 np0005476733 dnf[30688]: delorean-diskimage-builder-43381184423c185801b5 170 kB/s | 3.0 kB     00:00
Oct  8 10:49:15 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 10:49:15 np0005476733 dnf[30688]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 156 kB/s | 3.0 kB     00:00
Oct  8 10:49:15 np0005476733 dnf[30688]: delorean-python-designate-tests-tempest-347fdbc 165 kB/s | 3.0 kB     00:00
Oct  8 10:49:15 np0005476733 dnf[30688]: delorean-openstack-glance-1fd12c29b339f30fe823e 160 kB/s | 3.0 kB     00:00
Oct  8 10:49:15 np0005476733 dnf[30688]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 151 kB/s | 3.0 kB     00:00
Oct  8 10:49:15 np0005476733 dnf[30688]: delorean-openstack-manila-3c01b7181572c95dac462 153 kB/s | 3.0 kB     00:00
Oct  8 10:49:15 np0005476733 dnf[30688]: delorean-python-vmware-nsxlib-458234972d1428ac9 178 kB/s | 3.0 kB     00:00
Oct  8 10:49:15 np0005476733 dnf[30688]: delorean-openstack-octavia-ba397f07a7331190208c 187 kB/s | 3.0 kB     00:00
Oct  8 10:49:15 np0005476733 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Oct  8 10:49:15 np0005476733 dnf[30688]: delorean-openstack-watcher-c014f81a8647287f6dcc 194 kB/s | 3.0 kB     00:00
Oct  8 10:49:15 np0005476733 dnf[30688]: delorean-edpm-image-builder-55ba53cf215b14ed95b 186 kB/s | 3.0 kB     00:00
Oct  8 10:49:15 np0005476733 dnf[30688]: delorean-puppet-ceph-b0c245ccde541a63fde0564366 189 kB/s | 3.0 kB     00:00
Oct  8 10:49:15 np0005476733 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Oct  8 10:49:15 np0005476733 dnf[30688]: delorean-openstack-swift-dc98a8463506ac520c469a 152 kB/s | 3.0 kB     00:00
Oct  8 10:49:15 np0005476733 systemd[1]: Reloading.
Oct  8 10:49:15 np0005476733 dnf[30688]: delorean-python-tempestconf-8515371b7cceebd4282 146 kB/s | 3.0 kB     00:00
Oct  8 10:49:15 np0005476733 dnf[30688]: delorean-openstack-heat-ui-013accbfd179753bc3f0 147 kB/s | 3.0 kB     00:00
Oct  8 10:49:15 np0005476733 dnf[30688]: gating-repo                                     490 kB/s | 3.0 kB     00:00
Oct  8 10:49:15 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 10:49:15 np0005476733 dnf[30688]: CentOS Stream 9 - BaseOS                         57 kB/s | 6.7 kB     00:00
Oct  8 10:49:15 np0005476733 systemd[1]: Listening on LVM2 poll daemon socket.
Oct  8 10:49:15 np0005476733 dbus-broker-launch[816]: Noticed file-system modification, trigger reload.
Oct  8 10:49:15 np0005476733 dbus-broker-launch[816]: Noticed file-system modification, trigger reload.
Oct  8 10:49:15 np0005476733 dbus-broker-launch[816]: Noticed file-system modification, trigger reload.
Oct  8 10:49:15 np0005476733 dnf[30688]: CentOS Stream 9 - AppStream                      25 kB/s | 6.8 kB     00:00
Oct  8 10:49:16 np0005476733 dnf[30688]: CentOS Stream 9 - CRB                            55 kB/s | 6.6 kB     00:00
Oct  8 10:49:16 np0005476733 dnf[30688]: CentOS Stream 9 - Extras packages                78 kB/s | 8.0 kB     00:00
Oct  8 10:49:16 np0005476733 dnf[30688]: dlrn-antelope-testing                           159 kB/s | 3.0 kB     00:00
Oct  8 10:49:16 np0005476733 dnf[30688]: dlrn-antelope-build-deps                        167 kB/s | 3.0 kB     00:00
Oct  8 10:49:16 np0005476733 dnf[30688]: centos9-rabbitmq                                 50 kB/s | 3.0 kB     00:00
Oct  8 10:49:16 np0005476733 dnf[30688]: centos9-storage                                  15 kB/s | 3.0 kB     00:00
Oct  8 10:49:16 np0005476733 dnf[30688]: centos9-opstools                                135 kB/s | 3.0 kB     00:00
Oct  8 10:49:16 np0005476733 dnf[30688]: NFV SIG OpenvSwitch                              67 kB/s | 3.0 kB     00:00
Oct  8 10:49:16 np0005476733 dnf[30688]: repo-setup-centos-appstream                     193 kB/s | 4.4 kB     00:00
Oct  8 10:49:16 np0005476733 dnf[30688]: repo-setup-centos-baseos                        164 kB/s | 3.9 kB     00:00
Oct  8 10:49:16 np0005476733 dnf[30688]: repo-setup-centos-highavailability              149 kB/s | 3.9 kB     00:00
Oct  8 10:49:16 np0005476733 dnf[30688]: repo-setup-centos-powertools                    152 kB/s | 4.3 kB     00:00
Oct  8 10:49:17 np0005476733 dnf[30688]: Extra Packages for Enterprise Linux 9 - x86_64   85 kB/s |  31 kB     00:00
Oct  8 10:49:17 np0005476733 dnf[30688]: Metadata cache created.
Oct  8 10:49:17 np0005476733 systemd[1]: dnf-makecache.service: Deactivated successfully.
Oct  8 10:49:17 np0005476733 systemd[1]: Finished dnf makecache.
Oct  8 10:49:17 np0005476733 systemd[1]: dnf-makecache.service: Consumed 1.633s CPU time.
Oct  8 10:50:23 np0005476733 kernel: SELinux:  Converting 2714 SID table entries...
Oct  8 10:50:23 np0005476733 kernel: SELinux:  policy capability network_peer_controls=1
Oct  8 10:50:23 np0005476733 kernel: SELinux:  policy capability open_perms=1
Oct  8 10:50:23 np0005476733 kernel: SELinux:  policy capability extended_socket_class=1
Oct  8 10:50:23 np0005476733 kernel: SELinux:  policy capability always_check_network=0
Oct  8 10:50:23 np0005476733 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  8 10:50:23 np0005476733 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  8 10:50:23 np0005476733 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  8 10:50:24 np0005476733 dbus-broker-launch[817]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Oct  8 10:50:24 np0005476733 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  8 10:50:24 np0005476733 systemd[1]: Starting man-db-cache-update.service...
Oct  8 10:50:24 np0005476733 systemd[1]: Reloading.
Oct  8 10:50:24 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 10:50:24 np0005476733 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  8 10:50:24 np0005476733 systemd[1]: Starting PackageKit Daemon...
Oct  8 10:50:24 np0005476733 systemd[1]: Started PackageKit Daemon.
Oct  8 10:50:25 np0005476733 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  8 10:50:25 np0005476733 systemd[1]: Finished man-db-cache-update.service.
Oct  8 10:50:25 np0005476733 systemd[1]: man-db-cache-update.service: Consumed 1.061s CPU time.
Oct  8 10:50:25 np0005476733 systemd[1]: run-r29e80afcafb44568a9deda04596487b1.service: Deactivated successfully.
Oct  8 10:50:43 np0005476733 python3.9[32035]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:50:45 np0005476733 python3.9[32316]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Oct  8 10:50:46 np0005476733 python3.9[32468]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Oct  8 10:50:48 np0005476733 python3.9[32621]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:50:49 np0005476733 python3.9[32773]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Oct  8 10:50:50 np0005476733 python3.9[32925]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:50:51 np0005476733 python3.9[33077]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:50:51 np0005476733 python3.9[33200]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935050.8544528-434-32566032938197/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b7137e5b4a35daae8ba780c026abba3353623136 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:50:53 np0005476733 python3.9[33352]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Oct  8 10:50:54 np0005476733 python3.9[33505]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  8 10:50:54 np0005476733 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 10:50:55 np0005476733 python3.9[33664]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  8 10:50:56 np0005476733 python3.9[33824]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Oct  8 10:50:56 np0005476733 python3.9[33977]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  8 10:50:57 np0005476733 python3.9[34135]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Oct  8 10:50:58 np0005476733 python3.9[34287]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 10:51:09 np0005476733 python3.9[34442]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:51:10 np0005476733 python3.9[34594]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:51:10 np0005476733 python3.9[34717]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759935069.7078476-624-207915884660748/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:51:11 np0005476733 python3.9[34869]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 10:51:11 np0005476733 systemd[1]: Starting Load Kernel Modules...
Oct  8 10:51:12 np0005476733 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Oct  8 10:51:12 np0005476733 kernel: Bridge firewalling registered
Oct  8 10:51:12 np0005476733 systemd-modules-load[34873]: Inserted module 'br_netfilter'
Oct  8 10:51:12 np0005476733 systemd[1]: Finished Load Kernel Modules.
Oct  8 10:51:12 np0005476733 python3.9[35028]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:51:13 np0005476733 python3.9[35151]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759935072.3126929-670-258566246785249/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:51:14 np0005476733 python3.9[35303]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 10:51:18 np0005476733 dbus-broker-launch[816]: Noticed file-system modification, trigger reload.
Oct  8 10:51:18 np0005476733 dbus-broker-launch[816]: Noticed file-system modification, trigger reload.
Oct  8 10:51:19 np0005476733 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  8 10:51:19 np0005476733 systemd[1]: Starting man-db-cache-update.service...
Oct  8 10:51:19 np0005476733 systemd[1]: Reloading.
Oct  8 10:51:19 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 10:51:19 np0005476733 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  8 10:51:21 np0005476733 python3.9[37399]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 10:51:22 np0005476733 python3.9[38609]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Oct  8 10:51:22 np0005476733 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  8 10:51:22 np0005476733 systemd[1]: Finished man-db-cache-update.service.
Oct  8 10:51:22 np0005476733 systemd[1]: man-db-cache-update.service: Consumed 4.762s CPU time.
Oct  8 10:51:22 np0005476733 systemd[1]: run-rae5a3aac760f4666b6b8d70756aa75a4.service: Deactivated successfully.
Oct  8 10:51:22 np0005476733 python3.9[39301]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 10:51:23 np0005476733 python3.9[39467]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:51:24 np0005476733 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct  8 10:51:24 np0005476733 systemd[1]: Started Dynamic System Tuning Daemon.
Oct  8 10:51:25 np0005476733 python3.9[39840]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 10:51:25 np0005476733 systemd[1]: Stopping Dynamic System Tuning Daemon...
Oct  8 10:51:25 np0005476733 systemd[1]: tuned.service: Deactivated successfully.
Oct  8 10:51:25 np0005476733 systemd[1]: Stopped Dynamic System Tuning Daemon.
Oct  8 10:51:25 np0005476733 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct  8 10:51:25 np0005476733 systemd[1]: Started Dynamic System Tuning Daemon.
Oct  8 10:51:26 np0005476733 python3.9[40002]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Oct  8 10:51:29 np0005476733 python3.9[40154]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 10:51:29 np0005476733 systemd[1]: Reloading.
Oct  8 10:51:29 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 10:51:30 np0005476733 python3.9[40343]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 10:51:30 np0005476733 systemd[1]: Reloading.
Oct  8 10:51:30 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 10:51:31 np0005476733 python3.9[40532]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:51:32 np0005476733 python3.9[40685]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:51:32 np0005476733 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Oct  8 10:51:33 np0005476733 python3.9[40838]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:51:35 np0005476733 python3.9[41000]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:51:36 np0005476733 python3.9[41153]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 10:51:36 np0005476733 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct  8 10:51:36 np0005476733 systemd[1]: Stopped Apply Kernel Variables.
Oct  8 10:51:36 np0005476733 systemd[1]: Stopping Apply Kernel Variables...
Oct  8 10:51:36 np0005476733 systemd[1]: Starting Apply Kernel Variables...
Oct  8 10:51:36 np0005476733 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct  8 10:51:36 np0005476733 systemd[1]: Finished Apply Kernel Variables.
Oct  8 10:51:36 np0005476733 systemd[1]: session-11.scope: Deactivated successfully.
Oct  8 10:51:36 np0005476733 systemd[1]: session-11.scope: Consumed 2min 11.377s CPU time.
Oct  8 10:51:36 np0005476733 systemd-logind[827]: Session 11 logged out. Waiting for processes to exit.
Oct  8 10:51:36 np0005476733 systemd-logind[827]: Removed session 11.
Oct  8 10:51:42 np0005476733 systemd-logind[827]: New session 12 of user zuul.
Oct  8 10:51:42 np0005476733 systemd[1]: Started Session 12 of User zuul.
Oct  8 10:51:43 np0005476733 python3.9[41336]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:51:44 np0005476733 python3.9[41490]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:51:45 np0005476733 python3.9[41646]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:51:46 np0005476733 python3.9[41797]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:51:48 np0005476733 python3.9[41953]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 10:51:49 np0005476733 python3.9[42037]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 10:51:51 np0005476733 python3.9[42190]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 10:51:52 np0005476733 python3.9[42361]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:51:53 np0005476733 python3.9[42513]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:51:53 np0005476733 systemd[1]: var-lib-containers-storage-overlay-compat2652845994-merged.mount: Deactivated successfully.
Oct  8 10:51:53 np0005476733 podman[42514]: 2025-10-08 14:51:53.348953415 +0000 UTC m=+0.072641021 system refresh
Oct  8 10:51:54 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:51:54 np0005476733 python3.9[42676]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:51:55 np0005476733 python3.9[42799]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935114.3815017-199-13755043976371/.source.json follow=False _original_basename=podman_network_config.j2 checksum=ce8a9f05a56205af57dca0293f663433abb5a03a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:51:56 np0005476733 python3.9[42951]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:51:56 np0005476733 python3.9[43074]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759935115.9818482-229-9226978491335/.source.conf follow=False _original_basename=registries.conf.j2 checksum=f95551851a3aad1fadf39ba40ad5808b10502fe1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:51:57 np0005476733 python3.9[43226]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:51:58 np0005476733 python3.9[43378]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:51:59 np0005476733 python3.9[43530]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:51:59 np0005476733 python3.9[43682]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:52:00 np0005476733 python3.9[43832]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:52:01 np0005476733 python3.9[43986]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  8 10:52:03 np0005476733 python3.9[44139]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  8 10:52:06 np0005476733 python3.9[44299]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  8 10:52:09 np0005476733 python3.9[44452]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  8 10:52:12 np0005476733 python3.9[44605]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  8 10:52:14 np0005476733 python3.9[44761]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  8 10:52:18 np0005476733 python3.9[44929]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  8 10:52:20 np0005476733 python3.9[45082]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  8 10:52:37 np0005476733 python3.9[45419]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:52:38 np0005476733 python3.9[45594]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:52:39 np0005476733 python3.9[45717]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1759935157.7580311-507-218497320123776/.source.json _original_basename=.tn11pfvj follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:52:40 np0005476733 python3.9[45869]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  8 10:52:40 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:52:41 np0005476733 systemd[1]: var-lib-containers-storage-overlay-compat1689473256-lower\x2dmapped.mount: Deactivated successfully.
Oct  8 10:52:45 np0005476733 podman[45882]: 2025-10-08 14:52:45.691745292 +0000 UTC m=+5.378216370 image pull fa23f900391d6b2045198c4ce65355e00d82cd7d392391ca189cef278619240c 38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297
Oct  8 10:52:45 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:52:45 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:52:45 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:52:50 np0005476733 python3.9[46184]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  8 10:52:50 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:52:52 np0005476733 podman[46193]: 2025-10-08 14:52:52.609026773 +0000 UTC m=+1.972603822 image pull 94361c82f6cc9b9bc202c316322eaa39cd13ed74f9cc27d918c1366be405a281 38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297
Oct  8 10:52:52 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:52:52 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:52:52 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:52:54 np0005476733 python3.9[46450]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  8 10:52:54 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:52:59 np0005476733 podman[46463]: 2025-10-08 14:52:59.369392444 +0000 UTC m=+5.090650605 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 10:52:59 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:52:59 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:52:59 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:53:01 np0005476733 python3.9[46742]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  8 10:53:01 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:53:02 np0005476733 podman[46754]: 2025-10-08 14:53:02.248291595 +0000 UTC m=+0.502255714 image pull 4e93051232e4641dc0eac7573570c6d1a852f96d2f3e786f5899bf2d007a52e4 38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297
Oct  8 10:53:02 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:53:02 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:53:04 np0005476733 python3.9[46988]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  8 10:53:04 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:53:17 np0005476733 podman[47000]: 2025-10-08 14:53:17.175530953 +0000 UTC m=+13.025846593 image pull b762169049433908bdcf83b1787c6c97bc547a073236bc2a8f405754b3d623cc 38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297
Oct  8 10:53:17 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:53:17 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:53:17 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:53:23 np0005476733 python3.9[47257]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  8 10:53:23 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:53:25 np0005476733 podman[47270]: 2025-10-08 14:53:25.024021773 +0000 UTC m=+1.975067170 image pull 12aa1ebee6c7bc72738c39719fe13059590ab0501a869a9a0be74a5be9846d32 38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297
Oct  8 10:53:25 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:53:25 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:53:25 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:53:25 np0005476733 python3.9[47528]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  8 10:53:26 np0005476733 podman[47539]: 2025-10-08 14:53:26.978787117 +0000 UTC m=+1.088598085 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Oct  8 10:53:26 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:53:27 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:53:27 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:53:27 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:53:28 np0005476733 systemd[1]: session-12.scope: Deactivated successfully.
Oct  8 10:53:28 np0005476733 systemd[1]: session-12.scope: Consumed 1min 39.869s CPU time.
Oct  8 10:53:28 np0005476733 systemd-logind[827]: Session 12 logged out. Waiting for processes to exit.
Oct  8 10:53:28 np0005476733 systemd-logind[827]: Removed session 12.
Oct  8 10:53:28 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:53:33 np0005476733 systemd-logind[827]: New session 13 of user zuul.
Oct  8 10:53:33 np0005476733 systemd[1]: Started Session 13 of User zuul.
Oct  8 10:53:34 np0005476733 python3.9[47839]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:53:36 np0005476733 python3.9[47995]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Oct  8 10:53:37 np0005476733 python3.9[48148]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  8 10:53:38 np0005476733 python3.9[48306]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  8 10:53:41 np0005476733 python3.9[48466]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 10:53:46 np0005476733 python3.9[48550]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  8 10:53:48 np0005476733 python3.9[48717]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 10:53:59 np0005476733 kernel: SELinux:  Converting 2726 SID table entries...
Oct  8 10:53:59 np0005476733 kernel: SELinux:  policy capability network_peer_controls=1
Oct  8 10:53:59 np0005476733 kernel: SELinux:  policy capability open_perms=1
Oct  8 10:53:59 np0005476733 kernel: SELinux:  policy capability extended_socket_class=1
Oct  8 10:53:59 np0005476733 kernel: SELinux:  policy capability always_check_network=0
Oct  8 10:53:59 np0005476733 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  8 10:53:59 np0005476733 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  8 10:53:59 np0005476733 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  8 10:53:59 np0005476733 dbus-broker-launch[817]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Oct  8 10:53:59 np0005476733 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Oct  8 10:54:01 np0005476733 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  8 10:54:01 np0005476733 systemd[1]: Starting man-db-cache-update.service...
Oct  8 10:54:01 np0005476733 systemd[1]: Reloading.
Oct  8 10:54:01 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 10:54:01 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 10:54:01 np0005476733 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  8 10:54:01 np0005476733 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  8 10:54:01 np0005476733 systemd[1]: Finished man-db-cache-update.service.
Oct  8 10:54:01 np0005476733 systemd[1]: run-rdd72a086f70c4774a65515f237cdf65a.service: Deactivated successfully.
Oct  8 10:54:07 np0005476733 python3.9[49819]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  8 10:54:07 np0005476733 systemd[1]: Reloading.
Oct  8 10:54:07 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 10:54:07 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 10:54:07 np0005476733 systemd[1]: Starting Open vSwitch Database Unit...
Oct  8 10:54:07 np0005476733 chown[49860]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Oct  8 10:54:07 np0005476733 ovs-ctl[49865]: /etc/openvswitch/conf.db does not exist ... (warning).
Oct  8 10:54:07 np0005476733 ovs-ctl[49865]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Oct  8 10:54:07 np0005476733 ovs-ctl[49865]: Starting ovsdb-server [  OK  ]
Oct  8 10:54:07 np0005476733 ovs-vsctl[49914]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Oct  8 10:54:07 np0005476733 ovs-vsctl[49934]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"ec52a299-e5bc-4227-a88e-e241833eebb2\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Oct  8 10:54:07 np0005476733 ovs-ctl[49865]: Configuring Open vSwitch system IDs [  OK  ]
Oct  8 10:54:07 np0005476733 ovs-vsctl[49940]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Oct  8 10:54:07 np0005476733 ovs-ctl[49865]: Enabling remote OVSDB managers [  OK  ]
Oct  8 10:54:07 np0005476733 systemd[1]: Started Open vSwitch Database Unit.
Oct  8 10:54:07 np0005476733 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Oct  8 10:54:07 np0005476733 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Oct  8 10:54:07 np0005476733 systemd[1]: Starting Open vSwitch Forwarding Unit...
Oct  8 10:54:07 np0005476733 kernel: openvswitch: Open vSwitch switching datapath
Oct  8 10:54:07 np0005476733 ovs-ctl[49984]: Inserting openvswitch module [  OK  ]
Oct  8 10:54:08 np0005476733 ovs-ctl[49953]: Starting ovs-vswitchd [  OK  ]
Oct  8 10:54:08 np0005476733 ovs-vsctl[50002]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Oct  8 10:54:08 np0005476733 ovs-ctl[49953]: Enabling remote OVSDB managers [  OK  ]
Oct  8 10:54:08 np0005476733 systemd[1]: Started Open vSwitch Forwarding Unit.
Oct  8 10:54:08 np0005476733 systemd[1]: Starting Open vSwitch...
Oct  8 10:54:08 np0005476733 systemd[1]: Finished Open vSwitch.
Oct  8 10:54:09 np0005476733 python3.9[50154]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:54:10 np0005476733 python3.9[50306]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Oct  8 10:54:12 np0005476733 kernel: SELinux:  Converting 2740 SID table entries...
Oct  8 10:54:12 np0005476733 kernel: SELinux:  policy capability network_peer_controls=1
Oct  8 10:54:12 np0005476733 kernel: SELinux:  policy capability open_perms=1
Oct  8 10:54:12 np0005476733 kernel: SELinux:  policy capability extended_socket_class=1
Oct  8 10:54:12 np0005476733 kernel: SELinux:  policy capability always_check_network=0
Oct  8 10:54:12 np0005476733 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  8 10:54:12 np0005476733 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  8 10:54:12 np0005476733 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  8 10:54:13 np0005476733 python3.9[50461]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:54:14 np0005476733 dbus-broker-launch[817]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Oct  8 10:54:14 np0005476733 python3.9[50619]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 10:54:16 np0005476733 python3.9[50772]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:54:18 np0005476733 python3.9[51059]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  8 10:54:19 np0005476733 python3.9[51209]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 10:54:19 np0005476733 python3.9[51363]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 10:54:21 np0005476733 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  8 10:54:21 np0005476733 systemd[1]: Starting man-db-cache-update.service...
Oct  8 10:54:21 np0005476733 systemd[1]: Reloading.
Oct  8 10:54:21 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 10:54:21 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 10:54:22 np0005476733 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  8 10:54:22 np0005476733 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  8 10:54:22 np0005476733 systemd[1]: Finished man-db-cache-update.service.
Oct  8 10:54:22 np0005476733 systemd[1]: run-r60298da4ea4a43f89272a7c5f0ffe583.service: Deactivated successfully.
Oct  8 10:54:23 np0005476733 python3.9[51681]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 10:54:23 np0005476733 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct  8 10:54:23 np0005476733 systemd[1]: Stopped Network Manager Wait Online.
Oct  8 10:54:23 np0005476733 systemd[1]: Stopping Network Manager Wait Online...
Oct  8 10:54:23 np0005476733 systemd[1]: Stopping Network Manager...
Oct  8 10:54:23 np0005476733 NetworkManager[3949]: <info>  [1759935263.5046] caught SIGTERM, shutting down normally.
Oct  8 10:54:23 np0005476733 NetworkManager[3949]: <info>  [1759935263.5059] dhcp4 (eth0): canceled DHCP transaction
Oct  8 10:54:23 np0005476733 NetworkManager[3949]: <info>  [1759935263.5059] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  8 10:54:23 np0005476733 NetworkManager[3949]: <info>  [1759935263.5059] dhcp4 (eth0): state changed no lease
Oct  8 10:54:23 np0005476733 NetworkManager[3949]: <info>  [1759935263.5061] manager: NetworkManager state is now CONNECTED_SITE
Oct  8 10:54:23 np0005476733 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  8 10:54:23 np0005476733 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  8 10:54:23 np0005476733 NetworkManager[3949]: <info>  [1759935263.5393] exiting (success)
Oct  8 10:54:23 np0005476733 systemd[1]: NetworkManager.service: Deactivated successfully.
Oct  8 10:54:23 np0005476733 systemd[1]: Stopped Network Manager.
Oct  8 10:54:23 np0005476733 systemd[1]: NetworkManager.service: Consumed 11.140s CPU time, 4.1M memory peak, read 0B from disk, written 37.5K to disk.
Oct  8 10:54:23 np0005476733 systemd[1]: Starting Network Manager...
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.6182] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:333cb165-ebfa-4717-89b0-51e1063c215b)
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.6182] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.6230] manager[0x5565a2fb3090]: monitoring kernel firmware directory '/lib/firmware'.
Oct  8 10:54:23 np0005476733 systemd[1]: Starting Hostname Service...
Oct  8 10:54:23 np0005476733 systemd[1]: Started Hostname Service.
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.6991] hostname: hostname: using hostnamed
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.6991] hostname: static hostname changed from (none) to "compute-1"
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.6996] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7001] manager[0x5565a2fb3090]: rfkill: Wi-Fi hardware radio set enabled
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7001] manager[0x5565a2fb3090]: rfkill: WWAN hardware radio set enabled
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7018] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7026] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7026] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7026] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7027] manager: Networking is enabled by state file
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7028] settings: Loaded settings plugin: keyfile (internal)
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7033] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7053] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7061] dhcp: init: Using DHCP client 'internal'
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7063] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7068] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7073] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7079] device (lo): Activation: starting connection 'lo' (f95091c6-00bf-46c5-85a3-3d1e635fc62f)
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7085] device (eth0): carrier: link connected
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7088] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7094] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7094] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7100] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7106] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7111] device (eth1): carrier: link connected
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7114] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7118] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (741aa56c-7d41-58ab-9957-86492a3b919f) (indicated)
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7118] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7123] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7130] device (eth1): Activation: starting connection 'ci-private-network' (741aa56c-7d41-58ab-9957-86492a3b919f)
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7135] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct  8 10:54:23 np0005476733 systemd[1]: Started Network Manager.
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7143] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7144] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7146] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7148] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7150] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7153] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7155] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7157] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7163] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7166] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7172] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7182] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7199] dhcp4 (eth0): state changed new lease, address=38.102.83.224
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7205] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct  8 10:54:23 np0005476733 systemd[1]: Starting Network Manager Wait Online...
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7779] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7789] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7791] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7793] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7801] device (lo): Activation: successful, device activated.
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7808] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7813] manager: NetworkManager state is now CONNECTED_LOCAL
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7816] device (eth1): Activation: successful, device activated.
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7880] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7883] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7887] manager: NetworkManager state is now CONNECTED_SITE
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7891] device (eth0): Activation: successful, device activated.
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7897] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct  8 10:54:23 np0005476733 NetworkManager[51699]: <info>  [1759935263.7901] manager: startup complete
Oct  8 10:54:23 np0005476733 systemd[1]: Finished Network Manager Wait Online.
Oct  8 10:54:24 np0005476733 python3.9[51907]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 10:54:30 np0005476733 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  8 10:54:30 np0005476733 systemd[1]: Starting man-db-cache-update.service...
Oct  8 10:54:30 np0005476733 systemd[1]: Reloading.
Oct  8 10:54:30 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 10:54:30 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 10:54:30 np0005476733 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  8 10:54:33 np0005476733 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  8 10:54:33 np0005476733 systemd[1]: Finished man-db-cache-update.service.
Oct  8 10:54:33 np0005476733 systemd[1]: run-r5c7c2064bf2b480f8db48520c1f32c77.service: Deactivated successfully.
Oct  8 10:54:33 np0005476733 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  8 10:54:34 np0005476733 python3.9[52369]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 10:54:34 np0005476733 python3.9[52521]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:54:35 np0005476733 python3.9[52675]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:54:36 np0005476733 python3.9[52827]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:54:37 np0005476733 python3.9[52979]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:54:37 np0005476733 python3.9[53131]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:54:38 np0005476733 python3.9[53283]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:54:39 np0005476733 python3.9[53406]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759935278.0304396-439-26284299981990/.source _original_basename=.yoo2x298 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:54:39 np0005476733 python3.9[53558]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:54:40 np0005476733 python3.9[53710]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Oct  8 10:54:41 np0005476733 python3.9[53862]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:54:43 np0005476733 python3.9[54289]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Oct  8 10:54:44 np0005476733 ansible-async_wrapper.py[54464]: Invoked with j316596879037 300 /home/zuul/.ansible/tmp/ansible-tmp-1759935284.033823-571-101198834196279/AnsiballZ_edpm_os_net_config.py _
Oct  8 10:54:44 np0005476733 ansible-async_wrapper.py[54467]: Starting module and watcher
Oct  8 10:54:44 np0005476733 ansible-async_wrapper.py[54467]: Start watching 54468 (300)
Oct  8 10:54:44 np0005476733 ansible-async_wrapper.py[54468]: Start module (54468)
Oct  8 10:54:44 np0005476733 ansible-async_wrapper.py[54464]: Return async_wrapper task started.
Oct  8 10:54:45 np0005476733 python3.9[54469]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Oct  8 10:54:45 np0005476733 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Oct  8 10:54:45 np0005476733 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Oct  8 10:54:45 np0005476733 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Oct  8 10:54:45 np0005476733 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Oct  8 10:54:45 np0005476733 kernel: cfg80211: failed to load regulatory.db
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.8729] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54470 uid=0 result="success"
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.8745] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54470 uid=0 result="success"
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9192] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9193] audit: op="connection-add" uuid="3deebab2-0473-4027-bb29-e769b53b585f" name="br-ex-br" pid=54470 uid=0 result="success"
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9206] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9207] audit: op="connection-add" uuid="56bacf1f-9957-4271-97db-2ac4e1e65df7" name="br-ex-port" pid=54470 uid=0 result="success"
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9217] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9218] audit: op="connection-add" uuid="e1fb1495-908a-4eff-afed-cd62a5f76f6e" name="eth1-port" pid=54470 uid=0 result="success"
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9229] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9230] audit: op="connection-add" uuid="b5de9720-a183-4be0-af38-c105bb184b8b" name="vlan20-port" pid=54470 uid=0 result="success"
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9239] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9240] audit: op="connection-add" uuid="aa8d78ae-000d-41e4-b8da-f270105b55d4" name="vlan21-port" pid=54470 uid=0 result="success"
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9251] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9252] audit: op="connection-add" uuid="184a31de-6085-4029-857d-3b5608cbd745" name="vlan22-port" pid=54470 uid=0 result="success"
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9269] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority,connection.timestamp,ipv6.method,ipv6.dhcp-timeout,ipv6.addr-gen-mode" pid=54470 uid=0 result="success"
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9282] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9283] audit: op="connection-add" uuid="fd388079-956f-4657-8fc9-e9ead2734c7e" name="br-ex-if" pid=54470 uid=0 result="success"
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9327] audit: op="connection-update" uuid="741aa56c-7d41-58ab-9957-86492a3b919f" name="ci-private-network" args="ovs-interface.type,ipv4.method,ipv4.routing-rules,ipv4.addresses,ipv4.dns,ipv4.never-default,ipv4.routes,connection.master,connection.timestamp,connection.controller,connection.port-type,connection.slave-type,ipv6.method,ipv6.routing-rules,ipv6.addresses,ipv6.dns,ipv6.addr-gen-mode,ipv6.routes,ovs-external-ids.data" pid=54470 uid=0 result="success"
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9342] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9344] audit: op="connection-add" uuid="2a1e6c1c-565e-4c0b-9b7f-8e5aae03ec96" name="vlan20-if" pid=54470 uid=0 result="success"
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9356] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9358] audit: op="connection-add" uuid="d5a6524b-461c-4d2b-8028-e0b8952b5584" name="vlan21-if" pid=54470 uid=0 result="success"
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9372] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9374] audit: op="connection-add" uuid="59cb1232-0ba6-4abf-b2c3-b33aa7df8e14" name="vlan22-if" pid=54470 uid=0 result="success"
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9383] audit: op="connection-delete" uuid="5ec6a8e8-7a4f-32f6-a4d8-d91859011c0c" name="Wired connection 1" pid=54470 uid=0 result="success"
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9392] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9399] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9402] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (3deebab2-0473-4027-bb29-e769b53b585f)
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9402] audit: op="connection-activate" uuid="3deebab2-0473-4027-bb29-e769b53b585f" name="br-ex-br" pid=54470 uid=0 result="success"
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9403] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9409] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9412] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (56bacf1f-9957-4271-97db-2ac4e1e65df7)
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9413] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9418] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9421] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (e1fb1495-908a-4eff-afed-cd62a5f76f6e)
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9423] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9428] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9431] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (b5de9720-a183-4be0-af38-c105bb184b8b)
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9433] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9438] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9441] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (aa8d78ae-000d-41e4-b8da-f270105b55d4)
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9443] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9449] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9452] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (184a31de-6085-4029-857d-3b5608cbd745)
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9453] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9455] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9456] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9461] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9465] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9468] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (fd388079-956f-4657-8fc9-e9ead2734c7e)
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9468] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9471] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9473] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9474] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9474] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9482] device (eth1): disconnecting for new activation request.
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9482] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9485] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9486] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9486] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9488] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9492] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9495] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (2a1e6c1c-565e-4c0b-9b7f-8e5aae03ec96)
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9496] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9498] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9499] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9500] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9502] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9506] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9511] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (d5a6524b-461c-4d2b-8028-e0b8952b5584)
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9512] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9514] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9515] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9516] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9518] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9522] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9526] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (59cb1232-0ba6-4abf-b2c3-b33aa7df8e14)
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9527] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9530] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9532] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9533] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9534] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9549] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority,ipv6.method,ipv6.addr-gen-mode" pid=54470 uid=0 result="success"
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9551] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9553] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9555] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9567] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9570] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9573] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9576] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9577] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 kernel: ovs-system: entered promiscuous mode
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9588] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9592] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9596] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 systemd-udevd[54475]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 10:54:46 np0005476733 kernel: Timeout policy base is empty
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9597] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9605] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9616] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9622] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9624] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9629] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9634] dhcp4 (eth0): canceled DHCP transaction
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9634] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9634] dhcp4 (eth0): state changed no lease
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9635] dhcp4 (eth0): activation: beginning transaction (no timeout)
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9644] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9651] audit: op="device-reapply" interface="eth1" ifindex=3 pid=54470 uid=0 result="fail" reason="Device is not activated"
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9656] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9662] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9670] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9679] device (eth1): disconnecting for new activation request.
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9680] audit: op="connection-activate" uuid="741aa56c-7d41-58ab-9957-86492a3b919f" name="ci-private-network" pid=54470 uid=0 result="success"
Oct  8 10:54:46 np0005476733 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9721] dhcp4 (eth0): state changed new lease, address=38.102.83.224
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9760] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54470 uid=0 result="success"
Oct  8 10:54:46 np0005476733 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9800] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct  8 10:54:46 np0005476733 kernel: br-ex: entered promiscuous mode
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9902] device (eth1): Activation: starting connection 'ci-private-network' (741aa56c-7d41-58ab-9957-86492a3b919f)
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9916] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9921] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9935] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9937] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9939] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9941] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9943] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9946] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9957] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 kernel: vlan22: entered promiscuous mode
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9963] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9966] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Oct  8 10:54:46 np0005476733 systemd-udevd[54476]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9970] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9973] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9975] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9979] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Oct  8 10:54:46 np0005476733 NetworkManager[51699]: <info>  [1759935286.9993] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0007] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0015] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0022] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0027] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0031] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Oct  8 10:54:47 np0005476733 kernel: vlan21: entered promiscuous mode
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0044] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0054] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0063] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0077] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0087] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0100] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 10:54:47 np0005476733 kernel: vlan20: entered promiscuous mode
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0106] device (eth1): Activation: successful, device activated.
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0112] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0142] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0155] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Oct  8 10:54:47 np0005476733 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0166] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0175] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0182] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0196] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0200] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0209] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0219] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0237] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0238] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0244] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0252] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0273] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0307] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0309] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 10:54:47 np0005476733 NetworkManager[51699]: <info>  [1759935287.0315] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  8 10:54:48 np0005476733 NetworkManager[51699]: <info>  [1759935288.1457] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54470 uid=0 result="success"
Oct  8 10:54:48 np0005476733 NetworkManager[51699]: <info>  [1759935288.2682] checkpoint[0x5565a2f89950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Oct  8 10:54:48 np0005476733 NetworkManager[51699]: <info>  [1759935288.2686] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54470 uid=0 result="success"
Oct  8 10:54:48 np0005476733 NetworkManager[51699]: <info>  [1759935288.5184] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54470 uid=0 result="success"
Oct  8 10:54:48 np0005476733 NetworkManager[51699]: <info>  [1759935288.5195] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54470 uid=0 result="success"
Oct  8 10:54:48 np0005476733 NetworkManager[51699]: <info>  [1759935288.6878] audit: op="networking-control" arg="global-dns-configuration" pid=54470 uid=0 result="success"
Oct  8 10:54:48 np0005476733 NetworkManager[51699]: <info>  [1759935288.6926] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Oct  8 10:54:48 np0005476733 NetworkManager[51699]: <info>  [1759935288.6955] audit: op="networking-control" arg="global-dns-configuration" pid=54470 uid=0 result="success"
Oct  8 10:54:48 np0005476733 NetworkManager[51699]: <info>  [1759935288.7346] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54470 uid=0 result="success"
Oct  8 10:54:48 np0005476733 NetworkManager[51699]: <info>  [1759935288.8378] checkpoint[0x5565a2f89a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Oct  8 10:54:48 np0005476733 NetworkManager[51699]: <info>  [1759935288.8382] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54470 uid=0 result="success"
Oct  8 10:54:48 np0005476733 python3.9[54804]: ansible-ansible.legacy.async_status Invoked with jid=j316596879037.54464 mode=status _async_dir=/root/.ansible_async
Oct  8 10:54:48 np0005476733 ansible-async_wrapper.py[54468]: Module complete (54468)
Oct  8 10:54:49 np0005476733 ansible-async_wrapper.py[54467]: Done in kid B.
Oct  8 10:54:52 np0005476733 python3.9[54908]: ansible-ansible.legacy.async_status Invoked with jid=j316596879037.54464 mode=status _async_dir=/root/.ansible_async
Oct  8 10:54:52 np0005476733 python3.9[55008]: ansible-ansible.legacy.async_status Invoked with jid=j316596879037.54464 mode=cleanup _async_dir=/root/.ansible_async
Oct  8 10:54:53 np0005476733 python3.9[55160]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:54:53 np0005476733 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  8 10:54:54 np0005476733 python3.9[55286]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759935293.1263368-625-14351882239823/.source.returncode _original_basename=.9jf1d3kn follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:54:54 np0005476733 python3.9[55438]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:54:55 np0005476733 python3.9[55561]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759935294.400781-657-181149399200039/.source.cfg _original_basename=.6u4dfdlv follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:54:56 np0005476733 python3.9[55714]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 10:54:56 np0005476733 systemd[1]: Reloading Network Manager...
Oct  8 10:54:56 np0005476733 NetworkManager[51699]: <info>  [1759935296.2069] audit: op="reload" arg="0" pid=55718 uid=0 result="success"
Oct  8 10:54:56 np0005476733 NetworkManager[51699]: <info>  [1759935296.2075] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Oct  8 10:54:56 np0005476733 systemd[1]: Reloaded Network Manager.
Oct  8 10:54:56 np0005476733 systemd[1]: session-13.scope: Deactivated successfully.
Oct  8 10:54:56 np0005476733 systemd[1]: session-13.scope: Consumed 47.997s CPU time.
Oct  8 10:54:56 np0005476733 systemd-logind[827]: Session 13 logged out. Waiting for processes to exit.
Oct  8 10:54:56 np0005476733 systemd-logind[827]: Removed session 13.
Oct  8 10:55:02 np0005476733 systemd-logind[827]: New session 14 of user zuul.
Oct  8 10:55:02 np0005476733 systemd[1]: Started Session 14 of User zuul.
Oct  8 10:55:03 np0005476733 python3.9[55902]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:55:04 np0005476733 python3.9[56056]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 10:55:05 np0005476733 python3.9[56246]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:55:06 np0005476733 systemd[1]: session-14.scope: Deactivated successfully.
Oct  8 10:55:06 np0005476733 systemd[1]: session-14.scope: Consumed 2.177s CPU time.
Oct  8 10:55:06 np0005476733 systemd-logind[827]: Session 14 logged out. Waiting for processes to exit.
Oct  8 10:55:06 np0005476733 systemd-logind[827]: Removed session 14.
Oct  8 10:55:06 np0005476733 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  8 10:55:11 np0005476733 systemd-logind[827]: New session 15 of user zuul.
Oct  8 10:55:11 np0005476733 systemd[1]: Started Session 15 of User zuul.
Oct  8 10:55:12 np0005476733 python3.9[56428]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:55:13 np0005476733 python3.9[56582]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:55:14 np0005476733 python3.9[56738]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 10:55:15 np0005476733 python3.9[56823]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 10:55:17 np0005476733 python3.9[56976]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 10:55:18 np0005476733 python3.9[57168]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:55:19 np0005476733 python3.9[57320]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:55:19 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 10:55:20 np0005476733 python3.9[57482]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:55:20 np0005476733 python3.9[57560]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:55:21 np0005476733 python3.9[57712]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:55:21 np0005476733 python3.9[57790]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:55:22 np0005476733 python3.9[57942]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:55:23 np0005476733 python3.9[58094]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:55:24 np0005476733 python3.9[58246]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:55:25 np0005476733 python3.9[58398]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:55:26 np0005476733 python3.9[58550]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 10:55:28 np0005476733 python3.9[58703]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:55:28 np0005476733 python3.9[58857]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 10:55:29 np0005476733 python3.9[59009]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 10:55:30 np0005476733 python3.9[59161]: ansible-service_facts Invoked
Oct  8 10:55:30 np0005476733 network[59178]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  8 10:55:30 np0005476733 network[59179]: 'network-scripts' will be removed from distribution in near future.
Oct  8 10:55:30 np0005476733 network[59180]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  8 10:55:36 np0005476733 python3.9[59635]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 10:55:38 np0005476733 python3.9[59788]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Oct  8 10:55:40 np0005476733 python3.9[59940]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:55:40 np0005476733 python3.9[60065]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759935339.6116815-422-73031097371661/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:55:41 np0005476733 python3.9[60219]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:55:42 np0005476733 python3.9[60344]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759935341.1589656-452-199056061408172/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:55:43 np0005476733 python3.9[60499]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:55:45 np0005476733 python3.9[60653]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 10:55:46 np0005476733 python3.9[60737]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 10:55:47 np0005476733 python3.9[60891]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 10:55:48 np0005476733 python3.9[60975]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 10:55:48 np0005476733 chronyd[835]: chronyd exiting
Oct  8 10:55:48 np0005476733 systemd[1]: Stopping NTP client/server...
Oct  8 10:55:48 np0005476733 systemd[1]: chronyd.service: Deactivated successfully.
Oct  8 10:55:48 np0005476733 systemd[1]: Stopped NTP client/server.
Oct  8 10:55:48 np0005476733 systemd[1]: Starting NTP client/server...
Oct  8 10:55:48 np0005476733 chronyd[60983]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct  8 10:55:48 np0005476733 chronyd[60983]: Frequency -32.064 +/- 0.292 ppm read from /var/lib/chrony/drift
Oct  8 10:55:48 np0005476733 chronyd[60983]: Loaded seccomp filter (level 2)
Oct  8 10:55:48 np0005476733 systemd[1]: Started NTP client/server.
Oct  8 10:55:49 np0005476733 systemd[1]: session-15.scope: Deactivated successfully.
Oct  8 10:55:49 np0005476733 systemd[1]: session-15.scope: Consumed 24.294s CPU time.
Oct  8 10:55:49 np0005476733 systemd-logind[827]: Session 15 logged out. Waiting for processes to exit.
Oct  8 10:55:49 np0005476733 systemd-logind[827]: Removed session 15.
Oct  8 10:55:55 np0005476733 systemd-logind[827]: New session 16 of user zuul.
Oct  8 10:55:55 np0005476733 systemd[1]: Started Session 16 of User zuul.
Oct  8 10:55:56 np0005476733 python3.9[61162]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:55:57 np0005476733 python3.9[61318]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:55:58 np0005476733 python3.9[61493]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:55:58 np0005476733 python3.9[61571]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.lpersbv3 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:55:59 np0005476733 python3.9[61723]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:56:00 np0005476733 python3.9[61846]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759935359.2335548-103-139676296263248/.source _original_basename=.4z0cgsyi follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:56:01 np0005476733 python3.9[61998]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:56:01 np0005476733 python3.9[62150]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:56:02 np0005476733 python3.9[62273]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759935361.3313324-151-22471339087423/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:56:03 np0005476733 python3.9[62425]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:56:03 np0005476733 python3.9[62548]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759935362.6048234-151-19465605442412/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:56:04 np0005476733 python3.9[62700]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:56:05 np0005476733 python3.9[62852]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:56:05 np0005476733 python3.9[62975]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935364.7165942-225-191227041623012/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:56:06 np0005476733 python3.9[63127]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:56:06 np0005476733 python3.9[63250]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935365.939109-255-152885446525284/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:56:08 np0005476733 python3.9[63402]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 10:56:08 np0005476733 systemd[1]: Reloading.
Oct  8 10:56:08 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 10:56:08 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 10:56:08 np0005476733 systemd[1]: Reloading.
Oct  8 10:56:08 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 10:56:08 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 10:56:08 np0005476733 systemd[1]: Starting EDPM Container Shutdown...
Oct  8 10:56:08 np0005476733 systemd[1]: Finished EDPM Container Shutdown.
Oct  8 10:56:09 np0005476733 python3.9[63632]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:56:09 np0005476733 python3.9[63755]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935368.883507-301-17155975947764/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:56:10 np0005476733 python3.9[63907]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:56:11 np0005476733 python3.9[64030]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935370.1891658-331-191375212420377/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:56:11 np0005476733 python3.9[64182]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 10:56:11 np0005476733 systemd[1]: Reloading.
Oct  8 10:56:12 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 10:56:12 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 10:56:12 np0005476733 systemd[1]: Reloading.
Oct  8 10:56:12 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 10:56:12 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 10:56:12 np0005476733 systemd[1]: Starting Create netns directory...
Oct  8 10:56:12 np0005476733 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  8 10:56:12 np0005476733 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  8 10:56:12 np0005476733 systemd[1]: Finished Create netns directory.
Oct  8 10:56:13 np0005476733 python3.9[64408]: ansible-ansible.builtin.service_facts Invoked
Oct  8 10:56:13 np0005476733 network[64425]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  8 10:56:13 np0005476733 network[64426]: 'network-scripts' will be removed from distribution in near future.
Oct  8 10:56:13 np0005476733 network[64427]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  8 10:56:17 np0005476733 python3.9[64691]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 10:56:18 np0005476733 systemd[1]: Reloading.
Oct  8 10:56:18 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 10:56:18 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 10:56:18 np0005476733 systemd[1]: Stopping IPv4 firewall with iptables...
Oct  8 10:56:19 np0005476733 iptables.init[64732]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Oct  8 10:56:19 np0005476733 iptables.init[64732]: iptables: Flushing firewall rules: [  OK  ]
Oct  8 10:56:19 np0005476733 systemd[1]: iptables.service: Deactivated successfully.
Oct  8 10:56:19 np0005476733 systemd[1]: Stopped IPv4 firewall with iptables.
Oct  8 10:56:20 np0005476733 python3.9[64928]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 10:56:21 np0005476733 python3.9[65082]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 10:56:21 np0005476733 systemd[1]: Reloading.
Oct  8 10:56:21 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 10:56:21 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 10:56:21 np0005476733 systemd[1]: Starting Netfilter Tables...
Oct  8 10:56:21 np0005476733 systemd[1]: Finished Netfilter Tables.
Oct  8 10:56:22 np0005476733 python3.9[65274]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:56:23 np0005476733 python3.9[65427]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:56:23 np0005476733 python3.9[65552]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759935382.8624897-469-111793298617358/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=4729b6ffc5b555fa142bf0b6e6dc15609cb89a22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:56:24 np0005476733 python3.9[65703]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 10:56:50 np0005476733 systemd[1]: session-16.scope: Deactivated successfully.
Oct  8 10:56:50 np0005476733 systemd[1]: session-16.scope: Consumed 17.984s CPU time.
Oct  8 10:56:50 np0005476733 systemd-logind[827]: Session 16 logged out. Waiting for processes to exit.
Oct  8 10:56:50 np0005476733 systemd-logind[827]: Removed session 16.
Oct  8 10:57:02 np0005476733 systemd-logind[827]: New session 17 of user zuul.
Oct  8 10:57:02 np0005476733 systemd[1]: Started Session 17 of User zuul.
Oct  8 10:57:04 np0005476733 python3.9[65897]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:57:05 np0005476733 python3.9[66053]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:05 np0005476733 python3.9[66228]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:57:06 np0005476733 python3.9[66306]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.gl44hhfv recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:07 np0005476733 python3.9[66458]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:57:07 np0005476733 python3.9[66536]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.6b70t7tt recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:08 np0005476733 python3.9[66688]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:57:09 np0005476733 python3.9[66840]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:57:09 np0005476733 python3.9[66918]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:57:10 np0005476733 python3.9[67070]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:57:10 np0005476733 python3.9[67148]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:57:11 np0005476733 python3.9[67300]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:12 np0005476733 python3.9[67452]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:57:12 np0005476733 python3.9[67530]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:13 np0005476733 python3.9[67682]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:57:13 np0005476733 python3.9[67760]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:14 np0005476733 python3.9[67912]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 10:57:14 np0005476733 systemd[1]: Reloading.
Oct  8 10:57:15 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 10:57:15 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 10:57:16 np0005476733 python3.9[68101]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:57:16 np0005476733 python3.9[68179]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:17 np0005476733 python3.9[68331]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:57:17 np0005476733 python3.9[68409]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:18 np0005476733 python3.9[68561]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 10:57:18 np0005476733 systemd[1]: Reloading.
Oct  8 10:57:18 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 10:57:18 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 10:57:18 np0005476733 systemd[1]: Starting Create netns directory...
Oct  8 10:57:18 np0005476733 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  8 10:57:18 np0005476733 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  8 10:57:18 np0005476733 systemd[1]: Finished Create netns directory.
Oct  8 10:57:19 np0005476733 python3.9[68752]: ansible-ansible.builtin.service_facts Invoked
Oct  8 10:57:19 np0005476733 network[68769]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  8 10:57:19 np0005476733 network[68770]: 'network-scripts' will be removed from distribution in near future.
Oct  8 10:57:19 np0005476733 network[68771]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  8 10:57:26 np0005476733 python3.9[69034]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:57:26 np0005476733 python3.9[69112]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:27 np0005476733 python3.9[69264]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:28 np0005476733 python3.9[69416]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:57:28 np0005476733 python3.9[69539]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935447.8030539-413-196440580398749/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:29 np0005476733 python3.9[69691]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct  8 10:57:29 np0005476733 systemd[1]: Starting Time & Date Service...
Oct  8 10:57:30 np0005476733 systemd[1]: Started Time & Date Service.
Oct  8 10:57:30 np0005476733 python3.9[69847]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:31 np0005476733 python3.9[69999]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:57:32 np0005476733 python3.9[70122]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759935451.0635347-483-209965702819297/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:32 np0005476733 python3.9[70274]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:57:33 np0005476733 python3.9[70397]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759935452.3235037-513-250016648836259/.source.yaml _original_basename=.nxvbj9iu follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:34 np0005476733 python3.9[70549]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:57:34 np0005476733 python3.9[70672]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935453.6361783-543-167837304073443/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:35 np0005476733 python3.9[70824]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:57:36 np0005476733 python3.9[70977]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:57:37 np0005476733 python3[71130]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  8 10:57:38 np0005476733 python3.9[71282]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:57:38 np0005476733 python3.9[71405]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935457.5835376-621-39492789009570/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:39 np0005476733 python3.9[71557]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:57:39 np0005476733 python3.9[71680]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935458.8211699-651-130561642685131/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:40 np0005476733 python3.9[71832]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:57:41 np0005476733 python3.9[71955]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935460.2170842-682-207415416326284/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:41 np0005476733 python3.9[72107]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:57:42 np0005476733 python3.9[72230]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935461.4661207-711-57947323937591/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:43 np0005476733 python3.9[72382]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:57:44 np0005476733 python3.9[72505]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935462.889704-741-194593602510674/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:44 np0005476733 python3.9[72657]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:45 np0005476733 python3.9[72809]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:57:46 np0005476733 python3.9[72968]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:47 np0005476733 python3.9[73121]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:47 np0005476733 python3.9[73273]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:48 np0005476733 python3.9[73425]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct  8 10:57:48 np0005476733 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 10:57:48 np0005476733 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 10:57:49 np0005476733 python3.9[73579]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct  8 10:57:49 np0005476733 systemd[1]: session-17.scope: Deactivated successfully.
Oct  8 10:57:49 np0005476733 systemd[1]: session-17.scope: Consumed 29.628s CPU time.
Oct  8 10:57:49 np0005476733 systemd-logind[827]: Session 17 logged out. Waiting for processes to exit.
Oct  8 10:57:49 np0005476733 systemd-logind[827]: Removed session 17.
Oct  8 10:57:55 np0005476733 systemd-logind[827]: New session 18 of user zuul.
Oct  8 10:57:55 np0005476733 systemd[1]: Started Session 18 of User zuul.
Oct  8 10:57:56 np0005476733 python3.9[73760]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Oct  8 10:57:56 np0005476733 python3.9[73912]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 10:57:58 np0005476733 python3.9[74064]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:57:58 np0005476733 chronyd[60983]: Selected source 23.133.168.245 (pool.ntp.org)
Oct  8 10:57:58 np0005476733 python3.9[74216]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLQI/FtfVvMiXzK0Ldkj3KnPVRN51vUTednlUR+b4p8YeZdYdQazebQ6yoJMt+KtgcwhhF6UL6TC6ePVZkjrIpjWLQhMGHbT6KxUDgBo3nI4HASqe8sH7x7n/irFESe9923ZiUQOYoJ/PqnfiKeKDaDacEk5EsksjLcTEeGNH7qMPbbHOtmfZzdaAQ6AZVOD9MOh+fsBUoJ1RmzP2RSUF/dS2ErOs1flXSsz9MxD0wgUuLrNwPa2SPnehvDnVLb4vQWDzMxo+SM/M5sgJQsB9oibM/9rjwdkLqbO1g9KrTDqSkgKbkc2nDpVGatiOHwTlsSrQf/fEqMRznEr/UN11claK5nWhQLiiMuuUxXuiWB7mf4qKaJYscLKPdmPXun0T70xekZXVZBVpjJhunqJneI1BLHX5Np7C4nrYngDoGRu9f2BqepsTkatN93ueJz/xWlnNWixDWcD3H0lbAQPRqDyUtCXGb7OvLg9ZQmc8O4ESf6xeQ+Xomowepn8IgBBM=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBua7WF6Ean0kL6nKA2S7/vHZKdujUetM3Cb7PnQ5vxZ#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEgJHRew7ACGh8d9GfWZCPepZDjta8i86w/IE8HimXwydhDD2CThcxvH8CIZ53lgfKriVcTIjfafBJXTC2GG6rc=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCV2Qk5dmTjM/NanLd+k4oigtJdXfo6rV3Xd24aTXWJSXzU4ArtF343Bp8+xT1a83tTgqDO+UiC7taXE1WK8JK0cbQaJlq+MqHjDFhWxG/lejliE1xqKRN6yJGyebD1k/AUvK6My3r2WygE8sMXa3AiYRNDHnvbRlxu9VlVDRtDqjeSas4ywNGwCF+A2kgSrrnS6G+Uml6ZRqv5tIIG1CvAyRIXbq6gvIn61R4x0xhaLIcv3Y35LSIi2ySlMnfD/aaONNDeEBVbE3TF3/zd6+ICLPCzNUAbWVvyRWCZIN8POUTinBKlHhCG8FYoH7rluvJRI8RY4XSmCOmrn+jT+vbUpl83FuFg7Y6/JiystHJGBhALOaINybh1rp8223vKmx8zrlyqiS4lCB16N7+onF6lNiiG3Q6qwo4fJMY9hrTK/nIA3BsUf/ZfSz9lLekgAgGTXB6cIoCRa5KlI6W3WW6/k+ccyUA30QISBF/LL6Nwp5wvFCpEJxyjcO1WKz7trxM=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEzREQJNpFOVGK5QmHO9Gqw9GuQ2zAJXFPatmqsARQ+y#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIZ5LGsNrPkgz3GtyTiV03zLNhZqhk/V3efcHY96ivHaG1QaZClKgZd6n7tnfcwLkMZoWDBR+YEZaKCtikK0/MM=#012 create=True mode=0644 path=/tmp/ansible.uhziwbzm state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:57:59 np0005476733 python3.9[74368]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.uhziwbzm' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:58:00 np0005476733 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct  8 10:58:00 np0005476733 python3.9[74525]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.uhziwbzm state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:58:01 np0005476733 systemd[1]: session-18.scope: Deactivated successfully.
Oct  8 10:58:01 np0005476733 systemd[1]: session-18.scope: Consumed 3.293s CPU time.
Oct  8 10:58:01 np0005476733 systemd-logind[827]: Session 18 logged out. Waiting for processes to exit.
Oct  8 10:58:01 np0005476733 systemd-logind[827]: Removed session 18.
Oct  8 10:58:06 np0005476733 systemd-logind[827]: New session 19 of user zuul.
Oct  8 10:58:06 np0005476733 systemd[1]: Started Session 19 of User zuul.
Oct  8 10:58:07 np0005476733 python3.9[74703]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:58:09 np0005476733 python3.9[74859]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct  8 10:58:10 np0005476733 python3.9[75013]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 10:58:11 np0005476733 python3.9[75166]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:58:12 np0005476733 python3.9[75319]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 10:58:13 np0005476733 python3.9[75473]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:58:14 np0005476733 python3.9[75628]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:58:14 np0005476733 systemd[1]: session-19.scope: Deactivated successfully.
Oct  8 10:58:14 np0005476733 systemd[1]: session-19.scope: Consumed 4.294s CPU time.
Oct  8 10:58:14 np0005476733 systemd-logind[827]: Session 19 logged out. Waiting for processes to exit.
Oct  8 10:58:14 np0005476733 systemd-logind[827]: Removed session 19.
Oct  8 10:58:20 np0005476733 systemd-logind[827]: New session 20 of user zuul.
Oct  8 10:58:20 np0005476733 systemd[1]: Started Session 20 of User zuul.
Oct  8 10:58:21 np0005476733 python3.9[75806]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:58:23 np0005476733 python3.9[75962]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 10:58:24 np0005476733 python3.9[76046]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  8 10:58:26 np0005476733 python3.9[76197]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:58:27 np0005476733 python3.9[76348]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  8 10:58:28 np0005476733 python3.9[76498]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 10:58:28 np0005476733 python3.9[76648]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 10:58:29 np0005476733 systemd[1]: session-20.scope: Deactivated successfully.
Oct  8 10:58:29 np0005476733 systemd[1]: session-20.scope: Consumed 5.725s CPU time.
Oct  8 10:58:29 np0005476733 systemd-logind[827]: Session 20 logged out. Waiting for processes to exit.
Oct  8 10:58:29 np0005476733 systemd-logind[827]: Removed session 20.
Oct  8 10:58:34 np0005476733 systemd-logind[827]: New session 21 of user zuul.
Oct  8 10:58:34 np0005476733 systemd[1]: Started Session 21 of User zuul.
Oct  8 10:58:36 np0005476733 python3.9[76826]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:58:37 np0005476733 python3.9[76982]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:58:38 np0005476733 python3.9[77134]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:58:39 np0005476733 python3.9[77286]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:58:40 np0005476733 python3.9[77409]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935518.6982968-109-68022735149287/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=afc2cc6ef5a89103e19d020c23273ad5003592c5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:58:40 np0005476733 python3.9[77561]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:58:41 np0005476733 python3.9[77684]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935520.2673976-109-34493964417458/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=a439c28a10e9a0bcc667d378d51d4eb295d29719 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:58:41 np0005476733 python3.9[77836]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:58:42 np0005476733 python3.9[77959]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935521.4261973-109-69382413954980/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=82af1cdb489ea6ed15e1cf07a4ebd019e088c753 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:58:42 np0005476733 python3.9[78111]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:58:43 np0005476733 python3.9[78263]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:58:44 np0005476733 python3.9[78415]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:58:44 np0005476733 python3.9[78538]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935523.7952344-227-218952247575289/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=c45acb8dc669524bd3556964374a702fe1448138 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:58:45 np0005476733 python3.9[78690]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:58:45 np0005476733 python3.9[78813]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935524.8911448-227-200534111996354/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=7252cb127ac7ea22638e78ce5424996d60309a30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:58:46 np0005476733 python3.9[78965]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:58:46 np0005476733 python3.9[79088]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935525.9597437-227-230594758973668/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=a188617b48a4b2036c662ca622f4d8bb25ea869c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:58:47 np0005476733 python3.9[79240]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:58:48 np0005476733 python3.9[79392]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:58:48 np0005476733 python3.9[79544]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:58:49 np0005476733 python3.9[79667]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935528.351195-333-105277482593800/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=1047f25ba86ce1124c29f11b593398b3804304e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:58:49 np0005476733 python3.9[79819]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:58:50 np0005476733 python3.9[79942]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935529.436603-333-108593794148847/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=46744c1e565aefac4b54c72457da94907294ee40 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:58:51 np0005476733 python3.9[80094]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:58:51 np0005476733 python3.9[80217]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935530.5516741-333-241321741770811/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=794492a637cf46fdf87c8d8e4af1c58cae4ec5e6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:58:52 np0005476733 python3.9[80369]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:58:52 np0005476733 python3.9[80521]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:58:53 np0005476733 python3.9[80673]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:58:54 np0005476733 python3.9[80796]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935533.075546-440-10545994180325/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=ffa5ff282e1c3c11658035f63989e10b21b27f52 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:58:54 np0005476733 python3.9[80948]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:58:55 np0005476733 python3.9[81071]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935534.1773798-440-280665063362391/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=46744c1e565aefac4b54c72457da94907294ee40 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:58:55 np0005476733 python3.9[81223]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:58:56 np0005476733 python3.9[81346]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935535.3843226-440-133713464015425/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=ffeed02205a00e07cb5a46d05fc75e792ca1dd55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:58:57 np0005476733 python3.9[81498]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:58:58 np0005476733 python3.9[81650]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:58:59 np0005476733 python3.9[81773]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935537.9960043-563-105225505911141/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b7137e5b4a35daae8ba780c026abba3353623136 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:58:59 np0005476733 python3.9[81925]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:59:00 np0005476733 python3.9[82077]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:59:01 np0005476733 python3.9[82200]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935540.0532475-622-152422344418935/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b7137e5b4a35daae8ba780c026abba3353623136 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:59:02 np0005476733 python3.9[82352]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:59:02 np0005476733 python3.9[82504]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:59:03 np0005476733 python3.9[82627]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935542.3161633-666-103586588393024/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b7137e5b4a35daae8ba780c026abba3353623136 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:59:04 np0005476733 python3.9[82779]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:59:04 np0005476733 python3.9[82931]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:59:05 np0005476733 python3.9[83054]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935544.3676102-712-223103386759842/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b7137e5b4a35daae8ba780c026abba3353623136 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:59:06 np0005476733 python3.9[83206]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:59:06 np0005476733 python3.9[83358]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:59:07 np0005476733 python3.9[83481]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935546.3465405-758-102646089730068/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b7137e5b4a35daae8ba780c026abba3353623136 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:59:08 np0005476733 python3.9[83633]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:59:08 np0005476733 python3.9[83785]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:59:09 np0005476733 python3.9[83908]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935548.2369208-806-27492585037291/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b7137e5b4a35daae8ba780c026abba3353623136 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:59:10 np0005476733 python3.9[84060]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:59:10 np0005476733 python3.9[84212]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:59:11 np0005476733 python3.9[84335]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935550.4292734-852-226915991210038/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b7137e5b4a35daae8ba780c026abba3353623136 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:59:15 np0005476733 systemd-logind[827]: Session 21 logged out. Waiting for processes to exit.
Oct  8 10:59:15 np0005476733 systemd[1]: session-21.scope: Deactivated successfully.
Oct  8 10:59:15 np0005476733 systemd[1]: session-21.scope: Consumed 28.037s CPU time.
Oct  8 10:59:15 np0005476733 systemd-logind[827]: Removed session 21.
Oct  8 10:59:21 np0005476733 systemd-logind[827]: New session 22 of user zuul.
Oct  8 10:59:21 np0005476733 systemd[1]: Started Session 22 of User zuul.
Oct  8 10:59:22 np0005476733 python3.9[84513]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:59:23 np0005476733 python3.9[84669]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:59:24 np0005476733 python3.9[84821]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  8 10:59:25 np0005476733 python3.9[84971]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:59:26 np0005476733 python3.9[85123]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct  8 10:59:29 np0005476733 dbus-broker-launch[817]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Oct  8 10:59:29 np0005476733 python3.9[85279]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 10:59:30 np0005476733 python3.9[85363]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 10:59:33 np0005476733 python3.9[85516]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  8 10:59:34 np0005476733 python3[85671]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Oct  8 10:59:35 np0005476733 python3.9[85823]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:59:36 np0005476733 python3.9[85975]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:59:36 np0005476733 python3.9[86053]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:59:37 np0005476733 python3.9[86205]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:59:37 np0005476733 python3.9[86283]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.qmn21auu recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:59:38 np0005476733 python3.9[86435]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:59:38 np0005476733 python3.9[86513]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:59:39 np0005476733 python3.9[86665]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:59:40 np0005476733 systemd[1]: packagekit.service: Deactivated successfully.
Oct  8 10:59:40 np0005476733 python3[86819]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  8 10:59:41 np0005476733 python3.9[86971]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:59:42 np0005476733 python3.9[87096]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935580.9024029-295-78582098430356/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:59:42 np0005476733 python3.9[87248]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:59:43 np0005476733 python3.9[87373]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935582.2940273-325-229310903936404/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:59:44 np0005476733 python3.9[87525]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:59:44 np0005476733 python3.9[87650]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935583.5744393-355-240787956500852/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:59:45 np0005476733 python3.9[87802]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:59:46 np0005476733 python3.9[87927]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935585.0936742-385-166454773434297/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:59:46 np0005476733 python3.9[88079]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 10:59:47 np0005476733 python3.9[88204]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935586.41837-415-78662519619965/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:59:48 np0005476733 python3.9[88356]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:59:49 np0005476733 python3.9[88508]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:59:50 np0005476733 python3.9[88663]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:59:50 np0005476733 python3.9[88815]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:59:51 np0005476733 python3.9[88968]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 10:59:52 np0005476733 python3.9[89122]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:59:52 np0005476733 python3.9[89277]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 10:59:54 np0005476733 python3.9[89427]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 10:59:55 np0005476733 python3.9[89580]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:d8:76:c8:90" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:59:55 np0005476733 ovs-vsctl[89581]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:d8:76:c8:90 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Oct  8 10:59:56 np0005476733 python3.9[89733]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:59:57 np0005476733 python3.9[89888]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 10:59:57 np0005476733 ovs-vsctl[89889]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Oct  8 10:59:58 np0005476733 python3.9[90039]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 10:59:59 np0005476733 python3.9[90193]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:00:00 np0005476733 python3.9[90345]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:00:00 np0005476733 python3.9[90423]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:00:01 np0005476733 python3.9[90575]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:00:01 np0005476733 python3.9[90653]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:00:02 np0005476733 python3.9[90805]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:00:03 np0005476733 python3.9[90957]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:00:03 np0005476733 python3.9[91035]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:00:04 np0005476733 python3.9[91187]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:00:04 np0005476733 python3.9[91265]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:00:05 np0005476733 python3.9[91417]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:00:05 np0005476733 systemd[1]: Reloading.
Oct  8 11:00:05 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:00:05 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:00:06 np0005476733 python3.9[91606]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:00:07 np0005476733 python3.9[91684]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:00:07 np0005476733 python3.9[91836]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:00:08 np0005476733 python3.9[91914]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:00:09 np0005476733 python3.9[92066]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:00:09 np0005476733 systemd[1]: Reloading.
Oct  8 11:00:09 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:00:09 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:00:09 np0005476733 systemd[1]: Starting Create netns directory...
Oct  8 11:00:09 np0005476733 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  8 11:00:09 np0005476733 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  8 11:00:09 np0005476733 systemd[1]: Finished Create netns directory.
Oct  8 11:00:10 np0005476733 python3.9[92259]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:00:10 np0005476733 python3.9[92411]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:00:11 np0005476733 python3.9[92534]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759935610.4623916-917-123034416756891/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:00:12 np0005476733 python3.9[92686]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:00:13 np0005476733 python3.9[92838]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:00:13 np0005476733 python3.9[92961]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759935612.5951886-967-119071633541867/.source.json _original_basename=.76kpivo9 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:00:14 np0005476733 python3.9[93113]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:00:16 np0005476733 python3.9[93540]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Oct  8 11:00:17 np0005476733 python3.9[93692]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  8 11:00:18 np0005476733 python3.9[93844]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  8 11:00:18 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 11:00:19 np0005476733 python3[94006]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  8 11:00:19 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 11:00:19 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 11:00:20 np0005476733 podman[94044]: 2025-10-08 15:00:20.0991216 +0000 UTC m=+0.064982664 container create 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 11:00:20 np0005476733 podman[94044]: 2025-10-08 15:00:20.057731501 +0000 UTC m=+0.023592615 image pull 94361c82f6cc9b9bc202c316322eaa39cd13ed74f9cc27d918c1366be405a281 38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:00:20 np0005476733 python3[94006]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z 38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:00:20 np0005476733 python3.9[94234]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:00:20 np0005476733 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 11:00:21 np0005476733 python3.9[94388]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:00:22 np0005476733 python3.9[94464]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:00:22 np0005476733 python3.9[94615]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759935622.2146103-1143-156692566297752/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:00:23 np0005476733 python3.9[94691]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 11:00:23 np0005476733 systemd[1]: Reloading.
Oct  8 11:00:23 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:00:23 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:00:24 np0005476733 python3.9[94803]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:00:24 np0005476733 systemd[1]: Reloading.
Oct  8 11:00:24 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:00:24 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:00:24 np0005476733 systemd[1]: Starting ovn_controller container...
Oct  8 11:00:24 np0005476733 systemd[1]: Created slice Virtual Machine and Container Slice.
Oct  8 11:00:24 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:00:24 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d013de5f33e0ff5678f4c633117b334df7b293f43f478018dc7fa7467349bfa0/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct  8 11:00:24 np0005476733 systemd[1]: Started /usr/bin/podman healthcheck run 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b.
Oct  8 11:00:24 np0005476733 podman[94843]: 2025-10-08 15:00:24.86777739 +0000 UTC m=+0.172338467 container init 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:00:24 np0005476733 ovn_controller[94857]: + sudo -E kolla_set_configs
Oct  8 11:00:24 np0005476733 podman[94843]: 2025-10-08 15:00:24.901667545 +0000 UTC m=+0.206228502 container start 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:00:24 np0005476733 edpm-start-podman-container[94843]: ovn_controller
Oct  8 11:00:24 np0005476733 systemd[1]: Created slice User Slice of UID 0.
Oct  8 11:00:24 np0005476733 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct  8 11:00:24 np0005476733 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct  8 11:00:24 np0005476733 edpm-start-podman-container[94842]: Creating additional drop-in dependency for "ovn_controller" (20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b)
Oct  8 11:00:24 np0005476733 systemd[1]: Starting User Manager for UID 0...
Oct  8 11:00:24 np0005476733 podman[94863]: 2025-10-08 15:00:24.990963366 +0000 UTC m=+0.074947975 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Oct  8 11:00:25 np0005476733 systemd[1]: Reloading.
Oct  8 11:00:25 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:00:25 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:00:25 np0005476733 systemd[94900]: Queued start job for default target Main User Target.
Oct  8 11:00:25 np0005476733 systemd[94900]: Created slice User Application Slice.
Oct  8 11:00:25 np0005476733 systemd[94900]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct  8 11:00:25 np0005476733 systemd[94900]: Started Daily Cleanup of User's Temporary Directories.
Oct  8 11:00:25 np0005476733 systemd[94900]: Reached target Paths.
Oct  8 11:00:25 np0005476733 systemd[94900]: Reached target Timers.
Oct  8 11:00:25 np0005476733 systemd[94900]: Starting D-Bus User Message Bus Socket...
Oct  8 11:00:25 np0005476733 systemd[94900]: Starting Create User's Volatile Files and Directories...
Oct  8 11:00:25 np0005476733 systemd[94900]: Finished Create User's Volatile Files and Directories.
Oct  8 11:00:25 np0005476733 systemd[94900]: Listening on D-Bus User Message Bus Socket.
Oct  8 11:00:25 np0005476733 systemd[94900]: Reached target Sockets.
Oct  8 11:00:25 np0005476733 systemd[94900]: Reached target Basic System.
Oct  8 11:00:25 np0005476733 systemd[94900]: Reached target Main User Target.
Oct  8 11:00:25 np0005476733 systemd[94900]: Startup finished in 142ms.
Oct  8 11:00:25 np0005476733 systemd[1]: Started User Manager for UID 0.
Oct  8 11:00:25 np0005476733 systemd[1]: Started ovn_controller container.
Oct  8 11:00:25 np0005476733 systemd[1]: 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b-12726c3703adeae8.service: Main process exited, code=exited, status=1/FAILURE
Oct  8 11:00:25 np0005476733 systemd[1]: 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b-12726c3703adeae8.service: Failed with result 'exit-code'.
Oct  8 11:00:25 np0005476733 systemd[1]: Started Session c1 of User root.
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: INFO:__main__:Validating config file
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: INFO:__main__:Writing out command to execute
Oct  8 11:00:25 np0005476733 systemd[1]: session-c1.scope: Deactivated successfully.
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: ++ cat /run_command
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: + ARGS=
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: + sudo kolla_copy_cacerts
Oct  8 11:00:25 np0005476733 systemd[1]: Started Session c2 of User root.
Oct  8 11:00:25 np0005476733 systemd[1]: session-c2.scope: Deactivated successfully.
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: + [[ ! -n '' ]]
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: + . kolla_extend_start
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: + umask 0022
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Oct  8 11:00:25 np0005476733 NetworkManager[51699]: <info>  [1759935625.4659] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Oct  8 11:00:25 np0005476733 NetworkManager[51699]: <info>  [1759935625.4669] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 11:00:25 np0005476733 NetworkManager[51699]: <info>  [1759935625.4680] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Oct  8 11:00:25 np0005476733 NetworkManager[51699]: <info>  [1759935625.4686] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Oct  8 11:00:25 np0005476733 NetworkManager[51699]: <info>  [1759935625.4690] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  8 11:00:25 np0005476733 kernel: br-int: entered promiscuous mode
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00014|main|INFO|OVS feature set changed, force recompute.
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00020|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00021|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00022|main|INFO|OVS feature set changed, force recompute.
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Oct  8 11:00:25 np0005476733 systemd-udevd[94992]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  8 11:00:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:25Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  8 11:00:25 np0005476733 NetworkManager[51699]: <info>  [1759935625.6844] manager: (ovn-964ae1-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Oct  8 11:00:25 np0005476733 kernel: genev_sys_6081: entered promiscuous mode
Oct  8 11:00:25 np0005476733 NetworkManager[51699]: <info>  [1759935625.7077] device (genev_sys_6081): carrier: link connected
Oct  8 11:00:25 np0005476733 NetworkManager[51699]: <info>  [1759935625.7081] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Oct  8 11:00:25 np0005476733 systemd-udevd[94994]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:00:26 np0005476733 NetworkManager[51699]: <info>  [1759935626.6063] manager: (ovn-329dd4-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Oct  8 11:00:27 np0005476733 python3.9[95125]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:00:27 np0005476733 ovs-vsctl[95126]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Oct  8 11:00:27 np0005476733 python3.9[95278]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:00:27 np0005476733 ovs-vsctl[95280]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Oct  8 11:00:28 np0005476733 python3.9[95433]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:00:28 np0005476733 ovs-vsctl[95434]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Oct  8 11:00:29 np0005476733 systemd[1]: session-22.scope: Deactivated successfully.
Oct  8 11:00:29 np0005476733 systemd[1]: session-22.scope: Consumed 44.949s CPU time.
Oct  8 11:00:29 np0005476733 systemd-logind[827]: Session 22 logged out. Waiting for processes to exit.
Oct  8 11:00:29 np0005476733 systemd-logind[827]: Removed session 22.
Oct  8 11:00:35 np0005476733 systemd[1]: Stopping User Manager for UID 0...
Oct  8 11:00:35 np0005476733 systemd[94900]: Activating special unit Exit the Session...
Oct  8 11:00:35 np0005476733 systemd[94900]: Stopped target Main User Target.
Oct  8 11:00:35 np0005476733 systemd[94900]: Stopped target Basic System.
Oct  8 11:00:35 np0005476733 systemd[94900]: Stopped target Paths.
Oct  8 11:00:35 np0005476733 systemd[94900]: Stopped target Sockets.
Oct  8 11:00:35 np0005476733 systemd[94900]: Stopped target Timers.
Oct  8 11:00:35 np0005476733 systemd[94900]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  8 11:00:35 np0005476733 systemd[94900]: Closed D-Bus User Message Bus Socket.
Oct  8 11:00:35 np0005476733 systemd[94900]: Stopped Create User's Volatile Files and Directories.
Oct  8 11:00:35 np0005476733 systemd[94900]: Removed slice User Application Slice.
Oct  8 11:00:35 np0005476733 systemd[94900]: Reached target Shutdown.
Oct  8 11:00:35 np0005476733 systemd[94900]: Finished Exit the Session.
Oct  8 11:00:35 np0005476733 systemd[94900]: Reached target Exit the Session.
Oct  8 11:00:35 np0005476733 systemd[1]: user@0.service: Deactivated successfully.
Oct  8 11:00:35 np0005476733 systemd[1]: Stopped User Manager for UID 0.
Oct  8 11:00:35 np0005476733 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct  8 11:00:35 np0005476733 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct  8 11:00:35 np0005476733 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct  8 11:00:35 np0005476733 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct  8 11:00:35 np0005476733 systemd[1]: Removed slice User Slice of UID 0.
Oct  8 11:00:35 np0005476733 systemd-logind[827]: New session 24 of user zuul.
Oct  8 11:00:35 np0005476733 systemd[1]: Started Session 24 of User zuul.
Oct  8 11:00:36 np0005476733 python3.9[95617]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 11:00:38 np0005476733 python3.9[95773]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:00:38 np0005476733 python3.9[95925]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:00:39 np0005476733 python3.9[96077]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:00:39 np0005476733 python3.9[96229]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:00:40 np0005476733 python3.9[96381]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:00:41 np0005476733 python3.9[96533]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 11:00:42 np0005476733 python3.9[96685]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct  8 11:00:43 np0005476733 python3.9[96836]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:00:44 np0005476733 python3.9[96957]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759935643.33802-153-40126377798489/.source follow=False _original_basename=haproxy.j2 checksum=743f91144c5fd47dbac6807fbfd5404b55a29257 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:00:45 np0005476733 python3.9[97107]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:00:46 np0005476733 python3.9[97228]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759935645.0056605-183-198386089103641/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:00:47 np0005476733 python3.9[97380]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 11:00:48 np0005476733 python3.9[97464]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 11:00:50 np0005476733 python3.9[97617]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  8 11:00:51 np0005476733 python3.9[97770]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:00:51 np0005476733 python3.9[97891]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759935650.7287047-257-124904200546416/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:00:52 np0005476733 python3.9[98041]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:00:52 np0005476733 python3.9[98162]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759935651.796881-257-139161407949202/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:00:54 np0005476733 python3.9[98312]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:00:54 np0005476733 python3.9[98433]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759935653.635002-345-64306234608619/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:00:55 np0005476733 python3.9[98583]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:00:55 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:55Z|00025|memory|INFO|16256 kB peak resident set size after 30.3 seconds
Oct  8 11:00:55 np0005476733 ovn_controller[94857]: 2025-10-08T15:00:55Z|00026|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Oct  8 11:00:55 np0005476733 podman[98678]: 2025-10-08 15:00:55.71977242 +0000 UTC m=+0.087473896 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 11:00:55 np0005476733 python3.9[98715]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759935654.864525-345-261190358821048/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:00:56 np0005476733 python3.9[98882]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:00:57 np0005476733 python3.9[99036]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:00:57 np0005476733 python3.9[99188]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:00:58 np0005476733 python3.9[99266]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:00:58 np0005476733 python3.9[99418]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:00:59 np0005476733 python3.9[99496]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:01:00 np0005476733 python3.9[99648]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:01:01 np0005476733 python3.9[99800]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:01:01 np0005476733 python3.9[99878]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:01:02 np0005476733 python3.9[100045]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:01:02 np0005476733 python3.9[100123]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:01:03 np0005476733 python3.9[100275]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:01:03 np0005476733 systemd[1]: Reloading.
Oct  8 11:01:03 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:01:03 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:01:04 np0005476733 python3.9[100464]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:01:05 np0005476733 python3.9[100542]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:01:06 np0005476733 python3.9[100694]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:01:06 np0005476733 python3.9[100772]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:01:07 np0005476733 python3.9[100924]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:01:07 np0005476733 systemd[1]: Reloading.
Oct  8 11:01:07 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:01:07 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:01:07 np0005476733 systemd[1]: Starting Create netns directory...
Oct  8 11:01:07 np0005476733 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  8 11:01:07 np0005476733 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  8 11:01:07 np0005476733 systemd[1]: Finished Create netns directory.
Oct  8 11:01:08 np0005476733 python3.9[101118]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:01:09 np0005476733 python3.9[101270]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:01:10 np0005476733 python3.9[101393]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759935668.9444282-647-23034696429511/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:01:10 np0005476733 python3.9[101545]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:01:11 np0005476733 python3.9[101697]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:01:12 np0005476733 python3.9[101820]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759935671.2557144-697-29909101980936/.source.json _original_basename=.e20wk9cs follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:01:13 np0005476733 python3.9[101972]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:01:15 np0005476733 python3.9[102399]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Oct  8 11:01:16 np0005476733 python3.9[102551]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  8 11:01:17 np0005476733 python3.9[102703]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  8 11:01:18 np0005476733 python3[102880]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  8 11:01:18 np0005476733 podman[102917]: 2025-10-08 15:01:18.942372329 +0000 UTC m=+0.066767858 container create aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Oct  8 11:01:18 np0005476733 podman[102917]: 2025-10-08 15:01:18.905976417 +0000 UTC m=+0.030371976 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:01:18 np0005476733 python3[102880]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:01:19 np0005476733 python3.9[103107]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:01:20 np0005476733 python3.9[103261]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:01:21 np0005476733 python3.9[103337]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:01:21 np0005476733 python3.9[103488]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759935681.237599-873-258205695045596/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:01:22 np0005476733 python3.9[103565]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 11:01:22 np0005476733 systemd[1]: Reloading.
Oct  8 11:01:22 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:01:22 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:01:23 np0005476733 python3.9[103676]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:01:23 np0005476733 systemd[1]: Reloading.
Oct  8 11:01:23 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:01:23 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:01:23 np0005476733 systemd[1]: Starting ovn_metadata_agent container...
Oct  8 11:01:23 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:01:24 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/637b3d28e5472cb4851f3d469a6f62e401481ba6d47e27fb0b8637000f5a36c8/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Oct  8 11:01:24 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/637b3d28e5472cb4851f3d469a6f62e401481ba6d47e27fb0b8637000f5a36c8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:01:24 np0005476733 systemd[1]: Started /usr/bin/podman healthcheck run aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4.
Oct  8 11:01:24 np0005476733 podman[103718]: 2025-10-08 15:01:24.057475168 +0000 UTC m=+0.236851070 container init aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: + sudo -E kolla_set_configs
Oct  8 11:01:24 np0005476733 podman[103718]: 2025-10-08 15:01:24.093932152 +0000 UTC m=+0.273308034 container start aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:01:24 np0005476733 edpm-start-podman-container[103718]: ovn_metadata_agent
Oct  8 11:01:24 np0005476733 edpm-start-podman-container[103717]: Creating additional drop-in dependency for "ovn_metadata_agent" (aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4)
Oct  8 11:01:24 np0005476733 podman[103741]: 2025-10-08 15:01:24.160970717 +0000 UTC m=+0.055023403 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: INFO:__main__:Validating config file
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: INFO:__main__:Copying service configuration files
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: INFO:__main__:Writing out command to execute
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: INFO:__main__:Setting permission for /var/lib/neutron
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: INFO:__main__:Setting permission for /var/lib/neutron/external
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: ++ cat /run_command
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: + CMD=neutron-ovn-metadata-agent
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: + ARGS=
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: + sudo kolla_copy_cacerts
Oct  8 11:01:24 np0005476733 systemd[1]: Reloading.
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: + [[ ! -n '' ]]
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: + . kolla_extend_start
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: Running command: 'neutron-ovn-metadata-agent'
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: + umask 0022
Oct  8 11:01:24 np0005476733 ovn_metadata_agent[103734]: + exec neutron-ovn-metadata-agent
Oct  8 11:01:24 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:01:24 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:01:24 np0005476733 systemd[1]: Started ovn_metadata_agent container.
Oct  8 11:01:24 np0005476733 systemd[1]: session-24.scope: Deactivated successfully.
Oct  8 11:01:24 np0005476733 systemd[1]: session-24.scope: Consumed 37.070s CPU time.
Oct  8 11:01:24 np0005476733 systemd-logind[827]: Session 24 logged out. Waiting for processes to exit.
Oct  8 11:01:24 np0005476733 systemd-logind[827]: Removed session 24.
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.229 103739 INFO neutron.common.config [-] Logging enabled!#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.229 103739 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.229 103739 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.230 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.230 103739 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.230 103739 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.230 103739 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.230 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.230 103739 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.230 103739 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.231 103739 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.231 103739 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.231 103739 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.231 103739 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.231 103739 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.231 103739 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.231 103739 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.231 103739 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.231 103739 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.232 103739 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.232 103739 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.232 103739 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.232 103739 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.232 103739 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.232 103739 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.232 103739 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.232 103739 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.232 103739 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.232 103739 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.233 103739 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.233 103739 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.233 103739 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.233 103739 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.233 103739 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.233 103739 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.233 103739 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.233 103739 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.233 103739 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.234 103739 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.234 103739 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.234 103739 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.234 103739 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.234 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.234 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.234 103739 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.234 103739 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.235 103739 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.235 103739 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.235 103739 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.235 103739 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.235 103739 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.235 103739 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.235 103739 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.235 103739 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.235 103739 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.236 103739 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.236 103739 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.236 103739 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.236 103739 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.236 103739 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.236 103739 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.236 103739 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.236 103739 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.236 103739 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.237 103739 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.237 103739 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.237 103739 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.237 103739 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.237 103739 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.237 103739 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.237 103739 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.237 103739 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.237 103739 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.238 103739 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.238 103739 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.238 103739 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.238 103739 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.238 103739 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.238 103739 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.238 103739 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.238 103739 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.238 103739 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.239 103739 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.239 103739 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.239 103739 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.239 103739 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.239 103739 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.239 103739 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.239 103739 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.239 103739 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.239 103739 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.240 103739 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.240 103739 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.240 103739 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.240 103739 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.240 103739 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.240 103739 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.240 103739 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.240 103739 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.240 103739 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.240 103739 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.240 103739 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.241 103739 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.241 103739 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.241 103739 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.241 103739 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.241 103739 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.241 103739 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.241 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.241 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.241 103739 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.242 103739 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.242 103739 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.242 103739 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.242 103739 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.242 103739 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.242 103739 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.242 103739 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.242 103739 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.242 103739 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.243 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.243 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.243 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.243 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.243 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.243 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.243 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.243 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.243 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.244 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.244 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.244 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.244 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.244 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.244 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.244 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.244 103739 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.244 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.245 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.245 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.245 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.245 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.245 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.245 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.245 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.245 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.245 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.246 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.246 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.246 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.246 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.246 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.246 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.246 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.247 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.247 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.247 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.247 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.247 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.247 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.247 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.248 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.248 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.248 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.248 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.248 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.248 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.248 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.249 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.249 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.249 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.249 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.249 103739 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.249 103739 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.249 103739 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.250 103739 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.250 103739 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.250 103739 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.250 103739 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.250 103739 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.250 103739 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.250 103739 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.250 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.251 103739 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.251 103739 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.251 103739 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.251 103739 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.251 103739 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.251 103739 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.251 103739 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.251 103739 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.251 103739 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.252 103739 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.252 103739 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.252 103739 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.252 103739 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.252 103739 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.252 103739 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.252 103739 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.252 103739 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.253 103739 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.253 103739 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.253 103739 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.253 103739 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.253 103739 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.253 103739 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.253 103739 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.253 103739 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.253 103739 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.253 103739 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.254 103739 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.254 103739 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.254 103739 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.254 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.254 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.254 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.254 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.254 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.254 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.255 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.255 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.255 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.255 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.255 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.255 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.255 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.255 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.256 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 podman[103847]: 2025-10-08 15:01:26.256132616 +0000 UTC m=+0.087555514 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.256 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.256 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.256 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.256 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.256 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.256 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.256 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.257 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.257 103739 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.257 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.257 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.257 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.257 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.257 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.257 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.257 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.258 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.258 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.258 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.258 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.258 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.258 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.258 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.258 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.259 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.259 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.259 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.259 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.259 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.259 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.259 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.259 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.259 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.260 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.260 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.260 103739 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.260 103739 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.260 103739 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.260 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.260 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.260 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.260 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.261 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.261 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.261 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.261 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.261 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.261 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.261 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.261 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.261 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.262 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.262 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.262 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.262 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.262 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.262 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.262 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.262 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.262 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.263 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.263 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.263 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.263 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.263 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.263 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.263 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.263 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.263 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.264 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.264 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.264 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.264 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.264 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.264 103739 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.264 103739 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.273 103739 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.274 103739 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.274 103739 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.274 103739 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.274 103739 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.286 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name ec52a299-e5bc-4227-a88e-e241833eebb2 (UUID: ec52a299-e5bc-4227-a88e-e241833eebb2) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.311 103739 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.311 103739 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.311 103739 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.311 103739 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.315 103739 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.319 103739 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.325 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'ec52a299-e5bc-4227-a88e-e241833eebb2'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], external_ids={}, name=ec52a299-e5bc-4227-a88e-e241833eebb2, nb_cfg_timestamp=1759935633490, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.326 103739 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f029f401b20>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.327 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.327 103739 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.328 103739 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.328 103739 INFO oslo_service.service [-] Starting 1 workers#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.334 103739 DEBUG oslo_service.service [-] Started child 103873 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.338 103739 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpflre3fte/privsep.sock']#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.338 103873 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-162980'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.369 103873 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.369 103873 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.370 103873 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.374 103873 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.381 103873 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.392 103873 INFO eventlet.wsgi.server [-] (103873) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Oct  8 11:01:26 np0005476733 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.992 103739 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.995 103739 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpflre3fte/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.854 103878 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.858 103878 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.860 103878 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Oct  8 11:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.860 103878 INFO oslo.privsep.daemon [-] privsep daemon running as pid 103878#033[00m
Oct  8 11:01:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:26.999 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[ba0e6334-edad-4f29-961c-af8e90636143]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:01:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:27.562 103878 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:01:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:27.562 103878 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:01:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:27.562 103878 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.126 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[0e63e425-1104-4471-9ee1-54948d3b4b58]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.129 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, column=external_ids, values=({'neutron:ovn-metadata-id': '9018c0c8-c6aa-5f9d-9ad7-f625c993fb4c'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.140 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.147 103739 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.148 103739 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.148 103739 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.148 103739 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.148 103739 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.149 103739 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.149 103739 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.149 103739 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.150 103739 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.150 103739 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.151 103739 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.151 103739 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.151 103739 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.151 103739 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.151 103739 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.151 103739 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.152 103739 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.152 103739 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.152 103739 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.152 103739 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.152 103739 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.152 103739 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.152 103739 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.153 103739 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.153 103739 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.153 103739 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.153 103739 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.153 103739 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.153 103739 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.154 103739 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.154 103739 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.154 103739 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.154 103739 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.154 103739 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.154 103739 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.155 103739 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.155 103739 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.155 103739 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.155 103739 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.155 103739 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.155 103739 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.156 103739 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.156 103739 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.156 103739 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.156 103739 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.156 103739 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.156 103739 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.156 103739 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.156 103739 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.157 103739 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.157 103739 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.157 103739 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.157 103739 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.157 103739 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.157 103739 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.157 103739 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.157 103739 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.158 103739 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.158 103739 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.158 103739 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.158 103739 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.158 103739 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.158 103739 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.158 103739 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.159 103739 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.159 103739 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.159 103739 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.159 103739 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.159 103739 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.159 103739 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.160 103739 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.160 103739 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.160 103739 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.160 103739 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.160 103739 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.160 103739 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.160 103739 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.161 103739 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.161 103739 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.161 103739 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.161 103739 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.161 103739 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.161 103739 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.161 103739 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.161 103739 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.162 103739 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.162 103739 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.162 103739 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.162 103739 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.162 103739 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.162 103739 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.162 103739 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.163 103739 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.163 103739 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.163 103739 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.163 103739 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.163 103739 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.163 103739 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.163 103739 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.163 103739 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.164 103739 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.164 103739 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.164 103739 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.164 103739 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.164 103739 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.164 103739 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.164 103739 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.164 103739 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.165 103739 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.165 103739 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.165 103739 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.165 103739 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.165 103739 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.165 103739 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.166 103739 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.166 103739 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.166 103739 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.166 103739 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.166 103739 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.166 103739 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.167 103739 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.167 103739 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.167 103739 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.167 103739 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.167 103739 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.167 103739 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.167 103739 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.168 103739 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.168 103739 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.168 103739 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.168 103739 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.168 103739 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.168 103739 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.168 103739 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.168 103739 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.169 103739 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.169 103739 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.169 103739 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.169 103739 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.169 103739 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.170 103739 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.170 103739 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.170 103739 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.170 103739 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.170 103739 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.170 103739 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.170 103739 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.171 103739 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.171 103739 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.171 103739 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.171 103739 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.171 103739 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.171 103739 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.172 103739 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.172 103739 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.172 103739 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.172 103739 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.172 103739 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.172 103739 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.172 103739 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.173 103739 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.173 103739 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.173 103739 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.173 103739 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.173 103739 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.173 103739 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.174 103739 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.174 103739 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.174 103739 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.174 103739 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.174 103739 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.174 103739 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.174 103739 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.175 103739 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.175 103739 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.175 103739 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.175 103739 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.175 103739 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.175 103739 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.175 103739 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.175 103739 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.176 103739 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.176 103739 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.176 103739 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.176 103739 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.176 103739 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.176 103739 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.177 103739 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.177 103739 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.177 103739 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.177 103739 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.177 103739 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.177 103739 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.177 103739 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.178 103739 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.178 103739 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.178 103739 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.178 103739 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.178 103739 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.178 103739 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.178 103739 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.179 103739 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.179 103739 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.179 103739 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.179 103739 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.179 103739 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.179 103739 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.179 103739 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.179 103739 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.180 103739 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.180 103739 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.180 103739 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.180 103739 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.180 103739 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.180 103739 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.180 103739 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.180 103739 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.181 103739 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.181 103739 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.181 103739 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.181 103739 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.181 103739 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.181 103739 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.181 103739 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.182 103739 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.182 103739 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.182 103739 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.182 103739 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.182 103739 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.182 103739 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.182 103739 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.182 103739 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.183 103739 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.183 103739 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.183 103739 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.183 103739 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.183 103739 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.183 103739 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.183 103739 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.183 103739 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.183 103739 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.183 103739 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.184 103739 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.184 103739 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.184 103739 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.184 103739 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.184 103739 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.184 103739 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.184 103739 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.184 103739 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.184 103739 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.185 103739 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.185 103739 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.185 103739 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.185 103739 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.185 103739 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.185 103739 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.185 103739 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.185 103739 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.186 103739 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.186 103739 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.186 103739 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.186 103739 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.186 103739 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.186 103739 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.186 103739 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.186 103739 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.186 103739 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.187 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.187 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.187 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.187 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.187 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.187 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.187 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.187 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.188 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.188 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.188 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.188 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.188 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.188 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.188 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.188 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.188 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.189 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.189 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.189 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.189 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.189 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.189 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.189 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.189 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.189 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.189 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.190 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.190 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.190 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.190 103739 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.190 103739 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.190 103739 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.190 103739 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.190 103739 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:01:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:01:28.190 103739 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  8 11:01:31 np0005476733 systemd-logind[827]: New session 25 of user zuul.
Oct  8 11:01:31 np0005476733 systemd[1]: Started Session 25 of User zuul.
Oct  8 11:01:32 np0005476733 python3.9[104036]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 11:01:33 np0005476733 python3.9[104192]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:01:34 np0005476733 python3.9[104354]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 11:01:34 np0005476733 systemd[1]: Reloading.
Oct  8 11:01:34 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:01:34 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:01:36 np0005476733 python3.9[104539]: ansible-ansible.builtin.service_facts Invoked
Oct  8 11:01:36 np0005476733 network[104556]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  8 11:01:36 np0005476733 network[104557]: 'network-scripts' will be removed from distribution in near future.
Oct  8 11:01:36 np0005476733 network[104558]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  8 11:01:41 np0005476733 python3.9[104822]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:01:41 np0005476733 python3.9[104975]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:01:42 np0005476733 python3.9[105128]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:01:43 np0005476733 python3.9[105281]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:01:45 np0005476733 python3.9[105434]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:01:46 np0005476733 python3.9[105587]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:01:47 np0005476733 python3.9[105740]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:01:48 np0005476733 python3.9[105893]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:01:48 np0005476733 python3.9[106045]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:01:49 np0005476733 python3.9[106197]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:01:50 np0005476733 python3.9[106349]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:01:50 np0005476733 python3.9[106501]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:01:51 np0005476733 python3.9[106653]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:01:52 np0005476733 python3.9[106805]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:01:53 np0005476733 python3.9[106957]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:01:53 np0005476733 python3.9[107109]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:01:54 np0005476733 python3.9[107261]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:01:54 np0005476733 podman[107385]: 2025-10-08 15:01:54.721880284 +0000 UTC m=+0.062925349 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  8 11:01:54 np0005476733 python3.9[107432]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:01:55 np0005476733 python3.9[107584]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:01:56 np0005476733 python3.9[107736]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:01:56 np0005476733 podman[107860]: 2025-10-08 15:01:56.710124797 +0000 UTC m=+0.101102276 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:01:56 np0005476733 python3.9[107908]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:01:58 np0005476733 python3.9[108066]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:01:58 np0005476733 python3.9[108218]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  8 11:02:00 np0005476733 python3.9[108370]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 11:02:00 np0005476733 systemd[1]: Reloading.
Oct  8 11:02:00 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:02:00 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:02:01 np0005476733 python3.9[108558]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:02:01 np0005476733 python3.9[108711]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:02:02 np0005476733 python3.9[108864]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:02:03 np0005476733 python3.9[109017]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:02:04 np0005476733 python3.9[109170]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:02:04 np0005476733 python3.9[109323]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:02:05 np0005476733 python3.9[109476]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:02:06 np0005476733 python3.9[109629]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Oct  8 11:02:07 np0005476733 python3.9[109782]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  8 11:02:09 np0005476733 python3.9[109940]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  8 11:02:10 np0005476733 python3.9[110100]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 11:02:11 np0005476733 python3.9[110184]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 11:02:25 np0005476733 podman[110370]: 2025-10-08 15:02:25.2670977 +0000 UTC m=+0.079593008 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:02:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:02:26.278 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:02:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:02:26.279 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:02:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:02:26.280 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:02:27 np0005476733 podman[110394]: 2025-10-08 15:02:27.287937845 +0000 UTC m=+0.117332393 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  8 11:02:36 np0005476733 kernel: SELinux:  Converting 2753 SID table entries...
Oct  8 11:02:36 np0005476733 kernel: SELinux:  policy capability network_peer_controls=1
Oct  8 11:02:36 np0005476733 kernel: SELinux:  policy capability open_perms=1
Oct  8 11:02:36 np0005476733 kernel: SELinux:  policy capability extended_socket_class=1
Oct  8 11:02:36 np0005476733 kernel: SELinux:  policy capability always_check_network=0
Oct  8 11:02:36 np0005476733 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  8 11:02:36 np0005476733 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  8 11:02:36 np0005476733 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  8 11:02:46 np0005476733 kernel: SELinux:  Converting 2753 SID table entries...
Oct  8 11:02:46 np0005476733 kernel: SELinux:  policy capability network_peer_controls=1
Oct  8 11:02:46 np0005476733 kernel: SELinux:  policy capability open_perms=1
Oct  8 11:02:46 np0005476733 kernel: SELinux:  policy capability extended_socket_class=1
Oct  8 11:02:46 np0005476733 kernel: SELinux:  policy capability always_check_network=0
Oct  8 11:02:46 np0005476733 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  8 11:02:46 np0005476733 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  8 11:02:46 np0005476733 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  8 11:02:56 np0005476733 dbus-broker-launch[817]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Oct  8 11:02:56 np0005476733 podman[110437]: 2025-10-08 15:02:56.247193467 +0000 UTC m=+0.063663407 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 11:02:58 np0005476733 podman[110456]: 2025-10-08 15:02:58.282251705 +0000 UTC m=+0.113179365 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 11:03:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:03:26.282 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:03:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:03:26.288 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.008s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:03:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:03:26.288 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:03:27 np0005476733 podman[126945]: 2025-10-08 15:03:27.227776987 +0000 UTC m=+0.057195953 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  8 11:03:29 np0005476733 podman[127241]: 2025-10-08 15:03:29.288140443 +0000 UTC m=+0.118951411 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:03:40 np0005476733 kernel: SELinux:  Converting 2754 SID table entries...
Oct  8 11:03:40 np0005476733 kernel: SELinux:  policy capability network_peer_controls=1
Oct  8 11:03:40 np0005476733 kernel: SELinux:  policy capability open_perms=1
Oct  8 11:03:40 np0005476733 kernel: SELinux:  policy capability extended_socket_class=1
Oct  8 11:03:40 np0005476733 kernel: SELinux:  policy capability always_check_network=0
Oct  8 11:03:40 np0005476733 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  8 11:03:40 np0005476733 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  8 11:03:40 np0005476733 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  8 11:03:42 np0005476733 dbus-broker-launch[816]: Noticed file-system modification, trigger reload.
Oct  8 11:03:42 np0005476733 dbus-broker-launch[817]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Oct  8 11:03:42 np0005476733 dbus-broker-launch[816]: Noticed file-system modification, trigger reload.
Oct  8 11:03:50 np0005476733 systemd[1]: Stopping OpenSSH server daemon...
Oct  8 11:03:50 np0005476733 systemd[1]: sshd.service: Deactivated successfully.
Oct  8 11:03:50 np0005476733 systemd[1]: Stopped OpenSSH server daemon.
Oct  8 11:03:50 np0005476733 systemd[1]: sshd.service: Consumed 1.407s CPU time, read 0B from disk, written 12.0K to disk.
Oct  8 11:03:50 np0005476733 systemd[1]: Stopped target sshd-keygen.target.
Oct  8 11:03:50 np0005476733 systemd[1]: Stopping sshd-keygen.target...
Oct  8 11:03:50 np0005476733 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  8 11:03:50 np0005476733 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  8 11:03:50 np0005476733 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  8 11:03:50 np0005476733 systemd[1]: Reached target sshd-keygen.target.
Oct  8 11:03:50 np0005476733 systemd[1]: Starting OpenSSH server daemon...
Oct  8 11:03:50 np0005476733 systemd[1]: Started OpenSSH server daemon.
Oct  8 11:03:52 np0005476733 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  8 11:03:52 np0005476733 systemd[1]: Starting man-db-cache-update.service...
Oct  8 11:03:52 np0005476733 systemd[1]: Reloading.
Oct  8 11:03:52 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:03:52 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:03:52 np0005476733 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  8 11:03:54 np0005476733 systemd[1]: Starting PackageKit Daemon...
Oct  8 11:03:54 np0005476733 systemd[1]: Started PackageKit Daemon.
Oct  8 11:03:57 np0005476733 podman[132866]: 2025-10-08 15:03:57.494344276 +0000 UTC m=+0.061896295 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Oct  8 11:03:57 np0005476733 python3.9[133045]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  8 11:03:57 np0005476733 systemd[1]: Reloading.
Oct  8 11:03:58 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:03:58 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:03:59 np0005476733 python3.9[134402]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  8 11:03:59 np0005476733 systemd[1]: Reloading.
Oct  8 11:03:59 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:03:59 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:03:59 np0005476733 podman[134962]: 2025-10-08 15:03:59.715289197 +0000 UTC m=+0.174987214 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:04:00 np0005476733 python3.9[135519]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  8 11:04:00 np0005476733 systemd[1]: Reloading.
Oct  8 11:04:00 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:04:00 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:04:01 np0005476733 python3.9[136745]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  8 11:04:01 np0005476733 systemd[1]: Reloading.
Oct  8 11:04:01 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:04:01 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:04:02 np0005476733 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  8 11:04:02 np0005476733 systemd[1]: Finished man-db-cache-update.service.
Oct  8 11:04:02 np0005476733 systemd[1]: man-db-cache-update.service: Consumed 12.192s CPU time.
Oct  8 11:04:02 np0005476733 systemd[1]: run-r170e1f25d0e84ae0903f278c50020226.service: Deactivated successfully.
Oct  8 11:04:02 np0005476733 python3.9[137497]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 11:04:03 np0005476733 systemd[1]: Reloading.
Oct  8 11:04:03 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:04:03 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:04:04 np0005476733 python3.9[137688]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 11:04:04 np0005476733 systemd[1]: Reloading.
Oct  8 11:04:04 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:04:05 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:04:05 np0005476733 python3.9[137878]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 11:04:06 np0005476733 systemd[1]: Reloading.
Oct  8 11:04:06 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:04:06 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:04:07 np0005476733 python3.9[138068]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 11:04:07 np0005476733 python3.9[138223]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 11:04:07 np0005476733 systemd[1]: Reloading.
Oct  8 11:04:08 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:04:08 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:04:09 np0005476733 python3.9[138413]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  8 11:04:09 np0005476733 systemd[1]: Reloading.
Oct  8 11:04:09 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:04:09 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:04:09 np0005476733 systemd[1]: Listening on libvirt proxy daemon socket.
Oct  8 11:04:09 np0005476733 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Oct  8 11:04:10 np0005476733 python3.9[138606]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 11:04:11 np0005476733 python3.9[138761]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 11:04:11 np0005476733 python3.9[138916]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 11:04:13 np0005476733 python3.9[139071]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 11:04:14 np0005476733 python3.9[139226]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 11:04:15 np0005476733 python3.9[139381]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 11:04:16 np0005476733 python3.9[139536]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 11:04:17 np0005476733 python3.9[139691]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 11:04:17 np0005476733 python3.9[139846]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 11:04:18 np0005476733 python3.9[140001]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 11:04:19 np0005476733 python3.9[140156]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 11:04:20 np0005476733 python3.9[140311]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 11:04:21 np0005476733 python3.9[140466]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 11:04:21 np0005476733 python3.9[140621]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 11:04:22 np0005476733 python3.9[140776]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:04:23 np0005476733 python3.9[140928]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:04:23 np0005476733 python3.9[141080]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:04:24 np0005476733 python3.9[141232]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:04:25 np0005476733 python3.9[141384]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:04:25 np0005476733 python3.9[141536]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:04:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:04:26.282 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:04:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:04:26.283 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:04:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:04:26.284 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:04:26 np0005476733 python3.9[141688]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:04:27 np0005476733 podman[141813]: 2025-10-08 15:04:27.593133301 +0000 UTC m=+0.058791424 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 11:04:27 np0005476733 python3.9[141814]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759935866.3088784-1089-101664375252381/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:28 np0005476733 python3.9[141985]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:04:28 np0005476733 python3.9[142110]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759935867.8930657-1089-241131994244284/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:29 np0005476733 python3.9[142262]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:04:29 np0005476733 podman[142359]: 2025-10-08 15:04:29.962348297 +0000 UTC m=+0.120694915 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:04:30 np0005476733 python3.9[142407]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759935869.0314412-1089-208926983148976/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:30 np0005476733 python3.9[142565]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:04:31 np0005476733 python3.9[142690]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759935870.2391207-1089-248720975903766/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:31 np0005476733 python3.9[142842]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:04:32 np0005476733 python3.9[142967]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759935871.4877768-1089-27720084344970/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:33 np0005476733 python3.9[143119]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:04:33 np0005476733 python3.9[143244]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759935872.6929483-1089-218318173611877/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:34 np0005476733 python3.9[143396]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:04:34 np0005476733 python3.9[143519]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759935873.7862244-1089-112946140925535/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:35 np0005476733 python3.9[143671]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:04:35 np0005476733 python3.9[143796]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759935874.8930914-1089-243745613889338/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:36 np0005476733 python3.9[143948]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Oct  8 11:04:37 np0005476733 python3.9[144101]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:38 np0005476733 python3.9[144253]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:39 np0005476733 python3.9[144405]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:40 np0005476733 python3.9[144557]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:40 np0005476733 python3.9[144709]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:41 np0005476733 python3.9[144861]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:42 np0005476733 python3.9[145013]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:42 np0005476733 python3.9[145165]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:43 np0005476733 python3.9[145317]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:44 np0005476733 python3.9[145469]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:45 np0005476733 python3.9[145621]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:45 np0005476733 python3.9[145773]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:46 np0005476733 python3.9[145925]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:47 np0005476733 python3.9[146077]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:48 np0005476733 python3.9[146229]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:04:48 np0005476733 python3.9[146352]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935887.4970636-1531-22846558815224/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:49 np0005476733 python3.9[146504]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:04:50 np0005476733 python3.9[146627]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935889.0753815-1531-113887028712077/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:50 np0005476733 python3.9[146779]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:04:51 np0005476733 python3.9[146902]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935890.3162744-1531-122965108413148/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:52 np0005476733 python3.9[147054]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:04:52 np0005476733 python3.9[147177]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935891.6925447-1531-181455579688679/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:53 np0005476733 python3.9[147329]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:04:54 np0005476733 python3.9[147452]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935892.986874-1531-226850912973074/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:54 np0005476733 python3.9[147604]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:04:55 np0005476733 python3.9[147727]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935894.2239716-1531-238249216287588/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:55 np0005476733 python3.9[147879]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:04:56 np0005476733 python3.9[148002]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935895.4908938-1531-125850819571023/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:57 np0005476733 python3.9[148154]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:04:57 np0005476733 podman[148277]: 2025-10-08 15:04:57.709204376 +0000 UTC m=+0.051259545 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:04:57 np0005476733 python3.9[148278]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935896.7881377-1531-83191577647553/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:58 np0005476733 python3.9[148448]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:04:59 np0005476733 python3.9[148571]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935898.056471-1531-197521310565056/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:04:59 np0005476733 python3.9[148723]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:05:00 np0005476733 podman[148818]: 2025-10-08 15:05:00.219246584 +0000 UTC m=+0.096072374 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:05:00 np0005476733 python3.9[148867]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935899.2914314-1531-113185946420832/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:00 np0005476733 python3.9[149025]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:05:01 np0005476733 python3.9[149148]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935900.5334864-1531-47148506549461/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:02 np0005476733 python3.9[149300]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:05:02 np0005476733 python3.9[149423]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935901.7194312-1531-178719524542172/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:03 np0005476733 python3.9[149575]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:05:04 np0005476733 python3.9[149698]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935902.9831696-1531-109074446697490/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:04 np0005476733 python3.9[149850]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:05:05 np0005476733 python3.9[149973]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935904.269199-1531-146741593852600/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:06 np0005476733 python3.9[150123]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:05:07 np0005476733 python3.9[150278]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Oct  8 11:05:08 np0005476733 dbus-broker-launch[817]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Oct  8 11:05:09 np0005476733 python3.9[150434]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:09 np0005476733 python3.9[150586]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:10 np0005476733 python3.9[150738]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:11 np0005476733 python3.9[150890]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:11 np0005476733 python3.9[151042]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:12 np0005476733 python3.9[151194]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:13 np0005476733 python3.9[151346]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:14 np0005476733 python3.9[151498]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:14 np0005476733 python3.9[151650]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:15 np0005476733 python3.9[151802]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:16 np0005476733 python3.9[151954]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 11:05:16 np0005476733 systemd[1]: Reloading.
Oct  8 11:05:16 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:05:16 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:05:16 np0005476733 systemd[1]: Starting libvirt logging daemon socket...
Oct  8 11:05:16 np0005476733 systemd[1]: Listening on libvirt logging daemon socket.
Oct  8 11:05:16 np0005476733 systemd[1]: Starting libvirt logging daemon admin socket...
Oct  8 11:05:16 np0005476733 systemd[1]: Listening on libvirt logging daemon admin socket.
Oct  8 11:05:16 np0005476733 systemd[1]: Starting libvirt logging daemon...
Oct  8 11:05:16 np0005476733 systemd[1]: Started libvirt logging daemon.
Oct  8 11:05:17 np0005476733 python3.9[152149]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 11:05:17 np0005476733 systemd[1]: Reloading.
Oct  8 11:05:17 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:05:17 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:05:17 np0005476733 systemd[1]: Starting libvirt nodedev daemon socket...
Oct  8 11:05:17 np0005476733 systemd[1]: Listening on libvirt nodedev daemon socket.
Oct  8 11:05:17 np0005476733 systemd[1]: Starting libvirt nodedev daemon admin socket...
Oct  8 11:05:17 np0005476733 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Oct  8 11:05:17 np0005476733 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Oct  8 11:05:17 np0005476733 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Oct  8 11:05:17 np0005476733 systemd[1]: Starting libvirt nodedev daemon...
Oct  8 11:05:17 np0005476733 systemd[1]: Started libvirt nodedev daemon.
Oct  8 11:05:18 np0005476733 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Oct  8 11:05:18 np0005476733 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Oct  8 11:05:18 np0005476733 python3.9[152365]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 11:05:18 np0005476733 systemd[1]: Reloading.
Oct  8 11:05:18 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:05:18 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:05:19 np0005476733 systemd[1]: Starting libvirt proxy daemon admin socket...
Oct  8 11:05:19 np0005476733 systemd[1]: Starting libvirt proxy daemon read-only socket...
Oct  8 11:05:19 np0005476733 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Oct  8 11:05:19 np0005476733 systemd[1]: Listening on libvirt proxy daemon admin socket.
Oct  8 11:05:19 np0005476733 systemd[1]: Starting libvirt proxy daemon...
Oct  8 11:05:19 np0005476733 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Oct  8 11:05:19 np0005476733 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Oct  8 11:05:19 np0005476733 systemd[1]: Started libvirt proxy daemon.
Oct  8 11:05:20 np0005476733 python3.9[152584]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 11:05:20 np0005476733 systemd[1]: Reloading.
Oct  8 11:05:20 np0005476733 setroubleshoot[152215]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 681c654e-8377-4095-8b0b-254add968526
Oct  8 11:05:20 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:05:20 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:05:20 np0005476733 setroubleshoot[152215]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Oct  8 11:05:20 np0005476733 setroubleshoot[152215]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 681c654e-8377-4095-8b0b-254add968526
Oct  8 11:05:20 np0005476733 setroubleshoot[152215]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Oct  8 11:05:20 np0005476733 systemd[1]: Listening on libvirt locking daemon socket.
Oct  8 11:05:20 np0005476733 systemd[1]: Starting libvirt QEMU daemon socket...
Oct  8 11:05:20 np0005476733 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Oct  8 11:05:20 np0005476733 systemd[1]: Starting Virtual Machine and Container Registration Service...
Oct  8 11:05:20 np0005476733 systemd[1]: Listening on libvirt QEMU daemon socket.
Oct  8 11:05:20 np0005476733 systemd[1]: Starting libvirt QEMU daemon admin socket...
Oct  8 11:05:20 np0005476733 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Oct  8 11:05:20 np0005476733 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Oct  8 11:05:20 np0005476733 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Oct  8 11:05:20 np0005476733 systemd[1]: Started Virtual Machine and Container Registration Service.
Oct  8 11:05:20 np0005476733 systemd[1]: Starting libvirt QEMU daemon...
Oct  8 11:05:20 np0005476733 systemd[1]: Started libvirt QEMU daemon.
Oct  8 11:05:21 np0005476733 python3.9[152797]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 11:05:21 np0005476733 systemd[1]: Reloading.
Oct  8 11:05:21 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:05:21 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:05:21 np0005476733 systemd[1]: Starting libvirt secret daemon socket...
Oct  8 11:05:21 np0005476733 systemd[1]: Listening on libvirt secret daemon socket.
Oct  8 11:05:21 np0005476733 systemd[1]: Starting libvirt secret daemon admin socket...
Oct  8 11:05:21 np0005476733 systemd[1]: Starting libvirt secret daemon read-only socket...
Oct  8 11:05:21 np0005476733 systemd[1]: Listening on libvirt secret daemon admin socket.
Oct  8 11:05:21 np0005476733 systemd[1]: Listening on libvirt secret daemon read-only socket.
Oct  8 11:05:21 np0005476733 systemd[1]: Starting libvirt secret daemon...
Oct  8 11:05:21 np0005476733 systemd[1]: Started libvirt secret daemon.
Oct  8 11:05:24 np0005476733 python3.9[153007]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:25 np0005476733 python3.9[153159]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  8 11:05:26 np0005476733 python3.9[153311]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:05:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:05:26.283 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:05:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:05:26.285 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:05:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:05:26.285 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:05:26 np0005476733 python3.9[153434]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759935925.7477844-2221-276089073934564/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:27 np0005476733 python3.9[153586]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:28 np0005476733 podman[153663]: 2025-10-08 15:05:28.254973045 +0000 UTC m=+0.073528201 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 11:05:28 np0005476733 python3.9[153757]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:05:29 np0005476733 python3.9[153835]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:29 np0005476733 python3.9[153987]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:05:30 np0005476733 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Oct  8 11:05:30 np0005476733 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.035s CPU time.
Oct  8 11:05:30 np0005476733 systemd[1]: setroubleshootd.service: Deactivated successfully.
Oct  8 11:05:30 np0005476733 podman[154065]: 2025-10-08 15:05:30.405058289 +0000 UTC m=+0.094804195 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  8 11:05:30 np0005476733 python3.9[154066]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.5pm18b3z recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:31 np0005476733 python3.9[154240]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:05:31 np0005476733 python3.9[154318]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:32 np0005476733 python3.9[154470]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:05:33 np0005476733 python3[154623]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  8 11:05:34 np0005476733 python3.9[154775]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:05:34 np0005476733 python3.9[154853]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:35 np0005476733 python3.9[155005]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:05:35 np0005476733 python3.9[155083]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:36 np0005476733 python3.9[155235]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:05:37 np0005476733 python3.9[155313]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:38 np0005476733 python3.9[155465]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:05:38 np0005476733 python3.9[155543]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:39 np0005476733 python3.9[155695]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:05:40 np0005476733 python3.9[155820]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759935938.9214776-2471-163743928128727/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:40 np0005476733 python3.9[155972]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:41 np0005476733 python3.9[156124]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:05:42 np0005476733 python3.9[156279]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:43 np0005476733 python3.9[156431]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:05:44 np0005476733 python3.9[156584]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:05:44 np0005476733 python3.9[156738]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:05:45 np0005476733 python3.9[156893]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:46 np0005476733 python3.9[157045]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:05:46 np0005476733 python3.9[157168]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759935945.7588217-2615-119037328232292/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:47 np0005476733 python3.9[157320]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:05:48 np0005476733 python3.9[157443]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759935947.0969307-2645-172098342399098/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:48 np0005476733 python3.9[157595]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:05:49 np0005476733 python3.9[157718]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759935948.4308424-2675-155422425466278/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:05:50 np0005476733 python3.9[157870]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:05:50 np0005476733 systemd[1]: Reloading.
Oct  8 11:05:50 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:05:50 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:05:50 np0005476733 systemd[1]: Reached target edpm_libvirt.target.
Oct  8 11:05:51 np0005476733 python3.9[158061]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct  8 11:05:51 np0005476733 systemd[1]: Reloading.
Oct  8 11:05:51 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:05:51 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:05:51 np0005476733 systemd[1]: Reloading.
Oct  8 11:05:52 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:05:52 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:05:52 np0005476733 systemd[1]: session-25.scope: Deactivated successfully.
Oct  8 11:05:52 np0005476733 systemd[1]: session-25.scope: Consumed 3min 34.152s CPU time.
Oct  8 11:05:52 np0005476733 systemd-logind[827]: Session 25 logged out. Waiting for processes to exit.
Oct  8 11:05:52 np0005476733 systemd-logind[827]: Removed session 25.
Oct  8 11:05:58 np0005476733 systemd-logind[827]: New session 26 of user zuul.
Oct  8 11:05:58 np0005476733 systemd[1]: Started Session 26 of User zuul.
Oct  8 11:05:58 np0005476733 podman[158159]: 2025-10-08 15:05:58.835344104 +0000 UTC m=+0.104995506 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 11:05:59 np0005476733 python3.9[158329]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 11:06:00 np0005476733 podman[158457]: 2025-10-08 15:06:00.945468435 +0000 UTC m=+0.100931526 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 11:06:01 np0005476733 python3.9[158506]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:06:01 np0005476733 python3.9[158663]: ansible-ansible.builtin.file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:06:02 np0005476733 python3.9[158815]: ansible-ansible.builtin.file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:06:03 np0005476733 python3.9[158967]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  8 11:06:03 np0005476733 python3.9[159119]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data/ansible-generated/iscsid setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:06:04 np0005476733 python3.9[159271]: ansible-ansible.builtin.stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:06:05 np0005476733 python3.9[159425]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsid.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:06:05 np0005476733 systemd[1]: Reloading.
Oct  8 11:06:06 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:06:06 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:06:07 np0005476733 python3.9[159615]: ansible-ansible.builtin.service_facts Invoked
Oct  8 11:06:07 np0005476733 network[159632]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  8 11:06:07 np0005476733 network[159633]: 'network-scripts' will be removed from distribution in near future.
Oct  8 11:06:07 np0005476733 network[159634]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  8 11:06:11 np0005476733 python3.9[159907]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsi-starter.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:06:12 np0005476733 systemd[1]: Reloading.
Oct  8 11:06:12 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:06:12 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:06:13 np0005476733 python3.9[160093]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:06:14 np0005476733 python3.9[160245]: ansible-containers.podman.podman_container Invoked with command=/usr/sbin/iscsi-iname detach=False image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297 name=iscsid_config rm=True tty=True executable=podman state=started debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct  8 11:06:14 np0005476733 podman[160281]: 2025-10-08 15:06:14.437315387 +0000 UTC m=+0.063868571 container create 50f5da91dd5c9f316d97db6e5fa3a8cdbe4b7671d4d93d69b1d6433f029879c0 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid_config, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:06:14 np0005476733 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 11:06:14 np0005476733 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 11:06:14 np0005476733 NetworkManager[51699]: <info>  [1759935974.4669] manager: (podman0): new Bridge device (/org/freedesktop/NetworkManager/Devices/20)
Oct  8 11:06:14 np0005476733 kernel: podman0: port 1(veth0) entered blocking state
Oct  8 11:06:14 np0005476733 kernel: podman0: port 1(veth0) entered disabled state
Oct  8 11:06:14 np0005476733 kernel: veth0: entered allmulticast mode
Oct  8 11:06:14 np0005476733 kernel: veth0: entered promiscuous mode
Oct  8 11:06:14 np0005476733 kernel: podman0: port 1(veth0) entered blocking state
Oct  8 11:06:14 np0005476733 kernel: podman0: port 1(veth0) entered forwarding state
Oct  8 11:06:14 np0005476733 NetworkManager[51699]: <info>  [1759935974.4944] manager: (veth0): new Veth device (/org/freedesktop/NetworkManager/Devices/21)
Oct  8 11:06:14 np0005476733 podman[160281]: 2025-10-08 15:06:14.403188467 +0000 UTC m=+0.029741741 image pull fa23f900391d6b2045198c4ce65355e00d82cd7d392391ca189cef278619240c 38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:06:14 np0005476733 NetworkManager[51699]: <info>  [1759935974.4963] device (veth0): carrier: link connected
Oct  8 11:06:14 np0005476733 NetworkManager[51699]: <info>  [1759935974.4969] device (podman0): carrier: link connected
Oct  8 11:06:14 np0005476733 systemd-udevd[160314]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:06:14 np0005476733 systemd-udevd[160311]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:06:14 np0005476733 NetworkManager[51699]: <info>  [1759935974.5367] device (podman0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:06:14 np0005476733 NetworkManager[51699]: <info>  [1759935974.5375] device (podman0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:06:14 np0005476733 NetworkManager[51699]: <info>  [1759935974.5388] device (podman0): Activation: starting connection 'podman0' (7c666c3d-cf4e-46be-97a3-f20912f1fdbd)
Oct  8 11:06:14 np0005476733 NetworkManager[51699]: <info>  [1759935974.5390] device (podman0): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  8 11:06:14 np0005476733 NetworkManager[51699]: <info>  [1759935974.5395] device (podman0): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  8 11:06:14 np0005476733 NetworkManager[51699]: <info>  [1759935974.5399] device (podman0): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  8 11:06:14 np0005476733 NetworkManager[51699]: <info>  [1759935974.5404] device (podman0): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  8 11:06:14 np0005476733 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  8 11:06:14 np0005476733 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  8 11:06:14 np0005476733 NetworkManager[51699]: <info>  [1759935974.5749] device (podman0): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  8 11:06:14 np0005476733 NetworkManager[51699]: <info>  [1759935974.5753] device (podman0): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  8 11:06:14 np0005476733 NetworkManager[51699]: <info>  [1759935974.5761] device (podman0): Activation: successful, device activated.
Oct  8 11:06:14 np0005476733 systemd[1]: iscsi.service: Unit cannot be reloaded because it is inactive.
Oct  8 11:06:14 np0005476733 systemd[1]: Started libpod-conmon-50f5da91dd5c9f316d97db6e5fa3a8cdbe4b7671d4d93d69b1d6433f029879c0.scope.
Oct  8 11:06:14 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:06:14 np0005476733 podman[160281]: 2025-10-08 15:06:14.872353726 +0000 UTC m=+0.498907000 container init 50f5da91dd5c9f316d97db6e5fa3a8cdbe4b7671d4d93d69b1d6433f029879c0 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid_config, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:06:14 np0005476733 podman[160281]: 2025-10-08 15:06:14.886572959 +0000 UTC m=+0.513126163 container start 50f5da91dd5c9f316d97db6e5fa3a8cdbe4b7671d4d93d69b1d6433f029879c0 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid_config, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 11:06:14 np0005476733 podman[160281]: 2025-10-08 15:06:14.891127675 +0000 UTC m=+0.517680879 container attach 50f5da91dd5c9f316d97db6e5fa3a8cdbe4b7671d4d93d69b1d6433f029879c0 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid_config, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:06:14 np0005476733 iscsid_config[160440]: iqn.1994-05.com.redhat:4634321a1aa1#015
Oct  8 11:06:14 np0005476733 systemd[1]: libpod-50f5da91dd5c9f316d97db6e5fa3a8cdbe4b7671d4d93d69b1d6433f029879c0.scope: Deactivated successfully.
Oct  8 11:06:14 np0005476733 podman[160281]: 2025-10-08 15:06:14.896183916 +0000 UTC m=+0.522737130 container died 50f5da91dd5c9f316d97db6e5fa3a8cdbe4b7671d4d93d69b1d6433f029879c0 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid_config, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 11:06:14 np0005476733 kernel: podman0: port 1(veth0) entered disabled state
Oct  8 11:06:14 np0005476733 kernel: veth0 (unregistering): left allmulticast mode
Oct  8 11:06:14 np0005476733 kernel: veth0 (unregistering): left promiscuous mode
Oct  8 11:06:14 np0005476733 kernel: podman0: port 1(veth0) entered disabled state
Oct  8 11:06:14 np0005476733 NetworkManager[51699]: <info>  [1759935974.9611] device (podman0): state change: activated -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:06:15 np0005476733 systemd[1]: run-netns-netns\x2d55015723\x2d64b3\x2dd009\x2d74ea\x2d4acea2e76adb.mount: Deactivated successfully.
Oct  8 11:06:15 np0005476733 systemd[1]: var-lib-containers-storage-overlay-92d98eeb7b9c5147a7a1ba5cfc9601064a91f6e7f737ad74bd59071c9771b737-merged.mount: Deactivated successfully.
Oct  8 11:06:15 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-50f5da91dd5c9f316d97db6e5fa3a8cdbe4b7671d4d93d69b1d6433f029879c0-userdata-shm.mount: Deactivated successfully.
Oct  8 11:06:15 np0005476733 podman[160281]: 2025-10-08 15:06:15.355424187 +0000 UTC m=+0.981977361 container remove 50f5da91dd5c9f316d97db6e5fa3a8cdbe4b7671d4d93d69b1d6433f029879c0 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid_config, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 11:06:15 np0005476733 python3.9[160245]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman run --name iscsid_config --detach=False --rm --tty=True 38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297 /usr/sbin/iscsi-iname
Oct  8 11:06:15 np0005476733 systemd[1]: libpod-conmon-50f5da91dd5c9f316d97db6e5fa3a8cdbe4b7671d4d93d69b1d6433f029879c0.scope: Deactivated successfully.
Oct  8 11:06:15 np0005476733 python3.9[160245]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: Error generating systemd: #012DEPRECATED command:#012It is recommended to use Quadlets for running containers and pods under systemd.#012#012Please refer to podman-systemd.unit(5) for details.#012Error: iscsid_config does not refer to a container or pod: no pod with name or ID iscsid_config found: no such pod: no container with name or ID "iscsid_config" found: no such container
Oct  8 11:06:16 np0005476733 python3.9[160683]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:06:16 np0005476733 python3.9[160806]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759935975.68346-219-34319826146608/.source.iscsi _original_basename=._yho1pca follow=False checksum=ec8e1debf9ad7ded72be7b4f1d389c6a5371da53 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:06:17 np0005476733 python3.9[160958]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:06:18 np0005476733 python3.9[161108]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/iscsid.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:06:19 np0005476733 python3.9[161262]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:06:20 np0005476733 python3.9[161414]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:06:20 np0005476733 python3.9[161566]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:06:21 np0005476733 python3.9[161644]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:06:22 np0005476733 python3.9[161796]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:06:22 np0005476733 python3.9[161874]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:06:23 np0005476733 python3.9[162026]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:06:24 np0005476733 python3.9[162178]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:06:24 np0005476733 python3.9[162256]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:06:25 np0005476733 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  8 11:06:25 np0005476733 python3.9[162408]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:06:25 np0005476733 python3.9[162486]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:06:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:06:26.284 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:06:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:06:26.285 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:06:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:06:26.286 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:06:26 np0005476733 python3.9[162638]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:06:26 np0005476733 systemd[1]: Reloading.
Oct  8 11:06:26 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:06:26 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:06:27 np0005476733 python3.9[162826]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:06:28 np0005476733 python3.9[162904]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:06:28 np0005476733 python3.9[163056]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:06:29 np0005476733 podman[163091]: 2025-10-08 15:06:29.279331782 +0000 UTC m=+0.088327432 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 11:06:29 np0005476733 python3.9[163152]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:06:30 np0005476733 python3.9[163304]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:06:30 np0005476733 systemd[1]: Reloading.
Oct  8 11:06:30 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:06:30 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:06:30 np0005476733 systemd[1]: Starting Create netns directory...
Oct  8 11:06:30 np0005476733 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  8 11:06:30 np0005476733 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  8 11:06:30 np0005476733 systemd[1]: Finished Create netns directory.
Oct  8 11:06:31 np0005476733 podman[163401]: 2025-10-08 15:06:31.311292755 +0000 UTC m=+0.135638845 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller)
Oct  8 11:06:31 np0005476733 python3.9[163526]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:06:32 np0005476733 python3.9[163678]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/iscsid/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:06:33 np0005476733 python3.9[163801]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/iscsid/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759935991.8392773-527-112024706959753/.source _original_basename=healthcheck follow=False checksum=2e1237e7fe015c809b173c52e24cfb87132f4344 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:06:33 np0005476733 python3.9[163953]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:06:34 np0005476733 python3.9[164105]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/iscsid.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:06:35 np0005476733 python3.9[164228]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/iscsid.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759935994.176769-577-248936704638622/.source.json _original_basename=.kqt6cpsz follow=False checksum=80e4f97460718c7e5c66b21ef8b846eba0e0dbc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:06:36 np0005476733 python3.9[164380]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/iscsid state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:06:38 np0005476733 python3.9[164807]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/iscsid config_pattern=*.json debug=False
Oct  8 11:06:39 np0005476733 python3.9[164959]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  8 11:06:40 np0005476733 python3.9[165111]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  8 11:06:42 np0005476733 python3[165290]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/iscsid config_id=iscsid config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  8 11:06:42 np0005476733 podman[165325]: 2025-10-08 15:06:42.5475069 +0000 UTC m=+0.048135926 container create 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:06:42 np0005476733 podman[165325]: 2025-10-08 15:06:42.525618017 +0000 UTC m=+0.026247063 image pull fa23f900391d6b2045198c4ce65355e00d82cd7d392391ca189cef278619240c 38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:06:42 np0005476733 python3[165290]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name iscsid --conmon-pidfile /run/iscsid.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=iscsid --label container_name=iscsid --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:z --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/openstack/healthchecks/iscsid:/openstack:ro,z 38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:06:43 np0005476733 python3.9[165515]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:06:44 np0005476733 python3.9[165669]: ansible-file Invoked with path=/etc/systemd/system/edpm_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:06:44 np0005476733 python3.9[165745]: ansible-stat Invoked with path=/etc/systemd/system/edpm_iscsid_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:06:45 np0005476733 python3.9[165896]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759936004.7895792-753-153679564839568/source dest=/etc/systemd/system/edpm_iscsid.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:06:46 np0005476733 python3.9[165972]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 11:06:46 np0005476733 systemd[1]: Reloading.
Oct  8 11:06:46 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:06:46 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:06:47 np0005476733 python3.9[166084]: ansible-systemd Invoked with state=restarted name=edpm_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:06:47 np0005476733 systemd[1]: Reloading.
Oct  8 11:06:47 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:06:47 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:06:47 np0005476733 systemd[1]: Starting iscsid container...
Oct  8 11:06:47 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:06:47 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c76851bce8f8bd520d069ab485ea795045f038d0e9ebad2de6f44dd4b8f4f195/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Oct  8 11:06:47 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c76851bce8f8bd520d069ab485ea795045f038d0e9ebad2de6f44dd4b8f4f195/merged/etc/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  8 11:06:47 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c76851bce8f8bd520d069ab485ea795045f038d0e9ebad2de6f44dd4b8f4f195/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  8 11:06:47 np0005476733 systemd[1]: Started /usr/bin/podman healthcheck run 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435.
Oct  8 11:06:47 np0005476733 podman[166123]: 2025-10-08 15:06:47.571210808 +0000 UTC m=+0.146706258 container init 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:06:47 np0005476733 iscsid[166138]: + sudo -E kolla_set_configs
Oct  8 11:06:47 np0005476733 podman[166123]: 2025-10-08 15:06:47.601831681 +0000 UTC m=+0.177327121 container start 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:06:47 np0005476733 podman[166123]: iscsid
Oct  8 11:06:47 np0005476733 systemd[1]: Started iscsid container.
Oct  8 11:06:47 np0005476733 systemd[1]: Created slice User Slice of UID 0.
Oct  8 11:06:47 np0005476733 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct  8 11:06:47 np0005476733 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct  8 11:06:47 np0005476733 systemd[1]: Starting User Manager for UID 0...
Oct  8 11:06:47 np0005476733 podman[166144]: 2025-10-08 15:06:47.699212315 +0000 UTC m=+0.074881483 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=starting, health_failing_streak=1, health_log=, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 11:06:47 np0005476733 systemd[1]: 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435-19114c2fdb78132a.service: Main process exited, code=exited, status=1/FAILURE
Oct  8 11:06:47 np0005476733 systemd[1]: 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435-19114c2fdb78132a.service: Failed with result 'exit-code'.
Oct  8 11:06:47 np0005476733 systemd[166159]: Queued start job for default target Main User Target.
Oct  8 11:06:47 np0005476733 systemd[166159]: Created slice User Application Slice.
Oct  8 11:06:47 np0005476733 systemd[166159]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct  8 11:06:47 np0005476733 systemd[166159]: Started Daily Cleanup of User's Temporary Directories.
Oct  8 11:06:47 np0005476733 systemd[166159]: Reached target Paths.
Oct  8 11:06:47 np0005476733 systemd[166159]: Reached target Timers.
Oct  8 11:06:47 np0005476733 systemd[166159]: Starting D-Bus User Message Bus Socket...
Oct  8 11:06:47 np0005476733 systemd[166159]: Starting Create User's Volatile Files and Directories...
Oct  8 11:06:47 np0005476733 systemd[166159]: Listening on D-Bus User Message Bus Socket.
Oct  8 11:06:47 np0005476733 systemd[166159]: Reached target Sockets.
Oct  8 11:06:47 np0005476733 systemd[166159]: Finished Create User's Volatile Files and Directories.
Oct  8 11:06:47 np0005476733 systemd[166159]: Reached target Basic System.
Oct  8 11:06:47 np0005476733 systemd[166159]: Reached target Main User Target.
Oct  8 11:06:47 np0005476733 systemd[166159]: Startup finished in 134ms.
Oct  8 11:06:47 np0005476733 systemd[1]: Started User Manager for UID 0.
Oct  8 11:06:47 np0005476733 systemd[1]: Started Session c3 of User root.
Oct  8 11:06:47 np0005476733 iscsid[166138]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  8 11:06:47 np0005476733 iscsid[166138]: INFO:__main__:Validating config file
Oct  8 11:06:47 np0005476733 iscsid[166138]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  8 11:06:47 np0005476733 iscsid[166138]: INFO:__main__:Writing out command to execute
Oct  8 11:06:47 np0005476733 systemd[1]: session-c3.scope: Deactivated successfully.
Oct  8 11:06:47 np0005476733 iscsid[166138]: ++ cat /run_command
Oct  8 11:06:47 np0005476733 iscsid[166138]: + CMD='/usr/sbin/iscsid -f'
Oct  8 11:06:47 np0005476733 iscsid[166138]: + ARGS=
Oct  8 11:06:47 np0005476733 iscsid[166138]: + sudo kolla_copy_cacerts
Oct  8 11:06:47 np0005476733 systemd[1]: Started Session c4 of User root.
Oct  8 11:06:47 np0005476733 systemd[1]: session-c4.scope: Deactivated successfully.
Oct  8 11:06:47 np0005476733 iscsid[166138]: + [[ ! -n '' ]]
Oct  8 11:06:47 np0005476733 iscsid[166138]: + . kolla_extend_start
Oct  8 11:06:47 np0005476733 iscsid[166138]: ++ [[ ! -f /etc/iscsi/initiatorname.iscsi ]]
Oct  8 11:06:47 np0005476733 iscsid[166138]: + echo 'Running command: '\''/usr/sbin/iscsid -f'\'''
Oct  8 11:06:47 np0005476733 iscsid[166138]: Running command: '/usr/sbin/iscsid -f'
Oct  8 11:06:47 np0005476733 iscsid[166138]: + umask 0022
Oct  8 11:06:47 np0005476733 iscsid[166138]: + exec /usr/sbin/iscsid -f
Oct  8 11:06:48 np0005476733 kernel: Loading iSCSI transport class v2.0-870.
Oct  8 11:06:48 np0005476733 python3.9[166343]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.iscsid_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:06:49 np0005476733 python3.9[166495]: ansible-ansible.builtin.file Invoked with path=/etc/iscsi/.iscsid_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:06:50 np0005476733 python3.9[166647]: ansible-ansible.builtin.service_facts Invoked
Oct  8 11:06:50 np0005476733 network[166664]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  8 11:06:50 np0005476733 network[166665]: 'network-scripts' will be removed from distribution in near future.
Oct  8 11:06:50 np0005476733 network[166666]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  8 11:06:54 np0005476733 python3.9[166940]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  8 11:06:55 np0005476733 python3.9[167092]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Oct  8 11:06:56 np0005476733 python3.9[167248]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:06:57 np0005476733 python3.9[167371]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759936015.899993-901-73624490856634/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:06:58 np0005476733 python3.9[167523]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:06:58 np0005476733 systemd[1]: Stopping User Manager for UID 0...
Oct  8 11:06:58 np0005476733 systemd[166159]: Activating special unit Exit the Session...
Oct  8 11:06:58 np0005476733 systemd[166159]: Stopped target Main User Target.
Oct  8 11:06:58 np0005476733 systemd[166159]: Stopped target Basic System.
Oct  8 11:06:58 np0005476733 systemd[166159]: Stopped target Paths.
Oct  8 11:06:58 np0005476733 systemd[166159]: Stopped target Sockets.
Oct  8 11:06:58 np0005476733 systemd[166159]: Stopped target Timers.
Oct  8 11:06:58 np0005476733 systemd[166159]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  8 11:06:58 np0005476733 systemd[166159]: Closed D-Bus User Message Bus Socket.
Oct  8 11:06:58 np0005476733 systemd[166159]: Stopped Create User's Volatile Files and Directories.
Oct  8 11:06:58 np0005476733 systemd[166159]: Removed slice User Application Slice.
Oct  8 11:06:58 np0005476733 systemd[166159]: Reached target Shutdown.
Oct  8 11:06:58 np0005476733 systemd[166159]: Finished Exit the Session.
Oct  8 11:06:58 np0005476733 systemd[166159]: Reached target Exit the Session.
Oct  8 11:06:58 np0005476733 systemd[1]: user@0.service: Deactivated successfully.
Oct  8 11:06:58 np0005476733 systemd[1]: Stopped User Manager for UID 0.
Oct  8 11:06:58 np0005476733 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct  8 11:06:58 np0005476733 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct  8 11:06:58 np0005476733 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct  8 11:06:58 np0005476733 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct  8 11:06:58 np0005476733 systemd[1]: Removed slice User Slice of UID 0.
Oct  8 11:06:58 np0005476733 python3.9[167678]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 11:06:58 np0005476733 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct  8 11:06:58 np0005476733 systemd[1]: Stopped Load Kernel Modules.
Oct  8 11:06:58 np0005476733 systemd[1]: Stopping Load Kernel Modules...
Oct  8 11:06:58 np0005476733 systemd[1]: Starting Load Kernel Modules...
Oct  8 11:06:58 np0005476733 systemd[1]: Finished Load Kernel Modules.
Oct  8 11:06:59 np0005476733 podman[167806]: 2025-10-08 15:06:59.574719947 +0000 UTC m=+0.087106816 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:06:59 np0005476733 python3.9[167850]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:07:00 np0005476733 python3.9[168006]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:07:01 np0005476733 python3.9[168158]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:07:02 np0005476733 podman[168282]: 2025-10-08 15:07:02.063430679 +0000 UTC m=+0.153164855 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 11:07:02 np0005476733 python3.9[168327]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:07:02 np0005476733 python3.9[168462]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759936021.56581-1017-274760414490696/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:07:03 np0005476733 python3.9[168614]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:07:04 np0005476733 python3.9[168767]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:07:05 np0005476733 python3.9[168919]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:07:06 np0005476733 python3.9[169071]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:07:07 np0005476733 python3.9[169223]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:07:07 np0005476733 python3.9[169375]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:07:08 np0005476733 python3.9[169527]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:07:09 np0005476733 python3.9[169679]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:07:09 np0005476733 python3.9[169831]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:07:10 np0005476733 python3.9[169985]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:07:11 np0005476733 python3.9[170137]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:07:12 np0005476733 python3.9[170289]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:07:12 np0005476733 python3.9[170367]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:07:13 np0005476733 python3.9[170519]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:07:13 np0005476733 python3.9[170597]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:07:14 np0005476733 python3.9[170749]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:07:15 np0005476733 python3.9[170901]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:07:15 np0005476733 python3.9[170979]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:07:16 np0005476733 python3.9[171131]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:07:17 np0005476733 python3.9[171209]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:07:17 np0005476733 python3.9[171361]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:07:17 np0005476733 systemd[1]: virtnodedevd.service: Deactivated successfully.
Oct  8 11:07:17 np0005476733 systemd[1]: Reloading.
Oct  8 11:07:18 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:07:18 np0005476733 podman[171364]: 2025-10-08 15:07:18.118268708 +0000 UTC m=+0.113595015 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 11:07:18 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:07:19 np0005476733 python3.9[171572]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:07:19 np0005476733 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct  8 11:07:19 np0005476733 python3.9[171651]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:07:20 np0005476733 python3.9[171803]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:07:20 np0005476733 systemd[1]: virtqemud.service: Deactivated successfully.
Oct  8 11:07:21 np0005476733 python3.9[171882]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:07:21 np0005476733 systemd[1]: virtsecretd.service: Deactivated successfully.
Oct  8 11:07:21 np0005476733 python3.9[172034]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:07:21 np0005476733 systemd[1]: Reloading.
Oct  8 11:07:22 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:07:22 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:07:22 np0005476733 systemd[1]: Starting Create netns directory...
Oct  8 11:07:22 np0005476733 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  8 11:07:22 np0005476733 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  8 11:07:22 np0005476733 systemd[1]: Finished Create netns directory.
Oct  8 11:07:23 np0005476733 python3.9[172228]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:07:24 np0005476733 python3.9[172380]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:07:24 np0005476733 python3.9[172503]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759936043.5159204-1431-249245477658037/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:07:25 np0005476733 python3.9[172655]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:07:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:07:26.285 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:07:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:07:26.286 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:07:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:07:26.287 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:07:26 np0005476733 python3.9[172807]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:07:26 np0005476733 python3.9[172930]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759936045.8893216-1481-120792254299339/.source.json _original_basename=.3sfk5q10 follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:07:27 np0005476733 python3.9[173082]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:07:29 np0005476733 podman[173481]: 2025-10-08 15:07:29.753428719 +0000 UTC m=+0.076739633 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 11:07:29 np0005476733 python3.9[173527]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Oct  8 11:07:30 np0005476733 python3.9[173682]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  8 11:07:31 np0005476733 python3.9[173834]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  8 11:07:32 np0005476733 podman[173885]: 2025-10-08 15:07:32.317500858 +0000 UTC m=+0.138598007 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  8 11:07:32 np0005476733 python3[174038]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  8 11:07:33 np0005476733 podman[174073]: 2025-10-08 15:07:33.15688616 +0000 UTC m=+0.064207181 container create 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:07:33 np0005476733 podman[174073]: 2025-10-08 15:07:33.121250887 +0000 UTC m=+0.028571968 image pull 4e93051232e4641dc0eac7573570c6d1a852f96d2f3e786f5899bf2d007a52e4 38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:07:33 np0005476733 python3[174038]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z 38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:07:34 np0005476733 python3.9[174264]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:07:34 np0005476733 python3.9[174418]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:07:35 np0005476733 python3.9[174494]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:07:36 np0005476733 python3.9[174645]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759936055.5129552-1657-47672107332854/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:07:36 np0005476733 python3.9[174721]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 11:07:36 np0005476733 systemd[1]: Reloading.
Oct  8 11:07:36 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:07:36 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:07:37 np0005476733 python3.9[174832]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:07:37 np0005476733 systemd[1]: Reloading.
Oct  8 11:07:37 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:07:37 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:07:38 np0005476733 systemd[1]: Starting multipathd container...
Oct  8 11:07:38 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:07:38 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14258f4a6fb6238d9e1a7e2f41b50d70728c615983ef57725ea11e62748986f1/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  8 11:07:38 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14258f4a6fb6238d9e1a7e2f41b50d70728c615983ef57725ea11e62748986f1/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  8 11:07:38 np0005476733 systemd[1]: Started /usr/bin/podman healthcheck run 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031.
Oct  8 11:07:38 np0005476733 podman[174871]: 2025-10-08 15:07:38.236971139 +0000 UTC m=+0.155966605 container init 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Oct  8 11:07:38 np0005476733 multipathd[174885]: + sudo -E kolla_set_configs
Oct  8 11:07:38 np0005476733 podman[174871]: 2025-10-08 15:07:38.265045879 +0000 UTC m=+0.184041325 container start 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:07:38 np0005476733 podman[174871]: multipathd
Oct  8 11:07:38 np0005476733 systemd[1]: Started multipathd container.
Oct  8 11:07:38 np0005476733 multipathd[174885]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  8 11:07:38 np0005476733 multipathd[174885]: INFO:__main__:Validating config file
Oct  8 11:07:38 np0005476733 multipathd[174885]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  8 11:07:38 np0005476733 multipathd[174885]: INFO:__main__:Writing out command to execute
Oct  8 11:07:38 np0005476733 multipathd[174885]: ++ cat /run_command
Oct  8 11:07:38 np0005476733 multipathd[174885]: + CMD='/usr/sbin/multipathd -d'
Oct  8 11:07:38 np0005476733 multipathd[174885]: + ARGS=
Oct  8 11:07:38 np0005476733 multipathd[174885]: + sudo kolla_copy_cacerts
Oct  8 11:07:38 np0005476733 multipathd[174885]: + [[ ! -n '' ]]
Oct  8 11:07:38 np0005476733 multipathd[174885]: + . kolla_extend_start
Oct  8 11:07:38 np0005476733 multipathd[174885]: Running command: '/usr/sbin/multipathd -d'
Oct  8 11:07:38 np0005476733 multipathd[174885]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct  8 11:07:38 np0005476733 multipathd[174885]: + umask 0022
Oct  8 11:07:38 np0005476733 multipathd[174885]: + exec /usr/sbin/multipathd -d
Oct  8 11:07:38 np0005476733 multipathd[174885]: 2962.109806 | --------start up--------
Oct  8 11:07:38 np0005476733 multipathd[174885]: 2962.109824 | read /etc/multipath.conf
Oct  8 11:07:38 np0005476733 podman[174894]: 2025-10-08 15:07:38.394209634 +0000 UTC m=+0.116342005 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, container_name=multipathd)
Oct  8 11:07:38 np0005476733 multipathd[174885]: 2962.121390 | path checkers start up
Oct  8 11:07:38 np0005476733 systemd[1]: 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031-26bf71b5fd55de1a.service: Main process exited, code=exited, status=1/FAILURE
Oct  8 11:07:38 np0005476733 systemd[1]: 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031-26bf71b5fd55de1a.service: Failed with result 'exit-code'.
Oct  8 11:07:39 np0005476733 python3.9[175076]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:07:40 np0005476733 python3.9[175230]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:07:41 np0005476733 python3.9[175395]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 11:07:41 np0005476733 systemd[1]: Stopping multipathd container...
Oct  8 11:07:41 np0005476733 multipathd[174885]: 2965.133012 | exit (signal)
Oct  8 11:07:41 np0005476733 multipathd[174885]: 2965.133196 | --------shut down-------
Oct  8 11:07:41 np0005476733 systemd[1]: libpod-3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031.scope: Deactivated successfully.
Oct  8 11:07:41 np0005476733 podman[175399]: 2025-10-08 15:07:41.4418925 +0000 UTC m=+0.070342488 container died 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 11:07:41 np0005476733 systemd[1]: 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031-26bf71b5fd55de1a.timer: Deactivated successfully.
Oct  8 11:07:41 np0005476733 systemd[1]: Stopped /usr/bin/podman healthcheck run 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031.
Oct  8 11:07:41 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031-userdata-shm.mount: Deactivated successfully.
Oct  8 11:07:41 np0005476733 systemd[1]: var-lib-containers-storage-overlay-14258f4a6fb6238d9e1a7e2f41b50d70728c615983ef57725ea11e62748986f1-merged.mount: Deactivated successfully.
Oct  8 11:07:41 np0005476733 podman[175399]: 2025-10-08 15:07:41.492492421 +0000 UTC m=+0.120942409 container cleanup 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Oct  8 11:07:41 np0005476733 podman[175399]: multipathd
Oct  8 11:07:41 np0005476733 podman[175428]: multipathd
Oct  8 11:07:41 np0005476733 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Oct  8 11:07:41 np0005476733 systemd[1]: Stopped multipathd container.
Oct  8 11:07:41 np0005476733 systemd[1]: Starting multipathd container...
Oct  8 11:07:41 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:07:41 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14258f4a6fb6238d9e1a7e2f41b50d70728c615983ef57725ea11e62748986f1/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  8 11:07:41 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14258f4a6fb6238d9e1a7e2f41b50d70728c615983ef57725ea11e62748986f1/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  8 11:07:41 np0005476733 systemd[1]: Started /usr/bin/podman healthcheck run 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031.
Oct  8 11:07:41 np0005476733 podman[175440]: 2025-10-08 15:07:41.723946363 +0000 UTC m=+0.121326523 container init 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 11:07:41 np0005476733 multipathd[175456]: + sudo -E kolla_set_configs
Oct  8 11:07:41 np0005476733 podman[175440]: 2025-10-08 15:07:41.755962733 +0000 UTC m=+0.153342893 container start 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001)
Oct  8 11:07:41 np0005476733 podman[175440]: multipathd
Oct  8 11:07:41 np0005476733 systemd[1]: Started multipathd container.
Oct  8 11:07:41 np0005476733 multipathd[175456]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  8 11:07:41 np0005476733 multipathd[175456]: INFO:__main__:Validating config file
Oct  8 11:07:41 np0005476733 multipathd[175456]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  8 11:07:41 np0005476733 multipathd[175456]: INFO:__main__:Writing out command to execute
Oct  8 11:07:41 np0005476733 multipathd[175456]: ++ cat /run_command
Oct  8 11:07:41 np0005476733 multipathd[175456]: + CMD='/usr/sbin/multipathd -d'
Oct  8 11:07:41 np0005476733 multipathd[175456]: + ARGS=
Oct  8 11:07:41 np0005476733 multipathd[175456]: + sudo kolla_copy_cacerts
Oct  8 11:07:41 np0005476733 multipathd[175456]: + [[ ! -n '' ]]
Oct  8 11:07:41 np0005476733 multipathd[175456]: + . kolla_extend_start
Oct  8 11:07:41 np0005476733 multipathd[175456]: Running command: '/usr/sbin/multipathd -d'
Oct  8 11:07:41 np0005476733 multipathd[175456]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct  8 11:07:41 np0005476733 multipathd[175456]: + umask 0022
Oct  8 11:07:41 np0005476733 multipathd[175456]: + exec /usr/sbin/multipathd -d
Oct  8 11:07:41 np0005476733 multipathd[175456]: 2965.599723 | --------start up--------
Oct  8 11:07:41 np0005476733 multipathd[175456]: 2965.599738 | read /etc/multipath.conf
Oct  8 11:07:41 np0005476733 multipathd[175456]: 2965.606687 | path checkers start up
Oct  8 11:07:41 np0005476733 podman[175463]: 2025-10-08 15:07:41.884977586 +0000 UTC m=+0.116095147 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  8 11:07:41 np0005476733 systemd[1]: 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031-720465d737f0d8ff.service: Main process exited, code=exited, status=1/FAILURE
Oct  8 11:07:41 np0005476733 systemd[1]: 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031-720465d737f0d8ff.service: Failed with result 'exit-code'.
Oct  8 11:07:42 np0005476733 python3.9[175647]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:07:43 np0005476733 python3.9[175799]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  8 11:07:44 np0005476733 python3.9[175951]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Oct  8 11:07:44 np0005476733 kernel: Key type psk registered
Oct  8 11:07:45 np0005476733 python3.9[176112]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:07:45 np0005476733 python3.9[176235]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759936064.5838497-1817-181729459036000/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:07:46 np0005476733 python3.9[176387]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:07:47 np0005476733 python3.9[176539]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 11:07:47 np0005476733 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct  8 11:07:47 np0005476733 systemd[1]: Stopped Load Kernel Modules.
Oct  8 11:07:47 np0005476733 systemd[1]: Stopping Load Kernel Modules...
Oct  8 11:07:47 np0005476733 systemd[1]: Starting Load Kernel Modules...
Oct  8 11:07:47 np0005476733 systemd[1]: Finished Load Kernel Modules.
Oct  8 11:07:48 np0005476733 python3.9[176695]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 11:07:49 np0005476733 podman[176751]: 2025-10-08 15:07:49.176860346 +0000 UTC m=+0.086604282 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  8 11:07:49 np0005476733 python3.9[176799]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 11:07:56 np0005476733 systemd[1]: Reloading.
Oct  8 11:07:56 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:07:56 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:07:56 np0005476733 systemd[1]: Reloading.
Oct  8 11:07:56 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:07:56 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:07:56 np0005476733 systemd-logind[827]: Watching system buttons on /dev/input/event0 (Power Button)
Oct  8 11:07:56 np0005476733 systemd-logind[827]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct  8 11:07:57 np0005476733 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  8 11:07:57 np0005476733 systemd[1]: Starting man-db-cache-update.service...
Oct  8 11:07:57 np0005476733 systemd[1]: Reloading.
Oct  8 11:07:57 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:07:57 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:07:57 np0005476733 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  8 11:07:59 np0005476733 python3.9[178252]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.iscsid_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:07:59 np0005476733 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  8 11:07:59 np0005476733 systemd[1]: Finished man-db-cache-update.service.
Oct  8 11:07:59 np0005476733 systemd[1]: man-db-cache-update.service: Consumed 1.760s CPU time.
Oct  8 11:07:59 np0005476733 systemd[1]: run-r2d2daa5b588b4019a98e4afe120c4695.service: Deactivated successfully.
Oct  8 11:07:59 np0005476733 python3.9[178403]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 11:08:00 np0005476733 podman[178408]: 2025-10-08 15:08:00.248072244 +0000 UTC m=+0.065425023 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 11:08:01 np0005476733 python3.9[178579]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:08:02 np0005476733 python3.9[178731]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 11:08:02 np0005476733 systemd[1]: Reloading.
Oct  8 11:08:02 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:08:02 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:08:02 np0005476733 podman[178733]: 2025-10-08 15:08:02.605802747 +0000 UTC m=+0.099925215 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Oct  8 11:08:03 np0005476733 python3.9[178942]: ansible-ansible.builtin.service_facts Invoked
Oct  8 11:08:03 np0005476733 network[178959]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  8 11:08:03 np0005476733 network[178960]: 'network-scripts' will be removed from distribution in near future.
Oct  8 11:08:03 np0005476733 network[178961]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  8 11:08:08 np0005476733 python3.9[179238]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:08:09 np0005476733 python3.9[179391]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:08:10 np0005476733 python3.9[179544]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:08:11 np0005476733 python3.9[179697]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:08:12 np0005476733 podman[179822]: 2025-10-08 15:08:12.031070556 +0000 UTC m=+0.100757436 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:08:12 np0005476733 python3.9[179865]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:08:13 np0005476733 python3.9[180023]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:08:14 np0005476733 python3.9[180176]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:08:14 np0005476733 python3.9[180329]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:08:15 np0005476733 python3.9[180482]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:08:16 np0005476733 python3.9[180634]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:08:17 np0005476733 python3.9[180786]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:08:18 np0005476733 python3.9[180938]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:08:18 np0005476733 python3.9[181090]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:08:19 np0005476733 python3.9[181242]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:08:19 np0005476733 podman[181366]: 2025-10-08 15:08:19.822133691 +0000 UTC m=+0.052029667 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  8 11:08:20 np0005476733 python3.9[181413]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:08:20 np0005476733 python3.9[181566]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:08:21 np0005476733 python3.9[181718]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:08:21 np0005476733 python3.9[181870]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:08:22 np0005476733 python3.9[182022]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:08:23 np0005476733 python3.9[182174]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:08:23 np0005476733 python3.9[182326]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:08:24 np0005476733 python3.9[182478]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:08:25 np0005476733 python3.9[182630]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:08:25 np0005476733 python3.9[182782]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:08:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:08:26.287 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:08:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:08:26.288 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:08:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:08:26.288 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:08:26 np0005476733 python3.9[182934]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:08:27 np0005476733 python3.9[183086]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  8 11:08:28 np0005476733 python3.9[183238]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 11:08:28 np0005476733 systemd[1]: Reloading.
Oct  8 11:08:28 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:08:28 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:08:29 np0005476733 python3.9[183425]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:08:31 np0005476733 podman[183550]: 2025-10-08 15:08:31.20387237 +0000 UTC m=+0.070771601 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  8 11:08:31 np0005476733 python3.9[183594]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:08:32 np0005476733 python3.9[183749]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:08:32 np0005476733 python3.9[183902]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:08:33 np0005476733 podman[183904]: 2025-10-08 15:08:33.141421953 +0000 UTC m=+0.138885447 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 11:08:34 np0005476733 python3.9[184081]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:08:34 np0005476733 python3.9[184235]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:08:35 np0005476733 python3.9[184388]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:08:36 np0005476733 python3.9[184541]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:08:38 np0005476733 python3.9[184694]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:08:38 np0005476733 python3.9[184846]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:08:39 np0005476733 python3.9[184998]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:08:40 np0005476733 python3.9[185150]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:08:41 np0005476733 python3.9[185302]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:08:41 np0005476733 python3.9[185454]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:08:42 np0005476733 podman[185571]: 2025-10-08 15:08:42.265503062 +0000 UTC m=+0.090428200 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:08:42 np0005476733 python3.9[185625]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:08:43 np0005476733 python3.9[185778]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:08:43 np0005476733 python3.9[185930]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:08:44 np0005476733 python3.9[186082]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:08:45 np0005476733 python3.9[186234]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:08:46 np0005476733 python3.9[186386]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:08:50 np0005476733 podman[186411]: 2025-10-08 15:08:50.229153018 +0000 UTC m=+0.057363220 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct  8 11:08:50 np0005476733 python3.9[186558]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Oct  8 11:08:51 np0005476733 python3.9[186711]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  8 11:08:52 np0005476733 python3.9[186869]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  8 11:08:54 np0005476733 systemd-logind[827]: New session 28 of user zuul.
Oct  8 11:08:54 np0005476733 systemd[1]: Started Session 28 of User zuul.
Oct  8 11:08:54 np0005476733 systemd[1]: session-28.scope: Deactivated successfully.
Oct  8 11:08:54 np0005476733 systemd-logind[827]: Session 28 logged out. Waiting for processes to exit.
Oct  8 11:08:54 np0005476733 systemd-logind[827]: Removed session 28.
Oct  8 11:08:54 np0005476733 python3.9[187055]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:08:55 np0005476733 python3.9[187176]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759936134.5058837-2935-44115052006710/.source.json follow=False _original_basename=config.json.j2 checksum=2c2474b5f24ef7c9ed37f49680082593e0d1100b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:08:56 np0005476733 python3.9[187326]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:08:56 np0005476733 python3.9[187402]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:08:57 np0005476733 python3.9[187552]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:08:58 np0005476733 python3.9[187673]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759936137.060341-2935-136006703639807/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:08:58 np0005476733 python3.9[187823]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:08:59 np0005476733 python3.9[187944]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759936138.3205326-2935-91986837427916/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:09:00 np0005476733 python3.9[188094]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:09:00 np0005476733 python3.9[188215]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759936139.576326-2935-90374467086924/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:09:01 np0005476733 podman[188339]: 2025-10-08 15:09:01.476671672 +0000 UTC m=+0.091160172 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent)
Oct  8 11:09:01 np0005476733 python3.9[188388]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:09:02 np0005476733 python3.9[188540]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:09:03 np0005476733 python3.9[188692]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:09:03 np0005476733 podman[188816]: 2025-10-08 15:09:03.744231354 +0000 UTC m=+0.116914478 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:09:03 np0005476733 python3.9[188862]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:09:04 np0005476733 python3.9[188992]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1759936143.350459-3120-167618000629138/.source _original_basename=.wq1ny3mi follow=False checksum=fef71a0169b08c9ef76ab05ebe470c90c1f49369 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Oct  8 11:09:05 np0005476733 python3.9[189144]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:09:06 np0005476733 python3.9[189296]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:09:07 np0005476733 python3.9[189417]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759936145.870398-3172-223823947597319/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=00a758acd58b5e8ca5cd337b0d8cb96111dc331d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:09:07 np0005476733 python3.9[189567]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:09:08 np0005476733 python3.9[189688]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759936147.2629793-3202-135253159359043/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=f190cbefa653e127f1a1f2e3167429b2a68e2a1b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:09:09 np0005476733 python3.9[189840]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Oct  8 11:09:10 np0005476733 python3.9[189992]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  8 11:09:11 np0005476733 python3[190144]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Oct  8 11:09:11 np0005476733 podman[190181]: 2025-10-08 15:09:11.396247553 +0000 UTC m=+0.082624470 container create 31e738f45454e43ba382b5f96b2b751bde4955ade11c50c4e988b9f3df101b79 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute_init, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  8 11:09:11 np0005476733 podman[190181]: 2025-10-08 15:09:11.342627514 +0000 UTC m=+0.029004451 image pull b762169049433908bdcf83b1787c6c97bc547a073236bc2a8f405754b3d623cc 38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:09:11 np0005476733 python3[190144]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z 38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297 bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Oct  8 11:09:12 np0005476733 python3.9[190371]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:09:13 np0005476733 podman[190497]: 2025-10-08 15:09:13.270621651 +0000 UTC m=+0.104544201 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 11:09:13 np0005476733 python3.9[190545]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Oct  8 11:09:14 np0005476733 python3.9[190698]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  8 11:09:15 np0005476733 python3[190850]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Oct  8 11:09:15 np0005476733 podman[190883]: 2025-10-08 15:09:15.574037252 +0000 UTC m=+0.063128474 container create 910e1d8c0b2f657640c399cfa71f1bab27eaa9421889a23eb046ca97072e3709 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297, name=nova_compute, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, container_name=nova_compute, org.label-schema.vendor=CentOS)
Oct  8 11:09:15 np0005476733 podman[190883]: 2025-10-08 15:09:15.533947888 +0000 UTC m=+0.023039130 image pull b762169049433908bdcf83b1787c6c97bc547a073236bc2a8f405754b3d623cc 38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:09:15 np0005476733 python3[190850]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro 38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297 kolla_start
Oct  8 11:09:16 np0005476733 python3.9[191073]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:09:17 np0005476733 python3.9[191227]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:09:18 np0005476733 python3.9[191378]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759936157.3775542-3386-181122377710944/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:09:18 np0005476733 python3.9[191454]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 11:09:18 np0005476733 systemd[1]: Reloading.
Oct  8 11:09:18 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:09:18 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:09:19 np0005476733 python3.9[191565]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:09:19 np0005476733 systemd[1]: Reloading.
Oct  8 11:09:19 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:09:19 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:09:20 np0005476733 systemd[1]: Starting nova_compute container...
Oct  8 11:09:20 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:09:20 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14061a6874000439d64edae3d4adc2d905ef3fe62ab8f6c4d4a3876b3c23f60f/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct  8 11:09:20 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14061a6874000439d64edae3d4adc2d905ef3fe62ab8f6c4d4a3876b3c23f60f/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  8 11:09:20 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14061a6874000439d64edae3d4adc2d905ef3fe62ab8f6c4d4a3876b3c23f60f/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct  8 11:09:20 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14061a6874000439d64edae3d4adc2d905ef3fe62ab8f6c4d4a3876b3c23f60f/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct  8 11:09:20 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14061a6874000439d64edae3d4adc2d905ef3fe62ab8f6c4d4a3876b3c23f60f/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  8 11:09:20 np0005476733 podman[191604]: 2025-10-08 15:09:20.179664836 +0000 UTC m=+0.113737676 container init 910e1d8c0b2f657640c399cfa71f1bab27eaa9421889a23eb046ca97072e3709 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297, name=nova_compute, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  8 11:09:20 np0005476733 podman[191604]: 2025-10-08 15:09:20.187784406 +0000 UTC m=+0.121857216 container start 910e1d8c0b2f657640c399cfa71f1bab27eaa9421889a23eb046ca97072e3709 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297, name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm, container_name=nova_compute)
Oct  8 11:09:20 np0005476733 nova_compute[191620]: + sudo -E kolla_set_configs
Oct  8 11:09:20 np0005476733 podman[191604]: nova_compute
Oct  8 11:09:20 np0005476733 systemd[1]: Started nova_compute container.
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Validating config file
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Copying service configuration files
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Deleting /etc/ceph
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Creating directory /etc/ceph
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Setting permission for /etc/ceph
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Writing out command to execute
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  8 11:09:20 np0005476733 nova_compute[191620]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  8 11:09:20 np0005476733 nova_compute[191620]: ++ cat /run_command
Oct  8 11:09:20 np0005476733 nova_compute[191620]: + CMD=nova-compute
Oct  8 11:09:20 np0005476733 nova_compute[191620]: + ARGS=
Oct  8 11:09:20 np0005476733 nova_compute[191620]: + sudo kolla_copy_cacerts
Oct  8 11:09:20 np0005476733 nova_compute[191620]: + [[ ! -n '' ]]
Oct  8 11:09:20 np0005476733 nova_compute[191620]: + . kolla_extend_start
Oct  8 11:09:20 np0005476733 nova_compute[191620]: Running command: 'nova-compute'
Oct  8 11:09:20 np0005476733 nova_compute[191620]: + echo 'Running command: '\''nova-compute'\'''
Oct  8 11:09:20 np0005476733 nova_compute[191620]: + umask 0022
Oct  8 11:09:20 np0005476733 nova_compute[191620]: + exec nova-compute
Oct  8 11:09:20 np0005476733 podman[191628]: 2025-10-08 15:09:20.365970528 +0000 UTC m=+0.096379421 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:09:21 np0005476733 python3.9[191802]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:09:22 np0005476733 nova_compute[191620]: 2025-10-08 15:09:22.340 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  8 11:09:22 np0005476733 nova_compute[191620]: 2025-10-08 15:09:22.340 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  8 11:09:22 np0005476733 nova_compute[191620]: 2025-10-08 15:09:22.340 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  8 11:09:22 np0005476733 nova_compute[191620]: 2025-10-08 15:09:22.341 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Oct  8 11:09:22 np0005476733 nova_compute[191620]: 2025-10-08 15:09:22.483 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:09:22 np0005476733 nova_compute[191620]: 2025-10-08 15:09:22.508 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:09:22 np0005476733 python3.9[191954]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.195 2 INFO nova.virt.driver [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.316 2 INFO nova.compute.provider_config [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.330 2 DEBUG oslo_concurrency.lockutils [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.331 2 DEBUG oslo_concurrency.lockutils [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.331 2 DEBUG oslo_concurrency.lockutils [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.331 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.331 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.332 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.332 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.332 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.332 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.332 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.333 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.333 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.333 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.333 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.333 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.333 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.333 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.334 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.334 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.334 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.334 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.334 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.334 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.335 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.335 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.335 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.335 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.335 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.336 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.336 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.336 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.336 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.336 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.337 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.337 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.338 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.338 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.338 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.338 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.338 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.339 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.339 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.339 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.339 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.340 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.340 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.340 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.340 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.340 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.340 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.340 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.341 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.341 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.341 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.341 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.341 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.341 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.342 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.342 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.342 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.342 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.342 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.342 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.343 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.343 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.343 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.343 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.343 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.343 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.343 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.343 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.344 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.344 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.344 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.344 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.344 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.345 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.345 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.345 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.345 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.345 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.345 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.346 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.346 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.346 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.346 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.346 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.346 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.347 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.347 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.347 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.347 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.347 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.348 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.348 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.348 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.348 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.348 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.348 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.349 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.349 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.349 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.349 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.349 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.349 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.349 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.350 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.350 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.350 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.350 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.350 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.350 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.351 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.351 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.351 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.351 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.351 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.351 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.351 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.352 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.352 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.352 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.352 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.352 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.353 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.353 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.353 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.353 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.353 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.354 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.354 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.354 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.354 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.354 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.354 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.355 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.355 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.355 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.355 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.355 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.355 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.356 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.356 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.356 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.356 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.356 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.356 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.357 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.357 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.357 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.357 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.357 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.357 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.358 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.358 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.358 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.358 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.358 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.359 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.359 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.359 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.359 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.359 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.359 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.360 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.360 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.360 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.360 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.360 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.360 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.360 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.361 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.361 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.361 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.361 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.361 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.361 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.362 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.362 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.362 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.362 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.362 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.362 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.363 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.363 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.363 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.363 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.363 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.364 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.364 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.364 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.364 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.364 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.364 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.364 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.365 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.365 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.365 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.365 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.365 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.365 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.365 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.366 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.366 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.366 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.366 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.366 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.366 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.367 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.367 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.367 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.367 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.367 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.367 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.367 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.368 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.368 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.368 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.368 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.368 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.368 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.368 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.369 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.369 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.369 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.369 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.369 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.369 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.370 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.370 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.370 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.370 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.370 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.370 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.370 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.371 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.371 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.371 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.371 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.371 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.371 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.371 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.372 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.372 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.372 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.372 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.372 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.372 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.372 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.373 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.373 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.373 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.373 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.373 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.373 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.374 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.374 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.374 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.374 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.374 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.374 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.374 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.375 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.375 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.375 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.375 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.375 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.375 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.375 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.376 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.376 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.376 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.376 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.376 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.376 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.377 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.377 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.377 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.377 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.377 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.377 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.377 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.378 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.378 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.378 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.378 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.378 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.378 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.379 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.379 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.379 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.379 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.379 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.380 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.380 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.380 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.380 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.380 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.381 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.381 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.381 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.381 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.381 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.382 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.382 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.382 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.382 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.382 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.383 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.383 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.383 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.383 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.383 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.383 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.384 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.384 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.384 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.384 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.384 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.384 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.384 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.385 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.385 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.385 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.385 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.385 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.385 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 python3.9[192106]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.385 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.386 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.386 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.386 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.386 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.386 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.386 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.386 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.387 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.387 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.387 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.387 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.387 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.387 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.388 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.388 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.388 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.388 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.388 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.388 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.389 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.389 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.389 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.389 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.389 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.390 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.390 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.390 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.390 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.390 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.391 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.391 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.391 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.391 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.392 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.392 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.392 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.392 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.392 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.393 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.393 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.393 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.393 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.393 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.394 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.394 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.394 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.394 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.394 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.394 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.394 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.395 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.395 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.395 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.395 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.396 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.396 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.396 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.396 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.396 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.396 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.396 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.397 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.397 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.397 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.397 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.397 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.397 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.398 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.398 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.398 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.398 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.398 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.398 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.399 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.399 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.399 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.399 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.399 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.399 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.399 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.400 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.400 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.400 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.400 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.400 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.400 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.400 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.400 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.401 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.401 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.401 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.401 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.401 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.401 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.401 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.402 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.402 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.402 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.402 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.402 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.402 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.402 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.403 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.403 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.403 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.403 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.403 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.403 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.403 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.404 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.404 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.404 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.404 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.404 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.404 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.404 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.405 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.405 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.405 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.405 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.405 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.406 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.406 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.406 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.406 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.406 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.406 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.406 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.407 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.407 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.407 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.407 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.407 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.407 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.407 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.408 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.408 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.408 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.408 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.408 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.408 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.409 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.409 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.409 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.409 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.409 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.410 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.410 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.410 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.410 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.410 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.411 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.411 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.411 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.411 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.411 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.412 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.412 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.412 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.412 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.412 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.412 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.413 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.413 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.413 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.413 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.413 2 WARNING oslo_config.cfg [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct  8 11:09:23 np0005476733 nova_compute[191620]: live_migration_uri is deprecated for removal in favor of two other options that
Oct  8 11:09:23 np0005476733 nova_compute[191620]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct  8 11:09:23 np0005476733 nova_compute[191620]: and ``live_migration_inbound_addr`` respectively.
Oct  8 11:09:23 np0005476733 nova_compute[191620]: ).  Its value may be silently ignored in the future.#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.413 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.414 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.414 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.414 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.414 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.414 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.414 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.415 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.415 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.415 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.415 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.415 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.415 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.416 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.416 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.416 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.416 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.416 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.416 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.417 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.417 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.417 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.417 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.417 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.418 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.418 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.418 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.418 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.418 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.419 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.419 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.419 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.419 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.419 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.420 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.420 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.420 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.420 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.420 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.420 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.421 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.421 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.421 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.421 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.421 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.421 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.422 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.422 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.422 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.422 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.422 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.423 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.423 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.423 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.423 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.423 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.424 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.424 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.424 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.424 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.424 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.424 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.425 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.425 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.425 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.425 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.425 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.425 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.426 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.426 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.426 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.426 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.426 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.426 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.427 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.427 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.427 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.427 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.427 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.428 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.428 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.428 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.428 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.428 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.429 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.429 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.429 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.429 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.429 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.430 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.430 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.430 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.430 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.430 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.431 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.431 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.431 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.431 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.431 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.431 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.432 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.432 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.432 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.432 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.432 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.432 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.432 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.433 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.433 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.433 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.433 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.433 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.433 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.434 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.434 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.434 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.434 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.434 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.435 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.435 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.435 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.435 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.435 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.436 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.436 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.436 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.436 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.436 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.437 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.437 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.437 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.437 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.437 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.438 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.438 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.438 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.438 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.438 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.439 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.439 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.439 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.439 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.439 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.440 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.440 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.440 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.440 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.440 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.440 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.440 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.441 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.441 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.441 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.441 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.441 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.441 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.442 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.442 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.442 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.442 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.442 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.443 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.443 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.443 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.443 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.443 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.443 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.444 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.444 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.444 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.444 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.444 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.444 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.444 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.445 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.445 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.445 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.445 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.445 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.446 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.446 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.446 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.446 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.446 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.446 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.447 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.447 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.447 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.447 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.447 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.447 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.447 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.448 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.448 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.448 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.448 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.448 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.449 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.449 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.449 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.449 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.449 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.449 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.449 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.450 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.450 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.450 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.450 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.450 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.450 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.450 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.451 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.451 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.451 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.451 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.451 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.451 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.452 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.452 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.452 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.452 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.452 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.452 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.453 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.453 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.453 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.453 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.453 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.453 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.454 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.454 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.454 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.454 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.454 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.454 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.455 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.455 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.455 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.455 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.456 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.456 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.456 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.456 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.456 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.457 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.457 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.457 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.457 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.458 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.458 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.458 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.458 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.459 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.459 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.459 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.459 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.460 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.460 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.460 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.460 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.460 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.461 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.461 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.461 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.461 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.462 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.462 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.462 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.462 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.462 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.463 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.463 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.463 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.463 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.463 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.464 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.464 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.464 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.464 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.464 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.465 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.465 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.465 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.465 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.465 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.466 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.466 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.466 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.466 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.466 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.467 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.467 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.467 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.467 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.468 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.468 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.468 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.468 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.468 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.469 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.469 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.469 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.469 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.469 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.470 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.470 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.470 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.470 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.470 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.470 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.471 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.471 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.471 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.471 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.471 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.471 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.472 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.472 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.472 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.472 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.472 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.473 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.473 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.473 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.473 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.473 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.473 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.474 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.474 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.474 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.474 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.474 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.475 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.475 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.475 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.475 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.475 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.476 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.476 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.476 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.476 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.476 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.477 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.477 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.477 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.477 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.478 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.478 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.478 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.478 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.478 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.478 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.479 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.479 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.479 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.479 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.479 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.479 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.480 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.480 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.480 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.480 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.480 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.481 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.481 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.481 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.481 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.481 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.481 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.482 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.482 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.482 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.482 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.482 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.483 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.483 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.483 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.483 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.483 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.484 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.484 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.484 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.484 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.484 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.485 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.485 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.485 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.485 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.485 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.486 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.486 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.486 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.486 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.486 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.487 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.487 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.487 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.487 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.487 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.487 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.488 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.488 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.488 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.488 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.488 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.489 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.489 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.489 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.489 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.489 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.489 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.490 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.490 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.490 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.490 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.490 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.491 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.491 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.491 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.491 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.491 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.491 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.492 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.492 2 DEBUG oslo_service.service [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.493 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.549 2 DEBUG nova.virt.libvirt.host [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.550 2 DEBUG nova.virt.libvirt.host [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.550 2 DEBUG nova.virt.libvirt.host [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.550 2 DEBUG nova.virt.libvirt.host [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Oct  8 11:09:23 np0005476733 systemd[1]: Starting libvirt QEMU daemon...
Oct  8 11:09:23 np0005476733 systemd[1]: Started libvirt QEMU daemon.
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.625 2 DEBUG nova.virt.libvirt.host [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7eff7a8981f0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.628 2 DEBUG nova.virt.libvirt.host [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7eff7a8981f0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.630 2 INFO nova.virt.libvirt.driver [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Connection event '1' reason 'None'#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.676 2 WARNING nova.virt.libvirt.driver [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Oct  8 11:09:23 np0005476733 nova_compute[191620]: 2025-10-08 15:09:23.677 2 DEBUG nova.virt.libvirt.volume.mount [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Oct  8 11:09:24 np0005476733 python3.9[192310]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct  8 11:09:24 np0005476733 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 11:09:24 np0005476733 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 2025-10-08 15:09:24.477 2 INFO nova.virt.libvirt.host [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Libvirt host capabilities <capabilities>
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <host>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <uuid>e18df060-3a53-4792-ac3f-8aebcc82fccc</uuid>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <cpu>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <arch>x86_64</arch>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model>EPYC-Rome-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <vendor>AMD</vendor>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <microcode version='16777317'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <signature family='23' model='49' stepping='0'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <maxphysaddr mode='emulate' bits='40'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature name='x2apic'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature name='tsc-deadline'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature name='osxsave'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature name='hypervisor'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature name='tsc_adjust'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature name='spec-ctrl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature name='stibp'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature name='arch-capabilities'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature name='ssbd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature name='cmp_legacy'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature name='topoext'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature name='virt-ssbd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature name='lbrv'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature name='tsc-scale'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature name='vmcb-clean'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature name='pause-filter'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature name='pfthreshold'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature name='svme-addr-chk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature name='rdctl-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature name='skip-l1dfl-vmentry'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature name='mds-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature name='pschange-mc-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <pages unit='KiB' size='4'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <pages unit='KiB' size='2048'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <pages unit='KiB' size='1048576'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </cpu>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <power_management>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <suspend_mem/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <suspend_disk/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <suspend_hybrid/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </power_management>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <iommu support='no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <migration_features>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <live/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <uri_transports>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <uri_transport>tcp</uri_transport>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <uri_transport>rdma</uri_transport>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </uri_transports>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </migration_features>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <topology>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <cells num='1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <cell id='0'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:          <memory unit='KiB'>16109340</memory>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:          <pages unit='KiB' size='4'>4027335</pages>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:          <pages unit='KiB' size='2048'>0</pages>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:          <pages unit='KiB' size='1048576'>0</pages>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:          <distances>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:            <sibling id='0' value='10'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:          </distances>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:          <cpus num='8'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:          </cpus>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        </cell>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </cells>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </topology>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <cache>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </cache>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <secmodel>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model>selinux</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <doi>0</doi>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </secmodel>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <secmodel>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model>dac</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <doi>0</doi>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <baselabel type='kvm'>+107:+107</baselabel>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <baselabel type='qemu'>+107:+107</baselabel>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </secmodel>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  </host>
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <guest>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <os_type>hvm</os_type>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <arch name='i686'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <wordsize>32</wordsize>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <domain type='qemu'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <domain type='kvm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </arch>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <features>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <pae/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <nonpae/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <acpi default='on' toggle='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <apic default='on' toggle='no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <cpuselection/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <deviceboot/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <disksnapshot default='on' toggle='no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <externalSnapshot/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </features>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  </guest>
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <guest>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <os_type>hvm</os_type>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <arch name='x86_64'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <wordsize>64</wordsize>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <domain type='qemu'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <domain type='kvm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </arch>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <features>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <acpi default='on' toggle='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <apic default='on' toggle='no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <cpuselection/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <deviceboot/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <disksnapshot default='on' toggle='no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <externalSnapshot/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </features>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  </guest>
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 
Oct  8 11:09:24 np0005476733 nova_compute[191620]: </capabilities>
Oct  8 11:09:24 np0005476733 nova_compute[191620]: #033[00m
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 2025-10-08 15:09:24.486 2 DEBUG nova.virt.libvirt.host [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 2025-10-08 15:09:24.510 2 DEBUG nova.virt.libvirt.host [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct  8 11:09:24 np0005476733 nova_compute[191620]: <domainCapabilities>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <path>/usr/libexec/qemu-kvm</path>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <domain>kvm</domain>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <arch>i686</arch>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <vcpu max='240'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <iothreads supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <os supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <enum name='firmware'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <loader supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='type'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>rom</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>pflash</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='readonly'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>yes</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>no</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='secure'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>no</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </loader>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  </os>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <cpu>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <mode name='host-passthrough' supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='hostPassthroughMigratable'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>on</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>off</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </mode>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <mode name='maximum' supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='maximumMigratable'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>on</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>off</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </mode>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <mode name='host-model' supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <vendor>AMD</vendor>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='x2apic'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='tsc-deadline'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='hypervisor'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='tsc_adjust'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='spec-ctrl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='stibp'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='arch-capabilities'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='ssbd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='cmp_legacy'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='overflow-recov'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='succor'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='ibrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='amd-ssbd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='virt-ssbd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='lbrv'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='tsc-scale'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='vmcb-clean'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='flushbyasid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='pause-filter'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='pfthreshold'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='svme-addr-chk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='rdctl-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='mds-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='pschange-mc-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='gds-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='rfds-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='disable' name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </mode>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <mode name='custom' supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-noTSX'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server-v5'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cooperlake'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cooperlake-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cooperlake-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Denverton'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mpx'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Denverton-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mpx'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Denverton-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Denverton-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Dhyana-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Genoa'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amd-psfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='auto-ibrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='no-nested-data-bp'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='null-sel-clr-base'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='stibp-always-on'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Genoa-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amd-psfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='auto-ibrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='no-nested-data-bp'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='null-sel-clr-base'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='stibp-always-on'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Milan'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Milan-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Milan-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amd-psfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='no-nested-data-bp'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='null-sel-clr-base'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='stibp-always-on'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Rome'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Rome-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Rome-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Rome-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='GraniteRapids'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mcdt-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pbrsb-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='prefetchiti'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='GraniteRapids-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mcdt-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pbrsb-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='prefetchiti'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='GraniteRapids-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx10'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx10-128'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx10-256'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx10-512'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mcdt-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pbrsb-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='prefetchiti'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-noTSX'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-noTSX'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v5'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v6'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v7'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='IvyBridge'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='IvyBridge-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='IvyBridge-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='IvyBridge-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='KnightsMill'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-4fmaps'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-4vnniw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512er'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512pf'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='KnightsMill-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-4fmaps'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-4vnniw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512er'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512pf'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Opteron_G4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fma4'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xop'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Opteron_G4-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fma4'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xop'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Opteron_G5'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fma4'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tbm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xop'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Opteron_G5-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fma4'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tbm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xop'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='SapphireRapids'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='SapphireRapids-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='SapphireRapids-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='SapphireRapids-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='SierraForest'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-ne-convert'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cmpccxadd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mcdt-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pbrsb-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='SierraForest-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-ne-convert'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cmpccxadd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mcdt-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pbrsb-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-v5'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Snowridge'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='core-capability'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mpx'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='split-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Snowridge-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='core-capability'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mpx'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='split-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Snowridge-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='core-capability'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='split-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Snowridge-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='core-capability'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='split-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Snowridge-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='athlon'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnow'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnowext'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='athlon-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnow'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnowext'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='core2duo'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='core2duo-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='coreduo'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='coreduo-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='n270'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='n270-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='phenom'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnow'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnowext'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='phenom-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnow'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnowext'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </mode>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  </cpu>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <memoryBacking supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <enum name='sourceType'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <value>file</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <value>anonymous</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <value>memfd</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  </memoryBacking>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <devices>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <disk supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='diskDevice'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>disk</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>cdrom</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>floppy</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>lun</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='bus'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>ide</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>fdc</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>scsi</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>usb</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>sata</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='model'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio-transitional</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio-non-transitional</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </disk>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <graphics supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='type'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>vnc</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>egl-headless</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>dbus</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </graphics>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <video supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='modelType'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>vga</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>cirrus</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>none</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>bochs</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>ramfb</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </video>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <hostdev supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='mode'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>subsystem</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='startupPolicy'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>default</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>mandatory</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>requisite</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>optional</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='subsysType'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>usb</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>pci</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>scsi</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='capsType'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='pciBackend'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </hostdev>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <rng supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='model'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio-transitional</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio-non-transitional</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='backendModel'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>random</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>egd</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>builtin</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </rng>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <filesystem supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='driverType'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>path</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>handle</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtiofs</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </filesystem>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <tpm supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='model'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>tpm-tis</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>tpm-crb</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='backendModel'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>emulator</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>external</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='backendVersion'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>2.0</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </tpm>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <redirdev supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='bus'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>usb</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </redirdev>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <channel supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='type'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>pty</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>unix</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </channel>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <crypto supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='model'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='type'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>qemu</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='backendModel'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>builtin</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </crypto>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <interface supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='backendType'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>default</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>passt</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </interface>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <panic supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='model'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>isa</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>hyperv</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </panic>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  </devices>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <features>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <gic supported='no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <vmcoreinfo supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <genid supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <backingStoreInput supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <backup supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <async-teardown supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <ps2 supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <sev supported='no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <sgx supported='no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <hyperv supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='features'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>relaxed</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>vapic</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>spinlocks</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>vpindex</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>runtime</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>synic</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>stimer</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>reset</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>vendor_id</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>frequencies</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>reenlightenment</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>tlbflush</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>ipi</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>avic</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>emsr_bitmap</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>xmm_input</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </hyperv>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <launchSecurity supported='no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  </features>
Oct  8 11:09:24 np0005476733 nova_compute[191620]: </domainCapabilities>
Oct  8 11:09:24 np0005476733 nova_compute[191620]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 2025-10-08 15:09:24.517 2 DEBUG nova.virt.libvirt.host [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct  8 11:09:24 np0005476733 nova_compute[191620]: <domainCapabilities>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <path>/usr/libexec/qemu-kvm</path>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <domain>kvm</domain>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <arch>i686</arch>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <vcpu max='4096'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <iothreads supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <os supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <enum name='firmware'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <loader supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='type'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>rom</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>pflash</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='readonly'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>yes</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>no</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='secure'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>no</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </loader>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  </os>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <cpu>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <mode name='host-passthrough' supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='hostPassthroughMigratable'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>on</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>off</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </mode>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <mode name='maximum' supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='maximumMigratable'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>on</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>off</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </mode>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <mode name='host-model' supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <vendor>AMD</vendor>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='x2apic'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='tsc-deadline'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='hypervisor'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='tsc_adjust'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='spec-ctrl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='stibp'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='arch-capabilities'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='ssbd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='cmp_legacy'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='overflow-recov'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='succor'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='ibrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='amd-ssbd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='virt-ssbd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='lbrv'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='tsc-scale'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='vmcb-clean'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='flushbyasid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='pause-filter'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='pfthreshold'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='svme-addr-chk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='rdctl-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='mds-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='pschange-mc-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='gds-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='rfds-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='disable' name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </mode>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <mode name='custom' supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-noTSX'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server-v5'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cooperlake'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cooperlake-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cooperlake-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Denverton'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mpx'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Denverton-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mpx'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Denverton-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Denverton-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Dhyana-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Genoa'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amd-psfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='auto-ibrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='no-nested-data-bp'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='null-sel-clr-base'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='stibp-always-on'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Genoa-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amd-psfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='auto-ibrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='no-nested-data-bp'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='null-sel-clr-base'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='stibp-always-on'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Milan'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Milan-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Milan-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amd-psfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='no-nested-data-bp'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='null-sel-clr-base'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='stibp-always-on'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Rome'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Rome-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Rome-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Rome-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='GraniteRapids'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mcdt-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pbrsb-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='prefetchiti'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='GraniteRapids-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mcdt-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pbrsb-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='prefetchiti'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='GraniteRapids-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx10'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx10-128'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx10-256'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx10-512'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mcdt-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pbrsb-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='prefetchiti'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-noTSX'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-noTSX'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v5'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v6'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v7'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='IvyBridge'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='IvyBridge-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='IvyBridge-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='IvyBridge-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='KnightsMill'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-4fmaps'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-4vnniw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512er'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512pf'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='KnightsMill-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-4fmaps'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-4vnniw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512er'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512pf'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Opteron_G4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fma4'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xop'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Opteron_G4-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fma4'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xop'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Opteron_G5'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fma4'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tbm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xop'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Opteron_G5-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fma4'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tbm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xop'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='SapphireRapids'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='SapphireRapids-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='SapphireRapids-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='SapphireRapids-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='SierraForest'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-ne-convert'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cmpccxadd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mcdt-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pbrsb-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='SierraForest-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-ne-convert'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cmpccxadd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mcdt-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pbrsb-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-v5'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Snowridge'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='core-capability'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mpx'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='split-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Snowridge-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='core-capability'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mpx'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='split-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Snowridge-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='core-capability'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='split-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Snowridge-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='core-capability'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='split-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Snowridge-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='athlon'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnow'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnowext'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='athlon-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnow'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnowext'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='core2duo'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='core2duo-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='coreduo'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='coreduo-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='n270'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='n270-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='phenom'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnow'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnowext'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='phenom-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnow'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnowext'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </mode>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  </cpu>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <memoryBacking supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <enum name='sourceType'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <value>file</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <value>anonymous</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <value>memfd</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  </memoryBacking>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <devices>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <disk supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='diskDevice'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>disk</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>cdrom</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>floppy</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>lun</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='bus'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>fdc</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>scsi</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>usb</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>sata</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='model'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio-transitional</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio-non-transitional</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </disk>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <graphics supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='type'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>vnc</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>egl-headless</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>dbus</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </graphics>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <video supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='modelType'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>vga</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>cirrus</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>none</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>bochs</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>ramfb</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </video>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <hostdev supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='mode'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>subsystem</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='startupPolicy'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>default</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>mandatory</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>requisite</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>optional</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='subsysType'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>usb</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>pci</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>scsi</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='capsType'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='pciBackend'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </hostdev>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <rng supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='model'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio-transitional</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio-non-transitional</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='backendModel'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>random</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>egd</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>builtin</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </rng>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <filesystem supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='driverType'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>path</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>handle</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtiofs</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </filesystem>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <tpm supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='model'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>tpm-tis</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>tpm-crb</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='backendModel'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>emulator</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>external</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='backendVersion'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>2.0</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </tpm>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <redirdev supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='bus'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>usb</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </redirdev>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <channel supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='type'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>pty</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>unix</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </channel>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <crypto supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='model'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='type'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>qemu</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='backendModel'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>builtin</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </crypto>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <interface supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='backendType'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>default</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>passt</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </interface>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <panic supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='model'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>isa</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>hyperv</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </panic>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  </devices>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <features>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <gic supported='no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <vmcoreinfo supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <genid supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <backingStoreInput supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <backup supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <async-teardown supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <ps2 supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <sev supported='no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <sgx supported='no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <hyperv supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='features'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>relaxed</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>vapic</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>spinlocks</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>vpindex</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>runtime</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>synic</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>stimer</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>reset</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>vendor_id</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>frequencies</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>reenlightenment</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>tlbflush</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>ipi</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>avic</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>emsr_bitmap</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>xmm_input</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </hyperv>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <launchSecurity supported='no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  </features>
Oct  8 11:09:24 np0005476733 nova_compute[191620]: </domainCapabilities>
Oct  8 11:09:24 np0005476733 nova_compute[191620]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 2025-10-08 15:09:24.562 2 DEBUG nova.virt.libvirt.host [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 2025-10-08 15:09:24.567 2 DEBUG nova.virt.libvirt.host [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct  8 11:09:24 np0005476733 nova_compute[191620]: <domainCapabilities>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <path>/usr/libexec/qemu-kvm</path>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <domain>kvm</domain>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <arch>x86_64</arch>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <vcpu max='240'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <iothreads supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <os supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <enum name='firmware'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <loader supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='type'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>rom</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>pflash</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='readonly'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>yes</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>no</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='secure'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>no</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </loader>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  </os>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <cpu>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <mode name='host-passthrough' supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='hostPassthroughMigratable'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>on</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>off</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </mode>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <mode name='maximum' supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='maximumMigratable'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>on</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>off</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </mode>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <mode name='host-model' supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <vendor>AMD</vendor>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='x2apic'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='tsc-deadline'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='hypervisor'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='tsc_adjust'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='spec-ctrl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='stibp'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='arch-capabilities'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='ssbd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='cmp_legacy'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='overflow-recov'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='succor'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='ibrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='amd-ssbd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='virt-ssbd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='lbrv'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='tsc-scale'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='vmcb-clean'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='flushbyasid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='pause-filter'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='pfthreshold'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='svme-addr-chk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='rdctl-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='mds-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='pschange-mc-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='gds-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='rfds-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='disable' name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </mode>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <mode name='custom' supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-noTSX'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server-v5'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cooperlake'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cooperlake-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cooperlake-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Denverton'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mpx'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Denverton-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mpx'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Denverton-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Denverton-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Dhyana-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Genoa'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amd-psfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='auto-ibrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='no-nested-data-bp'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='null-sel-clr-base'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='stibp-always-on'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Genoa-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amd-psfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='auto-ibrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='no-nested-data-bp'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='null-sel-clr-base'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='stibp-always-on'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Milan'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Milan-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Milan-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amd-psfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='no-nested-data-bp'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='null-sel-clr-base'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='stibp-always-on'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Rome'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Rome-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Rome-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Rome-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='GraniteRapids'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mcdt-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pbrsb-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='prefetchiti'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='GraniteRapids-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mcdt-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pbrsb-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='prefetchiti'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='GraniteRapids-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx10'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx10-128'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx10-256'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx10-512'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mcdt-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pbrsb-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='prefetchiti'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-noTSX'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-noTSX'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v5'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v6'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v7'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='IvyBridge'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='IvyBridge-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='IvyBridge-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='IvyBridge-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='KnightsMill'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-4fmaps'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-4vnniw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512er'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512pf'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='KnightsMill-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-4fmaps'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-4vnniw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512er'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512pf'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Opteron_G4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fma4'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xop'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Opteron_G4-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fma4'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xop'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Opteron_G5'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fma4'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tbm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xop'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Opteron_G5-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fma4'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tbm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xop'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='SapphireRapids'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='SapphireRapids-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='SapphireRapids-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='SapphireRapids-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='SierraForest'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-ne-convert'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cmpccxadd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mcdt-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pbrsb-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='SierraForest-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-ne-convert'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cmpccxadd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mcdt-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pbrsb-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-v5'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Snowridge'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='core-capability'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mpx'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='split-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Snowridge-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='core-capability'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mpx'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='split-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Snowridge-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='core-capability'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='split-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Snowridge-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='core-capability'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='split-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Snowridge-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='athlon'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnow'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnowext'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='athlon-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnow'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnowext'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='core2duo'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='core2duo-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='coreduo'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='coreduo-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='n270'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='n270-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='phenom'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnow'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnowext'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='phenom-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnow'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnowext'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </mode>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  </cpu>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <memoryBacking supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <enum name='sourceType'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <value>file</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <value>anonymous</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <value>memfd</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  </memoryBacking>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <devices>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <disk supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='diskDevice'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>disk</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>cdrom</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>floppy</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>lun</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='bus'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>ide</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>fdc</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>scsi</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>usb</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>sata</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='model'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio-transitional</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio-non-transitional</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </disk>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <graphics supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='type'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>vnc</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>egl-headless</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>dbus</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </graphics>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <video supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='modelType'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>vga</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>cirrus</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>none</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>bochs</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>ramfb</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </video>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <hostdev supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='mode'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>subsystem</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='startupPolicy'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>default</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>mandatory</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>requisite</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>optional</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='subsysType'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>usb</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>pci</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>scsi</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='capsType'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='pciBackend'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </hostdev>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <rng supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='model'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio-transitional</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio-non-transitional</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='backendModel'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>random</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>egd</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>builtin</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </rng>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <filesystem supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='driverType'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>path</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>handle</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtiofs</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </filesystem>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <tpm supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='model'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>tpm-tis</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>tpm-crb</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='backendModel'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>emulator</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>external</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='backendVersion'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>2.0</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </tpm>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <redirdev supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='bus'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>usb</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </redirdev>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <channel supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='type'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>pty</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>unix</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </channel>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <crypto supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='model'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='type'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>qemu</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='backendModel'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>builtin</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </crypto>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <interface supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='backendType'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>default</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>passt</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </interface>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <panic supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='model'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>isa</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>hyperv</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </panic>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  </devices>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <features>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <gic supported='no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <vmcoreinfo supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <genid supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <backingStoreInput supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <backup supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <async-teardown supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <ps2 supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <sev supported='no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <sgx supported='no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <hyperv supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='features'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>relaxed</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>vapic</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>spinlocks</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>vpindex</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>runtime</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>synic</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>stimer</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>reset</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>vendor_id</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>frequencies</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>reenlightenment</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>tlbflush</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>ipi</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>avic</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>emsr_bitmap</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>xmm_input</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </hyperv>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <launchSecurity supported='no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  </features>
Oct  8 11:09:24 np0005476733 nova_compute[191620]: </domainCapabilities>
Oct  8 11:09:24 np0005476733 nova_compute[191620]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 2025-10-08 15:09:24.634 2 DEBUG nova.virt.libvirt.host [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct  8 11:09:24 np0005476733 nova_compute[191620]: <domainCapabilities>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <path>/usr/libexec/qemu-kvm</path>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <domain>kvm</domain>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <arch>x86_64</arch>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <vcpu max='4096'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <iothreads supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <os supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <enum name='firmware'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <value>efi</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <loader supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='type'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>rom</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>pflash</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='readonly'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>yes</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>no</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='secure'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>yes</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>no</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </loader>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  </os>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <cpu>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <mode name='host-passthrough' supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='hostPassthroughMigratable'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>on</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>off</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </mode>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <mode name='maximum' supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='maximumMigratable'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>on</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>off</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </mode>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <mode name='host-model' supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <vendor>AMD</vendor>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='x2apic'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='tsc-deadline'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='hypervisor'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='tsc_adjust'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='spec-ctrl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='stibp'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='arch-capabilities'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='ssbd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='cmp_legacy'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='overflow-recov'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='succor'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='ibrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='amd-ssbd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='virt-ssbd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='lbrv'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='tsc-scale'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='vmcb-clean'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='flushbyasid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='pause-filter'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='pfthreshold'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='svme-addr-chk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='rdctl-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='mds-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='pschange-mc-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='gds-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='require' name='rfds-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <feature policy='disable' name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </mode>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <mode name='custom' supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-noTSX'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Broadwell-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cascadelake-Server-v5'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cooperlake'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cooperlake-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Cooperlake-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Denverton'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mpx'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Denverton-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mpx'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Denverton-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Denverton-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Dhyana-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Genoa'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amd-psfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='auto-ibrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='no-nested-data-bp'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='null-sel-clr-base'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='stibp-always-on'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Genoa-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amd-psfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='auto-ibrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='no-nested-data-bp'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='null-sel-clr-base'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='stibp-always-on'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Milan'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Milan-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Milan-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amd-psfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='no-nested-data-bp'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='null-sel-clr-base'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='stibp-always-on'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Rome'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Rome-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Rome-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-Rome-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='EPYC-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='GraniteRapids'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mcdt-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pbrsb-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='prefetchiti'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='GraniteRapids-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mcdt-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pbrsb-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='prefetchiti'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='GraniteRapids-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx10'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx10-128'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx10-256'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx10-512'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mcdt-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pbrsb-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='prefetchiti'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-noTSX'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Haswell-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-noTSX'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v5'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v6'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Icelake-Server-v7'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='IvyBridge'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='IvyBridge-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='IvyBridge-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='IvyBridge-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='KnightsMill'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-4fmaps'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-4vnniw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512er'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512pf'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='KnightsMill-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-4fmaps'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-4vnniw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512er'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512pf'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Opteron_G4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fma4'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xop'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Opteron_G4-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fma4'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xop'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Opteron_G5'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fma4'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tbm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xop'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Opteron_G5-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fma4'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tbm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xop'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='SapphireRapids'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='SapphireRapids-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='SapphireRapids-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='SapphireRapids-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='amx-tile'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-bf16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-fp16'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bitalg'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrc'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fzrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='la57'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='taa-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xfd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='SierraForest'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-ne-convert'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cmpccxadd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mcdt-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pbrsb-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='SierraForest-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-ifma'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-ne-convert'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx-vnni-int8'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cmpccxadd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fbsdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='fsrs'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ibrs-all'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mcdt-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pbrsb-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='psdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='serialize'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vaes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Client-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='hle'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='rtm'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Skylake-Server-v5'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512bw'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512cd'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512dq'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512f'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='avx512vl'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='invpcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pcid'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='pku'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Snowridge'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='core-capability'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mpx'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='split-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Snowridge-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='core-capability'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='mpx'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='split-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Snowridge-v2'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='core-capability'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='split-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Snowridge-v3'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='core-capability'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='split-lock-detect'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='Snowridge-v4'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='cldemote'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='erms'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='gfni'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdir64b'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='movdiri'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='xsaves'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='athlon'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnow'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnowext'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='athlon-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnow'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnowext'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='core2duo'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='core2duo-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='coreduo'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='coreduo-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='n270'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='n270-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='ss'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='phenom'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnow'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnowext'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <blockers model='phenom-v1'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnow'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <feature name='3dnowext'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </blockers>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </mode>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  </cpu>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <memoryBacking supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <enum name='sourceType'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <value>file</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <value>anonymous</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <value>memfd</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  </memoryBacking>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <devices>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <disk supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='diskDevice'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>disk</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>cdrom</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>floppy</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>lun</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='bus'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>fdc</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>scsi</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>usb</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>sata</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='model'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio-transitional</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio-non-transitional</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </disk>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <graphics supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='type'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>vnc</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>egl-headless</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>dbus</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </graphics>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <video supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='modelType'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>vga</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>cirrus</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>none</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>bochs</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>ramfb</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </video>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <hostdev supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='mode'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>subsystem</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='startupPolicy'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>default</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>mandatory</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>requisite</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>optional</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='subsysType'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>usb</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>pci</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>scsi</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='capsType'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='pciBackend'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </hostdev>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <rng supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='model'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio-transitional</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtio-non-transitional</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='backendModel'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>random</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>egd</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>builtin</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </rng>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <filesystem supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='driverType'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>path</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>handle</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>virtiofs</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </filesystem>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <tpm supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='model'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>tpm-tis</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>tpm-crb</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='backendModel'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>emulator</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>external</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='backendVersion'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>2.0</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </tpm>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <redirdev supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='bus'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>usb</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </redirdev>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <channel supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='type'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>pty</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>unix</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </channel>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <crypto supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='model'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='type'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>qemu</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='backendModel'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>builtin</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </crypto>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <interface supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='backendType'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>default</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>passt</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </interface>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <panic supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='model'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>isa</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>hyperv</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </panic>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  </devices>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  <features>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <gic supported='no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <vmcoreinfo supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <genid supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <backingStoreInput supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <backup supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <async-teardown supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <ps2 supported='yes'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <sev supported='no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <sgx supported='no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <hyperv supported='yes'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      <enum name='features'>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>relaxed</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>vapic</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>spinlocks</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>vpindex</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>runtime</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>synic</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>stimer</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>reset</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>vendor_id</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>frequencies</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>reenlightenment</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>tlbflush</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>ipi</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>avic</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>emsr_bitmap</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:        <value>xmm_input</value>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:      </enum>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    </hyperv>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:    <launchSecurity supported='no'/>
Oct  8 11:09:24 np0005476733 nova_compute[191620]:  </features>
Oct  8 11:09:24 np0005476733 nova_compute[191620]: </domainCapabilities>
Oct  8 11:09:24 np0005476733 nova_compute[191620]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 2025-10-08 15:09:24.692 2 DEBUG nova.virt.libvirt.host [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 2025-10-08 15:09:24.692 2 DEBUG nova.virt.libvirt.host [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 2025-10-08 15:09:24.693 2 DEBUG nova.virt.libvirt.host [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 2025-10-08 15:09:24.693 2 INFO nova.virt.libvirt.host [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Secure Boot support detected#033[00m
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 2025-10-08 15:09:24.694 2 INFO nova.virt.libvirt.driver [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 2025-10-08 15:09:24.695 2 INFO nova.virt.libvirt.driver [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 2025-10-08 15:09:24.705 2 DEBUG nova.virt.libvirt.driver [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 2025-10-08 15:09:24.785 2 INFO nova.virt.node [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Determined node identity 94652b61-be28-442d-a9f4-cded63837444 from /var/lib/nova/compute_id#033[00m
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 2025-10-08 15:09:24.813 2 WARNING nova.compute.manager [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Compute nodes ['94652b61-be28-442d-a9f4-cded63837444'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 2025-10-08 15:09:24.852 2 INFO nova.compute.manager [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 2025-10-08 15:09:24.898 2 WARNING nova.compute.manager [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 2025-10-08 15:09:24.899 2 DEBUG oslo_concurrency.lockutils [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 2025-10-08 15:09:24.899 2 DEBUG oslo_concurrency.lockutils [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 2025-10-08 15:09:24.899 2 DEBUG oslo_concurrency.lockutils [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:09:24 np0005476733 nova_compute[191620]: 2025-10-08 15:09:24.899 2 DEBUG nova.compute.resource_tracker [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:09:24 np0005476733 systemd[1]: Starting libvirt nodedev daemon...
Oct  8 11:09:24 np0005476733 systemd[1]: Started libvirt nodedev daemon.
Oct  8 11:09:25 np0005476733 python3.9[192496]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 11:09:25 np0005476733 nova_compute[191620]: 2025-10-08 15:09:25.255 2 WARNING nova.virt.libvirt.driver [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:09:25 np0005476733 nova_compute[191620]: 2025-10-08 15:09:25.256 2 DEBUG nova.compute.resource_tracker [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=14313MB free_disk=113.38857650756836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:09:25 np0005476733 nova_compute[191620]: 2025-10-08 15:09:25.256 2 DEBUG oslo_concurrency.lockutils [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:09:25 np0005476733 nova_compute[191620]: 2025-10-08 15:09:25.256 2 DEBUG oslo_concurrency.lockutils [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:09:25 np0005476733 systemd[1]: Stopping nova_compute container...
Oct  8 11:09:25 np0005476733 nova_compute[191620]: 2025-10-08 15:09:25.287 2 WARNING nova.compute.resource_tracker [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] No compute node record for compute-1.ctlplane.example.com:94652b61-be28-442d-a9f4-cded63837444: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 94652b61-be28-442d-a9f4-cded63837444 could not be found.#033[00m
Oct  8 11:09:25 np0005476733 nova_compute[191620]: 2025-10-08 15:09:25.339 2 DEBUG oslo_concurrency.lockutils [None req-b393900e-d279-4529-8a23-823dcd0641b8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:09:25 np0005476733 nova_compute[191620]: 2025-10-08 15:09:25.339 2 DEBUG oslo_concurrency.lockutils [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:09:25 np0005476733 nova_compute[191620]: 2025-10-08 15:09:25.340 2 DEBUG oslo_concurrency.lockutils [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:09:25 np0005476733 nova_compute[191620]: 2025-10-08 15:09:25.340 2 DEBUG oslo_concurrency.lockutils [None req-7d0d2448-4442-4bca-bfcf-3544f6eb5110 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:09:25 np0005476733 nova_compute[191620]: 2025-10-08 15:09:25.343 2 INFO oslo_messaging._drivers.amqpdriver [-] No calling threads waiting for msg_id : 606068ea253343eb8ce928103f4d6c8b#033[00m
Oct  8 11:09:25 np0005476733 virtqemud[192152]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, )
Oct  8 11:09:25 np0005476733 virtqemud[192152]: hostname: compute-1
Oct  8 11:09:25 np0005476733 virtqemud[192152]: End of file while reading data: Input/output error
Oct  8 11:09:25 np0005476733 systemd[1]: libpod-910e1d8c0b2f657640c399cfa71f1bab27eaa9421889a23eb046ca97072e3709.scope: Deactivated successfully.
Oct  8 11:09:25 np0005476733 systemd[1]: libpod-910e1d8c0b2f657640c399cfa71f1bab27eaa9421889a23eb046ca97072e3709.scope: Consumed 3.315s CPU time.
Oct  8 11:09:25 np0005476733 podman[192522]: 2025-10-08 15:09:25.798795605 +0000 UTC m=+0.506970761 container died 910e1d8c0b2f657640c399cfa71f1bab27eaa9421889a23eb046ca97072e3709 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297, name=nova_compute, config_id=edpm, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Oct  8 11:09:25 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-910e1d8c0b2f657640c399cfa71f1bab27eaa9421889a23eb046ca97072e3709-userdata-shm.mount: Deactivated successfully.
Oct  8 11:09:25 np0005476733 systemd[1]: var-lib-containers-storage-overlay-14061a6874000439d64edae3d4adc2d905ef3fe62ab8f6c4d4a3876b3c23f60f-merged.mount: Deactivated successfully.
Oct  8 11:09:25 np0005476733 podman[192522]: 2025-10-08 15:09:25.93279419 +0000 UTC m=+0.640969386 container cleanup 910e1d8c0b2f657640c399cfa71f1bab27eaa9421889a23eb046ca97072e3709 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297, name=nova_compute, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  8 11:09:25 np0005476733 podman[192522]: nova_compute
Oct  8 11:09:26 np0005476733 podman[192550]: nova_compute
Oct  8 11:09:26 np0005476733 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Oct  8 11:09:26 np0005476733 systemd[1]: Stopped nova_compute container.
Oct  8 11:09:26 np0005476733 systemd[1]: Starting nova_compute container...
Oct  8 11:09:26 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:09:26 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14061a6874000439d64edae3d4adc2d905ef3fe62ab8f6c4d4a3876b3c23f60f/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct  8 11:09:26 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14061a6874000439d64edae3d4adc2d905ef3fe62ab8f6c4d4a3876b3c23f60f/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  8 11:09:26 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14061a6874000439d64edae3d4adc2d905ef3fe62ab8f6c4d4a3876b3c23f60f/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct  8 11:09:26 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14061a6874000439d64edae3d4adc2d905ef3fe62ab8f6c4d4a3876b3c23f60f/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct  8 11:09:26 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14061a6874000439d64edae3d4adc2d905ef3fe62ab8f6c4d4a3876b3c23f60f/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  8 11:09:26 np0005476733 podman[192564]: 2025-10-08 15:09:26.169403974 +0000 UTC m=+0.111230276 container init 910e1d8c0b2f657640c399cfa71f1bab27eaa9421889a23eb046ca97072e3709 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3)
Oct  8 11:09:26 np0005476733 podman[192564]: 2025-10-08 15:09:26.174871559 +0000 UTC m=+0.116697841 container start 910e1d8c0b2f657640c399cfa71f1bab27eaa9421889a23eb046ca97072e3709 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297, name=nova_compute, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 11:09:26 np0005476733 podman[192564]: nova_compute
Oct  8 11:09:26 np0005476733 nova_compute[192580]: + sudo -E kolla_set_configs
Oct  8 11:09:26 np0005476733 systemd[1]: Started nova_compute container.
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Validating config file
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Copying service configuration files
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Deleting /etc/ceph
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Creating directory /etc/ceph
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Setting permission for /etc/ceph
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Writing out command to execute
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  8 11:09:26 np0005476733 nova_compute[192580]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  8 11:09:26 np0005476733 nova_compute[192580]: ++ cat /run_command
Oct  8 11:09:26 np0005476733 nova_compute[192580]: + CMD=nova-compute
Oct  8 11:09:26 np0005476733 nova_compute[192580]: + ARGS=
Oct  8 11:09:26 np0005476733 nova_compute[192580]: + sudo kolla_copy_cacerts
Oct  8 11:09:26 np0005476733 nova_compute[192580]: + [[ ! -n '' ]]
Oct  8 11:09:26 np0005476733 nova_compute[192580]: + . kolla_extend_start
Oct  8 11:09:26 np0005476733 nova_compute[192580]: Running command: 'nova-compute'
Oct  8 11:09:26 np0005476733 nova_compute[192580]: + echo 'Running command: '\''nova-compute'\'''
Oct  8 11:09:26 np0005476733 nova_compute[192580]: + umask 0022
Oct  8 11:09:26 np0005476733 nova_compute[192580]: + exec nova-compute
Oct  8 11:09:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:09:26.287 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:09:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:09:26.288 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:09:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:09:26.288 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:09:27 np0005476733 python3.9[192743]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct  8 11:09:27 np0005476733 systemd[1]: Started libpod-conmon-31e738f45454e43ba382b5f96b2b751bde4955ade11c50c4e988b9f3df101b79.scope.
Oct  8 11:09:27 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:09:27 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a93ca443151e065a72a9ac8eb03a78dbb1ff022fc6439be6931b0ce787e53147/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Oct  8 11:09:27 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a93ca443151e065a72a9ac8eb03a78dbb1ff022fc6439be6931b0ce787e53147/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct  8 11:09:27 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a93ca443151e065a72a9ac8eb03a78dbb1ff022fc6439be6931b0ce787e53147/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Oct  8 11:09:27 np0005476733 podman[192769]: 2025-10-08 15:09:27.561538497 +0000 UTC m=+0.193812284 container init 31e738f45454e43ba382b5f96b2b751bde4955ade11c50c4e988b9f3df101b79 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297, name=nova_compute_init, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=nova_compute_init, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  8 11:09:27 np0005476733 podman[192769]: 2025-10-08 15:09:27.569404488 +0000 UTC m=+0.201678275 container start 31e738f45454e43ba382b5f96b2b751bde4955ade11c50c4e988b9f3df101b79 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.build-date=20251001, container_name=nova_compute_init, org.label-schema.license=GPLv2)
Oct  8 11:09:27 np0005476733 python3.9[192743]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Oct  8 11:09:27 np0005476733 nova_compute_init[192790]: INFO:nova_statedir:Applying nova statedir ownership
Oct  8 11:09:27 np0005476733 nova_compute_init[192790]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Oct  8 11:09:27 np0005476733 nova_compute_init[192790]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Oct  8 11:09:27 np0005476733 nova_compute_init[192790]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Oct  8 11:09:27 np0005476733 nova_compute_init[192790]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Oct  8 11:09:27 np0005476733 nova_compute_init[192790]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Oct  8 11:09:27 np0005476733 nova_compute_init[192790]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Oct  8 11:09:27 np0005476733 nova_compute_init[192790]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Oct  8 11:09:27 np0005476733 nova_compute_init[192790]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Oct  8 11:09:27 np0005476733 nova_compute_init[192790]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Oct  8 11:09:27 np0005476733 nova_compute_init[192790]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Oct  8 11:09:27 np0005476733 nova_compute_init[192790]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Oct  8 11:09:27 np0005476733 nova_compute_init[192790]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Oct  8 11:09:27 np0005476733 nova_compute_init[192790]: INFO:nova_statedir:Nova statedir ownership complete
Oct  8 11:09:27 np0005476733 systemd[1]: libpod-31e738f45454e43ba382b5f96b2b751bde4955ade11c50c4e988b9f3df101b79.scope: Deactivated successfully.
Oct  8 11:09:27 np0005476733 podman[192804]: 2025-10-08 15:09:27.673257627 +0000 UTC m=+0.028857666 container died 31e738f45454e43ba382b5f96b2b751bde4955ade11c50c4e988b9f3df101b79 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297, name=nova_compute_init, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:09:27 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-31e738f45454e43ba382b5f96b2b751bde4955ade11c50c4e988b9f3df101b79-userdata-shm.mount: Deactivated successfully.
Oct  8 11:09:27 np0005476733 systemd[1]: var-lib-containers-storage-overlay-a93ca443151e065a72a9ac8eb03a78dbb1ff022fc6439be6931b0ce787e53147-merged.mount: Deactivated successfully.
Oct  8 11:09:27 np0005476733 podman[192804]: 2025-10-08 15:09:27.756311819 +0000 UTC m=+0.111911838 container cleanup 31e738f45454e43ba382b5f96b2b751bde4955ade11c50c4e988b9f3df101b79 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297, name=nova_compute_init, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-nova-compute:b78cfc68a577b1553523c8a70a34e297', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:09:27 np0005476733 systemd[1]: libpod-conmon-31e738f45454e43ba382b5f96b2b751bde4955ade11c50c4e988b9f3df101b79.scope: Deactivated successfully.
Oct  8 11:09:28 np0005476733 systemd[1]: session-26.scope: Deactivated successfully.
Oct  8 11:09:28 np0005476733 systemd[1]: session-26.scope: Consumed 2min 32.798s CPU time.
Oct  8 11:09:28 np0005476733 systemd-logind[827]: Session 26 logged out. Waiting for processes to exit.
Oct  8 11:09:28 np0005476733 systemd-logind[827]: Removed session 26.
Oct  8 11:09:28 np0005476733 nova_compute[192580]: 2025-10-08 15:09:28.381 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  8 11:09:28 np0005476733 nova_compute[192580]: 2025-10-08 15:09:28.382 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  8 11:09:28 np0005476733 nova_compute[192580]: 2025-10-08 15:09:28.382 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  8 11:09:28 np0005476733 nova_compute[192580]: 2025-10-08 15:09:28.382 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Oct  8 11:09:28 np0005476733 nova_compute[192580]: 2025-10-08 15:09:28.559 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:09:28 np0005476733 nova_compute[192580]: 2025-10-08 15:09:28.574 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.095 2 INFO nova.virt.driver [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.230 2 INFO nova.compute.provider_config [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.243 2 DEBUG oslo_concurrency.lockutils [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.243 2 DEBUG oslo_concurrency.lockutils [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.244 2 DEBUG oslo_concurrency.lockutils [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.244 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.244 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.244 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.244 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.245 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.245 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.245 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.245 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.245 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.245 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.246 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.246 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.246 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.246 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.246 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.246 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.247 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.247 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.247 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.247 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.247 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.247 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.247 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.248 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.248 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.248 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.248 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.248 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.248 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.249 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.249 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.249 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.249 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.249 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.249 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.249 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.250 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.250 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.250 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.250 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.250 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.250 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.251 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.251 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.251 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.251 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.251 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.251 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.251 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.252 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.252 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.252 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.252 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.252 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.253 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.253 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.253 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.253 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.253 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.253 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.254 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.254 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.254 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.254 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.254 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.254 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.254 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.255 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.255 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.255 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.255 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.255 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.255 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.255 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.256 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.256 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.256 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.256 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.256 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.256 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.256 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.257 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.257 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.257 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.257 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.257 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.257 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.258 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.258 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.258 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.258 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.258 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.258 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.258 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.259 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.259 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.259 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.259 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.259 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.259 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.260 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.260 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.260 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.260 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.260 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.260 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.260 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.261 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.261 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.261 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.261 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.261 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.261 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.261 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.261 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.262 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.262 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.262 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.262 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.262 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.262 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.262 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.263 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.263 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.263 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.263 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.263 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.263 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.264 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.264 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.264 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.264 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.264 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.265 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.265 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.265 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.265 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.265 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.266 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.266 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.266 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.266 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.266 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.266 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.266 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.267 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.267 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.267 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.267 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.267 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.267 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.268 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.268 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.268 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.268 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.268 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.268 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.268 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.269 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.269 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.269 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.269 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.269 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.269 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.269 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.270 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.270 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.270 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.270 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.270 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.270 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.271 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.271 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.271 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.271 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.271 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.271 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.271 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.272 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.272 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.272 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.272 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.272 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.272 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.273 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.273 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.273 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.273 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.273 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.273 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.273 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.274 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.274 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.274 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.274 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.274 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.274 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.275 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.275 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.275 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.275 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.275 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.275 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.276 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.276 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.276 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.276 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.276 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.277 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.277 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.277 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.277 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.277 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.278 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.278 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.278 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.278 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.278 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.278 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.278 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.279 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.279 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.279 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.279 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.279 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.279 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.279 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.280 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.280 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.280 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.280 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.280 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.280 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.280 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.281 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.281 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.281 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.281 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.281 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.281 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.282 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.282 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.282 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.282 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.282 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.282 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.282 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.283 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.283 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.283 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.283 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.283 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.284 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.284 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.284 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.284 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.284 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.284 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.284 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.285 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.285 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.285 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.285 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.285 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.285 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.286 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.286 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.286 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.286 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.286 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.286 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.286 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.287 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.287 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.287 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.287 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.287 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.287 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.288 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.288 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.288 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.288 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.288 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.288 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.288 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.289 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.289 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.289 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.289 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.289 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.289 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.289 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.290 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.290 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.290 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.290 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.290 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.290 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.290 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.291 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.291 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.291 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.291 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.291 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.291 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.291 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.292 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.292 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.292 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.292 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.292 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.292 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.292 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.293 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.293 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.293 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.293 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.293 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.293 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.293 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.294 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.294 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.294 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.294 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.294 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.294 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.294 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.295 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.295 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.295 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.295 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.295 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.295 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.295 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.296 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.296 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.296 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.296 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.296 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.296 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.296 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.297 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.297 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.297 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.297 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.297 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.297 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.297 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.298 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.298 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.298 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.298 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.298 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.298 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.299 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.299 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.299 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.299 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.299 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.300 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.300 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.300 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.300 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.300 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.300 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.301 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.301 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.301 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.301 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.301 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.301 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.301 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.302 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.302 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.302 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.302 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.302 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.302 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.303 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.303 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.303 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.303 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.303 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.303 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.304 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.304 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.304 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.304 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.304 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.304 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.304 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.305 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.305 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.305 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.305 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.305 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.306 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.306 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.306 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.306 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.306 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.307 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.307 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.307 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.307 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.307 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.307 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.308 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.308 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.308 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.308 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.308 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.308 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.309 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.309 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.309 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.309 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.309 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.310 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.310 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.310 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.310 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.310 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.310 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.310 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.311 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.311 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.311 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.311 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.311 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.311 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.312 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.312 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.312 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.312 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.312 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.312 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.312 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.312 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.313 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.313 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.313 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.313 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.313 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.314 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.314 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.314 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.314 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.314 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.314 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.314 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.315 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.315 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.315 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.315 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.315 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.315 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.316 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.316 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.316 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.316 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.316 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.316 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.317 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.317 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.317 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.317 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.318 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.318 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.318 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.318 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.318 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.319 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.319 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.319 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.319 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.319 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.319 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.320 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.320 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.320 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.320 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.320 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.320 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.321 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.321 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.321 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.321 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.321 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.321 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.321 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.322 2 WARNING oslo_config.cfg [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct  8 11:09:29 np0005476733 nova_compute[192580]: live_migration_uri is deprecated for removal in favor of two other options that
Oct  8 11:09:29 np0005476733 nova_compute[192580]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct  8 11:09:29 np0005476733 nova_compute[192580]: and ``live_migration_inbound_addr`` respectively.
Oct  8 11:09:29 np0005476733 nova_compute[192580]: ).  Its value may be silently ignored in the future.#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.322 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.322 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.322 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.322 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.322 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.323 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.323 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.323 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.323 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.323 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.323 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.324 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.324 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.324 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.324 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.324 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.324 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.325 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.325 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.325 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.325 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.325 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.325 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.326 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.326 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.326 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.326 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.326 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.326 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.327 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.327 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.327 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.327 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.327 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.327 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.328 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.328 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.328 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.328 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.328 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.328 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.328 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.329 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.329 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.329 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.329 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.329 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.329 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.330 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.330 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.330 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.330 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.330 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.330 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.331 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.331 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.331 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.331 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.331 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.331 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.332 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.332 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.332 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.332 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.332 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.332 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.333 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.333 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.333 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.333 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.333 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.333 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.333 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.334 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.334 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.334 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.334 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.334 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.334 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.335 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.335 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.335 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.335 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.335 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.335 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.336 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.336 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.336 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.336 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.336 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.336 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.336 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.337 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.337 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.337 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.337 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.337 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.337 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.338 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.338 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.338 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.338 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.338 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.338 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.339 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.339 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.339 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.339 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.339 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.339 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.340 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.340 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.340 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.340 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.340 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.340 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.340 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.341 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.341 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.341 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.341 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.341 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.341 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.341 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.342 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.342 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.342 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.342 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.342 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.342 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.342 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.343 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.343 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.343 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.343 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.343 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.343 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.344 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.344 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.344 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.344 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.344 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.345 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.345 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.345 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.345 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.345 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.345 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.345 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.345 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.346 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.346 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.346 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.346 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.346 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.346 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.346 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.347 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.347 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.347 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.347 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.347 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.347 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.347 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.348 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.348 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.348 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.348 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.348 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.348 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.349 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.349 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.349 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.349 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.349 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.349 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.350 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.350 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.350 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.350 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.350 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.350 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.351 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.351 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.351 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.351 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.351 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.351 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.352 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.352 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.352 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.352 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.352 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.352 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.352 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.353 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.353 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.353 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.353 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.353 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.353 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.354 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.354 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.354 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.354 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.354 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.354 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.355 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.355 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.355 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.355 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.355 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.356 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.356 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.356 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.356 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.356 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.356 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.357 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.357 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.357 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.357 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.357 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.357 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.357 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.358 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.358 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.358 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.358 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.358 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.358 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.359 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.359 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.359 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.359 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.359 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.360 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.360 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.360 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.360 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.360 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.360 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.361 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.361 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.361 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.361 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.361 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.362 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.362 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.362 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.362 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.363 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.363 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.363 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.363 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.364 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.364 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.364 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.364 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.364 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.365 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.365 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.365 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.365 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.365 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.365 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.365 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.366 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.366 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.366 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.366 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.366 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.366 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.366 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.367 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.367 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.367 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.367 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.367 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.368 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.368 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.368 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.368 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.368 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.368 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.368 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.369 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.369 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.369 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.369 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.369 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.369 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.369 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.370 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.370 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.370 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.370 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.370 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.370 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.370 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.371 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.371 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.371 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.371 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.371 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.371 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.372 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.372 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.372 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.372 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.372 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.373 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.373 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.373 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.373 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.373 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.373 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.374 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.374 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.374 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.374 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.374 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.374 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.375 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.375 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.375 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.375 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.375 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.375 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.376 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.376 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.376 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.376 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.376 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.377 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.377 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.377 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.377 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.377 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.377 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.378 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.378 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.378 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.378 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.378 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.378 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.378 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.379 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.379 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.379 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.379 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.379 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.379 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.379 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.380 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.380 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.380 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.380 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.380 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.380 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.380 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.380 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.381 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.381 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.381 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.381 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.381 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.381 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.381 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.382 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.382 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.382 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.382 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.382 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.382 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.383 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.383 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.383 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.383 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.383 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.383 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.383 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.384 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.384 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.384 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.384 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.384 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.384 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.384 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.385 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.385 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.385 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.385 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.385 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.385 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.385 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.386 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.386 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.386 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.386 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.386 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.386 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.386 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.386 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.387 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.387 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.387 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.387 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.387 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.387 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.387 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.388 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.388 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.388 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.388 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.388 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.388 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.388 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.389 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.389 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.389 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.389 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.389 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.390 2 DEBUG oslo_service.service [None req-25f1740f-178e-4c0f-bfc5-11469881e8eb - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.391 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.405 2 INFO nova.virt.node [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Determined node identity 94652b61-be28-442d-a9f4-cded63837444 from /var/lib/nova/compute_id#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.406 2 DEBUG nova.virt.libvirt.host [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.407 2 DEBUG nova.virt.libvirt.host [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.407 2 DEBUG nova.virt.libvirt.host [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.407 2 DEBUG nova.virt.libvirt.host [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.418 2 DEBUG nova.virt.libvirt.host [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f0a716174c0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.420 2 DEBUG nova.virt.libvirt.host [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f0a716174c0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.421 2 INFO nova.virt.libvirt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Connection event '1' reason 'None'#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.429 2 INFO nova.virt.libvirt.host [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Libvirt host capabilities <capabilities>
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <host>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <uuid>e18df060-3a53-4792-ac3f-8aebcc82fccc</uuid>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <cpu>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <arch>x86_64</arch>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model>EPYC-Rome-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <vendor>AMD</vendor>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <microcode version='16777317'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <signature family='23' model='49' stepping='0'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <maxphysaddr mode='emulate' bits='40'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature name='x2apic'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature name='tsc-deadline'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature name='osxsave'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature name='hypervisor'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature name='tsc_adjust'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature name='spec-ctrl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature name='stibp'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature name='arch-capabilities'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature name='ssbd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature name='cmp_legacy'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature name='topoext'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature name='virt-ssbd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature name='lbrv'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature name='tsc-scale'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature name='vmcb-clean'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature name='pause-filter'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature name='pfthreshold'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature name='svme-addr-chk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature name='rdctl-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature name='skip-l1dfl-vmentry'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature name='mds-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature name='pschange-mc-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <pages unit='KiB' size='4'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <pages unit='KiB' size='2048'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <pages unit='KiB' size='1048576'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </cpu>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <power_management>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <suspend_mem/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <suspend_disk/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <suspend_hybrid/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </power_management>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <iommu support='no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <migration_features>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <live/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <uri_transports>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <uri_transport>tcp</uri_transport>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <uri_transport>rdma</uri_transport>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </uri_transports>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </migration_features>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <topology>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <cells num='1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <cell id='0'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:          <memory unit='KiB'>16109340</memory>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:          <pages unit='KiB' size='4'>4027335</pages>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:          <pages unit='KiB' size='2048'>0</pages>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:          <pages unit='KiB' size='1048576'>0</pages>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:          <distances>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:            <sibling id='0' value='10'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:          </distances>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:          <cpus num='8'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:          </cpus>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        </cell>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </cells>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </topology>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <cache>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </cache>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <secmodel>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model>selinux</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <doi>0</doi>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </secmodel>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <secmodel>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model>dac</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <doi>0</doi>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <baselabel type='kvm'>+107:+107</baselabel>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <baselabel type='qemu'>+107:+107</baselabel>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </secmodel>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  </host>
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <guest>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <os_type>hvm</os_type>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <arch name='i686'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <wordsize>32</wordsize>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <domain type='qemu'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <domain type='kvm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </arch>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <features>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <pae/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <nonpae/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <acpi default='on' toggle='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <apic default='on' toggle='no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <cpuselection/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <deviceboot/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <disksnapshot default='on' toggle='no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <externalSnapshot/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </features>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  </guest>
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <guest>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <os_type>hvm</os_type>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <arch name='x86_64'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <wordsize>64</wordsize>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <domain type='qemu'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <domain type='kvm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </arch>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <features>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <acpi default='on' toggle='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <apic default='on' toggle='no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <cpuselection/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <deviceboot/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <disksnapshot default='on' toggle='no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <externalSnapshot/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </features>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  </guest>
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 
Oct  8 11:09:29 np0005476733 nova_compute[192580]: </capabilities>
Oct  8 11:09:29 np0005476733 nova_compute[192580]: #033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.435 2 DEBUG nova.virt.libvirt.host [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.441 2 DEBUG nova.virt.libvirt.host [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct  8 11:09:29 np0005476733 nova_compute[192580]: <domainCapabilities>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <path>/usr/libexec/qemu-kvm</path>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <domain>kvm</domain>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <arch>i686</arch>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <vcpu max='4096'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <iothreads supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <os supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <enum name='firmware'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <loader supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='type'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>rom</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>pflash</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='readonly'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>yes</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>no</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='secure'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>no</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </loader>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <cpu>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <mode name='host-passthrough' supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='hostPassthroughMigratable'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>on</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>off</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </mode>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <mode name='maximum' supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='maximumMigratable'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>on</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>off</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </mode>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <mode name='host-model' supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <vendor>AMD</vendor>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='x2apic'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='tsc-deadline'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='hypervisor'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='tsc_adjust'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='spec-ctrl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='stibp'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='arch-capabilities'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='ssbd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='cmp_legacy'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='overflow-recov'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='succor'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='ibrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='amd-ssbd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='virt-ssbd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='lbrv'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='tsc-scale'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='vmcb-clean'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='flushbyasid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='pause-filter'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='pfthreshold'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='svme-addr-chk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='rdctl-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='mds-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='pschange-mc-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='gds-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='rfds-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='disable' name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </mode>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <mode name='custom' supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-noTSX'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server-v5'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cooperlake'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cooperlake-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cooperlake-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Denverton'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mpx'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Denverton-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mpx'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Denverton-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Denverton-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Dhyana-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Genoa'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amd-psfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='auto-ibrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='no-nested-data-bp'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='null-sel-clr-base'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='stibp-always-on'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Genoa-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amd-psfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='auto-ibrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='no-nested-data-bp'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='null-sel-clr-base'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='stibp-always-on'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Milan'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Milan-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Milan-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amd-psfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='no-nested-data-bp'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='null-sel-clr-base'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='stibp-always-on'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Rome'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Rome-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Rome-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Rome-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='GraniteRapids'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mcdt-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pbrsb-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='prefetchiti'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='GraniteRapids-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mcdt-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pbrsb-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='prefetchiti'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='GraniteRapids-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx10'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx10-128'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx10-256'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx10-512'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mcdt-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pbrsb-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='prefetchiti'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-noTSX'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-noTSX'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v5'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v6'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v7'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='IvyBridge'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='IvyBridge-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='IvyBridge-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='IvyBridge-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='KnightsMill'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-4fmaps'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-4vnniw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512er'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512pf'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='KnightsMill-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-4fmaps'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-4vnniw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512er'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512pf'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Opteron_G4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fma4'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xop'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Opteron_G4-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fma4'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xop'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Opteron_G5'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fma4'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tbm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xop'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Opteron_G5-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fma4'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tbm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xop'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='SapphireRapids'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='SapphireRapids-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='SapphireRapids-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='SapphireRapids-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='SierraForest'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-ne-convert'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cmpccxadd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mcdt-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pbrsb-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='SierraForest-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-ne-convert'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cmpccxadd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mcdt-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pbrsb-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-v5'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Snowridge'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='core-capability'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mpx'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='split-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Snowridge-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='core-capability'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mpx'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='split-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Snowridge-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='core-capability'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='split-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Snowridge-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='core-capability'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='split-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Snowridge-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='athlon'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnow'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnowext'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='athlon-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnow'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnowext'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='core2duo'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='core2duo-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='coreduo'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='coreduo-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='n270'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='n270-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='phenom'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnow'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnowext'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='phenom-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnow'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnowext'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </mode>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <memoryBacking supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <enum name='sourceType'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <value>file</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <value>anonymous</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <value>memfd</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  </memoryBacking>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <disk supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='diskDevice'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>disk</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>cdrom</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>floppy</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>lun</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='bus'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>fdc</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>scsi</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>usb</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>sata</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='model'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio-transitional</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio-non-transitional</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <graphics supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='type'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>vnc</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>egl-headless</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>dbus</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </graphics>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <video supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='modelType'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>vga</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>cirrus</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>none</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>bochs</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>ramfb</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <hostdev supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='mode'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>subsystem</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='startupPolicy'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>default</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>mandatory</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>requisite</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>optional</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='subsysType'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>usb</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>pci</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>scsi</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='capsType'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='pciBackend'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </hostdev>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <rng supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='model'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio-transitional</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio-non-transitional</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='backendModel'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>random</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>egd</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>builtin</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <filesystem supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='driverType'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>path</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>handle</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtiofs</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </filesystem>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <tpm supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='model'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>tpm-tis</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>tpm-crb</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='backendModel'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>emulator</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>external</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='backendVersion'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>2.0</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </tpm>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <redirdev supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='bus'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>usb</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </redirdev>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <channel supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='type'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>pty</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>unix</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </channel>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <crypto supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='model'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='type'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>qemu</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='backendModel'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>builtin</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </crypto>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <interface supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='backendType'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>default</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>passt</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <panic supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='model'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>isa</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>hyperv</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </panic>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <gic supported='no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <vmcoreinfo supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <genid supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <backingStoreInput supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <backup supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <async-teardown supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <ps2 supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <sev supported='no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <sgx supported='no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <hyperv supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='features'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>relaxed</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>vapic</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>spinlocks</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>vpindex</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>runtime</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>synic</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>stimer</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>reset</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>vendor_id</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>frequencies</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>reenlightenment</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>tlbflush</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>ipi</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>avic</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>emsr_bitmap</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>xmm_input</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </hyperv>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <launchSecurity supported='no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:09:29 np0005476733 nova_compute[192580]: </domainCapabilities>
Oct  8 11:09:29 np0005476733 nova_compute[192580]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.445 2 DEBUG nova.virt.libvirt.volume.mount [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.448 2 DEBUG nova.virt.libvirt.host [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct  8 11:09:29 np0005476733 nova_compute[192580]: <domainCapabilities>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <path>/usr/libexec/qemu-kvm</path>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <domain>kvm</domain>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <arch>i686</arch>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <vcpu max='240'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <iothreads supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <os supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <enum name='firmware'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <loader supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='type'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>rom</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>pflash</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='readonly'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>yes</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>no</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='secure'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>no</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </loader>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <cpu>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <mode name='host-passthrough' supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='hostPassthroughMigratable'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>on</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>off</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </mode>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <mode name='maximum' supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='maximumMigratable'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>on</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>off</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </mode>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <mode name='host-model' supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <vendor>AMD</vendor>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='x2apic'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='tsc-deadline'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='hypervisor'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='tsc_adjust'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='spec-ctrl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='stibp'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='arch-capabilities'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='ssbd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='cmp_legacy'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='overflow-recov'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='succor'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='ibrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='amd-ssbd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='virt-ssbd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='lbrv'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='tsc-scale'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='vmcb-clean'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='flushbyasid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='pause-filter'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='pfthreshold'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='svme-addr-chk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='rdctl-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='mds-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='pschange-mc-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='gds-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='rfds-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='disable' name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </mode>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <mode name='custom' supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-noTSX'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server-v5'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cooperlake'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cooperlake-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cooperlake-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Denverton'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mpx'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Denverton-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mpx'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Denverton-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Denverton-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Dhyana-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Genoa'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amd-psfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='auto-ibrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='no-nested-data-bp'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='null-sel-clr-base'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='stibp-always-on'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Genoa-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amd-psfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='auto-ibrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='no-nested-data-bp'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='null-sel-clr-base'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='stibp-always-on'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Milan'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Milan-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Milan-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amd-psfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='no-nested-data-bp'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='null-sel-clr-base'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='stibp-always-on'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Rome'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Rome-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Rome-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Rome-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='GraniteRapids'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mcdt-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pbrsb-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='prefetchiti'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='GraniteRapids-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mcdt-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pbrsb-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='prefetchiti'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='GraniteRapids-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx10'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx10-128'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx10-256'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx10-512'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mcdt-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pbrsb-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='prefetchiti'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-noTSX'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-noTSX'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v5'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v6'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v7'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='IvyBridge'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='IvyBridge-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='IvyBridge-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='IvyBridge-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='KnightsMill'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-4fmaps'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-4vnniw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512er'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512pf'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='KnightsMill-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-4fmaps'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-4vnniw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512er'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512pf'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Opteron_G4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fma4'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xop'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Opteron_G4-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fma4'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xop'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Opteron_G5'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fma4'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tbm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xop'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Opteron_G5-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fma4'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tbm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xop'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='SapphireRapids'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='SapphireRapids-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='SapphireRapids-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='SapphireRapids-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='SierraForest'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-ne-convert'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cmpccxadd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mcdt-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pbrsb-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='SierraForest-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-ne-convert'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cmpccxadd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mcdt-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pbrsb-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-v5'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Snowridge'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='core-capability'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mpx'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='split-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Snowridge-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='core-capability'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mpx'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='split-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Snowridge-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='core-capability'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='split-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Snowridge-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='core-capability'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='split-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Snowridge-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='athlon'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnow'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnowext'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='athlon-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnow'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnowext'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='core2duo'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='core2duo-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='coreduo'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='coreduo-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='n270'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='n270-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='phenom'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnow'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnowext'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='phenom-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnow'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnowext'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </mode>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <memoryBacking supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <enum name='sourceType'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <value>file</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <value>anonymous</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <value>memfd</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  </memoryBacking>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <disk supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='diskDevice'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>disk</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>cdrom</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>floppy</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>lun</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='bus'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>ide</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>fdc</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>scsi</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>usb</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>sata</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='model'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio-transitional</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio-non-transitional</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <graphics supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='type'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>vnc</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>egl-headless</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>dbus</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </graphics>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <video supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='modelType'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>vga</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>cirrus</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>none</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>bochs</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>ramfb</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <hostdev supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='mode'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>subsystem</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='startupPolicy'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>default</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>mandatory</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>requisite</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>optional</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='subsysType'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>usb</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>pci</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>scsi</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='capsType'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='pciBackend'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </hostdev>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <rng supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='model'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio-transitional</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio-non-transitional</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='backendModel'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>random</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>egd</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>builtin</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <filesystem supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='driverType'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>path</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>handle</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtiofs</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </filesystem>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <tpm supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='model'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>tpm-tis</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>tpm-crb</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='backendModel'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>emulator</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>external</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='backendVersion'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>2.0</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </tpm>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <redirdev supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='bus'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>usb</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </redirdev>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <channel supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='type'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>pty</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>unix</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </channel>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <crypto supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='model'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='type'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>qemu</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='backendModel'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>builtin</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </crypto>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <interface supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='backendType'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>default</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>passt</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <panic supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='model'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>isa</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>hyperv</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </panic>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <gic supported='no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <vmcoreinfo supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <genid supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <backingStoreInput supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <backup supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <async-teardown supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <ps2 supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <sev supported='no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <sgx supported='no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <hyperv supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='features'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>relaxed</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>vapic</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>spinlocks</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>vpindex</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>runtime</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>synic</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>stimer</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>reset</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>vendor_id</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>frequencies</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>reenlightenment</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>tlbflush</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>ipi</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>avic</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>emsr_bitmap</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>xmm_input</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </hyperv>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <launchSecurity supported='no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:09:29 np0005476733 nova_compute[192580]: </domainCapabilities>
Oct  8 11:09:29 np0005476733 nova_compute[192580]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.473 2 DEBUG nova.virt.libvirt.host [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.477 2 DEBUG nova.virt.libvirt.host [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct  8 11:09:29 np0005476733 nova_compute[192580]: <domainCapabilities>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <path>/usr/libexec/qemu-kvm</path>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <domain>kvm</domain>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <arch>x86_64</arch>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <vcpu max='4096'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <iothreads supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <os supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <enum name='firmware'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <value>efi</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <loader supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='type'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>rom</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>pflash</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='readonly'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>yes</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>no</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='secure'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>yes</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>no</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </loader>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <cpu>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <mode name='host-passthrough' supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='hostPassthroughMigratable'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>on</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>off</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </mode>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <mode name='maximum' supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='maximumMigratable'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>on</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>off</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </mode>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <mode name='host-model' supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <vendor>AMD</vendor>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='x2apic'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='tsc-deadline'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='hypervisor'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='tsc_adjust'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='spec-ctrl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='stibp'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='arch-capabilities'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='ssbd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='cmp_legacy'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='overflow-recov'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='succor'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='ibrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='amd-ssbd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='virt-ssbd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='lbrv'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='tsc-scale'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='vmcb-clean'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='flushbyasid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='pause-filter'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='pfthreshold'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='svme-addr-chk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='rdctl-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='mds-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='pschange-mc-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='gds-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='rfds-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='disable' name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </mode>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <mode name='custom' supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-noTSX'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server-v5'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cooperlake'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cooperlake-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cooperlake-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Denverton'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mpx'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Denverton-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mpx'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Denverton-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Denverton-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Dhyana-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Genoa'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amd-psfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='auto-ibrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='no-nested-data-bp'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='null-sel-clr-base'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='stibp-always-on'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Genoa-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amd-psfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='auto-ibrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='no-nested-data-bp'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='null-sel-clr-base'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='stibp-always-on'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Milan'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Milan-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Milan-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amd-psfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='no-nested-data-bp'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='null-sel-clr-base'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='stibp-always-on'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Rome'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Rome-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Rome-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Rome-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='GraniteRapids'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mcdt-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pbrsb-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='prefetchiti'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='GraniteRapids-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mcdt-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pbrsb-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='prefetchiti'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='GraniteRapids-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx10'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx10-128'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx10-256'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx10-512'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mcdt-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pbrsb-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='prefetchiti'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-noTSX'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-noTSX'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v5'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v6'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v7'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='IvyBridge'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='IvyBridge-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='IvyBridge-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='IvyBridge-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='KnightsMill'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-4fmaps'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-4vnniw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512er'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512pf'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='KnightsMill-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-4fmaps'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-4vnniw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512er'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512pf'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Opteron_G4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fma4'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xop'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Opteron_G4-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fma4'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xop'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Opteron_G5'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fma4'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tbm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xop'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Opteron_G5-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fma4'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tbm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xop'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='SapphireRapids'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='SapphireRapids-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='SapphireRapids-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='SapphireRapids-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='SierraForest'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-ne-convert'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cmpccxadd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mcdt-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pbrsb-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='SierraForest-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-ne-convert'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cmpccxadd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mcdt-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pbrsb-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-v5'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Snowridge'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='core-capability'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mpx'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='split-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Snowridge-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='core-capability'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mpx'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='split-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Snowridge-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='core-capability'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='split-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Snowridge-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='core-capability'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='split-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Snowridge-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='athlon'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnow'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnowext'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='athlon-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnow'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnowext'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='core2duo'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='core2duo-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='coreduo'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='coreduo-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='n270'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='n270-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='phenom'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnow'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnowext'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='phenom-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnow'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnowext'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </mode>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <memoryBacking supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <enum name='sourceType'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <value>file</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <value>anonymous</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <value>memfd</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  </memoryBacking>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <disk supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='diskDevice'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>disk</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>cdrom</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>floppy</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>lun</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='bus'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>fdc</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>scsi</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>usb</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>sata</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='model'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio-transitional</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio-non-transitional</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <graphics supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='type'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>vnc</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>egl-headless</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>dbus</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </graphics>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <video supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='modelType'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>vga</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>cirrus</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>none</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>bochs</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>ramfb</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <hostdev supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='mode'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>subsystem</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='startupPolicy'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>default</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>mandatory</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>requisite</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>optional</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='subsysType'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>usb</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>pci</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>scsi</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='capsType'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='pciBackend'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </hostdev>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <rng supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='model'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio-transitional</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio-non-transitional</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='backendModel'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>random</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>egd</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>builtin</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <filesystem supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='driverType'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>path</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>handle</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtiofs</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </filesystem>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <tpm supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='model'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>tpm-tis</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>tpm-crb</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='backendModel'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>emulator</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>external</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='backendVersion'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>2.0</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </tpm>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <redirdev supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='bus'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>usb</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </redirdev>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <channel supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='type'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>pty</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>unix</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </channel>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <crypto supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='model'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='type'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>qemu</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='backendModel'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>builtin</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </crypto>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <interface supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='backendType'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>default</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>passt</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <panic supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='model'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>isa</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>hyperv</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </panic>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <gic supported='no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <vmcoreinfo supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <genid supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <backingStoreInput supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <backup supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <async-teardown supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <ps2 supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <sev supported='no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <sgx supported='no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <hyperv supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='features'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>relaxed</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>vapic</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>spinlocks</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>vpindex</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>runtime</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>synic</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>stimer</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>reset</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>vendor_id</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>frequencies</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>reenlightenment</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>tlbflush</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>ipi</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>avic</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>emsr_bitmap</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>xmm_input</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </hyperv>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <launchSecurity supported='no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:09:29 np0005476733 nova_compute[192580]: </domainCapabilities>
Oct  8 11:09:29 np0005476733 nova_compute[192580]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.537 2 DEBUG nova.virt.libvirt.host [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct  8 11:09:29 np0005476733 nova_compute[192580]: <domainCapabilities>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <path>/usr/libexec/qemu-kvm</path>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <domain>kvm</domain>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <arch>x86_64</arch>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <vcpu max='240'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <iothreads supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <os supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <enum name='firmware'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <loader supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='type'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>rom</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>pflash</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='readonly'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>yes</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>no</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='secure'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>no</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </loader>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <cpu>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <mode name='host-passthrough' supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='hostPassthroughMigratable'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>on</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>off</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </mode>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <mode name='maximum' supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='maximumMigratable'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>on</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>off</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </mode>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <mode name='host-model' supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <vendor>AMD</vendor>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='x2apic'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='tsc-deadline'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='hypervisor'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='tsc_adjust'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='spec-ctrl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='stibp'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='arch-capabilities'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='ssbd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='cmp_legacy'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='overflow-recov'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='succor'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='ibrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='amd-ssbd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='virt-ssbd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='lbrv'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='tsc-scale'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='vmcb-clean'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='flushbyasid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='pause-filter'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='pfthreshold'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='svme-addr-chk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='rdctl-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='mds-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='pschange-mc-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='gds-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='require' name='rfds-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <feature policy='disable' name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </mode>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <mode name='custom' supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-noTSX'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Broadwell-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cascadelake-Server-v5'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cooperlake'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cooperlake-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Cooperlake-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Denverton'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mpx'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Denverton-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mpx'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Denverton-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Denverton-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Dhyana-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Genoa'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amd-psfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='auto-ibrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='no-nested-data-bp'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='null-sel-clr-base'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='stibp-always-on'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Genoa-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amd-psfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='auto-ibrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='no-nested-data-bp'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='null-sel-clr-base'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='stibp-always-on'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Milan'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Milan-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Milan-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amd-psfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='no-nested-data-bp'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='null-sel-clr-base'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='stibp-always-on'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Rome'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Rome-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Rome-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-Rome-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='EPYC-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='GraniteRapids'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mcdt-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pbrsb-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='prefetchiti'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='GraniteRapids-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mcdt-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pbrsb-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='prefetchiti'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='GraniteRapids-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx10'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx10-128'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx10-256'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx10-512'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mcdt-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pbrsb-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='prefetchiti'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-noTSX'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Haswell-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-noTSX'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v5'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v6'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Icelake-Server-v7'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='IvyBridge'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='IvyBridge-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='IvyBridge-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='IvyBridge-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='KnightsMill'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-4fmaps'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-4vnniw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512er'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512pf'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='KnightsMill-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-4fmaps'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-4vnniw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512er'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512pf'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Opteron_G4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fma4'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xop'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Opteron_G4-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fma4'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xop'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Opteron_G5'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fma4'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tbm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xop'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Opteron_G5-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fma4'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tbm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xop'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='SapphireRapids'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='SapphireRapids-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='SapphireRapids-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='SapphireRapids-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='amx-tile'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-bf16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-fp16'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512-vpopcntdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bitalg'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vbmi2'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrc'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fzrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='la57'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='taa-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='tsx-ldtrk'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xfd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='SierraForest'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-ne-convert'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cmpccxadd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mcdt-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pbrsb-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='SierraForest-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-ifma'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-ne-convert'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx-vnni-int8'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='bus-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cmpccxadd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fbsdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='fsrs'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ibrs-all'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mcdt-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pbrsb-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='psdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='sbdr-ssdp-no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='serialize'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vaes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='vpclmulqdq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Client-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='hle'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='rtm'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Skylake-Server-v5'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512bw'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512cd'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512dq'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512f'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='avx512vl'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='invpcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pcid'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='pku'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Snowridge'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='core-capability'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mpx'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='split-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Snowridge-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='core-capability'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='mpx'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='split-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Snowridge-v2'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='core-capability'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='split-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Snowridge-v3'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='core-capability'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='split-lock-detect'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='Snowridge-v4'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='cldemote'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='erms'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='gfni'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdir64b'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='movdiri'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='xsaves'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='athlon'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnow'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnowext'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='athlon-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnow'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnowext'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='core2duo'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='core2duo-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='coreduo'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='coreduo-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='n270'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='n270-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='ss'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='phenom'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnow'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnowext'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <blockers model='phenom-v1'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnow'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <feature name='3dnowext'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </blockers>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </mode>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <memoryBacking supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <enum name='sourceType'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <value>file</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <value>anonymous</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <value>memfd</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  </memoryBacking>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <disk supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='diskDevice'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>disk</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>cdrom</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>floppy</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>lun</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='bus'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>ide</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>fdc</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>scsi</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>usb</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>sata</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='model'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio-transitional</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio-non-transitional</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <graphics supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='type'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>vnc</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>egl-headless</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>dbus</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </graphics>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <video supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='modelType'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>vga</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>cirrus</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>none</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>bochs</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>ramfb</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <hostdev supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='mode'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>subsystem</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='startupPolicy'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>default</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>mandatory</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>requisite</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>optional</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='subsysType'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>usb</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>pci</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>scsi</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='capsType'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='pciBackend'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </hostdev>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <rng supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='model'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio-transitional</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtio-non-transitional</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='backendModel'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>random</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>egd</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>builtin</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <filesystem supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='driverType'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>path</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>handle</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>virtiofs</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </filesystem>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <tpm supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='model'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>tpm-tis</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>tpm-crb</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='backendModel'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>emulator</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>external</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='backendVersion'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>2.0</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </tpm>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <redirdev supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='bus'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>usb</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </redirdev>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <channel supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='type'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>pty</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>unix</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </channel>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <crypto supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='model'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='type'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>qemu</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='backendModel'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>builtin</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </crypto>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <interface supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='backendType'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>default</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>passt</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <panic supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='model'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>isa</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>hyperv</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </panic>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <gic supported='no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <vmcoreinfo supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <genid supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <backingStoreInput supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <backup supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <async-teardown supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <ps2 supported='yes'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <sev supported='no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <sgx supported='no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <hyperv supported='yes'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      <enum name='features'>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>relaxed</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>vapic</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>spinlocks</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>vpindex</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>runtime</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>synic</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>stimer</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>reset</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>vendor_id</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>frequencies</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>reenlightenment</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>tlbflush</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>ipi</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>avic</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>emsr_bitmap</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:        <value>xmm_input</value>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:      </enum>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    </hyperv>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:    <launchSecurity supported='no'/>
Oct  8 11:09:29 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:09:29 np0005476733 nova_compute[192580]: </domainCapabilities>
Oct  8 11:09:29 np0005476733 nova_compute[192580]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.600 2 DEBUG nova.virt.libvirt.host [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.601 2 INFO nova.virt.libvirt.host [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Secure Boot support detected#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.602 2 INFO nova.virt.libvirt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.603 2 INFO nova.virt.libvirt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.611 2 DEBUG nova.virt.libvirt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.633 2 INFO nova.virt.node [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Determined node identity 94652b61-be28-442d-a9f4-cded63837444 from /var/lib/nova/compute_id#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.650 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Verified node 94652b61-be28-442d-a9f4-cded63837444 matches my host compute-1.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m
Oct  8 11:09:29 np0005476733 nova_compute[192580]: 2025-10-08 15:09:29.672 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Oct  8 11:09:30 np0005476733 nova_compute[192580]: 2025-10-08 15:09:30.103 2 ERROR nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Could not retrieve compute node resource provider 94652b61-be28-442d-a9f4-cded63837444 and therefore unable to error out any instances stuck in BUILDING state. Error: Failed to retrieve allocations for resource provider 94652b61-be28-442d-a9f4-cded63837444: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '94652b61-be28-442d-a9f4-cded63837444' not found: No resource provider with uuid 94652b61-be28-442d-a9f4-cded63837444 found  ", "request_id": "req-1d71ef14-fca9-4bf1-9be9-ce3ddd2f7fac"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 94652b61-be28-442d-a9f4-cded63837444: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '94652b61-be28-442d-a9f4-cded63837444' not found: No resource provider with uuid 94652b61-be28-442d-a9f4-cded63837444 found  ", "request_id": "req-1d71ef14-fca9-4bf1-9be9-ce3ddd2f7fac"}]}#033[00m
Oct  8 11:09:30 np0005476733 nova_compute[192580]: 2025-10-08 15:09:30.123 2 DEBUG oslo_concurrency.lockutils [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:09:30 np0005476733 nova_compute[192580]: 2025-10-08 15:09:30.123 2 DEBUG oslo_concurrency.lockutils [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:09:30 np0005476733 nova_compute[192580]: 2025-10-08 15:09:30.124 2 DEBUG oslo_concurrency.lockutils [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:09:30 np0005476733 nova_compute[192580]: 2025-10-08 15:09:30.124 2 DEBUG nova.compute.resource_tracker [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:09:30 np0005476733 nova_compute[192580]: 2025-10-08 15:09:30.328 2 WARNING nova.virt.libvirt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:09:30 np0005476733 nova_compute[192580]: 2025-10-08 15:09:30.329 2 DEBUG nova.compute.resource_tracker [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=14286MB free_disk=113.38779830932617GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:09:30 np0005476733 nova_compute[192580]: 2025-10-08 15:09:30.329 2 DEBUG oslo_concurrency.lockutils [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:09:30 np0005476733 nova_compute[192580]: 2025-10-08 15:09:30.329 2 DEBUG oslo_concurrency.lockutils [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:09:30 np0005476733 nova_compute[192580]: 2025-10-08 15:09:30.562 2 ERROR nova.compute.resource_tracker [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider 94652b61-be28-442d-a9f4-cded63837444: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '94652b61-be28-442d-a9f4-cded63837444' not found: No resource provider with uuid 94652b61-be28-442d-a9f4-cded63837444 found  ", "request_id": "req-6705935d-69ff-48ee-9fb3-c9e179912df9"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 94652b61-be28-442d-a9f4-cded63837444: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '94652b61-be28-442d-a9f4-cded63837444' not found: No resource provider with uuid 94652b61-be28-442d-a9f4-cded63837444 found  ", "request_id": "req-6705935d-69ff-48ee-9fb3-c9e179912df9"}]}#033[00m
Oct  8 11:09:30 np0005476733 nova_compute[192580]: 2025-10-08 15:09:30.563 2 DEBUG nova.compute.resource_tracker [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:09:30 np0005476733 nova_compute[192580]: 2025-10-08 15:09:30.563 2 DEBUG nova.compute.resource_tracker [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:09:30 np0005476733 nova_compute[192580]: 2025-10-08 15:09:30.647 2 INFO nova.scheduler.client.report [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [req-d8b4238d-bb63-4bd7-9db0-9b812540455f] Created resource provider record via placement API for resource provider with UUID 94652b61-be28-442d-a9f4-cded63837444 and name compute-1.ctlplane.example.com.#033[00m
Oct  8 11:09:30 np0005476733 nova_compute[192580]: 2025-10-08 15:09:30.681 2 DEBUG nova.virt.libvirt.host [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Oct  8 11:09:30 np0005476733 nova_compute[192580]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Oct  8 11:09:30 np0005476733 nova_compute[192580]: 2025-10-08 15:09:30.681 2 INFO nova.virt.libvirt.host [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] kernel doesn't support AMD SEV#033[00m
Oct  8 11:09:30 np0005476733 nova_compute[192580]: 2025-10-08 15:09:30.682 2 DEBUG nova.compute.provider_tree [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'MEMORY_MB': {'total': 15731, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 119, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 11:09:30 np0005476733 nova_compute[192580]: 2025-10-08 15:09:30.682 2 DEBUG nova.virt.libvirt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:09:30 np0005476733 nova_compute[192580]: 2025-10-08 15:09:30.750 2 DEBUG nova.scheduler.client.report [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Updated inventory for provider 94652b61-be28-442d-a9f4-cded63837444 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15731, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 119, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Oct  8 11:09:30 np0005476733 nova_compute[192580]: 2025-10-08 15:09:30.750 2 DEBUG nova.compute.provider_tree [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Updating resource provider 94652b61-be28-442d-a9f4-cded63837444 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Oct  8 11:09:30 np0005476733 nova_compute[192580]: 2025-10-08 15:09:30.751 2 DEBUG nova.compute.provider_tree [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 11:09:30 np0005476733 nova_compute[192580]: 2025-10-08 15:09:30.878 2 DEBUG nova.compute.provider_tree [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Updating resource provider 94652b61-be28-442d-a9f4-cded63837444 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Oct  8 11:09:30 np0005476733 nova_compute[192580]: 2025-10-08 15:09:30.938 2 DEBUG nova.compute.resource_tracker [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:09:30 np0005476733 nova_compute[192580]: 2025-10-08 15:09:30.939 2 DEBUG oslo_concurrency.lockutils [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.609s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:09:30 np0005476733 nova_compute[192580]: 2025-10-08 15:09:30.939 2 DEBUG nova.service [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Oct  8 11:09:31 np0005476733 nova_compute[192580]: 2025-10-08 15:09:31.019 2 DEBUG nova.service [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Oct  8 11:09:31 np0005476733 nova_compute[192580]: 2025-10-08 15:09:31.019 2 DEBUG nova.servicegroup.drivers.db [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Oct  8 11:09:32 np0005476733 podman[192877]: 2025-10-08 15:09:32.266297478 +0000 UTC m=+0.089619764 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:09:34 np0005476733 systemd-logind[827]: New session 29 of user zuul.
Oct  8 11:09:34 np0005476733 systemd[1]: Started Session 29 of User zuul.
Oct  8 11:09:34 np0005476733 podman[192897]: 2025-10-08 15:09:34.18382004 +0000 UTC m=+0.109126509 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:09:35 np0005476733 python3.9[193073]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 11:09:36 np0005476733 python3.9[193229]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 11:09:36 np0005476733 systemd[1]: Reloading.
Oct  8 11:09:37 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:09:37 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:09:38 np0005476733 python3.9[193413]: ansible-ansible.builtin.service_facts Invoked
Oct  8 11:09:38 np0005476733 network[193430]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  8 11:09:38 np0005476733 network[193431]: 'network-scripts' will be removed from distribution in near future.
Oct  8 11:09:38 np0005476733 network[193432]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  8 11:09:42 np0005476733 python3.9[193709]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:09:43 np0005476733 podman[193834]: 2025-10-08 15:09:43.785410747 +0000 UTC m=+0.116717001 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:09:44 np0005476733 python3.9[193875]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:09:44 np0005476733 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 11:09:44 np0005476733 python3.9[194035]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:09:45 np0005476733 python3.9[194187]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:09:46 np0005476733 python3.9[194339]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  8 11:09:47 np0005476733 python3.9[194491]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 11:09:47 np0005476733 systemd[1]: Reloading.
Oct  8 11:09:47 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:09:47 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:09:48 np0005476733 python3.9[194678]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:09:49 np0005476733 python3.9[194831]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:09:50 np0005476733 python3.9[194981]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:09:51 np0005476733 podman[195107]: 2025-10-08 15:09:51.069152821 +0000 UTC m=+0.066126119 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 11:09:51 np0005476733 python3.9[195148]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:09:51 np0005476733 python3.9[195274]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759936190.7390394-247-35906076049266/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=a9bdb897f3979025d9a372b4beff53a09cbe0d55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:09:52 np0005476733 nova_compute[192580]: 2025-10-08 15:09:52.021 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:09:52 np0005476733 nova_compute[192580]: 2025-10-08 15:09:52.053 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:09:52 np0005476733 python3.9[195426]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Oct  8 11:09:53 np0005476733 python3.9[195578]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Oct  8 11:09:54 np0005476733 python3.9[195731]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  8 11:09:55 np0005476733 python3.9[195889]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  8 11:09:57 np0005476733 python3.9[196047]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:09:57 np0005476733 auditd[706]: Audit daemon rotating log files
Oct  8 11:09:57 np0005476733 python3.9[196168]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759936196.5597389-383-192452349884892/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:09:58 np0005476733 python3.9[196318]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:09:58 np0005476733 python3.9[196439]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759936197.7402203-383-164211454911110/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:09:59 np0005476733 python3.9[196589]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:09:59 np0005476733 python3.9[196710]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759936198.9409277-383-158927331290383/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:10:00 np0005476733 python3.9[196860]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:10:01 np0005476733 python3.9[197012]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:10:02 np0005476733 python3.9[197164]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:10:02 np0005476733 podman[197259]: 2025-10-08 15:10:02.516280848 +0000 UTC m=+0.083935398 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:10:02 np0005476733 python3.9[197299]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759936201.575213-501-103037347830296/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:10:03 np0005476733 python3.9[197452]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:10:03 np0005476733 python3.9[197528]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:10:04 np0005476733 python3.9[197678]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:10:04 np0005476733 podman[197773]: 2025-10-08 15:10:04.847563072 +0000 UTC m=+0.091174330 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller)
Oct  8 11:10:04 np0005476733 python3.9[197818]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759936203.9159274-501-183315213315207/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=31ec6ddda44bb33b6a2cbc5a2ccd63ac4f8d5ac3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:10:05 np0005476733 python3.9[197976]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:10:06 np0005476733 python3.9[198097]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759936205.1734288-501-252643586681544/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:10:06 np0005476733 python3.9[198247]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:10:07 np0005476733 python3.9[198368]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759936206.3716142-501-270500603740654/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:10:08 np0005476733 python3.9[198518]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:10:08 np0005476733 python3.9[198639]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759936207.618267-501-188553430350515/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:10:09 np0005476733 python3.9[198789]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:10:09 np0005476733 python3.9[198910]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759936208.7913406-501-89376479077386/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:10:10 np0005476733 python3.9[199060]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:10:10 np0005476733 python3.9[199181]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759936209.9288511-501-210830108526105/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:10:11 np0005476733 python3.9[199331]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:10:12 np0005476733 python3.9[199452]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759936211.045896-501-280264490973551/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:10:12 np0005476733 python3.9[199602]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:10:13 np0005476733 python3.9[199723]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759936212.1871545-501-276328916273102/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:10:13 np0005476733 podman[199847]: 2025-10-08 15:10:13.926035697 +0000 UTC m=+0.062302615 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:10:14 np0005476733 python3.9[199884]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:10:14 np0005476733 python3.9[200014]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759936213.4244058-501-112464850803976/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:10:15 np0005476733 python3.9[200164]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:10:16 np0005476733 python3.9[200240]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:10:16 np0005476733 python3.9[200390]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:10:17 np0005476733 python3.9[200466]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:10:18 np0005476733 python3.9[200616]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:10:18 np0005476733 python3.9[200692]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:10:19 np0005476733 python3.9[200844]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:10:20 np0005476733 python3.9[200996]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:10:20 np0005476733 python3.9[201148]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:10:21 np0005476733 podman[201233]: 2025-10-08 15:10:21.236813876 +0000 UTC m=+0.062044117 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:10:21 np0005476733 python3.9[201322]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:10:21 np0005476733 systemd[1]: Reloading.
Oct  8 11:10:21 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:10:21 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:10:21 np0005476733 systemd[1]: Listening on Podman API Socket.
Oct  8 11:10:22 np0005476733 python3.9[201513]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:10:23 np0005476733 python3.9[201636]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759936222.3575017-945-199896165895628/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:10:24 np0005476733 python3.9[201712]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:10:24 np0005476733 python3.9[201835]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759936222.3575017-945-199896165895628/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:10:25 np0005476733 python3.9[201987]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Oct  8 11:10:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:10:26.289 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:10:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:10:26.290 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:10:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:10:26.290 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:10:26 np0005476733 python3.9[202139]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  8 11:10:28 np0005476733 python3[202291]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Oct  8 11:10:28 np0005476733 podman[202325]: 2025-10-08 15:10:28.312006022 +0000 UTC m=+0.045965473 container create 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  8 11:10:28 np0005476733 podman[202325]: 2025-10-08 15:10:28.286764934 +0000 UTC m=+0.020724395 image pull 12aa1ebee6c7bc72738c39719fe13059590ab0501a869a9a0be74a5be9846d32 38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:10:28 np0005476733 python3[202291]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z 38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297 kolla_start
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.591 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.592 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.592 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.592 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.606 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.607 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.607 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.607 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.608 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.608 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.608 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.608 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.608 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.629 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.629 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.629 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.629 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.789 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.789 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=14296MB free_disk=113.38803482055664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.790 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.790 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.846 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.846 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.865 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.881 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.883 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:10:28 np0005476733 nova_compute[192580]: 2025-10-08 15:10:28.884 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:10:29 np0005476733 python3.9[202515]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:10:30 np0005476733 python3.9[202669]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:10:30 np0005476733 python3.9[202820]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759936230.0876286-1073-199916439296313/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:10:31 np0005476733 python3.9[202896]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 11:10:31 np0005476733 systemd[1]: Reloading.
Oct  8 11:10:31 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:10:31 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:10:32 np0005476733 python3.9[203007]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:10:32 np0005476733 systemd[1]: Reloading.
Oct  8 11:10:32 np0005476733 podman[203010]: 2025-10-08 15:10:32.632040393 +0000 UTC m=+0.064966171 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001)
Oct  8 11:10:32 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:10:32 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:10:32 np0005476733 systemd[1]: Starting ceilometer_agent_compute container...
Oct  8 11:10:33 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:10:33 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d4b2f8226843d5d7a7c3557b59eae6c04ed92c48d7c618ab1f7f86e4e8a7342/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Oct  8 11:10:33 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d4b2f8226843d5d7a7c3557b59eae6c04ed92c48d7c618ab1f7f86e4e8a7342/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  8 11:10:33 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d4b2f8226843d5d7a7c3557b59eae6c04ed92c48d7c618ab1f7f86e4e8a7342/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Oct  8 11:10:33 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d4b2f8226843d5d7a7c3557b59eae6c04ed92c48d7c618ab1f7f86e4e8a7342/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Oct  8 11:10:33 np0005476733 systemd[1]: Started /usr/bin/podman healthcheck run 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33.
Oct  8 11:10:33 np0005476733 podman[203065]: 2025-10-08 15:10:33.068201528 +0000 UTC m=+0.167029670 container init 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: + sudo -E kolla_set_configs
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: sudo: unable to send audit message: Operation not permitted
Oct  8 11:10:33 np0005476733 podman[203065]: 2025-10-08 15:10:33.108550259 +0000 UTC m=+0.207378371 container start 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 11:10:33 np0005476733 podman[203065]: ceilometer_agent_compute
Oct  8 11:10:33 np0005476733 systemd[1]: Started ceilometer_agent_compute container.
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: INFO:__main__:Validating config file
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: INFO:__main__:Copying service configuration files
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: INFO:__main__:Writing out command to execute
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: ++ cat /run_command
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: + ARGS=
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: + sudo kolla_copy_cacerts
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: sudo: unable to send audit message: Operation not permitted
Oct  8 11:10:33 np0005476733 podman[203087]: 2025-10-08 15:10:33.177623501 +0000 UTC m=+0.055374544 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm)
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: + [[ ! -n '' ]]
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: + . kolla_extend_start
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: + umask 0022
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Oct  8 11:10:33 np0005476733 systemd[1]: 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33-5807cfc67731086a.service: Main process exited, code=exited, status=1/FAILURE
Oct  8 11:10:33 np0005476733 systemd[1]: 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33-5807cfc67731086a.service: Failed with result 'exit-code'.
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.959 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.959 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.959 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.959 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.959 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.959 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.960 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.960 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.960 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.960 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.960 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.960 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.960 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.960 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.960 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.960 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.961 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.961 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.961 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.961 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.961 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.961 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.961 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.961 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.961 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.961 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.961 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.961 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.961 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.962 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.962 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.962 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.962 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.962 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.962 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.962 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.962 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.962 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.962 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.962 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.962 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.962 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.962 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.963 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.963 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.963 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.963 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.963 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.963 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.963 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.963 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.963 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.963 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.963 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.963 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.963 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.964 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.964 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.964 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.964 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.964 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.964 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.964 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.964 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.964 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.964 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.964 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.964 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.964 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.965 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.965 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.965 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.965 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.965 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.965 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.965 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.965 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.965 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.965 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.965 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.965 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.965 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.966 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.966 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.966 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.966 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.966 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.966 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.966 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.966 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.966 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.966 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.966 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.966 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.967 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.967 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.967 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.967 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.967 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.967 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.967 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.967 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.967 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.967 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.967 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.967 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.968 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.968 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.968 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.968 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.968 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.968 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.968 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.968 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.968 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.968 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.968 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.968 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.969 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.969 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.969 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.969 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.969 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.969 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.969 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.969 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.969 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.969 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.969 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.969 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.970 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.970 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.970 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.970 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.970 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.970 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.970 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.970 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.970 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.970 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.970 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.970 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.970 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.971 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.971 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.971 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.971 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.971 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.971 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.971 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.971 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.971 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.971 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.971 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.971 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.971 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.972 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.972 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.972 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.972 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.972 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.972 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.972 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.972 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:33 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.972 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.992 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.994 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:33.996 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Oct  8 11:10:34 np0005476733 python3.9[203261]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.103 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct  8 11:10:34 np0005476733 systemd[1]: Stopping ceilometer_agent_compute container...
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.171 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.181 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.181 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.181 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.182 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.182 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.182 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.182 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.182 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.182 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.182 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.182 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.182 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.182 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.183 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.183 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.183 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.183 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.183 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.183 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.183 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.183 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.183 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.183 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.184 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.184 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.184 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.184 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.184 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.184 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.184 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.184 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.184 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.184 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.184 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.184 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.185 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.185 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.185 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.185 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.185 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.185 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.185 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.185 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.185 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.185 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.185 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.185 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.186 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.186 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.186 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.186 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.186 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.186 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.186 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.186 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.186 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.186 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.186 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.187 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.187 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.187 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.187 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.187 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.187 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.187 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.187 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.187 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.187 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.187 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.187 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.188 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.188 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.188 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.188 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.188 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.188 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.188 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.188 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.188 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.188 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.188 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.189 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.189 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.189 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.189 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.189 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.189 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.189 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.189 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.189 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.189 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.189 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.189 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.190 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.190 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.190 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.190 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.190 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.190 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.190 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.190 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.190 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.190 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.190 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.191 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.191 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.191 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.191 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.191 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.191 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.191 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.191 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.191 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.191 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.192 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.192 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.192 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.192 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.192 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.192 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.192 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.192 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.192 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.192 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.192 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.192 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.193 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.193 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.193 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.193 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.193 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.193 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.193 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.193 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.193 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.193 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.193 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.194 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.194 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.194 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.194 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.194 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.194 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.194 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.194 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.194 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.194 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.194 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.194 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.195 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.195 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.195 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.195 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.195 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.195 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.195 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.195 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.195 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.195 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.195 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.195 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.196 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.196 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.196 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.196 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.196 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.196 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.196 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.196 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.196 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.196 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.196 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.196 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.197 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.197 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.197 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.197 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.197 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.197 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.197 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.197 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.197 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.197 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.197 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.197 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.197 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.198 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.198 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.198 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.198 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.198 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.198 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.198 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.198 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.198 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.198 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.198 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.198 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.199 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.199 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.199 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.199 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.199 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.199 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.199 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.199 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.199 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.199 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.199 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.200 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.200 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.200 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.200 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.200 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.200 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.200 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.200 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.200 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.200 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.203 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.211 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.215 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.215 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.215 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.215 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.215 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.215 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.216 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.216 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.216 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.216 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.216 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.217 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.217 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.217 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.217 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.217 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.217 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.217 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.218 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.218 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.218 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.218 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.218 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.218 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.218 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.272 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.272 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.272 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Oct  8 11:10:34 np0005476733 virtqemud[192152]: End of file while reading data: Input/output error
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203080]: 2025-10-08 15:10:34.293 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Oct  8 11:10:34 np0005476733 virtqemud[192152]: End of file while reading data: Input/output error
Oct  8 11:10:34 np0005476733 systemd[1]: libpod-7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33.scope: Deactivated successfully.
Oct  8 11:10:34 np0005476733 systemd[1]: libpod-7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33.scope: Consumed 1.312s CPU time.
Oct  8 11:10:34 np0005476733 conmon[203080]: conmon 7f939ebfa387d0072674 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33.scope/container/memory.events
Oct  8 11:10:34 np0005476733 podman[203268]: 2025-10-08 15:10:34.450388253 +0000 UTC m=+0.328103107 container died 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 11:10:34 np0005476733 systemd[1]: 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33-5807cfc67731086a.timer: Deactivated successfully.
Oct  8 11:10:34 np0005476733 systemd[1]: Stopped /usr/bin/podman healthcheck run 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33.
Oct  8 11:10:34 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33-userdata-shm.mount: Deactivated successfully.
Oct  8 11:10:34 np0005476733 systemd[1]: var-lib-containers-storage-overlay-2d4b2f8226843d5d7a7c3557b59eae6c04ed92c48d7c618ab1f7f86e4e8a7342-merged.mount: Deactivated successfully.
Oct  8 11:10:34 np0005476733 podman[203268]: 2025-10-08 15:10:34.513123122 +0000 UTC m=+0.390837996 container cleanup 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm)
Oct  8 11:10:34 np0005476733 podman[203268]: ceilometer_agent_compute
Oct  8 11:10:34 np0005476733 podman[203301]: ceilometer_agent_compute
Oct  8 11:10:34 np0005476733 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Oct  8 11:10:34 np0005476733 systemd[1]: Stopped ceilometer_agent_compute container.
Oct  8 11:10:34 np0005476733 systemd[1]: Starting ceilometer_agent_compute container...
Oct  8 11:10:34 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:10:34 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d4b2f8226843d5d7a7c3557b59eae6c04ed92c48d7c618ab1f7f86e4e8a7342/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Oct  8 11:10:34 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d4b2f8226843d5d7a7c3557b59eae6c04ed92c48d7c618ab1f7f86e4e8a7342/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  8 11:10:34 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d4b2f8226843d5d7a7c3557b59eae6c04ed92c48d7c618ab1f7f86e4e8a7342/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Oct  8 11:10:34 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d4b2f8226843d5d7a7c3557b59eae6c04ed92c48d7c618ab1f7f86e4e8a7342/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Oct  8 11:10:34 np0005476733 systemd[1]: Started /usr/bin/podman healthcheck run 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33.
Oct  8 11:10:34 np0005476733 podman[203314]: 2025-10-08 15:10:34.741409801 +0000 UTC m=+0.123895518 container init 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm)
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: + sudo -E kolla_set_configs
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: sudo: unable to send audit message: Operation not permitted
Oct  8 11:10:34 np0005476733 podman[203314]: 2025-10-08 15:10:34.776842975 +0000 UTC m=+0.159328612 container start 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  8 11:10:34 np0005476733 podman[203314]: ceilometer_agent_compute
Oct  8 11:10:34 np0005476733 systemd[1]: Started ceilometer_agent_compute container.
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: INFO:__main__:Validating config file
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: INFO:__main__:Copying service configuration files
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: INFO:__main__:Writing out command to execute
Oct  8 11:10:34 np0005476733 podman[203337]: 2025-10-08 15:10:34.829979476 +0000 UTC m=+0.046017073 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: ++ cat /run_command
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: + ARGS=
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: + sudo kolla_copy_cacerts
Oct  8 11:10:34 np0005476733 systemd[1]: 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33-49a315aee70518c2.service: Main process exited, code=exited, status=1/FAILURE
Oct  8 11:10:34 np0005476733 systemd[1]: 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33-49a315aee70518c2.service: Failed with result 'exit-code'.
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: sudo: unable to send audit message: Operation not permitted
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: + [[ ! -n '' ]]
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: + . kolla_extend_start
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: + umask 0022
Oct  8 11:10:34 np0005476733 ceilometer_agent_compute[203330]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Oct  8 11:10:35 np0005476733 podman[203460]: 2025-10-08 15:10:35.304288153 +0000 UTC m=+0.125263081 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  8 11:10:35 np0005476733 python3.9[203538]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.778 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.779 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.779 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.779 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.779 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.779 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.779 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.779 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.780 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.780 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.780 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.780 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.780 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.780 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.780 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.780 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.781 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.781 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.781 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.781 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.781 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.781 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.781 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.781 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.781 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.781 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.781 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.781 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.782 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.782 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.782 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.782 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.782 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.782 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.782 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.782 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.782 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.782 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.782 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.782 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.783 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.783 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.783 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.783 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.783 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.783 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.783 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.783 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.783 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.783 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.783 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.784 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.784 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.784 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.784 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.784 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.784 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.784 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.784 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.784 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.784 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.784 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.785 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.785 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.785 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.785 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.785 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.785 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.785 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.785 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.785 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.785 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.786 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.786 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.786 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.786 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.786 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.786 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.786 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.786 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.786 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.786 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.786 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.787 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.787 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.787 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.787 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.787 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.787 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.787 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.787 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.787 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.787 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.787 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.788 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.788 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.788 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.788 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.788 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.788 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.788 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.788 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.788 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.788 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.789 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.789 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.789 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.789 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.789 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.789 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.789 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.789 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.789 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.789 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.789 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.790 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.790 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.790 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.790 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.790 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.790 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.790 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.790 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.790 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.790 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.791 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.791 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.791 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.791 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.791 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.791 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.791 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.791 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.791 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.791 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.791 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.792 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.792 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.792 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.792 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.792 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.792 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.792 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.792 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.792 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.792 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.792 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.793 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.793 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.793 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.793 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.793 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.793 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.793 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.793 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.793 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.793 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.793 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.793 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.794 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.794 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.794 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.794 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.794 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.794 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.812 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.813 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.814 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.825 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.966 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.966 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.966 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.966 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.966 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.966 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.966 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.966 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.967 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.967 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.967 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.967 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.967 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.967 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.967 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.967 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.967 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.968 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.968 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.968 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.968 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.968 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.968 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.968 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.968 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.968 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.968 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.969 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.969 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.969 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.969 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.969 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.969 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.969 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.969 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.969 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.969 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.969 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.970 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.970 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.970 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.970 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.970 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.970 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.970 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.970 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.970 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.970 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.971 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.971 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.971 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.971 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.971 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.971 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.971 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.971 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.972 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.972 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.972 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.972 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.972 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.972 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.972 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.972 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.972 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.972 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.973 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.973 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.973 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.973 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.974 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.974 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.974 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.974 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.974 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.974 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.974 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.974 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.974 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.974 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.974 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.975 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.975 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.975 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.975 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.975 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.975 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.975 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.975 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.975 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.975 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.975 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.976 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.976 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.976 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.976 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.976 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.976 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.976 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.976 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.976 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.976 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.976 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.977 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.977 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.977 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.977 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.977 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.977 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.977 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.977 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.977 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.978 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.978 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.978 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.978 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.978 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.978 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.978 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.978 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.978 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.978 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.979 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.979 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.979 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.979 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.979 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.979 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.979 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.979 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.979 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.979 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.980 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.980 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.980 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.980 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.980 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.980 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.980 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.980 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.980 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.980 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.980 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.980 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.981 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.981 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.981 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.981 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.981 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.981 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.981 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.981 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.981 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.981 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.981 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.981 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.982 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.982 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.982 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.982 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.982 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.982 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.982 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.982 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.982 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.982 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.982 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.982 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.983 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.983 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.983 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.983 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.983 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.983 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.983 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.983 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.983 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.983 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.983 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.984 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.984 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.984 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.984 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.984 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.984 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.984 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.984 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.984 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.984 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.984 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.984 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.985 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.985 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.985 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.985 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.985 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.985 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.985 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.985 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.985 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.985 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.985 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.986 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.986 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.986 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.986 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.986 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.986 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.986 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.986 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.986 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.986 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.987 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.987 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.987 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.987 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.987 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.987 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.987 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.989 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:35.995 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.002 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.002 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.002 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.002 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:10:36.006 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:10:36 np0005476733 python3.9[203664]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759936234.9989297-1137-15648479598215/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:10:36 np0005476733 python3.9[203819]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Oct  8 11:10:37 np0005476733 python3.9[203971]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  8 11:10:38 np0005476733 python3[204123]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct  8 11:10:39 np0005476733 podman[204158]: 2025-10-08 15:10:39.041865494 +0000 UTC m=+0.068072661 container create d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm)
Oct  8 11:10:39 np0005476733 podman[204158]: 2025-10-08 15:10:39.005171369 +0000 UTC m=+0.031378586 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Oct  8 11:10:39 np0005476733 python3[204123]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Oct  8 11:10:39 np0005476733 python3.9[204348]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:10:40 np0005476733 python3.9[204502]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:10:41 np0005476733 python3.9[204653]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759936240.8423412-1243-169909058637855/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:10:42 np0005476733 python3.9[204729]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 11:10:42 np0005476733 systemd[1]: Reloading.
Oct  8 11:10:42 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:10:42 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:10:43 np0005476733 python3.9[204839]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:10:43 np0005476733 systemd[1]: Reloading.
Oct  8 11:10:43 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:10:43 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:10:43 np0005476733 systemd[1]: Starting node_exporter container...
Oct  8 11:10:43 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:10:43 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49dfdb5e8ac3780abb488157361210ffdd7a2bef9cc597a0b7fc87aafc5b3529/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  8 11:10:43 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49dfdb5e8ac3780abb488157361210ffdd7a2bef9cc597a0b7fc87aafc5b3529/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  8 11:10:43 np0005476733 systemd[1]: Started /usr/bin/podman healthcheck run d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7.
Oct  8 11:10:43 np0005476733 podman[204879]: 2025-10-08 15:10:43.570993869 +0000 UTC m=+0.135349644 container init d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.599Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.599Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.600Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.600Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.600Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.600Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.600Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.600Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.600Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=arp
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=bcache
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=bonding
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=btrfs
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=conntrack
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=cpu
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=cpufreq
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=diskstats
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=edac
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=fibrechannel
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=filefd
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=filesystem
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=infiniband
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=ipvs
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=loadavg
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=mdadm
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=meminfo
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=netclass
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=netdev
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=netstat
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=nfs
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=nfsd
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=nvme
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=schedstat
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=sockstat
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=softnet
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=systemd
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=tapestats
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=udp_queues
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=vmstat
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=xfs
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.601Z caller=node_exporter.go:117 level=info collector=zfs
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.602Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Oct  8 11:10:43 np0005476733 node_exporter[204895]: ts=2025-10-08T15:10:43.603Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Oct  8 11:10:43 np0005476733 podman[204879]: 2025-10-08 15:10:43.603604164 +0000 UTC m=+0.167959839 container start d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:10:43 np0005476733 podman[204879]: node_exporter
Oct  8 11:10:43 np0005476733 systemd[1]: Started node_exporter container.
Oct  8 11:10:43 np0005476733 podman[204900]: 2025-10-08 15:10:43.661526178 +0000 UTC m=+0.049215457 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 11:10:44 np0005476733 podman[205028]: 2025-10-08 15:10:44.273824332 +0000 UTC m=+0.084966911 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:10:44 np0005476733 python3.9[205099]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 11:10:44 np0005476733 systemd[1]: Stopping node_exporter container...
Oct  8 11:10:44 np0005476733 systemd[1]: libpod-d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7.scope: Deactivated successfully.
Oct  8 11:10:44 np0005476733 podman[205103]: 2025-10-08 15:10:44.807959234 +0000 UTC m=+0.062305865 container died d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 11:10:44 np0005476733 systemd[1]: d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7-6cb376bdbbc470b4.timer: Deactivated successfully.
Oct  8 11:10:44 np0005476733 systemd[1]: Stopped /usr/bin/podman healthcheck run d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7.
Oct  8 11:10:44 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7-userdata-shm.mount: Deactivated successfully.
Oct  8 11:10:44 np0005476733 systemd[1]: var-lib-containers-storage-overlay-49dfdb5e8ac3780abb488157361210ffdd7a2bef9cc597a0b7fc87aafc5b3529-merged.mount: Deactivated successfully.
Oct  8 11:10:44 np0005476733 podman[205103]: 2025-10-08 15:10:44.916904643 +0000 UTC m=+0.171251284 container cleanup d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 11:10:44 np0005476733 podman[205103]: node_exporter
Oct  8 11:10:44 np0005476733 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct  8 11:10:44 np0005476733 podman[205132]: node_exporter
Oct  8 11:10:44 np0005476733 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Oct  8 11:10:44 np0005476733 systemd[1]: Stopped node_exporter container.
Oct  8 11:10:45 np0005476733 systemd[1]: Starting node_exporter container...
Oct  8 11:10:45 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:10:45 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49dfdb5e8ac3780abb488157361210ffdd7a2bef9cc597a0b7fc87aafc5b3529/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  8 11:10:45 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49dfdb5e8ac3780abb488157361210ffdd7a2bef9cc597a0b7fc87aafc5b3529/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  8 11:10:45 np0005476733 systemd[1]: Started /usr/bin/podman healthcheck run d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7.
Oct  8 11:10:45 np0005476733 podman[205143]: 2025-10-08 15:10:45.176426903 +0000 UTC m=+0.155978056 container init d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.195Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.195Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.195Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.196Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.196Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.197Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.197Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.197Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.197Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=arp
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=bcache
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=bonding
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=btrfs
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=conntrack
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=cpu
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=cpufreq
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=diskstats
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=edac
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=fibrechannel
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=filefd
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=filesystem
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=infiniband
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=ipvs
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=loadavg
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=mdadm
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=meminfo
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=netclass
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=netdev
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=netstat
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=nfs
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=nfsd
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=nvme
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=schedstat
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=sockstat
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=softnet
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=systemd
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=tapestats
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=udp_queues
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=vmstat
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=xfs
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.198Z caller=node_exporter.go:117 level=info collector=zfs
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.199Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Oct  8 11:10:45 np0005476733 node_exporter[205158]: ts=2025-10-08T15:10:45.200Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Oct  8 11:10:45 np0005476733 podman[205143]: 2025-10-08 15:10:45.211154914 +0000 UTC m=+0.190706057 container start d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 11:10:45 np0005476733 podman[205143]: node_exporter
Oct  8 11:10:45 np0005476733 systemd[1]: Started node_exporter container.
Oct  8 11:10:45 np0005476733 podman[205167]: 2025-10-08 15:10:45.323893944 +0000 UTC m=+0.094648311 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:10:45 np0005476733 python3.9[205342]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:10:46 np0005476733 python3.9[205465]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759936245.451253-1307-6778782750503/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:10:47 np0005476733 python3.9[205617]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Oct  8 11:10:48 np0005476733 python3.9[205769]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  8 11:10:49 np0005476733 python3[205921]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct  8 11:10:50 np0005476733 podman[205935]: 2025-10-08 15:10:50.535101209 +0000 UTC m=+1.290106129 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Oct  8 11:10:50 np0005476733 podman[206031]: 2025-10-08 15:10:50.684840262 +0000 UTC m=+0.049902568 container create 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 11:10:50 np0005476733 podman[206031]: 2025-10-08 15:10:50.659562323 +0000 UTC m=+0.024624599 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Oct  8 11:10:50 np0005476733 python3[205921]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Oct  8 11:10:51 np0005476733 podman[206193]: 2025-10-08 15:10:51.394775104 +0000 UTC m=+0.055396636 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 11:10:51 np0005476733 python3.9[206241]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:10:52 np0005476733 python3.9[206395]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:10:53 np0005476733 python3.9[206546]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759936252.477747-1413-207234297934615/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:10:53 np0005476733 python3.9[206622]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 11:10:53 np0005476733 systemd[1]: Reloading.
Oct  8 11:10:53 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:10:53 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:10:54 np0005476733 python3.9[206733]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:10:54 np0005476733 systemd[1]: Reloading.
Oct  8 11:10:54 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:10:54 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:10:55 np0005476733 systemd[1]: Starting podman_exporter container...
Oct  8 11:10:55 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:10:55 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb2deaec4de7cd8628a03b54b8897ca930af5c12a28b617e19412d9a2d45566e/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  8 11:10:55 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb2deaec4de7cd8628a03b54b8897ca930af5c12a28b617e19412d9a2d45566e/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  8 11:10:55 np0005476733 systemd[1]: Started /usr/bin/podman healthcheck run 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e.
Oct  8 11:10:55 np0005476733 podman[206772]: 2025-10-08 15:10:55.318557613 +0000 UTC m=+0.179053621 container init 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:10:55 np0005476733 podman_exporter[206787]: ts=2025-10-08T15:10:55.341Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Oct  8 11:10:55 np0005476733 podman_exporter[206787]: ts=2025-10-08T15:10:55.341Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Oct  8 11:10:55 np0005476733 podman_exporter[206787]: ts=2025-10-08T15:10:55.341Z caller=handler.go:94 level=info msg="enabled collectors"
Oct  8 11:10:55 np0005476733 podman_exporter[206787]: ts=2025-10-08T15:10:55.341Z caller=handler.go:105 level=info collector=container
Oct  8 11:10:55 np0005476733 podman[206772]: 2025-10-08 15:10:55.348433404 +0000 UTC m=+0.208929322 container start 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 11:10:55 np0005476733 podman[206772]: podman_exporter
Oct  8 11:10:55 np0005476733 systemd[1]: Starting Podman API Service...
Oct  8 11:10:55 np0005476733 systemd[1]: Started Podman API Service.
Oct  8 11:10:55 np0005476733 systemd[1]: Started podman_exporter container.
Oct  8 11:10:55 np0005476733 podman[206798]: time="2025-10-08T15:10:55Z" level=info msg="/usr/bin/podman filtering at log level info"
Oct  8 11:10:55 np0005476733 podman[206798]: time="2025-10-08T15:10:55Z" level=info msg="Setting parallel job count to 25"
Oct  8 11:10:55 np0005476733 podman[206798]: time="2025-10-08T15:10:55Z" level=info msg="Using sqlite as database backend"
Oct  8 11:10:55 np0005476733 podman[206798]: time="2025-10-08T15:10:55Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Oct  8 11:10:55 np0005476733 podman[206798]: time="2025-10-08T15:10:55Z" level=info msg="Using systemd socket activation to determine API endpoint"
Oct  8 11:10:55 np0005476733 podman[206798]: time="2025-10-08T15:10:55Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Oct  8 11:10:55 np0005476733 podman[206798]: @ - - [08/Oct/2025:15:10:55 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Oct  8 11:10:55 np0005476733 podman[206798]: time="2025-10-08T15:10:55Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct  8 11:10:55 np0005476733 podman[206796]: 2025-10-08 15:10:55.412395011 +0000 UTC m=+0.051051943 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:10:55 np0005476733 systemd[1]: 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e-1567256bc267ac18.service: Main process exited, code=exited, status=1/FAILURE
Oct  8 11:10:55 np0005476733 podman[206798]: @ - - [08/Oct/2025:15:10:55 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 22437 "" "Go-http-client/1.1"
Oct  8 11:10:55 np0005476733 systemd[1]: 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e-1567256bc267ac18.service: Failed with result 'exit-code'.
Oct  8 11:10:55 np0005476733 podman_exporter[206787]: ts=2025-10-08T15:10:55.420Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Oct  8 11:10:55 np0005476733 podman_exporter[206787]: ts=2025-10-08T15:10:55.420Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Oct  8 11:10:55 np0005476733 podman_exporter[206787]: ts=2025-10-08T15:10:55.421Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Oct  8 11:10:56 np0005476733 python3.9[206986]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 11:10:56 np0005476733 systemd[1]: Stopping podman_exporter container...
Oct  8 11:10:56 np0005476733 systemd[1]: libpod-9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e.scope: Deactivated successfully.
Oct  8 11:10:56 np0005476733 podman[206798]: @ - - [08/Oct/2025:15:10:55 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1449 "" "Go-http-client/1.1"
Oct  8 11:10:56 np0005476733 podman[206990]: 2025-10-08 15:10:56.368139088 +0000 UTC m=+0.050915769 container died 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:10:56 np0005476733 systemd[1]: 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e-1567256bc267ac18.timer: Deactivated successfully.
Oct  8 11:10:56 np0005476733 systemd[1]: Stopped /usr/bin/podman healthcheck run 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e.
Oct  8 11:10:56 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e-userdata-shm.mount: Deactivated successfully.
Oct  8 11:10:56 np0005476733 systemd[1]: var-lib-containers-storage-overlay-eb2deaec4de7cd8628a03b54b8897ca930af5c12a28b617e19412d9a2d45566e-merged.mount: Deactivated successfully.
Oct  8 11:10:56 np0005476733 podman[206990]: 2025-10-08 15:10:56.552296292 +0000 UTC m=+0.235072953 container cleanup 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 11:10:56 np0005476733 podman[206990]: podman_exporter
Oct  8 11:10:56 np0005476733 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct  8 11:10:56 np0005476733 podman[207019]: podman_exporter
Oct  8 11:10:56 np0005476733 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Oct  8 11:10:56 np0005476733 systemd[1]: Stopped podman_exporter container.
Oct  8 11:10:56 np0005476733 systemd[1]: Starting podman_exporter container...
Oct  8 11:10:56 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:10:56 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb2deaec4de7cd8628a03b54b8897ca930af5c12a28b617e19412d9a2d45566e/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  8 11:10:56 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb2deaec4de7cd8628a03b54b8897ca930af5c12a28b617e19412d9a2d45566e/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  8 11:10:56 np0005476733 systemd[1]: Started /usr/bin/podman healthcheck run 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e.
Oct  8 11:10:56 np0005476733 podman[207033]: 2025-10-08 15:10:56.797648586 +0000 UTC m=+0.146498574 container init 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:10:56 np0005476733 podman_exporter[207047]: ts=2025-10-08T15:10:56.815Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Oct  8 11:10:56 np0005476733 podman_exporter[207047]: ts=2025-10-08T15:10:56.815Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Oct  8 11:10:56 np0005476733 podman_exporter[207047]: ts=2025-10-08T15:10:56.815Z caller=handler.go:94 level=info msg="enabled collectors"
Oct  8 11:10:56 np0005476733 podman_exporter[207047]: ts=2025-10-08T15:10:56.815Z caller=handler.go:105 level=info collector=container
Oct  8 11:10:56 np0005476733 podman[206798]: @ - - [08/Oct/2025:15:10:56 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Oct  8 11:10:56 np0005476733 podman[206798]: time="2025-10-08T15:10:56Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct  8 11:10:56 np0005476733 podman[207033]: 2025-10-08 15:10:56.827668651 +0000 UTC m=+0.176518619 container start 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:10:56 np0005476733 podman[207033]: podman_exporter
Oct  8 11:10:56 np0005476733 systemd[1]: Started podman_exporter container.
Oct  8 11:10:56 np0005476733 podman[206798]: @ - - [08/Oct/2025:15:10:56 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 22439 "" "Go-http-client/1.1"
Oct  8 11:10:56 np0005476733 podman_exporter[207047]: ts=2025-10-08T15:10:56.841Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Oct  8 11:10:56 np0005476733 podman_exporter[207047]: ts=2025-10-08T15:10:56.842Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Oct  8 11:10:56 np0005476733 podman_exporter[207047]: ts=2025-10-08T15:10:56.842Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Oct  8 11:10:56 np0005476733 podman[207057]: 2025-10-08 15:10:56.898643335 +0000 UTC m=+0.052311535 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:10:57 np0005476733 python3.9[207232]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:10:58 np0005476733 python3.9[207355]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759936257.1325908-1477-172373217823461/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 11:10:59 np0005476733 python3.9[207507]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Oct  8 11:11:00 np0005476733 python3.9[207659]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  8 11:11:00 np0005476733 python3[207811]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct  8 11:11:03 np0005476733 podman[207884]: 2025-10-08 15:11:03.222075711 +0000 UTC m=+0.130911562 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 11:11:03 np0005476733 podman[207824]: 2025-10-08 15:11:03.347697122 +0000 UTC m=+2.251527043 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct  8 11:11:03 np0005476733 podman[207942]: 2025-10-08 15:11:03.496850551 +0000 UTC m=+0.050415143 container create ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., config_id=edpm, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  8 11:11:03 np0005476733 podman[207942]: 2025-10-08 15:11:03.467904369 +0000 UTC m=+0.021468981 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct  8 11:11:03 np0005476733 python3[207811]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct  8 11:11:04 np0005476733 python3.9[208132]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:11:05 np0005476733 podman[208258]: 2025-10-08 15:11:05.120930808 +0000 UTC m=+0.051166907 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct  8 11:11:05 np0005476733 systemd[1]: 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33-49a315aee70518c2.service: Main process exited, code=exited, status=1/FAILURE
Oct  8 11:11:05 np0005476733 systemd[1]: 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33-49a315aee70518c2.service: Failed with result 'exit-code'.
Oct  8 11:11:05 np0005476733 python3.9[208305]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:11:05 np0005476733 podman[208428]: 2025-10-08 15:11:05.961622053 +0000 UTC m=+0.100073180 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Oct  8 11:11:06 np0005476733 python3.9[208474]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759936265.40434-1583-252909906603764/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:11:06 np0005476733 python3.9[208556]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 11:11:06 np0005476733 systemd[1]: Reloading.
Oct  8 11:11:06 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:11:06 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:11:07 np0005476733 python3.9[208666]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 11:11:07 np0005476733 systemd[1]: Reloading.
Oct  8 11:11:07 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 11:11:07 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 11:11:08 np0005476733 systemd[1]: Starting openstack_network_exporter container...
Oct  8 11:11:08 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:11:08 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/716c527616300efca96bcdad043ef8b7c3dafc9b8630f862e8c5a31f2ba16207/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct  8 11:11:08 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/716c527616300efca96bcdad043ef8b7c3dafc9b8630f862e8c5a31f2ba16207/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  8 11:11:08 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/716c527616300efca96bcdad043ef8b7c3dafc9b8630f862e8c5a31f2ba16207/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  8 11:11:08 np0005476733 systemd[1]: Started /usr/bin/podman healthcheck run ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253.
Oct  8 11:11:08 np0005476733 podman[208707]: 2025-10-08 15:11:08.268213386 +0000 UTC m=+0.177229121 container init ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  8 11:11:08 np0005476733 openstack_network_exporter[208723]: INFO    15:11:08 main.go:48: registering *bridge.Collector
Oct  8 11:11:08 np0005476733 openstack_network_exporter[208723]: INFO    15:11:08 main.go:48: registering *coverage.Collector
Oct  8 11:11:08 np0005476733 openstack_network_exporter[208723]: INFO    15:11:08 main.go:48: registering *datapath.Collector
Oct  8 11:11:08 np0005476733 openstack_network_exporter[208723]: INFO    15:11:08 main.go:48: registering *iface.Collector
Oct  8 11:11:08 np0005476733 openstack_network_exporter[208723]: INFO    15:11:08 main.go:48: registering *memory.Collector
Oct  8 11:11:08 np0005476733 openstack_network_exporter[208723]: INFO    15:11:08 main.go:48: registering *ovnnorthd.Collector
Oct  8 11:11:08 np0005476733 openstack_network_exporter[208723]: INFO    15:11:08 main.go:48: registering *ovn.Collector
Oct  8 11:11:08 np0005476733 openstack_network_exporter[208723]: INFO    15:11:08 main.go:48: registering *ovsdbserver.Collector
Oct  8 11:11:08 np0005476733 openstack_network_exporter[208723]: INFO    15:11:08 main.go:48: registering *pmd_perf.Collector
Oct  8 11:11:08 np0005476733 openstack_network_exporter[208723]: INFO    15:11:08 main.go:48: registering *pmd_rxq.Collector
Oct  8 11:11:08 np0005476733 openstack_network_exporter[208723]: INFO    15:11:08 main.go:48: registering *vswitch.Collector
Oct  8 11:11:08 np0005476733 openstack_network_exporter[208723]: NOTICE  15:11:08 main.go:76: listening on https://:9105/metrics
Oct  8 11:11:08 np0005476733 podman[208707]: 2025-10-08 15:11:08.307472949 +0000 UTC m=+0.216488704 container start ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_id=edpm)
Oct  8 11:11:08 np0005476733 podman[208707]: openstack_network_exporter
Oct  8 11:11:08 np0005476733 systemd[1]: Started openstack_network_exporter container.
Oct  8 11:11:08 np0005476733 podman[208733]: 2025-10-08 15:11:08.427572713 +0000 UTC m=+0.108736539 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-type=git, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public)
Oct  8 11:11:09 np0005476733 python3.9[208906]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 11:11:09 np0005476733 systemd[1]: Stopping openstack_network_exporter container...
Oct  8 11:11:09 np0005476733 systemd[1]: libpod-ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253.scope: Deactivated successfully.
Oct  8 11:11:09 np0005476733 podman[208910]: 2025-10-08 15:11:09.365111754 +0000 UTC m=+0.057430229 container died ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm)
Oct  8 11:11:09 np0005476733 systemd[1]: ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253-12cd56eed285e9af.timer: Deactivated successfully.
Oct  8 11:11:09 np0005476733 systemd[1]: Stopped /usr/bin/podman healthcheck run ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253.
Oct  8 11:11:09 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253-userdata-shm.mount: Deactivated successfully.
Oct  8 11:11:09 np0005476733 systemd[1]: var-lib-containers-storage-overlay-716c527616300efca96bcdad043ef8b7c3dafc9b8630f862e8c5a31f2ba16207-merged.mount: Deactivated successfully.
Oct  8 11:11:09 np0005476733 podman[208910]: 2025-10-08 15:11:09.967580825 +0000 UTC m=+0.659899300 container cleanup ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-type=git, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  8 11:11:09 np0005476733 podman[208910]: openstack_network_exporter
Oct  8 11:11:09 np0005476733 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct  8 11:11:10 np0005476733 podman[208940]: openstack_network_exporter
Oct  8 11:11:10 np0005476733 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Oct  8 11:11:10 np0005476733 systemd[1]: Stopped openstack_network_exporter container.
Oct  8 11:11:10 np0005476733 systemd[1]: Starting openstack_network_exporter container...
Oct  8 11:11:10 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:11:10 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/716c527616300efca96bcdad043ef8b7c3dafc9b8630f862e8c5a31f2ba16207/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct  8 11:11:10 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/716c527616300efca96bcdad043ef8b7c3dafc9b8630f862e8c5a31f2ba16207/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  8 11:11:10 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/716c527616300efca96bcdad043ef8b7c3dafc9b8630f862e8c5a31f2ba16207/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  8 11:11:10 np0005476733 systemd[1]: Started /usr/bin/podman healthcheck run ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253.
Oct  8 11:11:10 np0005476733 podman[208953]: 2025-10-08 15:11:10.19585937 +0000 UTC m=+0.124940261 container init ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, config_id=edpm, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  8 11:11:10 np0005476733 openstack_network_exporter[208967]: INFO    15:11:10 main.go:48: registering *bridge.Collector
Oct  8 11:11:10 np0005476733 openstack_network_exporter[208967]: INFO    15:11:10 main.go:48: registering *coverage.Collector
Oct  8 11:11:10 np0005476733 openstack_network_exporter[208967]: INFO    15:11:10 main.go:48: registering *datapath.Collector
Oct  8 11:11:10 np0005476733 openstack_network_exporter[208967]: INFO    15:11:10 main.go:48: registering *iface.Collector
Oct  8 11:11:10 np0005476733 openstack_network_exporter[208967]: INFO    15:11:10 main.go:48: registering *memory.Collector
Oct  8 11:11:10 np0005476733 openstack_network_exporter[208967]: INFO    15:11:10 main.go:48: registering *ovnnorthd.Collector
Oct  8 11:11:10 np0005476733 openstack_network_exporter[208967]: INFO    15:11:10 main.go:48: registering *ovn.Collector
Oct  8 11:11:10 np0005476733 openstack_network_exporter[208967]: INFO    15:11:10 main.go:48: registering *ovsdbserver.Collector
Oct  8 11:11:10 np0005476733 openstack_network_exporter[208967]: INFO    15:11:10 main.go:48: registering *pmd_perf.Collector
Oct  8 11:11:10 np0005476733 openstack_network_exporter[208967]: INFO    15:11:10 main.go:48: registering *pmd_rxq.Collector
Oct  8 11:11:10 np0005476733 openstack_network_exporter[208967]: INFO    15:11:10 main.go:48: registering *vswitch.Collector
Oct  8 11:11:10 np0005476733 openstack_network_exporter[208967]: NOTICE  15:11:10 main.go:76: listening on https://:9105/metrics
Oct  8 11:11:10 np0005476733 podman[208953]: 2025-10-08 15:11:10.222493046 +0000 UTC m=+0.151573937 container start ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  8 11:11:10 np0005476733 podman[208953]: openstack_network_exporter
Oct  8 11:11:10 np0005476733 systemd[1]: Started openstack_network_exporter container.
Oct  8 11:11:10 np0005476733 podman[208979]: 2025-10-08 15:11:10.298002045 +0000 UTC m=+0.062561613 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Oct  8 11:11:11 np0005476733 python3.9[209151]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  8 11:11:15 np0005476733 podman[209176]: 2025-10-08 15:11:15.225188483 +0000 UTC m=+0.058377689 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  8 11:11:16 np0005476733 podman[209196]: 2025-10-08 15:11:16.245160557 +0000 UTC m=+0.069132985 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 11:11:22 np0005476733 podman[209220]: 2025-10-08 15:11:22.223971516 +0000 UTC m=+0.054817714 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=iscsid, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 11:11:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:11:26.290 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:11:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:11:26.291 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:11:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:11:26.291 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:11:27 np0005476733 podman[209241]: 2025-10-08 15:11:27.227951107 +0000 UTC m=+0.054924478 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:11:28 np0005476733 nova_compute[192580]: 2025-10-08 15:11:28.871 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:11:28 np0005476733 nova_compute[192580]: 2025-10-08 15:11:28.886 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:11:28 np0005476733 nova_compute[192580]: 2025-10-08 15:11:28.887 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:11:29 np0005476733 nova_compute[192580]: 2025-10-08 15:11:29.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:11:29 np0005476733 nova_compute[192580]: 2025-10-08 15:11:29.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:11:29 np0005476733 nova_compute[192580]: 2025-10-08 15:11:29.613 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:11:29 np0005476733 nova_compute[192580]: 2025-10-08 15:11:29.614 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:11:29 np0005476733 nova_compute[192580]: 2025-10-08 15:11:29.614 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:11:29 np0005476733 nova_compute[192580]: 2025-10-08 15:11:29.615 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:11:29 np0005476733 nova_compute[192580]: 2025-10-08 15:11:29.836 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:11:29 np0005476733 nova_compute[192580]: 2025-10-08 15:11:29.838 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=14051MB free_disk=113.22233581542969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:11:29 np0005476733 nova_compute[192580]: 2025-10-08 15:11:29.838 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:11:29 np0005476733 nova_compute[192580]: 2025-10-08 15:11:29.839 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:11:29 np0005476733 nova_compute[192580]: 2025-10-08 15:11:29.896 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:11:29 np0005476733 nova_compute[192580]: 2025-10-08 15:11:29.896 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:11:29 np0005476733 nova_compute[192580]: 2025-10-08 15:11:29.916 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:11:29 np0005476733 nova_compute[192580]: 2025-10-08 15:11:29.931 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:11:29 np0005476733 nova_compute[192580]: 2025-10-08 15:11:29.933 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:11:29 np0005476733 nova_compute[192580]: 2025-10-08 15:11:29.934 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:11:30 np0005476733 nova_compute[192580]: 2025-10-08 15:11:30.927 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:11:30 np0005476733 nova_compute[192580]: 2025-10-08 15:11:30.928 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:11:30 np0005476733 nova_compute[192580]: 2025-10-08 15:11:30.929 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:11:30 np0005476733 nova_compute[192580]: 2025-10-08 15:11:30.929 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:11:30 np0005476733 nova_compute[192580]: 2025-10-08 15:11:30.944 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 11:11:30 np0005476733 nova_compute[192580]: 2025-10-08 15:11:30.944 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:11:30 np0005476733 nova_compute[192580]: 2025-10-08 15:11:30.945 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:11:30 np0005476733 nova_compute[192580]: 2025-10-08 15:11:30.946 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:11:30 np0005476733 nova_compute[192580]: 2025-10-08 15:11:30.946 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:11:34 np0005476733 podman[209266]: 2025-10-08 15:11:34.23731685 +0000 UTC m=+0.067637347 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  8 11:11:35 np0005476733 podman[209285]: 2025-10-08 15:11:35.234584812 +0000 UTC m=+0.056817039 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  8 11:11:35 np0005476733 systemd[1]: 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33-49a315aee70518c2.service: Main process exited, code=exited, status=1/FAILURE
Oct  8 11:11:35 np0005476733 systemd[1]: 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33-49a315aee70518c2.service: Failed with result 'exit-code'.
Oct  8 11:11:36 np0005476733 podman[209304]: 2025-10-08 15:11:36.242840128 +0000 UTC m=+0.076425660 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_controller)
Oct  8 11:11:40 np0005476733 podman[209430]: 2025-10-08 15:11:40.734746964 +0000 UTC m=+0.075292413 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Oct  8 11:11:40 np0005476733 python3.9[209479]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Oct  8 11:11:41 np0005476733 python3.9[209644]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 11:11:41 np0005476733 systemd[1]: Started libpod-conmon-20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b.scope.
Oct  8 11:11:41 np0005476733 podman[209645]: 2025-10-08 15:11:41.932160236 +0000 UTC m=+0.198739294 container exec 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 11:11:42 np0005476733 podman[209665]: 2025-10-08 15:11:42.008394078 +0000 UTC m=+0.059927219 container exec_died 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  8 11:11:42 np0005476733 podman[209645]: 2025-10-08 15:11:42.068350257 +0000 UTC m=+0.334929265 container exec_died 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2)
Oct  8 11:11:42 np0005476733 systemd[1]: libpod-conmon-20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b.scope: Deactivated successfully.
Oct  8 11:11:43 np0005476733 python3.9[209828]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 11:11:43 np0005476733 systemd[1]: Started libpod-conmon-20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b.scope.
Oct  8 11:11:43 np0005476733 podman[209829]: 2025-10-08 15:11:43.369599688 +0000 UTC m=+0.096554997 container exec 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 11:11:43 np0005476733 podman[209829]: 2025-10-08 15:11:43.408563161 +0000 UTC m=+0.135518500 container exec_died 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 11:11:43 np0005476733 systemd[1]: libpod-conmon-20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b.scope: Deactivated successfully.
Oct  8 11:11:44 np0005476733 python3.9[210014]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:11:45 np0005476733 python3.9[210166]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Oct  8 11:11:45 np0005476733 podman[210303]: 2025-10-08 15:11:45.768280794 +0000 UTC m=+0.079421285 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0)
Oct  8 11:11:45 np0005476733 python3.9[210344]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 11:11:46 np0005476733 systemd[1]: Started libpod-conmon-aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4.scope.
Oct  8 11:11:46 np0005476733 podman[210352]: 2025-10-08 15:11:46.033857198 +0000 UTC m=+0.086329169 container exec aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Oct  8 11:11:46 np0005476733 podman[210352]: 2025-10-08 15:11:46.077956337 +0000 UTC m=+0.130428278 container exec_died aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:11:46 np0005476733 systemd[1]: libpod-conmon-aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4.scope: Deactivated successfully.
Oct  8 11:11:46 np0005476733 podman[210508]: 2025-10-08 15:11:46.630956366 +0000 UTC m=+0.050683011 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 11:11:46 np0005476733 python3.9[210560]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 11:11:46 np0005476733 systemd[1]: Started libpod-conmon-aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4.scope.
Oct  8 11:11:46 np0005476733 podman[210561]: 2025-10-08 15:11:46.916882415 +0000 UTC m=+0.071269063 container exec aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:11:46 np0005476733 podman[210561]: 2025-10-08 15:11:46.947162709 +0000 UTC m=+0.101549357 container exec_died aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct  8 11:11:47 np0005476733 systemd[1]: libpod-conmon-aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4.scope: Deactivated successfully.
Oct  8 11:11:47 np0005476733 python3.9[210745]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:11:48 np0005476733 python3.9[210897]: ansible-containers.podman.podman_container_info Invoked with name=['iscsid'] executable=podman
Oct  8 11:11:49 np0005476733 python3.9[211063]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 11:11:49 np0005476733 systemd[1]: Started libpod-conmon-72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435.scope.
Oct  8 11:11:49 np0005476733 podman[211064]: 2025-10-08 15:11:49.680090238 +0000 UTC m=+0.111181958 container exec 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct  8 11:11:49 np0005476733 podman[211064]: 2025-10-08 15:11:49.715787546 +0000 UTC m=+0.146879306 container exec_died 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct  8 11:11:49 np0005476733 systemd[1]: libpod-conmon-72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435.scope: Deactivated successfully.
Oct  8 11:11:50 np0005476733 python3.9[211248]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 11:11:50 np0005476733 systemd[1]: Started libpod-conmon-72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435.scope.
Oct  8 11:11:50 np0005476733 podman[211249]: 2025-10-08 15:11:50.571816815 +0000 UTC m=+0.069460746 container exec 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct  8 11:11:50 np0005476733 podman[211249]: 2025-10-08 15:11:50.605480268 +0000 UTC m=+0.103124179 container exec_died 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, org.label-schema.license=GPLv2, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct  8 11:11:50 np0005476733 systemd[1]: libpod-conmon-72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435.scope: Deactivated successfully.
Oct  8 11:11:51 np0005476733 python3.9[211432]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/iscsid recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:11:52 np0005476733 python3.9[211584]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Oct  8 11:11:52 np0005476733 podman[211721]: 2025-10-08 15:11:52.784932081 +0000 UTC m=+0.082904248 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:11:52 np0005476733 python3.9[211766]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 11:11:53 np0005476733 systemd[1]: Started libpod-conmon-3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031.scope.
Oct  8 11:11:53 np0005476733 podman[211769]: 2025-10-08 15:11:53.06558093 +0000 UTC m=+0.082676862 container exec 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=multipathd)
Oct  8 11:11:53 np0005476733 podman[211769]: 2025-10-08 15:11:53.099567773 +0000 UTC m=+0.116663675 container exec_died 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct  8 11:11:53 np0005476733 systemd[1]: libpod-conmon-3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031.scope: Deactivated successfully.
Oct  8 11:11:53 np0005476733 python3.9[211953]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 11:11:53 np0005476733 systemd[1]: Started libpod-conmon-3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031.scope.
Oct  8 11:11:53 np0005476733 podman[211954]: 2025-10-08 15:11:53.968750735 +0000 UTC m=+0.074689354 container exec 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:11:53 np0005476733 podman[211954]: 2025-10-08 15:11:53.976309398 +0000 UTC m=+0.082247987 container exec_died 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:11:54 np0005476733 systemd[1]: libpod-conmon-3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031.scope: Deactivated successfully.
Oct  8 11:11:54 np0005476733 python3.9[212138]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:11:55 np0005476733 python3.9[212290]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Oct  8 11:11:56 np0005476733 python3.9[212455]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 11:11:56 np0005476733 systemd[1]: Started libpod-conmon-7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33.scope.
Oct  8 11:11:56 np0005476733 podman[212456]: 2025-10-08 15:11:56.35373281 +0000 UTC m=+0.086512464 container exec 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm)
Oct  8 11:11:56 np0005476733 podman[212456]: 2025-10-08 15:11:56.390457702 +0000 UTC m=+0.123216785 container exec_died 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  8 11:11:56 np0005476733 systemd[1]: libpod-conmon-7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33.scope: Deactivated successfully.
Oct  8 11:11:57 np0005476733 python3.9[212640]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 11:11:57 np0005476733 systemd[1]: Started libpod-conmon-7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33.scope.
Oct  8 11:11:57 np0005476733 podman[212641]: 2025-10-08 15:11:57.204784879 +0000 UTC m=+0.088424486 container exec 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm)
Oct  8 11:11:57 np0005476733 podman[212641]: 2025-10-08 15:11:57.235382003 +0000 UTC m=+0.119021610 container exec_died 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 11:11:57 np0005476733 systemd[1]: libpod-conmon-7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33.scope: Deactivated successfully.
Oct  8 11:11:57 np0005476733 podman[212673]: 2025-10-08 15:11:57.384858272 +0000 UTC m=+0.083465026 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 11:11:58 np0005476733 python3.9[212851]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:11:58 np0005476733 python3.9[213003]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Oct  8 11:11:59 np0005476733 python3.9[213168]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 11:11:59 np0005476733 systemd[1]: Started libpod-conmon-d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7.scope.
Oct  8 11:11:59 np0005476733 podman[213169]: 2025-10-08 15:11:59.858172552 +0000 UTC m=+0.082392810 container exec d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 11:11:59 np0005476733 podman[213169]: 2025-10-08 15:11:59.889079551 +0000 UTC m=+0.113299789 container exec_died d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 11:11:59 np0005476733 systemd[1]: libpod-conmon-d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7.scope: Deactivated successfully.
Oct  8 11:12:00 np0005476733 python3.9[213353]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 11:12:00 np0005476733 systemd[1]: Started libpod-conmon-d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7.scope.
Oct  8 11:12:00 np0005476733 podman[213354]: 2025-10-08 15:12:00.77816765 +0000 UTC m=+0.089732344 container exec d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:12:00 np0005476733 podman[213354]: 2025-10-08 15:12:00.809570956 +0000 UTC m=+0.121135610 container exec_died d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:12:00 np0005476733 systemd[1]: libpod-conmon-d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7.scope: Deactivated successfully.
Oct  8 11:12:01 np0005476733 python3.9[213538]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:12:02 np0005476733 python3.9[213690]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Oct  8 11:12:03 np0005476733 python3.9[213855]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 11:12:03 np0005476733 systemd[1]: Started libpod-conmon-9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e.scope.
Oct  8 11:12:03 np0005476733 podman[213856]: 2025-10-08 15:12:03.298764781 +0000 UTC m=+0.074142096 container exec 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:12:03 np0005476733 podman[213856]: 2025-10-08 15:12:03.329102402 +0000 UTC m=+0.104479737 container exec_died 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 11:12:03 np0005476733 systemd[1]: libpod-conmon-9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e.scope: Deactivated successfully.
Oct  8 11:12:04 np0005476733 python3.9[214039]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 11:12:04 np0005476733 systemd[1]: Started libpod-conmon-9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e.scope.
Oct  8 11:12:04 np0005476733 podman[214040]: 2025-10-08 15:12:04.148946944 +0000 UTC m=+0.074506137 container exec 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:12:04 np0005476733 podman[214040]: 2025-10-08 15:12:04.182393864 +0000 UTC m=+0.107953057 container exec_died 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 11:12:04 np0005476733 systemd[1]: libpod-conmon-9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e.scope: Deactivated successfully.
Oct  8 11:12:04 np0005476733 podman[214194]: 2025-10-08 15:12:04.701423834 +0000 UTC m=+0.057877475 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:12:04 np0005476733 python3.9[214241]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:12:05 np0005476733 podman[214365]: 2025-10-08 15:12:05.445156918 +0000 UTC m=+0.077837073 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Oct  8 11:12:05 np0005476733 python3.9[214413]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Oct  8 11:12:06 np0005476733 python3.9[214579]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 11:12:06 np0005476733 systemd[1]: Started libpod-conmon-ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253.scope.
Oct  8 11:12:06 np0005476733 podman[214580]: 2025-10-08 15:12:06.41335047 +0000 UTC m=+0.082857274 container exec ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  8 11:12:06 np0005476733 podman[214580]: 2025-10-08 15:12:06.424459616 +0000 UTC m=+0.093966410 container exec_died ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-type=git, managed_by=edpm_ansible, release=1755695350, io.buildah.version=1.33.7, container_name=openstack_network_exporter)
Oct  8 11:12:06 np0005476733 systemd[1]: libpod-conmon-ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253.scope: Deactivated successfully.
Oct  8 11:12:06 np0005476733 podman[214597]: 2025-10-08 15:12:06.538550799 +0000 UTC m=+0.120746067 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller)
Oct  8 11:12:07 np0005476733 python3.9[214790]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 11:12:07 np0005476733 systemd[1]: Started libpod-conmon-ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253.scope.
Oct  8 11:12:07 np0005476733 podman[214791]: 2025-10-08 15:12:07.242860551 +0000 UTC m=+0.081506891 container exec ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, config_id=edpm, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public)
Oct  8 11:12:07 np0005476733 podman[214791]: 2025-10-08 15:12:07.273712749 +0000 UTC m=+0.112359119 container exec_died ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, name=ubi9-minimal, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.6, container_name=openstack_network_exporter)
Oct  8 11:12:07 np0005476733 systemd[1]: libpod-conmon-ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253.scope: Deactivated successfully.
Oct  8 11:12:08 np0005476733 python3.9[214974]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:12:09 np0005476733 python3.9[215126]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:12:09 np0005476733 python3.9[215278]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:12:10 np0005476733 python3.9[215401]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759936329.2143042-2211-194497483677480/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:12:11 np0005476733 podman[215525]: 2025-10-08 15:12:11.073115678 +0000 UTC m=+0.082387239 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, architecture=x86_64, release=1755695350, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=edpm)
Oct  8 11:12:11 np0005476733 python3.9[215570]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:12:11 np0005476733 python3.9[215726]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:12:12 np0005476733 python3.9[215804]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:12:13 np0005476733 python3.9[215956]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:12:13 np0005476733 python3.9[216034]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.f0p1m7if recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:12:14 np0005476733 python3.9[216186]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:12:14 np0005476733 python3.9[216264]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:12:15 np0005476733 python3.9[216416]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:12:16 np0005476733 podman[216541]: 2025-10-08 15:12:16.19807203 +0000 UTC m=+0.056377696 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible)
Oct  8 11:12:16 np0005476733 python3[216589]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  8 11:12:17 np0005476733 podman[216713]: 2025-10-08 15:12:17.001071452 +0000 UTC m=+0.046977565 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 11:12:17 np0005476733 python3.9[216765]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:12:17 np0005476733 python3.9[216843]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:12:18 np0005476733 python3.9[216995]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:12:18 np0005476733 python3.9[217073]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:12:19 np0005476733 python3.9[217225]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:12:20 np0005476733 python3.9[217303]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:12:22 np0005476733 python3.9[217455]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:12:22 np0005476733 python3.9[217533]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:12:23 np0005476733 podman[217558]: 2025-10-08 15:12:23.217821194 +0000 UTC m=+0.054032391 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:12:23 np0005476733 python3.9[217707]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 11:12:24 np0005476733 python3.9[217832]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759936343.1973312-2461-244195774161295/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:12:25 np0005476733 python3.9[217984]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:12:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:12:26.291 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:12:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:12:26.292 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:12:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:12:26.292 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:12:26 np0005476733 python3.9[218136]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:12:27 np0005476733 python3.9[218291]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:12:28 np0005476733 podman[218412]: 2025-10-08 15:12:28.25765799 +0000 UTC m=+0.073231165 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 11:12:28 np0005476733 python3.9[218467]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:12:28 np0005476733 nova_compute[192580]: 2025-10-08 15:12:28.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:12:29 np0005476733 python3.9[218620]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 11:12:29 np0005476733 nova_compute[192580]: 2025-10-08 15:12:29.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:12:29 np0005476733 nova_compute[192580]: 2025-10-08 15:12:29.630 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:12:29 np0005476733 nova_compute[192580]: 2025-10-08 15:12:29.630 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:12:29 np0005476733 nova_compute[192580]: 2025-10-08 15:12:29.630 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:12:29 np0005476733 nova_compute[192580]: 2025-10-08 15:12:29.631 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:12:29 np0005476733 nova_compute[192580]: 2025-10-08 15:12:29.789 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:12:29 np0005476733 nova_compute[192580]: 2025-10-08 15:12:29.790 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=14088MB free_disk=113.22129821777344GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:12:29 np0005476733 nova_compute[192580]: 2025-10-08 15:12:29.790 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:12:29 np0005476733 nova_compute[192580]: 2025-10-08 15:12:29.791 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:12:29 np0005476733 nova_compute[192580]: 2025-10-08 15:12:29.848 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:12:29 np0005476733 nova_compute[192580]: 2025-10-08 15:12:29.849 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:12:29 np0005476733 nova_compute[192580]: 2025-10-08 15:12:29.869 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:12:29 np0005476733 nova_compute[192580]: 2025-10-08 15:12:29.885 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:12:29 np0005476733 nova_compute[192580]: 2025-10-08 15:12:29.886 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:12:29 np0005476733 nova_compute[192580]: 2025-10-08 15:12:29.887 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:12:30 np0005476733 python3.9[218774]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 11:12:30 np0005476733 nova_compute[192580]: 2025-10-08 15:12:30.879 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:12:30 np0005476733 nova_compute[192580]: 2025-10-08 15:12:30.880 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:12:30 np0005476733 nova_compute[192580]: 2025-10-08 15:12:30.880 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:12:30 np0005476733 nova_compute[192580]: 2025-10-08 15:12:30.880 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:12:30 np0005476733 nova_compute[192580]: 2025-10-08 15:12:30.898 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 11:12:30 np0005476733 nova_compute[192580]: 2025-10-08 15:12:30.899 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:12:30 np0005476733 nova_compute[192580]: 2025-10-08 15:12:30.899 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:12:30 np0005476733 nova_compute[192580]: 2025-10-08 15:12:30.900 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:12:31 np0005476733 python3.9[218929]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 11:12:31 np0005476733 nova_compute[192580]: 2025-10-08 15:12:31.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:12:31 np0005476733 nova_compute[192580]: 2025-10-08 15:12:31.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:12:31 np0005476733 nova_compute[192580]: 2025-10-08 15:12:31.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:12:31 np0005476733 systemd[1]: session-29.scope: Deactivated successfully.
Oct  8 11:12:31 np0005476733 systemd-logind[827]: Session 29 logged out. Waiting for processes to exit.
Oct  8 11:12:31 np0005476733 systemd[1]: session-29.scope: Consumed 1min 51.387s CPU time.
Oct  8 11:12:31 np0005476733 systemd-logind[827]: Removed session 29.
Oct  8 11:12:35 np0005476733 podman[218954]: 2025-10-08 15:12:35.26224675 +0000 UTC m=+0.078726142 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 11:12:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:35.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:35.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:35.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:36.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:36.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:12:36.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:12:36 np0005476733 podman[218973]: 2025-10-08 15:12:36.246464296 +0000 UTC m=+0.066821451 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Oct  8 11:12:37 np0005476733 podman[218994]: 2025-10-08 15:12:37.28346431 +0000 UTC m=+0.116052506 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 11:12:41 np0005476733 podman[219020]: 2025-10-08 15:12:41.231633851 +0000 UTC m=+0.060124436 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  8 11:12:47 np0005476733 podman[219045]: 2025-10-08 15:12:47.23313718 +0000 UTC m=+0.052717689 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 11:12:47 np0005476733 podman[219044]: 2025-10-08 15:12:47.263581535 +0000 UTC m=+0.085280992 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  8 11:12:54 np0005476733 podman[219086]: 2025-10-08 15:12:54.220756164 +0000 UTC m=+0.052941756 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:12:59 np0005476733 podman[219107]: 2025-10-08 15:12:59.218774882 +0000 UTC m=+0.047576224 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 11:13:05 np0005476733 systemd[1]: packagekit.service: Deactivated successfully.
Oct  8 11:13:05 np0005476733 podman[219131]: 2025-10-08 15:13:05.40428316 +0000 UTC m=+0.086012768 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct  8 11:13:07 np0005476733 podman[219150]: 2025-10-08 15:13:07.230011013 +0000 UTC m=+0.059126006 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Oct  8 11:13:08 np0005476733 podman[219171]: 2025-10-08 15:13:08.278138488 +0000 UTC m=+0.102074051 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:13:12 np0005476733 podman[219198]: 2025-10-08 15:13:12.235045362 +0000 UTC m=+0.065619814 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=)
Oct  8 11:13:18 np0005476733 podman[219222]: 2025-10-08 15:13:18.226939174 +0000 UTC m=+0.054681283 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 11:13:18 np0005476733 podman[219221]: 2025-10-08 15:13:18.250896532 +0000 UTC m=+0.073425844 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd)
Oct  8 11:13:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:13:19.460 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:13:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:13:19.462 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:13:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:13:19.464 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:13:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:13:20Z|00027|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:13:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:13:20Z|00028|pinctrl|WARN|IGMP Querier enabled with invalid ETH src address
Oct  8 11:13:21 np0005476733 ovn_controller[94857]: 2025-10-08T15:13:21Z|00029|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:13:21 np0005476733 ovn_controller[94857]: 2025-10-08T15:13:21Z|00030|pinctrl|WARN|IGMP Querier enabled with invalid ETH src address
Oct  8 11:13:25 np0005476733 podman[219266]: 2025-10-08 15:13:25.231833336 +0000 UTC m=+0.062175812 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=iscsid)
Oct  8 11:13:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:13:25Z|00031|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:13:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:13:26.292 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:13:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:13:26.293 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:13:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:13:26.293 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:13:28 np0005476733 nova_compute[192580]: 2025-10-08 15:13:28.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:13:29 np0005476733 nova_compute[192580]: 2025-10-08 15:13:29.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:13:29 np0005476733 nova_compute[192580]: 2025-10-08 15:13:29.618 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:13:29 np0005476733 nova_compute[192580]: 2025-10-08 15:13:29.618 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:13:29 np0005476733 nova_compute[192580]: 2025-10-08 15:13:29.619 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:13:29 np0005476733 nova_compute[192580]: 2025-10-08 15:13:29.619 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:13:29 np0005476733 nova_compute[192580]: 2025-10-08 15:13:29.830 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:13:29 np0005476733 nova_compute[192580]: 2025-10-08 15:13:29.832 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=14165MB free_disk=113.22135543823242GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:13:29 np0005476733 nova_compute[192580]: 2025-10-08 15:13:29.832 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:13:29 np0005476733 nova_compute[192580]: 2025-10-08 15:13:29.832 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:13:29 np0005476733 nova_compute[192580]: 2025-10-08 15:13:29.897 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:13:29 np0005476733 nova_compute[192580]: 2025-10-08 15:13:29.898 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:13:29 np0005476733 nova_compute[192580]: 2025-10-08 15:13:29.926 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:13:29 np0005476733 nova_compute[192580]: 2025-10-08 15:13:29.941 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:13:29 np0005476733 nova_compute[192580]: 2025-10-08 15:13:29.943 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:13:29 np0005476733 nova_compute[192580]: 2025-10-08 15:13:29.943 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:13:30 np0005476733 podman[219286]: 2025-10-08 15:13:30.245551184 +0000 UTC m=+0.069725665 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 11:13:30 np0005476733 nova_compute[192580]: 2025-10-08 15:13:30.943 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:13:31 np0005476733 nova_compute[192580]: 2025-10-08 15:13:31.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:13:31 np0005476733 nova_compute[192580]: 2025-10-08 15:13:31.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:13:31 np0005476733 nova_compute[192580]: 2025-10-08 15:13:31.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:13:31 np0005476733 nova_compute[192580]: 2025-10-08 15:13:31.604 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 11:13:31 np0005476733 nova_compute[192580]: 2025-10-08 15:13:31.604 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:13:32 np0005476733 nova_compute[192580]: 2025-10-08 15:13:32.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:13:32 np0005476733 nova_compute[192580]: 2025-10-08 15:13:32.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:13:33 np0005476733 nova_compute[192580]: 2025-10-08 15:13:33.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:13:33 np0005476733 nova_compute[192580]: 2025-10-08 15:13:33.648 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:13:33 np0005476733 nova_compute[192580]: 2025-10-08 15:13:33.648 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:13:33 np0005476733 nova_compute[192580]: 2025-10-08 15:13:33.649 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:13:36 np0005476733 podman[219310]: 2025-10-08 15:13:36.245469734 +0000 UTC m=+0.067608868 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Oct  8 11:13:38 np0005476733 podman[219330]: 2025-10-08 15:13:38.257870729 +0000 UTC m=+0.087856436 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute)
Oct  8 11:13:39 np0005476733 podman[219350]: 2025-10-08 15:13:39.261167158 +0000 UTC m=+0.094585432 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Oct  8 11:13:43 np0005476733 podman[219376]: 2025-10-08 15:13:43.237097461 +0000 UTC m=+0.068695152 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, version=9.6, maintainer=Red Hat, Inc., architecture=x86_64, config_id=edpm, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  8 11:13:49 np0005476733 podman[219397]: 2025-10-08 15:13:49.259313775 +0000 UTC m=+0.075882513 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:13:49 np0005476733 podman[219398]: 2025-10-08 15:13:49.264050346 +0000 UTC m=+0.081640337 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 11:13:56 np0005476733 podman[219439]: 2025-10-08 15:13:56.238173792 +0000 UTC m=+0.068025271 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  8 11:14:01 np0005476733 podman[219459]: 2025-10-08 15:14:01.223961485 +0000 UTC m=+0.051160801 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:14:07 np0005476733 podman[219484]: 2025-10-08 15:14:07.216047034 +0000 UTC m=+0.051511582 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  8 11:14:09 np0005476733 podman[219503]: 2025-10-08 15:14:09.245552324 +0000 UTC m=+0.071662009 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Oct  8 11:14:10 np0005476733 podman[219523]: 2025-10-08 15:14:10.303219364 +0000 UTC m=+0.121526360 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 11:14:14 np0005476733 podman[219549]: 2025-10-08 15:14:14.243044972 +0000 UTC m=+0.070558604 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, name=ubi9-minimal, release=1755695350, io.openshift.tags=minimal rhel9)
Oct  8 11:14:20 np0005476733 podman[219571]: 2025-10-08 15:14:20.241026992 +0000 UTC m=+0.055006616 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 11:14:20 np0005476733 podman[219570]: 2025-10-08 15:14:20.256057212 +0000 UTC m=+0.065422809 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3)
Oct  8 11:14:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:14:25Z|00032|pinctrl|WARN|Dropped 19 log messages in last 60 seconds (most recently, 6 seconds ago) due to excessive rate
Oct  8 11:14:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:14:25Z|00033|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:14:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:14:26.295 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:14:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:14:26.296 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:14:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:14:26.296 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:14:27 np0005476733 podman[219615]: 2025-10-08 15:14:27.232772384 +0000 UTC m=+0.060103940 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 11:14:28 np0005476733 nova_compute[192580]: 2025-10-08 15:14:28.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:14:28 np0005476733 nova_compute[192580]: 2025-10-08 15:14:28.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:14:28 np0005476733 nova_compute[192580]: 2025-10-08 15:14:28.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 11:14:28 np0005476733 nova_compute[192580]: 2025-10-08 15:14:28.615 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 11:14:28 np0005476733 nova_compute[192580]: 2025-10-08 15:14:28.616 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:14:28 np0005476733 nova_compute[192580]: 2025-10-08 15:14:28.616 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 11:14:28 np0005476733 nova_compute[192580]: 2025-10-08 15:14:28.626 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:14:30 np0005476733 nova_compute[192580]: 2025-10-08 15:14:30.636 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:14:30 np0005476733 nova_compute[192580]: 2025-10-08 15:14:30.637 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:14:30 np0005476733 nova_compute[192580]: 2025-10-08 15:14:30.677 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:14:30 np0005476733 nova_compute[192580]: 2025-10-08 15:14:30.677 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:14:30 np0005476733 nova_compute[192580]: 2025-10-08 15:14:30.677 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:14:30 np0005476733 nova_compute[192580]: 2025-10-08 15:14:30.677 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:14:30 np0005476733 nova_compute[192580]: 2025-10-08 15:14:30.831 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:14:30 np0005476733 nova_compute[192580]: 2025-10-08 15:14:30.833 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=14181MB free_disk=113.2213363647461GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:14:30 np0005476733 nova_compute[192580]: 2025-10-08 15:14:30.833 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:14:30 np0005476733 nova_compute[192580]: 2025-10-08 15:14:30.833 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:14:30 np0005476733 nova_compute[192580]: 2025-10-08 15:14:30.998 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:14:30 np0005476733 nova_compute[192580]: 2025-10-08 15:14:30.999 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:14:31 np0005476733 nova_compute[192580]: 2025-10-08 15:14:31.109 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 11:14:31 np0005476733 nova_compute[192580]: 2025-10-08 15:14:31.269 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 11:14:31 np0005476733 nova_compute[192580]: 2025-10-08 15:14:31.270 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 11:14:31 np0005476733 nova_compute[192580]: 2025-10-08 15:14:31.290 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 11:14:31 np0005476733 nova_compute[192580]: 2025-10-08 15:14:31.322 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 11:14:31 np0005476733 nova_compute[192580]: 2025-10-08 15:14:31.348 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:14:31 np0005476733 nova_compute[192580]: 2025-10-08 15:14:31.362 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:14:31 np0005476733 nova_compute[192580]: 2025-10-08 15:14:31.365 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:14:31 np0005476733 nova_compute[192580]: 2025-10-08 15:14:31.366 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.532s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:14:32 np0005476733 podman[219636]: 2025-10-08 15:14:32.219789045 +0000 UTC m=+0.051642678 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 11:14:34 np0005476733 nova_compute[192580]: 2025-10-08 15:14:34.319 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:14:34 np0005476733 nova_compute[192580]: 2025-10-08 15:14:34.319 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:14:34 np0005476733 nova_compute[192580]: 2025-10-08 15:14:34.319 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:14:34 np0005476733 nova_compute[192580]: 2025-10-08 15:14:34.333 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 11:14:34 np0005476733 nova_compute[192580]: 2025-10-08 15:14:34.334 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:14:34 np0005476733 nova_compute[192580]: 2025-10-08 15:14:34.334 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:14:34 np0005476733 nova_compute[192580]: 2025-10-08 15:14:34.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:14:34 np0005476733 nova_compute[192580]: 2025-10-08 15:14:34.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:14:34 np0005476733 nova_compute[192580]: 2025-10-08 15:14:34.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:14:34 np0005476733 nova_compute[192580]: 2025-10-08 15:14:34.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:14:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:35.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:35.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:35.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:35.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:35.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:35.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:35.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:35.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:35.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:14:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:14:38 np0005476733 podman[219661]: 2025-10-08 15:14:38.234301176 +0000 UTC m=+0.065450341 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:14:40 np0005476733 podman[219680]: 2025-10-08 15:14:40.268549578 +0000 UTC m=+0.084276931 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 11:14:41 np0005476733 podman[219701]: 2025-10-08 15:14:41.249269272 +0000 UTC m=+0.078253399 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 11:14:45 np0005476733 podman[219727]: 2025-10-08 15:14:45.278194773 +0000 UTC m=+0.095935414 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, managed_by=edpm_ansible, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:14:51 np0005476733 podman[219751]: 2025-10-08 15:14:51.232997176 +0000 UTC m=+0.059943445 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:14:51 np0005476733 podman[219750]: 2025-10-08 15:14:51.273401846 +0000 UTC m=+0.103778544 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 11:14:58 np0005476733 podman[219795]: 2025-10-08 15:14:58.238116355 +0000 UTC m=+0.063657864 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:15:03 np0005476733 podman[219815]: 2025-10-08 15:15:03.26406135 +0000 UTC m=+0.089830218 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:15:09 np0005476733 podman[219839]: 2025-10-08 15:15:09.22845314 +0000 UTC m=+0.053880071 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 11:15:11 np0005476733 podman[219859]: 2025-10-08 15:15:11.24446093 +0000 UTC m=+0.066118241 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:15:12 np0005476733 podman[219880]: 2025-10-08 15:15:12.266458182 +0000 UTC m=+0.091675148 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 11:15:16 np0005476733 podman[219904]: 2025-10-08 15:15:16.230205582 +0000 UTC m=+0.057576517 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-type=git, architecture=x86_64, name=ubi9-minimal, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  8 11:15:22 np0005476733 podman[219924]: 2025-10-08 15:15:22.254023661 +0000 UTC m=+0.065154139 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:15:22 np0005476733 podman[219923]: 2025-10-08 15:15:22.290218875 +0000 UTC m=+0.101985363 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  8 11:15:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:15:25Z|00034|pinctrl|WARN|Dropped 15 log messages in last 60 seconds (most recently, 6 seconds ago) due to excessive rate
Oct  8 11:15:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:15:25Z|00035|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:15:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:15:26.297 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:15:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:15:26.297 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:15:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:15:26.297 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:15:29 np0005476733 podman[219967]: 2025-10-08 15:15:29.226900858 +0000 UTC m=+0.058193088 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:15:29 np0005476733 nova_compute[192580]: 2025-10-08 15:15:29.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:15:30 np0005476733 nova_compute[192580]: 2025-10-08 15:15:30.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:15:30 np0005476733 nova_compute[192580]: 2025-10-08 15:15:30.614 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:15:30 np0005476733 nova_compute[192580]: 2025-10-08 15:15:30.614 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:15:30 np0005476733 nova_compute[192580]: 2025-10-08 15:15:30.614 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:15:30 np0005476733 nova_compute[192580]: 2025-10-08 15:15:30.614 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:15:30 np0005476733 nova_compute[192580]: 2025-10-08 15:15:30.765 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:15:30 np0005476733 nova_compute[192580]: 2025-10-08 15:15:30.766 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=14175MB free_disk=113.2222671508789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:15:30 np0005476733 nova_compute[192580]: 2025-10-08 15:15:30.766 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:15:30 np0005476733 nova_compute[192580]: 2025-10-08 15:15:30.767 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:15:30 np0005476733 nova_compute[192580]: 2025-10-08 15:15:30.853 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:15:30 np0005476733 nova_compute[192580]: 2025-10-08 15:15:30.854 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:15:30 np0005476733 nova_compute[192580]: 2025-10-08 15:15:30.878 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:15:30 np0005476733 nova_compute[192580]: 2025-10-08 15:15:30.901 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:15:30 np0005476733 nova_compute[192580]: 2025-10-08 15:15:30.902 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:15:30 np0005476733 nova_compute[192580]: 2025-10-08 15:15:30.903 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:15:32 np0005476733 nova_compute[192580]: 2025-10-08 15:15:32.903 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:15:33 np0005476733 nova_compute[192580]: 2025-10-08 15:15:33.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:15:33 np0005476733 nova_compute[192580]: 2025-10-08 15:15:33.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:15:34 np0005476733 podman[219987]: 2025-10-08 15:15:34.234203219 +0000 UTC m=+0.056765422 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 11:15:34 np0005476733 nova_compute[192580]: 2025-10-08 15:15:34.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:15:35 np0005476733 nova_compute[192580]: 2025-10-08 15:15:35.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:15:35 np0005476733 nova_compute[192580]: 2025-10-08 15:15:35.601 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:15:35 np0005476733 nova_compute[192580]: 2025-10-08 15:15:35.602 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:15:35 np0005476733 nova_compute[192580]: 2025-10-08 15:15:35.602 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:15:35 np0005476733 nova_compute[192580]: 2025-10-08 15:15:35.616 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 11:15:36 np0005476733 nova_compute[192580]: 2025-10-08 15:15:36.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:15:36 np0005476733 nova_compute[192580]: 2025-10-08 15:15:36.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:15:36 np0005476733 nova_compute[192580]: 2025-10-08 15:15:36.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:15:40 np0005476733 podman[220009]: 2025-10-08 15:15:40.235919112 +0000 UTC m=+0.059086465 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:15:42 np0005476733 podman[220030]: 2025-10-08 15:15:42.239857222 +0000 UTC m=+0.069109295 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:15:43 np0005476733 podman[220050]: 2025-10-08 15:15:43.284037223 +0000 UTC m=+0.106366464 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct  8 11:15:47 np0005476733 podman[220080]: 2025-10-08 15:15:47.281046024 +0000 UTC m=+0.103807241 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, io.buildah.version=1.33.7, vendor=Red Hat, Inc.)
Oct  8 11:15:53 np0005476733 podman[220104]: 2025-10-08 15:15:53.244486868 +0000 UTC m=+0.068102043 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 11:15:53 np0005476733 podman[220105]: 2025-10-08 15:15:53.266041575 +0000 UTC m=+0.075297552 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:16:00 np0005476733 podman[220155]: 2025-10-08 15:16:00.255944134 +0000 UTC m=+0.077398349 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid)
Oct  8 11:16:05 np0005476733 podman[220178]: 2025-10-08 15:16:05.238916599 +0000 UTC m=+0.064470517 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:16:11 np0005476733 podman[220207]: 2025-10-08 15:16:11.232011029 +0000 UTC m=+0.055354627 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct  8 11:16:13 np0005476733 podman[220228]: 2025-10-08 15:16:13.255163611 +0000 UTC m=+0.079793827 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  8 11:16:14 np0005476733 podman[220249]: 2025-10-08 15:16:14.273426195 +0000 UTC m=+0.099761843 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct  8 11:16:18 np0005476733 podman[220276]: 2025-10-08 15:16:18.237779256 +0000 UTC m=+0.067353461 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, io.buildah.version=1.33.7, version=9.6, distribution-scope=public, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  8 11:16:24 np0005476733 podman[220300]: 2025-10-08 15:16:24.258124776 +0000 UTC m=+0.078649728 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Oct  8 11:16:24 np0005476733 podman[220301]: 2025-10-08 15:16:24.258251411 +0000 UTC m=+0.072014237 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:16:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:16:25Z|00036|pinctrl|WARN|Dropped 15 log messages in last 60 seconds (most recently, 6 seconds ago) due to excessive rate
Oct  8 11:16:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:16:25Z|00037|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:16:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:16:26.300 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:16:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:16:26.301 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:16:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:16:26.301 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:16:30 np0005476733 nova_compute[192580]: 2025-10-08 15:16:30.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:16:31 np0005476733 podman[220348]: 2025-10-08 15:16:31.268119501 +0000 UTC m=+0.095473987 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid)
Oct  8 11:16:31 np0005476733 nova_compute[192580]: 2025-10-08 15:16:31.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:16:31 np0005476733 nova_compute[192580]: 2025-10-08 15:16:31.625 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:16:31 np0005476733 nova_compute[192580]: 2025-10-08 15:16:31.625 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:16:31 np0005476733 nova_compute[192580]: 2025-10-08 15:16:31.625 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:16:31 np0005476733 nova_compute[192580]: 2025-10-08 15:16:31.625 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:16:31 np0005476733 nova_compute[192580]: 2025-10-08 15:16:31.784 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:16:31 np0005476733 nova_compute[192580]: 2025-10-08 15:16:31.785 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=14162MB free_disk=113.2221794128418GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:16:31 np0005476733 nova_compute[192580]: 2025-10-08 15:16:31.786 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:16:31 np0005476733 nova_compute[192580]: 2025-10-08 15:16:31.786 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:16:31 np0005476733 nova_compute[192580]: 2025-10-08 15:16:31.846 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:16:31 np0005476733 nova_compute[192580]: 2025-10-08 15:16:31.847 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:16:31 np0005476733 nova_compute[192580]: 2025-10-08 15:16:31.875 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:16:31 np0005476733 nova_compute[192580]: 2025-10-08 15:16:31.898 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:16:31 np0005476733 nova_compute[192580]: 2025-10-08 15:16:31.900 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:16:31 np0005476733 nova_compute[192580]: 2025-10-08 15:16:31.900 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:16:32 np0005476733 nova_compute[192580]: 2025-10-08 15:16:32.900 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:16:34 np0005476733 nova_compute[192580]: 2025-10-08 15:16:34.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:16:35 np0005476733 nova_compute[192580]: 2025-10-08 15:16:35.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:16:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:35.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:35.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:35.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:35.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:35.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:35.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:35.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:35.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:35.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:35.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:35.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:16:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:16:36 np0005476733 podman[220370]: 2025-10-08 15:16:36.21213923 +0000 UTC m=+0.043631918 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:16:36 np0005476733 nova_compute[192580]: 2025-10-08 15:16:36.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:16:36 np0005476733 nova_compute[192580]: 2025-10-08 15:16:36.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:16:36 np0005476733 nova_compute[192580]: 2025-10-08 15:16:36.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:16:36 np0005476733 nova_compute[192580]: 2025-10-08 15:16:36.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:16:36 np0005476733 nova_compute[192580]: 2025-10-08 15:16:36.618 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 11:16:36 np0005476733 nova_compute[192580]: 2025-10-08 15:16:36.619 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:16:36 np0005476733 nova_compute[192580]: 2025-10-08 15:16:36.619 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:16:38 np0005476733 nova_compute[192580]: 2025-10-08 15:16:38.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:16:42 np0005476733 podman[220396]: 2025-10-08 15:16:42.232553441 +0000 UTC m=+0.060352262 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 11:16:44 np0005476733 podman[220416]: 2025-10-08 15:16:44.265667315 +0000 UTC m=+0.090716765 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251001)
Oct  8 11:16:45 np0005476733 podman[220436]: 2025-10-08 15:16:45.298687488 +0000 UTC m=+0.121085317 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  8 11:16:49 np0005476733 podman[220465]: 2025-10-08 15:16:49.247728253 +0000 UTC m=+0.071198640 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=edpm, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1755695350, name=ubi9-minimal, io.buildah.version=1.33.7, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  8 11:16:55 np0005476733 podman[220493]: 2025-10-08 15:16:55.258420393 +0000 UTC m=+0.074930819 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:16:55 np0005476733 podman[220492]: 2025-10-08 15:16:55.258495496 +0000 UTC m=+0.080877900 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 11:17:02 np0005476733 podman[220536]: 2025-10-08 15:17:02.241395573 +0000 UTC m=+0.071917012 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS)
Oct  8 11:17:07 np0005476733 podman[220560]: 2025-10-08 15:17:07.232280583 +0000 UTC m=+0.056838500 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:17:13 np0005476733 podman[220589]: 2025-10-08 15:17:13.267097337 +0000 UTC m=+0.084457635 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Oct  8 11:17:15 np0005476733 podman[220610]: 2025-10-08 15:17:15.229051212 +0000 UTC m=+0.062549154 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:17:16 np0005476733 podman[220630]: 2025-10-08 15:17:16.247937842 +0000 UTC m=+0.079440104 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  8 11:17:20 np0005476733 podman[220660]: 2025-10-08 15:17:20.23018516 +0000 UTC m=+0.061147989 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, config_id=edpm)
Oct  8 11:17:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:17:25Z|00038|pinctrl|WARN|Dropped 15 log messages in last 60 seconds (most recently, 6 seconds ago) due to excessive rate
Oct  8 11:17:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:17:25Z|00039|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:17:26 np0005476733 podman[220684]: 2025-10-08 15:17:26.23086986 +0000 UTC m=+0.061360485 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 11:17:26 np0005476733 podman[220685]: 2025-10-08 15:17:26.249776495 +0000 UTC m=+0.076054385 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 11:17:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:17:26.300 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:17:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:17:26.301 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:17:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:17:26.301 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:17:31 np0005476733 nova_compute[192580]: 2025-10-08 15:17:31.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:17:31 np0005476733 nova_compute[192580]: 2025-10-08 15:17:31.618 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:17:31 np0005476733 nova_compute[192580]: 2025-10-08 15:17:31.619 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:17:31 np0005476733 nova_compute[192580]: 2025-10-08 15:17:31.619 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:17:31 np0005476733 nova_compute[192580]: 2025-10-08 15:17:31.620 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:17:31 np0005476733 nova_compute[192580]: 2025-10-08 15:17:31.780 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:17:31 np0005476733 nova_compute[192580]: 2025-10-08 15:17:31.781 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=14161MB free_disk=113.22217559814453GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:17:31 np0005476733 nova_compute[192580]: 2025-10-08 15:17:31.781 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:17:31 np0005476733 nova_compute[192580]: 2025-10-08 15:17:31.782 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:17:31 np0005476733 nova_compute[192580]: 2025-10-08 15:17:31.841 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:17:31 np0005476733 nova_compute[192580]: 2025-10-08 15:17:31.842 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:17:31 np0005476733 nova_compute[192580]: 2025-10-08 15:17:31.864 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:17:31 np0005476733 nova_compute[192580]: 2025-10-08 15:17:31.879 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:17:31 np0005476733 nova_compute[192580]: 2025-10-08 15:17:31.881 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:17:31 np0005476733 nova_compute[192580]: 2025-10-08 15:17:31.881 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:17:32 np0005476733 nova_compute[192580]: 2025-10-08 15:17:32.882 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:17:33 np0005476733 podman[220729]: 2025-10-08 15:17:33.234723767 +0000 UTC m=+0.061717946 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid)
Oct  8 11:17:33 np0005476733 nova_compute[192580]: 2025-10-08 15:17:33.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:17:34 np0005476733 nova_compute[192580]: 2025-10-08 15:17:34.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:17:35 np0005476733 nova_compute[192580]: 2025-10-08 15:17:35.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:17:35 np0005476733 nova_compute[192580]: 2025-10-08 15:17:35.598 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:17:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:17:35.614 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:17:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:17:35.616 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:17:36 np0005476733 nova_compute[192580]: 2025-10-08 15:17:36.598 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:17:37 np0005476733 nova_compute[192580]: 2025-10-08 15:17:37.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:17:37 np0005476733 nova_compute[192580]: 2025-10-08 15:17:37.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:17:37 np0005476733 nova_compute[192580]: 2025-10-08 15:17:37.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:17:37 np0005476733 nova_compute[192580]: 2025-10-08 15:17:37.604 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 11:17:37 np0005476733 nova_compute[192580]: 2025-10-08 15:17:37.604 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:17:37 np0005476733 nova_compute[192580]: 2025-10-08 15:17:37.604 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:17:38 np0005476733 podman[220752]: 2025-10-08 15:17:38.235564306 +0000 UTC m=+0.069660500 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:17:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:17:39.617 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:17:40 np0005476733 nova_compute[192580]: 2025-10-08 15:17:40.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:17:44 np0005476733 podman[220780]: 2025-10-08 15:17:44.228426445 +0000 UTC m=+0.057401827 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:17:46 np0005476733 podman[220801]: 2025-10-08 15:17:46.233723447 +0000 UTC m=+0.062306005 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:17:47 np0005476733 podman[220822]: 2025-10-08 15:17:47.270602024 +0000 UTC m=+0.096311434 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 11:17:51 np0005476733 podman[220849]: 2025-10-08 15:17:51.253027916 +0000 UTC m=+0.078550195 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Oct  8 11:17:57 np0005476733 podman[220876]: 2025-10-08 15:17:57.241260789 +0000 UTC m=+0.058045769 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 11:17:57 np0005476733 podman[220875]: 2025-10-08 15:17:57.241416534 +0000 UTC m=+0.066843191 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:18:04 np0005476733 podman[220923]: 2025-10-08 15:18:04.241423729 +0000 UTC m=+0.062726339 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 11:18:09 np0005476733 podman[220949]: 2025-10-08 15:18:09.245022697 +0000 UTC m=+0.066049165 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 11:18:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:13.369 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:18:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:13.371 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:18:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:14.374 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:18:15 np0005476733 podman[220973]: 2025-10-08 15:18:15.246078909 +0000 UTC m=+0.059527506 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:18:17 np0005476733 podman[220993]: 2025-10-08 15:18:17.258287222 +0000 UTC m=+0.084277358 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:18:18 np0005476733 podman[221015]: 2025-10-08 15:18:18.298536588 +0000 UTC m=+0.124612530 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  8 11:18:18 np0005476733 systemd-logind[827]: New session 30 of user zuul.
Oct  8 11:18:18 np0005476733 systemd[1]: Started Session 30 of User zuul.
Oct  8 11:18:18 np0005476733 systemd-logind[827]: New session 31 of user zuul.
Oct  8 11:18:18 np0005476733 systemd[1]: Started Session 31 of User zuul.
Oct  8 11:18:18 np0005476733 systemd-logind[827]: New session 32 of user zuul.
Oct  8 11:18:18 np0005476733 systemd[1]: Started Session 32 of User zuul.
Oct  8 11:18:18 np0005476733 systemd[1]: session-30.scope: Deactivated successfully.
Oct  8 11:18:18 np0005476733 systemd-logind[827]: Session 30 logged out. Waiting for processes to exit.
Oct  8 11:18:18 np0005476733 systemd-logind[827]: Removed session 30.
Oct  8 11:18:18 np0005476733 systemd[1]: session-31.scope: Deactivated successfully.
Oct  8 11:18:18 np0005476733 systemd-logind[827]: Session 31 logged out. Waiting for processes to exit.
Oct  8 11:18:18 np0005476733 systemd-logind[827]: Removed session 31.
Oct  8 11:18:18 np0005476733 systemd[1]: session-32.scope: Deactivated successfully.
Oct  8 11:18:18 np0005476733 systemd-logind[827]: Session 32 logged out. Waiting for processes to exit.
Oct  8 11:18:18 np0005476733 systemd-logind[827]: Removed session 32.
Oct  8 11:18:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:18:20Z|00040|pinctrl|WARN|Dropped 205 log messages in last 55 seconds (most recently, 1 seconds ago) due to excessive rate
Oct  8 11:18:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:18:20Z|00041|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:18:20 np0005476733 systemd-logind[827]: New session 33 of user zuul.
Oct  8 11:18:20 np0005476733 systemd[1]: Started Session 33 of User zuul.
Oct  8 11:18:20 np0005476733 systemd[1]: session-33.scope: Deactivated successfully.
Oct  8 11:18:20 np0005476733 systemd-logind[827]: Session 33 logged out. Waiting for processes to exit.
Oct  8 11:18:20 np0005476733 systemd-logind[827]: Removed session 33.
Oct  8 11:18:22 np0005476733 podman[221151]: 2025-10-08 15:18:22.252249924 +0000 UTC m=+0.079027810 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  8 11:18:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:26.301 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:18:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:26.303 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:18:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:26.303 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:18:28 np0005476733 podman[221175]: 2025-10-08 15:18:28.229373361 +0000 UTC m=+0.053125402 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 11:18:28 np0005476733 podman[221176]: 2025-10-08 15:18:28.23059814 +0000 UTC m=+0.052244762 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 11:18:28 np0005476733 nova_compute[192580]: 2025-10-08 15:18:28.774 2 DEBUG oslo_concurrency.lockutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "45e7af69-431d-4066-9b30-3883334340db" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:18:28 np0005476733 nova_compute[192580]: 2025-10-08 15:18:28.774 2 DEBUG oslo_concurrency.lockutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "45e7af69-431d-4066-9b30-3883334340db" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:18:28 np0005476733 nova_compute[192580]: 2025-10-08 15:18:28.796 2 DEBUG nova.compute.manager [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:18:28 np0005476733 nova_compute[192580]: 2025-10-08 15:18:28.892 2 DEBUG oslo_concurrency.lockutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:18:28 np0005476733 nova_compute[192580]: 2025-10-08 15:18:28.892 2 DEBUG oslo_concurrency.lockutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:18:28 np0005476733 nova_compute[192580]: 2025-10-08 15:18:28.901 2 DEBUG nova.virt.hardware [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:18:28 np0005476733 nova_compute[192580]: 2025-10-08 15:18:28.901 2 INFO nova.compute.claims [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:18:29 np0005476733 nova_compute[192580]: 2025-10-08 15:18:29.003 2 DEBUG nova.compute.provider_tree [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:18:29 np0005476733 nova_compute[192580]: 2025-10-08 15:18:29.018 2 DEBUG nova.scheduler.client.report [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:18:29 np0005476733 nova_compute[192580]: 2025-10-08 15:18:29.043 2 DEBUG oslo_concurrency.lockutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:18:29 np0005476733 nova_compute[192580]: 2025-10-08 15:18:29.044 2 DEBUG nova.compute.manager [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:18:29 np0005476733 nova_compute[192580]: 2025-10-08 15:18:29.089 2 DEBUG nova.compute.manager [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:18:29 np0005476733 nova_compute[192580]: 2025-10-08 15:18:29.090 2 DEBUG nova.network.neutron [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:18:29 np0005476733 nova_compute[192580]: 2025-10-08 15:18:29.113 2 INFO nova.virt.libvirt.driver [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:18:29 np0005476733 nova_compute[192580]: 2025-10-08 15:18:29.130 2 DEBUG nova.compute.manager [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:18:29 np0005476733 nova_compute[192580]: 2025-10-08 15:18:29.217 2 DEBUG nova.compute.manager [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:18:29 np0005476733 nova_compute[192580]: 2025-10-08 15:18:29.218 2 DEBUG nova.virt.libvirt.driver [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:18:29 np0005476733 nova_compute[192580]: 2025-10-08 15:18:29.219 2 INFO nova.virt.libvirt.driver [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Creating image(s)#033[00m
Oct  8 11:18:29 np0005476733 nova_compute[192580]: 2025-10-08 15:18:29.219 2 DEBUG oslo_concurrency.lockutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "/var/lib/nova/instances/45e7af69-431d-4066-9b30-3883334340db/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:18:29 np0005476733 nova_compute[192580]: 2025-10-08 15:18:29.219 2 DEBUG oslo_concurrency.lockutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "/var/lib/nova/instances/45e7af69-431d-4066-9b30-3883334340db/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:18:29 np0005476733 nova_compute[192580]: 2025-10-08 15:18:29.220 2 DEBUG oslo_concurrency.lockutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "/var/lib/nova/instances/45e7af69-431d-4066-9b30-3883334340db/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:18:29 np0005476733 nova_compute[192580]: 2025-10-08 15:18:29.221 2 DEBUG oslo_concurrency.lockutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:18:29 np0005476733 nova_compute[192580]: 2025-10-08 15:18:29.221 2 DEBUG oslo_concurrency.lockutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:18:30 np0005476733 nova_compute[192580]: 2025-10-08 15:18:30.015 2 WARNING oslo_policy.policy [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Oct  8 11:18:30 np0005476733 nova_compute[192580]: 2025-10-08 15:18:30.015 2 WARNING oslo_policy.policy [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Oct  8 11:18:30 np0005476733 nova_compute[192580]: 2025-10-08 15:18:30.018 2 DEBUG nova.policy [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:18:30 np0005476733 nova_compute[192580]: 2025-10-08 15:18:30.815 2 DEBUG oslo_concurrency.processutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:18:30 np0005476733 nova_compute[192580]: 2025-10-08 15:18:30.874 2 DEBUG oslo_concurrency.processutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493.part --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:18:30 np0005476733 nova_compute[192580]: 2025-10-08 15:18:30.876 2 DEBUG nova.virt.images [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] ec29a055-bb5f-49c2-94be-8574c5ea97ea was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Oct  8 11:18:30 np0005476733 nova_compute[192580]: 2025-10-08 15:18:30.879 2 DEBUG nova.privsep.utils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  8 11:18:30 np0005476733 nova_compute[192580]: 2025-10-08 15:18:30.879 2 DEBUG oslo_concurrency.processutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493.part /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:18:31 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.139 2 DEBUG oslo_concurrency.processutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493.part /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493.converted" returned: 0 in 0.260s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:18:31 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.148 2 DEBUG oslo_concurrency.processutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:18:31 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.209 2 DEBUG oslo_concurrency.processutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493.converted --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:18:31 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.212 2 DEBUG oslo_concurrency.lockutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.990s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:18:31 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.235 2 INFO oslo.privsep.daemon [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpnljrgijo/privsep.sock']#033[00m
Oct  8 11:18:31 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:18:31 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.615 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:18:31 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.615 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:18:31 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:18:31 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.616 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:18:31 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.794 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:18:31 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.795 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=14098MB free_disk=113.18791580200195GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:18:31 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.796 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:18:31 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.796 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:18:31 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.869 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 45e7af69-431d-4066-9b30-3883334340db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:18:31 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.869 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:18:31 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.869 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:18:31 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.905 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'MEMORY_MB': {'total': 15731, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 119, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 11:18:31 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.935 2 ERROR nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [req-1258a043-c5e8-4575-b3f1-2c42cab4b7c0] Failed to update inventory to [{'MEMORY_MB': {'total': 15731, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 119, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 94652b61-be28-442d-a9f4-cded63837444.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-1258a043-c5e8-4575-b3f1-2c42cab4b7c0"}]}#033[00m
Oct  8 11:18:31 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.953 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 11:18:31 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.975 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 11:18:31 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.975 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 0, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.001 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.009 2 INFO oslo.privsep.daemon [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.861 52 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.866 52 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.869 52 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:31.869 52 INFO oslo.privsep.daemon [-] privsep daemon running as pid 52#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.041 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.087 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'MEMORY_MB': {'total': 15731, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 119, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.119 2 DEBUG oslo_concurrency.processutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.138 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updated inventory for provider 94652b61-be28-442d-a9f4-cded63837444 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15731, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 119, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.139 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating resource provider 94652b61-be28-442d-a9f4-cded63837444 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.140 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.180 2 DEBUG oslo_concurrency.processutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.181 2 DEBUG oslo_concurrency.lockutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.182 2 DEBUG oslo_concurrency.lockutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.204 2 DEBUG oslo_concurrency.processutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.224 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.225 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.429s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.266 2 DEBUG oslo_concurrency.processutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.267 2 DEBUG oslo_concurrency.processutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/45e7af69-431d-4066-9b30-3883334340db/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.309 2 DEBUG oslo_concurrency.processutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/45e7af69-431d-4066-9b30-3883334340db/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.311 2 DEBUG oslo_concurrency.lockutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.312 2 DEBUG oslo_concurrency.processutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.344 2 DEBUG nova.network.neutron [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Successfully created port: 38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.371 2 DEBUG oslo_concurrency.processutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.373 2 DEBUG nova.virt.disk.api [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Checking if we can resize image /var/lib/nova/instances/45e7af69-431d-4066-9b30-3883334340db/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.373 2 DEBUG oslo_concurrency.processutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/45e7af69-431d-4066-9b30-3883334340db/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.437 2 DEBUG oslo_concurrency.processutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/45e7af69-431d-4066-9b30-3883334340db/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.438 2 DEBUG nova.virt.disk.api [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Cannot resize image /var/lib/nova/instances/45e7af69-431d-4066-9b30-3883334340db/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.439 2 DEBUG nova.objects.instance [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lazy-loading 'migration_context' on Instance uuid 45e7af69-431d-4066-9b30-3883334340db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.474 2 DEBUG nova.virt.libvirt.driver [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.475 2 DEBUG nova.virt.libvirt.driver [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Ensure instance console log exists: /var/lib/nova/instances/45e7af69-431d-4066-9b30-3883334340db/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.476 2 DEBUG oslo_concurrency.lockutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.476 2 DEBUG oslo_concurrency.lockutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:18:32 np0005476733 nova_compute[192580]: 2025-10-08 15:18:32.477 2 DEBUG oslo_concurrency.lockutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:18:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:34.484 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7a:6d:ff 192.168.0.2 2001::f816:3eff:fe7a:6dff'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.2/24 2001::f816:3eff:fe7a:6dff/64', 'neutron:device_id': 'ovnmeta-dc54cf57-b70a-4130-a34b-3a77cf17e9ab', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dc54cf57-b70a-4130-a34b-3a77cf17e9ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c1390632da384309b358a3f3728ab5d8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0d540f2-5250-44b2-82f5-844cb338b1ba, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ddc16289-55c2-47a5-bc47-bb5a910bdb0e) old=Port_Binding(mac=['fa:16:3e:7a:6d:ff 192.168.0.2'], external_ids={'neutron:cidrs': '192.168.0.2/24', 'neutron:device_id': 'ovnmeta-dc54cf57-b70a-4130-a34b-3a77cf17e9ab', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dc54cf57-b70a-4130-a34b-3a77cf17e9ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c1390632da384309b358a3f3728ab5d8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:18:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:34.486 103739 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ddc16289-55c2-47a5-bc47-bb5a910bdb0e in datapath dc54cf57-b70a-4130-a34b-3a77cf17e9ab updated#033[00m
Oct  8 11:18:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:34.490 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dc54cf57-b70a-4130-a34b-3a77cf17e9ab, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:18:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:34.491 103739 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmptjmvamgb/privsep.sock']#033[00m
Oct  8 11:18:35 np0005476733 nova_compute[192580]: 2025-10-08 15:18:35.088 2 DEBUG nova.network.neutron [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Successfully updated port: 38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:18:35 np0005476733 nova_compute[192580]: 2025-10-08 15:18:35.105 2 DEBUG oslo_concurrency.lockutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "refresh_cache-45e7af69-431d-4066-9b30-3883334340db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:18:35 np0005476733 nova_compute[192580]: 2025-10-08 15:18:35.106 2 DEBUG oslo_concurrency.lockutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquired lock "refresh_cache-45e7af69-431d-4066-9b30-3883334340db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:18:35 np0005476733 nova_compute[192580]: 2025-10-08 15:18:35.106 2 DEBUG nova.network.neutron [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:18:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:35.213 103739 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Oct  8 11:18:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:35.215 103739 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmptjmvamgb/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Oct  8 11:18:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:35.049 221259 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  8 11:18:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:35.056 221259 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  8 11:18:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:35.061 221259 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Oct  8 11:18:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:35.061 221259 INFO oslo.privsep.daemon [-] privsep daemon running as pid 221259#033[00m
Oct  8 11:18:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:35.219 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[09d438cd-dbe4-4cb4-a57e-1fe0e636a380]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:18:35 np0005476733 nova_compute[192580]: 2025-10-08 15:18:35.225 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:18:35 np0005476733 nova_compute[192580]: 2025-10-08 15:18:35.226 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:18:35 np0005476733 podman[221260]: 2025-10-08 15:18:35.242946131 +0000 UTC m=+0.072895464 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, org.label-schema.vendor=CentOS)
Oct  8 11:18:35 np0005476733 nova_compute[192580]: 2025-10-08 15:18:35.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:18:35 np0005476733 nova_compute[192580]: 2025-10-08 15:18:35.620 2 DEBUG nova.network.neutron [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:18:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:35.734 221259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:18:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:35.734 221259 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:18:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:35.734 221259 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:18:35 np0005476733 nova_compute[192580]: 2025-10-08 15:18:35.750 2 DEBUG nova.compute.manager [req-7ebea7a8-3f48-49c0-9501-441cc39e364c req-46ca15c4-fd32-41ba-9950-b6b8ddf5800d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Received event network-changed-38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:18:35 np0005476733 nova_compute[192580]: 2025-10-08 15:18:35.751 2 DEBUG nova.compute.manager [req-7ebea7a8-3f48-49c0-9501-441cc39e364c req-46ca15c4-fd32-41ba-9950-b6b8ddf5800d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Refreshing instance network info cache due to event network-changed-38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:18:35 np0005476733 nova_compute[192580]: 2025-10-08 15:18:35.751 2 DEBUG oslo_concurrency.lockutils [req-7ebea7a8-3f48-49c0-9501-441cc39e364c req-46ca15c4-fd32-41ba-9950-b6b8ddf5800d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-45e7af69-431d-4066-9b30-3883334340db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:18:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:35.837 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[135d7577-7705-4766-8012-71ce22fdc926]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:18:35 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:35.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:35.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:36.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:36.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:36.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:36.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:36.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:36.002 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:36.002 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:36.002 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:36.002 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:36.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:36.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:36.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:36.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:36.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:36.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:36.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:36.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:36.004 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:18:36.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:18:36 np0005476733 nova_compute[192580]: 2025-10-08 15:18:36.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.529 2 DEBUG nova.network.neutron [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Updating instance_info_cache with network_info: [{"id": "38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0", "address": "fa:16:3e:e1:e4:39", "network": {"id": "d70e9270-ce35-4a65-b11d-6eefe64091e8", "bridge": "br-int", "label": "tempest-test-network--1317531077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38cc08d0-8e", "ovs_interfaceid": "38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.555 2 DEBUG oslo_concurrency.lockutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Releasing lock "refresh_cache-45e7af69-431d-4066-9b30-3883334340db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.556 2 DEBUG nova.compute.manager [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Instance network_info: |[{"id": "38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0", "address": "fa:16:3e:e1:e4:39", "network": {"id": "d70e9270-ce35-4a65-b11d-6eefe64091e8", "bridge": "br-int", "label": "tempest-test-network--1317531077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38cc08d0-8e", "ovs_interfaceid": "38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.556 2 DEBUG oslo_concurrency.lockutils [req-7ebea7a8-3f48-49c0-9501-441cc39e364c req-46ca15c4-fd32-41ba-9950-b6b8ddf5800d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-45e7af69-431d-4066-9b30-3883334340db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.556 2 DEBUG nova.network.neutron [req-7ebea7a8-3f48-49c0-9501-441cc39e364c req-46ca15c4-fd32-41ba-9950-b6b8ddf5800d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Refreshing network info cache for port 38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.559 2 DEBUG nova.virt.libvirt.driver [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Start _get_guest_xml network_info=[{"id": "38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0", "address": "fa:16:3e:e1:e4:39", "network": {"id": "d70e9270-ce35-4a65-b11d-6eefe64091e8", "bridge": "br-int", "label": "tempest-test-network--1317531077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38cc08d0-8e", "ovs_interfaceid": "38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T15:17:39Z,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T15:17:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.563 2 WARNING nova.virt.libvirt.driver [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.567 2 DEBUG nova.virt.libvirt.host [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.568 2 DEBUG nova.virt.libvirt.host [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.574 2 DEBUG nova.virt.libvirt.host [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.574 2 DEBUG nova.virt.libvirt.host [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.575 2 DEBUG nova.virt.libvirt.driver [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.575 2 DEBUG nova.virt.hardware [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='987b2db7-1d21-4b59-831a-1e8ace40589b',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T15:17:39Z,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T15:17:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.576 2 DEBUG nova.virt.hardware [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.576 2 DEBUG nova.virt.hardware [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.576 2 DEBUG nova.virt.hardware [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.577 2 DEBUG nova.virt.hardware [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.577 2 DEBUG nova.virt.hardware [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.577 2 DEBUG nova.virt.hardware [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.578 2 DEBUG nova.virt.hardware [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.578 2 DEBUG nova.virt.hardware [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.578 2 DEBUG nova.virt.hardware [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.578 2 DEBUG nova.virt.hardware [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.588 2 DEBUG nova.privsep.utils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.590 2 DEBUG nova.virt.libvirt.vif [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:18:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-588633435',display_name='tempest-server-test-588633435',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-588633435',id=1,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCl+s+yUuTm6Hg1/EtkHPuYj+MxrL1QWLpUkVHCxkkLtxt5vFuy3WR6QNq/sLHyb2USC6/SaVkz2TULgQ3QvZM5JIgzhXTphcmSNBjGGopZ9gCz2VffMGgQSCG26RjA1Jg==',key_name='tempest-keypair-test-1160353961',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7762962015674dfb9038a135559a61f3',ramdisk_id='',reservation_id='r-yhzdtrwj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkBasicTest-1891752524',owner_user_name='tempest-NetworkBasicTest-1891752524-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:18:29Z,user_data=None,user_id='71a7f2d2441447b2bbd1b677555d68cc',uuid=45e7af69-431d-4066-9b30-3883334340db,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0", "address": "fa:16:3e:e1:e4:39", "network": {"id": "d70e9270-ce35-4a65-b11d-6eefe64091e8", "bridge": "br-int", "label": "tempest-test-network--1317531077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38cc08d0-8e", "ovs_interfaceid": "38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.590 2 DEBUG nova.network.os_vif_util [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Converting VIF {"id": "38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0", "address": "fa:16:3e:e1:e4:39", "network": {"id": "d70e9270-ce35-4a65-b11d-6eefe64091e8", "bridge": "br-int", "label": "tempest-test-network--1317531077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38cc08d0-8e", "ovs_interfaceid": "38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.591 2 DEBUG nova.network.os_vif_util [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e1:e4:39,bridge_name='br-int',has_traffic_filtering=True,id=38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0,network=Network(d70e9270-ce35-4a65-b11d-6eefe64091e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38cc08d0-8e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.593 2 DEBUG nova.objects.instance [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 45e7af69-431d-4066-9b30-3883334340db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.595 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.595 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.596 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.616 2 DEBUG nova.virt.libvirt.driver [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:18:37 np0005476733 nova_compute[192580]:  <uuid>45e7af69-431d-4066-9b30-3883334340db</uuid>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:  <name>instance-00000001</name>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:  <memory>131072</memory>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <nova:name>tempest-server-test-588633435</nova:name>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:18:37</nova:creationTime>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <nova:flavor name="m1.nano">
Oct  8 11:18:37 np0005476733 nova_compute[192580]:        <nova:memory>128</nova:memory>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:        <nova:disk>1</nova:disk>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:        <nova:user uuid="71a7f2d2441447b2bbd1b677555d68cc">tempest-NetworkBasicTest-1891752524-project-member</nova:user>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:        <nova:project uuid="7762962015674dfb9038a135559a61f3">tempest-NetworkBasicTest-1891752524</nova:project>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="ec29a055-bb5f-49c2-94be-8574c5ea97ea"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:        <nova:port uuid="38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0">
Oct  8 11:18:37 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <entry name="serial">45e7af69-431d-4066-9b30-3883334340db</entry>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <entry name="uuid">45e7af69-431d-4066-9b30-3883334340db</entry>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/45e7af69-431d-4066-9b30-3883334340db/disk"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/45e7af69-431d-4066-9b30-3883334340db/disk.config"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:e1:e4:39"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <target dev="tap38cc08d0-8e"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/45e7af69-431d-4066-9b30-3883334340db/console.log" append="off"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:18:37 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:18:37 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:18:37 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:18:37 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.617 2 DEBUG nova.compute.manager [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Preparing to wait for external event network-vif-plugged-38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.618 2 DEBUG oslo_concurrency.lockutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "45e7af69-431d-4066-9b30-3883334340db-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.618 2 DEBUG oslo_concurrency.lockutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "45e7af69-431d-4066-9b30-3883334340db-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.618 2 DEBUG oslo_concurrency.lockutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "45e7af69-431d-4066-9b30-3883334340db-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.619 2 DEBUG nova.virt.libvirt.vif [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:18:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-588633435',display_name='tempest-server-test-588633435',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-588633435',id=1,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCl+s+yUuTm6Hg1/EtkHPuYj+MxrL1QWLpUkVHCxkkLtxt5vFuy3WR6QNq/sLHyb2USC6/SaVkz2TULgQ3QvZM5JIgzhXTphcmSNBjGGopZ9gCz2VffMGgQSCG26RjA1Jg==',key_name='tempest-keypair-test-1160353961',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7762962015674dfb9038a135559a61f3',ramdisk_id='',reservation_id='r-yhzdtrwj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkBasicTest-1891752524',owner_user_name='tempest-NetworkBasicTest-1891752524-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:18:29Z,user_data=None,user_id='71a7f2d2441447b2bbd1b677555d68cc',uuid=45e7af69-431d-4066-9b30-3883334340db,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0", "address": "fa:16:3e:e1:e4:39", "network": {"id": "d70e9270-ce35-4a65-b11d-6eefe64091e8", "bridge": "br-int", "label": "tempest-test-network--1317531077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38cc08d0-8e", "ovs_interfaceid": "38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.619 2 DEBUG nova.network.os_vif_util [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Converting VIF {"id": "38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0", "address": "fa:16:3e:e1:e4:39", "network": {"id": "d70e9270-ce35-4a65-b11d-6eefe64091e8", "bridge": "br-int", "label": "tempest-test-network--1317531077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38cc08d0-8e", "ovs_interfaceid": "38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.620 2 DEBUG nova.network.os_vif_util [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e1:e4:39,bridge_name='br-int',has_traffic_filtering=True,id=38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0,network=Network(d70e9270-ce35-4a65-b11d-6eefe64091e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38cc08d0-8e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.620 2 DEBUG os_vif [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:e4:39,bridge_name='br-int',has_traffic_filtering=True,id=38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0,network=Network(d70e9270-ce35-4a65-b11d-6eefe64091e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38cc08d0-8e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.653 2 DEBUG ovsdbapp.backend.ovs_idl [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.653 2 DEBUG ovsdbapp.backend.ovs_idl [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.653 2 DEBUG ovsdbapp.backend.ovs_idl [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.667 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.667 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:18:37 np0005476733 nova_compute[192580]: 2025-10-08 15:18:37.668 2 INFO oslo.privsep.daemon [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpoajzrl4d/privsep.sock']#033[00m
Oct  8 11:18:38 np0005476733 nova_compute[192580]: 2025-10-08 15:18:38.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:38 np0005476733 nova_compute[192580]: 2025-10-08 15:18:38.342 2 INFO oslo.privsep.daemon [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Oct  8 11:18:38 np0005476733 nova_compute[192580]: 2025-10-08 15:18:38.216 73 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  8 11:18:38 np0005476733 nova_compute[192580]: 2025-10-08 15:18:38.220 73 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  8 11:18:38 np0005476733 nova_compute[192580]: 2025-10-08 15:18:38.223 73 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Oct  8 11:18:38 np0005476733 nova_compute[192580]: 2025-10-08 15:18:38.223 73 INFO oslo.privsep.daemon [-] privsep daemon running as pid 73#033[00m
Oct  8 11:18:38 np0005476733 nova_compute[192580]: 2025-10-08 15:18:38.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:38 np0005476733 nova_compute[192580]: 2025-10-08 15:18:38.653 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap38cc08d0-8e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:18:38 np0005476733 nova_compute[192580]: 2025-10-08 15:18:38.653 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap38cc08d0-8e, col_values=(('external_ids', {'iface-id': '38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e1:e4:39', 'vm-uuid': '45e7af69-431d-4066-9b30-3883334340db'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:18:38 np0005476733 nova_compute[192580]: 2025-10-08 15:18:38.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:38 np0005476733 NetworkManager[51699]: <info>  [1759936718.6568] manager: (tap38cc08d0-8e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Oct  8 11:18:38 np0005476733 nova_compute[192580]: 2025-10-08 15:18:38.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:18:38 np0005476733 nova_compute[192580]: 2025-10-08 15:18:38.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:38 np0005476733 nova_compute[192580]: 2025-10-08 15:18:38.665 2 INFO os_vif [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:e4:39,bridge_name='br-int',has_traffic_filtering=True,id=38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0,network=Network(d70e9270-ce35-4a65-b11d-6eefe64091e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38cc08d0-8e')#033[00m
Oct  8 11:18:38 np0005476733 nova_compute[192580]: 2025-10-08 15:18:38.734 2 DEBUG nova.virt.libvirt.driver [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:18:38 np0005476733 nova_compute[192580]: 2025-10-08 15:18:38.735 2 DEBUG nova.virt.libvirt.driver [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:18:38 np0005476733 nova_compute[192580]: 2025-10-08 15:18:38.735 2 DEBUG nova.virt.libvirt.driver [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] No VIF found with MAC fa:16:3e:e1:e4:39, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:18:38 np0005476733 nova_compute[192580]: 2025-10-08 15:18:38.736 2 INFO nova.virt.libvirt.driver [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Using config drive#033[00m
Oct  8 11:18:39 np0005476733 nova_compute[192580]: 2025-10-08 15:18:39.305 2 INFO nova.virt.libvirt.driver [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Creating config drive at /var/lib/nova/instances/45e7af69-431d-4066-9b30-3883334340db/disk.config#033[00m
Oct  8 11:18:39 np0005476733 nova_compute[192580]: 2025-10-08 15:18:39.314 2 DEBUG oslo_concurrency.processutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/45e7af69-431d-4066-9b30-3883334340db/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppqn_tzc8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:18:39 np0005476733 nova_compute[192580]: 2025-10-08 15:18:39.453 2 DEBUG oslo_concurrency.processutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/45e7af69-431d-4066-9b30-3883334340db/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppqn_tzc8" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:18:39 np0005476733 kernel: tun: Universal TUN/TAP device driver, 1.6
Oct  8 11:18:39 np0005476733 kernel: tap38cc08d0-8e: entered promiscuous mode
Oct  8 11:18:39 np0005476733 NetworkManager[51699]: <info>  [1759936719.5626] manager: (tap38cc08d0-8e): new Tun device (/org/freedesktop/NetworkManager/Devices/23)
Oct  8 11:18:39 np0005476733 ovn_controller[94857]: 2025-10-08T15:18:39Z|00042|binding|INFO|Claiming lport 38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 for this chassis.
Oct  8 11:18:39 np0005476733 nova_compute[192580]: 2025-10-08 15:18:39.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:39 np0005476733 ovn_controller[94857]: 2025-10-08T15:18:39Z|00043|binding|INFO|38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0: Claiming fa:16:3e:e1:e4:39 10.100.0.11
Oct  8 11:18:39 np0005476733 nova_compute[192580]: 2025-10-08 15:18:39.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:39.588 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:e4:39 10.100.0.11'], port_security=['fa:16:3e:e1:e4:39 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '45e7af69-431d-4066-9b30-3883334340db', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d70e9270-ce35-4a65-b11d-6eefe64091e8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7762962015674dfb9038a135559a61f3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0ceba18c-c267-491a-9134-5610e71f22a4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f11af73-f291-4143-87b4-e949e78304b7, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:18:39 np0005476733 nova_compute[192580]: 2025-10-08 15:18:39.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:18:39 np0005476733 nova_compute[192580]: 2025-10-08 15:18:39.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:18:39 np0005476733 nova_compute[192580]: 2025-10-08 15:18:39.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:18:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:39.589 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 in datapath d70e9270-ce35-4a65-b11d-6eefe64091e8 bound to our chassis#033[00m
Oct  8 11:18:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:39.591 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d70e9270-ce35-4a65-b11d-6eefe64091e8#033[00m
Oct  8 11:18:39 np0005476733 systemd-udevd[221331]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:18:39 np0005476733 nova_compute[192580]: 2025-10-08 15:18:39.607 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 45e7af69-431d-4066-9b30-3883334340db] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  8 11:18:39 np0005476733 nova_compute[192580]: 2025-10-08 15:18:39.607 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 11:18:39 np0005476733 NetworkManager[51699]: <info>  [1759936719.6193] device (tap38cc08d0-8e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:18:39 np0005476733 NetworkManager[51699]: <info>  [1759936719.6204] device (tap38cc08d0-8e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:18:39 np0005476733 podman[221308]: 2025-10-08 15:18:39.622621659 +0000 UTC m=+0.092628146 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:18:39 np0005476733 systemd-machined[152624]: New machine qemu-1-instance-00000001.
Oct  8 11:18:39 np0005476733 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Oct  8 11:18:39 np0005476733 nova_compute[192580]: 2025-10-08 15:18:39.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:39 np0005476733 ovn_controller[94857]: 2025-10-08T15:18:39Z|00044|binding|INFO|Setting lport 38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 ovn-installed in OVS
Oct  8 11:18:39 np0005476733 ovn_controller[94857]: 2025-10-08T15:18:39Z|00045|binding|INFO|Setting lport 38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 up in Southbound
Oct  8 11:18:39 np0005476733 nova_compute[192580]: 2025-10-08 15:18:39.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:40.144 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c03abd69-3336-4e77-806a-8c7762ff43bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:18:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:40.146 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd70e9270-c1 in ovnmeta-d70e9270-ce35-4a65-b11d-6eefe64091e8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:18:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:40.150 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd70e9270-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:18:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:40.150 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[df78da35-ffe8-4a94-95f1-320488f1cb83]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:18:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:40.151 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2ee083c0-e4e2-4287-a8f5-89bbc72d1cb9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:18:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:40.179 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[b62bffff-fe3b-4cad-91c0-2b01ffa74339]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:18:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:40.200 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ceeb62c6-9466-496e-b8bc-03c847197440]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:18:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:40.202 103739 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp2aggfmfd/privsep.sock']#033[00m
Oct  8 11:18:40 np0005476733 nova_compute[192580]: 2025-10-08 15:18:40.263 2 DEBUG nova.network.neutron [req-7ebea7a8-3f48-49c0-9501-441cc39e364c req-46ca15c4-fd32-41ba-9950-b6b8ddf5800d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Updated VIF entry in instance network info cache for port 38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:18:40 np0005476733 nova_compute[192580]: 2025-10-08 15:18:40.263 2 DEBUG nova.network.neutron [req-7ebea7a8-3f48-49c0-9501-441cc39e364c req-46ca15c4-fd32-41ba-9950-b6b8ddf5800d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Updating instance_info_cache with network_info: [{"id": "38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0", "address": "fa:16:3e:e1:e4:39", "network": {"id": "d70e9270-ce35-4a65-b11d-6eefe64091e8", "bridge": "br-int", "label": "tempest-test-network--1317531077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38cc08d0-8e", "ovs_interfaceid": "38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:18:40 np0005476733 nova_compute[192580]: 2025-10-08 15:18:40.287 2 DEBUG oslo_concurrency.lockutils [req-7ebea7a8-3f48-49c0-9501-441cc39e364c req-46ca15c4-fd32-41ba-9950-b6b8ddf5800d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-45e7af69-431d-4066-9b30-3883334340db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:18:40 np0005476733 nova_compute[192580]: 2025-10-08 15:18:40.535 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936720.5352037, 45e7af69-431d-4066-9b30-3883334340db => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:18:40 np0005476733 nova_compute[192580]: 2025-10-08 15:18:40.536 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 45e7af69-431d-4066-9b30-3883334340db] VM Started (Lifecycle Event)#033[00m
Oct  8 11:18:40 np0005476733 nova_compute[192580]: 2025-10-08 15:18:40.570 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 45e7af69-431d-4066-9b30-3883334340db] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:18:40 np0005476733 nova_compute[192580]: 2025-10-08 15:18:40.574 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936720.5353312, 45e7af69-431d-4066-9b30-3883334340db => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:18:40 np0005476733 nova_compute[192580]: 2025-10-08 15:18:40.574 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 45e7af69-431d-4066-9b30-3883334340db] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:18:40 np0005476733 nova_compute[192580]: 2025-10-08 15:18:40.605 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 45e7af69-431d-4066-9b30-3883334340db] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:18:40 np0005476733 nova_compute[192580]: 2025-10-08 15:18:40.610 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 45e7af69-431d-4066-9b30-3883334340db] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:18:40 np0005476733 nova_compute[192580]: 2025-10-08 15:18:40.633 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 45e7af69-431d-4066-9b30-3883334340db] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:18:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:40.924 103739 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Oct  8 11:18:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:40.924 103739 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp2aggfmfd/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Oct  8 11:18:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:40.795 221372 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  8 11:18:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:40.801 221372 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  8 11:18:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:40.803 221372 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Oct  8 11:18:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:40.803 221372 INFO oslo.privsep.daemon [-] privsep daemon running as pid 221372#033[00m
Oct  8 11:18:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:40.927 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[47b7242e-8a0e-45b4-be08-0416578ef4b9]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:18:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:41.424 221372 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:18:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:41.424 221372 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:18:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:41.424 221372 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:18:41 np0005476733 nova_compute[192580]: 2025-10-08 15:18:41.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:18:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:41.976 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[f4d883a6-6efd-4b8c-b4b1-7a7504129937]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:18:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:41.985 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a98bcdc4-b15a-4188-863e-d0324c4b5f9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:18:41 np0005476733 NetworkManager[51699]: <info>  [1759936721.9869] manager: (tapd70e9270-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/24)
Oct  8 11:18:41 np0005476733 systemd-udevd[221332]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:42.034 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[245b1dab-883f-41b4-b079-c9095d6da85e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:42.037 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[0ded3cfa-7ad6-4a6f-bb6c-b6f0e369c4e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:18:42 np0005476733 NetworkManager[51699]: <info>  [1759936722.0700] device (tapd70e9270-c0): carrier: link connected
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:42.075 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[324f6efe-07e0-4235-8c8e-b32417850340]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:42.101 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[180cf61c-c332-4dd6-871c-34b624317a2b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd70e9270-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:50:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362573, 'reachable_time': 19265, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221394, 'error': None, 'target': 'ovnmeta-d70e9270-ce35-4a65-b11d-6eefe64091e8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:42.122 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[631efed4-598a-4033-80f0-d46286484533]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea7:5033'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 362573, 'tstamp': 362573}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221395, 'error': None, 'target': 'ovnmeta-d70e9270-ce35-4a65-b11d-6eefe64091e8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:42.138 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6c259686-f00f-47d2-8105-4eaa4257e91f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd70e9270-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:50:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362573, 'reachable_time': 19265, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221396, 'error': None, 'target': 'ovnmeta-d70e9270-ce35-4a65-b11d-6eefe64091e8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:42.167 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1e15e46c-0e2b-4d07-8fe3-fbdda206661e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:42.226 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0b5fec8c-652d-4c6e-a182-ae1e342e7e1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:42.228 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd70e9270-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:42.228 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:42.229 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd70e9270-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:18:42 np0005476733 NetworkManager[51699]: <info>  [1759936722.2324] manager: (tapd70e9270-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Oct  8 11:18:42 np0005476733 nova_compute[192580]: 2025-10-08 15:18:42.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:42 np0005476733 kernel: tapd70e9270-c0: entered promiscuous mode
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:42.235 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd70e9270-c0, col_values=(('external_ids', {'iface-id': 'be57ad0f-c6a0-46f7-9ec4-d7ac934c25d0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:18:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:18:42Z|00046|binding|INFO|Releasing lport be57ad0f-c6a0-46f7-9ec4-d7ac934c25d0 from this chassis (sb_readonly=0)
Oct  8 11:18:42 np0005476733 nova_compute[192580]: 2025-10-08 15:18:42.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:42.238 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d70e9270-ce35-4a65-b11d-6eefe64091e8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d70e9270-ce35-4a65-b11d-6eefe64091e8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:42.239 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1be77a0b-1344-4110-b98d-22e026528ee6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:42.243 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-d70e9270-ce35-4a65-b11d-6eefe64091e8
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/d70e9270-ce35-4a65-b11d-6eefe64091e8.pid.haproxy
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID d70e9270-ce35-4a65-b11d-6eefe64091e8
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:18:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:18:42.244 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d70e9270-ce35-4a65-b11d-6eefe64091e8', 'env', 'PROCESS_TAG=haproxy-d70e9270-ce35-4a65-b11d-6eefe64091e8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d70e9270-ce35-4a65-b11d-6eefe64091e8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:18:42 np0005476733 nova_compute[192580]: 2025-10-08 15:18:42.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:42 np0005476733 podman[221428]: 2025-10-08 15:18:42.588867097 +0000 UTC m=+0.055849818 container create 1e205769f3fb186b88dd63606f4f5dca49835609f46b69051c95a677b0fd4e74 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-d70e9270-ce35-4a65-b11d-6eefe64091e8, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct  8 11:18:42 np0005476733 systemd[1]: Started libpod-conmon-1e205769f3fb186b88dd63606f4f5dca49835609f46b69051c95a677b0fd4e74.scope.
Oct  8 11:18:42 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:18:42 np0005476733 podman[221428]: 2025-10-08 15:18:42.55520739 +0000 UTC m=+0.022190111 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:18:42 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa40d5ae89d71ec791b21c09b1c6a620dfc5d6a6480596398c022ede3400ed7c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:18:42 np0005476733 podman[221428]: 2025-10-08 15:18:42.666876574 +0000 UTC m=+0.133859295 container init 1e205769f3fb186b88dd63606f4f5dca49835609f46b69051c95a677b0fd4e74 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-d70e9270-ce35-4a65-b11d-6eefe64091e8, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  8 11:18:42 np0005476733 podman[221428]: 2025-10-08 15:18:42.674393655 +0000 UTC m=+0.141376356 container start 1e205769f3fb186b88dd63606f4f5dca49835609f46b69051c95a677b0fd4e74 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-d70e9270-ce35-4a65-b11d-6eefe64091e8, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct  8 11:18:42 np0005476733 neutron-haproxy-ovnmeta-d70e9270-ce35-4a65-b11d-6eefe64091e8[221443]: [NOTICE]   (221447) : New worker (221449) forked
Oct  8 11:18:42 np0005476733 neutron-haproxy-ovnmeta-d70e9270-ce35-4a65-b11d-6eefe64091e8[221443]: [NOTICE]   (221447) : Loading success.
Oct  8 11:18:43 np0005476733 nova_compute[192580]: 2025-10-08 15:18:43.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:43 np0005476733 nova_compute[192580]: 2025-10-08 15:18:43.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:46 np0005476733 podman[221458]: 2025-10-08 15:18:46.225372409 +0000 UTC m=+0.052993868 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, tcib_managed=true)
Oct  8 11:18:47 np0005476733 nova_compute[192580]: 2025-10-08 15:18:47.962 2 DEBUG oslo_concurrency.lockutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Acquiring lock "f83eb34d-9eaa-4ec0-8632-215e0b1ef541" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:18:47 np0005476733 nova_compute[192580]: 2025-10-08 15:18:47.962 2 DEBUG oslo_concurrency.lockutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Lock "f83eb34d-9eaa-4ec0-8632-215e0b1ef541" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:18:47 np0005476733 nova_compute[192580]: 2025-10-08 15:18:47.994 2 DEBUG nova.compute.manager [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.131 2 DEBUG oslo_concurrency.lockutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.131 2 DEBUG oslo_concurrency.lockutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.144 2 DEBUG nova.virt.hardware [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.144 2 INFO nova.compute.claims [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:18:48 np0005476733 podman[221477]: 2025-10-08 15:18:48.240609548 +0000 UTC m=+0.068369328 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.429 2 DEBUG nova.compute.provider_tree [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.458 2 DEBUG nova.scheduler.client.report [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.499 2 DEBUG oslo_concurrency.lockutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.368s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.501 2 DEBUG nova.compute.manager [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.580 2 DEBUG nova.compute.manager [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.580 2 DEBUG nova.network.neutron [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.609 2 INFO nova.virt.libvirt.driver [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.635 2 DEBUG nova.compute.manager [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.771 2 DEBUG nova.compute.manager [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.773 2 DEBUG nova.virt.libvirt.driver [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.774 2 INFO nova.virt.libvirt.driver [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Creating image(s)#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.775 2 DEBUG oslo_concurrency.lockutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Acquiring lock "/var/lib/nova/instances/f83eb34d-9eaa-4ec0-8632-215e0b1ef541/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.775 2 DEBUG oslo_concurrency.lockutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Lock "/var/lib/nova/instances/f83eb34d-9eaa-4ec0-8632-215e0b1ef541/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.776 2 DEBUG oslo_concurrency.lockutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Lock "/var/lib/nova/instances/f83eb34d-9eaa-4ec0-8632-215e0b1ef541/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.800 2 DEBUG oslo_concurrency.processutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.890 2 DEBUG oslo_concurrency.processutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.891 2 DEBUG oslo_concurrency.lockutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Acquiring lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.892 2 DEBUG oslo_concurrency.lockutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.911 2 DEBUG oslo_concurrency.processutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.982 2 DEBUG oslo_concurrency.processutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:18:48 np0005476733 nova_compute[192580]: 2025-10-08 15:18:48.983 2 DEBUG oslo_concurrency.processutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/f83eb34d-9eaa-4ec0-8632-215e0b1ef541/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:18:49 np0005476733 podman[221507]: 2025-10-08 15:18:49.290501371 +0000 UTC m=+0.116285812 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  8 11:18:49 np0005476733 nova_compute[192580]: 2025-10-08 15:18:49.339 2 DEBUG nova.policy [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8656576ff27549b38acf69641e38c125', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5248fcbeb2ef4348bd0d0da2a924916a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:18:49 np0005476733 nova_compute[192580]: 2025-10-08 15:18:49.557 2 DEBUG oslo_concurrency.processutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/f83eb34d-9eaa-4ec0-8632-215e0b1ef541/disk 1073741824" returned: 0 in 0.574s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:18:49 np0005476733 nova_compute[192580]: 2025-10-08 15:18:49.558 2 DEBUG oslo_concurrency.lockutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.666s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:18:49 np0005476733 nova_compute[192580]: 2025-10-08 15:18:49.558 2 DEBUG oslo_concurrency.processutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:18:49 np0005476733 nova_compute[192580]: 2025-10-08 15:18:49.617 2 DEBUG oslo_concurrency.processutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:18:49 np0005476733 nova_compute[192580]: 2025-10-08 15:18:49.618 2 DEBUG nova.virt.disk.api [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Checking if we can resize image /var/lib/nova/instances/f83eb34d-9eaa-4ec0-8632-215e0b1ef541/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 11:18:49 np0005476733 nova_compute[192580]: 2025-10-08 15:18:49.618 2 DEBUG oslo_concurrency.processutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f83eb34d-9eaa-4ec0-8632-215e0b1ef541/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:18:49 np0005476733 nova_compute[192580]: 2025-10-08 15:18:49.679 2 DEBUG oslo_concurrency.processutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f83eb34d-9eaa-4ec0-8632-215e0b1ef541/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:18:49 np0005476733 nova_compute[192580]: 2025-10-08 15:18:49.681 2 DEBUG nova.virt.disk.api [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Cannot resize image /var/lib/nova/instances/f83eb34d-9eaa-4ec0-8632-215e0b1ef541/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 11:18:49 np0005476733 nova_compute[192580]: 2025-10-08 15:18:49.681 2 DEBUG nova.objects.instance [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Lazy-loading 'migration_context' on Instance uuid f83eb34d-9eaa-4ec0-8632-215e0b1ef541 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:18:49 np0005476733 nova_compute[192580]: 2025-10-08 15:18:49.719 2 DEBUG nova.virt.libvirt.driver [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:18:49 np0005476733 nova_compute[192580]: 2025-10-08 15:18:49.720 2 DEBUG nova.virt.libvirt.driver [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Ensure instance console log exists: /var/lib/nova/instances/f83eb34d-9eaa-4ec0-8632-215e0b1ef541/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:18:49 np0005476733 nova_compute[192580]: 2025-10-08 15:18:49.721 2 DEBUG oslo_concurrency.lockutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:18:49 np0005476733 nova_compute[192580]: 2025-10-08 15:18:49.721 2 DEBUG oslo_concurrency.lockutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:18:49 np0005476733 nova_compute[192580]: 2025-10-08 15:18:49.722 2 DEBUG oslo_concurrency.lockutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:18:49 np0005476733 nova_compute[192580]: 2025-10-08 15:18:49.735 2 DEBUG oslo_concurrency.lockutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "af660b82-9b3c-4c4d-820a-3d22b73898e5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:18:49 np0005476733 nova_compute[192580]: 2025-10-08 15:18:49.735 2 DEBUG oslo_concurrency.lockutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "af660b82-9b3c-4c4d-820a-3d22b73898e5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:18:49 np0005476733 nova_compute[192580]: 2025-10-08 15:18:49.768 2 DEBUG nova.compute.manager [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:18:49 np0005476733 nova_compute[192580]: 2025-10-08 15:18:49.871 2 DEBUG oslo_concurrency.lockutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:18:49 np0005476733 nova_compute[192580]: 2025-10-08 15:18:49.872 2 DEBUG oslo_concurrency.lockutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:18:49 np0005476733 nova_compute[192580]: 2025-10-08 15:18:49.880 2 DEBUG nova.virt.hardware [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:18:49 np0005476733 nova_compute[192580]: 2025-10-08 15:18:49.880 2 INFO nova.compute.claims [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:18:50 np0005476733 nova_compute[192580]: 2025-10-08 15:18:50.165 2 DEBUG nova.compute.provider_tree [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:18:50 np0005476733 nova_compute[192580]: 2025-10-08 15:18:50.193 2 DEBUG nova.scheduler.client.report [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:18:50 np0005476733 nova_compute[192580]: 2025-10-08 15:18:50.240 2 DEBUG oslo_concurrency.lockutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.368s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:18:50 np0005476733 nova_compute[192580]: 2025-10-08 15:18:50.241 2 DEBUG nova.compute.manager [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:18:50 np0005476733 nova_compute[192580]: 2025-10-08 15:18:50.344 2 DEBUG nova.compute.manager [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:18:50 np0005476733 nova_compute[192580]: 2025-10-08 15:18:50.345 2 DEBUG nova.network.neutron [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:18:50 np0005476733 nova_compute[192580]: 2025-10-08 15:18:50.418 2 INFO nova.virt.libvirt.driver [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:18:50 np0005476733 nova_compute[192580]: 2025-10-08 15:18:50.465 2 DEBUG nova.compute.manager [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:18:50 np0005476733 nova_compute[192580]: 2025-10-08 15:18:50.614 2 DEBUG nova.compute.manager [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:18:50 np0005476733 nova_compute[192580]: 2025-10-08 15:18:50.616 2 DEBUG nova.virt.libvirt.driver [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:18:50 np0005476733 nova_compute[192580]: 2025-10-08 15:18:50.616 2 INFO nova.virt.libvirt.driver [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Creating image(s)#033[00m
Oct  8 11:18:50 np0005476733 nova_compute[192580]: 2025-10-08 15:18:50.617 2 DEBUG oslo_concurrency.lockutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "/var/lib/nova/instances/af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:18:50 np0005476733 nova_compute[192580]: 2025-10-08 15:18:50.617 2 DEBUG oslo_concurrency.lockutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "/var/lib/nova/instances/af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:18:50 np0005476733 nova_compute[192580]: 2025-10-08 15:18:50.618 2 DEBUG oslo_concurrency.lockutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "/var/lib/nova/instances/af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:18:50 np0005476733 nova_compute[192580]: 2025-10-08 15:18:50.618 2 DEBUG oslo_concurrency.lockutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:18:50 np0005476733 nova_compute[192580]: 2025-10-08 15:18:50.619 2 DEBUG oslo_concurrency.lockutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:18:51 np0005476733 nova_compute[192580]: 2025-10-08 15:18:51.310 2 DEBUG nova.policy [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:18:52 np0005476733 nova_compute[192580]: 2025-10-08 15:18:52.987 2 DEBUG nova.network.neutron [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Successfully updated port: fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:18:53 np0005476733 nova_compute[192580]: 2025-10-08 15:18:53.051 2 DEBUG oslo_concurrency.lockutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Acquiring lock "refresh_cache-f83eb34d-9eaa-4ec0-8632-215e0b1ef541" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:18:53 np0005476733 nova_compute[192580]: 2025-10-08 15:18:53.052 2 DEBUG oslo_concurrency.lockutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Acquired lock "refresh_cache-f83eb34d-9eaa-4ec0-8632-215e0b1ef541" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:18:53 np0005476733 nova_compute[192580]: 2025-10-08 15:18:53.052 2 DEBUG nova.network.neutron [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:18:53 np0005476733 nova_compute[192580]: 2025-10-08 15:18:53.210 2 DEBUG nova.network.neutron [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Successfully created port: 1f764678-f4b9-420d-b072-8c0f7c3534a9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 11:18:53 np0005476733 podman[221540]: 2025-10-08 15:18:53.231191339 +0000 UTC m=+0.065186138 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  8 11:18:53 np0005476733 nova_compute[192580]: 2025-10-08 15:18:53.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:53 np0005476733 nova_compute[192580]: 2025-10-08 15:18:53.466 2 DEBUG nova.network.neutron [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:18:53 np0005476733 nova_compute[192580]: 2025-10-08 15:18:53.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:55 np0005476733 nova_compute[192580]: 2025-10-08 15:18:55.878 2 DEBUG nova.compute.manager [req-56f47f3a-7f2e-4bff-954b-9a52316ef193 req-7babfd56-a3b6-4545-aa21-c8aae28366a3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Received event network-changed-fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:18:55 np0005476733 nova_compute[192580]: 2025-10-08 15:18:55.879 2 DEBUG nova.compute.manager [req-56f47f3a-7f2e-4bff-954b-9a52316ef193 req-7babfd56-a3b6-4545-aa21-c8aae28366a3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Refreshing instance network info cache due to event network-changed-fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:18:55 np0005476733 nova_compute[192580]: 2025-10-08 15:18:55.879 2 DEBUG oslo_concurrency.lockutils [req-56f47f3a-7f2e-4bff-954b-9a52316ef193 req-7babfd56-a3b6-4545-aa21-c8aae28366a3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-f83eb34d-9eaa-4ec0-8632-215e0b1ef541" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:18:57 np0005476733 nova_compute[192580]: 2025-10-08 15:18:57.952 2 DEBUG nova.network.neutron [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Updating instance_info_cache with network_info: [{"id": "fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae", "address": "fa:16:3e:ac:48:4e", "network": {"id": "e409da37-3f48-4214-98c8-11c392b47fc3", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-860223840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5248fcbeb2ef4348bd0d0da2a924916a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe3fc542-fd", "ovs_interfaceid": "fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:18:57 np0005476733 nova_compute[192580]: 2025-10-08 15:18:57.998 2 DEBUG oslo_concurrency.lockutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Releasing lock "refresh_cache-f83eb34d-9eaa-4ec0-8632-215e0b1ef541" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:18:57 np0005476733 nova_compute[192580]: 2025-10-08 15:18:57.999 2 DEBUG nova.compute.manager [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Instance network_info: |[{"id": "fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae", "address": "fa:16:3e:ac:48:4e", "network": {"id": "e409da37-3f48-4214-98c8-11c392b47fc3", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-860223840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5248fcbeb2ef4348bd0d0da2a924916a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe3fc542-fd", "ovs_interfaceid": "fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:57.999 2 DEBUG oslo_concurrency.lockutils [req-56f47f3a-7f2e-4bff-954b-9a52316ef193 req-7babfd56-a3b6-4545-aa21-c8aae28366a3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-f83eb34d-9eaa-4ec0-8632-215e0b1ef541" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.000 2 DEBUG nova.network.neutron [req-56f47f3a-7f2e-4bff-954b-9a52316ef193 req-7babfd56-a3b6-4545-aa21-c8aae28366a3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Refreshing network info cache for port fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.003 2 DEBUG nova.virt.libvirt.driver [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Start _get_guest_xml network_info=[{"id": "fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae", "address": "fa:16:3e:ac:48:4e", "network": {"id": "e409da37-3f48-4214-98c8-11c392b47fc3", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-860223840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5248fcbeb2ef4348bd0d0da2a924916a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe3fc542-fd", "ovs_interfaceid": "fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T15:17:39Z,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T15:17:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.015 2 WARNING nova.virt.libvirt.driver [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.026 2 DEBUG nova.virt.libvirt.host [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.027 2 DEBUG nova.virt.libvirt.host [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.042 2 DEBUG nova.virt.libvirt.host [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.043 2 DEBUG nova.virt.libvirt.host [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.043 2 DEBUG nova.virt.libvirt.driver [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.043 2 DEBUG nova.virt.hardware [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='987b2db7-1d21-4b59-831a-1e8ace40589b',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T15:17:39Z,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T15:17:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.044 2 DEBUG nova.virt.hardware [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.044 2 DEBUG nova.virt.hardware [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.044 2 DEBUG nova.virt.hardware [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.044 2 DEBUG nova.virt.hardware [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.044 2 DEBUG nova.virt.hardware [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.045 2 DEBUG nova.virt.hardware [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.045 2 DEBUG nova.virt.hardware [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.045 2 DEBUG nova.virt.hardware [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.045 2 DEBUG nova.virt.hardware [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.045 2 DEBUG nova.virt.hardware [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.048 2 DEBUG nova.virt.libvirt.vif [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:18:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-internal-dns-test-vm-1004768871',display_name='tempest-internal-dns-test-vm-1004768871',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-internal-dns-test-vm-1004768871',id=4,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJaHBXbst+5C6w+FRyQb5vNwPyePbL0LiFtDYUYrLhF4sS1u8JZRY/qVVDozU7kfxyfuGMumy6uD0IKmkpT0lo2o25UHzpCYQtviBx9gLAPFYRpL2c+ycGMhNCI1ZdUmKA==',key_name='tempest-internal-dns-test-shared-keypair-1788771288',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5248fcbeb2ef4348bd0d0da2a924916a',ramdisk_id='',reservation_id='r-wml4arjp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InternalDNSTestOvn-963558814',owner_user_name='tempest-InternalDNSTestOvn-963558814-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:18:48Z,user_data=None,user_id='8656576ff27549b38acf69641e38c125',uuid=f83eb34d-9eaa-4ec0-8632-215e0b1ef541,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae", "address": "fa:16:3e:ac:48:4e", "network": {"id": "e409da37-3f48-4214-98c8-11c392b47fc3", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-860223840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5248fcbeb2ef4348bd0d0da2a924916a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe3fc542-fd", "ovs_interfaceid": "fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.049 2 DEBUG nova.network.os_vif_util [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Converting VIF {"id": "fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae", "address": "fa:16:3e:ac:48:4e", "network": {"id": "e409da37-3f48-4214-98c8-11c392b47fc3", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-860223840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5248fcbeb2ef4348bd0d0da2a924916a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe3fc542-fd", "ovs_interfaceid": "fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.049 2 DEBUG nova.network.os_vif_util [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:48:4e,bridge_name='br-int',has_traffic_filtering=True,id=fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae,network=Network(e409da37-3f48-4214-98c8-11c392b47fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfe3fc542-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.050 2 DEBUG nova.objects.instance [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Lazy-loading 'pci_devices' on Instance uuid f83eb34d-9eaa-4ec0-8632-215e0b1ef541 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.064 2 DEBUG nova.network.neutron [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Successfully updated port: 1f764678-f4b9-420d-b072-8c0f7c3534a9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.218 2 DEBUG nova.virt.libvirt.driver [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:18:58 np0005476733 nova_compute[192580]:  <uuid>f83eb34d-9eaa-4ec0-8632-215e0b1ef541</uuid>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:  <name>instance-00000004</name>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:  <memory>131072</memory>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <nova:name>tempest-internal-dns-test-vm-1004768871</nova:name>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:18:58</nova:creationTime>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <nova:flavor name="m1.nano">
Oct  8 11:18:58 np0005476733 nova_compute[192580]:        <nova:memory>128</nova:memory>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:        <nova:disk>1</nova:disk>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:        <nova:user uuid="8656576ff27549b38acf69641e38c125">tempest-InternalDNSTestOvn-963558814-project-member</nova:user>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:        <nova:project uuid="5248fcbeb2ef4348bd0d0da2a924916a">tempest-InternalDNSTestOvn-963558814</nova:project>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="ec29a055-bb5f-49c2-94be-8574c5ea97ea"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:        <nova:port uuid="fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae">
Oct  8 11:18:58 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <entry name="serial">f83eb34d-9eaa-4ec0-8632-215e0b1ef541</entry>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <entry name="uuid">f83eb34d-9eaa-4ec0-8632-215e0b1ef541</entry>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/f83eb34d-9eaa-4ec0-8632-215e0b1ef541/disk"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/f83eb34d-9eaa-4ec0-8632-215e0b1ef541/disk.config"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:ac:48:4e"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <target dev="tapfe3fc542-fd"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/f83eb34d-9eaa-4ec0-8632-215e0b1ef541/console.log" append="off"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:18:58 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:18:58 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:18:58 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:18:58 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.219 2 DEBUG nova.compute.manager [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Preparing to wait for external event network-vif-plugged-fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.219 2 DEBUG oslo_concurrency.lockutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Acquiring lock "f83eb34d-9eaa-4ec0-8632-215e0b1ef541-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.220 2 DEBUG oslo_concurrency.lockutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Lock "f83eb34d-9eaa-4ec0-8632-215e0b1ef541-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.220 2 DEBUG oslo_concurrency.lockutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Lock "f83eb34d-9eaa-4ec0-8632-215e0b1ef541-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.220 2 DEBUG nova.virt.libvirt.vif [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:18:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-internal-dns-test-vm-1004768871',display_name='tempest-internal-dns-test-vm-1004768871',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-internal-dns-test-vm-1004768871',id=4,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJaHBXbst+5C6w+FRyQb5vNwPyePbL0LiFtDYUYrLhF4sS1u8JZRY/qVVDozU7kfxyfuGMumy6uD0IKmkpT0lo2o25UHzpCYQtviBx9gLAPFYRpL2c+ycGMhNCI1ZdUmKA==',key_name='tempest-internal-dns-test-shared-keypair-1788771288',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5248fcbeb2ef4348bd0d0da2a924916a',ramdisk_id='',reservation_id='r-wml4arjp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InternalDNSTestOvn-963558814',owner_user_name='tempest-InternalDNSTestOvn-963558814-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:18:48Z,user_data=None,user_id='8656576ff27549b38acf69641e38c125',uuid=f83eb34d-9eaa-4ec0-8632-215e0b1ef541,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae", "address": "fa:16:3e:ac:48:4e", "network": {"id": "e409da37-3f48-4214-98c8-11c392b47fc3", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-860223840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5248fcbeb2ef4348bd0d0da2a924916a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe3fc542-fd", "ovs_interfaceid": "fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.221 2 DEBUG nova.network.os_vif_util [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Converting VIF {"id": "fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae", "address": "fa:16:3e:ac:48:4e", "network": {"id": "e409da37-3f48-4214-98c8-11c392b47fc3", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-860223840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5248fcbeb2ef4348bd0d0da2a924916a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe3fc542-fd", "ovs_interfaceid": "fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.221 2 DEBUG nova.network.os_vif_util [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:48:4e,bridge_name='br-int',has_traffic_filtering=True,id=fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae,network=Network(e409da37-3f48-4214-98c8-11c392b47fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfe3fc542-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.222 2 DEBUG os_vif [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:48:4e,bridge_name='br-int',has_traffic_filtering=True,id=fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae,network=Network(e409da37-3f48-4214-98c8-11c392b47fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfe3fc542-fd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.225 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.226 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.249 2 DEBUG oslo_concurrency.lockutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "refresh_cache-af660b82-9b3c-4c4d-820a-3d22b73898e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.250 2 DEBUG oslo_concurrency.lockutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquired lock "refresh_cache-af660b82-9b3c-4c4d-820a-3d22b73898e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.250 2 DEBUG nova.network.neutron [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.258 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe3fc542-fd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.258 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfe3fc542-fd, col_values=(('external_ids', {'iface-id': 'fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ac:48:4e', 'vm-uuid': 'f83eb34d-9eaa-4ec0-8632-215e0b1ef541'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:58 np0005476733 NetworkManager[51699]: <info>  [1759936738.2630] manager: (tapfe3fc542-fd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.270 2 INFO os_vif [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:48:4e,bridge_name='br-int',has_traffic_filtering=True,id=fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae,network=Network(e409da37-3f48-4214-98c8-11c392b47fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfe3fc542-fd')#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:18:58 np0005476733 podman[221564]: 2025-10-08 15:18:58.393798085 +0000 UTC m=+0.070789387 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Oct  8 11:18:58 np0005476733 podman[221565]: 2025-10-08 15:18:58.397044879 +0000 UTC m=+0.065839639 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.402 2 DEBUG nova.virt.libvirt.driver [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.403 2 DEBUG nova.virt.libvirt.driver [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.403 2 DEBUG nova.virt.libvirt.driver [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] No VIF found with MAC fa:16:3e:ac:48:4e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.404 2 INFO nova.virt.libvirt.driver [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Using config drive#033[00m
Oct  8 11:18:58 np0005476733 nova_compute[192580]: 2025-10-08 15:18:58.578 2 DEBUG nova.network.neutron [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:19:00 np0005476733 nova_compute[192580]: 2025-10-08 15:19:00.706 2 INFO nova.virt.libvirt.driver [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Creating config drive at /var/lib/nova/instances/f83eb34d-9eaa-4ec0-8632-215e0b1ef541/disk.config#033[00m
Oct  8 11:19:00 np0005476733 nova_compute[192580]: 2025-10-08 15:19:00.711 2 DEBUG oslo_concurrency.processutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f83eb34d-9eaa-4ec0-8632-215e0b1ef541/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptadj9t_1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:19:00 np0005476733 nova_compute[192580]: 2025-10-08 15:19:00.839 2 DEBUG oslo_concurrency.processutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f83eb34d-9eaa-4ec0-8632-215e0b1ef541/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptadj9t_1" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:19:00 np0005476733 kernel: tapfe3fc542-fd: entered promiscuous mode
Oct  8 11:19:00 np0005476733 NetworkManager[51699]: <info>  [1759936740.8978] manager: (tapfe3fc542-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/27)
Oct  8 11:19:00 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:00Z|00047|binding|INFO|Claiming lport fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae for this chassis.
Oct  8 11:19:00 np0005476733 nova_compute[192580]: 2025-10-08 15:19:00.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:00 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:00Z|00048|binding|INFO|fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae: Claiming fa:16:3e:ac:48:4e 10.100.0.9
Oct  8 11:19:00 np0005476733 nova_compute[192580]: 2025-10-08 15:19:00.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:00 np0005476733 systemd-udevd[221623]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:19:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:00.927 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:48:4e 10.100.0.9'], port_security=['fa:16:3e:ac:48:4e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-internal-dns-test-port-826693979', 'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'f83eb34d-9eaa-4ec0-8632-215e0b1ef541', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e409da37-3f48-4214-98c8-11c392b47fc3', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-internal-dns-test-port-826693979', 'neutron:project_id': '5248fcbeb2ef4348bd0d0da2a924916a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '6499d1d4-cce3-4174-8313-a409bf2c1bb3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ca31c240-db5a-4e53-802b-b3d974372970, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:19:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:00.929 103739 INFO neutron.agent.ovn.metadata.agent [-] Port fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae in datapath e409da37-3f48-4214-98c8-11c392b47fc3 bound to our chassis#033[00m
Oct  8 11:19:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:00.932 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e409da37-3f48-4214-98c8-11c392b47fc3#033[00m
Oct  8 11:19:00 np0005476733 NetworkManager[51699]: <info>  [1759936740.9439] device (tapfe3fc542-fd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:19:00 np0005476733 NetworkManager[51699]: <info>  [1759936740.9454] device (tapfe3fc542-fd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:19:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:00.946 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[733767d8-3a9e-4ad2-844a-ceda3f9810d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:00.946 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape409da37-31 in ovnmeta-e409da37-3f48-4214-98c8-11c392b47fc3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:19:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:00.948 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape409da37-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:19:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:00.948 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ee90fcca-ac1f-438f-ad25-5a8e5233c029]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:00.950 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[202e269f-6963-4713-9417-3ae767f17463]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:00 np0005476733 nova_compute[192580]: 2025-10-08 15:19:00.965 2 DEBUG nova.compute.manager [req-081d918e-ce35-434a-bf5f-5b064d066c47 req-3c215768-c555-4b61-9c4f-28d803397c4f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Received event network-vif-plugged-38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:19:00 np0005476733 nova_compute[192580]: 2025-10-08 15:19:00.965 2 DEBUG oslo_concurrency.lockutils [req-081d918e-ce35-434a-bf5f-5b064d066c47 req-3c215768-c555-4b61-9c4f-28d803397c4f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "45e7af69-431d-4066-9b30-3883334340db-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:00 np0005476733 systemd-machined[152624]: New machine qemu-2-instance-00000004.
Oct  8 11:19:00 np0005476733 nova_compute[192580]: 2025-10-08 15:19:00.965 2 DEBUG oslo_concurrency.lockutils [req-081d918e-ce35-434a-bf5f-5b064d066c47 req-3c215768-c555-4b61-9c4f-28d803397c4f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "45e7af69-431d-4066-9b30-3883334340db-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:00 np0005476733 nova_compute[192580]: 2025-10-08 15:19:00.967 2 DEBUG oslo_concurrency.lockutils [req-081d918e-ce35-434a-bf5f-5b064d066c47 req-3c215768-c555-4b61-9c4f-28d803397c4f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "45e7af69-431d-4066-9b30-3883334340db-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:00 np0005476733 nova_compute[192580]: 2025-10-08 15:19:00.967 2 DEBUG nova.compute.manager [req-081d918e-ce35-434a-bf5f-5b064d066c47 req-3c215768-c555-4b61-9c4f-28d803397c4f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Processing event network-vif-plugged-38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:19:00 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:00Z|00049|binding|INFO|Setting lport fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae ovn-installed in OVS
Oct  8 11:19:00 np0005476733 nova_compute[192580]: 2025-10-08 15:19:00.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:00 np0005476733 nova_compute[192580]: 2025-10-08 15:19:00.976 2 DEBUG nova.compute.manager [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Instance event wait completed in 20 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:19:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:00.975 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[a91837fb-6e82-4f8f-92d9-fe7c44313d96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:00 np0005476733 systemd[1]: Started Virtual Machine qemu-2-instance-00000004.
Oct  8 11:19:00 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:00Z|00050|binding|INFO|Setting lport fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae up in Southbound
Oct  8 11:19:00 np0005476733 nova_compute[192580]: 2025-10-08 15:19:00.990 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936740.981921, 45e7af69-431d-4066-9b30-3883334340db => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:19:00 np0005476733 nova_compute[192580]: 2025-10-08 15:19:00.991 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 45e7af69-431d-4066-9b30-3883334340db] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:19:00 np0005476733 nova_compute[192580]: 2025-10-08 15:19:00.993 2 DEBUG nova.virt.libvirt.driver [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:19:01 np0005476733 nova_compute[192580]: 2025-10-08 15:19:01.008 2 INFO nova.virt.libvirt.driver [-] [instance: 45e7af69-431d-4066-9b30-3883334340db] Instance spawned successfully.#033[00m
Oct  8 11:19:01 np0005476733 nova_compute[192580]: 2025-10-08 15:19:01.012 2 DEBUG nova.virt.libvirt.driver [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:01.013 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1c0d958e-26c9-4de6-abe8-5a97085b2090]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:01.043 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[8cb19a77-a2f9-4554-be16-cec4d372acf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:01.049 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[28ca502f-d4b9-427f-9f00-fd9adace5c8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:01 np0005476733 NetworkManager[51699]: <info>  [1759936741.0519] manager: (tape409da37-30): new Veth device (/org/freedesktop/NetworkManager/Devices/28)
Oct  8 11:19:01 np0005476733 nova_compute[192580]: 2025-10-08 15:19:01.054 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 45e7af69-431d-4066-9b30-3883334340db] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:19:01 np0005476733 nova_compute[192580]: 2025-10-08 15:19:01.075 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 45e7af69-431d-4066-9b30-3883334340db] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:01.079 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[4829ff29-05c4-4076-8aa0-baff11d60f88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:01 np0005476733 nova_compute[192580]: 2025-10-08 15:19:01.086 2 DEBUG nova.virt.libvirt.driver [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:19:01 np0005476733 nova_compute[192580]: 2025-10-08 15:19:01.086 2 DEBUG nova.virt.libvirt.driver [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:19:01 np0005476733 nova_compute[192580]: 2025-10-08 15:19:01.087 2 DEBUG nova.virt.libvirt.driver [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:01.085 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[73348f2e-8e2e-40c6-b85b-0daad7158b84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:01 np0005476733 nova_compute[192580]: 2025-10-08 15:19:01.087 2 DEBUG nova.virt.libvirt.driver [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:19:01 np0005476733 nova_compute[192580]: 2025-10-08 15:19:01.087 2 DEBUG nova.virt.libvirt.driver [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:19:01 np0005476733 nova_compute[192580]: 2025-10-08 15:19:01.088 2 DEBUG nova.virt.libvirt.driver [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:19:01 np0005476733 NetworkManager[51699]: <info>  [1759936741.1111] device (tape409da37-30): carrier: link connected
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:01.117 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[f0211513-4a4d-4d34-abec-a62bae128da7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:01.135 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8d973ffd-fc94-4aed-bd33-03ab94ce6c72]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape409da37-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:8e:53'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364477, 'reachable_time': 27380, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221662, 'error': None, 'target': 'ovnmeta-e409da37-3f48-4214-98c8-11c392b47fc3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:01.150 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[cb02a306-03f1-4689-aa00-1ef6b70d05db]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefe:8e53'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 364477, 'tstamp': 364477}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221663, 'error': None, 'target': 'ovnmeta-e409da37-3f48-4214-98c8-11c392b47fc3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:01.165 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[092d1134-9c9a-472a-a921-8836aa59b9eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape409da37-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:8e:53'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364477, 'reachable_time': 27380, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221664, 'error': None, 'target': 'ovnmeta-e409da37-3f48-4214-98c8-11c392b47fc3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:01 np0005476733 nova_compute[192580]: 2025-10-08 15:19:01.198 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 45e7af69-431d-4066-9b30-3883334340db] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:01.198 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[33e127c3-39e4-4226-b7f9-da097189ba3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:01.264 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[35cb73a4-bbfc-41cc-95bc-5f4426f0a045]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:01.265 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape409da37-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:01.266 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:01.268 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape409da37-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:19:01 np0005476733 NetworkManager[51699]: <info>  [1759936741.2714] manager: (tape409da37-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Oct  8 11:19:01 np0005476733 kernel: tape409da37-30: entered promiscuous mode
Oct  8 11:19:01 np0005476733 nova_compute[192580]: 2025-10-08 15:19:01.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:01.274 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape409da37-30, col_values=(('external_ids', {'iface-id': '17554f0f-dcaf-420e-8155-c4671b89584b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:19:01 np0005476733 nova_compute[192580]: 2025-10-08 15:19:01.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:01 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:01Z|00051|binding|INFO|Releasing lport 17554f0f-dcaf-420e-8155-c4671b89584b from this chassis (sb_readonly=0)
Oct  8 11:19:01 np0005476733 nova_compute[192580]: 2025-10-08 15:19:01.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:01.289 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e409da37-3f48-4214-98c8-11c392b47fc3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e409da37-3f48-4214-98c8-11c392b47fc3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:01.290 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d741321d-2983-42fc-8019-6cc827fff1ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:01.291 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-e409da37-3f48-4214-98c8-11c392b47fc3
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/e409da37-3f48-4214-98c8-11c392b47fc3.pid.haproxy
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID e409da37-3f48-4214-98c8-11c392b47fc3
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:19:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:01.293 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e409da37-3f48-4214-98c8-11c392b47fc3', 'env', 'PROCESS_TAG=haproxy-e409da37-3f48-4214-98c8-11c392b47fc3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e409da37-3f48-4214-98c8-11c392b47fc3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:19:01 np0005476733 nova_compute[192580]: 2025-10-08 15:19:01.326 2 INFO nova.compute.manager [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Took 32.11 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:19:01 np0005476733 nova_compute[192580]: 2025-10-08 15:19:01.327 2 DEBUG nova.compute.manager [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:19:01 np0005476733 nova_compute[192580]: 2025-10-08 15:19:01.505 2 INFO nova.compute.manager [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Took 32.64 seconds to build instance.#033[00m
Oct  8 11:19:01 np0005476733 nova_compute[192580]: 2025-10-08 15:19:01.644 2 DEBUG oslo_concurrency.lockutils [None req-a41d13a3-54ec-4239-8925-35121b8bdb72 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "45e7af69-431d-4066-9b30-3883334340db" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 32.870s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:01 np0005476733 podman[221704]: 2025-10-08 15:19:01.601333746 +0000 UTC m=+0.024208415 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:19:01 np0005476733 nova_compute[192580]: 2025-10-08 15:19:01.830 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936741.8299532, f83eb34d-9eaa-4ec0-8632-215e0b1ef541 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:19:01 np0005476733 nova_compute[192580]: 2025-10-08 15:19:01.831 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] VM Started (Lifecycle Event)#033[00m
Oct  8 11:19:02 np0005476733 podman[221704]: 2025-10-08 15:19:02.179380548 +0000 UTC m=+0.602255177 container create 506a64257b5750ca5328abf6873d16b569716dd91deee708b811e54ccf3101bf (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-e409da37-3f48-4214-98c8-11c392b47fc3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 11:19:02 np0005476733 systemd[1]: Started libpod-conmon-506a64257b5750ca5328abf6873d16b569716dd91deee708b811e54ccf3101bf.scope.
Oct  8 11:19:02 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:19:02 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f90d017669949d29f9a7028e849e1ebb06138a1bc4011d39d98af1443e69be0a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:19:02 np0005476733 podman[221704]: 2025-10-08 15:19:02.504745571 +0000 UTC m=+0.927620260 container init 506a64257b5750ca5328abf6873d16b569716dd91deee708b811e54ccf3101bf (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-e409da37-3f48-4214-98c8-11c392b47fc3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:19:02 np0005476733 podman[221704]: 2025-10-08 15:19:02.511466546 +0000 UTC m=+0.934341175 container start 506a64257b5750ca5328abf6873d16b569716dd91deee708b811e54ccf3101bf (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-e409da37-3f48-4214-98c8-11c392b47fc3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:19:02 np0005476733 neutron-haproxy-ovnmeta-e409da37-3f48-4214-98c8-11c392b47fc3[221719]: [NOTICE]   (221723) : New worker (221725) forked
Oct  8 11:19:02 np0005476733 neutron-haproxy-ovnmeta-e409da37-3f48-4214-98c8-11c392b47fc3[221719]: [NOTICE]   (221723) : Loading success.
Oct  8 11:19:02 np0005476733 nova_compute[192580]: 2025-10-08 15:19:02.642 2 DEBUG nova.network.neutron [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Updating instance_info_cache with network_info: [{"id": "1f764678-f4b9-420d-b072-8c0f7c3534a9", "address": "fa:16:3e:7e:98:72", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f764678-f4", "ovs_interfaceid": "1f764678-f4b9-420d-b072-8c0f7c3534a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:19:02 np0005476733 nova_compute[192580]: 2025-10-08 15:19:02.746 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:19:02 np0005476733 nova_compute[192580]: 2025-10-08 15:19:02.756 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936741.8302412, f83eb34d-9eaa-4ec0-8632-215e0b1ef541 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:19:02 np0005476733 nova_compute[192580]: 2025-10-08 15:19:02.757 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:19:02 np0005476733 nova_compute[192580]: 2025-10-08 15:19:02.909 2 DEBUG oslo_concurrency.lockutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Releasing lock "refresh_cache-af660b82-9b3c-4c4d-820a-3d22b73898e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:19:02 np0005476733 nova_compute[192580]: 2025-10-08 15:19:02.909 2 DEBUG nova.compute.manager [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Instance network_info: |[{"id": "1f764678-f4b9-420d-b072-8c0f7c3534a9", "address": "fa:16:3e:7e:98:72", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f764678-f4", "ovs_interfaceid": "1f764678-f4b9-420d-b072-8c0f7c3534a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:19:02 np0005476733 nova_compute[192580]: 2025-10-08 15:19:02.918 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:19:02 np0005476733 nova_compute[192580]: 2025-10-08 15:19:02.922 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:19:03 np0005476733 nova_compute[192580]: 2025-10-08 15:19:03.062 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:19:03 np0005476733 nova_compute[192580]: 2025-10-08 15:19:03.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:03 np0005476733 nova_compute[192580]: 2025-10-08 15:19:03.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:03 np0005476733 nova_compute[192580]: 2025-10-08 15:19:03.825 2 DEBUG nova.network.neutron [req-56f47f3a-7f2e-4bff-954b-9a52316ef193 req-7babfd56-a3b6-4545-aa21-c8aae28366a3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Updated VIF entry in instance network info cache for port fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:19:03 np0005476733 nova_compute[192580]: 2025-10-08 15:19:03.826 2 DEBUG nova.network.neutron [req-56f47f3a-7f2e-4bff-954b-9a52316ef193 req-7babfd56-a3b6-4545-aa21-c8aae28366a3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Updating instance_info_cache with network_info: [{"id": "fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae", "address": "fa:16:3e:ac:48:4e", "network": {"id": "e409da37-3f48-4214-98c8-11c392b47fc3", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-860223840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5248fcbeb2ef4348bd0d0da2a924916a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe3fc542-fd", "ovs_interfaceid": "fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:19:03 np0005476733 nova_compute[192580]: 2025-10-08 15:19:03.962 2 DEBUG oslo_concurrency.lockutils [req-56f47f3a-7f2e-4bff-954b-9a52316ef193 req-7babfd56-a3b6-4545-aa21-c8aae28366a3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-f83eb34d-9eaa-4ec0-8632-215e0b1ef541" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:19:05 np0005476733 nova_compute[192580]: 2025-10-08 15:19:05.218 2 DEBUG nova.compute.manager [req-e8aaec63-7aae-474f-a1f5-79f89e6f882f req-6ed66e95-1510-46cf-9e2a-00bde8bf08f5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Received event network-changed-1f764678-f4b9-420d-b072-8c0f7c3534a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:19:05 np0005476733 nova_compute[192580]: 2025-10-08 15:19:05.218 2 DEBUG nova.compute.manager [req-e8aaec63-7aae-474f-a1f5-79f89e6f882f req-6ed66e95-1510-46cf-9e2a-00bde8bf08f5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Refreshing instance network info cache due to event network-changed-1f764678-f4b9-420d-b072-8c0f7c3534a9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:19:05 np0005476733 nova_compute[192580]: 2025-10-08 15:19:05.219 2 DEBUG oslo_concurrency.lockutils [req-e8aaec63-7aae-474f-a1f5-79f89e6f882f req-6ed66e95-1510-46cf-9e2a-00bde8bf08f5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-af660b82-9b3c-4c4d-820a-3d22b73898e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:19:05 np0005476733 nova_compute[192580]: 2025-10-08 15:19:05.219 2 DEBUG oslo_concurrency.lockutils [req-e8aaec63-7aae-474f-a1f5-79f89e6f882f req-6ed66e95-1510-46cf-9e2a-00bde8bf08f5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-af660b82-9b3c-4c4d-820a-3d22b73898e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:19:05 np0005476733 nova_compute[192580]: 2025-10-08 15:19:05.219 2 DEBUG nova.network.neutron [req-e8aaec63-7aae-474f-a1f5-79f89e6f882f req-6ed66e95-1510-46cf-9e2a-00bde8bf08f5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Refreshing network info cache for port 1f764678-f4b9-420d-b072-8c0f7c3534a9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:19:06 np0005476733 podman[221734]: 2025-10-08 15:19:06.242805683 +0000 UTC m=+0.067817252 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 11:19:10 np0005476733 podman[221754]: 2025-10-08 15:19:10.236211108 +0000 UTC m=+0.061095917 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 11:19:13 np0005476733 ovsdb-server[49913]: ovs|00005|reconnect|ERR|tcp:127.0.0.1:45100: no response to inactivity probe after 5 seconds, disconnecting
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.617 2 INFO nova.compute.manager [None req-4edf5508-a28c-4301-966d-283f4d8956ae 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Get console output#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.619 2 DEBUG nova.compute.manager [req-dca822b8-8bd7-480a-bd03-03325bb26dbb req-a7fd52fd-c701-43c5-8718-4859513e4735 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Received event network-vif-plugged-fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.620 2 DEBUG oslo_concurrency.lockutils [req-dca822b8-8bd7-480a-bd03-03325bb26dbb req-a7fd52fd-c701-43c5-8718-4859513e4735 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "f83eb34d-9eaa-4ec0-8632-215e0b1ef541-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.620 2 DEBUG oslo_concurrency.lockutils [req-dca822b8-8bd7-480a-bd03-03325bb26dbb req-a7fd52fd-c701-43c5-8718-4859513e4735 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "f83eb34d-9eaa-4ec0-8632-215e0b1ef541-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.620 2 DEBUG oslo_concurrency.lockutils [req-dca822b8-8bd7-480a-bd03-03325bb26dbb req-a7fd52fd-c701-43c5-8718-4859513e4735 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "f83eb34d-9eaa-4ec0-8632-215e0b1ef541-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.620 2 DEBUG nova.compute.manager [req-dca822b8-8bd7-480a-bd03-03325bb26dbb req-a7fd52fd-c701-43c5-8718-4859513e4735 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Processing event network-vif-plugged-fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.622 2 DEBUG nova.compute.manager [req-af5d954e-762a-4ce6-bb57-e5c0d21e013b req-d3ca23df-3d15-4386-9689-cd4f335716c8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Received event network-vif-plugged-fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.622 2 DEBUG oslo_concurrency.lockutils [req-af5d954e-762a-4ce6-bb57-e5c0d21e013b req-d3ca23df-3d15-4386-9689-cd4f335716c8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "f83eb34d-9eaa-4ec0-8632-215e0b1ef541-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.622 2 DEBUG oslo_concurrency.lockutils [req-af5d954e-762a-4ce6-bb57-e5c0d21e013b req-d3ca23df-3d15-4386-9689-cd4f335716c8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "f83eb34d-9eaa-4ec0-8632-215e0b1ef541-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.622 2 DEBUG oslo_concurrency.lockutils [req-af5d954e-762a-4ce6-bb57-e5c0d21e013b req-d3ca23df-3d15-4386-9689-cd4f335716c8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "f83eb34d-9eaa-4ec0-8632-215e0b1ef541-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.623 2 DEBUG nova.compute.manager [req-af5d954e-762a-4ce6-bb57-e5c0d21e013b req-d3ca23df-3d15-4386-9689-cd4f335716c8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] No waiting events found dispatching network-vif-plugged-fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.623 2 WARNING nova.compute.manager [req-af5d954e-762a-4ce6-bb57-e5c0d21e013b req-d3ca23df-3d15-4386-9689-cd4f335716c8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Received unexpected event network-vif-plugged-fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae for instance with vm_state building and task_state spawning.#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.626 2 DEBUG nova.compute.manager [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Instance event wait completed in 13 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering BACKOFF _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.632 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936755.6321201, f83eb34d-9eaa-4ec0-8632-215e0b1ef541 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.633 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.634 2 DEBUG nova.virt.libvirt.driver [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.641 2 INFO nova.virt.libvirt.driver [-] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Instance spawned successfully.#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.642 2 DEBUG nova.virt.libvirt.driver [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.718 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.726 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.729 2 DEBUG nova.virt.libvirt.driver [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.730 2 DEBUG nova.virt.libvirt.driver [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.730 2 DEBUG nova.virt.libvirt.driver [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.731 2 DEBUG nova.virt.libvirt.driver [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.731 2 DEBUG nova.virt.libvirt.driver [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.732 2 DEBUG nova.virt.libvirt.driver [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.758 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.899 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.900 2 INFO nova.compute.manager [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Took 27.13 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:19:15 np0005476733 nova_compute[192580]: 2025-10-08 15:19:15.901 2 DEBUG nova.compute.manager [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:19:16 np0005476733 nova_compute[192580]: 2025-10-08 15:19:16.151 2 INFO nova.compute.manager [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Took 28.07 seconds to build instance.#033[00m
Oct  8 11:19:16 np0005476733 nova_compute[192580]: 2025-10-08 15:19:16.262 2 DEBUG oslo_concurrency.lockutils [None req-60bc9fcf-ebcd-4caf-a49e-9b7ca096134e 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Lock "f83eb34d-9eaa-4ec0-8632-215e0b1ef541" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 28.300s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:16 np0005476733 nova_compute[192580]: 2025-10-08 15:19:16.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:19:16 np0005476733 nova_compute[192580]: 2025-10-08 15:19:16.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 11:19:16 np0005476733 nova_compute[192580]: 2025-10-08 15:19:16.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLOUT] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:16 np0005476733 nova_compute[192580]: 2025-10-08 15:19:16.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 11:19:16 np0005476733 nova_compute[192580]: 2025-10-08 15:19:16.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:16 np0005476733 nova_compute[192580]: 2025-10-08 15:19:16.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:16 np0005476733 nova_compute[192580]: 2025-10-08 15:19:16.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:16 np0005476733 nova_compute[192580]: 2025-10-08 15:19:16.938 2 DEBUG oslo_concurrency.processutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:19:17 np0005476733 nova_compute[192580]: 2025-10-08 15:19:17.006 2 DEBUG oslo_concurrency.processutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e.part --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:19:17 np0005476733 nova_compute[192580]: 2025-10-08 15:19:17.007 2 DEBUG nova.virt.images [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] 11111111-1111-1111-1111-111111111111 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Oct  8 11:19:17 np0005476733 nova_compute[192580]: 2025-10-08 15:19:17.008 2 DEBUG nova.privsep.utils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  8 11:19:17 np0005476733 nova_compute[192580]: 2025-10-08 15:19:17.009 2 DEBUG oslo_concurrency.processutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e.part /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:19:17 np0005476733 podman[221802]: 2025-10-08 15:19:17.216998126 +0000 UTC m=+0.048655057 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 11:19:17 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:17Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e1:e4:39 10.100.0.11
Oct  8 11:19:17 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:17Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e1:e4:39 10.100.0.11
Oct  8 11:19:17 np0005476733 nova_compute[192580]: 2025-10-08 15:19:17.714 2 DEBUG nova.network.neutron [req-e8aaec63-7aae-474f-a1f5-79f89e6f882f req-6ed66e95-1510-46cf-9e2a-00bde8bf08f5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Updated VIF entry in instance network info cache for port 1f764678-f4b9-420d-b072-8c0f7c3534a9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:19:17 np0005476733 nova_compute[192580]: 2025-10-08 15:19:17.714 2 DEBUG nova.network.neutron [req-e8aaec63-7aae-474f-a1f5-79f89e6f882f req-6ed66e95-1510-46cf-9e2a-00bde8bf08f5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Updating instance_info_cache with network_info: [{"id": "1f764678-f4b9-420d-b072-8c0f7c3534a9", "address": "fa:16:3e:7e:98:72", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f764678-f4", "ovs_interfaceid": "1f764678-f4b9-420d-b072-8c0f7c3534a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:19:17 np0005476733 nova_compute[192580]: 2025-10-08 15:19:17.749 2 DEBUG oslo_concurrency.lockutils [req-e8aaec63-7aae-474f-a1f5-79f89e6f882f req-6ed66e95-1510-46cf-9e2a-00bde8bf08f5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-af660b82-9b3c-4c4d-820a-3d22b73898e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:19:17 np0005476733 nova_compute[192580]: 2025-10-08 15:19:17.750 2 DEBUG nova.compute.manager [req-e8aaec63-7aae-474f-a1f5-79f89e6f882f req-6ed66e95-1510-46cf-9e2a-00bde8bf08f5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Received event network-vif-plugged-38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:19:17 np0005476733 nova_compute[192580]: 2025-10-08 15:19:17.751 2 DEBUG oslo_concurrency.lockutils [req-e8aaec63-7aae-474f-a1f5-79f89e6f882f req-6ed66e95-1510-46cf-9e2a-00bde8bf08f5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "45e7af69-431d-4066-9b30-3883334340db-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:17 np0005476733 nova_compute[192580]: 2025-10-08 15:19:17.751 2 DEBUG oslo_concurrency.lockutils [req-e8aaec63-7aae-474f-a1f5-79f89e6f882f req-6ed66e95-1510-46cf-9e2a-00bde8bf08f5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "45e7af69-431d-4066-9b30-3883334340db-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:17 np0005476733 nova_compute[192580]: 2025-10-08 15:19:17.751 2 DEBUG oslo_concurrency.lockutils [req-e8aaec63-7aae-474f-a1f5-79f89e6f882f req-6ed66e95-1510-46cf-9e2a-00bde8bf08f5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "45e7af69-431d-4066-9b30-3883334340db-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:17 np0005476733 nova_compute[192580]: 2025-10-08 15:19:17.751 2 DEBUG nova.compute.manager [req-e8aaec63-7aae-474f-a1f5-79f89e6f882f req-6ed66e95-1510-46cf-9e2a-00bde8bf08f5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] No waiting events found dispatching network-vif-plugged-38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:19:17 np0005476733 nova_compute[192580]: 2025-10-08 15:19:17.752 2 WARNING nova.compute.manager [req-e8aaec63-7aae-474f-a1f5-79f89e6f882f req-6ed66e95-1510-46cf-9e2a-00bde8bf08f5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Received unexpected event network-vif-plugged-38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:19:17 np0005476733 nova_compute[192580]: 2025-10-08 15:19:17.959 2 INFO nova.compute.manager [None req-86924092-88e7-4cc2-b13a-8e03f9884a8d 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Get console output#033[00m
Oct  8 11:19:17 np0005476733 nova_compute[192580]: 2025-10-08 15:19:17.965 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:19:18 np0005476733 nova_compute[192580]: 2025-10-08 15:19:18.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:19 np0005476733 podman[221821]: 2025-10-08 15:19:19.241906977 +0000 UTC m=+0.071755658 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, container_name=ceilometer_agent_compute)
Oct  8 11:19:19 np0005476733 nova_compute[192580]: 2025-10-08 15:19:19.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:19.347 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:19:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:19.350 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:19:20 np0005476733 podman[221841]: 2025-10-08 15:19:20.331962056 +0000 UTC m=+0.155373165 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 11:19:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:20.352 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:19:21 np0005476733 nova_compute[192580]: 2025-10-08 15:19:21.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:23 np0005476733 nova_compute[192580]: 2025-10-08 15:19:23.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:23 np0005476733 nova_compute[192580]: 2025-10-08 15:19:23.909 2 INFO nova.compute.manager [None req-6389d521-7abb-49cc-b37b-24780e142ff1 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Get console output#033[00m
Oct  8 11:19:23 np0005476733 nova_compute[192580]: 2025-10-08 15:19:23.913 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:19:23 np0005476733 nova_compute[192580]: 2025-10-08 15:19:23.974 2 INFO nova.compute.manager [None req-a4a59bdd-38da-4e67-901a-3775e7aac380 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Get console output#033[00m
Oct  8 11:19:23 np0005476733 nova_compute[192580]: 2025-10-08 15:19:23.979 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:19:24 np0005476733 podman[221866]: 2025-10-08 15:19:24.234099568 +0000 UTC m=+0.062100018 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, build-date=2025-08-20T13:12:41, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.buildah.version=1.33.7, version=9.6, architecture=x86_64, container_name=openstack_network_exporter)
Oct  8 11:19:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:25Z|00052|pinctrl|WARN|Dropped 3247 log messages in last 65 seconds (most recently, 5 seconds ago) due to excessive rate
Oct  8 11:19:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:25Z|00053|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:19:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:26.301 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:26.302 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:26.303 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:26 np0005476733 nova_compute[192580]: 2025-10-08 15:19:26.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:26 np0005476733 NetworkManager[51699]: <info>  [1759936766.8175] manager: (patch-provnet-20e9b335-697c-41bf-8f62-8813cb01ba99-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/30)
Oct  8 11:19:26 np0005476733 nova_compute[192580]: 2025-10-08 15:19:26.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:26 np0005476733 NetworkManager[51699]: <info>  [1759936766.8181] device (patch-provnet-20e9b335-697c-41bf-8f62-8813cb01ba99-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 11:19:26 np0005476733 NetworkManager[51699]: <info>  [1759936766.8190] manager: (patch-br-int-to-provnet-20e9b335-697c-41bf-8f62-8813cb01ba99): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/31)
Oct  8 11:19:26 np0005476733 NetworkManager[51699]: <info>  [1759936766.8193] device (patch-br-int-to-provnet-20e9b335-697c-41bf-8f62-8813cb01ba99)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 11:19:26 np0005476733 NetworkManager[51699]: <info>  [1759936766.8201] manager: (patch-provnet-20e9b335-697c-41bf-8f62-8813cb01ba99-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Oct  8 11:19:26 np0005476733 NetworkManager[51699]: <info>  [1759936766.8206] manager: (patch-br-int-to-provnet-20e9b335-697c-41bf-8f62-8813cb01ba99): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Oct  8 11:19:26 np0005476733 NetworkManager[51699]: <info>  [1759936766.8211] device (patch-provnet-20e9b335-697c-41bf-8f62-8813cb01ba99-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  8 11:19:26 np0005476733 NetworkManager[51699]: <info>  [1759936766.8213] device (patch-br-int-to-provnet-20e9b335-697c-41bf-8f62-8813cb01ba99)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  8 11:19:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:27Z|00054|binding|INFO|Releasing lport be57ad0f-c6a0-46f7-9ec4-d7ac934c25d0 from this chassis (sb_readonly=0)
Oct  8 11:19:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:27Z|00055|binding|INFO|Releasing lport 17554f0f-dcaf-420e-8155-c4671b89584b from this chassis (sb_readonly=0)
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:27Z|00056|binding|INFO|Releasing lport be57ad0f-c6a0-46f7-9ec4-d7ac934c25d0 from this chassis (sb_readonly=0)
Oct  8 11:19:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:27Z|00057|binding|INFO|Releasing lport 17554f0f-dcaf-420e-8155-c4671b89584b from this chassis (sb_readonly=0)
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.432 2 DEBUG nova.compute.manager [req-05781cfc-9cb6-4e41-ac0e-8fc93dd87a9a req-ef57d0f1-0c6e-4f3d-afb0-767b9a9e8713 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Received event network-changed-38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.433 2 DEBUG nova.compute.manager [req-05781cfc-9cb6-4e41-ac0e-8fc93dd87a9a req-ef57d0f1-0c6e-4f3d-afb0-767b9a9e8713 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Refreshing instance network info cache due to event network-changed-38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.433 2 DEBUG oslo_concurrency.lockutils [req-05781cfc-9cb6-4e41-ac0e-8fc93dd87a9a req-ef57d0f1-0c6e-4f3d-afb0-767b9a9e8713 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-45e7af69-431d-4066-9b30-3883334340db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.433 2 DEBUG oslo_concurrency.lockutils [req-05781cfc-9cb6-4e41-ac0e-8fc93dd87a9a req-ef57d0f1-0c6e-4f3d-afb0-767b9a9e8713 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-45e7af69-431d-4066-9b30-3883334340db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.433 2 DEBUG nova.network.neutron [req-05781cfc-9cb6-4e41-ac0e-8fc93dd87a9a req-ef57d0f1-0c6e-4f3d-afb0-767b9a9e8713 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Refreshing network info cache for port 38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.594 2 DEBUG oslo_concurrency.lockutils [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "45e7af69-431d-4066-9b30-3883334340db" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.594 2 DEBUG oslo_concurrency.lockutils [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "45e7af69-431d-4066-9b30-3883334340db" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.594 2 DEBUG oslo_concurrency.lockutils [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "45e7af69-431d-4066-9b30-3883334340db-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.595 2 DEBUG oslo_concurrency.lockutils [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "45e7af69-431d-4066-9b30-3883334340db-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.595 2 DEBUG oslo_concurrency.lockutils [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "45e7af69-431d-4066-9b30-3883334340db-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.596 2 INFO nova.compute.manager [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Terminating instance#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.596 2 DEBUG nova.compute.manager [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:19:27 np0005476733 kernel: tap38cc08d0-8e (unregistering): left promiscuous mode
Oct  8 11:19:27 np0005476733 NetworkManager[51699]: <info>  [1759936767.6359] device (tap38cc08d0-8e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:19:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:27Z|00058|binding|INFO|Releasing lport 38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 from this chassis (sb_readonly=0)
Oct  8 11:19:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:27Z|00059|binding|INFO|Setting lport 38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 down in Southbound
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:27Z|00060|binding|INFO|Removing iface tap38cc08d0-8e ovn-installed in OVS
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:27.656 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:e4:39 10.100.0.11'], port_security=['fa:16:3e:e1:e4:39 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '45e7af69-431d-4066-9b30-3883334340db', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d70e9270-ce35-4a65-b11d-6eefe64091e8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7762962015674dfb9038a135559a61f3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0ceba18c-c267-491a-9134-5610e71f22a4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.236'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f11af73-f291-4143-87b4-e949e78304b7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:27.659 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 in datapath d70e9270-ce35-4a65-b11d-6eefe64091e8 unbound from our chassis#033[00m
Oct  8 11:19:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:27.661 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d70e9270-ce35-4a65-b11d-6eefe64091e8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:19:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:27.663 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[65c6f297-f180-4e49-bc04-1a0b2485d7d7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:27.664 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d70e9270-ce35-4a65-b11d-6eefe64091e8 namespace which is not needed anymore#033[00m
Oct  8 11:19:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:27Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ac:48:4e 10.100.0.9
Oct  8 11:19:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:27Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ac:48:4e 10.100.0.9
Oct  8 11:19:27 np0005476733 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Oct  8 11:19:27 np0005476733 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 13.988s CPU time.
Oct  8 11:19:27 np0005476733 systemd-machined[152624]: Machine qemu-1-instance-00000001 terminated.
Oct  8 11:19:27 np0005476733 neutron-haproxy-ovnmeta-d70e9270-ce35-4a65-b11d-6eefe64091e8[221443]: [NOTICE]   (221447) : haproxy version is 2.8.14-c23fe91
Oct  8 11:19:27 np0005476733 neutron-haproxy-ovnmeta-d70e9270-ce35-4a65-b11d-6eefe64091e8[221443]: [NOTICE]   (221447) : path to executable is /usr/sbin/haproxy
Oct  8 11:19:27 np0005476733 neutron-haproxy-ovnmeta-d70e9270-ce35-4a65-b11d-6eefe64091e8[221443]: [WARNING]  (221447) : Exiting Master process...
Oct  8 11:19:27 np0005476733 neutron-haproxy-ovnmeta-d70e9270-ce35-4a65-b11d-6eefe64091e8[221443]: [ALERT]    (221447) : Current worker (221449) exited with code 143 (Terminated)
Oct  8 11:19:27 np0005476733 neutron-haproxy-ovnmeta-d70e9270-ce35-4a65-b11d-6eefe64091e8[221443]: [WARNING]  (221447) : All workers exited. Exiting... (0)
Oct  8 11:19:27 np0005476733 systemd[1]: libpod-1e205769f3fb186b88dd63606f4f5dca49835609f46b69051c95a677b0fd4e74.scope: Deactivated successfully.
Oct  8 11:19:27 np0005476733 kernel: tap38cc08d0-8e: entered promiscuous mode
Oct  8 11:19:27 np0005476733 kernel: tap38cc08d0-8e (unregistering): left promiscuous mode
Oct  8 11:19:27 np0005476733 podman[221930]: 2025-10-08 15:19:27.819620458 +0000 UTC m=+0.061927163 container died 1e205769f3fb186b88dd63606f4f5dca49835609f46b69051c95a677b0fd4e74 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-d70e9270-ce35-4a65-b11d-6eefe64091e8, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:19:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:27Z|00061|binding|INFO|Claiming lport 38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 for this chassis.
Oct  8 11:19:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:27Z|00062|binding|INFO|38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0: Claiming fa:16:3e:e1:e4:39 10.100.0.11
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:27.835 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:e4:39 10.100.0.11'], port_security=['fa:16:3e:e1:e4:39 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '45e7af69-431d-4066-9b30-3883334340db', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d70e9270-ce35-4a65-b11d-6eefe64091e8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7762962015674dfb9038a135559a61f3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0ceba18c-c267-491a-9134-5610e71f22a4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.236'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f11af73-f291-4143-87b4-e949e78304b7, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:19:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:27Z|00063|binding|INFO|Releasing lport 38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 from this chassis (sb_readonly=0)
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:27.848 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:e4:39 10.100.0.11'], port_security=['fa:16:3e:e1:e4:39 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '45e7af69-431d-4066-9b30-3883334340db', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d70e9270-ce35-4a65-b11d-6eefe64091e8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7762962015674dfb9038a135559a61f3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0ceba18c-c267-491a-9134-5610e71f22a4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.236'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f11af73-f291-4143-87b4-e949e78304b7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.867 2 INFO nova.virt.libvirt.driver [-] [instance: 45e7af69-431d-4066-9b30-3883334340db] Instance destroyed successfully.#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.867 2 DEBUG nova.objects.instance [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lazy-loading 'resources' on Instance uuid 45e7af69-431d-4066-9b30-3883334340db obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.882 2 DEBUG nova.virt.libvirt.vif [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:18:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-588633435',display_name='tempest-server-test-588633435',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-588633435',id=1,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCl+s+yUuTm6Hg1/EtkHPuYj+MxrL1QWLpUkVHCxkkLtxt5vFuy3WR6QNq/sLHyb2USC6/SaVkz2TULgQ3QvZM5JIgzhXTphcmSNBjGGopZ9gCz2VffMGgQSCG26RjA1Jg==',key_name='tempest-keypair-test-1160353961',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:19:01Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7762962015674dfb9038a135559a61f3',ramdisk_id='',reservation_id='r-yhzdtrwj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NetworkBasicTest-1891752524',owner_user_name='tempest-NetworkBasicTest-1891752524-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:19:01Z,user_data=None,user_id='71a7f2d2441447b2bbd1b677555d68cc',uuid=45e7af69-431d-4066-9b30-3883334340db,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0", "address": "fa:16:3e:e1:e4:39", "network": {"id": "d70e9270-ce35-4a65-b11d-6eefe64091e8", "bridge": "br-int", "label": "tempest-test-network--1317531077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38cc08d0-8e", "ovs_interfaceid": "38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.882 2 DEBUG nova.network.os_vif_util [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Converting VIF {"id": "38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0", "address": "fa:16:3e:e1:e4:39", "network": {"id": "d70e9270-ce35-4a65-b11d-6eefe64091e8", "bridge": "br-int", "label": "tempest-test-network--1317531077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38cc08d0-8e", "ovs_interfaceid": "38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.883 2 DEBUG nova.network.os_vif_util [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e1:e4:39,bridge_name='br-int',has_traffic_filtering=True,id=38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0,network=Network(d70e9270-ce35-4a65-b11d-6eefe64091e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38cc08d0-8e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.884 2 DEBUG os_vif [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:e4:39,bridge_name='br-int',has_traffic_filtering=True,id=38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0,network=Network(d70e9270-ce35-4a65-b11d-6eefe64091e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38cc08d0-8e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.886 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap38cc08d0-8e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.891 2 INFO os_vif [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:e4:39,bridge_name='br-int',has_traffic_filtering=True,id=38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0,network=Network(d70e9270-ce35-4a65-b11d-6eefe64091e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38cc08d0-8e')#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.892 2 INFO nova.virt.libvirt.driver [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Deleting instance files /var/lib/nova/instances/45e7af69-431d-4066-9b30-3883334340db_del#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.892 2 INFO nova.virt.libvirt.driver [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Deletion of /var/lib/nova/instances/45e7af69-431d-4066-9b30-3883334340db_del complete#033[00m
Oct  8 11:19:27 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1e205769f3fb186b88dd63606f4f5dca49835609f46b69051c95a677b0fd4e74-userdata-shm.mount: Deactivated successfully.
Oct  8 11:19:27 np0005476733 systemd[1]: var-lib-containers-storage-overlay-aa40d5ae89d71ec791b21c09b1c6a620dfc5d6a6480596398c022ede3400ed7c-merged.mount: Deactivated successfully.
Oct  8 11:19:27 np0005476733 podman[221930]: 2025-10-08 15:19:27.956407896 +0000 UTC m=+0.198714601 container cleanup 1e205769f3fb186b88dd63606f4f5dca49835609f46b69051c95a677b0fd4e74 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-d70e9270-ce35-4a65-b11d-6eefe64091e8, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.960 2 DEBUG nova.virt.libvirt.host [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.961 2 INFO nova.virt.libvirt.host [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] UEFI support detected#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.963 2 INFO nova.compute.manager [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.964 2 DEBUG oslo.service.loopingcall [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.964 2 DEBUG nova.compute.manager [-] [instance: 45e7af69-431d-4066-9b30-3883334340db] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:19:27 np0005476733 nova_compute[192580]: 2025-10-08 15:19:27.964 2 DEBUG nova.network.neutron [-] [instance: 45e7af69-431d-4066-9b30-3883334340db] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:19:27 np0005476733 systemd[1]: libpod-conmon-1e205769f3fb186b88dd63606f4f5dca49835609f46b69051c95a677b0fd4e74.scope: Deactivated successfully.
Oct  8 11:19:28 np0005476733 podman[221974]: 2025-10-08 15:19:28.073294827 +0000 UTC m=+0.086634144 container remove 1e205769f3fb186b88dd63606f4f5dca49835609f46b69051c95a677b0fd4e74 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-d70e9270-ce35-4a65-b11d-6eefe64091e8, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  8 11:19:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:28.079 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1924b7a3-33f8-4947-ba1e-bdbcaa6b4e34]: (4, ('Wed Oct  8 03:19:27 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d70e9270-ce35-4a65-b11d-6eefe64091e8 (1e205769f3fb186b88dd63606f4f5dca49835609f46b69051c95a677b0fd4e74)\n1e205769f3fb186b88dd63606f4f5dca49835609f46b69051c95a677b0fd4e74\nWed Oct  8 03:19:27 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d70e9270-ce35-4a65-b11d-6eefe64091e8 (1e205769f3fb186b88dd63606f4f5dca49835609f46b69051c95a677b0fd4e74)\n1e205769f3fb186b88dd63606f4f5dca49835609f46b69051c95a677b0fd4e74\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:28.083 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d27b1ee9-a4c3-4086-b96d-5298d22e5e25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:28.084 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd70e9270-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:19:28 np0005476733 kernel: tapd70e9270-c0: left promiscuous mode
Oct  8 11:19:28 np0005476733 nova_compute[192580]: 2025-10-08 15:19:28.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:28.090 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[57da44fe-9748-4205-95a0-92c771ae5a00]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:28 np0005476733 nova_compute[192580]: 2025-10-08 15:19:28.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:28.127 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[777f894f-8177-4bbb-8e90-6f76c2f7757a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:28.128 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d0c39746-decd-45b9-9f1d-0c76814cf682]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:28.146 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c5709dcd-5889-451f-a3df-ae30ba6f1087]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362563, 'reachable_time': 25380, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221986, 'error': None, 'target': 'ovnmeta-d70e9270-ce35-4a65-b11d-6eefe64091e8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:28 np0005476733 systemd[1]: run-netns-ovnmeta\x2dd70e9270\x2dce35\x2d4a65\x2db11d\x2d6eefe64091e8.mount: Deactivated successfully.
Oct  8 11:19:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:28.160 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d70e9270-ce35-4a65-b11d-6eefe64091e8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:19:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:28.161 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[aac1f983-b7da-40b7-98f5-8ac12d4150d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:28.162 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 in datapath d70e9270-ce35-4a65-b11d-6eefe64091e8 unbound from our chassis#033[00m
Oct  8 11:19:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:28.163 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d70e9270-ce35-4a65-b11d-6eefe64091e8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:19:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:28.164 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1935914f-6f54-46de-9e63-c54bcae29c7a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:28.164 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 in datapath d70e9270-ce35-4a65-b11d-6eefe64091e8 unbound from our chassis#033[00m
Oct  8 11:19:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:28.166 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d70e9270-ce35-4a65-b11d-6eefe64091e8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:19:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:28.166 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[064e1f25-380e-4db7-9c59-b4abb00e2a6c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:28 np0005476733 nova_compute[192580]: 2025-10-08 15:19:28.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:28 np0005476733 nova_compute[192580]: 2025-10-08 15:19:28.658 2 DEBUG nova.compute.manager [req-53dcfe71-8849-47c3-97c2-9a743b5acdc6 req-611bd9ba-a094-4e71-b87f-2592ec8faecf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Received event network-vif-unplugged-38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:19:28 np0005476733 nova_compute[192580]: 2025-10-08 15:19:28.659 2 DEBUG oslo_concurrency.lockutils [req-53dcfe71-8849-47c3-97c2-9a743b5acdc6 req-611bd9ba-a094-4e71-b87f-2592ec8faecf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "45e7af69-431d-4066-9b30-3883334340db-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:28 np0005476733 nova_compute[192580]: 2025-10-08 15:19:28.659 2 DEBUG oslo_concurrency.lockutils [req-53dcfe71-8849-47c3-97c2-9a743b5acdc6 req-611bd9ba-a094-4e71-b87f-2592ec8faecf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "45e7af69-431d-4066-9b30-3883334340db-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:28 np0005476733 nova_compute[192580]: 2025-10-08 15:19:28.660 2 DEBUG oslo_concurrency.lockutils [req-53dcfe71-8849-47c3-97c2-9a743b5acdc6 req-611bd9ba-a094-4e71-b87f-2592ec8faecf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "45e7af69-431d-4066-9b30-3883334340db-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:28 np0005476733 nova_compute[192580]: 2025-10-08 15:19:28.660 2 DEBUG nova.compute.manager [req-53dcfe71-8849-47c3-97c2-9a743b5acdc6 req-611bd9ba-a094-4e71-b87f-2592ec8faecf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] No waiting events found dispatching network-vif-unplugged-38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:19:28 np0005476733 nova_compute[192580]: 2025-10-08 15:19:28.660 2 DEBUG nova.compute.manager [req-53dcfe71-8849-47c3-97c2-9a743b5acdc6 req-611bd9ba-a094-4e71-b87f-2592ec8faecf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Received event network-vif-unplugged-38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:19:29 np0005476733 podman[221989]: 2025-10-08 15:19:29.257939433 +0000 UTC m=+0.066411917 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 11:19:29 np0005476733 podman[221988]: 2025-10-08 15:19:29.289853074 +0000 UTC m=+0.094625449 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:19:29 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:29Z|00064|memory|INFO|peak resident set size grew 59% in last 1113.9 seconds, from 16256 kB to 25856 kB
Oct  8 11:19:29 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:29Z|00065|memory|INFO|idl-cells-OVN_Southbound:12046 idl-cells-Open_vSwitch:756 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:418 lflow-cache-entries-cache-matches:312 lflow-cache-size-KB:1815 local_datapath_usage-KB:3 ofctrl_desired_flow_usage-KB:791 ofctrl_installed_flow_usage-KB:578 ofctrl_sb_flow_ref_usage-KB:292
Oct  8 11:19:29 np0005476733 nova_compute[192580]: 2025-10-08 15:19:29.590 2 DEBUG nova.network.neutron [req-05781cfc-9cb6-4e41-ac0e-8fc93dd87a9a req-ef57d0f1-0c6e-4f3d-afb0-767b9a9e8713 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Updated VIF entry in instance network info cache for port 38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:19:29 np0005476733 nova_compute[192580]: 2025-10-08 15:19:29.591 2 DEBUG nova.network.neutron [req-05781cfc-9cb6-4e41-ac0e-8fc93dd87a9a req-ef57d0f1-0c6e-4f3d-afb0-767b9a9e8713 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Updating instance_info_cache with network_info: [{"id": "38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0", "address": "fa:16:3e:e1:e4:39", "network": {"id": "d70e9270-ce35-4a65-b11d-6eefe64091e8", "bridge": "br-int", "label": "tempest-test-network--1317531077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38cc08d0-8e", "ovs_interfaceid": "38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:19:29 np0005476733 nova_compute[192580]: 2025-10-08 15:19:29.633 2 DEBUG nova.network.neutron [-] [instance: 45e7af69-431d-4066-9b30-3883334340db] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:19:29 np0005476733 nova_compute[192580]: 2025-10-08 15:19:29.635 2 DEBUG oslo_concurrency.lockutils [req-05781cfc-9cb6-4e41-ac0e-8fc93dd87a9a req-ef57d0f1-0c6e-4f3d-afb0-767b9a9e8713 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-45e7af69-431d-4066-9b30-3883334340db" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:19:29 np0005476733 nova_compute[192580]: 2025-10-08 15:19:29.686 2 INFO nova.compute.manager [-] [instance: 45e7af69-431d-4066-9b30-3883334340db] Took 1.72 seconds to deallocate network for instance.#033[00m
Oct  8 11:19:29 np0005476733 nova_compute[192580]: 2025-10-08 15:19:29.691 2 INFO nova.compute.manager [None req-1823ad0b-ef91-4809-aea1-b3e32870e216 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Get console output#033[00m
Oct  8 11:19:29 np0005476733 nova_compute[192580]: 2025-10-08 15:19:29.696 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:19:29 np0005476733 nova_compute[192580]: 2025-10-08 15:19:29.768 2 DEBUG oslo_concurrency.lockutils [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:29 np0005476733 nova_compute[192580]: 2025-10-08 15:19:29.768 2 DEBUG oslo_concurrency.lockutils [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:29 np0005476733 nova_compute[192580]: 2025-10-08 15:19:29.879 2 DEBUG nova.compute.provider_tree [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'MEMORY_MB': {'total': 15731, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 119, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 3}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 11:19:29 np0005476733 nova_compute[192580]: 2025-10-08 15:19:29.936 2 ERROR nova.scheduler.client.report [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [req-8ed9cf7c-d2af-492a-b379-5ca05e0f9205] Failed to update inventory to [{'MEMORY_MB': {'total': 15731, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 119, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 3}}] for resource provider with UUID 94652b61-be28-442d-a9f4-cded63837444.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-8ed9cf7c-d2af-492a-b379-5ca05e0f9205"}]}#033[00m
Oct  8 11:19:29 np0005476733 nova_compute[192580]: 2025-10-08 15:19:29.958 2 DEBUG nova.scheduler.client.report [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 11:19:29 np0005476733 nova_compute[192580]: 2025-10-08 15:19:29.981 2 DEBUG nova.scheduler.client.report [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 11:19:29 np0005476733 nova_compute[192580]: 2025-10-08 15:19:29.982 2 DEBUG nova.compute.provider_tree [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 1, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 11:19:29 np0005476733 nova_compute[192580]: 2025-10-08 15:19:29.998 2 DEBUG nova.scheduler.client.report [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 11:19:30 np0005476733 nova_compute[192580]: 2025-10-08 15:19:30.018 2 DEBUG nova.compute.manager [req-1abd932c-024d-4c00-98e1-5146c1c327f9 req-9f575814-bef4-4998-961d-e1f41dbd517f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Received event network-vif-deleted-38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:19:30 np0005476733 nova_compute[192580]: 2025-10-08 15:19:30.118 2 DEBUG nova.scheduler.client.report [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 11:19:30 np0005476733 nova_compute[192580]: 2025-10-08 15:19:30.192 2 DEBUG nova.compute.provider_tree [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'MEMORY_MB': {'total': 15731, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 119, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 3}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 11:19:30 np0005476733 nova_compute[192580]: 2025-10-08 15:19:30.330 2 DEBUG nova.scheduler.client.report [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Updated inventory for provider 94652b61-be28-442d-a9f4-cded63837444 with generation 6 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15731, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 119, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 3}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Oct  8 11:19:30 np0005476733 nova_compute[192580]: 2025-10-08 15:19:30.331 2 DEBUG nova.compute.provider_tree [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Updating resource provider 94652b61-be28-442d-a9f4-cded63837444 generation from 6 to 7 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Oct  8 11:19:30 np0005476733 nova_compute[192580]: 2025-10-08 15:19:30.331 2 DEBUG nova.compute.provider_tree [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 3, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 11:19:30 np0005476733 nova_compute[192580]: 2025-10-08 15:19:30.402 2 DEBUG oslo_concurrency.lockutils [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.634s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:30 np0005476733 nova_compute[192580]: 2025-10-08 15:19:30.440 2 INFO nova.scheduler.client.report [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Deleted allocations for instance 45e7af69-431d-4066-9b30-3883334340db#033[00m
Oct  8 11:19:30 np0005476733 nova_compute[192580]: 2025-10-08 15:19:30.541 2 DEBUG oslo_concurrency.lockutils [None req-7b88ba20-65c6-447d-b17f-37b8b2c4cd24 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "45e7af69-431d-4066-9b30-3883334340db" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.947s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:30 np0005476733 nova_compute[192580]: 2025-10-08 15:19:30.950 2 DEBUG nova.compute.manager [req-d2ef6baf-bfe1-40f0-af71-a6f8b16eba0a req-55a50edb-2650-4805-b8c6-b1f5c7eca133 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Received event network-vif-plugged-38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:19:30 np0005476733 nova_compute[192580]: 2025-10-08 15:19:30.950 2 DEBUG oslo_concurrency.lockutils [req-d2ef6baf-bfe1-40f0-af71-a6f8b16eba0a req-55a50edb-2650-4805-b8c6-b1f5c7eca133 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "45e7af69-431d-4066-9b30-3883334340db-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:30 np0005476733 nova_compute[192580]: 2025-10-08 15:19:30.951 2 DEBUG oslo_concurrency.lockutils [req-d2ef6baf-bfe1-40f0-af71-a6f8b16eba0a req-55a50edb-2650-4805-b8c6-b1f5c7eca133 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "45e7af69-431d-4066-9b30-3883334340db-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:30 np0005476733 nova_compute[192580]: 2025-10-08 15:19:30.951 2 DEBUG oslo_concurrency.lockutils [req-d2ef6baf-bfe1-40f0-af71-a6f8b16eba0a req-55a50edb-2650-4805-b8c6-b1f5c7eca133 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "45e7af69-431d-4066-9b30-3883334340db-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:30 np0005476733 nova_compute[192580]: 2025-10-08 15:19:30.951 2 DEBUG nova.compute.manager [req-d2ef6baf-bfe1-40f0-af71-a6f8b16eba0a req-55a50edb-2650-4805-b8c6-b1f5c7eca133 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] No waiting events found dispatching network-vif-plugged-38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:19:30 np0005476733 nova_compute[192580]: 2025-10-08 15:19:30.952 2 WARNING nova.compute.manager [req-d2ef6baf-bfe1-40f0-af71-a6f8b16eba0a req-55a50edb-2650-4805-b8c6-b1f5c7eca133 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 45e7af69-431d-4066-9b30-3883334340db] Received unexpected event network-vif-plugged-38cc08d0-8e7e-43b5-af3b-ea1f14ab95e0 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 11:19:31 np0005476733 nova_compute[192580]: 2025-10-08 15:19:31.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:19:31 np0005476733 nova_compute[192580]: 2025-10-08 15:19:31.610 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:31 np0005476733 nova_compute[192580]: 2025-10-08 15:19:31.610 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:31 np0005476733 nova_compute[192580]: 2025-10-08 15:19:31.611 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:31 np0005476733 nova_compute[192580]: 2025-10-08 15:19:31.611 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:19:31 np0005476733 nova_compute[192580]: 2025-10-08 15:19:31.681 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f83eb34d-9eaa-4ec0-8632-215e0b1ef541/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:19:31 np0005476733 nova_compute[192580]: 2025-10-08 15:19:31.751 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f83eb34d-9eaa-4ec0-8632-215e0b1ef541/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:19:31 np0005476733 nova_compute[192580]: 2025-10-08 15:19:31.752 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f83eb34d-9eaa-4ec0-8632-215e0b1ef541/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:19:31 np0005476733 nova_compute[192580]: 2025-10-08 15:19:31.809 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f83eb34d-9eaa-4ec0-8632-215e0b1ef541/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:19:31 np0005476733 nova_compute[192580]: 2025-10-08 15:19:31.972 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:19:31 np0005476733 nova_compute[192580]: 2025-10-08 15:19:31.973 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13627MB free_disk=110.82746505737305GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:19:31 np0005476733 nova_compute[192580]: 2025-10-08 15:19:31.974 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:31 np0005476733 nova_compute[192580]: 2025-10-08 15:19:31.974 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:32 np0005476733 nova_compute[192580]: 2025-10-08 15:19:32.150 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance f83eb34d-9eaa-4ec0-8632-215e0b1ef541 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:19:32 np0005476733 nova_compute[192580]: 2025-10-08 15:19:32.151 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance af660b82-9b3c-4c4d-820a-3d22b73898e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:19:32 np0005476733 nova_compute[192580]: 2025-10-08 15:19:32.151 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:19:32 np0005476733 nova_compute[192580]: 2025-10-08 15:19:32.152 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1664MB phys_disk=119GB used_disk=11GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:19:32 np0005476733 nova_compute[192580]: 2025-10-08 15:19:32.265 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:19:32 np0005476733 nova_compute[192580]: 2025-10-08 15:19:32.279 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 3, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:19:32 np0005476733 nova_compute[192580]: 2025-10-08 15:19:32.306 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:19:32 np0005476733 nova_compute[192580]: 2025-10-08 15:19:32.307 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.333s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:32 np0005476733 nova_compute[192580]: 2025-10-08 15:19:32.734 2 DEBUG nova.compute.manager [req-da8e1ac6-41a0-4a02-bab2-7d51a1febab0 req-8e7b4378-7237-40f2-8a77-3bd8c5555d1d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Received event network-changed-fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:19:32 np0005476733 nova_compute[192580]: 2025-10-08 15:19:32.734 2 DEBUG nova.compute.manager [req-da8e1ac6-41a0-4a02-bab2-7d51a1febab0 req-8e7b4378-7237-40f2-8a77-3bd8c5555d1d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Refreshing instance network info cache due to event network-changed-fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:19:32 np0005476733 nova_compute[192580]: 2025-10-08 15:19:32.735 2 DEBUG oslo_concurrency.lockutils [req-da8e1ac6-41a0-4a02-bab2-7d51a1febab0 req-8e7b4378-7237-40f2-8a77-3bd8c5555d1d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-f83eb34d-9eaa-4ec0-8632-215e0b1ef541" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:19:32 np0005476733 nova_compute[192580]: 2025-10-08 15:19:32.735 2 DEBUG oslo_concurrency.lockutils [req-da8e1ac6-41a0-4a02-bab2-7d51a1febab0 req-8e7b4378-7237-40f2-8a77-3bd8c5555d1d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-f83eb34d-9eaa-4ec0-8632-215e0b1ef541" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:19:32 np0005476733 nova_compute[192580]: 2025-10-08 15:19:32.736 2 DEBUG nova.network.neutron [req-da8e1ac6-41a0-4a02-bab2-7d51a1febab0 req-8e7b4378-7237-40f2-8a77-3bd8c5555d1d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Refreshing network info cache for port fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:19:32 np0005476733 nova_compute[192580]: 2025-10-08 15:19:32.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:33 np0005476733 nova_compute[192580]: 2025-10-08 15:19:33.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:33 np0005476733 nova_compute[192580]: 2025-10-08 15:19:33.851 2 DEBUG oslo_concurrency.lockutils [None req-518c6c54-60ea-4329-bb40-bff5b4ccfcee 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Acquiring lock "f83eb34d-9eaa-4ec0-8632-215e0b1ef541" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:33 np0005476733 nova_compute[192580]: 2025-10-08 15:19:33.852 2 DEBUG oslo_concurrency.lockutils [None req-518c6c54-60ea-4329-bb40-bff5b4ccfcee 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Lock "f83eb34d-9eaa-4ec0-8632-215e0b1ef541" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:33 np0005476733 nova_compute[192580]: 2025-10-08 15:19:33.852 2 DEBUG oslo_concurrency.lockutils [None req-518c6c54-60ea-4329-bb40-bff5b4ccfcee 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Acquiring lock "f83eb34d-9eaa-4ec0-8632-215e0b1ef541-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:33 np0005476733 nova_compute[192580]: 2025-10-08 15:19:33.852 2 DEBUG oslo_concurrency.lockutils [None req-518c6c54-60ea-4329-bb40-bff5b4ccfcee 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Lock "f83eb34d-9eaa-4ec0-8632-215e0b1ef541-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:33 np0005476733 nova_compute[192580]: 2025-10-08 15:19:33.852 2 DEBUG oslo_concurrency.lockutils [None req-518c6c54-60ea-4329-bb40-bff5b4ccfcee 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Lock "f83eb34d-9eaa-4ec0-8632-215e0b1ef541-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:33 np0005476733 nova_compute[192580]: 2025-10-08 15:19:33.853 2 INFO nova.compute.manager [None req-518c6c54-60ea-4329-bb40-bff5b4ccfcee 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Terminating instance#033[00m
Oct  8 11:19:33 np0005476733 nova_compute[192580]: 2025-10-08 15:19:33.854 2 DEBUG nova.compute.manager [None req-518c6c54-60ea-4329-bb40-bff5b4ccfcee 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:19:33 np0005476733 kernel: tapfe3fc542-fd (unregistering): left promiscuous mode
Oct  8 11:19:33 np0005476733 NetworkManager[51699]: <info>  [1759936773.8832] device (tapfe3fc542-fd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:19:33 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:33Z|00066|binding|INFO|Releasing lport fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae from this chassis (sb_readonly=0)
Oct  8 11:19:33 np0005476733 nova_compute[192580]: 2025-10-08 15:19:33.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:33 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:33Z|00067|binding|INFO|Setting lport fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae down in Southbound
Oct  8 11:19:33 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:33Z|00068|binding|INFO|Removing iface tapfe3fc542-fd ovn-installed in OVS
Oct  8 11:19:33 np0005476733 nova_compute[192580]: 2025-10-08 15:19:33.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:33.901 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:48:4e 10.100.0.9'], port_security=['fa:16:3e:ac:48:4e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-internal-dns-test-port-826693979', 'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'f83eb34d-9eaa-4ec0-8632-215e0b1ef541', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e409da37-3f48-4214-98c8-11c392b47fc3', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-internal-dns-test-port-826693979', 'neutron:project_id': '5248fcbeb2ef4348bd0d0da2a924916a', 'neutron:revision_number': '5', 'neutron:security_group_ids': '6499d1d4-cce3-4174-8313-a409bf2c1bb3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.207'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ca31c240-db5a-4e53-802b-b3d974372970, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:19:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:33.903 103739 INFO neutron.agent.ovn.metadata.agent [-] Port fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae in datapath e409da37-3f48-4214-98c8-11c392b47fc3 unbound from our chassis#033[00m
Oct  8 11:19:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:33.906 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e409da37-3f48-4214-98c8-11c392b47fc3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:19:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:33.907 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e014ac69-3086-4d06-a13b-1cd8b7ee6284]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:33 np0005476733 nova_compute[192580]: 2025-10-08 15:19:33.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:33.909 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e409da37-3f48-4214-98c8-11c392b47fc3 namespace which is not needed anymore#033[00m
Oct  8 11:19:33 np0005476733 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Deactivated successfully.
Oct  8 11:19:33 np0005476733 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Consumed 12.647s CPU time.
Oct  8 11:19:33 np0005476733 systemd-machined[152624]: Machine qemu-2-instance-00000004 terminated.
Oct  8 11:19:34 np0005476733 neutron-haproxy-ovnmeta-e409da37-3f48-4214-98c8-11c392b47fc3[221719]: [NOTICE]   (221723) : haproxy version is 2.8.14-c23fe91
Oct  8 11:19:34 np0005476733 neutron-haproxy-ovnmeta-e409da37-3f48-4214-98c8-11c392b47fc3[221719]: [NOTICE]   (221723) : path to executable is /usr/sbin/haproxy
Oct  8 11:19:34 np0005476733 neutron-haproxy-ovnmeta-e409da37-3f48-4214-98c8-11c392b47fc3[221719]: [WARNING]  (221723) : Exiting Master process...
Oct  8 11:19:34 np0005476733 neutron-haproxy-ovnmeta-e409da37-3f48-4214-98c8-11c392b47fc3[221719]: [ALERT]    (221723) : Current worker (221725) exited with code 143 (Terminated)
Oct  8 11:19:34 np0005476733 neutron-haproxy-ovnmeta-e409da37-3f48-4214-98c8-11c392b47fc3[221719]: [WARNING]  (221723) : All workers exited. Exiting... (0)
Oct  8 11:19:34 np0005476733 systemd[1]: libpod-506a64257b5750ca5328abf6873d16b569716dd91deee708b811e54ccf3101bf.scope: Deactivated successfully.
Oct  8 11:19:34 np0005476733 podman[222062]: 2025-10-08 15:19:34.11125183 +0000 UTC m=+0.104351351 container died 506a64257b5750ca5328abf6873d16b569716dd91deee708b811e54ccf3101bf (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-e409da37-3f48-4214-98c8-11c392b47fc3, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.121 2 INFO nova.virt.libvirt.driver [-] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Instance destroyed successfully.#033[00m
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.122 2 DEBUG nova.objects.instance [None req-518c6c54-60ea-4329-bb40-bff5b4ccfcee 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Lazy-loading 'resources' on Instance uuid f83eb34d-9eaa-4ec0-8632-215e0b1ef541 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.138 2 DEBUG nova.virt.libvirt.vif [None req-518c6c54-60ea-4329-bb40-bff5b4ccfcee 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:18:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-internal-dns-test-vm-1004768871',display_name='tempest-internal-dns-test-vm-1004768871',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-internal-dns-test-vm-1004768871',id=4,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJaHBXbst+5C6w+FRyQb5vNwPyePbL0LiFtDYUYrLhF4sS1u8JZRY/qVVDozU7kfxyfuGMumy6uD0IKmkpT0lo2o25UHzpCYQtviBx9gLAPFYRpL2c+ycGMhNCI1ZdUmKA==',key_name='tempest-internal-dns-test-shared-keypair-1788771288',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:19:15Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5248fcbeb2ef4348bd0d0da2a924916a',ramdisk_id='',reservation_id='r-wml4arjp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InternalDNSTestOvn-963558814',owner_user_name='tempest-InternalDNSTestOvn-963558814-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:19:16Z,user_data=None,user_id='8656576ff27549b38acf69641e38c125',uuid=f83eb34d-9eaa-4ec0-8632-215e0b1ef541,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae", "address": "fa:16:3e:ac:48:4e", "network": {"id": "e409da37-3f48-4214-98c8-11c392b47fc3", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-860223840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5248fcbeb2ef4348bd0d0da2a924916a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe3fc542-fd", "ovs_interfaceid": "fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.138 2 DEBUG nova.network.os_vif_util [None req-518c6c54-60ea-4329-bb40-bff5b4ccfcee 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Converting VIF {"id": "fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae", "address": "fa:16:3e:ac:48:4e", "network": {"id": "e409da37-3f48-4214-98c8-11c392b47fc3", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-860223840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5248fcbeb2ef4348bd0d0da2a924916a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe3fc542-fd", "ovs_interfaceid": "fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.139 2 DEBUG nova.network.os_vif_util [None req-518c6c54-60ea-4329-bb40-bff5b4ccfcee 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:48:4e,bridge_name='br-int',has_traffic_filtering=True,id=fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae,network=Network(e409da37-3f48-4214-98c8-11c392b47fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfe3fc542-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.140 2 DEBUG os_vif [None req-518c6c54-60ea-4329-bb40-bff5b4ccfcee 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:48:4e,bridge_name='br-int',has_traffic_filtering=True,id=fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae,network=Network(e409da37-3f48-4214-98c8-11c392b47fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfe3fc542-fd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.146 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe3fc542-fd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.152 2 INFO os_vif [None req-518c6c54-60ea-4329-bb40-bff5b4ccfcee 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:48:4e,bridge_name='br-int',has_traffic_filtering=True,id=fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae,network=Network(e409da37-3f48-4214-98c8-11c392b47fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfe3fc542-fd')#033[00m
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.153 2 INFO nova.virt.libvirt.driver [None req-518c6c54-60ea-4329-bb40-bff5b4ccfcee 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Deleting instance files /var/lib/nova/instances/f83eb34d-9eaa-4ec0-8632-215e0b1ef541_del#033[00m
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.153 2 INFO nova.virt.libvirt.driver [None req-518c6c54-60ea-4329-bb40-bff5b4ccfcee 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Deletion of /var/lib/nova/instances/f83eb34d-9eaa-4ec0-8632-215e0b1ef541_del complete#033[00m
Oct  8 11:19:34 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-506a64257b5750ca5328abf6873d16b569716dd91deee708b811e54ccf3101bf-userdata-shm.mount: Deactivated successfully.
Oct  8 11:19:34 np0005476733 systemd[1]: var-lib-containers-storage-overlay-f90d017669949d29f9a7028e849e1ebb06138a1bc4011d39d98af1443e69be0a-merged.mount: Deactivated successfully.
Oct  8 11:19:34 np0005476733 podman[222062]: 2025-10-08 15:19:34.197924754 +0000 UTC m=+0.191024275 container cleanup 506a64257b5750ca5328abf6873d16b569716dd91deee708b811e54ccf3101bf (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-e409da37-3f48-4214-98c8-11c392b47fc3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  8 11:19:34 np0005476733 systemd[1]: libpod-conmon-506a64257b5750ca5328abf6873d16b569716dd91deee708b811e54ccf3101bf.scope: Deactivated successfully.
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.224 2 INFO nova.compute.manager [None req-518c6c54-60ea-4329-bb40-bff5b4ccfcee 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.225 2 DEBUG oslo.service.loopingcall [None req-518c6c54-60ea-4329-bb40-bff5b4ccfcee 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.225 2 DEBUG nova.compute.manager [-] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.226 2 DEBUG nova.network.neutron [-] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:19:34 np0005476733 podman[222111]: 2025-10-08 15:19:34.310388223 +0000 UTC m=+0.085940462 container remove 506a64257b5750ca5328abf6873d16b569716dd91deee708b811e54ccf3101bf (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-e409da37-3f48-4214-98c8-11c392b47fc3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:19:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:34.316 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[cc32965f-ade2-4361-9fc3-be52992ef7b0]: (4, ('Wed Oct  8 03:19:33 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e409da37-3f48-4214-98c8-11c392b47fc3 (506a64257b5750ca5328abf6873d16b569716dd91deee708b811e54ccf3101bf)\n506a64257b5750ca5328abf6873d16b569716dd91deee708b811e54ccf3101bf\nWed Oct  8 03:19:34 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e409da37-3f48-4214-98c8-11c392b47fc3 (506a64257b5750ca5328abf6873d16b569716dd91deee708b811e54ccf3101bf)\n506a64257b5750ca5328abf6873d16b569716dd91deee708b811e54ccf3101bf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:34.317 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[32247faf-26ac-4ef5-89ea-375a09ec484e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:34.319 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape409da37-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:34 np0005476733 kernel: tape409da37-30: left promiscuous mode
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:34.336 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[69f144b6-cac4-425b-b862-47a3568a5913]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:34.370 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3096fe51-3942-481b-bcd0-b7c64a2638a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:34.372 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[617130bd-9216-4614-88cd-80a44f9341ef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:34.399 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3658d956-720f-461a-a0e5-3d1970d0b169]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364470, 'reachable_time': 39789, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222126, 'error': None, 'target': 'ovnmeta-e409da37-3f48-4214-98c8-11c392b47fc3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:34 np0005476733 systemd[1]: run-netns-ovnmeta\x2de409da37\x2d3f48\x2d4214\x2d98c8\x2d11c392b47fc3.mount: Deactivated successfully.
Oct  8 11:19:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:34.402 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e409da37-3f48-4214-98c8-11c392b47fc3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:19:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:34.403 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[fe89b6ea-ebbb-4c4b-b373-019dd8e92f4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.620 2 DEBUG nova.compute.manager [req-50b52348-d7f6-4c1f-bb93-7402769663a9 req-c83864a1-2108-4f48-ae61-ff387b44b9b6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Received event network-vif-unplugged-fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.621 2 DEBUG oslo_concurrency.lockutils [req-50b52348-d7f6-4c1f-bb93-7402769663a9 req-c83864a1-2108-4f48-ae61-ff387b44b9b6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "f83eb34d-9eaa-4ec0-8632-215e0b1ef541-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.621 2 DEBUG oslo_concurrency.lockutils [req-50b52348-d7f6-4c1f-bb93-7402769663a9 req-c83864a1-2108-4f48-ae61-ff387b44b9b6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "f83eb34d-9eaa-4ec0-8632-215e0b1ef541-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.621 2 DEBUG oslo_concurrency.lockutils [req-50b52348-d7f6-4c1f-bb93-7402769663a9 req-c83864a1-2108-4f48-ae61-ff387b44b9b6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "f83eb34d-9eaa-4ec0-8632-215e0b1ef541-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.622 2 DEBUG nova.compute.manager [req-50b52348-d7f6-4c1f-bb93-7402769663a9 req-c83864a1-2108-4f48-ae61-ff387b44b9b6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] No waiting events found dispatching network-vif-unplugged-fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:19:34 np0005476733 nova_compute[192580]: 2025-10-08 15:19:34.622 2 DEBUG nova.compute.manager [req-50b52348-d7f6-4c1f-bb93-7402769663a9 req-c83864a1-2108-4f48-ae61-ff387b44b9b6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Received event network-vif-unplugged-fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:19:35 np0005476733 nova_compute[192580]: 2025-10-08 15:19:35.058 2 DEBUG nova.network.neutron [req-da8e1ac6-41a0-4a02-bab2-7d51a1febab0 req-8e7b4378-7237-40f2-8a77-3bd8c5555d1d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Updated VIF entry in instance network info cache for port fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:19:35 np0005476733 nova_compute[192580]: 2025-10-08 15:19:35.058 2 DEBUG nova.network.neutron [req-da8e1ac6-41a0-4a02-bab2-7d51a1febab0 req-8e7b4378-7237-40f2-8a77-3bd8c5555d1d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Updating instance_info_cache with network_info: [{"id": "fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae", "address": "fa:16:3e:ac:48:4e", "network": {"id": "e409da37-3f48-4214-98c8-11c392b47fc3", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-860223840", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5248fcbeb2ef4348bd0d0da2a924916a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe3fc542-fd", "ovs_interfaceid": "fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:19:35 np0005476733 nova_compute[192580]: 2025-10-08 15:19:35.088 2 DEBUG oslo_concurrency.lockutils [req-da8e1ac6-41a0-4a02-bab2-7d51a1febab0 req-8e7b4378-7237-40f2-8a77-3bd8c5555d1d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-f83eb34d-9eaa-4ec0-8632-215e0b1ef541" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:19:35 np0005476733 nova_compute[192580]: 2025-10-08 15:19:35.308 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:19:35 np0005476733 nova_compute[192580]: 2025-10-08 15:19:35.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:19:35 np0005476733 nova_compute[192580]: 2025-10-08 15:19:35.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 11:19:35 np0005476733 nova_compute[192580]: 2025-10-08 15:19:35.604 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:19:36 np0005476733 nova_compute[192580]: 2025-10-08 15:19:36.241 2 DEBUG nova.network.neutron [-] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:19:36 np0005476733 nova_compute[192580]: 2025-10-08 15:19:36.272 2 INFO nova.compute.manager [-] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Took 2.05 seconds to deallocate network for instance.#033[00m
Oct  8 11:19:36 np0005476733 nova_compute[192580]: 2025-10-08 15:19:36.332 2 DEBUG oslo_concurrency.lockutils [None req-518c6c54-60ea-4329-bb40-bff5b4ccfcee 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:36 np0005476733 nova_compute[192580]: 2025-10-08 15:19:36.333 2 DEBUG oslo_concurrency.lockutils [None req-518c6c54-60ea-4329-bb40-bff5b4ccfcee 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:36 np0005476733 nova_compute[192580]: 2025-10-08 15:19:36.397 2 DEBUG nova.compute.provider_tree [None req-518c6c54-60ea-4329-bb40-bff5b4ccfcee 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:19:36 np0005476733 nova_compute[192580]: 2025-10-08 15:19:36.431 2 DEBUG nova.scheduler.client.report [None req-518c6c54-60ea-4329-bb40-bff5b4ccfcee 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 3, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:19:36 np0005476733 nova_compute[192580]: 2025-10-08 15:19:36.468 2 DEBUG oslo_concurrency.lockutils [None req-518c6c54-60ea-4329-bb40-bff5b4ccfcee 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:36 np0005476733 nova_compute[192580]: 2025-10-08 15:19:36.529 2 INFO nova.scheduler.client.report [None req-518c6c54-60ea-4329-bb40-bff5b4ccfcee 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Deleted allocations for instance f83eb34d-9eaa-4ec0-8632-215e0b1ef541#033[00m
Oct  8 11:19:36 np0005476733 nova_compute[192580]: 2025-10-08 15:19:36.628 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:19:36 np0005476733 nova_compute[192580]: 2025-10-08 15:19:36.656 2 DEBUG oslo_concurrency.lockutils [None req-518c6c54-60ea-4329-bb40-bff5b4ccfcee 8656576ff27549b38acf69641e38c125 5248fcbeb2ef4348bd0d0da2a924916a - - default default] Lock "f83eb34d-9eaa-4ec0-8632-215e0b1ef541" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.804s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:36 np0005476733 nova_compute[192580]: 2025-10-08 15:19:36.851 2 DEBUG nova.compute.manager [req-3d1f7fc5-c79d-42e2-8b80-41a5141c7159 req-3b173599-8ba5-4bd4-9b42-40907e7a8b97 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Received event network-vif-plugged-fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:19:36 np0005476733 nova_compute[192580]: 2025-10-08 15:19:36.852 2 DEBUG oslo_concurrency.lockutils [req-3d1f7fc5-c79d-42e2-8b80-41a5141c7159 req-3b173599-8ba5-4bd4-9b42-40907e7a8b97 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "f83eb34d-9eaa-4ec0-8632-215e0b1ef541-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:36 np0005476733 nova_compute[192580]: 2025-10-08 15:19:36.852 2 DEBUG oslo_concurrency.lockutils [req-3d1f7fc5-c79d-42e2-8b80-41a5141c7159 req-3b173599-8ba5-4bd4-9b42-40907e7a8b97 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "f83eb34d-9eaa-4ec0-8632-215e0b1ef541-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:36 np0005476733 nova_compute[192580]: 2025-10-08 15:19:36.852 2 DEBUG oslo_concurrency.lockutils [req-3d1f7fc5-c79d-42e2-8b80-41a5141c7159 req-3b173599-8ba5-4bd4-9b42-40907e7a8b97 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "f83eb34d-9eaa-4ec0-8632-215e0b1ef541-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:36 np0005476733 nova_compute[192580]: 2025-10-08 15:19:36.852 2 DEBUG nova.compute.manager [req-3d1f7fc5-c79d-42e2-8b80-41a5141c7159 req-3b173599-8ba5-4bd4-9b42-40907e7a8b97 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] No waiting events found dispatching network-vif-plugged-fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:19:36 np0005476733 nova_compute[192580]: 2025-10-08 15:19:36.853 2 WARNING nova.compute.manager [req-3d1f7fc5-c79d-42e2-8b80-41a5141c7159 req-3b173599-8ba5-4bd4-9b42-40907e7a8b97 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Received unexpected event network-vif-plugged-fe3fc542-fdc7-4cd7-9622-c5b9e1ebb1ae for instance with vm_state deleted and task_state None.#033[00m
Oct  8 11:19:37 np0005476733 podman[222127]: 2025-10-08 15:19:37.256077124 +0000 UTC m=+0.076706235 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 11:19:37 np0005476733 nova_compute[192580]: 2025-10-08 15:19:37.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:19:37 np0005476733 nova_compute[192580]: 2025-10-08 15:19:37.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:19:38 np0005476733 nova_compute[192580]: 2025-10-08 15:19:38.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:38 np0005476733 nova_compute[192580]: 2025-10-08 15:19:38.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.188 2 DEBUG oslo_concurrency.processutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e.part /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e.converted" returned: 0 in 22.179s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.352 2 DEBUG oslo_concurrency.processutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.426 2 DEBUG oslo_concurrency.processutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e.converted --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.428 2 DEBUG oslo_concurrency.lockutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 48.809s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.446 2 DEBUG oslo_concurrency.processutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.511 2 DEBUG oslo_concurrency.processutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.512 2 DEBUG oslo_concurrency.lockutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.513 2 DEBUG oslo_concurrency.lockutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.528 2 DEBUG oslo_concurrency.processutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.578 2 DEBUG oslo_concurrency.processutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.579 2 DEBUG oslo_concurrency.processutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/af660b82-9b3c-4c4d-820a-3d22b73898e5/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.603 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.603 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.603 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.627 2 DEBUG oslo_concurrency.processutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/af660b82-9b3c-4c4d-820a-3d22b73898e5/disk 10737418240" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.628 2 DEBUG oslo_concurrency.lockutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.628 2 DEBUG oslo_concurrency.processutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.698 2 DEBUG oslo_concurrency.processutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.700 2 DEBUG nova.objects.instance [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lazy-loading 'migration_context' on Instance uuid af660b82-9b3c-4c4d-820a-3d22b73898e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.718 2 DEBUG nova.virt.libvirt.driver [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.718 2 DEBUG nova.virt.libvirt.driver [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Ensure instance console log exists: /var/lib/nova/instances/af660b82-9b3c-4c4d-820a-3d22b73898e5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.719 2 DEBUG oslo_concurrency.lockutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.720 2 DEBUG oslo_concurrency.lockutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.721 2 DEBUG oslo_concurrency.lockutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.725 2 DEBUG nova.virt.libvirt.driver [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Start _get_guest_xml network_info=[{"id": "1f764678-f4b9-420d-b072-8c0f7c3534a9", "address": "fa:16:3e:7e:98:72", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f764678-f4", "ovs_interfaceid": "1f764678-f4b9-420d-b072-8c0f7c3534a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.732 2 WARNING nova.virt.libvirt.driver [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.737 2 DEBUG nova.virt.libvirt.host [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.738 2 DEBUG nova.virt.libvirt.host [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.741 2 DEBUG nova.virt.libvirt.host [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.742 2 DEBUG nova.virt.libvirt.host [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.742 2 DEBUG nova.virt.libvirt.driver [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.743 2 DEBUG nova.virt.hardware [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.743 2 DEBUG nova.virt.hardware [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.743 2 DEBUG nova.virt.hardware [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.744 2 DEBUG nova.virt.hardware [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.744 2 DEBUG nova.virt.hardware [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.744 2 DEBUG nova.virt.hardware [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.744 2 DEBUG nova.virt.hardware [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.744 2 DEBUG nova.virt.hardware [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.745 2 DEBUG nova.virt.hardware [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.745 2 DEBUG nova.virt.hardware [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.745 2 DEBUG nova.virt.hardware [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.749 2 DEBUG nova.virt.libvirt.vif [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:18:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_idle_timeout_with_querier_enabled-2110154127',display_name='tempest-test_idle_timeout_with_querier_enabled-2110154127',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-idle-timeout-with-querier-enabled-2110154127',id=5,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHaTUyIW7HAi8eLb2uxsb3hQ01QNiqMtiwd2QQElMyFusiyPekoP+eGZG5apcvUeJj+ezHykEE9e9GalqeB/Pt0gdiMZz/nmUCtHv59KRRGG4S5F2fPmbxlRdJaDztvzVg==',key_name='tempest-keypair-test-1272869518',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='496a37645ecf47b496dcf02c696ca64a',ramdisk_id='',reservation_id='r-cey4fd9p',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Ovn-1993668591',owner_user_name='tempest-MulticastTestIPv4Ovn-1993668591-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:18:50Z,user_data=None,user_id='c0c7c5c2dab54695b1cc0a34bdc4ee47',uuid=af660b82-9b3c-4c4d-820a-3d22b73898e5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1f764678-f4b9-420d-b072-8c0f7c3534a9", "address": "fa:16:3e:7e:98:72", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f764678-f4", "ovs_interfaceid": "1f764678-f4b9-420d-b072-8c0f7c3534a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.749 2 DEBUG nova.network.os_vif_util [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converting VIF {"id": "1f764678-f4b9-420d-b072-8c0f7c3534a9", "address": "fa:16:3e:7e:98:72", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f764678-f4", "ovs_interfaceid": "1f764678-f4b9-420d-b072-8c0f7c3534a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.750 2 DEBUG nova.network.os_vif_util [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7e:98:72,bridge_name='br-int',has_traffic_filtering=True,id=1f764678-f4b9-420d-b072-8c0f7c3534a9,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f764678-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.751 2 DEBUG nova.objects.instance [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lazy-loading 'pci_devices' on Instance uuid af660b82-9b3c-4c4d-820a-3d22b73898e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.766 2 DEBUG nova.virt.libvirt.driver [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:19:39 np0005476733 nova_compute[192580]:  <uuid>af660b82-9b3c-4c4d-820a-3d22b73898e5</uuid>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:  <name>instance-00000005</name>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <nova:name>tempest-test_idle_timeout_with_querier_enabled-2110154127</nova:name>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:19:39</nova:creationTime>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:19:39 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:        <nova:user uuid="c0c7c5c2dab54695b1cc0a34bdc4ee47">tempest-MulticastTestIPv4Ovn-1993668591-project-member</nova:user>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:        <nova:project uuid="496a37645ecf47b496dcf02c696ca64a">tempest-MulticastTestIPv4Ovn-1993668591</nova:project>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:        <nova:port uuid="1f764678-f4b9-420d-b072-8c0f7c3534a9">
Oct  8 11:19:39 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <entry name="serial">af660b82-9b3c-4c4d-820a-3d22b73898e5</entry>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <entry name="uuid">af660b82-9b3c-4c4d-820a-3d22b73898e5</entry>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/af660b82-9b3c-4c4d-820a-3d22b73898e5/disk"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.config"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:7e:98:72"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <target dev="tap1f764678-f4"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/af660b82-9b3c-4c4d-820a-3d22b73898e5/console.log" append="off"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:19:39 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:19:39 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:19:39 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:19:39 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.767 2 DEBUG nova.compute.manager [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Preparing to wait for external event network-vif-plugged-1f764678-f4b9-420d-b072-8c0f7c3534a9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.767 2 DEBUG oslo_concurrency.lockutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "af660b82-9b3c-4c4d-820a-3d22b73898e5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.768 2 DEBUG oslo_concurrency.lockutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "af660b82-9b3c-4c4d-820a-3d22b73898e5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.768 2 DEBUG oslo_concurrency.lockutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "af660b82-9b3c-4c4d-820a-3d22b73898e5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.769 2 DEBUG nova.virt.libvirt.vif [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:18:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_idle_timeout_with_querier_enabled-2110154127',display_name='tempest-test_idle_timeout_with_querier_enabled-2110154127',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-idle-timeout-with-querier-enabled-2110154127',id=5,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHaTUyIW7HAi8eLb2uxsb3hQ01QNiqMtiwd2QQElMyFusiyPekoP+eGZG5apcvUeJj+ezHykEE9e9GalqeB/Pt0gdiMZz/nmUCtHv59KRRGG4S5F2fPmbxlRdJaDztvzVg==',key_name='tempest-keypair-test-1272869518',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='496a37645ecf47b496dcf02c696ca64a',ramdisk_id='',reservation_id='r-cey4fd9p',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Ovn-1993668591',owner_user_name='tempest-MulticastTestIPv4Ovn-1993668591-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:18:50Z,user_data=None,user_id='c0c7c5c2dab54695b1cc0a34bdc4ee47',uuid=af660b82-9b3c-4c4d-820a-3d22b73898e5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1f764678-f4b9-420d-b072-8c0f7c3534a9", "address": "fa:16:3e:7e:98:72", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f764678-f4", "ovs_interfaceid": "1f764678-f4b9-420d-b072-8c0f7c3534a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.769 2 DEBUG nova.network.os_vif_util [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converting VIF {"id": "1f764678-f4b9-420d-b072-8c0f7c3534a9", "address": "fa:16:3e:7e:98:72", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f764678-f4", "ovs_interfaceid": "1f764678-f4b9-420d-b072-8c0f7c3534a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.769 2 DEBUG nova.network.os_vif_util [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7e:98:72,bridge_name='br-int',has_traffic_filtering=True,id=1f764678-f4b9-420d-b072-8c0f7c3534a9,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f764678-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.770 2 DEBUG os_vif [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:98:72,bridge_name='br-int',has_traffic_filtering=True,id=1f764678-f4b9-420d-b072-8c0f7c3534a9,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f764678-f4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.771 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.771 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.773 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f764678-f4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.774 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1f764678-f4, col_values=(('external_ids', {'iface-id': '1f764678-f4b9-420d-b072-8c0f7c3534a9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7e:98:72', 'vm-uuid': 'af660b82-9b3c-4c4d-820a-3d22b73898e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:39 np0005476733 NetworkManager[51699]: <info>  [1759936779.7913] manager: (tap1f764678-f4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.798 2 INFO os_vif [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:98:72,bridge_name='br-int',has_traffic_filtering=True,id=1f764678-f4b9-420d-b072-8c0f7c3534a9,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f764678-f4')#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.871 2 DEBUG nova.virt.libvirt.driver [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.872 2 DEBUG nova.virt.libvirt.driver [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.872 2 DEBUG nova.virt.libvirt.driver [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] No VIF found with MAC fa:16:3e:7e:98:72, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:19:39 np0005476733 nova_compute[192580]: 2025-10-08 15:19:39.873 2 INFO nova.virt.libvirt.driver [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Using config drive#033[00m
Oct  8 11:19:40 np0005476733 nova_compute[192580]: 2025-10-08 15:19:40.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:19:40 np0005476733 nova_compute[192580]: 2025-10-08 15:19:40.696 2 INFO nova.virt.libvirt.driver [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Creating config drive at /var/lib/nova/instances/af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.config#033[00m
Oct  8 11:19:40 np0005476733 nova_compute[192580]: 2025-10-08 15:19:40.702 2 DEBUG oslo_concurrency.processutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg12nglyc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:19:40 np0005476733 nova_compute[192580]: 2025-10-08 15:19:40.831 2 DEBUG oslo_concurrency.processutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg12nglyc" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:19:40 np0005476733 kernel: tap1f764678-f4: entered promiscuous mode
Oct  8 11:19:40 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:40Z|00069|binding|INFO|Claiming lport 1f764678-f4b9-420d-b072-8c0f7c3534a9 for this chassis.
Oct  8 11:19:40 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:40Z|00070|binding|INFO|1f764678-f4b9-420d-b072-8c0f7c3534a9: Claiming fa:16:3e:7e:98:72 10.100.0.8
Oct  8 11:19:40 np0005476733 nova_compute[192580]: 2025-10-08 15:19:40.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:40 np0005476733 NetworkManager[51699]: <info>  [1759936780.9329] manager: (tap1f764678-f4): new Tun device (/org/freedesktop/NetworkManager/Devices/35)
Oct  8 11:19:40 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:40Z|00071|binding|INFO|Setting lport 1f764678-f4b9-420d-b072-8c0f7c3534a9 ovn-installed in OVS
Oct  8 11:19:40 np0005476733 nova_compute[192580]: 2025-10-08 15:19:40.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:40 np0005476733 nova_compute[192580]: 2025-10-08 15:19:40.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:40 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:40Z|00072|binding|INFO|Setting lport 1f764678-f4b9-420d-b072-8c0f7c3534a9 up in Southbound
Oct  8 11:19:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:40.961 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7e:98:72 10.100.0.8'], port_security=['fa:16:3e:7e:98:72 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '496a37645ecf47b496dcf02c696ca64a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '023a0cd3-fdca-4dff-ba80-8ef557b384c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b3d4cc6-3768-451b-b35e-6b2333c921fd, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=1f764678-f4b9-420d-b072-8c0f7c3534a9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:19:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:40.963 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 1f764678-f4b9-420d-b072-8c0f7c3534a9 in datapath 30cdfb1e-750a-4d0e-9e9c-321b06b371b9 bound to our chassis#033[00m
Oct  8 11:19:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:40.965 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 30cdfb1e-750a-4d0e-9e9c-321b06b371b9#033[00m
Oct  8 11:19:40 np0005476733 systemd-udevd[222194]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:19:40 np0005476733 systemd-machined[152624]: New machine qemu-3-instance-00000005.
Oct  8 11:19:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:40.979 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6e5322a2-0467-4b6c-b9a0-82f132011db8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:40.981 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap30cdfb1e-71 in ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:19:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:40.983 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap30cdfb1e-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:19:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:40.983 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[bacec521-7b99-43ff-ad81-1eff21efc0dd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:40 np0005476733 NetworkManager[51699]: <info>  [1759936780.9843] device (tap1f764678-f4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:19:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:40.984 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[17801f4e-d616-44e5-8103-5f17f6bad563]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:40 np0005476733 NetworkManager[51699]: <info>  [1759936780.9853] device (tap1f764678-f4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:19:40 np0005476733 systemd[1]: Started Virtual Machine qemu-3-instance-00000005.
Oct  8 11:19:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:40.998 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[299a74ce-f6c6-4b15-966e-456cab920d71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:41 np0005476733 podman[222175]: 2025-10-08 15:19:41.005661936 +0000 UTC m=+0.091487759 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:41.022 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[88f51b9c-42e7-413b-92ef-414edd8001c5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:41.047 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[4560afb7-591d-4dfb-8460-5afb440eb29d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:41.052 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[bdc8d8f4-b3bc-452f-89e9-f3ce0d11ae64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:41 np0005476733 NetworkManager[51699]: <info>  [1759936781.0545] manager: (tap30cdfb1e-70): new Veth device (/org/freedesktop/NetworkManager/Devices/36)
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:41.084 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[66beb0fe-4ef9-4621-b327-2ff36ccc658a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:41.087 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[bc0cae1c-9a45-45f0-8980-987d103c8b8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:41 np0005476733 NetworkManager[51699]: <info>  [1759936781.1118] device (tap30cdfb1e-70): carrier: link connected
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:41.118 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[61dddbdf-d311-46e3-82e1-84843d63c1d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:41.137 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3a0c1da4-a7c3-4f96-9119-97b02e29efd0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap30cdfb1e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:3e:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 368477, 'reachable_time': 33561, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222240, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:41.154 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8087066c-d4dd-4f18-bf38-a7d5985241e0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed8:3ea4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 368477, 'tstamp': 368477}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222241, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:41.171 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d236c348-69a6-4026-a678-2dbd50afd200]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap30cdfb1e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:3e:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 368477, 'reachable_time': 33561, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222242, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:41.201 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7d3df2ac-2f6f-4870-82e7-7fd024299d17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:41.256 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1e75ac76-c572-4938-afc5-8bee733c6dfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:41.258 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap30cdfb1e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:41.258 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:41.258 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap30cdfb1e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:19:41 np0005476733 kernel: tap30cdfb1e-70: entered promiscuous mode
Oct  8 11:19:41 np0005476733 NetworkManager[51699]: <info>  [1759936781.2611] manager: (tap30cdfb1e-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Oct  8 11:19:41 np0005476733 nova_compute[192580]: 2025-10-08 15:19:41.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:41.263 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap30cdfb1e-70, col_values=(('external_ids', {'iface-id': '76302563-91ae-48df-adce-3edec8d5a578'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:19:41 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:41Z|00073|binding|INFO|Releasing lport 76302563-91ae-48df-adce-3edec8d5a578 from this chassis (sb_readonly=0)
Oct  8 11:19:41 np0005476733 nova_compute[192580]: 2025-10-08 15:19:41.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:41 np0005476733 nova_compute[192580]: 2025-10-08 15:19:41.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:41.266 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/30cdfb1e-750a-4d0e-9e9c-321b06b371b9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/30cdfb1e-750a-4d0e-9e9c-321b06b371b9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:41.266 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0b0be0f5-9966-433f-99f8-073e7bac9e38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:41.267 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-30cdfb1e-750a-4d0e-9e9c-321b06b371b9
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/30cdfb1e-750a-4d0e-9e9c-321b06b371b9.pid.haproxy
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 30cdfb1e-750a-4d0e-9e9c-321b06b371b9
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:19:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:19:41.268 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'env', 'PROCESS_TAG=haproxy-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/30cdfb1e-750a-4d0e-9e9c-321b06b371b9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:19:41 np0005476733 nova_compute[192580]: 2025-10-08 15:19:41.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:41 np0005476733 nova_compute[192580]: 2025-10-08 15:19:41.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:19:41 np0005476733 nova_compute[192580]: 2025-10-08 15:19:41.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:19:41 np0005476733 nova_compute[192580]: 2025-10-08 15:19:41.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:19:41 np0005476733 nova_compute[192580]: 2025-10-08 15:19:41.634 2 DEBUG nova.compute.manager [req-e574052e-0fe4-4aae-ba74-faf437b369ba req-8e3bda73-85b0-4071-a0ab-751ec0bbe20a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Received event network-vif-plugged-1f764678-f4b9-420d-b072-8c0f7c3534a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:19:41 np0005476733 nova_compute[192580]: 2025-10-08 15:19:41.635 2 DEBUG oslo_concurrency.lockutils [req-e574052e-0fe4-4aae-ba74-faf437b369ba req-8e3bda73-85b0-4071-a0ab-751ec0bbe20a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "af660b82-9b3c-4c4d-820a-3d22b73898e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:41 np0005476733 nova_compute[192580]: 2025-10-08 15:19:41.635 2 DEBUG oslo_concurrency.lockutils [req-e574052e-0fe4-4aae-ba74-faf437b369ba req-8e3bda73-85b0-4071-a0ab-751ec0bbe20a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "af660b82-9b3c-4c4d-820a-3d22b73898e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:41 np0005476733 nova_compute[192580]: 2025-10-08 15:19:41.636 2 DEBUG oslo_concurrency.lockutils [req-e574052e-0fe4-4aae-ba74-faf437b369ba req-8e3bda73-85b0-4071-a0ab-751ec0bbe20a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "af660b82-9b3c-4c4d-820a-3d22b73898e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:41 np0005476733 nova_compute[192580]: 2025-10-08 15:19:41.636 2 DEBUG nova.compute.manager [req-e574052e-0fe4-4aae-ba74-faf437b369ba req-8e3bda73-85b0-4071-a0ab-751ec0bbe20a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Processing event network-vif-plugged-1f764678-f4b9-420d-b072-8c0f7c3534a9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:19:41 np0005476733 podman[222280]: 2025-10-08 15:19:41.643895043 +0000 UTC m=+0.051931232 container create ff196a76cd72a270391c59c9051dde202d1b332e60776a07d5673f6f347cc5c3 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:19:41 np0005476733 nova_compute[192580]: 2025-10-08 15:19:41.652 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  8 11:19:41 np0005476733 nova_compute[192580]: 2025-10-08 15:19:41.654 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 11:19:41 np0005476733 systemd[1]: Started libpod-conmon-ff196a76cd72a270391c59c9051dde202d1b332e60776a07d5673f6f347cc5c3.scope.
Oct  8 11:19:41 np0005476733 podman[222280]: 2025-10-08 15:19:41.613973976 +0000 UTC m=+0.022010165 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:19:41 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:19:41 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b3c028b8f73810de4a748fc8846be92bbbadf66f8dafada46f9b4fa92d1d5f8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:19:41 np0005476733 podman[222280]: 2025-10-08 15:19:41.725181985 +0000 UTC m=+0.133218154 container init ff196a76cd72a270391c59c9051dde202d1b332e60776a07d5673f6f347cc5c3 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:19:41 np0005476733 podman[222280]: 2025-10-08 15:19:41.730131383 +0000 UTC m=+0.138167582 container start ff196a76cd72a270391c59c9051dde202d1b332e60776a07d5673f6f347cc5c3 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:19:41 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[222295]: [NOTICE]   (222299) : New worker (222301) forked
Oct  8 11:19:41 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[222295]: [NOTICE]   (222299) : Loading success.
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.120 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936782.1196752, af660b82-9b3c-4c4d-820a-3d22b73898e5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.121 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] VM Started (Lifecycle Event)#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.123 2 DEBUG nova.compute.manager [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.128 2 DEBUG nova.virt.libvirt.driver [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.132 2 INFO nova.virt.libvirt.driver [-] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Instance spawned successfully.#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.132 2 DEBUG nova.virt.libvirt.driver [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.164 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.168 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.207 2 DEBUG nova.virt.libvirt.driver [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.207 2 DEBUG nova.virt.libvirt.driver [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.207 2 DEBUG nova.virt.libvirt.driver [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.208 2 DEBUG nova.virt.libvirt.driver [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.208 2 DEBUG nova.virt.libvirt.driver [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.208 2 DEBUG nova.virt.libvirt.driver [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.338 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.339 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936782.1199408, af660b82-9b3c-4c4d-820a-3d22b73898e5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.339 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.391 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.399 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936782.126843, af660b82-9b3c-4c4d-820a-3d22b73898e5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.399 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.497 2 INFO nova.compute.manager [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Took 51.88 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.498 2 DEBUG nova.compute.manager [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.535 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.537 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.866 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759936767.864799, 45e7af69-431d-4066-9b30-3883334340db => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.866 2 INFO nova.compute.manager [-] [instance: 45e7af69-431d-4066-9b30-3883334340db] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:19:42 np0005476733 nova_compute[192580]: 2025-10-08 15:19:42.924 2 INFO nova.compute.manager [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Took 53.08 seconds to build instance.#033[00m
Oct  8 11:19:43 np0005476733 nova_compute[192580]: 2025-10-08 15:19:43.065 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 11:19:43 np0005476733 nova_compute[192580]: 2025-10-08 15:19:43.141 2 DEBUG nova.compute.manager [None req-cf8ad361-2a64-446b-aec1-376f4f01531c - - - - - -] [instance: 45e7af69-431d-4066-9b30-3883334340db] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:19:43 np0005476733 nova_compute[192580]: 2025-10-08 15:19:43.271 2 DEBUG oslo_concurrency.lockutils [None req-7cd85c35-36f7-49a5-92a8-358b946419ae c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "af660b82-9b3c-4c4d-820a-3d22b73898e5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 53.536s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:43 np0005476733 nova_compute[192580]: 2025-10-08 15:19:43.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:43 np0005476733 nova_compute[192580]: 2025-10-08 15:19:43.956 2 DEBUG nova.compute.manager [req-ad4453de-47a4-4237-aa70-b1848bba85b5 req-b87f6145-0a35-4ef4-9899-124dbecdc42f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Received event network-vif-plugged-1f764678-f4b9-420d-b072-8c0f7c3534a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:19:43 np0005476733 nova_compute[192580]: 2025-10-08 15:19:43.956 2 DEBUG oslo_concurrency.lockutils [req-ad4453de-47a4-4237-aa70-b1848bba85b5 req-b87f6145-0a35-4ef4-9899-124dbecdc42f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "af660b82-9b3c-4c4d-820a-3d22b73898e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:43 np0005476733 nova_compute[192580]: 2025-10-08 15:19:43.956 2 DEBUG oslo_concurrency.lockutils [req-ad4453de-47a4-4237-aa70-b1848bba85b5 req-b87f6145-0a35-4ef4-9899-124dbecdc42f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "af660b82-9b3c-4c4d-820a-3d22b73898e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:43 np0005476733 nova_compute[192580]: 2025-10-08 15:19:43.957 2 DEBUG oslo_concurrency.lockutils [req-ad4453de-47a4-4237-aa70-b1848bba85b5 req-b87f6145-0a35-4ef4-9899-124dbecdc42f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "af660b82-9b3c-4c4d-820a-3d22b73898e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:43 np0005476733 nova_compute[192580]: 2025-10-08 15:19:43.957 2 DEBUG nova.compute.manager [req-ad4453de-47a4-4237-aa70-b1848bba85b5 req-b87f6145-0a35-4ef4-9899-124dbecdc42f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] No waiting events found dispatching network-vif-plugged-1f764678-f4b9-420d-b072-8c0f7c3534a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:19:43 np0005476733 nova_compute[192580]: 2025-10-08 15:19:43.957 2 WARNING nova.compute.manager [req-ad4453de-47a4-4237-aa70-b1848bba85b5 req-b87f6145-0a35-4ef4-9899-124dbecdc42f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Received unexpected event network-vif-plugged-1f764678-f4b9-420d-b072-8c0f7c3534a9 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:19:44 np0005476733 nova_compute[192580]: 2025-10-08 15:19:44.065 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:19:44 np0005476733 nova_compute[192580]: 2025-10-08 15:19:44.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:44 np0005476733 nova_compute[192580]: 2025-10-08 15:19:44.901 2 INFO nova.compute.manager [None req-1745e0db-0053-4753-a4e4-ec817887d4c7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Get console output#033[00m
Oct  8 11:19:44 np0005476733 nova_compute[192580]: 2025-10-08 15:19:44.907 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:19:45 np0005476733 nova_compute[192580]: 2025-10-08 15:19:45.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:46 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:46Z|00074|binding|INFO|Releasing lport 76302563-91ae-48df-adce-3edec8d5a578 from this chassis (sb_readonly=0)
Oct  8 11:19:46 np0005476733 nova_compute[192580]: 2025-10-08 15:19:46.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:47 np0005476733 ovn_controller[94857]: 2025-10-08T15:19:47Z|00075|binding|INFO|Releasing lport 76302563-91ae-48df-adce-3edec8d5a578 from this chassis (sb_readonly=0)
Oct  8 11:19:47 np0005476733 nova_compute[192580]: 2025-10-08 15:19:47.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:48 np0005476733 podman[222311]: 2025-10-08 15:19:48.242104978 +0000 UTC m=+0.064470215 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:19:48 np0005476733 nova_compute[192580]: 2025-10-08 15:19:48.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.118 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759936774.1171453, f83eb34d-9eaa-4ec0-8632-215e0b1ef541 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.119 2 INFO nova.compute.manager [-] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.140 2 DEBUG nova.compute.manager [None req-cd84e8a7-2715-4df6-824c-998f581da011 - - - - - -] [instance: f83eb34d-9eaa-4ec0-8632-215e0b1ef541] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.240 2 DEBUG oslo_concurrency.lockutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "c4b45a9c-73a5-4b51-ab96-874507f4c028" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.241 2 DEBUG oslo_concurrency.lockutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.263 2 DEBUG nova.compute.manager [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.375 2 DEBUG oslo_concurrency.lockutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.377 2 DEBUG oslo_concurrency.lockutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.386 2 DEBUG nova.virt.hardware [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.387 2 INFO nova.compute.claims [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.589 2 DEBUG nova.compute.provider_tree [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'MEMORY_MB': {'total': 15731, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 119, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 2}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.651 2 ERROR nova.scheduler.client.report [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [req-a1794415-cd10-429e-9f03-a4267e4093a9] Failed to update inventory to [{'MEMORY_MB': {'total': 15731, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 119, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 2}}] for resource provider with UUID 94652b61-be28-442d-a9f4-cded63837444.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-a1794415-cd10-429e-9f03-a4267e4093a9"}]}#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.687 2 DEBUG nova.scheduler.client.report [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.706 2 DEBUG nova.scheduler.client.report [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 3, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.707 2 DEBUG nova.compute.provider_tree [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 3, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.731 2 DEBUG nova.scheduler.client.report [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.760 2 DEBUG nova.scheduler.client.report [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.824 2 DEBUG nova.compute.provider_tree [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'MEMORY_MB': {'total': 15731, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 119, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 2}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.877 2 DEBUG nova.scheduler.client.report [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Updated inventory for provider 94652b61-be28-442d-a9f4-cded63837444 with generation 8 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15731, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 119, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 2}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.877 2 DEBUG nova.compute.provider_tree [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Updating resource provider 94652b61-be28-442d-a9f4-cded63837444 generation from 8 to 9 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.878 2 DEBUG nova.compute.provider_tree [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.910 2 DEBUG oslo_concurrency.lockutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.533s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.912 2 DEBUG nova.compute.manager [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.968 2 DEBUG nova.compute.manager [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.968 2 DEBUG nova.network.neutron [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:19:49 np0005476733 nova_compute[192580]: 2025-10-08 15:19:49.990 2 INFO nova.virt.libvirt.driver [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.010 2 DEBUG nova.compute.manager [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.124 2 DEBUG nova.compute.manager [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.126 2 DEBUG nova.virt.libvirt.driver [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.127 2 INFO nova.virt.libvirt.driver [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Creating image(s)#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.128 2 DEBUG oslo_concurrency.lockutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "/var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.129 2 DEBUG oslo_concurrency.lockutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "/var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.130 2 DEBUG oslo_concurrency.lockutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "/var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.155 2 INFO nova.compute.manager [None req-2740735c-b5cb-4673-b52d-a80fa1969790 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Get console output#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.157 2 DEBUG oslo_concurrency.processutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.181 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.220 2 DEBUG nova.policy [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.224 2 DEBUG oslo_concurrency.processutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.225 2 DEBUG oslo_concurrency.lockutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.225 2 DEBUG oslo_concurrency.lockutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.237 2 DEBUG oslo_concurrency.processutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:19:50 np0005476733 podman[222331]: 2025-10-08 15:19:50.257602887 +0000 UTC m=+0.081599283 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.308 2 DEBUG oslo_concurrency.processutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.309 2 DEBUG oslo_concurrency.processutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.359 2 DEBUG oslo_concurrency.processutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.360 2 DEBUG oslo_concurrency.lockutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.361 2 DEBUG oslo_concurrency.processutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.425 2 DEBUG oslo_concurrency.processutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.426 2 DEBUG nova.virt.disk.api [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Checking if we can resize image /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.427 2 DEBUG oslo_concurrency.processutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.492 2 DEBUG oslo_concurrency.processutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.493 2 DEBUG nova.virt.disk.api [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Cannot resize image /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.494 2 DEBUG nova.objects.instance [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lazy-loading 'migration_context' on Instance uuid c4b45a9c-73a5-4b51-ab96-874507f4c028 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.542 2 DEBUG nova.virt.libvirt.driver [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.543 2 DEBUG nova.virt.libvirt.driver [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Ensure instance console log exists: /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.544 2 DEBUG oslo_concurrency.lockutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.544 2 DEBUG oslo_concurrency.lockutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:50 np0005476733 nova_compute[192580]: 2025-10-08 15:19:50.545 2 DEBUG oslo_concurrency.lockutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:51 np0005476733 podman[222366]: 2025-10-08 15:19:51.274235885 +0000 UTC m=+0.094236676 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Oct  8 11:19:51 np0005476733 nova_compute[192580]: 2025-10-08 15:19:51.563 2 DEBUG nova.network.neutron [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Successfully created port: fc0ee6f6-75ad-486c-8761-a75311199fcb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 11:19:52 np0005476733 nova_compute[192580]: 2025-10-08 15:19:52.025 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:19:52 np0005476733 nova_compute[192580]: 2025-10-08 15:19:52.183 2 WARNING nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] While synchronizing instance power states, found 2 instances in the database and 1 instances on the hypervisor.#033[00m
Oct  8 11:19:52 np0005476733 nova_compute[192580]: 2025-10-08 15:19:52.183 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Triggering sync for uuid af660b82-9b3c-4c4d-820a-3d22b73898e5 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  8 11:19:52 np0005476733 nova_compute[192580]: 2025-10-08 15:19:52.183 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Triggering sync for uuid c4b45a9c-73a5-4b51-ab96-874507f4c028 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  8 11:19:52 np0005476733 nova_compute[192580]: 2025-10-08 15:19:52.184 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "af660b82-9b3c-4c4d-820a-3d22b73898e5" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:52 np0005476733 nova_compute[192580]: 2025-10-08 15:19:52.184 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "af660b82-9b3c-4c4d-820a-3d22b73898e5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:19:52 np0005476733 nova_compute[192580]: 2025-10-08 15:19:52.184 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "c4b45a9c-73a5-4b51-ab96-874507f4c028" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:19:52 np0005476733 nova_compute[192580]: 2025-10-08 15:19:52.394 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "af660b82-9b3c-4c4d-820a-3d22b73898e5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:19:53 np0005476733 nova_compute[192580]: 2025-10-08 15:19:53.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:54 np0005476733 nova_compute[192580]: 2025-10-08 15:19:54.425 2 DEBUG nova.network.neutron [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Successfully updated port: fc0ee6f6-75ad-486c-8761-a75311199fcb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:19:54 np0005476733 nova_compute[192580]: 2025-10-08 15:19:54.491 2 DEBUG oslo_concurrency.lockutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "refresh_cache-c4b45a9c-73a5-4b51-ab96-874507f4c028" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:19:54 np0005476733 nova_compute[192580]: 2025-10-08 15:19:54.491 2 DEBUG oslo_concurrency.lockutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquired lock "refresh_cache-c4b45a9c-73a5-4b51-ab96-874507f4c028" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:19:54 np0005476733 nova_compute[192580]: 2025-10-08 15:19:54.492 2 DEBUG nova.network.neutron [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:19:54 np0005476733 nova_compute[192580]: 2025-10-08 15:19:54.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:55 np0005476733 podman[222393]: 2025-10-08 15:19:55.237116921 +0000 UTC m=+0.058946616 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, container_name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_id=edpm, distribution-scope=public, release=1755695350, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Oct  8 11:19:55 np0005476733 nova_compute[192580]: 2025-10-08 15:19:55.313 2 DEBUG nova.network.neutron [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:19:55 np0005476733 nova_compute[192580]: 2025-10-08 15:19:55.627 2 DEBUG nova.compute.manager [req-58fab194-1a2e-43bb-b750-6db4bae21412 req-d96a66f3-8d81-4f54-902a-c27122349dbf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received event network-changed-fc0ee6f6-75ad-486c-8761-a75311199fcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:19:55 np0005476733 nova_compute[192580]: 2025-10-08 15:19:55.627 2 DEBUG nova.compute.manager [req-58fab194-1a2e-43bb-b750-6db4bae21412 req-d96a66f3-8d81-4f54-902a-c27122349dbf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Refreshing instance network info cache due to event network-changed-fc0ee6f6-75ad-486c-8761-a75311199fcb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:19:55 np0005476733 nova_compute[192580]: 2025-10-08 15:19:55.628 2 DEBUG oslo_concurrency.lockutils [req-58fab194-1a2e-43bb-b750-6db4bae21412 req-d96a66f3-8d81-4f54-902a-c27122349dbf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-c4b45a9c-73a5-4b51-ab96-874507f4c028" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:19:58 np0005476733 nova_compute[192580]: 2025-10-08 15:19:58.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:59 np0005476733 nova_compute[192580]: 2025-10-08 15:19:59.142 2 INFO nova.compute.manager [None req-0aa1cd5c-a409-441b-a634-49a81baef4f2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Get console output#033[00m
Oct  8 11:19:59 np0005476733 nova_compute[192580]: 2025-10-08 15:19:59.148 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:19:59 np0005476733 nova_compute[192580]: 2025-10-08 15:19:59.647 2 DEBUG nova.network.neutron [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Updating instance_info_cache with network_info: [{"id": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "address": "fa:16:3e:3f:49:16", "network": {"id": "6683d4f6-e609-48e8-bf45-f31b3fa1d7ec", "bridge": "br-int", "label": "tempest-test-network--999097329", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0ee6f6-75", "ovs_interfaceid": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:19:59 np0005476733 nova_compute[192580]: 2025-10-08 15:19:59.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:19:59 np0005476733 nova_compute[192580]: 2025-10-08 15:19:59.987 2 DEBUG oslo_concurrency.lockutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Releasing lock "refresh_cache-c4b45a9c-73a5-4b51-ab96-874507f4c028" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:19:59 np0005476733 nova_compute[192580]: 2025-10-08 15:19:59.988 2 DEBUG nova.compute.manager [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Instance network_info: |[{"id": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "address": "fa:16:3e:3f:49:16", "network": {"id": "6683d4f6-e609-48e8-bf45-f31b3fa1d7ec", "bridge": "br-int", "label": "tempest-test-network--999097329", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0ee6f6-75", "ovs_interfaceid": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:19:59 np0005476733 nova_compute[192580]: 2025-10-08 15:19:59.990 2 DEBUG oslo_concurrency.lockutils [req-58fab194-1a2e-43bb-b750-6db4bae21412 req-d96a66f3-8d81-4f54-902a-c27122349dbf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-c4b45a9c-73a5-4b51-ab96-874507f4c028" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:19:59 np0005476733 nova_compute[192580]: 2025-10-08 15:19:59.991 2 DEBUG nova.network.neutron [req-58fab194-1a2e-43bb-b750-6db4bae21412 req-d96a66f3-8d81-4f54-902a-c27122349dbf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Refreshing network info cache for port fc0ee6f6-75ad-486c-8761-a75311199fcb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:19:59 np0005476733 nova_compute[192580]: 2025-10-08 15:19:59.995 2 DEBUG nova.virt.libvirt.driver [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Start _get_guest_xml network_info=[{"id": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "address": "fa:16:3e:3f:49:16", "network": {"id": "6683d4f6-e609-48e8-bf45-f31b3fa1d7ec", "bridge": "br-int", "label": "tempest-test-network--999097329", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0ee6f6-75", "ovs_interfaceid": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T15:17:39Z,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T15:17:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.001 2 WARNING nova.virt.libvirt.driver [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.008 2 DEBUG nova.virt.libvirt.host [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.009 2 DEBUG nova.virt.libvirt.host [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.015 2 DEBUG nova.virt.libvirt.host [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.017 2 DEBUG nova.virt.libvirt.host [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.018 2 DEBUG nova.virt.libvirt.driver [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.018 2 DEBUG nova.virt.hardware [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='987b2db7-1d21-4b59-831a-1e8ace40589b',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T15:17:39Z,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T15:17:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.019 2 DEBUG nova.virt.hardware [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.019 2 DEBUG nova.virt.hardware [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.020 2 DEBUG nova.virt.hardware [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.021 2 DEBUG nova.virt.hardware [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.021 2 DEBUG nova.virt.hardware [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.022 2 DEBUG nova.virt.hardware [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.022 2 DEBUG nova.virt.hardware [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.023 2 DEBUG nova.virt.hardware [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.023 2 DEBUG nova.virt.hardware [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.024 2 DEBUG nova.virt.hardware [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.027 2 DEBUG nova.virt.libvirt.vif [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:19:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-275740212',display_name='tempest-server-test-275740212',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-275740212',id=7,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBInikYfls17wB1mwR3S9Ulgdg+DNmQrpoJDQvNS30i0mIRN2rBHXyw6+Ph5Eh6gBM3mwOmnBp3bKiolQD/a4fLXgU3ywHOPwgHPAGHPd9nWtpL3ZVtnCf+c+8SPVxqk1WQ==',key_name='tempest-keypair-test-226591404',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7762962015674dfb9038a135559a61f3',ramdisk_id='',reservation_id='r-lnlgqtkx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkBasicTest-1891752524',owner_user_name='tempest-NetworkBasicTest-1891752524-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:19:50Z,user_data=None,user_id='71a7f2d2441447b2bbd1b677555d68cc',uuid=c4b45a9c-73a5-4b51-ab96-874507f4c028,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "address": "fa:16:3e:3f:49:16", "network": {"id": "6683d4f6-e609-48e8-bf45-f31b3fa1d7ec", "bridge": "br-int", "label": "tempest-test-network--999097329", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0ee6f6-75", "ovs_interfaceid": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.027 2 DEBUG nova.network.os_vif_util [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Converting VIF {"id": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "address": "fa:16:3e:3f:49:16", "network": {"id": "6683d4f6-e609-48e8-bf45-f31b3fa1d7ec", "bridge": "br-int", "label": "tempest-test-network--999097329", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0ee6f6-75", "ovs_interfaceid": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.028 2 DEBUG nova.network.os_vif_util [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3f:49:16,bridge_name='br-int',has_traffic_filtering=True,id=fc0ee6f6-75ad-486c-8761-a75311199fcb,network=Network(6683d4f6-e609-48e8-bf45-f31b3fa1d7ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc0ee6f6-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.029 2 DEBUG nova.objects.instance [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lazy-loading 'pci_devices' on Instance uuid c4b45a9c-73a5-4b51-ab96-874507f4c028 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.050 2 DEBUG nova.virt.libvirt.driver [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:20:00 np0005476733 nova_compute[192580]:  <uuid>c4b45a9c-73a5-4b51-ab96-874507f4c028</uuid>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:  <name>instance-00000007</name>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:  <memory>131072</memory>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <nova:name>tempest-server-test-275740212</nova:name>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:20:00</nova:creationTime>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <nova:flavor name="m1.nano">
Oct  8 11:20:00 np0005476733 nova_compute[192580]:        <nova:memory>128</nova:memory>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:        <nova:disk>1</nova:disk>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:        <nova:user uuid="71a7f2d2441447b2bbd1b677555d68cc">tempest-NetworkBasicTest-1891752524-project-member</nova:user>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:        <nova:project uuid="7762962015674dfb9038a135559a61f3">tempest-NetworkBasicTest-1891752524</nova:project>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="ec29a055-bb5f-49c2-94be-8574c5ea97ea"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:        <nova:port uuid="fc0ee6f6-75ad-486c-8761-a75311199fcb">
Oct  8 11:20:00 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.26" ipVersion="4"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <entry name="serial">c4b45a9c-73a5-4b51-ab96-874507f4c028</entry>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <entry name="uuid">c4b45a9c-73a5-4b51-ab96-874507f4c028</entry>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.config"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:3f:49:16"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <target dev="tapfc0ee6f6-75"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/console.log" append="off"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:20:00 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:20:00 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:20:00 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:20:00 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.051 2 DEBUG nova.compute.manager [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Preparing to wait for external event network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.051 2 DEBUG oslo_concurrency.lockutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.052 2 DEBUG oslo_concurrency.lockutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.052 2 DEBUG oslo_concurrency.lockutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.053 2 DEBUG nova.virt.libvirt.vif [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:19:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-275740212',display_name='tempest-server-test-275740212',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-275740212',id=7,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBInikYfls17wB1mwR3S9Ulgdg+DNmQrpoJDQvNS30i0mIRN2rBHXyw6+Ph5Eh6gBM3mwOmnBp3bKiolQD/a4fLXgU3ywHOPwgHPAGHPd9nWtpL3ZVtnCf+c+8SPVxqk1WQ==',key_name='tempest-keypair-test-226591404',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7762962015674dfb9038a135559a61f3',ramdisk_id='',reservation_id='r-lnlgqtkx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkBasicTest-1891752524',owner_user_name='tempest-NetworkBasicTest-1891752524-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:19:50Z,user_data=None,user_id='71a7f2d2441447b2bbd1b677555d68cc',uuid=c4b45a9c-73a5-4b51-ab96-874507f4c028,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "address": "fa:16:3e:3f:49:16", "network": {"id": "6683d4f6-e609-48e8-bf45-f31b3fa1d7ec", "bridge": "br-int", "label": "tempest-test-network--999097329", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0ee6f6-75", "ovs_interfaceid": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.053 2 DEBUG nova.network.os_vif_util [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Converting VIF {"id": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "address": "fa:16:3e:3f:49:16", "network": {"id": "6683d4f6-e609-48e8-bf45-f31b3fa1d7ec", "bridge": "br-int", "label": "tempest-test-network--999097329", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0ee6f6-75", "ovs_interfaceid": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.054 2 DEBUG nova.network.os_vif_util [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3f:49:16,bridge_name='br-int',has_traffic_filtering=True,id=fc0ee6f6-75ad-486c-8761-a75311199fcb,network=Network(6683d4f6-e609-48e8-bf45-f31b3fa1d7ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc0ee6f6-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.054 2 DEBUG os_vif [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3f:49:16,bridge_name='br-int',has_traffic_filtering=True,id=fc0ee6f6-75ad-486c-8761-a75311199fcb,network=Network(6683d4f6-e609-48e8-bf45-f31b3fa1d7ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc0ee6f6-75') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.055 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.056 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.058 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc0ee6f6-75, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.059 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfc0ee6f6-75, col_values=(('external_ids', {'iface-id': 'fc0ee6f6-75ad-486c-8761-a75311199fcb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3f:49:16', 'vm-uuid': 'c4b45a9c-73a5-4b51-ab96-874507f4c028'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:00 np0005476733 NetworkManager[51699]: <info>  [1759936800.0611] manager: (tapfc0ee6f6-75): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.071 2 INFO os_vif [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3f:49:16,bridge_name='br-int',has_traffic_filtering=True,id=fc0ee6f6-75ad-486c-8761-a75311199fcb,network=Network(6683d4f6-e609-48e8-bf45-f31b3fa1d7ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc0ee6f6-75')#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.130 2 DEBUG nova.virt.libvirt.driver [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.130 2 DEBUG nova.virt.libvirt.driver [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.130 2 DEBUG nova.virt.libvirt.driver [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] No VIF found with MAC fa:16:3e:3f:49:16, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.131 2 INFO nova.virt.libvirt.driver [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Using config drive#033[00m
Oct  8 11:20:00 np0005476733 podman[222423]: 2025-10-08 15:20:00.163948662 +0000 UTC m=+0.050718165 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:20:00 np0005476733 podman[222422]: 2025-10-08 15:20:00.181943487 +0000 UTC m=+0.071489208 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.784 2 INFO nova.virt.libvirt.driver [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Creating config drive at /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.config#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.795 2 DEBUG oslo_concurrency.processutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdr5ag89u execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:20:00 np0005476733 nova_compute[192580]: 2025-10-08 15:20:00.928 2 DEBUG oslo_concurrency.processutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdr5ag89u" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:20:01 np0005476733 kernel: tapfc0ee6f6-75: entered promiscuous mode
Oct  8 11:20:01 np0005476733 NetworkManager[51699]: <info>  [1759936801.0012] manager: (tapfc0ee6f6-75): new Tun device (/org/freedesktop/NetworkManager/Devices/39)
Oct  8 11:20:01 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:01Z|00076|binding|INFO|Claiming lport fc0ee6f6-75ad-486c-8761-a75311199fcb for this chassis.
Oct  8 11:20:01 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:01Z|00077|binding|INFO|fc0ee6f6-75ad-486c-8761-a75311199fcb: Claiming fa:16:3e:3f:49:16 10.100.0.26
Oct  8 11:20:01 np0005476733 nova_compute[192580]: 2025-10-08 15:20:01.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.059 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:49:16 10.100.0.26'], port_security=['fa:16:3e:3f:49:16 10.100.0.26'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.26/28', 'neutron:device_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7762962015674dfb9038a135559a61f3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '47f96e90-866e-45e9-bccf-367f966b96ad', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7bd9065d-dade-45b5-8223-a8753cff9447, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=fc0ee6f6-75ad-486c-8761-a75311199fcb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.062 103739 INFO neutron.agent.ovn.metadata.agent [-] Port fc0ee6f6-75ad-486c-8761-a75311199fcb in datapath 6683d4f6-e609-48e8-bf45-f31b3fa1d7ec bound to our chassis#033[00m
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.065 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6683d4f6-e609-48e8-bf45-f31b3fa1d7ec#033[00m
Oct  8 11:20:01 np0005476733 systemd-machined[152624]: New machine qemu-4-instance-00000007.
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.082 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[83cc0969-cdd9-48d6-86c8-13a78c1a4ef6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.083 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6683d4f6-e1 in ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.085 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6683d4f6-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.085 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[82a93fc6-b449-4704-ac4f-2e2c5d60c3f9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.086 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4458a77f-60fd-4cad-a4f0-fdec200f8f22]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:01 np0005476733 nova_compute[192580]: 2025-10-08 15:20:01.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:01 np0005476733 systemd[1]: Started Virtual Machine qemu-4-instance-00000007.
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.106 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[e7c23bff-a117-4ef9-89c9-477a43c28be5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:01 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:01Z|00078|binding|INFO|Setting lport fc0ee6f6-75ad-486c-8761-a75311199fcb ovn-installed in OVS
Oct  8 11:20:01 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:01Z|00079|binding|INFO|Setting lport fc0ee6f6-75ad-486c-8761-a75311199fcb up in Southbound
Oct  8 11:20:01 np0005476733 nova_compute[192580]: 2025-10-08 15:20:01.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.131 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[de106dea-2b5f-428c-bc3d-183ff6233e20]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:01 np0005476733 systemd-udevd[222484]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:20:01 np0005476733 NetworkManager[51699]: <info>  [1759936801.1519] device (tapfc0ee6f6-75): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:20:01 np0005476733 NetworkManager[51699]: <info>  [1759936801.1530] device (tapfc0ee6f6-75): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.166 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[cd40a7ab-6669-41a8-a1d7-a210d3ed0473]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.175 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[07b5a3d4-08e3-4175-a1f1-8fe63f911c98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:01 np0005476733 NetworkManager[51699]: <info>  [1759936801.1773] manager: (tap6683d4f6-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/40)
Oct  8 11:20:01 np0005476733 systemd-udevd[222491]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.216 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[222bedcf-0416-42ac-aace-1a9bafc845d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.218 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[f8fa809a-ccf1-4c0f-8387-01c4fc48cb89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:01 np0005476733 NetworkManager[51699]: <info>  [1759936801.2404] device (tap6683d4f6-e0): carrier: link connected
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.245 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[591504c2-3c86-416c-acf7-314e3f29bd21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.261 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[75060e54-cc9a-42df-835c-ac083276f9ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6683d4f6-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:fd:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 370490, 'reachable_time': 17147, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222514, 'error': None, 'target': 'ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.275 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[67b3e389-8813-46af-b4bf-96762ebfc241]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe29:fdad'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 370490, 'tstamp': 370490}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222515, 'error': None, 'target': 'ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.291 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[bbb194ea-c9fa-45ee-b3c1-986e654657ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6683d4f6-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:fd:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 370490, 'reachable_time': 17147, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222516, 'error': None, 'target': 'ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.323 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4088aa09-a742-4cc1-b237-26ff799dfb05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.380 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4b85a689-b3c4-4064-bd38-eb863f80521f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.381 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6683d4f6-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.381 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.382 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6683d4f6-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:01 np0005476733 nova_compute[192580]: 2025-10-08 15:20:01.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:01 np0005476733 NetworkManager[51699]: <info>  [1759936801.3846] manager: (tap6683d4f6-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Oct  8 11:20:01 np0005476733 kernel: tap6683d4f6-e0: entered promiscuous mode
Oct  8 11:20:01 np0005476733 nova_compute[192580]: 2025-10-08 15:20:01.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.387 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6683d4f6-e0, col_values=(('external_ids', {'iface-id': '02c54a14-9b9f-4195-ba94-66a72c7333c9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:01 np0005476733 nova_compute[192580]: 2025-10-08 15:20:01.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:01 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:01Z|00080|binding|INFO|Releasing lport 02c54a14-9b9f-4195-ba94-66a72c7333c9 from this chassis (sb_readonly=0)
Oct  8 11:20:01 np0005476733 nova_compute[192580]: 2025-10-08 15:20:01.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.406 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6683d4f6-e609-48e8-bf45-f31b3fa1d7ec.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6683d4f6-e609-48e8-bf45-f31b3fa1d7ec.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.407 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b429ab35-ccf7-406a-9167-c44a7d9d07bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.408 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/6683d4f6-e609-48e8-bf45-f31b3fa1d7ec.pid.haproxy
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 6683d4f6-e609-48e8-bf45-f31b3fa1d7ec
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:20:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:01.408 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'env', 'PROCESS_TAG=haproxy-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6683d4f6-e609-48e8-bf45-f31b3fa1d7ec.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:20:01 np0005476733 podman[222554]: 2025-10-08 15:20:01.815200562 +0000 UTC m=+0.063583106 container create 1aca3e504afe745cc1410170f2bf1331a72c516d93c79bc42c3779408e8660ad (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct  8 11:20:01 np0005476733 systemd[1]: Started libpod-conmon-1aca3e504afe745cc1410170f2bf1331a72c516d93c79bc42c3779408e8660ad.scope.
Oct  8 11:20:01 np0005476733 podman[222554]: 2025-10-08 15:20:01.78169716 +0000 UTC m=+0.030079714 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:20:01 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:20:01 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6963633d9a2ebea002beb8b19a7afc222d85664d738b514f15df167d0b99cd19/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:20:01 np0005476733 podman[222554]: 2025-10-08 15:20:01.906739831 +0000 UTC m=+0.155122395 container init 1aca3e504afe745cc1410170f2bf1331a72c516d93c79bc42c3779408e8660ad (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:20:01 np0005476733 podman[222554]: 2025-10-08 15:20:01.911602237 +0000 UTC m=+0.159984771 container start 1aca3e504afe745cc1410170f2bf1331a72c516d93c79bc42c3779408e8660ad (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:20:01 np0005476733 neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec[222570]: [NOTICE]   (222574) : New worker (222576) forked
Oct  8 11:20:01 np0005476733 neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec[222570]: [NOTICE]   (222574) : Loading success.
Oct  8 11:20:01 np0005476733 nova_compute[192580]: 2025-10-08 15:20:01.943 2 DEBUG nova.compute.manager [req-cbc8bd91-9f16-420f-8097-27bc8c3fa909 req-1ee8ba02-22e3-4f5d-a803-a8a9a9948864 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received event network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:20:01 np0005476733 nova_compute[192580]: 2025-10-08 15:20:01.943 2 DEBUG oslo_concurrency.lockutils [req-cbc8bd91-9f16-420f-8097-27bc8c3fa909 req-1ee8ba02-22e3-4f5d-a803-a8a9a9948864 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:01 np0005476733 nova_compute[192580]: 2025-10-08 15:20:01.943 2 DEBUG oslo_concurrency.lockutils [req-cbc8bd91-9f16-420f-8097-27bc8c3fa909 req-1ee8ba02-22e3-4f5d-a803-a8a9a9948864 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:01 np0005476733 nova_compute[192580]: 2025-10-08 15:20:01.944 2 DEBUG oslo_concurrency.lockutils [req-cbc8bd91-9f16-420f-8097-27bc8c3fa909 req-1ee8ba02-22e3-4f5d-a803-a8a9a9948864 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:01 np0005476733 nova_compute[192580]: 2025-10-08 15:20:01.944 2 DEBUG nova.compute.manager [req-cbc8bd91-9f16-420f-8097-27bc8c3fa909 req-1ee8ba02-22e3-4f5d-a803-a8a9a9948864 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Processing event network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:20:01 np0005476733 nova_compute[192580]: 2025-10-08 15:20:01.977 2 DEBUG nova.compute.manager [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:20:01 np0005476733 nova_compute[192580]: 2025-10-08 15:20:01.978 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936801.975644, c4b45a9c-73a5-4b51-ab96-874507f4c028 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:20:01 np0005476733 nova_compute[192580]: 2025-10-08 15:20:01.978 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] VM Started (Lifecycle Event)#033[00m
Oct  8 11:20:01 np0005476733 nova_compute[192580]: 2025-10-08 15:20:01.981 2 DEBUG nova.virt.libvirt.driver [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:20:01 np0005476733 nova_compute[192580]: 2025-10-08 15:20:01.984 2 INFO nova.virt.libvirt.driver [-] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Instance spawned successfully.#033[00m
Oct  8 11:20:01 np0005476733 nova_compute[192580]: 2025-10-08 15:20:01.984 2 DEBUG nova.virt.libvirt.driver [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.014 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.020 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.024 2 DEBUG nova.virt.libvirt.driver [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.025 2 DEBUG nova.virt.libvirt.driver [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.025 2 DEBUG nova.virt.libvirt.driver [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.026 2 DEBUG nova.virt.libvirt.driver [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.026 2 DEBUG nova.virt.libvirt.driver [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.027 2 DEBUG nova.virt.libvirt.driver [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.064 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.065 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936801.9776902, c4b45a9c-73a5-4b51-ab96-874507f4c028 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.066 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.098 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.102 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936801.9806552, c4b45a9c-73a5-4b51-ab96-874507f4c028 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.103 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.110 2 INFO nova.compute.manager [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Took 11.99 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.110 2 DEBUG nova.compute.manager [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.121 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.124 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.162 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.218 2 INFO nova.compute.manager [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Took 12.88 seconds to build instance.#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.237 2 DEBUG oslo_concurrency.lockutils [None req-10953de2-cc47-4797-bdaf-993f809f004a 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.996s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.238 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 10.054s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.238 2 INFO nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.239 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.241 2 DEBUG nova.network.neutron [req-58fab194-1a2e-43bb-b750-6db4bae21412 req-d96a66f3-8d81-4f54-902a-c27122349dbf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Updated VIF entry in instance network info cache for port fc0ee6f6-75ad-486c-8761-a75311199fcb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.241 2 DEBUG nova.network.neutron [req-58fab194-1a2e-43bb-b750-6db4bae21412 req-d96a66f3-8d81-4f54-902a-c27122349dbf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Updating instance_info_cache with network_info: [{"id": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "address": "fa:16:3e:3f:49:16", "network": {"id": "6683d4f6-e609-48e8-bf45-f31b3fa1d7ec", "bridge": "br-int", "label": "tempest-test-network--999097329", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0ee6f6-75", "ovs_interfaceid": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:20:02 np0005476733 nova_compute[192580]: 2025-10-08 15:20:02.263 2 DEBUG oslo_concurrency.lockutils [req-58fab194-1a2e-43bb-b750-6db4bae21412 req-d96a66f3-8d81-4f54-902a-c27122349dbf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-c4b45a9c-73a5-4b51-ab96-874507f4c028" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:20:03 np0005476733 nova_compute[192580]: 2025-10-08 15:20:03.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:03.388 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:20:03 np0005476733 nova_compute[192580]: 2025-10-08 15:20:03.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:03.390 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:20:03 np0005476733 nova_compute[192580]: 2025-10-08 15:20:03.763 2 INFO nova.compute.manager [None req-ec688d47-3714-4e44-a0ba-0fb7d09b19b6 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Get console output#033[00m
Oct  8 11:20:04 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:04Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7e:98:72 10.100.0.8
Oct  8 11:20:04 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:04Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7e:98:72 10.100.0.8
Oct  8 11:20:04 np0005476733 nova_compute[192580]: 2025-10-08 15:20:04.628 2 DEBUG nova.compute.manager [req-fc14efc5-2243-49e2-afdc-39e5a112a8e4 req-7fac87ad-3d25-4158-a244-7a91e503c50f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received event network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:20:04 np0005476733 nova_compute[192580]: 2025-10-08 15:20:04.629 2 DEBUG oslo_concurrency.lockutils [req-fc14efc5-2243-49e2-afdc-39e5a112a8e4 req-7fac87ad-3d25-4158-a244-7a91e503c50f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:04 np0005476733 nova_compute[192580]: 2025-10-08 15:20:04.629 2 DEBUG oslo_concurrency.lockutils [req-fc14efc5-2243-49e2-afdc-39e5a112a8e4 req-7fac87ad-3d25-4158-a244-7a91e503c50f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:04 np0005476733 nova_compute[192580]: 2025-10-08 15:20:04.629 2 DEBUG oslo_concurrency.lockutils [req-fc14efc5-2243-49e2-afdc-39e5a112a8e4 req-7fac87ad-3d25-4158-a244-7a91e503c50f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:04 np0005476733 nova_compute[192580]: 2025-10-08 15:20:04.630 2 DEBUG nova.compute.manager [req-fc14efc5-2243-49e2-afdc-39e5a112a8e4 req-7fac87ad-3d25-4158-a244-7a91e503c50f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] No waiting events found dispatching network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:20:04 np0005476733 nova_compute[192580]: 2025-10-08 15:20:04.630 2 WARNING nova.compute.manager [req-fc14efc5-2243-49e2-afdc-39e5a112a8e4 req-7fac87ad-3d25-4158-a244-7a91e503c50f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received unexpected event network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb for instance with vm_state active and task_state None.#033[00m
Oct  8 11:20:04 np0005476733 nova_compute[192580]: 2025-10-08 15:20:04.704 2 INFO nova.compute.manager [None req-edea5085-17d0-4f03-8c11-d09b44515675 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Get console output#033[00m
Oct  8 11:20:04 np0005476733 nova_compute[192580]: 2025-10-08 15:20:04.710 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:20:05 np0005476733 nova_compute[192580]: 2025-10-08 15:20:05.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:08 np0005476733 podman[222585]: 2025-10-08 15:20:08.280214014 +0000 UTC m=+0.089482525 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 11:20:08 np0005476733 nova_compute[192580]: 2025-10-08 15:20:08.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:08 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:08.393 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:08 np0005476733 nova_compute[192580]: 2025-10-08 15:20:08.937 2 INFO nova.compute.manager [None req-1e556ffb-96a4-4a49-8f81-301124a3e381 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Get console output#033[00m
Oct  8 11:20:08 np0005476733 nova_compute[192580]: 2025-10-08 15:20:08.944 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:20:09 np0005476733 systemd-logind[827]: New session 34 of user zuul.
Oct  8 11:20:09 np0005476733 systemd[1]: Started Session 34 of User zuul.
Oct  8 11:20:09 np0005476733 systemd[1]: session-34.scope: Deactivated successfully.
Oct  8 11:20:09 np0005476733 systemd-logind[827]: Session 34 logged out. Waiting for processes to exit.
Oct  8 11:20:09 np0005476733 systemd-logind[827]: Removed session 34.
Oct  8 11:20:10 np0005476733 nova_compute[192580]: 2025-10-08 15:20:10.000 2 INFO nova.compute.manager [None req-8b12be66-d1ee-499f-93e1-a6483e9366fd c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Get console output#033[00m
Oct  8 11:20:10 np0005476733 nova_compute[192580]: 2025-10-08 15:20:10.009 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:20:10 np0005476733 nova_compute[192580]: 2025-10-08 15:20:10.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:11 np0005476733 podman[222632]: 2025-10-08 15:20:11.241807304 +0000 UTC m=+0.066239712 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 11:20:13 np0005476733 nova_compute[192580]: 2025-10-08 15:20:13.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:14 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:14Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3f:49:16 10.100.0.26
Oct  8 11:20:14 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:14Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3f:49:16 10.100.0.26
Oct  8 11:20:14 np0005476733 nova_compute[192580]: 2025-10-08 15:20:14.102 2 INFO nova.compute.manager [None req-adfb795e-c586-4f25-9e6b-b7b9ccda1932 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Get console output#033[00m
Oct  8 11:20:14 np0005476733 nova_compute[192580]: 2025-10-08 15:20:14.107 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:20:15 np0005476733 nova_compute[192580]: 2025-10-08 15:20:15.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:15 np0005476733 nova_compute[192580]: 2025-10-08 15:20:15.744 2 INFO nova.compute.manager [None req-19100381-0987-4378-8fa0-e020b5690ce9 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Get console output#033[00m
Oct  8 11:20:15 np0005476733 nova_compute[192580]: 2025-10-08 15:20:15.750 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:20:15 np0005476733 nova_compute[192580]: 2025-10-08 15:20:15.754 2 INFO nova.virt.libvirt.driver [None req-19100381-0987-4378-8fa0-e020b5690ce9 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Truncated console log returned, 3362 bytes ignored#033[00m
Oct  8 11:20:18 np0005476733 nova_compute[192580]: 2025-10-08 15:20:18.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:19 np0005476733 podman[222695]: 2025-10-08 15:20:19.244145659 +0000 UTC m=+0.068415910 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 11:20:19 np0005476733 nova_compute[192580]: 2025-10-08 15:20:19.268 2 INFO nova.compute.manager [None req-13438cb3-0fef-4e9e-9488-1364aa8faaa7 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Get console output#033[00m
Oct  8 11:20:19 np0005476733 nova_compute[192580]: 2025-10-08 15:20:19.273 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:20:19 np0005476733 nova_compute[192580]: 2025-10-08 15:20:19.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:19 np0005476733 NetworkManager[51699]: <info>  [1759936819.8454] manager: (patch-br-int-to-provnet-20e9b335-697c-41bf-8f62-8813cb01ba99): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Oct  8 11:20:19 np0005476733 NetworkManager[51699]: <info>  [1759936819.8467] manager: (patch-provnet-20e9b335-697c-41bf-8f62-8813cb01ba99-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Oct  8 11:20:20 np0005476733 nova_compute[192580]: 2025-10-08 15:20:20.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:20 np0005476733 nova_compute[192580]: 2025-10-08 15:20:20.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:20Z|00081|binding|INFO|Releasing lport 02c54a14-9b9f-4195-ba94-66a72c7333c9 from this chassis (sb_readonly=0)
Oct  8 11:20:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:20Z|00082|binding|INFO|Releasing lport 76302563-91ae-48df-adce-3edec8d5a578 from this chassis (sb_readonly=0)
Oct  8 11:20:20 np0005476733 nova_compute[192580]: 2025-10-08 15:20:20.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:21 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:21Z|00083|pinctrl|WARN|Dropped 3311 log messages in last 55 seconds (most recently, 1 seconds ago) due to excessive rate
Oct  8 11:20:21 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:21Z|00084|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:20:21 np0005476733 podman[222715]: 2025-10-08 15:20:21.237535021 +0000 UTC m=+0.070988333 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct  8 11:20:21 np0005476733 nova_compute[192580]: 2025-10-08 15:20:21.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:21 np0005476733 nova_compute[192580]: 2025-10-08 15:20:21.848 2 DEBUG oslo_concurrency.lockutils [None req-68a10be9-5acc-490b-b3bf-c00a969cd019 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "c4b45a9c-73a5-4b51-ab96-874507f4c028" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:21 np0005476733 nova_compute[192580]: 2025-10-08 15:20:21.848 2 DEBUG oslo_concurrency.lockutils [None req-68a10be9-5acc-490b-b3bf-c00a969cd019 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:21 np0005476733 nova_compute[192580]: 2025-10-08 15:20:21.849 2 INFO nova.compute.manager [None req-68a10be9-5acc-490b-b3bf-c00a969cd019 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Rebooting instance#033[00m
Oct  8 11:20:21 np0005476733 nova_compute[192580]: 2025-10-08 15:20:21.869 2 DEBUG oslo_concurrency.lockutils [None req-68a10be9-5acc-490b-b3bf-c00a969cd019 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "refresh_cache-c4b45a9c-73a5-4b51-ab96-874507f4c028" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:20:21 np0005476733 nova_compute[192580]: 2025-10-08 15:20:21.870 2 DEBUG oslo_concurrency.lockutils [None req-68a10be9-5acc-490b-b3bf-c00a969cd019 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquired lock "refresh_cache-c4b45a9c-73a5-4b51-ab96-874507f4c028" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:20:21 np0005476733 nova_compute[192580]: 2025-10-08 15:20:21.870 2 DEBUG nova.network.neutron [None req-68a10be9-5acc-490b-b3bf-c00a969cd019 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:20:22 np0005476733 podman[222735]: 2025-10-08 15:20:22.277735524 +0000 UTC m=+0.107889245 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2)
Oct  8 11:20:23 np0005476733 nova_compute[192580]: 2025-10-08 15:20:23.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:24 np0005476733 nova_compute[192580]: 2025-10-08 15:20:24.262 2 DEBUG nova.compute.manager [req-6ae75d86-a46f-45e2-a94b-a3196419c244 req-b72a6dca-6fc1-4ed8-80d1-8f21cfd6dc89 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received event network-changed-fc0ee6f6-75ad-486c-8761-a75311199fcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:20:24 np0005476733 nova_compute[192580]: 2025-10-08 15:20:24.263 2 DEBUG nova.compute.manager [req-6ae75d86-a46f-45e2-a94b-a3196419c244 req-b72a6dca-6fc1-4ed8-80d1-8f21cfd6dc89 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Refreshing instance network info cache due to event network-changed-fc0ee6f6-75ad-486c-8761-a75311199fcb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:20:24 np0005476733 nova_compute[192580]: 2025-10-08 15:20:24.264 2 DEBUG oslo_concurrency.lockutils [req-6ae75d86-a46f-45e2-a94b-a3196419c244 req-b72a6dca-6fc1-4ed8-80d1-8f21cfd6dc89 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-c4b45a9c-73a5-4b51-ab96-874507f4c028" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:20:24 np0005476733 nova_compute[192580]: 2025-10-08 15:20:24.367 2 DEBUG nova.compute.manager [req-4f48e58b-f788-40c3-a8e9-546e5383fd3b req-e88233c7-dcd9-4bd9-afca-fbb87c5293b0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Received event network-changed-1f764678-f4b9-420d-b072-8c0f7c3534a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:20:24 np0005476733 nova_compute[192580]: 2025-10-08 15:20:24.367 2 DEBUG nova.compute.manager [req-4f48e58b-f788-40c3-a8e9-546e5383fd3b req-e88233c7-dcd9-4bd9-afca-fbb87c5293b0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Refreshing instance network info cache due to event network-changed-1f764678-f4b9-420d-b072-8c0f7c3534a9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:20:24 np0005476733 nova_compute[192580]: 2025-10-08 15:20:24.367 2 DEBUG oslo_concurrency.lockutils [req-4f48e58b-f788-40c3-a8e9-546e5383fd3b req-e88233c7-dcd9-4bd9-afca-fbb87c5293b0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-af660b82-9b3c-4c4d-820a-3d22b73898e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:20:24 np0005476733 nova_compute[192580]: 2025-10-08 15:20:24.368 2 DEBUG oslo_concurrency.lockutils [req-4f48e58b-f788-40c3-a8e9-546e5383fd3b req-e88233c7-dcd9-4bd9-afca-fbb87c5293b0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-af660b82-9b3c-4c4d-820a-3d22b73898e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:20:24 np0005476733 nova_compute[192580]: 2025-10-08 15:20:24.368 2 DEBUG nova.network.neutron [req-4f48e58b-f788-40c3-a8e9-546e5383fd3b req-e88233c7-dcd9-4bd9-afca-fbb87c5293b0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Refreshing network info cache for port 1f764678-f4b9-420d-b072-8c0f7c3534a9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:20:25 np0005476733 nova_compute[192580]: 2025-10-08 15:20:25.034 2 DEBUG nova.network.neutron [None req-68a10be9-5acc-490b-b3bf-c00a969cd019 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Updating instance_info_cache with network_info: [{"id": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "address": "fa:16:3e:3f:49:16", "network": {"id": "6683d4f6-e609-48e8-bf45-f31b3fa1d7ec", "bridge": "br-int", "label": "tempest-test-network--999097329", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0ee6f6-75", "ovs_interfaceid": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:20:25 np0005476733 nova_compute[192580]: 2025-10-08 15:20:25.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:25 np0005476733 nova_compute[192580]: 2025-10-08 15:20:25.124 2 DEBUG oslo_concurrency.lockutils [None req-68a10be9-5acc-490b-b3bf-c00a969cd019 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Releasing lock "refresh_cache-c4b45a9c-73a5-4b51-ab96-874507f4c028" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:20:25 np0005476733 nova_compute[192580]: 2025-10-08 15:20:25.126 2 DEBUG nova.compute.manager [None req-68a10be9-5acc-490b-b3bf-c00a969cd019 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:20:25 np0005476733 nova_compute[192580]: 2025-10-08 15:20:25.127 2 DEBUG oslo_concurrency.lockutils [req-6ae75d86-a46f-45e2-a94b-a3196419c244 req-b72a6dca-6fc1-4ed8-80d1-8f21cfd6dc89 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-c4b45a9c-73a5-4b51-ab96-874507f4c028" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:20:25 np0005476733 nova_compute[192580]: 2025-10-08 15:20:25.128 2 DEBUG nova.network.neutron [req-6ae75d86-a46f-45e2-a94b-a3196419c244 req-b72a6dca-6fc1-4ed8-80d1-8f21cfd6dc89 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Refreshing network info cache for port fc0ee6f6-75ad-486c-8761-a75311199fcb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:20:26 np0005476733 podman[222762]: 2025-10-08 15:20:26.234881378 +0000 UTC m=+0.055911221 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Oct  8 11:20:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:26.302 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:26.303 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:26.304 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:26 np0005476733 nova_compute[192580]: 2025-10-08 15:20:26.348 2 DEBUG nova.network.neutron [req-4f48e58b-f788-40c3-a8e9-546e5383fd3b req-e88233c7-dcd9-4bd9-afca-fbb87c5293b0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Updated VIF entry in instance network info cache for port 1f764678-f4b9-420d-b072-8c0f7c3534a9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:20:26 np0005476733 nova_compute[192580]: 2025-10-08 15:20:26.349 2 DEBUG nova.network.neutron [req-4f48e58b-f788-40c3-a8e9-546e5383fd3b req-e88233c7-dcd9-4bd9-afca-fbb87c5293b0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Updating instance_info_cache with network_info: [{"id": "1f764678-f4b9-420d-b072-8c0f7c3534a9", "address": "fa:16:3e:7e:98:72", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f764678-f4", "ovs_interfaceid": "1f764678-f4b9-420d-b072-8c0f7c3534a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:20:26 np0005476733 nova_compute[192580]: 2025-10-08 15:20:26.591 2 DEBUG oslo_concurrency.lockutils [req-4f48e58b-f788-40c3-a8e9-546e5383fd3b req-e88233c7-dcd9-4bd9-afca-fbb87c5293b0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-af660b82-9b3c-4c4d-820a-3d22b73898e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:20:27 np0005476733 nova_compute[192580]: 2025-10-08 15:20:27.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:27 np0005476733 nova_compute[192580]: 2025-10-08 15:20:27.292 2 DEBUG nova.network.neutron [req-6ae75d86-a46f-45e2-a94b-a3196419c244 req-b72a6dca-6fc1-4ed8-80d1-8f21cfd6dc89 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Updated VIF entry in instance network info cache for port fc0ee6f6-75ad-486c-8761-a75311199fcb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:20:27 np0005476733 nova_compute[192580]: 2025-10-08 15:20:27.293 2 DEBUG nova.network.neutron [req-6ae75d86-a46f-45e2-a94b-a3196419c244 req-b72a6dca-6fc1-4ed8-80d1-8f21cfd6dc89 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Updating instance_info_cache with network_info: [{"id": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "address": "fa:16:3e:3f:49:16", "network": {"id": "6683d4f6-e609-48e8-bf45-f31b3fa1d7ec", "bridge": "br-int", "label": "tempest-test-network--999097329", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0ee6f6-75", "ovs_interfaceid": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:20:27 np0005476733 nova_compute[192580]: 2025-10-08 15:20:27.321 2 DEBUG oslo_concurrency.lockutils [req-6ae75d86-a46f-45e2-a94b-a3196419c244 req-b72a6dca-6fc1-4ed8-80d1-8f21cfd6dc89 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-c4b45a9c-73a5-4b51-ab96-874507f4c028" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:20:27 np0005476733 kernel: tapfc0ee6f6-75 (unregistering): left promiscuous mode
Oct  8 11:20:27 np0005476733 NetworkManager[51699]: <info>  [1759936827.4402] device (tapfc0ee6f6-75): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:20:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:27Z|00085|binding|INFO|Releasing lport fc0ee6f6-75ad-486c-8761-a75311199fcb from this chassis (sb_readonly=0)
Oct  8 11:20:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:27Z|00086|binding|INFO|Setting lport fc0ee6f6-75ad-486c-8761-a75311199fcb down in Southbound
Oct  8 11:20:27 np0005476733 nova_compute[192580]: 2025-10-08 15:20:27.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:27Z|00087|binding|INFO|Removing iface tapfc0ee6f6-75 ovn-installed in OVS
Oct  8 11:20:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:27.459 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:49:16 10.100.0.26'], port_security=['fa:16:3e:3f:49:16 10.100.0.26'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.26/28', 'neutron:device_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7762962015674dfb9038a135559a61f3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '47f96e90-866e-45e9-bccf-367f966b96ad', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.246'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7bd9065d-dade-45b5-8223-a8753cff9447, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=fc0ee6f6-75ad-486c-8761-a75311199fcb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:20:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:27.460 103739 INFO neutron.agent.ovn.metadata.agent [-] Port fc0ee6f6-75ad-486c-8761-a75311199fcb in datapath 6683d4f6-e609-48e8-bf45-f31b3fa1d7ec unbound from our chassis#033[00m
Oct  8 11:20:27 np0005476733 nova_compute[192580]: 2025-10-08 15:20:27.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:27.464 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6683d4f6-e609-48e8-bf45-f31b3fa1d7ec, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:20:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:27.465 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[aa0af43b-dc7e-4996-bd8b-8ab874501225]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:27.465 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec namespace which is not needed anymore#033[00m
Oct  8 11:20:27 np0005476733 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Deactivated successfully.
Oct  8 11:20:27 np0005476733 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Consumed 12.929s CPU time.
Oct  8 11:20:27 np0005476733 systemd-machined[152624]: Machine qemu-4-instance-00000007 terminated.
Oct  8 11:20:27 np0005476733 neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec[222570]: [NOTICE]   (222574) : haproxy version is 2.8.14-c23fe91
Oct  8 11:20:27 np0005476733 neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec[222570]: [NOTICE]   (222574) : path to executable is /usr/sbin/haproxy
Oct  8 11:20:27 np0005476733 neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec[222570]: [WARNING]  (222574) : Exiting Master process...
Oct  8 11:20:27 np0005476733 neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec[222570]: [ALERT]    (222574) : Current worker (222576) exited with code 143 (Terminated)
Oct  8 11:20:27 np0005476733 neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec[222570]: [WARNING]  (222574) : All workers exited. Exiting... (0)
Oct  8 11:20:27 np0005476733 systemd[1]: libpod-1aca3e504afe745cc1410170f2bf1331a72c516d93c79bc42c3779408e8660ad.scope: Deactivated successfully.
Oct  8 11:20:27 np0005476733 podman[222804]: 2025-10-08 15:20:27.587993855 +0000 UTC m=+0.046104507 container died 1aca3e504afe745cc1410170f2bf1331a72c516d93c79bc42c3779408e8660ad (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  8 11:20:27 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1aca3e504afe745cc1410170f2bf1331a72c516d93c79bc42c3779408e8660ad-userdata-shm.mount: Deactivated successfully.
Oct  8 11:20:27 np0005476733 systemd[1]: var-lib-containers-storage-overlay-6963633d9a2ebea002beb8b19a7afc222d85664d738b514f15df167d0b99cd19-merged.mount: Deactivated successfully.
Oct  8 11:20:27 np0005476733 podman[222804]: 2025-10-08 15:20:27.620226527 +0000 UTC m=+0.078337179 container cleanup 1aca3e504afe745cc1410170f2bf1331a72c516d93c79bc42c3779408e8660ad (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  8 11:20:27 np0005476733 systemd[1]: libpod-conmon-1aca3e504afe745cc1410170f2bf1331a72c516d93c79bc42c3779408e8660ad.scope: Deactivated successfully.
Oct  8 11:20:27 np0005476733 podman[222833]: 2025-10-08 15:20:27.682414327 +0000 UTC m=+0.041521089 container remove 1aca3e504afe745cc1410170f2bf1331a72c516d93c79bc42c3779408e8660ad (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 11:20:27 np0005476733 kernel: tapfc0ee6f6-75: entered promiscuous mode
Oct  8 11:20:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:27.689 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ee3b2b38-dc03-40f5-973e-8e11ef2bf1ec]: (4, ('Wed Oct  8 03:20:27 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec (1aca3e504afe745cc1410170f2bf1331a72c516d93c79bc42c3779408e8660ad)\n1aca3e504afe745cc1410170f2bf1331a72c516d93c79bc42c3779408e8660ad\nWed Oct  8 03:20:27 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec (1aca3e504afe745cc1410170f2bf1331a72c516d93c79bc42c3779408e8660ad)\n1aca3e504afe745cc1410170f2bf1331a72c516d93c79bc42c3779408e8660ad\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:27 np0005476733 NetworkManager[51699]: <info>  [1759936827.6905] manager: (tapfc0ee6f6-75): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Oct  8 11:20:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:27Z|00088|binding|INFO|Claiming lport fc0ee6f6-75ad-486c-8761-a75311199fcb for this chassis.
Oct  8 11:20:27 np0005476733 nova_compute[192580]: 2025-10-08 15:20:27.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:27Z|00089|binding|INFO|fc0ee6f6-75ad-486c-8761-a75311199fcb: Claiming fa:16:3e:3f:49:16 10.100.0.26
Oct  8 11:20:27 np0005476733 kernel: tapfc0ee6f6-75 (unregistering): left promiscuous mode
Oct  8 11:20:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:27.692 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8a72a124-dfb4-4d9f-b3f5-1bacf5a24983]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:27.694 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6683d4f6-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:27 np0005476733 nova_compute[192580]: 2025-10-08 15:20:27.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:27.707 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:49:16 10.100.0.26'], port_security=['fa:16:3e:3f:49:16 10.100.0.26'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.26/28', 'neutron:device_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7762962015674dfb9038a135559a61f3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '47f96e90-866e-45e9-bccf-367f966b96ad', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.246'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7bd9065d-dade-45b5-8223-a8753cff9447, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=fc0ee6f6-75ad-486c-8761-a75311199fcb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:20:27 np0005476733 nova_compute[192580]: 2025-10-08 15:20:27.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:27Z|00090|binding|INFO|Setting lport fc0ee6f6-75ad-486c-8761-a75311199fcb ovn-installed in OVS
Oct  8 11:20:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:27Z|00091|binding|INFO|Setting lport fc0ee6f6-75ad-486c-8761-a75311199fcb up in Southbound
Oct  8 11:20:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:27Z|00092|binding|INFO|Releasing lport fc0ee6f6-75ad-486c-8761-a75311199fcb from this chassis (sb_readonly=1)
Oct  8 11:20:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:27Z|00093|if_status|INFO|Not setting lport fc0ee6f6-75ad-486c-8761-a75311199fcb down as sb is readonly
Oct  8 11:20:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:27Z|00094|binding|INFO|Removing iface tapfc0ee6f6-75 ovn-installed in OVS
Oct  8 11:20:27 np0005476733 nova_compute[192580]: 2025-10-08 15:20:27.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:27 np0005476733 kernel: tap6683d4f6-e0: left promiscuous mode
Oct  8 11:20:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:27Z|00095|binding|INFO|Releasing lport fc0ee6f6-75ad-486c-8761-a75311199fcb from this chassis (sb_readonly=0)
Oct  8 11:20:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:27Z|00096|binding|INFO|Setting lport fc0ee6f6-75ad-486c-8761-a75311199fcb down in Southbound
Oct  8 11:20:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:27.733 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:49:16 10.100.0.26'], port_security=['fa:16:3e:3f:49:16 10.100.0.26'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.26/28', 'neutron:device_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7762962015674dfb9038a135559a61f3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '47f96e90-866e-45e9-bccf-367f966b96ad', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.246'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7bd9065d-dade-45b5-8223-a8753cff9447, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=fc0ee6f6-75ad-486c-8761-a75311199fcb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:20:27 np0005476733 nova_compute[192580]: 2025-10-08 15:20:27.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:27 np0005476733 nova_compute[192580]: 2025-10-08 15:20:27.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:27.744 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[efa16038-1836-4e23-8e29-42a393fb1c25]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:27.777 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7bea9510-7a98-4489-bb59-2117d766755b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:27.778 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[74306b4b-149a-420a-a7bb-a70599a02270]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:27.798 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3086c0fd-5570-407f-b25c-2566db1c0ea8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 370482, 'reachable_time': 39660, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222863, 'error': None, 'target': 'ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:27.801 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:20:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:27.801 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[8dca5083-40b2-4a5c-b2c7-a904ff043ca4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:27.802 103739 INFO neutron.agent.ovn.metadata.agent [-] Port fc0ee6f6-75ad-486c-8761-a75311199fcb in datapath 6683d4f6-e609-48e8-bf45-f31b3fa1d7ec unbound from our chassis#033[00m
Oct  8 11:20:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:27.803 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6683d4f6-e609-48e8-bf45-f31b3fa1d7ec, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:20:27 np0005476733 systemd[1]: run-netns-ovnmeta\x2d6683d4f6\x2de609\x2d48e8\x2dbf45\x2df31b3fa1d7ec.mount: Deactivated successfully.
Oct  8 11:20:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:27.804 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3a8712ae-9d82-42bd-8b40-c6a25b64db24]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:27.804 103739 INFO neutron.agent.ovn.metadata.agent [-] Port fc0ee6f6-75ad-486c-8761-a75311199fcb in datapath 6683d4f6-e609-48e8-bf45-f31b3fa1d7ec unbound from our chassis#033[00m
Oct  8 11:20:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:27.806 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6683d4f6-e609-48e8-bf45-f31b3fa1d7ec, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:20:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:27.806 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5ada7777-6e74-43b8-bb5b-1ef156635dc5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:28 np0005476733 nova_compute[192580]: 2025-10-08 15:20:28.307 2 INFO nova.virt.libvirt.driver [None req-68a10be9-5acc-490b-b3bf-c00a969cd019 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Instance shutdown successfully.#033[00m
Oct  8 11:20:28 np0005476733 kernel: tapfc0ee6f6-75: entered promiscuous mode
Oct  8 11:20:28 np0005476733 systemd-udevd[222787]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:20:28 np0005476733 NetworkManager[51699]: <info>  [1759936828.4254] manager: (tapfc0ee6f6-75): new Tun device (/org/freedesktop/NetworkManager/Devices/45)
Oct  8 11:20:28 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:28Z|00097|binding|INFO|Claiming lport fc0ee6f6-75ad-486c-8761-a75311199fcb for this chassis.
Oct  8 11:20:28 np0005476733 nova_compute[192580]: 2025-10-08 15:20:28.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:28 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:28Z|00098|binding|INFO|fc0ee6f6-75ad-486c-8761-a75311199fcb: Claiming fa:16:3e:3f:49:16 10.100.0.26
Oct  8 11:20:28 np0005476733 NetworkManager[51699]: <info>  [1759936828.4283] device (tapfc0ee6f6-75): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:20:28 np0005476733 NetworkManager[51699]: <info>  [1759936828.4290] device (tapfc0ee6f6-75): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:20:28 np0005476733 nova_compute[192580]: 2025-10-08 15:20:28.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.434 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:49:16 10.100.0.26'], port_security=['fa:16:3e:3f:49:16 10.100.0.26'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.26/28', 'neutron:device_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7762962015674dfb9038a135559a61f3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '47f96e90-866e-45e9-bccf-367f966b96ad', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.246'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7bd9065d-dade-45b5-8223-a8753cff9447, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=fc0ee6f6-75ad-486c-8761-a75311199fcb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.435 103739 INFO neutron.agent.ovn.metadata.agent [-] Port fc0ee6f6-75ad-486c-8761-a75311199fcb in datapath 6683d4f6-e609-48e8-bf45-f31b3fa1d7ec bound to our chassis#033[00m
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.437 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6683d4f6-e609-48e8-bf45-f31b3fa1d7ec#033[00m
Oct  8 11:20:28 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:28Z|00099|binding|INFO|Setting lport fc0ee6f6-75ad-486c-8761-a75311199fcb ovn-installed in OVS
Oct  8 11:20:28 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:28Z|00100|binding|INFO|Setting lport fc0ee6f6-75ad-486c-8761-a75311199fcb up in Southbound
Oct  8 11:20:28 np0005476733 nova_compute[192580]: 2025-10-08 15:20:28.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:28 np0005476733 nova_compute[192580]: 2025-10-08 15:20:28.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.449 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7f5729d0-bc9b-4de2-b466-fbc9c95ddeb6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.450 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6683d4f6-e1 in ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.451 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6683d4f6-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.451 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[53ebb20f-71f1-470c-a6d9-c57b5640ae39]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.452 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5fc58d92-7856-4bc9-b30a-2bfc2cc763d4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:28 np0005476733 systemd-machined[152624]: New machine qemu-5-instance-00000007.
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.464 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[f14a7782-3faa-4b8e-98d1-8d94a7d5101a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:28 np0005476733 systemd[1]: Started Virtual Machine qemu-5-instance-00000007.
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.491 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[561403d2-401c-4fd3-9eee-a50f45b968c5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.523 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[a99a4037-0593-4e1a-83ea-8c21114e590c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.529 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c795eb1b-0a4b-4417-9914-2330f6cc7afa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:28 np0005476733 NetworkManager[51699]: <info>  [1759936828.5329] manager: (tap6683d4f6-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/46)
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.567 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[f9f52147-a6e0-4d25-a4c6-388aecd394b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.570 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[8e31b6a5-2895-4cc7-86de-f8baa5db6339]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:28 np0005476733 NetworkManager[51699]: <info>  [1759936828.5956] device (tap6683d4f6-e0): carrier: link connected
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.602 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[bb72d4eb-eb3e-4057-a40f-441d14a86564]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.618 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f127277d-8938-4134-9368-eb89c8adf91a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6683d4f6-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:fd:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373225, 'reachable_time': 41437, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222910, 'error': None, 'target': 'ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.635 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[78b522d3-c22e-4d0a-9cfe-116891247c0d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe29:fdad'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 373225, 'tstamp': 373225}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222912, 'error': None, 'target': 'ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.656 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[71ff1144-b3b2-4869-b389-830f01846754]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6683d4f6-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:fd:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373225, 'reachable_time': 41437, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222915, 'error': None, 'target': 'ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.693 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[61a55d82-0630-496f-876c-4676a6b7e3b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.755 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d0370c62-62a0-47c2-a14d-335ab52d4601]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.756 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6683d4f6-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.757 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.758 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6683d4f6-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:28 np0005476733 nova_compute[192580]: 2025-10-08 15:20:28.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:28 np0005476733 NetworkManager[51699]: <info>  [1759936828.7616] manager: (tap6683d4f6-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Oct  8 11:20:28 np0005476733 kernel: tap6683d4f6-e0: entered promiscuous mode
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.764 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6683d4f6-e0, col_values=(('external_ids', {'iface-id': '02c54a14-9b9f-4195-ba94-66a72c7333c9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:28 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:28Z|00101|binding|INFO|Releasing lport 02c54a14-9b9f-4195-ba94-66a72c7333c9 from this chassis (sb_readonly=0)
Oct  8 11:20:28 np0005476733 nova_compute[192580]: 2025-10-08 15:20:28.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:28 np0005476733 nova_compute[192580]: 2025-10-08 15:20:28.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.793 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6683d4f6-e609-48e8-bf45-f31b3fa1d7ec.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6683d4f6-e609-48e8-bf45-f31b3fa1d7ec.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.795 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a0ee6c8b-a066-4249-bb2b-83cf48f9ff5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.796 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/6683d4f6-e609-48e8-bf45-f31b3fa1d7ec.pid.haproxy
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 6683d4f6-e609-48e8-bf45-f31b3fa1d7ec
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:20:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:28.796 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'env', 'PROCESS_TAG=haproxy-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6683d4f6-e609-48e8-bf45-f31b3fa1d7ec.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:20:29 np0005476733 nova_compute[192580]: 2025-10-08 15:20:29.176 2 DEBUG nova.virt.libvirt.host [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Removed pending event for c4b45a9c-73a5-4b51-ab96-874507f4c028 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  8 11:20:29 np0005476733 nova_compute[192580]: 2025-10-08 15:20:29.177 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936829.1755302, c4b45a9c-73a5-4b51-ab96-874507f4c028 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:20:29 np0005476733 nova_compute[192580]: 2025-10-08 15:20:29.177 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:20:29 np0005476733 nova_compute[192580]: 2025-10-08 15:20:29.182 2 INFO nova.virt.libvirt.driver [-] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Instance running successfully.#033[00m
Oct  8 11:20:29 np0005476733 nova_compute[192580]: 2025-10-08 15:20:29.182 2 INFO nova.virt.libvirt.driver [None req-68a10be9-5acc-490b-b3bf-c00a969cd019 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Instance soft rebooted successfully.#033[00m
Oct  8 11:20:29 np0005476733 nova_compute[192580]: 2025-10-08 15:20:29.183 2 DEBUG nova.compute.manager [None req-68a10be9-5acc-490b-b3bf-c00a969cd019 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:20:29 np0005476733 podman[222951]: 2025-10-08 15:20:29.183225833 +0000 UTC m=+0.056293233 container create 9655c2ee3d8a29dc99fbb152c82c13e2e6d6df630d88189e58e506879525cbe9 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:20:29 np0005476733 systemd[1]: Started libpod-conmon-9655c2ee3d8a29dc99fbb152c82c13e2e6d6df630d88189e58e506879525cbe9.scope.
Oct  8 11:20:29 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:20:29 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c41eb8b4c2d83699168fa1690999f87d065cd8d0d3ba98003a80c587d8a6c90a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:20:29 np0005476733 podman[222951]: 2025-10-08 15:20:29.152972105 +0000 UTC m=+0.026039525 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:20:29 np0005476733 podman[222951]: 2025-10-08 15:20:29.252968315 +0000 UTC m=+0.126035715 container init 9655c2ee3d8a29dc99fbb152c82c13e2e6d6df630d88189e58e506879525cbe9 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0)
Oct  8 11:20:29 np0005476733 podman[222951]: 2025-10-08 15:20:29.258828463 +0000 UTC m=+0.131895853 container start 9655c2ee3d8a29dc99fbb152c82c13e2e6d6df630d88189e58e506879525cbe9 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:20:29 np0005476733 neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec[222966]: [NOTICE]   (222970) : New worker (222972) forked
Oct  8 11:20:29 np0005476733 neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec[222966]: [NOTICE]   (222970) : Loading success.
Oct  8 11:20:29 np0005476733 nova_compute[192580]: 2025-10-08 15:20:29.419 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:20:29 np0005476733 nova_compute[192580]: 2025-10-08 15:20:29.423 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:20:29 np0005476733 nova_compute[192580]: 2025-10-08 15:20:29.485 2 DEBUG oslo_concurrency.lockutils [None req-68a10be9-5acc-490b-b3bf-c00a969cd019 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 7.637s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:29 np0005476733 nova_compute[192580]: 2025-10-08 15:20:29.502 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936829.1766121, c4b45a9c-73a5-4b51-ab96-874507f4c028 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:20:29 np0005476733 nova_compute[192580]: 2025-10-08 15:20:29.503 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] VM Started (Lifecycle Event)#033[00m
Oct  8 11:20:29 np0005476733 nova_compute[192580]: 2025-10-08 15:20:29.594 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:20:29 np0005476733 nova_compute[192580]: 2025-10-08 15:20:29.598 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:20:30 np0005476733 nova_compute[192580]: 2025-10-08 15:20:30.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:31 np0005476733 podman[223017]: 2025-10-08 15:20:31.249128345 +0000 UTC m=+0.061289452 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:20:31 np0005476733 podman[223016]: 2025-10-08 15:20:31.25331808 +0000 UTC m=+0.063154003 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:20:32 np0005476733 nova_compute[192580]: 2025-10-08 15:20:32.415 2 DEBUG oslo_concurrency.lockutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "a71ee5d2-21b8-4455-8870-f20bed682909" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:32 np0005476733 nova_compute[192580]: 2025-10-08 15:20:32.416 2 DEBUG oslo_concurrency.lockutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "a71ee5d2-21b8-4455-8870-f20bed682909" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:32 np0005476733 nova_compute[192580]: 2025-10-08 15:20:32.440 2 DEBUG nova.compute.manager [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:20:32 np0005476733 nova_compute[192580]: 2025-10-08 15:20:32.534 2 DEBUG oslo_concurrency.lockutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:32 np0005476733 nova_compute[192580]: 2025-10-08 15:20:32.535 2 DEBUG oslo_concurrency.lockutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:32 np0005476733 nova_compute[192580]: 2025-10-08 15:20:32.543 2 DEBUG nova.virt.hardware [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:20:32 np0005476733 nova_compute[192580]: 2025-10-08 15:20:32.544 2 INFO nova.compute.claims [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:20:32 np0005476733 nova_compute[192580]: 2025-10-08 15:20:32.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:20:32 np0005476733 nova_compute[192580]: 2025-10-08 15:20:32.650 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:32 np0005476733 nova_compute[192580]: 2025-10-08 15:20:32.753 2 DEBUG nova.compute.provider_tree [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:20:32 np0005476733 nova_compute[192580]: 2025-10-08 15:20:32.805 2 DEBUG nova.scheduler.client.report [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:20:32 np0005476733 nova_compute[192580]: 2025-10-08 15:20:32.923 2 DEBUG oslo_concurrency.lockutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.388s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:32 np0005476733 nova_compute[192580]: 2025-10-08 15:20:32.924 2 DEBUG nova.compute.manager [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:20:32 np0005476733 nova_compute[192580]: 2025-10-08 15:20:32.926 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:32 np0005476733 nova_compute[192580]: 2025-10-08 15:20:32.927 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:32 np0005476733 nova_compute[192580]: 2025-10-08 15:20:32.927 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.072 2 DEBUG nova.compute.manager [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.073 2 DEBUG nova.network.neutron [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.097 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af660b82-9b3c-4c4d-820a-3d22b73898e5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.129 2 INFO nova.virt.libvirt.driver [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.168 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af660b82-9b3c-4c4d-820a-3d22b73898e5/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.169 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af660b82-9b3c-4c4d-820a-3d22b73898e5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.197 2 DEBUG nova.compute.manager [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.232 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af660b82-9b3c-4c4d-820a-3d22b73898e5/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.240 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.300 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.301 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.328 2 DEBUG nova.policy [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.373 2 DEBUG nova.compute.manager [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.376 2 DEBUG nova.virt.libvirt.driver [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.377 2 INFO nova.virt.libvirt.driver [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Creating image(s)#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.379 2 DEBUG oslo_concurrency.lockutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "/var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.379 2 DEBUG oslo_concurrency.lockutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "/var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.380 2 DEBUG oslo_concurrency.lockutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "/var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.407 2 DEBUG oslo_concurrency.processutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.425 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk --force-share --output=json" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.457 2 DEBUG nova.compute.manager [req-41046327-c889-4126-8a62-f1f82b583201 req-019dc40b-e8dc-4921-b2f8-9244070a7009 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received event network-vif-unplugged-fc0ee6f6-75ad-486c-8761-a75311199fcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.458 2 DEBUG oslo_concurrency.lockutils [req-41046327-c889-4126-8a62-f1f82b583201 req-019dc40b-e8dc-4921-b2f8-9244070a7009 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.458 2 DEBUG oslo_concurrency.lockutils [req-41046327-c889-4126-8a62-f1f82b583201 req-019dc40b-e8dc-4921-b2f8-9244070a7009 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.458 2 DEBUG oslo_concurrency.lockutils [req-41046327-c889-4126-8a62-f1f82b583201 req-019dc40b-e8dc-4921-b2f8-9244070a7009 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.459 2 DEBUG nova.compute.manager [req-41046327-c889-4126-8a62-f1f82b583201 req-019dc40b-e8dc-4921-b2f8-9244070a7009 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] No waiting events found dispatching network-vif-unplugged-fc0ee6f6-75ad-486c-8761-a75311199fcb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.459 2 WARNING nova.compute.manager [req-41046327-c889-4126-8a62-f1f82b583201 req-019dc40b-e8dc-4921-b2f8-9244070a7009 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received unexpected event network-vif-unplugged-fc0ee6f6-75ad-486c-8761-a75311199fcb for instance with vm_state active and task_state None.#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.480 2 DEBUG oslo_concurrency.processutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.481 2 DEBUG oslo_concurrency.lockutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.481 2 DEBUG oslo_concurrency.lockutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.491 2 DEBUG oslo_concurrency.processutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.548 2 DEBUG oslo_concurrency.processutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.549 2 DEBUG oslo_concurrency.processutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.581 2 DEBUG oslo_concurrency.processutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909/disk 10737418240" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.582 2 DEBUG oslo_concurrency.lockutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.583 2 DEBUG oslo_concurrency.processutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.645 2 DEBUG oslo_concurrency.processutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.646 2 DEBUG nova.objects.instance [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lazy-loading 'migration_context' on Instance uuid a71ee5d2-21b8-4455-8870-f20bed682909 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.669 2 DEBUG nova.virt.libvirt.driver [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.669 2 DEBUG nova.virt.libvirt.driver [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Ensure instance console log exists: /var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.669 2 DEBUG oslo_concurrency.lockutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.670 2 DEBUG oslo_concurrency.lockutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.670 2 DEBUG oslo_concurrency.lockutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.679 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.680 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12844MB free_disk=111.17750930786133GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.681 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.681 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.771 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance af660b82-9b3c-4c4d-820a-3d22b73898e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.771 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance c4b45a9c-73a5-4b51-ab96-874507f4c028 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.772 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance a71ee5d2-21b8-4455-8870-f20bed682909 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.772 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.772 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=2688MB phys_disk=119GB used_disk=21GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.865 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.884 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.929 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:20:33 np0005476733 nova_compute[192580]: 2025-10-08 15:20:33.929 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:34 np0005476733 nova_compute[192580]: 2025-10-08 15:20:34.323 2 DEBUG nova.network.neutron [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Successfully created port: f66c148b-4cbb-4cdd-8196-6513d7c5ff78 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.263 2 DEBUG nova.network.neutron [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Successfully updated port: f66c148b-4cbb-4cdd-8196-6513d7c5ff78 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.287 2 DEBUG oslo_concurrency.lockutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "refresh_cache-a71ee5d2-21b8-4455-8870-f20bed682909" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.288 2 DEBUG oslo_concurrency.lockutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquired lock "refresh_cache-a71ee5d2-21b8-4455-8870-f20bed682909" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.288 2 DEBUG nova.network.neutron [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.376 2 DEBUG nova.compute.manager [req-95e46191-60cb-420e-9ad4-9d58db2a5784 req-158bff1d-42b3-47da-b533-dc25f74667ff 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Received event network-changed-f66c148b-4cbb-4cdd-8196-6513d7c5ff78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.377 2 DEBUG nova.compute.manager [req-95e46191-60cb-420e-9ad4-9d58db2a5784 req-158bff1d-42b3-47da-b533-dc25f74667ff 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Refreshing instance network info cache due to event network-changed-f66c148b-4cbb-4cdd-8196-6513d7c5ff78. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.378 2 DEBUG oslo_concurrency.lockutils [req-95e46191-60cb-420e-9ad4-9d58db2a5784 req-158bff1d-42b3-47da-b533-dc25f74667ff 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-a71ee5d2-21b8-4455-8870-f20bed682909" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.433 2 DEBUG nova.network.neutron [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.556 2 DEBUG nova.compute.manager [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received event network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.556 2 DEBUG oslo_concurrency.lockutils [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.557 2 DEBUG oslo_concurrency.lockutils [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.557 2 DEBUG oslo_concurrency.lockutils [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.557 2 DEBUG nova.compute.manager [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] No waiting events found dispatching network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.557 2 WARNING nova.compute.manager [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received unexpected event network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb for instance with vm_state active and task_state None.#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.558 2 DEBUG nova.compute.manager [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received event network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.558 2 DEBUG oslo_concurrency.lockutils [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.558 2 DEBUG oslo_concurrency.lockutils [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.559 2 DEBUG oslo_concurrency.lockutils [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.559 2 DEBUG nova.compute.manager [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] No waiting events found dispatching network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.559 2 WARNING nova.compute.manager [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received unexpected event network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb for instance with vm_state active and task_state None.#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.559 2 DEBUG nova.compute.manager [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received event network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.560 2 DEBUG oslo_concurrency.lockutils [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.560 2 DEBUG oslo_concurrency.lockutils [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.560 2 DEBUG oslo_concurrency.lockutils [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.561 2 DEBUG nova.compute.manager [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] No waiting events found dispatching network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.561 2 WARNING nova.compute.manager [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received unexpected event network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb for instance with vm_state active and task_state None.#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.561 2 DEBUG nova.compute.manager [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received event network-vif-unplugged-fc0ee6f6-75ad-486c-8761-a75311199fcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.561 2 DEBUG oslo_concurrency.lockutils [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.561 2 DEBUG oslo_concurrency.lockutils [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.562 2 DEBUG oslo_concurrency.lockutils [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.562 2 DEBUG nova.compute.manager [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] No waiting events found dispatching network-vif-unplugged-fc0ee6f6-75ad-486c-8761-a75311199fcb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.562 2 WARNING nova.compute.manager [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received unexpected event network-vif-unplugged-fc0ee6f6-75ad-486c-8761-a75311199fcb for instance with vm_state active and task_state None.#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.563 2 DEBUG nova.compute.manager [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received event network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.563 2 DEBUG oslo_concurrency.lockutils [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.563 2 DEBUG oslo_concurrency.lockutils [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.563 2 DEBUG oslo_concurrency.lockutils [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.564 2 DEBUG nova.compute.manager [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] No waiting events found dispatching network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.564 2 WARNING nova.compute.manager [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received unexpected event network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb for instance with vm_state active and task_state None.#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.564 2 DEBUG nova.compute.manager [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received event network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.564 2 DEBUG oslo_concurrency.lockutils [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.565 2 DEBUG oslo_concurrency.lockutils [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.565 2 DEBUG oslo_concurrency.lockutils [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.565 2 DEBUG nova.compute.manager [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] No waiting events found dispatching network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.566 2 WARNING nova.compute.manager [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received unexpected event network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb for instance with vm_state active and task_state None.#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.566 2 DEBUG nova.compute.manager [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received event network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.566 2 DEBUG oslo_concurrency.lockutils [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.566 2 DEBUG oslo_concurrency.lockutils [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.567 2 DEBUG oslo_concurrency.lockutils [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.567 2 DEBUG nova.compute.manager [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] No waiting events found dispatching network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:20:35 np0005476733 nova_compute[192580]: 2025-10-08 15:20:35.567 2 WARNING nova.compute.manager [req-2fe81315-846d-4120-9245-c1b20a3db425 req-4e38eb2a-d3e9-40e2-b293-45e82ea10775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received unexpected event network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb for instance with vm_state active and task_state None.#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.235 2 DEBUG nova.network.neutron [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Updating instance_info_cache with network_info: [{"id": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "address": "fa:16:3e:77:9d:93", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66c148b-4c", "ovs_interfaceid": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.308 2 DEBUG oslo_concurrency.lockutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Releasing lock "refresh_cache-a71ee5d2-21b8-4455-8870-f20bed682909" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.309 2 DEBUG nova.compute.manager [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Instance network_info: |[{"id": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "address": "fa:16:3e:77:9d:93", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66c148b-4c", "ovs_interfaceid": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.310 2 DEBUG oslo_concurrency.lockutils [req-95e46191-60cb-420e-9ad4-9d58db2a5784 req-158bff1d-42b3-47da-b533-dc25f74667ff 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-a71ee5d2-21b8-4455-8870-f20bed682909" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.310 2 DEBUG nova.network.neutron [req-95e46191-60cb-420e-9ad4-9d58db2a5784 req-158bff1d-42b3-47da-b533-dc25f74667ff 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Refreshing network info cache for port f66c148b-4cbb-4cdd-8196-6513d7c5ff78 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.313 2 DEBUG nova.virt.libvirt.driver [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Start _get_guest_xml network_info=[{"id": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "address": "fa:16:3e:77:9d:93", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66c148b-4c", "ovs_interfaceid": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.317 2 WARNING nova.virt.libvirt.driver [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.323 2 DEBUG nova.virt.libvirt.host [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.324 2 DEBUG nova.virt.libvirt.host [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.327 2 DEBUG nova.virt.libvirt.host [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.327 2 DEBUG nova.virt.libvirt.host [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.328 2 DEBUG nova.virt.libvirt.driver [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.328 2 DEBUG nova.virt.hardware [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.328 2 DEBUG nova.virt.hardware [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.329 2 DEBUG nova.virt.hardware [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.329 2 DEBUG nova.virt.hardware [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.329 2 DEBUG nova.virt.hardware [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.330 2 DEBUG nova.virt.hardware [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.330 2 DEBUG nova.virt.hardware [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.330 2 DEBUG nova.virt.hardware [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.331 2 DEBUG nova.virt.hardware [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.331 2 DEBUG nova.virt.hardware [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.331 2 DEBUG nova.virt.hardware [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.335 2 DEBUG nova.virt.libvirt.vif [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:20:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_flooding_when_special_groups-542526277',display_name='tempest-test_flooding_when_special_groups-542526277',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-flooding-when-special-groups-542526277',id=9,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD2WFeSg/DGHNB9+nWyQfurOVjPkTxdtZkW0R1GkMWJ7Z/35TtPo56N93IJ9W+ueAP01srElKtm0K/Obvpsxk9Lrs3cBEC1ElilHgpG+1/NKtqmriMYH4DXfeSh+aMoHPg==',key_name='tempest-keypair-test-469695160',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7f2acdb26a5a4269a4b1e407da7722c3',ramdisk_id='',reservation_id='r-z4n1a4rs',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Common-178854047',owner_user_name='tempest-MulticastTestIPv4Common-178854047-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:20:33Z,user_data=None,user_id='f03335a379bd4afdbbd7b9cc7cae27e0',uuid=a71ee5d2-21b8-4455-8870-f20bed682909,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "address": "fa:16:3e:77:9d:93", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66c148b-4c", "ovs_interfaceid": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.335 2 DEBUG nova.network.os_vif_util [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Converting VIF {"id": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "address": "fa:16:3e:77:9d:93", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66c148b-4c", "ovs_interfaceid": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.337 2 DEBUG nova.network.os_vif_util [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:9d:93,bridge_name='br-int',has_traffic_filtering=True,id=f66c148b-4cbb-4cdd-8196-6513d7c5ff78,network=Network(3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf66c148b-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.338 2 DEBUG nova.objects.instance [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lazy-loading 'pci_devices' on Instance uuid a71ee5d2-21b8-4455-8870-f20bed682909 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.379 2 DEBUG nova.virt.libvirt.driver [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:20:36 np0005476733 nova_compute[192580]:  <uuid>a71ee5d2-21b8-4455-8870-f20bed682909</uuid>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:  <name>instance-00000009</name>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <nova:name>tempest-test_flooding_when_special_groups-542526277</nova:name>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:20:36</nova:creationTime>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:20:36 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:        <nova:user uuid="f03335a379bd4afdbbd7b9cc7cae27e0">tempest-MulticastTestIPv4Common-178854047-project-member</nova:user>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:        <nova:project uuid="7f2acdb26a5a4269a4b1e407da7722c3">tempest-MulticastTestIPv4Common-178854047</nova:project>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:        <nova:port uuid="f66c148b-4cbb-4cdd-8196-6513d7c5ff78">
Oct  8 11:20:36 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <entry name="serial">a71ee5d2-21b8-4455-8870-f20bed682909</entry>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <entry name="uuid">a71ee5d2-21b8-4455-8870-f20bed682909</entry>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.379 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}bb37b4088f4e830c4c7174815b977c814c8dd3669b8fc5006cfda63dcd5ae1c1" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Oct  8 11:20:36 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909/disk"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909/disk.config"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:77:9d:93"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <target dev="tapf66c148b-4c"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909/console.log" append="off"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:20:36 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:20:36 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:20:36 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:20:36 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.386 2 DEBUG nova.compute.manager [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Preparing to wait for external event network-vif-plugged-f66c148b-4cbb-4cdd-8196-6513d7c5ff78 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.386 2 DEBUG oslo_concurrency.lockutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "a71ee5d2-21b8-4455-8870-f20bed682909-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.386 2 DEBUG oslo_concurrency.lockutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "a71ee5d2-21b8-4455-8870-f20bed682909-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.387 2 DEBUG oslo_concurrency.lockutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "a71ee5d2-21b8-4455-8870-f20bed682909-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.387 2 DEBUG nova.virt.libvirt.vif [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:20:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_flooding_when_special_groups-542526277',display_name='tempest-test_flooding_when_special_groups-542526277',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-flooding-when-special-groups-542526277',id=9,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD2WFeSg/DGHNB9+nWyQfurOVjPkTxdtZkW0R1GkMWJ7Z/35TtPo56N93IJ9W+ueAP01srElKtm0K/Obvpsxk9Lrs3cBEC1ElilHgpG+1/NKtqmriMYH4DXfeSh+aMoHPg==',key_name='tempest-keypair-test-469695160',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7f2acdb26a5a4269a4b1e407da7722c3',ramdisk_id='',reservation_id='r-z4n1a4rs',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Common-178854047',owner_user_name='tempest-MulticastTestIPv4Common-178854047-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:20:33Z,user_data=None,user_id='f03335a379bd4afdbbd7b9cc7cae27e0',uuid=a71ee5d2-21b8-4455-8870-f20bed682909,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "address": "fa:16:3e:77:9d:93", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66c148b-4c", "ovs_interfaceid": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.388 2 DEBUG nova.network.os_vif_util [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Converting VIF {"id": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "address": "fa:16:3e:77:9d:93", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66c148b-4c", "ovs_interfaceid": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.388 2 DEBUG nova.network.os_vif_util [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:9d:93,bridge_name='br-int',has_traffic_filtering=True,id=f66c148b-4cbb-4cdd-8196-6513d7c5ff78,network=Network(3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf66c148b-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.388 2 DEBUG os_vif [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:9d:93,bridge_name='br-int',has_traffic_filtering=True,id=f66c148b-4cbb-4cdd-8196-6513d7c5ff78,network=Network(3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf66c148b-4c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.389 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.391 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.395 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf66c148b-4c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.395 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf66c148b-4c, col_values=(('external_ids', {'iface-id': 'f66c148b-4cbb-4cdd-8196-6513d7c5ff78', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:77:9d:93', 'vm-uuid': 'a71ee5d2-21b8-4455-8870-f20bed682909'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:36 np0005476733 NetworkManager[51699]: <info>  [1759936836.3982] manager: (tapf66c148b-4c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.406 2 INFO os_vif [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:9d:93,bridge_name='br-int',has_traffic_filtering=True,id=f66c148b-4cbb-4cdd-8196-6513d7c5ff78,network=Network(3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf66c148b-4c')#033[00m
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.465 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 972 Content-Type: application/json Date: Wed, 08 Oct 2025 15:20:36 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-a6874a07-1dee-4808-aa82-c06103499c77 x-openstack-request-id: req-a6874a07-1dee-4808-aa82-c06103499c77 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.465 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "22222222-2222-2222-2222-222222222222", "name": "custom_neutron_guest", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/22222222-2222-2222-2222-222222222222"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/22222222-2222-2222-2222-222222222222"}]}, {"id": "320a7bb4-cef4-4b24-b163-c19d971d4760", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/320a7bb4-cef4-4b24-b163-c19d971d4760"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/320a7bb4-cef4-4b24-b163-c19d971d4760"}]}, {"id": "987b2db7-1d21-4b59-831a-1e8ace40589b", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/987b2db7-1d21-4b59-831a-1e8ace40589b"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/987b2db7-1d21-4b59-831a-1e8ace40589b"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.465 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-a6874a07-1dee-4808-aa82-c06103499c77 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.467 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/22222222-2222-2222-2222-222222222222 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}bb37b4088f4e830c4c7174815b977c814c8dd3669b8fc5006cfda63dcd5ae1c1" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.468 2 DEBUG nova.virt.libvirt.driver [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.469 2 DEBUG nova.virt.libvirt.driver [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.469 2 DEBUG nova.virt.libvirt.driver [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] No VIF found with MAC fa:16:3e:77:9d:93, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:20:36 np0005476733 nova_compute[192580]: 2025-10-08 15:20:36.470 2 INFO nova.virt.libvirt.driver [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Using config drive#033[00m
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.537 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 510 Content-Type: application/json Date: Wed, 08 Oct 2025 15:20:36 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-d5fba96d-5b18-48ad-b77e-af73ef594e04 x-openstack-request-id: req-d5fba96d-5b18-48ad-b77e-af73ef594e04 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.538 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "22222222-2222-2222-2222-222222222222", "name": "custom_neutron_guest", "ram": 1024, "disk": 10, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/22222222-2222-2222-2222-222222222222"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/22222222-2222-2222-2222-222222222222"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.538 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/22222222-2222-2222-2222-222222222222 used request id req-d5fba96d-5b18-48ad-b77e-af73ef594e04 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.539 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000005', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '496a37645ecf47b496dcf02c696ca64a', 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'hostId': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.541 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}bb37b4088f4e830c4c7174815b977c814c8dd3669b8fc5006cfda63dcd5ae1c1" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.612 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 972 Content-Type: application/json Date: Wed, 08 Oct 2025 15:20:36 GMT Keep-Alive: timeout=5, max=98 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-bcab6508-4581-4088-8bbc-2bb6735762d3 x-openstack-request-id: req-bcab6508-4581-4088-8bbc-2bb6735762d3 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.613 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "22222222-2222-2222-2222-222222222222", "name": "custom_neutron_guest", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/22222222-2222-2222-2222-222222222222"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/22222222-2222-2222-2222-222222222222"}]}, {"id": "320a7bb4-cef4-4b24-b163-c19d971d4760", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/320a7bb4-cef4-4b24-b163-c19d971d4760"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/320a7bb4-cef4-4b24-b163-c19d971d4760"}]}, {"id": "987b2db7-1d21-4b59-831a-1e8ace40589b", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/987b2db7-1d21-4b59-831a-1e8ace40589b"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/987b2db7-1d21-4b59-831a-1e8ace40589b"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.613 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-bcab6508-4581-4088-8bbc-2bb6735762d3 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.614 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/987b2db7-1d21-4b59-831a-1e8ace40589b -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}bb37b4088f4e830c4c7174815b977c814c8dd3669b8fc5006cfda63dcd5ae1c1" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.678 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Wed, 08 Oct 2025 15:20:36 GMT Keep-Alive: timeout=5, max=97 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-99817261-032b-435f-b058-43c43d8f94ec x-openstack-request-id: req-99817261-032b-435f-b058-43c43d8f94ec _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.678 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "987b2db7-1d21-4b59-831a-1e8ace40589b", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/987b2db7-1d21-4b59-831a-1e8ace40589b"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/987b2db7-1d21-4b59-831a-1e8ace40589b"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.678 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/987b2db7-1d21-4b59-831a-1e8ace40589b used request id req-99817261-032b-435f-b058-43c43d8f94ec request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.679 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'name': 'tempest-server-test-275740212', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000007', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7762962015674dfb9038a135559a61f3', 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'hostId': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.680 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.680 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.680 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-test_idle_timeout_with_querier_enabled-2110154127>, <NovaLikeServer: tempest-server-test-275740212>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_idle_timeout_with_querier_enabled-2110154127>, <NovaLikeServer: tempest-server-test-275740212>]
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.680 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.685 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for af660b82-9b3c-4c4d-820a-3d22b73898e5 / tap1f764678-f4 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.685 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.688 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c4b45a9c-73a5-4b51-ab96-874507f4c028 / tapfc0ee6f6-75 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.689 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3431398-ff10-4caa-9520-65224275cb85', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000005-af660b82-9b3c-4c4d-820a-3d22b73898e5-tap1f764678-f4', 'timestamp': '2025-10-08T15:20:36.680956', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'tap1f764678-f4', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:7e:98:72', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f764678-f4'}, 'message_id': '57486064-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.404061335, 'message_signature': '9e97eb95069acea5e8b9168b4b400d40aa012b4aebe1508aa2cc3682af999876'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'instance-00000007-c4b45a9c-73a5-4b51-ab96-874507f4c028-tapfc0ee6f6-75', 'timestamp': '2025-10-08T15:20:36.680956', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'tapfc0ee6f6-75', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:49:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc0ee6f6-75'}, 'message_id': '5748e43a-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.409068315, 'message_signature': '005ab01cb17a58165552455d1dc6107563ff3e932bdd181fd4185b92a0d84372'}]}, 'timestamp': '2025-10-08 15:20:36.689400', '_unique_id': 'fddeed58bc3d420a8e1e01b1e8a6733f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.697 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.700 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.727 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/cpu volume: 40150000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.749 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/cpu volume: 7200000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2bb3925-7925-4d7a-ace3-4837e0fd8852', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 40150000000, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'timestamp': '2025-10-08T15:20:36.700338', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'instance-00000005', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '574ecbca-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.45017227, 'message_signature': '495b298974b2228ee76f61f8d7d0028a944438fbf4cb1744cce5fa0eb8abdf5c'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7200000000, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'timestamp': '2025-10-08T15:20:36.700338', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'instance-00000007', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '575220ea-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.472256407, 'message_signature': '888450df9f65d34fdc47c1e07273cf3bb17650e747dd05d98130bf433b560f15'}]}, 'timestamp': '2025-10-08 15:20:36.749954', '_unique_id': 'a9ffb654b5864ea8bf47ac82f9e7159d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.751 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.752 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.786 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.device.read.requests volume: 11521 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.786 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.818 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.818 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84c902f9-7791-4dcb-81de-439a5fe11eca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11521, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5-vda', 'timestamp': '2025-10-08T15:20:36.752272', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'instance-00000005', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '5757c2a2-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.475355016, 'message_signature': '37e50a3c1bac15225252872cc2637533f435b209556de59d164c348b64e746d8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5-sda', 'timestamp': '2025-10-08T15:20:36.752272', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'instance-00000005', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '5757d472-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.475355016, 'message_signature': 'a4e9f383f4fc0813c465c38d02c0c69e7be494cf5194bdc96ae5bf8bee18ff2e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028-vda', 'timestamp': '2025-10-08T15:20:36.752272', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'instance-00000007', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '575c9e30-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.510202612, 'message_signature': '4fd1b868d31d3d86fb5d33d440e89b8a82052d4152f2853536a617e12c15b587'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028-sda', 'timestamp': '2025-10-08T15:20:36.752272', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'instance-00000007', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '575ca966-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.510202612, 'message_signature': 'aaa01dbbdec82820611440a963a47da1a6be10eb72e79db8069a5c1d95ef9071'}]}, 'timestamp': '2025-10-08 15:20:36.818889', '_unique_id': '6aa0e458139f4b8b89dc32391054c0be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.820 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.821 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.821 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '623f0279-c171-4cda-a5ef-dd6d6133fee6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000005-af660b82-9b3c-4c4d-820a-3d22b73898e5-tap1f764678-f4', 'timestamp': '2025-10-08T15:20:36.821007', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'tap1f764678-f4', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:7e:98:72', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f764678-f4'}, 'message_id': '575d088e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.404061335, 'message_signature': '658c6087c7bc9bb75a90704a99aba022ede746917dd864b61b7d19469017c049'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'instance-00000007-c4b45a9c-73a5-4b51-ab96-874507f4c028-tapfc0ee6f6-75', 'timestamp': '2025-10-08T15:20:36.821007', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'tapfc0ee6f6-75', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:49:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc0ee6f6-75'}, 'message_id': '575d14a0-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.409068315, 'message_signature': '156e4ffafeacb02fbfbcc73d92abf44008c77fc7eedc2703790eb2099c6dd579'}]}, 'timestamp': '2025-10-08 15:20:36.821664', '_unique_id': 'f62576e80e41442cbb234776ff5c0538'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.822 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.823 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.823 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.823 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '793bd2bc-e679-4924-a8e3-4d66db0f177a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000005-af660b82-9b3c-4c4d-820a-3d22b73898e5-tap1f764678-f4', 'timestamp': '2025-10-08T15:20:36.823191', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'tap1f764678-f4', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:7e:98:72', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f764678-f4'}, 'message_id': '575d5c3a-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.404061335, 'message_signature': '7fd24d02bdf72cb7dbc7b75334ea83e7cfdbc7df54881186647e23795abdbf8f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'instance-00000007-c4b45a9c-73a5-4b51-ab96-874507f4c028-tapfc0ee6f6-75', 'timestamp': '2025-10-08T15:20:36.823191', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'tapfc0ee6f6-75', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:49:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc0ee6f6-75'}, 'message_id': '575d6770-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.409068315, 'message_signature': '40609578dc057d318757171b7b6f9b6e044e248f39fa5212f4e04675d179852d'}]}, 'timestamp': '2025-10-08 15:20:36.823782', '_unique_id': '83d49c17431848ab96e0df619205fd40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.824 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.825 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.825 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.825 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b7ba1ad-2198-49c9-bea2-77d69e72f7e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000005-af660b82-9b3c-4c4d-820a-3d22b73898e5-tap1f764678-f4', 'timestamp': '2025-10-08T15:20:36.825207', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'tap1f764678-f4', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:7e:98:72', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f764678-f4'}, 'message_id': '575daab4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.404061335, 'message_signature': '15a58d181af174664d4d1a0e31fc0981c7083720393225bf3ef5b2033a84aeb6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'instance-00000007-c4b45a9c-73a5-4b51-ab96-874507f4c028-tapfc0ee6f6-75', 'timestamp': '2025-10-08T15:20:36.825207', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'tapfc0ee6f6-75', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:49:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc0ee6f6-75'}, 'message_id': '575db630-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.409068315, 'message_signature': 'fc8d5a5964376d9d71ca0b5fb208df045c1b416b4fed8364b6479631ff7fe0f1'}]}, 'timestamp': '2025-10-08 15:20:36.825792', '_unique_id': '34527dbd3f9047bb9a6d6c9bee1c5d8d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.826 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.828 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.829 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.830 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98a4e6d5-d94d-4c2d-a5a5-80d89cc6c35d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000005-af660b82-9b3c-4c4d-820a-3d22b73898e5-tap1f764678-f4', 'timestamp': '2025-10-08T15:20:36.828953', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'tap1f764678-f4', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:7e:98:72', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f764678-f4'}, 'message_id': '575e61fc-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.404061335, 'message_signature': '6ff653ae55f373f017c351b7713d916421e2794c3f92a6dbac3b4c61658374d4'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'instance-00000007-c4b45a9c-73a5-4b51-ab96-874507f4c028-tapfc0ee6f6-75', 'timestamp': '2025-10-08T15:20:36.828953', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'tapfc0ee6f6-75', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:49:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc0ee6f6-75'}, 'message_id': '575e720a-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.409068315, 'message_signature': 'a5c9c82abcfe892f700fe640a5d8ade47f21596709f7b3a5dac194e23b958bd3'}]}, 'timestamp': '2025-10-08 15:20:36.830611', '_unique_id': 'd5cfd90974754ddfa5bf713e781f349d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.831 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.832 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.832 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.832 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09977232-aa12-4984-8904-afb545876e79', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000005-af660b82-9b3c-4c4d-820a-3d22b73898e5-tap1f764678-f4', 'timestamp': '2025-10-08T15:20:36.832173', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'tap1f764678-f4', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:7e:98:72', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f764678-f4'}, 'message_id': '575ebaee-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.404061335, 'message_signature': '599f4d4eb1c1dfcda3f92980dec415aed78ebb4192da2b79f5522c882ec9de7f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'instance-00000007-c4b45a9c-73a5-4b51-ab96-874507f4c028-tapfc0ee6f6-75', 'timestamp': '2025-10-08T15:20:36.832173', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'tapfc0ee6f6-75', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:49:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc0ee6f6-75'}, 'message_id': '575ec6ba-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.409068315, 'message_signature': '2c6a9a281e2cc0c18a6dbc873e8e0f197074f65fe19a50a08b816164a0e3ec4b'}]}, 'timestamp': '2025-10-08 15:20:36.832776', '_unique_id': '1dee6c50ea274641b778037089bae681'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.833 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.836 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.837 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.837 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-test_idle_timeout_with_querier_enabled-2110154127>, <NovaLikeServer: tempest-server-test-275740212>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_idle_timeout_with_querier_enabled-2110154127>, <NovaLikeServer: tempest-server-test-275740212>]
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.837 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.850 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.device.usage volume: 152436736 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.851 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.868 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.868 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'afadffef-1c5a-4431-b312-1220ccfd934a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152436736, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5-vda', 'timestamp': '2025-10-08T15:20:36.838033', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'instance-00000005', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '57619034-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.561158703, 'message_signature': '620b387c0242908ca54225dffd6c5806671b192420576e16fba9164cb44a37ce'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5-sda', 'timestamp': '2025-10-08T15:20:36.838033', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'instance-00000005', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '57619f48-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.561158703, 'message_signature': 'ef922147132528899fddca49cfcdc5aea6629d1fad4ddf6310ca4f3251266360'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028-vda', 'timestamp': '2025-10-08T15:20:36.838033', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'instance-00000007', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '576447de-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.574387506, 'message_signature': '5dd20fda25b7f2a5537161788907d4c73ab42b255721e6cdbb77713680d858a7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028-sda', 'timestamp': '2025-10-08T15:20:36.838033', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'instance-00000007', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '576452f6-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.574387506, 'message_signature': '1edeb27d48b0e10fdc3eb39b1f338c058c3f422b8f9c0f3241dbb7ea7e0d02e3'}]}, 'timestamp': '2025-10-08 15:20:36.869126', '_unique_id': '5df9bf337e624199a6d6fd7fe02515d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.870 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.871 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.871 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.871 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84d0638a-38ea-44a1-8631-652f3dfc9b39', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5-vda', 'timestamp': '2025-10-08T15:20:36.870891', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'instance-00000005', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '5764a3dc-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.561158703, 'message_signature': 'b2d50e3ed6a4a359543b808921edeb03377d28b9533dcd9748ec2304a2a4838f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5-sda', 'timestamp': '2025-10-08T15:20:36.870891', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'instance-00000005', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '5764ace2-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.561158703, 'message_signature': 'ca2f26fc5eaa28c73954bc6ec7dc11a2a6ee59fc16947bd7897814565625cd3d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028-vda', 'timestamp': '2025-10-08T15:20:36.870891', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'instance-00000007', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5764b516-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.574387506, 'message_signature': '5bf914ebfcb24fd46d798e6316abc4dbba79cec2c7649589fc7a5b286234b6a6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028-sda', 'timestamp': '2025-10-08T15:20:36.870891', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'instance-00000007', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5764bd4a-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.574387506, 'message_signature': 'dfb58674919b3d239d8a5732b3186529cfdee6630445f0ce7303c2d036aca9b1'}]}, 'timestamp': '2025-10-08 15:20:36.871816', '_unique_id': '36c6dc38013c4f0684961fdd8a858f21'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.872 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/network.outgoing.packets volume: 35 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f86c8b92-9582-43ac-9bbd-1250ac448e63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 35, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000005-af660b82-9b3c-4c4d-820a-3d22b73898e5-tap1f764678-f4', 'timestamp': '2025-10-08T15:20:36.873056', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'tap1f764678-f4', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:7e:98:72', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f764678-f4'}, 'message_id': '5764f774-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.404061335, 'message_signature': '6305c274c096df7920fa2e4a09dfd4707df6bcc22d31a8e218baf6cb19412998'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'instance-00000007-c4b45a9c-73a5-4b51-ab96-874507f4c028-tapfc0ee6f6-75', 'timestamp': '2025-10-08T15:20:36.873056', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'tapfc0ee6f6-75', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:49:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc0ee6f6-75'}, 'message_id': '5764ffa8-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.409068315, 'message_signature': '37fc1f67b0e53f9e3b9044c12907e79b792e3049d9df555cc32d4ccbe128ef9f'}]}, 'timestamp': '2025-10-08 15:20:36.873511', '_unique_id': 'bc7e4403f7eb4e3f96aaddac8def56c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.873 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.874 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.874 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.874 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8973f862-73a1-409c-a25e-5519590df67e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5-vda', 'timestamp': '2025-10-08T15:20:36.874604', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'instance-00000005', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '57653338-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.561158703, 'message_signature': 'dec6cbbfd0927edda22cecf164b2f5f5069f4de92cc29966d7c1235c34032c02'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5-sda', 'timestamp': '2025-10-08T15:20:36.874604', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'instance-00000005', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '57653c98-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.561158703, 'message_signature': 'f124f9381c62b977676263ace3bd663253a1451d568831a82bb09c5b0b160992'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028-vda', 'timestamp': '2025-10-08T15:20:36.874604', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'instance-00000007', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5765459e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.574387506, 'message_signature': 'ad2fc72c086ac734514273da0ada412e471cd7b7a608e7c75a661c6111fed227'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028-sda', 'timestamp': '2025-10-08T15:20:36.874604', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'instance-00000007', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '57654d14-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.574387506, 'message_signature': 'fc1d5d4ddbfc3f93f4da6f3dab3ca3329b603d2693a5f49d9eae497112290fed'}]}, 'timestamp': '2025-10-08 15:20:36.875485', '_unique_id': 'f60a9c95d5fd45f6ac667a4661946551'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.875 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.876 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.876 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.876 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '406a73c3-dc69-4da8-a6db-9faeaf167bdf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000005-af660b82-9b3c-4c4d-820a-3d22b73898e5-tap1f764678-f4', 'timestamp': '2025-10-08T15:20:36.876625', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'tap1f764678-f4', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:7e:98:72', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f764678-f4'}, 'message_id': '57658266-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.404061335, 'message_signature': 'e57a9347d16bd1f0cb74fbefb0c6e84a53c1d81b0e9e6c06672021ed1e5cefa9'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'instance-00000007-c4b45a9c-73a5-4b51-ab96-874507f4c028-tapfc0ee6f6-75', 'timestamp': '2025-10-08T15:20:36.876625', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'tapfc0ee6f6-75', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:49:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc0ee6f6-75'}, 'message_id': '57658a90-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.409068315, 'message_signature': '8719606aed5b292f7af0613347687834bd1667ad4ddebaa030ee76d6ac48630a'}]}, 'timestamp': '2025-10-08 15:20:36.877069', '_unique_id': 'e7b593bd1ed44f3d8aa26513b5321816'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.877 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.878 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.878 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.device.write.requests volume: 713 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.878 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.878 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.878 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa5e19e4-5d00-4da3-882e-309ab0a35987', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 713, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5-vda', 'timestamp': '2025-10-08T15:20:36.878210', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'instance-00000005', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '5765bfba-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.475355016, 'message_signature': '09e7a29e35dd64403a8d52d3ec59288c401c853b7d06828d1d8aa302f7572bbc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5-sda', 'timestamp': '2025-10-08T15:20:36.878210', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'instance-00000005', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '5765c744-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.475355016, 'message_signature': 'a24271424dfdc8ca6c19defcfecd6500587e903541fc7188a614c2440744d9c3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028-vda', 'timestamp': '2025-10-08T15:20:36.878210', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'instance-00000007', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5765ce7e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.510202612, 'message_signature': '2f43ec27649fbd5fdc11ff2c2a5423ba67b037234afbc4bceadba1db1fb05260'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028-sda', 'timestamp': '2025-10-08T15:20:36.878210', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'instance-00000007', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5765d5d6-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.510202612, 'message_signature': '40a24bc345411e66cc8ab69b3354dd5079b8c068cd4a5cc11832e3b29efc7358'}]}, 'timestamp': '2025-10-08 15:20:36.879012', '_unique_id': 'aad14c3261b94f1284998abf20db0297'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.879 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/memory.usage volume: 225.71484375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance c4b45a9c-73a5-4b51-ab96-874507f4c028: ceilometer.compute.pollsters.NoVolumeException
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c8daac7-0259-4f09-9cc0-c0a8fea1f8aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 225.71484375, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'timestamp': '2025-10-08T15:20:36.880154', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'instance-00000005', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '57660bb4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.45017227, 'message_signature': 'c994a6f1d61ca948cdd49881b09ea40f3634b370cf3e66bf52c077872630e53e'}]}, 'timestamp': '2025-10-08 15:20:36.880534', '_unique_id': '3ac43a301ad541c3a9290597a03d46ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.880 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.881 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.881 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.device.write.bytes volume: 135508992 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.881 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b6820f6-5c0f-4445-8b8d-300c392fa2e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 135508992, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5-vda', 'timestamp': '2025-10-08T15:20:36.881631', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'instance-00000005', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '57664656-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.475355016, 'message_signature': '47cfedc5b62ee828364f041cde486190c99a72a3f436f30861b5ad1b9c3a9143'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5-sda', 'timestamp': '2025-10-08T15:20:36.881631', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'instance-00000005', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '57664e08-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.475355016, 'message_signature': '14c5dc194ee8ae8b99f2247ee9728869aca7d720d8ee8ed53865aea9d11bb2b5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028-vda', 'timestamp': '2025-10-08T15:20:36.881631', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'instance-00000007', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5766566e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.510202612, 'message_signature': 'ffbf6d910699c7984fb1b7084dead11cd175a2ec7292ff41999248d2425dba2c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028-sda', 'timestamp': '2025-10-08T15:20:36.881631', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'instance-00000007', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '57665db2-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.510202612, 'message_signature': '7da4ca9bc7dbc3df8891b326b2178277d0044fced1253cf8ff2f7fce3b7cde29'}]}, 'timestamp': '2025-10-08 15:20:36.882460', '_unique_id': '5c2944268b3c42eea148588a8c3f7370'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.882 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.883 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.883 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.device.read.latency volume: 6431172391 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.883 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.device.read.latency volume: 124795616 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.device.read.latency volume: 523899470 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.device.read.latency volume: 583959 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6cf70504-97a5-414b-b7fa-781f9258d97b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6431172391, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5-vda', 'timestamp': '2025-10-08T15:20:36.883629', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'instance-00000005', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '57669336-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.475355016, 'message_signature': '0c66df2a48f820a3f3c82f164340579acbacbc5e3b9a24f1ed4cf7d27925bf73'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 124795616, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5-sda', 'timestamp': '2025-10-08T15:20:36.883629', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'instance-00000005', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '57669b4c-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.475355016, 'message_signature': '4ad1d5671bd4aa4864a6e04cf4ced251e7ffa2961851da43d161948ae78a8215'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 523899470, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028-vda', 'timestamp': '2025-10-08T15:20:36.883629', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'instance-00000007', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5766a4e8-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.510202612, 'message_signature': 'de333141e0a8ded5d43b57f1769d66ac0557d778c4f03eef6422d0406c90a2ae'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 583959, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028-sda', 'timestamp': '2025-10-08T15:20:36.883629', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'instance-00000007', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5766ac68-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.510202612, 'message_signature': 'c8cfc942ffe9a04e7d7f7a09e2ef8bd406325302844b9846498d4b3d66e948d4'}]}, 'timestamp': '2025-10-08 15:20:36.884479', '_unique_id': '3ba6a96b008a43e6be4b8a16562b81cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.884 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.885 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.885 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.885 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-test_idle_timeout_with_querier_enabled-2110154127>, <NovaLikeServer: tempest-server-test-275740212>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_idle_timeout_with_querier_enabled-2110154127>, <NovaLikeServer: tempest-server-test-275740212>]
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.885 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.885 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.885 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-test_idle_timeout_with_querier_enabled-2110154127>, <NovaLikeServer: tempest-server-test-275740212>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_idle_timeout_with_querier_enabled-2110154127>, <NovaLikeServer: tempest-server-test-275740212>]
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.886 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.886 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/network.incoming.bytes volume: 2294 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.886 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6853bf3-f058-4665-9127-111a56f811f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2294, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000005-af660b82-9b3c-4c4d-820a-3d22b73898e5-tap1f764678-f4', 'timestamp': '2025-10-08T15:20:36.886170', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'tap1f764678-f4', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:7e:98:72', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f764678-f4'}, 'message_id': '5766f682-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.404061335, 'message_signature': '2dee60e2f1b6d6183b008bdc1e457c4a5e4f662d646e7a51d45ac3323e9ef883'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'instance-00000007-c4b45a9c-73a5-4b51-ab96-874507f4c028-tapfc0ee6f6-75', 'timestamp': '2025-10-08T15:20:36.886170', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'tapfc0ee6f6-75', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:49:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc0ee6f6-75'}, 'message_id': '5766fed4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.409068315, 'message_signature': '7a22fdccb036ac8b9e475a9574812227e14bafa7ec7ae96eb328707c64fc8849'}]}, 'timestamp': '2025-10-08 15:20:36.886597', '_unique_id': '47c5e741a7084d74b4931d06de34655e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/network.outgoing.bytes volume: 3722 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.887 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a77bd1b9-709b-4ec5-ba79-ba82348ee2c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3722, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000005-af660b82-9b3c-4c4d-820a-3d22b73898e5-tap1f764678-f4', 'timestamp': '2025-10-08T15:20:36.887715', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'tap1f764678-f4', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:7e:98:72', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1f764678-f4'}, 'message_id': '57673444-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.404061335, 'message_signature': 'ba39ae8d4ee6f36ca866a41e0f1de97ba957004a0cff49637207bb23cd1eff7e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'instance-00000007-c4b45a9c-73a5-4b51-ab96-874507f4c028-tapfc0ee6f6-75', 'timestamp': '2025-10-08T15:20:36.887715', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'tapfc0ee6f6-75', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:49:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc0ee6f6-75'}, 'message_id': '57673c5a-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.409068315, 'message_signature': 'dc6c651c382dbe8d987d51cff0075403f77b61aa563a703bb7e24b7941dd598e'}]}, 'timestamp': '2025-10-08 15:20:36.888192', '_unique_id': '804df7909eb1405ea2b42b9b0d7a81dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.888 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.889 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.889 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.device.write.latency volume: 14099088746 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.889 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.889 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.889 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ed4a9ef-ab11-4907-b8e7-7dde703fd5e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14099088746, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5-vda', 'timestamp': '2025-10-08T15:20:36.889359', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'instance-00000005', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '576772ec-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.475355016, 'message_signature': '02d03d1843bea3c309c7fe46e678320b8106c0356a16fd3845e924deaa7a5aec'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5-sda', 'timestamp': '2025-10-08T15:20:36.889359', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'instance-00000005', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '57677a6c-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.475355016, 'message_signature': 'a0ddb8bee721978ec74f0986a06f6010b77365e2ca971b2fad7bcccc3998be03'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028-vda', 'timestamp': '2025-10-08T15:20:36.889359', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'instance-00000007', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '57678192-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.510202612, 'message_signature': '58ebcd045afe1b136757034e2a5f8bd1f9738f497225770f90e44c8f0284dce9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028-sda', 'timestamp': '2025-10-08T15:20:36.889359', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'instance-00000007', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '576788ae-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.510202612, 'message_signature': '1314030568c3fbceeac2c87b24bc60e6d0eab51e2abf188cebd91ac4acc72c39'}]}, 'timestamp': '2025-10-08 15:20:36.890133', '_unique_id': '1da6242ac6bd45db82327dbeac6c77e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.890 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.891 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.891 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.device.read.bytes volume: 326608384 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.891 12 DEBUG ceilometer.compute.pollsters [-] af660b82-9b3c-4c4d-820a-3d22b73898e5/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.891 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.891 12 DEBUG ceilometer.compute.pollsters [-] c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4afcb2d-fa69-429a-bab3-d4392b0a480c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 326608384, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5-vda', 'timestamp': '2025-10-08T15:20:36.891239', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'instance-00000005', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '5767bc5c-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.475355016, 'message_signature': '35a32d536f21de2128c10d1ffdccc7c0943bceb5691784495289a4bfa0be1498'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5-sda', 'timestamp': '2025-10-08T15:20:36.891239', 'resource_metadata': {'display_name': 'tempest-test_idle_timeout_with_querier_enabled-2110154127', 'name': 'instance-00000005', 'instance_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '5767c3aa-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.475355016, 'message_signature': '0c4889a7992901197667259d9c0ec77fcb1167616befa9cc464484ad7dde87ed'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028-vda', 'timestamp': '2025-10-08T15:20:36.891239', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'instance-00000007', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5767cb20-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.510202612, 'message_signature': '05af2c301df4bf6cd780e2879ac8641d9518766fc63126b8e48adf765ef91a14'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '71a7f2d2441447b2bbd1b677555d68cc', 'user_name': None, 'project_id': '7762962015674dfb9038a135559a61f3', 'project_name': None, 'resource_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028-sda', 'timestamp': '2025-10-08T15:20:36.891239', 'resource_metadata': {'display_name': 'tempest-server-test-275740212', 'name': 'instance-00000007', 'instance_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'instance_type': 'm1.nano', 'host': '8e907dbbee9444b9a5f5d00baf5e97662a85497c2f505cb688d46ae5', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5767d26e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3740.510202612, 'message_signature': '1c5f3a2f039f958f920def15b419d5192dc113f45b8286fe13db0ed7afe84217'}]}, 'timestamp': '2025-10-08 15:20:36.892004', '_unique_id': '7ba95d5e3d694e95a9ec238a4307420a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:20:36.892 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:20:37 np0005476733 nova_compute[192580]: 2025-10-08 15:20:37.930 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:20:37 np0005476733 nova_compute[192580]: 2025-10-08 15:20:37.932 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:20:37 np0005476733 nova_compute[192580]: 2025-10-08 15:20:37.932 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:20:38 np0005476733 nova_compute[192580]: 2025-10-08 15:20:38.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:39 np0005476733 podman[223091]: 2025-10-08 15:20:39.243279119 +0000 UTC m=+0.067498671 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid)
Oct  8 11:20:39 np0005476733 nova_compute[192580]: 2025-10-08 15:20:39.583 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:20:39 np0005476733 nova_compute[192580]: 2025-10-08 15:20:39.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:20:39 np0005476733 nova_compute[192580]: 2025-10-08 15:20:39.679 2 INFO nova.virt.libvirt.driver [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Creating config drive at /var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909/disk.config#033[00m
Oct  8 11:20:39 np0005476733 nova_compute[192580]: 2025-10-08 15:20:39.683 2 DEBUG oslo_concurrency.processutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6oa1grss execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:20:39 np0005476733 nova_compute[192580]: 2025-10-08 15:20:39.809 2 DEBUG oslo_concurrency.processutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6oa1grss" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:20:39 np0005476733 kernel: tapf66c148b-4c: entered promiscuous mode
Oct  8 11:20:39 np0005476733 NetworkManager[51699]: <info>  [1759936839.8823] manager: (tapf66c148b-4c): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Oct  8 11:20:39 np0005476733 nova_compute[192580]: 2025-10-08 15:20:39.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:39 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:39Z|00102|binding|INFO|Claiming lport f66c148b-4cbb-4cdd-8196-6513d7c5ff78 for this chassis.
Oct  8 11:20:39 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:39Z|00103|binding|INFO|f66c148b-4cbb-4cdd-8196-6513d7c5ff78: Claiming fa:16:3e:77:9d:93 10.100.0.9
Oct  8 11:20:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:39.896 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:9d:93 10.100.0.9'], port_security=['fa:16:3e:77:9d:93 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '89de78c9-f0c2-4dee-bf11-af3dd2c1fe7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ebe434a7-5fd3-4a18-92a7-9bb4b2dc9121, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=f66c148b-4cbb-4cdd-8196-6513d7c5ff78) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:20:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:39.897 103739 INFO neutron.agent.ovn.metadata.agent [-] Port f66c148b-4cbb-4cdd-8196-6513d7c5ff78 in datapath 3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567 bound to our chassis#033[00m
Oct  8 11:20:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:39.899 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567#033[00m
Oct  8 11:20:39 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:39Z|00104|binding|INFO|Setting lport f66c148b-4cbb-4cdd-8196-6513d7c5ff78 ovn-installed in OVS
Oct  8 11:20:39 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:39Z|00105|binding|INFO|Setting lport f66c148b-4cbb-4cdd-8196-6513d7c5ff78 up in Southbound
Oct  8 11:20:39 np0005476733 nova_compute[192580]: 2025-10-08 15:20:39.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:39 np0005476733 systemd-udevd[223141]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:20:39 np0005476733 nova_compute[192580]: 2025-10-08 15:20:39.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:39.920 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c194e565-705e-471e-ab8c-48c446388bbb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:39.921 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3ec2e14e-51 in ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:20:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:39.922 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3ec2e14e-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:20:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:39.922 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[27bc3c46-c92a-416e-952a-c390733db08d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:39.924 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3b1c4af2-f80b-4605-8215-2a6c729d28d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:39 np0005476733 systemd-machined[152624]: New machine qemu-6-instance-00000009.
Oct  8 11:20:39 np0005476733 NetworkManager[51699]: <info>  [1759936839.9331] device (tapf66c148b-4c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:20:39 np0005476733 NetworkManager[51699]: <info>  [1759936839.9340] device (tapf66c148b-4c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:20:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:39.935 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[6a3c9c73-b371-4b9c-b33a-461824ad133c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:39 np0005476733 systemd[1]: Started Virtual Machine qemu-6-instance-00000009.
Oct  8 11:20:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:39.961 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0bf94775-8819-4a8b-8f39-c965b839dc18]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:39.985 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[d92b08c1-3d37-423d-94e0-1770fd739fbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:39 np0005476733 systemd-udevd[223145]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:20:39 np0005476733 NetworkManager[51699]: <info>  [1759936839.9914] manager: (tap3ec2e14e-50): new Veth device (/org/freedesktop/NetworkManager/Devices/50)
Oct  8 11:20:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:39.990 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[58bfb7fa-46e6-43b0-8e64-b23db56cdde9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:40.018 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[f0c2b373-4117-4a6d-adbb-80c94b26e93e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:40.021 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[37f4bec7-5dc8-4e01-af43-5b31d7ade5ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:40 np0005476733 NetworkManager[51699]: <info>  [1759936840.0520] device (tap3ec2e14e-50): carrier: link connected
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:40.057 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[9b30d34a-bbd3-4bb2-b08e-8576153415a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:40.076 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e1e91dc3-b298-4304-8ac8-c115409a1d38]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3ec2e14e-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:9d:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 374371, 'reachable_time': 16032, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223174, 'error': None, 'target': 'ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:40.090 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8772c99c-8eb1-494e-9301-c9229fc38bab]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6f:9d70'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 374371, 'tstamp': 374371}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223175, 'error': None, 'target': 'ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:40.105 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[48919402-1cb5-491f-948f-391956d3e8e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3ec2e14e-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:9d:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 374371, 'reachable_time': 16032, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223176, 'error': None, 'target': 'ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:40.139 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1809d26a-cef7-40cb-a2f5-98f8025caf91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:40.202 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e67770f2-58ff-4e1c-be96-4d30a7a2042a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:40.204 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3ec2e14e-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:40.204 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:40.205 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3ec2e14e-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:40 np0005476733 NetworkManager[51699]: <info>  [1759936840.2077] manager: (tap3ec2e14e-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Oct  8 11:20:40 np0005476733 kernel: tap3ec2e14e-50: entered promiscuous mode
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:40.212 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3ec2e14e-50, col_values=(('external_ids', {'iface-id': '1e0c4d29-d963-4fdf-8ca6-0153967de16b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:40 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:40Z|00106|binding|INFO|Releasing lport 1e0c4d29-d963-4fdf-8ca6-0153967de16b from this chassis (sb_readonly=0)
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:40.214 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:40.218 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[886c3cad-273d-4d3c-8b08-ee1d120287f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:40.219 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567.pid.haproxy
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:20:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:40.219 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567', 'env', 'PROCESS_TAG=haproxy-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:40 np0005476733 podman[223215]: 2025-10-08 15:20:40.592272816 +0000 UTC m=+0.059686242 container create 78417254839c7c4543a785d835162a728cec3c5cb3077809945012fe85f2dfa1 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 11:20:40 np0005476733 systemd[1]: Started libpod-conmon-78417254839c7c4543a785d835162a728cec3c5cb3077809945012fe85f2dfa1.scope.
Oct  8 11:20:40 np0005476733 podman[223215]: 2025-10-08 15:20:40.557648477 +0000 UTC m=+0.025061953 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:20:40 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:20:40 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/146f4e32264e73a4b7fb2fff8a107e6e31ae9f295f81776878a0e8b0e8e2788f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.664 2 DEBUG nova.network.neutron [req-95e46191-60cb-420e-9ad4-9d58db2a5784 req-158bff1d-42b3-47da-b533-dc25f74667ff 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Updated VIF entry in instance network info cache for port f66c148b-4cbb-4cdd-8196-6513d7c5ff78. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.665 2 DEBUG nova.network.neutron [req-95e46191-60cb-420e-9ad4-9d58db2a5784 req-158bff1d-42b3-47da-b533-dc25f74667ff 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Updating instance_info_cache with network_info: [{"id": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "address": "fa:16:3e:77:9d:93", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66c148b-4c", "ovs_interfaceid": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:20:40 np0005476733 podman[223215]: 2025-10-08 15:20:40.678692741 +0000 UTC m=+0.146106197 container init 78417254839c7c4543a785d835162a728cec3c5cb3077809945012fe85f2dfa1 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.682 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936840.6817296, a71ee5d2-21b8-4455-8870-f20bed682909 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.682 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] VM Started (Lifecycle Event)#033[00m
Oct  8 11:20:40 np0005476733 podman[223215]: 2025-10-08 15:20:40.684200348 +0000 UTC m=+0.151613774 container start 78417254839c7c4543a785d835162a728cec3c5cb3077809945012fe85f2dfa1 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.684 2 DEBUG oslo_concurrency.lockutils [req-95e46191-60cb-420e-9ad4-9d58db2a5784 req-158bff1d-42b3-47da-b533-dc25f74667ff 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-a71ee5d2-21b8-4455-8870-f20bed682909" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.700 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:20:40 np0005476733 neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567[223230]: [NOTICE]   (223234) : New worker (223236) forked
Oct  8 11:20:40 np0005476733 neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567[223230]: [NOTICE]   (223234) : Loading success.
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.705 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936840.6818078, a71ee5d2-21b8-4455-8870-f20bed682909 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.706 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.728 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.733 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.758 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.941 2 DEBUG nova.compute.manager [req-7b574952-5a45-4136-a34c-39c5cc9f92ea req-f514c419-4295-4714-a379-d15ffaf85c85 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Received event network-vif-plugged-f66c148b-4cbb-4cdd-8196-6513d7c5ff78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.941 2 DEBUG oslo_concurrency.lockutils [req-7b574952-5a45-4136-a34c-39c5cc9f92ea req-f514c419-4295-4714-a379-d15ffaf85c85 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "a71ee5d2-21b8-4455-8870-f20bed682909-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.942 2 DEBUG oslo_concurrency.lockutils [req-7b574952-5a45-4136-a34c-39c5cc9f92ea req-f514c419-4295-4714-a379-d15ffaf85c85 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "a71ee5d2-21b8-4455-8870-f20bed682909-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.943 2 DEBUG oslo_concurrency.lockutils [req-7b574952-5a45-4136-a34c-39c5cc9f92ea req-f514c419-4295-4714-a379-d15ffaf85c85 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "a71ee5d2-21b8-4455-8870-f20bed682909-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.943 2 DEBUG nova.compute.manager [req-7b574952-5a45-4136-a34c-39c5cc9f92ea req-f514c419-4295-4714-a379-d15ffaf85c85 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Processing event network-vif-plugged-f66c148b-4cbb-4cdd-8196-6513d7c5ff78 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.945 2 DEBUG nova.compute.manager [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.949 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936840.949191, a71ee5d2-21b8-4455-8870-f20bed682909 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.950 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.953 2 DEBUG nova.virt.libvirt.driver [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.957 2 INFO nova.virt.libvirt.driver [-] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Instance spawned successfully.#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.957 2 DEBUG nova.virt.libvirt.driver [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.977 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.984 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.987 2 DEBUG nova.virt.libvirt.driver [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.987 2 DEBUG nova.virt.libvirt.driver [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.988 2 DEBUG nova.virt.libvirt.driver [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.988 2 DEBUG nova.virt.libvirt.driver [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.989 2 DEBUG nova.virt.libvirt.driver [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:20:40 np0005476733 nova_compute[192580]: 2025-10-08 15:20:40.989 2 DEBUG nova.virt.libvirt.driver [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:20:41 np0005476733 nova_compute[192580]: 2025-10-08 15:20:41.015 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:20:41 np0005476733 nova_compute[192580]: 2025-10-08 15:20:41.046 2 INFO nova.compute.manager [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Took 7.67 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:20:41 np0005476733 nova_compute[192580]: 2025-10-08 15:20:41.047 2 DEBUG nova.compute.manager [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:20:41 np0005476733 nova_compute[192580]: 2025-10-08 15:20:41.113 2 INFO nova.compute.manager [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Took 8.60 seconds to build instance.#033[00m
Oct  8 11:20:41 np0005476733 nova_compute[192580]: 2025-10-08 15:20:41.141 2 DEBUG oslo_concurrency.lockutils [None req-07f3fa1d-0427-4bf7-bddf-0de6b4791323 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "a71ee5d2-21b8-4455-8870-f20bed682909" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.725s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:41 np0005476733 nova_compute[192580]: 2025-10-08 15:20:41.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:41 np0005476733 nova_compute[192580]: 2025-10-08 15:20:41.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:20:41 np0005476733 nova_compute[192580]: 2025-10-08 15:20:41.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:20:41 np0005476733 nova_compute[192580]: 2025-10-08 15:20:41.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:20:41 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:41Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3f:49:16 10.100.0.26
Oct  8 11:20:41 np0005476733 nova_compute[192580]: 2025-10-08 15:20:41.736 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-af660b82-9b3c-4c4d-820a-3d22b73898e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:20:41 np0005476733 nova_compute[192580]: 2025-10-08 15:20:41.737 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-af660b82-9b3c-4c4d-820a-3d22b73898e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:20:41 np0005476733 nova_compute[192580]: 2025-10-08 15:20:41.737 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:20:41 np0005476733 nova_compute[192580]: 2025-10-08 15:20:41.737 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid af660b82-9b3c-4c4d-820a-3d22b73898e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:20:42 np0005476733 podman[223245]: 2025-10-08 15:20:42.241974546 +0000 UTC m=+0.065017211 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 11:20:43 np0005476733 nova_compute[192580]: 2025-10-08 15:20:43.049 2 DEBUG nova.compute.manager [req-145bbe9c-e3a4-47a6-aab0-6be05e9752b4 req-d5ebbb41-c06a-421f-9ed4-68c8876bb7d4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Received event network-vif-plugged-f66c148b-4cbb-4cdd-8196-6513d7c5ff78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:20:43 np0005476733 nova_compute[192580]: 2025-10-08 15:20:43.050 2 DEBUG oslo_concurrency.lockutils [req-145bbe9c-e3a4-47a6-aab0-6be05e9752b4 req-d5ebbb41-c06a-421f-9ed4-68c8876bb7d4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "a71ee5d2-21b8-4455-8870-f20bed682909-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:43 np0005476733 nova_compute[192580]: 2025-10-08 15:20:43.050 2 DEBUG oslo_concurrency.lockutils [req-145bbe9c-e3a4-47a6-aab0-6be05e9752b4 req-d5ebbb41-c06a-421f-9ed4-68c8876bb7d4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "a71ee5d2-21b8-4455-8870-f20bed682909-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:43 np0005476733 nova_compute[192580]: 2025-10-08 15:20:43.050 2 DEBUG oslo_concurrency.lockutils [req-145bbe9c-e3a4-47a6-aab0-6be05e9752b4 req-d5ebbb41-c06a-421f-9ed4-68c8876bb7d4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "a71ee5d2-21b8-4455-8870-f20bed682909-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:43 np0005476733 nova_compute[192580]: 2025-10-08 15:20:43.050 2 DEBUG nova.compute.manager [req-145bbe9c-e3a4-47a6-aab0-6be05e9752b4 req-d5ebbb41-c06a-421f-9ed4-68c8876bb7d4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] No waiting events found dispatching network-vif-plugged-f66c148b-4cbb-4cdd-8196-6513d7c5ff78 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:20:43 np0005476733 nova_compute[192580]: 2025-10-08 15:20:43.051 2 WARNING nova.compute.manager [req-145bbe9c-e3a4-47a6-aab0-6be05e9752b4 req-d5ebbb41-c06a-421f-9ed4-68c8876bb7d4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Received unexpected event network-vif-plugged-f66c148b-4cbb-4cdd-8196-6513d7c5ff78 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:20:43 np0005476733 nova_compute[192580]: 2025-10-08 15:20:43.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:44 np0005476733 nova_compute[192580]: 2025-10-08 15:20:44.442 2 INFO nova.compute.manager [None req-2ccad884-a207-4be1-9b12-9255e9549c94 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Get console output#033[00m
Oct  8 11:20:44 np0005476733 nova_compute[192580]: 2025-10-08 15:20:44.451 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:20:44 np0005476733 nova_compute[192580]: 2025-10-08 15:20:44.719 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Updating instance_info_cache with network_info: [{"id": "1f764678-f4b9-420d-b072-8c0f7c3534a9", "address": "fa:16:3e:7e:98:72", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f764678-f4", "ovs_interfaceid": "1f764678-f4b9-420d-b072-8c0f7c3534a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:20:44 np0005476733 nova_compute[192580]: 2025-10-08 15:20:44.736 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-af660b82-9b3c-4c4d-820a-3d22b73898e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:20:44 np0005476733 nova_compute[192580]: 2025-10-08 15:20:44.736 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:20:44 np0005476733 nova_compute[192580]: 2025-10-08 15:20:44.737 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:20:44 np0005476733 nova_compute[192580]: 2025-10-08 15:20:44.737 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:20:45 np0005476733 nova_compute[192580]: 2025-10-08 15:20:45.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:20:46 np0005476733 nova_compute[192580]: 2025-10-08 15:20:46.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:47 np0005476733 nova_compute[192580]: 2025-10-08 15:20:47.415 2 DEBUG oslo_concurrency.lockutils [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "c4b45a9c-73a5-4b51-ab96-874507f4c028" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:47 np0005476733 nova_compute[192580]: 2025-10-08 15:20:47.416 2 DEBUG oslo_concurrency.lockutils [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:47 np0005476733 nova_compute[192580]: 2025-10-08 15:20:47.417 2 INFO nova.compute.manager [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Rebooting instance#033[00m
Oct  8 11:20:47 np0005476733 nova_compute[192580]: 2025-10-08 15:20:47.434 2 DEBUG oslo_concurrency.lockutils [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "refresh_cache-c4b45a9c-73a5-4b51-ab96-874507f4c028" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:20:47 np0005476733 nova_compute[192580]: 2025-10-08 15:20:47.435 2 DEBUG oslo_concurrency.lockutils [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquired lock "refresh_cache-c4b45a9c-73a5-4b51-ab96-874507f4c028" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:20:47 np0005476733 nova_compute[192580]: 2025-10-08 15:20:47.435 2 DEBUG nova.network.neutron [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.480 2 DEBUG nova.network.neutron [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Updating instance_info_cache with network_info: [{"id": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "address": "fa:16:3e:3f:49:16", "network": {"id": "6683d4f6-e609-48e8-bf45-f31b3fa1d7ec", "bridge": "br-int", "label": "tempest-test-network--999097329", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0ee6f6-75", "ovs_interfaceid": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.497 2 DEBUG oslo_concurrency.lockutils [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Releasing lock "refresh_cache-c4b45a9c-73a5-4b51-ab96-874507f4c028" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.498 2 DEBUG nova.compute.manager [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:48 np0005476733 kernel: tapfc0ee6f6-75 (unregistering): left promiscuous mode
Oct  8 11:20:48 np0005476733 NetworkManager[51699]: <info>  [1759936848.6179] device (tapfc0ee6f6-75): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:48 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:48Z|00107|binding|INFO|Releasing lport fc0ee6f6-75ad-486c-8761-a75311199fcb from this chassis (sb_readonly=0)
Oct  8 11:20:48 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:48Z|00108|binding|INFO|Setting lport fc0ee6f6-75ad-486c-8761-a75311199fcb down in Southbound
Oct  8 11:20:48 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:48Z|00109|binding|INFO|Removing iface tapfc0ee6f6-75 ovn-installed in OVS
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:48.636 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:49:16 10.100.0.26'], port_security=['fa:16:3e:3f:49:16 10.100.0.26'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.26/28', 'neutron:device_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7762962015674dfb9038a135559a61f3', 'neutron:revision_number': '8', 'neutron:security_group_ids': '47f96e90-866e-45e9-bccf-367f966b96ad', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.246'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7bd9065d-dade-45b5-8223-a8753cff9447, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=fc0ee6f6-75ad-486c-8761-a75311199fcb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:48.638 103739 INFO neutron.agent.ovn.metadata.agent [-] Port fc0ee6f6-75ad-486c-8761-a75311199fcb in datapath 6683d4f6-e609-48e8-bf45-f31b3fa1d7ec unbound from our chassis#033[00m
Oct  8 11:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:48.641 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6683d4f6-e609-48e8-bf45-f31b3fa1d7ec, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:48.642 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[61042254-95f0-40a9-a4ed-5d57c902b343]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:48.643 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec namespace which is not needed anymore#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:48 np0005476733 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000007.scope: Deactivated successfully.
Oct  8 11:20:48 np0005476733 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000007.scope: Consumed 11.825s CPU time.
Oct  8 11:20:48 np0005476733 systemd-machined[152624]: Machine qemu-5-instance-00000007 terminated.
Oct  8 11:20:48 np0005476733 neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec[222966]: [NOTICE]   (222970) : haproxy version is 2.8.14-c23fe91
Oct  8 11:20:48 np0005476733 neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec[222966]: [NOTICE]   (222970) : path to executable is /usr/sbin/haproxy
Oct  8 11:20:48 np0005476733 neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec[222966]: [WARNING]  (222970) : Exiting Master process...
Oct  8 11:20:48 np0005476733 neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec[222966]: [WARNING]  (222970) : Exiting Master process...
Oct  8 11:20:48 np0005476733 neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec[222966]: [ALERT]    (222970) : Current worker (222972) exited with code 143 (Terminated)
Oct  8 11:20:48 np0005476733 neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec[222966]: [WARNING]  (222970) : All workers exited. Exiting... (0)
Oct  8 11:20:48 np0005476733 systemd[1]: libpod-9655c2ee3d8a29dc99fbb152c82c13e2e6d6df630d88189e58e506879525cbe9.scope: Deactivated successfully.
Oct  8 11:20:48 np0005476733 podman[223291]: 2025-10-08 15:20:48.838479156 +0000 UTC m=+0.063785652 container died 9655c2ee3d8a29dc99fbb152c82c13e2e6d6df630d88189e58e506879525cbe9 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.855 2 INFO nova.virt.libvirt.driver [-] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Instance destroyed successfully.#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.856 2 DEBUG nova.objects.instance [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lazy-loading 'resources' on Instance uuid c4b45a9c-73a5-4b51-ab96-874507f4c028 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:20:48 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9655c2ee3d8a29dc99fbb152c82c13e2e6d6df630d88189e58e506879525cbe9-userdata-shm.mount: Deactivated successfully.
Oct  8 11:20:48 np0005476733 systemd[1]: var-lib-containers-storage-overlay-c41eb8b4c2d83699168fa1690999f87d065cd8d0d3ba98003a80c587d8a6c90a-merged.mount: Deactivated successfully.
Oct  8 11:20:48 np0005476733 podman[223291]: 2025-10-08 15:20:48.875333196 +0000 UTC m=+0.100639662 container cleanup 9655c2ee3d8a29dc99fbb152c82c13e2e6d6df630d88189e58e506879525cbe9 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.877 2 DEBUG nova.virt.libvirt.vif [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:19:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-275740212',display_name='tempest-server-test-275740212',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-275740212',id=7,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBInikYfls17wB1mwR3S9Ulgdg+DNmQrpoJDQvNS30i0mIRN2rBHXyw6+Ph5Eh6gBM3mwOmnBp3bKiolQD/a4fLXgU3ywHOPwgHPAGHPd9nWtpL3ZVtnCf+c+8SPVxqk1WQ==',key_name='tempest-keypair-test-226591404',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:20:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7762962015674dfb9038a135559a61f3',ramdisk_id='',reservation_id='r-lnlgqtkx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NetworkBasicTest-1891752524',owner_user_name='tempest-NetworkBasicTest-1891752524-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:20:48Z,user_data=None,user_id='71a7f2d2441447b2bbd1b677555d68cc',uuid=c4b45a9c-73a5-4b51-ab96-874507f4c028,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "address": "fa:16:3e:3f:49:16", "network": {"id": "6683d4f6-e609-48e8-bf45-f31b3fa1d7ec", "bridge": "br-int", "label": "tempest-test-network--999097329", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0ee6f6-75", "ovs_interfaceid": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.879 2 DEBUG nova.network.os_vif_util [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Converting VIF {"id": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "address": "fa:16:3e:3f:49:16", "network": {"id": "6683d4f6-e609-48e8-bf45-f31b3fa1d7ec", "bridge": "br-int", "label": "tempest-test-network--999097329", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0ee6f6-75", "ovs_interfaceid": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.882 2 DEBUG nova.network.os_vif_util [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3f:49:16,bridge_name='br-int',has_traffic_filtering=True,id=fc0ee6f6-75ad-486c-8761-a75311199fcb,network=Network(6683d4f6-e609-48e8-bf45-f31b3fa1d7ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc0ee6f6-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.882 2 DEBUG os_vif [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3f:49:16,bridge_name='br-int',has_traffic_filtering=True,id=fc0ee6f6-75ad-486c-8761-a75311199fcb,network=Network(6683d4f6-e609-48e8-bf45-f31b3fa1d7ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc0ee6f6-75') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:20:48 np0005476733 systemd[1]: libpod-conmon-9655c2ee3d8a29dc99fbb152c82c13e2e6d6df630d88189e58e506879525cbe9.scope: Deactivated successfully.
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.892 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc0ee6f6-75, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.901 2 INFO os_vif [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3f:49:16,bridge_name='br-int',has_traffic_filtering=True,id=fc0ee6f6-75ad-486c-8761-a75311199fcb,network=Network(6683d4f6-e609-48e8-bf45-f31b3fa1d7ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc0ee6f6-75')#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.921 2 DEBUG nova.virt.libvirt.driver [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Start _get_guest_xml network_info=[{"id": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "address": "fa:16:3e:3f:49:16", "network": {"id": "6683d4f6-e609-48e8-bf45-f31b3fa1d7ec", "bridge": "br-int", "label": "tempest-test-network--999097329", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0ee6f6-75", "ovs_interfaceid": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.927 2 WARNING nova.virt.libvirt.driver [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.933 2 DEBUG nova.virt.libvirt.host [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.934 2 DEBUG nova.virt.libvirt.host [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:20:48 np0005476733 podman[223335]: 2025-10-08 15:20:48.936252495 +0000 UTC m=+0.041763447 container remove 9655c2ee3d8a29dc99fbb152c82c13e2e6d6df630d88189e58e506879525cbe9 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.938 2 DEBUG nova.virt.libvirt.host [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.939 2 DEBUG nova.virt.libvirt.host [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.940 2 DEBUG nova.virt.libvirt.driver [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:48.941 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[eb497f45-7c87-4072-a537-d57f1ab67446]: (4, ('Wed Oct  8 03:20:48 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec (9655c2ee3d8a29dc99fbb152c82c13e2e6d6df630d88189e58e506879525cbe9)\n9655c2ee3d8a29dc99fbb152c82c13e2e6d6df630d88189e58e506879525cbe9\nWed Oct  8 03:20:48 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec (9655c2ee3d8a29dc99fbb152c82c13e2e6d6df630d88189e58e506879525cbe9)\n9655c2ee3d8a29dc99fbb152c82c13e2e6d6df630d88189e58e506879525cbe9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.941 2 DEBUG nova.virt.hardware [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='987b2db7-1d21-4b59-831a-1e8ace40589b',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.942 2 DEBUG nova.virt.hardware [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.942 2 DEBUG nova.virt.hardware [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:48.942 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3f34712a-a3ff-48fe-a472-8b6eb4d0d327]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.943 2 DEBUG nova.virt.hardware [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:48.943 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6683d4f6-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.943 2 DEBUG nova.virt.hardware [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.943 2 DEBUG nova.virt.hardware [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.944 2 DEBUG nova.virt.hardware [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.944 2 DEBUG nova.virt.hardware [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.945 2 DEBUG nova.virt.hardware [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.945 2 DEBUG nova.virt.hardware [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.946 2 DEBUG nova.virt.hardware [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.946 2 DEBUG nova.objects.instance [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid c4b45a9c-73a5-4b51-ab96-874507f4c028 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:20:48 np0005476733 kernel: tap6683d4f6-e0: left promiscuous mode
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:48.961 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1d891169-b509-4a71-be8b-77ad2ed69400]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:48 np0005476733 nova_compute[192580]: 2025-10-08 15:20:48.975 2 DEBUG oslo_concurrency.processutils [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:48.991 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9b853c79-fa5d-4eca-94c8-1daf7007d37b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:48.992 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ff541725-7bba-405b-9a96-800f637232a4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.008 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[cbf91d23-ba2b-4e65-b4ab-369031e783eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373218, 'reachable_time': 25128, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223351, 'error': None, 'target': 'ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.011 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.011 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[93af3cea-db41-4fae-8efe-e09774709bd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:49 np0005476733 systemd[1]: run-netns-ovnmeta\x2d6683d4f6\x2de609\x2d48e8\x2dbf45\x2df31b3fa1d7ec.mount: Deactivated successfully.
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.041 2 DEBUG oslo_concurrency.processutils [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.config --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.043 2 DEBUG oslo_concurrency.lockutils [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "/var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.043 2 DEBUG oslo_concurrency.lockutils [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "/var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.044 2 DEBUG oslo_concurrency.lockutils [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "/var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.045 2 DEBUG nova.virt.libvirt.vif [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:19:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-275740212',display_name='tempest-server-test-275740212',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-275740212',id=7,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBInikYfls17wB1mwR3S9Ulgdg+DNmQrpoJDQvNS30i0mIRN2rBHXyw6+Ph5Eh6gBM3mwOmnBp3bKiolQD/a4fLXgU3ywHOPwgHPAGHPd9nWtpL3ZVtnCf+c+8SPVxqk1WQ==',key_name='tempest-keypair-test-226591404',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:20:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7762962015674dfb9038a135559a61f3',ramdisk_id='',reservation_id='r-lnlgqtkx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NetworkBasicTest-1891752524',owner_user_name='tempest-NetworkBasicTest-1891752524-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:20:48Z,user_data=None,user_id='71a7f2d2441447b2bbd1b677555d68cc',uuid=c4b45a9c-73a5-4b51-ab96-874507f4c028,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "address": "fa:16:3e:3f:49:16", "network": {"id": "6683d4f6-e609-48e8-bf45-f31b3fa1d7ec", "bridge": "br-int", "label": "tempest-test-network--999097329", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0ee6f6-75", "ovs_interfaceid": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.045 2 DEBUG nova.network.os_vif_util [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Converting VIF {"id": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "address": "fa:16:3e:3f:49:16", "network": {"id": "6683d4f6-e609-48e8-bf45-f31b3fa1d7ec", "bridge": "br-int", "label": "tempest-test-network--999097329", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0ee6f6-75", "ovs_interfaceid": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.046 2 DEBUG nova.network.os_vif_util [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3f:49:16,bridge_name='br-int',has_traffic_filtering=True,id=fc0ee6f6-75ad-486c-8761-a75311199fcb,network=Network(6683d4f6-e609-48e8-bf45-f31b3fa1d7ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc0ee6f6-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.047 2 DEBUG nova.objects.instance [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lazy-loading 'pci_devices' on Instance uuid c4b45a9c-73a5-4b51-ab96-874507f4c028 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.075 2 DEBUG nova.virt.libvirt.driver [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:20:49 np0005476733 nova_compute[192580]:  <uuid>c4b45a9c-73a5-4b51-ab96-874507f4c028</uuid>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:  <name>instance-00000007</name>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:  <memory>131072</memory>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <nova:name>tempest-server-test-275740212</nova:name>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:20:48</nova:creationTime>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <nova:flavor name="m1.nano">
Oct  8 11:20:49 np0005476733 nova_compute[192580]:        <nova:memory>128</nova:memory>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:        <nova:disk>1</nova:disk>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:        <nova:user uuid="71a7f2d2441447b2bbd1b677555d68cc">tempest-NetworkBasicTest-1891752524-project-member</nova:user>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:        <nova:project uuid="7762962015674dfb9038a135559a61f3">tempest-NetworkBasicTest-1891752524</nova:project>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="ec29a055-bb5f-49c2-94be-8574c5ea97ea"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:        <nova:port uuid="fc0ee6f6-75ad-486c-8761-a75311199fcb">
Oct  8 11:20:49 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.26" ipVersion="4"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <entry name="serial">c4b45a9c-73a5-4b51-ab96-874507f4c028</entry>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <entry name="uuid">c4b45a9c-73a5-4b51-ab96-874507f4c028</entry>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk.config"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:3f:49:16"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <target dev="tapfc0ee6f6-75"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/console.log" append="off"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <input type="keyboard" bus="usb"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:20:49 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:20:49 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:20:49 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:20:49 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.077 2 DEBUG oslo_concurrency.processutils [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.135 2 DEBUG oslo_concurrency.processutils [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.136 2 DEBUG oslo_concurrency.processutils [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.190 2 DEBUG oslo_concurrency.processutils [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.191 2 DEBUG nova.objects.instance [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid c4b45a9c-73a5-4b51-ab96-874507f4c028 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.207 2 DEBUG oslo_concurrency.processutils [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.259 2 DEBUG oslo_concurrency.processutils [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.260 2 DEBUG nova.virt.disk.api [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Checking if we can resize image /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.261 2 DEBUG oslo_concurrency.processutils [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.319 2 DEBUG oslo_concurrency.processutils [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.320 2 DEBUG nova.virt.disk.api [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Cannot resize image /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.320 2 DEBUG nova.objects.instance [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lazy-loading 'migration_context' on Instance uuid c4b45a9c-73a5-4b51-ab96-874507f4c028 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.335 2 DEBUG nova.virt.libvirt.vif [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:19:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-275740212',display_name='tempest-server-test-275740212',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-275740212',id=7,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBInikYfls17wB1mwR3S9Ulgdg+DNmQrpoJDQvNS30i0mIRN2rBHXyw6+Ph5Eh6gBM3mwOmnBp3bKiolQD/a4fLXgU3ywHOPwgHPAGHPd9nWtpL3ZVtnCf+c+8SPVxqk1WQ==',key_name='tempest-keypair-test-226591404',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:20:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='7762962015674dfb9038a135559a61f3',ramdisk_id='',reservation_id='r-lnlgqtkx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NetworkBasicTest-1891752524',owner_user_name='tempest-NetworkBasicTest-1891752524-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:20:48Z,user_data=None,user_id='71a7f2d2441447b2bbd1b677555d68cc',uuid=c4b45a9c-73a5-4b51-ab96-874507f4c028,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "address": "fa:16:3e:3f:49:16", "network": {"id": "6683d4f6-e609-48e8-bf45-f31b3fa1d7ec", "bridge": "br-int", "label": "tempest-test-network--999097329", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0ee6f6-75", "ovs_interfaceid": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.336 2 DEBUG nova.network.os_vif_util [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Converting VIF {"id": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "address": "fa:16:3e:3f:49:16", "network": {"id": "6683d4f6-e609-48e8-bf45-f31b3fa1d7ec", "bridge": "br-int", "label": "tempest-test-network--999097329", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0ee6f6-75", "ovs_interfaceid": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.336 2 DEBUG nova.network.os_vif_util [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3f:49:16,bridge_name='br-int',has_traffic_filtering=True,id=fc0ee6f6-75ad-486c-8761-a75311199fcb,network=Network(6683d4f6-e609-48e8-bf45-f31b3fa1d7ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc0ee6f6-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.337 2 DEBUG os_vif [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3f:49:16,bridge_name='br-int',has_traffic_filtering=True,id=fc0ee6f6-75ad-486c-8761-a75311199fcb,network=Network(6683d4f6-e609-48e8-bf45-f31b3fa1d7ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc0ee6f6-75') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.338 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.338 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.351 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc0ee6f6-75, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.352 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfc0ee6f6-75, col_values=(('external_ids', {'iface-id': 'fc0ee6f6-75ad-486c-8761-a75311199fcb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3f:49:16', 'vm-uuid': 'c4b45a9c-73a5-4b51-ab96-874507f4c028'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:49 np0005476733 NetworkManager[51699]: <info>  [1759936849.3547] manager: (tapfc0ee6f6-75): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.360 2 INFO os_vif [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3f:49:16,bridge_name='br-int',has_traffic_filtering=True,id=fc0ee6f6-75ad-486c-8761-a75311199fcb,network=Network(6683d4f6-e609-48e8-bf45-f31b3fa1d7ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc0ee6f6-75')#033[00m
Oct  8 11:20:49 np0005476733 kernel: tapfc0ee6f6-75: entered promiscuous mode
Oct  8 11:20:49 np0005476733 systemd-udevd[223271]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:20:49 np0005476733 NetworkManager[51699]: <info>  [1759936849.4403] manager: (tapfc0ee6f6-75): new Tun device (/org/freedesktop/NetworkManager/Devices/53)
Oct  8 11:20:49 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:49Z|00110|binding|INFO|Claiming lport fc0ee6f6-75ad-486c-8761-a75311199fcb for this chassis.
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:49 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:49Z|00111|binding|INFO|fc0ee6f6-75ad-486c-8761-a75311199fcb: Claiming fa:16:3e:3f:49:16 10.100.0.26
Oct  8 11:20:49 np0005476733 NetworkManager[51699]: <info>  [1759936849.4609] device (tapfc0ee6f6-75): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.460 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:49:16 10.100.0.26'], port_security=['fa:16:3e:3f:49:16 10.100.0.26'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.26/28', 'neutron:device_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7762962015674dfb9038a135559a61f3', 'neutron:revision_number': '8', 'neutron:security_group_ids': '47f96e90-866e-45e9-bccf-367f966b96ad', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.246'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7bd9065d-dade-45b5-8223-a8753cff9447, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=fc0ee6f6-75ad-486c-8761-a75311199fcb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:20:49 np0005476733 NetworkManager[51699]: <info>  [1759936849.4615] device (tapfc0ee6f6-75): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:20:49 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:49Z|00112|binding|INFO|Setting lport fc0ee6f6-75ad-486c-8761-a75311199fcb ovn-installed in OVS
Oct  8 11:20:49 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:49Z|00113|binding|INFO|Setting lport fc0ee6f6-75ad-486c-8761-a75311199fcb up in Southbound
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.463 103739 INFO neutron.agent.ovn.metadata.agent [-] Port fc0ee6f6-75ad-486c-8761-a75311199fcb in datapath 6683d4f6-e609-48e8-bf45-f31b3fa1d7ec bound to our chassis#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.466 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6683d4f6-e609-48e8-bf45-f31b3fa1d7ec#033[00m
Oct  8 11:20:49 np0005476733 systemd-machined[152624]: New machine qemu-7-instance-00000007.
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.479 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f2670666-cfaa-4a29-b215-62609d7691cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.483 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6683d4f6-e1 in ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.484 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6683d4f6-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.484 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[65f25098-d777-43f9-ad13-e30c8865d31c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.485 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e841c276-0a40-4d95-b94a-6a56dddd66be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:49 np0005476733 podman[223368]: 2025-10-08 15:20:49.492770968 +0000 UTC m=+0.094510007 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:20:49 np0005476733 systemd[1]: Started Virtual Machine qemu-7-instance-00000007.
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.500 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[f376663b-d8e0-429d-8ad1-c98da2a51a85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.525 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[abbc01c2-1e33-4444-9e97-bf20f4254d96]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.551 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[f90c065d-26f6-4278-9e2c-9f72124b8381]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.557 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0a410c0e-a6f3-4ba6-817d-494927d3b970]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:49 np0005476733 NetworkManager[51699]: <info>  [1759936849.5583] manager: (tap6683d4f6-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/54)
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.593 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[a78874f5-477e-44cb-a526-085530fb9ce5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.596 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[fd85bfff-604a-4566-a151-a24f534e4b4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.614 2 INFO nova.compute.manager [None req-7652bee7-9131-43b9-8acd-9d65f59173b8 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Get console output#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.621 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:20:49 np0005476733 NetworkManager[51699]: <info>  [1759936849.6237] device (tap6683d4f6-e0): carrier: link connected
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.632 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[5ccfe2e9-a56d-4c57-bc64-2e031d5328a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.652 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[21202c79-e55c-4138-935d-45ced6aa2df0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6683d4f6-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:fd:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375328, 'reachable_time': 23059, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223432, 'error': None, 'target': 'ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.672 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[48e62711-1f3e-4052-8ecd-a8c9943f538e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe29:fdad'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375328, 'tstamp': 375328}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223433, 'error': None, 'target': 'ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.695 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9bfdf956-ae54-4b88-b0a8-2b05bb1d8a69]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6683d4f6-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:fd:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375328, 'reachable_time': 23059, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223434, 'error': None, 'target': 'ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.733 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e5bfbf3e-ea89-4c63-b2f8-5a32a472389e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.802 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[31386913-9a66-4e0a-ae41-858dbee05dec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.805 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6683d4f6-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.805 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.806 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6683d4f6-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:49 np0005476733 NetworkManager[51699]: <info>  [1759936849.8085] manager: (tap6683d4f6-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Oct  8 11:20:49 np0005476733 kernel: tap6683d4f6-e0: entered promiscuous mode
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.810 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6683d4f6-e0, col_values=(('external_ids', {'iface-id': '02c54a14-9b9f-4195-ba94-66a72c7333c9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:49 np0005476733 ovn_controller[94857]: 2025-10-08T15:20:49Z|00114|binding|INFO|Releasing lport 02c54a14-9b9f-4195-ba94-66a72c7333c9 from this chassis (sb_readonly=0)
Oct  8 11:20:49 np0005476733 nova_compute[192580]: 2025-10-08 15:20:49.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.824 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6683d4f6-e609-48e8-bf45-f31b3fa1d7ec.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6683d4f6-e609-48e8-bf45-f31b3fa1d7ec.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.825 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[896e60e1-d3a1-4bad-9260-0efb156b3d9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.826 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/6683d4f6-e609-48e8-bf45-f31b3fa1d7ec.pid.haproxy
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 6683d4f6-e609-48e8-bf45-f31b3fa1d7ec
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:49.827 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'env', 'PROCESS_TAG=haproxy-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6683d4f6-e609-48e8-bf45-f31b3fa1d7ec.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:20:50 np0005476733 podman[223466]: 2025-10-08 15:20:50.235188719 +0000 UTC m=+0.063882325 container create 15c9d2805e4eb850aa22dadbb95f6ebb7d0d55c846d5b8a88862405fa792b522 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:20:50 np0005476733 systemd[1]: Started libpod-conmon-15c9d2805e4eb850aa22dadbb95f6ebb7d0d55c846d5b8a88862405fa792b522.scope.
Oct  8 11:20:50 np0005476733 podman[223466]: 2025-10-08 15:20:50.206474851 +0000 UTC m=+0.035168487 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:20:50 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:20:50 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98314696b0d14ad55a224e2e56f6276310ad465d2cdc6b9bd1585d89c09abee4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:20:50 np0005476733 podman[223466]: 2025-10-08 15:20:50.327798724 +0000 UTC m=+0.156492340 container init 15c9d2805e4eb850aa22dadbb95f6ebb7d0d55c846d5b8a88862405fa792b522 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2)
Oct  8 11:20:50 np0005476733 podman[223466]: 2025-10-08 15:20:50.334080675 +0000 UTC m=+0.162774291 container start 15c9d2805e4eb850aa22dadbb95f6ebb7d0d55c846d5b8a88862405fa792b522 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  8 11:20:50 np0005476733 neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec[223481]: [NOTICE]   (223492) : New worker (223494) forked
Oct  8 11:20:50 np0005476733 neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec[223481]: [NOTICE]   (223492) : Loading success.
Oct  8 11:20:50 np0005476733 nova_compute[192580]: 2025-10-08 15:20:50.774 2 DEBUG nova.virt.libvirt.host [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Removed pending event for c4b45a9c-73a5-4b51-ab96-874507f4c028 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  8 11:20:50 np0005476733 nova_compute[192580]: 2025-10-08 15:20:50.774 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936850.77391, c4b45a9c-73a5-4b51-ab96-874507f4c028 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:20:50 np0005476733 nova_compute[192580]: 2025-10-08 15:20:50.774 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:20:50 np0005476733 nova_compute[192580]: 2025-10-08 15:20:50.776 2 DEBUG nova.compute.manager [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:20:50 np0005476733 nova_compute[192580]: 2025-10-08 15:20:50.779 2 INFO nova.virt.libvirt.driver [-] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Instance rebooted successfully.#033[00m
Oct  8 11:20:50 np0005476733 nova_compute[192580]: 2025-10-08 15:20:50.780 2 DEBUG nova.compute.manager [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:20:50 np0005476733 nova_compute[192580]: 2025-10-08 15:20:50.814 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:20:50 np0005476733 nova_compute[192580]: 2025-10-08 15:20:50.818 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:20:50 np0005476733 nova_compute[192580]: 2025-10-08 15:20:50.850 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.#033[00m
Oct  8 11:20:50 np0005476733 nova_compute[192580]: 2025-10-08 15:20:50.851 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936850.7759442, c4b45a9c-73a5-4b51-ab96-874507f4c028 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:20:50 np0005476733 nova_compute[192580]: 2025-10-08 15:20:50.851 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] VM Started (Lifecycle Event)#033[00m
Oct  8 11:20:50 np0005476733 nova_compute[192580]: 2025-10-08 15:20:50.860 2 DEBUG oslo_concurrency.lockutils [None req-f1c74d5e-a4df-45c6-8123-1c916bc38b9f 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 3.444s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:50 np0005476733 nova_compute[192580]: 2025-10-08 15:20:50.869 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:20:50 np0005476733 nova_compute[192580]: 2025-10-08 15:20:50.872 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:20:52 np0005476733 podman[223504]: 2025-10-08 15:20:52.24884829 +0000 UTC m=+0.067813692 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Oct  8 11:20:52 np0005476733 nova_compute[192580]: 2025-10-08 15:20:52.742 2 DEBUG nova.compute.manager [req-7e3c90ad-ee06-4964-94d5-6deb4740c103 req-a6044450-e459-485a-a574-03c502cbf7db 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received event network-vif-unplugged-fc0ee6f6-75ad-486c-8761-a75311199fcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:20:52 np0005476733 nova_compute[192580]: 2025-10-08 15:20:52.743 2 DEBUG oslo_concurrency.lockutils [req-7e3c90ad-ee06-4964-94d5-6deb4740c103 req-a6044450-e459-485a-a574-03c502cbf7db 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:52 np0005476733 nova_compute[192580]: 2025-10-08 15:20:52.744 2 DEBUG oslo_concurrency.lockutils [req-7e3c90ad-ee06-4964-94d5-6deb4740c103 req-a6044450-e459-485a-a574-03c502cbf7db 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:52 np0005476733 nova_compute[192580]: 2025-10-08 15:20:52.744 2 DEBUG oslo_concurrency.lockutils [req-7e3c90ad-ee06-4964-94d5-6deb4740c103 req-a6044450-e459-485a-a574-03c502cbf7db 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:52 np0005476733 nova_compute[192580]: 2025-10-08 15:20:52.744 2 DEBUG nova.compute.manager [req-7e3c90ad-ee06-4964-94d5-6deb4740c103 req-a6044450-e459-485a-a574-03c502cbf7db 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] No waiting events found dispatching network-vif-unplugged-fc0ee6f6-75ad-486c-8761-a75311199fcb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:20:52 np0005476733 nova_compute[192580]: 2025-10-08 15:20:52.745 2 WARNING nova.compute.manager [req-7e3c90ad-ee06-4964-94d5-6deb4740c103 req-a6044450-e459-485a-a574-03c502cbf7db 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received unexpected event network-vif-unplugged-fc0ee6f6-75ad-486c-8761-a75311199fcb for instance with vm_state active and task_state None.#033[00m
Oct  8 11:20:53 np0005476733 podman[223524]: 2025-10-08 15:20:53.311566143 +0000 UTC m=+0.120377194 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller)
Oct  8 11:20:53 np0005476733 nova_compute[192580]: 2025-10-08 15:20:53.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:54 np0005476733 nova_compute[192580]: 2025-10-08 15:20:54.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:54 np0005476733 nova_compute[192580]: 2025-10-08 15:20:54.770 2 INFO nova.compute.manager [None req-5770961f-c650-4dd1-a00a-3b63e747e075 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Get console output#033[00m
Oct  8 11:20:54 np0005476733 nova_compute[192580]: 2025-10-08 15:20:54.777 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:20:54 np0005476733 nova_compute[192580]: 2025-10-08 15:20:54.838 2 DEBUG nova.compute.manager [req-e46516e7-d6a6-4094-b897-be7a952aa185 req-debe2b82-d0b6-4a48-95a3-2ad8e970bfbe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received event network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:20:54 np0005476733 nova_compute[192580]: 2025-10-08 15:20:54.839 2 DEBUG oslo_concurrency.lockutils [req-e46516e7-d6a6-4094-b897-be7a952aa185 req-debe2b82-d0b6-4a48-95a3-2ad8e970bfbe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:54 np0005476733 nova_compute[192580]: 2025-10-08 15:20:54.839 2 DEBUG oslo_concurrency.lockutils [req-e46516e7-d6a6-4094-b897-be7a952aa185 req-debe2b82-d0b6-4a48-95a3-2ad8e970bfbe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:54 np0005476733 nova_compute[192580]: 2025-10-08 15:20:54.840 2 DEBUG oslo_concurrency.lockutils [req-e46516e7-d6a6-4094-b897-be7a952aa185 req-debe2b82-d0b6-4a48-95a3-2ad8e970bfbe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:54 np0005476733 nova_compute[192580]: 2025-10-08 15:20:54.840 2 DEBUG nova.compute.manager [req-e46516e7-d6a6-4094-b897-be7a952aa185 req-debe2b82-d0b6-4a48-95a3-2ad8e970bfbe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] No waiting events found dispatching network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:20:54 np0005476733 nova_compute[192580]: 2025-10-08 15:20:54.840 2 WARNING nova.compute.manager [req-e46516e7-d6a6-4094-b897-be7a952aa185 req-debe2b82-d0b6-4a48-95a3-2ad8e970bfbe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received unexpected event network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb for instance with vm_state active and task_state None.#033[00m
Oct  8 11:20:54 np0005476733 nova_compute[192580]: 2025-10-08 15:20:54.840 2 DEBUG nova.compute.manager [req-e46516e7-d6a6-4094-b897-be7a952aa185 req-debe2b82-d0b6-4a48-95a3-2ad8e970bfbe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received event network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:20:54 np0005476733 nova_compute[192580]: 2025-10-08 15:20:54.841 2 DEBUG oslo_concurrency.lockutils [req-e46516e7-d6a6-4094-b897-be7a952aa185 req-debe2b82-d0b6-4a48-95a3-2ad8e970bfbe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:54 np0005476733 nova_compute[192580]: 2025-10-08 15:20:54.841 2 DEBUG oslo_concurrency.lockutils [req-e46516e7-d6a6-4094-b897-be7a952aa185 req-debe2b82-d0b6-4a48-95a3-2ad8e970bfbe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:54 np0005476733 nova_compute[192580]: 2025-10-08 15:20:54.841 2 DEBUG oslo_concurrency.lockutils [req-e46516e7-d6a6-4094-b897-be7a952aa185 req-debe2b82-d0b6-4a48-95a3-2ad8e970bfbe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:54 np0005476733 nova_compute[192580]: 2025-10-08 15:20:54.842 2 DEBUG nova.compute.manager [req-e46516e7-d6a6-4094-b897-be7a952aa185 req-debe2b82-d0b6-4a48-95a3-2ad8e970bfbe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] No waiting events found dispatching network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:20:54 np0005476733 nova_compute[192580]: 2025-10-08 15:20:54.842 2 WARNING nova.compute.manager [req-e46516e7-d6a6-4094-b897-be7a952aa185 req-debe2b82-d0b6-4a48-95a3-2ad8e970bfbe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received unexpected event network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb for instance with vm_state active and task_state None.#033[00m
Oct  8 11:20:56 np0005476733 nova_compute[192580]: 2025-10-08 15:20:56.950 2 DEBUG nova.compute.manager [req-1cd056b6-07b4-445e-9bd7-9c8999a57ce8 req-106fc815-495f-46f5-a835-aa25088ff871 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received event network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:20:56 np0005476733 nova_compute[192580]: 2025-10-08 15:20:56.951 2 DEBUG oslo_concurrency.lockutils [req-1cd056b6-07b4-445e-9bd7-9c8999a57ce8 req-106fc815-495f-46f5-a835-aa25088ff871 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:20:56 np0005476733 nova_compute[192580]: 2025-10-08 15:20:56.951 2 DEBUG oslo_concurrency.lockutils [req-1cd056b6-07b4-445e-9bd7-9c8999a57ce8 req-106fc815-495f-46f5-a835-aa25088ff871 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:20:56 np0005476733 nova_compute[192580]: 2025-10-08 15:20:56.951 2 DEBUG oslo_concurrency.lockutils [req-1cd056b6-07b4-445e-9bd7-9c8999a57ce8 req-106fc815-495f-46f5-a835-aa25088ff871 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:20:56 np0005476733 nova_compute[192580]: 2025-10-08 15:20:56.952 2 DEBUG nova.compute.manager [req-1cd056b6-07b4-445e-9bd7-9c8999a57ce8 req-106fc815-495f-46f5-a835-aa25088ff871 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] No waiting events found dispatching network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:20:56 np0005476733 nova_compute[192580]: 2025-10-08 15:20:56.952 2 WARNING nova.compute.manager [req-1cd056b6-07b4-445e-9bd7-9c8999a57ce8 req-106fc815-495f-46f5-a835-aa25088ff871 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received unexpected event network-vif-plugged-fc0ee6f6-75ad-486c-8761-a75311199fcb for instance with vm_state active and task_state None.#033[00m
Oct  8 11:20:57 np0005476733 podman[223560]: 2025-10-08 15:20:57.294908126 +0000 UTC m=+0.106091287 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  8 11:20:57 np0005476733 nova_compute[192580]: 2025-10-08 15:20:57.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:57.678 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:20:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:20:57.681 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:20:58 np0005476733 nova_compute[192580]: 2025-10-08 15:20:58.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:20:59 np0005476733 nova_compute[192580]: 2025-10-08 15:20:59.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:00 np0005476733 nova_compute[192580]: 2025-10-08 15:21:00.070 2 INFO nova.compute.manager [None req-60244b53-9dad-49d1-b124-4a249bdc42ba f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Get console output#033[00m
Oct  8 11:21:00 np0005476733 nova_compute[192580]: 2025-10-08 15:21:00.077 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:21:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:00.684 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:21:02 np0005476733 podman[223585]: 2025-10-08 15:21:02.25719138 +0000 UTC m=+0.072270164 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 11:21:02 np0005476733 podman[223586]: 2025-10-08 15:21:02.281129387 +0000 UTC m=+0.098530315 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 11:21:02 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:02Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3f:49:16 10.100.0.26
Oct  8 11:21:03 np0005476733 nova_compute[192580]: 2025-10-08 15:21:03.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:03 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:03Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:77:9d:93 10.100.0.9
Oct  8 11:21:03 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:03Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:77:9d:93 10.100.0.9
Oct  8 11:21:04 np0005476733 nova_compute[192580]: 2025-10-08 15:21:04.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:05 np0005476733 nova_compute[192580]: 2025-10-08 15:21:05.232 2 INFO nova.compute.manager [None req-ab50ae55-ec64-49da-9e98-097da9c3bc24 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Get console output#033[00m
Oct  8 11:21:05 np0005476733 nova_compute[192580]: 2025-10-08 15:21:05.237 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:21:05 np0005476733 nova_compute[192580]: 2025-10-08 15:21:05.414 2 DEBUG oslo_concurrency.lockutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Acquiring lock "64549fc7-989f-473a-99bb-78947d8d7536" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:21:05 np0005476733 nova_compute[192580]: 2025-10-08 15:21:05.414 2 DEBUG oslo_concurrency.lockutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "64549fc7-989f-473a-99bb-78947d8d7536" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:21:05 np0005476733 nova_compute[192580]: 2025-10-08 15:21:05.436 2 DEBUG nova.compute.manager [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:21:05 np0005476733 nova_compute[192580]: 2025-10-08 15:21:05.534 2 DEBUG oslo_concurrency.lockutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:21:05 np0005476733 nova_compute[192580]: 2025-10-08 15:21:05.535 2 DEBUG oslo_concurrency.lockutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:21:05 np0005476733 nova_compute[192580]: 2025-10-08 15:21:05.543 2 DEBUG nova.virt.hardware [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:21:05 np0005476733 nova_compute[192580]: 2025-10-08 15:21:05.544 2 INFO nova.compute.claims [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:21:05 np0005476733 nova_compute[192580]: 2025-10-08 15:21:05.835 2 DEBUG nova.compute.provider_tree [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:21:05 np0005476733 nova_compute[192580]: 2025-10-08 15:21:05.854 2 DEBUG nova.scheduler.client.report [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:21:05 np0005476733 nova_compute[192580]: 2025-10-08 15:21:05.876 2 DEBUG oslo_concurrency.lockutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Acquiring lock "341c177f-c391-41dd-bf3c-14c2076057eb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:21:05 np0005476733 nova_compute[192580]: 2025-10-08 15:21:05.877 2 DEBUG oslo_concurrency.lockutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Lock "341c177f-c391-41dd-bf3c-14c2076057eb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:21:05 np0005476733 nova_compute[192580]: 2025-10-08 15:21:05.897 2 DEBUG oslo_concurrency.lockutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.362s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:21:05 np0005476733 nova_compute[192580]: 2025-10-08 15:21:05.898 2 DEBUG nova.compute.manager [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:21:05 np0005476733 nova_compute[192580]: 2025-10-08 15:21:05.905 2 DEBUG nova.compute.manager [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:21:05 np0005476733 nova_compute[192580]: 2025-10-08 15:21:05.967 2 DEBUG nova.compute.manager [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:21:05 np0005476733 nova_compute[192580]: 2025-10-08 15:21:05.969 2 DEBUG nova.network.neutron [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.012 2 INFO nova.virt.libvirt.driver [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.018 2 DEBUG oslo_concurrency.lockutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.019 2 DEBUG oslo_concurrency.lockutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.032 2 DEBUG nova.virt.hardware [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.033 2 INFO nova.compute.claims [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.062 2 DEBUG nova.compute.manager [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.206 2 DEBUG nova.compute.manager [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.208 2 DEBUG nova.virt.libvirt.driver [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.209 2 INFO nova.virt.libvirt.driver [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Creating image(s)#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.210 2 DEBUG oslo_concurrency.lockutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Acquiring lock "/var/lib/nova/instances/64549fc7-989f-473a-99bb-78947d8d7536/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.210 2 DEBUG oslo_concurrency.lockutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "/var/lib/nova/instances/64549fc7-989f-473a-99bb-78947d8d7536/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.212 2 DEBUG oslo_concurrency.lockutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "/var/lib/nova/instances/64549fc7-989f-473a-99bb-78947d8d7536/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.236 2 DEBUG oslo_concurrency.processutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.295 2 DEBUG oslo_concurrency.processutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.296 2 DEBUG oslo_concurrency.lockutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.297 2 DEBUG oslo_concurrency.lockutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.308 2 DEBUG oslo_concurrency.processutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.333 2 DEBUG nova.compute.provider_tree [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.353 2 DEBUG nova.scheduler.client.report [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.367 2 DEBUG oslo_concurrency.processutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.367 2 DEBUG oslo_concurrency.processutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/64549fc7-989f-473a-99bb-78947d8d7536/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.391 2 DEBUG oslo_concurrency.lockutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.392 2 DEBUG nova.compute.manager [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.409 2 DEBUG oslo_concurrency.processutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/64549fc7-989f-473a-99bb-78947d8d7536/disk 10737418240" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.410 2 DEBUG oslo_concurrency.lockutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.410 2 DEBUG oslo_concurrency.processutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.472 2 DEBUG oslo_concurrency.processutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.476 2 DEBUG nova.objects.instance [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lazy-loading 'migration_context' on Instance uuid 64549fc7-989f-473a-99bb-78947d8d7536 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.499 2 DEBUG nova.compute.manager [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.500 2 DEBUG nova.network.neutron [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.510 2 DEBUG nova.virt.libvirt.driver [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.511 2 DEBUG nova.virt.libvirt.driver [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Ensure instance console log exists: /var/lib/nova/instances/64549fc7-989f-473a-99bb-78947d8d7536/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.512 2 DEBUG oslo_concurrency.lockutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.512 2 DEBUG oslo_concurrency.lockutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.512 2 DEBUG oslo_concurrency.lockutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.556 2 INFO nova.virt.libvirt.driver [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.584 2 DEBUG nova.compute.manager [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.723 2 DEBUG nova.policy [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.728 2 DEBUG nova.compute.manager [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.730 2 DEBUG nova.virt.libvirt.driver [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.730 2 INFO nova.virt.libvirt.driver [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Creating image(s)#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.731 2 DEBUG oslo_concurrency.lockutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Acquiring lock "/var/lib/nova/instances/341c177f-c391-41dd-bf3c-14c2076057eb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.731 2 DEBUG oslo_concurrency.lockutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Lock "/var/lib/nova/instances/341c177f-c391-41dd-bf3c-14c2076057eb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.732 2 DEBUG oslo_concurrency.lockutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Lock "/var/lib/nova/instances/341c177f-c391-41dd-bf3c-14c2076057eb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.745 2 DEBUG oslo_concurrency.processutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.805 2 DEBUG oslo_concurrency.processutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.806 2 DEBUG oslo_concurrency.lockutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.807 2 DEBUG oslo_concurrency.lockutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.823 2 DEBUG oslo_concurrency.processutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.848 2 DEBUG nova.policy [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.885 2 DEBUG oslo_concurrency.processutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.885 2 DEBUG oslo_concurrency.processutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/341c177f-c391-41dd-bf3c-14c2076057eb/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.942 2 DEBUG oslo_concurrency.processutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/341c177f-c391-41dd-bf3c-14c2076057eb/disk 10737418240" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.943 2 DEBUG oslo_concurrency.lockutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:21:06 np0005476733 nova_compute[192580]: 2025-10-08 15:21:06.943 2 DEBUG oslo_concurrency.processutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:21:07 np0005476733 nova_compute[192580]: 2025-10-08 15:21:07.003 2 DEBUG oslo_concurrency.processutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:21:07 np0005476733 nova_compute[192580]: 2025-10-08 15:21:07.006 2 DEBUG nova.objects.instance [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Lazy-loading 'migration_context' on Instance uuid 341c177f-c391-41dd-bf3c-14c2076057eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:21:07 np0005476733 nova_compute[192580]: 2025-10-08 15:21:07.024 2 DEBUG nova.virt.libvirt.driver [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:21:07 np0005476733 nova_compute[192580]: 2025-10-08 15:21:07.024 2 DEBUG nova.virt.libvirt.driver [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Ensure instance console log exists: /var/lib/nova/instances/341c177f-c391-41dd-bf3c-14c2076057eb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:21:07 np0005476733 nova_compute[192580]: 2025-10-08 15:21:07.024 2 DEBUG oslo_concurrency.lockutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:21:07 np0005476733 nova_compute[192580]: 2025-10-08 15:21:07.025 2 DEBUG oslo_concurrency.lockutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:21:07 np0005476733 nova_compute[192580]: 2025-10-08 15:21:07.025 2 DEBUG oslo_concurrency.lockutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:21:07 np0005476733 nova_compute[192580]: 2025-10-08 15:21:07.946 2 DEBUG nova.network.neutron [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Successfully created port: 046dc8a5-fad3-4f9f-bd10-3894704fe7ed _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 11:21:08 np0005476733 nova_compute[192580]: 2025-10-08 15:21:08.542 2 DEBUG nova.network.neutron [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Successfully updated port: 4689d9d8-d635-4a1c-9495-cff4ea7e6a95 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:21:08 np0005476733 nova_compute[192580]: 2025-10-08 15:21:08.558 2 DEBUG oslo_concurrency.lockutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Acquiring lock "refresh_cache-64549fc7-989f-473a-99bb-78947d8d7536" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:21:08 np0005476733 nova_compute[192580]: 2025-10-08 15:21:08.558 2 DEBUG oslo_concurrency.lockutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Acquired lock "refresh_cache-64549fc7-989f-473a-99bb-78947d8d7536" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:21:08 np0005476733 nova_compute[192580]: 2025-10-08 15:21:08.558 2 DEBUG nova.network.neutron [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:21:08 np0005476733 nova_compute[192580]: 2025-10-08 15:21:08.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:08 np0005476733 nova_compute[192580]: 2025-10-08 15:21:08.702 2 DEBUG nova.compute.manager [req-819b05f0-13ed-44b0-9098-5b0b0f5238b9 req-129bae05-5ef0-4aa3-99d5-939984b50d52 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Received event network-changed-4689d9d8-d635-4a1c-9495-cff4ea7e6a95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:21:08 np0005476733 nova_compute[192580]: 2025-10-08 15:21:08.703 2 DEBUG nova.compute.manager [req-819b05f0-13ed-44b0-9098-5b0b0f5238b9 req-129bae05-5ef0-4aa3-99d5-939984b50d52 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Refreshing instance network info cache due to event network-changed-4689d9d8-d635-4a1c-9495-cff4ea7e6a95. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:21:08 np0005476733 nova_compute[192580]: 2025-10-08 15:21:08.703 2 DEBUG oslo_concurrency.lockutils [req-819b05f0-13ed-44b0-9098-5b0b0f5238b9 req-129bae05-5ef0-4aa3-99d5-939984b50d52 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-64549fc7-989f-473a-99bb-78947d8d7536" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:21:08 np0005476733 nova_compute[192580]: 2025-10-08 15:21:08.767 2 DEBUG nova.network.neutron [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:21:08 np0005476733 nova_compute[192580]: 2025-10-08 15:21:08.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:09 np0005476733 nova_compute[192580]: 2025-10-08 15:21:09.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.037 2 DEBUG oslo_concurrency.lockutils [None req-a0f9bb18-230d-459e-9c2f-98881223a7ed 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "c4b45a9c-73a5-4b51-ab96-874507f4c028" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.037 2 DEBUG oslo_concurrency.lockutils [None req-a0f9bb18-230d-459e-9c2f-98881223a7ed 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.038 2 DEBUG oslo_concurrency.lockutils [None req-a0f9bb18-230d-459e-9c2f-98881223a7ed 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.038 2 DEBUG oslo_concurrency.lockutils [None req-a0f9bb18-230d-459e-9c2f-98881223a7ed 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.038 2 DEBUG oslo_concurrency.lockutils [None req-a0f9bb18-230d-459e-9c2f-98881223a7ed 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.041 2 INFO nova.compute.manager [None req-a0f9bb18-230d-459e-9c2f-98881223a7ed 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Terminating instance#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.042 2 DEBUG nova.compute.manager [None req-a0f9bb18-230d-459e-9c2f-98881223a7ed 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:21:10 np0005476733 kernel: tapfc0ee6f6-75 (unregistering): left promiscuous mode
Oct  8 11:21:10 np0005476733 NetworkManager[51699]: <info>  [1759936870.0778] device (tapfc0ee6f6-75): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:21:10 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:10Z|00115|binding|INFO|Releasing lport fc0ee6f6-75ad-486c-8761-a75311199fcb from this chassis (sb_readonly=0)
Oct  8 11:21:10 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:10Z|00116|binding|INFO|Setting lport fc0ee6f6-75ad-486c-8761-a75311199fcb down in Southbound
Oct  8 11:21:10 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:10Z|00117|binding|INFO|Removing iface tapfc0ee6f6-75 ovn-installed in OVS
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:10.105 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:49:16 10.100.0.26'], port_security=['fa:16:3e:3f:49:16 10.100.0.26'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.26/28', 'neutron:device_id': 'c4b45a9c-73a5-4b51-ab96-874507f4c028', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7762962015674dfb9038a135559a61f3', 'neutron:revision_number': '10', 'neutron:security_group_ids': '47f96e90-866e-45e9-bccf-367f966b96ad', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.246'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7bd9065d-dade-45b5-8223-a8753cff9447, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=fc0ee6f6-75ad-486c-8761-a75311199fcb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:21:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:10.106 103739 INFO neutron.agent.ovn.metadata.agent [-] Port fc0ee6f6-75ad-486c-8761-a75311199fcb in datapath 6683d4f6-e609-48e8-bf45-f31b3fa1d7ec unbound from our chassis#033[00m
Oct  8 11:21:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:10.109 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6683d4f6-e609-48e8-bf45-f31b3fa1d7ec, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:21:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:10.113 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[69d557e0-ac7d-4d5d-ab42-7949a4e5a872]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:10.114 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec namespace which is not needed anymore#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.132 2 DEBUG nova.network.neutron [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Successfully updated port: 046dc8a5-fad3-4f9f-bd10-3894704fe7ed _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:21:10 np0005476733 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Deactivated successfully.
Oct  8 11:21:10 np0005476733 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Consumed 13.283s CPU time.
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.150 2 DEBUG oslo_concurrency.lockutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Acquiring lock "refresh_cache-341c177f-c391-41dd-bf3c-14c2076057eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.151 2 DEBUG oslo_concurrency.lockutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Acquired lock "refresh_cache-341c177f-c391-41dd-bf3c-14c2076057eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.151 2 DEBUG nova.network.neutron [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:21:10 np0005476733 systemd-machined[152624]: Machine qemu-7-instance-00000007 terminated.
Oct  8 11:21:10 np0005476733 podman[223652]: 2025-10-08 15:21:10.191014042 +0000 UTC m=+0.090667972 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 11:21:10 np0005476733 neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec[223481]: [NOTICE]   (223492) : haproxy version is 2.8.14-c23fe91
Oct  8 11:21:10 np0005476733 neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec[223481]: [NOTICE]   (223492) : path to executable is /usr/sbin/haproxy
Oct  8 11:21:10 np0005476733 neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec[223481]: [WARNING]  (223492) : Exiting Master process...
Oct  8 11:21:10 np0005476733 neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec[223481]: [ALERT]    (223492) : Current worker (223494) exited with code 143 (Terminated)
Oct  8 11:21:10 np0005476733 neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec[223481]: [WARNING]  (223492) : All workers exited. Exiting... (0)
Oct  8 11:21:10 np0005476733 systemd[1]: libpod-15c9d2805e4eb850aa22dadbb95f6ebb7d0d55c846d5b8a88862405fa792b522.scope: Deactivated successfully.
Oct  8 11:21:10 np0005476733 podman[223695]: 2025-10-08 15:21:10.296998174 +0000 UTC m=+0.070144206 container died 15c9d2805e4eb850aa22dadbb95f6ebb7d0d55c846d5b8a88862405fa792b522 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.310 2 INFO nova.virt.libvirt.driver [-] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Instance destroyed successfully.#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.311 2 DEBUG nova.objects.instance [None req-a0f9bb18-230d-459e-9c2f-98881223a7ed 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lazy-loading 'resources' on Instance uuid c4b45a9c-73a5-4b51-ab96-874507f4c028 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.328 2 DEBUG nova.virt.libvirt.vif [None req-a0f9bb18-230d-459e-9c2f-98881223a7ed 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:19:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-275740212',display_name='tempest-server-test-275740212',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-275740212',id=7,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBInikYfls17wB1mwR3S9Ulgdg+DNmQrpoJDQvNS30i0mIRN2rBHXyw6+Ph5Eh6gBM3mwOmnBp3bKiolQD/a4fLXgU3ywHOPwgHPAGHPd9nWtpL3ZVtnCf+c+8SPVxqk1WQ==',key_name='tempest-keypair-test-226591404',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:20:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7762962015674dfb9038a135559a61f3',ramdisk_id='',reservation_id='r-lnlgqtkx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NetworkBasicTest-1891752524',owner_user_name='tempest-NetworkBasicTest-1891752524-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:20:50Z,user_data=None,user_id='71a7f2d2441447b2bbd1b677555d68cc',uuid=c4b45a9c-73a5-4b51-ab96-874507f4c028,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "address": "fa:16:3e:3f:49:16", "network": {"id": "6683d4f6-e609-48e8-bf45-f31b3fa1d7ec", "bridge": "br-int", "label": "tempest-test-network--999097329", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0ee6f6-75", "ovs_interfaceid": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.329 2 DEBUG nova.network.os_vif_util [None req-a0f9bb18-230d-459e-9c2f-98881223a7ed 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Converting VIF {"id": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "address": "fa:16:3e:3f:49:16", "network": {"id": "6683d4f6-e609-48e8-bf45-f31b3fa1d7ec", "bridge": "br-int", "label": "tempest-test-network--999097329", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7762962015674dfb9038a135559a61f3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc0ee6f6-75", "ovs_interfaceid": "fc0ee6f6-75ad-486c-8761-a75311199fcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.330 2 DEBUG nova.network.os_vif_util [None req-a0f9bb18-230d-459e-9c2f-98881223a7ed 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3f:49:16,bridge_name='br-int',has_traffic_filtering=True,id=fc0ee6f6-75ad-486c-8761-a75311199fcb,network=Network(6683d4f6-e609-48e8-bf45-f31b3fa1d7ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc0ee6f6-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.330 2 DEBUG os_vif [None req-a0f9bb18-230d-459e-9c2f-98881223a7ed 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3f:49:16,bridge_name='br-int',has_traffic_filtering=True,id=fc0ee6f6-75ad-486c-8761-a75311199fcb,network=Network(6683d4f6-e609-48e8-bf45-f31b3fa1d7ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc0ee6f6-75') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.331 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc0ee6f6-75, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.337 2 INFO os_vif [None req-a0f9bb18-230d-459e-9c2f-98881223a7ed 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3f:49:16,bridge_name='br-int',has_traffic_filtering=True,id=fc0ee6f6-75ad-486c-8761-a75311199fcb,network=Network(6683d4f6-e609-48e8-bf45-f31b3fa1d7ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc0ee6f6-75')#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.337 2 INFO nova.virt.libvirt.driver [None req-a0f9bb18-230d-459e-9c2f-98881223a7ed 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Deleting instance files /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028_del#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.338 2 INFO nova.virt.libvirt.driver [None req-a0f9bb18-230d-459e-9c2f-98881223a7ed 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Deletion of /var/lib/nova/instances/c4b45a9c-73a5-4b51-ab96-874507f4c028_del complete#033[00m
Oct  8 11:21:10 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-15c9d2805e4eb850aa22dadbb95f6ebb7d0d55c846d5b8a88862405fa792b522-userdata-shm.mount: Deactivated successfully.
Oct  8 11:21:10 np0005476733 systemd[1]: var-lib-containers-storage-overlay-98314696b0d14ad55a224e2e56f6276310ad465d2cdc6b9bd1585d89c09abee4-merged.mount: Deactivated successfully.
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.370 2 DEBUG nova.network.neutron [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Updating instance_info_cache with network_info: [{"id": "4689d9d8-d635-4a1c-9495-cff4ea7e6a95", "address": "fa:16:3e:bc:4a:7f", "network": {"id": "ca646cb6-3329-453a-a072-04814e4638f0", "bridge": "br-int", "label": "tempest-test-network--1140426360", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27fe52d14e2143a887b0445eb5cfca72", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4689d9d8-d6", "ovs_interfaceid": "4689d9d8-d635-4a1c-9495-cff4ea7e6a95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:21:10 np0005476733 podman[223695]: 2025-10-08 15:21:10.375751655 +0000 UTC m=+0.148897677 container cleanup 15c9d2805e4eb850aa22dadbb95f6ebb7d0d55c846d5b8a88862405fa792b522 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:21:10 np0005476733 systemd[1]: libpod-conmon-15c9d2805e4eb850aa22dadbb95f6ebb7d0d55c846d5b8a88862405fa792b522.scope: Deactivated successfully.
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.411 2 DEBUG oslo_concurrency.lockutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Releasing lock "refresh_cache-64549fc7-989f-473a-99bb-78947d8d7536" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.412 2 DEBUG nova.compute.manager [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Instance network_info: |[{"id": "4689d9d8-d635-4a1c-9495-cff4ea7e6a95", "address": "fa:16:3e:bc:4a:7f", "network": {"id": "ca646cb6-3329-453a-a072-04814e4638f0", "bridge": "br-int", "label": "tempest-test-network--1140426360", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27fe52d14e2143a887b0445eb5cfca72", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4689d9d8-d6", "ovs_interfaceid": "4689d9d8-d635-4a1c-9495-cff4ea7e6a95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.412 2 DEBUG oslo_concurrency.lockutils [req-819b05f0-13ed-44b0-9098-5b0b0f5238b9 req-129bae05-5ef0-4aa3-99d5-939984b50d52 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-64549fc7-989f-473a-99bb-78947d8d7536" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.412 2 DEBUG nova.network.neutron [req-819b05f0-13ed-44b0-9098-5b0b0f5238b9 req-129bae05-5ef0-4aa3-99d5-939984b50d52 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Refreshing network info cache for port 4689d9d8-d635-4a1c-9495-cff4ea7e6a95 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.416 2 DEBUG nova.virt.libvirt.driver [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Start _get_guest_xml network_info=[{"id": "4689d9d8-d635-4a1c-9495-cff4ea7e6a95", "address": "fa:16:3e:bc:4a:7f", "network": {"id": "ca646cb6-3329-453a-a072-04814e4638f0", "bridge": "br-int", "label": "tempest-test-network--1140426360", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27fe52d14e2143a887b0445eb5cfca72", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4689d9d8-d6", "ovs_interfaceid": "4689d9d8-d635-4a1c-9495-cff4ea7e6a95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.417 2 INFO nova.compute.manager [None req-a0f9bb18-230d-459e-9c2f-98881223a7ed 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.418 2 DEBUG oslo.service.loopingcall [None req-a0f9bb18-230d-459e-9c2f-98881223a7ed 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.418 2 DEBUG nova.compute.manager [-] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.418 2 DEBUG nova.network.neutron [-] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.424 2 WARNING nova.virt.libvirt.driver [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.432 2 DEBUG nova.virt.libvirt.host [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.432 2 DEBUG nova.virt.libvirt.host [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.439 2 DEBUG nova.virt.libvirt.host [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.440 2 DEBUG nova.virt.libvirt.host [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.440 2 DEBUG nova.virt.libvirt.driver [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.440 2 DEBUG nova.virt.hardware [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.441 2 DEBUG nova.virt.hardware [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.441 2 DEBUG nova.virt.hardware [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.441 2 DEBUG nova.virt.hardware [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.441 2 DEBUG nova.virt.hardware [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.441 2 DEBUG nova.virt.hardware [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.442 2 DEBUG nova.virt.hardware [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.442 2 DEBUG nova.virt.hardware [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.442 2 DEBUG nova.virt.hardware [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.442 2 DEBUG nova.virt.hardware [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.442 2 DEBUG nova.virt.hardware [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.445 2 DEBUG nova.virt.libvirt.vif [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:21:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-multicast-server-vlan-transparent-1532029749',display_name='tempest-multicast-server-vlan-transparent-1532029749',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-multicast-server-vlan-transparent-1532029749',id=10,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH4syPllrLf9M6NW3P0Mtw3AQOO4FK7TnvvKqGmsnzh9ZdBFhzF23mGGofa6PIbzV2jpECHJPUWbJNsOHP+hhSHtvJ/A+QvrET4E695rK5KUU6a+Wgg98oHszoQwuH9J+g==',key_name='tempest-keypair-test-307751635',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='27fe52d14e2143a887b0445eb5cfca72',ramdisk_id='',reservation_id='r-7ly4s0go',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestVlanTransparency-435229999',owner_user_name='tempest-MulticastTestVlanTransparency-435229999-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:21:06Z,user_data=None,user_id='8f9ed00bd5cc488a9d2a77380f12a503',uuid=64549fc7-989f-473a-99bb-78947d8d7536,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4689d9d8-d635-4a1c-9495-cff4ea7e6a95", "address": "fa:16:3e:bc:4a:7f", "network": {"id": "ca646cb6-3329-453a-a072-04814e4638f0", "bridge": "br-int", "label": "tempest-test-network--1140426360", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27fe52d14e2143a887b0445eb5cfca72", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4689d9d8-d6", "ovs_interfaceid": "4689d9d8-d635-4a1c-9495-cff4ea7e6a95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.445 2 DEBUG nova.network.os_vif_util [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Converting VIF {"id": "4689d9d8-d635-4a1c-9495-cff4ea7e6a95", "address": "fa:16:3e:bc:4a:7f", "network": {"id": "ca646cb6-3329-453a-a072-04814e4638f0", "bridge": "br-int", "label": "tempest-test-network--1140426360", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27fe52d14e2143a887b0445eb5cfca72", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4689d9d8-d6", "ovs_interfaceid": "4689d9d8-d635-4a1c-9495-cff4ea7e6a95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.446 2 DEBUG nova.network.os_vif_util [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:4a:7f,bridge_name='br-int',has_traffic_filtering=True,id=4689d9d8-d635-4a1c-9495-cff4ea7e6a95,network=Network(ca646cb6-3329-453a-a072-04814e4638f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4689d9d8-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.447 2 DEBUG nova.objects.instance [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lazy-loading 'pci_devices' on Instance uuid 64549fc7-989f-473a-99bb-78947d8d7536 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.451 2 INFO nova.compute.manager [None req-36aaa3ad-376c-4acc-a7ce-66fb158f5126 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Get console output#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.456 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.459 2 INFO nova.virt.libvirt.driver [None req-36aaa3ad-376c-4acc-a7ce-66fb158f5126 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Truncated console log returned, 2880 bytes ignored#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.464 2 DEBUG nova.virt.libvirt.driver [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:21:10 np0005476733 nova_compute[192580]:  <uuid>64549fc7-989f-473a-99bb-78947d8d7536</uuid>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:  <name>instance-0000000a</name>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <nova:name>tempest-multicast-server-vlan-transparent-1532029749</nova:name>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:21:10</nova:creationTime>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:21:10 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:        <nova:user uuid="8f9ed00bd5cc488a9d2a77380f12a503">tempest-MulticastTestVlanTransparency-435229999-project-member</nova:user>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:        <nova:project uuid="27fe52d14e2143a887b0445eb5cfca72">tempest-MulticastTestVlanTransparency-435229999</nova:project>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:        <nova:port uuid="4689d9d8-d635-4a1c-9495-cff4ea7e6a95">
Oct  8 11:21:10 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <entry name="serial">64549fc7-989f-473a-99bb-78947d8d7536</entry>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <entry name="uuid">64549fc7-989f-473a-99bb-78947d8d7536</entry>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/64549fc7-989f-473a-99bb-78947d8d7536/disk"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/64549fc7-989f-473a-99bb-78947d8d7536/disk.config"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:bc:4a:7f"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <target dev="tap4689d9d8-d6"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/64549fc7-989f-473a-99bb-78947d8d7536/console.log" append="off"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:21:10 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:21:10 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:21:10 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:21:10 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.464 2 DEBUG nova.compute.manager [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Preparing to wait for external event network-vif-plugged-4689d9d8-d635-4a1c-9495-cff4ea7e6a95 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.465 2 DEBUG oslo_concurrency.lockutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Acquiring lock "64549fc7-989f-473a-99bb-78947d8d7536-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.465 2 DEBUG oslo_concurrency.lockutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "64549fc7-989f-473a-99bb-78947d8d7536-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.465 2 DEBUG oslo_concurrency.lockutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "64549fc7-989f-473a-99bb-78947d8d7536-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.466 2 DEBUG nova.virt.libvirt.vif [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:21:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-multicast-server-vlan-transparent-1532029749',display_name='tempest-multicast-server-vlan-transparent-1532029749',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-multicast-server-vlan-transparent-1532029749',id=10,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH4syPllrLf9M6NW3P0Mtw3AQOO4FK7TnvvKqGmsnzh9ZdBFhzF23mGGofa6PIbzV2jpECHJPUWbJNsOHP+hhSHtvJ/A+QvrET4E695rK5KUU6a+Wgg98oHszoQwuH9J+g==',key_name='tempest-keypair-test-307751635',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='27fe52d14e2143a887b0445eb5cfca72',ramdisk_id='',reservation_id='r-7ly4s0go',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestVlanTransparency-435229999',owner_user_name='tempest-MulticastTestVlanTransparency-435229999-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:21:06Z,user_data=None,user_id='8f9ed00bd5cc488a9d2a77380f12a503',uuid=64549fc7-989f-473a-99bb-78947d8d7536,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4689d9d8-d635-4a1c-9495-cff4ea7e6a95", "address": "fa:16:3e:bc:4a:7f", "network": {"id": "ca646cb6-3329-453a-a072-04814e4638f0", "bridge": "br-int", "label": "tempest-test-network--1140426360", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27fe52d14e2143a887b0445eb5cfca72", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4689d9d8-d6", "ovs_interfaceid": "4689d9d8-d635-4a1c-9495-cff4ea7e6a95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.466 2 DEBUG nova.network.os_vif_util [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Converting VIF {"id": "4689d9d8-d635-4a1c-9495-cff4ea7e6a95", "address": "fa:16:3e:bc:4a:7f", "network": {"id": "ca646cb6-3329-453a-a072-04814e4638f0", "bridge": "br-int", "label": "tempest-test-network--1140426360", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27fe52d14e2143a887b0445eb5cfca72", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4689d9d8-d6", "ovs_interfaceid": "4689d9d8-d635-4a1c-9495-cff4ea7e6a95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.467 2 DEBUG nova.network.os_vif_util [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:4a:7f,bridge_name='br-int',has_traffic_filtering=True,id=4689d9d8-d635-4a1c-9495-cff4ea7e6a95,network=Network(ca646cb6-3329-453a-a072-04814e4638f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4689d9d8-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.467 2 DEBUG os_vif [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:4a:7f,bridge_name='br-int',has_traffic_filtering=True,id=4689d9d8-d635-4a1c-9495-cff4ea7e6a95,network=Network(ca646cb6-3329-453a-a072-04814e4638f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4689d9d8-d6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.468 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.468 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.471 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4689d9d8-d6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.472 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4689d9d8-d6, col_values=(('external_ids', {'iface-id': '4689d9d8-d635-4a1c-9495-cff4ea7e6a95', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bc:4a:7f', 'vm-uuid': '64549fc7-989f-473a-99bb-78947d8d7536'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:10 np0005476733 NetworkManager[51699]: <info>  [1759936870.4750] manager: (tap4689d9d8-d6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.479 2 INFO os_vif [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:4a:7f,bridge_name='br-int',has_traffic_filtering=True,id=4689d9d8-d635-4a1c-9495-cff4ea7e6a95,network=Network(ca646cb6-3329-453a-a072-04814e4638f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4689d9d8-d6')#033[00m
Oct  8 11:21:10 np0005476733 podman[223744]: 2025-10-08 15:21:10.507534683 +0000 UTC m=+0.103123851 container remove 15c9d2805e4eb850aa22dadbb95f6ebb7d0d55c846d5b8a88862405fa792b522 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:21:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:10.513 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0106d68e-99ff-4037-8937-28a0741a1e22]: (4, ('Wed Oct  8 03:21:10 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec (15c9d2805e4eb850aa22dadbb95f6ebb7d0d55c846d5b8a88862405fa792b522)\n15c9d2805e4eb850aa22dadbb95f6ebb7d0d55c846d5b8a88862405fa792b522\nWed Oct  8 03:21:10 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec (15c9d2805e4eb850aa22dadbb95f6ebb7d0d55c846d5b8a88862405fa792b522)\n15c9d2805e4eb850aa22dadbb95f6ebb7d0d55c846d5b8a88862405fa792b522\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:10.516 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[85666060-1be3-4a75-972f-de6e2dbb00e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:10.517 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6683d4f6-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:10 np0005476733 kernel: tap6683d4f6-e0: left promiscuous mode
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:10.539 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[bb08f899-80bb-456a-81f3-87a5d835e5b6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.563 2 DEBUG nova.virt.libvirt.driver [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.563 2 DEBUG nova.virt.libvirt.driver [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.564 2 DEBUG nova.virt.libvirt.driver [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] No VIF found with MAC fa:16:3e:bc:4a:7f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.564 2 INFO nova.virt.libvirt.driver [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Using config drive#033[00m
Oct  8 11:21:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:10.567 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2e1b5d05-ca7e-43b0-9812-2287f8410a90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:10.568 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[cf3b5420-587b-44cb-8b9c-38785ab7b693]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:10.582 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[94895cf5-98f1-461d-9092-e0fa6e2c866b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375321, 'reachable_time': 43073, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223765, 'error': None, 'target': 'ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:10 np0005476733 systemd[1]: run-netns-ovnmeta\x2d6683d4f6\x2de609\x2d48e8\x2dbf45\x2df31b3fa1d7ec.mount: Deactivated successfully.
Oct  8 11:21:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:10.585 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6683d4f6-e609-48e8-bf45-f31b3fa1d7ec deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:21:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:10.586 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[b29d3378-f7e2-452d-9762-738f5993c5d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.649 2 DEBUG nova.network.neutron [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.860 2 DEBUG nova.compute.manager [req-d58c623b-475c-4599-a9e2-455f272afe95 req-4b9d0917-b415-499e-a515-4cb1c3038d7c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Received event network-changed-046dc8a5-fad3-4f9f-bd10-3894704fe7ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.861 2 DEBUG nova.compute.manager [req-d58c623b-475c-4599-a9e2-455f272afe95 req-4b9d0917-b415-499e-a515-4cb1c3038d7c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Refreshing instance network info cache due to event network-changed-046dc8a5-fad3-4f9f-bd10-3894704fe7ed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:21:10 np0005476733 nova_compute[192580]: 2025-10-08 15:21:10.861 2 DEBUG oslo_concurrency.lockutils [req-d58c623b-475c-4599-a9e2-455f272afe95 req-4b9d0917-b415-499e-a515-4cb1c3038d7c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-341c177f-c391-41dd-bf3c-14c2076057eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:21:11 np0005476733 nova_compute[192580]: 2025-10-08 15:21:11.265 2 INFO nova.virt.libvirt.driver [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Creating config drive at /var/lib/nova/instances/64549fc7-989f-473a-99bb-78947d8d7536/disk.config#033[00m
Oct  8 11:21:11 np0005476733 nova_compute[192580]: 2025-10-08 15:21:11.271 2 DEBUG oslo_concurrency.processutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/64549fc7-989f-473a-99bb-78947d8d7536/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt6alcao7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:21:11 np0005476733 nova_compute[192580]: 2025-10-08 15:21:11.402 2 DEBUG oslo_concurrency.processutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/64549fc7-989f-473a-99bb-78947d8d7536/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt6alcao7" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:21:11 np0005476733 kernel: tap4689d9d8-d6: entered promiscuous mode
Oct  8 11:21:11 np0005476733 systemd-udevd[223667]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:21:11 np0005476733 NetworkManager[51699]: <info>  [1759936871.4809] manager: (tap4689d9d8-d6): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Oct  8 11:21:11 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:11Z|00118|binding|INFO|Claiming lport 4689d9d8-d635-4a1c-9495-cff4ea7e6a95 for this chassis.
Oct  8 11:21:11 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:11Z|00119|binding|INFO|4689d9d8-d635-4a1c-9495-cff4ea7e6a95: Claiming fa:16:3e:bc:4a:7f 10.100.0.14
Oct  8 11:21:11 np0005476733 nova_compute[192580]: 2025-10-08 15:21:11.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:11 np0005476733 NetworkManager[51699]: <info>  [1759936871.4959] device (tap4689d9d8-d6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:21:11 np0005476733 NetworkManager[51699]: <info>  [1759936871.4968] device (tap4689d9d8-d6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.498 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bc:4a:7f 10.100.0.14'], port_security=['fa:16:3e:bc:4a:7f 10.100.0.14 192.168.123.11/24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca646cb6-3329-453a-a072-04814e4638f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '27fe52d14e2143a887b0445eb5cfca72', 'neutron:revision_number': '2', 'neutron:security_group_ids': '47c9f436-4d87-4dd0-ad82-6f84fbc433e0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e3e342e-563d-45df-8704-409eb95c6087, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=4689d9d8-d635-4a1c-9495-cff4ea7e6a95) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.500 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 4689d9d8-d635-4a1c-9495-cff4ea7e6a95 in datapath ca646cb6-3329-453a-a072-04814e4638f0 bound to our chassis#033[00m
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.503 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ca646cb6-3329-453a-a072-04814e4638f0#033[00m
Oct  8 11:21:11 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:11Z|00120|binding|INFO|Setting lport 4689d9d8-d635-4a1c-9495-cff4ea7e6a95 ovn-installed in OVS
Oct  8 11:21:11 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:11Z|00121|binding|INFO|Setting lport 4689d9d8-d635-4a1c-9495-cff4ea7e6a95 up in Southbound
Oct  8 11:21:11 np0005476733 nova_compute[192580]: 2025-10-08 15:21:11.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:11 np0005476733 nova_compute[192580]: 2025-10-08 15:21:11.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.522 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[96848fa1-adde-4ba1-bd87-af7bdbaa7de8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.523 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapca646cb6-31 in ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.526 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapca646cb6-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.527 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ca1d729f-2dc8-4a46-8941-d29907b747da]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.529 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[15fd2a58-d592-4358-8191-dc616b2439bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:11 np0005476733 systemd-machined[152624]: New machine qemu-8-instance-0000000a.
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.541 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[6d9d67e1-7689-435c-add5-59125eba1a15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:11 np0005476733 systemd[1]: Started Virtual Machine qemu-8-instance-0000000a.
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.559 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6e1d33a5-22c9-4515-9e10-9ba0748408bb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.597 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[da9e002e-da20-4fe8-af58-f0febf378906]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.603 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3073f751-4d11-44f8-84e2-8fcf1b29ec33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:11 np0005476733 NetworkManager[51699]: <info>  [1759936871.6098] manager: (tapca646cb6-30): new Veth device (/org/freedesktop/NetworkManager/Devices/58)
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.651 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[36504f82-e51a-4de0-af2c-5b345a2863ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.655 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[957e4155-e7a5-42f0-a193-0b8c80313526]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:11 np0005476733 NetworkManager[51699]: <info>  [1759936871.6869] device (tapca646cb6-30): carrier: link connected
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.692 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[aa831558-c43b-492b-a5ee-3008071408a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.714 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[34783486-3543-45dc-a277-3944138ab277]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca646cb6-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:47:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377535, 'reachable_time': 19488, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223814, 'error': None, 'target': 'ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.731 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[42df2d31-6892-4b66-96ad-7c73b58983cb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1b:473d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 377535, 'tstamp': 377535}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223815, 'error': None, 'target': 'ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.762 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[05283120-db93-4523-a447-e73f9704f1e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca646cb6-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:47:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377535, 'reachable_time': 19488, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223816, 'error': None, 'target': 'ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.798 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[72442c8b-dc60-42f1-8862-447a1aa1cc8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:11 np0005476733 nova_compute[192580]: 2025-10-08 15:21:11.801 2 DEBUG nova.network.neutron [-] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:21:11 np0005476733 nova_compute[192580]: 2025-10-08 15:21:11.823 2 INFO nova.compute.manager [-] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Took 1.40 seconds to deallocate network for instance.#033[00m
Oct  8 11:21:11 np0005476733 nova_compute[192580]: 2025-10-08 15:21:11.876 2 DEBUG oslo_concurrency.lockutils [None req-a0f9bb18-230d-459e-9c2f-98881223a7ed 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:21:11 np0005476733 nova_compute[192580]: 2025-10-08 15:21:11.877 2 DEBUG oslo_concurrency.lockutils [None req-a0f9bb18-230d-459e-9c2f-98881223a7ed 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.889 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[20323033-08d8-4d04-88ae-a3511543f721]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.891 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca646cb6-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.892 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.893 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca646cb6-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:21:11 np0005476733 NetworkManager[51699]: <info>  [1759936871.8957] manager: (tapca646cb6-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Oct  8 11:21:11 np0005476733 kernel: tapca646cb6-30: entered promiscuous mode
Oct  8 11:21:11 np0005476733 nova_compute[192580]: 2025-10-08 15:21:11.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.899 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapca646cb6-30, col_values=(('external_ids', {'iface-id': '01b0a658-f0ed-4cb7-aee4-981992c348f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:21:11 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:11Z|00122|binding|INFO|Releasing lport 01b0a658-f0ed-4cb7-aee4-981992c348f0 from this chassis (sb_readonly=0)
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.915 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ca646cb6-3329-453a-a072-04814e4638f0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ca646cb6-3329-453a-a072-04814e4638f0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:21:11 np0005476733 nova_compute[192580]: 2025-10-08 15:21:11.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.916 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4a68b429-048f-4469-bf14-8a8989b019d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.917 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-ca646cb6-3329-453a-a072-04814e4638f0
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/ca646cb6-3329-453a-a072-04814e4638f0.pid.haproxy
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID ca646cb6-3329-453a-a072-04814e4638f0
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:21:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:11.917 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0', 'env', 'PROCESS_TAG=haproxy-ca646cb6-3329-453a-a072-04814e4638f0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ca646cb6-3329-453a-a072-04814e4638f0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.065 2 DEBUG nova.compute.provider_tree [None req-a0f9bb18-230d-459e-9c2f-98881223a7ed 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.094 2 DEBUG nova.scheduler.client.report [None req-a0f9bb18-230d-459e-9c2f-98881223a7ed 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.118 2 DEBUG oslo_concurrency.lockutils [None req-a0f9bb18-230d-459e-9c2f-98881223a7ed 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.242s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.147 2 INFO nova.scheduler.client.report [None req-a0f9bb18-230d-459e-9c2f-98881223a7ed 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Deleted allocations for instance c4b45a9c-73a5-4b51-ab96-874507f4c028#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.211 2 DEBUG oslo_concurrency.lockutils [None req-a0f9bb18-230d-459e-9c2f-98881223a7ed 71a7f2d2441447b2bbd1b677555d68cc 7762962015674dfb9038a135559a61f3 - - default default] Lock "c4b45a9c-73a5-4b51-ab96-874507f4c028" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.230 2 DEBUG nova.network.neutron [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Updating instance_info_cache with network_info: [{"id": "046dc8a5-fad3-4f9f-bd10-3894704fe7ed", "address": "fa:16:3e:0f:3d:28", "network": {"id": "7e5261bf-648d-4475-96cb-fe9ba80fd1d8", "bridge": "br-int", "label": "tempest-test-network--1002477072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d027d9bf53149dd9246b01ebf09eb48", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap046dc8a5-fa", "ovs_interfaceid": "046dc8a5-fad3-4f9f-bd10-3894704fe7ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.255 2 DEBUG oslo_concurrency.lockutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Releasing lock "refresh_cache-341c177f-c391-41dd-bf3c-14c2076057eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.255 2 DEBUG nova.compute.manager [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Instance network_info: |[{"id": "046dc8a5-fad3-4f9f-bd10-3894704fe7ed", "address": "fa:16:3e:0f:3d:28", "network": {"id": "7e5261bf-648d-4475-96cb-fe9ba80fd1d8", "bridge": "br-int", "label": "tempest-test-network--1002477072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d027d9bf53149dd9246b01ebf09eb48", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap046dc8a5-fa", "ovs_interfaceid": "046dc8a5-fad3-4f9f-bd10-3894704fe7ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.256 2 DEBUG oslo_concurrency.lockutils [req-d58c623b-475c-4599-a9e2-455f272afe95 req-4b9d0917-b415-499e-a515-4cb1c3038d7c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-341c177f-c391-41dd-bf3c-14c2076057eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.256 2 DEBUG nova.network.neutron [req-d58c623b-475c-4599-a9e2-455f272afe95 req-4b9d0917-b415-499e-a515-4cb1c3038d7c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Refreshing network info cache for port 046dc8a5-fad3-4f9f-bd10-3894704fe7ed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.259 2 DEBUG nova.virt.libvirt.driver [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Start _get_guest_xml network_info=[{"id": "046dc8a5-fad3-4f9f-bd10-3894704fe7ed", "address": "fa:16:3e:0f:3d:28", "network": {"id": "7e5261bf-648d-4475-96cb-fe9ba80fd1d8", "bridge": "br-int", "label": "tempest-test-network--1002477072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d027d9bf53149dd9246b01ebf09eb48", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap046dc8a5-fa", "ovs_interfaceid": "046dc8a5-fad3-4f9f-bd10-3894704fe7ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.263 2 WARNING nova.virt.libvirt.driver [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.269 2 DEBUG nova.virt.libvirt.host [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.270 2 DEBUG nova.virt.libvirt.host [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.278 2 DEBUG nova.virt.libvirt.host [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.279 2 DEBUG nova.virt.libvirt.host [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.280 2 DEBUG nova.virt.libvirt.driver [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.280 2 DEBUG nova.virt.hardware [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.280 2 DEBUG nova.virt.hardware [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.281 2 DEBUG nova.virt.hardware [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.281 2 DEBUG nova.virt.hardware [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.281 2 DEBUG nova.virt.hardware [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.281 2 DEBUG nova.virt.hardware [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.282 2 DEBUG nova.virt.hardware [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.282 2 DEBUG nova.virt.hardware [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.282 2 DEBUG nova.virt.hardware [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.282 2 DEBUG nova.virt.hardware [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.283 2 DEBUG nova.virt.hardware [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.287 2 DEBUG nova.virt.libvirt.vif [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:21:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-broadcast-receiver-1467126576',display_name='tempest-broadcast-receiver-1467126576',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-broadcast-receiver-1467126576',id=11,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHIGFmYxNtsO/KAD5DCJyEruKYGY8Jg3/mdP8DUSaU8q1j7RXeluCkcQClNdJmlHgOPM4zotnGNIaBo+klUL18feTKjHoE9KXkR0MO/pt0x3rWsYQBHe3V4p4r+dlzz5fw==',key_name='tempest-keypair-test-627190324',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0d027d9bf53149dd9246b01ebf09eb48',ramdisk_id='',reservation_id='r-82psguot',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-BroadcastTestIPv4Common-1303208658',owner_user_name='tempest-BroadcastTestIPv4Common-1303208658-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:21:06Z,user_data=None,user_id='625a85fb4a424c84b99b84adcf899810',uuid=341c177f-c391-41dd-bf3c-14c2076057eb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "046dc8a5-fad3-4f9f-bd10-3894704fe7ed", "address": "fa:16:3e:0f:3d:28", "network": {"id": "7e5261bf-648d-4475-96cb-fe9ba80fd1d8", "bridge": "br-int", "label": "tempest-test-network--1002477072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d027d9bf53149dd9246b01ebf09eb48", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap046dc8a5-fa", "ovs_interfaceid": "046dc8a5-fad3-4f9f-bd10-3894704fe7ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.287 2 DEBUG nova.network.os_vif_util [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Converting VIF {"id": "046dc8a5-fad3-4f9f-bd10-3894704fe7ed", "address": "fa:16:3e:0f:3d:28", "network": {"id": "7e5261bf-648d-4475-96cb-fe9ba80fd1d8", "bridge": "br-int", "label": "tempest-test-network--1002477072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d027d9bf53149dd9246b01ebf09eb48", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap046dc8a5-fa", "ovs_interfaceid": "046dc8a5-fad3-4f9f-bd10-3894704fe7ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.288 2 DEBUG nova.network.os_vif_util [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0f:3d:28,bridge_name='br-int',has_traffic_filtering=True,id=046dc8a5-fad3-4f9f-bd10-3894704fe7ed,network=Network(7e5261bf-648d-4475-96cb-fe9ba80fd1d8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap046dc8a5-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.289 2 DEBUG nova.objects.instance [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Lazy-loading 'pci_devices' on Instance uuid 341c177f-c391-41dd-bf3c-14c2076057eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.305 2 DEBUG nova.virt.libvirt.driver [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:21:12 np0005476733 nova_compute[192580]:  <uuid>341c177f-c391-41dd-bf3c-14c2076057eb</uuid>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:  <name>instance-0000000b</name>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <nova:name>tempest-broadcast-receiver-1467126576</nova:name>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:21:12</nova:creationTime>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:21:12 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:        <nova:user uuid="625a85fb4a424c84b99b84adcf899810">tempest-BroadcastTestIPv4Common-1303208658-project-member</nova:user>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:        <nova:project uuid="0d027d9bf53149dd9246b01ebf09eb48">tempest-BroadcastTestIPv4Common-1303208658</nova:project>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:        <nova:port uuid="046dc8a5-fad3-4f9f-bd10-3894704fe7ed">
Oct  8 11:21:12 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <entry name="serial">341c177f-c391-41dd-bf3c-14c2076057eb</entry>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <entry name="uuid">341c177f-c391-41dd-bf3c-14c2076057eb</entry>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/341c177f-c391-41dd-bf3c-14c2076057eb/disk"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/341c177f-c391-41dd-bf3c-14c2076057eb/disk.config"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:0f:3d:28"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <target dev="tap046dc8a5-fa"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/341c177f-c391-41dd-bf3c-14c2076057eb/console.log" append="off"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:21:12 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:21:12 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:21:12 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:21:12 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.306 2 DEBUG nova.compute.manager [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Preparing to wait for external event network-vif-plugged-046dc8a5-fad3-4f9f-bd10-3894704fe7ed prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.306 2 DEBUG oslo_concurrency.lockutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Acquiring lock "341c177f-c391-41dd-bf3c-14c2076057eb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.306 2 DEBUG oslo_concurrency.lockutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Lock "341c177f-c391-41dd-bf3c-14c2076057eb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.306 2 DEBUG oslo_concurrency.lockutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Lock "341c177f-c391-41dd-bf3c-14c2076057eb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.307 2 DEBUG nova.virt.libvirt.vif [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:21:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-broadcast-receiver-1467126576',display_name='tempest-broadcast-receiver-1467126576',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-broadcast-receiver-1467126576',id=11,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHIGFmYxNtsO/KAD5DCJyEruKYGY8Jg3/mdP8DUSaU8q1j7RXeluCkcQClNdJmlHgOPM4zotnGNIaBo+klUL18feTKjHoE9KXkR0MO/pt0x3rWsYQBHe3V4p4r+dlzz5fw==',key_name='tempest-keypair-test-627190324',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0d027d9bf53149dd9246b01ebf09eb48',ramdisk_id='',reservation_id='r-82psguot',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-BroadcastTestIPv4Common-1303208658',owner_user_name='tempest-BroadcastTestIPv4Common-1303208658-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:21:06Z,user_data=None,user_id='625a85fb4a424c84b99b84adcf899810',uuid=341c177f-c391-41dd-bf3c-14c2076057eb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "046dc8a5-fad3-4f9f-bd10-3894704fe7ed", "address": "fa:16:3e:0f:3d:28", "network": {"id": "7e5261bf-648d-4475-96cb-fe9ba80fd1d8", "bridge": "br-int", "label": "tempest-test-network--1002477072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d027d9bf53149dd9246b01ebf09eb48", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap046dc8a5-fa", "ovs_interfaceid": "046dc8a5-fad3-4f9f-bd10-3894704fe7ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:21:12 np0005476733 NetworkManager[51699]: <info>  [1759936872.3170] manager: (tap046dc8a5-fa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.307 2 DEBUG nova.network.os_vif_util [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Converting VIF {"id": "046dc8a5-fad3-4f9f-bd10-3894704fe7ed", "address": "fa:16:3e:0f:3d:28", "network": {"id": "7e5261bf-648d-4475-96cb-fe9ba80fd1d8", "bridge": "br-int", "label": "tempest-test-network--1002477072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d027d9bf53149dd9246b01ebf09eb48", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap046dc8a5-fa", "ovs_interfaceid": "046dc8a5-fad3-4f9f-bd10-3894704fe7ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.308 2 DEBUG nova.network.os_vif_util [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0f:3d:28,bridge_name='br-int',has_traffic_filtering=True,id=046dc8a5-fad3-4f9f-bd10-3894704fe7ed,network=Network(7e5261bf-648d-4475-96cb-fe9ba80fd1d8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap046dc8a5-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.308 2 DEBUG os_vif [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:3d:28,bridge_name='br-int',has_traffic_filtering=True,id=046dc8a5-fad3-4f9f-bd10-3894704fe7ed,network=Network(7e5261bf-648d-4475-96cb-fe9ba80fd1d8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap046dc8a5-fa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.310 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.310 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.313 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap046dc8a5-fa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.314 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap046dc8a5-fa, col_values=(('external_ids', {'iface-id': '046dc8a5-fad3-4f9f-bd10-3894704fe7ed', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0f:3d:28', 'vm-uuid': '341c177f-c391-41dd-bf3c-14c2076057eb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.325 2 INFO os_vif [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:3d:28,bridge_name='br-int',has_traffic_filtering=True,id=046dc8a5-fad3-4f9f-bd10-3894704fe7ed,network=Network(7e5261bf-648d-4475-96cb-fe9ba80fd1d8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap046dc8a5-fa')#033[00m
Oct  8 11:21:12 np0005476733 podman[223855]: 2025-10-08 15:21:12.337924377 +0000 UTC m=+0.086238691 container create 5fcc1d27a6420a128adef0bdb959f81b4a284b3f2cb39546fb2b3378d02410c5 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.371 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936872.3711982, 64549fc7-989f-473a-99bb-78947d8d7536 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.372 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] VM Started (Lifecycle Event)#033[00m
Oct  8 11:21:12 np0005476733 podman[223855]: 2025-10-08 15:21:12.288431963 +0000 UTC m=+0.036746307 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.379 2 DEBUG nova.virt.libvirt.driver [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.379 2 DEBUG nova.virt.libvirt.driver [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.379 2 DEBUG nova.virt.libvirt.driver [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] No VIF found with MAC fa:16:3e:0f:3d:28, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.380 2 INFO nova.virt.libvirt.driver [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Using config drive#033[00m
Oct  8 11:21:12 np0005476733 systemd[1]: Started libpod-conmon-5fcc1d27a6420a128adef0bdb959f81b4a284b3f2cb39546fb2b3378d02410c5.scope.
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.393 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.411 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936872.3712704, 64549fc7-989f-473a-99bb-78947d8d7536 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.412 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:21:12 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.428 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:21:12 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bbfdb0f8ecabbb4f562bf8bb53bfb2869b9a74e26f59e5d32823c632bceae699/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.431 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:21:12 np0005476733 podman[223855]: 2025-10-08 15:21:12.446148532 +0000 UTC m=+0.194462906 container init 5fcc1d27a6420a128adef0bdb959f81b4a284b3f2cb39546fb2b3378d02410c5 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.452 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:21:12 np0005476733 podman[223855]: 2025-10-08 15:21:12.454466178 +0000 UTC m=+0.202780482 container start 5fcc1d27a6420a128adef0bdb959f81b4a284b3f2cb39546fb2b3378d02410c5 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:21:12 np0005476733 podman[223869]: 2025-10-08 15:21:12.467601368 +0000 UTC m=+0.107920365 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.475 2 DEBUG nova.compute.manager [req-3d8a0218-77c7-4faa-b77c-494087935e60 req-687fef67-7a2f-4eea-82fd-151f8825d3d6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Received event network-vif-plugged-4689d9d8-d635-4a1c-9495-cff4ea7e6a95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.476 2 DEBUG oslo_concurrency.lockutils [req-3d8a0218-77c7-4faa-b77c-494087935e60 req-687fef67-7a2f-4eea-82fd-151f8825d3d6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "64549fc7-989f-473a-99bb-78947d8d7536-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.476 2 DEBUG oslo_concurrency.lockutils [req-3d8a0218-77c7-4faa-b77c-494087935e60 req-687fef67-7a2f-4eea-82fd-151f8825d3d6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "64549fc7-989f-473a-99bb-78947d8d7536-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.476 2 DEBUG oslo_concurrency.lockutils [req-3d8a0218-77c7-4faa-b77c-494087935e60 req-687fef67-7a2f-4eea-82fd-151f8825d3d6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "64549fc7-989f-473a-99bb-78947d8d7536-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.476 2 DEBUG nova.compute.manager [req-3d8a0218-77c7-4faa-b77c-494087935e60 req-687fef67-7a2f-4eea-82fd-151f8825d3d6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Processing event network-vif-plugged-4689d9d8-d635-4a1c-9495-cff4ea7e6a95 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.477 2 DEBUG nova.compute.manager [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.485 2 DEBUG nova.virt.libvirt.driver [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.485 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936872.485586, 64549fc7-989f-473a-99bb-78947d8d7536 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.486 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:21:12 np0005476733 neutron-haproxy-ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0[223882]: [NOTICE]   (223898) : New worker (223900) forked
Oct  8 11:21:12 np0005476733 neutron-haproxy-ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0[223882]: [NOTICE]   (223898) : Loading success.
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.497 2 INFO nova.virt.libvirt.driver [-] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Instance spawned successfully.#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.497 2 DEBUG nova.virt.libvirt.driver [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.512 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.516 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.520 2 DEBUG nova.virt.libvirt.driver [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.520 2 DEBUG nova.virt.libvirt.driver [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.521 2 DEBUG nova.virt.libvirt.driver [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.521 2 DEBUG nova.virt.libvirt.driver [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.521 2 DEBUG nova.virt.libvirt.driver [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.522 2 DEBUG nova.virt.libvirt.driver [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.552 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.586 2 INFO nova.compute.manager [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Took 6.38 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.586 2 DEBUG nova.compute.manager [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.649 2 INFO nova.compute.manager [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Took 7.15 seconds to build instance.#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.672 2 DEBUG oslo_concurrency.lockutils [None req-b6d94f5d-a003-4c9f-9ea6-f94ce3bb1f1b 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "64549fc7-989f-473a-99bb-78947d8d7536" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.791 2 DEBUG nova.network.neutron [req-819b05f0-13ed-44b0-9098-5b0b0f5238b9 req-129bae05-5ef0-4aa3-99d5-939984b50d52 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Updated VIF entry in instance network info cache for port 4689d9d8-d635-4a1c-9495-cff4ea7e6a95. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.792 2 DEBUG nova.network.neutron [req-819b05f0-13ed-44b0-9098-5b0b0f5238b9 req-129bae05-5ef0-4aa3-99d5-939984b50d52 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Updating instance_info_cache with network_info: [{"id": "4689d9d8-d635-4a1c-9495-cff4ea7e6a95", "address": "fa:16:3e:bc:4a:7f", "network": {"id": "ca646cb6-3329-453a-a072-04814e4638f0", "bridge": "br-int", "label": "tempest-test-network--1140426360", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27fe52d14e2143a887b0445eb5cfca72", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4689d9d8-d6", "ovs_interfaceid": "4689d9d8-d635-4a1c-9495-cff4ea7e6a95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.812 2 DEBUG oslo_concurrency.lockutils [req-819b05f0-13ed-44b0-9098-5b0b0f5238b9 req-129bae05-5ef0-4aa3-99d5-939984b50d52 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-64549fc7-989f-473a-99bb-78947d8d7536" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.965 2 DEBUG nova.compute.manager [req-70d4560d-3958-4310-b99c-936bc1babac3 req-f4188b5e-2bb4-47a6-8953-c128396ba13b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Received event network-vif-deleted-fc0ee6f6-75ad-486c-8761-a75311199fcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.991 2 INFO nova.virt.libvirt.driver [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Creating config drive at /var/lib/nova/instances/341c177f-c391-41dd-bf3c-14c2076057eb/disk.config#033[00m
Oct  8 11:21:12 np0005476733 nova_compute[192580]: 2025-10-08 15:21:12.995 2 DEBUG oslo_concurrency.processutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/341c177f-c391-41dd-bf3c-14c2076057eb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpioiz789d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:21:13 np0005476733 nova_compute[192580]: 2025-10-08 15:21:13.132 2 DEBUG oslo_concurrency.processutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/341c177f-c391-41dd-bf3c-14c2076057eb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpioiz789d" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:21:13 np0005476733 kernel: tap046dc8a5-fa: entered promiscuous mode
Oct  8 11:21:13 np0005476733 NetworkManager[51699]: <info>  [1759936873.1918] manager: (tap046dc8a5-fa): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Oct  8 11:21:13 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:13Z|00123|binding|INFO|Claiming lport 046dc8a5-fad3-4f9f-bd10-3894704fe7ed for this chassis.
Oct  8 11:21:13 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:13Z|00124|binding|INFO|046dc8a5-fad3-4f9f-bd10-3894704fe7ed: Claiming fa:16:3e:0f:3d:28 10.100.0.10
Oct  8 11:21:13 np0005476733 nova_compute[192580]: 2025-10-08 15:21:13.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.203 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:3d:28 10.100.0.10'], port_security=['fa:16:3e:0f:3d:28 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7e5261bf-648d-4475-96cb-fe9ba80fd1d8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0738c3ab-dc94-44ab-bcd4-94a57812b815', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=31be164c-7a5c-418b-b5ca-ef6f173770bf, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=046dc8a5-fad3-4f9f-bd10-3894704fe7ed) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.208 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 046dc8a5-fad3-4f9f-bd10-3894704fe7ed in datapath 7e5261bf-648d-4475-96cb-fe9ba80fd1d8 bound to our chassis#033[00m
Oct  8 11:21:13 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:13Z|00125|binding|INFO|Setting lport 046dc8a5-fad3-4f9f-bd10-3894704fe7ed ovn-installed in OVS
Oct  8 11:21:13 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:13Z|00126|binding|INFO|Setting lport 046dc8a5-fad3-4f9f-bd10-3894704fe7ed up in Southbound
Oct  8 11:21:13 np0005476733 nova_compute[192580]: 2025-10-08 15:21:13.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:13 np0005476733 nova_compute[192580]: 2025-10-08 15:21:13.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:13 np0005476733 nova_compute[192580]: 2025-10-08 15:21:13.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.219 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7e5261bf-648d-4475-96cb-fe9ba80fd1d8#033[00m
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.236 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8daea506-2c71-43b9-9fc4-d3174e627166]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.237 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7e5261bf-61 in ovnmeta-7e5261bf-648d-4475-96cb-fe9ba80fd1d8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.239 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7e5261bf-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.239 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c331b39a-847c-4369-94ff-111f88082f6f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.242 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e99f85ff-1359-47c2-8a84-c4d28814618c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:13 np0005476733 systemd-machined[152624]: New machine qemu-9-instance-0000000b.
Oct  8 11:21:13 np0005476733 systemd[1]: Started Virtual Machine qemu-9-instance-0000000b.
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.256 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[99a43819-657c-4658-985a-1848718a986d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:13 np0005476733 systemd-udevd[223930]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.284 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[db90f795-f104-42c3-97d1-35f4b3881fa9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:13 np0005476733 NetworkManager[51699]: <info>  [1759936873.2967] device (tap046dc8a5-fa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:21:13 np0005476733 NetworkManager[51699]: <info>  [1759936873.2975] device (tap046dc8a5-fa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.318 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[766e5b16-3330-4be5-812e-785ad599c8a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:13 np0005476733 NetworkManager[51699]: <info>  [1759936873.3256] manager: (tap7e5261bf-60): new Veth device (/org/freedesktop/NetworkManager/Devices/62)
Oct  8 11:21:13 np0005476733 systemd-udevd[223934]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.325 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[fc11dae5-3607-4336-8f90-65faedcdc19d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.366 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[c2558101-c969-4546-ae32-3b4c63eb7c20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.369 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[9e837b07-2d92-4bf2-90d6-1908bce52c12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:13 np0005476733 NetworkManager[51699]: <info>  [1759936873.3936] device (tap7e5261bf-60): carrier: link connected
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.399 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[6878249d-9a07-44bb-8b4c-6e94cf610d8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.417 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ef93dd47-81c9-43ed-89c7-1fdf11beb8b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7e5261bf-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:0a:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377705, 'reachable_time': 30590, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223960, 'error': None, 'target': 'ovnmeta-7e5261bf-648d-4475-96cb-fe9ba80fd1d8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.434 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e3e32184-2fb4-4a7b-8899-fded7ed3580f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6c:aa4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 377705, 'tstamp': 377705}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223961, 'error': None, 'target': 'ovnmeta-7e5261bf-648d-4475-96cb-fe9ba80fd1d8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.449 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a1b5d106-678e-4c8d-a31b-c88b758688e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7e5261bf-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:0a:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377705, 'reachable_time': 30590, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223962, 'error': None, 'target': 'ovnmeta-7e5261bf-648d-4475-96cb-fe9ba80fd1d8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.476 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[faa7368f-bc8b-425d-8821-c1501eb3b0f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.537 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[729673e3-3bd3-4ac5-9990-018237141fdc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.539 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7e5261bf-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.539 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.540 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7e5261bf-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:21:13 np0005476733 nova_compute[192580]: 2025-10-08 15:21:13.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:13 np0005476733 NetworkManager[51699]: <info>  [1759936873.5423] manager: (tap7e5261bf-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Oct  8 11:21:13 np0005476733 kernel: tap7e5261bf-60: entered promiscuous mode
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.545 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7e5261bf-60, col_values=(('external_ids', {'iface-id': '5090be74-4172-482c-a3e2-d50108f61f90'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:21:13 np0005476733 nova_compute[192580]: 2025-10-08 15:21:13.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:13 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:13Z|00127|binding|INFO|Releasing lport 5090be74-4172-482c-a3e2-d50108f61f90 from this chassis (sb_readonly=0)
Oct  8 11:21:13 np0005476733 nova_compute[192580]: 2025-10-08 15:21:13.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.584 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7e5261bf-648d-4475-96cb-fe9ba80fd1d8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7e5261bf-648d-4475-96cb-fe9ba80fd1d8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.586 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[bb483ec6-f97b-4c8a-99b3-103e931b16c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.587 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-7e5261bf-648d-4475-96cb-fe9ba80fd1d8
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/7e5261bf-648d-4475-96cb-fe9ba80fd1d8.pid.haproxy
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 7e5261bf-648d-4475-96cb-fe9ba80fd1d8
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:21:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:13.589 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7e5261bf-648d-4475-96cb-fe9ba80fd1d8', 'env', 'PROCESS_TAG=haproxy-7e5261bf-648d-4475-96cb-fe9ba80fd1d8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7e5261bf-648d-4475-96cb-fe9ba80fd1d8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:21:13 np0005476733 nova_compute[192580]: 2025-10-08 15:21:13.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:13 np0005476733 nova_compute[192580]: 2025-10-08 15:21:13.671 2 INFO nova.compute.manager [None req-5c92b71b-c9ff-410b-890f-1368bc578704 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Get console output#033[00m
Oct  8 11:21:13 np0005476733 nova_compute[192580]: 2025-10-08 15:21:13.678 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:21:13 np0005476733 podman[223997]: 2025-10-08 15:21:13.992306088 +0000 UTC m=+0.072533782 container create b1b3470a3b851cf317b9e34ffc076a0d16b2e365bb724bcc324fc2cf92e6b60c (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-7e5261bf-648d-4475-96cb-fe9ba80fd1d8, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:21:14 np0005476733 systemd[1]: Started libpod-conmon-b1b3470a3b851cf317b9e34ffc076a0d16b2e365bb724bcc324fc2cf92e6b60c.scope.
Oct  8 11:21:14 np0005476733 podman[223997]: 2025-10-08 15:21:13.966046848 +0000 UTC m=+0.046274562 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:21:14 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:21:14 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d9ebc36ad5ff3c965b41e44c1fd6dd4d222bebaa51b09058129247db2c952b9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:21:14 np0005476733 podman[223997]: 2025-10-08 15:21:14.106948637 +0000 UTC m=+0.187176381 container init b1b3470a3b851cf317b9e34ffc076a0d16b2e365bb724bcc324fc2cf92e6b60c (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-7e5261bf-648d-4475-96cb-fe9ba80fd1d8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  8 11:21:14 np0005476733 podman[223997]: 2025-10-08 15:21:14.11388817 +0000 UTC m=+0.194115904 container start b1b3470a3b851cf317b9e34ffc076a0d16b2e365bb724bcc324fc2cf92e6b60c (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-7e5261bf-648d-4475-96cb-fe9ba80fd1d8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2)
Oct  8 11:21:14 np0005476733 neutron-haproxy-ovnmeta-7e5261bf-648d-4475-96cb-fe9ba80fd1d8[224014]: [NOTICE]   (224018) : New worker (224020) forked
Oct  8 11:21:14 np0005476733 neutron-haproxy-ovnmeta-7e5261bf-648d-4475-96cb-fe9ba80fd1d8[224014]: [NOTICE]   (224018) : Loading success.
Oct  8 11:21:14 np0005476733 nova_compute[192580]: 2025-10-08 15:21:14.412 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936874.4125547, 341c177f-c391-41dd-bf3c-14c2076057eb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:21:14 np0005476733 nova_compute[192580]: 2025-10-08 15:21:14.413 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] VM Started (Lifecycle Event)#033[00m
Oct  8 11:21:14 np0005476733 nova_compute[192580]: 2025-10-08 15:21:14.438 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:21:14 np0005476733 nova_compute[192580]: 2025-10-08 15:21:14.442 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936874.41268, 341c177f-c391-41dd-bf3c-14c2076057eb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:21:14 np0005476733 nova_compute[192580]: 2025-10-08 15:21:14.442 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:21:14 np0005476733 nova_compute[192580]: 2025-10-08 15:21:14.457 2 DEBUG nova.network.neutron [req-d58c623b-475c-4599-a9e2-455f272afe95 req-4b9d0917-b415-499e-a515-4cb1c3038d7c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Updated VIF entry in instance network info cache for port 046dc8a5-fad3-4f9f-bd10-3894704fe7ed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:21:14 np0005476733 nova_compute[192580]: 2025-10-08 15:21:14.457 2 DEBUG nova.network.neutron [req-d58c623b-475c-4599-a9e2-455f272afe95 req-4b9d0917-b415-499e-a515-4cb1c3038d7c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Updating instance_info_cache with network_info: [{"id": "046dc8a5-fad3-4f9f-bd10-3894704fe7ed", "address": "fa:16:3e:0f:3d:28", "network": {"id": "7e5261bf-648d-4475-96cb-fe9ba80fd1d8", "bridge": "br-int", "label": "tempest-test-network--1002477072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d027d9bf53149dd9246b01ebf09eb48", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap046dc8a5-fa", "ovs_interfaceid": "046dc8a5-fad3-4f9f-bd10-3894704fe7ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:21:14 np0005476733 nova_compute[192580]: 2025-10-08 15:21:14.479 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:21:14 np0005476733 nova_compute[192580]: 2025-10-08 15:21:14.480 2 DEBUG oslo_concurrency.lockutils [req-d58c623b-475c-4599-a9e2-455f272afe95 req-4b9d0917-b415-499e-a515-4cb1c3038d7c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-341c177f-c391-41dd-bf3c-14c2076057eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:21:14 np0005476733 nova_compute[192580]: 2025-10-08 15:21:14.483 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:21:14 np0005476733 nova_compute[192580]: 2025-10-08 15:21:14.518 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:21:14 np0005476733 nova_compute[192580]: 2025-10-08 15:21:14.579 2 DEBUG nova.compute.manager [req-497defa1-6d1c-4a64-998d-ab72f714729d req-27a6a831-03c8-4f4b-83bf-3470f89bdfe6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Received event network-vif-plugged-4689d9d8-d635-4a1c-9495-cff4ea7e6a95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:21:14 np0005476733 nova_compute[192580]: 2025-10-08 15:21:14.580 2 DEBUG oslo_concurrency.lockutils [req-497defa1-6d1c-4a64-998d-ab72f714729d req-27a6a831-03c8-4f4b-83bf-3470f89bdfe6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "64549fc7-989f-473a-99bb-78947d8d7536-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:21:14 np0005476733 nova_compute[192580]: 2025-10-08 15:21:14.580 2 DEBUG oslo_concurrency.lockutils [req-497defa1-6d1c-4a64-998d-ab72f714729d req-27a6a831-03c8-4f4b-83bf-3470f89bdfe6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "64549fc7-989f-473a-99bb-78947d8d7536-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:21:14 np0005476733 nova_compute[192580]: 2025-10-08 15:21:14.581 2 DEBUG oslo_concurrency.lockutils [req-497defa1-6d1c-4a64-998d-ab72f714729d req-27a6a831-03c8-4f4b-83bf-3470f89bdfe6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "64549fc7-989f-473a-99bb-78947d8d7536-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:21:14 np0005476733 nova_compute[192580]: 2025-10-08 15:21:14.582 2 DEBUG nova.compute.manager [req-497defa1-6d1c-4a64-998d-ab72f714729d req-27a6a831-03c8-4f4b-83bf-3470f89bdfe6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] No waiting events found dispatching network-vif-plugged-4689d9d8-d635-4a1c-9495-cff4ea7e6a95 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:21:14 np0005476733 nova_compute[192580]: 2025-10-08 15:21:14.582 2 WARNING nova.compute.manager [req-497defa1-6d1c-4a64-998d-ab72f714729d req-27a6a831-03c8-4f4b-83bf-3470f89bdfe6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Received unexpected event network-vif-plugged-4689d9d8-d635-4a1c-9495-cff4ea7e6a95 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:21:15 np0005476733 nova_compute[192580]: 2025-10-08 15:21:15.814 2 INFO nova.compute.manager [None req-d149b798-427e-421f-bd74-154e4807af3e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Get console output#033[00m
Oct  8 11:21:15 np0005476733 nova_compute[192580]: 2025-10-08 15:21:15.820 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:21:15 np0005476733 nova_compute[192580]: 2025-10-08 15:21:15.823 2 INFO nova.virt.libvirt.driver [None req-d149b798-427e-421f-bd74-154e4807af3e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Truncated console log returned, 3244 bytes ignored#033[00m
Oct  8 11:21:17 np0005476733 nova_compute[192580]: 2025-10-08 15:21:17.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:17 np0005476733 nova_compute[192580]: 2025-10-08 15:21:17.820 2 DEBUG nova.compute.manager [req-deb1ede3-e8c8-420d-83cb-622676d536e9 req-86927c2a-60fd-41e7-9f4f-c09286d4ffe9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Received event network-changed-f66c148b-4cbb-4cdd-8196-6513d7c5ff78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:21:17 np0005476733 nova_compute[192580]: 2025-10-08 15:21:17.821 2 DEBUG nova.compute.manager [req-deb1ede3-e8c8-420d-83cb-622676d536e9 req-86927c2a-60fd-41e7-9f4f-c09286d4ffe9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Refreshing instance network info cache due to event network-changed-f66c148b-4cbb-4cdd-8196-6513d7c5ff78. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:21:17 np0005476733 nova_compute[192580]: 2025-10-08 15:21:17.822 2 DEBUG oslo_concurrency.lockutils [req-deb1ede3-e8c8-420d-83cb-622676d536e9 req-86927c2a-60fd-41e7-9f4f-c09286d4ffe9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-a71ee5d2-21b8-4455-8870-f20bed682909" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:21:17 np0005476733 nova_compute[192580]: 2025-10-08 15:21:17.822 2 DEBUG oslo_concurrency.lockutils [req-deb1ede3-e8c8-420d-83cb-622676d536e9 req-86927c2a-60fd-41e7-9f4f-c09286d4ffe9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-a71ee5d2-21b8-4455-8870-f20bed682909" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:21:17 np0005476733 nova_compute[192580]: 2025-10-08 15:21:17.822 2 DEBUG nova.network.neutron [req-deb1ede3-e8c8-420d-83cb-622676d536e9 req-86927c2a-60fd-41e7-9f4f-c09286d4ffe9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Refreshing network info cache for port f66c148b-4cbb-4cdd-8196-6513d7c5ff78 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:21:18 np0005476733 nova_compute[192580]: 2025-10-08 15:21:18.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:19 np0005476733 nova_compute[192580]: 2025-10-08 15:21:19.621 2 INFO nova.compute.manager [None req-9b276310-419d-43e8-8a90-e97519500bb7 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Get console output#033[00m
Oct  8 11:21:19 np0005476733 nova_compute[192580]: 2025-10-08 15:21:19.628 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:21:19 np0005476733 nova_compute[192580]: 2025-10-08 15:21:19.749 2 DEBUG nova.network.neutron [req-deb1ede3-e8c8-420d-83cb-622676d536e9 req-86927c2a-60fd-41e7-9f4f-c09286d4ffe9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Updated VIF entry in instance network info cache for port f66c148b-4cbb-4cdd-8196-6513d7c5ff78. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:21:19 np0005476733 nova_compute[192580]: 2025-10-08 15:21:19.750 2 DEBUG nova.network.neutron [req-deb1ede3-e8c8-420d-83cb-622676d536e9 req-86927c2a-60fd-41e7-9f4f-c09286d4ffe9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Updating instance_info_cache with network_info: [{"id": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "address": "fa:16:3e:77:9d:93", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66c148b-4c", "ovs_interfaceid": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:21:19 np0005476733 nova_compute[192580]: 2025-10-08 15:21:19.792 2 DEBUG oslo_concurrency.lockutils [req-deb1ede3-e8c8-420d-83cb-622676d536e9 req-86927c2a-60fd-41e7-9f4f-c09286d4ffe9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-a71ee5d2-21b8-4455-8870-f20bed682909" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:21:20 np0005476733 podman[224054]: 2025-10-08 15:21:20.214880681 +0000 UTC m=+0.047055908 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.259 2 DEBUG nova.compute.manager [req-3ca91f7b-cc0c-4449-834e-2f8cd23d55bc req-6cce65d3-6ee3-47d4-9f87-1fc5752734ba 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Received event network-vif-plugged-046dc8a5-fad3-4f9f-bd10-3894704fe7ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.259 2 DEBUG oslo_concurrency.lockutils [req-3ca91f7b-cc0c-4449-834e-2f8cd23d55bc req-6cce65d3-6ee3-47d4-9f87-1fc5752734ba 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "341c177f-c391-41dd-bf3c-14c2076057eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.259 2 DEBUG oslo_concurrency.lockutils [req-3ca91f7b-cc0c-4449-834e-2f8cd23d55bc req-6cce65d3-6ee3-47d4-9f87-1fc5752734ba 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "341c177f-c391-41dd-bf3c-14c2076057eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.260 2 DEBUG oslo_concurrency.lockutils [req-3ca91f7b-cc0c-4449-834e-2f8cd23d55bc req-6cce65d3-6ee3-47d4-9f87-1fc5752734ba 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "341c177f-c391-41dd-bf3c-14c2076057eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.260 2 DEBUG nova.compute.manager [req-3ca91f7b-cc0c-4449-834e-2f8cd23d55bc req-6cce65d3-6ee3-47d4-9f87-1fc5752734ba 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Processing event network-vif-plugged-046dc8a5-fad3-4f9f-bd10-3894704fe7ed _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.260 2 DEBUG nova.compute.manager [req-3ca91f7b-cc0c-4449-834e-2f8cd23d55bc req-6cce65d3-6ee3-47d4-9f87-1fc5752734ba 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Received event network-vif-plugged-046dc8a5-fad3-4f9f-bd10-3894704fe7ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.261 2 DEBUG oslo_concurrency.lockutils [req-3ca91f7b-cc0c-4449-834e-2f8cd23d55bc req-6cce65d3-6ee3-47d4-9f87-1fc5752734ba 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "341c177f-c391-41dd-bf3c-14c2076057eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.261 2 DEBUG oslo_concurrency.lockutils [req-3ca91f7b-cc0c-4449-834e-2f8cd23d55bc req-6cce65d3-6ee3-47d4-9f87-1fc5752734ba 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "341c177f-c391-41dd-bf3c-14c2076057eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.261 2 DEBUG oslo_concurrency.lockutils [req-3ca91f7b-cc0c-4449-834e-2f8cd23d55bc req-6cce65d3-6ee3-47d4-9f87-1fc5752734ba 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "341c177f-c391-41dd-bf3c-14c2076057eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.261 2 DEBUG nova.compute.manager [req-3ca91f7b-cc0c-4449-834e-2f8cd23d55bc req-6cce65d3-6ee3-47d4-9f87-1fc5752734ba 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] No waiting events found dispatching network-vif-plugged-046dc8a5-fad3-4f9f-bd10-3894704fe7ed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.261 2 WARNING nova.compute.manager [req-3ca91f7b-cc0c-4449-834e-2f8cd23d55bc req-6cce65d3-6ee3-47d4-9f87-1fc5752734ba 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Received unexpected event network-vif-plugged-046dc8a5-fad3-4f9f-bd10-3894704fe7ed for instance with vm_state building and task_state spawning.#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.262 2 DEBUG nova.compute.manager [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.266 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936880.265746, 341c177f-c391-41dd-bf3c-14c2076057eb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.266 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.267 2 DEBUG nova.virt.libvirt.driver [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.270 2 INFO nova.virt.libvirt.driver [-] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Instance spawned successfully.#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.270 2 DEBUG nova.virt.libvirt.driver [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.294 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.297 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.303 2 DEBUG nova.virt.libvirt.driver [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.303 2 DEBUG nova.virt.libvirt.driver [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.304 2 DEBUG nova.virt.libvirt.driver [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.304 2 DEBUG nova.virt.libvirt.driver [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.305 2 DEBUG nova.virt.libvirt.driver [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.305 2 DEBUG nova.virt.libvirt.driver [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.332 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.375 2 INFO nova.compute.manager [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Took 13.65 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.376 2 DEBUG nova.compute.manager [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:21:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:20Z|00128|binding|INFO|Releasing lport 5090be74-4172-482c-a3e2-d50108f61f90 from this chassis (sb_readonly=0)
Oct  8 11:21:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:20Z|00129|binding|INFO|Releasing lport 1e0c4d29-d963-4fdf-8ca6-0153967de16b from this chassis (sb_readonly=0)
Oct  8 11:21:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:20Z|00130|binding|INFO|Releasing lport 76302563-91ae-48df-adce-3edec8d5a578 from this chassis (sb_readonly=0)
Oct  8 11:21:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:20Z|00131|binding|INFO|Releasing lport 01b0a658-f0ed-4cb7-aee4-981992c348f0 from this chassis (sb_readonly=0)
Oct  8 11:21:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:20Z|00132|pinctrl|WARN|Dropped 5241 log messages in last 60 seconds (most recently, 1 seconds ago) due to excessive rate
Oct  8 11:21:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:20Z|00133|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.484 2 INFO nova.compute.manager [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Took 14.49 seconds to build instance.#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:20 np0005476733 nova_compute[192580]: 2025-10-08 15:21:20.576 2 DEBUG oslo_concurrency.lockutils [None req-6d927f1d-cc04-4f8f-b3ad-1e9055f251c9 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Lock "341c177f-c391-41dd-bf3c-14c2076057eb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.699s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:21:21 np0005476733 nova_compute[192580]: 2025-10-08 15:21:21.447 2 INFO nova.compute.manager [None req-1d3a5922-e9a2-4b8b-a45f-387404eff005 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Get console output#033[00m
Oct  8 11:21:21 np0005476733 nova_compute[192580]: 2025-10-08 15:21:21.454 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:21:22 np0005476733 nova_compute[192580]: 2025-10-08 15:21:22.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:23 np0005476733 podman[224075]: 2025-10-08 15:21:23.248124383 +0000 UTC m=+0.074603408 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 11:21:23 np0005476733 nova_compute[192580]: 2025-10-08 15:21:23.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:24 np0005476733 podman[224093]: 2025-10-08 15:21:24.267040375 +0000 UTC m=+0.095417265 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Oct  8 11:21:24 np0005476733 nova_compute[192580]: 2025-10-08 15:21:24.824 2 INFO nova.compute.manager [None req-2e18863d-c757-44eb-8679-ceb9098ba3c5 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Get console output#033[00m
Oct  8 11:21:24 np0005476733 nova_compute[192580]: 2025-10-08 15:21:24.829 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:21:25 np0005476733 nova_compute[192580]: 2025-10-08 15:21:25.309 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759936870.307622, c4b45a9c-73a5-4b51-ab96-874507f4c028 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:21:25 np0005476733 nova_compute[192580]: 2025-10-08 15:21:25.310 2 INFO nova.compute.manager [-] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:21:25 np0005476733 nova_compute[192580]: 2025-10-08 15:21:25.374 2 DEBUG nova.compute.manager [None req-22d1d781-a294-407e-8bc5-968df07b62ed - - - - - -] [instance: c4b45a9c-73a5-4b51-ab96-874507f4c028] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:21:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:26.303 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:21:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:26.304 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:21:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:26.305 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:21:27 np0005476733 nova_compute[192580]: 2025-10-08 15:21:27.033 2 INFO nova.compute.manager [None req-7b9d416f-2ebd-48b6-b90c-51869880cfaf 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Get console output#033[00m
Oct  8 11:21:27 np0005476733 nova_compute[192580]: 2025-10-08 15:21:27.037 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:21:27 np0005476733 nova_compute[192580]: 2025-10-08 15:21:27.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:28 np0005476733 podman[224119]: 2025-10-08 15:21:28.258916678 +0000 UTC m=+0.085236758 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Oct  8 11:21:28 np0005476733 nova_compute[192580]: 2025-10-08 15:21:28.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:30 np0005476733 nova_compute[192580]: 2025-10-08 15:21:30.103 2 INFO nova.compute.manager [None req-18270ad5-67ee-432a-a6a2-cccb17b95f2f 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Get console output#033[00m
Oct  8 11:21:30 np0005476733 nova_compute[192580]: 2025-10-08 15:21:30.106 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:21:30 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:30Z|00134|binding|INFO|Releasing lport 5090be74-4172-482c-a3e2-d50108f61f90 from this chassis (sb_readonly=0)
Oct  8 11:21:30 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:30Z|00135|binding|INFO|Releasing lport 1e0c4d29-d963-4fdf-8ca6-0153967de16b from this chassis (sb_readonly=0)
Oct  8 11:21:30 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:30Z|00136|binding|INFO|Releasing lport 76302563-91ae-48df-adce-3edec8d5a578 from this chassis (sb_readonly=0)
Oct  8 11:21:30 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:30Z|00137|binding|INFO|Releasing lport 01b0a658-f0ed-4cb7-aee4-981992c348f0 from this chassis (sb_readonly=0)
Oct  8 11:21:30 np0005476733 nova_compute[192580]: 2025-10-08 15:21:30.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:32 np0005476733 nova_compute[192580]: 2025-10-08 15:21:32.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:32 np0005476733 nova_compute[192580]: 2025-10-08 15:21:32.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:21:32 np0005476733 nova_compute[192580]: 2025-10-08 15:21:32.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:21:32 np0005476733 nova_compute[192580]: 2025-10-08 15:21:32.617 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:21:32 np0005476733 nova_compute[192580]: 2025-10-08 15:21:32.617 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:21:32 np0005476733 nova_compute[192580]: 2025-10-08 15:21:32.618 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:21:32 np0005476733 nova_compute[192580]: 2025-10-08 15:21:32.713 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64549fc7-989f-473a-99bb-78947d8d7536/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:21:32 np0005476733 podman[224182]: 2025-10-08 15:21:32.741737747 +0000 UTC m=+0.070406835 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 11:21:32 np0005476733 podman[224184]: 2025-10-08 15:21:32.769874468 +0000 UTC m=+0.098015819 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 11:21:32 np0005476733 nova_compute[192580]: 2025-10-08 15:21:32.787 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64549fc7-989f-473a-99bb-78947d8d7536/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:21:32 np0005476733 nova_compute[192580]: 2025-10-08 15:21:32.788 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64549fc7-989f-473a-99bb-78947d8d7536/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:21:32 np0005476733 nova_compute[192580]: 2025-10-08 15:21:32.842 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64549fc7-989f-473a-99bb-78947d8d7536/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:21:32 np0005476733 nova_compute[192580]: 2025-10-08 15:21:32.850 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af660b82-9b3c-4c4d-820a-3d22b73898e5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:21:32 np0005476733 nova_compute[192580]: 2025-10-08 15:21:32.909 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af660b82-9b3c-4c4d-820a-3d22b73898e5/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:21:32 np0005476733 nova_compute[192580]: 2025-10-08 15:21:32.910 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af660b82-9b3c-4c4d-820a-3d22b73898e5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:21:32 np0005476733 nova_compute[192580]: 2025-10-08 15:21:32.937 2 INFO nova.compute.manager [None req-18a08c49-f62f-4fea-8719-299b1246c41a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Get console output#033[00m
Oct  8 11:21:32 np0005476733 nova_compute[192580]: 2025-10-08 15:21:32.945 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:21:32 np0005476733 nova_compute[192580]: 2025-10-08 15:21:32.971 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af660b82-9b3c-4c4d-820a-3d22b73898e5/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:21:32 np0005476733 nova_compute[192580]: 2025-10-08 15:21:32.977 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/341c177f-c391-41dd-bf3c-14c2076057eb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:21:33 np0005476733 nova_compute[192580]: 2025-10-08 15:21:33.034 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/341c177f-c391-41dd-bf3c-14c2076057eb/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:21:33 np0005476733 nova_compute[192580]: 2025-10-08 15:21:33.035 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/341c177f-c391-41dd-bf3c-14c2076057eb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:21:33 np0005476733 nova_compute[192580]: 2025-10-08 15:21:33.091 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/341c177f-c391-41dd-bf3c-14c2076057eb/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:21:33 np0005476733 nova_compute[192580]: 2025-10-08 15:21:33.099 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:21:33 np0005476733 nova_compute[192580]: 2025-10-08 15:21:33.169 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:21:33 np0005476733 nova_compute[192580]: 2025-10-08 15:21:33.170 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:21:33 np0005476733 nova_compute[192580]: 2025-10-08 15:21:33.250 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:21:33 np0005476733 nova_compute[192580]: 2025-10-08 15:21:33.437 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:21:33 np0005476733 nova_compute[192580]: 2025-10-08 15:21:33.439 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=11485MB free_disk=111.05718994140625GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:21:33 np0005476733 nova_compute[192580]: 2025-10-08 15:21:33.439 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:21:33 np0005476733 nova_compute[192580]: 2025-10-08 15:21:33.440 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:21:33 np0005476733 nova_compute[192580]: 2025-10-08 15:21:33.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:33 np0005476733 nova_compute[192580]: 2025-10-08 15:21:33.664 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance af660b82-9b3c-4c4d-820a-3d22b73898e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:21:33 np0005476733 nova_compute[192580]: 2025-10-08 15:21:33.665 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance a71ee5d2-21b8-4455-8870-f20bed682909 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:21:33 np0005476733 nova_compute[192580]: 2025-10-08 15:21:33.666 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 64549fc7-989f-473a-99bb-78947d8d7536 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:21:33 np0005476733 nova_compute[192580]: 2025-10-08 15:21:33.666 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 341c177f-c391-41dd-bf3c-14c2076057eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:21:33 np0005476733 nova_compute[192580]: 2025-10-08 15:21:33.666 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:21:33 np0005476733 nova_compute[192580]: 2025-10-08 15:21:33.667 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=4608MB phys_disk=119GB used_disk=40GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:21:33 np0005476733 nova_compute[192580]: 2025-10-08 15:21:33.816 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:21:33 np0005476733 nova_compute[192580]: 2025-10-08 15:21:33.853 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:21:33 np0005476733 nova_compute[192580]: 2025-10-08 15:21:33.909 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:21:33 np0005476733 nova_compute[192580]: 2025-10-08 15:21:33.909 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.469s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:21:35 np0005476733 nova_compute[192580]: 2025-10-08 15:21:35.453 2 INFO nova.compute.manager [None req-062ccf8f-057e-4508-9630-4bd2a9b0b4b5 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Get console output#033[00m
Oct  8 11:21:35 np0005476733 nova_compute[192580]: 2025-10-08 15:21:35.458 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:21:37 np0005476733 nova_compute[192580]: 2025-10-08 15:21:37.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:38 np0005476733 nova_compute[192580]: 2025-10-08 15:21:38.179 2 INFO nova.compute.manager [None req-032353e0-3447-4e4d-aa48-1e6d7e66ff3e 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Get console output#033[00m
Oct  8 11:21:38 np0005476733 nova_compute[192580]: 2025-10-08 15:21:38.185 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:21:38 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:38Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bc:4a:7f 10.100.0.14
Oct  8 11:21:38 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:38Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bc:4a:7f 10.100.0.14
Oct  8 11:21:38 np0005476733 nova_compute[192580]: 2025-10-08 15:21:38.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:38 np0005476733 nova_compute[192580]: 2025-10-08 15:21:38.910 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:21:38 np0005476733 nova_compute[192580]: 2025-10-08 15:21:38.910 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:21:38 np0005476733 nova_compute[192580]: 2025-10-08 15:21:38.911 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:21:40 np0005476733 nova_compute[192580]: 2025-10-08 15:21:40.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:21:40 np0005476733 nova_compute[192580]: 2025-10-08 15:21:40.662 2 INFO nova.compute.manager [None req-3dba2b15-47db-4890-bb00-33bb0d7aba33 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Get console output#033[00m
Oct  8 11:21:40 np0005476733 nova_compute[192580]: 2025-10-08 15:21:40.667 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:21:41 np0005476733 podman[224258]: 2025-10-08 15:21:41.22013797 +0000 UTC m=+0.050993873 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible)
Oct  8 11:21:41 np0005476733 nova_compute[192580]: 2025-10-08 15:21:41.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:21:42 np0005476733 nova_compute[192580]: 2025-10-08 15:21:42.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:42 np0005476733 nova_compute[192580]: 2025-10-08 15:21:42.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:21:42 np0005476733 nova_compute[192580]: 2025-10-08 15:21:42.610 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:21:42 np0005476733 nova_compute[192580]: 2025-10-08 15:21:42.611 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:21:43 np0005476733 podman[224278]: 2025-10-08 15:21:43.217834649 +0000 UTC m=+0.051206790 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 11:21:43 np0005476733 nova_compute[192580]: 2025-10-08 15:21:43.458 2 INFO nova.compute.manager [None req-f8dadfc6-d08a-437f-aec2-70192ed72d22 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Get console output#033[00m
Oct  8 11:21:43 np0005476733 nova_compute[192580]: 2025-10-08 15:21:43.464 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:21:43 np0005476733 nova_compute[192580]: 2025-10-08 15:21:43.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:21:43 np0005476733 nova_compute[192580]: 2025-10-08 15:21:43.591 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:21:43 np0005476733 nova_compute[192580]: 2025-10-08 15:21:43.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:44.064 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:21:44 np0005476733 nova_compute[192580]: 2025-10-08 15:21:44.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:44.067 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:21:44 np0005476733 nova_compute[192580]: 2025-10-08 15:21:44.087 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-a71ee5d2-21b8-4455-8870-f20bed682909" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:21:44 np0005476733 nova_compute[192580]: 2025-10-08 15:21:44.087 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-a71ee5d2-21b8-4455-8870-f20bed682909" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:21:44 np0005476733 nova_compute[192580]: 2025-10-08 15:21:44.088 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:21:45 np0005476733 nova_compute[192580]: 2025-10-08 15:21:45.533 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Updating instance_info_cache with network_info: [{"id": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "address": "fa:16:3e:77:9d:93", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66c148b-4c", "ovs_interfaceid": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:21:45 np0005476733 nova_compute[192580]: 2025-10-08 15:21:45.548 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-a71ee5d2-21b8-4455-8870-f20bed682909" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:21:45 np0005476733 nova_compute[192580]: 2025-10-08 15:21:45.549 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:21:45 np0005476733 nova_compute[192580]: 2025-10-08 15:21:45.843 2 INFO nova.compute.manager [None req-a294e7bd-07a4-41e9-9bc5-46e2948e555a 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Get console output#033[00m
Oct  8 11:21:45 np0005476733 nova_compute[192580]: 2025-10-08 15:21:45.848 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:21:45 np0005476733 nova_compute[192580]: 2025-10-08 15:21:45.850 2 INFO nova.virt.libvirt.driver [None req-a294e7bd-07a4-41e9-9bc5-46e2948e555a 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Truncated console log returned, 3097 bytes ignored#033[00m
Oct  8 11:21:47 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:21:47.070 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:21:47 np0005476733 nova_compute[192580]: 2025-10-08 15:21:47.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:47 np0005476733 nova_compute[192580]: 2025-10-08 15:21:47.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:21:48 np0005476733 nova_compute[192580]: 2025-10-08 15:21:48.613 2 INFO nova.compute.manager [None req-e3c9c2f2-20e5-45af-931f-fe4df5f20384 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Get console output#033[00m
Oct  8 11:21:48 np0005476733 nova_compute[192580]: 2025-10-08 15:21:48.619 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:21:48 np0005476733 nova_compute[192580]: 2025-10-08 15:21:48.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:50 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:50Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0f:3d:28 10.100.0.10
Oct  8 11:21:50 np0005476733 ovn_controller[94857]: 2025-10-08T15:21:50Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0f:3d:28 10.100.0.10
Oct  8 11:21:50 np0005476733 systemd-logind[827]: New session 35 of user zuul.
Oct  8 11:21:50 np0005476733 systemd[1]: Started Session 35 of User zuul.
Oct  8 11:21:50 np0005476733 podman[224311]: 2025-10-08 15:21:50.507875527 +0000 UTC m=+0.087377398 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Oct  8 11:21:50 np0005476733 systemd[1]: session-35.scope: Deactivated successfully.
Oct  8 11:21:50 np0005476733 systemd-logind[827]: Session 35 logged out. Waiting for processes to exit.
Oct  8 11:21:50 np0005476733 systemd-logind[827]: Removed session 35.
Oct  8 11:21:51 np0005476733 nova_compute[192580]: 2025-10-08 15:21:51.015 2 INFO nova.compute.manager [None req-bd338eb5-6750-4833-94da-a89211a48957 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Get console output#033[00m
Oct  8 11:21:51 np0005476733 nova_compute[192580]: 2025-10-08 15:21:51.024 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:21:51 np0005476733 nova_compute[192580]: 2025-10-08 15:21:51.030 2 INFO nova.virt.libvirt.driver [None req-bd338eb5-6750-4833-94da-a89211a48957 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Truncated console log returned, 3312 bytes ignored#033[00m
Oct  8 11:21:52 np0005476733 nova_compute[192580]: 2025-10-08 15:21:52.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:53 np0005476733 nova_compute[192580]: 2025-10-08 15:21:53.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:53 np0005476733 nova_compute[192580]: 2025-10-08 15:21:53.753 2 INFO nova.compute.manager [None req-9ebba57d-8b91-4a18-8d90-d4c62dee1d90 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Get console output#033[00m
Oct  8 11:21:53 np0005476733 nova_compute[192580]: 2025-10-08 15:21:53.975 2 DEBUG nova.compute.manager [req-d5a05085-4556-412d-a54f-237a9f7731c9 req-1b2e0e53-48c7-4996-9df0-1d65a7362630 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Received event network-changed-4689d9d8-d635-4a1c-9495-cff4ea7e6a95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:21:53 np0005476733 nova_compute[192580]: 2025-10-08 15:21:53.976 2 DEBUG nova.compute.manager [req-d5a05085-4556-412d-a54f-237a9f7731c9 req-1b2e0e53-48c7-4996-9df0-1d65a7362630 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Refreshing instance network info cache due to event network-changed-4689d9d8-d635-4a1c-9495-cff4ea7e6a95. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:21:53 np0005476733 nova_compute[192580]: 2025-10-08 15:21:53.976 2 DEBUG oslo_concurrency.lockutils [req-d5a05085-4556-412d-a54f-237a9f7731c9 req-1b2e0e53-48c7-4996-9df0-1d65a7362630 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-64549fc7-989f-473a-99bb-78947d8d7536" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:21:53 np0005476733 nova_compute[192580]: 2025-10-08 15:21:53.976 2 DEBUG oslo_concurrency.lockutils [req-d5a05085-4556-412d-a54f-237a9f7731c9 req-1b2e0e53-48c7-4996-9df0-1d65a7362630 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-64549fc7-989f-473a-99bb-78947d8d7536" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:21:53 np0005476733 nova_compute[192580]: 2025-10-08 15:21:53.976 2 DEBUG nova.network.neutron [req-d5a05085-4556-412d-a54f-237a9f7731c9 req-1b2e0e53-48c7-4996-9df0-1d65a7362630 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Refreshing network info cache for port 4689d9d8-d635-4a1c-9495-cff4ea7e6a95 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:21:54 np0005476733 podman[224373]: 2025-10-08 15:21:54.24469991 +0000 UTC m=+0.072558503 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:21:55 np0005476733 podman[224393]: 2025-10-08 15:21:55.310939036 +0000 UTC m=+0.135180377 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 11:21:56 np0005476733 nova_compute[192580]: 2025-10-08 15:21:56.186 2 DEBUG nova.network.neutron [req-d5a05085-4556-412d-a54f-237a9f7731c9 req-1b2e0e53-48c7-4996-9df0-1d65a7362630 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Updated VIF entry in instance network info cache for port 4689d9d8-d635-4a1c-9495-cff4ea7e6a95. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:21:56 np0005476733 nova_compute[192580]: 2025-10-08 15:21:56.186 2 DEBUG nova.network.neutron [req-d5a05085-4556-412d-a54f-237a9f7731c9 req-1b2e0e53-48c7-4996-9df0-1d65a7362630 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Updating instance_info_cache with network_info: [{"id": "4689d9d8-d635-4a1c-9495-cff4ea7e6a95", "address": "fa:16:3e:bc:4a:7f", "network": {"id": "ca646cb6-3329-453a-a072-04814e4638f0", "bridge": "br-int", "label": "tempest-test-network--1140426360", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27fe52d14e2143a887b0445eb5cfca72", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4689d9d8-d6", "ovs_interfaceid": "4689d9d8-d635-4a1c-9495-cff4ea7e6a95", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:21:56 np0005476733 nova_compute[192580]: 2025-10-08 15:21:56.220 2 DEBUG oslo_concurrency.lockutils [req-d5a05085-4556-412d-a54f-237a9f7731c9 req-1b2e0e53-48c7-4996-9df0-1d65a7362630 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-64549fc7-989f-473a-99bb-78947d8d7536" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:21:57 np0005476733 nova_compute[192580]: 2025-10-08 15:21:57.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:58 np0005476733 nova_compute[192580]: 2025-10-08 15:21:58.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:58 np0005476733 nova_compute[192580]: 2025-10-08 15:21:58.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:21:58 np0005476733 nova_compute[192580]: 2025-10-08 15:21:58.930 2 INFO nova.compute.manager [None req-a4d0a9f0-9e1a-4625-b383-1b774ac245b4 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Get console output#033[00m
Oct  8 11:21:58 np0005476733 nova_compute[192580]: 2025-10-08 15:21:58.938 2 INFO nova.virt.libvirt.driver [None req-a4d0a9f0-9e1a-4625-b383-1b774ac245b4 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Truncated console log returned, 2907 bytes ignored#033[00m
Oct  8 11:21:59 np0005476733 podman[224420]: 2025-10-08 15:21:59.266310603 +0000 UTC m=+0.081359375 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, maintainer=Red Hat, Inc.)
Oct  8 11:22:01 np0005476733 nova_compute[192580]: 2025-10-08 15:22:01.030 2 DEBUG oslo_concurrency.lockutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Acquiring lock "9992bf78-8d8e-43c7-a8cc-5606d8c910cf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:01 np0005476733 nova_compute[192580]: 2025-10-08 15:22:01.031 2 DEBUG oslo_concurrency.lockutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "9992bf78-8d8e-43c7-a8cc-5606d8c910cf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:01 np0005476733 nova_compute[192580]: 2025-10-08 15:22:01.135 2 DEBUG nova.compute.manager [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:22:01 np0005476733 nova_compute[192580]: 2025-10-08 15:22:01.294 2 DEBUG oslo_concurrency.lockutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:01 np0005476733 nova_compute[192580]: 2025-10-08 15:22:01.295 2 DEBUG oslo_concurrency.lockutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:01 np0005476733 nova_compute[192580]: 2025-10-08 15:22:01.305 2 DEBUG nova.virt.hardware [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:22:01 np0005476733 nova_compute[192580]: 2025-10-08 15:22:01.306 2 INFO nova.compute.claims [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:22:01 np0005476733 nova_compute[192580]: 2025-10-08 15:22:01.612 2 DEBUG nova.compute.provider_tree [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:22:01 np0005476733 nova_compute[192580]: 2025-10-08 15:22:01.634 2 DEBUG nova.scheduler.client.report [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:22:01 np0005476733 nova_compute[192580]: 2025-10-08 15:22:01.670 2 DEBUG oslo_concurrency.lockutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.375s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:01 np0005476733 nova_compute[192580]: 2025-10-08 15:22:01.671 2 DEBUG nova.compute.manager [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:22:01 np0005476733 nova_compute[192580]: 2025-10-08 15:22:01.729 2 DEBUG nova.compute.manager [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:22:01 np0005476733 nova_compute[192580]: 2025-10-08 15:22:01.731 2 DEBUG nova.network.neutron [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:22:01 np0005476733 nova_compute[192580]: 2025-10-08 15:22:01.805 2 INFO nova.virt.libvirt.driver [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:22:01 np0005476733 nova_compute[192580]: 2025-10-08 15:22:01.879 2 DEBUG nova.compute.manager [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:22:01 np0005476733 nova_compute[192580]: 2025-10-08 15:22:01.971 2 DEBUG nova.compute.manager [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:22:01 np0005476733 nova_compute[192580]: 2025-10-08 15:22:01.975 2 DEBUG nova.virt.libvirt.driver [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:22:01 np0005476733 nova_compute[192580]: 2025-10-08 15:22:01.976 2 INFO nova.virt.libvirt.driver [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Creating image(s)#033[00m
Oct  8 11:22:01 np0005476733 nova_compute[192580]: 2025-10-08 15:22:01.978 2 DEBUG oslo_concurrency.lockutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Acquiring lock "/var/lib/nova/instances/9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:01 np0005476733 nova_compute[192580]: 2025-10-08 15:22:01.979 2 DEBUG oslo_concurrency.lockutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "/var/lib/nova/instances/9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:01 np0005476733 nova_compute[192580]: 2025-10-08 15:22:01.980 2 DEBUG oslo_concurrency.lockutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "/var/lib/nova/instances/9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:02 np0005476733 nova_compute[192580]: 2025-10-08 15:22:02.009 2 DEBUG oslo_concurrency.processutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:02 np0005476733 nova_compute[192580]: 2025-10-08 15:22:02.102 2 DEBUG oslo_concurrency.processutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:02 np0005476733 nova_compute[192580]: 2025-10-08 15:22:02.103 2 DEBUG oslo_concurrency.lockutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:02 np0005476733 nova_compute[192580]: 2025-10-08 15:22:02.104 2 DEBUG oslo_concurrency.lockutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:02 np0005476733 nova_compute[192580]: 2025-10-08 15:22:02.120 2 DEBUG oslo_concurrency.processutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:02 np0005476733 nova_compute[192580]: 2025-10-08 15:22:02.185 2 DEBUG oslo_concurrency.processutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:02 np0005476733 nova_compute[192580]: 2025-10-08 15:22:02.186 2 DEBUG oslo_concurrency.processutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:02 np0005476733 nova_compute[192580]: 2025-10-08 15:22:02.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:02 np0005476733 nova_compute[192580]: 2025-10-08 15:22:02.227 2 DEBUG oslo_concurrency.processutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk 10737418240" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:02 np0005476733 nova_compute[192580]: 2025-10-08 15:22:02.229 2 DEBUG oslo_concurrency.lockutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:02 np0005476733 nova_compute[192580]: 2025-10-08 15:22:02.230 2 DEBUG oslo_concurrency.processutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:02 np0005476733 nova_compute[192580]: 2025-10-08 15:22:02.286 2 DEBUG oslo_concurrency.processutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:02 np0005476733 nova_compute[192580]: 2025-10-08 15:22:02.288 2 DEBUG nova.objects.instance [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lazy-loading 'migration_context' on Instance uuid 9992bf78-8d8e-43c7-a8cc-5606d8c910cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:22:02 np0005476733 nova_compute[192580]: 2025-10-08 15:22:02.313 2 DEBUG nova.virt.libvirt.driver [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:22:02 np0005476733 nova_compute[192580]: 2025-10-08 15:22:02.313 2 DEBUG nova.virt.libvirt.driver [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Ensure instance console log exists: /var/lib/nova/instances/9992bf78-8d8e-43c7-a8cc-5606d8c910cf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:22:02 np0005476733 nova_compute[192580]: 2025-10-08 15:22:02.314 2 DEBUG oslo_concurrency.lockutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:02 np0005476733 nova_compute[192580]: 2025-10-08 15:22:02.315 2 DEBUG oslo_concurrency.lockutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:02 np0005476733 nova_compute[192580]: 2025-10-08 15:22:02.315 2 DEBUG oslo_concurrency.lockutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:02 np0005476733 nova_compute[192580]: 2025-10-08 15:22:02.382 2 DEBUG nova.policy [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:22:02 np0005476733 nova_compute[192580]: 2025-10-08 15:22:02.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:03 np0005476733 podman[224484]: 2025-10-08 15:22:03.257027461 +0000 UTC m=+0.075480797 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:22:03 np0005476733 podman[224483]: 2025-10-08 15:22:03.265524843 +0000 UTC m=+0.087459950 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 11:22:03 np0005476733 nova_compute[192580]: 2025-10-08 15:22:03.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:04 np0005476733 nova_compute[192580]: 2025-10-08 15:22:04.229 2 INFO nova.compute.manager [None req-1112e870-5ec4-4e27-afae-6879ccbd642c 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Get console output#033[00m
Oct  8 11:22:04 np0005476733 nova_compute[192580]: 2025-10-08 15:22:04.235 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:22:04 np0005476733 nova_compute[192580]: 2025-10-08 15:22:04.239 2 INFO nova.virt.libvirt.driver [None req-1112e870-5ec4-4e27-afae-6879ccbd642c 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Truncated console log returned, 3107 bytes ignored#033[00m
Oct  8 11:22:05 np0005476733 nova_compute[192580]: 2025-10-08 15:22:05.222 2 DEBUG nova.network.neutron [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Successfully updated port: 5800d2b5-1c28-4be9-ba9d-7442de36269e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:22:05 np0005476733 nova_compute[192580]: 2025-10-08 15:22:05.249 2 DEBUG oslo_concurrency.lockutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Acquiring lock "refresh_cache-9992bf78-8d8e-43c7-a8cc-5606d8c910cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:22:05 np0005476733 nova_compute[192580]: 2025-10-08 15:22:05.250 2 DEBUG oslo_concurrency.lockutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Acquired lock "refresh_cache-9992bf78-8d8e-43c7-a8cc-5606d8c910cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:22:05 np0005476733 nova_compute[192580]: 2025-10-08 15:22:05.251 2 DEBUG nova.network.neutron [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:22:05 np0005476733 nova_compute[192580]: 2025-10-08 15:22:05.512 2 DEBUG nova.network.neutron [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.515 2 DEBUG nova.compute.manager [req-1bacb99a-90b4-4404-8c56-8f45c1753aba req-9b1602c1-5d98-422b-9a29-46108fd94b04 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Received event network-changed-5800d2b5-1c28-4be9-ba9d-7442de36269e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.517 2 DEBUG nova.compute.manager [req-1bacb99a-90b4-4404-8c56-8f45c1753aba req-9b1602c1-5d98-422b-9a29-46108fd94b04 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Refreshing instance network info cache due to event network-changed-5800d2b5-1c28-4be9-ba9d-7442de36269e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.517 2 DEBUG oslo_concurrency.lockutils [req-1bacb99a-90b4-4404-8c56-8f45c1753aba req-9b1602c1-5d98-422b-9a29-46108fd94b04 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-9992bf78-8d8e-43c7-a8cc-5606d8c910cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.737 2 DEBUG nova.network.neutron [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Updating instance_info_cache with network_info: [{"id": "5800d2b5-1c28-4be9-ba9d-7442de36269e", "address": "fa:16:3e:2b:df:5b", "network": {"id": "ca646cb6-3329-453a-a072-04814e4638f0", "bridge": "br-int", "label": "tempest-test-network--1140426360", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27fe52d14e2143a887b0445eb5cfca72", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5800d2b5-1c", "ovs_interfaceid": "5800d2b5-1c28-4be9-ba9d-7442de36269e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.769 2 DEBUG oslo_concurrency.lockutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Releasing lock "refresh_cache-9992bf78-8d8e-43c7-a8cc-5606d8c910cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.770 2 DEBUG nova.compute.manager [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Instance network_info: |[{"id": "5800d2b5-1c28-4be9-ba9d-7442de36269e", "address": "fa:16:3e:2b:df:5b", "network": {"id": "ca646cb6-3329-453a-a072-04814e4638f0", "bridge": "br-int", "label": "tempest-test-network--1140426360", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27fe52d14e2143a887b0445eb5cfca72", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5800d2b5-1c", "ovs_interfaceid": "5800d2b5-1c28-4be9-ba9d-7442de36269e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.771 2 DEBUG oslo_concurrency.lockutils [req-1bacb99a-90b4-4404-8c56-8f45c1753aba req-9b1602c1-5d98-422b-9a29-46108fd94b04 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-9992bf78-8d8e-43c7-a8cc-5606d8c910cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.771 2 DEBUG nova.network.neutron [req-1bacb99a-90b4-4404-8c56-8f45c1753aba req-9b1602c1-5d98-422b-9a29-46108fd94b04 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Refreshing network info cache for port 5800d2b5-1c28-4be9-ba9d-7442de36269e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.775 2 DEBUG nova.virt.libvirt.driver [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Start _get_guest_xml network_info=[{"id": "5800d2b5-1c28-4be9-ba9d-7442de36269e", "address": "fa:16:3e:2b:df:5b", "network": {"id": "ca646cb6-3329-453a-a072-04814e4638f0", "bridge": "br-int", "label": "tempest-test-network--1140426360", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27fe52d14e2143a887b0445eb5cfca72", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5800d2b5-1c", "ovs_interfaceid": "5800d2b5-1c28-4be9-ba9d-7442de36269e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.780 2 WARNING nova.virt.libvirt.driver [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.789 2 DEBUG nova.virt.libvirt.host [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.790 2 DEBUG nova.virt.libvirt.host [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.799 2 DEBUG nova.virt.libvirt.host [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.800 2 DEBUG nova.virt.libvirt.host [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.801 2 DEBUG nova.virt.libvirt.driver [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.801 2 DEBUG nova.virt.hardware [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.802 2 DEBUG nova.virt.hardware [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.802 2 DEBUG nova.virt.hardware [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.802 2 DEBUG nova.virt.hardware [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.803 2 DEBUG nova.virt.hardware [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.803 2 DEBUG nova.virt.hardware [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.803 2 DEBUG nova.virt.hardware [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.804 2 DEBUG nova.virt.hardware [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.804 2 DEBUG nova.virt.hardware [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.804 2 DEBUG nova.virt.hardware [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.805 2 DEBUG nova.virt.hardware [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.810 2 DEBUG nova.virt.libvirt.vif [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:21:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-multicast-server-vlan-transparent-35634410',display_name='tempest-multicast-server-vlan-transparent-35634410',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-multicast-server-vlan-transparent-35634410',id=15,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH4syPllrLf9M6NW3P0Mtw3AQOO4FK7TnvvKqGmsnzh9ZdBFhzF23mGGofa6PIbzV2jpECHJPUWbJNsOHP+hhSHtvJ/A+QvrET4E695rK5KUU6a+Wgg98oHszoQwuH9J+g==',key_name='tempest-keypair-test-307751635',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='27fe52d14e2143a887b0445eb5cfca72',ramdisk_id='',reservation_id='r-r76c1a4l',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestVlanTransparency-435229999',owner_user_name='tempest-MulticastTestVlanTransparency-435229999-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:22:01Z,user_data=None,user_id='8f9ed00bd5cc488a9d2a77380f12a503',uuid=9992bf78-8d8e-43c7-a8cc-5606d8c910cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5800d2b5-1c28-4be9-ba9d-7442de36269e", "address": "fa:16:3e:2b:df:5b", "network": {"id": "ca646cb6-3329-453a-a072-04814e4638f0", "bridge": "br-int", "label": "tempest-test-network--1140426360", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27fe52d14e2143a887b0445eb5cfca72", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5800d2b5-1c", "ovs_interfaceid": "5800d2b5-1c28-4be9-ba9d-7442de36269e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.810 2 DEBUG nova.network.os_vif_util [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Converting VIF {"id": "5800d2b5-1c28-4be9-ba9d-7442de36269e", "address": "fa:16:3e:2b:df:5b", "network": {"id": "ca646cb6-3329-453a-a072-04814e4638f0", "bridge": "br-int", "label": "tempest-test-network--1140426360", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27fe52d14e2143a887b0445eb5cfca72", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5800d2b5-1c", "ovs_interfaceid": "5800d2b5-1c28-4be9-ba9d-7442de36269e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.812 2 DEBUG nova.network.os_vif_util [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:df:5b,bridge_name='br-int',has_traffic_filtering=True,id=5800d2b5-1c28-4be9-ba9d-7442de36269e,network=Network(ca646cb6-3329-453a-a072-04814e4638f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5800d2b5-1c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.813 2 DEBUG nova.objects.instance [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9992bf78-8d8e-43c7-a8cc-5606d8c910cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.834 2 DEBUG nova.virt.libvirt.driver [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:22:06 np0005476733 nova_compute[192580]:  <uuid>9992bf78-8d8e-43c7-a8cc-5606d8c910cf</uuid>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:  <name>instance-0000000f</name>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <nova:name>tempest-multicast-server-vlan-transparent-35634410</nova:name>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:22:06</nova:creationTime>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:22:06 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:        <nova:user uuid="8f9ed00bd5cc488a9d2a77380f12a503">tempest-MulticastTestVlanTransparency-435229999-project-member</nova:user>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:        <nova:project uuid="27fe52d14e2143a887b0445eb5cfca72">tempest-MulticastTestVlanTransparency-435229999</nova:project>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:        <nova:port uuid="5800d2b5-1c28-4be9-ba9d-7442de36269e">
Oct  8 11:22:06 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <entry name="serial">9992bf78-8d8e-43c7-a8cc-5606d8c910cf</entry>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <entry name="uuid">9992bf78-8d8e-43c7-a8cc-5606d8c910cf</entry>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.config"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:2b:df:5b"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <target dev="tap5800d2b5-1c"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/9992bf78-8d8e-43c7-a8cc-5606d8c910cf/console.log" append="off"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:22:06 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:22:06 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:22:06 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:22:06 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.840 2 DEBUG nova.compute.manager [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Preparing to wait for external event network-vif-plugged-5800d2b5-1c28-4be9-ba9d-7442de36269e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.840 2 DEBUG oslo_concurrency.lockutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Acquiring lock "9992bf78-8d8e-43c7-a8cc-5606d8c910cf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.840 2 DEBUG oslo_concurrency.lockutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "9992bf78-8d8e-43c7-a8cc-5606d8c910cf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.841 2 DEBUG oslo_concurrency.lockutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "9992bf78-8d8e-43c7-a8cc-5606d8c910cf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.841 2 DEBUG nova.virt.libvirt.vif [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:21:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-multicast-server-vlan-transparent-35634410',display_name='tempest-multicast-server-vlan-transparent-35634410',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-multicast-server-vlan-transparent-35634410',id=15,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH4syPllrLf9M6NW3P0Mtw3AQOO4FK7TnvvKqGmsnzh9ZdBFhzF23mGGofa6PIbzV2jpECHJPUWbJNsOHP+hhSHtvJ/A+QvrET4E695rK5KUU6a+Wgg98oHszoQwuH9J+g==',key_name='tempest-keypair-test-307751635',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='27fe52d14e2143a887b0445eb5cfca72',ramdisk_id='',reservation_id='r-r76c1a4l',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestVlanTransparency-435229999',owner_user_name='tempest-MulticastTestVlanTransparency-435229999-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:22:01Z,user_data=None,user_id='8f9ed00bd5cc488a9d2a77380f12a503',uuid=9992bf78-8d8e-43c7-a8cc-5606d8c910cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5800d2b5-1c28-4be9-ba9d-7442de36269e", "address": "fa:16:3e:2b:df:5b", "network": {"id": "ca646cb6-3329-453a-a072-04814e4638f0", "bridge": "br-int", "label": "tempest-test-network--1140426360", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27fe52d14e2143a887b0445eb5cfca72", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5800d2b5-1c", "ovs_interfaceid": "5800d2b5-1c28-4be9-ba9d-7442de36269e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.842 2 DEBUG nova.network.os_vif_util [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Converting VIF {"id": "5800d2b5-1c28-4be9-ba9d-7442de36269e", "address": "fa:16:3e:2b:df:5b", "network": {"id": "ca646cb6-3329-453a-a072-04814e4638f0", "bridge": "br-int", "label": "tempest-test-network--1140426360", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27fe52d14e2143a887b0445eb5cfca72", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5800d2b5-1c", "ovs_interfaceid": "5800d2b5-1c28-4be9-ba9d-7442de36269e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.843 2 DEBUG nova.network.os_vif_util [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:df:5b,bridge_name='br-int',has_traffic_filtering=True,id=5800d2b5-1c28-4be9-ba9d-7442de36269e,network=Network(ca646cb6-3329-453a-a072-04814e4638f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5800d2b5-1c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.843 2 DEBUG os_vif [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:df:5b,bridge_name='br-int',has_traffic_filtering=True,id=5800d2b5-1c28-4be9-ba9d-7442de36269e,network=Network(ca646cb6-3329-453a-a072-04814e4638f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5800d2b5-1c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.844 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.845 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.848 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5800d2b5-1c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.848 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5800d2b5-1c, col_values=(('external_ids', {'iface-id': '5800d2b5-1c28-4be9-ba9d-7442de36269e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2b:df:5b', 'vm-uuid': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:22:06 np0005476733 NetworkManager[51699]: <info>  [1759936926.8511] manager: (tap5800d2b5-1c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.864 2 INFO os_vif [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:df:5b,bridge_name='br-int',has_traffic_filtering=True,id=5800d2b5-1c28-4be9-ba9d-7442de36269e,network=Network(ca646cb6-3329-453a-a072-04814e4638f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5800d2b5-1c')#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.937 2 DEBUG nova.virt.libvirt.driver [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.937 2 DEBUG nova.virt.libvirt.driver [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.938 2 DEBUG nova.virt.libvirt.driver [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] No VIF found with MAC fa:16:3e:2b:df:5b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:22:06 np0005476733 nova_compute[192580]: 2025-10-08 15:22:06.938 2 INFO nova.virt.libvirt.driver [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Using config drive#033[00m
Oct  8 11:22:07 np0005476733 nova_compute[192580]: 2025-10-08 15:22:07.436 2 INFO nova.virt.libvirt.driver [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Creating config drive at /var/lib/nova/instances/9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.config#033[00m
Oct  8 11:22:07 np0005476733 nova_compute[192580]: 2025-10-08 15:22:07.443 2 DEBUG oslo_concurrency.processutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1beb0dpf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:07 np0005476733 nova_compute[192580]: 2025-10-08 15:22:07.573 2 DEBUG oslo_concurrency.processutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1beb0dpf" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:07 np0005476733 kernel: tap5800d2b5-1c: entered promiscuous mode
Oct  8 11:22:07 np0005476733 NetworkManager[51699]: <info>  [1759936927.6517] manager: (tap5800d2b5-1c): new Tun device (/org/freedesktop/NetworkManager/Devices/65)
Oct  8 11:22:07 np0005476733 ovn_controller[94857]: 2025-10-08T15:22:07Z|00138|binding|INFO|Claiming lport 5800d2b5-1c28-4be9-ba9d-7442de36269e for this chassis.
Oct  8 11:22:07 np0005476733 ovn_controller[94857]: 2025-10-08T15:22:07Z|00139|binding|INFO|5800d2b5-1c28-4be9-ba9d-7442de36269e: Claiming fa:16:3e:2b:df:5b 10.100.0.6
Oct  8 11:22:07 np0005476733 nova_compute[192580]: 2025-10-08 15:22:07.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:07.663 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:df:5b 10.100.0.6'], port_security=['fa:16:3e:2b:df:5b 10.100.0.6 192.168.123.12/24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca646cb6-3329-453a-a072-04814e4638f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '27fe52d14e2143a887b0445eb5cfca72', 'neutron:revision_number': '2', 'neutron:security_group_ids': '47c9f436-4d87-4dd0-ad82-6f84fbc433e0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e3e342e-563d-45df-8704-409eb95c6087, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=5800d2b5-1c28-4be9-ba9d-7442de36269e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:22:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:07.665 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 5800d2b5-1c28-4be9-ba9d-7442de36269e in datapath ca646cb6-3329-453a-a072-04814e4638f0 bound to our chassis#033[00m
Oct  8 11:22:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:07.667 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ca646cb6-3329-453a-a072-04814e4638f0#033[00m
Oct  8 11:22:07 np0005476733 ovn_controller[94857]: 2025-10-08T15:22:07Z|00140|binding|INFO|Setting lport 5800d2b5-1c28-4be9-ba9d-7442de36269e ovn-installed in OVS
Oct  8 11:22:07 np0005476733 ovn_controller[94857]: 2025-10-08T15:22:07Z|00141|binding|INFO|Setting lport 5800d2b5-1c28-4be9-ba9d-7442de36269e up in Southbound
Oct  8 11:22:07 np0005476733 nova_compute[192580]: 2025-10-08 15:22:07.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:07.690 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7427d1e0-3ca6-41ff-9663-b7a3067f7910]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:07 np0005476733 systemd-machined[152624]: New machine qemu-10-instance-0000000f.
Oct  8 11:22:07 np0005476733 systemd[1]: Started Virtual Machine qemu-10-instance-0000000f.
Oct  8 11:22:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:07.729 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[165c34bb-adc1-44c2-bad2-3022bfbbfac9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:07.732 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[d86a53cd-7dca-4856-9d2e-035cea670c82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:07 np0005476733 systemd-udevd[224552]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:22:07 np0005476733 NetworkManager[51699]: <info>  [1759936927.7661] device (tap5800d2b5-1c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:22:07 np0005476733 NetworkManager[51699]: <info>  [1759936927.7669] device (tap5800d2b5-1c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:22:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:07.774 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[40ec68e1-68a7-4c8a-b91e-832cca9e6bac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:07.792 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[191d8196-814e-4918-9826-c4e3a639f347]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca646cb6-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:47:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 5, 'rx_bytes': 1000, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 5, 'rx_bytes': 1000, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377535, 'reachable_time': 19488, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224562, 'error': None, 'target': 'ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:07.808 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6ac0a267-1b39-46c4-964d-315a2f785f8e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapca646cb6-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 377549, 'tstamp': 377549}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224564, 'error': None, 'target': 'ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapca646cb6-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 377554, 'tstamp': 377554}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224564, 'error': None, 'target': 'ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:07.810 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca646cb6-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:22:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:07.813 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca646cb6-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:22:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:07.813 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:22:07 np0005476733 nova_compute[192580]: 2025-10-08 15:22:07.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:07.813 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapca646cb6-30, col_values=(('external_ids', {'iface-id': '01b0a658-f0ed-4cb7-aee4-981992c348f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:22:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:07.814 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:22:08 np0005476733 nova_compute[192580]: 2025-10-08 15:22:08.529 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936928.5283618, 9992bf78-8d8e-43c7-a8cc-5606d8c910cf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:22:08 np0005476733 nova_compute[192580]: 2025-10-08 15:22:08.530 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] VM Started (Lifecycle Event)#033[00m
Oct  8 11:22:08 np0005476733 nova_compute[192580]: 2025-10-08 15:22:08.570 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:22:08 np0005476733 nova_compute[192580]: 2025-10-08 15:22:08.577 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936928.5293624, 9992bf78-8d8e-43c7-a8cc-5606d8c910cf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:22:08 np0005476733 nova_compute[192580]: 2025-10-08 15:22:08.577 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:22:08 np0005476733 nova_compute[192580]: 2025-10-08 15:22:08.601 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:22:08 np0005476733 nova_compute[192580]: 2025-10-08 15:22:08.606 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:22:08 np0005476733 nova_compute[192580]: 2025-10-08 15:22:08.631 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:22:08 np0005476733 nova_compute[192580]: 2025-10-08 15:22:08.672 2 DEBUG nova.compute.manager [req-4d90af99-6e8d-4cea-8243-54509161e657 req-173e44ac-bff7-46cf-9709-44774b0d824d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Received event network-changed-046dc8a5-fad3-4f9f-bd10-3894704fe7ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:22:08 np0005476733 nova_compute[192580]: 2025-10-08 15:22:08.672 2 DEBUG nova.compute.manager [req-4d90af99-6e8d-4cea-8243-54509161e657 req-173e44ac-bff7-46cf-9709-44774b0d824d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Refreshing instance network info cache due to event network-changed-046dc8a5-fad3-4f9f-bd10-3894704fe7ed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:22:08 np0005476733 nova_compute[192580]: 2025-10-08 15:22:08.673 2 DEBUG oslo_concurrency.lockutils [req-4d90af99-6e8d-4cea-8243-54509161e657 req-173e44ac-bff7-46cf-9709-44774b0d824d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-341c177f-c391-41dd-bf3c-14c2076057eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:22:08 np0005476733 nova_compute[192580]: 2025-10-08 15:22:08.673 2 DEBUG oslo_concurrency.lockutils [req-4d90af99-6e8d-4cea-8243-54509161e657 req-173e44ac-bff7-46cf-9709-44774b0d824d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-341c177f-c391-41dd-bf3c-14c2076057eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:22:08 np0005476733 nova_compute[192580]: 2025-10-08 15:22:08.674 2 DEBUG nova.network.neutron [req-4d90af99-6e8d-4cea-8243-54509161e657 req-173e44ac-bff7-46cf-9709-44774b0d824d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Refreshing network info cache for port 046dc8a5-fad3-4f9f-bd10-3894704fe7ed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:22:08 np0005476733 nova_compute[192580]: 2025-10-08 15:22:08.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:09 np0005476733 nova_compute[192580]: 2025-10-08 15:22:09.153 2 DEBUG nova.network.neutron [req-1bacb99a-90b4-4404-8c56-8f45c1753aba req-9b1602c1-5d98-422b-9a29-46108fd94b04 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Updated VIF entry in instance network info cache for port 5800d2b5-1c28-4be9-ba9d-7442de36269e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:22:09 np0005476733 nova_compute[192580]: 2025-10-08 15:22:09.154 2 DEBUG nova.network.neutron [req-1bacb99a-90b4-4404-8c56-8f45c1753aba req-9b1602c1-5d98-422b-9a29-46108fd94b04 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Updating instance_info_cache with network_info: [{"id": "5800d2b5-1c28-4be9-ba9d-7442de36269e", "address": "fa:16:3e:2b:df:5b", "network": {"id": "ca646cb6-3329-453a-a072-04814e4638f0", "bridge": "br-int", "label": "tempest-test-network--1140426360", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27fe52d14e2143a887b0445eb5cfca72", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5800d2b5-1c", "ovs_interfaceid": "5800d2b5-1c28-4be9-ba9d-7442de36269e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:22:09 np0005476733 nova_compute[192580]: 2025-10-08 15:22:09.171 2 DEBUG oslo_concurrency.lockutils [req-1bacb99a-90b4-4404-8c56-8f45c1753aba req-9b1602c1-5d98-422b-9a29-46108fd94b04 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-9992bf78-8d8e-43c7-a8cc-5606d8c910cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.610 2 DEBUG nova.network.neutron [req-4d90af99-6e8d-4cea-8243-54509161e657 req-173e44ac-bff7-46cf-9709-44774b0d824d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Updated VIF entry in instance network info cache for port 046dc8a5-fad3-4f9f-bd10-3894704fe7ed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.612 2 DEBUG nova.network.neutron [req-4d90af99-6e8d-4cea-8243-54509161e657 req-173e44ac-bff7-46cf-9709-44774b0d824d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Updating instance_info_cache with network_info: [{"id": "046dc8a5-fad3-4f9f-bd10-3894704fe7ed", "address": "fa:16:3e:0f:3d:28", "network": {"id": "7e5261bf-648d-4475-96cb-fe9ba80fd1d8", "bridge": "br-int", "label": "tempest-test-network--1002477072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d027d9bf53149dd9246b01ebf09eb48", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap046dc8a5-fa", "ovs_interfaceid": "046dc8a5-fad3-4f9f-bd10-3894704fe7ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.639 2 DEBUG oslo_concurrency.lockutils [req-4d90af99-6e8d-4cea-8243-54509161e657 req-173e44ac-bff7-46cf-9709-44774b0d824d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-341c177f-c391-41dd-bf3c-14c2076057eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.828 2 DEBUG nova.compute.manager [req-1deabba8-0ac8-4e46-8dec-b4ece8e3f09b req-311b5be0-7a3a-4a8f-8a25-405044ea790e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Received event network-vif-plugged-5800d2b5-1c28-4be9-ba9d-7442de36269e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.828 2 DEBUG oslo_concurrency.lockutils [req-1deabba8-0ac8-4e46-8dec-b4ece8e3f09b req-311b5be0-7a3a-4a8f-8a25-405044ea790e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "9992bf78-8d8e-43c7-a8cc-5606d8c910cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.829 2 DEBUG oslo_concurrency.lockutils [req-1deabba8-0ac8-4e46-8dec-b4ece8e3f09b req-311b5be0-7a3a-4a8f-8a25-405044ea790e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9992bf78-8d8e-43c7-a8cc-5606d8c910cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.829 2 DEBUG oslo_concurrency.lockutils [req-1deabba8-0ac8-4e46-8dec-b4ece8e3f09b req-311b5be0-7a3a-4a8f-8a25-405044ea790e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9992bf78-8d8e-43c7-a8cc-5606d8c910cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.829 2 DEBUG nova.compute.manager [req-1deabba8-0ac8-4e46-8dec-b4ece8e3f09b req-311b5be0-7a3a-4a8f-8a25-405044ea790e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Processing event network-vif-plugged-5800d2b5-1c28-4be9-ba9d-7442de36269e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.830 2 DEBUG nova.compute.manager [req-1deabba8-0ac8-4e46-8dec-b4ece8e3f09b req-311b5be0-7a3a-4a8f-8a25-405044ea790e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Received event network-vif-plugged-5800d2b5-1c28-4be9-ba9d-7442de36269e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.830 2 DEBUG oslo_concurrency.lockutils [req-1deabba8-0ac8-4e46-8dec-b4ece8e3f09b req-311b5be0-7a3a-4a8f-8a25-405044ea790e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "9992bf78-8d8e-43c7-a8cc-5606d8c910cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.830 2 DEBUG oslo_concurrency.lockutils [req-1deabba8-0ac8-4e46-8dec-b4ece8e3f09b req-311b5be0-7a3a-4a8f-8a25-405044ea790e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9992bf78-8d8e-43c7-a8cc-5606d8c910cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.830 2 DEBUG oslo_concurrency.lockutils [req-1deabba8-0ac8-4e46-8dec-b4ece8e3f09b req-311b5be0-7a3a-4a8f-8a25-405044ea790e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9992bf78-8d8e-43c7-a8cc-5606d8c910cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.831 2 DEBUG nova.compute.manager [req-1deabba8-0ac8-4e46-8dec-b4ece8e3f09b req-311b5be0-7a3a-4a8f-8a25-405044ea790e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] No waiting events found dispatching network-vif-plugged-5800d2b5-1c28-4be9-ba9d-7442de36269e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.831 2 WARNING nova.compute.manager [req-1deabba8-0ac8-4e46-8dec-b4ece8e3f09b req-311b5be0-7a3a-4a8f-8a25-405044ea790e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Received unexpected event network-vif-plugged-5800d2b5-1c28-4be9-ba9d-7442de36269e for instance with vm_state building and task_state spawning.#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.831 2 DEBUG nova.compute.manager [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.835 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936930.835166, 9992bf78-8d8e-43c7-a8cc-5606d8c910cf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.835 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.837 2 DEBUG nova.virt.libvirt.driver [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.847 2 INFO nova.virt.libvirt.driver [-] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Instance spawned successfully.#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.848 2 DEBUG nova.virt.libvirt.driver [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.867 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.871 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.881 2 DEBUG nova.virt.libvirt.driver [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.881 2 DEBUG nova.virt.libvirt.driver [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.882 2 DEBUG nova.virt.libvirt.driver [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.882 2 DEBUG nova.virt.libvirt.driver [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.883 2 DEBUG nova.virt.libvirt.driver [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.883 2 DEBUG nova.virt.libvirt.driver [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.918 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.965 2 INFO nova.compute.manager [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Took 8.99 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:22:10 np0005476733 nova_compute[192580]: 2025-10-08 15:22:10.965 2 DEBUG nova.compute.manager [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:22:11 np0005476733 nova_compute[192580]: 2025-10-08 15:22:11.052 2 INFO nova.compute.manager [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Took 9.79 seconds to build instance.#033[00m
Oct  8 11:22:11 np0005476733 nova_compute[192580]: 2025-10-08 15:22:11.077 2 DEBUG oslo_concurrency.lockutils [None req-8d33e1e7-5b55-4ffd-b398-6fc7e830423d 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "9992bf78-8d8e-43c7-a8cc-5606d8c910cf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.046s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:11 np0005476733 nova_compute[192580]: 2025-10-08 15:22:11.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:12 np0005476733 podman[224572]: 2025-10-08 15:22:12.268351221 +0000 UTC m=+0.082008815 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 11:22:13 np0005476733 nova_compute[192580]: 2025-10-08 15:22:13.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:14 np0005476733 nova_compute[192580]: 2025-10-08 15:22:14.257 2 INFO nova.compute.manager [None req-070060e6-d748-4bfd-81ba-19c5a3a6c0a8 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Get console output#033[00m
Oct  8 11:22:14 np0005476733 nova_compute[192580]: 2025-10-08 15:22:14.263 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:22:14 np0005476733 podman[224592]: 2025-10-08 15:22:14.266948759 +0000 UTC m=+0.097107579 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:22:16 np0005476733 nova_compute[192580]: 2025-10-08 15:22:16.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:18 np0005476733 nova_compute[192580]: 2025-10-08 15:22:18.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:19 np0005476733 nova_compute[192580]: 2025-10-08 15:22:19.457 2 INFO nova.compute.manager [None req-e9c6fcba-cab6-4354-a1f9-0b008cc40113 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Get console output#033[00m
Oct  8 11:22:20 np0005476733 nova_compute[192580]: 2025-10-08 15:22:20.431 2 DEBUG oslo_concurrency.lockutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "08e4113f-f3be-424f-926e-62e20b3ad767" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:20 np0005476733 nova_compute[192580]: 2025-10-08 15:22:20.432 2 DEBUG oslo_concurrency.lockutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "08e4113f-f3be-424f-926e-62e20b3ad767" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:20 np0005476733 nova_compute[192580]: 2025-10-08 15:22:20.471 2 DEBUG nova.compute.manager [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:22:20 np0005476733 nova_compute[192580]: 2025-10-08 15:22:20.617 2 DEBUG oslo_concurrency.lockutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:20 np0005476733 nova_compute[192580]: 2025-10-08 15:22:20.620 2 DEBUG oslo_concurrency.lockutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:20 np0005476733 nova_compute[192580]: 2025-10-08 15:22:20.630 2 DEBUG nova.virt.hardware [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:22:20 np0005476733 nova_compute[192580]: 2025-10-08 15:22:20.630 2 INFO nova.compute.claims [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:22:20 np0005476733 nova_compute[192580]: 2025-10-08 15:22:20.884 2 DEBUG nova.compute.provider_tree [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:22:20 np0005476733 nova_compute[192580]: 2025-10-08 15:22:20.907 2 DEBUG nova.scheduler.client.report [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:22:20 np0005476733 nova_compute[192580]: 2025-10-08 15:22:20.932 2 DEBUG oslo_concurrency.lockutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.312s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:20 np0005476733 nova_compute[192580]: 2025-10-08 15:22:20.933 2 DEBUG nova.compute.manager [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:22:20 np0005476733 nova_compute[192580]: 2025-10-08 15:22:20.976 2 DEBUG nova.compute.manager [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:22:20 np0005476733 nova_compute[192580]: 2025-10-08 15:22:20.977 2 DEBUG nova.network.neutron [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:22:20 np0005476733 nova_compute[192580]: 2025-10-08 15:22:20.995 2 INFO nova.virt.libvirt.driver [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.023 2 DEBUG nova.compute.manager [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:22:21 np0005476733 ovn_controller[94857]: 2025-10-08T15:22:21Z|00142|pinctrl|WARN|Dropped 2873 log messages in last 60 seconds (most recently, 0 seconds ago) due to excessive rate
Oct  8 11:22:21 np0005476733 ovn_controller[94857]: 2025-10-08T15:22:21Z|00143|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.135 2 DEBUG nova.compute.manager [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.136 2 DEBUG nova.virt.libvirt.driver [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.138 2 INFO nova.virt.libvirt.driver [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Creating image(s)#033[00m
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.139 2 DEBUG oslo_concurrency.lockutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "/var/lib/nova/instances/08e4113f-f3be-424f-926e-62e20b3ad767/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.139 2 DEBUG oslo_concurrency.lockutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "/var/lib/nova/instances/08e4113f-f3be-424f-926e-62e20b3ad767/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.140 2 DEBUG oslo_concurrency.lockutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "/var/lib/nova/instances/08e4113f-f3be-424f-926e-62e20b3ad767/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.155 2 DEBUG oslo_concurrency.processutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:21 np0005476733 podman[224617]: 2025-10-08 15:22:21.232222712 +0000 UTC m=+0.059518506 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.238 2 DEBUG oslo_concurrency.processutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.240 2 DEBUG oslo_concurrency.lockutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.241 2 DEBUG oslo_concurrency.lockutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.253 2 DEBUG oslo_concurrency.processutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.276 2 DEBUG nova.policy [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.311 2 DEBUG oslo_concurrency.processutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.312 2 DEBUG oslo_concurrency.processutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/08e4113f-f3be-424f-926e-62e20b3ad767/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.352 2 DEBUG oslo_concurrency.processutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/08e4113f-f3be-424f-926e-62e20b3ad767/disk 10737418240" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.353 2 DEBUG oslo_concurrency.lockutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.354 2 DEBUG oslo_concurrency.processutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.417 2 DEBUG oslo_concurrency.processutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.418 2 DEBUG nova.objects.instance [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lazy-loading 'migration_context' on Instance uuid 08e4113f-f3be-424f-926e-62e20b3ad767 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.433 2 DEBUG nova.virt.libvirt.driver [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.434 2 DEBUG nova.virt.libvirt.driver [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Ensure instance console log exists: /var/lib/nova/instances/08e4113f-f3be-424f-926e-62e20b3ad767/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.435 2 DEBUG oslo_concurrency.lockutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.435 2 DEBUG oslo_concurrency.lockutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.436 2 DEBUG oslo_concurrency.lockutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:21 np0005476733 nova_compute[192580]: 2025-10-08 15:22:21.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:22.144 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:22:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:22.149 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:22:22 np0005476733 nova_compute[192580]: 2025-10-08 15:22:22.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:22 np0005476733 nova_compute[192580]: 2025-10-08 15:22:22.261 2 DEBUG nova.network.neutron [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Successfully created port: 3e1bce81-bd3f-433a-aad4-1b90ad016699 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 11:22:23 np0005476733 nova_compute[192580]: 2025-10-08 15:22:23.319 2 DEBUG nova.network.neutron [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Successfully updated port: 3e1bce81-bd3f-433a-aad4-1b90ad016699 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:22:23 np0005476733 nova_compute[192580]: 2025-10-08 15:22:23.348 2 DEBUG oslo_concurrency.lockutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "refresh_cache-08e4113f-f3be-424f-926e-62e20b3ad767" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:22:23 np0005476733 nova_compute[192580]: 2025-10-08 15:22:23.349 2 DEBUG oslo_concurrency.lockutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquired lock "refresh_cache-08e4113f-f3be-424f-926e-62e20b3ad767" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:22:23 np0005476733 nova_compute[192580]: 2025-10-08 15:22:23.349 2 DEBUG nova.network.neutron [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:22:23 np0005476733 nova_compute[192580]: 2025-10-08 15:22:23.455 2 DEBUG nova.compute.manager [req-e5541a49-ef83-405e-9056-d75f9c4bd300 req-418ea329-5980-4bd0-afa8-98dfbb869b80 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Received event network-changed-3e1bce81-bd3f-433a-aad4-1b90ad016699 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:22:23 np0005476733 nova_compute[192580]: 2025-10-08 15:22:23.455 2 DEBUG nova.compute.manager [req-e5541a49-ef83-405e-9056-d75f9c4bd300 req-418ea329-5980-4bd0-afa8-98dfbb869b80 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Refreshing instance network info cache due to event network-changed-3e1bce81-bd3f-433a-aad4-1b90ad016699. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:22:23 np0005476733 nova_compute[192580]: 2025-10-08 15:22:23.456 2 DEBUG oslo_concurrency.lockutils [req-e5541a49-ef83-405e-9056-d75f9c4bd300 req-418ea329-5980-4bd0-afa8-98dfbb869b80 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-08e4113f-f3be-424f-926e-62e20b3ad767" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:22:23 np0005476733 nova_compute[192580]: 2025-10-08 15:22:23.523 2 DEBUG nova.network.neutron [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:22:23 np0005476733 nova_compute[192580]: 2025-10-08 15:22:23.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:24.152 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.573 2 DEBUG nova.network.neutron [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Updating instance_info_cache with network_info: [{"id": "3e1bce81-bd3f-433a-aad4-1b90ad016699", "address": "fa:16:3e:ce:a3:46", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e1bce81-bd", "ovs_interfaceid": "3e1bce81-bd3f-433a-aad4-1b90ad016699", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.608 2 DEBUG oslo_concurrency.lockutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Releasing lock "refresh_cache-08e4113f-f3be-424f-926e-62e20b3ad767" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.608 2 DEBUG nova.compute.manager [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Instance network_info: |[{"id": "3e1bce81-bd3f-433a-aad4-1b90ad016699", "address": "fa:16:3e:ce:a3:46", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e1bce81-bd", "ovs_interfaceid": "3e1bce81-bd3f-433a-aad4-1b90ad016699", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.609 2 DEBUG oslo_concurrency.lockutils [req-e5541a49-ef83-405e-9056-d75f9c4bd300 req-418ea329-5980-4bd0-afa8-98dfbb869b80 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-08e4113f-f3be-424f-926e-62e20b3ad767" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.610 2 DEBUG nova.network.neutron [req-e5541a49-ef83-405e-9056-d75f9c4bd300 req-418ea329-5980-4bd0-afa8-98dfbb869b80 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Refreshing network info cache for port 3e1bce81-bd3f-433a-aad4-1b90ad016699 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.614 2 DEBUG nova.virt.libvirt.driver [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Start _get_guest_xml network_info=[{"id": "3e1bce81-bd3f-433a-aad4-1b90ad016699", "address": "fa:16:3e:ce:a3:46", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e1bce81-bd", "ovs_interfaceid": "3e1bce81-bd3f-433a-aad4-1b90ad016699", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.621 2 WARNING nova.virt.libvirt.driver [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.631 2 DEBUG nova.virt.libvirt.host [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.632 2 DEBUG nova.virt.libvirt.host [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.636 2 DEBUG nova.virt.libvirt.host [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.636 2 DEBUG nova.virt.libvirt.host [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.637 2 DEBUG nova.virt.libvirt.driver [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.637 2 DEBUG nova.virt.hardware [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.638 2 DEBUG nova.virt.hardware [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.638 2 DEBUG nova.virt.hardware [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.638 2 DEBUG nova.virt.hardware [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.638 2 DEBUG nova.virt.hardware [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.638 2 DEBUG nova.virt.hardware [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.639 2 DEBUG nova.virt.hardware [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.639 2 DEBUG nova.virt.hardware [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.639 2 DEBUG nova.virt.hardware [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.639 2 DEBUG nova.virt.hardware [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.639 2 DEBUG nova.virt.hardware [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.645 2 DEBUG nova.virt.libvirt.vif [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:22:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_flooding_when_special_groups-185393400',display_name='tempest-test_flooding_when_special_groups-185393400',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-flooding-when-special-groups-185393400',id=18,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD2WFeSg/DGHNB9+nWyQfurOVjPkTxdtZkW0R1GkMWJ7Z/35TtPo56N93IJ9W+ueAP01srElKtm0K/Obvpsxk9Lrs3cBEC1ElilHgpG+1/NKtqmriMYH4DXfeSh+aMoHPg==',key_name='tempest-keypair-test-469695160',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7f2acdb26a5a4269a4b1e407da7722c3',ramdisk_id='',reservation_id='r-xna3x6ut',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Common-178854047',owner_user_name='tempest-MulticastTestIPv4Common-178854047-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:22:21Z,user_data=None,user_id='f03335a379bd4afdbbd7b9cc7cae27e0',uuid=08e4113f-f3be-424f-926e-62e20b3ad767,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3e1bce81-bd3f-433a-aad4-1b90ad016699", "address": "fa:16:3e:ce:a3:46", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e1bce81-bd", "ovs_interfaceid": "3e1bce81-bd3f-433a-aad4-1b90ad016699", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.645 2 DEBUG nova.network.os_vif_util [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Converting VIF {"id": "3e1bce81-bd3f-433a-aad4-1b90ad016699", "address": "fa:16:3e:ce:a3:46", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e1bce81-bd", "ovs_interfaceid": "3e1bce81-bd3f-433a-aad4-1b90ad016699", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.646 2 DEBUG nova.network.os_vif_util [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:a3:46,bridge_name='br-int',has_traffic_filtering=True,id=3e1bce81-bd3f-433a-aad4-1b90ad016699,network=Network(3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e1bce81-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.647 2 DEBUG nova.objects.instance [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 08e4113f-f3be-424f-926e-62e20b3ad767 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.665 2 DEBUG nova.virt.libvirt.driver [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:22:24 np0005476733 nova_compute[192580]:  <uuid>08e4113f-f3be-424f-926e-62e20b3ad767</uuid>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:  <name>instance-00000012</name>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <nova:name>tempest-test_flooding_when_special_groups-185393400</nova:name>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:22:24</nova:creationTime>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:22:24 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:        <nova:user uuid="f03335a379bd4afdbbd7b9cc7cae27e0">tempest-MulticastTestIPv4Common-178854047-project-member</nova:user>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:        <nova:project uuid="7f2acdb26a5a4269a4b1e407da7722c3">tempest-MulticastTestIPv4Common-178854047</nova:project>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:        <nova:port uuid="3e1bce81-bd3f-433a-aad4-1b90ad016699">
Oct  8 11:22:24 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <entry name="serial">08e4113f-f3be-424f-926e-62e20b3ad767</entry>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <entry name="uuid">08e4113f-f3be-424f-926e-62e20b3ad767</entry>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/08e4113f-f3be-424f-926e-62e20b3ad767/disk"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/08e4113f-f3be-424f-926e-62e20b3ad767/disk.config"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:ce:a3:46"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <target dev="tap3e1bce81-bd"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/08e4113f-f3be-424f-926e-62e20b3ad767/console.log" append="off"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:22:24 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:22:24 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:22:24 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:22:24 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.667 2 DEBUG nova.compute.manager [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Preparing to wait for external event network-vif-plugged-3e1bce81-bd3f-433a-aad4-1b90ad016699 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.667 2 DEBUG oslo_concurrency.lockutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "08e4113f-f3be-424f-926e-62e20b3ad767-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.667 2 DEBUG oslo_concurrency.lockutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "08e4113f-f3be-424f-926e-62e20b3ad767-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.667 2 DEBUG oslo_concurrency.lockutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "08e4113f-f3be-424f-926e-62e20b3ad767-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.668 2 DEBUG nova.virt.libvirt.vif [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:22:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_flooding_when_special_groups-185393400',display_name='tempest-test_flooding_when_special_groups-185393400',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-flooding-when-special-groups-185393400',id=18,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD2WFeSg/DGHNB9+nWyQfurOVjPkTxdtZkW0R1GkMWJ7Z/35TtPo56N93IJ9W+ueAP01srElKtm0K/Obvpsxk9Lrs3cBEC1ElilHgpG+1/NKtqmriMYH4DXfeSh+aMoHPg==',key_name='tempest-keypair-test-469695160',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7f2acdb26a5a4269a4b1e407da7722c3',ramdisk_id='',reservation_id='r-xna3x6ut',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Common-178854047',owner_user_name='tempest-MulticastTestIPv4Common-178854047-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:22:21Z,user_data=None,user_id='f03335a379bd4afdbbd7b9cc7cae27e0',uuid=08e4113f-f3be-424f-926e-62e20b3ad767,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3e1bce81-bd3f-433a-aad4-1b90ad016699", "address": "fa:16:3e:ce:a3:46", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e1bce81-bd", "ovs_interfaceid": "3e1bce81-bd3f-433a-aad4-1b90ad016699", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.668 2 DEBUG nova.network.os_vif_util [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Converting VIF {"id": "3e1bce81-bd3f-433a-aad4-1b90ad016699", "address": "fa:16:3e:ce:a3:46", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e1bce81-bd", "ovs_interfaceid": "3e1bce81-bd3f-433a-aad4-1b90ad016699", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.669 2 DEBUG nova.network.os_vif_util [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:a3:46,bridge_name='br-int',has_traffic_filtering=True,id=3e1bce81-bd3f-433a-aad4-1b90ad016699,network=Network(3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e1bce81-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.669 2 DEBUG os_vif [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:a3:46,bridge_name='br-int',has_traffic_filtering=True,id=3e1bce81-bd3f-433a-aad4-1b90ad016699,network=Network(3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e1bce81-bd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.670 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.671 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.676 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3e1bce81-bd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.677 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3e1bce81-bd, col_values=(('external_ids', {'iface-id': '3e1bce81-bd3f-433a-aad4-1b90ad016699', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ce:a3:46', 'vm-uuid': '08e4113f-f3be-424f-926e-62e20b3ad767'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:22:24 np0005476733 NetworkManager[51699]: <info>  [1759936944.6815] manager: (tap3e1bce81-bd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.695 2 INFO os_vif [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:a3:46,bridge_name='br-int',has_traffic_filtering=True,id=3e1bce81-bd3f-433a-aad4-1b90ad016699,network=Network(3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e1bce81-bd')#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.764 2 DEBUG nova.virt.libvirt.driver [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.765 2 DEBUG nova.virt.libvirt.driver [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.765 2 DEBUG nova.virt.libvirt.driver [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] No VIF found with MAC fa:16:3e:ce:a3:46, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:22:24 np0005476733 nova_compute[192580]: 2025-10-08 15:22:24.766 2 INFO nova.virt.libvirt.driver [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Using config drive#033[00m
Oct  8 11:22:24 np0005476733 podman[224653]: 2025-10-08 15:22:24.800995516 +0000 UTC m=+0.070307132 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001)
Oct  8 11:22:25 np0005476733 nova_compute[192580]: 2025-10-08 15:22:25.188 2 INFO nova.virt.libvirt.driver [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Creating config drive at /var/lib/nova/instances/08e4113f-f3be-424f-926e-62e20b3ad767/disk.config#033[00m
Oct  8 11:22:25 np0005476733 nova_compute[192580]: 2025-10-08 15:22:25.196 2 DEBUG oslo_concurrency.processutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/08e4113f-f3be-424f-926e-62e20b3ad767/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl1ngglz5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:25 np0005476733 nova_compute[192580]: 2025-10-08 15:22:25.228 2 INFO nova.compute.manager [None req-26653608-20f3-453e-b8b7-f6aa1baf32c9 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Get console output#033[00m
Oct  8 11:22:25 np0005476733 nova_compute[192580]: 2025-10-08 15:22:25.235 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:22:25 np0005476733 nova_compute[192580]: 2025-10-08 15:22:25.331 2 DEBUG oslo_concurrency.processutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/08e4113f-f3be-424f-926e-62e20b3ad767/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl1ngglz5" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:25 np0005476733 kernel: tap3e1bce81-bd: entered promiscuous mode
Oct  8 11:22:25 np0005476733 NetworkManager[51699]: <info>  [1759936945.4168] manager: (tap3e1bce81-bd): new Tun device (/org/freedesktop/NetworkManager/Devices/67)
Oct  8 11:22:25 np0005476733 nova_compute[192580]: 2025-10-08 15:22:25.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:22:25Z|00144|binding|INFO|Claiming lport 3e1bce81-bd3f-433a-aad4-1b90ad016699 for this chassis.
Oct  8 11:22:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:22:25Z|00145|binding|INFO|3e1bce81-bd3f-433a-aad4-1b90ad016699: Claiming fa:16:3e:ce:a3:46 10.100.0.3
Oct  8 11:22:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:22:25Z|00146|binding|INFO|Setting lport 3e1bce81-bd3f-433a-aad4-1b90ad016699 ovn-installed in OVS
Oct  8 11:22:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:22:25Z|00147|binding|INFO|Setting lport 3e1bce81-bd3f-433a-aad4-1b90ad016699 up in Southbound
Oct  8 11:22:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:25.429 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:a3:46 10.100.0.3'], port_security=['fa:16:3e:ce:a3:46 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '89de78c9-f0c2-4dee-bf11-af3dd2c1fe7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ebe434a7-5fd3-4a18-92a7-9bb4b2dc9121, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=3e1bce81-bd3f-433a-aad4-1b90ad016699) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:22:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:25.432 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 3e1bce81-bd3f-433a-aad4-1b90ad016699 in datapath 3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567 bound to our chassis#033[00m
Oct  8 11:22:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:25.437 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567#033[00m
Oct  8 11:22:25 np0005476733 nova_compute[192580]: 2025-10-08 15:22:25.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:25 np0005476733 nova_compute[192580]: 2025-10-08 15:22:25.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:25 np0005476733 systemd-udevd[224698]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:22:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:25.464 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[38e90641-9a19-4da3-a036-ee08bdc4c71b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:25 np0005476733 NetworkManager[51699]: <info>  [1759936945.4791] device (tap3e1bce81-bd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:22:25 np0005476733 NetworkManager[51699]: <info>  [1759936945.4803] device (tap3e1bce81-bd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:22:25 np0005476733 systemd-machined[152624]: New machine qemu-11-instance-00000012.
Oct  8 11:22:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:25.504 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[af75a88f-0150-4e34-9e18-3f99066433ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:25.507 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[c41377cd-0bca-4e49-b1dc-72dede8553f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:25 np0005476733 systemd[1]: Started Virtual Machine qemu-11-instance-00000012.
Oct  8 11:22:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:25.543 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[4e9fcab4-73f6-45cd-879b-c3b03f294dea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:25.568 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0b3f0da4-dbb7-4d7e-b896-d82c7a1fea70]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3ec2e14e-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:9d:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 11, 'tx_packets': 5, 'rx_bytes': 958, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 11, 'tx_packets': 5, 'rx_bytes': 958, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 374371, 'reachable_time': 16032, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224719, 'error': None, 'target': 'ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:25.589 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ac8f7ee4-60e3-4ffa-a707-c32c66e14e6c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3ec2e14e-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 374383, 'tstamp': 374383}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224726, 'error': None, 'target': 'ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap3ec2e14e-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 374386, 'tstamp': 374386}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224726, 'error': None, 'target': 'ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:25.590 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3ec2e14e-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:22:25 np0005476733 nova_compute[192580]: 2025-10-08 15:22:25.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:25 np0005476733 nova_compute[192580]: 2025-10-08 15:22:25.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:25.594 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3ec2e14e-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:22:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:25.594 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:22:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:25.595 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3ec2e14e-50, col_values=(('external_ids', {'iface-id': '1e0c4d29-d963-4fdf-8ca6-0153967de16b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:22:25 np0005476733 podman[224683]: 2025-10-08 15:22:25.595541796 +0000 UTC m=+0.193512085 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  8 11:22:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:25.596 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.087 2 DEBUG nova.network.neutron [req-e5541a49-ef83-405e-9056-d75f9c4bd300 req-418ea329-5980-4bd0-afa8-98dfbb869b80 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Updated VIF entry in instance network info cache for port 3e1bce81-bd3f-433a-aad4-1b90ad016699. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.089 2 DEBUG nova.network.neutron [req-e5541a49-ef83-405e-9056-d75f9c4bd300 req-418ea329-5980-4bd0-afa8-98dfbb869b80 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Updating instance_info_cache with network_info: [{"id": "3e1bce81-bd3f-433a-aad4-1b90ad016699", "address": "fa:16:3e:ce:a3:46", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e1bce81-bd", "ovs_interfaceid": "3e1bce81-bd3f-433a-aad4-1b90ad016699", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.123 2 DEBUG oslo_concurrency.lockutils [req-e5541a49-ef83-405e-9056-d75f9c4bd300 req-418ea329-5980-4bd0-afa8-98dfbb869b80 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-08e4113f-f3be-424f-926e-62e20b3ad767" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:22:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:26.304 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:26.304 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:26.305 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.544 2 DEBUG nova.compute.manager [req-d81fe369-e9b4-4bc7-8a24-c59ac60b2171 req-d9d5527e-79c6-4c4b-9f56-f1cc65280410 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Received event network-vif-plugged-3e1bce81-bd3f-433a-aad4-1b90ad016699 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.545 2 DEBUG oslo_concurrency.lockutils [req-d81fe369-e9b4-4bc7-8a24-c59ac60b2171 req-d9d5527e-79c6-4c4b-9f56-f1cc65280410 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "08e4113f-f3be-424f-926e-62e20b3ad767-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.545 2 DEBUG oslo_concurrency.lockutils [req-d81fe369-e9b4-4bc7-8a24-c59ac60b2171 req-d9d5527e-79c6-4c4b-9f56-f1cc65280410 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "08e4113f-f3be-424f-926e-62e20b3ad767-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.546 2 DEBUG oslo_concurrency.lockutils [req-d81fe369-e9b4-4bc7-8a24-c59ac60b2171 req-d9d5527e-79c6-4c4b-9f56-f1cc65280410 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "08e4113f-f3be-424f-926e-62e20b3ad767-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.546 2 DEBUG nova.compute.manager [req-d81fe369-e9b4-4bc7-8a24-c59ac60b2171 req-d9d5527e-79c6-4c4b-9f56-f1cc65280410 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Processing event network-vif-plugged-3e1bce81-bd3f-433a-aad4-1b90ad016699 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.546 2 DEBUG nova.compute.manager [req-d81fe369-e9b4-4bc7-8a24-c59ac60b2171 req-d9d5527e-79c6-4c4b-9f56-f1cc65280410 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Received event network-vif-plugged-3e1bce81-bd3f-433a-aad4-1b90ad016699 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.546 2 DEBUG oslo_concurrency.lockutils [req-d81fe369-e9b4-4bc7-8a24-c59ac60b2171 req-d9d5527e-79c6-4c4b-9f56-f1cc65280410 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "08e4113f-f3be-424f-926e-62e20b3ad767-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.547 2 DEBUG oslo_concurrency.lockutils [req-d81fe369-e9b4-4bc7-8a24-c59ac60b2171 req-d9d5527e-79c6-4c4b-9f56-f1cc65280410 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "08e4113f-f3be-424f-926e-62e20b3ad767-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.547 2 DEBUG oslo_concurrency.lockutils [req-d81fe369-e9b4-4bc7-8a24-c59ac60b2171 req-d9d5527e-79c6-4c4b-9f56-f1cc65280410 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "08e4113f-f3be-424f-926e-62e20b3ad767-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.547 2 DEBUG nova.compute.manager [req-d81fe369-e9b4-4bc7-8a24-c59ac60b2171 req-d9d5527e-79c6-4c4b-9f56-f1cc65280410 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] No waiting events found dispatching network-vif-plugged-3e1bce81-bd3f-433a-aad4-1b90ad016699 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.548 2 WARNING nova.compute.manager [req-d81fe369-e9b4-4bc7-8a24-c59ac60b2171 req-d9d5527e-79c6-4c4b-9f56-f1cc65280410 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Received unexpected event network-vif-plugged-3e1bce81-bd3f-433a-aad4-1b90ad016699 for instance with vm_state building and task_state spawning.#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.732 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936946.7318494, 08e4113f-f3be-424f-926e-62e20b3ad767 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.734 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] VM Started (Lifecycle Event)#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.737 2 DEBUG nova.compute.manager [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.742 2 DEBUG nova.virt.libvirt.driver [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.747 2 INFO nova.virt.libvirt.driver [-] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Instance spawned successfully.#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.748 2 DEBUG nova.virt.libvirt.driver [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.824 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.827 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.888 2 DEBUG nova.virt.libvirt.driver [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.889 2 DEBUG nova.virt.libvirt.driver [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.890 2 DEBUG nova.virt.libvirt.driver [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.890 2 DEBUG nova.virt.libvirt.driver [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.890 2 DEBUG nova.virt.libvirt.driver [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.891 2 DEBUG nova.virt.libvirt.driver [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.901 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.901 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936946.7321334, 08e4113f-f3be-424f-926e-62e20b3ad767 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:22:26 np0005476733 nova_compute[192580]: 2025-10-08 15:22:26.901 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:22:27 np0005476733 nova_compute[192580]: 2025-10-08 15:22:27.104 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:22:27 np0005476733 nova_compute[192580]: 2025-10-08 15:22:27.113 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936946.741147, 08e4113f-f3be-424f-926e-62e20b3ad767 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:22:27 np0005476733 nova_compute[192580]: 2025-10-08 15:22:27.114 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:22:27 np0005476733 nova_compute[192580]: 2025-10-08 15:22:27.171 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:22:27 np0005476733 nova_compute[192580]: 2025-10-08 15:22:27.175 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:22:27 np0005476733 nova_compute[192580]: 2025-10-08 15:22:27.277 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:22:27 np0005476733 nova_compute[192580]: 2025-10-08 15:22:27.371 2 INFO nova.compute.manager [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Took 6.24 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:22:27 np0005476733 nova_compute[192580]: 2025-10-08 15:22:27.372 2 DEBUG nova.compute.manager [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:22:27 np0005476733 nova_compute[192580]: 2025-10-08 15:22:27.794 2 INFO nova.compute.manager [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Took 7.23 seconds to build instance.#033[00m
Oct  8 11:22:27 np0005476733 nova_compute[192580]: 2025-10-08 15:22:27.886 2 DEBUG oslo_concurrency.lockutils [None req-5ab5d037-c268-4eaa-80ce-428918cc6da1 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "08e4113f-f3be-424f-926e-62e20b3ad767" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.454s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:28 np0005476733 nova_compute[192580]: 2025-10-08 15:22:28.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:29 np0005476733 nova_compute[192580]: 2025-10-08 15:22:29.035 2 INFO nova.compute.manager [None req-793eb6de-ee1e-4fd9-a784-83cb35f99012 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Get console output#033[00m
Oct  8 11:22:29 np0005476733 nova_compute[192580]: 2025-10-08 15:22:29.041 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:22:29 np0005476733 nova_compute[192580]: 2025-10-08 15:22:29.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:30 np0005476733 podman[224752]: 2025-10-08 15:22:30.248653204 +0000 UTC m=+0.070418675 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, name=ubi9-minimal, architecture=x86_64, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter)
Oct  8 11:22:30 np0005476733 nova_compute[192580]: 2025-10-08 15:22:30.367 2 INFO nova.compute.manager [None req-920d7d05-0884-4b42-9b2c-18954d7d8838 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Get console output#033[00m
Oct  8 11:22:30 np0005476733 nova_compute[192580]: 2025-10-08 15:22:30.373 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.512 2 DEBUG oslo_concurrency.lockutils [None req-be599f11-ae17-4c64-9538-0a7c93d240a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "af660b82-9b3c-4c4d-820a-3d22b73898e5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.513 2 DEBUG oslo_concurrency.lockutils [None req-be599f11-ae17-4c64-9538-0a7c93d240a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "af660b82-9b3c-4c4d-820a-3d22b73898e5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.514 2 DEBUG oslo_concurrency.lockutils [None req-be599f11-ae17-4c64-9538-0a7c93d240a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "af660b82-9b3c-4c4d-820a-3d22b73898e5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.514 2 DEBUG oslo_concurrency.lockutils [None req-be599f11-ae17-4c64-9538-0a7c93d240a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "af660b82-9b3c-4c4d-820a-3d22b73898e5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.514 2 DEBUG oslo_concurrency.lockutils [None req-be599f11-ae17-4c64-9538-0a7c93d240a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "af660b82-9b3c-4c4d-820a-3d22b73898e5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.515 2 INFO nova.compute.manager [None req-be599f11-ae17-4c64-9538-0a7c93d240a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Terminating instance#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.516 2 DEBUG nova.compute.manager [None req-be599f11-ae17-4c64-9538-0a7c93d240a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:22:31 np0005476733 kernel: tap1f764678-f4 (unregistering): left promiscuous mode
Oct  8 11:22:31 np0005476733 NetworkManager[51699]: <info>  [1759936951.5446] device (tap1f764678-f4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:31 np0005476733 ovn_controller[94857]: 2025-10-08T15:22:31Z|00148|binding|INFO|Releasing lport 1f764678-f4b9-420d-b072-8c0f7c3534a9 from this chassis (sb_readonly=0)
Oct  8 11:22:31 np0005476733 ovn_controller[94857]: 2025-10-08T15:22:31Z|00149|binding|INFO|Setting lport 1f764678-f4b9-420d-b072-8c0f7c3534a9 down in Southbound
Oct  8 11:22:31 np0005476733 ovn_controller[94857]: 2025-10-08T15:22:31Z|00150|binding|INFO|Removing iface tap1f764678-f4 ovn-installed in OVS
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:31.563 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7e:98:72 10.100.0.8'], port_security=['fa:16:3e:7e:98:72 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'af660b82-9b3c-4c4d-820a-3d22b73898e5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '496a37645ecf47b496dcf02c696ca64a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '023a0cd3-fdca-4dff-ba80-8ef557b384c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.202'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b3d4cc6-3768-451b-b35e-6b2333c921fd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=1f764678-f4b9-420d-b072-8c0f7c3534a9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:22:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:31.564 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 1f764678-f4b9-420d-b072-8c0f7c3534a9 in datapath 30cdfb1e-750a-4d0e-9e9c-321b06b371b9 unbound from our chassis#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:31.568 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 30cdfb1e-750a-4d0e-9e9c-321b06b371b9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:22:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:31.570 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[86e13d81-7021-4b87-8744-24c89ea33143]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:31.571 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9 namespace which is not needed anymore#033[00m
Oct  8 11:22:31 np0005476733 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000005.scope: Deactivated successfully.
Oct  8 11:22:31 np0005476733 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000005.scope: Consumed 45.294s CPU time.
Oct  8 11:22:31 np0005476733 systemd-machined[152624]: Machine qemu-3-instance-00000005 terminated.
Oct  8 11:22:31 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[222295]: [NOTICE]   (222299) : haproxy version is 2.8.14-c23fe91
Oct  8 11:22:31 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[222295]: [NOTICE]   (222299) : path to executable is /usr/sbin/haproxy
Oct  8 11:22:31 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[222295]: [WARNING]  (222299) : Exiting Master process...
Oct  8 11:22:31 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[222295]: [ALERT]    (222299) : Current worker (222301) exited with code 143 (Terminated)
Oct  8 11:22:31 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[222295]: [WARNING]  (222299) : All workers exited. Exiting... (0)
Oct  8 11:22:31 np0005476733 systemd[1]: libpod-ff196a76cd72a270391c59c9051dde202d1b332e60776a07d5673f6f347cc5c3.scope: Deactivated successfully.
Oct  8 11:22:31 np0005476733 podman[224796]: 2025-10-08 15:22:31.70729702 +0000 UTC m=+0.046339284 container died ff196a76cd72a270391c59c9051dde202d1b332e60776a07d5673f6f347cc5c3 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Oct  8 11:22:31 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ff196a76cd72a270391c59c9051dde202d1b332e60776a07d5673f6f347cc5c3-userdata-shm.mount: Deactivated successfully.
Oct  8 11:22:31 np0005476733 systemd[1]: var-lib-containers-storage-overlay-5b3c028b8f73810de4a748fc8846be92bbbadf66f8dafada46f9b4fa92d1d5f8-merged.mount: Deactivated successfully.
Oct  8 11:22:31 np0005476733 podman[224796]: 2025-10-08 15:22:31.743238691 +0000 UTC m=+0.082280955 container cleanup ff196a76cd72a270391c59c9051dde202d1b332e60776a07d5673f6f347cc5c3 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:22:31 np0005476733 systemd[1]: libpod-conmon-ff196a76cd72a270391c59c9051dde202d1b332e60776a07d5673f6f347cc5c3.scope: Deactivated successfully.
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.789 2 INFO nova.virt.libvirt.driver [-] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Instance destroyed successfully.#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.789 2 DEBUG nova.objects.instance [None req-be599f11-ae17-4c64-9538-0a7c93d240a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lazy-loading 'resources' on Instance uuid af660b82-9b3c-4c4d-820a-3d22b73898e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.806 2 DEBUG nova.virt.libvirt.vif [None req-be599f11-ae17-4c64-9538-0a7c93d240a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:18:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_idle_timeout_with_querier_enabled-2110154127',display_name='tempest-test_idle_timeout_with_querier_enabled-2110154127',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-idle-timeout-with-querier-enabled-2110154127',id=5,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHaTUyIW7HAi8eLb2uxsb3hQ01QNiqMtiwd2QQElMyFusiyPekoP+eGZG5apcvUeJj+ezHykEE9e9GalqeB/Pt0gdiMZz/nmUCtHv59KRRGG4S5F2fPmbxlRdJaDztvzVg==',key_name='tempest-keypair-test-1272869518',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:19:42Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='496a37645ecf47b496dcf02c696ca64a',ramdisk_id='',reservation_id='r-cey4fd9p',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-MulticastTestIPv4Ovn-1993668591',owner_user_name='tempest-MulticastTestIPv4Ovn-1993668591-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:19:42Z,user_data=None,user_id='c0c7c5c2dab54695b1cc0a34bdc4ee47',uuid=af660b82-9b3c-4c4d-820a-3d22b73898e5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1f764678-f4b9-420d-b072-8c0f7c3534a9", "address": "fa:16:3e:7e:98:72", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f764678-f4", "ovs_interfaceid": "1f764678-f4b9-420d-b072-8c0f7c3534a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.807 2 DEBUG nova.network.os_vif_util [None req-be599f11-ae17-4c64-9538-0a7c93d240a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converting VIF {"id": "1f764678-f4b9-420d-b072-8c0f7c3534a9", "address": "fa:16:3e:7e:98:72", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f764678-f4", "ovs_interfaceid": "1f764678-f4b9-420d-b072-8c0f7c3534a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.807 2 DEBUG nova.network.os_vif_util [None req-be599f11-ae17-4c64-9538-0a7c93d240a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7e:98:72,bridge_name='br-int',has_traffic_filtering=True,id=1f764678-f4b9-420d-b072-8c0f7c3534a9,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f764678-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.808 2 DEBUG os_vif [None req-be599f11-ae17-4c64-9538-0a7c93d240a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7e:98:72,bridge_name='br-int',has_traffic_filtering=True,id=1f764678-f4b9-420d-b072-8c0f7c3534a9,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f764678-f4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.810 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f764678-f4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.815 2 INFO os_vif [None req-be599f11-ae17-4c64-9538-0a7c93d240a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7e:98:72,bridge_name='br-int',has_traffic_filtering=True,id=1f764678-f4b9-420d-b072-8c0f7c3534a9,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f764678-f4')#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.816 2 INFO nova.virt.libvirt.driver [None req-be599f11-ae17-4c64-9538-0a7c93d240a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Deleting instance files /var/lib/nova/instances/af660b82-9b3c-4c4d-820a-3d22b73898e5_del#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.816 2 INFO nova.virt.libvirt.driver [None req-be599f11-ae17-4c64-9538-0a7c93d240a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Deletion of /var/lib/nova/instances/af660b82-9b3c-4c4d-820a-3d22b73898e5_del complete#033[00m
Oct  8 11:22:31 np0005476733 podman[224831]: 2025-10-08 15:22:31.830437782 +0000 UTC m=+0.057896225 container remove ff196a76cd72a270391c59c9051dde202d1b332e60776a07d5673f6f347cc5c3 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:22:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:31.835 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1b24ce63-51f1-4ee5-affd-4d7b38a156e9]: (4, ('Wed Oct  8 03:22:31 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9 (ff196a76cd72a270391c59c9051dde202d1b332e60776a07d5673f6f347cc5c3)\nff196a76cd72a270391c59c9051dde202d1b332e60776a07d5673f6f347cc5c3\nWed Oct  8 03:22:31 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9 (ff196a76cd72a270391c59c9051dde202d1b332e60776a07d5673f6f347cc5c3)\nff196a76cd72a270391c59c9051dde202d1b332e60776a07d5673f6f347cc5c3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:31.837 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8c7aacb8-c394-49ce-8908-49c36c54811a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:31.838 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap30cdfb1e-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:31 np0005476733 kernel: tap30cdfb1e-70: left promiscuous mode
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:31.858 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7cdcca7c-8300-4441-a2f6-bb4988059170]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.873 2 INFO nova.compute.manager [None req-be599f11-ae17-4c64-9538-0a7c93d240a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.874 2 DEBUG oslo.service.loopingcall [None req-be599f11-ae17-4c64-9538-0a7c93d240a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.874 2 DEBUG nova.compute.manager [-] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:22:31 np0005476733 nova_compute[192580]: 2025-10-08 15:22:31.874 2 DEBUG nova.network.neutron [-] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:22:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:31.890 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8228f6e5-82c8-49e6-8143-6977f8550c7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:31.891 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2bb2a831-2021-4d3c-bdf0-741143f445a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:31.908 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9dce84d3-d586-4f4c-a879-b9890bcb272a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 368470, 'reachable_time': 31820, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224856, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:31.911 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:22:31 np0005476733 systemd[1]: run-netns-ovnmeta\x2d30cdfb1e\x2d750a\x2d4d0e\x2d9e9c\x2d321b06b371b9.mount: Deactivated successfully.
Oct  8 11:22:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:31.911 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[3cec9908-27e2-4dda-890c-46edf6d60fff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:33 np0005476733 nova_compute[192580]: 2025-10-08 15:22:33.510 2 DEBUG nova.compute.manager [req-5d0e38be-00ad-463d-9445-bd0447ed229c req-af2daf31-029e-47ea-ae38-060b0687cd6f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Received event network-vif-unplugged-1f764678-f4b9-420d-b072-8c0f7c3534a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:22:33 np0005476733 nova_compute[192580]: 2025-10-08 15:22:33.511 2 DEBUG oslo_concurrency.lockutils [req-5d0e38be-00ad-463d-9445-bd0447ed229c req-af2daf31-029e-47ea-ae38-060b0687cd6f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "af660b82-9b3c-4c4d-820a-3d22b73898e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:33 np0005476733 nova_compute[192580]: 2025-10-08 15:22:33.511 2 DEBUG oslo_concurrency.lockutils [req-5d0e38be-00ad-463d-9445-bd0447ed229c req-af2daf31-029e-47ea-ae38-060b0687cd6f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "af660b82-9b3c-4c4d-820a-3d22b73898e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:33 np0005476733 nova_compute[192580]: 2025-10-08 15:22:33.512 2 DEBUG oslo_concurrency.lockutils [req-5d0e38be-00ad-463d-9445-bd0447ed229c req-af2daf31-029e-47ea-ae38-060b0687cd6f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "af660b82-9b3c-4c4d-820a-3d22b73898e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:33 np0005476733 nova_compute[192580]: 2025-10-08 15:22:33.512 2 DEBUG nova.compute.manager [req-5d0e38be-00ad-463d-9445-bd0447ed229c req-af2daf31-029e-47ea-ae38-060b0687cd6f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] No waiting events found dispatching network-vif-unplugged-1f764678-f4b9-420d-b072-8c0f7c3534a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:22:33 np0005476733 nova_compute[192580]: 2025-10-08 15:22:33.513 2 DEBUG nova.compute.manager [req-5d0e38be-00ad-463d-9445-bd0447ed229c req-af2daf31-029e-47ea-ae38-060b0687cd6f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Received event network-vif-unplugged-1f764678-f4b9-420d-b072-8c0f7c3534a9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:22:33 np0005476733 nova_compute[192580]: 2025-10-08 15:22:33.513 2 DEBUG nova.compute.manager [req-5d0e38be-00ad-463d-9445-bd0447ed229c req-af2daf31-029e-47ea-ae38-060b0687cd6f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Received event network-vif-plugged-1f764678-f4b9-420d-b072-8c0f7c3534a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:22:33 np0005476733 nova_compute[192580]: 2025-10-08 15:22:33.513 2 DEBUG oslo_concurrency.lockutils [req-5d0e38be-00ad-463d-9445-bd0447ed229c req-af2daf31-029e-47ea-ae38-060b0687cd6f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "af660b82-9b3c-4c4d-820a-3d22b73898e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:33 np0005476733 nova_compute[192580]: 2025-10-08 15:22:33.513 2 DEBUG oslo_concurrency.lockutils [req-5d0e38be-00ad-463d-9445-bd0447ed229c req-af2daf31-029e-47ea-ae38-060b0687cd6f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "af660b82-9b3c-4c4d-820a-3d22b73898e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:33 np0005476733 nova_compute[192580]: 2025-10-08 15:22:33.514 2 DEBUG oslo_concurrency.lockutils [req-5d0e38be-00ad-463d-9445-bd0447ed229c req-af2daf31-029e-47ea-ae38-060b0687cd6f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "af660b82-9b3c-4c4d-820a-3d22b73898e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:33 np0005476733 nova_compute[192580]: 2025-10-08 15:22:33.514 2 DEBUG nova.compute.manager [req-5d0e38be-00ad-463d-9445-bd0447ed229c req-af2daf31-029e-47ea-ae38-060b0687cd6f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] No waiting events found dispatching network-vif-plugged-1f764678-f4b9-420d-b072-8c0f7c3534a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:22:33 np0005476733 nova_compute[192580]: 2025-10-08 15:22:33.514 2 WARNING nova.compute.manager [req-5d0e38be-00ad-463d-9445-bd0447ed229c req-af2daf31-029e-47ea-ae38-060b0687cd6f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Received unexpected event network-vif-plugged-1f764678-f4b9-420d-b072-8c0f7c3534a9 for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:22:33 np0005476733 nova_compute[192580]: 2025-10-08 15:22:33.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:34 np0005476733 podman[224859]: 2025-10-08 15:22:34.233882797 +0000 UTC m=+0.056752927 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 11:22:34 np0005476733 podman[224858]: 2025-10-08 15:22:34.248500765 +0000 UTC m=+0.069199896 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.357 2 INFO nova.compute.manager [None req-728d8836-ed8d-4b6c-8c0f-f7fa438d7449 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Get console output#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.360 2 DEBUG nova.network.neutron [-] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.369 2 DEBUG nova.compute.manager [req-0dd9cc5f-bcbe-492a-9186-89b36a598d48 req-75cbdc16-8f3e-4c34-8b7b-4450703a5bc9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Received event network-vif-deleted-1f764678-f4b9-420d-b072-8c0f7c3534a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.369 2 INFO nova.compute.manager [req-0dd9cc5f-bcbe-492a-9186-89b36a598d48 req-75cbdc16-8f3e-4c34-8b7b-4450703a5bc9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Neutron deleted interface 1f764678-f4b9-420d-b072-8c0f7c3534a9; detaching it from the instance and deleting it from the info cache#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.369 2 DEBUG nova.network.neutron [req-0dd9cc5f-bcbe-492a-9186-89b36a598d48 req-75cbdc16-8f3e-4c34-8b7b-4450703a5bc9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.389 2 INFO nova.compute.manager [-] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Took 2.51 seconds to deallocate network for instance.#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.395 2 DEBUG nova.compute.manager [req-0dd9cc5f-bcbe-492a-9186-89b36a598d48 req-75cbdc16-8f3e-4c34-8b7b-4450703a5bc9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Detach interface failed, port_id=1f764678-f4b9-420d-b072-8c0f7c3534a9, reason: Instance af660b82-9b3c-4c4d-820a-3d22b73898e5 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.444 2 DEBUG oslo_concurrency.lockutils [None req-be599f11-ae17-4c64-9538-0a7c93d240a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.444 2 DEBUG oslo_concurrency.lockutils [None req-be599f11-ae17-4c64-9538-0a7c93d240a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.604 2 DEBUG nova.compute.provider_tree [None req-be599f11-ae17-4c64-9538-0a7c93d240a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.609 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.622 2 DEBUG nova.scheduler.client.report [None req-be599f11-ae17-4c64-9538-0a7c93d240a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.648 2 DEBUG oslo_concurrency.lockutils [None req-be599f11-ae17-4c64-9538-0a7c93d240a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.651 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.042s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.651 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.651 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.680 2 INFO nova.scheduler.client.report [None req-be599f11-ae17-4c64-9538-0a7c93d240a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Deleted allocations for instance af660b82-9b3c-4c4d-820a-3d22b73898e5#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.753 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/08e4113f-f3be-424f-926e-62e20b3ad767/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.775 2 DEBUG oslo_concurrency.lockutils [None req-be599f11-ae17-4c64-9538-0a7c93d240a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "af660b82-9b3c-4c4d-820a-3d22b73898e5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.810 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/08e4113f-f3be-424f-926e-62e20b3ad767/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.811 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/08e4113f-f3be-424f-926e-62e20b3ad767/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.867 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/08e4113f-f3be-424f-926e-62e20b3ad767/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.877 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64549fc7-989f-473a-99bb-78947d8d7536/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.937 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64549fc7-989f-473a-99bb-78947d8d7536/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:34 np0005476733 nova_compute[192580]: 2025-10-08 15:22:34.938 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64549fc7-989f-473a-99bb-78947d8d7536/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.001 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64549fc7-989f-473a-99bb-78947d8d7536/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.007 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.065 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.066 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.124 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.130 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/341c177f-c391-41dd-bf3c-14c2076057eb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.186 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/341c177f-c391-41dd-bf3c-14c2076057eb/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.187 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/341c177f-c391-41dd-bf3c-14c2076057eb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.267 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/341c177f-c391-41dd-bf3c-14c2076057eb/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.272 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.351 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.352 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.414 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.518 2 INFO nova.compute.manager [None req-3f1a8838-33fe-49e1-a946-4dfea416bc68 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Get console output#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.523 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.629 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.630 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=10596MB free_disk=110.90072250366211GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.631 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.631 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.727 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 341c177f-c391-41dd-bf3c-14c2076057eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.727 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 64549fc7-989f-473a-99bb-78947d8d7536 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.727 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 9992bf78-8d8e-43c7-a8cc-5606d8c910cf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.728 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 08e4113f-f3be-424f-926e-62e20b3ad767 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.728 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance a71ee5d2-21b8-4455-8870-f20bed682909 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.728 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 5 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.728 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=5632MB phys_disk=119GB used_disk=50GB total_vcpus=8 used_vcpus=5 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.852 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.869 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:22:35 np0005476733 ovn_controller[94857]: 2025-10-08T15:22:35Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2b:df:5b 10.100.0.6
Oct  8 11:22:35 np0005476733 ovn_controller[94857]: 2025-10-08T15:22:35Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2b:df:5b 10.100.0.6
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.894 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:22:35 np0005476733 nova_compute[192580]: 2025-10-08 15:22:35.895 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.001 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'name': 'tempest-test_flooding_when_special_groups-185393400', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000012', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'hostId': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.006 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '64549fc7-989f-473a-99bb-78947d8d7536', 'name': 'tempest-multicast-server-vlan-transparent-1532029749', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000a', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '27fe52d14e2143a887b0445eb5cfca72', 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'hostId': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.008 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'name': 'tempest-multicast-server-vlan-transparent-35634410', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000f', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '27fe52d14e2143a887b0445eb5cfca72', 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'hostId': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.010 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'name': 'tempest-broadcast-receiver-1467126576', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000b', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0d027d9bf53149dd9246b01ebf09eb48', 'user_id': '625a85fb4a424c84b99b84adcf899810', 'hostId': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.012 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'name': 'tempest-test_flooding_when_special_groups-542526277', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000009', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'hostId': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.012 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.015 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 08e4113f-f3be-424f-926e-62e20b3ad767 / tap3e1bce81-bd inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.015 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.021 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 64549fc7-989f-473a-99bb-78947d8d7536 / tap4689d9d8-d6 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.021 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.024 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 9992bf78-8d8e-43c7-a8cc-5606d8c910cf / tap5800d2b5-1c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.024 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.028 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 341c177f-c391-41dd-bf3c-14c2076057eb / tap046dc8a5-fa inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.029 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.032 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for a71ee5d2-21b8-4455-8870-f20bed682909 / tapf66c148b-4c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.032 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd085fd4-9244-4d41-94c5-4465665a03f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000012-08e4113f-f3be-424f-926e-62e20b3ad767-tap3e1bce81-bd', 'timestamp': '2025-10-08T15:22:36.012927', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'tap3e1bce81-bd', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:ce:a3:46', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e1bce81-bd'}, 'message_id': '9e68c7c2-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.735943831, 'message_signature': '54e5768feb642b7ff4f624169a224f2a7ce9bf699dcdc137c4115a0082652c3a'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': 'instance-0000000a-64549fc7-989f-473a-99bb-78947d8d7536-tap4689d9d8-d6', 'timestamp': '2025-10-08T15:22:36.012927', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'tap4689d9d8-d6', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:bc:4a:7f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4689d9d8-d6'}, 'message_id': '9e69a624-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.739652079, 'message_signature': '01f8c932bd591549acfc0715c54a4b30e9c7d2c770ceaf17740aef1aaaf4d281'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': 'instance-0000000f-9992bf78-8d8e-43c7-a8cc-5606d8c910cf-tap5800d2b5-1c', 'timestamp': '2025-10-08T15:22:36.012927', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'tap5800d2b5-1c', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2b:df:5b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5800d2b5-1c'}, 'message_id': '9e6a1e2e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.74530853, 'message_signature': '72ec8ef4eb2f549eacaa47c29fbaf6a0cc537fe05e99856a001335cb2346e474'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': 'instance-0000000b-341c177f-c391-41dd-bf3c-14c2076057eb-tap046dc8a5-fa', 'timestamp': '2025-10-08T15:22:36.012927', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'tap046dc8a5-fa', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:0f:3d:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap046dc8a5-fa'}, 'message_id': '9e6abeba-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.748373258, 'message_signature': '0011de2d71728ce218cbce9329524afeba83f64b8960a79876cf45bf2c80229c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000009-a71ee5d2-21b8-4455-8870-f20bed682909-tapf66c148b-4c', 'timestamp': '2025-10-08T15:22:36.012927', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'tapf66c148b-4c', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:77:9d:93', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf66c148b-4c'}, 'message_id': '9e6b459c-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.75248314, 'message_signature': 'a31cdee024f2698291de9d0d97d952e0c944bc3b1a7e1c1ad473b99f6a1e475d'}]}, 'timestamp': '2025-10-08 15:22:36.033008', '_unique_id': 'adfebf05c1084b3980abbee9db6af517'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.034 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.036 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.047 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.048 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.059 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/disk.device.usage volume: 152502272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.060 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.076 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.device.usage volume: 17170432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.077 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.090 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/disk.device.usage volume: 152436736 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.091 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.104 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/disk.device.usage volume: 152305664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.105 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71c47b45-2be5-495e-93f7-1138b2e172f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': '08e4113f-f3be-424f-926e-62e20b3ad767-vda', 'timestamp': '2025-10-08T15:22:36.036628', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'instance-00000012', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e6da2d8-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.759634149, 'message_signature': '47d32f91920a61671bd02cc56e9b10fd9db656d28d3d7f269fb6aad56aebae8d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': '08e4113f-f3be-424f-926e-62e20b3ad767-sda', 'timestamp': '2025-10-08T15:22:36.036628', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'instance-00000012', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e6db296-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.759634149, 'message_signature': 'b072b5f3e5a56c2cff758501872520131952452f986024a67fd25a6f6a8466ac'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152502272, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '64549fc7-989f-473a-99bb-78947d8d7536-vda', 'timestamp': '2025-10-08T15:22:36.036628', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'instance-0000000a', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e6f6b9a-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.771797318, 'message_signature': 'e1c2007e5d3ccc09b1b3b0a31ba5fa21c74a9bc04b83aeade8d83e6aaf44a812'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '64549fc7-989f-473a-99bb-78947d8d7536-sda', 'timestamp': '2025-10-08T15:22:36.036628', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'instance-0000000a', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e6f78ec-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.771797318, 'message_signature': '6d058f23c506334b6637aa4e8f04394ca49a6116cdc7495cf676ae5c48340006'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 17170432, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf-vda', 'timestamp': '2025-10-08T15:22:36.036628', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'instance-0000000f', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e720b48-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.78340367, 'message_signature': '6d801507014dc087c9b5142bf4440b3727862f28697244aa3172d90e404367f6'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf-sda', 'timestamp': '2025-10-08T15:22:36.036628', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'instance-0000000f', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', '
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: -1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e7219da-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.78340367, 'message_signature': '9a0e85ecb007ebbda451148fab6144fa2088b1ccfe14e283238b6d19696bf596'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152436736, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': '341c177f-c391-41dd-bf3c-14c2076057eb-vda', 'timestamp': '2025-10-08T15:22:36.036628', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'instance-0000000b', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e742766-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.800628132, 'message_signature': '6ea0c483843cfd8401e3d4a70ab8d2ae1ce474b3647c29f23b11918d9c60e269'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': '341c177f-c391-41dd-bf3c-14c2076057eb-sda', 'timestamp': '2025-10-08T15:22:36.036628', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'instance-0000000b', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e74392c-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.800628132, 'message_signature': '70482d104b1c1cb466f245aa1b0345767a99f25d59e4b0dfcdf1d345aadc3207'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152305664, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'a71ee5d2-21b8-4455-8870-f20bed682909-vda', 'timestamp': '2025-10-08T15:22:36.036628', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'instance-00000009', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e763ad8-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.814589198, 'message_signature': 'b369472d483e4d63f49ee63fcd3479b93610210efb38eceff55f091d05f37b9d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'a71ee5d2-21b8-4455-8870-f20bed682909-sda', 'timestamp': '2025-10-08T15:22:36.036628', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'instance-00000009', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e76557c-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.814589198, 'message_signature': '00e73a48d5fc107b23792955d5d1d34a0145ed734c32e03fefab4b453781f7f5'}]}, 'timestamp': '2025-10-08 15:22:36.105445', '_unique_id': '4d3466a4c98b4a4fbdc7f7a9de52ad00'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.108 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.108 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.108 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.109 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.109 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.109 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58c38fed-c9fa-4f9d-beee-20c5012f9ce1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000012-08e4113f-f3be-424f-926e-62e20b3ad767-tap3e1bce81-bd', 'timestamp': '2025-10-08T15:22:36.108363', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'tap3e1bce81-bd', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:ce:a3:46', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e1bce81-bd'}, 'message_id': '9e76d6aa-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.735943831, 'message_signature': '61e5a0a622ad1561f4222424fc682999ddc21e13f8f6933c4cf51cd3b2404688'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': 'instance-0000000a-64549fc7-989f-473a-99bb-78947d8d7536-tap4689d9d8-d6', 'timestamp': '2025-10-08T15:22:36.108363', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'tap4689d9d8-d6', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:bc:4a:7f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4689d9d8-d6'}, 'message_id': '9e76e38e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.739652079, 'message_signature': '3f99424af50f30c98a6c958df12de354b5517486d21f6b5160fafe884b44bf78'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': 'instance-0000000f-9992bf78-8d8e-43c7-a8cc-5606d8c910cf-tap5800d2b5-1c', 'timestamp': '2025-10-08T15:22:36.108363', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'tap5800d2b5-1c', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2b:df:5b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5800d2b5-1c'}, 'message_id': '9e76f1e4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.74530853, 'message_signature': '5e389a7a535dd2e263a5e55c8f66905616f6ada6fc4ac389b4eeab45a624280b'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': 'instance-0000000b-341c177f-c391-41dd-bf3c-14c2076057eb-tap046dc8a5-fa', 'timestamp': '2025-10-08T15:22:36.108363', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'tap046dc8a5-fa', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:0f:3d:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap046dc8a5-fa'}, 'message_id': '9e770184-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.748373258, 'message_signature': '470292dd29fcd0d9989fb9f65372776dea2221a36e079b3c15fa3e89ab0a0e7c'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000009-a71ee5d2-21b8-4455-8870-f20bed682909-tapf66c148b-4c', 'timestamp': '2025-10-08T15:22:36.108363', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'tapf66c148b-4c', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:77:9d:93', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf66c148b-4c'}, 'message_id': '9e77117e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.75248314, 'message_signature': 'fc17be0198c23c42373b5e90f8b4172f654e68a48de60948dc79f74147ff232b'}]}, 'timestamp': '2025-10-08 15:22:36.110252', '_unique_id': '6481a56fcb72474392c5e04c7e96c57f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.111 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.112 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.132 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/disk.device.read.latency volume: 2848234541 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.133 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/disk.device.read.latency volume: 918478 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.147 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/disk.device.read.latency volume: 14489038081 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.147 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/disk.device.read.latency volume: 64906838 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.167 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.device.read.latency volume: 5538216118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.168 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.device.read.latency volume: 77132865 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.183 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/disk.device.read.latency volume: 14615290035 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.184 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/disk.device.read.latency volume: 220291200 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.202 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/disk.device.read.latency volume: 9001730759 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.202 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/disk.device.read.latency volume: 89239411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9e3331f-cf09-4ee9-8337-5372e8b55053', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2848234541, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': '08e4113f-f3be-424f-926e-62e20b3ad767-vda', 'timestamp': '2025-10-08T15:22:36.112482', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'instance-00000012', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e7a8ca0-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.835486677, 'message_signature': '046b104a4086c7eb3b340406b133cd5a5c21329d4ffd007837b72ef2d7a781f5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 918478, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': '08e4113f-f3be-424f-926e-62e20b3ad767-sda', 'timestamp': '2025-10-08T15:22:36.112482', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'instance-00000012', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e7a99a2-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.835486677, 'message_signature': '3d6fd8a2a2116db7ffe53c3ec79f6fd2d56f9914ad4f291b3a0f6557aa29eb92'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14489038081, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '64549fc7-989f-473a-99bb-78947d8d7536-vda', 'timestamp': '2025-10-08T15:22:36.112482', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'instance-0000000a', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e7cd096-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.856316254, 'message_signature': '568d0158bbae746bd006930e199a2887953698260fb3025785c0c09bdebbb87c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 64906838, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '64549fc7-989f-473a-99bb-78947d8d7536-sda', 'timestamp': '2025-10-08T15:22:36.112482', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'instance-0000000a', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e7cdd70-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.856316254, 'message_signature': '77de4cb0fb305653025f2a5e22c2cebc668ec1dd1e38e4d4845d32a97eafe7ff'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5538216118, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf-vda', 'timestamp': '2025-10-08T15:22:36.112482', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'instance-0000000f', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e7fec68-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.871160928, 'message_signature': '21db4ce574e91f765c9e5addff8cf9252448a3cb2b69a11df22000efd5d19678'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 77132865, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf-sda', 'timestamp': '2025-10-08T15:22:36.112482', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'instance-0000000f', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ra
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e7ffa50-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.871160928, 'message_signature': '8df5adf6d6f22baf289fb6c839db088a46fe32475e16a265c2c00b70f4e33b53'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14615290035, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': '341c177f-c391-41dd-bf3c-14c2076057eb-vda', 'timestamp': '2025-10-08T15:22:36.112482', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'instance-0000000b', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e8253cc-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.891572392, 'message_signature': '937efd44095cfedf59ea6286129db5b43f0933fddfc76b4749548335a0720d3f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 220291200, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': '341c177f-c391-41dd-bf3c-14c2076057eb-sda', 'timestamp': '2025-10-08T15:22:36.112482', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'instance-0000000b', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e825f84-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.891572392, 'message_signature': 'c82301ff1e6663902820ed42b9ec3f5183dc4d831a9e421df455840759478c49'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9001730759, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'a71ee5d2-21b8-4455-8870-f20bed682909-vda', 'timestamp': '2025-10-08T15:22:36.112482', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'instance-00000009', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e852bba-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.907228714, 'message_signature': '8d438658c1a0c3db0ad1d874f6e1839e3fa150458f00907935483d8db5b83a94'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89239411, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'a71ee5d2-21b8-4455-8870-f20bed682909-sda', 'timestamp': '2025-10-08T15:22:36.112482', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'instance-00000009', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e853a2e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.907228714, 'message_signature': '1ac01b2f76a2158483b52e36ab5ed17f5e444724558633cc12e65180a28a7bb2'}]}, 'timestamp': '2025-10-08 15:22:36.203013', '_unique_id': '8515b47587ab41d38a104ec20e149c98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.205 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.205 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.205 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.206 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.206 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.206 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a51d8969-6924-4a65-80ed-36814d1f1d6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000012-08e4113f-f3be-424f-926e-62e20b3ad767-tap3e1bce81-bd', 'timestamp': '2025-10-08T15:22:36.205453', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'tap3e1bce81-bd', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:ce:a3:46', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e1bce81-bd'}, 'message_id': '9e85a6d0-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.735943831, 'message_signature': '1145dbe9d815686fac2dd4885f89a8110930fd4e53f3b4c81f9759bc2946b854'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': 'instance-0000000a-64549fc7-989f-473a-99bb-78947d8d7536-tap4689d9d8-d6', 'timestamp': '2025-10-08T15:22:36.205453', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'tap4689d9d8-d6', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:bc:4a:7f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4689d9d8-d6'}, 'message_id': '9e85b2d8-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.739652079, 'message_signature': 'd0a591fff60e11972be966cff0902b5fba218fd56a0cb65183facfe1ecc8791a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': 'instance-0000000f-9992bf78-8d8e-43c7-a8cc-5606d8c910cf-tap5800d2b5-1c', 'timestamp': '2025-10-08T15:22:36.205453', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'tap5800d2b5-1c', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2b:df:5b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5800d2b5-1c'}, 'message_id': '9e85bf30-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.74530853, 'message_signature': 'd58c7deb0c4670360a1316e4312590664e8373cebf00e3e23e39889eb3b79e60'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': 'instance-0000000b-341c177f-c391-41dd-bf3c-14c2076057eb-tap046dc8a5-fa', 'timestamp': '2025-10-08T15:22:36.205453', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'tap046dc8a5-fa', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:0f:3d:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap046dc8a5-fa'}, 'message_id': '9e85cab6-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.748373258, 'message_signature': 'a732f6826577f381462252478402f7e82fd84c9b42ef9c4243be321103e5634c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000009-a71ee5d2-21b8-4455-8870-f20bed682909-tapf66c148b-4c', 'timestamp': '2025-10-08T15:22:36.205453', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'tapf66c148b-4c', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:77:9d:93', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf66c148b-4c'}, 'message_id': '9e85de8e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.75248314, 'message_signature': 'fbbae39524b98ce37509b956ff6e48d8dd3c14623a2799e1f81c57c7f1bee3c6'}]}, 'timestamp': '2025-10-08 15:22:36.207242', '_unique_id': 'e4bb34a6e40b498a948ffdc22ee56f2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.208 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.209 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.209 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/disk.device.write.bytes volume: 1024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.209 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.209 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/disk.device.write.bytes volume: 136160768 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.210 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.210 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.device.write.bytes volume: 16453632 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.210 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.210 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/disk.device.write.bytes volume: 135687168 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.211 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.211 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/disk.device.write.bytes volume: 135795200 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.211 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad586644-8d40-49a4-92cd-f79269990620', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1024, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': '08e4113f-f3be-424f-926e-62e20b3ad767-vda', 'timestamp': '2025-10-08T15:22:36.209294', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'instance-00000012', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e863bf4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.835486677, 'message_signature': 'ebe04222b1338d83cf9c9ec86c3d27ebd1f7aab634c7eab1e0a871346f1b36fa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': '08e4113f-f3be-424f-926e-62e20b3ad767-sda', 'timestamp': '2025-10-08T15:22:36.209294', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'instance-00000012', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e864608-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.835486677, 'message_signature': '9d812569ab690a3a1df0874763a0474b0a7cdcfe107efededbc6778948f13731'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 136160768, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '64549fc7-989f-473a-99bb-78947d8d7536-vda', 'timestamp': '2025-10-08T15:22:36.209294', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'instance-0000000a', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e86503a-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.856316254, 'message_signature': '23dfdb91119841fe7052e378428d71165816526b781462343976a5f25f4c6476'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '64549fc7-989f-473a-99bb-78947d8d7536-sda', 'timestamp': '2025-10-08T15:22:36.209294', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'instance-0000000a', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e865be8-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.856316254, 'message_signature': '079ae3c8985009c5357263a15ac03c51094820b5bbc5f68c9c271df6d5ba2ba9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 16453632, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf-vda', 'timestamp': '2025-10-08T15:22:36.209294', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'instance-0000000f', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e866692-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.871160928, 'message_signature': 'b59872a1c4dd6d210d68abb8f0bdf790e3f7a0fb164b9944fcdba98e87caff8c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf-sda', 'timestamp': '2025-10-08T15:22:36.209294', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'instance-0000000f', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'sw
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 1-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e867038-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.871160928, 'message_signature': 'fe84a554c33f43a36d2b9aaa89a100b22c960ab16df22669b9237c7468f00259'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 135687168, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': '341c177f-c391-41dd-bf3c-14c2076057eb-vda', 'timestamp': '2025-10-08T15:22:36.209294', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'instance-0000000b', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e8677cc-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.891572392, 'message_signature': 'e09da0f9fb59e1c59fc272de7041f07cbc263b9023f0a16fd4862ebb4c9c2882'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': '341c177f-c391-41dd-bf3c-14c2076057eb-sda', 'timestamp': '2025-10-08T15:22:36.209294', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'instance-0000000b', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e8681d6-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.891572392, 'message_signature': '96f13c2daf6ff406770db353d4f69496002ba86a764c6b0f72af169b845daec2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 135795200, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'a71ee5d2-21b8-4455-8870-f20bed682909-vda', 'timestamp': '2025-10-08T15:22:36.209294', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'instance-00000009', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e868c3a-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.907228714, 'message_signature': 'ea36a972f165a7124f9f1f9fc068d9f3eb2c3c0c1a6f4a39bc8212f5ee98a9ae'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'a71ee5d2-21b8-4455-8870-f20bed682909-sda', 'timestamp': '2025-10-08T15:22:36.209294', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'instance-00000009', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e86969e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.907228714, 'message_signature': 'eb8bfc7d11a6fb0425d7bebfb2eafb9d5d962ba92b00a92074404dcfd4532e17'}]}, 'timestamp': '2025-10-08 15:22:36.211911', '_unique_id': 'da015164b5e946b481b4299eb08fc91c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.213 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.213 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.214 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/network.outgoing.packets volume: 190 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.214 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/network.outgoing.packets volume: 7 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.214 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/network.outgoing.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.215 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/network.outgoing.packets volume: 48 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4746ea90-3da5-43d9-8310-02e9ff1573fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000012-08e4113f-f3be-424f-926e-62e20b3ad767-tap3e1bce81-bd', 'timestamp': '2025-10-08T15:22:36.213848', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'tap3e1bce81-bd', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:ce:a3:46', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e1bce81-bd'}, 'message_id': '9e86ee46-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.735943831, 'message_signature': 'fd9bef2e0346d236a5753aa9bb8918234c95f603485248cfeedb82c22a5b7fce'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 190, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': 'instance-0000000a-64549fc7-989f-473a-99bb-78947d8d7536-tap4689d9d8-d6', 'timestamp': '2025-10-08T15:22:36.213848', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'tap4689d9d8-d6', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:bc:4a:7f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4689d9d8-d6'}, 'message_id': '9e86fb0c-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.739652079, 'message_signature': '70f3cc089d646ebb5e6846a35d8b2cec5cea048378b22059c39c54d27bd9fa2f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 7, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': 'instance-0000000f-9992bf78-8d8e-43c7-a8cc-5606d8c910cf-tap5800d2b5-1c', 'timestamp': '2025-10-08T15:22:36.213848', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'tap5800d2b5-1c', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2b:df:5b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5800d2b5-1c'}, 'message_id': '9e8706c4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.74530853, 'message_signature': 'd64e55d8267278d4e739492e177631a5a4b11316120bfb8a336e8206df984e49'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': 'instance-0000000b-341c177f-c391-41dd-bf3c-14c2076057eb-tap046dc8a5-fa', 'timestamp': '2025-10-08T15:22:36.213848', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'tap046dc8a5-fa', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:0f:3d:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap046dc8a5-fa'}, 'message_id': '9e87124a-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.748373258, 'message_signature': '75ef07ffd2d122222a7b76d53f0e0bba58d7c6df92097452d1dfd2052680fd8e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 48, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000009-a71ee5d2-21b8-4455-8870-f20bed682909-tapf66c148b-4c', 'timestamp': '2025-10-08T15:22:36.213848', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'tapf66c148b-4c', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:77:9d:93', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf66c148b-4c'}, 'message_id': '9e871e8e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.75248314, 'message_signature': '69187e81db3cc3cf529c0dba6436d87ee46d29d9f5ebfb4114e57ae99657a8e5'}]}, 'timestamp': '2025-10-08 15:22:36.215406', '_unique_id': 'ab712a2f74d84fff86f9396967276447'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.216 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.217 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.217 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.217 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-test_flooding_when_special_groups-185393400>, <NovaLikeServer: tempest-multicast-server-vlan-transparent-1532029749>, <NovaLikeServer: tempest-multicast-server-vlan-transparent-35634410>, <NovaLikeServer: tempest-broadcast-receiver-1467126576>, <NovaLikeServer: tempest-test_flooding_when_special_groups-542526277>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_flooding_when_special_groups-185393400>, <NovaLikeServer: tempest-multicast-server-vlan-transparent-1532029749>, <NovaLikeServer: tempest-multicast-server-vlan-transparent-35634410>, <NovaLikeServer: tempest-broadcast-receiver-1467126576>, <NovaLikeServer: tempest-test_flooding_when_special_groups-542526277>]
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.217 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.217 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.219 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-test_flooding_when_special_groups-185393400>, <NovaLikeServer: tempest-multicast-server-vlan-transparent-1532029749>, <NovaLikeServer: tempest-multicast-server-vlan-transparent-35634410>, <NovaLikeServer: tempest-broadcast-receiver-1467126576>, <NovaLikeServer: tempest-test_flooding_when_special_groups-542526277>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_flooding_when_special_groups-185393400>, <NovaLikeServer: tempest-multicast-server-vlan-transparent-1532029749>, <NovaLikeServer: tempest-multicast-server-vlan-transparent-35634410>, <NovaLikeServer: tempest-broadcast-receiver-1467126576>, <NovaLikeServer: tempest-test_flooding_when_special_groups-542526277>]
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.219 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.219 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.219 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.220 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/disk.device.write.requests volume: 735 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.220 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.220 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.device.write.requests volume: 42 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.220 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.221 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/disk.device.write.requests volume: 714 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.221 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.221 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/disk.device.write.requests volume: 733 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.221 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea8a4c79-8ad4-4c67-ae5e-b76623986208', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': '08e4113f-f3be-424f-926e-62e20b3ad767-vda', 'timestamp': '2025-10-08T15:22:36.219579', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'instance-00000012', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e87cdfc-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.835486677, 'message_signature': 'babdcdb1592c6d841f3f8cc193371f88b89431a9128833e035ec9d61b7ae55d9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': '08e4113f-f3be-424f-926e-62e20b3ad767-sda', 'timestamp': '2025-10-08T15:22:36.219579', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'instance-00000012', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e87da7c-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.835486677, 'message_signature': '02430132029a66f0a9f052c5d7afa8db93230b7e6deedd69117406a7d8ef6f67'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 735, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '64549fc7-989f-473a-99bb-78947d8d7536-vda', 'timestamp': '2025-10-08T15:22:36.219579', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'instance-0000000a', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e87e3e6-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.856316254, 'message_signature': 'e321a39a48d3065807ab419205351dac5ccb9b591c7767c1caf5a92bdf473499'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '64549fc7-989f-473a-99bb-78947d8d7536-sda', 'timestamp': '2025-10-08T15:22:36.219579', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'instance-0000000a', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e87ed96-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.856316254, 'message_signature': '29bfe174c258c1bb6f696c6e739403318fd73753e68192bea491df297f7fb836'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 42, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf-vda', 'timestamp': '2025-10-08T15:22:36.219579', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'instance-0000000f', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e87f78c-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.871160928, 'message_signature': 'd756920715b9d73d0aeee6dde38c6b085fe9f7cba9f97e165fd92513ec3773bb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf-sda', 'timestamp': '2025-10-08T15:22:36.219579', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'instance-0000000f', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram'
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: ate': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e880100-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.871160928, 'message_signature': '009acf457e02abf138c5f46560793aa967cde84621209b669d60e5a1fd5d3601'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 714, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': '341c177f-c391-41dd-bf3c-14c2076057eb-vda', 'timestamp': '2025-10-08T15:22:36.219579', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'instance-0000000b', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e880bc8-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.891572392, 'message_signature': '4c843cc5451dea7fd241ede3e5d4395cee29be02bd78b1cec857c54ed3b65420'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': '341c177f-c391-41dd-bf3c-14c2076057eb-sda', 'timestamp': '2025-10-08T15:22:36.219579', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'instance-0000000b', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e8815d2-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.891572392, 'message_signature': '9ddb7cba98ca8aaf89d1244d1d247ead4797d5dd66ae6e0e94adec905a20c266'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 733, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'a71ee5d2-21b8-4455-8870-f20bed682909-vda', 'timestamp': '2025-10-08T15:22:36.219579', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'instance-00000009', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e882018-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.907228714, 'message_signature': '93f03eb8b3677117369053a7612b9057403a139d2ee151c513b27a0e24bb06ff'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'a71ee5d2-21b8-4455-8870-f20bed682909-sda', 'timestamp': '2025-10-08T15:22:36.219579', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'instance-00000009', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e882810-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.907228714, 'message_signature': '998b72ad8e47fab405634ac564ffdfbd727e425a8055470b114e5fbb096a5582'}]}, 'timestamp': '2025-10-08 15:22:36.222176', '_unique_id': '9833e8dcf147427a8bc33b836807636a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.223 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.224 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-test_flooding_when_special_groups-185393400>, <NovaLikeServer: tempest-multicast-server-vlan-transparent-1532029749>, <NovaLikeServer: tempest-multicast-server-vlan-transparent-35634410>, <NovaLikeServer: tempest-broadcast-receiver-1467126576>, <NovaLikeServer: tempest-test_flooding_when_special_groups-542526277>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_flooding_when_special_groups-185393400>, <NovaLikeServer: tempest-multicast-server-vlan-transparent-1532029749>, <NovaLikeServer: tempest-multicast-server-vlan-transparent-35634410>, <NovaLikeServer: tempest-broadcast-receiver-1467126576>, <NovaLikeServer: tempest-test_flooding_when_special_groups-542526277>]
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.224 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.224 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.224 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/network.outgoing.bytes volume: 37047 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.224 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/network.outgoing.bytes volume: 1086 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.225 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/network.outgoing.bytes volume: 8229 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.225 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/network.outgoing.bytes volume: 4748 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1fd288a-f538-438a-a5db-02b9558b1915', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000012-08e4113f-f3be-424f-926e-62e20b3ad767-tap3e1bce81-bd', 'timestamp': '2025-10-08T15:22:36.224388', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'tap3e1bce81-bd', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:ce:a3:46', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e1bce81-bd'}, 'message_id': '9e88892c-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.735943831, 'message_signature': '1d575a69b6a94239c6b2a7497f5babbb0a24afb1749b8a32a03d2de90d630148'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 37047, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': 'instance-0000000a-64549fc7-989f-473a-99bb-78947d8d7536-tap4689d9d8-d6', 'timestamp': '2025-10-08T15:22:36.224388', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'tap4689d9d8-d6', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:bc:4a:7f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4689d9d8-d6'}, 'message_id': '9e8892fa-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.739652079, 'message_signature': '1a3d83b900c89ac1d330bb91cf9816d356dc70dbace9eaf026e0fdb656f3c3fb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1086, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': 'instance-0000000f-9992bf78-8d8e-43c7-a8cc-5606d8c910cf-tap5800d2b5-1c', 'timestamp': '2025-10-08T15:22:36.224388', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'tap5800d2b5-1c', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2b:df:5b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5800d2b5-1c'}, 'message_id': '9e889bc4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.74530853, 'message_signature': '5ea4c6f13aba1b763c972e2be0d8fa513bccf55d0fec4678f2de7bf10bd6794a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8229, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': 'instance-0000000b-341c177f-c391-41dd-bf3c-14c2076057eb-tap046dc8a5-fa', 'timestamp': '2025-10-08T15:22:36.224388', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'tap046dc8a5-fa', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:0f:3d:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap046dc8a5-fa'}, 'message_id': '9e88a498-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.748373258, 'message_signature': 'df4dc098146a3ed32616e312047a0cabd0a0769d60f784ee4ca0c58e9ef682c8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4748, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000009-a71ee5d2-21b8-4455-8870-f20bed682909-tapf66c148b-4c', 'timestamp': '2025-10-08T15:22:36.224388', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'tapf66c148b-4c', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:77:9d:93', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf66c148b-4c'}, 'message_id': '9e88ac40-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.75248314, 'message_signature': 'ca866af8d43d437ef3819157d1357629a66ee8a4856d535080c75fb3a257a540'}]}, 'timestamp': '2025-10-08 15:22:36.225540', '_unique_id': '2193544bb4a1445e97469118bd0fbed1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.226 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.246 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/cpu volume: 9040000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.264 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/cpu volume: 43140000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.279 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/cpu volume: 24110000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:22:36.106 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.295 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/cpu volume: 42400000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:22:36.204 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.313 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/cpu volume: 40780000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b46ea478-8332-418d-a9d2-673c34f63d10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9040000000, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'timestamp': '2025-10-08T15:22:36.226756', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'instance-00000012', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '9e8bf648-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.969567768, 'message_signature': 'ed7a4217c5f4b84d9d03c7630a2e53c43f7f5c2a97debd15d15c35467ad9326c'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 43140000000, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'timestamp': '2025-10-08T15:22:36.226756', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'instance-0000000a', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '9e8ea050-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.987053668, 'message_signature': '19225db9e36643706ef13593f337d1a1eb2faa401f509455e889dbbccb0a7949'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24110000000, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'timestamp': '2025-10-08T15:22:36.226756', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'instance-0000000f', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '9e910084-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3860.00273955, 'message_signature': '1b77bd17c031b4225aa249d0ad0f5b21d4308c4116ae4a917108f25c496c9cce'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 42400000000, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'timestamp': '2025-10-08T15:22:36.226756', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'instance-0000000b', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '9e937f26-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3860.018732672, 'message_signature': 'e976092dbd0060c4fcc96484489e9a3547a9b12281266bcbf341103adb4d661e'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 40780000000, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'timestamp': '2025-10-08T15:22:36.226756', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'instance-00000009', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '9e963400-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3860.036440848, 'message_signature': '4256cf1623e647e75628bf9895d910ee43b5ece947054e2b051e584771bc0dbf'}]}, 'timestamp': '2025-10-08 15:22:36.314308', '_unique_id': 'f177310582aa4aacb49defdbebabb984'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.315 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.318 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.318 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/disk.device.allocation volume: 1253376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.318 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.318 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.318 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.319 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.device.allocation volume: 17829888 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.319 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.319 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.319 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.319 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/disk.device.allocation volume: 153100288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d1db2ef-b3b8-44ca-ae55-e55a5b5bb531', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1253376, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': '08e4113f-f3be-424f-926e-62e20b3ad767-vda', 'timestamp': '2025-10-08T15:22:36.318132', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'instance-00000012', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e96d842-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.759634149, 'message_signature': '4ce9b1cfa0c7c125ce04a327da24f4d9ea5b3259a51aa1d78fd7d648eacf217a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': '08e4113f-f3be-424f-926e-62e20b3ad767-sda', 'timestamp': '2025-10-08T15:22:36.318132', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'instance-00000012', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e96e116-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.759634149, 'message_signature': 'a7be871bc70a55ff5fca649aaa3e6a58b779815c06cb151d0f815f8052b0c742'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '64549fc7-989f-473a-99bb-78947d8d7536-vda', 'timestamp': '2025-10-08T15:22:36.318132', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'instance-0000000a', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e96e936-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.771797318, 'message_signature': '10ea816eccfd4b22ad986a10dec9f0f02b3c266146c56ab1b480442093ffae61'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '64549fc7-989f-473a-99bb-78947d8d7536-sda', 'timestamp': '2025-10-08T15:22:36.318132', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'instance-0000000a', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e96f0ca-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.771797318, 'message_signature': '18bb38972b711256073b476a312cc92287d4402a82280fd2121815a7396218ed'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 17829888, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf-vda', 'timestamp': '2025-10-08T15:22:36.318132', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'instance-0000000f', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e96f93a-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.78340367, 'message_signature': '4ec1fbd24594a64f8eba9d0692495d865610c30eb239f9d9611081b710e4286a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf-sda', 'timestamp': '2025-10-08T15:22:36.318132', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'instance-0000000f', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 11111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e9700d8-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.78340367, 'message_signature': 'a61cf424d843b12f48a61a70f0336aef8480369528d60bcd15bc5957d890c001'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': '341c177f-c391-41dd-bf3c-14c2076057eb-vda', 'timestamp': '2025-10-08T15:22:36.318132', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'instance-0000000b', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e97084e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.800628132, 'message_signature': '224f7e159957ebfa59ef99d4584681817ec1c22e35a0f949131d561d23ce5ce6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': '341c177f-c391-41dd-bf3c-14c2076057eb-sda', 'timestamp': '2025-10-08T15:22:36.318132', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'instance-0000000b', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e970f88-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.800628132, 'message_signature': 'a0d0b9534cbc4d519272d0928ee87e754fd058e20e0d90bf46f4626ef35b084d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153100288, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'a71ee5d2-21b8-4455-8870-f20bed682909-vda', 'timestamp': '2025-10-08T15:22:36.318132', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'instance-00000009', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e9716e0-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.814589198, 'message_signature': 'd58580a3034bf8da4df6e2f83c029f8bacc2b84bb7ebda70480f9565da4e9f85'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'a71ee5d2-21b8-4455-8870-f20bed682909-sda', 'timestamp': '2025-10-08T15:22:36.318132', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'instance-00000009', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e972004-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.814589198, 'message_signature': 'ea1af0205e1277b3c8853ac51bb926bb51f0aa8b13a9d4ee89c22ab383ba027a'}]}, 'timestamp': '2025-10-08 15:22:36.320261', '_unique_id': '6e038da9c3ba40e9aff21e16a022ed13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:22:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:22:36.212 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.322 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.329 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.329 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.329 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.330 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.330 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07a352ba-d87a-41b2-8352-1fba91264d8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000012-08e4113f-f3be-424f-926e-62e20b3ad767-tap3e1bce81-bd', 'timestamp': '2025-10-08T15:22:36.329312', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'tap3e1bce81-bd', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:ce:a3:46', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e1bce81-bd'}, 'message_id': '9e988d90-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.735943831, 'message_signature': '35ecbde1db662128b78ac90e4cb0d200d154ddfa2a7eb9d2d8537b5d84ab6704'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': 'instance-0000000a-64549fc7-989f-473a-99bb-78947d8d7536-tap4689d9d8-d6', 'timestamp': '2025-10-08T15:22:36.329312', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'tap4689d9d8-d6', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:bc:4a:7f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4689d9d8-d6'}, 'message_id': '9e9896dc-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.739652079, 'message_signature': 'e6a7e97ee4927ff64331573a5d0b9c0481ca3d97583f7fa3ad6cce55d2507cd8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': 'instance-0000000f-9992bf78-8d8e-43c7-a8cc-5606d8c910cf-tap5800d2b5-1c', 'timestamp': '2025-10-08T15:22:36.329312', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'tap5800d2b5-1c', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2b:df:5b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5800d2b5-1c'}, 'message_id': '9e989f24-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.74530853, 'message_signature': 'dcf82b57cecf978e9041192cd142bb5c142d7c141e5b055e09f92dbd70bf04c5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': 'instance-0000000b-341c177f-c391-41dd-bf3c-14c2076057eb-tap046dc8a5-fa', 'timestamp': '2025-10-08T15:22:36.329312', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'tap046dc8a5-fa', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:0f:3d:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap046dc8a5-fa'}, 'message_id': '9e98a992-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.748373258, 'message_signature': 'a12f41c4628ea083dbb2943c02378998fd9f34130cd1292b64efa1419b4681ff'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000009-a71ee5d2-21b8-4455-8870-f20bed682909-tapf66c148b-4c', 'timestamp': '2025-10-08T15:22:36.329312', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'tapf66c148b-4c', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:77:9d:93', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf66c148b-4c'}, 'message_id': '9e98b392-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.75248314, 'message_signature': 'c19b5de206979d147cef53dd4c3abcb3dcb3a3ee913f4c101c66eccfd2382803'}]}, 'timestamp': '2025-10-08 15:22:36.330592', '_unique_id': 'f7514edb0fd74ee596339f05c7412b93'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.331 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/disk.device.read.requests volume: 5688 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.334 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.334 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/disk.device.read.requests volume: 11707 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.335 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.335 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.device.read.requests volume: 9866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.335 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.device.read.requests volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.335 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/disk.device.read.requests volume: 11694 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.336 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.336 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/disk.device.read.requests volume: 11527 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.336 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a9b75d2-640c-4808-9d8e-e5b410c17f64', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 5688, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': '08e4113f-f3be-424f-926e-62e20b3ad767-vda', 'timestamp': '2025-10-08T15:22:36.331943', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'instance-00000012', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e98f136-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.835486677, 'message_signature': '11035a29a6d208f553a2919551fd5dc0f47b1ba8aa9e415650afaa90f026015d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': '08e4113f-f3be-424f-926e-62e20b3ad767-sda', 'timestamp': '2025-10-08T15:22:36.331943', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'instance-00000012', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e9959a0-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.835486677, 'message_signature': '6918fcec5c1fddd38c48f96be1dc7ad50966b1aef05ffaa4ca464364ddd9d44e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11707, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '64549fc7-989f-473a-99bb-78947d8d7536-vda', 'timestamp': '2025-10-08T15:22:36.331943', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'instance-0000000a', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e996314-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.856316254, 'message_signature': '45e459e781f3bf878b46d545a7440f2256ef6c66464dbc1b0fc9fbe7a311b190'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '64549fc7-989f-473a-99bb-78947d8d7536-sda', 'timestamp': '2025-10-08T15:22:36.331943', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'instance-0000000a', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e997142-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.856316254, 'message_signature': 'a7c6578b185764116d4ef4e1f0181797aec604f48ca96536b5155e2b48926671'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 9866, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf-vda', 'timestamp': '2025-10-08T15:22:36.331943', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'instance-0000000f', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e997a66-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.871160928, 'message_signature': '9edf2e716874dc12390e5d63272666673d1f9aa186d74cf5e5f4e2bd1984a206'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 90, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf-sda', 'timestamp': '2025-10-08T15:22:36.331943', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'instance-0000000f', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, '
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: k_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e998344-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.871160928, 'message_signature': '56bf7782e037131e4719283f713780ccdafd3e662775ee37c08da7d65e94f425'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11694, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': '341c177f-c391-41dd-bf3c-14c2076057eb-vda', 'timestamp': '2025-10-08T15:22:36.331943', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'instance-0000000b', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e998c54-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.891572392, 'message_signature': 'd019a938bd96c03a269446e3ddefafb78d0ee2f16bdc27c763cc502d35c06f68'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': '341c177f-c391-41dd-bf3c-14c2076057eb-sda', 'timestamp': '2025-10-08T15:22:36.331943', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'instance-0000000b', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e99973a-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.891572392, 'message_signature': '047eca13ca4edc1f6adc23daf95b7be88bfdac170993e0c47547e5c8de28949e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11527, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'a71ee5d2-21b8-4455-8870-f20bed682909-vda', 'timestamp': '2025-10-08T15:22:36.331943', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'instance-00000009', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e999f8c-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.907228714, 'message_signature': '73942cd416adcb8432d9cfa65f8b76cb0ee1d6394da6f6dcbdc1c090f91583b2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'a71ee5d2-21b8-4455-8870-f20bed682909-sda', 'timestamp': '2025-10-08T15:22:36.331943', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'instance-00000009', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e99a7ca-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.907228714, 'message_signature': 'e06b579c34a9c45d7c9bd44342c43fc58ecfc3b685664bc56018fd0821a853f5'}]}, 'timestamp': '2025-10-08 15:22:36.336840', '_unique_id': 'dc16161cd9764fb994c3d547d0db2a01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.340 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.340 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.341 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.341 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.341 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a7f8776-85fb-4471-9d16-2e9813a6f462', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000012-08e4113f-f3be-424f-926e-62e20b3ad767-tap3e1bce81-bd', 'timestamp': '2025-10-08T15:22:36.340690', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'tap3e1bce81-bd', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:ce:a3:46', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e1bce81-bd'}, 'message_id': '9e9a4ef0-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.735943831, 'message_signature': 'bc9f790ea1ef919e0652eff635bca90867ae2affda041a28dbad01e78d264720'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': 'instance-0000000a-64549fc7-989f-473a-99bb-78947d8d7536-tap4689d9d8-d6', 'timestamp': '2025-10-08T15:22:36.340690', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'tap4689d9d8-d6', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:bc:4a:7f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4689d9d8-d6'}, 'message_id': '9e9a5c88-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.739652079, 'message_signature': 'b5510d0054a72e9249b19225350dfd7351a3b13c23dfe99f941501017318a6db'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': 'instance-0000000f-9992bf78-8d8e-43c7-a8cc-5606d8c910cf-tap5800d2b5-1c', 'timestamp': '2025-10-08T15:22:36.340690', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'tap5800d2b5-1c', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2b:df:5b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5800d2b5-1c'}, 'message_id': '9e9a68f4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.74530853, 'message_signature': 'bb4cb2b5da32935828f2f65c290460fc7d07ac57ae906d62682321f0121b5d51'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': 'instance-0000000b-341c177f-c391-41dd-bf3c-14c2076057eb-tap046dc8a5-fa', 'timestamp': '2025-10-08T15:22:36.340690', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'tap046dc8a5-fa', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:0f:3d:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap046dc8a5-fa'}, 'message_id': '9e9a752e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.748373258, 'message_signature': '4b7dbec21987187041ce20ea9489df6ed0e1047efbb233179b7ef245dc9a4df3'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000009-a71ee5d2-21b8-4455-8870-f20bed682909-tapf66c148b-4c', 'timestamp': '2025-10-08T15:22:36.340690', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'tapf66c148b-4c', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:77:9d:93', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf66c148b-4c'}, 'message_id': '9e9a81f4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.75248314, 'message_signature': 'ac9f0ea99ff28ae887afde395725fd0738c5b38e4de9ddeb37a2b8ff17cefc36'}]}, 'timestamp': '2025-10-08 15:22:36.342433', '_unique_id': 'ff1f0f7c90eb4820a089cdbb555b0e14'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:22:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:22:36.223 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.342 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.343 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.345 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.346 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/network.incoming.bytes volume: 25003 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.346 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/network.incoming.bytes volume: 1688 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.346 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/network.incoming.bytes volume: 5653 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.346 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/network.incoming.bytes volume: 2552 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d75493d-3d0e-4958-b58d-4ee1affc4092', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000012-08e4113f-f3be-424f-926e-62e20b3ad767-tap3e1bce81-bd', 'timestamp': '2025-10-08T15:22:36.345913', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'tap3e1bce81-bd', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:ce:a3:46', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e1bce81-bd'}, 'message_id': '9e9b16b4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.735943831, 'message_signature': '8aae114f15c946d9e0086e983b4e0d0320a781f55a776a55aeaa2d984529f647'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25003, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': 'instance-0000000a-64549fc7-989f-473a-99bb-78947d8d7536-tap4689d9d8-d6', 'timestamp': '2025-10-08T15:22:36.345913', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'tap4689d9d8-d6', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:bc:4a:7f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4689d9d8-d6'}, 'message_id': '9e9b206e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.739652079, 'message_signature': '3b5d7a45d75c6ac5a29d782c3ed347581ac22cf0572fc50bb7fb595ba584e1a7'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1688, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': 'instance-0000000f-9992bf78-8d8e-43c7-a8cc-5606d8c910cf-tap5800d2b5-1c', 'timestamp': '2025-10-08T15:22:36.345913', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'tap5800d2b5-1c', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2b:df:5b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5800d2b5-1c'}, 'message_id': '9e9b2938-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.74530853, 'message_signature': '5dcc32513f5b5a0b4d057e0b2c02d28e13e4de5938243adec75858dcb377f87c'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5653, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': 'instance-0000000b-341c177f-c391-41dd-bf3c-14c2076057eb-tap046dc8a5-fa', 'timestamp': '2025-10-08T15:22:36.345913', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'tap046dc8a5-fa', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:0f:3d:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap046dc8a5-fa'}, 'message_id': '9e9b31e4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.748373258, 'message_signature': 'bb289777d31466a915f75786daeba16a3242a19e487ed1b2ed2bb463f641ed3b'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2552, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000009-a71ee5d2-21b8-4455-8870-f20bed682909-tapf66c148b-4c', 'timestamp': '2025-10-08T15:22:36.345913', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'tapf66c148b-4c', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:77:9d:93', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf66c148b-4c'}, 'message_id': '9e9b3a90-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.75248314, 'message_signature': 'e4f9bd026a54ebd49dfdb93eee25fff1368c8fb0fc89508236c0dd2c7696e558'}]}, 'timestamp': '2025-10-08 15:22:36.347177', '_unique_id': '3eea8c29631c4706989616ea3233ce88'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.347 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.348 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.348 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.348 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 08e4113f-f3be-424f-926e-62e20b3ad767: ceilometer.compute.pollsters.NoVolumeException
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.348 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/memory.usage volume: 234.89453125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.348 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/memory.usage volume: 172.40234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.349 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/memory.usage volume: 234.2578125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.351 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/memory.usage volume: 233.9140625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3aafb915-7c18-4c1f-bb2f-2c0544927b04', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 234.89453125, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'timestamp': '2025-10-08T15:22:36.348415', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'instance-0000000a', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '9e9b7d16-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.987053668, 'message_signature': 'd2803e9a6c564d3e53ee1755b7973009e807309a08450243bf7b2a57a944f9d4'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 172.40234375, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'timestamp': '2025-10-08T15:22:36.348415', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'instance-0000000f', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '9e9b85a4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3860.00273955, 'message_signature': '68333676c511e13664efd2f0dd108d9ae275f064231e81b51fc5492a852ac7af'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 234.2578125, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'timestamp': '2025-10-08T15:22:36.348415', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'instance-0000000b', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '9e9bf1c4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3860.018732672, 'message_signature': '111ea559b99f50539ac47780c639fb323b8bb663e9a7a327dd70aca6ddb5e5ad'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 233.9140625, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'timestamp': '2025-10-08T15:22:36.348415', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'instance-00000009', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '9e9bfe08-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3860.036440848, 'message_signature': '739b6d092b5225d533711233c8109cee3dbc8fd8927764e10fb30c086eb9f767'}]}, 'timestamp': '2025-10-08 15:22:36.352196', '_unique_id': 'a94fd01222454fdabd3bb5b4affd688f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.352 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.353 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.354 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.354 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/network.incoming.packets volume: 148 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.355 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.355 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/network.incoming.packets volume: 33 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.355 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/network.incoming.packets volume: 21 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43f495bc-73eb-46ab-a006-4e915757484e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000012-08e4113f-f3be-424f-926e-62e20b3ad767-tap3e1bce81-bd', 'timestamp': '2025-10-08T15:22:36.354469', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'tap3e1bce81-bd', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:ce:a3:46', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e1bce81-bd'}, 'message_id': '9e9c6604-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.735943831, 'message_signature': '05191fe9c81c1e0c947ddbb7bc8d3e58e3c38004fd575ac47d6772e4758d78eb'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 148, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': 'instance-0000000a-64549fc7-989f-473a-99bb-78947d8d7536-tap4689d9d8-d6', 'timestamp': '2025-10-08T15:22:36.354469', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'tap4689d9d8-d6', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:bc:4a:7f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4689d9d8-d6'}, 'message_id': '9e9c722a-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.739652079, 'message_signature': '740ef91ae163e71a38736d8f681818b08d21f4ce4ed0975e58d1ba821c95d8c5'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': 'instance-0000000f-9992bf78-8d8e-43c7-a8cc-5606d8c910cf-tap5800d2b5-1c', 'timestamp': '2025-10-08T15:22:36.354469', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'tap5800d2b5-1c', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2b:df:5b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5800d2b5-1c'}, 'message_id': '9e9c7b26-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.74530853, 'message_signature': '2d33a6d17fb779dba2ecb567aaff7c8b66db8259c4cd43d72d5dc13cf1eea435'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 33, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': 'instance-0000000b-341c177f-c391-41dd-bf3c-14c2076057eb-tap046dc8a5-fa', 'timestamp': '2025-10-08T15:22:36.354469', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'tap046dc8a5-fa', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:0f:3d:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap046dc8a5-fa'}, 'message_id': '9e9c8332-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.748373258, 'message_signature': 'ccf16c622dec648bc6a6eecd8f7e2f4dcbc6f347f2ee53113d64fed16db97c86'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 21, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000009-a71ee5d2-21b8-4455-8870-f20bed682909-tapf66c148b-4c', 'timestamp': '2025-10-08T15:22:36.354469', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'tapf66c148b-4c', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:77:9d:93', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf66c148b-4c'}, 'message_id': '9e9c8bf2-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.75248314, 'message_signature': 'c46562e6c08ceedf6fb4147366c5813f34672781b13baa719852807af047a5ad'}]}, 'timestamp': '2025-10-08 15:22:36.355810', '_unique_id': 'c4074cb872ea4228b51695c7f3a1cb11'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.357 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.358 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/disk.device.write.latency volume: 13709308 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.358 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.358 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/disk.device.write.latency volume: 17895864982 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.358 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.358 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.device.write.latency volume: 389826237 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.359 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.359 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/disk.device.write.latency volume: 17697492630 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.359 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.359 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/disk.device.write.latency volume: 21802908926 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.360 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:22:36.320 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2cb50393-369c-49ab-9ed2-743bff769729', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13709308, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': '08e4113f-f3be-424f-926e-62e20b3ad767-vda', 'timestamp': '2025-10-08T15:22:36.357981', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'instance-00000012', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e9ceb42-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.835486677, 'message_signature': '91a1445ffa3bc63b0f0f3ef84788fe72123b2f209024cb628a68a4269af02eb1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': '08e4113f-f3be-424f-926e-62e20b3ad767-sda', 'timestamp': '2025-10-08T15:22:36.357981', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'instance-00000012', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e9cf3a8-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.835486677, 'message_signature': '029b52cc9acec0a55777a66abbde9bc23242f0af8a50a4e4915ff4418140f902'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17895864982, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '64549fc7-989f-473a-99bb-78947d8d7536-vda', 'timestamp': '2025-10-08T15:22:36.357981', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'instance-0000000a', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e9cfb6e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.856316254, 'message_signature': 'e26497ad7b4a76ceaace45b7047f5a163e09cbf0f0d0a44040ffa087ec8c801a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '64549fc7-989f-473a-99bb-78947d8d7536-sda', 'timestamp': '2025-10-08T15:22:36.357981', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'instance-0000000a', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e9d04ba-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.856316254, 'message_signature': '4fb562e990b64d07510ea48edc3570b8e512d4dc453cd47e2fdbee0c9687de89'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 389826237, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf-vda', 'timestamp': '2025-10-08T15:22:36.357981', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'instance-0000000f', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e9d0ec4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.871160928, 'message_signature': 'e10730c4a71153d11e15d389c149eff60108edb050868efdf473733d285242f0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf-sda', 'timestamp': '2025-10-08T15:22:36.357981', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'instance-0000000f', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk'
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: ge': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e9d19c8-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.871160928, 'message_signature': '1e66a2aa73a57ff6e60fd442bb09806b8918718da3a715ca5194d1dc41c18807'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17697492630, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': '341c177f-c391-41dd-bf3c-14c2076057eb-vda', 'timestamp': '2025-10-08T15:22:36.357981', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'instance-0000000b', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e9d23f0-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.891572392, 'message_signature': '490e5841937d2a3fc9c92134f157966a9edf8d5a7038e9a9d3348fa5e0727731'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': '341c177f-c391-41dd-bf3c-14c2076057eb-sda', 'timestamp': '2025-10-08T15:22:36.357981', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'instance-0000000b', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e9d2dc8-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.891572392, 'message_signature': 'd9db983a7bc24b0293365c763ea1cf2751cfd0035580902d091b4f14e2e245e0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21802908926, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'a71ee5d2-21b8-4455-8870-f20bed682909-vda', 'timestamp': '2025-10-08T15:22:36.357981', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'instance-00000009', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e9d3890-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.907228714, 'message_signature': '939949217d804c0eb9504f7c74090212a54cad6ecbde0c8a284441ce92d2cae5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'a71ee5d2-21b8-4455-8870-f20bed682909-sda', 'timestamp': '2025-10-08T15:22:36.357981', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'instance-00000009', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e9d43a8-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.907228714, 'message_signature': 'fa2bd1b8067d1caba639cdb3ccde2b5b44c885aa8956d5272612d6d9dd1afec0'}]}, 'timestamp': '2025-10-08 15:22:36.360483', '_unique_id': '47cfec8f7d984ac585a8dd3cca5af41c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.362 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.362 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.362 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.362 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.363 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.364 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.364 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.365 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.366 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.366 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.367 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3fcf675-1e4c-4736-8aa4-8c93f16ddb91', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': '08e4113f-f3be-424f-926e-62e20b3ad767-vda', 'timestamp': '2025-10-08T15:22:36.362287', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'instance-00000012', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e9d93a8-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.759634149, 'message_signature': 'cb7b5f79bef53d0835c6c9bf649467b40cb8204b6c53a6c3c4666901711305f5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': '08e4113f-f3be-424f-926e-62e20b3ad767-sda', 'timestamp': '2025-10-08T15:22:36.362287', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'instance-00000012', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e9d9db2-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.759634149, 'message_signature': '97727e4c872ecba671ab5ece7c9bd674b42be274fccda6763591c4526b6ad886'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '64549fc7-989f-473a-99bb-78947d8d7536-vda', 'timestamp': '2025-10-08T15:22:36.362287', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'instance-0000000a', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e9daaaa-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.771797318, 'message_signature': '5b939095177cefac607ffcfceddc33a151a528ab194e43fccf4861dafc884131'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '64549fc7-989f-473a-99bb-78947d8d7536-sda', 'timestamp': '2025-10-08T15:22:36.362287', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'instance-0000000a', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e9dc6d4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.771797318, 'message_signature': 'fdc54c8fef67798113b79d649f35c8d35ecde5b4ac61e943a8acaa1233529895'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf-vda', 'timestamp': '2025-10-08T15:22:36.362287', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'instance-0000000f', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e9de1fa-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.78340367, 'message_signature': '91371213740d1c5b64d20faa0868eefeedbf34abebf70cb9a29ae07fcf98722d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf-sda', 'timestamp': '2025-10-08T15:22:36.362287', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'instance-0000000f', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'ac
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 11'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e9e0504-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.78340367, 'message_signature': 'd36e559f6e8199aecabb4106d0f0ee3828e052de8cd87ae536326dd389b7e47d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': '341c177f-c391-41dd-bf3c-14c2076057eb-vda', 'timestamp': '2025-10-08T15:22:36.362287', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'instance-0000000b', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e9e1800-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.800628132, 'message_signature': '15eb96fedb05b16e47f7155016e63f352cc1b105ca640111751ffa221e3bd6d4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': '341c177f-c391-41dd-bf3c-14c2076057eb-sda', 'timestamp': '2025-10-08T15:22:36.362287', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'instance-0000000b', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e9e37b8-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.800628132, 'message_signature': '7b7a6af711816452145dccec56049be2da5cbde45aa45b6088a186acb1b8c267'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'a71ee5d2-21b8-4455-8870-f20bed682909-vda', 'timestamp': '2025-10-08T15:22:36.362287', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'instance-00000009', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e9e5522-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.814589198, 'message_signature': '9bf4b3084392cd476c1591fba7564965e8bee235e5480401be28e8c13582af85'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'a71ee5d2-21b8-4455-8870-f20bed682909-sda', 'timestamp': '2025-10-08T15:22:36.362287', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'instance-00000009', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e9e6e90-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.814589198, 'message_signature': '02bdf9f8f0239ff5b5e322ad35a9ea35e4ec40bf3965f6c23ea594ca4cba8377'}]}, 'timestamp': '2025-10-08 15:22:36.368408', '_unique_id': '6bfc6f51a9864e869694cf12978508a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.371 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.371 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/disk.device.read.bytes volume: 93131776 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.371 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.371 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/disk.device.read.bytes volume: 330397184 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.372 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.372 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.device.read.bytes volume: 231744512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.372 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/disk.device.read.bytes volume: 295160 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.372 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/disk.device.read.bytes volume: 330335744 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.372 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.373 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/disk.device.read.bytes volume: 326608384 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.373 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71103a26-785b-4bfb-82a4-2e7b11cea751', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 93131776, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': '08e4113f-f3be-424f-926e-62e20b3ad767-vda', 'timestamp': '2025-10-08T15:22:36.371288', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'instance-00000012', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e9ef3ce-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.835486677, 'message_signature': '716c63fea168d107dd1620c8c770b2d65311f4f1bd0b3efdbe69dd836ce5de8e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': '08e4113f-f3be-424f-926e-62e20b3ad767-sda', 'timestamp': '2025-10-08T15:22:36.371288', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'instance-00000012', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e9efdba-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.835486677, 'message_signature': '55bbe50c2423ead658c1ec45d29f86e317b37ae4deb059a609ef3b5a7943b91f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 330397184, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '64549fc7-989f-473a-99bb-78947d8d7536-vda', 'timestamp': '2025-10-08T15:22:36.371288', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'instance-0000000a', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e9f06a2-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.856316254, 'message_signature': 'e42f80dd2a8cf51525807b90dcc39107cf134d40a113d676a5085841cef93406'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '64549fc7-989f-473a-99bb-78947d8d7536-sda', 'timestamp': '2025-10-08T15:22:36.371288', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'instance-0000000a', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e9f0fda-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.856316254, 'message_signature': 'efbf95cafe9b835b0b7dc651cbd61f8e1b642eddfefe7584d73526446b911684'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 231744512, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf-vda', 'timestamp': '2025-10-08T15:22:36.371288', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'instance-0000000f', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e9f1944-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.871160928, 'message_signature': '85fa5eb459dc7a2481168077b70e5218c9728b8be954fd32743db4a1ba281c64'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 295160, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf-sda', 'timestamp': '2025-10-08T15:22:36.371288', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'instance-0000000f', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'epheme
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e9f218c-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.871160928, 'message_signature': 'b40529ff1e7179e6251aaaf01443850715c22f5e87d6ecd93ad48e91e323eba1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 330335744, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': '341c177f-c391-41dd-bf3c-14c2076057eb-vda', 'timestamp': '2025-10-08T15:22:36.371288', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'instance-0000000b', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e9f2b50-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.891572392, 'message_signature': '20a080c1fa74f7d80f1b4f1e48ea799b4bd3746c1eed1e2f7715bb294230b31d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': '341c177f-c391-41dd-bf3c-14c2076057eb-sda', 'timestamp': '2025-10-08T15:22:36.371288', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'instance-0000000b', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e9f32e4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.891572392, 'message_signature': '153799a20cf6b2cf5e1fbf02bf92e35f4ebee71180263a570e7870def5c967a4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 326608384, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'a71ee5d2-21b8-4455-8870-f20bed682909-vda', 'timestamp': '2025-10-08T15:22:36.371288', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'instance-00000009', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9e9f3c3a-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.907228714, 'message_signature': '98bdd1163c6fbc5a4df5525f41d9157b3cc5698d58c0682b4acdde4dcdd26c68'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'a71ee5d2-21b8-4455-8870-f20bed682909-sda', 'timestamp': '2025-10-08T15:22:36.371288', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'instance-00000009', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9e9f4392-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.907228714, 'message_signature': '58ebf31865edecf4f8ff5ffbf7b47c6d651d142f5395aa28737efa1785fc50be'}]}, 'timestamp': '2025-10-08 15:22:36.373584', '_unique_id': '21ad914ef9e94e2996f2c0aa2c5971d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:22:36.337 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.374 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.375 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-test_flooding_when_special_groups-185393400>, <NovaLikeServer: tempest-multicast-server-vlan-transparent-1532029749>, <NovaLikeServer: tempest-multicast-server-vlan-transparent-35634410>, <NovaLikeServer: tempest-broadcast-receiver-1467126576>, <NovaLikeServer: tempest-test_flooding_when_special_groups-542526277>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_flooding_when_special_groups-185393400>, <NovaLikeServer: tempest-multicast-server-vlan-transparent-1532029749>, <NovaLikeServer: tempest-multicast-server-vlan-transparent-35634410>, <NovaLikeServer: tempest-broadcast-receiver-1467126576>, <NovaLikeServer: tempest-test_flooding_when_special_groups-542526277>]
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.375 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.375 12 DEBUG ceilometer.compute.pollsters [-] 08e4113f-f3be-424f-926e-62e20b3ad767/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.375 12 DEBUG ceilometer.compute.pollsters [-] 64549fc7-989f-473a-99bb-78947d8d7536/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.375 12 DEBUG ceilometer.compute.pollsters [-] 9992bf78-8d8e-43c7-a8cc-5606d8c910cf/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.376 12 DEBUG ceilometer.compute.pollsters [-] 341c177f-c391-41dd-bf3c-14c2076057eb/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.376 12 DEBUG ceilometer.compute.pollsters [-] a71ee5d2-21b8-4455-8870-f20bed682909/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0104abb7-974f-4684-a94b-17754f3c7df1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000012-08e4113f-f3be-424f-926e-62e20b3ad767-tap3e1bce81-bd', 'timestamp': '2025-10-08T15:22:36.375340', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-185393400', 'name': 'tap3e1bce81-bd', 'instance_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:ce:a3:46', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e1bce81-bd'}, 'message_id': '9e9f90ae-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.735943831, 'message_signature': 'f518b34a4e382da05f4371239e389afe929953064954fd280e389f65fbc550ed'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': 'instance-0000000a-64549fc7-989f-473a-99bb-78947d8d7536-tap4689d9d8-d6', 'timestamp': '2025-10-08T15:22:36.375340', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-1532029749', 'name': 'tap4689d9d8-d6', 'instance_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:bc:4a:7f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4689d9d8-d6'}, 'message_id': '9e9f99fa-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.739652079, 'message_signature': 'bda52b105a8971232a899c89e43cfca5a82b255b57cc9165d04d498cfbf6675d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8f9ed00bd5cc488a9d2a77380f12a503', 'user_name': None, 'project_id': '27fe52d14e2143a887b0445eb5cfca72', 'project_name': None, 'resource_id': 'instance-0000000f-9992bf78-8d8e-43c7-a8cc-5606d8c910cf-tap5800d2b5-1c', 'timestamp': '2025-10-08T15:22:36.375340', 'resource_metadata': {'display_name': 'tempest-multicast-server-vlan-transparent-35634410', 'name': 'tap5800d2b5-1c', 'instance_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'instance_type': 'custom_neutron_guest', 'host': 'a248eeeb5ef2c117eefb333bf409b9712460867c0e66c5e030e70d66', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2b:df:5b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5800d2b5-1c'}, 'message_id': '9e9fa418-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.74530853, 'message_signature': '0b32ed5eb5a577fd97c25b9d27a26eda29942d85b777ef7ef6e14b4d28d45489'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '625a85fb4a424c84b99b84adcf899810', 'user_name': None, 'project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'project_name': None, 'resource_id': 'instance-0000000b-341c177f-c391-41dd-bf3c-14c2076057eb-tap046dc8a5-fa', 'timestamp': '2025-10-08T15:22:36.375340', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-1467126576', 'name': 'tap046dc8a5-fa', 'instance_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'instance_type': 'custom_neutron_guest', 'host': 'bd01977109c0d571ba9c41f7cbbf1b686a1ac35b140e12441b0bd337', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:0f:3d:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap046dc8a5-fa'}, 'message_id': '9e9faf9e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.748373258, 'message_signature': '4d3e11816c7f76f1ebfb608230db0954db246c98ec437f50569975e6accedf0b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000009-a71ee5d2-21b8-4455-8870-f20bed682909-tapf66c148b-4c', 'timestamp': '2025-10-08T15:22:36.375340', 'resource_metadata': {'display_name': 'tempest-test_flooding_when_special_groups-542526277', 'name': 'tapf66c148b-4c', 'instance_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:77:9d:93', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf66c148b-4c'}, 'message_id': '9e9fba16-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3859.75248314, 'message_signature': '45f5847d17fb16e71ce1dab592aa087eac41d4c45b7a54cf30347e6252dc7e3c'}]}, 'timestamp': '2025-10-08 15:22:36.376655', '_unique_id': 'bf4109281c94442499f339f9b621e57d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:22:36.377 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:22:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:22:36.361 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:22:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:22:36.370 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:22:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:22:36.374 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:22:36 np0005476733 nova_compute[192580]: 2025-10-08 15:22:36.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:38 np0005476733 nova_compute[192580]: 2025-10-08 15:22:38.415 2 DEBUG oslo_concurrency.lockutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "656c0a96-03f3-4a70-baac-01de2a126a91" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:38 np0005476733 nova_compute[192580]: 2025-10-08 15:22:38.416 2 DEBUG oslo_concurrency.lockutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "656c0a96-03f3-4a70-baac-01de2a126a91" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:38 np0005476733 nova_compute[192580]: 2025-10-08 15:22:38.433 2 DEBUG nova.compute.manager [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:22:38 np0005476733 nova_compute[192580]: 2025-10-08 15:22:38.558 2 DEBUG oslo_concurrency.lockutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:38 np0005476733 nova_compute[192580]: 2025-10-08 15:22:38.559 2 DEBUG oslo_concurrency.lockutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:38 np0005476733 nova_compute[192580]: 2025-10-08 15:22:38.566 2 DEBUG nova.virt.hardware [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:22:38 np0005476733 nova_compute[192580]: 2025-10-08 15:22:38.567 2 INFO nova.compute.claims [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:22:38 np0005476733 nova_compute[192580]: 2025-10-08 15:22:38.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:38 np0005476733 nova_compute[192580]: 2025-10-08 15:22:38.777 2 DEBUG nova.compute.provider_tree [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:22:38 np0005476733 nova_compute[192580]: 2025-10-08 15:22:38.808 2 DEBUG nova.scheduler.client.report [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:22:38 np0005476733 nova_compute[192580]: 2025-10-08 15:22:38.834 2 DEBUG oslo_concurrency.lockutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.275s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:38 np0005476733 nova_compute[192580]: 2025-10-08 15:22:38.835 2 DEBUG nova.compute.manager [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:22:38 np0005476733 nova_compute[192580]: 2025-10-08 15:22:38.897 2 DEBUG nova.compute.manager [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:22:38 np0005476733 nova_compute[192580]: 2025-10-08 15:22:38.897 2 DEBUG nova.network.neutron [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:22:38 np0005476733 nova_compute[192580]: 2025-10-08 15:22:38.900 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:22:38 np0005476733 nova_compute[192580]: 2025-10-08 15:22:38.901 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:22:38 np0005476733 nova_compute[192580]: 2025-10-08 15:22:38.901 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:22:38 np0005476733 nova_compute[192580]: 2025-10-08 15:22:38.933 2 INFO nova.virt.libvirt.driver [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:22:38 np0005476733 nova_compute[192580]: 2025-10-08 15:22:38.972 2 DEBUG nova.compute.manager [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.111 2 DEBUG nova.compute.manager [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.113 2 DEBUG nova.virt.libvirt.driver [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.114 2 INFO nova.virt.libvirt.driver [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Creating image(s)#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.115 2 DEBUG oslo_concurrency.lockutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "/var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.116 2 DEBUG oslo_concurrency.lockutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "/var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.117 2 DEBUG oslo_concurrency.lockutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "/var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.136 2 DEBUG nova.policy [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.140 2 DEBUG oslo_concurrency.processutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.213 2 DEBUG oslo_concurrency.processutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.214 2 DEBUG oslo_concurrency.lockutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.215 2 DEBUG oslo_concurrency.lockutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.230 2 DEBUG oslo_concurrency.processutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.310 2 DEBUG oslo_concurrency.processutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.313 2 DEBUG oslo_concurrency.processutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.357 2 DEBUG oslo_concurrency.processutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk 10737418240" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.358 2 DEBUG oslo_concurrency.lockutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.359 2 DEBUG oslo_concurrency.processutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.419 2 DEBUG oslo_concurrency.processutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.421 2 DEBUG nova.objects.instance [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lazy-loading 'migration_context' on Instance uuid 656c0a96-03f3-4a70-baac-01de2a126a91 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.521 2 DEBUG nova.virt.libvirt.driver [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.522 2 DEBUG nova.virt.libvirt.driver [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Ensure instance console log exists: /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.523 2 DEBUG oslo_concurrency.lockutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.523 2 DEBUG oslo_concurrency.lockutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.523 2 DEBUG oslo_concurrency.lockutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:39 np0005476733 nova_compute[192580]: 2025-10-08 15:22:39.541 2 INFO nova.compute.manager [None req-35da35cb-00b0-486b-a222-a3854fd9f15a f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Get console output#033[00m
Oct  8 11:22:40 np0005476733 nova_compute[192580]: 2025-10-08 15:22:40.967 2 INFO nova.compute.manager [None req-3414c9d3-628e-4c10-8bf2-4265973f2b8f 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Get console output#033[00m
Oct  8 11:22:40 np0005476733 nova_compute[192580]: 2025-10-08 15:22:40.974 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:22:41 np0005476733 nova_compute[192580]: 2025-10-08 15:22:41.403 2 DEBUG nova.network.neutron [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Successfully created port: 59f58b79-9163-41ba-8e03-7430e5def4ef _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 11:22:41 np0005476733 nova_compute[192580]: 2025-10-08 15:22:41.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:22:41 np0005476733 nova_compute[192580]: 2025-10-08 15:22:41.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:42 np0005476733 nova_compute[192580]: 2025-10-08 15:22:42.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:22:42 np0005476733 nova_compute[192580]: 2025-10-08 15:22:42.957 2 DEBUG nova.network.neutron [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Successfully updated port: 59f58b79-9163-41ba-8e03-7430e5def4ef _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:22:43 np0005476733 nova_compute[192580]: 2025-10-08 15:22:43.242 2 DEBUG oslo_concurrency.lockutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "refresh_cache-656c0a96-03f3-4a70-baac-01de2a126a91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:22:43 np0005476733 nova_compute[192580]: 2025-10-08 15:22:43.244 2 DEBUG oslo_concurrency.lockutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquired lock "refresh_cache-656c0a96-03f3-4a70-baac-01de2a126a91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:22:43 np0005476733 nova_compute[192580]: 2025-10-08 15:22:43.244 2 DEBUG nova.network.neutron [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:22:43 np0005476733 podman[224951]: 2025-10-08 15:22:43.249838796 +0000 UTC m=+0.072072458 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true)
Oct  8 11:22:43 np0005476733 nova_compute[192580]: 2025-10-08 15:22:43.415 2 DEBUG nova.compute.manager [req-a51061cd-0dca-4427-90f1-a1b8a2ecec2d req-e3fa9a74-b9d8-4f48-9ef4-c4b5adb70955 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Received event network-changed-59f58b79-9163-41ba-8e03-7430e5def4ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:22:43 np0005476733 nova_compute[192580]: 2025-10-08 15:22:43.415 2 DEBUG nova.compute.manager [req-a51061cd-0dca-4427-90f1-a1b8a2ecec2d req-e3fa9a74-b9d8-4f48-9ef4-c4b5adb70955 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Refreshing instance network info cache due to event network-changed-59f58b79-9163-41ba-8e03-7430e5def4ef. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:22:43 np0005476733 nova_compute[192580]: 2025-10-08 15:22:43.416 2 DEBUG oslo_concurrency.lockutils [req-a51061cd-0dca-4427-90f1-a1b8a2ecec2d req-e3fa9a74-b9d8-4f48-9ef4-c4b5adb70955 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-656c0a96-03f3-4a70-baac-01de2a126a91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:22:43 np0005476733 nova_compute[192580]: 2025-10-08 15:22:43.419 2 DEBUG nova.network.neutron [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:22:43 np0005476733 nova_compute[192580]: 2025-10-08 15:22:43.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:22:43 np0005476733 nova_compute[192580]: 2025-10-08 15:22:43.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:22:43 np0005476733 nova_compute[192580]: 2025-10-08 15:22:43.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.259 2 DEBUG nova.network.neutron [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Updating instance_info_cache with network_info: [{"id": "59f58b79-9163-41ba-8e03-7430e5def4ef", "address": "fa:16:3e:5f:94:83", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59f58b79-91", "ovs_interfaceid": "59f58b79-9163-41ba-8e03-7430e5def4ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.567 2 DEBUG oslo_concurrency.lockutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Releasing lock "refresh_cache-656c0a96-03f3-4a70-baac-01de2a126a91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.567 2 DEBUG nova.compute.manager [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Instance network_info: |[{"id": "59f58b79-9163-41ba-8e03-7430e5def4ef", "address": "fa:16:3e:5f:94:83", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59f58b79-91", "ovs_interfaceid": "59f58b79-9163-41ba-8e03-7430e5def4ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.568 2 DEBUG oslo_concurrency.lockutils [req-a51061cd-0dca-4427-90f1-a1b8a2ecec2d req-e3fa9a74-b9d8-4f48-9ef4-c4b5adb70955 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-656c0a96-03f3-4a70-baac-01de2a126a91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.568 2 DEBUG nova.network.neutron [req-a51061cd-0dca-4427-90f1-a1b8a2ecec2d req-e3fa9a74-b9d8-4f48-9ef4-c4b5adb70955 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Refreshing network info cache for port 59f58b79-9163-41ba-8e03-7430e5def4ef _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.570 2 DEBUG nova.virt.libvirt.driver [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Start _get_guest_xml network_info=[{"id": "59f58b79-9163-41ba-8e03-7430e5def4ef", "address": "fa:16:3e:5f:94:83", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59f58b79-91", "ovs_interfaceid": "59f58b79-9163-41ba-8e03-7430e5def4ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.575 2 WARNING nova.virt.libvirt.driver [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.580 2 DEBUG nova.virt.libvirt.host [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.580 2 DEBUG nova.virt.libvirt.host [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.584 2 DEBUG nova.virt.libvirt.host [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.584 2 DEBUG nova.virt.libvirt.host [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.585 2 DEBUG nova.virt.libvirt.driver [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.585 2 DEBUG nova.virt.hardware [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.585 2 DEBUG nova.virt.hardware [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.585 2 DEBUG nova.virt.hardware [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.586 2 DEBUG nova.virt.hardware [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.586 2 DEBUG nova.virt.hardware [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.586 2 DEBUG nova.virt.hardware [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.586 2 DEBUG nova.virt.hardware [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.586 2 DEBUG nova.virt.hardware [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.586 2 DEBUG nova.virt.hardware [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.587 2 DEBUG nova.virt.hardware [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.587 2 DEBUG nova.virt.hardware [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.590 2 DEBUG nova.virt.libvirt.vif [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:22:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_multicast_after_idle_timeout-135618235',display_name='tempest-test_multicast_after_idle_timeout-135618235',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-multicast-after-idle-timeout-135618235',id=19,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHaTUyIW7HAi8eLb2uxsb3hQ01QNiqMtiwd2QQElMyFusiyPekoP+eGZG5apcvUeJj+ezHykEE9e9GalqeB/Pt0gdiMZz/nmUCtHv59KRRGG4S5F2fPmbxlRdJaDztvzVg==',key_name='tempest-keypair-test-1272869518',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='496a37645ecf47b496dcf02c696ca64a',ramdisk_id='',reservation_id='r-7ui3kw3n',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Ovn-1993668591',owner_user_name='tempest-MulticastTestIPv4Ovn-1993668591-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:22:39Z,user_data=None,user_id='c0c7c5c2dab54695b1cc0a34bdc4ee47',uuid=656c0a96-03f3-4a70-baac-01de2a126a91,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "59f58b79-9163-41ba-8e03-7430e5def4ef", "address": "fa:16:3e:5f:94:83", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59f58b79-91", "ovs_interfaceid": "59f58b79-9163-41ba-8e03-7430e5def4ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.590 2 DEBUG nova.network.os_vif_util [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converting VIF {"id": "59f58b79-9163-41ba-8e03-7430e5def4ef", "address": "fa:16:3e:5f:94:83", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59f58b79-91", "ovs_interfaceid": "59f58b79-9163-41ba-8e03-7430e5def4ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.591 2 DEBUG nova.network.os_vif_util [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:94:83,bridge_name='br-int',has_traffic_filtering=True,id=59f58b79-9163-41ba-8e03-7430e5def4ef,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap59f58b79-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.592 2 DEBUG nova.objects.instance [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lazy-loading 'pci_devices' on Instance uuid 656c0a96-03f3-4a70-baac-01de2a126a91 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.593 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.593 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:22:44 np0005476733 nova_compute[192580]: 2025-10-08 15:22:44.593 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.020 2 DEBUG nova.virt.libvirt.driver [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:22:45 np0005476733 nova_compute[192580]:  <uuid>656c0a96-03f3-4a70-baac-01de2a126a91</uuid>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:  <name>instance-00000013</name>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <nova:name>tempest-test_multicast_after_idle_timeout-135618235</nova:name>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:22:44</nova:creationTime>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:22:45 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:        <nova:user uuid="c0c7c5c2dab54695b1cc0a34bdc4ee47">tempest-MulticastTestIPv4Ovn-1993668591-project-member</nova:user>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:        <nova:project uuid="496a37645ecf47b496dcf02c696ca64a">tempest-MulticastTestIPv4Ovn-1993668591</nova:project>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:        <nova:port uuid="59f58b79-9163-41ba-8e03-7430e5def4ef">
Oct  8 11:22:45 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <entry name="serial">656c0a96-03f3-4a70-baac-01de2a126a91</entry>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <entry name="uuid">656c0a96-03f3-4a70-baac-01de2a126a91</entry>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk.config"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:5f:94:83"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <target dev="tap59f58b79-91"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/console.log" append="off"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:22:45 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:22:45 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:22:45 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:22:45 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.020 2 DEBUG nova.compute.manager [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Preparing to wait for external event network-vif-plugged-59f58b79-9163-41ba-8e03-7430e5def4ef prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.021 2 DEBUG oslo_concurrency.lockutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "656c0a96-03f3-4a70-baac-01de2a126a91-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.021 2 DEBUG oslo_concurrency.lockutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "656c0a96-03f3-4a70-baac-01de2a126a91-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.021 2 DEBUG oslo_concurrency.lockutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "656c0a96-03f3-4a70-baac-01de2a126a91-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.022 2 DEBUG nova.virt.libvirt.vif [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:22:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_multicast_after_idle_timeout-135618235',display_name='tempest-test_multicast_after_idle_timeout-135618235',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-multicast-after-idle-timeout-135618235',id=19,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHaTUyIW7HAi8eLb2uxsb3hQ01QNiqMtiwd2QQElMyFusiyPekoP+eGZG5apcvUeJj+ezHykEE9e9GalqeB/Pt0gdiMZz/nmUCtHv59KRRGG4S5F2fPmbxlRdJaDztvzVg==',key_name='tempest-keypair-test-1272869518',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='496a37645ecf47b496dcf02c696ca64a',ramdisk_id='',reservation_id='r-7ui3kw3n',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Ovn-1993668591',owner_user_name='tempest-MulticastTestIPv4Ovn-1993668591-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:22:39Z,user_data=None,user_id='c0c7c5c2dab54695b1cc0a34bdc4ee47',uuid=656c0a96-03f3-4a70-baac-01de2a126a91,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "59f58b79-9163-41ba-8e03-7430e5def4ef", "address": "fa:16:3e:5f:94:83", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59f58b79-91", "ovs_interfaceid": "59f58b79-9163-41ba-8e03-7430e5def4ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.022 2 DEBUG nova.network.os_vif_util [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converting VIF {"id": "59f58b79-9163-41ba-8e03-7430e5def4ef", "address": "fa:16:3e:5f:94:83", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59f58b79-91", "ovs_interfaceid": "59f58b79-9163-41ba-8e03-7430e5def4ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.023 2 DEBUG nova.network.os_vif_util [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:94:83,bridge_name='br-int',has_traffic_filtering=True,id=59f58b79-9163-41ba-8e03-7430e5def4ef,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap59f58b79-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.023 2 DEBUG os_vif [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:94:83,bridge_name='br-int',has_traffic_filtering=True,id=59f58b79-9163-41ba-8e03-7430e5def4ef,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap59f58b79-91') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.024 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.025 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.027 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap59f58b79-91, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.028 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap59f58b79-91, col_values=(('external_ids', {'iface-id': '59f58b79-9163-41ba-8e03-7430e5def4ef', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5f:94:83', 'vm-uuid': '656c0a96-03f3-4a70-baac-01de2a126a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:45 np0005476733 NetworkManager[51699]: <info>  [1759936965.0327] manager: (tap59f58b79-91): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.035 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.039 2 INFO os_vif [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:94:83,bridge_name='br-int',has_traffic_filtering=True,id=59f58b79-9163-41ba-8e03-7430e5def4ef,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap59f58b79-91')#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.255 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-a71ee5d2-21b8-4455-8870-f20bed682909" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.255 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-a71ee5d2-21b8-4455-8870-f20bed682909" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.256 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.256 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a71ee5d2-21b8-4455-8870-f20bed682909 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:22:45 np0005476733 podman[224974]: 2025-10-08 15:22:45.271318006 +0000 UTC m=+0.086775749 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.585 2 INFO nova.compute.manager [None req-e291b769-3709-4e4d-824f-98c3a6d9c863 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Get console output#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.590 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.604 2 DEBUG nova.virt.libvirt.driver [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.604 2 DEBUG nova.virt.libvirt.driver [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.604 2 DEBUG nova.virt.libvirt.driver [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] No VIF found with MAC fa:16:3e:5f:94:83, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:22:45 np0005476733 nova_compute[192580]: 2025-10-08 15:22:45.604 2 INFO nova.virt.libvirt.driver [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Using config drive#033[00m
Oct  8 11:22:46 np0005476733 nova_compute[192580]: 2025-10-08 15:22:46.042 2 INFO nova.virt.libvirt.driver [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Creating config drive at /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk.config#033[00m
Oct  8 11:22:46 np0005476733 nova_compute[192580]: 2025-10-08 15:22:46.047 2 DEBUG oslo_concurrency.processutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpd2iq1t6j execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:22:46 np0005476733 nova_compute[192580]: 2025-10-08 15:22:46.187 2 DEBUG oslo_concurrency.processutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpd2iq1t6j" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:22:46 np0005476733 kernel: tap59f58b79-91: entered promiscuous mode
Oct  8 11:22:46 np0005476733 NetworkManager[51699]: <info>  [1759936966.2973] manager: (tap59f58b79-91): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Oct  8 11:22:46 np0005476733 nova_compute[192580]: 2025-10-08 15:22:46.340 2 INFO nova.compute.manager [None req-304afc9e-bdf3-4ab8-977b-e41add33833e 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Get console output#033[00m
Oct  8 11:22:46 np0005476733 nova_compute[192580]: 2025-10-08 15:22:46.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:46 np0005476733 ovn_controller[94857]: 2025-10-08T15:22:46Z|00151|binding|INFO|Claiming lport 59f58b79-9163-41ba-8e03-7430e5def4ef for this chassis.
Oct  8 11:22:46 np0005476733 ovn_controller[94857]: 2025-10-08T15:22:46Z|00152|binding|INFO|59f58b79-9163-41ba-8e03-7430e5def4ef: Claiming fa:16:3e:5f:94:83 10.100.0.3
Oct  8 11:22:46 np0005476733 ovn_controller[94857]: 2025-10-08T15:22:46Z|00153|binding|INFO|Setting lport 59f58b79-9163-41ba-8e03-7430e5def4ef ovn-installed in OVS
Oct  8 11:22:46 np0005476733 nova_compute[192580]: 2025-10-08 15:22:46.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:46 np0005476733 nova_compute[192580]: 2025-10-08 15:22:46.363 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:22:46 np0005476733 nova_compute[192580]: 2025-10-08 15:22:46.367 2 INFO nova.virt.libvirt.driver [None req-304afc9e-bdf3-4ab8-977b-e41add33833e 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Truncated console log returned, 3378 bytes ignored#033[00m
Oct  8 11:22:46 np0005476733 systemd-machined[152624]: New machine qemu-12-instance-00000013.
Oct  8 11:22:46 np0005476733 systemd-udevd[225018]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:22:46 np0005476733 NetworkManager[51699]: <info>  [1759936966.3909] device (tap59f58b79-91): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:22:46 np0005476733 systemd[1]: Started Virtual Machine qemu-12-instance-00000013.
Oct  8 11:22:46 np0005476733 NetworkManager[51699]: <info>  [1759936966.3935] device (tap59f58b79-91): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:22:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:46.664 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:94:83 10.100.0.3'], port_security=['fa:16:3e:5f:94:83 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '496a37645ecf47b496dcf02c696ca64a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '023a0cd3-fdca-4dff-ba80-8ef557b384c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b3d4cc6-3768-451b-b35e-6b2333c921fd, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=59f58b79-9163-41ba-8e03-7430e5def4ef) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:22:46 np0005476733 ovn_controller[94857]: 2025-10-08T15:22:46Z|00154|binding|INFO|Setting lport 59f58b79-9163-41ba-8e03-7430e5def4ef up in Southbound
Oct  8 11:22:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:46.666 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 59f58b79-9163-41ba-8e03-7430e5def4ef in datapath 30cdfb1e-750a-4d0e-9e9c-321b06b371b9 bound to our chassis#033[00m
Oct  8 11:22:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:46.669 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 30cdfb1e-750a-4d0e-9e9c-321b06b371b9#033[00m
Oct  8 11:22:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:46.681 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9ccfb838-7f89-4d10-b911-0027ced7d416]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:46.682 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap30cdfb1e-71 in ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:22:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:46.684 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap30cdfb1e-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:22:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:46.684 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[75f9bb70-910d-4850-a448-feb1c038285a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:46.685 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8a4b825c-b0a4-485d-84ab-e2c1dff5fd44]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:46.701 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[97c31dd2-ede5-4d5b-b50b-c737fc0d2300]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:46.726 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[78ee7821-7551-4cde-8e50-38b5831485cd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:46.777 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[c53d7c46-6e83-4719-9d6e-96ac663cf76e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:46 np0005476733 NetworkManager[51699]: <info>  [1759936966.7840] manager: (tap30cdfb1e-70): new Veth device (/org/freedesktop/NetworkManager/Devices/70)
Oct  8 11:22:46 np0005476733 nova_compute[192580]: 2025-10-08 15:22:46.788 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759936951.786848, af660b82-9b3c-4c4d-820a-3d22b73898e5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:22:46 np0005476733 nova_compute[192580]: 2025-10-08 15:22:46.788 2 INFO nova.compute.manager [-] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:22:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:46.783 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[774957e2-e1c9-4ca8-922c-49cabd461012]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:46.829 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[0eadaad9-5cc4-4b93-ac94-02a73831f317]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:46.834 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[2e9f7cc3-7fdf-4c1e-b57f-957f8ce7db88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:46 np0005476733 NetworkManager[51699]: <info>  [1759936966.8660] device (tap30cdfb1e-70): carrier: link connected
Oct  8 11:22:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:46.876 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[8e5491bf-76a7-41aa-860a-215598d2712b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:46.895 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3cd0539b-b363-493b-a86d-8d8cbc709585]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap30cdfb1e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:3e:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387053, 'reachable_time': 42105, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225085, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:46.912 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ae4505d5-d14d-4f12-ae70-6a87e57f97f9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed8:3ea4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387053, 'tstamp': 387053}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225086, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:46.932 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4ff6274b-bb64-4493-86ef-00d0c5bb9d2a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap30cdfb1e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:3e:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387053, 'reachable_time': 42105, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225087, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:46.967 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[619e3de3-0632-4d73-ac4b-16e58c0fa841]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:47.041 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[592218f1-ff7e-405b-956f-44c72cf2e0a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:47.042 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap30cdfb1e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:47.042 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:47.043 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap30cdfb1e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:22:47 np0005476733 nova_compute[192580]: 2025-10-08 15:22:47.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:47 np0005476733 NetworkManager[51699]: <info>  [1759936967.0454] manager: (tap30cdfb1e-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Oct  8 11:22:47 np0005476733 kernel: tap30cdfb1e-70: entered promiscuous mode
Oct  8 11:22:47 np0005476733 nova_compute[192580]: 2025-10-08 15:22:47.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:47.055 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap30cdfb1e-70, col_values=(('external_ids', {'iface-id': '76302563-91ae-48df-adce-3edec8d5a578'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:22:47 np0005476733 nova_compute[192580]: 2025-10-08 15:22:47.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:47 np0005476733 ovn_controller[94857]: 2025-10-08T15:22:47Z|00155|binding|INFO|Releasing lport 76302563-91ae-48df-adce-3edec8d5a578 from this chassis (sb_readonly=0)
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:47.068 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/30cdfb1e-750a-4d0e-9e9c-321b06b371b9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/30cdfb1e-750a-4d0e-9e9c-321b06b371b9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:22:47 np0005476733 nova_compute[192580]: 2025-10-08 15:22:47.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:47.069 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[150fd937-d3fd-4930-9335-83a7ebdb64d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:47.070 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-30cdfb1e-750a-4d0e-9e9c-321b06b371b9
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/30cdfb1e-750a-4d0e-9e9c-321b06b371b9.pid.haproxy
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 30cdfb1e-750a-4d0e-9e9c-321b06b371b9
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:22:47 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:22:47.071 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'env', 'PROCESS_TAG=haproxy-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/30cdfb1e-750a-4d0e-9e9c-321b06b371b9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:22:47 np0005476733 nova_compute[192580]: 2025-10-08 15:22:47.260 2 DEBUG nova.compute.manager [None req-cf8a2900-d98b-4d9d-aad5-53d0f42823a5 - - - - - -] [instance: af660b82-9b3c-4c4d-820a-3d22b73898e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:22:47 np0005476733 nova_compute[192580]: 2025-10-08 15:22:47.273 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936967.2736223, 656c0a96-03f3-4a70-baac-01de2a126a91 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:22:47 np0005476733 nova_compute[192580]: 2025-10-08 15:22:47.274 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] VM Started (Lifecycle Event)#033[00m
Oct  8 11:22:47 np0005476733 podman[225119]: 2025-10-08 15:22:47.467879864 +0000 UTC m=+0.055912663 container create 18df9ece4581dd634a654a8be88a002996a285bc52011554f76c8c3f51e1dd3a (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  8 11:22:47 np0005476733 nova_compute[192580]: 2025-10-08 15:22:47.513 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:22:47 np0005476733 nova_compute[192580]: 2025-10-08 15:22:47.521 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936967.2742252, 656c0a96-03f3-4a70-baac-01de2a126a91 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:22:47 np0005476733 nova_compute[192580]: 2025-10-08 15:22:47.521 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:22:47 np0005476733 systemd[1]: Started libpod-conmon-18df9ece4581dd634a654a8be88a002996a285bc52011554f76c8c3f51e1dd3a.scope.
Oct  8 11:22:47 np0005476733 podman[225119]: 2025-10-08 15:22:47.43839938 +0000 UTC m=+0.026432219 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:22:47 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:22:47 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3554fa24416693bb2ea6b587fc627c333ee2507e5834ba877ff5ff0eee24ba67/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:22:47 np0005476733 podman[225119]: 2025-10-08 15:22:47.587347472 +0000 UTC m=+0.175380291 container init 18df9ece4581dd634a654a8be88a002996a285bc52011554f76c8c3f51e1dd3a (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct  8 11:22:47 np0005476733 podman[225119]: 2025-10-08 15:22:47.59753935 +0000 UTC m=+0.185572159 container start 18df9ece4581dd634a654a8be88a002996a285bc52011554f76c8c3f51e1dd3a (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  8 11:22:47 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[225134]: [NOTICE]   (225138) : New worker (225140) forked
Oct  8 11:22:47 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[225134]: [NOTICE]   (225138) : Loading success.
Oct  8 11:22:47 np0005476733 nova_compute[192580]: 2025-10-08 15:22:47.784 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:22:47 np0005476733 nova_compute[192580]: 2025-10-08 15:22:47.791 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:22:47 np0005476733 nova_compute[192580]: 2025-10-08 15:22:47.836 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.249 2 DEBUG nova.compute.manager [req-67c8fd02-71fd-42f7-a861-2634018569fa req-558c71c7-96d8-4ee5-8d01-35e734e9f245 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Received event network-vif-plugged-59f58b79-9163-41ba-8e03-7430e5def4ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.250 2 DEBUG oslo_concurrency.lockutils [req-67c8fd02-71fd-42f7-a861-2634018569fa req-558c71c7-96d8-4ee5-8d01-35e734e9f245 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "656c0a96-03f3-4a70-baac-01de2a126a91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.250 2 DEBUG oslo_concurrency.lockutils [req-67c8fd02-71fd-42f7-a861-2634018569fa req-558c71c7-96d8-4ee5-8d01-35e734e9f245 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "656c0a96-03f3-4a70-baac-01de2a126a91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.251 2 DEBUG oslo_concurrency.lockutils [req-67c8fd02-71fd-42f7-a861-2634018569fa req-558c71c7-96d8-4ee5-8d01-35e734e9f245 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "656c0a96-03f3-4a70-baac-01de2a126a91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.251 2 DEBUG nova.compute.manager [req-67c8fd02-71fd-42f7-a861-2634018569fa req-558c71c7-96d8-4ee5-8d01-35e734e9f245 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Processing event network-vif-plugged-59f58b79-9163-41ba-8e03-7430e5def4ef _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.252 2 DEBUG nova.compute.manager [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.256 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936968.2560647, 656c0a96-03f3-4a70-baac-01de2a126a91 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.256 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.258 2 DEBUG nova.virt.libvirt.driver [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.261 2 INFO nova.virt.libvirt.driver [-] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Instance spawned successfully.#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.261 2 DEBUG nova.virt.libvirt.driver [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.277 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Updating instance_info_cache with network_info: [{"id": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "address": "fa:16:3e:77:9d:93", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66c148b-4c", "ovs_interfaceid": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.287 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.290 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.298 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-a71ee5d2-21b8-4455-8870-f20bed682909" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.298 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.299 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.304 2 DEBUG nova.virt.libvirt.driver [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.305 2 DEBUG nova.virt.libvirt.driver [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.306 2 DEBUG nova.virt.libvirt.driver [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.306 2 DEBUG nova.virt.libvirt.driver [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.307 2 DEBUG nova.virt.libvirt.driver [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.307 2 DEBUG nova.virt.libvirt.driver [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.310 2 DEBUG nova.network.neutron [req-a51061cd-0dca-4427-90f1-a1b8a2ecec2d req-e3fa9a74-b9d8-4f48-9ef4-c4b5adb70955 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Updated VIF entry in instance network info cache for port 59f58b79-9163-41ba-8e03-7430e5def4ef. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.311 2 DEBUG nova.network.neutron [req-a51061cd-0dca-4427-90f1-a1b8a2ecec2d req-e3fa9a74-b9d8-4f48-9ef4-c4b5adb70955 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Updating instance_info_cache with network_info: [{"id": "59f58b79-9163-41ba-8e03-7430e5def4ef", "address": "fa:16:3e:5f:94:83", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59f58b79-91", "ovs_interfaceid": "59f58b79-9163-41ba-8e03-7430e5def4ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.332 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.363 2 DEBUG oslo_concurrency.lockutils [req-a51061cd-0dca-4427-90f1-a1b8a2ecec2d req-e3fa9a74-b9d8-4f48-9ef4-c4b5adb70955 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-656c0a96-03f3-4a70-baac-01de2a126a91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.460 2 INFO nova.compute.manager [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Took 9.35 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.461 2 DEBUG nova.compute.manager [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.571 2 INFO nova.compute.manager [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Took 10.04 seconds to build instance.#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.620 2 DEBUG oslo_concurrency.lockutils [None req-52b36698-4128-4e37-84dc-91514066c650 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "656c0a96-03f3-4a70-baac-01de2a126a91" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:48 np0005476733 nova_compute[192580]: 2025-10-08 15:22:48.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:50 np0005476733 nova_compute[192580]: 2025-10-08 15:22:50.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:50 np0005476733 nova_compute[192580]: 2025-10-08 15:22:50.347 2 INFO nova.compute.manager [None req-8e5b3b6d-1fd7-4014-b698-871ea64cb5be c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Get console output#033[00m
Oct  8 11:22:50 np0005476733 nova_compute[192580]: 2025-10-08 15:22:50.357 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:22:50 np0005476733 nova_compute[192580]: 2025-10-08 15:22:50.381 2 DEBUG nova.compute.manager [req-03fdd658-4f91-4b43-a7f1-448e21dd8555 req-f55dfca7-128b-4e32-81c7-0106b6ab0dbd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Received event network-vif-plugged-59f58b79-9163-41ba-8e03-7430e5def4ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:22:50 np0005476733 nova_compute[192580]: 2025-10-08 15:22:50.382 2 DEBUG oslo_concurrency.lockutils [req-03fdd658-4f91-4b43-a7f1-448e21dd8555 req-f55dfca7-128b-4e32-81c7-0106b6ab0dbd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "656c0a96-03f3-4a70-baac-01de2a126a91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:22:50 np0005476733 nova_compute[192580]: 2025-10-08 15:22:50.382 2 DEBUG oslo_concurrency.lockutils [req-03fdd658-4f91-4b43-a7f1-448e21dd8555 req-f55dfca7-128b-4e32-81c7-0106b6ab0dbd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "656c0a96-03f3-4a70-baac-01de2a126a91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:22:50 np0005476733 nova_compute[192580]: 2025-10-08 15:22:50.382 2 DEBUG oslo_concurrency.lockutils [req-03fdd658-4f91-4b43-a7f1-448e21dd8555 req-f55dfca7-128b-4e32-81c7-0106b6ab0dbd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "656c0a96-03f3-4a70-baac-01de2a126a91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:22:50 np0005476733 nova_compute[192580]: 2025-10-08 15:22:50.383 2 DEBUG nova.compute.manager [req-03fdd658-4f91-4b43-a7f1-448e21dd8555 req-f55dfca7-128b-4e32-81c7-0106b6ab0dbd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] No waiting events found dispatching network-vif-plugged-59f58b79-9163-41ba-8e03-7430e5def4ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:22:50 np0005476733 nova_compute[192580]: 2025-10-08 15:22:50.383 2 WARNING nova.compute.manager [req-03fdd658-4f91-4b43-a7f1-448e21dd8555 req-f55dfca7-128b-4e32-81c7-0106b6ab0dbd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Received unexpected event network-vif-plugged-59f58b79-9163-41ba-8e03-7430e5def4ef for instance with vm_state active and task_state None.#033[00m
Oct  8 11:22:50 np0005476733 ovn_controller[94857]: 2025-10-08T15:22:50Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ce:a3:46 10.100.0.3
Oct  8 11:22:50 np0005476733 ovn_controller[94857]: 2025-10-08T15:22:50Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ce:a3:46 10.100.0.3
Oct  8 11:22:50 np0005476733 nova_compute[192580]: 2025-10-08 15:22:50.791 2 INFO nova.compute.manager [None req-515d7814-84b7-403c-9cfa-477a84e323a8 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Get console output#033[00m
Oct  8 11:22:50 np0005476733 nova_compute[192580]: 2025-10-08 15:22:50.799 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:22:52 np0005476733 podman[225150]: 2025-10-08 15:22:52.253725473 +0000 UTC m=+0.078155186 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:22:52 np0005476733 nova_compute[192580]: 2025-10-08 15:22:52.539 2 DEBUG nova.compute.manager [req-04b5e2c0-76ff-43bc-b64b-ec9734bf9cde req-f4c47ff4-0be4-444a-977e-40d87ae4ca32 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Received event network-changed-5800d2b5-1c28-4be9-ba9d-7442de36269e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:22:52 np0005476733 nova_compute[192580]: 2025-10-08 15:22:52.541 2 DEBUG nova.compute.manager [req-04b5e2c0-76ff-43bc-b64b-ec9734bf9cde req-f4c47ff4-0be4-444a-977e-40d87ae4ca32 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Refreshing instance network info cache due to event network-changed-5800d2b5-1c28-4be9-ba9d-7442de36269e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:22:52 np0005476733 nova_compute[192580]: 2025-10-08 15:22:52.542 2 DEBUG oslo_concurrency.lockutils [req-04b5e2c0-76ff-43bc-b64b-ec9734bf9cde req-f4c47ff4-0be4-444a-977e-40d87ae4ca32 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-9992bf78-8d8e-43c7-a8cc-5606d8c910cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:22:52 np0005476733 nova_compute[192580]: 2025-10-08 15:22:52.542 2 DEBUG oslo_concurrency.lockutils [req-04b5e2c0-76ff-43bc-b64b-ec9734bf9cde req-f4c47ff4-0be4-444a-977e-40d87ae4ca32 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-9992bf78-8d8e-43c7-a8cc-5606d8c910cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:22:52 np0005476733 nova_compute[192580]: 2025-10-08 15:22:52.543 2 DEBUG nova.network.neutron [req-04b5e2c0-76ff-43bc-b64b-ec9734bf9cde req-f4c47ff4-0be4-444a-977e-40d87ae4ca32 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Refreshing network info cache for port 5800d2b5-1c28-4be9-ba9d-7442de36269e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:22:53 np0005476733 nova_compute[192580]: 2025-10-08 15:22:53.656 2 DEBUG nova.network.neutron [req-04b5e2c0-76ff-43bc-b64b-ec9734bf9cde req-f4c47ff4-0be4-444a-977e-40d87ae4ca32 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Updated VIF entry in instance network info cache for port 5800d2b5-1c28-4be9-ba9d-7442de36269e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:22:53 np0005476733 nova_compute[192580]: 2025-10-08 15:22:53.657 2 DEBUG nova.network.neutron [req-04b5e2c0-76ff-43bc-b64b-ec9734bf9cde req-f4c47ff4-0be4-444a-977e-40d87ae4ca32 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Updating instance_info_cache with network_info: [{"id": "5800d2b5-1c28-4be9-ba9d-7442de36269e", "address": "fa:16:3e:2b:df:5b", "network": {"id": "ca646cb6-3329-453a-a072-04814e4638f0", "bridge": "br-int", "label": "tempest-test-network--1140426360", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27fe52d14e2143a887b0445eb5cfca72", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5800d2b5-1c", "ovs_interfaceid": "5800d2b5-1c28-4be9-ba9d-7442de36269e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:22:53 np0005476733 nova_compute[192580]: 2025-10-08 15:22:53.690 2 DEBUG oslo_concurrency.lockutils [req-04b5e2c0-76ff-43bc-b64b-ec9734bf9cde req-f4c47ff4-0be4-444a-977e-40d87ae4ca32 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-9992bf78-8d8e-43c7-a8cc-5606d8c910cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:22:53 np0005476733 nova_compute[192580]: 2025-10-08 15:22:53.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:55 np0005476733 nova_compute[192580]: 2025-10-08 15:22:55.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:22:55 np0005476733 podman[225167]: 2025-10-08 15:22:55.246698272 +0000 UTC m=+0.072708921 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 11:22:55 np0005476733 nova_compute[192580]: 2025-10-08 15:22:55.501 2 INFO nova.compute.manager [None req-98acb432-69a7-4dae-9b32-127c770f767c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Get console output#033[00m
Oct  8 11:22:56 np0005476733 nova_compute[192580]: 2025-10-08 15:22:56.002 2 INFO nova.compute.manager [None req-218b9e84-d567-4659-8229-32fbc8ba93d8 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Get console output#033[00m
Oct  8 11:22:56 np0005476733 nova_compute[192580]: 2025-10-08 15:22:56.008 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:22:56 np0005476733 nova_compute[192580]: 2025-10-08 15:22:56.014 2 INFO nova.virt.libvirt.driver [None req-218b9e84-d567-4659-8229-32fbc8ba93d8 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Truncated console log returned, 3017 bytes ignored#033[00m
Oct  8 11:22:56 np0005476733 podman[225187]: 2025-10-08 15:22:56.256934138 +0000 UTC m=+0.086466982 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=ovn_controller)
Oct  8 11:22:58 np0005476733 nova_compute[192580]: 2025-10-08 15:22:58.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:00 np0005476733 nova_compute[192580]: 2025-10-08 15:23:00.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:00 np0005476733 nova_compute[192580]: 2025-10-08 15:23:00.810 2 INFO nova.compute.manager [None req-cc36ac7c-c588-4ded-bfb0-418372a6f944 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Get console output#033[00m
Oct  8 11:23:01 np0005476733 nova_compute[192580]: 2025-10-08 15:23:01.187 2 INFO nova.compute.manager [None req-325e09d7-8087-4f8d-8cd3-d9e055aeb626 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Get console output#033[00m
Oct  8 11:23:01 np0005476733 nova_compute[192580]: 2025-10-08 15:23:01.190 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:23:01 np0005476733 nova_compute[192580]: 2025-10-08 15:23:01.193 2 INFO nova.virt.libvirt.driver [None req-325e09d7-8087-4f8d-8cd3-d9e055aeb626 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Truncated console log returned, 3304 bytes ignored#033[00m
Oct  8 11:23:01 np0005476733 podman[225238]: 2025-10-08 15:23:01.245697809 +0000 UTC m=+0.071564335 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Oct  8 11:23:03 np0005476733 nova_compute[192580]: 2025-10-08 15:23:03.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:05 np0005476733 nova_compute[192580]: 2025-10-08 15:23:05.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:05 np0005476733 nova_compute[192580]: 2025-10-08 15:23:05.193 2 DEBUG nova.compute.manager [req-5eb3073a-6d18-4d3a-b3d0-1579a793ffc9 req-65a4520a-1b07-41b6-8c68-60b8a2ab8281 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Received event network-changed-3e1bce81-bd3f-433a-aad4-1b90ad016699 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:05 np0005476733 nova_compute[192580]: 2025-10-08 15:23:05.193 2 DEBUG nova.compute.manager [req-5eb3073a-6d18-4d3a-b3d0-1579a793ffc9 req-65a4520a-1b07-41b6-8c68-60b8a2ab8281 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Refreshing instance network info cache due to event network-changed-3e1bce81-bd3f-433a-aad4-1b90ad016699. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:23:05 np0005476733 nova_compute[192580]: 2025-10-08 15:23:05.193 2 DEBUG oslo_concurrency.lockutils [req-5eb3073a-6d18-4d3a-b3d0-1579a793ffc9 req-65a4520a-1b07-41b6-8c68-60b8a2ab8281 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-08e4113f-f3be-424f-926e-62e20b3ad767" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:23:05 np0005476733 nova_compute[192580]: 2025-10-08 15:23:05.194 2 DEBUG oslo_concurrency.lockutils [req-5eb3073a-6d18-4d3a-b3d0-1579a793ffc9 req-65a4520a-1b07-41b6-8c68-60b8a2ab8281 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-08e4113f-f3be-424f-926e-62e20b3ad767" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:23:05 np0005476733 nova_compute[192580]: 2025-10-08 15:23:05.194 2 DEBUG nova.network.neutron [req-5eb3073a-6d18-4d3a-b3d0-1579a793ffc9 req-65a4520a-1b07-41b6-8c68-60b8a2ab8281 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Refreshing network info cache for port 3e1bce81-bd3f-433a-aad4-1b90ad016699 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:23:05 np0005476733 podman[225289]: 2025-10-08 15:23:05.228959795 +0000 UTC m=+0.054052483 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 11:23:05 np0005476733 podman[225288]: 2025-10-08 15:23:05.259960939 +0000 UTC m=+0.082949640 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  8 11:23:05 np0005476733 nova_compute[192580]: 2025-10-08 15:23:05.947 2 INFO nova.compute.manager [None req-c9d7620a-bd9e-4574-9ade-d3ad5c357131 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Get console output#033[00m
Oct  8 11:23:05 np0005476733 nova_compute[192580]: 2025-10-08 15:23:05.952 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:23:08 np0005476733 nova_compute[192580]: 2025-10-08 15:23:08.163 2 DEBUG nova.network.neutron [req-5eb3073a-6d18-4d3a-b3d0-1579a793ffc9 req-65a4520a-1b07-41b6-8c68-60b8a2ab8281 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Updated VIF entry in instance network info cache for port 3e1bce81-bd3f-433a-aad4-1b90ad016699. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:23:08 np0005476733 nova_compute[192580]: 2025-10-08 15:23:08.163 2 DEBUG nova.network.neutron [req-5eb3073a-6d18-4d3a-b3d0-1579a793ffc9 req-65a4520a-1b07-41b6-8c68-60b8a2ab8281 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Updating instance_info_cache with network_info: [{"id": "3e1bce81-bd3f-433a-aad4-1b90ad016699", "address": "fa:16:3e:ce:a3:46", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e1bce81-bd", "ovs_interfaceid": "3e1bce81-bd3f-433a-aad4-1b90ad016699", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:23:08 np0005476733 nova_compute[192580]: 2025-10-08 15:23:08.197 2 DEBUG oslo_concurrency.lockutils [req-5eb3073a-6d18-4d3a-b3d0-1579a793ffc9 req-65a4520a-1b07-41b6-8c68-60b8a2ab8281 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-08e4113f-f3be-424f-926e-62e20b3ad767" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:23:08 np0005476733 nova_compute[192580]: 2025-10-08 15:23:08.402 2 DEBUG oslo_concurrency.lockutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "1a3ae685-bd3d-4f36-ad77-9f5b6b95677f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:08 np0005476733 nova_compute[192580]: 2025-10-08 15:23:08.402 2 DEBUG oslo_concurrency.lockutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "1a3ae685-bd3d-4f36-ad77-9f5b6b95677f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:08 np0005476733 nova_compute[192580]: 2025-10-08 15:23:08.421 2 DEBUG nova.compute.manager [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:23:08 np0005476733 nova_compute[192580]: 2025-10-08 15:23:08.502 2 DEBUG oslo_concurrency.lockutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:08 np0005476733 nova_compute[192580]: 2025-10-08 15:23:08.502 2 DEBUG oslo_concurrency.lockutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:08 np0005476733 nova_compute[192580]: 2025-10-08 15:23:08.518 2 DEBUG nova.virt.hardware [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:23:08 np0005476733 nova_compute[192580]: 2025-10-08 15:23:08.518 2 INFO nova.compute.claims [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:23:08 np0005476733 nova_compute[192580]: 2025-10-08 15:23:08.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:08 np0005476733 nova_compute[192580]: 2025-10-08 15:23:08.759 2 DEBUG nova.compute.provider_tree [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:23:08 np0005476733 nova_compute[192580]: 2025-10-08 15:23:08.777 2 DEBUG nova.scheduler.client.report [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:23:08 np0005476733 nova_compute[192580]: 2025-10-08 15:23:08.799 2 DEBUG oslo_concurrency.lockutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.296s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:08 np0005476733 nova_compute[192580]: 2025-10-08 15:23:08.799 2 DEBUG nova.compute.manager [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:23:08 np0005476733 nova_compute[192580]: 2025-10-08 15:23:08.856 2 DEBUG nova.compute.manager [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:23:08 np0005476733 nova_compute[192580]: 2025-10-08 15:23:08.856 2 DEBUG nova.network.neutron [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:23:08 np0005476733 nova_compute[192580]: 2025-10-08 15:23:08.878 2 INFO nova.virt.libvirt.driver [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:23:08 np0005476733 nova_compute[192580]: 2025-10-08 15:23:08.900 2 DEBUG nova.compute.manager [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:23:08 np0005476733 nova_compute[192580]: 2025-10-08 15:23:08.996 2 DEBUG nova.compute.manager [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:23:08 np0005476733 nova_compute[192580]: 2025-10-08 15:23:08.998 2 DEBUG nova.virt.libvirt.driver [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:23:08 np0005476733 nova_compute[192580]: 2025-10-08 15:23:08.999 2 INFO nova.virt.libvirt.driver [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Creating image(s)#033[00m
Oct  8 11:23:09 np0005476733 nova_compute[192580]: 2025-10-08 15:23:09.000 2 DEBUG oslo_concurrency.lockutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "/var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:09 np0005476733 nova_compute[192580]: 2025-10-08 15:23:09.000 2 DEBUG oslo_concurrency.lockutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "/var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:09 np0005476733 nova_compute[192580]: 2025-10-08 15:23:09.002 2 DEBUG oslo_concurrency.lockutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "/var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:09 np0005476733 nova_compute[192580]: 2025-10-08 15:23:09.029 2 DEBUG oslo_concurrency.processutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:23:09 np0005476733 nova_compute[192580]: 2025-10-08 15:23:09.090 2 DEBUG oslo_concurrency.processutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:23:09 np0005476733 nova_compute[192580]: 2025-10-08 15:23:09.091 2 DEBUG oslo_concurrency.lockutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:09 np0005476733 nova_compute[192580]: 2025-10-08 15:23:09.092 2 DEBUG oslo_concurrency.lockutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:09 np0005476733 nova_compute[192580]: 2025-10-08 15:23:09.103 2 DEBUG oslo_concurrency.processutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:23:09 np0005476733 nova_compute[192580]: 2025-10-08 15:23:09.169 2 DEBUG oslo_concurrency.processutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:23:09 np0005476733 nova_compute[192580]: 2025-10-08 15:23:09.170 2 DEBUG oslo_concurrency.processutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:23:09 np0005476733 nova_compute[192580]: 2025-10-08 15:23:09.207 2 DEBUG oslo_concurrency.processutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk 10737418240" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:23:09 np0005476733 nova_compute[192580]: 2025-10-08 15:23:09.208 2 DEBUG oslo_concurrency.lockutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:09 np0005476733 nova_compute[192580]: 2025-10-08 15:23:09.209 2 DEBUG oslo_concurrency.processutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:23:09 np0005476733 nova_compute[192580]: 2025-10-08 15:23:09.228 2 DEBUG nova.policy [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:23:09 np0005476733 nova_compute[192580]: 2025-10-08 15:23:09.262 2 DEBUG oslo_concurrency.processutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:23:09 np0005476733 nova_compute[192580]: 2025-10-08 15:23:09.263 2 DEBUG nova.objects.instance [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lazy-loading 'migration_context' on Instance uuid 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:23:09 np0005476733 nova_compute[192580]: 2025-10-08 15:23:09.281 2 DEBUG nova.virt.libvirt.driver [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:23:09 np0005476733 nova_compute[192580]: 2025-10-08 15:23:09.281 2 DEBUG nova.virt.libvirt.driver [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Ensure instance console log exists: /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:23:09 np0005476733 nova_compute[192580]: 2025-10-08 15:23:09.282 2 DEBUG oslo_concurrency.lockutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:09 np0005476733 nova_compute[192580]: 2025-10-08 15:23:09.282 2 DEBUG oslo_concurrency.lockutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:09 np0005476733 nova_compute[192580]: 2025-10-08 15:23:09.282 2 DEBUG oslo_concurrency.lockutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:10 np0005476733 nova_compute[192580]: 2025-10-08 15:23:10.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:10 np0005476733 nova_compute[192580]: 2025-10-08 15:23:10.151 2 DEBUG nova.network.neutron [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Successfully updated port: 0bb60f77-cd96-4dfd-9810-5583ec966cb2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:23:10 np0005476733 nova_compute[192580]: 2025-10-08 15:23:10.174 2 DEBUG oslo_concurrency.lockutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "refresh_cache-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:23:10 np0005476733 nova_compute[192580]: 2025-10-08 15:23:10.174 2 DEBUG oslo_concurrency.lockutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquired lock "refresh_cache-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:23:10 np0005476733 nova_compute[192580]: 2025-10-08 15:23:10.174 2 DEBUG nova.network.neutron [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:23:10 np0005476733 nova_compute[192580]: 2025-10-08 15:23:10.373 2 DEBUG nova.compute.manager [req-a08836a3-0013-4ea7-9ac6-eb96756ab2a2 req-c48cc938-6fee-4df8-8c60-043286e88ec8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Received event network-changed-0bb60f77-cd96-4dfd-9810-5583ec966cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:10 np0005476733 nova_compute[192580]: 2025-10-08 15:23:10.373 2 DEBUG nova.compute.manager [req-a08836a3-0013-4ea7-9ac6-eb96756ab2a2 req-c48cc938-6fee-4df8-8c60-043286e88ec8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Refreshing instance network info cache due to event network-changed-0bb60f77-cd96-4dfd-9810-5583ec966cb2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:23:10 np0005476733 nova_compute[192580]: 2025-10-08 15:23:10.374 2 DEBUG oslo_concurrency.lockutils [req-a08836a3-0013-4ea7-9ac6-eb96756ab2a2 req-c48cc938-6fee-4df8-8c60-043286e88ec8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:23:10 np0005476733 nova_compute[192580]: 2025-10-08 15:23:10.817 2 DEBUG nova.network.neutron [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:23:11 np0005476733 nova_compute[192580]: 2025-10-08 15:23:11.464 2 INFO nova.compute.manager [None req-0a424378-a5f6-49e2-8002-ff42dbc1986d c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Get console output#033[00m
Oct  8 11:23:11 np0005476733 nova_compute[192580]: 2025-10-08 15:23:11.469 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:23:12 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:12Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5f:94:83 10.100.0.3
Oct  8 11:23:12 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:12Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5f:94:83 10.100.0.3
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.861 2 DEBUG nova.network.neutron [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Updating instance_info_cache with network_info: [{"id": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "address": "fa:16:3e:a6:aa:3b", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bb60f77-cd", "ovs_interfaceid": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.884 2 DEBUG oslo_concurrency.lockutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Releasing lock "refresh_cache-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.885 2 DEBUG nova.compute.manager [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Instance network_info: |[{"id": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "address": "fa:16:3e:a6:aa:3b", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bb60f77-cd", "ovs_interfaceid": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.886 2 DEBUG oslo_concurrency.lockutils [req-a08836a3-0013-4ea7-9ac6-eb96756ab2a2 req-c48cc938-6fee-4df8-8c60-043286e88ec8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.886 2 DEBUG nova.network.neutron [req-a08836a3-0013-4ea7-9ac6-eb96756ab2a2 req-c48cc938-6fee-4df8-8c60-043286e88ec8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Refreshing network info cache for port 0bb60f77-cd96-4dfd-9810-5583ec966cb2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.889 2 DEBUG nova.virt.libvirt.driver [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Start _get_guest_xml network_info=[{"id": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "address": "fa:16:3e:a6:aa:3b", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bb60f77-cd", "ovs_interfaceid": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.896 2 WARNING nova.virt.libvirt.driver [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.906 2 DEBUG nova.virt.libvirt.host [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.907 2 DEBUG nova.virt.libvirt.host [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.923 2 DEBUG nova.virt.libvirt.host [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.924 2 DEBUG nova.virt.libvirt.host [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.924 2 DEBUG nova.virt.libvirt.driver [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.924 2 DEBUG nova.virt.hardware [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.925 2 DEBUG nova.virt.hardware [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.925 2 DEBUG nova.virt.hardware [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.925 2 DEBUG nova.virt.hardware [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.926 2 DEBUG nova.virt.hardware [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.926 2 DEBUG nova.virt.hardware [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.926 2 DEBUG nova.virt.hardware [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.926 2 DEBUG nova.virt.hardware [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.926 2 DEBUG nova.virt.hardware [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.927 2 DEBUG nova.virt.hardware [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.927 2 DEBUG nova.virt.hardware [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.930 2 DEBUG nova.virt.libvirt.vif [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:23:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-broadcast-receiver-123-1908290520',display_name='tempest-broadcast-receiver-123-1908290520',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-broadcast-receiver-123-1908290520',id=20,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGBqxlO9VuM0Qq/DWr14YnhGxOxwcqegm/N2XcRSLA8NJfb1K0EfLGDHkMQul32EUhmJshL5J7ZH56Voxwq765dL8/B4SFbezZWy3ydp4mAt0951qcEHggiOu5J3JaZbOg==',key_name='tempest-keypair-test-1882494757',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7e1086961263487db8a3c5190fdf1b2e',ramdisk_id='',reservation_id='r-52vk5sn9',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-BroadcastTestVlanTransparency-538458942',owner_user_name='tempest-BroadcastTestVlanTransparency-538458942-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:23:08Z,user_data=None,user_id='843ea0278e174175a6f8e21731c1383e',uuid=1a3ae685-bd3d-4f36-ad77-9f5b6b95677f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "address": "fa:16:3e:a6:aa:3b", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bb60f77-cd", "ovs_interfaceid": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.931 2 DEBUG nova.network.os_vif_util [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Converting VIF {"id": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "address": "fa:16:3e:a6:aa:3b", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bb60f77-cd", "ovs_interfaceid": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.932 2 DEBUG nova.network.os_vif_util [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a6:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=0bb60f77-cd96-4dfd-9810-5583ec966cb2,network=Network(7a77f8cd-4394-4cb0-a8a1-33872549758a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0bb60f77-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:23:13 np0005476733 nova_compute[192580]: 2025-10-08 15:23:13.933 2 DEBUG nova.objects.instance [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lazy-loading 'pci_devices' on Instance uuid 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:23:14 np0005476733 nova_compute[192580]: 2025-10-08 15:23:14.028 2 DEBUG nova.virt.libvirt.driver [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:23:14 np0005476733 nova_compute[192580]:  <uuid>1a3ae685-bd3d-4f36-ad77-9f5b6b95677f</uuid>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:  <name>instance-00000014</name>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <nova:name>tempest-broadcast-receiver-123-1908290520</nova:name>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:23:13</nova:creationTime>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:23:14 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:        <nova:user uuid="843ea0278e174175a6f8e21731c1383e">tempest-BroadcastTestVlanTransparency-538458942-project-member</nova:user>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:        <nova:project uuid="7e1086961263487db8a3c5190fdf1b2e">tempest-BroadcastTestVlanTransparency-538458942</nova:project>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:        <nova:port uuid="0bb60f77-cd96-4dfd-9810-5583ec966cb2">
Oct  8 11:23:14 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <entry name="serial">1a3ae685-bd3d-4f36-ad77-9f5b6b95677f</entry>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <entry name="uuid">1a3ae685-bd3d-4f36-ad77-9f5b6b95677f</entry>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.config"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:a6:aa:3b"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <target dev="tap0bb60f77-cd"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/console.log" append="off"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:23:14 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:23:14 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:23:14 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:23:14 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:23:14 np0005476733 nova_compute[192580]: 2025-10-08 15:23:14.030 2 DEBUG nova.compute.manager [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Preparing to wait for external event network-vif-plugged-0bb60f77-cd96-4dfd-9810-5583ec966cb2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:23:14 np0005476733 nova_compute[192580]: 2025-10-08 15:23:14.030 2 DEBUG oslo_concurrency.lockutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:14 np0005476733 nova_compute[192580]: 2025-10-08 15:23:14.031 2 DEBUG oslo_concurrency.lockutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:14 np0005476733 nova_compute[192580]: 2025-10-08 15:23:14.031 2 DEBUG oslo_concurrency.lockutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:14 np0005476733 nova_compute[192580]: 2025-10-08 15:23:14.031 2 DEBUG nova.virt.libvirt.vif [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:23:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-broadcast-receiver-123-1908290520',display_name='tempest-broadcast-receiver-123-1908290520',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-broadcast-receiver-123-1908290520',id=20,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGBqxlO9VuM0Qq/DWr14YnhGxOxwcqegm/N2XcRSLA8NJfb1K0EfLGDHkMQul32EUhmJshL5J7ZH56Voxwq765dL8/B4SFbezZWy3ydp4mAt0951qcEHggiOu5J3JaZbOg==',key_name='tempest-keypair-test-1882494757',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7e1086961263487db8a3c5190fdf1b2e',ramdisk_id='',reservation_id='r-52vk5sn9',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-BroadcastTestVlanTransparency-538458942',owner_user_name='tempest-BroadcastTestVlanTransparency-538458942-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:23:08Z,user_data=None,user_id='843ea0278e174175a6f8e21731c1383e',uuid=1a3ae685-bd3d-4f36-ad77-9f5b6b95677f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "address": "fa:16:3e:a6:aa:3b", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bb60f77-cd", "ovs_interfaceid": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:23:14 np0005476733 nova_compute[192580]: 2025-10-08 15:23:14.032 2 DEBUG nova.network.os_vif_util [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Converting VIF {"id": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "address": "fa:16:3e:a6:aa:3b", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bb60f77-cd", "ovs_interfaceid": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:23:14 np0005476733 nova_compute[192580]: 2025-10-08 15:23:14.033 2 DEBUG nova.network.os_vif_util [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a6:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=0bb60f77-cd96-4dfd-9810-5583ec966cb2,network=Network(7a77f8cd-4394-4cb0-a8a1-33872549758a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0bb60f77-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:23:14 np0005476733 nova_compute[192580]: 2025-10-08 15:23:14.033 2 DEBUG os_vif [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a6:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=0bb60f77-cd96-4dfd-9810-5583ec966cb2,network=Network(7a77f8cd-4394-4cb0-a8a1-33872549758a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0bb60f77-cd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:23:14 np0005476733 nova_compute[192580]: 2025-10-08 15:23:14.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:14 np0005476733 nova_compute[192580]: 2025-10-08 15:23:14.034 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:14 np0005476733 nova_compute[192580]: 2025-10-08 15:23:14.034 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:23:14 np0005476733 nova_compute[192580]: 2025-10-08 15:23:14.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:14 np0005476733 nova_compute[192580]: 2025-10-08 15:23:14.039 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0bb60f77-cd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:14 np0005476733 nova_compute[192580]: 2025-10-08 15:23:14.040 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0bb60f77-cd, col_values=(('external_ids', {'iface-id': '0bb60f77-cd96-4dfd-9810-5583ec966cb2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a6:aa:3b', 'vm-uuid': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:14 np0005476733 nova_compute[192580]: 2025-10-08 15:23:14.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:14 np0005476733 NetworkManager[51699]: <info>  [1759936994.0876] manager: (tap0bb60f77-cd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Oct  8 11:23:14 np0005476733 nova_compute[192580]: 2025-10-08 15:23:14.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:23:14 np0005476733 nova_compute[192580]: 2025-10-08 15:23:14.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:14 np0005476733 nova_compute[192580]: 2025-10-08 15:23:14.100 2 INFO os_vif [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a6:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=0bb60f77-cd96-4dfd-9810-5583ec966cb2,network=Network(7a77f8cd-4394-4cb0-a8a1-33872549758a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0bb60f77-cd')#033[00m
Oct  8 11:23:14 np0005476733 nova_compute[192580]: 2025-10-08 15:23:14.177 2 DEBUG nova.virt.libvirt.driver [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:23:14 np0005476733 nova_compute[192580]: 2025-10-08 15:23:14.177 2 DEBUG nova.virt.libvirt.driver [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:23:14 np0005476733 nova_compute[192580]: 2025-10-08 15:23:14.177 2 DEBUG nova.virt.libvirt.driver [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] No VIF found with MAC fa:16:3e:a6:aa:3b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:23:14 np0005476733 nova_compute[192580]: 2025-10-08 15:23:14.178 2 INFO nova.virt.libvirt.driver [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Using config drive#033[00m
Oct  8 11:23:14 np0005476733 podman[225346]: 2025-10-08 15:23:14.2005722 +0000 UTC m=+0.068657471 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2)
Oct  8 11:23:14 np0005476733 nova_compute[192580]: 2025-10-08 15:23:14.996 2 INFO nova.virt.libvirt.driver [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Creating config drive at /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.config#033[00m
Oct  8 11:23:15 np0005476733 nova_compute[192580]: 2025-10-08 15:23:15.007 2 DEBUG oslo_concurrency.processutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp50ow2vke execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:23:15 np0005476733 nova_compute[192580]: 2025-10-08 15:23:15.152 2 DEBUG oslo_concurrency.processutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp50ow2vke" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:23:15 np0005476733 NetworkManager[51699]: <info>  [1759936995.2402] manager: (tap0bb60f77-cd): new Tun device (/org/freedesktop/NetworkManager/Devices/73)
Oct  8 11:23:15 np0005476733 kernel: tap0bb60f77-cd: entered promiscuous mode
Oct  8 11:23:15 np0005476733 nova_compute[192580]: 2025-10-08 15:23:15.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:15 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:15Z|00156|binding|INFO|Claiming lport 0bb60f77-cd96-4dfd-9810-5583ec966cb2 for this chassis.
Oct  8 11:23:15 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:15Z|00157|binding|INFO|0bb60f77-cd96-4dfd-9810-5583ec966cb2: Claiming fa:16:3e:a6:aa:3b 10.100.0.8
Oct  8 11:23:15 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:15Z|00158|binding|INFO|Setting lport 0bb60f77-cd96-4dfd-9810-5583ec966cb2 ovn-installed in OVS
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.265 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a6:aa:3b 10.100.0.8'], port_security=['fa:16:3e:a6:aa:3b 10.100.0.8 192.168.111.11/24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7a77f8cd-4394-4cb0-a8a1-33872549758a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e1086961263487db8a3c5190fdf1b2e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '78a6a465-5b3b-43e0-8a00-63e5875c77b4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=538f4b4e-d2f6-4df4-8e2a-7fc02c73fc5a, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=0bb60f77-cd96-4dfd-9810-5583ec966cb2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:23:15 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:15Z|00159|binding|INFO|Setting lport 0bb60f77-cd96-4dfd-9810-5583ec966cb2 up in Southbound
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.268 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 0bb60f77-cd96-4dfd-9810-5583ec966cb2 in datapath 7a77f8cd-4394-4cb0-a8a1-33872549758a bound to our chassis#033[00m
Oct  8 11:23:15 np0005476733 nova_compute[192580]: 2025-10-08 15:23:15.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.271 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7a77f8cd-4394-4cb0-a8a1-33872549758a#033[00m
Oct  8 11:23:15 np0005476733 nova_compute[192580]: 2025-10-08 15:23:15.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.287 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b36dc995-e9e8-4287-84b5-cc5b2dedd71e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.288 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7a77f8cd-41 in ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.290 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7a77f8cd-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.290 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f44bb48e-c858-4cab-88b6-44a3dff34596]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.294 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e5799146-2a0c-4207-a062-c9d7742f86bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:15 np0005476733 systemd-machined[152624]: New machine qemu-13-instance-00000014.
Oct  8 11:23:15 np0005476733 systemd[1]: Started Virtual Machine qemu-13-instance-00000014.
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.321 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[a4ab84b4-58b2-4df8-b416-5a6e158e8237]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:15 np0005476733 systemd-udevd[225392]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:23:15 np0005476733 NetworkManager[51699]: <info>  [1759936995.3486] device (tap0bb60f77-cd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:23:15 np0005476733 NetworkManager[51699]: <info>  [1759936995.3493] device (tap0bb60f77-cd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.348 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0b7cadf5-c72d-4d56-8a1d-966994e23802]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.387 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[4e2563bd-ef92-4e92-b68b-e70b9d6c05cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.392 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[fd037b86-d674-4a86-9f4b-b062c45d8142]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:15 np0005476733 NetworkManager[51699]: <info>  [1759936995.3941] manager: (tap7a77f8cd-40): new Veth device (/org/freedesktop/NetworkManager/Devices/74)
Oct  8 11:23:15 np0005476733 podman[225384]: 2025-10-08 15:23:15.424272547 +0000 UTC m=+0.102563857 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.440 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[1acada80-e839-4241-a8b8-667421c438b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.443 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[4e804c5a-42c0-4d9d-8999-60f3c912a69a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:15 np0005476733 NetworkManager[51699]: <info>  [1759936995.4675] device (tap7a77f8cd-40): carrier: link connected
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.475 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[527794fc-587a-48bd-b1ed-95a0edc5b2ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.496 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7327336f-cdf2-4adf-a41e-0b5327844eb8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7a77f8cd-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:53:0f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 389913, 'reachable_time': 16061, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225441, 'error': None, 'target': 'ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.522 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[80f89fa0-1183-4733-9627-5485f9a4ef48]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1b:530f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 389913, 'tstamp': 389913}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225442, 'error': None, 'target': 'ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.548 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[166114d1-1ede-4694-9483-aa5dcc84b4e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7a77f8cd-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:53:0f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 389913, 'reachable_time': 16061, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225443, 'error': None, 'target': 'ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.597 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[306e133e-746c-4876-8bcb-5a21a1cf61d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.702 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0fd3ce63-fcb1-4002-b41b-845866070efe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.704 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a77f8cd-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.704 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.705 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7a77f8cd-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:15 np0005476733 nova_compute[192580]: 2025-10-08 15:23:15.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:15 np0005476733 kernel: tap7a77f8cd-40: entered promiscuous mode
Oct  8 11:23:15 np0005476733 NetworkManager[51699]: <info>  [1759936995.7079] manager: (tap7a77f8cd-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.714 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7a77f8cd-40, col_values=(('external_ids', {'iface-id': 'b563ca05-c871-4f0e-9980-177237a3f88d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:15 np0005476733 nova_compute[192580]: 2025-10-08 15:23:15.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:15 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:15Z|00160|binding|INFO|Releasing lport b563ca05-c871-4f0e-9980-177237a3f88d from this chassis (sb_readonly=0)
Oct  8 11:23:15 np0005476733 nova_compute[192580]: 2025-10-08 15:23:15.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:15 np0005476733 nova_compute[192580]: 2025-10-08 15:23:15.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.736 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7a77f8cd-4394-4cb0-a8a1-33872549758a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7a77f8cd-4394-4cb0-a8a1-33872549758a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.740 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[db63ff6b-157b-4861-bf44-cbda4e30881c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.741 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-7a77f8cd-4394-4cb0-a8a1-33872549758a
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/7a77f8cd-4394-4cb0-a8a1-33872549758a.pid.haproxy
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 7a77f8cd-4394-4cb0-a8a1-33872549758a
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:23:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:15.742 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a', 'env', 'PROCESS_TAG=haproxy-7a77f8cd-4394-4cb0-a8a1-33872549758a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7a77f8cd-4394-4cb0-a8a1-33872549758a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.093 2 DEBUG nova.compute.manager [req-93fae9be-1f0d-4fb2-8dec-e92916273f7c req-edf711a3-4793-47dc-935f-b0d464973282 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Received event network-vif-plugged-0bb60f77-cd96-4dfd-9810-5583ec966cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.095 2 DEBUG oslo_concurrency.lockutils [req-93fae9be-1f0d-4fb2-8dec-e92916273f7c req-edf711a3-4793-47dc-935f-b0d464973282 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.095 2 DEBUG oslo_concurrency.lockutils [req-93fae9be-1f0d-4fb2-8dec-e92916273f7c req-edf711a3-4793-47dc-935f-b0d464973282 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.095 2 DEBUG oslo_concurrency.lockutils [req-93fae9be-1f0d-4fb2-8dec-e92916273f7c req-edf711a3-4793-47dc-935f-b0d464973282 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.095 2 DEBUG nova.compute.manager [req-93fae9be-1f0d-4fb2-8dec-e92916273f7c req-edf711a3-4793-47dc-935f-b0d464973282 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Processing event network-vif-plugged-0bb60f77-cd96-4dfd-9810-5583ec966cb2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:23:16 np0005476733 podman[225482]: 2025-10-08 15:23:16.182183507 +0000 UTC m=+0.068921620 container create 9cc9ad7d6f3f073545d8485061fa8b036ac5fc2618d84b4a5cbed171d9617366 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  8 11:23:16 np0005476733 systemd[1]: Started libpod-conmon-9cc9ad7d6f3f073545d8485061fa8b036ac5fc2618d84b4a5cbed171d9617366.scope.
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.232 2 DEBUG nova.network.neutron [req-a08836a3-0013-4ea7-9ac6-eb96756ab2a2 req-c48cc938-6fee-4df8-8c60-043286e88ec8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Updated VIF entry in instance network info cache for port 0bb60f77-cd96-4dfd-9810-5583ec966cb2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.233 2 DEBUG nova.network.neutron [req-a08836a3-0013-4ea7-9ac6-eb96756ab2a2 req-c48cc938-6fee-4df8-8c60-043286e88ec8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Updating instance_info_cache with network_info: [{"id": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "address": "fa:16:3e:a6:aa:3b", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bb60f77-cd", "ovs_interfaceid": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:23:16 np0005476733 podman[225482]: 2025-10-08 15:23:16.14671901 +0000 UTC m=+0.033457143 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:23:16 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:23:16 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/548964f02231e33d3158a6429259dcf0919199dd66efcea55fac07f198cd4b58/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.252 2 DEBUG oslo_concurrency.lockutils [req-a08836a3-0013-4ea7-9ac6-eb96756ab2a2 req-c48cc938-6fee-4df8-8c60-043286e88ec8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:23:16 np0005476733 podman[225482]: 2025-10-08 15:23:16.271648104 +0000 UTC m=+0.158386247 container init 9cc9ad7d6f3f073545d8485061fa8b036ac5fc2618d84b4a5cbed171d9617366 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.276 2 DEBUG oslo_concurrency.lockutils [None req-49e4c43b-a573-4345-a087-495faf0dc3b6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Acquiring lock "9992bf78-8d8e-43c7-a8cc-5606d8c910cf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.276 2 DEBUG oslo_concurrency.lockutils [None req-49e4c43b-a573-4345-a087-495faf0dc3b6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "9992bf78-8d8e-43c7-a8cc-5606d8c910cf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:16 np0005476733 podman[225482]: 2025-10-08 15:23:16.277336576 +0000 UTC m=+0.164074699 container start 9cc9ad7d6f3f073545d8485061fa8b036ac5fc2618d84b4a5cbed171d9617366 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.277 2 DEBUG oslo_concurrency.lockutils [None req-49e4c43b-a573-4345-a087-495faf0dc3b6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Acquiring lock "9992bf78-8d8e-43c7-a8cc-5606d8c910cf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.277 2 DEBUG oslo_concurrency.lockutils [None req-49e4c43b-a573-4345-a087-495faf0dc3b6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "9992bf78-8d8e-43c7-a8cc-5606d8c910cf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.278 2 DEBUG oslo_concurrency.lockutils [None req-49e4c43b-a573-4345-a087-495faf0dc3b6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "9992bf78-8d8e-43c7-a8cc-5606d8c910cf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.279 2 INFO nova.compute.manager [None req-49e4c43b-a573-4345-a087-495faf0dc3b6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Terminating instance#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.280 2 DEBUG nova.compute.manager [None req-49e4c43b-a573-4345-a087-495faf0dc3b6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:23:16 np0005476733 neutron-haproxy-ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a[225497]: [NOTICE]   (225501) : New worker (225503) forked
Oct  8 11:23:16 np0005476733 neutron-haproxy-ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a[225497]: [NOTICE]   (225501) : Loading success.
Oct  8 11:23:16 np0005476733 kernel: tap5800d2b5-1c (unregistering): left promiscuous mode
Oct  8 11:23:16 np0005476733 NetworkManager[51699]: <info>  [1759936996.3086] device (tap5800d2b5-1c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:16 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:16Z|00161|binding|INFO|Releasing lport 5800d2b5-1c28-4be9-ba9d-7442de36269e from this chassis (sb_readonly=0)
Oct  8 11:23:16 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:16Z|00162|binding|INFO|Setting lport 5800d2b5-1c28-4be9-ba9d-7442de36269e down in Southbound
Oct  8 11:23:16 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:16Z|00163|binding|INFO|Removing iface tap5800d2b5-1c ovn-installed in OVS
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.369 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:df:5b 10.100.0.6'], port_security=['fa:16:3e:2b:df:5b 10.100.0.6 192.168.123.12/24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca646cb6-3329-453a-a072-04814e4638f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '27fe52d14e2143a887b0445eb5cfca72', 'neutron:revision_number': '4', 'neutron:security_group_ids': '47c9f436-4d87-4dd0-ad82-6f84fbc433e0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.198'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e3e342e-563d-45df-8704-409eb95c6087, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=5800d2b5-1c28-4be9-ba9d-7442de36269e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.407 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 5800d2b5-1c28-4be9-ba9d-7442de36269e in datapath ca646cb6-3329-453a-a072-04814e4638f0 unbound from our chassis#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.410 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ca646cb6-3329-453a-a072-04814e4638f0#033[00m
Oct  8 11:23:16 np0005476733 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Oct  8 11:23:16 np0005476733 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000f.scope: Consumed 47.444s CPU time.
Oct  8 11:23:16 np0005476733 systemd-machined[152624]: Machine qemu-10-instance-0000000f terminated.
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.435 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7a56c09a-e44b-4b27-985d-f5b0d6bd1596]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.469 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[c6e0c2a3-6758-4871-bd84-390dd76d3548]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.474 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[6367fcaa-bdbd-4e31-8d07-e73ff93b7c1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:16 np0005476733 kernel: tap5800d2b5-1c: entered promiscuous mode
Oct  8 11:23:16 np0005476733 NetworkManager[51699]: <info>  [1759936996.5124] manager: (tap5800d2b5-1c): new Tun device (/org/freedesktop/NetworkManager/Devices/76)
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:16 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:16Z|00164|binding|INFO|Claiming lport 5800d2b5-1c28-4be9-ba9d-7442de36269e for this chassis.
Oct  8 11:23:16 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:16Z|00165|binding|INFO|5800d2b5-1c28-4be9-ba9d-7442de36269e: Claiming fa:16:3e:2b:df:5b 10.100.0.6
Oct  8 11:23:16 np0005476733 kernel: tap5800d2b5-1c (unregistering): left promiscuous mode
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.529 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[e02d5d66-6d81-41e7-8673-b8bae7ebb706]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.533 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:df:5b 10.100.0.6'], port_security=['fa:16:3e:2b:df:5b 10.100.0.6 192.168.123.12/24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca646cb6-3329-453a-a072-04814e4638f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '27fe52d14e2143a887b0445eb5cfca72', 'neutron:revision_number': '4', 'neutron:security_group_ids': '47c9f436-4d87-4dd0-ad82-6f84fbc433e0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.198'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e3e342e-563d-45df-8704-409eb95c6087, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=5800d2b5-1c28-4be9-ba9d-7442de36269e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:16 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:16Z|00166|binding|INFO|Releasing lport 5800d2b5-1c28-4be9-ba9d-7442de36269e from this chassis (sb_readonly=0)
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.546 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:df:5b 10.100.0.6'], port_security=['fa:16:3e:2b:df:5b 10.100.0.6 192.168.123.12/24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '9992bf78-8d8e-43c7-a8cc-5606d8c910cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca646cb6-3329-453a-a072-04814e4638f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '27fe52d14e2143a887b0445eb5cfca72', 'neutron:revision_number': '4', 'neutron:security_group_ids': '47c9f436-4d87-4dd0-ad82-6f84fbc433e0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.198'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e3e342e-563d-45df-8704-409eb95c6087, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=5800d2b5-1c28-4be9-ba9d-7442de36269e) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.560 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[52f350f8-2637-4008-94b4-5fac1244505b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca646cb6-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:47:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 18, 'tx_packets': 7, 'rx_bytes': 1252, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 18, 'tx_packets': 7, 'rx_bytes': 1252, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377535, 'reachable_time': 19488, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225526, 'error': None, 'target': 'ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.570 2 INFO nova.virt.libvirt.driver [-] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Instance destroyed successfully.#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.571 2 DEBUG nova.objects.instance [None req-49e4c43b-a573-4345-a087-495faf0dc3b6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lazy-loading 'resources' on Instance uuid 9992bf78-8d8e-43c7-a8cc-5606d8c910cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.575 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f441f42b-ccee-473c-b317-012cc623b1d0]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapca646cb6-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 377549, 'tstamp': 377549}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225531, 'error': None, 'target': 'ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapca646cb6-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 377554, 'tstamp': 377554}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225531, 'error': None, 'target': 'ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.577 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca646cb6-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.586 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca646cb6-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.587 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.587 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapca646cb6-30, col_values=(('external_ids', {'iface-id': '01b0a658-f0ed-4cb7-aee4-981992c348f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.588 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.590 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 5800d2b5-1c28-4be9-ba9d-7442de36269e in datapath ca646cb6-3329-453a-a072-04814e4638f0 unbound from our chassis#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.594 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ca646cb6-3329-453a-a072-04814e4638f0#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.596 2 DEBUG nova.virt.libvirt.vif [None req-49e4c43b-a573-4345-a087-495faf0dc3b6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:21:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-multicast-server-vlan-transparent-35634410',display_name='tempest-multicast-server-vlan-transparent-35634410',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-multicast-server-vlan-transparent-35634410',id=15,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH4syPllrLf9M6NW3P0Mtw3AQOO4FK7TnvvKqGmsnzh9ZdBFhzF23mGGofa6PIbzV2jpECHJPUWbJNsOHP+hhSHtvJ/A+QvrET4E695rK5KUU6a+Wgg98oHszoQwuH9J+g==',key_name='tempest-keypair-test-307751635',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:22:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='27fe52d14e2143a887b0445eb5cfca72',ramdisk_id='',reservation_id='r-r76c1a4l',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-MulticastTestVlanTransparency-435229999',owner_user_name='tempest-MulticastTestVlanTransparency-435229999-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:22:11Z,user_data=None,user_id='8f9ed00bd5cc488a9d2a77380f12a503',uuid=9992bf78-8d8e-43c7-a8cc-5606d8c910cf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5800d2b5-1c28-4be9-ba9d-7442de36269e", "address": "fa:16:3e:2b:df:5b", "network": {"id": "ca646cb6-3329-453a-a072-04814e4638f0", "bridge": "br-int", "label": "tempest-test-network--1140426360", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27fe52d14e2143a887b0445eb5cfca72", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5800d2b5-1c", "ovs_interfaceid": "5800d2b5-1c28-4be9-ba9d-7442de36269e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.596 2 DEBUG nova.network.os_vif_util [None req-49e4c43b-a573-4345-a087-495faf0dc3b6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Converting VIF {"id": "5800d2b5-1c28-4be9-ba9d-7442de36269e", "address": "fa:16:3e:2b:df:5b", "network": {"id": "ca646cb6-3329-453a-a072-04814e4638f0", "bridge": "br-int", "label": "tempest-test-network--1140426360", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27fe52d14e2143a887b0445eb5cfca72", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5800d2b5-1c", "ovs_interfaceid": "5800d2b5-1c28-4be9-ba9d-7442de36269e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.597 2 DEBUG nova.network.os_vif_util [None req-49e4c43b-a573-4345-a087-495faf0dc3b6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2b:df:5b,bridge_name='br-int',has_traffic_filtering=True,id=5800d2b5-1c28-4be9-ba9d-7442de36269e,network=Network(ca646cb6-3329-453a-a072-04814e4638f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5800d2b5-1c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.597 2 DEBUG os_vif [None req-49e4c43b-a573-4345-a087-495faf0dc3b6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2b:df:5b,bridge_name='br-int',has_traffic_filtering=True,id=5800d2b5-1c28-4be9-ba9d-7442de36269e,network=Network(ca646cb6-3329-453a-a072-04814e4638f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5800d2b5-1c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.599 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5800d2b5-1c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.606 2 INFO os_vif [None req-49e4c43b-a573-4345-a087-495faf0dc3b6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2b:df:5b,bridge_name='br-int',has_traffic_filtering=True,id=5800d2b5-1c28-4be9-ba9d-7442de36269e,network=Network(ca646cb6-3329-453a-a072-04814e4638f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5800d2b5-1c')#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.606 2 INFO nova.virt.libvirt.driver [None req-49e4c43b-a573-4345-a087-495faf0dc3b6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Deleting instance files /var/lib/nova/instances/9992bf78-8d8e-43c7-a8cc-5606d8c910cf_del#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.607 2 INFO nova.virt.libvirt.driver [None req-49e4c43b-a573-4345-a087-495faf0dc3b6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Deletion of /var/lib/nova/instances/9992bf78-8d8e-43c7-a8cc-5606d8c910cf_del complete#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.614 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a081256c-bcef-478e-b7ee-86cb57520afd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.615 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936996.6147692, 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.616 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] VM Started (Lifecycle Event)#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.617 2 DEBUG nova.compute.manager [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.629 2 DEBUG nova.virt.libvirt.driver [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.634 2 INFO nova.virt.libvirt.driver [-] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Instance spawned successfully.#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.634 2 DEBUG nova.virt.libvirt.driver [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.649 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[8c2edc18-1f87-4ad5-8168-42f322e5034f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.652 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[991ac2a2-2a93-47ae-b894-909a6646d6fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.663 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.666 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.680 2 DEBUG nova.virt.libvirt.driver [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.680 2 DEBUG nova.virt.libvirt.driver [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.681 2 DEBUG nova.virt.libvirt.driver [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.681 2 DEBUG nova.virt.libvirt.driver [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.681 2 DEBUG nova.virt.libvirt.driver [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.681 2 DEBUG nova.virt.libvirt.driver [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.686 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[8ae552bb-ca4f-4588-a3ce-747785fb5ac9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.700 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c3da88a6-a5a7-4c57-b2eb-77735dc5c6f5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca646cb6-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:47:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 18, 'tx_packets': 9, 'rx_bytes': 1252, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 18, 'tx_packets': 9, 'rx_bytes': 1252, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377535, 'reachable_time': 19488, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225538, 'error': None, 'target': 'ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.716 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.716 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936996.6149437, 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.716 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.720 2 INFO nova.compute.manager [None req-49e4c43b-a573-4345-a087-495faf0dc3b6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Took 0.44 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.720 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[fb6b3179-b036-4767-be31-f6329d844c9e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapca646cb6-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 377549, 'tstamp': 377549}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225539, 'error': None, 'target': 'ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapca646cb6-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 377554, 'tstamp': 377554}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225539, 'error': None, 'target': 'ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.721 2 DEBUG oslo.service.loopingcall [None req-49e4c43b-a573-4345-a087-495faf0dc3b6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.721 2 DEBUG nova.compute.manager [-] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.721 2 DEBUG nova.network.neutron [-] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.721 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca646cb6-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.724 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca646cb6-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.724 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.724 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapca646cb6-30, col_values=(('external_ids', {'iface-id': '01b0a658-f0ed-4cb7-aee4-981992c348f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.725 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.725 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 5800d2b5-1c28-4be9-ba9d-7442de36269e in datapath ca646cb6-3329-453a-a072-04814e4638f0 unbound from our chassis#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.728 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ca646cb6-3329-453a-a072-04814e4638f0#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.750 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0a7eccec-e819-4cab-adb4-856b70b7e6bf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.752 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.754 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759936996.6229422, 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.755 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.765 2 INFO nova.compute.manager [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Took 7.77 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.765 2 DEBUG nova.compute.manager [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.777 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.779 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.788 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[887eb522-2a98-467c-b3ec-72e43b4ca3f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.793 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[7d8f40a4-09e1-463c-b946-1348eda686f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.805 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.830 2 INFO nova.compute.manager [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Took 8.36 seconds to build instance.#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.830 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[71362b69-6021-47c4-9e72-c2130d7be6b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.848 2 DEBUG oslo_concurrency.lockutils [None req-9c700f7d-17a5-4f2e-8984-b885dedb4a48 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "1a3ae685-bd3d-4f36-ad77-9f5b6b95677f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.445s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.853 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f3c999fc-6752-47ec-ac18-9b0739fd53bb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca646cb6-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:47:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 18, 'tx_packets': 11, 'rx_bytes': 1252, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 18, 'tx_packets': 11, 'rx_bytes': 1252, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377535, 'reachable_time': 19488, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225545, 'error': None, 'target': 'ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.875 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7d0d89da-6f01-4298-9383-f2bf1a91b9ec]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapca646cb6-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 377549, 'tstamp': 377549}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225546, 'error': None, 'target': 'ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapca646cb6-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 377554, 'tstamp': 377554}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225546, 'error': None, 'target': 'ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.877 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca646cb6-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:16 np0005476733 nova_compute[192580]: 2025-10-08 15:23:16.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.880 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca646cb6-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.881 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.881 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapca646cb6-30, col_values=(('external_ids', {'iface-id': '01b0a658-f0ed-4cb7-aee4-981992c348f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:16.882 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:23:17 np0005476733 nova_compute[192580]: 2025-10-08 15:23:17.009 2 INFO nova.compute.manager [None req-5646140b-ad21-428d-b992-2670ec79b095 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Get console output#033[00m
Oct  8 11:23:17 np0005476733 nova_compute[192580]: 2025-10-08 15:23:17.016 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.221 2 DEBUG nova.compute.manager [req-2420165d-1e44-42ff-a094-f9ffca74b303 req-85e1e784-1419-4053-b43e-140fb4ef17c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Received event network-vif-plugged-0bb60f77-cd96-4dfd-9810-5583ec966cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.221 2 DEBUG oslo_concurrency.lockutils [req-2420165d-1e44-42ff-a094-f9ffca74b303 req-85e1e784-1419-4053-b43e-140fb4ef17c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.221 2 DEBUG oslo_concurrency.lockutils [req-2420165d-1e44-42ff-a094-f9ffca74b303 req-85e1e784-1419-4053-b43e-140fb4ef17c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.222 2 DEBUG oslo_concurrency.lockutils [req-2420165d-1e44-42ff-a094-f9ffca74b303 req-85e1e784-1419-4053-b43e-140fb4ef17c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.222 2 DEBUG nova.compute.manager [req-2420165d-1e44-42ff-a094-f9ffca74b303 req-85e1e784-1419-4053-b43e-140fb4ef17c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] No waiting events found dispatching network-vif-plugged-0bb60f77-cd96-4dfd-9810-5583ec966cb2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.222 2 WARNING nova.compute.manager [req-2420165d-1e44-42ff-a094-f9ffca74b303 req-85e1e784-1419-4053-b43e-140fb4ef17c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Received unexpected event network-vif-plugged-0bb60f77-cd96-4dfd-9810-5583ec966cb2 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.222 2 DEBUG nova.compute.manager [req-2420165d-1e44-42ff-a094-f9ffca74b303 req-85e1e784-1419-4053-b43e-140fb4ef17c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Received event network-vif-unplugged-5800d2b5-1c28-4be9-ba9d-7442de36269e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.222 2 DEBUG oslo_concurrency.lockutils [req-2420165d-1e44-42ff-a094-f9ffca74b303 req-85e1e784-1419-4053-b43e-140fb4ef17c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "9992bf78-8d8e-43c7-a8cc-5606d8c910cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.222 2 DEBUG oslo_concurrency.lockutils [req-2420165d-1e44-42ff-a094-f9ffca74b303 req-85e1e784-1419-4053-b43e-140fb4ef17c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9992bf78-8d8e-43c7-a8cc-5606d8c910cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.223 2 DEBUG oslo_concurrency.lockutils [req-2420165d-1e44-42ff-a094-f9ffca74b303 req-85e1e784-1419-4053-b43e-140fb4ef17c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9992bf78-8d8e-43c7-a8cc-5606d8c910cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.223 2 DEBUG nova.compute.manager [req-2420165d-1e44-42ff-a094-f9ffca74b303 req-85e1e784-1419-4053-b43e-140fb4ef17c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] No waiting events found dispatching network-vif-unplugged-5800d2b5-1c28-4be9-ba9d-7442de36269e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.223 2 DEBUG nova.compute.manager [req-2420165d-1e44-42ff-a094-f9ffca74b303 req-85e1e784-1419-4053-b43e-140fb4ef17c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Received event network-vif-unplugged-5800d2b5-1c28-4be9-ba9d-7442de36269e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.223 2 DEBUG nova.compute.manager [req-2420165d-1e44-42ff-a094-f9ffca74b303 req-85e1e784-1419-4053-b43e-140fb4ef17c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Received event network-vif-plugged-5800d2b5-1c28-4be9-ba9d-7442de36269e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.223 2 DEBUG oslo_concurrency.lockutils [req-2420165d-1e44-42ff-a094-f9ffca74b303 req-85e1e784-1419-4053-b43e-140fb4ef17c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "9992bf78-8d8e-43c7-a8cc-5606d8c910cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.223 2 DEBUG oslo_concurrency.lockutils [req-2420165d-1e44-42ff-a094-f9ffca74b303 req-85e1e784-1419-4053-b43e-140fb4ef17c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9992bf78-8d8e-43c7-a8cc-5606d8c910cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.223 2 DEBUG oslo_concurrency.lockutils [req-2420165d-1e44-42ff-a094-f9ffca74b303 req-85e1e784-1419-4053-b43e-140fb4ef17c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9992bf78-8d8e-43c7-a8cc-5606d8c910cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.224 2 DEBUG nova.compute.manager [req-2420165d-1e44-42ff-a094-f9ffca74b303 req-85e1e784-1419-4053-b43e-140fb4ef17c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] No waiting events found dispatching network-vif-plugged-5800d2b5-1c28-4be9-ba9d-7442de36269e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.224 2 WARNING nova.compute.manager [req-2420165d-1e44-42ff-a094-f9ffca74b303 req-85e1e784-1419-4053-b43e-140fb4ef17c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Received unexpected event network-vif-plugged-5800d2b5-1c28-4be9-ba9d-7442de36269e for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.386 2 DEBUG nova.network.neutron [-] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.411 2 INFO nova.compute.manager [-] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Took 1.69 seconds to deallocate network for instance.#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.494 2 DEBUG oslo_concurrency.lockutils [None req-49e4c43b-a573-4345-a087-495faf0dc3b6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.495 2 DEBUG oslo_concurrency.lockutils [None req-49e4c43b-a573-4345-a087-495faf0dc3b6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.683 2 INFO nova.compute.manager [None req-010adb76-96ff-4c78-80ad-e2e53686e4f5 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Get console output#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.687 2 DEBUG nova.compute.provider_tree [None req-49e4c43b-a573-4345-a087-495faf0dc3b6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.690 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.709 2 DEBUG nova.scheduler.client.report [None req-49e4c43b-a573-4345-a087-495faf0dc3b6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.732 2 DEBUG oslo_concurrency.lockutils [None req-49e4c43b-a573-4345-a087-495faf0dc3b6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.777 2 INFO nova.scheduler.client.report [None req-49e4c43b-a573-4345-a087-495faf0dc3b6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Deleted allocations for instance 9992bf78-8d8e-43c7-a8cc-5606d8c910cf#033[00m
Oct  8 11:23:18 np0005476733 nova_compute[192580]: 2025-10-08 15:23:18.883 2 DEBUG oslo_concurrency.lockutils [None req-49e4c43b-a573-4345-a087-495faf0dc3b6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "9992bf78-8d8e-43c7-a8cc-5606d8c910cf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.606s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:19 np0005476733 nova_compute[192580]: 2025-10-08 15:23:19.843 2 DEBUG oslo_concurrency.lockutils [None req-feed1737-4342-4dae-8632-159b3af14bc6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Acquiring lock "64549fc7-989f-473a-99bb-78947d8d7536" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:19 np0005476733 nova_compute[192580]: 2025-10-08 15:23:19.844 2 DEBUG oslo_concurrency.lockutils [None req-feed1737-4342-4dae-8632-159b3af14bc6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "64549fc7-989f-473a-99bb-78947d8d7536" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:19 np0005476733 nova_compute[192580]: 2025-10-08 15:23:19.844 2 DEBUG oslo_concurrency.lockutils [None req-feed1737-4342-4dae-8632-159b3af14bc6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Acquiring lock "64549fc7-989f-473a-99bb-78947d8d7536-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:19 np0005476733 nova_compute[192580]: 2025-10-08 15:23:19.845 2 DEBUG oslo_concurrency.lockutils [None req-feed1737-4342-4dae-8632-159b3af14bc6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "64549fc7-989f-473a-99bb-78947d8d7536-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:19 np0005476733 nova_compute[192580]: 2025-10-08 15:23:19.845 2 DEBUG oslo_concurrency.lockutils [None req-feed1737-4342-4dae-8632-159b3af14bc6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "64549fc7-989f-473a-99bb-78947d8d7536-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:19 np0005476733 nova_compute[192580]: 2025-10-08 15:23:19.847 2 INFO nova.compute.manager [None req-feed1737-4342-4dae-8632-159b3af14bc6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Terminating instance#033[00m
Oct  8 11:23:19 np0005476733 nova_compute[192580]: 2025-10-08 15:23:19.848 2 DEBUG nova.compute.manager [None req-feed1737-4342-4dae-8632-159b3af14bc6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:23:19 np0005476733 kernel: tap4689d9d8-d6 (unregistering): left promiscuous mode
Oct  8 11:23:19 np0005476733 NetworkManager[51699]: <info>  [1759936999.8906] device (tap4689d9d8-d6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:23:19 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:19Z|00167|binding|INFO|Releasing lport 4689d9d8-d635-4a1c-9495-cff4ea7e6a95 from this chassis (sb_readonly=0)
Oct  8 11:23:19 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:19Z|00168|binding|INFO|Setting lport 4689d9d8-d635-4a1c-9495-cff4ea7e6a95 down in Southbound
Oct  8 11:23:19 np0005476733 nova_compute[192580]: 2025-10-08 15:23:19.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:19 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:19Z|00169|binding|INFO|Removing iface tap4689d9d8-d6 ovn-installed in OVS
Oct  8 11:23:19 np0005476733 nova_compute[192580]: 2025-10-08 15:23:19.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:19.915 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bc:4a:7f 10.100.0.14'], port_security=['fa:16:3e:bc:4a:7f 10.100.0.14 192.168.123.11/24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca646cb6-3329-453a-a072-04814e4638f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '27fe52d14e2143a887b0445eb5cfca72', 'neutron:revision_number': '4', 'neutron:security_group_ids': '47c9f436-4d87-4dd0-ad82-6f84fbc433e0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.172'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e3e342e-563d-45df-8704-409eb95c6087, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=4689d9d8-d635-4a1c-9495-cff4ea7e6a95) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:23:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:19.918 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 4689d9d8-d635-4a1c-9495-cff4ea7e6a95 in datapath ca646cb6-3329-453a-a072-04814e4638f0 unbound from our chassis#033[00m
Oct  8 11:23:19 np0005476733 nova_compute[192580]: 2025-10-08 15:23:19.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:19.922 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ca646cb6-3329-453a-a072-04814e4638f0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:23:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:19.924 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[46514aa4-2790-4669-8486-cd2f2c8d611e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:19.925 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0 namespace which is not needed anymore#033[00m
Oct  8 11:23:19 np0005476733 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Oct  8 11:23:19 np0005476733 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000a.scope: Consumed 49.011s CPU time.
Oct  8 11:23:19 np0005476733 systemd-machined[152624]: Machine qemu-8-instance-0000000a terminated.
Oct  8 11:23:20 np0005476733 kernel: tap4689d9d8-d6: entered promiscuous mode
Oct  8 11:23:20 np0005476733 NetworkManager[51699]: <info>  [1759937000.0732] manager: (tap4689d9d8-d6): new Tun device (/org/freedesktop/NetworkManager/Devices/77)
Oct  8 11:23:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:20Z|00170|binding|INFO|Claiming lport 4689d9d8-d635-4a1c-9495-cff4ea7e6a95 for this chassis.
Oct  8 11:23:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:20Z|00171|binding|INFO|4689d9d8-d635-4a1c-9495-cff4ea7e6a95: Claiming fa:16:3e:bc:4a:7f 10.100.0.14
Oct  8 11:23:20 np0005476733 kernel: tap4689d9d8-d6 (unregistering): left promiscuous mode
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:20.083 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bc:4a:7f 10.100.0.14'], port_security=['fa:16:3e:bc:4a:7f 10.100.0.14 192.168.123.11/24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca646cb6-3329-453a-a072-04814e4638f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '27fe52d14e2143a887b0445eb5cfca72', 'neutron:revision_number': '4', 'neutron:security_group_ids': '47c9f436-4d87-4dd0-ad82-6f84fbc433e0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.172'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e3e342e-563d-45df-8704-409eb95c6087, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=4689d9d8-d635-4a1c-9495-cff4ea7e6a95) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:23:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:20Z|00172|binding|INFO|Setting lport 4689d9d8-d635-4a1c-9495-cff4ea7e6a95 ovn-installed in OVS
Oct  8 11:23:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:20Z|00173|binding|INFO|Setting lport 4689d9d8-d635-4a1c-9495-cff4ea7e6a95 up in Southbound
Oct  8 11:23:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:20Z|00174|binding|INFO|Releasing lport 4689d9d8-d635-4a1c-9495-cff4ea7e6a95 from this chassis (sb_readonly=1)
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:20Z|00175|if_status|INFO|Dropped 2 log messages in last 172 seconds (most recently, 172 seconds ago) due to excessive rate
Oct  8 11:23:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:20Z|00176|if_status|INFO|Not setting lport 4689d9d8-d635-4a1c-9495-cff4ea7e6a95 down as sb is readonly
Oct  8 11:23:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:20Z|00177|binding|INFO|Removing iface tap4689d9d8-d6 ovn-installed in OVS
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:20Z|00178|binding|INFO|Releasing lport 4689d9d8-d635-4a1c-9495-cff4ea7e6a95 from this chassis (sb_readonly=0)
Oct  8 11:23:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:20Z|00179|binding|INFO|Setting lport 4689d9d8-d635-4a1c-9495-cff4ea7e6a95 down in Southbound
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:20.111 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bc:4a:7f 10.100.0.14'], port_security=['fa:16:3e:bc:4a:7f 10.100.0.14 192.168.123.11/24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '64549fc7-989f-473a-99bb-78947d8d7536', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca646cb6-3329-453a-a072-04814e4638f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '27fe52d14e2143a887b0445eb5cfca72', 'neutron:revision_number': '4', 'neutron:security_group_ids': '47c9f436-4d87-4dd0-ad82-6f84fbc433e0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.172'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e3e342e-563d-45df-8704-409eb95c6087, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=4689d9d8-d635-4a1c-9495-cff4ea7e6a95) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.122 2 INFO nova.virt.libvirt.driver [-] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Instance destroyed successfully.#033[00m
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.123 2 DEBUG nova.objects.instance [None req-feed1737-4342-4dae-8632-159b3af14bc6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lazy-loading 'resources' on Instance uuid 64549fc7-989f-473a-99bb-78947d8d7536 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:23:20 np0005476733 neutron-haproxy-ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0[223882]: [NOTICE]   (223898) : haproxy version is 2.8.14-c23fe91
Oct  8 11:23:20 np0005476733 neutron-haproxy-ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0[223882]: [NOTICE]   (223898) : path to executable is /usr/sbin/haproxy
Oct  8 11:23:20 np0005476733 neutron-haproxy-ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0[223882]: [ALERT]    (223898) : Current worker (223900) exited with code 143 (Terminated)
Oct  8 11:23:20 np0005476733 neutron-haproxy-ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0[223882]: [WARNING]  (223898) : All workers exited. Exiting... (0)
Oct  8 11:23:20 np0005476733 systemd[1]: libpod-5fcc1d27a6420a128adef0bdb959f81b4a284b3f2cb39546fb2b3378d02410c5.scope: Deactivated successfully.
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.140 2 DEBUG nova.virt.libvirt.vif [None req-feed1737-4342-4dae-8632-159b3af14bc6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:21:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-multicast-server-vlan-transparent-1532029749',display_name='tempest-multicast-server-vlan-transparent-1532029749',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-multicast-server-vlan-transparent-1532029749',id=10,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH4syPllrLf9M6NW3P0Mtw3AQOO4FK7TnvvKqGmsnzh9ZdBFhzF23mGGofa6PIbzV2jpECHJPUWbJNsOHP+hhSHtvJ/A+QvrET4E695rK5KUU6a+Wgg98oHszoQwuH9J+g==',key_name='tempest-keypair-test-307751635',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:21:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='27fe52d14e2143a887b0445eb5cfca72',ramdisk_id='',reservation_id='r-7ly4s0go',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-MulticastTestVlanTransparency-435229999',owner_user_name='tempest-MulticastTestVlanTransparency-435229999-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:21:12Z,user_data=None,user_id='8f9ed00bd5cc488a9d2a77380f12a503',uuid=64549fc7-989f-473a-99bb-78947d8d7536,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4689d9d8-d635-4a1c-9495-cff4ea7e6a95", "address": "fa:16:3e:bc:4a:7f", "network": {"id": "ca646cb6-3329-453a-a072-04814e4638f0", "bridge": "br-int", "label": "tempest-test-network--1140426360", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27fe52d14e2143a887b0445eb5cfca72", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4689d9d8-d6", "ovs_interfaceid": "4689d9d8-d635-4a1c-9495-cff4ea7e6a95", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.140 2 DEBUG nova.network.os_vif_util [None req-feed1737-4342-4dae-8632-159b3af14bc6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Converting VIF {"id": "4689d9d8-d635-4a1c-9495-cff4ea7e6a95", "address": "fa:16:3e:bc:4a:7f", "network": {"id": "ca646cb6-3329-453a-a072-04814e4638f0", "bridge": "br-int", "label": "tempest-test-network--1140426360", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27fe52d14e2143a887b0445eb5cfca72", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4689d9d8-d6", "ovs_interfaceid": "4689d9d8-d635-4a1c-9495-cff4ea7e6a95", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.142 2 DEBUG nova.network.os_vif_util [None req-feed1737-4342-4dae-8632-159b3af14bc6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bc:4a:7f,bridge_name='br-int',has_traffic_filtering=True,id=4689d9d8-d635-4a1c-9495-cff4ea7e6a95,network=Network(ca646cb6-3329-453a-a072-04814e4638f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4689d9d8-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.143 2 DEBUG os_vif [None req-feed1737-4342-4dae-8632-159b3af14bc6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bc:4a:7f,bridge_name='br-int',has_traffic_filtering=True,id=4689d9d8-d635-4a1c-9495-cff4ea7e6a95,network=Network(ca646cb6-3329-453a-a072-04814e4638f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4689d9d8-d6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.146 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4689d9d8-d6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:20 np0005476733 podman[225569]: 2025-10-08 15:23:20.152193759 +0000 UTC m=+0.097585049 container died 5fcc1d27a6420a128adef0bdb959f81b4a284b3f2cb39546fb2b3378d02410c5 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.154 2 INFO os_vif [None req-feed1737-4342-4dae-8632-159b3af14bc6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bc:4a:7f,bridge_name='br-int',has_traffic_filtering=True,id=4689d9d8-d635-4a1c-9495-cff4ea7e6a95,network=Network(ca646cb6-3329-453a-a072-04814e4638f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4689d9d8-d6')#033[00m
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.155 2 INFO nova.virt.libvirt.driver [None req-feed1737-4342-4dae-8632-159b3af14bc6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Deleting instance files /var/lib/nova/instances/64549fc7-989f-473a-99bb-78947d8d7536_del#033[00m
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.156 2 INFO nova.virt.libvirt.driver [None req-feed1737-4342-4dae-8632-159b3af14bc6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Deletion of /var/lib/nova/instances/64549fc7-989f-473a-99bb-78947d8d7536_del complete#033[00m
Oct  8 11:23:20 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5fcc1d27a6420a128adef0bdb959f81b4a284b3f2cb39546fb2b3378d02410c5-userdata-shm.mount: Deactivated successfully.
Oct  8 11:23:20 np0005476733 systemd[1]: var-lib-containers-storage-overlay-bbfdb0f8ecabbb4f562bf8bb53bfb2869b9a74e26f59e5d32823c632bceae699-merged.mount: Deactivated successfully.
Oct  8 11:23:20 np0005476733 podman[225569]: 2025-10-08 15:23:20.210215269 +0000 UTC m=+0.155606559 container cleanup 5fcc1d27a6420a128adef0bdb959f81b4a284b3f2cb39546fb2b3378d02410c5 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.218 2 INFO nova.compute.manager [None req-feed1737-4342-4dae-8632-159b3af14bc6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.218 2 DEBUG oslo.service.loopingcall [None req-feed1737-4342-4dae-8632-159b3af14bc6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.219 2 DEBUG nova.compute.manager [-] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.219 2 DEBUG nova.network.neutron [-] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:23:20 np0005476733 systemd[1]: libpod-conmon-5fcc1d27a6420a128adef0bdb959f81b4a284b3f2cb39546fb2b3378d02410c5.scope: Deactivated successfully.
Oct  8 11:23:20 np0005476733 podman[225611]: 2025-10-08 15:23:20.279564521 +0000 UTC m=+0.044190787 container remove 5fcc1d27a6420a128adef0bdb959f81b4a284b3f2cb39546fb2b3378d02410c5 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:23:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:20.286 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[07ca979e-3c74-4f95-ae88-98f954e2aa88]: (4, ('Wed Oct  8 03:23:20 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0 (5fcc1d27a6420a128adef0bdb959f81b4a284b3f2cb39546fb2b3378d02410c5)\n5fcc1d27a6420a128adef0bdb959f81b4a284b3f2cb39546fb2b3378d02410c5\nWed Oct  8 03:23:20 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0 (5fcc1d27a6420a128adef0bdb959f81b4a284b3f2cb39546fb2b3378d02410c5)\n5fcc1d27a6420a128adef0bdb959f81b4a284b3f2cb39546fb2b3378d02410c5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:20.287 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[07654d65-7234-474b-8b51-bca38cd6ecba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:20.288 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca646cb6-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:20 np0005476733 kernel: tapca646cb6-30: left promiscuous mode
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:20.297 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d73690d0-fb69-4bc4-aed2-ef408b21a99a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:20.335 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[81013216-46fa-4430-8e7e-f752ece452ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:20.337 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[55fa9f8b-ebb8-4d9c-8897-b9e4b04fb241]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:20.353 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4abe5442-0c12-4d02-b5a9-2073c3b3f2c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377525, 'reachable_time': 28283, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225626, 'error': None, 'target': 'ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:20 np0005476733 systemd[1]: run-netns-ovnmeta\x2dca646cb6\x2d3329\x2d453a\x2da072\x2d04814e4638f0.mount: Deactivated successfully.
Oct  8 11:23:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:20.363 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ca646cb6-3329-453a-a072-04814e4638f0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:23:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:20.363 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[96cdaba1-3481-4b40-a6c9-51805ab2b9e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:20.365 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 4689d9d8-d635-4a1c-9495-cff4ea7e6a95 in datapath ca646cb6-3329-453a-a072-04814e4638f0 unbound from our chassis#033[00m
Oct  8 11:23:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:20.369 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ca646cb6-3329-453a-a072-04814e4638f0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:23:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:20.370 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[982b8be5-0b21-478f-91d8-135c38737c8c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:20.371 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 4689d9d8-d635-4a1c-9495-cff4ea7e6a95 in datapath ca646cb6-3329-453a-a072-04814e4638f0 unbound from our chassis#033[00m
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.373 2 DEBUG nova.compute.manager [req-11710268-9e77-43c2-9fae-537828cfc3d0 req-6278f4d1-21f0-4af6-a831-0b0c8be37871 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Received event network-vif-unplugged-4689d9d8-d635-4a1c-9495-cff4ea7e6a95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:20.373 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ca646cb6-3329-453a-a072-04814e4638f0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.374 2 DEBUG oslo_concurrency.lockutils [req-11710268-9e77-43c2-9fae-537828cfc3d0 req-6278f4d1-21f0-4af6-a831-0b0c8be37871 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "64549fc7-989f-473a-99bb-78947d8d7536-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.374 2 DEBUG oslo_concurrency.lockutils [req-11710268-9e77-43c2-9fae-537828cfc3d0 req-6278f4d1-21f0-4af6-a831-0b0c8be37871 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "64549fc7-989f-473a-99bb-78947d8d7536-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.374 2 DEBUG oslo_concurrency.lockutils [req-11710268-9e77-43c2-9fae-537828cfc3d0 req-6278f4d1-21f0-4af6-a831-0b0c8be37871 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "64549fc7-989f-473a-99bb-78947d8d7536-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.374 2 DEBUG nova.compute.manager [req-11710268-9e77-43c2-9fae-537828cfc3d0 req-6278f4d1-21f0-4af6-a831-0b0c8be37871 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] No waiting events found dispatching network-vif-unplugged-4689d9d8-d635-4a1c-9495-cff4ea7e6a95 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:23:20 np0005476733 nova_compute[192580]: 2025-10-08 15:23:20.375 2 DEBUG nova.compute.manager [req-11710268-9e77-43c2-9fae-537828cfc3d0 req-6278f4d1-21f0-4af6-a831-0b0c8be37871 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Received event network-vif-unplugged-4689d9d8-d635-4a1c-9495-cff4ea7e6a95 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:23:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:20.376 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[63bd2e63-4a3b-4eb3-b19e-5ea6deec08de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:20Z|00180|pinctrl|WARN|Dropped 3659 log messages in last 60 seconds (most recently, 0 seconds ago) due to excessive rate
Oct  8 11:23:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:20Z|00181|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.000 2 DEBUG nova.network.neutron [-] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.028 2 INFO nova.compute.manager [-] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Took 1.81 seconds to deallocate network for instance.#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.074 2 DEBUG oslo_concurrency.lockutils [None req-feed1737-4342-4dae-8632-159b3af14bc6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.075 2 DEBUG oslo_concurrency.lockutils [None req-feed1737-4342-4dae-8632-159b3af14bc6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.195 2 INFO nova.compute.manager [None req-47d476cd-015d-4945-933b-e6b62b49e17c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Get console output#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.202 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.205 2 INFO nova.virt.libvirt.driver [None req-47d476cd-015d-4945-933b-e6b62b49e17c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Truncated console log returned, 3304 bytes ignored#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.231 2 DEBUG nova.compute.provider_tree [None req-feed1737-4342-4dae-8632-159b3af14bc6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.254 2 DEBUG nova.scheduler.client.report [None req-feed1737-4342-4dae-8632-159b3af14bc6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.284 2 DEBUG oslo_concurrency.lockutils [None req-feed1737-4342-4dae-8632-159b3af14bc6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.328 2 INFO nova.scheduler.client.report [None req-feed1737-4342-4dae-8632-159b3af14bc6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Deleted allocations for instance 64549fc7-989f-473a-99bb-78947d8d7536#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.411 2 DEBUG oslo_concurrency.lockutils [None req-feed1737-4342-4dae-8632-159b3af14bc6 8f9ed00bd5cc488a9d2a77380f12a503 27fe52d14e2143a887b0445eb5cfca72 - - default default] Lock "64549fc7-989f-473a-99bb-78947d8d7536" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.567s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.495 2 DEBUG nova.compute.manager [req-3b510dbd-b8df-4b91-8636-4d53fee2600c req-406105f6-dc08-4bee-aea4-1f3263d46d03 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Received event network-vif-plugged-4689d9d8-d635-4a1c-9495-cff4ea7e6a95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.496 2 DEBUG oslo_concurrency.lockutils [req-3b510dbd-b8df-4b91-8636-4d53fee2600c req-406105f6-dc08-4bee-aea4-1f3263d46d03 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "64549fc7-989f-473a-99bb-78947d8d7536-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.496 2 DEBUG oslo_concurrency.lockutils [req-3b510dbd-b8df-4b91-8636-4d53fee2600c req-406105f6-dc08-4bee-aea4-1f3263d46d03 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "64549fc7-989f-473a-99bb-78947d8d7536-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.497 2 DEBUG oslo_concurrency.lockutils [req-3b510dbd-b8df-4b91-8636-4d53fee2600c req-406105f6-dc08-4bee-aea4-1f3263d46d03 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "64549fc7-989f-473a-99bb-78947d8d7536-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.497 2 DEBUG nova.compute.manager [req-3b510dbd-b8df-4b91-8636-4d53fee2600c req-406105f6-dc08-4bee-aea4-1f3263d46d03 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] No waiting events found dispatching network-vif-plugged-4689d9d8-d635-4a1c-9495-cff4ea7e6a95 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.498 2 WARNING nova.compute.manager [req-3b510dbd-b8df-4b91-8636-4d53fee2600c req-406105f6-dc08-4bee-aea4-1f3263d46d03 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Received unexpected event network-vif-plugged-4689d9d8-d635-4a1c-9495-cff4ea7e6a95 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.498 2 DEBUG nova.compute.manager [req-3b510dbd-b8df-4b91-8636-4d53fee2600c req-406105f6-dc08-4bee-aea4-1f3263d46d03 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Received event network-vif-plugged-4689d9d8-d635-4a1c-9495-cff4ea7e6a95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.499 2 DEBUG oslo_concurrency.lockutils [req-3b510dbd-b8df-4b91-8636-4d53fee2600c req-406105f6-dc08-4bee-aea4-1f3263d46d03 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "64549fc7-989f-473a-99bb-78947d8d7536-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.500 2 DEBUG oslo_concurrency.lockutils [req-3b510dbd-b8df-4b91-8636-4d53fee2600c req-406105f6-dc08-4bee-aea4-1f3263d46d03 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "64549fc7-989f-473a-99bb-78947d8d7536-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.500 2 DEBUG oslo_concurrency.lockutils [req-3b510dbd-b8df-4b91-8636-4d53fee2600c req-406105f6-dc08-4bee-aea4-1f3263d46d03 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "64549fc7-989f-473a-99bb-78947d8d7536-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.501 2 DEBUG nova.compute.manager [req-3b510dbd-b8df-4b91-8636-4d53fee2600c req-406105f6-dc08-4bee-aea4-1f3263d46d03 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] No waiting events found dispatching network-vif-plugged-4689d9d8-d635-4a1c-9495-cff4ea7e6a95 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.501 2 WARNING nova.compute.manager [req-3b510dbd-b8df-4b91-8636-4d53fee2600c req-406105f6-dc08-4bee-aea4-1f3263d46d03 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Received unexpected event network-vif-plugged-4689d9d8-d635-4a1c-9495-cff4ea7e6a95 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.502 2 DEBUG nova.compute.manager [req-3b510dbd-b8df-4b91-8636-4d53fee2600c req-406105f6-dc08-4bee-aea4-1f3263d46d03 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Received event network-vif-plugged-4689d9d8-d635-4a1c-9495-cff4ea7e6a95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.502 2 DEBUG oslo_concurrency.lockutils [req-3b510dbd-b8df-4b91-8636-4d53fee2600c req-406105f6-dc08-4bee-aea4-1f3263d46d03 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "64549fc7-989f-473a-99bb-78947d8d7536-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.503 2 DEBUG oslo_concurrency.lockutils [req-3b510dbd-b8df-4b91-8636-4d53fee2600c req-406105f6-dc08-4bee-aea4-1f3263d46d03 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "64549fc7-989f-473a-99bb-78947d8d7536-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.503 2 DEBUG oslo_concurrency.lockutils [req-3b510dbd-b8df-4b91-8636-4d53fee2600c req-406105f6-dc08-4bee-aea4-1f3263d46d03 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "64549fc7-989f-473a-99bb-78947d8d7536-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.504 2 DEBUG nova.compute.manager [req-3b510dbd-b8df-4b91-8636-4d53fee2600c req-406105f6-dc08-4bee-aea4-1f3263d46d03 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] No waiting events found dispatching network-vif-plugged-4689d9d8-d635-4a1c-9495-cff4ea7e6a95 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.504 2 WARNING nova.compute.manager [req-3b510dbd-b8df-4b91-8636-4d53fee2600c req-406105f6-dc08-4bee-aea4-1f3263d46d03 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Received unexpected event network-vif-plugged-4689d9d8-d635-4a1c-9495-cff4ea7e6a95 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.506 2 DEBUG oslo_concurrency.lockutils [None req-c1eba31a-3ca4-4ca9-a06d-a050e299016a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Acquiring lock "341c177f-c391-41dd-bf3c-14c2076057eb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.506 2 DEBUG oslo_concurrency.lockutils [None req-c1eba31a-3ca4-4ca9-a06d-a050e299016a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Lock "341c177f-c391-41dd-bf3c-14c2076057eb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.507 2 DEBUG oslo_concurrency.lockutils [None req-c1eba31a-3ca4-4ca9-a06d-a050e299016a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Acquiring lock "341c177f-c391-41dd-bf3c-14c2076057eb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.507 2 DEBUG oslo_concurrency.lockutils [None req-c1eba31a-3ca4-4ca9-a06d-a050e299016a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Lock "341c177f-c391-41dd-bf3c-14c2076057eb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.507 2 DEBUG oslo_concurrency.lockutils [None req-c1eba31a-3ca4-4ca9-a06d-a050e299016a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Lock "341c177f-c391-41dd-bf3c-14c2076057eb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.508 2 INFO nova.compute.manager [None req-c1eba31a-3ca4-4ca9-a06d-a050e299016a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Terminating instance#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.509 2 DEBUG nova.compute.manager [None req-c1eba31a-3ca4-4ca9-a06d-a050e299016a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:23:22 np0005476733 kernel: tap046dc8a5-fa (unregistering): left promiscuous mode
Oct  8 11:23:22 np0005476733 NetworkManager[51699]: <info>  [1759937002.6504] device (tap046dc8a5-fa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:22 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:22Z|00182|binding|INFO|Releasing lport 046dc8a5-fad3-4f9f-bd10-3894704fe7ed from this chassis (sb_readonly=0)
Oct  8 11:23:22 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:22Z|00183|binding|INFO|Setting lport 046dc8a5-fad3-4f9f-bd10-3894704fe7ed down in Southbound
Oct  8 11:23:22 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:22Z|00184|binding|INFO|Removing iface tap046dc8a5-fa ovn-installed in OVS
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:22.671 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:3d:28 10.100.0.10'], port_security=['fa:16:3e:0f:3d:28 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '341c177f-c391-41dd-bf3c-14c2076057eb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7e5261bf-648d-4475-96cb-fe9ba80fd1d8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0d027d9bf53149dd9246b01ebf09eb48', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0738c3ab-dc94-44ab-bcd4-94a57812b815', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.215'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=31be164c-7a5c-418b-b5ca-ef6f173770bf, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=046dc8a5-fad3-4f9f-bd10-3894704fe7ed) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:23:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:22.672 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 046dc8a5-fad3-4f9f-bd10-3894704fe7ed in datapath 7e5261bf-648d-4475-96cb-fe9ba80fd1d8 unbound from our chassis#033[00m
Oct  8 11:23:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:22.675 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7e5261bf-648d-4475-96cb-fe9ba80fd1d8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:23:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:22.679 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ab29cbad-dc82-4a4d-a947-0baffb789db7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:22.680 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7e5261bf-648d-4475-96cb-fe9ba80fd1d8 namespace which is not needed anymore#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:22 np0005476733 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Oct  8 11:23:22 np0005476733 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000b.scope: Consumed 47.599s CPU time.
Oct  8 11:23:22 np0005476733 systemd-machined[152624]: Machine qemu-9-instance-0000000b terminated.
Oct  8 11:23:22 np0005476733 podman[225627]: 2025-10-08 15:23:22.751633587 +0000 UTC m=+0.057395151 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 11:23:22 np0005476733 neutron-haproxy-ovnmeta-7e5261bf-648d-4475-96cb-fe9ba80fd1d8[224014]: [NOTICE]   (224018) : haproxy version is 2.8.14-c23fe91
Oct  8 11:23:22 np0005476733 neutron-haproxy-ovnmeta-7e5261bf-648d-4475-96cb-fe9ba80fd1d8[224014]: [NOTICE]   (224018) : path to executable is /usr/sbin/haproxy
Oct  8 11:23:22 np0005476733 neutron-haproxy-ovnmeta-7e5261bf-648d-4475-96cb-fe9ba80fd1d8[224014]: [WARNING]  (224018) : Exiting Master process...
Oct  8 11:23:22 np0005476733 neutron-haproxy-ovnmeta-7e5261bf-648d-4475-96cb-fe9ba80fd1d8[224014]: [ALERT]    (224018) : Current worker (224020) exited with code 143 (Terminated)
Oct  8 11:23:22 np0005476733 neutron-haproxy-ovnmeta-7e5261bf-648d-4475-96cb-fe9ba80fd1d8[224014]: [WARNING]  (224018) : All workers exited. Exiting... (0)
Oct  8 11:23:22 np0005476733 systemd[1]: libpod-b1b3470a3b851cf317b9e34ffc076a0d16b2e365bb724bcc324fc2cf92e6b60c.scope: Deactivated successfully.
Oct  8 11:23:22 np0005476733 podman[225668]: 2025-10-08 15:23:22.816970731 +0000 UTC m=+0.044302141 container died b1b3470a3b851cf317b9e34ffc076a0d16b2e365bb724bcc324fc2cf92e6b60c (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-7e5261bf-648d-4475-96cb-fe9ba80fd1d8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 11:23:22 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b1b3470a3b851cf317b9e34ffc076a0d16b2e365bb724bcc324fc2cf92e6b60c-userdata-shm.mount: Deactivated successfully.
Oct  8 11:23:22 np0005476733 podman[225668]: 2025-10-08 15:23:22.848654886 +0000 UTC m=+0.075986276 container cleanup b1b3470a3b851cf317b9e34ffc076a0d16b2e365bb724bcc324fc2cf92e6b60c (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-7e5261bf-648d-4475-96cb-fe9ba80fd1d8, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 11:23:22 np0005476733 systemd[1]: var-lib-containers-storage-overlay-2d9ebc36ad5ff3c965b41e44c1fd6dd4d222bebaa51b09058129247db2c952b9-merged.mount: Deactivated successfully.
Oct  8 11:23:22 np0005476733 systemd[1]: libpod-conmon-b1b3470a3b851cf317b9e34ffc076a0d16b2e365bb724bcc324fc2cf92e6b60c.scope: Deactivated successfully.
Oct  8 11:23:22 np0005476733 podman[225695]: 2025-10-08 15:23:22.925003193 +0000 UTC m=+0.044431795 container remove b1b3470a3b851cf317b9e34ffc076a0d16b2e365bb724bcc324fc2cf92e6b60c (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-7e5261bf-648d-4475-96cb-fe9ba80fd1d8, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  8 11:23:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:22.933 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[220145d0-9106-4c20-9136-72c26122cb77]: (4, ('Wed Oct  8 03:23:22 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7e5261bf-648d-4475-96cb-fe9ba80fd1d8 (b1b3470a3b851cf317b9e34ffc076a0d16b2e365bb724bcc324fc2cf92e6b60c)\nb1b3470a3b851cf317b9e34ffc076a0d16b2e365bb724bcc324fc2cf92e6b60c\nWed Oct  8 03:23:22 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7e5261bf-648d-4475-96cb-fe9ba80fd1d8 (b1b3470a3b851cf317b9e34ffc076a0d16b2e365bb724bcc324fc2cf92e6b60c)\nb1b3470a3b851cf317b9e34ffc076a0d16b2e365bb724bcc324fc2cf92e6b60c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:22.942 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e40a3508-71a0-4125-84a2-e2cc087bf054]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:22.948 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7e5261bf-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:22 np0005476733 kernel: tap7e5261bf-60: left promiscuous mode
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:22.971 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6b7a4a90-f18d-4490-aa9e-e43ce7603e12]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.984 2 INFO nova.virt.libvirt.driver [-] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Instance destroyed successfully.#033[00m
Oct  8 11:23:22 np0005476733 nova_compute[192580]: 2025-10-08 15:23:22.986 2 DEBUG nova.objects.instance [None req-c1eba31a-3ca4-4ca9-a06d-a050e299016a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Lazy-loading 'resources' on Instance uuid 341c177f-c391-41dd-bf3c-14c2076057eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:23:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:22.995 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[53ecc986-0ade-4f57-a911-1f85c974b7b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:22.997 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[def2aba6-2ac5-4a6f-bc8c-a7bd7d52502e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.005 2 DEBUG nova.virt.libvirt.vif [None req-c1eba31a-3ca4-4ca9-a06d-a050e299016a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:21:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-broadcast-receiver-1467126576',display_name='tempest-broadcast-receiver-1467126576',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-broadcast-receiver-1467126576',id=11,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHIGFmYxNtsO/KAD5DCJyEruKYGY8Jg3/mdP8DUSaU8q1j7RXeluCkcQClNdJmlHgOPM4zotnGNIaBo+klUL18feTKjHoE9KXkR0MO/pt0x3rWsYQBHe3V4p4r+dlzz5fw==',key_name='tempest-keypair-test-627190324',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:21:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0d027d9bf53149dd9246b01ebf09eb48',ramdisk_id='',reservation_id='r-82psguot',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-BroadcastTestIPv4Common-1303208658',owner_user_name='tempest-BroadcastTestIPv4Common-1303208658-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:21:20Z,user_data=None,user_id='625a85fb4a424c84b99b84adcf899810',uuid=341c177f-c391-41dd-bf3c-14c2076057eb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "046dc8a5-fad3-4f9f-bd10-3894704fe7ed", "address": "fa:16:3e:0f:3d:28", "network": {"id": "7e5261bf-648d-4475-96cb-fe9ba80fd1d8", "bridge": "br-int", "label": "tempest-test-network--1002477072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d027d9bf53149dd9246b01ebf09eb48", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap046dc8a5-fa", "ovs_interfaceid": "046dc8a5-fad3-4f9f-bd10-3894704fe7ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.006 2 DEBUG nova.network.os_vif_util [None req-c1eba31a-3ca4-4ca9-a06d-a050e299016a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Converting VIF {"id": "046dc8a5-fad3-4f9f-bd10-3894704fe7ed", "address": "fa:16:3e:0f:3d:28", "network": {"id": "7e5261bf-648d-4475-96cb-fe9ba80fd1d8", "bridge": "br-int", "label": "tempest-test-network--1002477072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d027d9bf53149dd9246b01ebf09eb48", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap046dc8a5-fa", "ovs_interfaceid": "046dc8a5-fad3-4f9f-bd10-3894704fe7ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.007 2 DEBUG nova.network.os_vif_util [None req-c1eba31a-3ca4-4ca9-a06d-a050e299016a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0f:3d:28,bridge_name='br-int',has_traffic_filtering=True,id=046dc8a5-fad3-4f9f-bd10-3894704fe7ed,network=Network(7e5261bf-648d-4475-96cb-fe9ba80fd1d8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap046dc8a5-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.007 2 DEBUG os_vif [None req-c1eba31a-3ca4-4ca9-a06d-a050e299016a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0f:3d:28,bridge_name='br-int',has_traffic_filtering=True,id=046dc8a5-fad3-4f9f-bd10-3894704fe7ed,network=Network(7e5261bf-648d-4475-96cb-fe9ba80fd1d8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap046dc8a5-fa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.008 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap046dc8a5-fa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.015 2 INFO os_vif [None req-c1eba31a-3ca4-4ca9-a06d-a050e299016a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0f:3d:28,bridge_name='br-int',has_traffic_filtering=True,id=046dc8a5-fad3-4f9f-bd10-3894704fe7ed,network=Network(7e5261bf-648d-4475-96cb-fe9ba80fd1d8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap046dc8a5-fa')#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.015 2 INFO nova.virt.libvirt.driver [None req-c1eba31a-3ca4-4ca9-a06d-a050e299016a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Deleting instance files /var/lib/nova/instances/341c177f-c391-41dd-bf3c-14c2076057eb_del#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.020 2 INFO nova.virt.libvirt.driver [None req-c1eba31a-3ca4-4ca9-a06d-a050e299016a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Deletion of /var/lib/nova/instances/341c177f-c391-41dd-bf3c-14c2076057eb_del complete#033[00m
Oct  8 11:23:23 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:23.020 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4ed211da-afb1-453f-bd93-a9dbc0587ddf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377697, 'reachable_time': 32276, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225729, 'error': None, 'target': 'ovnmeta-7e5261bf-648d-4475-96cb-fe9ba80fd1d8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:23 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:23.023 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7e5261bf-648d-4475-96cb-fe9ba80fd1d8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:23:23 np0005476733 systemd[1]: run-netns-ovnmeta\x2d7e5261bf\x2d648d\x2d4475\x2d96cb\x2dfe9ba80fd1d8.mount: Deactivated successfully.
Oct  8 11:23:23 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:23.023 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[dd5bafb0-1486-4c9f-b757-8ba84a5adb0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.142 2 INFO nova.compute.manager [None req-c1eba31a-3ca4-4ca9-a06d-a050e299016a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Took 0.63 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.143 2 DEBUG oslo.service.loopingcall [None req-c1eba31a-3ca4-4ca9-a06d-a050e299016a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.144 2 DEBUG nova.compute.manager [-] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.145 2 DEBUG nova.network.neutron [-] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:23:23 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:23.372 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:23 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:23.374 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.640 2 DEBUG nova.compute.manager [req-605d5611-9319-4ff9-a056-d220b861cb5a req-7165cde0-deab-43a0-b487-81a3028901fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Received event network-vif-unplugged-046dc8a5-fad3-4f9f-bd10-3894704fe7ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.641 2 DEBUG oslo_concurrency.lockutils [req-605d5611-9319-4ff9-a056-d220b861cb5a req-7165cde0-deab-43a0-b487-81a3028901fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "341c177f-c391-41dd-bf3c-14c2076057eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.642 2 DEBUG oslo_concurrency.lockutils [req-605d5611-9319-4ff9-a056-d220b861cb5a req-7165cde0-deab-43a0-b487-81a3028901fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "341c177f-c391-41dd-bf3c-14c2076057eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.642 2 DEBUG oslo_concurrency.lockutils [req-605d5611-9319-4ff9-a056-d220b861cb5a req-7165cde0-deab-43a0-b487-81a3028901fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "341c177f-c391-41dd-bf3c-14c2076057eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.642 2 DEBUG nova.compute.manager [req-605d5611-9319-4ff9-a056-d220b861cb5a req-7165cde0-deab-43a0-b487-81a3028901fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] No waiting events found dispatching network-vif-unplugged-046dc8a5-fad3-4f9f-bd10-3894704fe7ed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.643 2 DEBUG nova.compute.manager [req-605d5611-9319-4ff9-a056-d220b861cb5a req-7165cde0-deab-43a0-b487-81a3028901fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Received event network-vif-unplugged-046dc8a5-fad3-4f9f-bd10-3894704fe7ed for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.643 2 DEBUG nova.compute.manager [req-605d5611-9319-4ff9-a056-d220b861cb5a req-7165cde0-deab-43a0-b487-81a3028901fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Received event network-vif-plugged-046dc8a5-fad3-4f9f-bd10-3894704fe7ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.644 2 DEBUG oslo_concurrency.lockutils [req-605d5611-9319-4ff9-a056-d220b861cb5a req-7165cde0-deab-43a0-b487-81a3028901fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "341c177f-c391-41dd-bf3c-14c2076057eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.644 2 DEBUG oslo_concurrency.lockutils [req-605d5611-9319-4ff9-a056-d220b861cb5a req-7165cde0-deab-43a0-b487-81a3028901fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "341c177f-c391-41dd-bf3c-14c2076057eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.644 2 DEBUG oslo_concurrency.lockutils [req-605d5611-9319-4ff9-a056-d220b861cb5a req-7165cde0-deab-43a0-b487-81a3028901fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "341c177f-c391-41dd-bf3c-14c2076057eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.644 2 DEBUG nova.compute.manager [req-605d5611-9319-4ff9-a056-d220b861cb5a req-7165cde0-deab-43a0-b487-81a3028901fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] No waiting events found dispatching network-vif-plugged-046dc8a5-fad3-4f9f-bd10-3894704fe7ed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.645 2 WARNING nova.compute.manager [req-605d5611-9319-4ff9-a056-d220b861cb5a req-7165cde0-deab-43a0-b487-81a3028901fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Received unexpected event network-vif-plugged-046dc8a5-fad3-4f9f-bd10-3894704fe7ed for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.856 2 INFO nova.compute.manager [None req-99775618-82d3-4f8f-8fc4-56aa645a8574 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Get console output#033[00m
Oct  8 11:23:23 np0005476733 nova_compute[192580]: 2025-10-08 15:23:23.862 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:23:24 np0005476733 nova_compute[192580]: 2025-10-08 15:23:24.662 2 DEBUG nova.compute.manager [req-fe202c29-da07-4599-a79a-c10fa168673c req-48e75ab3-9c55-40d8-b809-dab2b3ecbac6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Received event network-vif-plugged-4689d9d8-d635-4a1c-9495-cff4ea7e6a95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:24 np0005476733 nova_compute[192580]: 2025-10-08 15:23:24.662 2 DEBUG oslo_concurrency.lockutils [req-fe202c29-da07-4599-a79a-c10fa168673c req-48e75ab3-9c55-40d8-b809-dab2b3ecbac6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "64549fc7-989f-473a-99bb-78947d8d7536-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:24 np0005476733 nova_compute[192580]: 2025-10-08 15:23:24.663 2 DEBUG oslo_concurrency.lockutils [req-fe202c29-da07-4599-a79a-c10fa168673c req-48e75ab3-9c55-40d8-b809-dab2b3ecbac6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "64549fc7-989f-473a-99bb-78947d8d7536-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:24 np0005476733 nova_compute[192580]: 2025-10-08 15:23:24.664 2 DEBUG oslo_concurrency.lockutils [req-fe202c29-da07-4599-a79a-c10fa168673c req-48e75ab3-9c55-40d8-b809-dab2b3ecbac6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "64549fc7-989f-473a-99bb-78947d8d7536-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:24 np0005476733 nova_compute[192580]: 2025-10-08 15:23:24.664 2 DEBUG nova.compute.manager [req-fe202c29-da07-4599-a79a-c10fa168673c req-48e75ab3-9c55-40d8-b809-dab2b3ecbac6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] No waiting events found dispatching network-vif-plugged-4689d9d8-d635-4a1c-9495-cff4ea7e6a95 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:23:24 np0005476733 nova_compute[192580]: 2025-10-08 15:23:24.664 2 WARNING nova.compute.manager [req-fe202c29-da07-4599-a79a-c10fa168673c req-48e75ab3-9c55-40d8-b809-dab2b3ecbac6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Received unexpected event network-vif-plugged-4689d9d8-d635-4a1c-9495-cff4ea7e6a95 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 11:23:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:25.377 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:25 np0005476733 nova_compute[192580]: 2025-10-08 15:23:25.534 2 DEBUG nova.network.neutron [-] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:23:25 np0005476733 nova_compute[192580]: 2025-10-08 15:23:25.637 2 INFO nova.compute.manager [-] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Took 2.49 seconds to deallocate network for instance.#033[00m
Oct  8 11:23:25 np0005476733 nova_compute[192580]: 2025-10-08 15:23:25.733 2 DEBUG oslo_concurrency.lockutils [None req-c1eba31a-3ca4-4ca9-a06d-a050e299016a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:25 np0005476733 nova_compute[192580]: 2025-10-08 15:23:25.733 2 DEBUG oslo_concurrency.lockutils [None req-c1eba31a-3ca4-4ca9-a06d-a050e299016a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:26 np0005476733 nova_compute[192580]: 2025-10-08 15:23:26.011 2 DEBUG nova.compute.provider_tree [None req-c1eba31a-3ca4-4ca9-a06d-a050e299016a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:23:26 np0005476733 nova_compute[192580]: 2025-10-08 15:23:26.064 2 DEBUG nova.scheduler.client.report [None req-c1eba31a-3ca4-4ca9-a06d-a050e299016a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:23:26 np0005476733 podman[225746]: 2025-10-08 15:23:26.246065368 +0000 UTC m=+0.070538432 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  8 11:23:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:26.305 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:26.306 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:26.307 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:26 np0005476733 nova_compute[192580]: 2025-10-08 15:23:26.350 2 DEBUG oslo_concurrency.lockutils [None req-c1eba31a-3ca4-4ca9-a06d-a050e299016a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.616s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:26 np0005476733 nova_compute[192580]: 2025-10-08 15:23:26.491 2 INFO nova.scheduler.client.report [None req-c1eba31a-3ca4-4ca9-a06d-a050e299016a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Deleted allocations for instance 341c177f-c391-41dd-bf3c-14c2076057eb#033[00m
Oct  8 11:23:26 np0005476733 nova_compute[192580]: 2025-10-08 15:23:26.725 2 DEBUG oslo_concurrency.lockutils [None req-c1eba31a-3ca4-4ca9-a06d-a050e299016a 625a85fb4a424c84b99b84adcf899810 0d027d9bf53149dd9246b01ebf09eb48 - - default default] Lock "341c177f-c391-41dd-bf3c-14c2076057eb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:27 np0005476733 podman[225768]: 2025-10-08 15:23:27.258331618 +0000 UTC m=+0.082708201 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  8 11:23:27 np0005476733 nova_compute[192580]: 2025-10-08 15:23:27.650 2 DEBUG nova.compute.manager [req-f8617589-ef64-4685-be8d-0d2b883f930f req-3f3ca452-8358-4e89-ae36-1b0796bf63b5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Received event network-vif-deleted-046dc8a5-fad3-4f9f-bd10-3894704fe7ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:27 np0005476733 nova_compute[192580]: 2025-10-08 15:23:27.650 2 DEBUG nova.compute.manager [req-f8617589-ef64-4685-be8d-0d2b883f930f req-3f3ca452-8358-4e89-ae36-1b0796bf63b5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Received event network-changed-59f58b79-9163-41ba-8e03-7430e5def4ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:27 np0005476733 nova_compute[192580]: 2025-10-08 15:23:27.651 2 DEBUG nova.compute.manager [req-f8617589-ef64-4685-be8d-0d2b883f930f req-3f3ca452-8358-4e89-ae36-1b0796bf63b5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Refreshing instance network info cache due to event network-changed-59f58b79-9163-41ba-8e03-7430e5def4ef. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:23:27 np0005476733 nova_compute[192580]: 2025-10-08 15:23:27.652 2 DEBUG oslo_concurrency.lockutils [req-f8617589-ef64-4685-be8d-0d2b883f930f req-3f3ca452-8358-4e89-ae36-1b0796bf63b5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-656c0a96-03f3-4a70-baac-01de2a126a91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:23:27 np0005476733 nova_compute[192580]: 2025-10-08 15:23:27.652 2 DEBUG oslo_concurrency.lockutils [req-f8617589-ef64-4685-be8d-0d2b883f930f req-3f3ca452-8358-4e89-ae36-1b0796bf63b5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-656c0a96-03f3-4a70-baac-01de2a126a91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:23:27 np0005476733 nova_compute[192580]: 2025-10-08 15:23:27.653 2 DEBUG nova.network.neutron [req-f8617589-ef64-4685-be8d-0d2b883f930f req-3f3ca452-8358-4e89-ae36-1b0796bf63b5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Refreshing network info cache for port 59f58b79-9163-41ba-8e03-7430e5def4ef _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.300 2 DEBUG oslo_concurrency.lockutils [None req-b4797bab-4c6e-4b4c-9f5d-ec4d6f198aee f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "08e4113f-f3be-424f-926e-62e20b3ad767" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.301 2 DEBUG oslo_concurrency.lockutils [None req-b4797bab-4c6e-4b4c-9f5d-ec4d6f198aee f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "08e4113f-f3be-424f-926e-62e20b3ad767" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.302 2 DEBUG oslo_concurrency.lockutils [None req-b4797bab-4c6e-4b4c-9f5d-ec4d6f198aee f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "08e4113f-f3be-424f-926e-62e20b3ad767-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.302 2 DEBUG oslo_concurrency.lockutils [None req-b4797bab-4c6e-4b4c-9f5d-ec4d6f198aee f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "08e4113f-f3be-424f-926e-62e20b3ad767-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.303 2 DEBUG oslo_concurrency.lockutils [None req-b4797bab-4c6e-4b4c-9f5d-ec4d6f198aee f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "08e4113f-f3be-424f-926e-62e20b3ad767-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.305 2 INFO nova.compute.manager [None req-b4797bab-4c6e-4b4c-9f5d-ec4d6f198aee f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Terminating instance#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.307 2 DEBUG nova.compute.manager [None req-b4797bab-4c6e-4b4c-9f5d-ec4d6f198aee f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:23:28 np0005476733 kernel: tap3e1bce81-bd (unregistering): left promiscuous mode
Oct  8 11:23:28 np0005476733 NetworkManager[51699]: <info>  [1759937008.3482] device (tap3e1bce81-bd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:28 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:28Z|00185|binding|INFO|Releasing lport 3e1bce81-bd3f-433a-aad4-1b90ad016699 from this chassis (sb_readonly=0)
Oct  8 11:23:28 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:28Z|00186|binding|INFO|Setting lport 3e1bce81-bd3f-433a-aad4-1b90ad016699 down in Southbound
Oct  8 11:23:28 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:28Z|00187|binding|INFO|Removing iface tap3e1bce81-bd ovn-installed in OVS
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:28.376 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:a3:46 10.100.0.3'], port_security=['fa:16:3e:ce:a3:46 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '08e4113f-f3be-424f-926e-62e20b3ad767', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '89de78c9-f0c2-4dee-bf11-af3dd2c1fe7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.235'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ebe434a7-5fd3-4a18-92a7-9bb4b2dc9121, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=3e1bce81-bd3f-433a-aad4-1b90ad016699) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:28.378 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 3e1bce81-bd3f-433a-aad4-1b90ad016699 in datapath 3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567 unbound from our chassis#033[00m
Oct  8 11:23:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:28.382 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567#033[00m
Oct  8 11:23:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:28.410 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[aa983de6-8f17-46ad-837b-5fc8939dff02]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:28 np0005476733 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000012.scope: Deactivated successfully.
Oct  8 11:23:28 np0005476733 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000012.scope: Consumed 44.011s CPU time.
Oct  8 11:23:28 np0005476733 systemd-machined[152624]: Machine qemu-11-instance-00000012 terminated.
Oct  8 11:23:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:28.449 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[0c0ec9ac-27f2-4a15-9c31-ea37fa8df230]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:28.452 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[15e803f2-d7cd-40fe-8aef-54cf1b0cab5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:28.489 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[b8ea5be6-85ab-46cb-898c-cb72e67611a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:28.508 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[90e6dbe1-c647-40fc-be79-43985512dc79]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3ec2e14e-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:9d:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 15, 'tx_packets': 7, 'rx_bytes': 1143, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 15, 'tx_packets': 7, 'rx_bytes': 1143, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 374371, 'reachable_time': 16032, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225803, 'error': None, 'target': 'ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:28.522 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[85e8928f-8034-4a1e-afda-c2e3bf6fda3d]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3ec2e14e-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 374383, 'tstamp': 374383}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225804, 'error': None, 'target': 'ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap3ec2e14e-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 374386, 'tstamp': 374386}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225804, 'error': None, 'target': 'ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:28.525 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3ec2e14e-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:28.540 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3ec2e14e-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:28.541 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:23:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:28.542 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3ec2e14e-50, col_values=(('external_ids', {'iface-id': '1e0c4d29-d963-4fdf-8ca6-0153967de16b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:28.543 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.567 2 INFO nova.virt.libvirt.driver [-] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Instance destroyed successfully.#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.569 2 DEBUG nova.objects.instance [None req-b4797bab-4c6e-4b4c-9f5d-ec4d6f198aee f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lazy-loading 'resources' on Instance uuid 08e4113f-f3be-424f-926e-62e20b3ad767 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.615 2 DEBUG nova.virt.libvirt.vif [None req-b4797bab-4c6e-4b4c-9f5d-ec4d6f198aee f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:22:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_flooding_when_special_groups-185393400',display_name='tempest-test_flooding_when_special_groups-185393400',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-flooding-when-special-groups-185393400',id=18,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD2WFeSg/DGHNB9+nWyQfurOVjPkTxdtZkW0R1GkMWJ7Z/35TtPo56N93IJ9W+ueAP01srElKtm0K/Obvpsxk9Lrs3cBEC1ElilHgpG+1/NKtqmriMYH4DXfeSh+aMoHPg==',key_name='tempest-keypair-test-469695160',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:22:27Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7f2acdb26a5a4269a4b1e407da7722c3',ramdisk_id='',reservation_id='r-xna3x6ut',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-MulticastTestIPv4Common-178854047',owner_user_name='tempest-MulticastTestIPv4Common-178854047-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:22:27Z,user_data=None,user_id='f03335a379bd4afdbbd7b9cc7cae27e0',uuid=08e4113f-f3be-424f-926e-62e20b3ad767,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e1bce81-bd3f-433a-aad4-1b90ad016699", "address": "fa:16:3e:ce:a3:46", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e1bce81-bd", "ovs_interfaceid": "3e1bce81-bd3f-433a-aad4-1b90ad016699", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.615 2 DEBUG nova.network.os_vif_util [None req-b4797bab-4c6e-4b4c-9f5d-ec4d6f198aee f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Converting VIF {"id": "3e1bce81-bd3f-433a-aad4-1b90ad016699", "address": "fa:16:3e:ce:a3:46", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e1bce81-bd", "ovs_interfaceid": "3e1bce81-bd3f-433a-aad4-1b90ad016699", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.616 2 DEBUG nova.network.os_vif_util [None req-b4797bab-4c6e-4b4c-9f5d-ec4d6f198aee f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ce:a3:46,bridge_name='br-int',has_traffic_filtering=True,id=3e1bce81-bd3f-433a-aad4-1b90ad016699,network=Network(3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e1bce81-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.616 2 DEBUG os_vif [None req-b4797bab-4c6e-4b4c-9f5d-ec4d6f198aee f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ce:a3:46,bridge_name='br-int',has_traffic_filtering=True,id=3e1bce81-bd3f-433a-aad4-1b90ad016699,network=Network(3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e1bce81-bd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.618 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e1bce81-bd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.659 2 INFO os_vif [None req-b4797bab-4c6e-4b4c-9f5d-ec4d6f198aee f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ce:a3:46,bridge_name='br-int',has_traffic_filtering=True,id=3e1bce81-bd3f-433a-aad4-1b90ad016699,network=Network(3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e1bce81-bd')#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.659 2 INFO nova.virt.libvirt.driver [None req-b4797bab-4c6e-4b4c-9f5d-ec4d6f198aee f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Deleting instance files /var/lib/nova/instances/08e4113f-f3be-424f-926e-62e20b3ad767_del#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.660 2 INFO nova.virt.libvirt.driver [None req-b4797bab-4c6e-4b4c-9f5d-ec4d6f198aee f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Deletion of /var/lib/nova/instances/08e4113f-f3be-424f-926e-62e20b3ad767_del complete#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.785 2 INFO nova.compute.manager [None req-b4797bab-4c6e-4b4c-9f5d-ec4d6f198aee f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Took 0.48 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.785 2 DEBUG oslo.service.loopingcall [None req-b4797bab-4c6e-4b4c-9f5d-ec4d6f198aee f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.786 2 DEBUG nova.compute.manager [-] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:23:28 np0005476733 nova_compute[192580]: 2025-10-08 15:23:28.786 2 DEBUG nova.network.neutron [-] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:23:30 np0005476733 nova_compute[192580]: 2025-10-08 15:23:30.130 2 INFO nova.compute.manager [None req-6b96b108-37e6-4f6d-9b78-d4eadb4c415c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Get console output#033[00m
Oct  8 11:23:30 np0005476733 nova_compute[192580]: 2025-10-08 15:23:30.252 2 DEBUG nova.compute.manager [req-3670a4a1-cda2-470b-b494-80dda6813762 req-25b62a85-1749-4771-bd49-f0f05d2c864a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Received event network-vif-unplugged-3e1bce81-bd3f-433a-aad4-1b90ad016699 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:30 np0005476733 nova_compute[192580]: 2025-10-08 15:23:30.252 2 DEBUG oslo_concurrency.lockutils [req-3670a4a1-cda2-470b-b494-80dda6813762 req-25b62a85-1749-4771-bd49-f0f05d2c864a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "08e4113f-f3be-424f-926e-62e20b3ad767-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:30 np0005476733 nova_compute[192580]: 2025-10-08 15:23:30.253 2 DEBUG oslo_concurrency.lockutils [req-3670a4a1-cda2-470b-b494-80dda6813762 req-25b62a85-1749-4771-bd49-f0f05d2c864a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "08e4113f-f3be-424f-926e-62e20b3ad767-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:30 np0005476733 nova_compute[192580]: 2025-10-08 15:23:30.253 2 DEBUG oslo_concurrency.lockutils [req-3670a4a1-cda2-470b-b494-80dda6813762 req-25b62a85-1749-4771-bd49-f0f05d2c864a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "08e4113f-f3be-424f-926e-62e20b3ad767-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:30 np0005476733 nova_compute[192580]: 2025-10-08 15:23:30.254 2 DEBUG nova.compute.manager [req-3670a4a1-cda2-470b-b494-80dda6813762 req-25b62a85-1749-4771-bd49-f0f05d2c864a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] No waiting events found dispatching network-vif-unplugged-3e1bce81-bd3f-433a-aad4-1b90ad016699 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:23:30 np0005476733 nova_compute[192580]: 2025-10-08 15:23:30.254 2 DEBUG nova.compute.manager [req-3670a4a1-cda2-470b-b494-80dda6813762 req-25b62a85-1749-4771-bd49-f0f05d2c864a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Received event network-vif-unplugged-3e1bce81-bd3f-433a-aad4-1b90ad016699 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:23:30 np0005476733 nova_compute[192580]: 2025-10-08 15:23:30.257 2 DEBUG nova.network.neutron [req-f8617589-ef64-4685-be8d-0d2b883f930f req-3f3ca452-8358-4e89-ae36-1b0796bf63b5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Updated VIF entry in instance network info cache for port 59f58b79-9163-41ba-8e03-7430e5def4ef. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:23:30 np0005476733 nova_compute[192580]: 2025-10-08 15:23:30.258 2 DEBUG nova.network.neutron [req-f8617589-ef64-4685-be8d-0d2b883f930f req-3f3ca452-8358-4e89-ae36-1b0796bf63b5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Updating instance_info_cache with network_info: [{"id": "59f58b79-9163-41ba-8e03-7430e5def4ef", "address": "fa:16:3e:5f:94:83", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59f58b79-91", "ovs_interfaceid": "59f58b79-9163-41ba-8e03-7430e5def4ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:23:30 np0005476733 nova_compute[192580]: 2025-10-08 15:23:30.284 2 DEBUG oslo_concurrency.lockutils [req-f8617589-ef64-4685-be8d-0d2b883f930f req-3f3ca452-8358-4e89-ae36-1b0796bf63b5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-656c0a96-03f3-4a70-baac-01de2a126a91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:23:30 np0005476733 nova_compute[192580]: 2025-10-08 15:23:30.526 2 DEBUG nova.network.neutron [-] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:23:30 np0005476733 nova_compute[192580]: 2025-10-08 15:23:30.557 2 INFO nova.compute.manager [-] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Took 1.77 seconds to deallocate network for instance.#033[00m
Oct  8 11:23:30 np0005476733 nova_compute[192580]: 2025-10-08 15:23:30.600 2 DEBUG oslo_concurrency.lockutils [None req-b4797bab-4c6e-4b4c-9f5d-ec4d6f198aee f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:30 np0005476733 nova_compute[192580]: 2025-10-08 15:23:30.600 2 DEBUG oslo_concurrency.lockutils [None req-b4797bab-4c6e-4b4c-9f5d-ec4d6f198aee f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:30 np0005476733 nova_compute[192580]: 2025-10-08 15:23:30.619 2 DEBUG nova.compute.manager [req-bdda8b1b-4700-4d28-b617-021c3ff3cb12 req-64f294fe-9581-4daf-8019-1b677527c149 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Received event network-vif-deleted-3e1bce81-bd3f-433a-aad4-1b90ad016699 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:30 np0005476733 nova_compute[192580]: 2025-10-08 15:23:30.792 2 DEBUG nova.compute.provider_tree [None req-b4797bab-4c6e-4b4c-9f5d-ec4d6f198aee f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:23:30 np0005476733 nova_compute[192580]: 2025-10-08 15:23:30.808 2 DEBUG nova.scheduler.client.report [None req-b4797bab-4c6e-4b4c-9f5d-ec4d6f198aee f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:23:30 np0005476733 nova_compute[192580]: 2025-10-08 15:23:30.861 2 DEBUG oslo_concurrency.lockutils [None req-b4797bab-4c6e-4b4c-9f5d-ec4d6f198aee f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:30 np0005476733 nova_compute[192580]: 2025-10-08 15:23:30.903 2 INFO nova.scheduler.client.report [None req-b4797bab-4c6e-4b4c-9f5d-ec4d6f198aee f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Deleted allocations for instance 08e4113f-f3be-424f-926e-62e20b3ad767#033[00m
Oct  8 11:23:31 np0005476733 nova_compute[192580]: 2025-10-08 15:23:31.014 2 DEBUG oslo_concurrency.lockutils [None req-b4797bab-4c6e-4b4c-9f5d-ec4d6f198aee f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "08e4113f-f3be-424f-926e-62e20b3ad767" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.714s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:31 np0005476733 nova_compute[192580]: 2025-10-08 15:23:31.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:31 np0005476733 nova_compute[192580]: 2025-10-08 15:23:31.569 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759936996.5683665, 9992bf78-8d8e-43c7-a8cc-5606d8c910cf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:23:31 np0005476733 nova_compute[192580]: 2025-10-08 15:23:31.570 2 INFO nova.compute.manager [-] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:23:31 np0005476733 nova_compute[192580]: 2025-10-08 15:23:31.595 2 DEBUG nova.compute.manager [None req-60b9f07f-494c-454b-9147-5f590df30b8d - - - - - -] [instance: 9992bf78-8d8e-43c7-a8cc-5606d8c910cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:23:32 np0005476733 podman[225838]: 2025-10-08 15:23:32.237427791 +0000 UTC m=+0.059890951 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, io.buildah.version=1.33.7, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc.)
Oct  8 11:23:32 np0005476733 nova_compute[192580]: 2025-10-08 15:23:32.441 2 DEBUG nova.compute.manager [req-cee65cd0-f012-4550-a01d-456cddb39dba req-cefa86f4-2095-4e08-91d5-df1251288647 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Received event network-vif-plugged-3e1bce81-bd3f-433a-aad4-1b90ad016699 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:32 np0005476733 nova_compute[192580]: 2025-10-08 15:23:32.442 2 DEBUG oslo_concurrency.lockutils [req-cee65cd0-f012-4550-a01d-456cddb39dba req-cefa86f4-2095-4e08-91d5-df1251288647 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "08e4113f-f3be-424f-926e-62e20b3ad767-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:32 np0005476733 nova_compute[192580]: 2025-10-08 15:23:32.442 2 DEBUG oslo_concurrency.lockutils [req-cee65cd0-f012-4550-a01d-456cddb39dba req-cefa86f4-2095-4e08-91d5-df1251288647 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "08e4113f-f3be-424f-926e-62e20b3ad767-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:32 np0005476733 nova_compute[192580]: 2025-10-08 15:23:32.443 2 DEBUG oslo_concurrency.lockutils [req-cee65cd0-f012-4550-a01d-456cddb39dba req-cefa86f4-2095-4e08-91d5-df1251288647 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "08e4113f-f3be-424f-926e-62e20b3ad767-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:32 np0005476733 nova_compute[192580]: 2025-10-08 15:23:32.443 2 DEBUG nova.compute.manager [req-cee65cd0-f012-4550-a01d-456cddb39dba req-cefa86f4-2095-4e08-91d5-df1251288647 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] No waiting events found dispatching network-vif-plugged-3e1bce81-bd3f-433a-aad4-1b90ad016699 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:23:32 np0005476733 nova_compute[192580]: 2025-10-08 15:23:32.444 2 WARNING nova.compute.manager [req-cee65cd0-f012-4550-a01d-456cddb39dba req-cefa86f4-2095-4e08-91d5-df1251288647 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Received unexpected event network-vif-plugged-3e1bce81-bd3f-433a-aad4-1b90ad016699 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 11:23:33 np0005476733 nova_compute[192580]: 2025-10-08 15:23:33.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:33 np0005476733 nova_compute[192580]: 2025-10-08 15:23:33.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:34 np0005476733 nova_compute[192580]: 2025-10-08 15:23:34.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:23:34 np0005476733 nova_compute[192580]: 2025-10-08 15:23:34.612 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:34 np0005476733 nova_compute[192580]: 2025-10-08 15:23:34.613 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:34 np0005476733 nova_compute[192580]: 2025-10-08 15:23:34.613 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:34 np0005476733 nova_compute[192580]: 2025-10-08 15:23:34.614 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:23:34 np0005476733 nova_compute[192580]: 2025-10-08 15:23:34.707 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:23:34 np0005476733 nova_compute[192580]: 2025-10-08 15:23:34.802 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:23:34 np0005476733 nova_compute[192580]: 2025-10-08 15:23:34.803 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:23:34 np0005476733 nova_compute[192580]: 2025-10-08 15:23:34.880 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:23:34 np0005476733 nova_compute[192580]: 2025-10-08 15:23:34.887 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:23:34 np0005476733 nova_compute[192580]: 2025-10-08 15:23:34.964 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:23:34 np0005476733 nova_compute[192580]: 2025-10-08 15:23:34.966 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.035 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.043 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.104 2 DEBUG oslo_concurrency.lockutils [None req-46ed59fe-1196-4631-82c3-6adbaf446f27 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "a71ee5d2-21b8-4455-8870-f20bed682909" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.104 2 DEBUG oslo_concurrency.lockutils [None req-46ed59fe-1196-4631-82c3-6adbaf446f27 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "a71ee5d2-21b8-4455-8870-f20bed682909" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.105 2 DEBUG oslo_concurrency.lockutils [None req-46ed59fe-1196-4631-82c3-6adbaf446f27 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "a71ee5d2-21b8-4455-8870-f20bed682909-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.105 2 DEBUG oslo_concurrency.lockutils [None req-46ed59fe-1196-4631-82c3-6adbaf446f27 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "a71ee5d2-21b8-4455-8870-f20bed682909-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.105 2 DEBUG oslo_concurrency.lockutils [None req-46ed59fe-1196-4631-82c3-6adbaf446f27 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "a71ee5d2-21b8-4455-8870-f20bed682909-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.107 2 INFO nova.compute.manager [None req-46ed59fe-1196-4631-82c3-6adbaf446f27 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Terminating instance#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.108 2 DEBUG nova.compute.manager [None req-46ed59fe-1196-4631-82c3-6adbaf446f27 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.109 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.109 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.125 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937000.1203954, 64549fc7-989f-473a-99bb-78947d8d7536 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.126 2 INFO nova.compute.manager [-] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.150 2 DEBUG nova.compute.manager [None req-fb5e057e-9895-434b-9064-1656b37b898c - - - - - -] [instance: 64549fc7-989f-473a-99bb-78947d8d7536] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.166 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:23:35 np0005476733 kernel: tapf66c148b-4c (unregistering): left promiscuous mode
Oct  8 11:23:35 np0005476733 NetworkManager[51699]: <info>  [1759937015.1881] device (tapf66c148b-4c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:23:35 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:35Z|00188|binding|INFO|Releasing lport f66c148b-4cbb-4cdd-8196-6513d7c5ff78 from this chassis (sb_readonly=0)
Oct  8 11:23:35 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:35Z|00189|binding|INFO|Setting lport f66c148b-4cbb-4cdd-8196-6513d7c5ff78 down in Southbound
Oct  8 11:23:35 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:35Z|00190|binding|INFO|Removing iface tapf66c148b-4c ovn-installed in OVS
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:35.232 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:9d:93 10.100.0.9'], port_security=['fa:16:3e:77:9d:93 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'a71ee5d2-21b8-4455-8870-f20bed682909', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '89de78c9-f0c2-4dee-bf11-af3dd2c1fe7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.205'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ebe434a7-5fd3-4a18-92a7-9bb4b2dc9121, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=f66c148b-4cbb-4cdd-8196-6513d7c5ff78) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:35.233 103739 INFO neutron.agent.ovn.metadata.agent [-] Port f66c148b-4cbb-4cdd-8196-6513d7c5ff78 in datapath 3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567 unbound from our chassis#033[00m
Oct  8 11:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:35.235 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:35.236 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[86dca841-ff1e-4cc1-ae96-a5b255dc620a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:35.236 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567 namespace which is not needed anymore#033[00m
Oct  8 11:23:35 np0005476733 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000009.scope: Deactivated successfully.
Oct  8 11:23:35 np0005476733 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000009.scope: Consumed 45.753s CPU time.
Oct  8 11:23:35 np0005476733 systemd-machined[152624]: Machine qemu-6-instance-00000009 terminated.
Oct  8 11:23:35 np0005476733 podman[225900]: 2025-10-08 15:23:35.356868133 +0000 UTC m=+0.082241546 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 11:23:35 np0005476733 podman[225906]: 2025-10-08 15:23:35.382183444 +0000 UTC m=+0.099688685 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd)
Oct  8 11:23:35 np0005476733 neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567[223230]: [NOTICE]   (223234) : haproxy version is 2.8.14-c23fe91
Oct  8 11:23:35 np0005476733 neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567[223230]: [NOTICE]   (223234) : path to executable is /usr/sbin/haproxy
Oct  8 11:23:35 np0005476733 neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567[223230]: [WARNING]  (223234) : Exiting Master process...
Oct  8 11:23:35 np0005476733 neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567[223230]: [WARNING]  (223234) : Exiting Master process...
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.389 2 INFO nova.virt.libvirt.driver [-] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Instance destroyed successfully.#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.390 2 DEBUG nova.objects.instance [None req-46ed59fe-1196-4631-82c3-6adbaf446f27 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lazy-loading 'resources' on Instance uuid a71ee5d2-21b8-4455-8870-f20bed682909 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:23:35 np0005476733 neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567[223230]: [ALERT]    (223234) : Current worker (223236) exited with code 143 (Terminated)
Oct  8 11:23:35 np0005476733 neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567[223230]: [WARNING]  (223234) : All workers exited. Exiting... (0)
Oct  8 11:23:35 np0005476733 systemd[1]: libpod-78417254839c7c4543a785d835162a728cec3c5cb3077809945012fe85f2dfa1.scope: Deactivated successfully.
Oct  8 11:23:35 np0005476733 podman[225946]: 2025-10-08 15:23:35.401909726 +0000 UTC m=+0.061203472 container died 78417254839c7c4543a785d835162a728cec3c5cb3077809945012fe85f2dfa1 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.417 2 DEBUG nova.virt.libvirt.vif [None req-46ed59fe-1196-4631-82c3-6adbaf446f27 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:20:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_flooding_when_special_groups-542526277',display_name='tempest-test_flooding_when_special_groups-542526277',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-flooding-when-special-groups-542526277',id=9,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD2WFeSg/DGHNB9+nWyQfurOVjPkTxdtZkW0R1GkMWJ7Z/35TtPo56N93IJ9W+ueAP01srElKtm0K/Obvpsxk9Lrs3cBEC1ElilHgpG+1/NKtqmriMYH4DXfeSh+aMoHPg==',key_name='tempest-keypair-test-469695160',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:20:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7f2acdb26a5a4269a4b1e407da7722c3',ramdisk_id='',reservation_id='r-z4n1a4rs',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-MulticastTestIPv4Common-178854047',owner_user_name='tempest-MulticastTestIPv4Common-178854047-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:20:41Z,user_data=None,user_id='f03335a379bd4afdbbd7b9cc7cae27e0',uuid=a71ee5d2-21b8-4455-8870-f20bed682909,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "address": "fa:16:3e:77:9d:93", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66c148b-4c", "ovs_interfaceid": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.418 2 DEBUG nova.network.os_vif_util [None req-46ed59fe-1196-4631-82c3-6adbaf446f27 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Converting VIF {"id": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "address": "fa:16:3e:77:9d:93", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66c148b-4c", "ovs_interfaceid": "f66c148b-4cbb-4cdd-8196-6513d7c5ff78", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.419 2 DEBUG nova.network.os_vif_util [None req-46ed59fe-1196-4631-82c3-6adbaf446f27 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:77:9d:93,bridge_name='br-int',has_traffic_filtering=True,id=f66c148b-4cbb-4cdd-8196-6513d7c5ff78,network=Network(3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf66c148b-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.419 2 DEBUG os_vif [None req-46ed59fe-1196-4631-82c3-6adbaf446f27 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:9d:93,bridge_name='br-int',has_traffic_filtering=True,id=f66c148b-4cbb-4cdd-8196-6513d7c5ff78,network=Network(3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf66c148b-4c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.421 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf66c148b-4c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.426 2 INFO os_vif [None req-46ed59fe-1196-4631-82c3-6adbaf446f27 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:9d:93,bridge_name='br-int',has_traffic_filtering=True,id=f66c148b-4cbb-4cdd-8196-6513d7c5ff78,network=Network(3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf66c148b-4c')#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.427 2 INFO nova.virt.libvirt.driver [None req-46ed59fe-1196-4631-82c3-6adbaf446f27 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Deleting instance files /var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909_del#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.427 2 INFO nova.virt.libvirt.driver [None req-46ed59fe-1196-4631-82c3-6adbaf446f27 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Deletion of /var/lib/nova/instances/a71ee5d2-21b8-4455-8870-f20bed682909_del complete#033[00m
Oct  8 11:23:35 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-78417254839c7c4543a785d835162a728cec3c5cb3077809945012fe85f2dfa1-userdata-shm.mount: Deactivated successfully.
Oct  8 11:23:35 np0005476733 systemd[1]: var-lib-containers-storage-overlay-146f4e32264e73a4b7fb2fff8a107e6e31ae9f295f81776878a0e8b0e8e2788f-merged.mount: Deactivated successfully.
Oct  8 11:23:35 np0005476733 podman[225946]: 2025-10-08 15:23:35.462452747 +0000 UTC m=+0.121746493 container cleanup 78417254839c7c4543a785d835162a728cec3c5cb3077809945012fe85f2dfa1 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:23:35 np0005476733 systemd[1]: libpod-conmon-78417254839c7c4543a785d835162a728cec3c5cb3077809945012fe85f2dfa1.scope: Deactivated successfully.
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.512 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.513 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=11805MB free_disk=111.07664108276367GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.513 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.514 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:35 np0005476733 podman[225997]: 2025-10-08 15:23:35.594651743 +0000 UTC m=+0.110870264 container remove 78417254839c7c4543a785d835162a728cec3c5cb3077809945012fe85f2dfa1 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:35.599 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ccdc0c3f-4d79-44e2-86d1-92dd85581fc3]: (4, ('Wed Oct  8 03:23:35 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567 (78417254839c7c4543a785d835162a728cec3c5cb3077809945012fe85f2dfa1)\n78417254839c7c4543a785d835162a728cec3c5cb3077809945012fe85f2dfa1\nWed Oct  8 03:23:35 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567 (78417254839c7c4543a785d835162a728cec3c5cb3077809945012fe85f2dfa1)\n78417254839c7c4543a785d835162a728cec3c5cb3077809945012fe85f2dfa1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:35.601 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c87414e2-8b9e-41ce-b3d7-2ef312c2e92b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:35.602 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3ec2e14e-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:35 np0005476733 kernel: tap3ec2e14e-50: left promiscuous mode
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.616 2 INFO nova.compute.manager [None req-46ed59fe-1196-4631-82c3-6adbaf446f27 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Took 0.51 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.617 2 DEBUG oslo.service.loopingcall [None req-46ed59fe-1196-4631-82c3-6adbaf446f27 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:35.617 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[fff48d58-f90f-4453-837b-d58438564680]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.618 2 DEBUG nova.compute.manager [-] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.618 2 DEBUG nova.network.neutron [-] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:35.642 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[007c06ff-98d0-45d9-91a1-58083205e886]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:35.643 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2348c232-cd4d-4a9b-a161-3988b4cf8dc0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.652 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance a71ee5d2-21b8-4455-8870-f20bed682909 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.653 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 656c0a96-03f3-4a70-baac-01de2a126a91 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.653 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.653 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.653 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=3584MB phys_disk=119GB used_disk=30GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:35.659 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[bd2c4c53-f358-40f1-8ff8-d6770e6e0fa5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 374364, 'reachable_time': 29663, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226012, 'error': None, 'target': 'ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:35.661 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:35.662 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[0daaafa2-8ad1-43dd-b8d0-f6802aa2bdfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:35 np0005476733 systemd[1]: run-netns-ovnmeta\x2d3ec2e14e\x2d57e7\x2d4e0a\x2dbf0c\x2d0beb3cfd5567.mount: Deactivated successfully.
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.735 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.752 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.775 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:23:35 np0005476733 nova_compute[192580]: 2025-10-08 15:23:35.776 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:36 np0005476733 nova_compute[192580]: 2025-10-08 15:23:36.046 2 INFO nova.compute.manager [None req-4b097835-26ae-4a5d-a9e1-9fc401a2ee01 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Get console output#033[00m
Oct  8 11:23:36 np0005476733 nova_compute[192580]: 2025-10-08 15:23:36.051 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:23:37 np0005476733 nova_compute[192580]: 2025-10-08 15:23:37.518 2 DEBUG nova.compute.manager [req-63974327-6263-4f43-80b8-97414059a79b req-14b8fa8a-54e8-4a12-ac79-e17573eb696d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Received event network-vif-unplugged-f66c148b-4cbb-4cdd-8196-6513d7c5ff78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:37 np0005476733 nova_compute[192580]: 2025-10-08 15:23:37.519 2 DEBUG oslo_concurrency.lockutils [req-63974327-6263-4f43-80b8-97414059a79b req-14b8fa8a-54e8-4a12-ac79-e17573eb696d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "a71ee5d2-21b8-4455-8870-f20bed682909-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:37 np0005476733 nova_compute[192580]: 2025-10-08 15:23:37.519 2 DEBUG oslo_concurrency.lockutils [req-63974327-6263-4f43-80b8-97414059a79b req-14b8fa8a-54e8-4a12-ac79-e17573eb696d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "a71ee5d2-21b8-4455-8870-f20bed682909-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:37 np0005476733 nova_compute[192580]: 2025-10-08 15:23:37.520 2 DEBUG oslo_concurrency.lockutils [req-63974327-6263-4f43-80b8-97414059a79b req-14b8fa8a-54e8-4a12-ac79-e17573eb696d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "a71ee5d2-21b8-4455-8870-f20bed682909-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:37 np0005476733 nova_compute[192580]: 2025-10-08 15:23:37.521 2 DEBUG nova.compute.manager [req-63974327-6263-4f43-80b8-97414059a79b req-14b8fa8a-54e8-4a12-ac79-e17573eb696d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] No waiting events found dispatching network-vif-unplugged-f66c148b-4cbb-4cdd-8196-6513d7c5ff78 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:23:37 np0005476733 nova_compute[192580]: 2025-10-08 15:23:37.521 2 DEBUG nova.compute.manager [req-63974327-6263-4f43-80b8-97414059a79b req-14b8fa8a-54e8-4a12-ac79-e17573eb696d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Received event network-vif-unplugged-f66c148b-4cbb-4cdd-8196-6513d7c5ff78 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:23:37 np0005476733 nova_compute[192580]: 2025-10-08 15:23:37.983 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937002.9817357, 341c177f-c391-41dd-bf3c-14c2076057eb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:23:37 np0005476733 nova_compute[192580]: 2025-10-08 15:23:37.983 2 INFO nova.compute.manager [-] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:23:38 np0005476733 nova_compute[192580]: 2025-10-08 15:23:38.009 2 DEBUG nova.compute.manager [None req-b8c83581-f397-4de0-8cb4-c17761b4b783 - - - - - -] [instance: 341c177f-c391-41dd-bf3c-14c2076057eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:23:38 np0005476733 nova_compute[192580]: 2025-10-08 15:23:38.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:38 np0005476733 nova_compute[192580]: 2025-10-08 15:23:38.776 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:23:38 np0005476733 nova_compute[192580]: 2025-10-08 15:23:38.776 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:23:38 np0005476733 nova_compute[192580]: 2025-10-08 15:23:38.776 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:23:39 np0005476733 nova_compute[192580]: 2025-10-08 15:23:39.178 2 DEBUG nova.network.neutron [-] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:23:39 np0005476733 nova_compute[192580]: 2025-10-08 15:23:39.236 2 INFO nova.compute.manager [-] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Took 3.62 seconds to deallocate network for instance.#033[00m
Oct  8 11:23:39 np0005476733 nova_compute[192580]: 2025-10-08 15:23:39.299 2 DEBUG oslo_concurrency.lockutils [None req-46ed59fe-1196-4631-82c3-6adbaf446f27 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:39 np0005476733 nova_compute[192580]: 2025-10-08 15:23:39.300 2 DEBUG oslo_concurrency.lockutils [None req-46ed59fe-1196-4631-82c3-6adbaf446f27 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:39 np0005476733 nova_compute[192580]: 2025-10-08 15:23:39.392 2 DEBUG nova.compute.provider_tree [None req-46ed59fe-1196-4631-82c3-6adbaf446f27 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:23:39 np0005476733 nova_compute[192580]: 2025-10-08 15:23:39.485 2 DEBUG nova.scheduler.client.report [None req-46ed59fe-1196-4631-82c3-6adbaf446f27 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:23:39 np0005476733 nova_compute[192580]: 2025-10-08 15:23:39.519 2 DEBUG oslo_concurrency.lockutils [None req-46ed59fe-1196-4631-82c3-6adbaf446f27 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:39 np0005476733 nova_compute[192580]: 2025-10-08 15:23:39.556 2 INFO nova.scheduler.client.report [None req-46ed59fe-1196-4631-82c3-6adbaf446f27 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Deleted allocations for instance a71ee5d2-21b8-4455-8870-f20bed682909#033[00m
Oct  8 11:23:39 np0005476733 nova_compute[192580]: 2025-10-08 15:23:39.631 2 DEBUG oslo_concurrency.lockutils [None req-46ed59fe-1196-4631-82c3-6adbaf446f27 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "a71ee5d2-21b8-4455-8870-f20bed682909" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.526s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:39 np0005476733 nova_compute[192580]: 2025-10-08 15:23:39.651 2 DEBUG nova.compute.manager [req-8967094c-a7ea-4792-bffd-5bcdbb592083 req-177a67fa-8c27-4b73-9a28-24340bfd29cd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Received event network-vif-plugged-f66c148b-4cbb-4cdd-8196-6513d7c5ff78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:39 np0005476733 nova_compute[192580]: 2025-10-08 15:23:39.652 2 DEBUG oslo_concurrency.lockutils [req-8967094c-a7ea-4792-bffd-5bcdbb592083 req-177a67fa-8c27-4b73-9a28-24340bfd29cd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "a71ee5d2-21b8-4455-8870-f20bed682909-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:39 np0005476733 nova_compute[192580]: 2025-10-08 15:23:39.653 2 DEBUG oslo_concurrency.lockutils [req-8967094c-a7ea-4792-bffd-5bcdbb592083 req-177a67fa-8c27-4b73-9a28-24340bfd29cd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "a71ee5d2-21b8-4455-8870-f20bed682909-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:39 np0005476733 nova_compute[192580]: 2025-10-08 15:23:39.653 2 DEBUG oslo_concurrency.lockutils [req-8967094c-a7ea-4792-bffd-5bcdbb592083 req-177a67fa-8c27-4b73-9a28-24340bfd29cd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "a71ee5d2-21b8-4455-8870-f20bed682909-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:39 np0005476733 nova_compute[192580]: 2025-10-08 15:23:39.654 2 DEBUG nova.compute.manager [req-8967094c-a7ea-4792-bffd-5bcdbb592083 req-177a67fa-8c27-4b73-9a28-24340bfd29cd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] No waiting events found dispatching network-vif-plugged-f66c148b-4cbb-4cdd-8196-6513d7c5ff78 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:23:39 np0005476733 nova_compute[192580]: 2025-10-08 15:23:39.654 2 WARNING nova.compute.manager [req-8967094c-a7ea-4792-bffd-5bcdbb592083 req-177a67fa-8c27-4b73-9a28-24340bfd29cd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Received unexpected event network-vif-plugged-f66c148b-4cbb-4cdd-8196-6513d7c5ff78 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 11:23:39 np0005476733 nova_compute[192580]: 2025-10-08 15:23:39.655 2 DEBUG nova.compute.manager [req-8967094c-a7ea-4792-bffd-5bcdbb592083 req-177a67fa-8c27-4b73-9a28-24340bfd29cd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Received event network-vif-deleted-f66c148b-4cbb-4cdd-8196-6513d7c5ff78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:40 np0005476733 nova_compute[192580]: 2025-10-08 15:23:40.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:41 np0005476733 nova_compute[192580]: 2025-10-08 15:23:41.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:41 np0005476733 nova_compute[192580]: 2025-10-08 15:23:41.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:23:41 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:41Z|00191|binding|INFO|Releasing lport 76302563-91ae-48df-adce-3edec8d5a578 from this chassis (sb_readonly=0)
Oct  8 11:23:41 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:41Z|00192|binding|INFO|Releasing lport b563ca05-c871-4f0e-9980-177237a3f88d from this chassis (sb_readonly=0)
Oct  8 11:23:41 np0005476733 nova_compute[192580]: 2025-10-08 15:23:41.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:41 np0005476733 nova_compute[192580]: 2025-10-08 15:23:41.771 2 INFO nova.compute.manager [None req-64ec2405-9497-4062-8e61-66e3aced3ca8 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Get console output#033[00m
Oct  8 11:23:41 np0005476733 nova_compute[192580]: 2025-10-08 15:23:41.777 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:23:42 np0005476733 nova_compute[192580]: 2025-10-08 15:23:42.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:23:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:42Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a6:aa:3b 10.100.0.8
Oct  8 11:23:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:42Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a6:aa:3b 10.100.0.8
Oct  8 11:23:43 np0005476733 nova_compute[192580]: 2025-10-08 15:23:43.567 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937008.564604, 08e4113f-f3be-424f-926e-62e20b3ad767 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:23:43 np0005476733 nova_compute[192580]: 2025-10-08 15:23:43.568 2 INFO nova.compute.manager [-] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:23:43 np0005476733 nova_compute[192580]: 2025-10-08 15:23:43.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:23:43 np0005476733 nova_compute[192580]: 2025-10-08 15:23:43.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:23:43 np0005476733 nova_compute[192580]: 2025-10-08 15:23:43.603 2 DEBUG nova.compute.manager [None req-bc32e45b-152d-4ecf-973f-1b50d62126b0 - - - - - -] [instance: 08e4113f-f3be-424f-926e-62e20b3ad767] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:23:43 np0005476733 nova_compute[192580]: 2025-10-08 15:23:43.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:44 np0005476733 nova_compute[192580]: 2025-10-08 15:23:44.521 2 DEBUG oslo_concurrency.lockutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "e36dd986-15d5-466e-93d6-dc7b4483c8e9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:44 np0005476733 nova_compute[192580]: 2025-10-08 15:23:44.523 2 DEBUG oslo_concurrency.lockutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "e36dd986-15d5-466e-93d6-dc7b4483c8e9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:44 np0005476733 nova_compute[192580]: 2025-10-08 15:23:44.542 2 DEBUG nova.compute.manager [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:23:44 np0005476733 nova_compute[192580]: 2025-10-08 15:23:44.625 2 DEBUG oslo_concurrency.lockutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:44 np0005476733 nova_compute[192580]: 2025-10-08 15:23:44.626 2 DEBUG oslo_concurrency.lockutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:44 np0005476733 nova_compute[192580]: 2025-10-08 15:23:44.632 2 DEBUG nova.virt.hardware [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:23:44 np0005476733 nova_compute[192580]: 2025-10-08 15:23:44.632 2 INFO nova.compute.claims [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:23:44 np0005476733 nova_compute[192580]: 2025-10-08 15:23:44.800 2 DEBUG nova.compute.provider_tree [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:23:44 np0005476733 nova_compute[192580]: 2025-10-08 15:23:44.816 2 DEBUG nova.scheduler.client.report [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:23:44 np0005476733 nova_compute[192580]: 2025-10-08 15:23:44.854 2 DEBUG oslo_concurrency.lockutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:44 np0005476733 nova_compute[192580]: 2025-10-08 15:23:44.854 2 DEBUG nova.compute.manager [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:23:44 np0005476733 nova_compute[192580]: 2025-10-08 15:23:44.912 2 DEBUG nova.compute.manager [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:23:44 np0005476733 nova_compute[192580]: 2025-10-08 15:23:44.912 2 DEBUG nova.network.neutron [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:23:44 np0005476733 nova_compute[192580]: 2025-10-08 15:23:44.977 2 INFO nova.virt.libvirt.driver [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.023 2 DEBUG nova.compute.manager [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.127 2 DEBUG nova.policy [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.233 2 DEBUG nova.compute.manager [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.235 2 DEBUG nova.virt.libvirt.driver [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.235 2 INFO nova.virt.libvirt.driver [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Creating image(s)#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.236 2 DEBUG oslo_concurrency.lockutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "/var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.239 2 DEBUG oslo_concurrency.lockutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "/var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.240 2 DEBUG oslo_concurrency.lockutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "/var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:45 np0005476733 podman[226054]: 2025-10-08 15:23:45.247137249 +0000 UTC m=+0.075933295 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.267 2 DEBUG oslo_concurrency.processutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.329 2 DEBUG oslo_concurrency.processutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.331 2 DEBUG oslo_concurrency.lockutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.331 2 DEBUG oslo_concurrency.lockutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.341 2 DEBUG oslo_concurrency.processutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.393 2 DEBUG oslo_concurrency.processutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.395 2 DEBUG oslo_concurrency.processutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.428 2 DEBUG oslo_concurrency.processutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk 10737418240" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.429 2 DEBUG oslo_concurrency.lockutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.429 2 DEBUG oslo_concurrency.processutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.519 2 DEBUG oslo_concurrency.processutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.522 2 DEBUG nova.objects.instance [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lazy-loading 'migration_context' on Instance uuid e36dd986-15d5-466e-93d6-dc7b4483c8e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.590 2 DEBUG nova.virt.libvirt.driver [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.591 2 DEBUG nova.virt.libvirt.driver [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Ensure instance console log exists: /var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.591 2 DEBUG oslo_concurrency.lockutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.591 2 DEBUG oslo_concurrency.lockutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:45 np0005476733 nova_compute[192580]: 2025-10-08 15:23:45.592 2 DEBUG oslo_concurrency.lockutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:46 np0005476733 nova_compute[192580]: 2025-10-08 15:23:46.160 2 DEBUG nova.network.neutron [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Successfully created port: 27016abf-08ed-40dc-8da9-bebab3e3a2a3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 11:23:46 np0005476733 podman[226086]: 2025-10-08 15:23:46.235122222 +0000 UTC m=+0.055784208 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 11:23:46 np0005476733 nova_compute[192580]: 2025-10-08 15:23:46.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:23:46 np0005476733 nova_compute[192580]: 2025-10-08 15:23:46.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:23:46 np0005476733 nova_compute[192580]: 2025-10-08 15:23:46.713 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 11:23:46 np0005476733 nova_compute[192580]: 2025-10-08 15:23:46.938 2 INFO nova.compute.manager [None req-2cc66864-558d-422d-aacf-1c3ca6d67a23 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Get console output#033[00m
Oct  8 11:23:46 np0005476733 nova_compute[192580]: 2025-10-08 15:23:46.942 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:23:47 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:47Z|00193|binding|INFO|Releasing lport 76302563-91ae-48df-adce-3edec8d5a578 from this chassis (sb_readonly=0)
Oct  8 11:23:47 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:47Z|00194|binding|INFO|Releasing lport b563ca05-c871-4f0e-9980-177237a3f88d from this chassis (sb_readonly=0)
Oct  8 11:23:47 np0005476733 nova_compute[192580]: 2025-10-08 15:23:47.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:47 np0005476733 nova_compute[192580]: 2025-10-08 15:23:47.564 2 DEBUG nova.network.neutron [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Successfully updated port: 27016abf-08ed-40dc-8da9-bebab3e3a2a3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:23:47 np0005476733 nova_compute[192580]: 2025-10-08 15:23:47.592 2 DEBUG oslo_concurrency.lockutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "refresh_cache-e36dd986-15d5-466e-93d6-dc7b4483c8e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:23:47 np0005476733 nova_compute[192580]: 2025-10-08 15:23:47.593 2 DEBUG oslo_concurrency.lockutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquired lock "refresh_cache-e36dd986-15d5-466e-93d6-dc7b4483c8e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:23:47 np0005476733 nova_compute[192580]: 2025-10-08 15:23:47.593 2 DEBUG nova.network.neutron [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:23:47 np0005476733 nova_compute[192580]: 2025-10-08 15:23:47.778 2 DEBUG nova.compute.manager [req-b3e0acc8-1afb-4f3d-941a-2c1bcb2c3e09 req-9fed5e79-0a73-496b-9afb-9609ba6420ba 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Received event network-changed-27016abf-08ed-40dc-8da9-bebab3e3a2a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:47 np0005476733 nova_compute[192580]: 2025-10-08 15:23:47.778 2 DEBUG nova.compute.manager [req-b3e0acc8-1afb-4f3d-941a-2c1bcb2c3e09 req-9fed5e79-0a73-496b-9afb-9609ba6420ba 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Refreshing instance network info cache due to event network-changed-27016abf-08ed-40dc-8da9-bebab3e3a2a3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:23:47 np0005476733 nova_compute[192580]: 2025-10-08 15:23:47.778 2 DEBUG oslo_concurrency.lockutils [req-b3e0acc8-1afb-4f3d-941a-2c1bcb2c3e09 req-9fed5e79-0a73-496b-9afb-9609ba6420ba 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-e36dd986-15d5-466e-93d6-dc7b4483c8e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:23:48 np0005476733 nova_compute[192580]: 2025-10-08 15:23:48.024 2 DEBUG nova.network.neutron [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:23:48 np0005476733 nova_compute[192580]: 2025-10-08 15:23:48.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:49 np0005476733 nova_compute[192580]: 2025-10-08 15:23:49.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:23:50 np0005476733 nova_compute[192580]: 2025-10-08 15:23:50.384 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937015.383594, a71ee5d2-21b8-4455-8870-f20bed682909 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:23:50 np0005476733 nova_compute[192580]: 2025-10-08 15:23:50.385 2 INFO nova.compute.manager [-] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:23:50 np0005476733 nova_compute[192580]: 2025-10-08 15:23:50.403 2 DEBUG nova.compute.manager [None req-dd3fece7-bd38-4fe4-a9af-95697f95567c - - - - - -] [instance: a71ee5d2-21b8-4455-8870-f20bed682909] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:23:50 np0005476733 nova_compute[192580]: 2025-10-08 15:23:50.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.003 2 DEBUG nova.network.neutron [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Updating instance_info_cache with network_info: [{"id": "27016abf-08ed-40dc-8da9-bebab3e3a2a3", "address": "fa:16:3e:fe:38:dd", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27016abf-08", "ovs_interfaceid": "27016abf-08ed-40dc-8da9-bebab3e3a2a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.036 2 DEBUG oslo_concurrency.lockutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Releasing lock "refresh_cache-e36dd986-15d5-466e-93d6-dc7b4483c8e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.036 2 DEBUG nova.compute.manager [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Instance network_info: |[{"id": "27016abf-08ed-40dc-8da9-bebab3e3a2a3", "address": "fa:16:3e:fe:38:dd", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27016abf-08", "ovs_interfaceid": "27016abf-08ed-40dc-8da9-bebab3e3a2a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.037 2 DEBUG oslo_concurrency.lockutils [req-b3e0acc8-1afb-4f3d-941a-2c1bcb2c3e09 req-9fed5e79-0a73-496b-9afb-9609ba6420ba 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-e36dd986-15d5-466e-93d6-dc7b4483c8e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.037 2 DEBUG nova.network.neutron [req-b3e0acc8-1afb-4f3d-941a-2c1bcb2c3e09 req-9fed5e79-0a73-496b-9afb-9609ba6420ba 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Refreshing network info cache for port 27016abf-08ed-40dc-8da9-bebab3e3a2a3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.040 2 DEBUG nova.virt.libvirt.driver [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Start _get_guest_xml network_info=[{"id": "27016abf-08ed-40dc-8da9-bebab3e3a2a3", "address": "fa:16:3e:fe:38:dd", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27016abf-08", "ovs_interfaceid": "27016abf-08ed-40dc-8da9-bebab3e3a2a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.045 2 WARNING nova.virt.libvirt.driver [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.050 2 DEBUG nova.virt.libvirt.host [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.051 2 DEBUG nova.virt.libvirt.host [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.058 2 DEBUG nova.virt.libvirt.host [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.059 2 DEBUG nova.virt.libvirt.host [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.060 2 DEBUG nova.virt.libvirt.driver [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.060 2 DEBUG nova.virt.hardware [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.061 2 DEBUG nova.virt.hardware [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.061 2 DEBUG nova.virt.hardware [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.061 2 DEBUG nova.virt.hardware [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.061 2 DEBUG nova.virt.hardware [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.062 2 DEBUG nova.virt.hardware [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.062 2 DEBUG nova.virt.hardware [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.062 2 DEBUG nova.virt.hardware [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.062 2 DEBUG nova.virt.hardware [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.063 2 DEBUG nova.virt.hardware [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.063 2 DEBUG nova.virt.hardware [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.067 2 DEBUG nova.virt.libvirt.vif [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:23:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986',display_name='tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-igmp-snooping-same-network-and-unsubscribe-1863725',id=22,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD2WFeSg/DGHNB9+nWyQfurOVjPkTxdtZkW0R1GkMWJ7Z/35TtPo56N93IJ9W+ueAP01srElKtm0K/Obvpsxk9Lrs3cBEC1ElilHgpG+1/NKtqmriMYH4DXfeSh+aMoHPg==',key_name='tempest-keypair-test-469695160',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7f2acdb26a5a4269a4b1e407da7722c3',ramdisk_id='',reservation_id='r-mydo74dz',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Common-178854047',owner_user_name='tempest-MulticastTestIPv4Common-178854047-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:23:45Z,user_data=None,user_id='f03335a379bd4afdbbd7b9cc7cae27e0',uuid=e36dd986-15d5-466e-93d6-dc7b4483c8e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "27016abf-08ed-40dc-8da9-bebab3e3a2a3", "address": "fa:16:3e:fe:38:dd", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27016abf-08", "ovs_interfaceid": "27016abf-08ed-40dc-8da9-bebab3e3a2a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.067 2 DEBUG nova.network.os_vif_util [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Converting VIF {"id": "27016abf-08ed-40dc-8da9-bebab3e3a2a3", "address": "fa:16:3e:fe:38:dd", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27016abf-08", "ovs_interfaceid": "27016abf-08ed-40dc-8da9-bebab3e3a2a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.068 2 DEBUG nova.network.os_vif_util [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:38:dd,bridge_name='br-int',has_traffic_filtering=True,id=27016abf-08ed-40dc-8da9-bebab3e3a2a3,network=Network(3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27016abf-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.069 2 DEBUG nova.objects.instance [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lazy-loading 'pci_devices' on Instance uuid e36dd986-15d5-466e-93d6-dc7b4483c8e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.086 2 DEBUG nova.virt.libvirt.driver [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:23:51 np0005476733 nova_compute[192580]:  <uuid>e36dd986-15d5-466e-93d6-dc7b4483c8e9</uuid>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:  <name>instance-00000016</name>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <nova:name>tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986</nova:name>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:23:51</nova:creationTime>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:23:51 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:        <nova:user uuid="f03335a379bd4afdbbd7b9cc7cae27e0">tempest-MulticastTestIPv4Common-178854047-project-member</nova:user>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:        <nova:project uuid="7f2acdb26a5a4269a4b1e407da7722c3">tempest-MulticastTestIPv4Common-178854047</nova:project>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:        <nova:port uuid="27016abf-08ed-40dc-8da9-bebab3e3a2a3">
Oct  8 11:23:51 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <entry name="serial">e36dd986-15d5-466e-93d6-dc7b4483c8e9</entry>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <entry name="uuid">e36dd986-15d5-466e-93d6-dc7b4483c8e9</entry>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.config"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:fe:38:dd"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <target dev="tap27016abf-08"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/console.log" append="off"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:23:51 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:23:51 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:23:51 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:23:51 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.087 2 DEBUG nova.compute.manager [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Preparing to wait for external event network-vif-plugged-27016abf-08ed-40dc-8da9-bebab3e3a2a3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.088 2 DEBUG oslo_concurrency.lockutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "e36dd986-15d5-466e-93d6-dc7b4483c8e9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.088 2 DEBUG oslo_concurrency.lockutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "e36dd986-15d5-466e-93d6-dc7b4483c8e9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.088 2 DEBUG oslo_concurrency.lockutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "e36dd986-15d5-466e-93d6-dc7b4483c8e9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.089 2 DEBUG nova.virt.libvirt.vif [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:23:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986',display_name='tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-igmp-snooping-same-network-and-unsubscribe-1863725',id=22,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD2WFeSg/DGHNB9+nWyQfurOVjPkTxdtZkW0R1GkMWJ7Z/35TtPo56N93IJ9W+ueAP01srElKtm0K/Obvpsxk9Lrs3cBEC1ElilHgpG+1/NKtqmriMYH4DXfeSh+aMoHPg==',key_name='tempest-keypair-test-469695160',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7f2acdb26a5a4269a4b1e407da7722c3',ramdisk_id='',reservation_id='r-mydo74dz',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Common-178854047',owner_user_name='tempest-MulticastTestIPv4Common-178854047-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:23:45Z,user_data=None,user_id='f03335a379bd4afdbbd7b9cc7cae27e0',uuid=e36dd986-15d5-466e-93d6-dc7b4483c8e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "27016abf-08ed-40dc-8da9-bebab3e3a2a3", "address": "fa:16:3e:fe:38:dd", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27016abf-08", "ovs_interfaceid": "27016abf-08ed-40dc-8da9-bebab3e3a2a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.089 2 DEBUG nova.network.os_vif_util [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Converting VIF {"id": "27016abf-08ed-40dc-8da9-bebab3e3a2a3", "address": "fa:16:3e:fe:38:dd", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27016abf-08", "ovs_interfaceid": "27016abf-08ed-40dc-8da9-bebab3e3a2a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.090 2 DEBUG nova.network.os_vif_util [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:38:dd,bridge_name='br-int',has_traffic_filtering=True,id=27016abf-08ed-40dc-8da9-bebab3e3a2a3,network=Network(3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27016abf-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.090 2 DEBUG os_vif [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:38:dd,bridge_name='br-int',has_traffic_filtering=True,id=27016abf-08ed-40dc-8da9-bebab3e3a2a3,network=Network(3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27016abf-08') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.091 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.091 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.094 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap27016abf-08, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.094 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap27016abf-08, col_values=(('external_ids', {'iface-id': '27016abf-08ed-40dc-8da9-bebab3e3a2a3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fe:38:dd', 'vm-uuid': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:51 np0005476733 NetworkManager[51699]: <info>  [1759937031.0972] manager: (tap27016abf-08): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.106 2 INFO os_vif [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:38:dd,bridge_name='br-int',has_traffic_filtering=True,id=27016abf-08ed-40dc-8da9-bebab3e3a2a3,network=Network(3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27016abf-08')#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.163 2 DEBUG nova.virt.libvirt.driver [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.164 2 DEBUG nova.virt.libvirt.driver [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.164 2 DEBUG nova.virt.libvirt.driver [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] No VIF found with MAC fa:16:3e:fe:38:dd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.165 2 INFO nova.virt.libvirt.driver [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Using config drive#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.775 2 INFO nova.virt.libvirt.driver [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Creating config drive at /var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.config#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.780 2 DEBUG oslo_concurrency.processutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_z_8cz8u execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.919 2 DEBUG oslo_concurrency.processutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_z_8cz8u" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:23:51 np0005476733 kernel: tap27016abf-08: entered promiscuous mode
Oct  8 11:23:51 np0005476733 NetworkManager[51699]: <info>  [1759937031.9944] manager: (tap27016abf-08): new Tun device (/org/freedesktop/NetworkManager/Devices/79)
Oct  8 11:23:51 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:51Z|00195|binding|INFO|Claiming lport 27016abf-08ed-40dc-8da9-bebab3e3a2a3 for this chassis.
Oct  8 11:23:51 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:51Z|00196|binding|INFO|27016abf-08ed-40dc-8da9-bebab3e3a2a3: Claiming fa:16:3e:fe:38:dd 10.100.0.13
Oct  8 11:23:51 np0005476733 nova_compute[192580]: 2025-10-08 15:23:51.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:52 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:52Z|00197|binding|INFO|Setting lport 27016abf-08ed-40dc-8da9-bebab3e3a2a3 ovn-installed in OVS
Oct  8 11:23:52 np0005476733 nova_compute[192580]: 2025-10-08 15:23:52.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:52 np0005476733 systemd-machined[152624]: New machine qemu-14-instance-00000016.
Oct  8 11:23:52 np0005476733 systemd[1]: Started Virtual Machine qemu-14-instance-00000016.
Oct  8 11:23:52 np0005476733 systemd-udevd[226131]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:23:52 np0005476733 NetworkManager[51699]: <info>  [1759937032.0675] device (tap27016abf-08): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:23:52 np0005476733 NetworkManager[51699]: <info>  [1759937032.0699] device (tap27016abf-08): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:23:52 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:52Z|00198|binding|INFO|Setting lport 27016abf-08ed-40dc-8da9-bebab3e3a2a3 up in Southbound
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.109 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fe:38:dd 10.100.0.13'], port_security=['fa:16:3e:fe:38:dd 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '89de78c9-f0c2-4dee-bf11-af3dd2c1fe7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ebe434a7-5fd3-4a18-92a7-9bb4b2dc9121, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=27016abf-08ed-40dc-8da9-bebab3e3a2a3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.111 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 27016abf-08ed-40dc-8da9-bebab3e3a2a3 in datapath 3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567 bound to our chassis#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.116 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.133 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[22959670-7860-4400-8af0-ae65e6da9e86]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.134 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3ec2e14e-51 in ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.136 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3ec2e14e-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.136 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1d16f22f-8fce-46ed-9e87-585bd85345e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.137 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[99fad78e-2d49-4258-b84a-1a11db0916ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.149 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[444ce79a-30fd-413f-abb1-4d06db391e07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.165 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8df246aa-6470-4274-ab2b-3a90b0fdd598]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.198 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[c7d7083a-910b-490b-8b48-fda21c76f058]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:52 np0005476733 NetworkManager[51699]: <info>  [1759937032.2067] manager: (tap3ec2e14e-50): new Veth device (/org/freedesktop/NetworkManager/Devices/80)
Oct  8 11:23:52 np0005476733 systemd-udevd[226133]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.208 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2c9d2c14-4858-4e47-85fb-8f2cdc00452e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.246 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[bf84461b-2f4d-4f34-a31a-77f46086890e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.249 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[9329d523-a36e-479a-8b35-420fca9c91d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:52 np0005476733 nova_compute[192580]: 2025-10-08 15:23:52.260 2 INFO nova.compute.manager [None req-c40b88e3-fe5d-4f3d-b068-bedf66c76141 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Get console output#033[00m
Oct  8 11:23:52 np0005476733 nova_compute[192580]: 2025-10-08 15:23:52.269 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:23:52 np0005476733 nova_compute[192580]: 2025-10-08 15:23:52.272 2 INFO nova.virt.libvirt.driver [None req-c40b88e3-fe5d-4f3d-b068-bedf66c76141 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Truncated console log returned, 3145 bytes ignored#033[00m
Oct  8 11:23:52 np0005476733 NetworkManager[51699]: <info>  [1759937032.2829] device (tap3ec2e14e-50): carrier: link connected
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.290 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[8d56dae4-67cc-4f7b-92f6-191ed39d5b8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.309 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[11505de8-d4ed-4eab-b60b-42da12c85008]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3ec2e14e-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:9d:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 393594, 'reachable_time': 17315, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226164, 'error': None, 'target': 'ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.324 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[988d0b99-cff9-4b0b-b65a-1e46f6dee931]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6f:9d70'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 393594, 'tstamp': 393594}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226165, 'error': None, 'target': 'ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.340 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[13800611-5657-4af9-b6e9-1e074bd74ab7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3ec2e14e-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:9d:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 393594, 'reachable_time': 17315, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226166, 'error': None, 'target': 'ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.367 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[08b0f9f9-46fd-4859-964d-a2c4149e5b19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.432 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ee3cde64-aee1-4e45-9699-0c4083d29dab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.433 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3ec2e14e-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.434 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.434 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3ec2e14e-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:52 np0005476733 nova_compute[192580]: 2025-10-08 15:23:52.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:52 np0005476733 kernel: tap3ec2e14e-50: entered promiscuous mode
Oct  8 11:23:52 np0005476733 NetworkManager[51699]: <info>  [1759937032.4371] manager: (tap3ec2e14e-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Oct  8 11:23:52 np0005476733 nova_compute[192580]: 2025-10-08 15:23:52.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.440 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3ec2e14e-50, col_values=(('external_ids', {'iface-id': '1e0c4d29-d963-4fdf-8ca6-0153967de16b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:23:52 np0005476733 nova_compute[192580]: 2025-10-08 15:23:52.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:52 np0005476733 ovn_controller[94857]: 2025-10-08T15:23:52Z|00199|binding|INFO|Releasing lport 1e0c4d29-d963-4fdf-8ca6-0153967de16b from this chassis (sb_readonly=0)
Oct  8 11:23:52 np0005476733 nova_compute[192580]: 2025-10-08 15:23:52.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.455 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.456 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[26ee8988-3c67-446a-9e6c-92bd9acc710e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.456 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567.pid.haproxy
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:23:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:23:52.457 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567', 'env', 'PROCESS_TAG=haproxy-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:23:52 np0005476733 podman[226209]: 2025-10-08 15:23:52.888205452 +0000 UTC m=+0.105979948 container create 3ccc8fc7b426526d985baf99fb75a94501775473bd46b4e803a3d2d2ba01b736 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, tcib_managed=true)
Oct  8 11:23:52 np0005476733 podman[226209]: 2025-10-08 15:23:52.808000481 +0000 UTC m=+0.025774997 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:23:52 np0005476733 nova_compute[192580]: 2025-10-08 15:23:52.903 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937032.9031227, e36dd986-15d5-466e-93d6-dc7b4483c8e9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:23:52 np0005476733 nova_compute[192580]: 2025-10-08 15:23:52.905 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] VM Started (Lifecycle Event)#033[00m
Oct  8 11:23:52 np0005476733 nova_compute[192580]: 2025-10-08 15:23:52.926 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:23:52 np0005476733 systemd[1]: Started libpod-conmon-3ccc8fc7b426526d985baf99fb75a94501775473bd46b4e803a3d2d2ba01b736.scope.
Oct  8 11:23:52 np0005476733 nova_compute[192580]: 2025-10-08 15:23:52.933 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937032.9032798, e36dd986-15d5-466e-93d6-dc7b4483c8e9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:23:52 np0005476733 nova_compute[192580]: 2025-10-08 15:23:52.934 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:23:52 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:23:52 np0005476733 nova_compute[192580]: 2025-10-08 15:23:52.955 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:23:52 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/927467e59113a6f78629d25bbe2e57b003ee25270ec43dd92f4c214a3581231b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:23:52 np0005476733 nova_compute[192580]: 2025-10-08 15:23:52.969 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:23:52 np0005476733 nova_compute[192580]: 2025-10-08 15:23:52.992 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:23:53 np0005476733 podman[226209]: 2025-10-08 15:23:53.007959639 +0000 UTC m=+0.225734145 container init 3ccc8fc7b426526d985baf99fb75a94501775473bd46b4e803a3d2d2ba01b736 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:23:53 np0005476733 podman[226209]: 2025-10-08 15:23:53.014196479 +0000 UTC m=+0.231970965 container start 3ccc8fc7b426526d985baf99fb75a94501775473bd46b4e803a3d2d2ba01b736 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:23:53 np0005476733 neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567[226244]: [NOTICE]   (226259) : New worker (226266) forked
Oct  8 11:23:53 np0005476733 neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567[226244]: [NOTICE]   (226259) : Loading success.
Oct  8 11:23:53 np0005476733 podman[226241]: 2025-10-08 15:23:53.057772667 +0000 UTC m=+0.123113247 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 11:23:53 np0005476733 nova_compute[192580]: 2025-10-08 15:23:53.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.235 2 DEBUG nova.compute.manager [req-6b7f3f4d-e6c2-4140-8c8a-0840003c9302 req-b5d2fd33-4889-4a13-b4b8-f001b09548e4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Received event network-vif-plugged-27016abf-08ed-40dc-8da9-bebab3e3a2a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.236 2 DEBUG oslo_concurrency.lockutils [req-6b7f3f4d-e6c2-4140-8c8a-0840003c9302 req-b5d2fd33-4889-4a13-b4b8-f001b09548e4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e36dd986-15d5-466e-93d6-dc7b4483c8e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.236 2 DEBUG oslo_concurrency.lockutils [req-6b7f3f4d-e6c2-4140-8c8a-0840003c9302 req-b5d2fd33-4889-4a13-b4b8-f001b09548e4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e36dd986-15d5-466e-93d6-dc7b4483c8e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.236 2 DEBUG oslo_concurrency.lockutils [req-6b7f3f4d-e6c2-4140-8c8a-0840003c9302 req-b5d2fd33-4889-4a13-b4b8-f001b09548e4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e36dd986-15d5-466e-93d6-dc7b4483c8e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.237 2 DEBUG nova.compute.manager [req-6b7f3f4d-e6c2-4140-8c8a-0840003c9302 req-b5d2fd33-4889-4a13-b4b8-f001b09548e4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Processing event network-vif-plugged-27016abf-08ed-40dc-8da9-bebab3e3a2a3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.238 2 DEBUG nova.compute.manager [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.241 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937034.2410243, e36dd986-15d5-466e-93d6-dc7b4483c8e9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.242 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.244 2 DEBUG nova.virt.libvirt.driver [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.248 2 INFO nova.virt.libvirt.driver [-] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Instance spawned successfully.#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.250 2 DEBUG nova.virt.libvirt.driver [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.285 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.291 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.298 2 DEBUG nova.virt.libvirt.driver [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.299 2 DEBUG nova.virt.libvirt.driver [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.300 2 DEBUG nova.virt.libvirt.driver [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.300 2 DEBUG nova.virt.libvirt.driver [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.301 2 DEBUG nova.virt.libvirt.driver [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.302 2 DEBUG nova.virt.libvirt.driver [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.313 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.375 2 INFO nova.compute.manager [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Took 9.14 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.376 2 DEBUG nova.compute.manager [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.448 2 INFO nova.compute.manager [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Took 9.85 seconds to build instance.#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.476 2 DEBUG oslo_concurrency.lockutils [None req-57a509e9-df0e-4151-88e2-71240f56b89e f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "e36dd986-15d5-466e-93d6-dc7b4483c8e9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.953s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.552 2 DEBUG nova.network.neutron [req-b3e0acc8-1afb-4f3d-941a-2c1bcb2c3e09 req-9fed5e79-0a73-496b-9afb-9609ba6420ba 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Updated VIF entry in instance network info cache for port 27016abf-08ed-40dc-8da9-bebab3e3a2a3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.553 2 DEBUG nova.network.neutron [req-b3e0acc8-1afb-4f3d-941a-2c1bcb2c3e09 req-9fed5e79-0a73-496b-9afb-9609ba6420ba 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Updating instance_info_cache with network_info: [{"id": "27016abf-08ed-40dc-8da9-bebab3e3a2a3", "address": "fa:16:3e:fe:38:dd", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27016abf-08", "ovs_interfaceid": "27016abf-08ed-40dc-8da9-bebab3e3a2a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:23:54 np0005476733 nova_compute[192580]: 2025-10-08 15:23:54.572 2 DEBUG oslo_concurrency.lockutils [req-b3e0acc8-1afb-4f3d-941a-2c1bcb2c3e09 req-9fed5e79-0a73-496b-9afb-9609ba6420ba 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-e36dd986-15d5-466e-93d6-dc7b4483c8e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:23:55 np0005476733 nova_compute[192580]: 2025-10-08 15:23:55.898 2 INFO nova.compute.manager [None req-eaba7cf0-5e0e-4649-a57c-0407e2ed5fab f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Get console output#033[00m
Oct  8 11:23:55 np0005476733 nova_compute[192580]: 2025-10-08 15:23:55.903 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:23:56 np0005476733 nova_compute[192580]: 2025-10-08 15:23:56.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:23:56 np0005476733 nova_compute[192580]: 2025-10-08 15:23:56.462 2 DEBUG nova.compute.manager [req-3db525af-856c-4e12-9390-0cf5b55519fc req-5808e4fa-d292-4d87-85ef-38a1a48f1e55 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Received event network-vif-plugged-27016abf-08ed-40dc-8da9-bebab3e3a2a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:56 np0005476733 nova_compute[192580]: 2025-10-08 15:23:56.463 2 DEBUG oslo_concurrency.lockutils [req-3db525af-856c-4e12-9390-0cf5b55519fc req-5808e4fa-d292-4d87-85ef-38a1a48f1e55 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e36dd986-15d5-466e-93d6-dc7b4483c8e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:23:56 np0005476733 nova_compute[192580]: 2025-10-08 15:23:56.463 2 DEBUG oslo_concurrency.lockutils [req-3db525af-856c-4e12-9390-0cf5b55519fc req-5808e4fa-d292-4d87-85ef-38a1a48f1e55 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e36dd986-15d5-466e-93d6-dc7b4483c8e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:23:56 np0005476733 nova_compute[192580]: 2025-10-08 15:23:56.464 2 DEBUG oslo_concurrency.lockutils [req-3db525af-856c-4e12-9390-0cf5b55519fc req-5808e4fa-d292-4d87-85ef-38a1a48f1e55 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e36dd986-15d5-466e-93d6-dc7b4483c8e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:23:56 np0005476733 nova_compute[192580]: 2025-10-08 15:23:56.464 2 DEBUG nova.compute.manager [req-3db525af-856c-4e12-9390-0cf5b55519fc req-5808e4fa-d292-4d87-85ef-38a1a48f1e55 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] No waiting events found dispatching network-vif-plugged-27016abf-08ed-40dc-8da9-bebab3e3a2a3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:23:56 np0005476733 nova_compute[192580]: 2025-10-08 15:23:56.465 2 WARNING nova.compute.manager [req-3db525af-856c-4e12-9390-0cf5b55519fc req-5808e4fa-d292-4d87-85ef-38a1a48f1e55 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Received unexpected event network-vif-plugged-27016abf-08ed-40dc-8da9-bebab3e3a2a3 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:23:56 np0005476733 nova_compute[192580]: 2025-10-08 15:23:56.465 2 DEBUG nova.compute.manager [req-3db525af-856c-4e12-9390-0cf5b55519fc req-5808e4fa-d292-4d87-85ef-38a1a48f1e55 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Received event network-changed-0bb60f77-cd96-4dfd-9810-5583ec966cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:23:56 np0005476733 nova_compute[192580]: 2025-10-08 15:23:56.465 2 DEBUG nova.compute.manager [req-3db525af-856c-4e12-9390-0cf5b55519fc req-5808e4fa-d292-4d87-85ef-38a1a48f1e55 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Refreshing instance network info cache due to event network-changed-0bb60f77-cd96-4dfd-9810-5583ec966cb2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:23:56 np0005476733 nova_compute[192580]: 2025-10-08 15:23:56.466 2 DEBUG oslo_concurrency.lockutils [req-3db525af-856c-4e12-9390-0cf5b55519fc req-5808e4fa-d292-4d87-85ef-38a1a48f1e55 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:23:56 np0005476733 nova_compute[192580]: 2025-10-08 15:23:56.466 2 DEBUG oslo_concurrency.lockutils [req-3db525af-856c-4e12-9390-0cf5b55519fc req-5808e4fa-d292-4d87-85ef-38a1a48f1e55 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:23:56 np0005476733 nova_compute[192580]: 2025-10-08 15:23:56.466 2 DEBUG nova.network.neutron [req-3db525af-856c-4e12-9390-0cf5b55519fc req-5808e4fa-d292-4d87-85ef-38a1a48f1e55 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Refreshing network info cache for port 0bb60f77-cd96-4dfd-9810-5583ec966cb2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:23:57 np0005476733 podman[226277]: 2025-10-08 15:23:57.243454129 +0000 UTC m=+0.069508468 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  8 11:23:58 np0005476733 nova_compute[192580]: 2025-10-08 15:23:58.118 2 DEBUG nova.network.neutron [req-3db525af-856c-4e12-9390-0cf5b55519fc req-5808e4fa-d292-4d87-85ef-38a1a48f1e55 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Updated VIF entry in instance network info cache for port 0bb60f77-cd96-4dfd-9810-5583ec966cb2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:23:58 np0005476733 nova_compute[192580]: 2025-10-08 15:23:58.120 2 DEBUG nova.network.neutron [req-3db525af-856c-4e12-9390-0cf5b55519fc req-5808e4fa-d292-4d87-85ef-38a1a48f1e55 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Updating instance_info_cache with network_info: [{"id": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "address": "fa:16:3e:a6:aa:3b", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bb60f77-cd", "ovs_interfaceid": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:23:58 np0005476733 nova_compute[192580]: 2025-10-08 15:23:58.156 2 DEBUG oslo_concurrency.lockutils [req-3db525af-856c-4e12-9390-0cf5b55519fc req-5808e4fa-d292-4d87-85ef-38a1a48f1e55 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:23:58 np0005476733 podman[226295]: 2025-10-08 15:23:58.260022919 +0000 UTC m=+0.090257413 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:23:58 np0005476733 nova_compute[192580]: 2025-10-08 15:23:58.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:00.557 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:24:00 np0005476733 nova_compute[192580]: 2025-10-08 15:24:00.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:00.560 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:24:01 np0005476733 nova_compute[192580]: 2025-10-08 15:24:01.103 2 INFO nova.compute.manager [None req-f5199009-85c9-47c8-8b02-64a4b55f2086 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Get console output#033[00m
Oct  8 11:24:01 np0005476733 nova_compute[192580]: 2025-10-08 15:24:01.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:01.563 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:24:03 np0005476733 podman[226319]: 2025-10-08 15:24:03.23786735 +0000 UTC m=+0.066526043 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, release=1755695350, vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, architecture=x86_64, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Oct  8 11:24:03 np0005476733 nova_compute[192580]: 2025-10-08 15:24:03.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:04 np0005476733 systemd-logind[827]: New session 36 of user zuul.
Oct  8 11:24:04 np0005476733 systemd[1]: Started Session 36 of User zuul.
Oct  8 11:24:04 np0005476733 systemd[1]: session-36.scope: Deactivated successfully.
Oct  8 11:24:04 np0005476733 systemd-logind[827]: Session 36 logged out. Waiting for processes to exit.
Oct  8 11:24:04 np0005476733 systemd-logind[827]: Removed session 36.
Oct  8 11:24:06 np0005476733 nova_compute[192580]: 2025-10-08 15:24:06.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:06 np0005476733 podman[226368]: 2025-10-08 15:24:06.231744979 +0000 UTC m=+0.051802322 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 11:24:06 np0005476733 podman[226367]: 2025-10-08 15:24:06.264056694 +0000 UTC m=+0.093072473 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 11:24:06 np0005476733 nova_compute[192580]: 2025-10-08 15:24:06.267 2 INFO nova.compute.manager [None req-15ab0f98-b409-40c4-9486-ddd6dbe9a020 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Get console output#033[00m
Oct  8 11:24:06 np0005476733 nova_compute[192580]: 2025-10-08 15:24:06.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:08 np0005476733 nova_compute[192580]: 2025-10-08 15:24:08.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:10 np0005476733 nova_compute[192580]: 2025-10-08 15:24:10.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:11 np0005476733 nova_compute[192580]: 2025-10-08 15:24:11.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:11 np0005476733 nova_compute[192580]: 2025-10-08 15:24:11.417 2 INFO nova.compute.manager [None req-32437d34-2ced-4d85-b57f-053e42969280 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Get console output#033[00m
Oct  8 11:24:11 np0005476733 nova_compute[192580]: 2025-10-08 15:24:11.422 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:24:13 np0005476733 nova_compute[192580]: 2025-10-08 15:24:13.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:13 np0005476733 nova_compute[192580]: 2025-10-08 15:24:13.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:16 np0005476733 nova_compute[192580]: 2025-10-08 15:24:16.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:16 np0005476733 podman[226417]: 2025-10-08 15:24:16.237890089 +0000 UTC m=+0.067179015 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  8 11:24:16 np0005476733 podman[226438]: 2025-10-08 15:24:16.324882066 +0000 UTC m=+0.054354083 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:24:16 np0005476733 nova_compute[192580]: 2025-10-08 15:24:16.574 2 INFO nova.compute.manager [None req-6bec383d-a53f-46aa-8456-1a84ea4697fc f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Get console output#033[00m
Oct  8 11:24:16 np0005476733 nova_compute[192580]: 2025-10-08 15:24:16.582 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:24:17 np0005476733 ovn_controller[94857]: 2025-10-08T15:24:17Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fe:38:dd 10.100.0.13
Oct  8 11:24:17 np0005476733 ovn_controller[94857]: 2025-10-08T15:24:17Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fe:38:dd 10.100.0.13
Oct  8 11:24:17 np0005476733 nova_compute[192580]: 2025-10-08 15:24:17.763 2 DEBUG oslo_concurrency.lockutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "e11af4e6-28c2-48fa-affb-668a5e9f6972" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:24:17 np0005476733 nova_compute[192580]: 2025-10-08 15:24:17.764 2 DEBUG oslo_concurrency.lockutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "e11af4e6-28c2-48fa-affb-668a5e9f6972" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:24:17 np0005476733 nova_compute[192580]: 2025-10-08 15:24:17.796 2 DEBUG nova.compute.manager [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:24:17 np0005476733 nova_compute[192580]: 2025-10-08 15:24:17.918 2 DEBUG oslo_concurrency.lockutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:24:17 np0005476733 nova_compute[192580]: 2025-10-08 15:24:17.919 2 DEBUG oslo_concurrency.lockutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:24:17 np0005476733 nova_compute[192580]: 2025-10-08 15:24:17.925 2 DEBUG nova.virt.hardware [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:24:17 np0005476733 nova_compute[192580]: 2025-10-08 15:24:17.925 2 INFO nova.compute.claims [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.130 2 DEBUG nova.compute.provider_tree [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.145 2 DEBUG nova.scheduler.client.report [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.174 2 DEBUG oslo_concurrency.lockutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.175 2 DEBUG nova.compute.manager [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.259 2 DEBUG nova.compute.manager [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.260 2 DEBUG nova.network.neutron [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.280 2 INFO nova.virt.libvirt.driver [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.306 2 DEBUG nova.compute.manager [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.421 2 DEBUG nova.compute.manager [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.423 2 DEBUG nova.virt.libvirt.driver [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.425 2 INFO nova.virt.libvirt.driver [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Creating image(s)#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.426 2 DEBUG oslo_concurrency.lockutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "/var/lib/nova/instances/e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.426 2 DEBUG oslo_concurrency.lockutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "/var/lib/nova/instances/e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.427 2 DEBUG oslo_concurrency.lockutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "/var/lib/nova/instances/e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.446 2 DEBUG oslo_concurrency.processutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.516 2 DEBUG oslo_concurrency.processutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.517 2 DEBUG oslo_concurrency.lockutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.518 2 DEBUG oslo_concurrency.lockutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.529 2 DEBUG oslo_concurrency.processutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.601 2 DEBUG oslo_concurrency.processutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.602 2 DEBUG oslo_concurrency.processutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/e11af4e6-28c2-48fa-affb-668a5e9f6972/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.773 2 DEBUG oslo_concurrency.processutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/e11af4e6-28c2-48fa-affb-668a5e9f6972/disk 10737418240" returned: 0 in 0.171s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.774 2 DEBUG oslo_concurrency.lockutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.774 2 DEBUG oslo_concurrency.processutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.838 2 DEBUG nova.policy [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.846 2 DEBUG oslo_concurrency.processutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.847 2 DEBUG nova.objects.instance [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lazy-loading 'migration_context' on Instance uuid e11af4e6-28c2-48fa-affb-668a5e9f6972 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.863 2 DEBUG nova.virt.libvirt.driver [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.864 2 DEBUG nova.virt.libvirt.driver [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Ensure instance console log exists: /var/lib/nova/instances/e11af4e6-28c2-48fa-affb-668a5e9f6972/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.865 2 DEBUG oslo_concurrency.lockutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.865 2 DEBUG oslo_concurrency.lockutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:24:18 np0005476733 nova_compute[192580]: 2025-10-08 15:24:18.865 2 DEBUG oslo_concurrency.lockutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:24:19 np0005476733 nova_compute[192580]: 2025-10-08 15:24:19.736 2 DEBUG nova.network.neutron [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Successfully updated port: bb8d6c3b-78f5-45eb-82d7-19d928374c3e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:24:19 np0005476733 nova_compute[192580]: 2025-10-08 15:24:19.754 2 DEBUG oslo_concurrency.lockutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "refresh_cache-e11af4e6-28c2-48fa-affb-668a5e9f6972" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:24:19 np0005476733 nova_compute[192580]: 2025-10-08 15:24:19.754 2 DEBUG oslo_concurrency.lockutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquired lock "refresh_cache-e11af4e6-28c2-48fa-affb-668a5e9f6972" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:24:19 np0005476733 nova_compute[192580]: 2025-10-08 15:24:19.755 2 DEBUG nova.network.neutron [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:24:19 np0005476733 nova_compute[192580]: 2025-10-08 15:24:19.854 2 DEBUG nova.compute.manager [req-20112d16-28e0-4806-b616-99c509c2d608 req-071e2bb8-04a3-4f3a-8c93-670113e3fd39 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Received event network-changed-bb8d6c3b-78f5-45eb-82d7-19d928374c3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:24:19 np0005476733 nova_compute[192580]: 2025-10-08 15:24:19.854 2 DEBUG nova.compute.manager [req-20112d16-28e0-4806-b616-99c509c2d608 req-071e2bb8-04a3-4f3a-8c93-670113e3fd39 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Refreshing instance network info cache due to event network-changed-bb8d6c3b-78f5-45eb-82d7-19d928374c3e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:24:19 np0005476733 nova_compute[192580]: 2025-10-08 15:24:19.854 2 DEBUG oslo_concurrency.lockutils [req-20112d16-28e0-4806-b616-99c509c2d608 req-071e2bb8-04a3-4f3a-8c93-670113e3fd39 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-e11af4e6-28c2-48fa-affb-668a5e9f6972" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:24:19 np0005476733 nova_compute[192580]: 2025-10-08 15:24:19.939 2 DEBUG nova.network.neutron [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:24:21 np0005476733 nova_compute[192580]: 2025-10-08 15:24:21.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:21 np0005476733 nova_compute[192580]: 2025-10-08 15:24:21.751 2 INFO nova.compute.manager [None req-0209dd44-1a45-47e9-bd8d-dc024ccd6a9b f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Get console output#033[00m
Oct  8 11:24:21 np0005476733 nova_compute[192580]: 2025-10-08 15:24:21.764 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:24:23 np0005476733 podman[226474]: 2025-10-08 15:24:23.230239981 +0000 UTC m=+0.061614586 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  8 11:24:23 np0005476733 nova_compute[192580]: 2025-10-08 15:24:23.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:23 np0005476733 nova_compute[192580]: 2025-10-08 15:24:23.992 2 DEBUG nova.network.neutron [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Updating instance_info_cache with network_info: [{"id": "bb8d6c3b-78f5-45eb-82d7-19d928374c3e", "address": "fa:16:3e:20:ea:c5", "network": {"id": "784726bd-b1f4-4298-96ff-31e8b942933e", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "2001::/64", "dns": [], "gateway": {"address": "2001::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001::367", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}, {"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8d6c3b-78", "ovs_interfaceid": "bb8d6c3b-78f5-45eb-82d7-19d928374c3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.022 2 DEBUG oslo_concurrency.lockutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Releasing lock "refresh_cache-e11af4e6-28c2-48fa-affb-668a5e9f6972" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.023 2 DEBUG nova.compute.manager [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Instance network_info: |[{"id": "bb8d6c3b-78f5-45eb-82d7-19d928374c3e", "address": "fa:16:3e:20:ea:c5", "network": {"id": "784726bd-b1f4-4298-96ff-31e8b942933e", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "2001::/64", "dns": [], "gateway": {"address": "2001::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001::367", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}, {"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8d6c3b-78", "ovs_interfaceid": "bb8d6c3b-78f5-45eb-82d7-19d928374c3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.023 2 DEBUG oslo_concurrency.lockutils [req-20112d16-28e0-4806-b616-99c509c2d608 req-071e2bb8-04a3-4f3a-8c93-670113e3fd39 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-e11af4e6-28c2-48fa-affb-668a5e9f6972" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.024 2 DEBUG nova.network.neutron [req-20112d16-28e0-4806-b616-99c509c2d608 req-071e2bb8-04a3-4f3a-8c93-670113e3fd39 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Refreshing network info cache for port bb8d6c3b-78f5-45eb-82d7-19d928374c3e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.026 2 DEBUG nova.virt.libvirt.driver [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Start _get_guest_xml network_info=[{"id": "bb8d6c3b-78f5-45eb-82d7-19d928374c3e", "address": "fa:16:3e:20:ea:c5", "network": {"id": "784726bd-b1f4-4298-96ff-31e8b942933e", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "2001::/64", "dns": [], "gateway": {"address": "2001::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001::367", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}, {"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8d6c3b-78", "ovs_interfaceid": "bb8d6c3b-78f5-45eb-82d7-19d928374c3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.031 2 WARNING nova.virt.libvirt.driver [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.037 2 DEBUG nova.virt.libvirt.host [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.037 2 DEBUG nova.virt.libvirt.host [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.042 2 DEBUG nova.virt.libvirt.host [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.043 2 DEBUG nova.virt.libvirt.host [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.044 2 DEBUG nova.virt.libvirt.driver [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.044 2 DEBUG nova.virt.hardware [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.045 2 DEBUG nova.virt.hardware [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.045 2 DEBUG nova.virt.hardware [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.045 2 DEBUG nova.virt.hardware [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.045 2 DEBUG nova.virt.hardware [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.045 2 DEBUG nova.virt.hardware [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.046 2 DEBUG nova.virt.hardware [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.046 2 DEBUG nova.virt.hardware [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.046 2 DEBUG nova.virt.hardware [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.046 2 DEBUG nova.virt.hardware [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.046 2 DEBUG nova.virt.hardware [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.050 2 DEBUG nova.virt.libvirt.vif [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:24:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6',display_name='tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-extradhcpoptionstest-1757752636-test-extra-dhcp-opts-di',id=25,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAROHXDFBirKKfgv1/Q2k8TOz822D2j3GssXLkqqAYkfNmKCLTZPWHL9R3TttvPeVcQM9XeUfcVk0LUjV4/DUc229+mDzz6yKwrgz0g4olEc5cIgAsFC91SZyJ937u9BxA==',key_name='tempest-ExtraDhcpOptionsTest-1757752636',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='93e68db931464f0282500c84d398d8af',ramdisk_id='',reservation_id='r-gb7f9gi1',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ExtraDhcpOptionsTest-522093769',owner_user_name='tempest-ExtraDhcpOptionsTest-522093769-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:24:18Z,user_data=None,user_id='048380879c82439f920961e33c8fc34c',uuid=e11af4e6-28c2-48fa-affb-668a5e9f6972,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bb8d6c3b-78f5-45eb-82d7-19d928374c3e", "address": "fa:16:3e:20:ea:c5", "network": {"id": "784726bd-b1f4-4298-96ff-31e8b942933e", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "2001::/64", "dns": [], "gateway": {"address": "2001::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001::367", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}, {"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8d6c3b-78", "ovs_interfaceid": "bb8d6c3b-78f5-45eb-82d7-19d928374c3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.050 2 DEBUG nova.network.os_vif_util [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Converting VIF {"id": "bb8d6c3b-78f5-45eb-82d7-19d928374c3e", "address": "fa:16:3e:20:ea:c5", "network": {"id": "784726bd-b1f4-4298-96ff-31e8b942933e", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "2001::/64", "dns": [], "gateway": {"address": "2001::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001::367", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}, {"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8d6c3b-78", "ovs_interfaceid": "bb8d6c3b-78f5-45eb-82d7-19d928374c3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.051 2 DEBUG nova.network.os_vif_util [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:ea:c5,bridge_name='br-int',has_traffic_filtering=True,id=bb8d6c3b-78f5-45eb-82d7-19d928374c3e,network=Network(784726bd-b1f4-4298-96ff-31e8b942933e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbb8d6c3b-78') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.052 2 DEBUG nova.objects.instance [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lazy-loading 'pci_devices' on Instance uuid e11af4e6-28c2-48fa-affb-668a5e9f6972 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.072 2 DEBUG nova.virt.libvirt.driver [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:24:24 np0005476733 nova_compute[192580]:  <uuid>e11af4e6-28c2-48fa-affb-668a5e9f6972</uuid>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:  <name>instance-00000019</name>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <nova:name>tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6</nova:name>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:24:24</nova:creationTime>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:24:24 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:        <nova:user uuid="048380879c82439f920961e33c8fc34c">tempest-ExtraDhcpOptionsTest-522093769-project-member</nova:user>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:        <nova:project uuid="93e68db931464f0282500c84d398d8af">tempest-ExtraDhcpOptionsTest-522093769</nova:project>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:        <nova:port uuid="bb8d6c3b-78f5-45eb-82d7-19d928374c3e">
Oct  8 11:24:24 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="2001::367" ipVersion="6"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.0.27" ipVersion="4"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <entry name="serial">e11af4e6-28c2-48fa-affb-668a5e9f6972</entry>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <entry name="uuid">e11af4e6-28c2-48fa-affb-668a5e9f6972</entry>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/e11af4e6-28c2-48fa-affb-668a5e9f6972/disk"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.config"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:20:ea:c5"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <target dev="tapbb8d6c3b-78"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/e11af4e6-28c2-48fa-affb-668a5e9f6972/console.log" append="off"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:24:24 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:24:24 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:24:24 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:24:24 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.074 2 DEBUG nova.compute.manager [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Preparing to wait for external event network-vif-plugged-bb8d6c3b-78f5-45eb-82d7-19d928374c3e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.074 2 DEBUG oslo_concurrency.lockutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "e11af4e6-28c2-48fa-affb-668a5e9f6972-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.074 2 DEBUG oslo_concurrency.lockutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "e11af4e6-28c2-48fa-affb-668a5e9f6972-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.075 2 DEBUG oslo_concurrency.lockutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "e11af4e6-28c2-48fa-affb-668a5e9f6972-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.075 2 DEBUG nova.virt.libvirt.vif [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:24:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6',display_name='tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-extradhcpoptionstest-1757752636-test-extra-dhcp-opts-di',id=25,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAROHXDFBirKKfgv1/Q2k8TOz822D2j3GssXLkqqAYkfNmKCLTZPWHL9R3TttvPeVcQM9XeUfcVk0LUjV4/DUc229+mDzz6yKwrgz0g4olEc5cIgAsFC91SZyJ937u9BxA==',key_name='tempest-ExtraDhcpOptionsTest-1757752636',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='93e68db931464f0282500c84d398d8af',ramdisk_id='',reservation_id='r-gb7f9gi1',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ExtraDhcpOptionsTest-522093769',owner_user_name='tempest-ExtraDhcpOptionsTest-522093769-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:24:18Z,user_data=None,user_id='048380879c82439f920961e33c8fc34c',uuid=e11af4e6-28c2-48fa-affb-668a5e9f6972,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bb8d6c3b-78f5-45eb-82d7-19d928374c3e", "address": "fa:16:3e:20:ea:c5", "network": {"id": "784726bd-b1f4-4298-96ff-31e8b942933e", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "2001::/64", "dns": [], "gateway": {"address": "2001::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001::367", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}, {"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8d6c3b-78", "ovs_interfaceid": "bb8d6c3b-78f5-45eb-82d7-19d928374c3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.076 2 DEBUG nova.network.os_vif_util [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Converting VIF {"id": "bb8d6c3b-78f5-45eb-82d7-19d928374c3e", "address": "fa:16:3e:20:ea:c5", "network": {"id": "784726bd-b1f4-4298-96ff-31e8b942933e", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "2001::/64", "dns": [], "gateway": {"address": "2001::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001::367", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}, {"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8d6c3b-78", "ovs_interfaceid": "bb8d6c3b-78f5-45eb-82d7-19d928374c3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.077 2 DEBUG nova.network.os_vif_util [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:ea:c5,bridge_name='br-int',has_traffic_filtering=True,id=bb8d6c3b-78f5-45eb-82d7-19d928374c3e,network=Network(784726bd-b1f4-4298-96ff-31e8b942933e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbb8d6c3b-78') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.077 2 DEBUG os_vif [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:ea:c5,bridge_name='br-int',has_traffic_filtering=True,id=bb8d6c3b-78f5-45eb-82d7-19d928374c3e,network=Network(784726bd-b1f4-4298-96ff-31e8b942933e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbb8d6c3b-78') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.078 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.078 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.088 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbb8d6c3b-78, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.089 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbb8d6c3b-78, col_values=(('external_ids', {'iface-id': 'bb8d6c3b-78f5-45eb-82d7-19d928374c3e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:20:ea:c5', 'vm-uuid': 'e11af4e6-28c2-48fa-affb-668a5e9f6972'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:24 np0005476733 ovn_controller[94857]: 2025-10-08T15:24:24Z|00200|pinctrl|WARN|Dropped 5937 log messages in last 63 seconds (most recently, 4 seconds ago) due to excessive rate
Oct  8 11:24:24 np0005476733 ovn_controller[94857]: 2025-10-08T15:24:24Z|00201|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:24:24 np0005476733 NetworkManager[51699]: <info>  [1759937064.0951] manager: (tapbb8d6c3b-78): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.103 2 INFO os_vif [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:ea:c5,bridge_name='br-int',has_traffic_filtering=True,id=bb8d6c3b-78f5-45eb-82d7-19d928374c3e,network=Network(784726bd-b1f4-4298-96ff-31e8b942933e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbb8d6c3b-78')#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.174 2 DEBUG nova.virt.libvirt.driver [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.175 2 DEBUG nova.virt.libvirt.driver [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.175 2 DEBUG nova.virt.libvirt.driver [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] No VIF found with MAC fa:16:3e:20:ea:c5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:24:24 np0005476733 nova_compute[192580]: 2025-10-08 15:24:24.176 2 INFO nova.virt.libvirt.driver [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Using config drive#033[00m
Oct  8 11:24:25 np0005476733 nova_compute[192580]: 2025-10-08 15:24:25.292 2 INFO nova.virt.libvirt.driver [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Creating config drive at /var/lib/nova/instances/e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.config#033[00m
Oct  8 11:24:25 np0005476733 nova_compute[192580]: 2025-10-08 15:24:25.300 2 DEBUG oslo_concurrency.processutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6ky907_e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:24:25 np0005476733 nova_compute[192580]: 2025-10-08 15:24:25.431 2 DEBUG oslo_concurrency.processutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6ky907_e" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:24:25 np0005476733 kernel: tapbb8d6c3b-78: entered promiscuous mode
Oct  8 11:24:25 np0005476733 nova_compute[192580]: 2025-10-08 15:24:25.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:24:25Z|00202|binding|INFO|Claiming lport bb8d6c3b-78f5-45eb-82d7-19d928374c3e for this chassis.
Oct  8 11:24:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:24:25Z|00203|binding|INFO|bb8d6c3b-78f5-45eb-82d7-19d928374c3e: Claiming fa:16:3e:20:ea:c5 192.168.0.27 2001::367
Oct  8 11:24:25 np0005476733 NetworkManager[51699]: <info>  [1759937065.5009] manager: (tapbb8d6c3b-78): new Tun device (/org/freedesktop/NetworkManager/Devices/83)
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.510 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:ea:c5 192.168.0.27 2001::367'], port_security=['fa:16:3e:20:ea:c5 192.168.0.27 2001::367'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'neutron:cidrs': '192.168.0.27/24 2001::367/64', 'neutron:device_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-784726bd-b1f4-4298-96ff-31e8b942933e', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'neutron:project_id': '93e68db931464f0282500c84d398d8af', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ee93d6be-59e3-41c0-a55f-8df79fb9da74', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=adadf214-3c16-4fe3-8265-a811250258e1, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=bb8d6c3b-78f5-45eb-82d7-19d928374c3e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.512 103739 INFO neutron.agent.ovn.metadata.agent [-] Port bb8d6c3b-78f5-45eb-82d7-19d928374c3e in datapath 784726bd-b1f4-4298-96ff-31e8b942933e bound to our chassis#033[00m
Oct  8 11:24:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:24:25Z|00204|binding|INFO|Setting lport bb8d6c3b-78f5-45eb-82d7-19d928374c3e ovn-installed in OVS
Oct  8 11:24:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:24:25Z|00205|binding|INFO|Setting lport bb8d6c3b-78f5-45eb-82d7-19d928374c3e up in Southbound
Oct  8 11:24:25 np0005476733 nova_compute[192580]: 2025-10-08 15:24:25.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.518 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 784726bd-b1f4-4298-96ff-31e8b942933e#033[00m
Oct  8 11:24:25 np0005476733 nova_compute[192580]: 2025-10-08 15:24:25.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:25 np0005476733 nova_compute[192580]: 2025-10-08 15:24:25.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.536 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b9b7ada2-7baa-4746-89de-d720a7275c0f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.536 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap784726bd-b1 in ovnmeta-784726bd-b1f4-4298-96ff-31e8b942933e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:24:25 np0005476733 systemd-udevd[226515]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.540 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap784726bd-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.541 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4c11f0ca-9d10-4577-89c6-3abd4f90d9ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.544 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0f8946a7-bc92-4e07-b2d0-21547848454f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.558 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[7661e535-cdac-4ed5-bb51-68f1cea85092]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:24:25 np0005476733 systemd-machined[152624]: New machine qemu-15-instance-00000019.
Oct  8 11:24:25 np0005476733 NetworkManager[51699]: <info>  [1759937065.5607] device (tapbb8d6c3b-78): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:24:25 np0005476733 NetworkManager[51699]: <info>  [1759937065.5616] device (tapbb8d6c3b-78): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.573 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7abfde89-1a6a-43f2-a9a1-55c055fc71b9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:24:25 np0005476733 systemd[1]: Started Virtual Machine qemu-15-instance-00000019.
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.604 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[3409ccc3-4c5f-4665-9279-ef9309c396d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:24:25 np0005476733 NetworkManager[51699]: <info>  [1759937065.6107] manager: (tap784726bd-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/84)
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.612 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e197ae95-1137-40a6-abbd-7fa3f9577851]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.643 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[df0220b8-93f0-41d4-ace9-a3d204dea030]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.647 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[8b0b03df-d628-4423-b944-5c71bfbefc0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:24:25 np0005476733 NetworkManager[51699]: <info>  [1759937065.6669] device (tap784726bd-b0): carrier: link connected
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.675 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[836966cf-cd8d-457f-a938-7bf8cf715e66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.696 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[55a9c6fa-6132-4417-b75a-760c8ab04770]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap784726bd-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:03:f9:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396933, 'reachable_time': 34727, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226549, 'error': None, 'target': 'ovnmeta-784726bd-b1f4-4298-96ff-31e8b942933e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.714 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2d7e6bb9-a8cb-40d6-94f0-8c6c7d547369]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe03:f913'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396933, 'tstamp': 396933}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226550, 'error': None, 'target': 'ovnmeta-784726bd-b1f4-4298-96ff-31e8b942933e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.728 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8e41d37b-252e-4a29-aa42-f26c181c75da]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap784726bd-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:03:f9:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396933, 'reachable_time': 34727, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226551, 'error': None, 'target': 'ovnmeta-784726bd-b1f4-4298-96ff-31e8b942933e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.755 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9bd63c38-182d-48a4-b191-aff76b236b76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.809 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[43c71822-7831-4d54-ad02-a565a1abf73d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.811 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap784726bd-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.811 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.811 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap784726bd-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:24:25 np0005476733 nova_compute[192580]: 2025-10-08 15:24:25.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:25 np0005476733 kernel: tap784726bd-b0: entered promiscuous mode
Oct  8 11:24:25 np0005476733 NetworkManager[51699]: <info>  [1759937065.8171] manager: (tap784726bd-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Oct  8 11:24:25 np0005476733 nova_compute[192580]: 2025-10-08 15:24:25.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.820 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap784726bd-b0, col_values=(('external_ids', {'iface-id': '3e1cfc1e-bd08-4cac-8c34-89e3efa1d0d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:24:25 np0005476733 nova_compute[192580]: 2025-10-08 15:24:25.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:24:25Z|00206|binding|INFO|Releasing lport 3e1cfc1e-bd08-4cac-8c34-89e3efa1d0d6 from this chassis (sb_readonly=0)
Oct  8 11:24:25 np0005476733 nova_compute[192580]: 2025-10-08 15:24:25.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:25 np0005476733 nova_compute[192580]: 2025-10-08 15:24:25.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.835 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/784726bd-b1f4-4298-96ff-31e8b942933e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/784726bd-b1f4-4298-96ff-31e8b942933e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.837 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4c7ee92f-2d56-470b-8db2-fe14d0bf1075]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.838 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-784726bd-b1f4-4298-96ff-31e8b942933e
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/784726bd-b1f4-4298-96ff-31e8b942933e.pid.haproxy
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 784726bd-b1f4-4298-96ff-31e8b942933e
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:24:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:25.839 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-784726bd-b1f4-4298-96ff-31e8b942933e', 'env', 'PROCESS_TAG=haproxy-784726bd-b1f4-4298-96ff-31e8b942933e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/784726bd-b1f4-4298-96ff-31e8b942933e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:24:25 np0005476733 nova_compute[192580]: 2025-10-08 15:24:25.934 2 DEBUG nova.compute.manager [req-4d10abaa-80c3-4f1c-aef6-f73a1b09c680 req-3c89741e-606b-454e-9bea-e28191547d02 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Received event network-vif-plugged-bb8d6c3b-78f5-45eb-82d7-19d928374c3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:24:25 np0005476733 nova_compute[192580]: 2025-10-08 15:24:25.934 2 DEBUG oslo_concurrency.lockutils [req-4d10abaa-80c3-4f1c-aef6-f73a1b09c680 req-3c89741e-606b-454e-9bea-e28191547d02 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e11af4e6-28c2-48fa-affb-668a5e9f6972-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:24:25 np0005476733 nova_compute[192580]: 2025-10-08 15:24:25.935 2 DEBUG oslo_concurrency.lockutils [req-4d10abaa-80c3-4f1c-aef6-f73a1b09c680 req-3c89741e-606b-454e-9bea-e28191547d02 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e11af4e6-28c2-48fa-affb-668a5e9f6972-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:24:25 np0005476733 nova_compute[192580]: 2025-10-08 15:24:25.935 2 DEBUG oslo_concurrency.lockutils [req-4d10abaa-80c3-4f1c-aef6-f73a1b09c680 req-3c89741e-606b-454e-9bea-e28191547d02 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e11af4e6-28c2-48fa-affb-668a5e9f6972-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:24:25 np0005476733 nova_compute[192580]: 2025-10-08 15:24:25.935 2 DEBUG nova.compute.manager [req-4d10abaa-80c3-4f1c-aef6-f73a1b09c680 req-3c89741e-606b-454e-9bea-e28191547d02 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Processing event network-vif-plugged-bb8d6c3b-78f5-45eb-82d7-19d928374c3e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:24:26 np0005476733 podman[226590]: 2025-10-08 15:24:26.212627891 +0000 UTC m=+0.052319708 container create 69200f816f34e4071fdad1579e9a825898eb2f58baa005ba05ff5c01535a0138 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-784726bd-b1f4-4298-96ff-31e8b942933e, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:24:26 np0005476733 systemd[1]: Started libpod-conmon-69200f816f34e4071fdad1579e9a825898eb2f58baa005ba05ff5c01535a0138.scope.
Oct  8 11:24:26 np0005476733 podman[226590]: 2025-10-08 15:24:26.17951016 +0000 UTC m=+0.019202007 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:24:26 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:24:26 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d13cb32a2d89e7c31f4a94b61dcbff8f2be287fd488d18c6732851bd2ffbb632/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:24:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:26.306 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:24:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:26.307 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:24:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:26.309 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:24:26 np0005476733 podman[226590]: 2025-10-08 15:24:26.313775133 +0000 UTC m=+0.153466980 container init 69200f816f34e4071fdad1579e9a825898eb2f58baa005ba05ff5c01535a0138 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-784726bd-b1f4-4298-96ff-31e8b942933e, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS)
Oct  8 11:24:26 np0005476733 podman[226590]: 2025-10-08 15:24:26.321134308 +0000 UTC m=+0.160826115 container start 69200f816f34e4071fdad1579e9a825898eb2f58baa005ba05ff5c01535a0138 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-784726bd-b1f4-4298-96ff-31e8b942933e, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  8 11:24:26 np0005476733 neutron-haproxy-ovnmeta-784726bd-b1f4-4298-96ff-31e8b942933e[226605]: [NOTICE]   (226609) : New worker (226611) forked
Oct  8 11:24:26 np0005476733 neutron-haproxy-ovnmeta-784726bd-b1f4-4298-96ff-31e8b942933e[226605]: [NOTICE]   (226609) : Loading success.
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.477 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937066.4769213, e11af4e6-28c2-48fa-affb-668a5e9f6972 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.478 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] VM Started (Lifecycle Event)#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.480 2 DEBUG nova.compute.manager [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.487 2 DEBUG nova.virt.libvirt.driver [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.490 2 INFO nova.virt.libvirt.driver [-] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Instance spawned successfully.#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.490 2 DEBUG nova.virt.libvirt.driver [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.521 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.525 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.548 2 DEBUG nova.virt.libvirt.driver [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.548 2 DEBUG nova.virt.libvirt.driver [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.549 2 DEBUG nova.virt.libvirt.driver [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.549 2 DEBUG nova.virt.libvirt.driver [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.550 2 DEBUG nova.virt.libvirt.driver [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.550 2 DEBUG nova.virt.libvirt.driver [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.557 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.557 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937066.4779294, e11af4e6-28c2-48fa-affb-668a5e9f6972 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.557 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.594 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.597 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937066.4881513, e11af4e6-28c2-48fa-affb-668a5e9f6972 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.598 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.625 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.628 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.655 2 INFO nova.compute.manager [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Took 8.23 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.656 2 DEBUG nova.compute.manager [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.657 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.734 2 INFO nova.compute.manager [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Took 8.86 seconds to build instance.#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.765 2 DEBUG oslo_concurrency.lockutils [None req-8ed6837f-59ef-43ab-8d51-c639442028b0 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "e11af4e6-28c2-48fa-affb-668a5e9f6972" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.937 2 INFO nova.compute.manager [None req-bf2622b9-04ce-41b2-8db5-e560099530cd f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Get console output#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.942 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:24:26 np0005476733 nova_compute[192580]: 2025-10-08 15:24:26.946 2 INFO nova.virt.libvirt.driver [None req-bf2622b9-04ce-41b2-8db5-e560099530cd f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Truncated console log returned, 3425 bytes ignored#033[00m
Oct  8 11:24:27 np0005476733 nova_compute[192580]: 2025-10-08 15:24:27.256 2 DEBUG nova.network.neutron [req-20112d16-28e0-4806-b616-99c509c2d608 req-071e2bb8-04a3-4f3a-8c93-670113e3fd39 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Updated VIF entry in instance network info cache for port bb8d6c3b-78f5-45eb-82d7-19d928374c3e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:24:27 np0005476733 nova_compute[192580]: 2025-10-08 15:24:27.256 2 DEBUG nova.network.neutron [req-20112d16-28e0-4806-b616-99c509c2d608 req-071e2bb8-04a3-4f3a-8c93-670113e3fd39 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Updating instance_info_cache with network_info: [{"id": "bb8d6c3b-78f5-45eb-82d7-19d928374c3e", "address": "fa:16:3e:20:ea:c5", "network": {"id": "784726bd-b1f4-4298-96ff-31e8b942933e", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "2001::/64", "dns": [], "gateway": {"address": "2001::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001::367", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}, {"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8d6c3b-78", "ovs_interfaceid": "bb8d6c3b-78f5-45eb-82d7-19d928374c3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:24:27 np0005476733 nova_compute[192580]: 2025-10-08 15:24:27.285 2 DEBUG oslo_concurrency.lockutils [req-20112d16-28e0-4806-b616-99c509c2d608 req-071e2bb8-04a3-4f3a-8c93-670113e3fd39 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-e11af4e6-28c2-48fa-affb-668a5e9f6972" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:24:27 np0005476733 nova_compute[192580]: 2025-10-08 15:24:27.550 2 INFO nova.compute.manager [None req-8bdf7956-344a-4901-93cf-91a35d88e3bf 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Get console output#033[00m
Oct  8 11:24:27 np0005476733 nova_compute[192580]: 2025-10-08 15:24:27.556 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:24:28 np0005476733 nova_compute[192580]: 2025-10-08 15:24:28.103 2 DEBUG nova.compute.manager [req-205e0b93-65af-49a5-800d-0d2d4df4bb42 req-98e946b3-6d1b-4bdf-a61a-a129a9fbffe5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Received event network-vif-plugged-bb8d6c3b-78f5-45eb-82d7-19d928374c3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:24:28 np0005476733 nova_compute[192580]: 2025-10-08 15:24:28.104 2 DEBUG oslo_concurrency.lockutils [req-205e0b93-65af-49a5-800d-0d2d4df4bb42 req-98e946b3-6d1b-4bdf-a61a-a129a9fbffe5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e11af4e6-28c2-48fa-affb-668a5e9f6972-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:24:28 np0005476733 nova_compute[192580]: 2025-10-08 15:24:28.104 2 DEBUG oslo_concurrency.lockutils [req-205e0b93-65af-49a5-800d-0d2d4df4bb42 req-98e946b3-6d1b-4bdf-a61a-a129a9fbffe5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e11af4e6-28c2-48fa-affb-668a5e9f6972-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:24:28 np0005476733 nova_compute[192580]: 2025-10-08 15:24:28.105 2 DEBUG oslo_concurrency.lockutils [req-205e0b93-65af-49a5-800d-0d2d4df4bb42 req-98e946b3-6d1b-4bdf-a61a-a129a9fbffe5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e11af4e6-28c2-48fa-affb-668a5e9f6972-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:24:28 np0005476733 nova_compute[192580]: 2025-10-08 15:24:28.105 2 DEBUG nova.compute.manager [req-205e0b93-65af-49a5-800d-0d2d4df4bb42 req-98e946b3-6d1b-4bdf-a61a-a129a9fbffe5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] No waiting events found dispatching network-vif-plugged-bb8d6c3b-78f5-45eb-82d7-19d928374c3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:24:28 np0005476733 nova_compute[192580]: 2025-10-08 15:24:28.106 2 WARNING nova.compute.manager [req-205e0b93-65af-49a5-800d-0d2d4df4bb42 req-98e946b3-6d1b-4bdf-a61a-a129a9fbffe5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Received unexpected event network-vif-plugged-bb8d6c3b-78f5-45eb-82d7-19d928374c3e for instance with vm_state active and task_state None.#033[00m
Oct  8 11:24:28 np0005476733 podman[226639]: 2025-10-08 15:24:28.294805481 +0000 UTC m=+0.117162425 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  8 11:24:28 np0005476733 podman[226667]: 2025-10-08 15:24:28.44390826 +0000 UTC m=+0.113009543 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 11:24:28 np0005476733 nova_compute[192580]: 2025-10-08 15:24:28.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:29 np0005476733 nova_compute[192580]: 2025-10-08 15:24:29.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:29 np0005476733 nova_compute[192580]: 2025-10-08 15:24:29.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:32 np0005476733 nova_compute[192580]: 2025-10-08 15:24:32.064 2 DEBUG nova.compute.manager [req-ebe05f17-ad07-473d-994f-6c4ac171f817 req-e3646b17-9595-4539-a332-0fc4a51cbed8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Received event network-changed-27016abf-08ed-40dc-8da9-bebab3e3a2a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:24:32 np0005476733 nova_compute[192580]: 2025-10-08 15:24:32.065 2 DEBUG nova.compute.manager [req-ebe05f17-ad07-473d-994f-6c4ac171f817 req-e3646b17-9595-4539-a332-0fc4a51cbed8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Refreshing instance network info cache due to event network-changed-27016abf-08ed-40dc-8da9-bebab3e3a2a3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:24:32 np0005476733 nova_compute[192580]: 2025-10-08 15:24:32.066 2 DEBUG oslo_concurrency.lockutils [req-ebe05f17-ad07-473d-994f-6c4ac171f817 req-e3646b17-9595-4539-a332-0fc4a51cbed8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-e36dd986-15d5-466e-93d6-dc7b4483c8e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:24:32 np0005476733 nova_compute[192580]: 2025-10-08 15:24:32.066 2 DEBUG oslo_concurrency.lockutils [req-ebe05f17-ad07-473d-994f-6c4ac171f817 req-e3646b17-9595-4539-a332-0fc4a51cbed8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-e36dd986-15d5-466e-93d6-dc7b4483c8e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:24:32 np0005476733 nova_compute[192580]: 2025-10-08 15:24:32.067 2 DEBUG nova.network.neutron [req-ebe05f17-ad07-473d-994f-6c4ac171f817 req-e3646b17-9595-4539-a332-0fc4a51cbed8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Refreshing network info cache for port 27016abf-08ed-40dc-8da9-bebab3e3a2a3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:24:32 np0005476733 nova_compute[192580]: 2025-10-08 15:24:32.731 2 INFO nova.compute.manager [None req-e8b199fe-0d0e-4f52-96c4-dea3d37e8305 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Get console output#033[00m
Oct  8 11:24:32 np0005476733 nova_compute[192580]: 2025-10-08 15:24:32.736 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:24:32 np0005476733 nova_compute[192580]: 2025-10-08 15:24:32.806 2 DEBUG oslo_concurrency.lockutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "6efc9ea0-184c-46cc-aeb5-e2759e10e398" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:24:32 np0005476733 nova_compute[192580]: 2025-10-08 15:24:32.807 2 DEBUG oslo_concurrency.lockutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "6efc9ea0-184c-46cc-aeb5-e2759e10e398" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:24:32 np0005476733 nova_compute[192580]: 2025-10-08 15:24:32.844 2 DEBUG nova.compute.manager [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:24:33 np0005476733 nova_compute[192580]: 2025-10-08 15:24:33.093 2 DEBUG oslo_concurrency.lockutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:24:33 np0005476733 nova_compute[192580]: 2025-10-08 15:24:33.094 2 DEBUG oslo_concurrency.lockutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:24:33 np0005476733 nova_compute[192580]: 2025-10-08 15:24:33.102 2 DEBUG nova.virt.hardware [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:24:33 np0005476733 nova_compute[192580]: 2025-10-08 15:24:33.102 2 INFO nova.compute.claims [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:24:33 np0005476733 nova_compute[192580]: 2025-10-08 15:24:33.615 2 DEBUG nova.compute.provider_tree [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:24:33 np0005476733 nova_compute[192580]: 2025-10-08 15:24:33.678 2 DEBUG nova.scheduler.client.report [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:24:33 np0005476733 nova_compute[192580]: 2025-10-08 15:24:33.803 2 DEBUG oslo_concurrency.lockutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.709s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:24:33 np0005476733 nova_compute[192580]: 2025-10-08 15:24:33.804 2 DEBUG nova.compute.manager [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:24:33 np0005476733 nova_compute[192580]: 2025-10-08 15:24:33.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:33 np0005476733 nova_compute[192580]: 2025-10-08 15:24:33.883 2 DEBUG nova.compute.manager [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:24:33 np0005476733 nova_compute[192580]: 2025-10-08 15:24:33.885 2 DEBUG nova.network.neutron [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:24:33 np0005476733 nova_compute[192580]: 2025-10-08 15:24:33.930 2 INFO nova.virt.libvirt.driver [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.010 2 DEBUG nova.compute.manager [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:34 np0005476733 podman[226700]: 2025-10-08 15:24:34.229451416 +0000 UTC m=+0.058692352 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, config_id=edpm, version=9.6, com.redhat.component=ubi9-minimal-container, vcs-type=git, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.255 2 DEBUG nova.policy [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.259 2 DEBUG nova.compute.manager [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.262 2 DEBUG nova.virt.libvirt.driver [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.262 2 INFO nova.virt.libvirt.driver [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Creating image(s)#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.264 2 DEBUG oslo_concurrency.lockutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "/var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.264 2 DEBUG oslo_concurrency.lockutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "/var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.266 2 DEBUG oslo_concurrency.lockutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "/var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.288 2 DEBUG oslo_concurrency.processutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.350 2 DEBUG oslo_concurrency.processutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.351 2 DEBUG oslo_concurrency.lockutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.352 2 DEBUG oslo_concurrency.lockutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.363 2 DEBUG oslo_concurrency.processutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.417 2 DEBUG oslo_concurrency.processutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.418 2 DEBUG oslo_concurrency.processutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.450 2 DEBUG oslo_concurrency.processutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk 10737418240" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.452 2 DEBUG oslo_concurrency.lockutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.453 2 DEBUG oslo_concurrency.processutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.548 2 DEBUG oslo_concurrency.processutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.553 2 DEBUG nova.objects.instance [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lazy-loading 'migration_context' on Instance uuid 6efc9ea0-184c-46cc-aeb5-e2759e10e398 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.684 2 DEBUG nova.virt.libvirt.driver [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.685 2 DEBUG nova.virt.libvirt.driver [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Ensure instance console log exists: /var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.686 2 DEBUG oslo_concurrency.lockutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.686 2 DEBUG oslo_concurrency.lockutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.687 2 DEBUG oslo_concurrency.lockutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.691 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.692 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.692 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.693 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.832 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.908 2 DEBUG nova.network.neutron [req-ebe05f17-ad07-473d-994f-6c4ac171f817 req-e3646b17-9595-4539-a332-0fc4a51cbed8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Updated VIF entry in instance network info cache for port 27016abf-08ed-40dc-8da9-bebab3e3a2a3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.910 2 DEBUG nova.network.neutron [req-ebe05f17-ad07-473d-994f-6c4ac171f817 req-e3646b17-9595-4539-a332-0fc4a51cbed8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Updating instance_info_cache with network_info: [{"id": "27016abf-08ed-40dc-8da9-bebab3e3a2a3", "address": "fa:16:3e:fe:38:dd", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27016abf-08", "ovs_interfaceid": "27016abf-08ed-40dc-8da9-bebab3e3a2a3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.922 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.923 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:24:34 np0005476733 nova_compute[192580]: 2025-10-08 15:24:34.992 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:24:35 np0005476733 nova_compute[192580]: 2025-10-08 15:24:35.003 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:24:35 np0005476733 nova_compute[192580]: 2025-10-08 15:24:35.058 2 DEBUG oslo_concurrency.lockutils [req-ebe05f17-ad07-473d-994f-6c4ac171f817 req-e3646b17-9595-4539-a332-0fc4a51cbed8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-e36dd986-15d5-466e-93d6-dc7b4483c8e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:24:35 np0005476733 nova_compute[192580]: 2025-10-08 15:24:35.070 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:24:35 np0005476733 nova_compute[192580]: 2025-10-08 15:24:35.071 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:24:35 np0005476733 nova_compute[192580]: 2025-10-08 15:24:35.129 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:24:35 np0005476733 nova_compute[192580]: 2025-10-08 15:24:35.136 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:24:35 np0005476733 nova_compute[192580]: 2025-10-08 15:24:35.195 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:24:35 np0005476733 nova_compute[192580]: 2025-10-08 15:24:35.196 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:24:35 np0005476733 nova_compute[192580]: 2025-10-08 15:24:35.256 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:24:35 np0005476733 nova_compute[192580]: 2025-10-08 15:24:35.265 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e11af4e6-28c2-48fa-affb-668a5e9f6972/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:24:35 np0005476733 nova_compute[192580]: 2025-10-08 15:24:35.325 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e11af4e6-28c2-48fa-affb-668a5e9f6972/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:24:35 np0005476733 nova_compute[192580]: 2025-10-08 15:24:35.326 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e11af4e6-28c2-48fa-affb-668a5e9f6972/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:24:35 np0005476733 nova_compute[192580]: 2025-10-08 15:24:35.379 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e11af4e6-28c2-48fa-affb-668a5e9f6972/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:24:35 np0005476733 nova_compute[192580]: 2025-10-08 15:24:35.582 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:24:35 np0005476733 nova_compute[192580]: 2025-10-08 15:24:35.584 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=11197MB free_disk=110.97208023071289GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:24:35 np0005476733 nova_compute[192580]: 2025-10-08 15:24:35.584 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:24:35 np0005476733 nova_compute[192580]: 2025-10-08 15:24:35.585 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.002 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'name': 'tempest-test_multicast_after_idle_timeout-135618235', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000013', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '496a37645ecf47b496dcf02c696ca64a', 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'hostId': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.006 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'name': 'tempest-broadcast-receiver-123-1908290520', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000014', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7e1086961263487db8a3c5190fdf1b2e', 'user_id': '843ea0278e174175a6f8e21731c1383e', 'hostId': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.009 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000016', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'hostId': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.013 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000019', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '93e68db931464f0282500c84d398d8af', 'user_id': '048380879c82439f920961e33c8fc34c', 'hostId': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.013 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.013 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.013 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-test_multicast_after_idle_timeout-135618235>, <NovaLikeServer: tempest-broadcast-receiver-123-1908290520>, <NovaLikeServer: tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_multicast_after_idle_timeout-135618235>, <NovaLikeServer: tempest-broadcast-receiver-123-1908290520>, <NovaLikeServer: tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6>]
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.014 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.014 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.014 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-test_multicast_after_idle_timeout-135618235>, <NovaLikeServer: tempest-broadcast-receiver-123-1908290520>, <NovaLikeServer: tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_multicast_after_idle_timeout-135618235>, <NovaLikeServer: tempest-broadcast-receiver-123-1908290520>, <NovaLikeServer: tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6>]
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.014 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.037 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.read.requests volume: 11521 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.037 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.060 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.read.requests volume: 11676 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.060 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.080 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.read.requests volume: 11493 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.081 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.103 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.device.read.requests volume: 5688 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.103 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e30a3fea-7e0f-4377-a47c-0646853fe99c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11521, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-vda', 'timestamp': '2025-10-08T15:24:36.014569', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e5f293ac-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.737522647, 'message_signature': 'b743b9dcb23f870b222cfee7fc72f7522d98d02c1b073dad05d22c96444d3d96'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-sda', 'timestamp': '2025-10-08T15:24:36.014569', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e5f29e88-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.737522647, 'message_signature': 'f260ac240dc1d28226c882665b6c241ea4d49d101e6c6abb6325d025b6c3d71e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11676, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:24:36.014569', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e5f60e2e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.761092512, 'message_signature': 'e43df74d47eae83d845c6bd43be4367fe0f03854773ee70623d6227ede7680ee'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:24:36.014569', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e5f6187e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.761092512, 'message_signature': '72cd6989ee663f3da6c758fbfc929582aef0942f293c114dd49508b9ceb7b5fc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11493, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-vda', 'timestamp': '2025-10-08T15:24:36.014569', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e5f932de-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.783878562, 'message_signature': 'c358d1fa5f5ae8109080d5c4349e683b3dcfda5d1874626e1d9e03031d82503d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-sda', 'timestamp': '2025-10-08T15:24:36.014569', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest'
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e5f93d06-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.783878562, 'message_signature': 'd17a099b739030256f7298dbe9b479b314d76634765969b62dbf6f65d0f18961'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 5688, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972-vda', 'timestamp': '2025-10-08T15:24:36.014569', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'instance-00000019', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e5fc9618-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.804438862, 'message_signature': 'c4813b6b41a633d8dbc58969e6cbaa8132fc2cfc87b9056ae7eadf31ef962eef'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972-sda', 'timestamp': '2025-10-08T15:24:36.014569', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'instance-00000019', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e5fc9ec4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.804438862, 'message_signature': 'c29d28313ae1144e8047221373120e3365e51bfc7b3d044c76e4837e1e613912'}]}, 'timestamp': '2025-10-08 15:24:36.103694', '_unique_id': 'e5987854546e4f588af279e513f6ef2d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.106 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.110 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 656c0a96-03f3-4a70-baac-01de2a126a91 / tap59f58b79-91 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.110 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/network.incoming.bytes volume: 2426 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.113 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f / tap0bb60f77-cd inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.113 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.incoming.bytes volume: 24927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.115 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e36dd986-15d5-466e-93d6-dc7b4483c8e9 / tap27016abf-08 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.116 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/network.incoming.bytes volume: 2330 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.118 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e11af4e6-28c2-48fa-affb-668a5e9f6972 / tapbb8d6c3b-78 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.118 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea4f9eae-7c3d-4746-8f96-e427f6ffe57e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2426, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000013-656c0a96-03f3-4a70-baac-01de2a126a91-tap59f58b79-91', 'timestamp': '2025-10-08T15:24:36.107031', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'tap59f58b79-91', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:5f:94:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap59f58b79-91'}, 'message_id': 'e5fdbd40-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.83001526, 'message_signature': '1635570bb8f2544a2654f3e56dca8e793000332432db37169a1ae3560d5be077'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 24927, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:24:36.107031', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': 'e5fe2384-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.833929086, 'message_signature': '0da4538671b1c6e3f280d27a57e84ac24c74a1aabaaeea6ec87ecefc79631b82'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2330, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000016-e36dd986-15d5-466e-93d6-dc7b4483c8e9-tap27016abf-08', 'timestamp': '2025-10-08T15:24:36.107031', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'tap27016abf-08', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:fe:38:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27016abf-08'}, 'message_id': 'e5fe95e4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.836640133, 'message_signature': '740c7953588e7ca411031c87660a9455cbb27e21532eb8b6d86b8391be30bdcb'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-00000019-e11af4e6-28c2-48fa-affb-668a5e9f6972-tapbb8d6c3b-78', 'timestamp': '2025-10-08T15:24:36.107031', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'tapbb8d6c3b-78', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:20:ea:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb8d6c3b-78'}, 'message_id': 'e5ff03c6-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.839608638, 'message_signature': 'd16e2f07be3df7aa2a01c8b1da7cbc3d21660c3ccda1cb40f51233cdb5472614'}]}, 'timestamp': '2025-10-08 15:24:36.119428', '_unique_id': '174f359cea774e0fbb4f8c844abbbf2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.121 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.123 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.123 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.write.requests volume: 727 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.123 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.123 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.write.requests volume: 742 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.123 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.124 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.write.requests volume: 378 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.124 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.124 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.124 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79f3ec43-af0b-4476-9cff-2ab1e0f82b54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 727, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-vda', 'timestamp': '2025-10-08T15:24:36.123199', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e5ffa344-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.737522647, 'message_signature': 'a52015588ab1711696ed9f26ac52678a3a21961bb7a440fa0ca0affd22a1dac4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-sda', 'timestamp': '2025-10-08T15:24:36.123199', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e5ffab50-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.737522647, 'message_signature': '8b47f462fc05fe2357d508969db6e105742baa8a652fee09960b2dfddefc7041'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 742, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:24:36.123199', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e5ffb2e4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.761092512, 'message_signature': '896583b1b8fac7dc9af30b8745aef01f0809438e3288ed4bb9466c678f8d7158'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:24:36.123199', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e5ffba32-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.761092512, 'message_signature': '1cc6d626545b3762d6547d5ac118cef47bf6125e1d59cba8164221ab63aa1289'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 378, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-vda', 'timestamp': '2025-10-08T15:24:36.123199', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e5ffc220-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.783878562, 'message_signature': '06ea638a5d07a7ad0a2d43835b34a547d09469316db8238b1e962da6ea21c902'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-sda', 'timestamp': '2025-10-08T15:24:36.123199', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcp
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: ng', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e5ffcb4e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.783878562, 'message_signature': '040dba8c6b2e41ecdfc405e6d5b17041cc9815c41dc0e7494a51e8cf12151fa2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972-vda', 'timestamp': '2025-10-08T15:24:36.123199', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'instance-00000019', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e5ffd760-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.804438862, 'message_signature': 'f1226a7e16098bf2af2cc6c3bfa00fe7c0217c7039fa0682384b7cfd5f808fa8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972-sda', 'timestamp': '2025-10-08T15:24:36.123199', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'instance-00000019', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e5ffe390-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.804438862, 'message_signature': 'eb9f84e82c3e8fabc59c28181d5e2a96d83c7cb1e694062f26f0ae59ba29fde9'}]}, 'timestamp': '2025-10-08 15:24:36.125119', '_unique_id': '5d26a6a818264dbb9605ae765c09cf47'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.130 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.130 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.130 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.130 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.131 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d619921-9a10-4341-b1d3-5eba6d4c6cf4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000013-656c0a96-03f3-4a70-baac-01de2a126a91-tap59f58b79-91', 'timestamp': '2025-10-08T15:24:36.130208', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'tap59f58b79-91', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:5f:94:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap59f58b79-91'}, 'message_id': 'e600b9dc-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.83001526, 'message_signature': 'a07e02c2beabfbe255c8722a5c9a5e1d6436e2c49b35f87a911006246db7ca0f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:24:36.130208', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': 'e600c6d4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.833929086, 'message_signature': '1059f5f4f892d1025e7d82c58f9587f4872fa109c3b2d0604e53131e22becf19'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000016-e36dd986-15d5-466e-93d6-dc7b4483c8e9-tap27016abf-08', 'timestamp': '2025-10-08T15:24:36.130208', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'tap27016abf-08', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:fe:38:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27016abf-08'}, 'message_id': 'e600d444-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.836640133, 'message_signature': '5f8b02446db245d5cdfcaf6eaa3f5ea0a06a075745022cab6281458a71833d12'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-00000019-e11af4e6-28c2-48fa-affb-668a5e9f6972-tapbb8d6c3b-78', 'timestamp': '2025-10-08T15:24:36.130208', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'tapbb8d6c3b-78', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:20:ea:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb8d6c3b-78'}, 'message_id': 'e600e128-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.839608638, 'message_signature': 'd765df80ad355caaa5c031aad4b7a07dbbff6ca26d1f559be34cb9594e991d01'}]}, 'timestamp': '2025-10-08 15:24:36.131631', '_unique_id': 'f5c1a6baeff741c597e4e0a4935b5b89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.132 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.133 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.133 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.133 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.133 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.134 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63063981-cb2b-425e-b20f-371ffea0fe08', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000013-656c0a96-03f3-4a70-baac-01de2a126a91-tap59f58b79-91', 'timestamp': '2025-10-08T15:24:36.133328', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'tap59f58b79-91', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:5f:94:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap59f58b79-91'}, 'message_id': 'e6012ee4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.83001526, 'message_signature': '38f2ee205cf971b5e37d7c09c90a99e44b52634348787176b3dfe600b93dbf9b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:24:36.133328', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': 'e6013ca4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.833929086, 'message_signature': '004daad3cf60242232d2920b5cdc059b20a851edefc08d1e14a9896a054bc04e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000016-e36dd986-15d5-466e-93d6-dc7b4483c8e9-tap27016abf-08', 'timestamp': '2025-10-08T15:24:36.133328', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'tap27016abf-08', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:fe:38:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27016abf-08'}, 'message_id': 'e60149ec-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.836640133, 'message_signature': '0d8d29bba1e77b5d1ac6d3ec947dae19d5a5f1f1cd0f736b4b394316578c5ca5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-00000019-e11af4e6-28c2-48fa-affb-668a5e9f6972-tapbb8d6c3b-78', 'timestamp': '2025-10-08T15:24:36.133328', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'tapbb8d6c3b-78', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:20:ea:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb8d6c3b-78'}, 'message_id': 'e601557c-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.839608638, 'message_signature': '962ad05ba1d0d4cc31306c1d114b4bef677b64a6838e2cf258f7ac88d7b155d2'}]}, 'timestamp': '2025-10-08 15:24:36.134616', '_unique_id': 'af875115df554399998e691fdeaae25f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.135 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.136 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.151 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.usage volume: 152240128 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.151 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.164 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.usage volume: 160890880 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.165 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.183 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.usage volume: 92078080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.183 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.198 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.199 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8d50c32-9e90-4415-ae9f-225cc91b7cc6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152240128, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-vda', 'timestamp': '2025-10-08T15:24:36.136352', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e603ebf2-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.859347961, 'message_signature': 'ff8e6580b2144e22846312fdfc0885a6a3ebbd98a90f5e4350e2983000a9b7d9'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-sda', 'timestamp': '2025-10-08T15:24:36.136352', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e603fc14-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.859347961, 'message_signature': 'c7058d815d0544ee3716508e9390411f868d8a0ff29d853bd3218d177b156367'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 160890880, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:24:36.136352', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e6060dc4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.87493468, 'message_signature': 'c29725d40174170014c45d86e4d7ba702bb2db0085aadac6f9af9bca55671548'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:24:36.136352', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e6061c7e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.87493468, 'message_signature': '046e074c4454dce70b76b54799f60b7e496837059162ee51c27d1a124909dccb'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 92078080, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-vda', 'timestamp': '2025-10-08T15:24:36.136352', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e608ca0a-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.888877257, 'message_signature': '65b6518ef27a41b190d1164727cd37013a8c1347143e1543261161c54ac5c771'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-sda', 'timestamp': '2025-10-08T15:24:36.136352', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: ': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e608d68a-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.888877257, 'message_signature': '83e7e26c80a12a01e596def66c86382c8987f9d3d9c8ab1a5cdfaf1c660ed2a9'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972-vda', 'timestamp': '2025-10-08T15:24:36.136352', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'instance-00000019', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e60b2e1c-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.906716759, 'message_signature': '5317cc3e8de73a975ce4380adaffd8f1bcaa39e7167e237aacedd409b3019fbf'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972-sda', 'timestamp': '2025-10-08T15:24:36.136352', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'instance-00000019', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e60b412c-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.906716759, 'message_signature': 'd224833def98ec39f0598908cbffb98a96a6a712d10fc81cfacf1510a2b6b680'}]}, 'timestamp': '2025-10-08 15:24:36.199634', '_unique_id': '91a8dc2f7295406a864f728771816671'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.203 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.225 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/memory.usage volume: 231.9921875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 nova_compute[192580]: 2025-10-08 15:24:36.238 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 656c0a96-03f3-4a70-baac-01de2a126a91 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:24:36 np0005476733 nova_compute[192580]: 2025-10-08 15:24:36.238 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:24:36 np0005476733 nova_compute[192580]: 2025-10-08 15:24:36.238 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance e36dd986-15d5-466e-93d6-dc7b4483c8e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:24:36 np0005476733 nova_compute[192580]: 2025-10-08 15:24:36.239 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance e11af4e6-28c2-48fa-affb-668a5e9f6972 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:24:36 np0005476733 nova_compute[192580]: 2025-10-08 15:24:36.239 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 6efc9ea0-184c-46cc-aeb5-e2759e10e398 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:24:36 np0005476733 nova_compute[192580]: 2025-10-08 15:24:36.239 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 5 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:24:36 np0005476733 nova_compute[192580]: 2025-10-08 15:24:36.239 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=5632MB phys_disk=119GB used_disk=50GB total_vcpus=8 used_vcpus=5 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.243 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/memory.usage volume: 233.8828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.259 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/memory.usage volume: 297.125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:24:36.105 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:24:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:24:36.125 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.282 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.282 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance e11af4e6-28c2-48fa-affb-668a5e9f6972: ceilometer.compute.pollsters.NoVolumeException
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c57fe00f-7c56-4f3b-86b4-01329ac75d70', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 231.9921875, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'timestamp': '2025-10-08T15:24:36.203557', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': 'e60f422c-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.948090145, 'message_signature': '6c6b3eb4fb9862e5b340f82510d9215009c2c8fcf6576bf22458e19e10b27d3d'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 233.8828125, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'timestamp': '2025-10-08T15:24:36.203557', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': 'e61225a0-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.96665911, 'message_signature': '66e75c1a91837042571108e58a4419ee9734bb40280fa84485021bbd033adcae'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 297.125, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'timestamp': '2025-10-08T15:24:36.203557', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': 'e6147b0c-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.98257541, 'message_signature': 'c6787ab209f40c113764122c51df397e2d1c0aa499734d6375ce6ba5b5103d9a'}]}, 'timestamp': '2025-10-08 15:24:36.282761', '_unique_id': 'd9f98da8b24c41199968e351ce985d25'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.283 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.284 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.284 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.285 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.285 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.285 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd07408ff-adcf-4daa-bf31-5eb7e0fef7a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000013-656c0a96-03f3-4a70-baac-01de2a126a91-tap59f58b79-91', 'timestamp': '2025-10-08T15:24:36.284875', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'tap59f58b79-91', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:5f:94:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap59f58b79-91'}, 'message_id': 'e6185132-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.83001526, 'message_signature': 'b0514557aa3d4cb467f71b516fbb4810f093f9f622450dbcc03e2aee46fd2b6d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:24:36.284875', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': 'e6185ea2-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.833929086, 'message_signature': '44a72cda8f1b89b1279efa94d6cf81e3dcceb0dbbbbaf63ec85b972d0bad3cb3'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000016-e36dd986-15d5-466e-93d6-dc7b4483c8e9-tap27016abf-08', 'timestamp': '2025-10-08T15:24:36.284875', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'tap27016abf-08', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:fe:38:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27016abf-08'}, 'message_id': 'e6186aaa-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.836640133, 'message_signature': '2fab49e1f8cf0af17ce9de6c2018be405d4139f4af40b9df51a15b185f88e144'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-00000019-e11af4e6-28c2-48fa-affb-668a5e9f6972-tapbb8d6c3b-78', 'timestamp': '2025-10-08T15:24:36.284875', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'tapbb8d6c3b-78', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:20:ea:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb8d6c3b-78'}, 'message_id': 'e6187572-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.839608638, 'message_signature': 'fa2f0aa88037b4a40555af07f3be67ceb11c4dfd8edbbadec695a7b230afff5b'}]}, 'timestamp': '2025-10-08 15:24:36.286150', '_unique_id': 'dbc2cd87be0d4e9d80deaa318c8f07f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.286 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.287 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.287 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.read.bytes volume: 326608384 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.287 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.287 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.read.bytes volume: 329324032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.288 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.288 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.read.bytes volume: 314213376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.289 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.289 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.device.read.bytes volume: 93131776 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.291 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0e359a8-bbc3-4581-b1ca-6766a43e4114', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 326608384, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-vda', 'timestamp': '2025-10-08T15:24:36.287407', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e618b190-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.737522647, 'message_signature': 'e7a0fab9c1bb41ee4045e60b0372c9e011d7458561de83df46a530f3a4d5d480'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-sda', 'timestamp': '2025-10-08T15:24:36.287407', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e618baf0-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.737522647, 'message_signature': '816c59e5bca9f3794002631ed8542b7186431d21858e50021c0a8807469db6fa'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 329324032, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:24:36.287407', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e618c3ce-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.761092512, 'message_signature': 'f239a81c39be9278800f9739c4683c6314e357f7c07d0576b5fdc455d40a6b3f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:24:36.287407', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e618d274-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.761092512, 'message_signature': 'eab5873f3663b57ea2b8008008cb3954f70b2d95a2a4aec21c01b544bd8b2029'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 314213376, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-vda', 'timestamp': '2025-10-08T15:24:36.287407', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e618e53e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.783878562, 'message_signature': 'f86b2fb18ee5c40dd54dfd054157abda8c1b68d6af050c4c6163a0cc809c796c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-sda', 'timestamp': '2025-10-08T15:24:36.287407', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk'
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: ge': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e618faba-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.783878562, 'message_signature': '811a9a3e2345b8c7da716cc15aff5da03e1ab1a916976939cb2598d55eff6d26'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 93131776, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972-vda', 'timestamp': '2025-10-08T15:24:36.287407', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'instance-00000019', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e6192c60-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.804438862, 'message_signature': 'ec3744ff2d478c64f1effaf9a4315ad328dbe43d6e6af60a1bbac03adb56af41'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972-sda', 'timestamp': '2025-10-08T15:24:36.287407', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'instance-00000019', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e61945c4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.804438862, 'message_signature': '62f7246597b45b97e3a33399e68a82284196c43094b7f4c1a87a5688327b1ffa'}]}, 'timestamp': '2025-10-08 15:24:36.291537', '_unique_id': 'e465fce8e57844a4a5b897c306d7ca08'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.295 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.295 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.296 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.297 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.298 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:24:36.201 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47635e53-c397-47b2-abaf-1e69e4f4fb66', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000013-656c0a96-03f3-4a70-baac-01de2a126a91-tap59f58b79-91', 'timestamp': '2025-10-08T15:24:36.295371', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'tap59f58b79-91', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:5f:94:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap59f58b79-91'}, 'message_id': 'e619f8e8-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.83001526, 'message_signature': '66e05cf686123a1fcf7191efbf1bc24768d90177b2d63fd3e59af395588d8478'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:24:36.295371', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': 'e61a1f12-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.833929086, 'message_signature': '7ed7188ae50128fd4a9003f5184eaddbef95cf581e20a85b73aa23974b3d2532'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000016-e36dd986-15d5-466e-93d6-dc7b4483c8e9-tap27016abf-08', 'timestamp': '2025-10-08T15:24:36.295371', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'tap27016abf-08', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:fe:38:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27016abf-08'}, 'message_id': 'e61a41ea-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.836640133, 'message_signature': '26d32ccd37b14c49948440f9deea3f8428354dd57087c8a109f90d074373657d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-00000019-e11af4e6-28c2-48fa-affb-668a5e9f6972-tapbb8d6c3b-78', 'timestamp': '2025-10-08T15:24:36.295371', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'tapbb8d6c3b-78', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:20:ea:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb8d6c3b-78'}, 'message_id': 'e61a6832-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.839608638, 'message_signature': 'f61c5d2457ab580ca5b6fda4420b17acf300dba4d45e4000a289d3aa0ab5b939'}]}, 'timestamp': '2025-10-08 15:24:36.299020', '_unique_id': 'c55559ce60d7411ea60ed07498d92125'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.300 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.301 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.302 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/network.outgoing.bytes volume: 4706 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.303 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.outgoing.bytes volume: 37563 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.304 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/network.outgoing.bytes volume: 3180 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.304 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c045099-f711-41a9-88d2-b6cd53c24f44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4706, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000013-656c0a96-03f3-4a70-baac-01de2a126a91-tap59f58b79-91', 'timestamp': '2025-10-08T15:24:36.302196', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'tap59f58b79-91', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:5f:94:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap59f58b79-91'}, 'message_id': 'e61b0b48-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.83001526, 'message_signature': 'fd2443968c92a08a51120cc740a76def906c4b031c58da7b517bea0401401ba8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 37563, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:24:36.302196', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': 'e61b36ea-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.833929086, 'message_signature': '37668e3e9a2af94804f1ea19ed48ce6272e5786a5cac1e3013ce5d1138cd4505'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3180, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000016-e36dd986-15d5-466e-93d6-dc7b4483c8e9-tap27016abf-08', 'timestamp': '2025-10-08T15:24:36.302196', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'tap27016abf-08', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:fe:38:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27016abf-08'}, 'message_id': 'e61b52f6-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.836640133, 'message_signature': '2a23e6acdd16c30c92c0501e9871e8b910cad799e4096e78acd9b6dc184492e8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-00000019-e11af4e6-28c2-48fa-affb-668a5e9f6972-tapbb8d6c3b-78', 'timestamp': '2025-10-08T15:24:36.302196', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'tapbb8d6c3b-78', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:20:ea:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb8d6c3b-78'}, 'message_id': 'e61b5e7c-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.839608638, 'message_signature': '2fa8adbd658e4921ebda57442703dc3e8a9ae78cb0438fedb0c5d788afaae5ae'}]}, 'timestamp': '2025-10-08 15:24:36.305222', '_unique_id': 'cc5e0db518d340408bd48b51ffb5393f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.305 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.306 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.306 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.306 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.307 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.allocation volume: 161484800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.307 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.308 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.allocation volume: 94380032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.309 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.310 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.device.allocation volume: 1253376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.310 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70aff31b-6bc4-4a23-899b-6e8af2708ae2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-vda', 'timestamp': '2025-10-08T15:24:36.306553', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e61b9ee6-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.859347961, 'message_signature': '7aee4d9ad8e6a1af2dfb4a79aa00158b01870a44d272cfe731998e4ade415c23'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-sda', 'timestamp': '2025-10-08T15:24:36.306553', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e61ba9ea-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.859347961, 'message_signature': '367cb68a71c4ab91ff452ae3dde885aa09c950f818a8b8612f2390c0f08de613'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 161484800, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:24:36.306553', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e61bb58e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.87493468, 'message_signature': '265ebe6b936ffba859c81f846dbc84039e751e0f64c1295294e8d3a1d18ab24e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:24:36.306553', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e61bc6aa-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.87493468, 'message_signature': '0f4f16bd366534f9fae2ee4e4bf7e23b5b0c5456356157140197e5fc3476f6e3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 94380032, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-vda', 'timestamp': '2025-10-08T15:24:36.306553', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e61bebbc-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.888877257, 'message_signature': '9109207d324747af8641b0e5e396b921c2d6330d159cb5610d93f271d67f324e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-sda', 'timestamp': '2025-10-08T15:24:36.306553', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0},
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e61c15ec-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.888877257, 'message_signature': 'd84b703e967c7d5f9f613d8d89729190982c770b076bbbdb46615eba3bc082b5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1253376, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972-vda', 'timestamp': '2025-10-08T15:24:36.306553', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'instance-00000019', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e61c2dca-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.906716759, 'message_signature': 'be25a30e566317ef27c0fe51f99ab0fc8a0ee316e69f6a2cd93b98d0b5a310be'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972-sda', 'timestamp': '2025-10-08T15:24:36.306553', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'instance-00000019', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e61c3e96-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.906716759, 'message_signature': '5d91cf49c2c734abee25db0c364b07830f1e3c36ba862180a3835599c4fef779'}]}, 'timestamp': '2025-10-08 15:24:36.311124', '_unique_id': '7a99db4e894b41af9c3ba3238942a0d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.313 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.313 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/network.outgoing.packets volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.314 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.outgoing.packets volume: 196 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.316 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.316 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:24:36.293 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cdb4ffd0-4de6-4808-9602-13d2129795fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 47, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000013-656c0a96-03f3-4a70-baac-01de2a126a91-tap59f58b79-91', 'timestamp': '2025-10-08T15:24:36.313768', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'tap59f58b79-91', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:5f:94:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap59f58b79-91'}, 'message_id': 'e61cc794-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.83001526, 'message_signature': '2047fe62c16ba8d0805e2439fb7762beb8e734879626e9a5d683c14ed0238218'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 196, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:24:36.313768', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': 'e61cf39a-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.833929086, 'message_signature': '3a80e58843d0600d2e432c4137a3856aa894e41ada29c05f45a78ce3b14f1327'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000016-e36dd986-15d5-466e-93d6-dc7b4483c8e9-tap27016abf-08', 'timestamp': '2025-10-08T15:24:36.313768', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'tap27016abf-08', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:fe:38:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27016abf-08'}, 'message_id': 'e61d1d48-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.836640133, 'message_signature': '5d86314fbc3e7e21488329457c707e87feda08dd98d85131e464475b7dfff995'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-00000019-e11af4e6-28c2-48fa-affb-668a5e9f6972-tapbb8d6c3b-78', 'timestamp': '2025-10-08T15:24:36.313768', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'tapbb8d6c3b-78', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:20:ea:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb8d6c3b-78'}, 'message_id': 'e61d3bca-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.839608638, 'message_signature': '07edcac84292a0c5d4ae92aee8832edf594a4dda27342f3371f6c80ccc866c26'}]}, 'timestamp': '2025-10-08 15:24:36.317513', '_unique_id': '3a31927b17514e0f94e5de6819554cc8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.318 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.320 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.320 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/cpu volume: 40260000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.321 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/cpu volume: 43270000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.322 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/cpu volume: 38640000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.323 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/cpu volume: 9410000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ee1a6a3-8db3-4bc0-8935-37a7f0720f01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 40260000000, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'timestamp': '2025-10-08T15:24:36.320715', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'e61dd95e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.948090145, 'message_signature': '52184160365ccf6cf6508828c38bd3a8af0f0ae2c504a3add2e3a12789489eaf'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 43270000000, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'timestamp': '2025-10-08T15:24:36.320715', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'e61dfc86-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.96665911, 'message_signature': '1c7b0fa7136eb53da9348c44200f4f49366bc4adbbe295d318df4f862d9da646'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 38640000000, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'timestamp': '2025-10-08T15:24:36.320715', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'e61e1e00-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.98257541, 'message_signature': 'ff02ae3f58e76f0e45b5e45b434dd3ae287409c3dcf229fff065ccdab72c5d14'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9410000000, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'timestamp': '2025-10-08T15:24:36.320715', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'instance-00000019', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'e61e3c1e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3980.005256437, 'message_signature': '29f11929e31ff2099c0759826fc0beed772ecd7358f5106b09d933d640123ade'}]}, 'timestamp': '2025-10-08 15:24:36.324019', '_unique_id': '6945337a8ea54a25b9e2e4872929094d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.324 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.325 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.325 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.write.bytes volume: 135618048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.325 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.326 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.write.bytes volume: 144437760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.326 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.326 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.write.bytes volume: 77929984 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.326 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.327 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.device.write.bytes volume: 1024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.327 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b27f4603-2b1a-4463-a7b2-55a870d08402', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 135618048, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-vda', 'timestamp': '2025-10-08T15:24:36.325409', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e61e7f62-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.737522647, 'message_signature': '9301aba2a1b321c354deefc57134bd52fbbb77f011a8871be04cd19c10cfbafe'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-sda', 'timestamp': '2025-10-08T15:24:36.325409', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e61e8b2e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.737522647, 'message_signature': '9c8877bd3c793443e2fb618e2b3cb9077275cb57f8428215a75d05c5579b3eff'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 144437760, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:24:36.325409', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e61e968c-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.761092512, 'message_signature': '620c0ed38d3a855bc05abb30dad7ec5dbe02b128db30bf41b56a38390cf7dc34'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:24:36.325409', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e61ea154-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.761092512, 'message_signature': '31c0190678fa432fa8ba749612c3b1ed41c8c5309ec7340e23603d573dd03a90'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 77929984, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-vda', 'timestamp': '2025-10-08T15:24:36.325409', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e61eac08-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.783878562, 'message_signature': '7522aa20f230e5ba785ade2ce72690e407a6d5c1b545b9f646bcb1b422abd9c1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-sda', 'timestamp': '2025-10-08T15:24:36.325409', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'eph
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: : '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e61eb6d0-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.783878562, 'message_signature': '9bd388ab718aecc88b58bff971fbb6d461ea045bc9f0bb2a379b3bfa79b8b8f8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1024, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972-vda', 'timestamp': '2025-10-08T15:24:36.325409', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'instance-00000019', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e61ec40e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.804438862, 'message_signature': 'd7efd773f3872242eb7754000c497ed41fa743a73973030a11858a0aaf27dcc6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972-sda', 'timestamp': '2025-10-08T15:24:36.325409', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'instance-00000019', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e61ece9a-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.804438862, 'message_signature': 'f3d780e8f944e91dc4bbfdcaa5eb399971afb27168b772794caadb5f5b4b311b'}]}, 'timestamp': '2025-10-08 15:24:36.327728', '_unique_id': 'f27c866c9553412daa9970b2a4c4b765'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.329 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.329 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.331 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-test_multicast_after_idle_timeout-135618235>, <NovaLikeServer: tempest-broadcast-receiver-123-1908290520>, <NovaLikeServer: tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_multicast_after_idle_timeout-135618235>, <NovaLikeServer: tempest-broadcast-receiver-123-1908290520>, <NovaLikeServer: tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6>]
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.332 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.332 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.read.latency volume: 8719649385 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.333 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.read.latency volume: 70590649 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.334 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.read.latency volume: 9520010357 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.334 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.read.latency volume: 62858344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.335 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.read.latency volume: 8043550276 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.335 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.read.latency volume: 79148645 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.336 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.device.read.latency volume: 3137115426 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.336 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.device.read.latency volume: 8177702 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:24:36.311 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84421e04-79f0-4c60-91f9-7f1ad08fc06b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8719649385, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-vda', 'timestamp': '2025-10-08T15:24:36.332770', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e61fac16-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.737522647, 'message_signature': '79021fca3bb77b7d55c9c54c7ddb6793311b6fa56ac14915c1b0134608ad6b26'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 70590649, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-sda', 'timestamp': '2025-10-08T15:24:36.332770', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e61fc2a0-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.737522647, 'message_signature': 'f01a8109b44280685d4bf5644d88793d9bbc4d27faf44afb0c054f46003bf29a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9520010357, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:24:36.332770', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e61fda60-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.761092512, 'message_signature': '2ce39e80b6dd511ec0935160405be5db33cbf31e35847dc0b173260dba01ba4a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 62858344, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:24:36.332770', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e61fee4c-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.761092512, 'message_signature': '7f08783b68763f9ec5fd32934320aace66a5298bd24489c06ffab92408230a00'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8043550276, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-vda', 'timestamp': '2025-10-08T15:24:36.332770', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e62005f8-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.783878562, 'message_signature': '47469af64ef4252a72e143741b61f24e1b2a6b885c6b21b4e70284695df542b4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 79148645, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-sda', 'timestamp': '2025-10-08T15:24:36.332770', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcp
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: ng', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e6201692-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.783878562, 'message_signature': '9058ddeec865b26844b5641f404c729711f59ebe6a8f66c30b98dc88dbb73c5d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3137115426, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972-vda', 'timestamp': '2025-10-08T15:24:36.332770', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'instance-00000019', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e6202cea-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.804438862, 'message_signature': '63ccac1fa4baf81d1de72883c3460c167f790018d15fe8006a141fc760d0ebf9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8177702, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972-sda', 'timestamp': '2025-10-08T15:24:36.332770', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'instance-00000019', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e62040f4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.804438862, 'message_signature': '63bc559eabc06caf129bcb73e41bb0512cedd3195daa0aaee1b08edb8a4c1a9e'}]}, 'timestamp': '2025-10-08 15:24:36.337353', '_unique_id': 'd926a6fa4f9f454db988f2cbad721c67'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.340 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.342 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.343 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-test_multicast_after_idle_timeout-135618235>, <NovaLikeServer: tempest-broadcast-receiver-123-1908290520>, <NovaLikeServer: tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_multicast_after_idle_timeout-135618235>, <NovaLikeServer: tempest-broadcast-receiver-123-1908290520>, <NovaLikeServer: tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6>]
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.343 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.344 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.344 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.345 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.345 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.346 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.346 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.347 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.347 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f992eaa9-ade6-47fd-8022-a6800ccff923', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-vda', 'timestamp': '2025-10-08T15:24:36.344030', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e6216196-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.859347961, 'message_signature': '478d435fef923eadc0573e001ced286082d6a3a665702811e30a9a5047b9d515'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-sda', 'timestamp': '2025-10-08T15:24:36.344030', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e62176c2-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.859347961, 'message_signature': 'b5966958abe373ff020b3a26ff3a63ca9face3953d754f60008cd31e3dc2a66f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:24:36.344030', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e6218b9e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.87493468, 'message_signature': '162700531a77f4b5db79bfd2380172f321cafe049a15bec85e70f3280eb629a2'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:24:36.344030', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e6219f62-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.87493468, 'message_signature': 'e3a7a42a96bb1eecb098e6c9d62e96b6c3cef2500c181af9558455a4e30b8ef9'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-vda', 'timestamp': '2025-10-08T15:24:36.344030', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e621b632-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.888877257, 'message_signature': '79af139761d01b7cbb41d9a5a3e25af0af33214e5a49099623c7ee029ce959c6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-sda', 'timestamp': '2025-10-08T15:24:36.344030', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'sta
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 11111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e621cbe0-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.888877257, 'message_signature': 'fc40d11e65d7fbd2248775307f2bcf779e54509c4f71041c72e7e5e8e01d3e96'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972-vda', 'timestamp': '2025-10-08T15:24:36.344030', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'instance-00000019', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e621e166-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.906716759, 'message_signature': '1cdf36c96819c60bd0c9fae0057510f9aedd943735886233a0becc45eed5752c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972-sda', 'timestamp': '2025-10-08T15:24:36.344030', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'instance-00000019', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e621eca6-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.906716759, 'message_signature': 'b4466aaa8cc61e3422dc1e56ea60208111108cfa8c025fe4f248d937a19d3cae'}]}, 'timestamp': '2025-10-08 15:24:36.348177', '_unique_id': 'd0c6c05f5ecf44f8a169c573a8253262'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.349 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.349 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.349 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae155020-1b51-48f8-a2cc-87f6d902c81f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000013-656c0a96-03f3-4a70-baac-01de2a126a91-tap59f58b79-91', 'timestamp': '2025-10-08T15:24:36.349442', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'tap59f58b79-91', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:5f:94:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap59f58b79-91'}, 'message_id': 'e6222a68-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.83001526, 'message_signature': '09c66bc0fe835be007bde5669836b35c616c708b5dc305c02602f04931789bcf'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:24:36.349442', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': 'e6223666-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.833929086, 'message_signature': '0365f3be1d03093a383667e4fcdffd5acd8f27c549b9889d2fa1d7a4c2a83ae8'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000016-e36dd986-15d5-466e-93d6-dc7b4483c8e9-tap27016abf-08', 'timestamp': '2025-10-08T15:24:36.349442', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'tap27016abf-08', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:fe:38:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27016abf-08'}, 'message_id': 'e6224034-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.836640133, 'message_signature': '6b5689186a529958c940dabb7362e8fd281546f5bbb6ed2c1276cbcee09c8b58'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-00000019-e11af4e6-28c2-48fa-affb-668a5e9f6972-tapbb8d6c3b-78', 'timestamp': '2025-10-08T15:24:36.349442', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'tapbb8d6c3b-78', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:20:ea:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb8d6c3b-78'}, 'message_id': 'e6224912-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.839608638, 'message_signature': '08711689732c2fff2cb425f375b286539d51f4f4a0f58f77e3e539f7f4c4e955'}]}, 'timestamp': '2025-10-08 15:24:36.350497', '_unique_id': 'ef22c99c5dce4d30b052709c6ab67202'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.350 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.351 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.351 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/network.incoming.packets volume: 18 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.351 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.incoming.packets volume: 143 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.352 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.352 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0f902c6-41c1-47ec-8202-837fbd42d860', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 18, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000013-656c0a96-03f3-4a70-baac-01de2a126a91-tap59f58b79-91', 'timestamp': '2025-10-08T15:24:36.351666', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'tap59f58b79-91', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:5f:94:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap59f58b79-91'}, 'message_id': 'e62280f8-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.83001526, 'message_signature': 'caba1d5225cd808f1564b5238fd171e8696ace56ed03947c75ad8fd93690e9c1'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 143, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:24:36.351666', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': 'e6228c9c-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.833929086, 'message_signature': '62c2f5bd629ef144b0cfa8c548c4c37ab0640acc95ca52f32f72371f1a8b3a19'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000016-e36dd986-15d5-466e-93d6-dc7b4483c8e9-tap27016abf-08', 'timestamp': '2025-10-08T15:24:36.351666', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'tap27016abf-08', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:fe:38:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27016abf-08'}, 'message_id': 'e6229714-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.836640133, 'message_signature': '3060a5b59234f32a13a450f207cc0db121e7acff6919984cf652243c36a418d0'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-00000019-e11af4e6-28c2-48fa-affb-668a5e9f6972-tapbb8d6c3b-78', 'timestamp': '2025-10-08T15:24:36.351666', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'tapbb8d6c3b-78', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:20:ea:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb8d6c3b-78'}, 'message_id': 'e6229fd4-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.839608638, 'message_signature': 'c63848c789c4231ceff63f8f0c97c974802522b448e46af6e614bdf762e0dcc2'}]}, 'timestamp': '2025-10-08 15:24:36.352719', '_unique_id': '455ae16983d04778b103dfabaf1d77a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.353 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.354 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.354 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.354 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1aed5969-8442-4a79-8708-a82d87b387c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000013-656c0a96-03f3-4a70-baac-01de2a126a91-tap59f58b79-91', 'timestamp': '2025-10-08T15:24:36.353905', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'tap59f58b79-91', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:5f:94:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap59f58b79-91'}, 'message_id': 'e622d850-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.83001526, 'message_signature': '2ada040d09c62c36c7e50c06db3929cf7ec80049bc7fd2a1f9dbcd0557398730'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:24:36.353905', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': 'e622e408-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.833929086, 'message_signature': '0e7b2fdd818e5ad83007043d426f7df4c21c71d19ae30f39e1be80d9d2060fef'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000016-e36dd986-15d5-466e-93d6-dc7b4483c8e9-tap27016abf-08', 'timestamp': '2025-10-08T15:24:36.353905', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'tap27016abf-08', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:fe:38:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27016abf-08'}, 'message_id': 'e622ed04-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.836640133, 'message_signature': '9e80e516d5ba49fde636d5b435bdebfce14845b8e17d42771a09eb7d0ffd7eb5'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-00000019-e11af4e6-28c2-48fa-affb-668a5e9f6972-tapbb8d6c3b-78', 'timestamp': '2025-10-08T15:24:36.353905', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'tapbb8d6c3b-78', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:20:ea:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb8d6c3b-78'}, 'message_id': 'e622f5b0-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.839608638, 'message_signature': 'a0a356c17c0e2aa9f267d4dac5f6348a20bb4f415eea77efb1788dec7a1ca5be'}]}, 'timestamp': '2025-10-08 15:24:36.354944', '_unique_id': '19157b400cac4019a1ef4ba552275efb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:24:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:24:36.328 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.355 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.356 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.356 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.write.latency volume: 19257935030 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.356 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.356 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.write.latency volume: 12173367371 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.357 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.357 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.write.latency volume: 39327567534 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.357 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.357 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.device.write.latency volume: 18338578 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.358 12 DEBUG ceilometer.compute.pollsters [-] e11af4e6-28c2-48fa-affb-668a5e9f6972/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70dbc763-3d19-49df-b132-f4d78363e37e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19257935030, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-vda', 'timestamp': '2025-10-08T15:24:36.356202', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e623323c-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.737522647, 'message_signature': '86f6360c4063491ed7ae61f6076e4049a50ba3ac6365a5b5ac91d6ef69cec13e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-sda', 'timestamp': '2025-10-08T15:24:36.356202', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e6234312-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.737522647, 'message_signature': '7e5f98e7368859e1e822a015cb39d37215a91e68c93e62dfc21921833607c234'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12173367371, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:24:36.356202', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e6234c04-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.761092512, 'message_signature': 'a1e59a76543bd658e23ca3f0efaacf90421e1690548e5ef60d759e08d7574caa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:24:36.356202', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e62355e6-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.761092512, 'message_signature': '7bcaf289b59ff9fd9949dc7215b4f982ecaf15c88555b0ad3883633aa0898bca'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 39327567534, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-vda', 'timestamp': '2025-10-08T15:24:36.356202', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e6235e6a-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.783878562, 'message_signature': 'e0e8693ab1cde28ce1369ba71cb3d239ba747fdedeb4cec8e002977663b50558'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-sda', 'timestamp': '2025-10-08T15:24:36.356202', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: tate': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e623675c-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.783878562, 'message_signature': '470ad424137626e06c097ed12d7de95f3186212180d8f9414cbb5b53505b0965'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18338578, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972-vda', 'timestamp': '2025-10-08T15:24:36.356202', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'instance-00000019', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e623717a-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.804438862, 'message_signature': '28b70c652d7b7744e2d8d5c7cc660140fcf50964072da6f9e3f1906e6881fb82'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972-sda', 'timestamp': '2025-10-08T15:24:36.356202', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'name': 'instance-00000019', 'instance_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e6237b3e-a45a-11f0-9274-fa163ef67048', 'monotonic_time': 3979.804438862, 'message_signature': '6ddfe7dcb9e4ee371ca24d44bbbedf5b8580c8b9036ca5e606efbd84c589fa9c'}]}, 'timestamp': '2025-10-08 15:24:36.358329', '_unique_id': 'b7ab74c02f7c41ad97e0eb22d9d213a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:24:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:24:36.338 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:24:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:24:36.348 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:24:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:24:36.359 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:24:36 np0005476733 nova_compute[192580]: 2025-10-08 15:24:36.441 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:24:36 np0005476733 nova_compute[192580]: 2025-10-08 15:24:36.500 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:24:36 np0005476733 nova_compute[192580]: 2025-10-08 15:24:36.570 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:24:36 np0005476733 nova_compute[192580]: 2025-10-08 15:24:36.570 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.985s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:24:37 np0005476733 podman[226758]: 2025-10-08 15:24:37.232475798 +0000 UTC m=+0.060866113 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 11:24:37 np0005476733 podman[226759]: 2025-10-08 15:24:37.256032572 +0000 UTC m=+0.078346382 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 11:24:37 np0005476733 nova_compute[192580]: 2025-10-08 15:24:37.633 2 DEBUG nova.network.neutron [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Successfully created port: 36047ed0-015a-4d5e-8c0a-fc4d965a13b7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 11:24:38 np0005476733 nova_compute[192580]: 2025-10-08 15:24:38.007 2 INFO nova.compute.manager [None req-b08086b3-3ea9-42ae-9777-025020a846fb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Get console output#033[00m
Oct  8 11:24:38 np0005476733 nova_compute[192580]: 2025-10-08 15:24:38.012 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:24:38 np0005476733 nova_compute[192580]: 2025-10-08 15:24:38.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:39 np0005476733 nova_compute[192580]: 2025-10-08 15:24:39.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:39 np0005476733 nova_compute[192580]: 2025-10-08 15:24:39.570 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:24:39 np0005476733 nova_compute[192580]: 2025-10-08 15:24:39.615 2 DEBUG nova.network.neutron [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Successfully updated port: 36047ed0-015a-4d5e-8c0a-fc4d965a13b7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:24:39 np0005476733 nova_compute[192580]: 2025-10-08 15:24:39.651 2 DEBUG oslo_concurrency.lockutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "refresh_cache-6efc9ea0-184c-46cc-aeb5-e2759e10e398" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:24:39 np0005476733 nova_compute[192580]: 2025-10-08 15:24:39.652 2 DEBUG oslo_concurrency.lockutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquired lock "refresh_cache-6efc9ea0-184c-46cc-aeb5-e2759e10e398" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:24:39 np0005476733 nova_compute[192580]: 2025-10-08 15:24:39.652 2 DEBUG nova.network.neutron [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:24:39 np0005476733 nova_compute[192580]: 2025-10-08 15:24:39.736 2 DEBUG nova.compute.manager [req-08d7368f-c6ab-4d05-b1ec-75a1eb63038b req-72d18054-fa4e-46ac-9aa2-a73da83024bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Received event network-changed-36047ed0-015a-4d5e-8c0a-fc4d965a13b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:24:39 np0005476733 nova_compute[192580]: 2025-10-08 15:24:39.736 2 DEBUG nova.compute.manager [req-08d7368f-c6ab-4d05-b1ec-75a1eb63038b req-72d18054-fa4e-46ac-9aa2-a73da83024bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Refreshing instance network info cache due to event network-changed-36047ed0-015a-4d5e-8c0a-fc4d965a13b7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:24:39 np0005476733 nova_compute[192580]: 2025-10-08 15:24:39.737 2 DEBUG oslo_concurrency.lockutils [req-08d7368f-c6ab-4d05-b1ec-75a1eb63038b req-72d18054-fa4e-46ac-9aa2-a73da83024bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-6efc9ea0-184c-46cc-aeb5-e2759e10e398" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:24:39 np0005476733 nova_compute[192580]: 2025-10-08 15:24:39.849 2 DEBUG nova.network.neutron [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:24:40 np0005476733 nova_compute[192580]: 2025-10-08 15:24:40.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:24:40 np0005476733 nova_compute[192580]: 2025-10-08 15:24:40.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.090 2 DEBUG nova.network.neutron [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Updating instance_info_cache with network_info: [{"id": "36047ed0-015a-4d5e-8c0a-fc4d965a13b7", "address": "fa:16:3e:a3:d0:1a", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36047ed0-01", "ovs_interfaceid": "36047ed0-015a-4d5e-8c0a-fc4d965a13b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.113 2 DEBUG oslo_concurrency.lockutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Releasing lock "refresh_cache-6efc9ea0-184c-46cc-aeb5-e2759e10e398" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.114 2 DEBUG nova.compute.manager [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Instance network_info: |[{"id": "36047ed0-015a-4d5e-8c0a-fc4d965a13b7", "address": "fa:16:3e:a3:d0:1a", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36047ed0-01", "ovs_interfaceid": "36047ed0-015a-4d5e-8c0a-fc4d965a13b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.114 2 DEBUG oslo_concurrency.lockutils [req-08d7368f-c6ab-4d05-b1ec-75a1eb63038b req-72d18054-fa4e-46ac-9aa2-a73da83024bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-6efc9ea0-184c-46cc-aeb5-e2759e10e398" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.115 2 DEBUG nova.network.neutron [req-08d7368f-c6ab-4d05-b1ec-75a1eb63038b req-72d18054-fa4e-46ac-9aa2-a73da83024bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Refreshing network info cache for port 36047ed0-015a-4d5e-8c0a-fc4d965a13b7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.118 2 DEBUG nova.virt.libvirt.driver [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Start _get_guest_xml network_info=[{"id": "36047ed0-015a-4d5e-8c0a-fc4d965a13b7", "address": "fa:16:3e:a3:d0:1a", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36047ed0-01", "ovs_interfaceid": "36047ed0-015a-4d5e-8c0a-fc4d965a13b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.126 2 WARNING nova.virt.libvirt.driver [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.132 2 DEBUG nova.virt.libvirt.host [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.133 2 DEBUG nova.virt.libvirt.host [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.138 2 DEBUG nova.virt.libvirt.host [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.139 2 DEBUG nova.virt.libvirt.host [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.140 2 DEBUG nova.virt.libvirt.driver [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.140 2 DEBUG nova.virt.hardware [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.140 2 DEBUG nova.virt.hardware [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.141 2 DEBUG nova.virt.hardware [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.141 2 DEBUG nova.virt.hardware [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.141 2 DEBUG nova.virt.hardware [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.142 2 DEBUG nova.virt.hardware [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.142 2 DEBUG nova.virt.hardware [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.142 2 DEBUG nova.virt.hardware [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.143 2 DEBUG nova.virt.hardware [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.143 2 DEBUG nova.virt.hardware [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.143 2 DEBUG nova.virt.hardware [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.147 2 DEBUG nova.virt.libvirt.vif [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:24:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_multicast_after_idle_timeout-155366011',display_name='tempest-test_multicast_after_idle_timeout-155366011',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-multicast-after-idle-timeout-155366011',id=26,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHaTUyIW7HAi8eLb2uxsb3hQ01QNiqMtiwd2QQElMyFusiyPekoP+eGZG5apcvUeJj+ezHykEE9e9GalqeB/Pt0gdiMZz/nmUCtHv59KRRGG4S5F2fPmbxlRdJaDztvzVg==',key_name='tempest-keypair-test-1272869518',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='496a37645ecf47b496dcf02c696ca64a',ramdisk_id='',reservation_id='r-0ewp8wvp',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Ovn-1993668591',owner_user_name='tempest-MulticastTestIPv4Ovn-1993668591-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:24:34Z,user_data=None,user_id='c0c7c5c2dab54695b1cc0a34bdc4ee47',uuid=6efc9ea0-184c-46cc-aeb5-e2759e10e398,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "36047ed0-015a-4d5e-8c0a-fc4d965a13b7", "address": "fa:16:3e:a3:d0:1a", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36047ed0-01", "ovs_interfaceid": "36047ed0-015a-4d5e-8c0a-fc4d965a13b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.147 2 DEBUG nova.network.os_vif_util [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converting VIF {"id": "36047ed0-015a-4d5e-8c0a-fc4d965a13b7", "address": "fa:16:3e:a3:d0:1a", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36047ed0-01", "ovs_interfaceid": "36047ed0-015a-4d5e-8c0a-fc4d965a13b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.148 2 DEBUG nova.network.os_vif_util [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:d0:1a,bridge_name='br-int',has_traffic_filtering=True,id=36047ed0-015a-4d5e-8c0a-fc4d965a13b7,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36047ed0-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.149 2 DEBUG nova.objects.instance [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lazy-loading 'pci_devices' on Instance uuid 6efc9ea0-184c-46cc-aeb5-e2759e10e398 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.165 2 DEBUG nova.virt.libvirt.driver [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:24:41 np0005476733 nova_compute[192580]:  <uuid>6efc9ea0-184c-46cc-aeb5-e2759e10e398</uuid>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:  <name>instance-0000001a</name>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <nova:name>tempest-test_multicast_after_idle_timeout-155366011</nova:name>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:24:41</nova:creationTime>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:24:41 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:        <nova:user uuid="c0c7c5c2dab54695b1cc0a34bdc4ee47">tempest-MulticastTestIPv4Ovn-1993668591-project-member</nova:user>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:        <nova:project uuid="496a37645ecf47b496dcf02c696ca64a">tempest-MulticastTestIPv4Ovn-1993668591</nova:project>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:        <nova:port uuid="36047ed0-015a-4d5e-8c0a-fc4d965a13b7">
Oct  8 11:24:41 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <entry name="serial">6efc9ea0-184c-46cc-aeb5-e2759e10e398</entry>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <entry name="uuid">6efc9ea0-184c-46cc-aeb5-e2759e10e398</entry>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.config"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:a3:d0:1a"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <target dev="tap36047ed0-01"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398/console.log" append="off"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:24:41 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:24:41 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:24:41 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:24:41 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.171 2 DEBUG nova.compute.manager [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Preparing to wait for external event network-vif-plugged-36047ed0-015a-4d5e-8c0a-fc4d965a13b7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.172 2 DEBUG oslo_concurrency.lockutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "6efc9ea0-184c-46cc-aeb5-e2759e10e398-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.172 2 DEBUG oslo_concurrency.lockutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "6efc9ea0-184c-46cc-aeb5-e2759e10e398-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.173 2 DEBUG oslo_concurrency.lockutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "6efc9ea0-184c-46cc-aeb5-e2759e10e398-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.173 2 DEBUG nova.virt.libvirt.vif [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:24:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_multicast_after_idle_timeout-155366011',display_name='tempest-test_multicast_after_idle_timeout-155366011',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-multicast-after-idle-timeout-155366011',id=26,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHaTUyIW7HAi8eLb2uxsb3hQ01QNiqMtiwd2QQElMyFusiyPekoP+eGZG5apcvUeJj+ezHykEE9e9GalqeB/Pt0gdiMZz/nmUCtHv59KRRGG4S5F2fPmbxlRdJaDztvzVg==',key_name='tempest-keypair-test-1272869518',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='496a37645ecf47b496dcf02c696ca64a',ramdisk_id='',reservation_id='r-0ewp8wvp',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Ovn-1993668591',owner_user_name='tempest-MulticastTestIPv4Ovn-1993668591-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:24:34Z,user_data=None,user_id='c0c7c5c2dab54695b1cc0a34bdc4ee47',uuid=6efc9ea0-184c-46cc-aeb5-e2759e10e398,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "36047ed0-015a-4d5e-8c0a-fc4d965a13b7", "address": "fa:16:3e:a3:d0:1a", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36047ed0-01", "ovs_interfaceid": "36047ed0-015a-4d5e-8c0a-fc4d965a13b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.174 2 DEBUG nova.network.os_vif_util [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converting VIF {"id": "36047ed0-015a-4d5e-8c0a-fc4d965a13b7", "address": "fa:16:3e:a3:d0:1a", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36047ed0-01", "ovs_interfaceid": "36047ed0-015a-4d5e-8c0a-fc4d965a13b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.175 2 DEBUG nova.network.os_vif_util [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:d0:1a,bridge_name='br-int',has_traffic_filtering=True,id=36047ed0-015a-4d5e-8c0a-fc4d965a13b7,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36047ed0-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.175 2 DEBUG os_vif [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:d0:1a,bridge_name='br-int',has_traffic_filtering=True,id=36047ed0-015a-4d5e-8c0a-fc4d965a13b7,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36047ed0-01') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.177 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.178 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.181 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap36047ed0-01, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.182 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap36047ed0-01, col_values=(('external_ids', {'iface-id': '36047ed0-015a-4d5e-8c0a-fc4d965a13b7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a3:d0:1a', 'vm-uuid': '6efc9ea0-184c-46cc-aeb5-e2759e10e398'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:24:41 np0005476733 NetworkManager[51699]: <info>  [1759937081.1860] manager: (tap36047ed0-01): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.198 2 INFO os_vif [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:d0:1a,bridge_name='br-int',has_traffic_filtering=True,id=36047ed0-015a-4d5e-8c0a-fc4d965a13b7,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36047ed0-01')#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.250 2 DEBUG nova.virt.libvirt.driver [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.252 2 DEBUG nova.virt.libvirt.driver [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.253 2 DEBUG nova.virt.libvirt.driver [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] No VIF found with MAC fa:16:3e:a3:d0:1a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.253 2 INFO nova.virt.libvirt.driver [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Using config drive#033[00m
Oct  8 11:24:41 np0005476733 nova_compute[192580]: 2025-10-08 15:24:41.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:24:42 np0005476733 nova_compute[192580]: 2025-10-08 15:24:42.407 2 INFO nova.virt.libvirt.driver [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Creating config drive at /var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.config#033[00m
Oct  8 11:24:42 np0005476733 nova_compute[192580]: 2025-10-08 15:24:42.413 2 DEBUG oslo_concurrency.processutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpflxu0qkx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:24:42 np0005476733 nova_compute[192580]: 2025-10-08 15:24:42.549 2 DEBUG oslo_concurrency.processutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpflxu0qkx" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:24:42 np0005476733 kernel: tap36047ed0-01: entered promiscuous mode
Oct  8 11:24:42 np0005476733 nova_compute[192580]: 2025-10-08 15:24:42.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:24:42Z|00207|binding|INFO|Claiming lport 36047ed0-015a-4d5e-8c0a-fc4d965a13b7 for this chassis.
Oct  8 11:24:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:24:42Z|00208|binding|INFO|36047ed0-015a-4d5e-8c0a-fc4d965a13b7: Claiming fa:16:3e:a3:d0:1a 10.100.0.6
Oct  8 11:24:42 np0005476733 NetworkManager[51699]: <info>  [1759937082.6303] manager: (tap36047ed0-01): new Tun device (/org/freedesktop/NetworkManager/Devices/87)
Oct  8 11:24:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:42.636 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:d0:1a 10.100.0.6'], port_security=['fa:16:3e:a3:d0:1a 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '496a37645ecf47b496dcf02c696ca64a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '023a0cd3-fdca-4dff-ba80-8ef557b384c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b3d4cc6-3768-451b-b35e-6b2333c921fd, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=36047ed0-015a-4d5e-8c0a-fc4d965a13b7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:24:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:42.638 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 36047ed0-015a-4d5e-8c0a-fc4d965a13b7 in datapath 30cdfb1e-750a-4d0e-9e9c-321b06b371b9 bound to our chassis#033[00m
Oct  8 11:24:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:42.641 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 30cdfb1e-750a-4d0e-9e9c-321b06b371b9#033[00m
Oct  8 11:24:42 np0005476733 nova_compute[192580]: 2025-10-08 15:24:42.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:24:42Z|00209|binding|INFO|Setting lport 36047ed0-015a-4d5e-8c0a-fc4d965a13b7 ovn-installed in OVS
Oct  8 11:24:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:24:42Z|00210|binding|INFO|Setting lport 36047ed0-015a-4d5e-8c0a-fc4d965a13b7 up in Southbound
Oct  8 11:24:42 np0005476733 nova_compute[192580]: 2025-10-08 15:24:42.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:42.659 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5582be8a-0bc5-4c39-b029-67b9ccf6a149]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:24:42 np0005476733 systemd-udevd[226828]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:24:42 np0005476733 systemd-machined[152624]: New machine qemu-16-instance-0000001a.
Oct  8 11:24:42 np0005476733 NetworkManager[51699]: <info>  [1759937082.6753] device (tap36047ed0-01): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:24:42 np0005476733 NetworkManager[51699]: <info>  [1759937082.6761] device (tap36047ed0-01): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:24:42 np0005476733 systemd[1]: Started Virtual Machine qemu-16-instance-0000001a.
Oct  8 11:24:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:42.705 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[df00ef19-b045-427c-b407-bd43a1904277]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:24:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:42.710 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[38cf4cfb-ec17-49e6-9b5f-012852c9e882]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:24:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:42.752 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[8981ec24-a97e-4f06-920f-4ef99ff7758c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:24:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:42.772 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[036a08ec-4236-4bf5-a963-fa870893e6e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap30cdfb1e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:3e:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 11, 'tx_packets': 5, 'rx_bytes': 958, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 11, 'tx_packets': 5, 'rx_bytes': 958, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387053, 'reachable_time': 34516, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226840, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:24:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:42.798 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0dc077c9-eb83-403e-8df9-fd6583cb4a72]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap30cdfb1e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387066, 'tstamp': 387066}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226842, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap30cdfb1e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387069, 'tstamp': 387069}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226842, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:24:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:42.801 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap30cdfb1e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:24:42 np0005476733 nova_compute[192580]: 2025-10-08 15:24:42.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:42 np0005476733 nova_compute[192580]: 2025-10-08 15:24:42.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:42.805 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap30cdfb1e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:24:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:42.805 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:24:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:42.806 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap30cdfb1e-70, col_values=(('external_ids', {'iface-id': '76302563-91ae-48df-adce-3edec8d5a578'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:24:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:24:42.806 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:24:43 np0005476733 nova_compute[192580]: 2025-10-08 15:24:43.277 2 INFO nova.compute.manager [None req-392685a6-ab1e-4ace-a355-693c29f98efc 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Get console output#033[00m
Oct  8 11:24:43 np0005476733 nova_compute[192580]: 2025-10-08 15:24:43.290 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:24:43 np0005476733 nova_compute[192580]: 2025-10-08 15:24:43.536 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937083.5357924, 6efc9ea0-184c-46cc-aeb5-e2759e10e398 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:24:43 np0005476733 nova_compute[192580]: 2025-10-08 15:24:43.537 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] VM Started (Lifecycle Event)#033[00m
Oct  8 11:24:43 np0005476733 nova_compute[192580]: 2025-10-08 15:24:43.569 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:24:43 np0005476733 nova_compute[192580]: 2025-10-08 15:24:43.573 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937083.5371115, 6efc9ea0-184c-46cc-aeb5-e2759e10e398 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:24:43 np0005476733 nova_compute[192580]: 2025-10-08 15:24:43.574 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:24:43 np0005476733 nova_compute[192580]: 2025-10-08 15:24:43.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:24:43 np0005476733 nova_compute[192580]: 2025-10-08 15:24:43.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:24:43 np0005476733 nova_compute[192580]: 2025-10-08 15:24:43.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:24:43 np0005476733 nova_compute[192580]: 2025-10-08 15:24:43.597 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:24:43 np0005476733 nova_compute[192580]: 2025-10-08 15:24:43.601 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:24:43 np0005476733 nova_compute[192580]: 2025-10-08 15:24:43.625 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:24:43 np0005476733 nova_compute[192580]: 2025-10-08 15:24:43.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:44 np0005476733 nova_compute[192580]: 2025-10-08 15:24:44.037 2 DEBUG nova.network.neutron [req-08d7368f-c6ab-4d05-b1ec-75a1eb63038b req-72d18054-fa4e-46ac-9aa2-a73da83024bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Updated VIF entry in instance network info cache for port 36047ed0-015a-4d5e-8c0a-fc4d965a13b7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:24:44 np0005476733 nova_compute[192580]: 2025-10-08 15:24:44.037 2 DEBUG nova.network.neutron [req-08d7368f-c6ab-4d05-b1ec-75a1eb63038b req-72d18054-fa4e-46ac-9aa2-a73da83024bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Updating instance_info_cache with network_info: [{"id": "36047ed0-015a-4d5e-8c0a-fc4d965a13b7", "address": "fa:16:3e:a3:d0:1a", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36047ed0-01", "ovs_interfaceid": "36047ed0-015a-4d5e-8c0a-fc4d965a13b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:24:44 np0005476733 nova_compute[192580]: 2025-10-08 15:24:44.059 2 DEBUG oslo_concurrency.lockutils [req-08d7368f-c6ab-4d05-b1ec-75a1eb63038b req-72d18054-fa4e-46ac-9aa2-a73da83024bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-6efc9ea0-184c-46cc-aeb5-e2759e10e398" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:24:44 np0005476733 nova_compute[192580]: 2025-10-08 15:24:44.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:45 np0005476733 nova_compute[192580]: 2025-10-08 15:24:45.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:24:45 np0005476733 nova_compute[192580]: 2025-10-08 15:24:45.591 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 11:24:46 np0005476733 nova_compute[192580]: 2025-10-08 15:24:46.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:46 np0005476733 nova_compute[192580]: 2025-10-08 15:24:46.608 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:24:46 np0005476733 nova_compute[192580]: 2025-10-08 15:24:46.609 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:24:46 np0005476733 nova_compute[192580]: 2025-10-08 15:24:46.609 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:24:46 np0005476733 nova_compute[192580]: 2025-10-08 15:24:46.639 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  8 11:24:46 np0005476733 nova_compute[192580]: 2025-10-08 15:24:46.951 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-656c0a96-03f3-4a70-baac-01de2a126a91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:24:46 np0005476733 nova_compute[192580]: 2025-10-08 15:24:46.952 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-656c0a96-03f3-4a70-baac-01de2a126a91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:24:46 np0005476733 nova_compute[192580]: 2025-10-08 15:24:46.952 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:24:46 np0005476733 nova_compute[192580]: 2025-10-08 15:24:46.952 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 656c0a96-03f3-4a70-baac-01de2a126a91 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:24:47 np0005476733 podman[226850]: 2025-10-08 15:24:47.248338009 +0000 UTC m=+0.069472947 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, io.buildah.version=1.41.3, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  8 11:24:47 np0005476733 podman[226851]: 2025-10-08 15:24:47.250041623 +0000 UTC m=+0.066977187 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:24:48 np0005476733 nova_compute[192580]: 2025-10-08 15:24:48.688 2 INFO nova.compute.manager [None req-6e788362-fe15-40b0-95c5-2ae5271e4fa8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Get console output#033[00m
Oct  8 11:24:48 np0005476733 nova_compute[192580]: 2025-10-08 15:24:48.694 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:24:48 np0005476733 nova_compute[192580]: 2025-10-08 15:24:48.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:49 np0005476733 nova_compute[192580]: 2025-10-08 15:24:49.184 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Updating instance_info_cache with network_info: [{"id": "59f58b79-9163-41ba-8e03-7430e5def4ef", "address": "fa:16:3e:5f:94:83", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59f58b79-91", "ovs_interfaceid": "59f58b79-9163-41ba-8e03-7430e5def4ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:24:49 np0005476733 nova_compute[192580]: 2025-10-08 15:24:49.385 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-656c0a96-03f3-4a70-baac-01de2a126a91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:24:49 np0005476733 nova_compute[192580]: 2025-10-08 15:24:49.385 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:24:49 np0005476733 nova_compute[192580]: 2025-10-08 15:24:49.386 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:24:49 np0005476733 nova_compute[192580]: 2025-10-08 15:24:49.387 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 11:24:49 np0005476733 nova_compute[192580]: 2025-10-08 15:24:49.809 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 11:24:49 np0005476733 nova_compute[192580]: 2025-10-08 15:24:49.810 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:24:50 np0005476733 nova_compute[192580]: 2025-10-08 15:24:50.164 2 DEBUG nova.compute.manager [req-435be8d3-f732-4843-afe5-9051b9c0da2f req-8e4b7584-ea90-40cd-bbc0-c0ffa6a95d51 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Received event network-vif-plugged-36047ed0-015a-4d5e-8c0a-fc4d965a13b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:24:50 np0005476733 nova_compute[192580]: 2025-10-08 15:24:50.165 2 DEBUG oslo_concurrency.lockutils [req-435be8d3-f732-4843-afe5-9051b9c0da2f req-8e4b7584-ea90-40cd-bbc0-c0ffa6a95d51 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "6efc9ea0-184c-46cc-aeb5-e2759e10e398-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:24:50 np0005476733 nova_compute[192580]: 2025-10-08 15:24:50.165 2 DEBUG oslo_concurrency.lockutils [req-435be8d3-f732-4843-afe5-9051b9c0da2f req-8e4b7584-ea90-40cd-bbc0-c0ffa6a95d51 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "6efc9ea0-184c-46cc-aeb5-e2759e10e398-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:24:50 np0005476733 nova_compute[192580]: 2025-10-08 15:24:50.166 2 DEBUG oslo_concurrency.lockutils [req-435be8d3-f732-4843-afe5-9051b9c0da2f req-8e4b7584-ea90-40cd-bbc0-c0ffa6a95d51 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "6efc9ea0-184c-46cc-aeb5-e2759e10e398-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:24:50 np0005476733 nova_compute[192580]: 2025-10-08 15:24:50.166 2 DEBUG nova.compute.manager [req-435be8d3-f732-4843-afe5-9051b9c0da2f req-8e4b7584-ea90-40cd-bbc0-c0ffa6a95d51 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Processing event network-vif-plugged-36047ed0-015a-4d5e-8c0a-fc4d965a13b7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:24:50 np0005476733 nova_compute[192580]: 2025-10-08 15:24:50.167 2 DEBUG nova.compute.manager [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Instance event wait completed in 6 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:24:50 np0005476733 nova_compute[192580]: 2025-10-08 15:24:50.173 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937090.17338, 6efc9ea0-184c-46cc-aeb5-e2759e10e398 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:24:50 np0005476733 nova_compute[192580]: 2025-10-08 15:24:50.174 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:24:50 np0005476733 nova_compute[192580]: 2025-10-08 15:24:50.176 2 DEBUG nova.virt.libvirt.driver [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:24:50 np0005476733 nova_compute[192580]: 2025-10-08 15:24:50.180 2 INFO nova.virt.libvirt.driver [-] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Instance spawned successfully.#033[00m
Oct  8 11:24:50 np0005476733 nova_compute[192580]: 2025-10-08 15:24:50.181 2 DEBUG nova.virt.libvirt.driver [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:24:50 np0005476733 nova_compute[192580]: 2025-10-08 15:24:50.289 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:24:50 np0005476733 nova_compute[192580]: 2025-10-08 15:24:50.697 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:24:50 np0005476733 nova_compute[192580]: 2025-10-08 15:24:50.702 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:24:50 np0005476733 nova_compute[192580]: 2025-10-08 15:24:50.708 2 DEBUG nova.virt.libvirt.driver [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:24:50 np0005476733 nova_compute[192580]: 2025-10-08 15:24:50.708 2 DEBUG nova.virt.libvirt.driver [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:24:50 np0005476733 nova_compute[192580]: 2025-10-08 15:24:50.709 2 DEBUG nova.virt.libvirt.driver [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:24:50 np0005476733 nova_compute[192580]: 2025-10-08 15:24:50.710 2 DEBUG nova.virt.libvirt.driver [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:24:50 np0005476733 nova_compute[192580]: 2025-10-08 15:24:50.711 2 DEBUG nova.virt.libvirt.driver [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:24:50 np0005476733 nova_compute[192580]: 2025-10-08 15:24:50.712 2 DEBUG nova.virt.libvirt.driver [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:24:50 np0005476733 nova_compute[192580]: 2025-10-08 15:24:50.945 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:24:51 np0005476733 nova_compute[192580]: 2025-10-08 15:24:51.073 2 INFO nova.compute.manager [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Took 16.81 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:24:51 np0005476733 nova_compute[192580]: 2025-10-08 15:24:51.074 2 DEBUG nova.compute.manager [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:24:51 np0005476733 nova_compute[192580]: 2025-10-08 15:24:51.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:51 np0005476733 nova_compute[192580]: 2025-10-08 15:24:51.352 2 INFO nova.compute.manager [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Took 18.43 seconds to build instance.#033[00m
Oct  8 11:24:51 np0005476733 nova_compute[192580]: 2025-10-08 15:24:51.480 2 DEBUG oslo_concurrency.lockutils [None req-a9b1b389-90ea-44fd-a0c3-534ca09b9c16 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "6efc9ea0-184c-46cc-aeb5-e2759e10e398" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.673s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:24:51 np0005476733 ovn_controller[94857]: 2025-10-08T15:24:51Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:20:ea:c5 192.168.0.27
Oct  8 11:24:51 np0005476733 ovn_controller[94857]: 2025-10-08T15:24:51Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:20:ea:c5 192.168.0.27
Oct  8 11:24:52 np0005476733 nova_compute[192580]: 2025-10-08 15:24:52.339 2 DEBUG nova.compute.manager [req-33a3e4df-8c4d-4ad0-abe4-6e0c24f05b00 req-b89bddfa-34b1-46e0-8ba9-378fbf675659 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Received event network-vif-plugged-36047ed0-015a-4d5e-8c0a-fc4d965a13b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:24:52 np0005476733 nova_compute[192580]: 2025-10-08 15:24:52.340 2 DEBUG oslo_concurrency.lockutils [req-33a3e4df-8c4d-4ad0-abe4-6e0c24f05b00 req-b89bddfa-34b1-46e0-8ba9-378fbf675659 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "6efc9ea0-184c-46cc-aeb5-e2759e10e398-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:24:52 np0005476733 nova_compute[192580]: 2025-10-08 15:24:52.340 2 DEBUG oslo_concurrency.lockutils [req-33a3e4df-8c4d-4ad0-abe4-6e0c24f05b00 req-b89bddfa-34b1-46e0-8ba9-378fbf675659 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "6efc9ea0-184c-46cc-aeb5-e2759e10e398-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:24:52 np0005476733 nova_compute[192580]: 2025-10-08 15:24:52.340 2 DEBUG oslo_concurrency.lockutils [req-33a3e4df-8c4d-4ad0-abe4-6e0c24f05b00 req-b89bddfa-34b1-46e0-8ba9-378fbf675659 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "6efc9ea0-184c-46cc-aeb5-e2759e10e398-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:24:52 np0005476733 nova_compute[192580]: 2025-10-08 15:24:52.340 2 DEBUG nova.compute.manager [req-33a3e4df-8c4d-4ad0-abe4-6e0c24f05b00 req-b89bddfa-34b1-46e0-8ba9-378fbf675659 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] No waiting events found dispatching network-vif-plugged-36047ed0-015a-4d5e-8c0a-fc4d965a13b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:24:52 np0005476733 nova_compute[192580]: 2025-10-08 15:24:52.341 2 WARNING nova.compute.manager [req-33a3e4df-8c4d-4ad0-abe4-6e0c24f05b00 req-b89bddfa-34b1-46e0-8ba9-378fbf675659 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Received unexpected event network-vif-plugged-36047ed0-015a-4d5e-8c0a-fc4d965a13b7 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:24:53 np0005476733 nova_compute[192580]: 2025-10-08 15:24:53.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:54 np0005476733 podman[226894]: 2025-10-08 15:24:54.240996141 +0000 UTC m=+0.056476711 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:24:54 np0005476733 nova_compute[192580]: 2025-10-08 15:24:54.576 2 INFO nova.compute.manager [None req-f93638ef-94fd-4edc-8e3c-ddce05800b4f 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Get console output#033[00m
Oct  8 11:24:54 np0005476733 nova_compute[192580]: 2025-10-08 15:24:54.583 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:24:54 np0005476733 nova_compute[192580]: 2025-10-08 15:24:54.714 2 INFO nova.compute.manager [None req-6ba9bfbd-7d5d-4ef1-85bf-3b11a9b12f84 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Get console output#033[00m
Oct  8 11:24:54 np0005476733 nova_compute[192580]: 2025-10-08 15:24:54.720 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:24:56 np0005476733 nova_compute[192580]: 2025-10-08 15:24:56.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:58 np0005476733 nova_compute[192580]: 2025-10-08 15:24:58.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:24:59 np0005476733 podman[226912]: 2025-10-08 15:24:59.258634784 +0000 UTC m=+0.084102597 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Oct  8 11:24:59 np0005476733 podman[226911]: 2025-10-08 15:24:59.286577476 +0000 UTC m=+0.113959870 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:24:59 np0005476733 nova_compute[192580]: 2025-10-08 15:24:59.907 2 INFO nova.compute.manager [None req-c0d245fe-3ec6-4d9d-b701-3355d0e17561 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Get console output#033[00m
Oct  8 11:24:59 np0005476733 nova_compute[192580]: 2025-10-08 15:24:59.913 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:24:59 np0005476733 nova_compute[192580]: 2025-10-08 15:24:59.916 2 INFO nova.virt.libvirt.driver [None req-c0d245fe-3ec6-4d9d-b701-3355d0e17561 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Truncated console log returned, 3857 bytes ignored#033[00m
Oct  8 11:25:00 np0005476733 nova_compute[192580]: 2025-10-08 15:25:00.495 2 INFO nova.compute.manager [None req-a4a50054-fc34-4c06-81dd-8f03c7bf12a3 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Get console output#033[00m
Oct  8 11:25:01 np0005476733 nova_compute[192580]: 2025-10-08 15:25:01.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:03 np0005476733 nova_compute[192580]: 2025-10-08 15:25:03.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:04 np0005476733 nova_compute[192580]: 2025-10-08 15:25:04.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:04.463 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:25:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:04.466 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:25:05 np0005476733 podman[226974]: 2025-10-08 15:25:05.254129928 +0000 UTC m=+0.074789430 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, maintainer=Red Hat, Inc., name=ubi9-minimal)
Oct  8 11:25:05 np0005476733 nova_compute[192580]: 2025-10-08 15:25:05.329 2 INFO nova.compute.manager [None req-4c540b8e-89db-4ab7-80f3-44b8abb5eded 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Get console output#033[00m
Oct  8 11:25:05 np0005476733 nova_compute[192580]: 2025-10-08 15:25:05.335 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:25:05 np0005476733 nova_compute[192580]: 2025-10-08 15:25:05.338 2 INFO nova.virt.libvirt.driver [None req-4c540b8e-89db-4ab7-80f3-44b8abb5eded 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Truncated console log returned, 4083 bytes ignored#033[00m
Oct  8 11:25:05 np0005476733 nova_compute[192580]: 2025-10-08 15:25:05.674 2 INFO nova.compute.manager [None req-b016d53d-4e8e-411c-8df3-5aaea948e043 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Get console output#033[00m
Oct  8 11:25:05 np0005476733 nova_compute[192580]: 2025-10-08 15:25:05.679 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:25:06 np0005476733 nova_compute[192580]: 2025-10-08 15:25:06.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:07 np0005476733 nova_compute[192580]: 2025-10-08 15:25:07.514 2 DEBUG nova.compute.manager [req-9a45e8a9-eb3d-49d2-b43a-80ae479ab618 req-658b17db-e914-40b8-94dd-b62c67192f75 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Received event network-changed-bb8d6c3b-78f5-45eb-82d7-19d928374c3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:25:07 np0005476733 nova_compute[192580]: 2025-10-08 15:25:07.515 2 DEBUG nova.compute.manager [req-9a45e8a9-eb3d-49d2-b43a-80ae479ab618 req-658b17db-e914-40b8-94dd-b62c67192f75 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Refreshing instance network info cache due to event network-changed-bb8d6c3b-78f5-45eb-82d7-19d928374c3e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:25:07 np0005476733 nova_compute[192580]: 2025-10-08 15:25:07.515 2 DEBUG oslo_concurrency.lockutils [req-9a45e8a9-eb3d-49d2-b43a-80ae479ab618 req-658b17db-e914-40b8-94dd-b62c67192f75 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-e11af4e6-28c2-48fa-affb-668a5e9f6972" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:25:07 np0005476733 nova_compute[192580]: 2025-10-08 15:25:07.516 2 DEBUG oslo_concurrency.lockutils [req-9a45e8a9-eb3d-49d2-b43a-80ae479ab618 req-658b17db-e914-40b8-94dd-b62c67192f75 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-e11af4e6-28c2-48fa-affb-668a5e9f6972" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:25:07 np0005476733 nova_compute[192580]: 2025-10-08 15:25:07.516 2 DEBUG nova.network.neutron [req-9a45e8a9-eb3d-49d2-b43a-80ae479ab618 req-658b17db-e914-40b8-94dd-b62c67192f75 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Refreshing network info cache for port bb8d6c3b-78f5-45eb-82d7-19d928374c3e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:25:08 np0005476733 podman[227003]: 2025-10-08 15:25:08.230060134 +0000 UTC m=+0.052731765 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 11:25:08 np0005476733 podman[227001]: 2025-10-08 15:25:08.230172437 +0000 UTC m=+0.059510902 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_id=multipathd, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.557 2 DEBUG oslo_concurrency.lockutils [None req-afd4b69d-4a80-4880-8e10-cb288940bdeb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "e11af4e6-28c2-48fa-affb-668a5e9f6972" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.559 2 DEBUG oslo_concurrency.lockutils [None req-afd4b69d-4a80-4880-8e10-cb288940bdeb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "e11af4e6-28c2-48fa-affb-668a5e9f6972" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.560 2 DEBUG oslo_concurrency.lockutils [None req-afd4b69d-4a80-4880-8e10-cb288940bdeb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "e11af4e6-28c2-48fa-affb-668a5e9f6972-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.560 2 DEBUG oslo_concurrency.lockutils [None req-afd4b69d-4a80-4880-8e10-cb288940bdeb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "e11af4e6-28c2-48fa-affb-668a5e9f6972-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.561 2 DEBUG oslo_concurrency.lockutils [None req-afd4b69d-4a80-4880-8e10-cb288940bdeb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "e11af4e6-28c2-48fa-affb-668a5e9f6972-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.563 2 INFO nova.compute.manager [None req-afd4b69d-4a80-4880-8e10-cb288940bdeb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Terminating instance#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.564 2 DEBUG nova.compute.manager [None req-afd4b69d-4a80-4880-8e10-cb288940bdeb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:25:08 np0005476733 kernel: tapbb8d6c3b-78 (unregistering): left promiscuous mode
Oct  8 11:25:08 np0005476733 NetworkManager[51699]: <info>  [1759937108.5991] device (tapbb8d6c3b-78): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:08 np0005476733 ovn_controller[94857]: 2025-10-08T15:25:08Z|00211|binding|INFO|Releasing lport bb8d6c3b-78f5-45eb-82d7-19d928374c3e from this chassis (sb_readonly=0)
Oct  8 11:25:08 np0005476733 ovn_controller[94857]: 2025-10-08T15:25:08Z|00212|binding|INFO|Setting lport bb8d6c3b-78f5-45eb-82d7-19d928374c3e down in Southbound
Oct  8 11:25:08 np0005476733 ovn_controller[94857]: 2025-10-08T15:25:08Z|00213|binding|INFO|Removing iface tapbb8d6c3b-78 ovn-installed in OVS
Oct  8 11:25:08 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:08.623 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:ea:c5 192.168.0.27 2001::367'], port_security=['fa:16:3e:20:ea:c5 192.168.0.27 2001::367'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'neutron:cidrs': '192.168.0.27/24 2001::367/64', 'neutron:device_id': 'e11af4e6-28c2-48fa-affb-668a5e9f6972', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-784726bd-b1f4-4298-96ff-31e8b942933e', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6', 'neutron:project_id': '93e68db931464f0282500c84d398d8af', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ee93d6be-59e3-41c0-a55f-8df79fb9da74', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=adadf214-3c16-4fe3-8265-a811250258e1, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=bb8d6c3b-78f5-45eb-82d7-19d928374c3e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:25:08 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:08.627 103739 INFO neutron.agent.ovn.metadata.agent [-] Port bb8d6c3b-78f5-45eb-82d7-19d928374c3e in datapath 784726bd-b1f4-4298-96ff-31e8b942933e unbound from our chassis#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:08 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:08.632 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 784726bd-b1f4-4298-96ff-31e8b942933e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:25:08 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:08.638 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9b55ee97-de32-41cf-90ed-77e3b3ec8a16]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:08 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:08.640 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-784726bd-b1f4-4298-96ff-31e8b942933e namespace which is not needed anymore#033[00m
Oct  8 11:25:08 np0005476733 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000019.scope: Deactivated successfully.
Oct  8 11:25:08 np0005476733 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000019.scope: Consumed 42.883s CPU time.
Oct  8 11:25:08 np0005476733 systemd-machined[152624]: Machine qemu-15-instance-00000019 terminated.
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:08 np0005476733 neutron-haproxy-ovnmeta-784726bd-b1f4-4298-96ff-31e8b942933e[226605]: [NOTICE]   (226609) : haproxy version is 2.8.14-c23fe91
Oct  8 11:25:08 np0005476733 neutron-haproxy-ovnmeta-784726bd-b1f4-4298-96ff-31e8b942933e[226605]: [NOTICE]   (226609) : path to executable is /usr/sbin/haproxy
Oct  8 11:25:08 np0005476733 neutron-haproxy-ovnmeta-784726bd-b1f4-4298-96ff-31e8b942933e[226605]: [WARNING]  (226609) : Exiting Master process...
Oct  8 11:25:08 np0005476733 neutron-haproxy-ovnmeta-784726bd-b1f4-4298-96ff-31e8b942933e[226605]: [ALERT]    (226609) : Current worker (226611) exited with code 143 (Terminated)
Oct  8 11:25:08 np0005476733 neutron-haproxy-ovnmeta-784726bd-b1f4-4298-96ff-31e8b942933e[226605]: [WARNING]  (226609) : All workers exited. Exiting... (0)
Oct  8 11:25:08 np0005476733 systemd[1]: libpod-69200f816f34e4071fdad1579e9a825898eb2f58baa005ba05ff5c01535a0138.scope: Deactivated successfully.
Oct  8 11:25:08 np0005476733 podman[227069]: 2025-10-08 15:25:08.819787209 +0000 UTC m=+0.067811427 container died 69200f816f34e4071fdad1579e9a825898eb2f58baa005ba05ff5c01535a0138 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-784726bd-b1f4-4298-96ff-31e8b942933e, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.850 2 INFO nova.virt.libvirt.driver [-] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Instance destroyed successfully.#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.850 2 DEBUG nova.objects.instance [None req-afd4b69d-4a80-4880-8e10-cb288940bdeb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lazy-loading 'resources' on Instance uuid e11af4e6-28c2-48fa-affb-668a5e9f6972 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:25:08 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-69200f816f34e4071fdad1579e9a825898eb2f58baa005ba05ff5c01535a0138-userdata-shm.mount: Deactivated successfully.
Oct  8 11:25:08 np0005476733 systemd[1]: var-lib-containers-storage-overlay-d13cb32a2d89e7c31f4a94b61dcbff8f2be287fd488d18c6732851bd2ffbb632-merged.mount: Deactivated successfully.
Oct  8 11:25:08 np0005476733 podman[227069]: 2025-10-08 15:25:08.868101582 +0000 UTC m=+0.116125780 container cleanup 69200f816f34e4071fdad1579e9a825898eb2f58baa005ba05ff5c01535a0138 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-784726bd-b1f4-4298-96ff-31e8b942933e, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.870 2 DEBUG nova.virt.libvirt.vif [None req-afd4b69d-4a80-4880-8e10-cb288940bdeb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:24:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6',display_name='tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-extradhcpoptionstest-1757752636-test-extra-dhcp-opts-di',id=25,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAROHXDFBirKKfgv1/Q2k8TOz822D2j3GssXLkqqAYkfNmKCLTZPWHL9R3TttvPeVcQM9XeUfcVk0LUjV4/DUc229+mDzz6yKwrgz0g4olEc5cIgAsFC91SZyJ937u9BxA==',key_name='tempest-ExtraDhcpOptionsTest-1757752636',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:24:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='93e68db931464f0282500c84d398d8af',ramdisk_id='',reservation_id='r-gb7f9gi1',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ExtraDhcpOptionsTest-522093769',owner_user_name='tempest-ExtraDhcpOptionsTest-522093769-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:24:26Z,user_data=None,user_id='048380879c82439f920961e33c8fc34c',uuid=e11af4e6-28c2-48fa-affb-668a5e9f6972,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bb8d6c3b-78f5-45eb-82d7-19d928374c3e", "address": "fa:16:3e:20:ea:c5", "network": {"id": "784726bd-b1f4-4298-96ff-31e8b942933e", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "2001::/64", "dns": [], "gateway": {"address": "2001::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001::367", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}, {"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8d6c3b-78", "ovs_interfaceid": "bb8d6c3b-78f5-45eb-82d7-19d928374c3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.871 2 DEBUG nova.network.os_vif_util [None req-afd4b69d-4a80-4880-8e10-cb288940bdeb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Converting VIF {"id": "bb8d6c3b-78f5-45eb-82d7-19d928374c3e", "address": "fa:16:3e:20:ea:c5", "network": {"id": "784726bd-b1f4-4298-96ff-31e8b942933e", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "2001::/64", "dns": [], "gateway": {"address": "2001::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001::367", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}, {"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8d6c3b-78", "ovs_interfaceid": "bb8d6c3b-78f5-45eb-82d7-19d928374c3e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.872 2 DEBUG nova.network.os_vif_util [None req-afd4b69d-4a80-4880-8e10-cb288940bdeb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:ea:c5,bridge_name='br-int',has_traffic_filtering=True,id=bb8d6c3b-78f5-45eb-82d7-19d928374c3e,network=Network(784726bd-b1f4-4298-96ff-31e8b942933e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbb8d6c3b-78') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.873 2 DEBUG os_vif [None req-afd4b69d-4a80-4880-8e10-cb288940bdeb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:ea:c5,bridge_name='br-int',has_traffic_filtering=True,id=bb8d6c3b-78f5-45eb-82d7-19d928374c3e,network=Network(784726bd-b1f4-4298-96ff-31e8b942933e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbb8d6c3b-78') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:25:08 np0005476733 systemd[1]: libpod-conmon-69200f816f34e4071fdad1579e9a825898eb2f58baa005ba05ff5c01535a0138.scope: Deactivated successfully.
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.876 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbb8d6c3b-78, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.888 2 INFO os_vif [None req-afd4b69d-4a80-4880-8e10-cb288940bdeb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:ea:c5,bridge_name='br-int',has_traffic_filtering=True,id=bb8d6c3b-78f5-45eb-82d7-19d928374c3e,network=Network(784726bd-b1f4-4298-96ff-31e8b942933e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbb8d6c3b-78')#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.889 2 INFO nova.virt.libvirt.driver [None req-afd4b69d-4a80-4880-8e10-cb288940bdeb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Deleting instance files /var/lib/nova/instances/e11af4e6-28c2-48fa-affb-668a5e9f6972_del#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.890 2 INFO nova.virt.libvirt.driver [None req-afd4b69d-4a80-4880-8e10-cb288940bdeb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Deletion of /var/lib/nova/instances/e11af4e6-28c2-48fa-affb-668a5e9f6972_del complete#033[00m
Oct  8 11:25:08 np0005476733 podman[227111]: 2025-10-08 15:25:08.941407763 +0000 UTC m=+0.045656839 container remove 69200f816f34e4071fdad1579e9a825898eb2f58baa005ba05ff5c01535a0138 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-784726bd-b1f4-4298-96ff-31e8b942933e, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.951 2 DEBUG oslo_concurrency.lockutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "1cbc4434-d89a-483d-a1f2-299190262888" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.951 2 DEBUG oslo_concurrency.lockutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "1cbc4434-d89a-483d-a1f2-299190262888" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:08 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:08.954 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8f1a1de7-501c-46f7-b06f-317043a0e035]: (4, ('Wed Oct  8 03:25:08 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-784726bd-b1f4-4298-96ff-31e8b942933e (69200f816f34e4071fdad1579e9a825898eb2f58baa005ba05ff5c01535a0138)\n69200f816f34e4071fdad1579e9a825898eb2f58baa005ba05ff5c01535a0138\nWed Oct  8 03:25:08 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-784726bd-b1f4-4298-96ff-31e8b942933e (69200f816f34e4071fdad1579e9a825898eb2f58baa005ba05ff5c01535a0138)\n69200f816f34e4071fdad1579e9a825898eb2f58baa005ba05ff5c01535a0138\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:08 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:08.956 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[11e2e559-1dcc-4b16-b05a-a784ce7236ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:08 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:08.957 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap784726bd-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.958 2 INFO nova.compute.manager [None req-afd4b69d-4a80-4880-8e10-cb288940bdeb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.972 2 DEBUG oslo.service.loopingcall [None req-afd4b69d-4a80-4880-8e10-cb288940bdeb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:08 np0005476733 kernel: tap784726bd-b0: left promiscuous mode
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.975 2 DEBUG nova.compute.manager [-] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.978 2 DEBUG nova.network.neutron [-] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:25:08 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:08.980 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[49b11715-bb15-4ee2-934c-257ade36a983]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.982 2 DEBUG nova.compute.manager [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:25:08 np0005476733 nova_compute[192580]: 2025-10-08 15:25:08.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:09.009 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e73b1798-b98b-468b-ba58-d49079520305]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:09.011 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[51531f32-f56b-45f7-abf0-e563c34a0589]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:09.025 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ebd005d5-6528-47bc-9a9e-4bf37f250e35]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396926, 'reachable_time': 42934, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227126, 'error': None, 'target': 'ovnmeta-784726bd-b1f4-4298-96ff-31e8b942933e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:09.028 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-784726bd-b1f4-4298-96ff-31e8b942933e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:25:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:09.028 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[fe0513a7-1ecf-4ab0-9be1-f6ecd281f863]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:09 np0005476733 systemd[1]: run-netns-ovnmeta\x2d784726bd\x2db1f4\x2d4298\x2d96ff\x2d31e8b942933e.mount: Deactivated successfully.
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.070 2 DEBUG oslo_concurrency.lockutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.071 2 DEBUG oslo_concurrency.lockutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.083 2 DEBUG nova.virt.hardware [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.083 2 INFO nova.compute.claims [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.165 2 DEBUG nova.scheduler.client.report [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.183 2 DEBUG nova.scheduler.client.report [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.184 2 DEBUG nova.compute.provider_tree [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.208 2 DEBUG nova.scheduler.client.report [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.246 2 DEBUG nova.scheduler.client.report [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.388 2 DEBUG nova.compute.provider_tree [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.436 2 DEBUG nova.scheduler.client.report [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.596 2 DEBUG oslo_concurrency.lockutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.525s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.597 2 DEBUG nova.compute.manager [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.633 2 DEBUG nova.compute.manager [req-e01e38c7-0612-42ce-9004-ef4e7df98a61 req-4ce0167e-c555-44ac-8a5b-1e7bf6378d71 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Received event network-vif-unplugged-bb8d6c3b-78f5-45eb-82d7-19d928374c3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.633 2 DEBUG oslo_concurrency.lockutils [req-e01e38c7-0612-42ce-9004-ef4e7df98a61 req-4ce0167e-c555-44ac-8a5b-1e7bf6378d71 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e11af4e6-28c2-48fa-affb-668a5e9f6972-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.634 2 DEBUG oslo_concurrency.lockutils [req-e01e38c7-0612-42ce-9004-ef4e7df98a61 req-4ce0167e-c555-44ac-8a5b-1e7bf6378d71 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e11af4e6-28c2-48fa-affb-668a5e9f6972-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.634 2 DEBUG oslo_concurrency.lockutils [req-e01e38c7-0612-42ce-9004-ef4e7df98a61 req-4ce0167e-c555-44ac-8a5b-1e7bf6378d71 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e11af4e6-28c2-48fa-affb-668a5e9f6972-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.634 2 DEBUG nova.compute.manager [req-e01e38c7-0612-42ce-9004-ef4e7df98a61 req-4ce0167e-c555-44ac-8a5b-1e7bf6378d71 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] No waiting events found dispatching network-vif-unplugged-bb8d6c3b-78f5-45eb-82d7-19d928374c3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.635 2 DEBUG nova.compute.manager [req-e01e38c7-0612-42ce-9004-ef4e7df98a61 req-4ce0167e-c555-44ac-8a5b-1e7bf6378d71 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Received event network-vif-unplugged-bb8d6c3b-78f5-45eb-82d7-19d928374c3e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.635 2 DEBUG nova.compute.manager [req-e01e38c7-0612-42ce-9004-ef4e7df98a61 req-4ce0167e-c555-44ac-8a5b-1e7bf6378d71 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Received event network-vif-plugged-bb8d6c3b-78f5-45eb-82d7-19d928374c3e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.635 2 DEBUG oslo_concurrency.lockutils [req-e01e38c7-0612-42ce-9004-ef4e7df98a61 req-4ce0167e-c555-44ac-8a5b-1e7bf6378d71 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e11af4e6-28c2-48fa-affb-668a5e9f6972-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.636 2 DEBUG oslo_concurrency.lockutils [req-e01e38c7-0612-42ce-9004-ef4e7df98a61 req-4ce0167e-c555-44ac-8a5b-1e7bf6378d71 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e11af4e6-28c2-48fa-affb-668a5e9f6972-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.636 2 DEBUG oslo_concurrency.lockutils [req-e01e38c7-0612-42ce-9004-ef4e7df98a61 req-4ce0167e-c555-44ac-8a5b-1e7bf6378d71 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e11af4e6-28c2-48fa-affb-668a5e9f6972-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.636 2 DEBUG nova.compute.manager [req-e01e38c7-0612-42ce-9004-ef4e7df98a61 req-4ce0167e-c555-44ac-8a5b-1e7bf6378d71 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] No waiting events found dispatching network-vif-plugged-bb8d6c3b-78f5-45eb-82d7-19d928374c3e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.637 2 WARNING nova.compute.manager [req-e01e38c7-0612-42ce-9004-ef4e7df98a61 req-4ce0167e-c555-44ac-8a5b-1e7bf6378d71 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Received unexpected event network-vif-plugged-bb8d6c3b-78f5-45eb-82d7-19d928374c3e for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.659 2 DEBUG nova.compute.manager [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.660 2 DEBUG nova.network.neutron [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.711 2 INFO nova.virt.libvirt.driver [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.722 2 DEBUG nova.network.neutron [req-9a45e8a9-eb3d-49d2-b43a-80ae479ab618 req-658b17db-e914-40b8-94dd-b62c67192f75 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Updated VIF entry in instance network info cache for port bb8d6c3b-78f5-45eb-82d7-19d928374c3e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.723 2 DEBUG nova.network.neutron [req-9a45e8a9-eb3d-49d2-b43a-80ae479ab618 req-658b17db-e914-40b8-94dd-b62c67192f75 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Updating instance_info_cache with network_info: [{"id": "bb8d6c3b-78f5-45eb-82d7-19d928374c3e", "address": "fa:16:3e:20:ea:c5", "network": {"id": "784726bd-b1f4-4298-96ff-31e8b942933e", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_dhcp6", "subnets": [{"cidr": "2001::/64", "dns": [], "gateway": {"address": "2001::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001::367", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}, {"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8d6c3b-78", "ovs_interfaceid": "bb8d6c3b-78f5-45eb-82d7-19d928374c3e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.782 2 DEBUG nova.compute.manager [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:25:09 np0005476733 nova_compute[192580]: 2025-10-08 15:25:09.913 2 DEBUG oslo_concurrency.lockutils [req-9a45e8a9-eb3d-49d2-b43a-80ae479ab618 req-658b17db-e914-40b8-94dd-b62c67192f75 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-e11af4e6-28c2-48fa-affb-668a5e9f6972" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.150 2 DEBUG nova.policy [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.209 2 DEBUG nova.compute.manager [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.210 2 DEBUG nova.virt.libvirt.driver [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.211 2 INFO nova.virt.libvirt.driver [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Creating image(s)#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.211 2 DEBUG oslo_concurrency.lockutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "/var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.212 2 DEBUG oslo_concurrency.lockutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "/var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.212 2 DEBUG oslo_concurrency.lockutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "/var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.229 2 DEBUG oslo_concurrency.processutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.297 2 DEBUG oslo_concurrency.processutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.298 2 DEBUG oslo_concurrency.lockutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.298 2 DEBUG oslo_concurrency.lockutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.313 2 DEBUG oslo_concurrency.processutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.368 2 DEBUG oslo_concurrency.processutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.369 2 DEBUG oslo_concurrency.processutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.413 2 DEBUG oslo_concurrency.processutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk 10737418240" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.414 2 DEBUG oslo_concurrency.lockutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.414 2 DEBUG oslo_concurrency.processutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.468 2 DEBUG oslo_concurrency.processutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.470 2 DEBUG nova.objects.instance [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lazy-loading 'migration_context' on Instance uuid 1cbc4434-d89a-483d-a1f2-299190262888 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.659 2 DEBUG nova.virt.libvirt.driver [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.659 2 DEBUG nova.virt.libvirt.driver [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Ensure instance console log exists: /var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.659 2 DEBUG oslo_concurrency.lockutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.660 2 DEBUG oslo_concurrency.lockutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.660 2 DEBUG oslo_concurrency.lockutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.891 2 INFO nova.compute.manager [None req-719bbdcf-9b02-46a2-a43b-56e358d23bc4 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Get console output#033[00m
Oct  8 11:25:10 np0005476733 nova_compute[192580]: 2025-10-08 15:25:10.896 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:25:11 np0005476733 nova_compute[192580]: 2025-10-08 15:25:11.643 2 DEBUG nova.network.neutron [-] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:25:11 np0005476733 nova_compute[192580]: 2025-10-08 15:25:11.668 2 DEBUG nova.network.neutron [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Successfully updated port: 020c7187-878e-4336-a49d-ac40eb956ef6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:25:11 np0005476733 nova_compute[192580]: 2025-10-08 15:25:11.685 2 INFO nova.compute.manager [-] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Took 2.71 seconds to deallocate network for instance.#033[00m
Oct  8 11:25:11 np0005476733 nova_compute[192580]: 2025-10-08 15:25:11.707 2 DEBUG oslo_concurrency.lockutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "refresh_cache-1cbc4434-d89a-483d-a1f2-299190262888" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:25:11 np0005476733 nova_compute[192580]: 2025-10-08 15:25:11.708 2 DEBUG oslo_concurrency.lockutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquired lock "refresh_cache-1cbc4434-d89a-483d-a1f2-299190262888" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:25:11 np0005476733 nova_compute[192580]: 2025-10-08 15:25:11.708 2 DEBUG nova.network.neutron [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:25:11 np0005476733 nova_compute[192580]: 2025-10-08 15:25:11.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:11 np0005476733 nova_compute[192580]: 2025-10-08 15:25:11.791 2 DEBUG oslo_concurrency.lockutils [None req-afd4b69d-4a80-4880-8e10-cb288940bdeb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:11 np0005476733 nova_compute[192580]: 2025-10-08 15:25:11.792 2 DEBUG oslo_concurrency.lockutils [None req-afd4b69d-4a80-4880-8e10-cb288940bdeb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:11 np0005476733 nova_compute[192580]: 2025-10-08 15:25:11.837 2 DEBUG nova.compute.manager [req-c0578314-5720-4adc-8bca-599a48c74b96 req-5a89cbf3-b453-4460-a8b7-5f0217b6ff4a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Received event network-changed-020c7187-878e-4336-a49d-ac40eb956ef6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:25:11 np0005476733 nova_compute[192580]: 2025-10-08 15:25:11.838 2 DEBUG nova.compute.manager [req-c0578314-5720-4adc-8bca-599a48c74b96 req-5a89cbf3-b453-4460-a8b7-5f0217b6ff4a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Refreshing instance network info cache due to event network-changed-020c7187-878e-4336-a49d-ac40eb956ef6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:25:11 np0005476733 nova_compute[192580]: 2025-10-08 15:25:11.838 2 DEBUG oslo_concurrency.lockutils [req-c0578314-5720-4adc-8bca-599a48c74b96 req-5a89cbf3-b453-4460-a8b7-5f0217b6ff4a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-1cbc4434-d89a-483d-a1f2-299190262888" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:25:11 np0005476733 nova_compute[192580]: 2025-10-08 15:25:11.982 2 DEBUG nova.compute.provider_tree [None req-afd4b69d-4a80-4880-8e10-cb288940bdeb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:25:11 np0005476733 nova_compute[192580]: 2025-10-08 15:25:11.989 2 DEBUG nova.network.neutron [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.003 2 DEBUG nova.scheduler.client.report [None req-afd4b69d-4a80-4880-8e10-cb288940bdeb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.033 2 DEBUG oslo_concurrency.lockutils [None req-afd4b69d-4a80-4880-8e10-cb288940bdeb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.076 2 INFO nova.scheduler.client.report [None req-afd4b69d-4a80-4880-8e10-cb288940bdeb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Deleted allocations for instance e11af4e6-28c2-48fa-affb-668a5e9f6972#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.183 2 DEBUG oslo_concurrency.lockutils [None req-afd4b69d-4a80-4880-8e10-cb288940bdeb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "e11af4e6-28c2-48fa-affb-668a5e9f6972" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.623s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:12.469 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.766 2 DEBUG nova.network.neutron [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Updating instance_info_cache with network_info: [{"id": "020c7187-878e-4336-a49d-ac40eb956ef6", "address": "fa:16:3e:8b:42:9f", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap020c7187-87", "ovs_interfaceid": "020c7187-878e-4336-a49d-ac40eb956ef6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.798 2 DEBUG oslo_concurrency.lockutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Releasing lock "refresh_cache-1cbc4434-d89a-483d-a1f2-299190262888" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.799 2 DEBUG nova.compute.manager [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Instance network_info: |[{"id": "020c7187-878e-4336-a49d-ac40eb956ef6", "address": "fa:16:3e:8b:42:9f", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap020c7187-87", "ovs_interfaceid": "020c7187-878e-4336-a49d-ac40eb956ef6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.799 2 DEBUG oslo_concurrency.lockutils [req-c0578314-5720-4adc-8bca-599a48c74b96 req-5a89cbf3-b453-4460-a8b7-5f0217b6ff4a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-1cbc4434-d89a-483d-a1f2-299190262888" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.800 2 DEBUG nova.network.neutron [req-c0578314-5720-4adc-8bca-599a48c74b96 req-5a89cbf3-b453-4460-a8b7-5f0217b6ff4a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Refreshing network info cache for port 020c7187-878e-4336-a49d-ac40eb956ef6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.804 2 DEBUG nova.virt.libvirt.driver [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Start _get_guest_xml network_info=[{"id": "020c7187-878e-4336-a49d-ac40eb956ef6", "address": "fa:16:3e:8b:42:9f", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap020c7187-87", "ovs_interfaceid": "020c7187-878e-4336-a49d-ac40eb956ef6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.809 2 WARNING nova.virt.libvirt.driver [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.816 2 DEBUG nova.virt.libvirt.host [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.817 2 DEBUG nova.virt.libvirt.host [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.823 2 DEBUG nova.virt.libvirt.host [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.824 2 DEBUG nova.virt.libvirt.host [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.824 2 DEBUG nova.virt.libvirt.driver [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.824 2 DEBUG nova.virt.hardware [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.825 2 DEBUG nova.virt.hardware [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.825 2 DEBUG nova.virt.hardware [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.825 2 DEBUG nova.virt.hardware [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.825 2 DEBUG nova.virt.hardware [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.825 2 DEBUG nova.virt.hardware [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.825 2 DEBUG nova.virt.hardware [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.826 2 DEBUG nova.virt.hardware [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.826 2 DEBUG nova.virt.hardware [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.826 2 DEBUG nova.virt.hardware [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.826 2 DEBUG nova.virt.hardware [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.829 2 DEBUG nova.virt.libvirt.vif [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:25:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-broadcast-sender-124-598361755',display_name='tempest-broadcast-sender-124-598361755',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-broadcast-sender-124-598361755',id=29,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGBqxlO9VuM0Qq/DWr14YnhGxOxwcqegm/N2XcRSLA8NJfb1K0EfLGDHkMQul32EUhmJshL5J7ZH56Voxwq765dL8/B4SFbezZWy3ydp4mAt0951qcEHggiOu5J3JaZbOg==',key_name='tempest-keypair-test-1882494757',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7e1086961263487db8a3c5190fdf1b2e',ramdisk_id='',reservation_id='r-yda0bge1',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-BroadcastTestVlanTransparency-538458942',owner_user_name='tempest-BroadcastTestVlanTransparency-538458942-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:25:09Z,user_data=None,user_id='843ea0278e174175a6f8e21731c1383e',uuid=1cbc4434-d89a-483d-a1f2-299190262888,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "020c7187-878e-4336-a49d-ac40eb956ef6", "address": "fa:16:3e:8b:42:9f", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap020c7187-87", "ovs_interfaceid": "020c7187-878e-4336-a49d-ac40eb956ef6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.829 2 DEBUG nova.network.os_vif_util [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Converting VIF {"id": "020c7187-878e-4336-a49d-ac40eb956ef6", "address": "fa:16:3e:8b:42:9f", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap020c7187-87", "ovs_interfaceid": "020c7187-878e-4336-a49d-ac40eb956ef6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.829 2 DEBUG nova.network.os_vif_util [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:42:9f,bridge_name='br-int',has_traffic_filtering=True,id=020c7187-878e-4336-a49d-ac40eb956ef6,network=Network(7a77f8cd-4394-4cb0-a8a1-33872549758a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap020c7187-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.830 2 DEBUG nova.objects.instance [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lazy-loading 'pci_devices' on Instance uuid 1cbc4434-d89a-483d-a1f2-299190262888 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.849 2 DEBUG nova.virt.libvirt.driver [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:25:12 np0005476733 nova_compute[192580]:  <uuid>1cbc4434-d89a-483d-a1f2-299190262888</uuid>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:  <name>instance-0000001d</name>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <nova:name>tempest-broadcast-sender-124-598361755</nova:name>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:25:12</nova:creationTime>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:25:12 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:        <nova:user uuid="843ea0278e174175a6f8e21731c1383e">tempest-BroadcastTestVlanTransparency-538458942-project-member</nova:user>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:        <nova:project uuid="7e1086961263487db8a3c5190fdf1b2e">tempest-BroadcastTestVlanTransparency-538458942</nova:project>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:        <nova:port uuid="020c7187-878e-4336-a49d-ac40eb956ef6">
Oct  8 11:25:12 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <entry name="serial">1cbc4434-d89a-483d-a1f2-299190262888</entry>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <entry name="uuid">1cbc4434-d89a-483d-a1f2-299190262888</entry>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk.config"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:8b:42:9f"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <target dev="tap020c7187-87"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/console.log" append="off"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:25:12 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:25:12 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:25:12 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:25:12 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.849 2 DEBUG nova.compute.manager [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Preparing to wait for external event network-vif-plugged-020c7187-878e-4336-a49d-ac40eb956ef6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.850 2 DEBUG oslo_concurrency.lockutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "1cbc4434-d89a-483d-a1f2-299190262888-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.850 2 DEBUG oslo_concurrency.lockutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "1cbc4434-d89a-483d-a1f2-299190262888-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.850 2 DEBUG oslo_concurrency.lockutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "1cbc4434-d89a-483d-a1f2-299190262888-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.851 2 DEBUG nova.virt.libvirt.vif [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:25:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-broadcast-sender-124-598361755',display_name='tempest-broadcast-sender-124-598361755',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-broadcast-sender-124-598361755',id=29,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGBqxlO9VuM0Qq/DWr14YnhGxOxwcqegm/N2XcRSLA8NJfb1K0EfLGDHkMQul32EUhmJshL5J7ZH56Voxwq765dL8/B4SFbezZWy3ydp4mAt0951qcEHggiOu5J3JaZbOg==',key_name='tempest-keypair-test-1882494757',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7e1086961263487db8a3c5190fdf1b2e',ramdisk_id='',reservation_id='r-yda0bge1',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-BroadcastTestVlanTransparency-538458942',owner_user_name='tempest-BroadcastTestVlanTransparency-538458942-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:25:09Z,user_data=None,user_id='843ea0278e174175a6f8e21731c1383e',uuid=1cbc4434-d89a-483d-a1f2-299190262888,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "020c7187-878e-4336-a49d-ac40eb956ef6", "address": "fa:16:3e:8b:42:9f", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap020c7187-87", "ovs_interfaceid": "020c7187-878e-4336-a49d-ac40eb956ef6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.852 2 DEBUG nova.network.os_vif_util [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Converting VIF {"id": "020c7187-878e-4336-a49d-ac40eb956ef6", "address": "fa:16:3e:8b:42:9f", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap020c7187-87", "ovs_interfaceid": "020c7187-878e-4336-a49d-ac40eb956ef6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.853 2 DEBUG nova.network.os_vif_util [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:42:9f,bridge_name='br-int',has_traffic_filtering=True,id=020c7187-878e-4336-a49d-ac40eb956ef6,network=Network(7a77f8cd-4394-4cb0-a8a1-33872549758a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap020c7187-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.853 2 DEBUG os_vif [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:42:9f,bridge_name='br-int',has_traffic_filtering=True,id=020c7187-878e-4336-a49d-ac40eb956ef6,network=Network(7a77f8cd-4394-4cb0-a8a1-33872549758a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap020c7187-87') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.855 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.855 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.858 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap020c7187-87, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.859 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap020c7187-87, col_values=(('external_ids', {'iface-id': '020c7187-878e-4336-a49d-ac40eb956ef6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8b:42:9f', 'vm-uuid': '1cbc4434-d89a-483d-a1f2-299190262888'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:25:12 np0005476733 NetworkManager[51699]: <info>  [1759937112.9138] manager: (tap020c7187-87): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:12 np0005476733 nova_compute[192580]: 2025-10-08 15:25:12.919 2 INFO os_vif [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:42:9f,bridge_name='br-int',has_traffic_filtering=True,id=020c7187-878e-4336-a49d-ac40eb956ef6,network=Network(7a77f8cd-4394-4cb0-a8a1-33872549758a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap020c7187-87')#033[00m
Oct  8 11:25:13 np0005476733 nova_compute[192580]: 2025-10-08 15:25:13.071 2 DEBUG nova.virt.libvirt.driver [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:25:13 np0005476733 nova_compute[192580]: 2025-10-08 15:25:13.072 2 DEBUG nova.virt.libvirt.driver [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:25:13 np0005476733 nova_compute[192580]: 2025-10-08 15:25:13.072 2 DEBUG nova.virt.libvirt.driver [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] No VIF found with MAC fa:16:3e:8b:42:9f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:25:13 np0005476733 nova_compute[192580]: 2025-10-08 15:25:13.072 2 INFO nova.virt.libvirt.driver [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Using config drive#033[00m
Oct  8 11:25:13 np0005476733 nova_compute[192580]: 2025-10-08 15:25:13.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:13 np0005476733 nova_compute[192580]: 2025-10-08 15:25:13.996 2 INFO nova.virt.libvirt.driver [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Creating config drive at /var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk.config#033[00m
Oct  8 11:25:14 np0005476733 nova_compute[192580]: 2025-10-08 15:25:14.003 2 DEBUG oslo_concurrency.processutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa29bj0va execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:14 np0005476733 nova_compute[192580]: 2025-10-08 15:25:14.136 2 DEBUG oslo_concurrency.processutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa29bj0va" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:14 np0005476733 kernel: tap020c7187-87: entered promiscuous mode
Oct  8 11:25:14 np0005476733 ovn_controller[94857]: 2025-10-08T15:25:14Z|00214|binding|INFO|Claiming lport 020c7187-878e-4336-a49d-ac40eb956ef6 for this chassis.
Oct  8 11:25:14 np0005476733 ovn_controller[94857]: 2025-10-08T15:25:14Z|00215|binding|INFO|020c7187-878e-4336-a49d-ac40eb956ef6: Claiming fa:16:3e:8b:42:9f 10.100.0.9
Oct  8 11:25:14 np0005476733 nova_compute[192580]: 2025-10-08 15:25:14.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:14 np0005476733 NetworkManager[51699]: <info>  [1759937114.2282] manager: (tap020c7187-87): new Tun device (/org/freedesktop/NetworkManager/Devices/89)
Oct  8 11:25:14 np0005476733 ovn_controller[94857]: 2025-10-08T15:25:14Z|00216|binding|INFO|Setting lport 020c7187-878e-4336-a49d-ac40eb956ef6 ovn-installed in OVS
Oct  8 11:25:14 np0005476733 nova_compute[192580]: 2025-10-08 15:25:14.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:14 np0005476733 nova_compute[192580]: 2025-10-08 15:25:14.249 2 DEBUG nova.network.neutron [req-c0578314-5720-4adc-8bca-599a48c74b96 req-5a89cbf3-b453-4460-a8b7-5f0217b6ff4a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Updated VIF entry in instance network info cache for port 020c7187-878e-4336-a49d-ac40eb956ef6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:25:14 np0005476733 nova_compute[192580]: 2025-10-08 15:25:14.251 2 DEBUG nova.network.neutron [req-c0578314-5720-4adc-8bca-599a48c74b96 req-5a89cbf3-b453-4460-a8b7-5f0217b6ff4a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Updating instance_info_cache with network_info: [{"id": "020c7187-878e-4336-a49d-ac40eb956ef6", "address": "fa:16:3e:8b:42:9f", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap020c7187-87", "ovs_interfaceid": "020c7187-878e-4336-a49d-ac40eb956ef6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:25:14 np0005476733 nova_compute[192580]: 2025-10-08 15:25:14.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:14 np0005476733 ovn_controller[94857]: 2025-10-08T15:25:14Z|00217|binding|INFO|Setting lport 020c7187-878e-4336-a49d-ac40eb956ef6 up in Southbound
Oct  8 11:25:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:14.259 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:42:9f 10.100.0.9'], port_security=['fa:16:3e:8b:42:9f 10.100.0.9 192.168.111.13/24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7a77f8cd-4394-4cb0-a8a1-33872549758a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e1086961263487db8a3c5190fdf1b2e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '78a6a465-5b3b-43e0-8a00-63e5875c77b4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=538f4b4e-d2f6-4df4-8e2a-7fc02c73fc5a, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=020c7187-878e-4336-a49d-ac40eb956ef6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:25:14 np0005476733 systemd-udevd[227159]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:25:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:14.261 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 020c7187-878e-4336-a49d-ac40eb956ef6 in datapath 7a77f8cd-4394-4cb0-a8a1-33872549758a bound to our chassis#033[00m
Oct  8 11:25:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:14.264 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7a77f8cd-4394-4cb0-a8a1-33872549758a#033[00m
Oct  8 11:25:14 np0005476733 NetworkManager[51699]: <info>  [1759937114.2862] device (tap020c7187-87): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:25:14 np0005476733 NetworkManager[51699]: <info>  [1759937114.2871] device (tap020c7187-87): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:25:14 np0005476733 systemd-machined[152624]: New machine qemu-17-instance-0000001d.
Oct  8 11:25:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:14.291 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d9247c50-45ca-4758-bb5a-7d35979660c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:14 np0005476733 systemd[1]: Started Virtual Machine qemu-17-instance-0000001d.
Oct  8 11:25:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:14.327 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[1143226b-d401-4037-8a79-8cc4c6fad6c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:14.333 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[4c0b8564-57ed-4c37-a3e9-0875cd5b61dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:14.368 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[0fb064e4-2911-4c76-902d-f70f416bfdfb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:14.396 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f45dfb64-6f2b-43b5-a2f0-6586e15b21f7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7a77f8cd-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:53:0f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 1042, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 1042, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 389913, 'reachable_time': 23320, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227173, 'error': None, 'target': 'ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:14.418 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3d8b06e1-e8d1-439d-ba51-11866271e93b]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7a77f8cd-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 389931, 'tstamp': 389931}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227175, 'error': None, 'target': 'ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7a77f8cd-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 389935, 'tstamp': 389935}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227175, 'error': None, 'target': 'ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:14.421 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a77f8cd-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:25:14 np0005476733 nova_compute[192580]: 2025-10-08 15:25:14.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:14 np0005476733 nova_compute[192580]: 2025-10-08 15:25:14.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:14.427 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7a77f8cd-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:25:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:14.427 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:25:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:14.427 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7a77f8cd-40, col_values=(('external_ids', {'iface-id': 'b563ca05-c871-4f0e-9980-177237a3f88d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:25:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:14.428 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:25:14 np0005476733 nova_compute[192580]: 2025-10-08 15:25:14.736 2 DEBUG oslo_concurrency.lockutils [req-c0578314-5720-4adc-8bca-599a48c74b96 req-5a89cbf3-b453-4460-a8b7-5f0217b6ff4a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-1cbc4434-d89a-483d-a1f2-299190262888" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.464 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937115.4620016, 1cbc4434-d89a-483d-a1f2-299190262888 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.465 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] VM Started (Lifecycle Event)#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.505 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.510 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937115.4626517, 1cbc4434-d89a-483d-a1f2-299190262888 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.511 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.582 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.587 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.628 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.881 2 DEBUG nova.compute.manager [req-a0533a8a-ea8b-4297-acca-52b8a1aa520f req-4709f423-2ab3-4eb6-a0f5-177841b92f1f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Received event network-vif-plugged-020c7187-878e-4336-a49d-ac40eb956ef6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.883 2 DEBUG oslo_concurrency.lockutils [req-a0533a8a-ea8b-4297-acca-52b8a1aa520f req-4709f423-2ab3-4eb6-a0f5-177841b92f1f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "1cbc4434-d89a-483d-a1f2-299190262888-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.884 2 DEBUG oslo_concurrency.lockutils [req-a0533a8a-ea8b-4297-acca-52b8a1aa520f req-4709f423-2ab3-4eb6-a0f5-177841b92f1f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "1cbc4434-d89a-483d-a1f2-299190262888-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.884 2 DEBUG oslo_concurrency.lockutils [req-a0533a8a-ea8b-4297-acca-52b8a1aa520f req-4709f423-2ab3-4eb6-a0f5-177841b92f1f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "1cbc4434-d89a-483d-a1f2-299190262888-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.884 2 DEBUG nova.compute.manager [req-a0533a8a-ea8b-4297-acca-52b8a1aa520f req-4709f423-2ab3-4eb6-a0f5-177841b92f1f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Processing event network-vif-plugged-020c7187-878e-4336-a49d-ac40eb956ef6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.885 2 DEBUG nova.compute.manager [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.901 2 DEBUG nova.virt.libvirt.driver [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.902 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937115.900588, 1cbc4434-d89a-483d-a1f2-299190262888 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.902 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.909 2 INFO nova.virt.libvirt.driver [-] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Instance spawned successfully.#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.910 2 DEBUG nova.virt.libvirt.driver [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.951 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.955 2 DEBUG nova.virt.libvirt.driver [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.956 2 DEBUG nova.virt.libvirt.driver [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.957 2 DEBUG nova.virt.libvirt.driver [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.957 2 DEBUG nova.virt.libvirt.driver [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.958 2 DEBUG nova.virt.libvirt.driver [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.958 2 DEBUG nova.virt.libvirt.driver [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:25:15 np0005476733 nova_compute[192580]: 2025-10-08 15:25:15.966 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:25:16 np0005476733 nova_compute[192580]: 2025-10-08 15:25:16.017 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:25:16 np0005476733 nova_compute[192580]: 2025-10-08 15:25:16.071 2 INFO nova.compute.manager [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Took 5.86 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:25:16 np0005476733 nova_compute[192580]: 2025-10-08 15:25:16.072 2 DEBUG nova.compute.manager [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:25:16 np0005476733 nova_compute[192580]: 2025-10-08 15:25:16.179 2 INFO nova.compute.manager [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Took 7.15 seconds to build instance.#033[00m
Oct  8 11:25:16 np0005476733 nova_compute[192580]: 2025-10-08 15:25:16.257 2 DEBUG oslo_concurrency.lockutils [None req-e06e4ae9-5a43-4be3-b56f-d198924d8f81 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "1cbc4434-d89a-483d-a1f2-299190262888" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.306s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:16 np0005476733 nova_compute[192580]: 2025-10-08 15:25:16.298 2 INFO nova.compute.manager [None req-4a96fb19-bb01-4737-b265-432f22d7acee c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Get console output#033[00m
Oct  8 11:25:16 np0005476733 nova_compute[192580]: 2025-10-08 15:25:16.306 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:25:16 np0005476733 ovn_controller[94857]: 2025-10-08T15:25:16Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a3:d0:1a 10.100.0.6
Oct  8 11:25:16 np0005476733 ovn_controller[94857]: 2025-10-08T15:25:16Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a3:d0:1a 10.100.0.6
Oct  8 11:25:17 np0005476733 nova_compute[192580]: 2025-10-08 15:25:17.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:17 np0005476733 nova_compute[192580]: 2025-10-08 15:25:17.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:18 np0005476733 nova_compute[192580]: 2025-10-08 15:25:18.033 2 DEBUG nova.compute.manager [req-f4ed30f5-b085-4e43-9995-b95f291d7a17 req-dc1c8ced-b731-4770-82c7-7b1af59d0aca 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Received event network-vif-plugged-020c7187-878e-4336-a49d-ac40eb956ef6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:25:18 np0005476733 nova_compute[192580]: 2025-10-08 15:25:18.034 2 DEBUG oslo_concurrency.lockutils [req-f4ed30f5-b085-4e43-9995-b95f291d7a17 req-dc1c8ced-b731-4770-82c7-7b1af59d0aca 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "1cbc4434-d89a-483d-a1f2-299190262888-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:18 np0005476733 nova_compute[192580]: 2025-10-08 15:25:18.035 2 DEBUG oslo_concurrency.lockutils [req-f4ed30f5-b085-4e43-9995-b95f291d7a17 req-dc1c8ced-b731-4770-82c7-7b1af59d0aca 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "1cbc4434-d89a-483d-a1f2-299190262888-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:18 np0005476733 nova_compute[192580]: 2025-10-08 15:25:18.036 2 DEBUG oslo_concurrency.lockutils [req-f4ed30f5-b085-4e43-9995-b95f291d7a17 req-dc1c8ced-b731-4770-82c7-7b1af59d0aca 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "1cbc4434-d89a-483d-a1f2-299190262888-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:18 np0005476733 nova_compute[192580]: 2025-10-08 15:25:18.037 2 DEBUG nova.compute.manager [req-f4ed30f5-b085-4e43-9995-b95f291d7a17 req-dc1c8ced-b731-4770-82c7-7b1af59d0aca 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] No waiting events found dispatching network-vif-plugged-020c7187-878e-4336-a49d-ac40eb956ef6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:25:18 np0005476733 nova_compute[192580]: 2025-10-08 15:25:18.038 2 WARNING nova.compute.manager [req-f4ed30f5-b085-4e43-9995-b95f291d7a17 req-dc1c8ced-b731-4770-82c7-7b1af59d0aca 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Received unexpected event network-vif-plugged-020c7187-878e-4336-a49d-ac40eb956ef6 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:25:18 np0005476733 podman[227184]: 2025-10-08 15:25:18.234058874 +0000 UTC m=+0.058417627 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 11:25:18 np0005476733 podman[227183]: 2025-10-08 15:25:18.242361069 +0000 UTC m=+0.066602108 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  8 11:25:18 np0005476733 nova_compute[192580]: 2025-10-08 15:25:18.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:19 np0005476733 nova_compute[192580]: 2025-10-08 15:25:19.171 2 INFO nova.compute.manager [None req-44926e8a-ad17-4f6d-9a91-af01d56d83f4 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Get console output#033[00m
Oct  8 11:25:19 np0005476733 nova_compute[192580]: 2025-10-08 15:25:19.177 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:25:21 np0005476733 ovn_controller[94857]: 2025-10-08T15:25:21Z|00218|pinctrl|WARN|Dropped 4785 log messages in last 57 seconds (most recently, 3 seconds ago) due to excessive rate
Oct  8 11:25:21 np0005476733 ovn_controller[94857]: 2025-10-08T15:25:21Z|00219|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:25:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:21.015 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:46:6b 192.168.7.2 2001:7::f816:3eff:fea4:466b'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.7.2/24 2001:7::f816:3eff:fea4:466b/64', 'neutron:device_id': 'ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ba711270-eff4-4485-a453-6e6d5887d038', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c1390632da384309b358a3f3728ab5d8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d3a3e561-49a1-4407-8153-f33a25aca48b, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=67a74961-69a4-4a58-9c66-716dd0490fac) old=Port_Binding(mac=['fa:16:3e:a4:46:6b 192.168.7.2'], external_ids={'neutron:cidrs': '192.168.7.2/24', 'neutron:device_id': 'ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ba711270-eff4-4485-a453-6e6d5887d038', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c1390632da384309b358a3f3728ab5d8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:25:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:21.017 103739 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 67a74961-69a4-4a58-9c66-716dd0490fac in datapath ba711270-eff4-4485-a453-6e6d5887d038 updated#033[00m
Oct  8 11:25:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:21.023 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ba711270-eff4-4485-a453-6e6d5887d038, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:25:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:21.024 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[49cc004d-29bc-47e6-95d9-df813f519350]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:21 np0005476733 nova_compute[192580]: 2025-10-08 15:25:21.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:21 np0005476733 nova_compute[192580]: 2025-10-08 15:25:21.778 2 INFO nova.compute.manager [None req-baf9513a-8156-44df-820e-fd7d5c91d70c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Get console output#033[00m
Oct  8 11:25:21 np0005476733 nova_compute[192580]: 2025-10-08 15:25:21.783 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:25:21 np0005476733 nova_compute[192580]: 2025-10-08 15:25:21.786 2 INFO nova.virt.libvirt.driver [None req-baf9513a-8156-44df-820e-fd7d5c91d70c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Truncated console log returned, 138 bytes ignored#033[00m
Oct  8 11:25:22 np0005476733 nova_compute[192580]: 2025-10-08 15:25:22.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:23 np0005476733 nova_compute[192580]: 2025-10-08 15:25:23.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:23 np0005476733 nova_compute[192580]: 2025-10-08 15:25:23.848 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937108.8467422, e11af4e6-28c2-48fa-affb-668a5e9f6972 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:25:23 np0005476733 nova_compute[192580]: 2025-10-08 15:25:23.849 2 INFO nova.compute.manager [-] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:25:23 np0005476733 nova_compute[192580]: 2025-10-08 15:25:23.884 2 DEBUG nova.compute.manager [None req-5108ea96-dc3f-4cad-9074-df70765cb20b - - - - - -] [instance: e11af4e6-28c2-48fa-affb-668a5e9f6972] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:25:24 np0005476733 nova_compute[192580]: 2025-10-08 15:25:24.661 2 INFO nova.compute.manager [None req-b3e1315e-8d5b-4229-8b95-cad0e5750d38 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Get console output#033[00m
Oct  8 11:25:25 np0005476733 podman[227230]: 2025-10-08 15:25:25.22805825 +0000 UTC m=+0.059484291 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  8 11:25:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:26.307 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:26.307 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:26.308 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:26 np0005476733 nova_compute[192580]: 2025-10-08 15:25:26.984 2 INFO nova.compute.manager [None req-e8b4f49d-650a-4a22-87a4-8d6c318af52e c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Get console output#033[00m
Oct  8 11:25:26 np0005476733 nova_compute[192580]: 2025-10-08 15:25:26.993 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:25:26 np0005476733 nova_compute[192580]: 2025-10-08 15:25:26.996 2 INFO nova.virt.libvirt.driver [None req-e8b4f49d-650a-4a22-87a4-8d6c318af52e c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Truncated console log returned, 3441 bytes ignored#033[00m
Oct  8 11:25:27 np0005476733 nova_compute[192580]: 2025-10-08 15:25:27.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:28 np0005476733 nova_compute[192580]: 2025-10-08 15:25:28.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:29 np0005476733 nova_compute[192580]: 2025-10-08 15:25:29.867 2 INFO nova.compute.manager [None req-5e048dbd-d566-474d-ac8c-e21311b24403 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Get console output#033[00m
Oct  8 11:25:30 np0005476733 podman[227270]: 2025-10-08 15:25:30.229274709 +0000 UTC m=+0.056347291 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:25:30 np0005476733 podman[227269]: 2025-10-08 15:25:30.257701567 +0000 UTC m=+0.084146098 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 11:25:31 np0005476733 nova_compute[192580]: 2025-10-08 15:25:31.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:32 np0005476733 nova_compute[192580]: 2025-10-08 15:25:32.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:33 np0005476733 nova_compute[192580]: 2025-10-08 15:25:33.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:34 np0005476733 nova_compute[192580]: 2025-10-08 15:25:34.169 2 DEBUG nova.compute.manager [req-62b34931-8983-45b7-accb-b86484d27255 req-bd385c16-6c73-4216-aaef-2e98356b3eea 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Received event network-changed-36047ed0-015a-4d5e-8c0a-fc4d965a13b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:25:34 np0005476733 nova_compute[192580]: 2025-10-08 15:25:34.170 2 DEBUG nova.compute.manager [req-62b34931-8983-45b7-accb-b86484d27255 req-bd385c16-6c73-4216-aaef-2e98356b3eea 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Refreshing instance network info cache due to event network-changed-36047ed0-015a-4d5e-8c0a-fc4d965a13b7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:25:34 np0005476733 nova_compute[192580]: 2025-10-08 15:25:34.170 2 DEBUG oslo_concurrency.lockutils [req-62b34931-8983-45b7-accb-b86484d27255 req-bd385c16-6c73-4216-aaef-2e98356b3eea 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-6efc9ea0-184c-46cc-aeb5-e2759e10e398" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:25:34 np0005476733 nova_compute[192580]: 2025-10-08 15:25:34.170 2 DEBUG oslo_concurrency.lockutils [req-62b34931-8983-45b7-accb-b86484d27255 req-bd385c16-6c73-4216-aaef-2e98356b3eea 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-6efc9ea0-184c-46cc-aeb5-e2759e10e398" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:25:34 np0005476733 nova_compute[192580]: 2025-10-08 15:25:34.171 2 DEBUG nova.network.neutron [req-62b34931-8983-45b7-accb-b86484d27255 req-bd385c16-6c73-4216-aaef-2e98356b3eea 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Refreshing network info cache for port 36047ed0-015a-4d5e-8c0a-fc4d965a13b7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:25:34 np0005476733 nova_compute[192580]: 2025-10-08 15:25:34.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:25:34 np0005476733 nova_compute[192580]: 2025-10-08 15:25:34.638 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:34 np0005476733 nova_compute[192580]: 2025-10-08 15:25:34.639 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:34 np0005476733 nova_compute[192580]: 2025-10-08 15:25:34.639 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:34 np0005476733 nova_compute[192580]: 2025-10-08 15:25:34.639 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:25:34 np0005476733 nova_compute[192580]: 2025-10-08 15:25:34.878 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:34 np0005476733 nova_compute[192580]: 2025-10-08 15:25:34.949 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:34 np0005476733 nova_compute[192580]: 2025-10-08 15:25:34.957 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:35 np0005476733 nova_compute[192580]: 2025-10-08 15:25:35.025 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:35 np0005476733 nova_compute[192580]: 2025-10-08 15:25:35.032 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:35 np0005476733 nova_compute[192580]: 2025-10-08 15:25:35.105 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:35 np0005476733 nova_compute[192580]: 2025-10-08 15:25:35.106 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:35 np0005476733 nova_compute[192580]: 2025-10-08 15:25:35.158 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:35 np0005476733 nova_compute[192580]: 2025-10-08 15:25:35.164 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:35 np0005476733 nova_compute[192580]: 2025-10-08 15:25:35.227 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:35 np0005476733 nova_compute[192580]: 2025-10-08 15:25:35.229 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:35 np0005476733 nova_compute[192580]: 2025-10-08 15:25:35.293 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:35 np0005476733 nova_compute[192580]: 2025-10-08 15:25:35.300 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:35 np0005476733 nova_compute[192580]: 2025-10-08 15:25:35.363 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:35 np0005476733 nova_compute[192580]: 2025-10-08 15:25:35.365 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:35 np0005476733 nova_compute[192580]: 2025-10-08 15:25:35.438 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:35 np0005476733 nova_compute[192580]: 2025-10-08 15:25:35.447 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:35 np0005476733 nova_compute[192580]: 2025-10-08 15:25:35.519 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:35 np0005476733 nova_compute[192580]: 2025-10-08 15:25:35.520 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:35 np0005476733 nova_compute[192580]: 2025-10-08 15:25:35.579 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:35 np0005476733 nova_compute[192580]: 2025-10-08 15:25:35.861 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:25:35 np0005476733 nova_compute[192580]: 2025-10-08 15:25:35.865 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=10227MB free_disk=110.82934188842773GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:25:35 np0005476733 nova_compute[192580]: 2025-10-08 15:25:35.865 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:35 np0005476733 nova_compute[192580]: 2025-10-08 15:25:35.866 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:35 np0005476733 nova_compute[192580]: 2025-10-08 15:25:35.905 2 INFO nova.compute.manager [None req-279d7a1a-5a21-4bf0-85bf-933521964f01 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Get console output#033[00m
Oct  8 11:25:35 np0005476733 nova_compute[192580]: 2025-10-08 15:25:35.912 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:25:36 np0005476733 nova_compute[192580]: 2025-10-08 15:25:36.088 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 656c0a96-03f3-4a70-baac-01de2a126a91 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:25:36 np0005476733 nova_compute[192580]: 2025-10-08 15:25:36.088 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:25:36 np0005476733 nova_compute[192580]: 2025-10-08 15:25:36.089 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance e36dd986-15d5-466e-93d6-dc7b4483c8e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:25:36 np0005476733 nova_compute[192580]: 2025-10-08 15:25:36.089 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 6efc9ea0-184c-46cc-aeb5-e2759e10e398 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:25:36 np0005476733 nova_compute[192580]: 2025-10-08 15:25:36.089 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 1cbc4434-d89a-483d-a1f2-299190262888 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:25:36 np0005476733 nova_compute[192580]: 2025-10-08 15:25:36.090 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 5 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:25:36 np0005476733 nova_compute[192580]: 2025-10-08 15:25:36.090 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=5632MB phys_disk=119GB used_disk=50GB total_vcpus=8 used_vcpus=5 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:25:36 np0005476733 nova_compute[192580]: 2025-10-08 15:25:36.194 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:25:36 np0005476733 podman[227357]: 2025-10-08 15:25:36.233648548 +0000 UTC m=+0.060960718 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.6, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  8 11:25:36 np0005476733 nova_compute[192580]: 2025-10-08 15:25:36.273 2 DEBUG nova.network.neutron [req-62b34931-8983-45b7-accb-b86484d27255 req-bd385c16-6c73-4216-aaef-2e98356b3eea 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Updated VIF entry in instance network info cache for port 36047ed0-015a-4d5e-8c0a-fc4d965a13b7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:25:36 np0005476733 nova_compute[192580]: 2025-10-08 15:25:36.274 2 DEBUG nova.network.neutron [req-62b34931-8983-45b7-accb-b86484d27255 req-bd385c16-6c73-4216-aaef-2e98356b3eea 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Updating instance_info_cache with network_info: [{"id": "36047ed0-015a-4d5e-8c0a-fc4d965a13b7", "address": "fa:16:3e:a3:d0:1a", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36047ed0-01", "ovs_interfaceid": "36047ed0-015a-4d5e-8c0a-fc4d965a13b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:25:36 np0005476733 nova_compute[192580]: 2025-10-08 15:25:36.303 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:25:36 np0005476733 nova_compute[192580]: 2025-10-08 15:25:36.503 2 DEBUG oslo_concurrency.lockutils [req-62b34931-8983-45b7-accb-b86484d27255 req-bd385c16-6c73-4216-aaef-2e98356b3eea 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-6efc9ea0-184c-46cc-aeb5-e2759e10e398" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:25:36 np0005476733 nova_compute[192580]: 2025-10-08 15:25:36.550 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:25:36 np0005476733 nova_compute[192580]: 2025-10-08 15:25:36.551 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.685s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:37 np0005476733 nova_compute[192580]: 2025-10-08 15:25:37.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:38 np0005476733 nova_compute[192580]: 2025-10-08 15:25:38.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:39 np0005476733 podman[227395]: 2025-10-08 15:25:39.245996966 +0000 UTC m=+0.070888064 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 11:25:39 np0005476733 podman[227394]: 2025-10-08 15:25:39.2476594 +0000 UTC m=+0.072859278 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true)
Oct  8 11:25:40 np0005476733 nova_compute[192580]: 2025-10-08 15:25:40.552 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:25:40 np0005476733 nova_compute[192580]: 2025-10-08 15:25:40.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:25:40 np0005476733 nova_compute[192580]: 2025-10-08 15:25:40.851 2 DEBUG oslo_concurrency.lockutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "cefc7b22-5a31-4d0c-bb25-462153dfc427" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:40 np0005476733 nova_compute[192580]: 2025-10-08 15:25:40.854 2 DEBUG oslo_concurrency.lockutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:40 np0005476733 nova_compute[192580]: 2025-10-08 15:25:40.937 2 DEBUG nova.compute.manager [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.114 2 DEBUG oslo_concurrency.lockutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.115 2 DEBUG oslo_concurrency.lockutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.123 2 DEBUG nova.virt.hardware [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.123 2 INFO nova.compute.claims [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.197 2 INFO nova.compute.manager [None req-d53308c6-2868-4558-88b0-0aa8361ef592 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Get console output#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.201 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.336 2 DEBUG oslo_concurrency.lockutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Acquiring lock "4f1d2adc-1ecb-45dc-83a0-c2369028e487" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.336 2 DEBUG oslo_concurrency.lockutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Lock "4f1d2adc-1ecb-45dc-83a0-c2369028e487" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.361 2 DEBUG nova.compute.manager [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.439 2 DEBUG oslo_concurrency.lockutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.446 2 DEBUG nova.compute.provider_tree [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.462 2 DEBUG nova.scheduler.client.report [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.486 2 DEBUG oslo_concurrency.lockutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.371s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.487 2 DEBUG nova.compute.manager [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.490 2 DEBUG oslo_concurrency.lockutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.052s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.497 2 DEBUG nova.virt.hardware [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.498 2 INFO nova.compute.claims [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.569 2 DEBUG nova.compute.manager [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.570 2 DEBUG nova.network.neutron [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.605 2 INFO nova.virt.libvirt.driver [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.631 2 DEBUG nova.compute.manager [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.736 2 DEBUG nova.compute.manager [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.739 2 DEBUG nova.virt.libvirt.driver [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.740 2 INFO nova.virt.libvirt.driver [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Creating image(s)#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.741 2 DEBUG oslo_concurrency.lockutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "/var/lib/nova/instances/cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.741 2 DEBUG oslo_concurrency.lockutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "/var/lib/nova/instances/cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.742 2 DEBUG oslo_concurrency.lockutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "/var/lib/nova/instances/cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.754 2 DEBUG oslo_concurrency.processutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.776 2 DEBUG nova.compute.provider_tree [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.792 2 DEBUG nova.scheduler.client.report [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.810 2 DEBUG oslo_concurrency.processutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.811 2 DEBUG oslo_concurrency.lockutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.812 2 DEBUG oslo_concurrency.lockutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.825 2 DEBUG oslo_concurrency.processutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.844 2 DEBUG oslo_concurrency.lockutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.354s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.845 2 DEBUG nova.compute.manager [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.881 2 DEBUG oslo_concurrency.processutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.883 2 DEBUG oslo_concurrency.processutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/cefc7b22-5a31-4d0c-bb25-462153dfc427/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.908 2 DEBUG nova.compute.manager [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.909 2 DEBUG nova.network.neutron [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.929 2 INFO nova.virt.libvirt.driver [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.945 2 DEBUG oslo_concurrency.processutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/cefc7b22-5a31-4d0c-bb25-462153dfc427/disk 10737418240" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.947 2 DEBUG oslo_concurrency.lockutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.947 2 DEBUG oslo_concurrency.processutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:41 np0005476733 nova_compute[192580]: 2025-10-08 15:25:41.970 2 DEBUG nova.compute.manager [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.009 2 DEBUG oslo_concurrency.processutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.014 2 DEBUG nova.objects.instance [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lazy-loading 'migration_context' on Instance uuid cefc7b22-5a31-4d0c-bb25-462153dfc427 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.029 2 DEBUG nova.virt.libvirt.driver [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.030 2 DEBUG nova.virt.libvirt.driver [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Ensure instance console log exists: /var/lib/nova/instances/cefc7b22-5a31-4d0c-bb25-462153dfc427/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.031 2 DEBUG oslo_concurrency.lockutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.031 2 DEBUG oslo_concurrency.lockutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.032 2 DEBUG oslo_concurrency.lockutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.085 2 DEBUG nova.compute.manager [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.087 2 DEBUG nova.virt.libvirt.driver [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.088 2 INFO nova.virt.libvirt.driver [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Creating image(s)#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.089 2 DEBUG oslo_concurrency.lockutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Acquiring lock "/var/lib/nova/instances/4f1d2adc-1ecb-45dc-83a0-c2369028e487/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.090 2 DEBUG oslo_concurrency.lockutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Lock "/var/lib/nova/instances/4f1d2adc-1ecb-45dc-83a0-c2369028e487/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.092 2 DEBUG oslo_concurrency.lockutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Lock "/var/lib/nova/instances/4f1d2adc-1ecb-45dc-83a0-c2369028e487/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.121 2 DEBUG oslo_concurrency.processutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.192 2 DEBUG oslo_concurrency.processutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.193 2 DEBUG oslo_concurrency.lockutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.194 2 DEBUG oslo_concurrency.lockutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.206 2 DEBUG oslo_concurrency.processutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.264 2 DEBUG nova.policy [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.273 2 DEBUG oslo_concurrency.processutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.274 2 DEBUG oslo_concurrency.processutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/4f1d2adc-1ecb-45dc-83a0-c2369028e487/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.313 2 DEBUG oslo_concurrency.processutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/4f1d2adc-1ecb-45dc-83a0-c2369028e487/disk 10737418240" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.315 2 DEBUG oslo_concurrency.lockutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.315 2 DEBUG oslo_concurrency.processutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.347 2 DEBUG nova.policy [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cff4e262a7054de9b32c7b3c504c757f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c1390632da384309b358a3f3728ab5d8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.369 2 DEBUG oslo_concurrency.processutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.370 2 DEBUG nova.objects.instance [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Lazy-loading 'migration_context' on Instance uuid 4f1d2adc-1ecb-45dc-83a0-c2369028e487 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.393 2 DEBUG nova.virt.libvirt.driver [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.394 2 DEBUG nova.virt.libvirt.driver [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Ensure instance console log exists: /var/lib/nova/instances/4f1d2adc-1ecb-45dc-83a0-c2369028e487/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.395 2 DEBUG oslo_concurrency.lockutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.396 2 DEBUG oslo_concurrency.lockutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.397 2 DEBUG oslo_concurrency.lockutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:25:42 np0005476733 nova_compute[192580]: 2025-10-08 15:25:42.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:25:42Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8b:42:9f 10.100.0.9
Oct  8 11:25:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:25:42Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8b:42:9f 10.100.0.9
Oct  8 11:25:43 np0005476733 nova_compute[192580]: 2025-10-08 15:25:43.591 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:25:43 np0005476733 nova_compute[192580]: 2025-10-08 15:25:43.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:44 np0005476733 nova_compute[192580]: 2025-10-08 15:25:44.318 2 DEBUG nova.network.neutron [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Successfully updated port: cae08d04-f9a8-46ee-ba57-0a0db94ae186 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:25:44 np0005476733 nova_compute[192580]: 2025-10-08 15:25:44.365 2 DEBUG oslo_concurrency.lockutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "refresh_cache-cefc7b22-5a31-4d0c-bb25-462153dfc427" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:25:44 np0005476733 nova_compute[192580]: 2025-10-08 15:25:44.366 2 DEBUG oslo_concurrency.lockutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquired lock "refresh_cache-cefc7b22-5a31-4d0c-bb25-462153dfc427" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:25:44 np0005476733 nova_compute[192580]: 2025-10-08 15:25:44.366 2 DEBUG nova.network.neutron [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:25:44 np0005476733 nova_compute[192580]: 2025-10-08 15:25:44.409 2 DEBUG nova.network.neutron [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Successfully updated port: 9dbcb8e0-b6cb-47f5-b89d-290794905306 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:25:44 np0005476733 nova_compute[192580]: 2025-10-08 15:25:44.531 2 DEBUG oslo_concurrency.lockutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Acquiring lock "refresh_cache-4f1d2adc-1ecb-45dc-83a0-c2369028e487" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:25:44 np0005476733 nova_compute[192580]: 2025-10-08 15:25:44.532 2 DEBUG oslo_concurrency.lockutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Acquired lock "refresh_cache-4f1d2adc-1ecb-45dc-83a0-c2369028e487" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:25:44 np0005476733 nova_compute[192580]: 2025-10-08 15:25:44.532 2 DEBUG nova.network.neutron [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:25:44 np0005476733 nova_compute[192580]: 2025-10-08 15:25:44.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:25:44 np0005476733 nova_compute[192580]: 2025-10-08 15:25:44.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:25:45 np0005476733 nova_compute[192580]: 2025-10-08 15:25:45.000 2 DEBUG nova.network.neutron [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:25:45 np0005476733 nova_compute[192580]: 2025-10-08 15:25:45.031 2 DEBUG nova.network.neutron [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.442 2 DEBUG nova.network.neutron [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Updating instance_info_cache with network_info: [{"id": "9dbcb8e0-b6cb-47f5-b89d-290794905306", "address": "fa:16:3e:2a:db:6c", "network": {"id": "ba711270-eff4-4485-a453-6e6d5887d038", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-1890118187-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.196", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:7::/64", "dns": [], "gateway": {"address": "2001:7::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:7::f816:3eff:fe2a:db6c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "c1390632da384309b358a3f3728ab5d8", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dbcb8e0-b6", "ovs_interfaceid": "9dbcb8e0-b6cb-47f5-b89d-290794905306", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.652 2 DEBUG oslo_concurrency.lockutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Releasing lock "refresh_cache-4f1d2adc-1ecb-45dc-83a0-c2369028e487" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.653 2 DEBUG nova.compute.manager [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Instance network_info: |[{"id": "9dbcb8e0-b6cb-47f5-b89d-290794905306", "address": "fa:16:3e:2a:db:6c", "network": {"id": "ba711270-eff4-4485-a453-6e6d5887d038", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-1890118187-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.196", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:7::/64", "dns": [], "gateway": {"address": "2001:7::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:7::f816:3eff:fe2a:db6c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "c1390632da384309b358a3f3728ab5d8", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dbcb8e0-b6", "ovs_interfaceid": "9dbcb8e0-b6cb-47f5-b89d-290794905306", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.656 2 DEBUG nova.virt.libvirt.driver [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Start _get_guest_xml network_info=[{"id": "9dbcb8e0-b6cb-47f5-b89d-290794905306", "address": "fa:16:3e:2a:db:6c", "network": {"id": "ba711270-eff4-4485-a453-6e6d5887d038", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-1890118187-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.196", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:7::/64", "dns": [], "gateway": {"address": "2001:7::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:7::f816:3eff:fe2a:db6c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "c1390632da384309b358a3f3728ab5d8", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dbcb8e0-b6", "ovs_interfaceid": "9dbcb8e0-b6cb-47f5-b89d-290794905306", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.663 2 WARNING nova.virt.libvirt.driver [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.669 2 DEBUG nova.virt.libvirt.host [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.670 2 DEBUG nova.virt.libvirt.host [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.674 2 DEBUG nova.virt.libvirt.host [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.675 2 DEBUG nova.virt.libvirt.host [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.675 2 DEBUG nova.virt.libvirt.driver [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.676 2 DEBUG nova.virt.hardware [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.676 2 DEBUG nova.virt.hardware [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.677 2 DEBUG nova.virt.hardware [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.677 2 DEBUG nova.virt.hardware [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.677 2 DEBUG nova.virt.hardware [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.678 2 DEBUG nova.virt.hardware [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.678 2 DEBUG nova.virt.hardware [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.678 2 DEBUG nova.virt.hardware [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.679 2 DEBUG nova.virt.hardware [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.679 2 DEBUG nova.virt.hardware [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.679 2 DEBUG nova.virt.hardware [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.683 2 DEBUG nova.virt.libvirt.vif [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:25:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-OvnExtraDhcpOptionsTest-1890118187-test_extra_dhcp_opts_ipv4_ipv6_stateless',display_name='tempest-OvnExtraDhcpOptionsTest-1890118187-test_extra_dhcp_opts_ipv4_ipv6_stateless',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-ovnextradhcpoptionstest-1890118187-test-extra-dhcp-opts',id=31,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEd+u8f5VceRZeyfE0Yrf01yg7yEMN2pLOr84m9oNqDsynJlsd7nDIcghdcO/L5YEUqPbydKTzc34ECm0UQvZ9Ra2/TmyKbJN8Sbt6K51Zbo5VjtVEJw/1DdJeHYynadkg==',key_name='tempest-OvnExtraDhcpOptionsTest-1890118187',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c1390632da384309b358a3f3728ab5d8',ramdisk_id='',reservation_id='r-dw007kmk',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-OvnExtraDhcpOptionsTest-1189559672',owner_user_name='tempest-OvnExtraDhcpOptionsTest-1189559672-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:25:42Z,user_data=None,user_id='cff4e262a7054de9b32c7b3c504c757f',uuid=4f1d2adc-1ecb-45dc-83a0-c2369028e487,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9dbcb8e0-b6cb-47f5-b89d-290794905306", "address": "fa:16:3e:2a:db:6c", "network": {"id": "ba711270-eff4-4485-a453-6e6d5887d038", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-1890118187-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.196", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:7::/64", "dns": [], "gateway": {"address": "2001:7::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:7::f816:3eff:fe2a:db6c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "c1390632da384309b358a3f3728ab5d8", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dbcb8e0-b6", "ovs_interfaceid": "9dbcb8e0-b6cb-47f5-b89d-290794905306", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.684 2 DEBUG nova.network.os_vif_util [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Converting VIF {"id": "9dbcb8e0-b6cb-47f5-b89d-290794905306", "address": "fa:16:3e:2a:db:6c", "network": {"id": "ba711270-eff4-4485-a453-6e6d5887d038", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-1890118187-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.196", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:7::/64", "dns": [], "gateway": {"address": "2001:7::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:7::f816:3eff:fe2a:db6c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "c1390632da384309b358a3f3728ab5d8", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dbcb8e0-b6", "ovs_interfaceid": "9dbcb8e0-b6cb-47f5-b89d-290794905306", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.685 2 DEBUG nova.network.os_vif_util [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2a:db:6c,bridge_name='br-int',has_traffic_filtering=True,id=9dbcb8e0-b6cb-47f5-b89d-290794905306,network=Network(ba711270-eff4-4485-a453-6e6d5887d038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9dbcb8e0-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.686 2 DEBUG nova.objects.instance [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4f1d2adc-1ecb-45dc-83a0-c2369028e487 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.835 2 DEBUG nova.compute.manager [req-be47de31-ee99-4f50-97b7-549d8e7a4ed8 req-1b09a428-e02c-4422-8b65-6f44b5bce35a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Received event network-changed-9dbcb8e0-b6cb-47f5-b89d-290794905306 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.839 2 DEBUG nova.compute.manager [req-be47de31-ee99-4f50-97b7-549d8e7a4ed8 req-1b09a428-e02c-4422-8b65-6f44b5bce35a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Refreshing instance network info cache due to event network-changed-9dbcb8e0-b6cb-47f5-b89d-290794905306. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.839 2 DEBUG oslo_concurrency.lockutils [req-be47de31-ee99-4f50-97b7-549d8e7a4ed8 req-1b09a428-e02c-4422-8b65-6f44b5bce35a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-4f1d2adc-1ecb-45dc-83a0-c2369028e487" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.840 2 DEBUG oslo_concurrency.lockutils [req-be47de31-ee99-4f50-97b7-549d8e7a4ed8 req-1b09a428-e02c-4422-8b65-6f44b5bce35a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-4f1d2adc-1ecb-45dc-83a0-c2369028e487" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.840 2 DEBUG nova.network.neutron [req-be47de31-ee99-4f50-97b7-549d8e7a4ed8 req-1b09a428-e02c-4422-8b65-6f44b5bce35a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Refreshing network info cache for port 9dbcb8e0-b6cb-47f5-b89d-290794905306 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.844 2 DEBUG nova.virt.libvirt.driver [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:25:46 np0005476733 nova_compute[192580]:  <uuid>4f1d2adc-1ecb-45dc-83a0-c2369028e487</uuid>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:  <name>instance-0000001f</name>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <nova:name>tempest-OvnExtraDhcpOptionsTest-1890118187-test_extra_dhcp_opts_ipv4_ipv6_stateless</nova:name>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:25:46</nova:creationTime>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:25:46 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:        <nova:user uuid="cff4e262a7054de9b32c7b3c504c757f">tempest-OvnExtraDhcpOptionsTest-1189559672-project-member</nova:user>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:        <nova:project uuid="c1390632da384309b358a3f3728ab5d8">tempest-OvnExtraDhcpOptionsTest-1189559672</nova:project>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:        <nova:port uuid="9dbcb8e0-b6cb-47f5-b89d-290794905306">
Oct  8 11:25:46 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.7.196" ipVersion="4"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="2001:7::f816:3eff:fe2a:db6c" ipVersion="6"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <entry name="serial">4f1d2adc-1ecb-45dc-83a0-c2369028e487</entry>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <entry name="uuid">4f1d2adc-1ecb-45dc-83a0-c2369028e487</entry>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/4f1d2adc-1ecb-45dc-83a0-c2369028e487/disk"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/4f1d2adc-1ecb-45dc-83a0-c2369028e487/disk.config"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:2a:db:6c"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <target dev="tap9dbcb8e0-b6"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/4f1d2adc-1ecb-45dc-83a0-c2369028e487/console.log" append="off"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:25:46 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:25:46 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:25:46 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:25:46 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.849 2 DEBUG nova.compute.manager [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Preparing to wait for external event network-vif-plugged-9dbcb8e0-b6cb-47f5-b89d-290794905306 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.850 2 DEBUG oslo_concurrency.lockutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Acquiring lock "4f1d2adc-1ecb-45dc-83a0-c2369028e487-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.850 2 DEBUG oslo_concurrency.lockutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Lock "4f1d2adc-1ecb-45dc-83a0-c2369028e487-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.850 2 DEBUG oslo_concurrency.lockutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Lock "4f1d2adc-1ecb-45dc-83a0-c2369028e487-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.851 2 DEBUG nova.virt.libvirt.vif [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:25:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-OvnExtraDhcpOptionsTest-1890118187-test_extra_dhcp_opts_ipv4_ipv6_stateless',display_name='tempest-OvnExtraDhcpOptionsTest-1890118187-test_extra_dhcp_opts_ipv4_ipv6_stateless',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-ovnextradhcpoptionstest-1890118187-test-extra-dhcp-opts',id=31,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEd+u8f5VceRZeyfE0Yrf01yg7yEMN2pLOr84m9oNqDsynJlsd7nDIcghdcO/L5YEUqPbydKTzc34ECm0UQvZ9Ra2/TmyKbJN8Sbt6K51Zbo5VjtVEJw/1DdJeHYynadkg==',key_name='tempest-OvnExtraDhcpOptionsTest-1890118187',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c1390632da384309b358a3f3728ab5d8',ramdisk_id='',reservation_id='r-dw007kmk',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-OvnExtraDhcpOptionsTest-1189559672',owner_user_name='tempest-OvnExtraDhcpOptionsTest-1189559672-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:25:42Z,user_data=None,user_id='cff4e262a7054de9b32c7b3c504c757f',uuid=4f1d2adc-1ecb-45dc-83a0-c2369028e487,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9dbcb8e0-b6cb-47f5-b89d-290794905306", "address": "fa:16:3e:2a:db:6c", "network": {"id": "ba711270-eff4-4485-a453-6e6d5887d038", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-1890118187-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.196", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:7::/64", "dns": [], "gateway": {"address": "2001:7::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:7::f816:3eff:fe2a:db6c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "c1390632da384309b358a3f3728ab5d8", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dbcb8e0-b6", "ovs_interfaceid": "9dbcb8e0-b6cb-47f5-b89d-290794905306", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.852 2 DEBUG nova.network.os_vif_util [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Converting VIF {"id": "9dbcb8e0-b6cb-47f5-b89d-290794905306", "address": "fa:16:3e:2a:db:6c", "network": {"id": "ba711270-eff4-4485-a453-6e6d5887d038", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-1890118187-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.196", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:7::/64", "dns": [], "gateway": {"address": "2001:7::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:7::f816:3eff:fe2a:db6c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "c1390632da384309b358a3f3728ab5d8", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dbcb8e0-b6", "ovs_interfaceid": "9dbcb8e0-b6cb-47f5-b89d-290794905306", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.853 2 DEBUG nova.network.os_vif_util [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2a:db:6c,bridge_name='br-int',has_traffic_filtering=True,id=9dbcb8e0-b6cb-47f5-b89d-290794905306,network=Network(ba711270-eff4-4485-a453-6e6d5887d038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9dbcb8e0-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.853 2 DEBUG os_vif [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2a:db:6c,bridge_name='br-int',has_traffic_filtering=True,id=9dbcb8e0-b6cb-47f5-b89d-290794905306,network=Network(ba711270-eff4-4485-a453-6e6d5887d038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9dbcb8e0-b6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.855 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.855 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.859 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9dbcb8e0-b6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.859 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9dbcb8e0-b6, col_values=(('external_ids', {'iface-id': '9dbcb8e0-b6cb-47f5-b89d-290794905306', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2a:db:6c', 'vm-uuid': '4f1d2adc-1ecb-45dc-83a0-c2369028e487'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:46 np0005476733 NetworkManager[51699]: <info>  [1759937146.8622] manager: (tap9dbcb8e0-b6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.870 2 INFO os_vif [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2a:db:6c,bridge_name='br-int',has_traffic_filtering=True,id=9dbcb8e0-b6cb-47f5-b89d-290794905306,network=Network(ba711270-eff4-4485-a453-6e6d5887d038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9dbcb8e0-b6')#033[00m
Oct  8 11:25:46 np0005476733 nova_compute[192580]: 2025-10-08 15:25:46.984 2 DEBUG nova.network.neutron [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Updating instance_info_cache with network_info: [{"id": "cae08d04-f9a8-46ee-ba57-0a0db94ae186", "address": "fa:16:3e:16:82:23", "network": {"id": "9c022ba9-08a2-40a7-896d-13c1538d7064", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.168", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae08d04-f9", "ovs_interfaceid": "cae08d04-f9a8-46ee-ba57-0a0db94ae186", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.053 2 DEBUG nova.compute.manager [req-0c929ff3-1889-49dc-8d69-1ce203884066 req-3e3d8c7b-f13a-4ef6-b990-31d61f745df0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Received event network-changed-cae08d04-f9a8-46ee-ba57-0a0db94ae186 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.053 2 DEBUG nova.compute.manager [req-0c929ff3-1889-49dc-8d69-1ce203884066 req-3e3d8c7b-f13a-4ef6-b990-31d61f745df0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Refreshing instance network info cache due to event network-changed-cae08d04-f9a8-46ee-ba57-0a0db94ae186. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.053 2 DEBUG oslo_concurrency.lockutils [req-0c929ff3-1889-49dc-8d69-1ce203884066 req-3e3d8c7b-f13a-4ef6-b990-31d61f745df0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-cefc7b22-5a31-4d0c-bb25-462153dfc427" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.088 2 DEBUG oslo_concurrency.lockutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Releasing lock "refresh_cache-cefc7b22-5a31-4d0c-bb25-462153dfc427" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.089 2 DEBUG nova.compute.manager [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Instance network_info: |[{"id": "cae08d04-f9a8-46ee-ba57-0a0db94ae186", "address": "fa:16:3e:16:82:23", "network": {"id": "9c022ba9-08a2-40a7-896d-13c1538d7064", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.168", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae08d04-f9", "ovs_interfaceid": "cae08d04-f9a8-46ee-ba57-0a0db94ae186", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.090 2 DEBUG oslo_concurrency.lockutils [req-0c929ff3-1889-49dc-8d69-1ce203884066 req-3e3d8c7b-f13a-4ef6-b990-31d61f745df0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-cefc7b22-5a31-4d0c-bb25-462153dfc427" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.091 2 DEBUG nova.network.neutron [req-0c929ff3-1889-49dc-8d69-1ce203884066 req-3e3d8c7b-f13a-4ef6-b990-31d61f745df0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Refreshing network info cache for port cae08d04-f9a8-46ee-ba57-0a0db94ae186 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.094 2 DEBUG nova.virt.libvirt.driver [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Start _get_guest_xml network_info=[{"id": "cae08d04-f9a8-46ee-ba57-0a0db94ae186", "address": "fa:16:3e:16:82:23", "network": {"id": "9c022ba9-08a2-40a7-896d-13c1538d7064", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.168", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae08d04-f9", "ovs_interfaceid": "cae08d04-f9a8-46ee-ba57-0a0db94ae186", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.102 2 DEBUG nova.virt.libvirt.driver [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.102 2 DEBUG nova.virt.libvirt.driver [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.102 2 DEBUG nova.virt.libvirt.driver [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] No VIF found with MAC fa:16:3e:2a:db:6c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.103 2 INFO nova.virt.libvirt.driver [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Using config drive#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.107 2 WARNING nova.virt.libvirt.driver [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.118 2 DEBUG nova.virt.libvirt.host [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.119 2 DEBUG nova.virt.libvirt.host [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.125 2 DEBUG nova.virt.libvirt.host [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.126 2 DEBUG nova.virt.libvirt.host [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.126 2 DEBUG nova.virt.libvirt.driver [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.126 2 DEBUG nova.virt.hardware [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.127 2 DEBUG nova.virt.hardware [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.127 2 DEBUG nova.virt.hardware [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.128 2 DEBUG nova.virt.hardware [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.128 2 DEBUG nova.virt.hardware [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.128 2 DEBUG nova.virt.hardware [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.128 2 DEBUG nova.virt.hardware [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.128 2 DEBUG nova.virt.hardware [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.129 2 DEBUG nova.virt.hardware [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.129 2 DEBUG nova.virt.hardware [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.129 2 DEBUG nova.virt.hardware [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.133 2 DEBUG nova.virt.libvirt.vif [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:25:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4',display_name='tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-extradhcpoptionstest-1757752636-test-extra-dhcp-opts-di',id=30,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAROHXDFBirKKfgv1/Q2k8TOz822D2j3GssXLkqqAYkfNmKCLTZPWHL9R3TttvPeVcQM9XeUfcVk0LUjV4/DUc229+mDzz6yKwrgz0g4olEc5cIgAsFC91SZyJ937u9BxA==',key_name='tempest-ExtraDhcpOptionsTest-1757752636',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='93e68db931464f0282500c84d398d8af',ramdisk_id='',reservation_id='r-qkxis15o',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ExtraDhcpOptionsTest-522093769',owner_user_name='tempest-ExtraDhcpOptionsTest-522093769-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:25:41Z,user_data=None,user_id='048380879c82439f920961e33c8fc34c',uuid=cefc7b22-5a31-4d0c-bb25-462153dfc427,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cae08d04-f9a8-46ee-ba57-0a0db94ae186", "address": "fa:16:3e:16:82:23", "network": {"id": "9c022ba9-08a2-40a7-896d-13c1538d7064", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.168", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae08d04-f9", "ovs_interfaceid": "cae08d04-f9a8-46ee-ba57-0a0db94ae186", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.134 2 DEBUG nova.network.os_vif_util [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Converting VIF {"id": "cae08d04-f9a8-46ee-ba57-0a0db94ae186", "address": "fa:16:3e:16:82:23", "network": {"id": "9c022ba9-08a2-40a7-896d-13c1538d7064", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.168", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae08d04-f9", "ovs_interfaceid": "cae08d04-f9a8-46ee-ba57-0a0db94ae186", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.135 2 DEBUG nova.network.os_vif_util [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:16:82:23,bridge_name='br-int',has_traffic_filtering=True,id=cae08d04-f9a8-46ee-ba57-0a0db94ae186,network=Network(9c022ba9-08a2-40a7-896d-13c1538d7064),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcae08d04-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.136 2 DEBUG nova.objects.instance [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lazy-loading 'pci_devices' on Instance uuid cefc7b22-5a31-4d0c-bb25-462153dfc427 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.185 2 DEBUG nova.virt.libvirt.driver [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:25:47 np0005476733 nova_compute[192580]:  <uuid>cefc7b22-5a31-4d0c-bb25-462153dfc427</uuid>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:  <name>instance-0000001e</name>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <nova:name>tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4</nova:name>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:25:47</nova:creationTime>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:25:47 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:        <nova:user uuid="048380879c82439f920961e33c8fc34c">tempest-ExtraDhcpOptionsTest-522093769-project-member</nova:user>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:        <nova:project uuid="93e68db931464f0282500c84d398d8af">tempest-ExtraDhcpOptionsTest-522093769</nova:project>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:        <nova:port uuid="cae08d04-f9a8-46ee-ba57-0a0db94ae186">
Oct  8 11:25:47 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.2.168" ipVersion="4"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <entry name="serial">cefc7b22-5a31-4d0c-bb25-462153dfc427</entry>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <entry name="uuid">cefc7b22-5a31-4d0c-bb25-462153dfc427</entry>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/cefc7b22-5a31-4d0c-bb25-462153dfc427/disk"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.config"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:16:82:23"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <target dev="tapcae08d04-f9"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/cefc7b22-5a31-4d0c-bb25-462153dfc427/console.log" append="off"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:25:47 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:25:47 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:25:47 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:25:47 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.186 2 DEBUG nova.compute.manager [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Preparing to wait for external event network-vif-plugged-cae08d04-f9a8-46ee-ba57-0a0db94ae186 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.186 2 DEBUG oslo_concurrency.lockutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.186 2 DEBUG oslo_concurrency.lockutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.187 2 DEBUG oslo_concurrency.lockutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.187 2 DEBUG nova.virt.libvirt.vif [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:25:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4',display_name='tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-extradhcpoptionstest-1757752636-test-extra-dhcp-opts-di',id=30,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAROHXDFBirKKfgv1/Q2k8TOz822D2j3GssXLkqqAYkfNmKCLTZPWHL9R3TttvPeVcQM9XeUfcVk0LUjV4/DUc229+mDzz6yKwrgz0g4olEc5cIgAsFC91SZyJ937u9BxA==',key_name='tempest-ExtraDhcpOptionsTest-1757752636',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='93e68db931464f0282500c84d398d8af',ramdisk_id='',reservation_id='r-qkxis15o',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ExtraDhcpOptionsTest-522093769',owner_user_name='tempest-ExtraDhcpOptionsTest-522093769-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:25:41Z,user_data=None,user_id='048380879c82439f920961e33c8fc34c',uuid=cefc7b22-5a31-4d0c-bb25-462153dfc427,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cae08d04-f9a8-46ee-ba57-0a0db94ae186", "address": "fa:16:3e:16:82:23", "network": {"id": "9c022ba9-08a2-40a7-896d-13c1538d7064", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.168", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae08d04-f9", "ovs_interfaceid": "cae08d04-f9a8-46ee-ba57-0a0db94ae186", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.188 2 DEBUG nova.network.os_vif_util [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Converting VIF {"id": "cae08d04-f9a8-46ee-ba57-0a0db94ae186", "address": "fa:16:3e:16:82:23", "network": {"id": "9c022ba9-08a2-40a7-896d-13c1538d7064", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.168", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae08d04-f9", "ovs_interfaceid": "cae08d04-f9a8-46ee-ba57-0a0db94ae186", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.188 2 DEBUG nova.network.os_vif_util [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:16:82:23,bridge_name='br-int',has_traffic_filtering=True,id=cae08d04-f9a8-46ee-ba57-0a0db94ae186,network=Network(9c022ba9-08a2-40a7-896d-13c1538d7064),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcae08d04-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.189 2 DEBUG os_vif [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:16:82:23,bridge_name='br-int',has_traffic_filtering=True,id=cae08d04-f9a8-46ee-ba57-0a0db94ae186,network=Network(9c022ba9-08a2-40a7-896d-13c1538d7064),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcae08d04-f9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.190 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.190 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.202 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcae08d04-f9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.202 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcae08d04-f9, col_values=(('external_ids', {'iface-id': 'cae08d04-f9a8-46ee-ba57-0a0db94ae186', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:16:82:23', 'vm-uuid': 'cefc7b22-5a31-4d0c-bb25-462153dfc427'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:25:47 np0005476733 NetworkManager[51699]: <info>  [1759937147.2056] manager: (tapcae08d04-f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.214 2 INFO os_vif [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:16:82:23,bridge_name='br-int',has_traffic_filtering=True,id=cae08d04-f9a8-46ee-ba57-0a0db94ae186,network=Network(9c022ba9-08a2-40a7-896d-13c1538d7064),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcae08d04-f9')#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.284 2 INFO nova.compute.manager [None req-5e165779-f0c1-455a-a059-2e09571ead94 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Get console output#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.290 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.453 2 DEBUG nova.virt.libvirt.driver [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.454 2 DEBUG nova.virt.libvirt.driver [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.454 2 DEBUG nova.virt.libvirt.driver [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] No VIF found with MAC fa:16:3e:16:82:23, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.455 2 INFO nova.virt.libvirt.driver [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Using config drive#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.889 2 INFO nova.virt.libvirt.driver [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Creating config drive at /var/lib/nova/instances/4f1d2adc-1ecb-45dc-83a0-c2369028e487/disk.config#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.898 2 DEBUG oslo_concurrency.processutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4f1d2adc-1ecb-45dc-83a0-c2369028e487/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpexxafo1w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.960 2 INFO nova.virt.libvirt.driver [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Creating config drive at /var/lib/nova/instances/cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.config#033[00m
Oct  8 11:25:47 np0005476733 nova_compute[192580]: 2025-10-08 15:25:47.968 2 DEBUG oslo_concurrency.processutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdy2eni6z execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.026 2 DEBUG oslo_concurrency.processutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4f1d2adc-1ecb-45dc-83a0-c2369028e487/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpexxafo1w" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:48 np0005476733 NetworkManager[51699]: <info>  [1759937148.0889] manager: (tap9dbcb8e0-b6): new Tun device (/org/freedesktop/NetworkManager/Devices/92)
Oct  8 11:25:48 np0005476733 kernel: tap9dbcb8e0-b6: entered promiscuous mode
Oct  8 11:25:48 np0005476733 ovn_controller[94857]: 2025-10-08T15:25:48Z|00220|binding|INFO|Claiming lport 9dbcb8e0-b6cb-47f5-b89d-290794905306 for this chassis.
Oct  8 11:25:48 np0005476733 ovn_controller[94857]: 2025-10-08T15:25:48Z|00221|binding|INFO|9dbcb8e0-b6cb-47f5-b89d-290794905306: Claiming fa:16:3e:2a:db:6c 192.168.7.196 2001:7::f816:3eff:fe2a:db6c
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.097 2 DEBUG oslo_concurrency.processutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdy2eni6z" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:48 np0005476733 ovn_controller[94857]: 2025-10-08T15:25:48Z|00222|binding|INFO|Setting lport 9dbcb8e0-b6cb-47f5-b89d-290794905306 ovn-installed in OVS
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:48 np0005476733 ovn_controller[94857]: 2025-10-08T15:25:48Z|00223|binding|INFO|Setting lport 9dbcb8e0-b6cb-47f5-b89d-290794905306 up in Southbound
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.127 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2a:db:6c 192.168.7.196 2001:7::f816:3eff:fe2a:db6c'], port_security=['fa:16:3e:2a:db:6c 192.168.7.196 2001:7::f816:3eff:fe2a:db6c'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-OvnExtraDhcpOptionsTest-1890118187-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'neutron:cidrs': '192.168.7.196/24 2001:7::f816:3eff:fe2a:db6c/64', 'neutron:device_id': '4f1d2adc-1ecb-45dc-83a0-c2369028e487', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ba711270-eff4-4485-a453-6e6d5887d038', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-OvnExtraDhcpOptionsTest-1890118187-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'neutron:project_id': 'c1390632da384309b358a3f3728ab5d8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '42921251-1598-4dc8-8cc9-ac707d0ab44b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d3a3e561-49a1-4407-8153-f33a25aca48b, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=9dbcb8e0-b6cb-47f5-b89d-290794905306) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.128 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 9dbcb8e0-b6cb-47f5-b89d-290794905306 in datapath ba711270-eff4-4485-a453-6e6d5887d038 bound to our chassis#033[00m
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.133 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ba711270-eff4-4485-a453-6e6d5887d038#033[00m
Oct  8 11:25:48 np0005476733 systemd-udevd[227519]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.146 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1af3b137-5659-48dd-aeb0-7393a2c49049]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.147 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapba711270-e1 in ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:25:48 np0005476733 systemd-machined[152624]: New machine qemu-18-instance-0000001f.
Oct  8 11:25:48 np0005476733 NetworkManager[51699]: <info>  [1759937148.1499] device (tap9dbcb8e0-b6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:25:48 np0005476733 NetworkManager[51699]: <info>  [1759937148.1509] device (tap9dbcb8e0-b6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.155 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapba711270-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.156 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ee0716d0-54f4-4188-9eba-51159046cf97]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.157 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2f71570b-1eb3-46ea-b966-7a66fbc32455]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:48 np0005476733 systemd[1]: Started Virtual Machine qemu-18-instance-0000001f.
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.172 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[30452b00-8f23-4fb5-9d2e-3fa4ebf4070e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.199 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6669262f-5915-4a97-acaf-b96f88bd1063]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:48 np0005476733 kernel: tapcae08d04-f9: entered promiscuous mode
Oct  8 11:25:48 np0005476733 NetworkManager[51699]: <info>  [1759937148.2101] manager: (tapcae08d04-f9): new Tun device (/org/freedesktop/NetworkManager/Devices/93)
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:48 np0005476733 ovn_controller[94857]: 2025-10-08T15:25:48Z|00224|binding|INFO|Claiming lport cae08d04-f9a8-46ee-ba57-0a0db94ae186 for this chassis.
Oct  8 11:25:48 np0005476733 ovn_controller[94857]: 2025-10-08T15:25:48Z|00225|binding|INFO|cae08d04-f9a8-46ee-ba57-0a0db94ae186: Claiming fa:16:3e:16:82:23 192.168.2.168
Oct  8 11:25:48 np0005476733 NetworkManager[51699]: <info>  [1759937148.2250] device (tapcae08d04-f9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:25:48 np0005476733 NetworkManager[51699]: <info>  [1759937148.2257] device (tapcae08d04-f9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:25:48 np0005476733 ovn_controller[94857]: 2025-10-08T15:25:48Z|00226|binding|INFO|Setting lport cae08d04-f9a8-46ee-ba57-0a0db94ae186 ovn-installed in OVS
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.245 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:16:82:23 192.168.2.168'], port_security=['fa:16:3e:16:82:23 192.168.2.168'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'neutron:cidrs': '192.168.2.168/24', 'neutron:device_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c022ba9-08a2-40a7-896d-13c1538d7064', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'neutron:project_id': '93e68db931464f0282500c84d398d8af', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ee93d6be-59e3-41c0-a55f-8df79fb9da74', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5ea236cb-dec7-48d3-a1ef-7ce9f1bd90ad, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=cae08d04-f9a8-46ee-ba57-0a0db94ae186) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:25:48 np0005476733 ovn_controller[94857]: 2025-10-08T15:25:48Z|00227|binding|INFO|Setting lport cae08d04-f9a8-46ee-ba57-0a0db94ae186 up in Southbound
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.254 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[85e89396-1dc0-48e2-9fc1-6cc140ae2081]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:48 np0005476733 NetworkManager[51699]: <info>  [1759937148.2603] manager: (tapba711270-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/94)
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.258 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[78949b1a-34ad-4e09-9250-f031db504618]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:48 np0005476733 systemd-machined[152624]: New machine qemu-19-instance-0000001e.
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.307 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[e6af2e44-dd56-4bc3-b469-3c907ff23862]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.310 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[dce37520-611a-4d62-8520-e52d88d06456]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:48 np0005476733 systemd[1]: Started Virtual Machine qemu-19-instance-0000001e.
Oct  8 11:25:48 np0005476733 NetworkManager[51699]: <info>  [1759937148.3355] device (tapba711270-e0): carrier: link connected
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.342 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[3a6fe6cc-1813-4aa9-8af1-9f156c9691ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:48 np0005476733 podman[227545]: 2025-10-08 15:25:48.355591021 +0000 UTC m=+0.074963596 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.362 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[91b379d6-0862-476f-b864-8fbf4e6df0c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapba711270-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:46:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 405200, 'reachable_time': 32590, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227602, 'error': None, 'target': 'ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.381 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0a60c0d4-a6f2-4b0a-b259-14e18f918e6e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea4:466b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 405200, 'tstamp': 405200}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227615, 'error': None, 'target': 'ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:48 np0005476733 podman[227549]: 2025-10-08 15:25:48.39786504 +0000 UTC m=+0.114286450 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.400 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c93655d8-bda7-42d8-ad85-8d3a9523e77e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapba711270-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:46:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 405200, 'reachable_time': 32590, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227616, 'error': None, 'target': 'ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.435 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1a19c109-bb34-45df-a7d5-4ecb27b3e9fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.501 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7b6c650f-e424-4bd2-9646-0c09cbf3d4ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.503 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapba711270-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.503 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.503 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapba711270-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:25:48 np0005476733 NetworkManager[51699]: <info>  [1759937148.5063] manager: (tapba711270-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:48 np0005476733 kernel: tapba711270-e0: entered promiscuous mode
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.511 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapba711270-e0, col_values=(('external_ids', {'iface-id': '67a74961-69a4-4a58-9c66-716dd0490fac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:48 np0005476733 ovn_controller[94857]: 2025-10-08T15:25:48Z|00228|binding|INFO|Releasing lport 67a74961-69a4-4a58-9c66-716dd0490fac from this chassis (sb_readonly=0)
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.517 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ba711270-eff4-4485-a453-6e6d5887d038.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ba711270-eff4-4485-a453-6e6d5887d038.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.526 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3e5dfdb2-ba29-4941-92aa-db39d7017d10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.527 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-ba711270-eff4-4485-a453-6e6d5887d038
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/ba711270-eff4-4485-a453-6e6d5887d038.pid.haproxy
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID ba711270-eff4-4485-a453-6e6d5887d038
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:48.529 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038', 'env', 'PROCESS_TAG=haproxy-ba711270-eff4-4485-a453-6e6d5887d038', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ba711270-eff4-4485-a453-6e6d5887d038.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.582 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.733 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.734 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:48 np0005476733 podman[227663]: 2025-10-08 15:25:48.904263925 +0000 UTC m=+0.044041479 container create 38eaa77941a104ea58512118b3a40463b033b6715f3ad7362e0bcca48ca2e288 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Oct  8 11:25:48 np0005476733 systemd[1]: Started libpod-conmon-38eaa77941a104ea58512118b3a40463b033b6715f3ad7362e0bcca48ca2e288.scope.
Oct  8 11:25:48 np0005476733 podman[227663]: 2025-10-08 15:25:48.880480145 +0000 UTC m=+0.020257719 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.992 2 DEBUG nova.compute.manager [req-43c60192-a8db-43e8-b9e1-f85e02faaf5f req-35a8c04c-8113-4bed-847c-912f944f64e2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Received event network-vif-plugged-9dbcb8e0-b6cb-47f5-b89d-290794905306 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.992 2 DEBUG oslo_concurrency.lockutils [req-43c60192-a8db-43e8-b9e1-f85e02faaf5f req-35a8c04c-8113-4bed-847c-912f944f64e2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "4f1d2adc-1ecb-45dc-83a0-c2369028e487-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.993 2 DEBUG oslo_concurrency.lockutils [req-43c60192-a8db-43e8-b9e1-f85e02faaf5f req-35a8c04c-8113-4bed-847c-912f944f64e2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "4f1d2adc-1ecb-45dc-83a0-c2369028e487-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.993 2 DEBUG oslo_concurrency.lockutils [req-43c60192-a8db-43e8-b9e1-f85e02faaf5f req-35a8c04c-8113-4bed-847c-912f944f64e2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "4f1d2adc-1ecb-45dc-83a0-c2369028e487-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.993 2 DEBUG nova.compute.manager [req-43c60192-a8db-43e8-b9e1-f85e02faaf5f req-35a8c04c-8113-4bed-847c-912f944f64e2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Processing event network-vif-plugged-9dbcb8e0-b6cb-47f5-b89d-290794905306 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.994 2 DEBUG nova.compute.manager [req-43c60192-a8db-43e8-b9e1-f85e02faaf5f req-35a8c04c-8113-4bed-847c-912f944f64e2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Received event network-vif-plugged-9dbcb8e0-b6cb-47f5-b89d-290794905306 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.994 2 DEBUG oslo_concurrency.lockutils [req-43c60192-a8db-43e8-b9e1-f85e02faaf5f req-35a8c04c-8113-4bed-847c-912f944f64e2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "4f1d2adc-1ecb-45dc-83a0-c2369028e487-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.994 2 DEBUG oslo_concurrency.lockutils [req-43c60192-a8db-43e8-b9e1-f85e02faaf5f req-35a8c04c-8113-4bed-847c-912f944f64e2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "4f1d2adc-1ecb-45dc-83a0-c2369028e487-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:48 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.995 2 DEBUG oslo_concurrency.lockutils [req-43c60192-a8db-43e8-b9e1-f85e02faaf5f req-35a8c04c-8113-4bed-847c-912f944f64e2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "4f1d2adc-1ecb-45dc-83a0-c2369028e487-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.996 2 DEBUG nova.compute.manager [req-43c60192-a8db-43e8-b9e1-f85e02faaf5f req-35a8c04c-8113-4bed-847c-912f944f64e2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] No waiting events found dispatching network-vif-plugged-9dbcb8e0-b6cb-47f5-b89d-290794905306 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:25:48 np0005476733 nova_compute[192580]: 2025-10-08 15:25:48.996 2 WARNING nova.compute.manager [req-43c60192-a8db-43e8-b9e1-f85e02faaf5f req-35a8c04c-8113-4bed-847c-912f944f64e2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Received unexpected event network-vif-plugged-9dbcb8e0-b6cb-47f5-b89d-290794905306 for instance with vm_state building and task_state spawning.#033[00m
Oct  8 11:25:49 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2919dc10e236115070f511a90cff8b4ab281ac08beb20ad1236f9f3f66cef31b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.025 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937149.0254292, 4f1d2adc-1ecb-45dc-83a0-c2369028e487 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.026 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] VM Started (Lifecycle Event)#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.029 2 DEBUG nova.compute.manager [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.039 2 DEBUG nova.virt.libvirt.driver [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.047 2 INFO nova.virt.libvirt.driver [-] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Instance spawned successfully.#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.047 2 DEBUG nova.virt.libvirt.driver [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:25:49 np0005476733 podman[227663]: 2025-10-08 15:25:49.088377815 +0000 UTC m=+0.228155399 container init 38eaa77941a104ea58512118b3a40463b033b6715f3ad7362e0bcca48ca2e288 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:25:49 np0005476733 podman[227663]: 2025-10-08 15:25:49.095402209 +0000 UTC m=+0.235179763 container start 38eaa77941a104ea58512118b3a40463b033b6715f3ad7362e0bcca48ca2e288 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS)
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.113 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.121 2 DEBUG nova.virt.libvirt.driver [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.122 2 DEBUG nova.virt.libvirt.driver [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.123 2 DEBUG nova.virt.libvirt.driver [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.124 2 DEBUG nova.virt.libvirt.driver [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.124 2 DEBUG nova.virt.libvirt.driver [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.125 2 DEBUG nova.virt.libvirt.driver [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:25:49 np0005476733 neutron-haproxy-ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038[227675]: [NOTICE]   (227681) : New worker (227683) forked
Oct  8 11:25:49 np0005476733 neutron-haproxy-ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038[227675]: [NOTICE]   (227681) : Loading success.
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.130 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.156 2 DEBUG nova.network.neutron [req-be47de31-ee99-4f50-97b7-549d8e7a4ed8 req-1b09a428-e02c-4422-8b65-6f44b5bce35a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Updated VIF entry in instance network info cache for port 9dbcb8e0-b6cb-47f5-b89d-290794905306. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.157 2 DEBUG nova.network.neutron [req-be47de31-ee99-4f50-97b7-549d8e7a4ed8 req-1b09a428-e02c-4422-8b65-6f44b5bce35a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Updating instance_info_cache with network_info: [{"id": "9dbcb8e0-b6cb-47f5-b89d-290794905306", "address": "fa:16:3e:2a:db:6c", "network": {"id": "ba711270-eff4-4485-a453-6e6d5887d038", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-1890118187-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.196", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:7::/64", "dns": [], "gateway": {"address": "2001:7::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:7::f816:3eff:fe2a:db6c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "c1390632da384309b358a3f3728ab5d8", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dbcb8e0-b6", "ovs_interfaceid": "9dbcb8e0-b6cb-47f5-b89d-290794905306", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:25:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:49.176 103739 INFO neutron.agent.ovn.metadata.agent [-] Port cae08d04-f9a8-46ee-ba57-0a0db94ae186 in datapath 9c022ba9-08a2-40a7-896d-13c1538d7064 unbound from our chassis#033[00m
Oct  8 11:25:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:49.178 103739 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 9c022ba9-08a2-40a7-896d-13c1538d7064 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  8 11:25:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:25:49.179 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0c58876f-e55f-4547-a7cd-2f5544083ae8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.695 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.698 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937149.0264359, 4f1d2adc-1ecb-45dc-83a0-c2369028e487 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.699 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.701 2 DEBUG oslo_concurrency.lockutils [req-be47de31-ee99-4f50-97b7-549d8e7a4ed8 req-1b09a428-e02c-4422-8b65-6f44b5bce35a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-4f1d2adc-1ecb-45dc-83a0-c2369028e487" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.712 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.712 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.712 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.761 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.765 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937149.0388093, 4f1d2adc-1ecb-45dc-83a0-c2369028e487 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.765 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.825 2 INFO nova.compute.manager [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Took 7.74 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:25:49 np0005476733 nova_compute[192580]: 2025-10-08 15:25:49.826 2 DEBUG nova.compute.manager [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:25:50 np0005476733 nova_compute[192580]: 2025-10-08 15:25:50.062 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:25:50 np0005476733 nova_compute[192580]: 2025-10-08 15:25:50.072 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:25:50 np0005476733 nova_compute[192580]: 2025-10-08 15:25:50.272 2 INFO nova.compute.manager [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Took 8.86 seconds to build instance.#033[00m
Oct  8 11:25:50 np0005476733 nova_compute[192580]: 2025-10-08 15:25:50.352 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937149.0909157, cefc7b22-5a31-4d0c-bb25-462153dfc427 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:25:50 np0005476733 nova_compute[192580]: 2025-10-08 15:25:50.353 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] VM Started (Lifecycle Event)#033[00m
Oct  8 11:25:50 np0005476733 nova_compute[192580]: 2025-10-08 15:25:50.517 2 DEBUG nova.compute.manager [req-dba24d76-5312-4d2d-a26c-825f60bd72c1 req-4e3964b7-32c6-4bf6-acce-cb65c5feb674 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Received event network-vif-plugged-cae08d04-f9a8-46ee-ba57-0a0db94ae186 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:25:50 np0005476733 nova_compute[192580]: 2025-10-08 15:25:50.518 2 DEBUG oslo_concurrency.lockutils [req-dba24d76-5312-4d2d-a26c-825f60bd72c1 req-4e3964b7-32c6-4bf6-acce-cb65c5feb674 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:50 np0005476733 nova_compute[192580]: 2025-10-08 15:25:50.518 2 DEBUG oslo_concurrency.lockutils [req-dba24d76-5312-4d2d-a26c-825f60bd72c1 req-4e3964b7-32c6-4bf6-acce-cb65c5feb674 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:50 np0005476733 nova_compute[192580]: 2025-10-08 15:25:50.518 2 DEBUG oslo_concurrency.lockutils [req-dba24d76-5312-4d2d-a26c-825f60bd72c1 req-4e3964b7-32c6-4bf6-acce-cb65c5feb674 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:50 np0005476733 nova_compute[192580]: 2025-10-08 15:25:50.519 2 DEBUG nova.compute.manager [req-dba24d76-5312-4d2d-a26c-825f60bd72c1 req-4e3964b7-32c6-4bf6-acce-cb65c5feb674 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Processing event network-vif-plugged-cae08d04-f9a8-46ee-ba57-0a0db94ae186 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:25:50 np0005476733 nova_compute[192580]: 2025-10-08 15:25:50.520 2 DEBUG nova.compute.manager [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:25:50 np0005476733 nova_compute[192580]: 2025-10-08 15:25:50.542 2 DEBUG nova.virt.libvirt.driver [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:25:50 np0005476733 nova_compute[192580]: 2025-10-08 15:25:50.547 2 INFO nova.virt.libvirt.driver [-] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Instance spawned successfully.#033[00m
Oct  8 11:25:50 np0005476733 nova_compute[192580]: 2025-10-08 15:25:50.548 2 DEBUG nova.virt.libvirt.driver [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:25:50 np0005476733 nova_compute[192580]: 2025-10-08 15:25:50.668 2 DEBUG oslo_concurrency.lockutils [None req-9cd7bd2a-df2e-49f3-8116-3198462b50ef cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Lock "4f1d2adc-1ecb-45dc-83a0-c2369028e487" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.331s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:50 np0005476733 nova_compute[192580]: 2025-10-08 15:25:50.783 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:25:50 np0005476733 nova_compute[192580]: 2025-10-08 15:25:50.789 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:25:51 np0005476733 nova_compute[192580]: 2025-10-08 15:25:51.219 2 DEBUG nova.virt.libvirt.driver [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:25:51 np0005476733 nova_compute[192580]: 2025-10-08 15:25:51.220 2 DEBUG nova.virt.libvirt.driver [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:25:51 np0005476733 nova_compute[192580]: 2025-10-08 15:25:51.221 2 DEBUG nova.virt.libvirt.driver [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:25:51 np0005476733 nova_compute[192580]: 2025-10-08 15:25:51.222 2 DEBUG nova.virt.libvirt.driver [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:25:51 np0005476733 nova_compute[192580]: 2025-10-08 15:25:51.223 2 DEBUG nova.virt.libvirt.driver [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:25:51 np0005476733 nova_compute[192580]: 2025-10-08 15:25:51.224 2 DEBUG nova.virt.libvirt.driver [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:25:51 np0005476733 nova_compute[192580]: 2025-10-08 15:25:51.231 2 DEBUG nova.network.neutron [req-0c929ff3-1889-49dc-8d69-1ce203884066 req-3e3d8c7b-f13a-4ef6-b990-31d61f745df0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Updated VIF entry in instance network info cache for port cae08d04-f9a8-46ee-ba57-0a0db94ae186. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:25:51 np0005476733 nova_compute[192580]: 2025-10-08 15:25:51.231 2 DEBUG nova.network.neutron [req-0c929ff3-1889-49dc-8d69-1ce203884066 req-3e3d8c7b-f13a-4ef6-b990-31d61f745df0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Updating instance_info_cache with network_info: [{"id": "cae08d04-f9a8-46ee-ba57-0a0db94ae186", "address": "fa:16:3e:16:82:23", "network": {"id": "9c022ba9-08a2-40a7-896d-13c1538d7064", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.168", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae08d04-f9", "ovs_interfaceid": "cae08d04-f9a8-46ee-ba57-0a0db94ae186", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:25:51 np0005476733 nova_compute[192580]: 2025-10-08 15:25:51.484 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:25:51 np0005476733 nova_compute[192580]: 2025-10-08 15:25:51.485 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937149.0911593, cefc7b22-5a31-4d0c-bb25-462153dfc427 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:25:51 np0005476733 nova_compute[192580]: 2025-10-08 15:25:51.485 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:25:51 np0005476733 nova_compute[192580]: 2025-10-08 15:25:51.603 2 DEBUG oslo_concurrency.lockutils [req-0c929ff3-1889-49dc-8d69-1ce203884066 req-3e3d8c7b-f13a-4ef6-b990-31d61f745df0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-cefc7b22-5a31-4d0c-bb25-462153dfc427" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:25:51 np0005476733 nova_compute[192580]: 2025-10-08 15:25:51.958 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:25:51 np0005476733 nova_compute[192580]: 2025-10-08 15:25:51.964 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937150.5254211, cefc7b22-5a31-4d0c-bb25-462153dfc427 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:25:51 np0005476733 nova_compute[192580]: 2025-10-08 15:25:51.965 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:25:52 np0005476733 nova_compute[192580]: 2025-10-08 15:25:52.067 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:25:52 np0005476733 nova_compute[192580]: 2025-10-08 15:25:52.071 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:25:52 np0005476733 nova_compute[192580]: 2025-10-08 15:25:52.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:52 np0005476733 nova_compute[192580]: 2025-10-08 15:25:52.283 2 INFO nova.compute.manager [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Took 10.54 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:25:52 np0005476733 nova_compute[192580]: 2025-10-08 15:25:52.284 2 DEBUG nova.compute.manager [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:25:52 np0005476733 nova_compute[192580]: 2025-10-08 15:25:52.525 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:25:52 np0005476733 nova_compute[192580]: 2025-10-08 15:25:52.799 2 INFO nova.compute.manager [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Took 11.71 seconds to build instance.#033[00m
Oct  8 11:25:53 np0005476733 nova_compute[192580]: 2025-10-08 15:25:53.032 2 DEBUG oslo_concurrency.lockutils [None req-f1eba431-b61a-4d6c-b514-86dd13dd1bf7 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:53 np0005476733 nova_compute[192580]: 2025-10-08 15:25:53.376 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Updating instance_info_cache with network_info: [{"id": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "address": "fa:16:3e:a6:aa:3b", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bb60f77-cd", "ovs_interfaceid": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:25:53 np0005476733 nova_compute[192580]: 2025-10-08 15:25:53.555 2 INFO nova.compute.manager [None req-b01fb1d3-4c6f-46b7-9cee-be6fc99ddb86 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Get console output#033[00m
Oct  8 11:25:53 np0005476733 nova_compute[192580]: 2025-10-08 15:25:53.560 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:25:53 np0005476733 nova_compute[192580]: 2025-10-08 15:25:53.563 2 INFO nova.virt.libvirt.driver [None req-b01fb1d3-4c6f-46b7-9cee-be6fc99ddb86 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Truncated console log returned, 3174 bytes ignored#033[00m
Oct  8 11:25:53 np0005476733 nova_compute[192580]: 2025-10-08 15:25:53.576 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:25:53 np0005476733 nova_compute[192580]: 2025-10-08 15:25:53.577 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:25:53 np0005476733 nova_compute[192580]: 2025-10-08 15:25:53.578 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:25:53 np0005476733 nova_compute[192580]: 2025-10-08 15:25:53.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:54 np0005476733 nova_compute[192580]: 2025-10-08 15:25:54.084 2 INFO nova.compute.manager [None req-7c1801eb-c1eb-48f3-bf3e-e17477a0b4ea cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Get console output#033[00m
Oct  8 11:25:54 np0005476733 nova_compute[192580]: 2025-10-08 15:25:54.088 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:25:55 np0005476733 nova_compute[192580]: 2025-10-08 15:25:55.500 2 INFO nova.compute.manager [None req-6fe6f9bb-b5c1-44e9-a175-d29f3b658571 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Get console output#033[00m
Oct  8 11:25:55 np0005476733 nova_compute[192580]: 2025-10-08 15:25:55.505 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:25:55 np0005476733 nova_compute[192580]: 2025-10-08 15:25:55.845 2 DEBUG nova.compute.manager [req-8192773d-1191-42b0-8b5d-cff3168477b3 req-6294c3c5-3359-4247-aa72-ef1579a1d696 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Received event network-vif-plugged-cae08d04-f9a8-46ee-ba57-0a0db94ae186 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:25:55 np0005476733 nova_compute[192580]: 2025-10-08 15:25:55.845 2 DEBUG oslo_concurrency.lockutils [req-8192773d-1191-42b0-8b5d-cff3168477b3 req-6294c3c5-3359-4247-aa72-ef1579a1d696 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:25:55 np0005476733 nova_compute[192580]: 2025-10-08 15:25:55.846 2 DEBUG oslo_concurrency.lockutils [req-8192773d-1191-42b0-8b5d-cff3168477b3 req-6294c3c5-3359-4247-aa72-ef1579a1d696 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:25:55 np0005476733 nova_compute[192580]: 2025-10-08 15:25:55.846 2 DEBUG oslo_concurrency.lockutils [req-8192773d-1191-42b0-8b5d-cff3168477b3 req-6294c3c5-3359-4247-aa72-ef1579a1d696 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:25:55 np0005476733 nova_compute[192580]: 2025-10-08 15:25:55.846 2 DEBUG nova.compute.manager [req-8192773d-1191-42b0-8b5d-cff3168477b3 req-6294c3c5-3359-4247-aa72-ef1579a1d696 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] No waiting events found dispatching network-vif-plugged-cae08d04-f9a8-46ee-ba57-0a0db94ae186 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:25:55 np0005476733 nova_compute[192580]: 2025-10-08 15:25:55.847 2 WARNING nova.compute.manager [req-8192773d-1191-42b0-8b5d-cff3168477b3 req-6294c3c5-3359-4247-aa72-ef1579a1d696 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Received unexpected event network-vif-plugged-cae08d04-f9a8-46ee-ba57-0a0db94ae186 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:25:56 np0005476733 podman[227696]: 2025-10-08 15:25:56.220066217 +0000 UTC m=+0.052126275 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct  8 11:25:57 np0005476733 nova_compute[192580]: 2025-10-08 15:25:57.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:58 np0005476733 nova_compute[192580]: 2025-10-08 15:25:58.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:25:59 np0005476733 nova_compute[192580]: 2025-10-08 15:25:59.265 2 INFO nova.compute.manager [None req-b813c10b-610d-4b11-a124-b23ac24928bf cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Get console output#033[00m
Oct  8 11:25:59 np0005476733 nova_compute[192580]: 2025-10-08 15:25:59.350 2 DEBUG nova.compute.manager [req-5325765d-325b-495d-be36-4932333fd85e req-197f1d69-6431-4b32-8487-4bb20d89d858 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Received event network-changed-020c7187-878e-4336-a49d-ac40eb956ef6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:25:59 np0005476733 nova_compute[192580]: 2025-10-08 15:25:59.351 2 DEBUG nova.compute.manager [req-5325765d-325b-495d-be36-4932333fd85e req-197f1d69-6431-4b32-8487-4bb20d89d858 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Refreshing instance network info cache due to event network-changed-020c7187-878e-4336-a49d-ac40eb956ef6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:25:59 np0005476733 nova_compute[192580]: 2025-10-08 15:25:59.352 2 DEBUG oslo_concurrency.lockutils [req-5325765d-325b-495d-be36-4932333fd85e req-197f1d69-6431-4b32-8487-4bb20d89d858 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-1cbc4434-d89a-483d-a1f2-299190262888" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:25:59 np0005476733 nova_compute[192580]: 2025-10-08 15:25:59.352 2 DEBUG oslo_concurrency.lockutils [req-5325765d-325b-495d-be36-4932333fd85e req-197f1d69-6431-4b32-8487-4bb20d89d858 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-1cbc4434-d89a-483d-a1f2-299190262888" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:25:59 np0005476733 nova_compute[192580]: 2025-10-08 15:25:59.353 2 DEBUG nova.network.neutron [req-5325765d-325b-495d-be36-4932333fd85e req-197f1d69-6431-4b32-8487-4bb20d89d858 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Refreshing network info cache for port 020c7187-878e-4336-a49d-ac40eb956ef6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:26:00 np0005476733 nova_compute[192580]: 2025-10-08 15:26:00.671 2 INFO nova.compute.manager [None req-087cc189-c55c-466c-9f82-79e33db382ed 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Get console output#033[00m
Oct  8 11:26:01 np0005476733 podman[227718]: 2025-10-08 15:26:01.28834494 +0000 UTC m=+0.110631085 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 11:26:01 np0005476733 podman[227717]: 2025-10-08 15:26:01.317996656 +0000 UTC m=+0.140062634 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:26:01 np0005476733 nova_compute[192580]: 2025-10-08 15:26:01.348 2 DEBUG nova.network.neutron [req-5325765d-325b-495d-be36-4932333fd85e req-197f1d69-6431-4b32-8487-4bb20d89d858 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Updated VIF entry in instance network info cache for port 020c7187-878e-4336-a49d-ac40eb956ef6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:26:01 np0005476733 nova_compute[192580]: 2025-10-08 15:26:01.348 2 DEBUG nova.network.neutron [req-5325765d-325b-495d-be36-4932333fd85e req-197f1d69-6431-4b32-8487-4bb20d89d858 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Updating instance_info_cache with network_info: [{"id": "020c7187-878e-4336-a49d-ac40eb956ef6", "address": "fa:16:3e:8b:42:9f", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap020c7187-87", "ovs_interfaceid": "020c7187-878e-4336-a49d-ac40eb956ef6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:26:01 np0005476733 nova_compute[192580]: 2025-10-08 15:26:01.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:01 np0005476733 nova_compute[192580]: 2025-10-08 15:26:01.372 2 DEBUG oslo_concurrency.lockutils [req-5325765d-325b-495d-be36-4932333fd85e req-197f1d69-6431-4b32-8487-4bb20d89d858 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-1cbc4434-d89a-483d-a1f2-299190262888" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:26:02 np0005476733 nova_compute[192580]: 2025-10-08 15:26:02.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:03 np0005476733 nova_compute[192580]: 2025-10-08 15:26:03.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:05 np0005476733 nova_compute[192580]: 2025-10-08 15:26:05.458 2 INFO nova.compute.manager [None req-8930cea6-0139-4091-8ec2-d47ad42c3add cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Get console output#033[00m
Oct  8 11:26:06 np0005476733 nova_compute[192580]: 2025-10-08 15:26:06.143 2 INFO nova.compute.manager [None req-562e6f22-65cc-4adb-81f8-10a837b9f86d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Get console output#033[00m
Oct  8 11:26:06 np0005476733 nova_compute[192580]: 2025-10-08 15:26:06.149 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:26:07 np0005476733 nova_compute[192580]: 2025-10-08 15:26:07.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:07 np0005476733 podman[227770]: 2025-10-08 15:26:07.223978433 +0000 UTC m=+0.057380264 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64)
Oct  8 11:26:07 np0005476733 nova_compute[192580]: 2025-10-08 15:26:07.371 2 DEBUG oslo_concurrency.lockutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "7f1808f3-5a79-4149-84d1-7bc21eefa497" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:26:07 np0005476733 nova_compute[192580]: 2025-10-08 15:26:07.372 2 DEBUG oslo_concurrency.lockutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "7f1808f3-5a79-4149-84d1-7bc21eefa497" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:26:07 np0005476733 nova_compute[192580]: 2025-10-08 15:26:07.453 2 DEBUG nova.compute.manager [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:26:07 np0005476733 nova_compute[192580]: 2025-10-08 15:26:07.634 2 DEBUG oslo_concurrency.lockutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:26:07 np0005476733 nova_compute[192580]: 2025-10-08 15:26:07.637 2 DEBUG oslo_concurrency.lockutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:26:07 np0005476733 nova_compute[192580]: 2025-10-08 15:26:07.645 2 DEBUG nova.virt.hardware [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:26:07 np0005476733 nova_compute[192580]: 2025-10-08 15:26:07.645 2 INFO nova.compute.claims [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:26:08 np0005476733 nova_compute[192580]: 2025-10-08 15:26:08.141 2 DEBUG nova.compute.provider_tree [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:26:08 np0005476733 nova_compute[192580]: 2025-10-08 15:26:08.336 2 DEBUG nova.scheduler.client.report [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:26:08 np0005476733 nova_compute[192580]: 2025-10-08 15:26:08.531 2 DEBUG oslo_concurrency.lockutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.894s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:26:08 np0005476733 nova_compute[192580]: 2025-10-08 15:26:08.532 2 DEBUG nova.compute.manager [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:26:08 np0005476733 nova_compute[192580]: 2025-10-08 15:26:08.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:08 np0005476733 nova_compute[192580]: 2025-10-08 15:26:08.898 2 DEBUG nova.compute.manager [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:26:08 np0005476733 nova_compute[192580]: 2025-10-08 15:26:08.899 2 DEBUG nova.network.neutron [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.058 2 INFO nova.virt.libvirt.driver [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.109 2 DEBUG nova.policy [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.288 2 DEBUG nova.compute.manager [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.556 2 DEBUG nova.compute.manager [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.558 2 DEBUG nova.virt.libvirt.driver [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.559 2 INFO nova.virt.libvirt.driver [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Creating image(s)#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.560 2 DEBUG oslo_concurrency.lockutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "/var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.560 2 DEBUG oslo_concurrency.lockutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "/var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.561 2 DEBUG oslo_concurrency.lockutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "/var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.574 2 DEBUG oslo_concurrency.processutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.633 2 DEBUG oslo_concurrency.processutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.634 2 DEBUG oslo_concurrency.lockutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.635 2 DEBUG oslo_concurrency.lockutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.647 2 DEBUG oslo_concurrency.processutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.705 2 DEBUG oslo_concurrency.processutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.706 2 DEBUG oslo_concurrency.processutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.739 2 DEBUG oslo_concurrency.processutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk 10737418240" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.740 2 DEBUG oslo_concurrency.lockutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.741 2 DEBUG oslo_concurrency.processutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.797 2 DEBUG oslo_concurrency.processutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.798 2 DEBUG nova.objects.instance [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'migration_context' on Instance uuid 7f1808f3-5a79-4149-84d1-7bc21eefa497 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.818 2 DEBUG nova.virt.libvirt.driver [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.822 2 DEBUG nova.virt.libvirt.driver [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Ensure instance console log exists: /var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.822 2 DEBUG oslo_concurrency.lockutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.823 2 DEBUG oslo_concurrency.lockutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:26:09 np0005476733 nova_compute[192580]: 2025-10-08 15:26:09.823 2 DEBUG oslo_concurrency.lockutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:26:10 np0005476733 podman[227841]: 2025-10-08 15:26:10.234508733 +0000 UTC m=+0.058248301 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  8 11:26:10 np0005476733 podman[227842]: 2025-10-08 15:26:10.251848837 +0000 UTC m=+0.078080995 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:26:10 np0005476733 nova_compute[192580]: 2025-10-08 15:26:10.592 2 INFO nova.compute.manager [None req-5b3fc114-41f7-4def-9dce-a2afd830cc51 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Get console output#033[00m
Oct  8 11:26:10 np0005476733 nova_compute[192580]: 2025-10-08 15:26:10.597 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:26:11 np0005476733 nova_compute[192580]: 2025-10-08 15:26:11.403 2 INFO nova.compute.manager [None req-a1f786c9-d8b0-4572-aeb7-9a325459366f 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Get console output#033[00m
Oct  8 11:26:11 np0005476733 nova_compute[192580]: 2025-10-08 15:26:11.406 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:26:11 np0005476733 nova_compute[192580]: 2025-10-08 15:26:11.543 2 DEBUG nova.network.neutron [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Successfully created port: b5af459f-569f-4ca4-86fe-d2d018227a96 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 11:26:12 np0005476733 nova_compute[192580]: 2025-10-08 15:26:12.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:12 np0005476733 nova_compute[192580]: 2025-10-08 15:26:12.889 2 DEBUG nova.network.neutron [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Successfully updated port: b5af459f-569f-4ca4-86fe-d2d018227a96 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:26:12 np0005476733 nova_compute[192580]: 2025-10-08 15:26:12.906 2 DEBUG oslo_concurrency.lockutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "refresh_cache-7f1808f3-5a79-4149-84d1-7bc21eefa497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:26:12 np0005476733 nova_compute[192580]: 2025-10-08 15:26:12.907 2 DEBUG oslo_concurrency.lockutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquired lock "refresh_cache-7f1808f3-5a79-4149-84d1-7bc21eefa497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:26:12 np0005476733 nova_compute[192580]: 2025-10-08 15:26:12.907 2 DEBUG nova.network.neutron [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:26:13 np0005476733 nova_compute[192580]: 2025-10-08 15:26:13.140 2 DEBUG nova.network.neutron [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:26:13 np0005476733 nova_compute[192580]: 2025-10-08 15:26:13.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:15 np0005476733 nova_compute[192580]: 2025-10-08 15:26:15.164 2 DEBUG nova.compute.manager [req-b5218f06-8f0e-40de-b012-6af3993cb2b9 req-b88f9b20-cab3-4a90-a0e1-26895c28a5c7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Received event network-changed-b5af459f-569f-4ca4-86fe-d2d018227a96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:26:15 np0005476733 nova_compute[192580]: 2025-10-08 15:26:15.164 2 DEBUG nova.compute.manager [req-b5218f06-8f0e-40de-b012-6af3993cb2b9 req-b88f9b20-cab3-4a90-a0e1-26895c28a5c7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Refreshing instance network info cache due to event network-changed-b5af459f-569f-4ca4-86fe-d2d018227a96. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:26:15 np0005476733 nova_compute[192580]: 2025-10-08 15:26:15.164 2 DEBUG oslo_concurrency.lockutils [req-b5218f06-8f0e-40de-b012-6af3993cb2b9 req-b88f9b20-cab3-4a90-a0e1-26895c28a5c7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-7f1808f3-5a79-4149-84d1-7bc21eefa497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:26:15 np0005476733 nova_compute[192580]: 2025-10-08 15:26:15.830 2 INFO nova.compute.manager [None req-1c94e687-1699-42bf-8097-19fb87159c67 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Get console output#033[00m
Oct  8 11:26:15 np0005476733 nova_compute[192580]: 2025-10-08 15:26:15.836 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.052 2 DEBUG nova.network.neutron [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Updating instance_info_cache with network_info: [{"id": "b5af459f-569f-4ca4-86fe-d2d018227a96", "address": "fa:16:3e:4e:90:51", "network": {"id": "f7929135-b0f8-4022-8ac4-4734ecb47f0b", "bridge": "br-int", "label": "tempest-test-network--322813205", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5af459f-56", "ovs_interfaceid": "b5af459f-569f-4ca4-86fe-d2d018227a96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.093 2 DEBUG oslo_concurrency.lockutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Releasing lock "refresh_cache-7f1808f3-5a79-4149-84d1-7bc21eefa497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.094 2 DEBUG nova.compute.manager [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Instance network_info: |[{"id": "b5af459f-569f-4ca4-86fe-d2d018227a96", "address": "fa:16:3e:4e:90:51", "network": {"id": "f7929135-b0f8-4022-8ac4-4734ecb47f0b", "bridge": "br-int", "label": "tempest-test-network--322813205", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5af459f-56", "ovs_interfaceid": "b5af459f-569f-4ca4-86fe-d2d018227a96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.094 2 DEBUG oslo_concurrency.lockutils [req-b5218f06-8f0e-40de-b012-6af3993cb2b9 req-b88f9b20-cab3-4a90-a0e1-26895c28a5c7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-7f1808f3-5a79-4149-84d1-7bc21eefa497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.094 2 DEBUG nova.network.neutron [req-b5218f06-8f0e-40de-b012-6af3993cb2b9 req-b88f9b20-cab3-4a90-a0e1-26895c28a5c7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Refreshing network info cache for port b5af459f-569f-4ca4-86fe-d2d018227a96 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.098 2 DEBUG nova.virt.libvirt.driver [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Start _get_guest_xml network_info=[{"id": "b5af459f-569f-4ca4-86fe-d2d018227a96", "address": "fa:16:3e:4e:90:51", "network": {"id": "f7929135-b0f8-4022-8ac4-4734ecb47f0b", "bridge": "br-int", "label": "tempest-test-network--322813205", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5af459f-56", "ovs_interfaceid": "b5af459f-569f-4ca4-86fe-d2d018227a96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.104 2 WARNING nova.virt.libvirt.driver [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.111 2 DEBUG nova.virt.libvirt.host [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.112 2 DEBUG nova.virt.libvirt.host [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.120 2 DEBUG nova.virt.libvirt.host [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.121 2 DEBUG nova.virt.libvirt.host [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.121 2 DEBUG nova.virt.libvirt.driver [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.122 2 DEBUG nova.virt.hardware [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.122 2 DEBUG nova.virt.hardware [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.123 2 DEBUG nova.virt.hardware [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.123 2 DEBUG nova.virt.hardware [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.123 2 DEBUG nova.virt.hardware [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.124 2 DEBUG nova.virt.hardware [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.124 2 DEBUG nova.virt.hardware [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.124 2 DEBUG nova.virt.hardware [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.124 2 DEBUG nova.virt.hardware [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.125 2 DEBUG nova.virt.hardware [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.125 2 DEBUG nova.virt.hardware [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.129 2 DEBUG nova.virt.libvirt.vif [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:26:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_bw_limit_east_west-350327070',display_name='tempest-test_bw_limit_east_west-350327070',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-bw-limit-east-west-350327070',id=33,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-j87mn69z',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:26:09Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=7f1808f3-5a79-4149-84d1-7bc21eefa497,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b5af459f-569f-4ca4-86fe-d2d018227a96", "address": "fa:16:3e:4e:90:51", "network": {"id": "f7929135-b0f8-4022-8ac4-4734ecb47f0b", "bridge": "br-int", "label": "tempest-test-network--322813205", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5af459f-56", "ovs_interfaceid": "b5af459f-569f-4ca4-86fe-d2d018227a96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.130 2 DEBUG nova.network.os_vif_util [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "b5af459f-569f-4ca4-86fe-d2d018227a96", "address": "fa:16:3e:4e:90:51", "network": {"id": "f7929135-b0f8-4022-8ac4-4734ecb47f0b", "bridge": "br-int", "label": "tempest-test-network--322813205", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5af459f-56", "ovs_interfaceid": "b5af459f-569f-4ca4-86fe-d2d018227a96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.131 2 DEBUG nova.network.os_vif_util [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:90:51,bridge_name='br-int',has_traffic_filtering=True,id=b5af459f-569f-4ca4-86fe-d2d018227a96,network=Network(f7929135-b0f8-4022-8ac4-4734ecb47f0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5af459f-56') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.132 2 DEBUG nova.objects.instance [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7f1808f3-5a79-4149-84d1-7bc21eefa497 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.210 2 DEBUG nova.virt.libvirt.driver [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:26:16 np0005476733 nova_compute[192580]:  <uuid>7f1808f3-5a79-4149-84d1-7bc21eefa497</uuid>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:  <name>instance-00000021</name>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <nova:name>tempest-test_bw_limit_east_west-350327070</nova:name>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:26:16</nova:creationTime>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:26:16 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:        <nova:user uuid="d4d641ac754b44f89a23c1628056309a">tempest-QosTestCommon-1316104462-project-member</nova:user>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:        <nova:project uuid="d58fb802e34e481ea69b20f4fe8df6d2">tempest-QosTestCommon-1316104462</nova:project>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:        <nova:port uuid="b5af459f-569f-4ca4-86fe-d2d018227a96">
Oct  8 11:26:16 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.2.175" ipVersion="4"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <entry name="serial">7f1808f3-5a79-4149-84d1-7bc21eefa497</entry>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <entry name="uuid">7f1808f3-5a79-4149-84d1-7bc21eefa497</entry>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.config"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:4e:90:51"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <target dev="tapb5af459f-56"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/console.log" append="off"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:26:16 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:26:16 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:26:16 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:26:16 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.217 2 DEBUG nova.compute.manager [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Preparing to wait for external event network-vif-plugged-b5af459f-569f-4ca4-86fe-d2d018227a96 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.218 2 DEBUG oslo_concurrency.lockutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.218 2 DEBUG oslo_concurrency.lockutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.218 2 DEBUG oslo_concurrency.lockutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.219 2 DEBUG nova.virt.libvirt.vif [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:26:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_bw_limit_east_west-350327070',display_name='tempest-test_bw_limit_east_west-350327070',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-bw-limit-east-west-350327070',id=33,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-j87mn69z',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:26:09Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=7f1808f3-5a79-4149-84d1-7bc21eefa497,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b5af459f-569f-4ca4-86fe-d2d018227a96", "address": "fa:16:3e:4e:90:51", "network": {"id": "f7929135-b0f8-4022-8ac4-4734ecb47f0b", "bridge": "br-int", "label": "tempest-test-network--322813205", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5af459f-56", "ovs_interfaceid": "b5af459f-569f-4ca4-86fe-d2d018227a96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.220 2 DEBUG nova.network.os_vif_util [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "b5af459f-569f-4ca4-86fe-d2d018227a96", "address": "fa:16:3e:4e:90:51", "network": {"id": "f7929135-b0f8-4022-8ac4-4734ecb47f0b", "bridge": "br-int", "label": "tempest-test-network--322813205", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5af459f-56", "ovs_interfaceid": "b5af459f-569f-4ca4-86fe-d2d018227a96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.220 2 DEBUG nova.network.os_vif_util [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:90:51,bridge_name='br-int',has_traffic_filtering=True,id=b5af459f-569f-4ca4-86fe-d2d018227a96,network=Network(f7929135-b0f8-4022-8ac4-4734ecb47f0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5af459f-56') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.221 2 DEBUG os_vif [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:90:51,bridge_name='br-int',has_traffic_filtering=True,id=b5af459f-569f-4ca4-86fe-d2d018227a96,network=Network(f7929135-b0f8-4022-8ac4-4734ecb47f0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5af459f-56') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.222 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.222 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.226 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5af459f-56, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.226 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb5af459f-56, col_values=(('external_ids', {'iface-id': 'b5af459f-569f-4ca4-86fe-d2d018227a96', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4e:90:51', 'vm-uuid': '7f1808f3-5a79-4149-84d1-7bc21eefa497'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:16 np0005476733 NetworkManager[51699]: <info>  [1759937176.2296] manager: (tapb5af459f-56): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:26:16 np0005476733 ovn_controller[94857]: 2025-10-08T15:26:16Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2a:db:6c 192.168.7.196
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.239 2 INFO os_vif [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:90:51,bridge_name='br-int',has_traffic_filtering=True,id=b5af459f-569f-4ca4-86fe-d2d018227a96,network=Network(f7929135-b0f8-4022-8ac4-4734ecb47f0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5af459f-56')#033[00m
Oct  8 11:26:16 np0005476733 ovn_controller[94857]: 2025-10-08T15:26:16Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2a:db:6c 192.168.7.196
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.370 2 DEBUG nova.virt.libvirt.driver [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.371 2 DEBUG nova.virt.libvirt.driver [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.371 2 DEBUG nova.virt.libvirt.driver [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No VIF found with MAC fa:16:3e:4e:90:51, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:26:16 np0005476733 nova_compute[192580]: 2025-10-08 15:26:16.372 2 INFO nova.virt.libvirt.driver [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Using config drive#033[00m
Oct  8 11:26:17 np0005476733 nova_compute[192580]: 2025-10-08 15:26:17.285 2 INFO nova.virt.libvirt.driver [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Creating config drive at /var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.config#033[00m
Oct  8 11:26:17 np0005476733 nova_compute[192580]: 2025-10-08 15:26:17.291 2 DEBUG oslo_concurrency.processutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprn7hxx7m execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:26:17 np0005476733 nova_compute[192580]: 2025-10-08 15:26:17.416 2 DEBUG oslo_concurrency.processutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprn7hxx7m" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:26:17 np0005476733 nova_compute[192580]: 2025-10-08 15:26:17.449 2 INFO nova.compute.manager [None req-2a8c6a05-32e3-44ca-9245-c492d4ff297c 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Get console output#033[00m
Oct  8 11:26:17 np0005476733 nova_compute[192580]: 2025-10-08 15:26:17.460 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:26:17 np0005476733 kernel: tapb5af459f-56: entered promiscuous mode
Oct  8 11:26:17 np0005476733 NetworkManager[51699]: <info>  [1759937177.4989] manager: (tapb5af459f-56): new Tun device (/org/freedesktop/NetworkManager/Devices/97)
Oct  8 11:26:17 np0005476733 nova_compute[192580]: 2025-10-08 15:26:17.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:17 np0005476733 ovn_controller[94857]: 2025-10-08T15:26:17Z|00229|binding|INFO|Claiming lport b5af459f-569f-4ca4-86fe-d2d018227a96 for this chassis.
Oct  8 11:26:17 np0005476733 ovn_controller[94857]: 2025-10-08T15:26:17Z|00230|binding|INFO|b5af459f-569f-4ca4-86fe-d2d018227a96: Claiming fa:16:3e:4e:90:51 192.168.2.175
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.511 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:90:51 192.168.2.175'], port_security=['fa:16:3e:4e:90:51 192.168.2.175'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.2.175/24', 'neutron:device_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7929135-b0f8-4022-8ac4-4734ecb47f0b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b449450f-29a2-4ba2-a56d-c4c1cca923db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e26c823e-4eb7-44c0-a2be-0739b8d56851, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=b5af459f-569f-4ca4-86fe-d2d018227a96) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.513 103739 INFO neutron.agent.ovn.metadata.agent [-] Port b5af459f-569f-4ca4-86fe-d2d018227a96 in datapath f7929135-b0f8-4022-8ac4-4734ecb47f0b bound to our chassis#033[00m
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.519 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7929135-b0f8-4022-8ac4-4734ecb47f0b#033[00m
Oct  8 11:26:17 np0005476733 nova_compute[192580]: 2025-10-08 15:26:17.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:17 np0005476733 nova_compute[192580]: 2025-10-08 15:26:17.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:17 np0005476733 ovn_controller[94857]: 2025-10-08T15:26:17Z|00231|binding|INFO|Setting lport b5af459f-569f-4ca4-86fe-d2d018227a96 ovn-installed in OVS
Oct  8 11:26:17 np0005476733 ovn_controller[94857]: 2025-10-08T15:26:17Z|00232|binding|INFO|Setting lport b5af459f-569f-4ca4-86fe-d2d018227a96 up in Southbound
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.533 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9bbff565-7b30-476a-9227-20ff147668a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.534 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf7929135-b1 in ovnmeta-f7929135-b0f8-4022-8ac4-4734ecb47f0b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:26:17 np0005476733 nova_compute[192580]: 2025-10-08 15:26:17.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.538 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf7929135-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.538 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[030959d3-d88d-438a-b37a-864327c8628f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.540 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ff9325aa-415d-4c49-87cc-90d7ac230e9a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.552 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[f49aa1d5-1dcf-47af-8eac-b572df699e10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:26:17 np0005476733 systemd-machined[152624]: New machine qemu-20-instance-00000021.
Oct  8 11:26:17 np0005476733 systemd-udevd[227913]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:26:17 np0005476733 systemd[1]: Started Virtual Machine qemu-20-instance-00000021.
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.570 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[02c45126-ce94-4132-95fe-330e521d961e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:26:17 np0005476733 NetworkManager[51699]: <info>  [1759937177.5868] device (tapb5af459f-56): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:26:17 np0005476733 NetworkManager[51699]: <info>  [1759937177.5875] device (tapb5af459f-56): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.600 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[f2ac38a9-2e8b-414e-b2f6-d3df953dd79e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:26:17 np0005476733 systemd-udevd[227916]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.607 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[917b55f6-01c2-4492-b970-77088de6a65a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:26:17 np0005476733 NetworkManager[51699]: <info>  [1759937177.6130] manager: (tapf7929135-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/98)
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.639 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[bb114f70-c1e9-436b-89da-478988fbfb03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.646 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[b280ab32-5fc1-48d5-bfac-c9789dd442af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:26:17 np0005476733 NetworkManager[51699]: <info>  [1759937177.6675] device (tapf7929135-b0): carrier: link connected
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.672 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[92e820ac-0022-4d9d-84b2-ca2892e67fc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.690 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[93c15b59-8991-4583-895c-590882185968]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7929135-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:90:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408133, 'reachable_time': 35104, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227943, 'error': None, 'target': 'ovnmeta-f7929135-b0f8-4022-8ac4-4734ecb47f0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.707 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[56fe4209-c712-49ea-b0a8-a37307a02dbe]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5f:90e8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 408133, 'tstamp': 408133}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227944, 'error': None, 'target': 'ovnmeta-f7929135-b0f8-4022-8ac4-4734ecb47f0b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.724 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0690d9b5-2a96-45d3-9ce9-0a1945891aac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7929135-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:90:e8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408133, 'reachable_time': 35104, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227945, 'error': None, 'target': 'ovnmeta-f7929135-b0f8-4022-8ac4-4734ecb47f0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.757 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[50a02af1-c3ee-41c0-9c18-73076691bee5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.823 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6511adaf-e4f2-41a9-b964-8a4b9fdd5587]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.825 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7929135-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.826 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.826 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7929135-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:26:17 np0005476733 nova_compute[192580]: 2025-10-08 15:26:17.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:17 np0005476733 NetworkManager[51699]: <info>  [1759937177.8296] manager: (tapf7929135-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Oct  8 11:26:17 np0005476733 kernel: tapf7929135-b0: entered promiscuous mode
Oct  8 11:26:17 np0005476733 nova_compute[192580]: 2025-10-08 15:26:17.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.833 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7929135-b0, col_values=(('external_ids', {'iface-id': '9e6f9f1a-9b45-47d5-b171-40ef2fcda78c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:26:17 np0005476733 ovn_controller[94857]: 2025-10-08T15:26:17Z|00233|binding|INFO|Releasing lport 9e6f9f1a-9b45-47d5-b171-40ef2fcda78c from this chassis (sb_readonly=0)
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.839 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7929135-b0f8-4022-8ac4-4734ecb47f0b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7929135-b0f8-4022-8ac4-4734ecb47f0b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.840 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2ee4b349-dedf-495a-9978-ecbfa8f6f4b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.841 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-f7929135-b0f8-4022-8ac4-4734ecb47f0b
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/f7929135-b0f8-4022-8ac4-4734ecb47f0b.pid.haproxy
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID f7929135-b0f8-4022-8ac4-4734ecb47f0b
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:26:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:17.844 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f7929135-b0f8-4022-8ac4-4734ecb47f0b', 'env', 'PROCESS_TAG=haproxy-f7929135-b0f8-4022-8ac4-4734ecb47f0b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f7929135-b0f8-4022-8ac4-4734ecb47f0b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:26:17 np0005476733 nova_compute[192580]: 2025-10-08 15:26:17.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.201 2 DEBUG nova.compute.manager [req-dc4676ff-f4a0-4a45-95c6-c6d27f17aab5 req-33842529-00ff-4164-8f28-e5c9976f1220 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Received event network-vif-plugged-b5af459f-569f-4ca4-86fe-d2d018227a96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.201 2 DEBUG oslo_concurrency.lockutils [req-dc4676ff-f4a0-4a45-95c6-c6d27f17aab5 req-33842529-00ff-4164-8f28-e5c9976f1220 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.202 2 DEBUG oslo_concurrency.lockutils [req-dc4676ff-f4a0-4a45-95c6-c6d27f17aab5 req-33842529-00ff-4164-8f28-e5c9976f1220 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.202 2 DEBUG oslo_concurrency.lockutils [req-dc4676ff-f4a0-4a45-95c6-c6d27f17aab5 req-33842529-00ff-4164-8f28-e5c9976f1220 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.202 2 DEBUG nova.compute.manager [req-dc4676ff-f4a0-4a45-95c6-c6d27f17aab5 req-33842529-00ff-4164-8f28-e5c9976f1220 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Processing event network-vif-plugged-b5af459f-569f-4ca4-86fe-d2d018227a96 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.299 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937178.2984753, 7f1808f3-5a79-4149-84d1-7bc21eefa497 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.299 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] VM Started (Lifecycle Event)#033[00m
Oct  8 11:26:18 np0005476733 podman[227984]: 2025-10-08 15:26:18.300651932 +0000 UTC m=+0.066034000 container create c8cd2035bc72214dc9ecd897698723de627c0bdb29f7b7088b9cc47dfabfc438 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f7929135-b0f8-4022-8ac4-4734ecb47f0b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.302 2 DEBUG nova.compute.manager [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.311 2 DEBUG nova.virt.libvirt.driver [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.315 2 INFO nova.virt.libvirt.driver [-] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Instance spawned successfully.#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.316 2 DEBUG nova.virt.libvirt.driver [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.326 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.331 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.341 2 DEBUG nova.virt.libvirt.driver [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.341 2 DEBUG nova.virt.libvirt.driver [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.342 2 DEBUG nova.virt.libvirt.driver [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:26:18 np0005476733 systemd[1]: Started libpod-conmon-c8cd2035bc72214dc9ecd897698723de627c0bdb29f7b7088b9cc47dfabfc438.scope.
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.343 2 DEBUG nova.virt.libvirt.driver [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.344 2 DEBUG nova.virt.libvirt.driver [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.344 2 DEBUG nova.virt.libvirt.driver [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:26:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:18.350 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.356 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.356 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937178.298725, 7f1808f3-5a79-4149-84d1-7bc21eefa497 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.356 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:26:18 np0005476733 podman[227984]: 2025-10-08 15:26:18.263776304 +0000 UTC m=+0.029158402 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:26:18 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:26:18 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eadc0cd2daef3b5cfbb38b77508b7e01431db41f485f4a67a1c025088b7f64ef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:26:18 np0005476733 podman[227984]: 2025-10-08 15:26:18.393935511 +0000 UTC m=+0.159317589 container init c8cd2035bc72214dc9ecd897698723de627c0bdb29f7b7088b9cc47dfabfc438 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f7929135-b0f8-4022-8ac4-4734ecb47f0b, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true)
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.392 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.399 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937178.3106291, 7f1808f3-5a79-4149-84d1-7bc21eefa497 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:26:18 np0005476733 podman[227984]: 2025-10-08 15:26:18.399825759 +0000 UTC m=+0.165207827 container start c8cd2035bc72214dc9ecd897698723de627c0bdb29f7b7088b9cc47dfabfc438 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f7929135-b0f8-4022-8ac4-4734ecb47f0b, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.399 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:26:18 np0005476733 neutron-haproxy-ovnmeta-f7929135-b0f8-4022-8ac4-4734ecb47f0b[228000]: [NOTICE]   (228010) : New worker (228016) forked
Oct  8 11:26:18 np0005476733 neutron-haproxy-ovnmeta-f7929135-b0f8-4022-8ac4-4734ecb47f0b[228000]: [NOTICE]   (228010) : Loading success.
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.427 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.433 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:26:18 np0005476733 podman[228003]: 2025-10-08 15:26:18.452374487 +0000 UTC m=+0.059087018 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.453 2 INFO nova.compute.manager [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Took 8.90 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.453 2 DEBUG nova.compute.manager [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.469 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:26:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:18.485 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:26:18 np0005476733 podman[228032]: 2025-10-08 15:26:18.53824056 +0000 UTC m=+0.064216692 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.547 2 INFO nova.compute.manager [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Took 10.94 seconds to build instance.#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.579 2 DEBUG oslo_concurrency.lockutils [None req-19efd215-45e7-4f93-b3a9-ef787b02e5de d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "7f1808f3-5a79-4149-84d1-7bc21eefa497" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.691 2 DEBUG nova.network.neutron [req-b5218f06-8f0e-40de-b012-6af3993cb2b9 req-b88f9b20-cab3-4a90-a0e1-26895c28a5c7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Updated VIF entry in instance network info cache for port b5af459f-569f-4ca4-86fe-d2d018227a96. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.692 2 DEBUG nova.network.neutron [req-b5218f06-8f0e-40de-b012-6af3993cb2b9 req-b88f9b20-cab3-4a90-a0e1-26895c28a5c7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Updating instance_info_cache with network_info: [{"id": "b5af459f-569f-4ca4-86fe-d2d018227a96", "address": "fa:16:3e:4e:90:51", "network": {"id": "f7929135-b0f8-4022-8ac4-4734ecb47f0b", "bridge": "br-int", "label": "tempest-test-network--322813205", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5af459f-56", "ovs_interfaceid": "b5af459f-569f-4ca4-86fe-d2d018227a96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.711 2 DEBUG oslo_concurrency.lockutils [req-b5218f06-8f0e-40de-b012-6af3993cb2b9 req-b88f9b20-cab3-4a90-a0e1-26895c28a5c7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-7f1808f3-5a79-4149-84d1-7bc21eefa497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:26:18 np0005476733 nova_compute[192580]: 2025-10-08 15:26:18.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:20 np0005476733 nova_compute[192580]: 2025-10-08 15:26:20.283 2 DEBUG nova.compute.manager [req-da4fb164-2917-49ce-b562-e4fb420fcce3 req-56310c02-bdab-4ff7-af6e-c2db59481824 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Received event network-vif-plugged-b5af459f-569f-4ca4-86fe-d2d018227a96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:26:20 np0005476733 nova_compute[192580]: 2025-10-08 15:26:20.284 2 DEBUG oslo_concurrency.lockutils [req-da4fb164-2917-49ce-b562-e4fb420fcce3 req-56310c02-bdab-4ff7-af6e-c2db59481824 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:26:20 np0005476733 nova_compute[192580]: 2025-10-08 15:26:20.285 2 DEBUG oslo_concurrency.lockutils [req-da4fb164-2917-49ce-b562-e4fb420fcce3 req-56310c02-bdab-4ff7-af6e-c2db59481824 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:26:20 np0005476733 nova_compute[192580]: 2025-10-08 15:26:20.285 2 DEBUG oslo_concurrency.lockutils [req-da4fb164-2917-49ce-b562-e4fb420fcce3 req-56310c02-bdab-4ff7-af6e-c2db59481824 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:26:20 np0005476733 nova_compute[192580]: 2025-10-08 15:26:20.286 2 DEBUG nova.compute.manager [req-da4fb164-2917-49ce-b562-e4fb420fcce3 req-56310c02-bdab-4ff7-af6e-c2db59481824 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] No waiting events found dispatching network-vif-plugged-b5af459f-569f-4ca4-86fe-d2d018227a96 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:26:20 np0005476733 nova_compute[192580]: 2025-10-08 15:26:20.286 2 WARNING nova.compute.manager [req-da4fb164-2917-49ce-b562-e4fb420fcce3 req-56310c02-bdab-4ff7-af6e-c2db59481824 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Received unexpected event network-vif-plugged-b5af459f-569f-4ca4-86fe-d2d018227a96 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:26:21 np0005476733 nova_compute[192580]: 2025-10-08 15:26:21.064 2 INFO nova.compute.manager [None req-4bbceb96-df4f-452d-9339-a7cfa91249e2 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Get console output#033[00m
Oct  8 11:26:21 np0005476733 nova_compute[192580]: 2025-10-08 15:26:21.068 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:26:21 np0005476733 nova_compute[192580]: 2025-10-08 15:26:21.072 2 INFO nova.virt.libvirt.driver [None req-4bbceb96-df4f-452d-9339-a7cfa91249e2 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Truncated console log returned, 1256 bytes ignored#033[00m
Oct  8 11:26:21 np0005476733 nova_compute[192580]: 2025-10-08 15:26:21.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:21 np0005476733 nova_compute[192580]: 2025-10-08 15:26:21.251 2 INFO nova.compute.manager [None req-407eff8e-7101-46e3-bfa2-bf06e51856d2 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Get console output#033[00m
Oct  8 11:26:21 np0005476733 nova_compute[192580]: 2025-10-08 15:26:21.257 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:26:22 np0005476733 nova_compute[192580]: 2025-10-08 15:26:22.671 2 INFO nova.compute.manager [None req-c454834e-8fcd-45b1-a857-bc34582518f2 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Get console output#033[00m
Oct  8 11:26:22 np0005476733 nova_compute[192580]: 2025-10-08 15:26:22.675 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:26:22 np0005476733 nova_compute[192580]: 2025-10-08 15:26:22.678 2 INFO nova.virt.libvirt.driver [None req-c454834e-8fcd-45b1-a857-bc34582518f2 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Truncated console log returned, 3450 bytes ignored#033[00m
Oct  8 11:26:23 np0005476733 ovn_controller[94857]: 2025-10-08T15:26:23Z|00234|pinctrl|WARN|Dropped 4321 log messages in last 63 seconds (most recently, 5 seconds ago) due to excessive rate
Oct  8 11:26:23 np0005476733 ovn_controller[94857]: 2025-10-08T15:26:23Z|00235|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:26:23 np0005476733 nova_compute[192580]: 2025-10-08 15:26:23.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:26 np0005476733 nova_compute[192580]: 2025-10-08 15:26:26.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:26 np0005476733 nova_compute[192580]: 2025-10-08 15:26:26.252 2 INFO nova.compute.manager [None req-9639b99f-3829-4781-9f4d-351dbf479647 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Get console output#033[00m
Oct  8 11:26:26 np0005476733 nova_compute[192580]: 2025-10-08 15:26:26.256 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:26:26 np0005476733 nova_compute[192580]: 2025-10-08 15:26:26.259 2 INFO nova.virt.libvirt.driver [None req-9639b99f-3829-4781-9f4d-351dbf479647 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Truncated console log returned, 3731 bytes ignored#033[00m
Oct  8 11:26:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:26.308 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:26:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:26.309 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:26:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:26.310 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:26:26 np0005476733 nova_compute[192580]: 2025-10-08 15:26:26.398 2 INFO nova.compute.manager [None req-5ef40539-940f-46c9-927b-ee8c8b18eb3a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Get console output#033[00m
Oct  8 11:26:27 np0005476733 podman[228078]: 2025-10-08 15:26:27.233537721 +0000 UTC m=+0.065113599 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 11:26:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:27.486 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:26:27 np0005476733 nova_compute[192580]: 2025-10-08 15:26:27.844 2 INFO nova.compute.manager [None req-f307e833-61ab-4728-9320-b2d8707611e6 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Get console output#033[00m
Oct  8 11:26:27 np0005476733 nova_compute[192580]: 2025-10-08 15:26:27.849 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:26:27 np0005476733 nova_compute[192580]: 2025-10-08 15:26:27.853 2 INFO nova.virt.libvirt.driver [None req-f307e833-61ab-4728-9320-b2d8707611e6 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Truncated console log returned, 3749 bytes ignored#033[00m
Oct  8 11:26:28 np0005476733 nova_compute[192580]: 2025-10-08 15:26:28.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:29 np0005476733 nova_compute[192580]: 2025-10-08 15:26:29.672 2 DEBUG nova.compute.manager [req-361e1b7d-38bf-4237-89ad-e0c0523d0550 req-51864918-078c-4954-a35a-42351fe2e94f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Received event network-changed-9dbcb8e0-b6cb-47f5-b89d-290794905306 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:26:29 np0005476733 nova_compute[192580]: 2025-10-08 15:26:29.672 2 DEBUG nova.compute.manager [req-361e1b7d-38bf-4237-89ad-e0c0523d0550 req-51864918-078c-4954-a35a-42351fe2e94f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Refreshing instance network info cache due to event network-changed-9dbcb8e0-b6cb-47f5-b89d-290794905306. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:26:29 np0005476733 nova_compute[192580]: 2025-10-08 15:26:29.673 2 DEBUG oslo_concurrency.lockutils [req-361e1b7d-38bf-4237-89ad-e0c0523d0550 req-51864918-078c-4954-a35a-42351fe2e94f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-4f1d2adc-1ecb-45dc-83a0-c2369028e487" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:26:29 np0005476733 nova_compute[192580]: 2025-10-08 15:26:29.673 2 DEBUG oslo_concurrency.lockutils [req-361e1b7d-38bf-4237-89ad-e0c0523d0550 req-51864918-078c-4954-a35a-42351fe2e94f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-4f1d2adc-1ecb-45dc-83a0-c2369028e487" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:26:29 np0005476733 nova_compute[192580]: 2025-10-08 15:26:29.673 2 DEBUG nova.network.neutron [req-361e1b7d-38bf-4237-89ad-e0c0523d0550 req-51864918-078c-4954-a35a-42351fe2e94f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Refreshing network info cache for port 9dbcb8e0-b6cb-47f5-b89d-290794905306 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:26:30 np0005476733 nova_compute[192580]: 2025-10-08 15:26:30.990 2 DEBUG oslo_concurrency.lockutils [None req-2ea89e2a-75d1-4e2d-b338-1cb87b384515 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Acquiring lock "4f1d2adc-1ecb-45dc-83a0-c2369028e487" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:26:30 np0005476733 nova_compute[192580]: 2025-10-08 15:26:30.991 2 DEBUG oslo_concurrency.lockutils [None req-2ea89e2a-75d1-4e2d-b338-1cb87b384515 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Lock "4f1d2adc-1ecb-45dc-83a0-c2369028e487" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:26:30 np0005476733 nova_compute[192580]: 2025-10-08 15:26:30.991 2 DEBUG oslo_concurrency.lockutils [None req-2ea89e2a-75d1-4e2d-b338-1cb87b384515 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Acquiring lock "4f1d2adc-1ecb-45dc-83a0-c2369028e487-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:26:30 np0005476733 nova_compute[192580]: 2025-10-08 15:26:30.992 2 DEBUG oslo_concurrency.lockutils [None req-2ea89e2a-75d1-4e2d-b338-1cb87b384515 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Lock "4f1d2adc-1ecb-45dc-83a0-c2369028e487-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:26:30 np0005476733 nova_compute[192580]: 2025-10-08 15:26:30.992 2 DEBUG oslo_concurrency.lockutils [None req-2ea89e2a-75d1-4e2d-b338-1cb87b384515 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Lock "4f1d2adc-1ecb-45dc-83a0-c2369028e487-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:26:30 np0005476733 nova_compute[192580]: 2025-10-08 15:26:30.993 2 INFO nova.compute.manager [None req-2ea89e2a-75d1-4e2d-b338-1cb87b384515 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Terminating instance#033[00m
Oct  8 11:26:30 np0005476733 nova_compute[192580]: 2025-10-08 15:26:30.995 2 DEBUG nova.compute.manager [None req-2ea89e2a-75d1-4e2d-b338-1cb87b384515 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:26:31 np0005476733 kernel: tap9dbcb8e0-b6 (unregistering): left promiscuous mode
Oct  8 11:26:31 np0005476733 NetworkManager[51699]: <info>  [1759937191.0316] device (tap9dbcb8e0-b6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:26:31 np0005476733 ovn_controller[94857]: 2025-10-08T15:26:31Z|00236|binding|INFO|Releasing lport 9dbcb8e0-b6cb-47f5-b89d-290794905306 from this chassis (sb_readonly=0)
Oct  8 11:26:31 np0005476733 ovn_controller[94857]: 2025-10-08T15:26:31Z|00237|binding|INFO|Setting lport 9dbcb8e0-b6cb-47f5-b89d-290794905306 down in Southbound
Oct  8 11:26:31 np0005476733 ovn_controller[94857]: 2025-10-08T15:26:31Z|00238|binding|INFO|Removing iface tap9dbcb8e0-b6 ovn-installed in OVS
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:31.099 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2a:db:6c 192.168.7.196 2001:7::f816:3eff:fe2a:db6c'], port_security=['fa:16:3e:2a:db:6c 192.168.7.196 2001:7::f816:3eff:fe2a:db6c'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-OvnExtraDhcpOptionsTest-1890118187-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'neutron:cidrs': '192.168.7.196/24 2001:7::f816:3eff:fe2a:db6c/64', 'neutron:device_id': '4f1d2adc-1ecb-45dc-83a0-c2369028e487', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ba711270-eff4-4485-a453-6e6d5887d038', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-OvnExtraDhcpOptionsTest-1890118187-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'neutron:project_id': 'c1390632da384309b358a3f3728ab5d8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '42921251-1598-4dc8-8cc9-ac707d0ab44b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.223'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d3a3e561-49a1-4407-8153-f33a25aca48b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=9dbcb8e0-b6cb-47f5-b89d-290794905306) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:26:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:31.103 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 9dbcb8e0-b6cb-47f5-b89d-290794905306 in datapath ba711270-eff4-4485-a453-6e6d5887d038 unbound from our chassis#033[00m
Oct  8 11:26:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:31.106 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ba711270-eff4-4485-a453-6e6d5887d038, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:26:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:31.108 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[dc12d520-30e0-42a8-b8fd-c6d3c9088092]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:26:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:31.109 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038 namespace which is not needed anymore#033[00m
Oct  8 11:26:31 np0005476733 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001f.scope: Deactivated successfully.
Oct  8 11:26:31 np0005476733 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001f.scope: Consumed 40.311s CPU time.
Oct  8 11:26:31 np0005476733 systemd-machined[152624]: Machine qemu-18-instance-0000001f terminated.
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.259 2 INFO nova.virt.libvirt.driver [-] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Instance destroyed successfully.#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.261 2 DEBUG nova.objects.instance [None req-2ea89e2a-75d1-4e2d-b338-1cb87b384515 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Lazy-loading 'resources' on Instance uuid 4f1d2adc-1ecb-45dc-83a0-c2369028e487 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.284 2 DEBUG nova.virt.libvirt.vif [None req-2ea89e2a-75d1-4e2d-b338-1cb87b384515 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:25:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-OvnExtraDhcpOptionsTest-1890118187-test_extra_dhcp_opts_ipv4_ipv6_stateless',display_name='tempest-OvnExtraDhcpOptionsTest-1890118187-test_extra_dhcp_opts_ipv4_ipv6_stateless',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-ovnextradhcpoptionstest-1890118187-test-extra-dhcp-opts',id=31,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEd+u8f5VceRZeyfE0Yrf01yg7yEMN2pLOr84m9oNqDsynJlsd7nDIcghdcO/L5YEUqPbydKTzc34ECm0UQvZ9Ra2/TmyKbJN8Sbt6K51Zbo5VjtVEJw/1DdJeHYynadkg==',key_name='tempest-OvnExtraDhcpOptionsTest-1890118187',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:25:49Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c1390632da384309b358a3f3728ab5d8',ramdisk_id='',reservation_id='r-dw007kmk',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-OvnExtraDhcpOptionsTest-1189559672',owner_user_name='tempest-OvnExtraDhcpOptionsTest-1189559672-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:25:50Z,user_data=None,user_id='cff4e262a7054de9b32c7b3c504c757f',uuid=4f1d2adc-1ecb-45dc-83a0-c2369028e487,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9dbcb8e0-b6cb-47f5-b89d-290794905306", "address": "fa:16:3e:2a:db:6c", "network": {"id": "ba711270-eff4-4485-a453-6e6d5887d038", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-1890118187-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.196", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:7::/64", "dns": [], "gateway": {"address": "2001:7::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:7::f816:3eff:fe2a:db6c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "c1390632da384309b358a3f3728ab5d8", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dbcb8e0-b6", "ovs_interfaceid": "9dbcb8e0-b6cb-47f5-b89d-290794905306", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.285 2 DEBUG nova.network.os_vif_util [None req-2ea89e2a-75d1-4e2d-b338-1cb87b384515 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Converting VIF {"id": "9dbcb8e0-b6cb-47f5-b89d-290794905306", "address": "fa:16:3e:2a:db:6c", "network": {"id": "ba711270-eff4-4485-a453-6e6d5887d038", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-1890118187-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.196", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:7::/64", "dns": [], "gateway": {"address": "2001:7::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:7::f816:3eff:fe2a:db6c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "c1390632da384309b358a3f3728ab5d8", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dbcb8e0-b6", "ovs_interfaceid": "9dbcb8e0-b6cb-47f5-b89d-290794905306", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.286 2 DEBUG nova.network.os_vif_util [None req-2ea89e2a-75d1-4e2d-b338-1cb87b384515 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2a:db:6c,bridge_name='br-int',has_traffic_filtering=True,id=9dbcb8e0-b6cb-47f5-b89d-290794905306,network=Network(ba711270-eff4-4485-a453-6e6d5887d038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9dbcb8e0-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.286 2 DEBUG os_vif [None req-2ea89e2a-75d1-4e2d-b338-1cb87b384515 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2a:db:6c,bridge_name='br-int',has_traffic_filtering=True,id=9dbcb8e0-b6cb-47f5-b89d-290794905306,network=Network(ba711270-eff4-4485-a453-6e6d5887d038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9dbcb8e0-b6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.289 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9dbcb8e0-b6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.295 2 INFO os_vif [None req-2ea89e2a-75d1-4e2d-b338-1cb87b384515 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2a:db:6c,bridge_name='br-int',has_traffic_filtering=True,id=9dbcb8e0-b6cb-47f5-b89d-290794905306,network=Network(ba711270-eff4-4485-a453-6e6d5887d038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap9dbcb8e0-b6')#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.296 2 INFO nova.virt.libvirt.driver [None req-2ea89e2a-75d1-4e2d-b338-1cb87b384515 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Deleting instance files /var/lib/nova/instances/4f1d2adc-1ecb-45dc-83a0-c2369028e487_del#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.297 2 INFO nova.virt.libvirt.driver [None req-2ea89e2a-75d1-4e2d-b338-1cb87b384515 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Deletion of /var/lib/nova/instances/4f1d2adc-1ecb-45dc-83a0-c2369028e487_del complete#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.367 2 INFO nova.compute.manager [None req-2ea89e2a-75d1-4e2d-b338-1cb87b384515 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.367 2 DEBUG oslo.service.loopingcall [None req-2ea89e2a-75d1-4e2d-b338-1cb87b384515 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.368 2 DEBUG nova.compute.manager [-] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.368 2 DEBUG nova.network.neutron [-] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.389 2 DEBUG nova.compute.manager [req-c7ee8451-3a3b-4224-af9e-f7c1d0ed2667 req-5ec72bfd-04ec-4997-afa4-1a40b6b35d01 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Received event network-vif-unplugged-9dbcb8e0-b6cb-47f5-b89d-290794905306 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.390 2 DEBUG oslo_concurrency.lockutils [req-c7ee8451-3a3b-4224-af9e-f7c1d0ed2667 req-5ec72bfd-04ec-4997-afa4-1a40b6b35d01 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "4f1d2adc-1ecb-45dc-83a0-c2369028e487-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.390 2 DEBUG oslo_concurrency.lockutils [req-c7ee8451-3a3b-4224-af9e-f7c1d0ed2667 req-5ec72bfd-04ec-4997-afa4-1a40b6b35d01 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "4f1d2adc-1ecb-45dc-83a0-c2369028e487-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.391 2 DEBUG oslo_concurrency.lockutils [req-c7ee8451-3a3b-4224-af9e-f7c1d0ed2667 req-5ec72bfd-04ec-4997-afa4-1a40b6b35d01 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "4f1d2adc-1ecb-45dc-83a0-c2369028e487-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.391 2 DEBUG nova.compute.manager [req-c7ee8451-3a3b-4224-af9e-f7c1d0ed2667 req-5ec72bfd-04ec-4997-afa4-1a40b6b35d01 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] No waiting events found dispatching network-vif-unplugged-9dbcb8e0-b6cb-47f5-b89d-290794905306 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.392 2 DEBUG nova.compute.manager [req-c7ee8451-3a3b-4224-af9e-f7c1d0ed2667 req-5ec72bfd-04ec-4997-afa4-1a40b6b35d01 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Received event network-vif-unplugged-9dbcb8e0-b6cb-47f5-b89d-290794905306 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:26:31 np0005476733 neutron-haproxy-ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038[227675]: [NOTICE]   (227681) : haproxy version is 2.8.14-c23fe91
Oct  8 11:26:31 np0005476733 neutron-haproxy-ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038[227675]: [NOTICE]   (227681) : path to executable is /usr/sbin/haproxy
Oct  8 11:26:31 np0005476733 neutron-haproxy-ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038[227675]: [WARNING]  (227681) : Exiting Master process...
Oct  8 11:26:31 np0005476733 neutron-haproxy-ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038[227675]: [ALERT]    (227681) : Current worker (227683) exited with code 143 (Terminated)
Oct  8 11:26:31 np0005476733 neutron-haproxy-ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038[227675]: [WARNING]  (227681) : All workers exited. Exiting... (0)
Oct  8 11:26:31 np0005476733 systemd[1]: libpod-38eaa77941a104ea58512118b3a40463b033b6715f3ad7362e0bcca48ca2e288.scope: Deactivated successfully.
Oct  8 11:26:31 np0005476733 conmon[227675]: conmon 38eaa77941a104ea5851 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-38eaa77941a104ea58512118b3a40463b033b6715f3ad7362e0bcca48ca2e288.scope/container/memory.events
Oct  8 11:26:31 np0005476733 podman[228144]: 2025-10-08 15:26:31.515418748 +0000 UTC m=+0.312741960 container died 38eaa77941a104ea58512118b3a40463b033b6715f3ad7362e0bcca48ca2e288 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.564 2 INFO nova.compute.manager [None req-36ddae25-1009-4419-bfac-97b3f11dd6e1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Get console output#033[00m
Oct  8 11:26:31 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-38eaa77941a104ea58512118b3a40463b033b6715f3ad7362e0bcca48ca2e288-userdata-shm.mount: Deactivated successfully.
Oct  8 11:26:31 np0005476733 systemd[1]: var-lib-containers-storage-overlay-2919dc10e236115070f511a90cff8b4ab281ac08beb20ad1236f9f3f66cef31b-merged.mount: Deactivated successfully.
Oct  8 11:26:31 np0005476733 podman[228144]: 2025-10-08 15:26:31.738346917 +0000 UTC m=+0.535670119 container cleanup 38eaa77941a104ea58512118b3a40463b033b6715f3ad7362e0bcca48ca2e288 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:26:31 np0005476733 systemd[1]: libpod-conmon-38eaa77941a104ea58512118b3a40463b033b6715f3ad7362e0bcca48ca2e288.scope: Deactivated successfully.
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.772 2 DEBUG nova.compute.manager [req-2d88d9b6-b4a6-48ce-966e-9c93f0e5e353 req-66b256c3-6c97-4581-a6ab-5040bf83571c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Received event network-changed-cae08d04-f9a8-46ee-ba57-0a0db94ae186 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.773 2 DEBUG nova.compute.manager [req-2d88d9b6-b4a6-48ce-966e-9c93f0e5e353 req-66b256c3-6c97-4581-a6ab-5040bf83571c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Refreshing instance network info cache due to event network-changed-cae08d04-f9a8-46ee-ba57-0a0db94ae186. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.773 2 DEBUG oslo_concurrency.lockutils [req-2d88d9b6-b4a6-48ce-966e-9c93f0e5e353 req-66b256c3-6c97-4581-a6ab-5040bf83571c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-cefc7b22-5a31-4d0c-bb25-462153dfc427" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.774 2 DEBUG oslo_concurrency.lockutils [req-2d88d9b6-b4a6-48ce-966e-9c93f0e5e353 req-66b256c3-6c97-4581-a6ab-5040bf83571c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-cefc7b22-5a31-4d0c-bb25-462153dfc427" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.774 2 DEBUG nova.network.neutron [req-2d88d9b6-b4a6-48ce-966e-9c93f0e5e353 req-66b256c3-6c97-4581-a6ab-5040bf83571c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Refreshing network info cache for port cae08d04-f9a8-46ee-ba57-0a0db94ae186 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:26:31 np0005476733 podman[228179]: 2025-10-08 15:26:31.828462985 +0000 UTC m=+0.293945659 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:26:31 np0005476733 podman[228207]: 2025-10-08 15:26:31.842751202 +0000 UTC m=+0.076228206 container remove 38eaa77941a104ea58512118b3a40463b033b6715f3ad7362e0bcca48ca2e288 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 11:26:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:31.854 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e9d80adf-73b8-4b37-933e-8a4215881162]: (4, ('Wed Oct  8 03:26:31 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038 (38eaa77941a104ea58512118b3a40463b033b6715f3ad7362e0bcca48ca2e288)\n38eaa77941a104ea58512118b3a40463b033b6715f3ad7362e0bcca48ca2e288\nWed Oct  8 03:26:31 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038 (38eaa77941a104ea58512118b3a40463b033b6715f3ad7362e0bcca48ca2e288)\n38eaa77941a104ea58512118b3a40463b033b6715f3ad7362e0bcca48ca2e288\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:26:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:31.856 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[34102861-f080-4c95-999f-8de970b34d4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:26:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:31.857 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapba711270-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:31 np0005476733 kernel: tapba711270-e0: left promiscuous mode
Oct  8 11:26:31 np0005476733 podman[228171]: 2025-10-08 15:26:31.868589257 +0000 UTC m=+0.336317332 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:31.877 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[31d698b4-be38-4fac-b8e3-bd954ba0c127]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:26:31 np0005476733 nova_compute[192580]: 2025-10-08 15:26:31.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:31.900 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ba2e6836-393c-43de-b382-2e878bef371b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:26:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:31.902 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3dd5adbc-7856-4e09-ab38-8189df17fdbd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:26:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:31.920 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f6111821-ee37-4b62-9cad-66f63c4e0b41]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 405191, 'reachable_time': 23850, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228243, 'error': None, 'target': 'ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:26:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:31.922 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ba711270-eff4-4485-a453-6e6d5887d038 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:26:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:26:31.923 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[b6aa9f55-0197-4916-bb47-d625c3f5d4e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:26:31 np0005476733 systemd[1]: run-netns-ovnmeta\x2dba711270\x2deff4\x2d4485\x2da453\x2d6e6d5887d038.mount: Deactivated successfully.
Oct  8 11:26:32 np0005476733 nova_compute[192580]: 2025-10-08 15:26:32.098 2 DEBUG nova.network.neutron [req-361e1b7d-38bf-4237-89ad-e0c0523d0550 req-51864918-078c-4954-a35a-42351fe2e94f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Updated VIF entry in instance network info cache for port 9dbcb8e0-b6cb-47f5-b89d-290794905306. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:26:32 np0005476733 nova_compute[192580]: 2025-10-08 15:26:32.099 2 DEBUG nova.network.neutron [req-361e1b7d-38bf-4237-89ad-e0c0523d0550 req-51864918-078c-4954-a35a-42351fe2e94f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Updating instance_info_cache with network_info: [{"id": "9dbcb8e0-b6cb-47f5-b89d-290794905306", "address": "fa:16:3e:2a:db:6c", "network": {"id": "ba711270-eff4-4485-a453-6e6d5887d038", "bridge": "br-int", "label": "tempest-OvnExtraDhcpOptionsTest-1890118187-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.196", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:7::/64", "dns": [], "gateway": {"address": "2001:7::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:7::f816:3eff:fe2a:db6c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "c1390632da384309b358a3f3728ab5d8", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dbcb8e0-b6", "ovs_interfaceid": "9dbcb8e0-b6cb-47f5-b89d-290794905306", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:26:32 np0005476733 nova_compute[192580]: 2025-10-08 15:26:32.180 2 DEBUG oslo_concurrency.lockutils [req-361e1b7d-38bf-4237-89ad-e0c0523d0550 req-51864918-078c-4954-a35a-42351fe2e94f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-4f1d2adc-1ecb-45dc-83a0-c2369028e487" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:26:33 np0005476733 nova_compute[192580]: 2025-10-08 15:26:33.232 2 DEBUG nova.network.neutron [-] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:26:33 np0005476733 nova_compute[192580]: 2025-10-08 15:26:33.288 2 INFO nova.compute.manager [-] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Took 1.92 seconds to deallocate network for instance.#033[00m
Oct  8 11:26:33 np0005476733 nova_compute[192580]: 2025-10-08 15:26:33.341 2 DEBUG oslo_concurrency.lockutils [None req-2ea89e2a-75d1-4e2d-b338-1cb87b384515 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:26:33 np0005476733 nova_compute[192580]: 2025-10-08 15:26:33.341 2 DEBUG oslo_concurrency.lockutils [None req-2ea89e2a-75d1-4e2d-b338-1cb87b384515 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:26:33 np0005476733 nova_compute[192580]: 2025-10-08 15:26:33.398 2 DEBUG nova.network.neutron [req-2d88d9b6-b4a6-48ce-966e-9c93f0e5e353 req-66b256c3-6c97-4581-a6ab-5040bf83571c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Updated VIF entry in instance network info cache for port cae08d04-f9a8-46ee-ba57-0a0db94ae186. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:26:33 np0005476733 nova_compute[192580]: 2025-10-08 15:26:33.399 2 DEBUG nova.network.neutron [req-2d88d9b6-b4a6-48ce-966e-9c93f0e5e353 req-66b256c3-6c97-4581-a6ab-5040bf83571c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Updating instance_info_cache with network_info: [{"id": "cae08d04-f9a8-46ee-ba57-0a0db94ae186", "address": "fa:16:3e:16:82:23", "network": {"id": "9c022ba9-08a2-40a7-896d-13c1538d7064", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.168", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae08d04-f9", "ovs_interfaceid": "cae08d04-f9a8-46ee-ba57-0a0db94ae186", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:26:33 np0005476733 nova_compute[192580]: 2025-10-08 15:26:33.492 2 DEBUG nova.compute.manager [req-8c06e0fd-d2f4-45cd-88af-d11f356fd138 req-d6734034-e085-4580-a4ee-db912810f1e0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Received event network-vif-plugged-9dbcb8e0-b6cb-47f5-b89d-290794905306 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:26:33 np0005476733 nova_compute[192580]: 2025-10-08 15:26:33.492 2 DEBUG oslo_concurrency.lockutils [req-8c06e0fd-d2f4-45cd-88af-d11f356fd138 req-d6734034-e085-4580-a4ee-db912810f1e0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "4f1d2adc-1ecb-45dc-83a0-c2369028e487-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:26:33 np0005476733 nova_compute[192580]: 2025-10-08 15:26:33.492 2 DEBUG oslo_concurrency.lockutils [req-8c06e0fd-d2f4-45cd-88af-d11f356fd138 req-d6734034-e085-4580-a4ee-db912810f1e0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "4f1d2adc-1ecb-45dc-83a0-c2369028e487-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:26:33 np0005476733 nova_compute[192580]: 2025-10-08 15:26:33.493 2 DEBUG oslo_concurrency.lockutils [req-8c06e0fd-d2f4-45cd-88af-d11f356fd138 req-d6734034-e085-4580-a4ee-db912810f1e0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "4f1d2adc-1ecb-45dc-83a0-c2369028e487-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:26:33 np0005476733 nova_compute[192580]: 2025-10-08 15:26:33.493 2 DEBUG nova.compute.manager [req-8c06e0fd-d2f4-45cd-88af-d11f356fd138 req-d6734034-e085-4580-a4ee-db912810f1e0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] No waiting events found dispatching network-vif-plugged-9dbcb8e0-b6cb-47f5-b89d-290794905306 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:26:33 np0005476733 nova_compute[192580]: 2025-10-08 15:26:33.493 2 WARNING nova.compute.manager [req-8c06e0fd-d2f4-45cd-88af-d11f356fd138 req-d6734034-e085-4580-a4ee-db912810f1e0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Received unexpected event network-vif-plugged-9dbcb8e0-b6cb-47f5-b89d-290794905306 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 11:26:33 np0005476733 nova_compute[192580]: 2025-10-08 15:26:33.494 2 DEBUG oslo_concurrency.lockutils [req-2d88d9b6-b4a6-48ce-966e-9c93f0e5e353 req-66b256c3-6c97-4581-a6ab-5040bf83571c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-cefc7b22-5a31-4d0c-bb25-462153dfc427" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:26:33 np0005476733 nova_compute[192580]: 2025-10-08 15:26:33.577 2 DEBUG nova.compute.provider_tree [None req-2ea89e2a-75d1-4e2d-b338-1cb87b384515 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:26:33 np0005476733 nova_compute[192580]: 2025-10-08 15:26:33.644 2 DEBUG nova.scheduler.client.report [None req-2ea89e2a-75d1-4e2d-b338-1cb87b384515 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:26:33 np0005476733 nova_compute[192580]: 2025-10-08 15:26:33.669 2 DEBUG oslo_concurrency.lockutils [None req-2ea89e2a-75d1-4e2d-b338-1cb87b384515 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.328s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:26:33 np0005476733 nova_compute[192580]: 2025-10-08 15:26:33.708 2 INFO nova.scheduler.client.report [None req-2ea89e2a-75d1-4e2d-b338-1cb87b384515 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Deleted allocations for instance 4f1d2adc-1ecb-45dc-83a0-c2369028e487#033[00m
Oct  8 11:26:33 np0005476733 nova_compute[192580]: 2025-10-08 15:26:33.803 2 DEBUG oslo_concurrency.lockutils [None req-2ea89e2a-75d1-4e2d-b338-1cb87b384515 cff4e262a7054de9b32c7b3c504c757f c1390632da384309b358a3f3728ab5d8 - - default default] Lock "4f1d2adc-1ecb-45dc-83a0-c2369028e487" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.812s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:26:33 np0005476733 nova_compute[192580]: 2025-10-08 15:26:33.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.003 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'name': 'tempest-test_multicast_after_idle_timeout-155366011', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001a', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '496a37645ecf47b496dcf02c696ca64a', 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'hostId': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.005 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000016', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'hostId': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.007 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'name': 'tempest-test_multicast_after_idle_timeout-135618235', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000013', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '496a37645ecf47b496dcf02c696ca64a', 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'hostId': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.009 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001e', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '93e68db931464f0282500c84d398d8af', 'user_id': '048380879c82439f920961e33c8fc34c', 'hostId': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.011 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'name': 'tempest-test_bw_limit_east_west-350327070', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000021', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'hostId': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.013 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'name': 'tempest-broadcast-receiver-123-1908290520', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000014', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7e1086961263487db8a3c5190fdf1b2e', 'user_id': '843ea0278e174175a6f8e21731c1383e', 'hostId': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.015 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1cbc4434-d89a-483d-a1f2-299190262888', 'name': 'tempest-broadcast-sender-124-598361755', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001d', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7e1086961263487db8a3c5190fdf1b2e', 'user_id': '843ea0278e174175a6f8e21731c1383e', 'hostId': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.015 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.026 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.027 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.040 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.041 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.052 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.053 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.063 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.063 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.073 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.074 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.084 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.084 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.094 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.095 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94a01e0f-c3ad-4a4e-82f5-de6521b9a669', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398-vda', 'timestamp': '2025-10-08T15:26:36.015850', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'instance-0000001a', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2d778a34-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.738826384, 'message_signature': '34c04198a19f33b08a18fdda645ed426cb69e0f059bbb500f37c36d8774991ad'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398-sda', 'timestamp': '2025-10-08T15:26:36.015850', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'instance-0000001a', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2d7799ca-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.738826384, 'message_signature': 'f6173825c98b93c51abcf60c1be4345e1891540d4999c36033bb0f073ad85b09'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-vda', 'timestamp': '2025-10-08T15:26:36.015850', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2d7998f6-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.750870989, 'message_signature': 'c0c97168ec5bf2138fb5440b96b5eae7afeafc857b73fb3396aaf22931b770c1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-sda', 'timestamp': '2025-10-08T15:26:36.015850', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2d79a83c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.750870989, 'message_signature': '11eb6129b728816c08a0dca6ff6989365326dc7310bd0e2c5d714a822296746a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-vda', 'timestamp': '2025-10-08T15:26:36.015850', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2d7b775c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.764340289, 'message_signature': 'eb77dd5a6920fc2ec2a59b21bf031b23e03ffc696510ef67eed1edc975f3eb18'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-sda', 'timestamp': '2025-10-08T15:26:36.015850', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 11111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2d7b8bc0-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.764340289, 'message_signature': '0bab3fa0654d7a9b249179087b75e7b14bd15edc2ba52bfc60aae2e8b87177b6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427-vda', 'timestamp': '2025-10-08T15:26:36.015850', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'instance-0000001e', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2d7d0928-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.776822858, 'message_signature': '868aa8ed4a1c38b5fe41b9a5a7e0198ed1f6fcbb6070ef9ae28aa9644bfc7036'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427-sda', 'timestamp': '2025-10-08T15:26:36.015850', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'instance-0000001e', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2d7d16ca-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.776822858, 'message_signature': '115bb76d53d2108faeb380a6462b2ea55ec15ee13fec877734c1cc94e2f928bc'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-vda', 'timestamp': '2025-10-08T15:26:36.015850', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2d7ea594-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.786824717, 'message_signature': '79e9d1c4ba0546438b99a05d248668bfa72c95e47babeb1b4a935d5053d8823c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-sda', 'timestamp': '2025-10-08T15:26:36.015850', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2d7eb48a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.786824717, 'message_signature': '5bf219276360f4d654852da1f889166f4395277420afce31464b56615faf8267'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:26:36.015850', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2d804084-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.797424806, 'message_signature': 'fd187b114277caf58c049a424d9240122c4ce4add0879b5ad0714a1b5d199254'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:26:36.015850', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'fl
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: , 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2d804f02-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.797424806, 'message_signature': '2e9fa96c31f4a7e7cf7ba0203c49e58578b0abc5f213dac9b4f39208e6f5ad31'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-vda', 'timestamp': '2025-10-08T15:26:36.015850', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2d81e72c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.807938732, 'message_signature': '836f4d4648168e33fc42ca34d5536e766b19ead313b27976713caef9b8b16ed4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-sda', 'timestamp': '2025-10-08T15:26:36.015850', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2d81f744-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.807938732, 'message_signature': 'f7f4c7a55bc9163b3f66f5bd50039a00a620a8b4468bd89b0c280de8911efcd9'}]}, 'timestamp': '2025-10-08 15:26:36.095847', '_unique_id': '3111765f612e4ec5811716923206c8e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.099 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.099 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.099 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-test_multicast_after_idle_timeout-155366011>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4>, <NovaLikeServer: tempest-test_bw_limit_east_west-350327070>, <NovaLikeServer: tempest-broadcast-sender-124-598361755>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_multicast_after_idle_timeout-155366011>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4>, <NovaLikeServer: tempest-test_bw_limit_east_west-350327070>, <NovaLikeServer: tempest-broadcast-sender-124-598361755>]
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.099 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.102 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 6efc9ea0-184c-46cc-aeb5-e2759e10e398 / tap36047ed0-01 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.102 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/network.outgoing.packets volume: 153 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.105 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/network.outgoing.packets volume: 49 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.108 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/network.outgoing.packets volume: 105 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.111 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for cefc7b22-5a31-4d0c-bb25-462153dfc427 / tapcae08d04-f9 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.111 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/network.outgoing.packets volume: 31 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.114 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 7f1808f3-5a79-4149-84d1-7bc21eefa497 / tapb5af459f-56 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.114 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.116 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.outgoing.packets volume: 218 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.119 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1cbc4434-d89a-483d-a1f2-299190262888 / tap020c7187-87 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.120 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/network.outgoing.packets volume: 201 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04ab5f35-893a-4058-bf5c-9709e9633698', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 153, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-0000001a-6efc9ea0-184c-46cc-aeb5-e2759e10e398-tap36047ed0-01', 'timestamp': '2025-10-08T15:26:36.099934', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'tap36047ed0-01', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a3:d0:1a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap36047ed0-01'}, 'message_id': '2d831570-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.822940091, 'message_signature': 'dbf48ecfb5d477652eada9e82f049b1471e5e2fd70def763ed1bb217efa5e251'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 49, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000016-e36dd986-15d5-466e-93d6-dc7b4483c8e9-tap27016abf-08', 'timestamp': '2025-10-08T15:26:36.099934', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'tap27016abf-08', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:fe:38:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27016abf-08'}, 'message_id': '2d838ce4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.826226446, 'message_signature': '9bc4e49fbc6e594202d6d78f98fba46ac99a3c7e91f2167e99597c137937eb7e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 105, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000013-656c0a96-03f3-4a70-baac-01de2a126a91-tap59f58b79-91', 'timestamp': '2025-10-08T15:26:36.099934', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'tap59f58b79-91', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:5f:94:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap59f58b79-91'}, 'message_id': '2d83f6f2-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.829207021, 'message_signature': '6e3e5074242569235310b02dc3b22f2b51a4189908f3ef427c3184a0dd92e3c9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 31, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-0000001e-cefc7b22-5a31-4d0c-bb25-462153dfc427-tapcae08d04-f9', 'timestamp': '2025-10-08T15:26:36.099934', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'tapcae08d04-f9', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:16:82:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcae08d04-f9'}, 'message_id': '2d8465ec-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.832067682, 'message_signature': 'd335d63076ad44fd3a0aa2a1e08a7826d8542cf5b9409267cbac8f10ba439473'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tapb5af459f-56', 'timestamp': '2025-10-08T15:26:36.099934', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tapb5af459f-56', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4e:90:51', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb5af459f-56'}, 'message_id': '2d84dd7e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.834697456, 'message_signature': 'c44b28683e84c78d043bd1107fedae31c1b6260320d1f3aba6527b08121dddc3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 218,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 7db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:26:36.099934', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': '2d853ef4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.837753404, 'message_signature': '041791f695625ff6a450e50681b93fb5441d07d6ba580c0daf703d891660beb4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 201, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-0000001d-1cbc4434-d89a-483d-a1f2-299190262888-tap020c7187-87', 'timestamp': '2025-10-08T15:26:36.099934', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'tap020c7187-87', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8b:42:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap020c7187-87'}, 'message_id': '2d85df94-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.840222643, 'message_signature': 'f157f304a11f3a37c2103d687136122a7026cde305c00dd6154a928093abf8f7'}]}, 'timestamp': '2025-10-08 15:26:36.121532', '_unique_id': '7de862736850490898576a6453ad84bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.124 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.124 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.124 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-test_multicast_after_idle_timeout-155366011>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4>, <NovaLikeServer: tempest-test_bw_limit_east_west-350327070>, <NovaLikeServer: tempest-broadcast-sender-124-598361755>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_multicast_after_idle_timeout-155366011>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4>, <NovaLikeServer: tempest-test_bw_limit_east_west-350327070>, <NovaLikeServer: tempest-broadcast-sender-124-598361755>]
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.125 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.146 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.device.read.requests volume: 11693 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.147 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.173 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.read.requests volume: 11522 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.174 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.199 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.read.requests volume: 11646 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.200 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.220 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.device.read.requests volume: 11712 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.221 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.245 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.read.requests volume: 5925 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.245 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.read.requests volume: 22 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.269 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.read.requests volume: 11685 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.270 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:26:36.097 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 11111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-11111 [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:26:36.123 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 nova_compute[192580]: 2025-10-08 15:26:36.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.309 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.read.requests volume: 11670 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.309 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87c8a20d-43f1-417e-908a-899eff9f351e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11693, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398-vda', 'timestamp': '2025-10-08T15:26:36.125505', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'instance-0000001a', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2d89d478-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.848516798, 'message_signature': '55fda55289e7b79c70dc6f04f86fd7b7f4185ee3970607d6ef08975eb92ef444'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398-sda', 'timestamp': '2025-10-08T15:26:36.125505', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'instance-0000001a', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2d89dfd6-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.848516798, 'message_signature': '48013b5aff69b311ff33e637ed65a1aa7542bcff5e469dd27fc4f98ce7ebfae2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11522, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-vda', 'timestamp': '2025-10-08T15:26:36.125505', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2d8df206-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.870615443, 'message_signature': 'e8dd0cb64293d5c6fa6d285ad89794ddb183d68b6978cbdc79f2ecb86851b8ee'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-sda', 'timestamp': '2025-10-08T15:26:36.125505', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2d8dfecc-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.870615443, 'message_signature': '154475b6474e82e211ce5f53a7f62983b368582e331fc57be7f10f0a755828b0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11646, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-vda', 'timestamp': '2025-10-08T15:26:36.125505', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2d91e4ce-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.897588025, 'message_signature': 'f9709e918cf6678f186e1deb1552da9f21bccbd2bde1e5a70f3d1c969ee3729c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-sda', 'timestamp': '2025-10-08T15:26:36.125505', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'c
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2d91f52c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.897588025, 'message_signature': '39d64f11c1803f127b29f48b213139acaa8abab0bdb80acae2ef615ce866db76'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11712, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427-vda', 'timestamp': '2025-10-08T15:26:36.125505', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'instance-0000001e', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2d951b8a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.923604795, 'message_signature': 'b8ba59af139d2030e629659386d0713626e82dda06cc897d01f0e9442f0e9bd0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427-sda', 'timestamp': '2025-10-08T15:26:36.125505', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'instance-0000001e', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2d952b3e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.923604795, 'message_signature': 'dc894fab2936b846cc45b64c4b240187bebb36bb830ce97437df8c29c8d7e04a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 5925, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-vda', 'timestamp': '2025-10-08T15:26:36.125505', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2d98d7ca-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.944692609, 'message_signature': 'fbfb9d44a63909d606a66416c7294a23675ce76e4113ab69709c658459435668'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 22, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-sda', 'timestamp': '2025-10-08T15:26:36.125505', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2d98eea4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.944692609, 'message_signature': 'ae53e137947ae64ab70b091768b42aad64e4f6b0c3af18689cfdd883ac0c91a6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11685, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:26:36.125505', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2d9c88d4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.969376798, 'message_signature': 'c2099bdfc2f2772322821327e9946f73a417e156ec1bb1a9f29bd6c0e031a084'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:26:36.125505', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_ne
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: ': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2d9c9a90-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.969376798, 'message_signature': '3fe8950431e97dfc83ed6ba5d2cf78dc49ca01b808abe99b961624f49d46b067'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11670, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-vda', 'timestamp': '2025-10-08T15:26:36.125505', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da29a3a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.993384654, 'message_signature': '46a835d64cc92675273a5e331a202c5f4465ada77a48368fa044ffa9027b0028'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-sda', 'timestamp': '2025-10-08T15:26:36.125505', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da2a610-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.993384654, 'message_signature': 'b4a3e79af9f7b801ca725da9d6a79082b358ef11212e9922c7bb8753386ac9b6'}]}, 'timestamp': '2025-10-08 15:26:36.309996', '_unique_id': '5979382560404fab806c01d04c6be070'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.312 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.312 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.device.write.bytes volume: 136035840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.312 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.312 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.write.bytes volume: 135857152 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.312 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.313 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.write.bytes volume: 136877568 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.313 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.313 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.device.write.bytes volume: 128268288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.313 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.314 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.write.bytes volume: 1024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.314 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.314 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.write.bytes volume: 144776192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.314 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.314 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.write.bytes volume: 152835072 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9bb7ac0a-e725-4e49-8c9c-bdc6229c5d0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 136035840, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398-vda', 'timestamp': '2025-10-08T15:26:36.312290', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'instance-0000001a', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da30b32-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.848516798, 'message_signature': '4336acb2be7eba8d56184eaeab9e7a912085742a82cacbecf4dc3c154d90498e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398-sda', 'timestamp': '2025-10-08T15:26:36.312290', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'instance-0000001a', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da313fc-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.848516798, 'message_signature': 'a8ec704bc6f97ef3929174ade6afe8c50bf9d0b43a9b6523b9b415a4693463d3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 135857152, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-vda', 'timestamp': '2025-10-08T15:26:36.312290', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da31b90-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.870615443, 'message_signature': '302f77a2da36e39d64dd23c4a0a4f154a2ada41b9b7f4032f97bf5430135e138'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-sda', 'timestamp': '2025-10-08T15:26:36.312290', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da3257c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.870615443, 'message_signature': 'ce225f4ac64a3a150f12fee7c55ab64bb1e6a3c12c740eee9cc12f85d2973e74'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 136877568, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-vda', 'timestamp': '2025-10-08T15:26:36.312290', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da32da6-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.897588025, 'message_signature': '0820531e1dcbe278ce0c7fb279a67964de3dcb46c7159ef3818cff93c6baa516'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-sda', 'timestamp': '2025-10-08T15:26:36.312290', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: ': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da33512-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.897588025, 'message_signature': '21b79489ff2c0dc1b4898cb4c3e6484cafbb353e9b5c8640ee8237817bd566af'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 128268288, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427-vda', 'timestamp': '2025-10-08T15:26:36.312290', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'instance-0000001e', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da33c88-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.923604795, 'message_signature': 'dcacf9bfb236cff7a4d488946354b83d4b071f677f524ad749a62cbb50c463b7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427-sda', 'timestamp': '2025-10-08T15:26:36.312290', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'instance-0000001e', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da3450c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.923604795, 'message_signature': 'ddadcb6cd5784cc2782ad8d135e086fd6a657461665c8ee9c8f1fdd90f17eac8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1024, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-vda', 'timestamp': '2025-10-08T15:26:36.312290', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da34f20-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.944692609, 'message_signature': 'f8261de0ba1c94b54940b2d809f7b6cab98a82345ea85457eab29c7e701fde4c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-sda', 'timestamp': '2025-10-08T15:26:36.312290', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da356c8-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.944692609, 'message_signature': '11884ba7d4b5fcf4e95713e6c3311578cf34d4b397b643ea54da71819aa0134c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 144776192, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:26:36.312290', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da35e16-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.969376798, 'message_signature': '053d25dc488865f4d3619293e06e36ee840d079be84d237bdb8e65ca45ad6c2c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:26:36.312290', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'inst
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da36546-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.969376798, 'message_signature': 'c3c870d4f2b58c8ed6de1cf1e19324141556ade7bb2b9b537143d77c5f91d6e0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 152835072, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-vda', 'timestamp': '2025-10-08T15:26:36.312290', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da36cc6-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.993384654, 'message_signature': 'b6a6449351e0b47994c45070b73d07707e0d6b69245a063353aace7e9f62d8a9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-sda', 'timestamp': '2025-10-08T15:26:36.312290', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da375f4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.993384654, 'message_signature': '393d6511d274ad234cebaef1ad100efa6fd43c913808cf6d02c3fccec1a5e5fd'}]}, 'timestamp': '2025-10-08 15:26:36.315280', '_unique_id': '8361a7848ab14a47a33091f710da6667'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.316 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.316 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/network.incoming.packets volume: 111 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.316 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/network.incoming.packets volume: 22 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.317 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.317 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/network.incoming.packets volume: 23 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.317 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.318 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.incoming.packets volume: 155 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.318 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/network.incoming.packets volume: 148 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47b01129-6b6f-4d47-b853-9f7cefcc9b13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 111, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-0000001a-6efc9ea0-184c-46cc-aeb5-e2759e10e398-tap36047ed0-01', 'timestamp': '2025-10-08T15:26:36.316714', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'tap36047ed0-01', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a3:d0:1a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap36047ed0-01'}, 'message_id': '2da3b6cc-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.822940091, 'message_signature': '7afa387f09ffe8aeca6b08bf5ee98b08100ab0ad4b1c102a37c343221a6ef020'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 22, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000016-e36dd986-15d5-466e-93d6-dc7b4483c8e9-tap27016abf-08', 'timestamp': '2025-10-08T15:26:36.316714', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'tap27016abf-08', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:fe:38:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27016abf-08'}, 'message_id': '2da3c0f4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.826226446, 'message_signature': '1bf70bf864e4a35770499e9bf9609a1a56b78410cc5b7817a2341dedbad0e50e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000013-656c0a96-03f3-4a70-baac-01de2a126a91-tap59f58b79-91', 'timestamp': '2025-10-08T15:26:36.316714', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'tap59f58b79-91', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:5f:94:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap59f58b79-91'}, 'message_id': '2da3cbee-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.829207021, 'message_signature': '4aac41564370586022a42c4d75c977da3fd4995a017f9a478caa3be85266759e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 23, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-0000001e-cefc7b22-5a31-4d0c-bb25-462153dfc427-tapcae08d04-f9', 'timestamp': '2025-10-08T15:26:36.316714', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'tapcae08d04-f9', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:16:82:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcae08d04-f9'}, 'message_id': '2da3d724-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.832067682, 'message_signature': 'ea101dd46891cc086d6553c486babac5e67325d0a856dd710b15ab24aed4b5e6'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tapb5af459f-56', 'timestamp': '2025-10-08T15:26:36.316714', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tapb5af459f-56', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4e:90:51', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb5af459f-56'}, 'message_id': '2da3e2dc-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.834697456, 'message_signature': '42460b8d1953c165abee8929e9cce6a21bd5ab835feb95c585e87b5057c81d5e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 155, 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:26:36.316714', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': '2da3eca0-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.837753404, 'message_signature': '65fe1e2cec55cb7a9f84e55625d52cf41b2f4e89654da7e4167dd0cba2387590'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 148, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-0000001d-1cbc4434-d89a-483d-a1f2-299190262888-tap020c7187-87', 'timestamp': '2025-10-08T15:26:36.316714', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'tap020c7187-87', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8b:42:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap020c7187-87'}, 'message_id': '2da3f470-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.840222643, 'message_signature': '61b316d30d39ce9e9cf27635098fbe890e9666167a04d1fe50305cb981e8c4f8'}]}, 'timestamp': '2025-10-08 15:26:36.318522', '_unique_id': 'af285d722d26424b9a4bb6b4a821797e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.319 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.320 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.320 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.320 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.320 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.321 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.321 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '290db51b-2613-4126-810e-580c1151335a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-0000001a-6efc9ea0-184c-46cc-aeb5-e2759e10e398-tap36047ed0-01', 'timestamp': '2025-10-08T15:26:36.319782', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'tap36047ed0-01', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a3:d0:1a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap36047ed0-01'}, 'message_id': '2da42e40-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.822940091, 'message_signature': '74e4dd0995139f93e5715e739f218d4a4827ac5c6ee7e92cb15eaf2ae6249dc1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000016-e36dd986-15d5-466e-93d6-dc7b4483c8e9-tap27016abf-08', 'timestamp': '2025-10-08T15:26:36.319782', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'tap27016abf-08', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:fe:38:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27016abf-08'}, 'message_id': '2da438ea-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.826226446, 'message_signature': '4a129062c23ce2be22fa32b3aea45402426e7cfe5ad3b760a227ba44579c68b4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000013-656c0a96-03f3-4a70-baac-01de2a126a91-tap59f58b79-91', 'timestamp': '2025-10-08T15:26:36.319782', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'tap59f58b79-91', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:5f:94:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap59f58b79-91'}, 'message_id': '2da44240-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.829207021, 'message_signature': '9b1f755f49817727c8c00f41d344acdc1a8d0841d997f17dc19caec945323870'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-0000001e-cefc7b22-5a31-4d0c-bb25-462153dfc427-tapcae08d04-f9', 'timestamp': '2025-10-08T15:26:36.319782', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'tapcae08d04-f9', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:16:82:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcae08d04-f9'}, 'message_id': '2da44cf4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.832067682, 'message_signature': '35b0d87717d5540914a0dfa7870c819037855c43e7e02caf070e8e7a785dd8cf'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tapb5af459f-56', 'timestamp': '2025-10-08T15:26:36.319782', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tapb5af459f-56', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4e:90:51', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb5af459f-56'}, 'message_id': '2da456cc-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.834697456, 'message_signature': 'ca10010f84b9d9e50eb51775fd4d917cf54a2de37797bb720abbca185777e7dc'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet'
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: t_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:26:36.319782', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': '2da46108-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.837753404, 'message_signature': '29f9e554268307d6ae0c01898b64dfef699e415a4e9850ef67dffd0a7ea8f77c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-0000001d-1cbc4434-d89a-483d-a1f2-299190262888-tap020c7187-87', 'timestamp': '2025-10-08T15:26:36.319782', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'tap020c7187-87', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8b:42:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap020c7187-87'}, 'message_id': '2da4690a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.840222643, 'message_signature': 'e2e052d0b363bac0623ef08c8eb9c49ee222069c1ec6ca964f44c5f062296319'}]}, 'timestamp': '2025-10-08 15:26:36.321509', '_unique_id': 'b6b9458329dd4f20a55c1036ea183ee7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.322 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.device.read.bytes volume: 330991104 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.323 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.323 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.read.bytes volume: 326477312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.323 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.323 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.read.bytes volume: 328435200 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.324 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.324 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.device.read.bytes volume: 331892224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.324 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.324 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.read.bytes volume: 99755008 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.325 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.read.bytes volume: 55476 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.325 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.read.bytes volume: 329394176 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.325 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.325 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.read.bytes volume: 329324032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.325 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd9e58f3-9dc3-42b3-8dad-3ae7d99c4937', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 330991104, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398-vda', 'timestamp': '2025-10-08T15:26:36.322763', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'instance-0000001a', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da4a370-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.848516798, 'message_signature': 'e24da4672f2867201adb413c84bb53c84d441f8bfdc3ed0c14a79d262fcc5b95'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398-sda', 'timestamp': '2025-10-08T15:26:36.322763', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'instance-0000001a', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da4ae88-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.848516798, 'message_signature': '3df52cd3f1db016f70cdd5c708f785a975e6b6f9d87fa95c5412eaa2deb54474'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 326477312, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-vda', 'timestamp': '2025-10-08T15:26:36.322763', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da4b888-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.870615443, 'message_signature': '058a2b2ad83236cfbd4162af61023d6ae0ed58d84115931083d28d4669564959'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-sda', 'timestamp': '2025-10-08T15:26:36.322763', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da4c256-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.870615443, 'message_signature': '631fc0638b5db54f0b9ccbb97ca7e9164ca85e02ee95d2235a0d768b0f234e6d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 328435200, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-vda', 'timestamp': '2025-10-08T15:26:36.322763', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da4cc92-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.897588025, 'message_signature': 'a059a06997b9ee11801c0296202bf9f04ab8eadb3c64a11651bedfe3092ea260'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-sda', 'timestamp': '2025-10-08T15:26:36.322763', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: ask_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da4d822-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.897588025, 'message_signature': '61ce37e504d43871d2a7661b5d9e7cd68192490da5c47bab3da8cc8dffb3914b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 331892224, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427-vda', 'timestamp': '2025-10-08T15:26:36.322763', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'instance-0000001e', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da4e272-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.923604795, 'message_signature': 'b287ac6ee6f1294de055c48964a318f34168c3f96b14ade3ceda298cb7b0032f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427-sda', 'timestamp': '2025-10-08T15:26:36.322763', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'instance-0000001e', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da4ec36-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.923604795, 'message_signature': '2bc0273736e1208ccca385324f57058c8ba8bb85b2884b10379ae6836302536e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 99755008, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-vda', 'timestamp': '2025-10-08T15:26:36.322763', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da4f46a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.944692609, 'message_signature': '08a8b2ee9cf22489c4f297c87565f607e1a16940ef4562c3a099a1c09d7f8159'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 55476, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-sda', 'timestamp': '2025-10-08T15:26:36.322763', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da4fdca-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.944692609, 'message_signature': '892c6fcfe8e45c5b8f81ccd505442f411debed4308ec043fe8441f3bbd039673'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 329394176, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:26:36.322763', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da50536-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.969376798, 'message_signature': 'cb0d513edc440af99f3eca8f6d58689ab929562714387fe059537daf4b18322f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:26:36.322763', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b7543
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 22-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da50c98-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.969376798, 'message_signature': '5c2033bd528af5e3a94b9353a8d3983e9c463bba8055c417564d579c7f805013'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 329324032, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-vda', 'timestamp': '2025-10-08T15:26:36.322763', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da514ea-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.993384654, 'message_signature': 'cf6f0c356823010a4cae60b446e1b7908ded3f09cc20a6d64f87ec5fb12860a2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-sda', 'timestamp': '2025-10-08T15:26:36.322763', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da51d3c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.993384654, 'message_signature': 'b443bac698b4cfaa0feb6ee587a05efd669dace83d4cda0051fef6573c496978'}]}, 'timestamp': '2025-10-08 15:26:36.326150', '_unique_id': '614c9327001647919bdfa284a3bf3285'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.327 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.327 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.device.usage volume: 152502272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.327 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.328 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.usage volume: 152305664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.328 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.328 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.usage volume: 152305664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.328 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.328 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.device.usage volume: 135987200 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.329 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.329 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.329 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.329 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.usage volume: 160956416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.330 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.330 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.usage volume: 169345024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.330 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40d2fa30-7d9c-4a0c-9d9d-1ae0bc4a4a73', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152502272, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398-vda', 'timestamp': '2025-10-08T15:26:36.327683', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'instance-0000001a', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da562f6-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.738826384, 'message_signature': '61364c896fbc73a183febd7cc9680eab3af410499267256cac92cd38f64c84a5'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398-sda', 'timestamp': '2025-10-08T15:26:36.327683', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'instance-0000001a', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da56b98-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.738826384, 'message_signature': '32367e075b27c21539a5d3c92fddf35e10e2f673438aeaf3e62583e7574205a7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152305664, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-vda', 'timestamp': '2025-10-08T15:26:36.327683', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da575de-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.750870989, 'message_signature': '042898d3af931dfb266395b3b05abb5b194878ef4c18a0ab9f28fae5f4e1c0a2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-sda', 'timestamp': '2025-10-08T15:26:36.327683', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da57d18-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.750870989, 'message_signature': 'ec3c0f40c2eedea92a73eac5a1eac2814d6c3ab891574215056ee48befbbf640'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152305664, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-vda', 'timestamp': '2025-10-08T15:26:36.327683', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da58420-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.764340289, 'message_signature': 'd3b9519d2c4981b21fb04e0a60e8471687594c1118c6716c2364dbbe01a00c08'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-sda', 'timestamp': '2025-10-08T15:26:36.327683', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'statu
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da58b82-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.764340289, 'message_signature': 'd96a229bbad17b2ec54ce5e746a433dedfddcff5b51c48a6ff224e32a1bd6304'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 135987200, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427-vda', 'timestamp': '2025-10-08T15:26:36.327683', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'instance-0000001e', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da5941a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.776822858, 'message_signature': '69fbec01a323628a5866489572bdcff435422a54b30f037478964b1f2d086474'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427-sda', 'timestamp': '2025-10-08T15:26:36.327683', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'instance-0000001e', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da59e60-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.776822858, 'message_signature': '07e4261d826911ab074d348f56bb3d3fe7a45f52c923837b597974e0b20ffff0'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-vda', 'timestamp': '2025-10-08T15:26:36.327683', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da5a752-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.786824717, 'message_signature': 'cecf3d558f4dcf19307001f4b8e33003f9f5924839d02a0ddd5f01eb16bbf969'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-sda', 'timestamp': '2025-10-08T15:26:36.327683', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da5b288-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.786824717, 'message_signature': 'd3245fb9edc07919a28c65613de0343ef9e768a95a289b8fde00445c67941a5c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 160956416, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:26:36.327683', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da5bd96-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.797424806, 'message_signature': '8ca7fd4fafdfad655ef79b7d8696783a5755936c33039ecd444879937d976776'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:26:36.327683', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da5c9d0-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.797424806, 'message_signature': '6a0bd1849c47636b083ba0c720c0874d5e224fc5c2a8479db29b651a43d95f9a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 169345024, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-vda', 'timestamp': '2025-10-08T15:26:36.327683', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da5d2ea-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.807938732, 'message_signature': '3e31174548a765c69423091da7eb104d18a0cd3ee70e7c10d928d6b2f70f2ebd'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-sda', 'timestamp': '2025-10-08T15:26:36.327683', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da5dc90-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.807938732, 'message_signature': 'b6049a5e41913ec3449df3802e29f2a1300faea6f33469f54a6e614322d2fed6'}]}, 'timestamp': '2025-10-08 15:26:36.331050', '_unique_id': '0cd8f26008a749ad939bb1cca4849e63'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.332 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.333 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.333 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-test_multicast_after_idle_timeout-155366011>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4>, <NovaLikeServer: tempest-test_bw_limit_east_west-350327070>, <NovaLikeServer: tempest-broadcast-sender-124-598361755>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_multicast_after_idle_timeout-155366011>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4>, <NovaLikeServer: tempest-test_bw_limit_east_west-350327070>, <NovaLikeServer: tempest-broadcast-sender-124-598361755>]
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.333 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.333 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.333 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.334 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.334 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.334 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.334 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.335 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28707b72-6847-4737-b910-cb3f24148231', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-0000001a-6efc9ea0-184c-46cc-aeb5-e2759e10e398-tap36047ed0-01', 'timestamp': '2025-10-08T15:26:36.333480', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'tap36047ed0-01', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a3:d0:1a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap36047ed0-01'}, 'message_id': '2da646b2-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.822940091, 'message_signature': '993c64b19c5d5b0e4bda469a11bc5c9ef15c79badc52e96f3b36d6a0c248189b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000016-e36dd986-15d5-466e-93d6-dc7b4483c8e9-tap27016abf-08', 'timestamp': '2025-10-08T15:26:36.333480', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'tap27016abf-08', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:fe:38:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27016abf-08'}, 'message_id': '2da6535a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.826226446, 'message_signature': '839c6dee7f4e4c77cb6eba144beca54d678e2598cb831943891877d741d821b4'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000013-656c0a96-03f3-4a70-baac-01de2a126a91-tap59f58b79-91', 'timestamp': '2025-10-08T15:26:36.333480', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'tap59f58b79-91', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:5f:94:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap59f58b79-91'}, 'message_id': '2da65ddc-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.829207021, 'message_signature': '4def7c20d2314e92b399805109b70c2552373d42caa622574a6e69426ee35f7d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-0000001e-cefc7b22-5a31-4d0c-bb25-462153dfc427-tapcae08d04-f9', 'timestamp': '2025-10-08T15:26:36.333480', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'tapcae08d04-f9', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:16:82:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcae08d04-f9'}, 'message_id': '2da66ade-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.832067682, 'message_signature': 'f0339c98d982c83b78d6cd8ee13e4b723294908f3c867b8f5c19b6ad5e7abf9d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tapb5af459f-56', 'timestamp': '2025-10-08T15:26:36.333480', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tapb5af459f-56', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4e:90:51', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb5af459f-56'}, 'message_id': '2da673f8-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.834697456, 'message_signature': '0a586b11efe401ca498f3c88c9762c03f6d401dfb8803f6609d946df8a30da98'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet'
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: t_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:26:36.333480', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': '2da67dd0-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.837753404, 'message_signature': '52fd5cd1f716bbc4dd5dce3549763a218876b8e97fcf6b8d4f7b578d2e317860'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-0000001d-1cbc4434-d89a-483d-a1f2-299190262888-tap020c7187-87', 'timestamp': '2025-10-08T15:26:36.333480', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'tap020c7187-87', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8b:42:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap020c7187-87'}, 'message_id': '2da686f4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.840222643, 'message_signature': 'd81cc4eeeb9bab0ea69f6e69d697dc9026db60e38760a4411e40aa85f911d728'}]}, 'timestamp': '2025-10-08 15:26:36.335469', '_unique_id': '7f1997dacc7c4d649e6af18fce294e81'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.337 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.337 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.337 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-test_multicast_after_idle_timeout-155366011>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4>, <NovaLikeServer: tempest-test_bw_limit_east_west-350327070>, <NovaLikeServer: tempest-broadcast-sender-124-598361755>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_multicast_after_idle_timeout-155366011>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4>, <NovaLikeServer: tempest-test_bw_limit_east_west-350327070>, <NovaLikeServer: tempest-broadcast-sender-124-598361755>]
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.337 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.337 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.device.write.latency volume: 17594172623 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.337 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.338 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.write.latency volume: 43777668432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.338 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.338 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.write.latency volume: 19377015046 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.339 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.339 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.device.write.latency volume: 17428298969 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.339 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.340 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.write.latency volume: 8125669 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.340 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.340 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.write.latency volume: 12356275106 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.340 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.341 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.write.latency volume: 18531207722 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.341 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21610334-c227-4af9-adf0-b969e486a632', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17594172623, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398-vda', 'timestamp': '2025-10-08T15:26:36.337544', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'instance-0000001a', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da6e3d8-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.848516798, 'message_signature': '7c7161543dca250ccd988b3117046c10f6f2dd416ed012c29afa2709e0e9fb81'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398-sda', 'timestamp': '2025-10-08T15:26:36.337544', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'instance-0000001a', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da6f6a2-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.848516798, 'message_signature': '0b2c188a2a150640d670735a56a1e703d151ea17f924e6c96bfac351b43489f9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 43777668432, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-vda', 'timestamp': '2025-10-08T15:26:36.337544', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da70232-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.870615443, 'message_signature': 'bcc180bdbba80fc1b048e5c858a96a0dfe69b4c2cb92a5fa4fffbe3fb7043e6a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-sda', 'timestamp': '2025-10-08T15:26:36.337544', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da70caa-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.870615443, 'message_signature': '22c07c3f0b4e3771d649fd90c64553ca2779d115d816cdad90e5f277bba93050'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19377015046, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-vda', 'timestamp': '2025-10-08T15:26:36.337544', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da71664-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.897588025, 'message_signature': '46bd8557628de9044e6562ac30e622104204b2355075cf2ce49ce19dacaeefb5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-sda', 'timestamp': '2025-10-08T15:26:36.337544', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_gues
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: : 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da72294-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.897588025, 'message_signature': 'f0e5c4417eb424ae2a24760bffafe982333ad2ca088956073a866bc827bb6e41'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17428298969, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427-vda', 'timestamp': '2025-10-08T15:26:36.337544', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'instance-0000001e', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da72f50-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.923604795, 'message_signature': '67fa4c2d8328dc137e8b626b408748de2fee03eb9ba4b23f1492c93cf25b3a84'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427-sda', 'timestamp': '2025-10-08T15:26:36.337544', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'instance-0000001e', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da73982-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.923604795, 'message_signature': 'fa793218b6c9354221be2d63cfeb5749e1ef40cf2c350283cfa44dee79b0f49f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8125669, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-vda', 'timestamp': '2025-10-08T15:26:36.337544', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da74a6c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.944692609, 'message_signature': '2c66acd4ce4180c48f7a0d2c889026498a2e0f8ec2a42ab19d8bee3ce56302a2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-sda', 'timestamp': '2025-10-08T15:26:36.337544', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da75674-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.944692609, 'message_signature': 'c612c3fd309df4fac79c8dabd72cd74bfd2398f555669a79fa620431d213e0f7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12356275106, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:26:36.337544', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da76100-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.969376798, 'message_signature': 'f0a76e97af5c28d6e0119d55b96186e3006f9e1e48ae1eafee35a73714365dbf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:26:36.337544', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd72757880
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da76ca4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.969376798, 'message_signature': 'b0b551d816b0865b1dc95c6e7d0e4cb90109a09702746efb8aaa6484e47f08cc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18531207722, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-vda', 'timestamp': '2025-10-08T15:26:36.337544', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da776b8-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.993384654, 'message_signature': '97a3772a5f2d796834d01abe1b25ab57c57b94788d05b21659bec7f66dca5399'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-sda', 'timestamp': '2025-10-08T15:26:36.337544', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da78072-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.993384654, 'message_signature': '812a88cab345142ac99ddef554847cffcffb6c523a2fd4bd3bf8b9cd563b45c5'}]}, 'timestamp': '2025-10-08 15:26:36.341810', '_unique_id': 'e6462903ea154405befd02eec698e4e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.343 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.343 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.device.read.latency volume: 9496632090 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.343 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.device.read.latency volume: 87519853 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.344 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.read.latency volume: 8103152554 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.344 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.read.latency volume: 79148645 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.344 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.read.latency volume: 8801372264 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.344 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.read.latency volume: 70590649 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.345 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.device.read.latency volume: 9883688387 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.345 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.device.read.latency volume: 179357156 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.345 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.read.latency volume: 3229717154 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.345 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.read.latency volume: 8491552 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.346 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.read.latency volume: 9525388275 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.346 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.read.latency volume: 62858344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.346 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.read.latency volume: 9450975799 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.346 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.read.latency volume: 54327544 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80768081-0535-4192-bc00-da1e77908f57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9496632090, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398-vda', 'timestamp': '2025-10-08T15:26:36.343664', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'instance-0000001a', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da7d40a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.848516798, 'message_signature': '9a40b0fb75231ea04fefefd74cc82420650af6dafdea7184d5293617637d6299'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 87519853, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398-sda', 'timestamp': '2025-10-08T15:26:36.343664', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'instance-0000001a', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da7dd4c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.848516798, 'message_signature': '8416da1cd658922e302dc0e8cf97785f530047dd1dde3e1a2bef94f565b3d4d9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8103152554, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-vda', 'timestamp': '2025-10-08T15:26:36.343664', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da7e56c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.870615443, 'message_signature': 'c36408e312704caa23a3de8ea45c78e37ca63dbb7a369cc3da1c7939062ac504'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 79148645, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-sda', 'timestamp': '2025-10-08T15:26:36.343664', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da7ec88-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.870615443, 'message_signature': '79874be5c1b57cfb46c160b3b26bebfbb8a95dc8c2996f090058a52512fb9d28'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8801372264, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-vda', 'timestamp': '2025-10-08T15:26:36.343664', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da7f692-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.897588025, 'message_signature': '16d81490b914b9db32cf5170f4121c96a12cb7dedf5b4bd8801ba69670b41e8c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 70590649, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-sda', 'timestamp': '2025-10-08T15:26:36.343664', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: ve', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da8004c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.897588025, 'message_signature': '1bb96387dff3e31edfb48b02a0926c2f8347beaaea74e62ffaaeb15b2e854cc3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9883688387, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427-vda', 'timestamp': '2025-10-08T15:26:36.343664', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'instance-0000001e', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da80b6e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.923604795, 'message_signature': 'f4397a397cd38642a2cb2f49f7f06caff2cb18007eef923174e1af60c63978dc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 179357156, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427-sda', 'timestamp': '2025-10-08T15:26:36.343664', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'instance-0000001e', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da8156e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.923604795, 'message_signature': '2875260ad6a84ff11106d803f261085d3deda487447c83f54af307167106cb0f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3229717154, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-vda', 'timestamp': '2025-10-08T15:26:36.343664', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da82266-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.944692609, 'message_signature': '198760abb08a367ed0ee7388859d32dc99270ba0d32c9ddd24c9e986e9e3c6a1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8491552, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-sda', 'timestamp': '2025-10-08T15:26:36.343664', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da82bee-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.944692609, 'message_signature': '0893ee8a1e3bb79d98c91a2529a87acd06d701f5a3d38eb4f1b4bf816553c8cf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9525388275, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:26:36.343664', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da83436-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.969376798, 'message_signature': '102d5e9d89dbcff4e04f120d1206d88ffbc6307265fa8591acb287779a6723fb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 62858344, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:26:36.343664', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_gues
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: e-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da83b84-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.969376798, 'message_signature': '10b773e4560bdecd0baa5d2689c228a33731d771b4cc1454a4a21a7807eee2f6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9450975799, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-vda', 'timestamp': '2025-10-08T15:26:36.343664', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2da845de-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.993384654, 'message_signature': '6371939c5f13f31bd20d908f902e887f9a00f7d29acb5a95e26145393d478d13'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 54327544, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-sda', 'timestamp': '2025-10-08T15:26:36.343664', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2da8504c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.993384654, 'message_signature': '6c8c454de74e28bc9f99f8b70c6898c4d7160be4ffa1fd6d41cd8f8f397a47ff'}]}, 'timestamp': '2025-10-08 15:26:36.347140', '_unique_id': 'ed261dd7a1174131a81ba7c505322038'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.348 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.349 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/network.incoming.bytes volume: 19745 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.349 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/network.incoming.bytes volume: 2666 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.349 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/network.incoming.bytes volume: 12878 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.350 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/network.incoming.bytes volume: 3799 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.350 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.350 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.incoming.bytes volume: 25431 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.350 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/network.incoming.bytes volume: 25191 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76456039-6220-428d-a066-cbbd89277fb7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 19745, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-0000001a-6efc9ea0-184c-46cc-aeb5-e2759e10e398-tap36047ed0-01', 'timestamp': '2025-10-08T15:26:36.349020', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'tap36047ed0-01', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a3:d0:1a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap36047ed0-01'}, 'message_id': '2da8a7a4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.822940091, 'message_signature': '528f0ca7e725506ee1b15ff1ae1d6ff3421dadf4c3cc6bd59888f99681fce0ba'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2666, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000016-e36dd986-15d5-466e-93d6-dc7b4483c8e9-tap27016abf-08', 'timestamp': '2025-10-08T15:26:36.349020', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'tap27016abf-08', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:fe:38:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27016abf-08'}, 'message_id': '2da8b262-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.826226446, 'message_signature': 'ddf6c59acb72459975e3b4ea20be1f32cf95c776d002d307bfc7046132273b85'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12878, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000013-656c0a96-03f3-4a70-baac-01de2a126a91-tap59f58b79-91', 'timestamp': '2025-10-08T15:26:36.349020', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'tap59f58b79-91', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:5f:94:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap59f58b79-91'}, 'message_id': '2da8c108-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.829207021, 'message_signature': '4e9061694801f4d7fbb3edb1a6387d90294c597de26de64966f1cf4f4141a2fb'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3799, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-0000001e-cefc7b22-5a31-4d0c-bb25-462153dfc427-tapcae08d04-f9', 'timestamp': '2025-10-08T15:26:36.349020', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'tapcae08d04-f9', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:16:82:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcae08d04-f9'}, 'message_id': '2da8ccde-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.832067682, 'message_signature': 'a083bfb097257355c2e15a23da53793493dfe08cae4cd7a3d21a5ee42a6e060d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tapb5af459f-56', 'timestamp': '2025-10-08T15:26:36.349020', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tapb5af459f-56', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4e:90:51', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb5af459f-56'}, 'message_id': '2da8d6f2-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.834697456, 'message_signature': '795b4bcafd63575d82359707999c8e6d909d5a0c86f65f35585042416eb5cfa9'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25431, 'user_id': '843ea0278e174175a
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:26:36.349020', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': '2da8e0f2-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.837753404, 'message_signature': '8666a6e9446b7e1388ff33ec90e92420f9058aec0300a3dae4598fc2cdeeacfb'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25191, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-0000001d-1cbc4434-d89a-483d-a1f2-299190262888-tap020c7187-87', 'timestamp': '2025-10-08T15:26:36.349020', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'tap020c7187-87', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8b:42:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap020c7187-87'}, 'message_id': '2da8ed90-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.840222643, 'message_signature': 'aff124d154ea94a59ec82552a971cc6a8efe70bc208550546199689822a8e202'}]}, 'timestamp': '2025-10-08 15:26:36.351166', '_unique_id': '2088c89acdd04be5917a367891470886'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.352 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.370 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/memory.usage volume: 249.1484375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.385 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/memory.usage volume: 230.63671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.401 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/memory.usage volume: 237.515625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.421 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/memory.usage volume: 308.37109375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.446 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.447 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 7f1808f3-5a79-4149-84d1-7bc21eefa497: ceilometer.compute.pollsters.NoVolumeException
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.459 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/memory.usage volume: 237.86328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.474 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/memory.usage volume: 244.09375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5e1d56c-79c9-43d5-bf6f-e344b6aa0aa0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 249.1484375, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'timestamp': '2025-10-08T15:26:36.352880', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'instance-0000001a', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '2dabf026-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4100.093272545, 'message_signature': '064f3f0c39e08fc8f6143b1b2e82443692979fbb1176a1da854ca2d229848411'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 230.63671875, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'timestamp': '2025-10-08T15:26:36.352880', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '2dae4948-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4100.108625345, 'message_signature': 'f305a5ad0cc33c642f82b6328b7e3b6c5d1edb277fd2225bd5b8fd54c52d66f6'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 237.515625, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'timestamp': '2025-10-08T15:26:36.352880', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '2db0a2a6-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4100.124084369, 'message_signature': '818d476a831a12b17489a5b52fe482c620b5edbde3f3ae5e036ea038385527bf'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 308.37109375, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'timestamp': '2025-10-08T15:26:36.352880', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'instance-0000001e', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '2db3bfea-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4100.144358317, 'message_signature': 'af517208df4f7ff192690eef675eef656af6a2fe28a8db274c76e5ae46879ad3'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 237.86328125, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'timestamp': '2025-10-08T15:26:36.352880', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '2db975fc-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4100.181993418, 'message_signature': 'd50b5b217ee12b3be52c44bfb2c721f3e0f683c65e0363cbe6f055275e429663'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 244.09375, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'timestamp': '2025-10-08T15:26:36.352880', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '2dbbdf86-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4100.197665579, 'message_signature': '535f8b71c37e387dcdd5019c8ba672d71105fec4dc350101a086440be050abb5'}]}, 'timestamp': '2025-10-08 15:26:36.475386', '_unique_id': 'fac8687b69404a2192c84a8f3c9bbdca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.477 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.477 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/cpu volume: 44140000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.478 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/cpu volume: 39850000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.478 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/cpu volume: 41630000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.478 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/cpu volume: 40090000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.478 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/cpu volume: 17350000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.479 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/cpu volume: 44440000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.479 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/cpu volume: 44140000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5127c2fd-4893-4e72-b577-f339e29254f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 44140000000, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'timestamp': '2025-10-08T15:26:36.477792', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'instance-0000001a', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '2dbc4d9a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4100.093272545, 'message_signature': '79a2251c52d49b61bea9a98f41f6f0f2230b14f471ed8dc6d56e860e45c4ed26'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 39850000000, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'timestamp': '2025-10-08T15:26:36.477792', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '2dbc58a8-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4100.108625345, 'message_signature': '65c72262d7adadd0fc5ac8adb85319fe929766cc23f3113abc118af2c7012b15'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 41630000000, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'timestamp': '2025-10-08T15:26:36.477792', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '2dbc6280-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4100.124084369, 'message_signature': 'fbaf7cca77a468b5bad404551f41fd0b889de0e9f0913301f105832e6cc16200'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 40090000000, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'timestamp': '2025-10-08T15:26:36.477792', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'instance-0000001e', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '2dbc6c80-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4100.144358317, 'message_signature': '8104acd1666958ec6f1f86bc6b90dfc9b566d0a341ad1239071a3ebe6a50d5c6'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17350000000, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'timestamp': '2025-10-08T15:26:36.477792', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '2dbc7644-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4100.1698326, 'message_signature': '05b1c87e5591b7b645be2683bd2aa57c43d27ffad2dd61e2086dd88d5d5fe65e'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 44440000000, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'timestamp': '2025-10-08T15:26:36.477792', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: e_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '2dbc8300-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4100.181993418, 'message_signature': '0c7be3aa4a73b08472e9cf691d8e8b0b87156070b63065a8a7cf3354daf18b31'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 44140000000, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'timestamp': '2025-10-08T15:26:36.477792', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '2dbc8dbe-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4100.197665579, 'message_signature': '3b12fdf83b69401e8bd012b33a5b695ba6f23e7cd93c2afee1404b0ef6747bda'}]}, 'timestamp': '2025-10-08 15:26:36.479778', '_unique_id': 'c5dd7559874843d1ba46d2ed484c5e23'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.481 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.481 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.482 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/network.incoming.bytes.delta volume: 336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.482 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/network.incoming.bytes.delta volume: 10452 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.482 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.482 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.483 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.incoming.bytes.delta volume: 504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.483 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57c1fcf5-ec05-4b12-85bc-09b2203eaf03', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-0000001a-6efc9ea0-184c-46cc-aeb5-e2759e10e398-tap36047ed0-01', 'timestamp': '2025-10-08T15:26:36.481663', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'tap36047ed0-01', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a3:d0:1a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap36047ed0-01'}, 'message_id': '2dbce46c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.822940091, 'message_signature': 'd737bdb67efca69721bf34f5f15d76de814fded6e7664e1803dbc72d0c3d7f52'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 336, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000016-e36dd986-15d5-466e-93d6-dc7b4483c8e9-tap27016abf-08', 'timestamp': '2025-10-08T15:26:36.481663', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'tap27016abf-08', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:fe:38:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27016abf-08'}, 'message_id': '2dbcf196-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.826226446, 'message_signature': '99b55a6bc5138af1bcee9e0d59032223505eb8721d72022ce9b683a2344ea344'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 10452, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000013-656c0a96-03f3-4a70-baac-01de2a126a91-tap59f58b79-91', 'timestamp': '2025-10-08T15:26:36.481663', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'tap59f58b79-91', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:5f:94:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap59f58b79-91'}, 'message_id': '2dbcfea2-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.829207021, 'message_signature': '83d5f9aec18bd831ee82c78bcb6ccb5cd8e06355c7d9216a157d17296fcf9fe5'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-0000001e-cefc7b22-5a31-4d0c-bb25-462153dfc427-tapcae08d04-f9', 'timestamp': '2025-10-08T15:26:36.481663', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'tapcae08d04-f9', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:16:82:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcae08d04-f9'}, 'message_id': '2dbd09d8-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.832067682, 'message_signature': '862338de501c13f304dc4b1f3cb884c9f867ce2097994ace1fe4e8dfb6a6b364'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tapb5af459f-56', 'timestamp': '2025-10-08T15:26:36.481663', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tapb5af459f-56', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4e:90:51', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb5af459f-56'}, 'message_id': '2dbd159a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.834697456, 'message_signature': 'a6d12319c022dc3a49870e7a0aa8bcdfcd29a64a52cdf372fafa4a3898b45a1c'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 504, 'user_id': '843ea0278e174175a6f8e21
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:26:36.481663', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': '2dbd2102-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.837753404, 'message_signature': '1a002fe9ad7cfa4dfaec6ec3b93afcc0226869a35071c0dcd4b856860359ac3a'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-0000001d-1cbc4434-d89a-483d-a1f2-299190262888-tap020c7187-87', 'timestamp': '2025-10-08T15:26:36.481663', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'tap020c7187-87', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8b:42:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap020c7187-87'}, 'message_id': '2dbd2bfc-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.840222643, 'message_signature': '7b481b384f624bc11eceab29311d02dc005f1428c11f0bcf219620dc62768f37'}]}, 'timestamp': '2025-10-08 15:26:36.483837', '_unique_id': '3db1cc27e01040179a4b517ab6c85a4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.485 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.485 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.485 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.486 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.486 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.486 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.486 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.487 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bad1cdcd-dc5d-456d-a115-067a8e958128', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-0000001a-6efc9ea0-184c-46cc-aeb5-e2759e10e398-tap36047ed0-01', 'timestamp': '2025-10-08T15:26:36.485528', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'tap36047ed0-01', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a3:d0:1a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap36047ed0-01'}, 'message_id': '2dbd7a3a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.822940091, 'message_signature': 'b04edcea55cd247d1e38cb550d02b9552af7d8310ec67799fe6fb10d2c1e5098'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000016-e36dd986-15d5-466e-93d6-dc7b4483c8e9-tap27016abf-08', 'timestamp': '2025-10-08T15:26:36.485528', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'tap27016abf-08', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:fe:38:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27016abf-08'}, 'message_id': '2dbd864c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.826226446, 'message_signature': '40629e2d10ecf17e5fcd3d8d3bba777e083f87284b3d78bbcde268b01c94fd34'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000013-656c0a96-03f3-4a70-baac-01de2a126a91-tap59f58b79-91', 'timestamp': '2025-10-08T15:26:36.485528', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'tap59f58b79-91', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:5f:94:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap59f58b79-91'}, 'message_id': '2dbd9218-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.829207021, 'message_signature': 'd633569cdfde8d3b85c65130327d04834e9f2b47737b12c600740cbcde82d7ff'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-0000001e-cefc7b22-5a31-4d0c-bb25-462153dfc427-tapcae08d04-f9', 'timestamp': '2025-10-08T15:26:36.485528', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'tapcae08d04-f9', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:16:82:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcae08d04-f9'}, 'message_id': '2dbd9cea-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.832067682, 'message_signature': '4aca0c256f604420fd215ef28aaf571e56f1fa60ec00f1ad119e61024a25ff4a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tapb5af459f-56', 'timestamp': '2025-10-08T15:26:36.485528', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tapb5af459f-56', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4e:90:51', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb5af459f-56'}, 'message_id': '2dbda762-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.834697456, 'message_signature': '89ee40227e7a20e5fb264155a54b8606d8d1526146fafc8d220ff4bd5e2bfa15'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'p
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:26:36.485528', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': '2dbdb2d4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.837753404, 'message_signature': '59b82be9663b8273b74f60b9c6c5ee9c2c3a87b0c575144d5b317569f5389d93'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-0000001d-1cbc4434-d89a-483d-a1f2-299190262888-tap020c7187-87', 'timestamp': '2025-10-08T15:26:36.485528', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'tap020c7187-87', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8b:42:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap020c7187-87'}, 'message_id': '2dbdbda6-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.840222643, 'message_signature': 'e430c36ca5b97ade9eb1dcabb3db67a16e0fad2cd9b78a8c3e1980e9cf58954e'}]}, 'timestamp': '2025-10-08 15:26:36.487557', '_unique_id': '383ffd5b008f4e848a0e6c33853490df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.489 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.489 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/network.outgoing.bytes volume: 26907 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.489 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/network.outgoing.bytes volume: 4914 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.489 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/network.outgoing.bytes volume: 16036 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.490 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/network.outgoing.bytes volume: 4919 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.490 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.490 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.outgoing.bytes volume: 39375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.490 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/network.outgoing.bytes volume: 37869 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9ea8203-0a3d-4cf7-b821-b057d687db52', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 26907, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-0000001a-6efc9ea0-184c-46cc-aeb5-e2759e10e398-tap36047ed0-01', 'timestamp': '2025-10-08T15:26:36.489209', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'tap36047ed0-01', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a3:d0:1a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap36047ed0-01'}, 'message_id': '2dbe09aa-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.822940091, 'message_signature': '947631e62a3ab93dbdafa64caf30d1452b343965c5da5dbde4daaaebf51023d4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4914, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000016-e36dd986-15d5-466e-93d6-dc7b4483c8e9-tap27016abf-08', 'timestamp': '2025-10-08T15:26:36.489209', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'tap27016abf-08', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:fe:38:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27016abf-08'}, 'message_id': '2dbe1508-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.826226446, 'message_signature': '924e441c268f50b798374d4472256479860f78bbc6146a70550ee3c3a55ff9b7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 16036, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000013-656c0a96-03f3-4a70-baac-01de2a126a91-tap59f58b79-91', 'timestamp': '2025-10-08T15:26:36.489209', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'tap59f58b79-91', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:5f:94:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap59f58b79-91'}, 'message_id': '2dbe20b6-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.829207021, 'message_signature': 'd8ddde5f678b5739ddc00603ee6ae025608fc5be6602c8002caaa8d3e4f8cb8c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4919, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-0000001e-cefc7b22-5a31-4d0c-bb25-462153dfc427-tapcae08d04-f9', 'timestamp': '2025-10-08T15:26:36.489209', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'tapcae08d04-f9', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:16:82:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcae08d04-f9'}, 'message_id': '2dbe2c6e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.832067682, 'message_signature': 'dbb1206e58e215ca2848898cff1ff9780ba00233f69d03455da16897fcca3c4c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tapb5af459f-56', 'timestamp': '2025-10-08T15:26:36.489209', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tapb5af459f-56', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4e:90:51', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb5af459f-56'}, 'message_id': '2dbe3754-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.834697456, 'message_signature': '4c52c46928c226138d135ae2703d8f4e4d5efe6c3d5cad6acdb0566433b4a7ca'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 39375, 'user_id': '843ea0278e174175a6f
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: me': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:26:36.489209', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': '2dbe41c2-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.837753404, 'message_signature': 'f0e8cc56280b36eb97966bba9354d985087790acc3dce89c519df52a59a5fdb0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 37869, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-0000001d-1cbc4434-d89a-483d-a1f2-299190262888-tap020c7187-87', 'timestamp': '2025-10-08T15:26:36.489209', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'tap020c7187-87', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8b:42:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap020c7187-87'}, 'message_id': '2dbe4c30-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.840222643, 'message_signature': '828a76c4ea658f2354012fc17534da599f88b9d8fbb7fac6b47896d142f23ee7'}]}, 'timestamp': '2025-10-08 15:26:36.491224', '_unique_id': '884151c5ffab450eafba38463d5c0334'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.492 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.492 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.device.write.requests volume: 731 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.493 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.493 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.write.requests volume: 741 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.493 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.494 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.write.requests volume: 808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.494 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.494 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.device.write.requests volume: 446 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.494 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.495 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.495 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.495 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.write.requests volume: 777 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.496 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.496 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.write.requests volume: 744 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.496 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a944c77-54b2-44bb-b010-51a58677165e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 731, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398-vda', 'timestamp': '2025-10-08T15:26:36.492829', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'instance-0000001a', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2dbe9726-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.848516798, 'message_signature': 'e309172f3b48092fa99d611080a58a811a3190383f7558d269ab2a4f2e33b74a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398-sda', 'timestamp': '2025-10-08T15:26:36.492829', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'instance-0000001a', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2dbea676-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.848516798, 'message_signature': 'faf026b4f1ae2d6fa5ce8bee4a81ae4de62839953de1ab1ae7969cd5422ae562'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 741, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-vda', 'timestamp': '2025-10-08T15:26:36.492829', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2dbeb0d0-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.870615443, 'message_signature': 'df9cba773da27c32d9364162065a1f44683cead59de2f1eb97096ad41c1b8345'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-sda', 'timestamp': '2025-10-08T15:26:36.492829', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2dbebc10-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.870615443, 'message_signature': '8ce15910b37d229e10fb13ebea290e78ff1330cfb821ab105e1f0e8116648748'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 808, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-vda', 'timestamp': '2025-10-08T15:26:36.492829', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2dbec8f4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.897588025, 'message_signature': '85e81a841c6c9f3bb6182dc4b46de9be8952b589b26cda4d2273874d671e9a93'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-sda', 'timestamp': '2025-10-08T15:26:36.492829', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: ve', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2dbed358-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.897588025, 'message_signature': '6e13086e7d9ea861f8742635195369aaba9496c0055b3a8d33d8ef43e3ed6c4a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 446, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427-vda', 'timestamp': '2025-10-08T15:26:36.492829', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'instance-0000001e', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2dbedd62-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.923604795, 'message_signature': 'd275da451f800a83b03179ac46deab7df24ac121eb0148982044d0990f839d58'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427-sda', 'timestamp': '2025-10-08T15:26:36.492829', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'instance-0000001e', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2dbee730-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.923604795, 'message_signature': '66492bebb81710a2d70e606673de6deb22310ffd6c37111dbc04a398e96ffeef'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-vda', 'timestamp': '2025-10-08T15:26:36.492829', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2dbef41e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.944692609, 'message_signature': '22793270ad56490640a4f37096fb3147f69f287a056afddff5ff4fda6e20d5b2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-sda', 'timestamp': '2025-10-08T15:26:36.492829', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2dbefe5a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.944692609, 'message_signature': '6c1a87556c479750dab32d129cc8fe0406849d22ce051466ddcd4560a0794036'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 777, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:26:36.492829', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2dbf083c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.969376798, 'message_signature': '9051022fa5664b836965b9cec2da59f5fc8b08c47e15f85e24f9c7abbbf4c523'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:26:36.492829', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest'
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2dbf130e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.969376798, 'message_signature': '77a6de5419fd01a16e7f0709fdb73b06ac8725340e6e13ec07e126c9b0d452d8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 744, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-vda', 'timestamp': '2025-10-08T15:26:36.492829', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2dbf1fa2-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.993384654, 'message_signature': '15c61e8806f99d11b5bf7fd5ed742de898f91efff7604c586c75cba748e3b762'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-sda', 'timestamp': '2025-10-08T15:26:36.492829', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2dbf298e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.993384654, 'message_signature': '344308a87ad55c47171b985a5ba3fb3ad0ac2bae79d4d75f6e76625cec278ed0'}]}, 'timestamp': '2025-10-08 15:26:36.496864', '_unique_id': '862d2ffee6ff4341b2366ccba31191a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.498 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.498 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.499 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.499 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.allocation volume: 153100288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.499 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.499 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.500 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.500 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.device.allocation volume: 136318976 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.500 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.501 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.allocation volume: 1253376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.501 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.501 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.allocation volume: 161484800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.501 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.502 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.allocation volume: 169873408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.502 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'acd2f4d4-b5c5-4178-984b-56bb8b796d3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398-vda', 'timestamp': '2025-10-08T15:26:36.498803', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'instance-0000001a', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2dbf8078-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.738826384, 'message_signature': 'eb83a4c82e8d531312b3af923eae60aa4d3bf0975a63f95e69d3020d602789c9'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398-sda', 'timestamp': '2025-10-08T15:26:36.498803', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'instance-0000001a', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2dbf8bc2-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.738826384, 'message_signature': '00e7c6713b35f70389c8c69afdd2838aa8bbce1f2744823d05e2d9025dd1f9b0'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153100288, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-vda', 'timestamp': '2025-10-08T15:26:36.498803', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2dbf9842-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.750870989, 'message_signature': '95d41edfdee4a67cee17ab58c2ee54823405ef653d8eb70b2c044b3ed049c637'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9-sda', 'timestamp': '2025-10-08T15:26:36.498803', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'instance-00000016', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2dbfa288-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.750870989, 'message_signature': '26574141ac54991ee7f7ca049ffc4c9f68b82835546e7b8ee399d696f56fe3db'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-vda', 'timestamp': '2025-10-08T15:26:36.498803', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2dbfada0-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.764340289, 'message_signature': '62c83a9fed7d4bdc92799a1c9a1afe127b97b842073de1595e0a4ec879a580d3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '656c0a96-03f3-4a70-baac-01de2a126a91-sda', 'timestamp': '2025-10-08T15:26:36.498803', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'instance-00000013', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'eph
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: : '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2dbfb9f8-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.764340289, 'message_signature': '6749fe1bdee9baa09dbf92e8e70aa45f521699ea261a5ae9d8f99b34c9197661'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 136318976, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427-vda', 'timestamp': '2025-10-08T15:26:36.498803', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'instance-0000001e', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2dbfc43e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.776822858, 'message_signature': '771580e2220e65ada9ebd7fe10ea486329d839b33dc08d574fbe871971df5383'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427-sda', 'timestamp': '2025-10-08T15:26:36.498803', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'instance-0000001e', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2dbfce0c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.776822858, 'message_signature': '7e8a55ca73d2fab76211549273007a3456e98547e0983a77127309a791e4b371'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1253376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-vda', 'timestamp': '2025-10-08T15:26:36.498803', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2dbfd8d4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.786824717, 'message_signature': '66636fe0958a2f26d6b35c4713b851b9007bbee08fbc291cf6f0b1dd72763c2f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-sda', 'timestamp': '2025-10-08T15:26:36.498803', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2dbfe52c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.786824717, 'message_signature': 'd2bd2f8c781af7d58f2582e24f614a6aa94dbcb4e6722bd2b77f43d232350721'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 161484800, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:26:36.498803', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2dbfef36-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.797424806, 'message_signature': '2ad9a70dc9d56672a18ec6b4e379078c79b9773f17123b11a778e6c3aa953a51'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:26:36.498803', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2dbff9d6-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.797424806, 'message_signature': '64457cfd8fc266ac57f27597009f3bee7115c2ebcf3c94ab391542f2ac1c5f0e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 169873408, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-vda', 'timestamp': '2025-10-08T15:26:36.498803', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2dc0049e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.807938732, 'message_signature': '3f85c2e5eb265485014e1a00eafc5842c5c71ec5554c630ab7a151a51f5cf71f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-sda', 'timestamp': '2025-10-08T15:26:36.498803', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2dc010d8-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.807938732, 'message_signature': '8344f36dc9b74ee303b78818421328c57259c7c61f1034a46ca387e32277d86d'}]}, 'timestamp': '2025-10-08 15:26:36.502788', '_unique_id': '13a8a4c77f444411a772e0c74075c17a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.504 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.504 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.505 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.505 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.505 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.506 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.506 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.506 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63269fe3-0e03-411a-be94-725d235ec462', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-0000001a-6efc9ea0-184c-46cc-aeb5-e2759e10e398-tap36047ed0-01', 'timestamp': '2025-10-08T15:26:36.504795', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'tap36047ed0-01', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a3:d0:1a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap36047ed0-01'}, 'message_id': '2dc06ae2-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.822940091, 'message_signature': '00302aaefdb6e9c30fce16375a02a59ef5cd503aa7cc9a4dd5e6549da74e77b8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000016-e36dd986-15d5-466e-93d6-dc7b4483c8e9-tap27016abf-08', 'timestamp': '2025-10-08T15:26:36.504795', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'tap27016abf-08', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:fe:38:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27016abf-08'}, 'message_id': '2dc0778a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.826226446, 'message_signature': '8a1eb33ec2998f6fac0c8e67003076932df60da494c220fa85a27d1342fcbc12'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000013-656c0a96-03f3-4a70-baac-01de2a126a91-tap59f58b79-91', 'timestamp': '2025-10-08T15:26:36.504795', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'tap59f58b79-91', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:5f:94:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap59f58b79-91'}, 'message_id': '2dc084f0-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.829207021, 'message_signature': '8db49ef62a48fa97e3f58dbd72a060aaa0eb92c441c41b46d646795c7dc270b3'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-0000001e-cefc7b22-5a31-4d0c-bb25-462153dfc427-tapcae08d04-f9', 'timestamp': '2025-10-08T15:26:36.504795', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'tapcae08d04-f9', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:16:82:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcae08d04-f9'}, 'message_id': '2dc090c6-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.832067682, 'message_signature': '6f3597dabd2d8b184277b34fce445a074c696adebc2013f98753a7f93ab22f6c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tapb5af459f-56', 'timestamp': '2025-10-08T15:26:36.504795', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tapb5af459f-56', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4e:90:51', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb5af459f-56'}, 'message_id': '2dc09cba-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.834697456, 'message_signature': '5b7145928a2c1b070c5031d8650c02050829731e69388626a2da695bf1e358b4'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'p
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:26:36.504795', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': '2dc0a9b2-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.837753404, 'message_signature': '5aa74ccb05f19dd2243ba873715fc19f1f6b46cf1703332d7c05d761bd28336b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-0000001d-1cbc4434-d89a-483d-a1f2-299190262888-tap020c7187-87', 'timestamp': '2025-10-08T15:26:36.504795', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'tap020c7187-87', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8b:42:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap020c7187-87'}, 'message_id': '2dc0b4c0-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.840222643, 'message_signature': '507116cc6b5ffb098702b7a2fa072f4faec3e588c43fa5395b6c204e0e058396'}]}, 'timestamp': '2025-10-08 15:26:36.506992', '_unique_id': 'dc8fb411f8dc4aa6bb9f4df30f6497a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.508 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.508 12 DEBUG ceilometer.compute.pollsters [-] 6efc9ea0-184c-46cc-aeb5-e2759e10e398/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.509 12 DEBUG ceilometer.compute.pollsters [-] e36dd986-15d5-466e-93d6-dc7b4483c8e9/network.outgoing.bytes.delta volume: 1734 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.509 12 DEBUG ceilometer.compute.pollsters [-] 656c0a96-03f3-4a70-baac-01de2a126a91/network.outgoing.bytes.delta volume: 11330 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.509 12 DEBUG ceilometer.compute.pollsters [-] cefc7b22-5a31-4d0c-bb25-462153dfc427/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.510 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.510 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.outgoing.bytes.delta volume: 1812 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.510 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb8e3b67-7be2-44d2-9f99-ccc48e699f1d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-0000001a-6efc9ea0-184c-46cc-aeb5-e2759e10e398-tap36047ed0-01', 'timestamp': '2025-10-08T15:26:36.508775', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-155366011', 'name': 'tap36047ed0-01', 'instance_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a3:d0:1a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap36047ed0-01'}, 'message_id': '2dc10628-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.822940091, 'message_signature': '7e6d7a6286f1c17c0ab847c01ab13ef2b13557f9d86ecbae2ab17ef6e9a3cc87'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 1734, 'user_id': 'f03335a379bd4afdbbd7b9cc7cae27e0', 'user_name': None, 'project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'project_name': None, 'resource_id': 'instance-00000016-e36dd986-15d5-466e-93d6-dc7b4483c8e9-tap27016abf-08', 'timestamp': '2025-10-08T15:26:36.508775', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986', 'name': 'tap27016abf-08', 'instance_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'instance_type': 'custom_neutron_guest', 'host': 'f55b0420c3b664921c79e89c6ffeebd134f6f367da341cd2595242f0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:fe:38:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27016abf-08'}, 'message_id': '2dc11294-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.826226446, 'message_signature': '3180fbfab9374a11386ffce6fecfafd099134bfb86574493a7d8692b0907f76e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 11330, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000013-656c0a96-03f3-4a70-baac-01de2a126a91-tap59f58b79-91', 'timestamp': '2025-10-08T15:26:36.508775', 'resource_metadata': {'display_name': 'tempest-test_multicast_after_idle_timeout-135618235', 'name': 'tap59f58b79-91', 'instance_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:5f:94:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap59f58b79-91'}, 'message_id': '2dc11d8e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.829207021, 'message_signature': '85f47690e9768f27e6afa9454897f89f7b457e0938fe12c4d3c54bdb18de819f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-0000001e-cefc7b22-5a31-4d0c-bb25-462153dfc427-tapcae08d04-f9', 'timestamp': '2025-10-08T15:26:36.508775', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'name': 'tapcae08d04-f9', 'instance_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:16:82:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcae08d04-f9'}, 'message_id': '2dc12ba8-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.832067682, 'message_signature': 'ec353e4598967386a6c52eaceb9a416f314595cd0bc6792a8f19c17bc8da74ac'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tapb5af459f-56', 'timestamp': '2025-10-08T15:26:36.508775', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tapb5af459f-56', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4e:90:51', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb5af459f-56'}, 'message_id': '2dc13760-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.834697456, 'message_signature': 'a4b8e9ddda937c33dbd178668e553cf5d2795121646bae4c29656eddc21556fc'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 1812, 'user_id': '843ea0278e174175a6f8e
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: ': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:26:36.508775', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': '2dc14250-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.837753404, 'message_signature': '9644eea5b0ded21d5b6498cd9199e26c684dc9c4b2068a5dd3070680618192b9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-0000001d-1cbc4434-d89a-483d-a1f2-299190262888-tap020c7187-87', 'timestamp': '2025-10-08T15:26:36.508775', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'tap020c7187-87', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8b:42:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap020c7187-87'}, 'message_id': '2dc14f52-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4099.840222643, 'message_signature': 'e60c91357ccf4b70dbdc6e5d484f0a946c7e40f288acc5f04dd951f38003f4f9'}]}, 'timestamp': '2025-10-08 15:26:36.510948', '_unique_id': '4ef6524e792942fc84762a62efffb168'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:26:36 np0005476733 nova_compute[192580]: 2025-10-08 15:26:36.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:26:36.311 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is:  'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111- [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:26:36.315 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: ': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11 [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:26:36.319 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:26:36.322 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:26:36.326 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: ask_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:26:36.331 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:26:36.336 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:26:36.342 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: : 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-11111111 [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:26:36.347 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: ve', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1 [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:26:36.351 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:26:36.476 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 nova_compute[192580]: 2025-10-08 15:26:36.677 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:26:36 np0005476733 nova_compute[192580]: 2025-10-08 15:26:36.678 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:26:36 np0005476733 nova_compute[192580]: 2025-10-08 15:26:36.679 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:26:36 np0005476733 nova_compute[192580]: 2025-10-08 15:26:36.680 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:26:36.480 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:26:36.484 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:26:36.488 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:26:36.491 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:26:36.497 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: ve', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1 [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:26:36.503 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: : '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111 [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:26:36.507 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:26:36.511 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:26:36 np0005476733 nova_compute[192580]: 2025-10-08 15:26:36.777 2 INFO nova.compute.manager [None req-07e25945-40fd-4606-94d0-9469b994c424 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Get console output#033[00m
Oct  8 11:26:36 np0005476733 nova_compute[192580]: 2025-10-08 15:26:36.782 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:26:36 np0005476733 nova_compute[192580]: 2025-10-08 15:26:36.854 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:26:36 np0005476733 nova_compute[192580]: 2025-10-08 15:26:36.931 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:26:36 np0005476733 nova_compute[192580]: 2025-10-08 15:26:36.933 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.002 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.007 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.059 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.061 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.114 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.122 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.178 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.179 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.234 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.246 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cefc7b22-5a31-4d0c-bb25-462153dfc427/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.308 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cefc7b22-5a31-4d0c-bb25-462153dfc427/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.309 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cefc7b22-5a31-4d0c-bb25-462153dfc427/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.360 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cefc7b22-5a31-4d0c-bb25-462153dfc427/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.368 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.425 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.427 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.486 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.497 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.559 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.560 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.623 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.630 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.691 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.692 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.745 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.955 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.957 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=8430MB free_disk=110.46952056884766GB free_vcpus=1 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.957 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:26:37 np0005476733 nova_compute[192580]: 2025-10-08 15:26:37.958 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:26:38 np0005476733 nova_compute[192580]: 2025-10-08 15:26:38.075 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 656c0a96-03f3-4a70-baac-01de2a126a91 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:26:38 np0005476733 nova_compute[192580]: 2025-10-08 15:26:38.076 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 6efc9ea0-184c-46cc-aeb5-e2759e10e398 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:26:38 np0005476733 nova_compute[192580]: 2025-10-08 15:26:38.076 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:26:38 np0005476733 nova_compute[192580]: 2025-10-08 15:26:38.076 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 1cbc4434-d89a-483d-a1f2-299190262888 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:26:38 np0005476733 nova_compute[192580]: 2025-10-08 15:26:38.077 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance e36dd986-15d5-466e-93d6-dc7b4483c8e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:26:38 np0005476733 nova_compute[192580]: 2025-10-08 15:26:38.077 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance cefc7b22-5a31-4d0c-bb25-462153dfc427 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:26:38 np0005476733 nova_compute[192580]: 2025-10-08 15:26:38.077 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 7f1808f3-5a79-4149-84d1-7bc21eefa497 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:26:38 np0005476733 nova_compute[192580]: 2025-10-08 15:26:38.077 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 7 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:26:38 np0005476733 nova_compute[192580]: 2025-10-08 15:26:38.078 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=7680MB phys_disk=119GB used_disk=70GB total_vcpus=8 used_vcpus=7 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:26:38 np0005476733 nova_compute[192580]: 2025-10-08 15:26:38.229 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:26:38 np0005476733 nova_compute[192580]: 2025-10-08 15:26:38.252 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:26:38 np0005476733 podman[228294]: 2025-10-08 15:26:38.277044571 +0000 UTC m=+0.097340990 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, architecture=x86_64, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Oct  8 11:26:38 np0005476733 nova_compute[192580]: 2025-10-08 15:26:38.309 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:26:38 np0005476733 nova_compute[192580]: 2025-10-08 15:26:38.310 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.352s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:26:38 np0005476733 nova_compute[192580]: 2025-10-08 15:26:38.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:41 np0005476733 podman[228356]: 2025-10-08 15:26:41.279740951 +0000 UTC m=+0.083748896 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 11:26:41 np0005476733 podman[228355]: 2025-10-08 15:26:41.282676695 +0000 UTC m=+0.097258657 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:26:41 np0005476733 nova_compute[192580]: 2025-10-08 15:26:41.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:41 np0005476733 nova_compute[192580]: 2025-10-08 15:26:41.310 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:26:41 np0005476733 nova_compute[192580]: 2025-10-08 15:26:41.311 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:26:41 np0005476733 nova_compute[192580]: 2025-10-08 15:26:41.933 2 INFO nova.compute.manager [None req-dd12c989-e815-4e67-ace7-2be381113f37 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Get console output#033[00m
Oct  8 11:26:41 np0005476733 nova_compute[192580]: 2025-10-08 15:26:41.939 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:26:42 np0005476733 nova_compute[192580]: 2025-10-08 15:26:42.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:26:42 np0005476733 nova_compute[192580]: 2025-10-08 15:26:42.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:26:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:26:42Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4e:90:51 192.168.2.175
Oct  8 11:26:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:26:42Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4e:90:51 192.168.2.175
Oct  8 11:26:43 np0005476733 nova_compute[192580]: 2025-10-08 15:26:43.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:44 np0005476733 nova_compute[192580]: 2025-10-08 15:26:44.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:26:44 np0005476733 nova_compute[192580]: 2025-10-08 15:26:44.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:26:45 np0005476733 nova_compute[192580]: 2025-10-08 15:26:45.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:26:46 np0005476733 nova_compute[192580]: 2025-10-08 15:26:46.256 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937191.2558494, 4f1d2adc-1ecb-45dc-83a0-c2369028e487 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:26:46 np0005476733 nova_compute[192580]: 2025-10-08 15:26:46.257 2 INFO nova.compute.manager [-] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:26:46 np0005476733 nova_compute[192580]: 2025-10-08 15:26:46.297 2 DEBUG nova.compute.manager [None req-039ed941-2aa7-409e-8eab-ffc415b53a72 - - - - - -] [instance: 4f1d2adc-1ecb-45dc-83a0-c2369028e487] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:26:46 np0005476733 nova_compute[192580]: 2025-10-08 15:26:46.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:47 np0005476733 nova_compute[192580]: 2025-10-08 15:26:47.128 2 INFO nova.compute.manager [None req-ed38c84d-d349-420e-8f52-90e2aff42a2b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Get console output#033[00m
Oct  8 11:26:47 np0005476733 nova_compute[192580]: 2025-10-08 15:26:47.133 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:26:47 np0005476733 nova_compute[192580]: 2025-10-08 15:26:47.137 2 INFO nova.virt.libvirt.driver [None req-ed38c84d-d349-420e-8f52-90e2aff42a2b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Truncated console log returned, 21 bytes ignored#033[00m
Oct  8 11:26:48 np0005476733 nova_compute[192580]: 2025-10-08 15:26:48.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:49 np0005476733 podman[228408]: 2025-10-08 15:26:49.219868655 +0000 UTC m=+0.047403695 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:26:49 np0005476733 podman[228407]: 2025-10-08 15:26:49.223607004 +0000 UTC m=+0.052216379 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 11:26:49 np0005476733 ovn_controller[94857]: 2025-10-08T15:26:49Z|00239|binding|INFO|Releasing lport 1e0c4d29-d963-4fdf-8ca6-0153967de16b from this chassis (sb_readonly=0)
Oct  8 11:26:49 np0005476733 ovn_controller[94857]: 2025-10-08T15:26:49Z|00240|binding|INFO|Releasing lport 76302563-91ae-48df-adce-3edec8d5a578 from this chassis (sb_readonly=0)
Oct  8 11:26:49 np0005476733 ovn_controller[94857]: 2025-10-08T15:26:49Z|00241|binding|INFO|Releasing lport b563ca05-c871-4f0e-9980-177237a3f88d from this chassis (sb_readonly=0)
Oct  8 11:26:49 np0005476733 ovn_controller[94857]: 2025-10-08T15:26:49Z|00242|binding|INFO|Releasing lport 9e6f9f1a-9b45-47d5-b171-40ef2fcda78c from this chassis (sb_readonly=0)
Oct  8 11:26:49 np0005476733 nova_compute[192580]: 2025-10-08 15:26:49.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:50 np0005476733 nova_compute[192580]: 2025-10-08 15:26:50.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:26:50 np0005476733 nova_compute[192580]: 2025-10-08 15:26:50.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:26:51 np0005476733 nova_compute[192580]: 2025-10-08 15:26:51.238 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-e36dd986-15d5-466e-93d6-dc7b4483c8e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:26:51 np0005476733 nova_compute[192580]: 2025-10-08 15:26:51.239 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-e36dd986-15d5-466e-93d6-dc7b4483c8e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:26:51 np0005476733 nova_compute[192580]: 2025-10-08 15:26:51.239 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:26:51 np0005476733 nova_compute[192580]: 2025-10-08 15:26:51.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:52 np0005476733 nova_compute[192580]: 2025-10-08 15:26:52.283 2 INFO nova.compute.manager [None req-4f85d427-7286-461d-9336-8b265edce06e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Get console output#033[00m
Oct  8 11:26:52 np0005476733 nova_compute[192580]: 2025-10-08 15:26:52.289 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:26:52 np0005476733 nova_compute[192580]: 2025-10-08 15:26:52.295 2 INFO nova.virt.libvirt.driver [None req-4f85d427-7286-461d-9336-8b265edce06e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Truncated console log returned, 3254 bytes ignored#033[00m
Oct  8 11:26:53 np0005476733 nova_compute[192580]: 2025-10-08 15:26:53.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:54 np0005476733 nova_compute[192580]: 2025-10-08 15:26:54.270 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Updating instance_info_cache with network_info: [{"id": "27016abf-08ed-40dc-8da9-bebab3e3a2a3", "address": "fa:16:3e:fe:38:dd", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27016abf-08", "ovs_interfaceid": "27016abf-08ed-40dc-8da9-bebab3e3a2a3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:26:54 np0005476733 nova_compute[192580]: 2025-10-08 15:26:54.393 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-e36dd986-15d5-466e-93d6-dc7b4483c8e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:26:54 np0005476733 nova_compute[192580]: 2025-10-08 15:26:54.394 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:26:54 np0005476733 nova_compute[192580]: 2025-10-08 15:26:54.394 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:26:56 np0005476733 nova_compute[192580]: 2025-10-08 15:26:56.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:57 np0005476733 nova_compute[192580]: 2025-10-08 15:26:57.144 2 DEBUG nova.compute.manager [req-f71d7e9e-298a-4665-ad2c-471474a375d3 req-c51fed45-935d-4af2-9956-6f28482dffe8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Received event network-changed-b5af459f-569f-4ca4-86fe-d2d018227a96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:26:57 np0005476733 nova_compute[192580]: 2025-10-08 15:26:57.145 2 DEBUG nova.compute.manager [req-f71d7e9e-298a-4665-ad2c-471474a375d3 req-c51fed45-935d-4af2-9956-6f28482dffe8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Refreshing instance network info cache due to event network-changed-b5af459f-569f-4ca4-86fe-d2d018227a96. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:26:57 np0005476733 nova_compute[192580]: 2025-10-08 15:26:57.145 2 DEBUG oslo_concurrency.lockutils [req-f71d7e9e-298a-4665-ad2c-471474a375d3 req-c51fed45-935d-4af2-9956-6f28482dffe8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-7f1808f3-5a79-4149-84d1-7bc21eefa497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:26:57 np0005476733 nova_compute[192580]: 2025-10-08 15:26:57.146 2 DEBUG oslo_concurrency.lockutils [req-f71d7e9e-298a-4665-ad2c-471474a375d3 req-c51fed45-935d-4af2-9956-6f28482dffe8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-7f1808f3-5a79-4149-84d1-7bc21eefa497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:26:57 np0005476733 nova_compute[192580]: 2025-10-08 15:26:57.146 2 DEBUG nova.network.neutron [req-f71d7e9e-298a-4665-ad2c-471474a375d3 req-c51fed45-935d-4af2-9956-6f28482dffe8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Refreshing network info cache for port b5af459f-569f-4ca4-86fe-d2d018227a96 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:26:58 np0005476733 podman[228464]: 2025-10-08 15:26:58.254514595 +0000 UTC m=+0.073243001 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:26:58 np0005476733 ovn_controller[94857]: 2025-10-08T15:26:58Z|00243|binding|INFO|Releasing lport 1e0c4d29-d963-4fdf-8ca6-0153967de16b from this chassis (sb_readonly=0)
Oct  8 11:26:58 np0005476733 ovn_controller[94857]: 2025-10-08T15:26:58Z|00244|binding|INFO|Releasing lport 76302563-91ae-48df-adce-3edec8d5a578 from this chassis (sb_readonly=0)
Oct  8 11:26:58 np0005476733 ovn_controller[94857]: 2025-10-08T15:26:58Z|00245|binding|INFO|Releasing lport b563ca05-c871-4f0e-9980-177237a3f88d from this chassis (sb_readonly=0)
Oct  8 11:26:58 np0005476733 ovn_controller[94857]: 2025-10-08T15:26:58Z|00246|binding|INFO|Releasing lport 9e6f9f1a-9b45-47d5-b171-40ef2fcda78c from this chassis (sb_readonly=0)
Oct  8 11:26:58 np0005476733 nova_compute[192580]: 2025-10-08 15:26:58.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:58 np0005476733 nova_compute[192580]: 2025-10-08 15:26:58.870 2 DEBUG nova.network.neutron [req-f71d7e9e-298a-4665-ad2c-471474a375d3 req-c51fed45-935d-4af2-9956-6f28482dffe8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Updated VIF entry in instance network info cache for port b5af459f-569f-4ca4-86fe-d2d018227a96. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:26:58 np0005476733 nova_compute[192580]: 2025-10-08 15:26:58.871 2 DEBUG nova.network.neutron [req-f71d7e9e-298a-4665-ad2c-471474a375d3 req-c51fed45-935d-4af2-9956-6f28482dffe8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Updating instance_info_cache with network_info: [{"id": "b5af459f-569f-4ca4-86fe-d2d018227a96", "address": "fa:16:3e:4e:90:51", "network": {"id": "f7929135-b0f8-4022-8ac4-4734ecb47f0b", "bridge": "br-int", "label": "tempest-test-network--322813205", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5af459f-56", "ovs_interfaceid": "b5af459f-569f-4ca4-86fe-d2d018227a96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:26:58 np0005476733 nova_compute[192580]: 2025-10-08 15:26:58.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:26:58 np0005476733 nova_compute[192580]: 2025-10-08 15:26:58.941 2 DEBUG oslo_concurrency.lockutils [req-f71d7e9e-298a-4665-ad2c-471474a375d3 req-c51fed45-935d-4af2-9956-6f28482dffe8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-7f1808f3-5a79-4149-84d1-7bc21eefa497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:27:01 np0005476733 nova_compute[192580]: 2025-10-08 15:27:01.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:02 np0005476733 podman[228486]: 2025-10-08 15:27:02.268935848 +0000 UTC m=+0.088782026 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute)
Oct  8 11:27:02 np0005476733 podman[228485]: 2025-10-08 15:27:02.279401752 +0000 UTC m=+0.107677959 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  8 11:27:03 np0005476733 nova_compute[192580]: 2025-10-08 15:27:03.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:04 np0005476733 nova_compute[192580]: 2025-10-08 15:27:04.694 2 DEBUG oslo_concurrency.lockutils [None req-5ac29c08-c55b-4272-9f30-8f15712a6657 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "cefc7b22-5a31-4d0c-bb25-462153dfc427" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:04 np0005476733 nova_compute[192580]: 2025-10-08 15:27:04.694 2 DEBUG oslo_concurrency.lockutils [None req-5ac29c08-c55b-4272-9f30-8f15712a6657 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:04 np0005476733 nova_compute[192580]: 2025-10-08 15:27:04.695 2 INFO nova.compute.manager [None req-5ac29c08-c55b-4272-9f30-8f15712a6657 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Rebooting instance#033[00m
Oct  8 11:27:04 np0005476733 nova_compute[192580]: 2025-10-08 15:27:04.733 2 DEBUG oslo_concurrency.lockutils [None req-5ac29c08-c55b-4272-9f30-8f15712a6657 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "refresh_cache-cefc7b22-5a31-4d0c-bb25-462153dfc427" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:27:04 np0005476733 nova_compute[192580]: 2025-10-08 15:27:04.733 2 DEBUG oslo_concurrency.lockutils [None req-5ac29c08-c55b-4272-9f30-8f15712a6657 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquired lock "refresh_cache-cefc7b22-5a31-4d0c-bb25-462153dfc427" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:27:04 np0005476733 nova_compute[192580]: 2025-10-08 15:27:04.733 2 DEBUG nova.network.neutron [None req-5ac29c08-c55b-4272-9f30-8f15712a6657 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:27:05 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:05Z|00247|binding|INFO|Releasing lport 1e0c4d29-d963-4fdf-8ca6-0153967de16b from this chassis (sb_readonly=0)
Oct  8 11:27:05 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:05Z|00248|binding|INFO|Releasing lport 76302563-91ae-48df-adce-3edec8d5a578 from this chassis (sb_readonly=0)
Oct  8 11:27:05 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:05Z|00249|binding|INFO|Releasing lport b563ca05-c871-4f0e-9980-177237a3f88d from this chassis (sb_readonly=0)
Oct  8 11:27:05 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:05Z|00250|binding|INFO|Releasing lport 9e6f9f1a-9b45-47d5-b171-40ef2fcda78c from this chassis (sb_readonly=0)
Oct  8 11:27:05 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:05Z|00251|binding|INFO|Releasing lport 1e0c4d29-d963-4fdf-8ca6-0153967de16b from this chassis (sb_readonly=0)
Oct  8 11:27:05 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:05Z|00252|binding|INFO|Releasing lport 76302563-91ae-48df-adce-3edec8d5a578 from this chassis (sb_readonly=0)
Oct  8 11:27:05 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:05Z|00253|binding|INFO|Releasing lport b563ca05-c871-4f0e-9980-177237a3f88d from this chassis (sb_readonly=0)
Oct  8 11:27:05 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:05Z|00254|binding|INFO|Releasing lport 9e6f9f1a-9b45-47d5-b171-40ef2fcda78c from this chassis (sb_readonly=0)
Oct  8 11:27:05 np0005476733 nova_compute[192580]: 2025-10-08 15:27:05.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:06 np0005476733 nova_compute[192580]: 2025-10-08 15:27:06.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:08 np0005476733 nova_compute[192580]: 2025-10-08 15:27:08.745 2 DEBUG nova.network.neutron [None req-5ac29c08-c55b-4272-9f30-8f15712a6657 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Updating instance_info_cache with network_info: [{"id": "cae08d04-f9a8-46ee-ba57-0a0db94ae186", "address": "fa:16:3e:16:82:23", "network": {"id": "9c022ba9-08a2-40a7-896d-13c1538d7064", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.168", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae08d04-f9", "ovs_interfaceid": "cae08d04-f9a8-46ee-ba57-0a0db94ae186", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:27:08 np0005476733 nova_compute[192580]: 2025-10-08 15:27:08.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:08 np0005476733 nova_compute[192580]: 2025-10-08 15:27:08.957 2 DEBUG oslo_concurrency.lockutils [None req-5ac29c08-c55b-4272-9f30-8f15712a6657 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Releasing lock "refresh_cache-cefc7b22-5a31-4d0c-bb25-462153dfc427" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:27:08 np0005476733 nova_compute[192580]: 2025-10-08 15:27:08.958 2 DEBUG nova.compute.manager [None req-5ac29c08-c55b-4272-9f30-8f15712a6657 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:27:09 np0005476733 podman[228557]: 2025-10-08 15:27:09.221187311 +0000 UTC m=+0.050920615 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, config_id=edpm, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  8 11:27:10 np0005476733 kernel: tapcae08d04-f9 (unregistering): left promiscuous mode
Oct  8 11:27:10 np0005476733 NetworkManager[51699]: <info>  [1759937230.1754] device (tapcae08d04-f9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:27:10 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:10Z|00255|binding|INFO|Releasing lport cae08d04-f9a8-46ee-ba57-0a0db94ae186 from this chassis (sb_readonly=0)
Oct  8 11:27:10 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:10Z|00256|binding|INFO|Setting lport cae08d04-f9a8-46ee-ba57-0a0db94ae186 down in Southbound
Oct  8 11:27:10 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:10Z|00257|binding|INFO|Removing iface tapcae08d04-f9 ovn-installed in OVS
Oct  8 11:27:10 np0005476733 nova_compute[192580]: 2025-10-08 15:27:10.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:10 np0005476733 nova_compute[192580]: 2025-10-08 15:27:10.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:10 np0005476733 nova_compute[192580]: 2025-10-08 15:27:10.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.229 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:16:82:23 192.168.2.168'], port_security=['fa:16:3e:16:82:23 192.168.2.168'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'neutron:cidrs': '192.168.2.168/24', 'neutron:device_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c022ba9-08a2-40a7-896d-13c1538d7064', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'neutron:project_id': '93e68db931464f0282500c84d398d8af', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ee93d6be-59e3-41c0-a55f-8df79fb9da74', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.230'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5ea236cb-dec7-48d3-a1ef-7ce9f1bd90ad, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=cae08d04-f9a8-46ee-ba57-0a0db94ae186) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.230 103739 INFO neutron.agent.ovn.metadata.agent [-] Port cae08d04-f9a8-46ee-ba57-0a0db94ae186 in datapath 9c022ba9-08a2-40a7-896d-13c1538d7064 unbound from our chassis#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.234 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9c022ba9-08a2-40a7-896d-13c1538d7064, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.235 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[01fc680c-0afe-4aed-855c-fad11416246b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:10 np0005476733 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Oct  8 11:27:10 np0005476733 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001e.scope: Consumed 44.654s CPU time.
Oct  8 11:27:10 np0005476733 systemd-machined[152624]: Machine qemu-19-instance-0000001e terminated.
Oct  8 11:27:10 np0005476733 kernel: tapcae08d04-f9: entered promiscuous mode
Oct  8 11:27:10 np0005476733 NetworkManager[51699]: <info>  [1759937230.4192] manager: (tapcae08d04-f9): new Tun device (/org/freedesktop/NetworkManager/Devices/100)
Oct  8 11:27:10 np0005476733 kernel: tapcae08d04-f9 (unregistering): left promiscuous mode
Oct  8 11:27:10 np0005476733 nova_compute[192580]: 2025-10-08 15:27:10.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:10 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:10Z|00258|binding|INFO|Claiming lport cae08d04-f9a8-46ee-ba57-0a0db94ae186 for this chassis.
Oct  8 11:27:10 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:10Z|00259|binding|INFO|cae08d04-f9a8-46ee-ba57-0a0db94ae186: Claiming fa:16:3e:16:82:23 192.168.2.168
Oct  8 11:27:10 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:10Z|00260|binding|INFO|Setting lport cae08d04-f9a8-46ee-ba57-0a0db94ae186 ovn-installed in OVS
Oct  8 11:27:10 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:10Z|00261|if_status|INFO|Dropped 2 log messages in last 231 seconds (most recently, 231 seconds ago) due to excessive rate
Oct  8 11:27:10 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:10Z|00262|if_status|INFO|Not setting lport cae08d04-f9a8-46ee-ba57-0a0db94ae186 down as sb is readonly
Oct  8 11:27:10 np0005476733 nova_compute[192580]: 2025-10-08 15:27:10.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:10 np0005476733 nova_compute[192580]: 2025-10-08 15:27:10.491 2 INFO nova.virt.libvirt.driver [None req-5ac29c08-c55b-4272-9f30-8f15712a6657 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Instance shutdown successfully.#033[00m
Oct  8 11:27:10 np0005476733 kernel: tapcae08d04-f9: entered promiscuous mode
Oct  8 11:27:10 np0005476733 systemd-udevd[228588]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:27:10 np0005476733 NetworkManager[51699]: <info>  [1759937230.5601] manager: (tapcae08d04-f9): new Tun device (/org/freedesktop/NetworkManager/Devices/101)
Oct  8 11:27:10 np0005476733 nova_compute[192580]: 2025-10-08 15:27:10.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:10 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:10Z|00263|if_status|INFO|Not updating pb chassis for cae08d04-f9a8-46ee-ba57-0a0db94ae186 now as sb is readonly
Oct  8 11:27:10 np0005476733 NetworkManager[51699]: <info>  [1759937230.5711] device (tapcae08d04-f9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:27:10 np0005476733 NetworkManager[51699]: <info>  [1759937230.5718] device (tapcae08d04-f9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:27:10 np0005476733 nova_compute[192580]: 2025-10-08 15:27:10.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:10 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:10Z|00264|binding|INFO|Removing lport cae08d04-f9a8-46ee-ba57-0a0db94ae186 ovn-installed in OVS
Oct  8 11:27:10 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:10Z|00265|binding|INFO|Setting lport cae08d04-f9a8-46ee-ba57-0a0db94ae186 ovn-installed in OVS
Oct  8 11:27:10 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:10Z|00266|binding|INFO|Setting lport cae08d04-f9a8-46ee-ba57-0a0db94ae186 up in Southbound
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.592 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:16:82:23 192.168.2.168'], port_security=['fa:16:3e:16:82:23 192.168.2.168'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'neutron:cidrs': '192.168.2.168/24', 'neutron:device_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c022ba9-08a2-40a7-896d-13c1538d7064', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'neutron:project_id': '93e68db931464f0282500c84d398d8af', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ee93d6be-59e3-41c0-a55f-8df79fb9da74', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.230'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5ea236cb-dec7-48d3-a1ef-7ce9f1bd90ad, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=cae08d04-f9a8-46ee-ba57-0a0db94ae186) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.594 103739 INFO neutron.agent.ovn.metadata.agent [-] Port cae08d04-f9a8-46ee-ba57-0a0db94ae186 in datapath 9c022ba9-08a2-40a7-896d-13c1538d7064 bound to our chassis#033[00m
Oct  8 11:27:10 np0005476733 nova_compute[192580]: 2025-10-08 15:27:10.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.598 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9c022ba9-08a2-40a7-896d-13c1538d7064#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.608 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[80b34c85-8ff4-4a73-b45a-6c5f3a3d391e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.609 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9c022ba9-01 in ovnmeta-9c022ba9-08a2-40a7-896d-13c1538d7064 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.612 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9c022ba9-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.612 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[55134525-6b84-4a61-b2bf-1e578f0fd672]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.613 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c0d47d94-f6a6-40f9-aa36-d1d2da85ca6b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:10 np0005476733 systemd-machined[152624]: New machine qemu-21-instance-0000001e.
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.625 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[9914853a-a132-4795-8096-f552ff070f16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:10 np0005476733 systemd[1]: Started Virtual Machine qemu-21-instance-0000001e.
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.651 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6229b2d4-716c-4ea9-905d-c3033953c0b6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.691 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[a38a8488-1153-4aa3-af6d-9148a458471f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.697 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[49a2e88b-39df-48b3-aa18-92c6df922aff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:10 np0005476733 NetworkManager[51699]: <info>  [1759937230.6986] manager: (tap9c022ba9-00): new Veth device (/org/freedesktop/NetworkManager/Devices/102)
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.735 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[e037b69c-1230-4807-80c2-dc585ae07b88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.738 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[e6946198-af2f-4ae6-99a7-789d73cf3aca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:10 np0005476733 NetworkManager[51699]: <info>  [1759937230.7635] device (tap9c022ba9-00): carrier: link connected
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.773 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[f33a25d6-a0ea-4757-8c7d-e57fbd47fbb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.790 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3cd26907-d238-4ca2-8de0-2f5c3879372b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c022ba9-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5a:0f:f1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 413442, 'reachable_time': 16322, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228649, 'error': None, 'target': 'ovnmeta-9c022ba9-08a2-40a7-896d-13c1538d7064', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.808 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6ed2002d-d0cd-4028-879a-a2641d20011f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5a:ff1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 413442, 'tstamp': 413442}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228650, 'error': None, 'target': 'ovnmeta-9c022ba9-08a2-40a7-896d-13c1538d7064', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.824 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[22164001-9f6f-420a-bc36-68ec08a3100b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c022ba9-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5a:0f:f1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 413442, 'reachable_time': 16322, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228651, 'error': None, 'target': 'ovnmeta-9c022ba9-08a2-40a7-896d-13c1538d7064', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.864 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[33fb0d59-87a3-46cc-a012-61a08c6c0c36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.925 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f5738cf6-0c43-4fe1-9fec-c99d29145d86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.926 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c022ba9-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.926 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.927 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9c022ba9-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:10 np0005476733 NetworkManager[51699]: <info>  [1759937230.9661] manager: (tap9c022ba9-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Oct  8 11:27:10 np0005476733 nova_compute[192580]: 2025-10-08 15:27:10.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:10 np0005476733 kernel: tap9c022ba9-00: entered promiscuous mode
Oct  8 11:27:10 np0005476733 nova_compute[192580]: 2025-10-08 15:27:10.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.971 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9c022ba9-00, col_values=(('external_ids', {'iface-id': 'a798b21b-d37f-4eaa-be10-bf865d0421dd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:10 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:10Z|00267|binding|INFO|Releasing lport a798b21b-d37f-4eaa-be10-bf865d0421dd from this chassis (sb_readonly=0)
Oct  8 11:27:10 np0005476733 nova_compute[192580]: 2025-10-08 15:27:10.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.991 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c022ba9-08a2-40a7-896d-13c1538d7064.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c022ba9-08a2-40a7-896d-13c1538d7064.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.993 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b4bed699-f3da-4ca2-85ee-093e99e641fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.994 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-9c022ba9-08a2-40a7-896d-13c1538d7064
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/9c022ba9-08a2-40a7-896d-13c1538d7064.pid.haproxy
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 9c022ba9-08a2-40a7-896d-13c1538d7064
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:27:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:10.995 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9c022ba9-08a2-40a7-896d-13c1538d7064', 'env', 'PROCESS_TAG=haproxy-9c022ba9-08a2-40a7-896d-13c1538d7064', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9c022ba9-08a2-40a7-896d-13c1538d7064.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:27:11 np0005476733 nova_compute[192580]: 2025-10-08 15:27:11.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:11 np0005476733 nova_compute[192580]: 2025-10-08 15:27:11.343 2 DEBUG nova.compute.manager [req-1f977bcd-60ad-4bc0-9226-77327786c3c8 req-87a5acfd-9358-4e79-9c34-52664d78bdc0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Received event network-vif-unplugged-cae08d04-f9a8-46ee-ba57-0a0db94ae186 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:27:11 np0005476733 nova_compute[192580]: 2025-10-08 15:27:11.344 2 DEBUG oslo_concurrency.lockutils [req-1f977bcd-60ad-4bc0-9226-77327786c3c8 req-87a5acfd-9358-4e79-9c34-52664d78bdc0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:11 np0005476733 nova_compute[192580]: 2025-10-08 15:27:11.344 2 DEBUG oslo_concurrency.lockutils [req-1f977bcd-60ad-4bc0-9226-77327786c3c8 req-87a5acfd-9358-4e79-9c34-52664d78bdc0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:11 np0005476733 nova_compute[192580]: 2025-10-08 15:27:11.344 2 DEBUG oslo_concurrency.lockutils [req-1f977bcd-60ad-4bc0-9226-77327786c3c8 req-87a5acfd-9358-4e79-9c34-52664d78bdc0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:11 np0005476733 nova_compute[192580]: 2025-10-08 15:27:11.345 2 DEBUG nova.compute.manager [req-1f977bcd-60ad-4bc0-9226-77327786c3c8 req-87a5acfd-9358-4e79-9c34-52664d78bdc0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] No waiting events found dispatching network-vif-unplugged-cae08d04-f9a8-46ee-ba57-0a0db94ae186 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:27:11 np0005476733 nova_compute[192580]: 2025-10-08 15:27:11.345 2 WARNING nova.compute.manager [req-1f977bcd-60ad-4bc0-9226-77327786c3c8 req-87a5acfd-9358-4e79-9c34-52664d78bdc0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Received unexpected event network-vif-unplugged-cae08d04-f9a8-46ee-ba57-0a0db94ae186 for instance with vm_state active and task_state reboot_started.#033[00m
Oct  8 11:27:11 np0005476733 podman[228690]: 2025-10-08 15:27:11.356118606 +0000 UTC m=+0.051602567 container create f983de7a5b1b48e3a3bc134620dcf4e95dc35bbbb01ba557f773fe3d9dff1a6c (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-9c022ba9-08a2-40a7-896d-13c1538d7064, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS)
Oct  8 11:27:11 np0005476733 systemd[1]: Started libpod-conmon-f983de7a5b1b48e3a3bc134620dcf4e95dc35bbbb01ba557f773fe3d9dff1a6c.scope.
Oct  8 11:27:11 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:27:11 np0005476733 podman[228690]: 2025-10-08 15:27:11.327668234 +0000 UTC m=+0.023152195 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:27:11 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9db5a2fca3107ec5c1b5c64dbd2fcbe9d16d8a9b35b42ce6742c5f8833258229/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:27:11 np0005476733 podman[228690]: 2025-10-08 15:27:11.442654728 +0000 UTC m=+0.138138679 container init f983de7a5b1b48e3a3bc134620dcf4e95dc35bbbb01ba557f773fe3d9dff1a6c (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-9c022ba9-08a2-40a7-896d-13c1538d7064, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 11:27:11 np0005476733 podman[228690]: 2025-10-08 15:27:11.449674681 +0000 UTC m=+0.145158632 container start f983de7a5b1b48e3a3bc134620dcf4e95dc35bbbb01ba557f773fe3d9dff1a6c (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-9c022ba9-08a2-40a7-896d-13c1538d7064, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  8 11:27:11 np0005476733 podman[228704]: 2025-10-08 15:27:11.465917086 +0000 UTC m=+0.071266990 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3)
Oct  8 11:27:11 np0005476733 neutron-haproxy-ovnmeta-9c022ba9-08a2-40a7-896d-13c1538d7064[228708]: [NOTICE]   (228751) : New worker (228754) forked
Oct  8 11:27:11 np0005476733 neutron-haproxy-ovnmeta-9c022ba9-08a2-40a7-896d-13c1538d7064[228708]: [NOTICE]   (228751) : Loading success.
Oct  8 11:27:11 np0005476733 podman[228707]: 2025-10-08 15:27:11.475019903 +0000 UTC m=+0.080572484 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:27:11 np0005476733 nova_compute[192580]: 2025-10-08 15:27:11.523 2 DEBUG nova.virt.libvirt.host [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Removed pending event for cefc7b22-5a31-4d0c-bb25-462153dfc427 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  8 11:27:11 np0005476733 nova_compute[192580]: 2025-10-08 15:27:11.524 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937231.5236168, cefc7b22-5a31-4d0c-bb25-462153dfc427 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:27:11 np0005476733 nova_compute[192580]: 2025-10-08 15:27:11.524 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:27:11 np0005476733 nova_compute[192580]: 2025-10-08 15:27:11.530 2 INFO nova.virt.libvirt.driver [-] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Instance running successfully.#033[00m
Oct  8 11:27:11 np0005476733 nova_compute[192580]: 2025-10-08 15:27:11.530 2 INFO nova.virt.libvirt.driver [None req-5ac29c08-c55b-4272-9f30-8f15712a6657 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Instance soft rebooted successfully.#033[00m
Oct  8 11:27:11 np0005476733 nova_compute[192580]: 2025-10-08 15:27:11.531 2 DEBUG nova.compute.manager [None req-5ac29c08-c55b-4272-9f30-8f15712a6657 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:27:11 np0005476733 nova_compute[192580]: 2025-10-08 15:27:11.748 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:27:11 np0005476733 nova_compute[192580]: 2025-10-08 15:27:11.752 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:27:11 np0005476733 nova_compute[192580]: 2025-10-08 15:27:11.824 2 DEBUG oslo_concurrency.lockutils [None req-5ac29c08-c55b-4272-9f30-8f15712a6657 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 7.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:11 np0005476733 nova_compute[192580]: 2025-10-08 15:27:11.870 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937231.527627, cefc7b22-5a31-4d0c-bb25-462153dfc427 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:27:11 np0005476733 nova_compute[192580]: 2025-10-08 15:27:11.870 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] VM Started (Lifecycle Event)#033[00m
Oct  8 11:27:12 np0005476733 nova_compute[192580]: 2025-10-08 15:27:12.062 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:27:12 np0005476733 nova_compute[192580]: 2025-10-08 15:27:12.068 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:27:13 np0005476733 nova_compute[192580]: 2025-10-08 15:27:13.509 2 DEBUG nova.compute.manager [req-ced04f11-334f-4295-b732-2a50d3ee5ff3 req-619faf20-ac67-4e1b-afab-938a2c459045 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Received event network-vif-plugged-cae08d04-f9a8-46ee-ba57-0a0db94ae186 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:27:13 np0005476733 nova_compute[192580]: 2025-10-08 15:27:13.510 2 DEBUG oslo_concurrency.lockutils [req-ced04f11-334f-4295-b732-2a50d3ee5ff3 req-619faf20-ac67-4e1b-afab-938a2c459045 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:13 np0005476733 nova_compute[192580]: 2025-10-08 15:27:13.511 2 DEBUG oslo_concurrency.lockutils [req-ced04f11-334f-4295-b732-2a50d3ee5ff3 req-619faf20-ac67-4e1b-afab-938a2c459045 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:13 np0005476733 nova_compute[192580]: 2025-10-08 15:27:13.511 2 DEBUG oslo_concurrency.lockutils [req-ced04f11-334f-4295-b732-2a50d3ee5ff3 req-619faf20-ac67-4e1b-afab-938a2c459045 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:13 np0005476733 nova_compute[192580]: 2025-10-08 15:27:13.512 2 DEBUG nova.compute.manager [req-ced04f11-334f-4295-b732-2a50d3ee5ff3 req-619faf20-ac67-4e1b-afab-938a2c459045 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] No waiting events found dispatching network-vif-plugged-cae08d04-f9a8-46ee-ba57-0a0db94ae186 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:27:13 np0005476733 nova_compute[192580]: 2025-10-08 15:27:13.512 2 WARNING nova.compute.manager [req-ced04f11-334f-4295-b732-2a50d3ee5ff3 req-619faf20-ac67-4e1b-afab-938a2c459045 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Received unexpected event network-vif-plugged-cae08d04-f9a8-46ee-ba57-0a0db94ae186 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:27:13 np0005476733 nova_compute[192580]: 2025-10-08 15:27:13.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:15 np0005476733 nova_compute[192580]: 2025-10-08 15:27:15.744 2 DEBUG nova.compute.manager [req-271d63c8-8b1d-4d47-8454-bd3084b7af51 req-d3e37150-e121-429c-bbe8-4c3296b96bdd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Received event network-vif-plugged-cae08d04-f9a8-46ee-ba57-0a0db94ae186 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:27:15 np0005476733 nova_compute[192580]: 2025-10-08 15:27:15.744 2 DEBUG oslo_concurrency.lockutils [req-271d63c8-8b1d-4d47-8454-bd3084b7af51 req-d3e37150-e121-429c-bbe8-4c3296b96bdd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:15 np0005476733 nova_compute[192580]: 2025-10-08 15:27:15.744 2 DEBUG oslo_concurrency.lockutils [req-271d63c8-8b1d-4d47-8454-bd3084b7af51 req-d3e37150-e121-429c-bbe8-4c3296b96bdd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:15 np0005476733 nova_compute[192580]: 2025-10-08 15:27:15.745 2 DEBUG oslo_concurrency.lockutils [req-271d63c8-8b1d-4d47-8454-bd3084b7af51 req-d3e37150-e121-429c-bbe8-4c3296b96bdd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:15 np0005476733 nova_compute[192580]: 2025-10-08 15:27:15.745 2 DEBUG nova.compute.manager [req-271d63c8-8b1d-4d47-8454-bd3084b7af51 req-d3e37150-e121-429c-bbe8-4c3296b96bdd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] No waiting events found dispatching network-vif-plugged-cae08d04-f9a8-46ee-ba57-0a0db94ae186 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:27:15 np0005476733 nova_compute[192580]: 2025-10-08 15:27:15.745 2 WARNING nova.compute.manager [req-271d63c8-8b1d-4d47-8454-bd3084b7af51 req-d3e37150-e121-429c-bbe8-4c3296b96bdd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Received unexpected event network-vif-plugged-cae08d04-f9a8-46ee-ba57-0a0db94ae186 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:27:15 np0005476733 nova_compute[192580]: 2025-10-08 15:27:15.745 2 DEBUG nova.compute.manager [req-271d63c8-8b1d-4d47-8454-bd3084b7af51 req-d3e37150-e121-429c-bbe8-4c3296b96bdd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Received event network-vif-plugged-cae08d04-f9a8-46ee-ba57-0a0db94ae186 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:27:15 np0005476733 nova_compute[192580]: 2025-10-08 15:27:15.746 2 DEBUG oslo_concurrency.lockutils [req-271d63c8-8b1d-4d47-8454-bd3084b7af51 req-d3e37150-e121-429c-bbe8-4c3296b96bdd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:15 np0005476733 nova_compute[192580]: 2025-10-08 15:27:15.746 2 DEBUG oslo_concurrency.lockutils [req-271d63c8-8b1d-4d47-8454-bd3084b7af51 req-d3e37150-e121-429c-bbe8-4c3296b96bdd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:15 np0005476733 nova_compute[192580]: 2025-10-08 15:27:15.746 2 DEBUG oslo_concurrency.lockutils [req-271d63c8-8b1d-4d47-8454-bd3084b7af51 req-d3e37150-e121-429c-bbe8-4c3296b96bdd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:15 np0005476733 nova_compute[192580]: 2025-10-08 15:27:15.746 2 DEBUG nova.compute.manager [req-271d63c8-8b1d-4d47-8454-bd3084b7af51 req-d3e37150-e121-429c-bbe8-4c3296b96bdd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] No waiting events found dispatching network-vif-plugged-cae08d04-f9a8-46ee-ba57-0a0db94ae186 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:27:15 np0005476733 nova_compute[192580]: 2025-10-08 15:27:15.746 2 WARNING nova.compute.manager [req-271d63c8-8b1d-4d47-8454-bd3084b7af51 req-d3e37150-e121-429c-bbe8-4c3296b96bdd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Received unexpected event network-vif-plugged-cae08d04-f9a8-46ee-ba57-0a0db94ae186 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:27:16 np0005476733 nova_compute[192580]: 2025-10-08 15:27:16.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:18 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:18Z|00268|binding|INFO|Releasing lport a798b21b-d37f-4eaa-be10-bf865d0421dd from this chassis (sb_readonly=0)
Oct  8 11:27:18 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:18Z|00269|binding|INFO|Releasing lport 1e0c4d29-d963-4fdf-8ca6-0153967de16b from this chassis (sb_readonly=0)
Oct  8 11:27:18 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:18Z|00270|binding|INFO|Releasing lport 76302563-91ae-48df-adce-3edec8d5a578 from this chassis (sb_readonly=0)
Oct  8 11:27:18 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:18Z|00271|binding|INFO|Releasing lport b563ca05-c871-4f0e-9980-177237a3f88d from this chassis (sb_readonly=0)
Oct  8 11:27:18 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:18Z|00272|binding|INFO|Releasing lport 9e6f9f1a-9b45-47d5-b171-40ef2fcda78c from this chassis (sb_readonly=0)
Oct  8 11:27:18 np0005476733 nova_compute[192580]: 2025-10-08 15:27:18.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:18 np0005476733 nova_compute[192580]: 2025-10-08 15:27:18.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:20.027 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:27:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:20.028 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:27:20 np0005476733 nova_compute[192580]: 2025-10-08 15:27:20.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:20 np0005476733 podman[228782]: 2025-10-08 15:27:20.217173031 +0000 UTC m=+0.049151310 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, io.buildah.version=1.41.3, tcib_managed=true)
Oct  8 11:27:20 np0005476733 podman[228783]: 2025-10-08 15:27:20.240047316 +0000 UTC m=+0.064269208 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 11:27:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:20Z|00273|pinctrl|WARN|Dropped 5459 log messages in last 57 seconds (most recently, 1 seconds ago) due to excessive rate
Oct  8 11:27:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:20Z|00274|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:27:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:21.031 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:21 np0005476733 nova_compute[192580]: 2025-10-08 15:27:21.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:23 np0005476733 nova_compute[192580]: 2025-10-08 15:27:23.787 2 DEBUG oslo_concurrency.lockutils [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "interface-7f1808f3-5a79-4149-84d1-7bc21eefa497-7ad20ed3-8502-40cd-84e3-773d77da33ae" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:23 np0005476733 nova_compute[192580]: 2025-10-08 15:27:23.788 2 DEBUG oslo_concurrency.lockutils [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "interface-7f1808f3-5a79-4149-84d1-7bc21eefa497-7ad20ed3-8502-40cd-84e3-773d77da33ae" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:23 np0005476733 nova_compute[192580]: 2025-10-08 15:27:23.789 2 DEBUG nova.objects.instance [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'flavor' on Instance uuid 7f1808f3-5a79-4149-84d1-7bc21eefa497 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:27:23 np0005476733 nova_compute[192580]: 2025-10-08 15:27:23.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:25 np0005476733 nova_compute[192580]: 2025-10-08 15:27:25.427 2 DEBUG nova.objects.instance [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'pci_requests' on Instance uuid 7f1808f3-5a79-4149-84d1-7bc21eefa497 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:27:25 np0005476733 nova_compute[192580]: 2025-10-08 15:27:25.510 2 DEBUG nova.network.neutron [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:27:26 np0005476733 nova_compute[192580]: 2025-10-08 15:27:26.141 2 DEBUG nova.policy [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:27:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:26.309 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:26.310 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:26.311 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:26 np0005476733 nova_compute[192580]: 2025-10-08 15:27:26.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:28 np0005476733 nova_compute[192580]: 2025-10-08 15:27:28.043 2 DEBUG nova.network.neutron [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Successfully updated port: 7ad20ed3-8502-40cd-84e3-773d77da33ae _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:27:28 np0005476733 nova_compute[192580]: 2025-10-08 15:27:28.059 2 DEBUG oslo_concurrency.lockutils [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "refresh_cache-7f1808f3-5a79-4149-84d1-7bc21eefa497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:27:28 np0005476733 nova_compute[192580]: 2025-10-08 15:27:28.060 2 DEBUG oslo_concurrency.lockutils [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquired lock "refresh_cache-7f1808f3-5a79-4149-84d1-7bc21eefa497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:27:28 np0005476733 nova_compute[192580]: 2025-10-08 15:27:28.060 2 DEBUG nova.network.neutron [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:27:28 np0005476733 nova_compute[192580]: 2025-10-08 15:27:28.393 2 DEBUG nova.compute.manager [req-37ed5a05-42f0-4e34-8321-1fdefefec87e req-0e0187ed-60cb-4e48-b150-881dc95d6374 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Received event network-changed-7ad20ed3-8502-40cd-84e3-773d77da33ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:27:28 np0005476733 nova_compute[192580]: 2025-10-08 15:27:28.393 2 DEBUG nova.compute.manager [req-37ed5a05-42f0-4e34-8321-1fdefefec87e req-0e0187ed-60cb-4e48-b150-881dc95d6374 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Refreshing instance network info cache due to event network-changed-7ad20ed3-8502-40cd-84e3-773d77da33ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:27:28 np0005476733 nova_compute[192580]: 2025-10-08 15:27:28.394 2 DEBUG oslo_concurrency.lockutils [req-37ed5a05-42f0-4e34-8321-1fdefefec87e req-0e0187ed-60cb-4e48-b150-881dc95d6374 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-7f1808f3-5a79-4149-84d1-7bc21eefa497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:27:28 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:28Z|00275|binding|INFO|Releasing lport a798b21b-d37f-4eaa-be10-bf865d0421dd from this chassis (sb_readonly=0)
Oct  8 11:27:28 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:28Z|00276|binding|INFO|Releasing lport 1e0c4d29-d963-4fdf-8ca6-0153967de16b from this chassis (sb_readonly=0)
Oct  8 11:27:28 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:28Z|00277|binding|INFO|Releasing lport 76302563-91ae-48df-adce-3edec8d5a578 from this chassis (sb_readonly=0)
Oct  8 11:27:28 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:28Z|00278|binding|INFO|Releasing lport b563ca05-c871-4f0e-9980-177237a3f88d from this chassis (sb_readonly=0)
Oct  8 11:27:28 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:28Z|00279|binding|INFO|Releasing lport 9e6f9f1a-9b45-47d5-b171-40ef2fcda78c from this chassis (sb_readonly=0)
Oct  8 11:27:28 np0005476733 nova_compute[192580]: 2025-10-08 15:27:28.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:28 np0005476733 nova_compute[192580]: 2025-10-08 15:27:28.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.196 2 DEBUG oslo_concurrency.lockutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "63718bc7-c79a-49a4-a0f2-bb47aa50f5b6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.196 2 DEBUG oslo_concurrency.lockutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "63718bc7-c79a-49a4-a0f2-bb47aa50f5b6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.221 2 DEBUG nova.compute.manager [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:27:29 np0005476733 podman[228829]: 2025-10-08 15:27:29.241853991 +0000 UTC m=+0.070208396 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.329 2 DEBUG oslo_concurrency.lockutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.330 2 DEBUG oslo_concurrency.lockutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.340 2 DEBUG nova.virt.hardware [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.341 2 INFO nova.compute.claims [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.601 2 DEBUG nova.compute.provider_tree [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.624 2 DEBUG nova.scheduler.client.report [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.653 2 DEBUG oslo_concurrency.lockutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.323s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.654 2 DEBUG nova.compute.manager [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.709 2 DEBUG nova.compute.manager [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.710 2 DEBUG nova.network.neutron [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.737 2 INFO nova.virt.libvirt.driver [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.760 2 DEBUG nova.compute.manager [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.863 2 DEBUG nova.compute.manager [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.864 2 DEBUG nova.virt.libvirt.driver [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.864 2 INFO nova.virt.libvirt.driver [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Creating image(s)#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.865 2 DEBUG oslo_concurrency.lockutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "/var/lib/nova/instances/63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.865 2 DEBUG oslo_concurrency.lockutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "/var/lib/nova/instances/63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.866 2 DEBUG oslo_concurrency.lockutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "/var/lib/nova/instances/63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.883 2 DEBUG oslo_concurrency.processutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.958 2 DEBUG oslo_concurrency.processutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.959 2 DEBUG oslo_concurrency.lockutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.960 2 DEBUG oslo_concurrency.lockutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:29 np0005476733 nova_compute[192580]: 2025-10-08 15:27:29.978 2 DEBUG oslo_concurrency.processutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.046 2 DEBUG oslo_concurrency.processutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.047 2 DEBUG oslo_concurrency.processutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.084 2 DEBUG oslo_concurrency.processutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk 10737418240" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.085 2 DEBUG oslo_concurrency.lockutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.086 2 DEBUG oslo_concurrency.processutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.123 2 DEBUG nova.network.neutron [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Updating instance_info_cache with network_info: [{"id": "b5af459f-569f-4ca4-86fe-d2d018227a96", "address": "fa:16:3e:4e:90:51", "network": {"id": "f7929135-b0f8-4022-8ac4-4734ecb47f0b", "bridge": "br-int", "label": "tempest-test-network--322813205", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5af459f-56", "ovs_interfaceid": "b5af459f-569f-4ca4-86fe-d2d018227a96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7ad20ed3-8502-40cd-84e3-773d77da33ae", "address": "fa:16:3e:62:ec:86", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad20ed3-85", "ovs_interfaceid": "7ad20ed3-8502-40cd-84e3-773d77da33ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.143 2 DEBUG oslo_concurrency.processutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.144 2 DEBUG nova.objects.instance [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lazy-loading 'migration_context' on Instance uuid 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.147 2 DEBUG oslo_concurrency.lockutils [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Releasing lock "refresh_cache-7f1808f3-5a79-4149-84d1-7bc21eefa497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.148 2 DEBUG oslo_concurrency.lockutils [req-37ed5a05-42f0-4e34-8321-1fdefefec87e req-0e0187ed-60cb-4e48-b150-881dc95d6374 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-7f1808f3-5a79-4149-84d1-7bc21eefa497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.148 2 DEBUG nova.network.neutron [req-37ed5a05-42f0-4e34-8321-1fdefefec87e req-0e0187ed-60cb-4e48-b150-881dc95d6374 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Refreshing network info cache for port 7ad20ed3-8502-40cd-84e3-773d77da33ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.151 2 DEBUG nova.virt.libvirt.vif [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:26:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_bw_limit_east_west-350327070',display_name='tempest-test_bw_limit_east_west-350327070',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-bw-limit-east-west-350327070',id=33,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:26:18Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-j87mn69z',resources=<?>,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:26:18Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=7f1808f3-5a79-4149-84d1-7bc21eefa497,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7ad20ed3-8502-40cd-84e3-773d77da33ae", "address": "fa:16:3e:62:ec:86", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad20ed3-85", "ovs_interfaceid": "7ad20ed3-8502-40cd-84e3-773d77da33ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.152 2 DEBUG nova.network.os_vif_util [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "7ad20ed3-8502-40cd-84e3-773d77da33ae", "address": "fa:16:3e:62:ec:86", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad20ed3-85", "ovs_interfaceid": "7ad20ed3-8502-40cd-84e3-773d77da33ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.153 2 DEBUG nova.network.os_vif_util [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:ec:86,bridge_name='br-int',has_traffic_filtering=True,id=7ad20ed3-8502-40cd-84e3-773d77da33ae,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7ad20ed3-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.153 2 DEBUG os_vif [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:ec:86,bridge_name='br-int',has_traffic_filtering=True,id=7ad20ed3-8502-40cd-84e3-773d77da33ae,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7ad20ed3-85') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.154 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.154 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.157 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7ad20ed3-85, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.158 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7ad20ed3-85, col_values=(('external_ids', {'iface-id': '7ad20ed3-8502-40cd-84e3-773d77da33ae', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:62:ec:86', 'vm-uuid': '7f1808f3-5a79-4149-84d1-7bc21eefa497'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:30 np0005476733 NetworkManager[51699]: <info>  [1759937250.1610] manager: (tap7ad20ed3-85): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/104)
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.163 2 DEBUG nova.virt.libvirt.driver [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.163 2 DEBUG nova.virt.libvirt.driver [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Ensure instance console log exists: /var/lib/nova/instances/63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.163 2 DEBUG oslo_concurrency.lockutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.164 2 DEBUG oslo_concurrency.lockutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.164 2 DEBUG oslo_concurrency.lockutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.170 2 INFO os_vif [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:ec:86,bridge_name='br-int',has_traffic_filtering=True,id=7ad20ed3-8502-40cd-84e3-773d77da33ae,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7ad20ed3-85')#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.170 2 DEBUG nova.virt.libvirt.vif [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:26:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_bw_limit_east_west-350327070',display_name='tempest-test_bw_limit_east_west-350327070',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-bw-limit-east-west-350327070',id=33,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:26:18Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-j87mn69z',resources=<?>,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:26:18Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=7f1808f3-5a79-4149-84d1-7bc21eefa497,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7ad20ed3-8502-40cd-84e3-773d77da33ae", "address": "fa:16:3e:62:ec:86", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad20ed3-85", "ovs_interfaceid": "7ad20ed3-8502-40cd-84e3-773d77da33ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.171 2 DEBUG nova.network.os_vif_util [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "7ad20ed3-8502-40cd-84e3-773d77da33ae", "address": "fa:16:3e:62:ec:86", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad20ed3-85", "ovs_interfaceid": "7ad20ed3-8502-40cd-84e3-773d77da33ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.171 2 DEBUG nova.network.os_vif_util [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:ec:86,bridge_name='br-int',has_traffic_filtering=True,id=7ad20ed3-8502-40cd-84e3-773d77da33ae,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7ad20ed3-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.174 2 DEBUG nova.virt.libvirt.guest [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] attach device xml: <interface type="ethernet">
Oct  8 11:27:30 np0005476733 nova_compute[192580]:  <mac address="fa:16:3e:62:ec:86"/>
Oct  8 11:27:30 np0005476733 nova_compute[192580]:  <model type="virtio"/>
Oct  8 11:27:30 np0005476733 nova_compute[192580]:  <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:27:30 np0005476733 nova_compute[192580]:  <mtu size="1342"/>
Oct  8 11:27:30 np0005476733 nova_compute[192580]:  <target dev="tap7ad20ed3-85"/>
Oct  8 11:27:30 np0005476733 nova_compute[192580]: </interface>
Oct  8 11:27:30 np0005476733 nova_compute[192580]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  8 11:27:30 np0005476733 kernel: tap7ad20ed3-85: entered promiscuous mode
Oct  8 11:27:30 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:30Z|00280|binding|INFO|Claiming lport 7ad20ed3-8502-40cd-84e3-773d77da33ae for this chassis.
Oct  8 11:27:30 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:30Z|00281|binding|INFO|7ad20ed3-8502-40cd-84e3-773d77da33ae: Claiming fa:16:3e:62:ec:86 10.100.0.3
Oct  8 11:27:30 np0005476733 NetworkManager[51699]: <info>  [1759937250.1917] manager: (tap7ad20ed3-85): new Tun device (/org/freedesktop/NetworkManager/Devices/105)
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.190 2 DEBUG nova.policy [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.199 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:ec:86 10.100.0.3'], port_security=['fa:16:3e:62:ec:86 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '82ea289b-c65f-44fe-a172-e9784a3ab9f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3da71a44-b74e-4032-87c4-3337484b3d54, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=7ad20ed3-8502-40cd-84e3-773d77da33ae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.201 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 7ad20ed3-8502-40cd-84e3-773d77da33ae in datapath 58a69152-b5a6-41d0-85d5-36ab51cfbfb5 bound to our chassis#033[00m
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.207 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58a69152-b5a6-41d0-85d5-36ab51cfbfb5#033[00m
Oct  8 11:27:30 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:30Z|00282|binding|INFO|Setting lport 7ad20ed3-8502-40cd-84e3-773d77da33ae ovn-installed in OVS
Oct  8 11:27:30 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:30Z|00283|binding|INFO|Setting lport 7ad20ed3-8502-40cd-84e3-773d77da33ae up in Southbound
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.222 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[346d95b9-e6d4-4f9b-9635-9b62fb4ba72a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.223 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap58a69152-b1 in ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.225 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap58a69152-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.225 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ccc14547-d36a-4dac-ada5-35aa43f03781]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.227 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6208cf64-cf73-4d97-8e4c-5f9c4ac7a358]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:30 np0005476733 systemd-udevd[228867]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.245 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[c5f891c1-9b04-4a99-ab2a-c0a6e67d739b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:30 np0005476733 NetworkManager[51699]: <info>  [1759937250.2546] device (tap7ad20ed3-85): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:27:30 np0005476733 NetworkManager[51699]: <info>  [1759937250.2554] device (tap7ad20ed3-85): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.259 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e7ed8d2d-8883-4d23-8ae7-b5c3e8a167bc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.288 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[6a4c918d-723d-48ad-956d-0272cfcb9280]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:30 np0005476733 NetworkManager[51699]: <info>  [1759937250.2991] manager: (tap58a69152-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/106)
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.300 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[33c1c901-58ff-43b0-953b-b88098831d25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:30 np0005476733 systemd-udevd[228871]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.308 2 DEBUG nova.virt.libvirt.driver [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.309 2 DEBUG nova.virt.libvirt.driver [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.309 2 DEBUG nova.virt.libvirt.driver [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No VIF found with MAC fa:16:3e:4e:90:51, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.310 2 DEBUG nova.virt.libvirt.driver [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No VIF found with MAC fa:16:3e:62:ec:86, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.337 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[997cba5b-e377-4464-a8b2-cf09f12590d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.341 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[98a833c0-56c1-40d8-a278-a6cce6908554]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:30 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:30Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:62:ec:86 10.100.0.3
Oct  8 11:27:30 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:30Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:62:ec:86 10.100.0.3
Oct  8 11:27:30 np0005476733 NetworkManager[51699]: <info>  [1759937250.3659] device (tap58a69152-b0): carrier: link connected
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.371 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[9b7cc3bf-c47f-41a8-90dd-ec9900187f14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.388 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a64c7c6c-0e4a-48d9-8ab1-bbc4becd55fa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58a69152-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:63:a5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415403, 'reachable_time': 37843, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228895, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.399 2 DEBUG nova.virt.libvirt.guest [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:27:30 np0005476733 nova_compute[192580]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:27:30 np0005476733 nova_compute[192580]:  <nova:name>tempest-test_bw_limit_east_west-350327070</nova:name>
Oct  8 11:27:30 np0005476733 nova_compute[192580]:  <nova:creationTime>2025-10-08 15:27:30</nova:creationTime>
Oct  8 11:27:30 np0005476733 nova_compute[192580]:  <nova:flavor name="custom_neutron_guest">
Oct  8 11:27:30 np0005476733 nova_compute[192580]:    <nova:memory>1024</nova:memory>
Oct  8 11:27:30 np0005476733 nova_compute[192580]:    <nova:disk>10</nova:disk>
Oct  8 11:27:30 np0005476733 nova_compute[192580]:    <nova:swap>0</nova:swap>
Oct  8 11:27:30 np0005476733 nova_compute[192580]:    <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:27:30 np0005476733 nova_compute[192580]:    <nova:vcpus>1</nova:vcpus>
Oct  8 11:27:30 np0005476733 nova_compute[192580]:  </nova:flavor>
Oct  8 11:27:30 np0005476733 nova_compute[192580]:  <nova:owner>
Oct  8 11:27:30 np0005476733 nova_compute[192580]:    <nova:user uuid="d4d641ac754b44f89a23c1628056309a">tempest-QosTestCommon-1316104462-project-member</nova:user>
Oct  8 11:27:30 np0005476733 nova_compute[192580]:    <nova:project uuid="d58fb802e34e481ea69b20f4fe8df6d2">tempest-QosTestCommon-1316104462</nova:project>
Oct  8 11:27:30 np0005476733 nova_compute[192580]:  </nova:owner>
Oct  8 11:27:30 np0005476733 nova_compute[192580]:  <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:27:30 np0005476733 nova_compute[192580]:  <nova:ports>
Oct  8 11:27:30 np0005476733 nova_compute[192580]:    <nova:port uuid="b5af459f-569f-4ca4-86fe-d2d018227a96">
Oct  8 11:27:30 np0005476733 nova_compute[192580]:      <nova:ip type="fixed" address="192.168.2.175" ipVersion="4"/>
Oct  8 11:27:30 np0005476733 nova_compute[192580]:    </nova:port>
Oct  8 11:27:30 np0005476733 nova_compute[192580]:    <nova:port uuid="7ad20ed3-8502-40cd-84e3-773d77da33ae">
Oct  8 11:27:30 np0005476733 nova_compute[192580]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  8 11:27:30 np0005476733 nova_compute[192580]:    </nova:port>
Oct  8 11:27:30 np0005476733 nova_compute[192580]:  </nova:ports>
Oct  8 11:27:30 np0005476733 nova_compute[192580]: </nova:instance>
Oct  8 11:27:30 np0005476733 nova_compute[192580]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.405 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a73bed60-2b59-4450-8489-824bd0b1b63e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecf:63a5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415403, 'tstamp': 415403}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228896, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.421 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[08b2bfde-5975-4078-9398-5b21eb377d4b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58a69152-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:63:a5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415403, 'reachable_time': 37843, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228897, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.428 2 DEBUG oslo_concurrency.lockutils [None req-3979450d-5241-47bb-9253-d2a9f496df7a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "interface-7f1808f3-5a79-4149-84d1-7bc21eefa497-7ad20ed3-8502-40cd-84e3-773d77da33ae" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 6.640s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.449 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[28d838d4-fbdc-44fd-a0da-a5bf5efeb9d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.512 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b61a72fc-2a3c-4ade-940e-8a758bc6fc9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.513 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58a69152-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.514 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.514 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58a69152-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:30 np0005476733 kernel: tap58a69152-b0: entered promiscuous mode
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:30 np0005476733 NetworkManager[51699]: <info>  [1759937250.5172] manager: (tap58a69152-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/107)
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.520 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58a69152-b0, col_values=(('external_ids', {'iface-id': '46f589fc-b5d9-4e1f-b085-8789fd1f48e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:30 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:30Z|00284|binding|INFO|Releasing lport 46f589fc-b5d9-4e1f-b085-8789fd1f48e9 from this chassis (sb_readonly=0)
Oct  8 11:27:30 np0005476733 nova_compute[192580]: 2025-10-08 15:27:30.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.534 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/58a69152-b5a6-41d0-85d5-36ab51cfbfb5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/58a69152-b5a6-41d0-85d5-36ab51cfbfb5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.535 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[60a57274-afb3-4e59-9d33-910e234b0d50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.536 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-58a69152-b5a6-41d0-85d5-36ab51cfbfb5
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/58a69152-b5a6-41d0-85d5-36ab51cfbfb5.pid.haproxy
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 58a69152-b5a6-41d0-85d5-36ab51cfbfb5
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:27:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:30.537 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'env', 'PROCESS_TAG=haproxy-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/58a69152-b5a6-41d0-85d5-36ab51cfbfb5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:27:31 np0005476733 podman[228929]: 2025-10-08 15:27:31.006979916 +0000 UTC m=+0.065691803 container create 888c79ef1380fa876c972cb71bfcdb0dd462eef9c2ce38f07e2b18b630efdec6 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:27:31 np0005476733 systemd[1]: Started libpod-conmon-888c79ef1380fa876c972cb71bfcdb0dd462eef9c2ce38f07e2b18b630efdec6.scope.
Oct  8 11:27:31 np0005476733 podman[228929]: 2025-10-08 15:27:30.978672789 +0000 UTC m=+0.037384716 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:27:31 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:27:31 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9cd9da7bb5746506a062b8c51415ce13f87e4ac881c7d8dbc0e78502a9256580/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:27:31 np0005476733 podman[228929]: 2025-10-08 15:27:31.108419221 +0000 UTC m=+0.167131198 container init 888c79ef1380fa876c972cb71bfcdb0dd462eef9c2ce38f07e2b18b630efdec6 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 11:27:31 np0005476733 podman[228929]: 2025-10-08 15:27:31.114028229 +0000 UTC m=+0.172740156 container start 888c79ef1380fa876c972cb71bfcdb0dd462eef9c2ce38f07e2b18b630efdec6 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:27:31 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[228945]: [NOTICE]   (228949) : New worker (228951) forked
Oct  8 11:27:31 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[228945]: [NOTICE]   (228949) : Loading success.
Oct  8 11:27:31 np0005476733 nova_compute[192580]: 2025-10-08 15:27:31.376 2 DEBUG nova.network.neutron [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Successfully updated port: 92e6817e-732a-4e42-973e-2d26e62163f5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:27:31 np0005476733 nova_compute[192580]: 2025-10-08 15:27:31.395 2 DEBUG oslo_concurrency.lockutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "refresh_cache-63718bc7-c79a-49a4-a0f2-bb47aa50f5b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:27:31 np0005476733 nova_compute[192580]: 2025-10-08 15:27:31.395 2 DEBUG oslo_concurrency.lockutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquired lock "refresh_cache-63718bc7-c79a-49a4-a0f2-bb47aa50f5b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:27:31 np0005476733 nova_compute[192580]: 2025-10-08 15:27:31.395 2 DEBUG nova.network.neutron [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:27:31 np0005476733 nova_compute[192580]: 2025-10-08 15:27:31.527 2 DEBUG nova.compute.manager [req-1109a55e-7507-4657-b027-d5c6e620adbd req-c7105e00-e070-4906-a3d6-ad1d450c38f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Received event network-changed-92e6817e-732a-4e42-973e-2d26e62163f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:27:31 np0005476733 nova_compute[192580]: 2025-10-08 15:27:31.527 2 DEBUG nova.compute.manager [req-1109a55e-7507-4657-b027-d5c6e620adbd req-c7105e00-e070-4906-a3d6-ad1d450c38f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Refreshing instance network info cache due to event network-changed-92e6817e-732a-4e42-973e-2d26e62163f5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:27:31 np0005476733 nova_compute[192580]: 2025-10-08 15:27:31.527 2 DEBUG oslo_concurrency.lockutils [req-1109a55e-7507-4657-b027-d5c6e620adbd req-c7105e00-e070-4906-a3d6-ad1d450c38f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-63718bc7-c79a-49a4-a0f2-bb47aa50f5b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:27:31 np0005476733 nova_compute[192580]: 2025-10-08 15:27:31.569 2 DEBUG nova.network.neutron [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.363 2 DEBUG nova.network.neutron [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Updating instance_info_cache with network_info: [{"id": "92e6817e-732a-4e42-973e-2d26e62163f5", "address": "fa:16:3e:6b:88:09", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92e6817e-73", "ovs_interfaceid": "92e6817e-732a-4e42-973e-2d26e62163f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.391 2 DEBUG oslo_concurrency.lockutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Releasing lock "refresh_cache-63718bc7-c79a-49a4-a0f2-bb47aa50f5b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.392 2 DEBUG nova.compute.manager [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Instance network_info: |[{"id": "92e6817e-732a-4e42-973e-2d26e62163f5", "address": "fa:16:3e:6b:88:09", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92e6817e-73", "ovs_interfaceid": "92e6817e-732a-4e42-973e-2d26e62163f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.393 2 DEBUG oslo_concurrency.lockutils [req-1109a55e-7507-4657-b027-d5c6e620adbd req-c7105e00-e070-4906-a3d6-ad1d450c38f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-63718bc7-c79a-49a4-a0f2-bb47aa50f5b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.393 2 DEBUG nova.network.neutron [req-1109a55e-7507-4657-b027-d5c6e620adbd req-c7105e00-e070-4906-a3d6-ad1d450c38f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Refreshing network info cache for port 92e6817e-732a-4e42-973e-2d26e62163f5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.397 2 DEBUG nova.virt.libvirt.driver [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Start _get_guest_xml network_info=[{"id": "92e6817e-732a-4e42-973e-2d26e62163f5", "address": "fa:16:3e:6b:88:09", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92e6817e-73", "ovs_interfaceid": "92e6817e-732a-4e42-973e-2d26e62163f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.402 2 WARNING nova.virt.libvirt.driver [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.408 2 DEBUG nova.virt.libvirt.host [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.409 2 DEBUG nova.virt.libvirt.host [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.412 2 DEBUG nova.virt.libvirt.host [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.413 2 DEBUG nova.virt.libvirt.host [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.414 2 DEBUG nova.virt.libvirt.driver [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.414 2 DEBUG nova.virt.hardware [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.415 2 DEBUG nova.virt.hardware [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.415 2 DEBUG nova.virt.hardware [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.415 2 DEBUG nova.virt.hardware [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.416 2 DEBUG nova.virt.hardware [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.416 2 DEBUG nova.virt.hardware [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.416 2 DEBUG nova.virt.hardware [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.417 2 DEBUG nova.virt.hardware [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.417 2 DEBUG nova.virt.hardware [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.417 2 DEBUG nova.virt.hardware [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.418 2 DEBUG nova.virt.hardware [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.422 2 DEBUG nova.virt.libvirt.vif [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:27:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-broadcast-receiver-124-44875693',display_name='tempest-broadcast-receiver-124-44875693',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-broadcast-receiver-124-44875693',id=35,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGBqxlO9VuM0Qq/DWr14YnhGxOxwcqegm/N2XcRSLA8NJfb1K0EfLGDHkMQul32EUhmJshL5J7ZH56Voxwq765dL8/B4SFbezZWy3ydp4mAt0951qcEHggiOu5J3JaZbOg==',key_name='tempest-keypair-test-1882494757',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7e1086961263487db8a3c5190fdf1b2e',ramdisk_id='',reservation_id='r-p4w6uocs',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-BroadcastTestVlanTransparency-538458942',owner_user_name='tempest-BroadcastTestVlanTransparency-538458942-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:27:29Z,user_data=None,user_id='843ea0278e174175a6f8e21731c1383e',uuid=63718bc7-c79a-49a4-a0f2-bb47aa50f5b6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "92e6817e-732a-4e42-973e-2d26e62163f5", "address": "fa:16:3e:6b:88:09", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92e6817e-73", "ovs_interfaceid": "92e6817e-732a-4e42-973e-2d26e62163f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.422 2 DEBUG nova.network.os_vif_util [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Converting VIF {"id": "92e6817e-732a-4e42-973e-2d26e62163f5", "address": "fa:16:3e:6b:88:09", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92e6817e-73", "ovs_interfaceid": "92e6817e-732a-4e42-973e-2d26e62163f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.423 2 DEBUG nova.network.os_vif_util [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:88:09,bridge_name='br-int',has_traffic_filtering=True,id=92e6817e-732a-4e42-973e-2d26e62163f5,network=Network(7a77f8cd-4394-4cb0-a8a1-33872549758a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap92e6817e-73') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.424 2 DEBUG nova.objects.instance [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lazy-loading 'pci_devices' on Instance uuid 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.439 2 DEBUG nova.virt.libvirt.driver [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:27:32 np0005476733 nova_compute[192580]:  <uuid>63718bc7-c79a-49a4-a0f2-bb47aa50f5b6</uuid>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:  <name>instance-00000023</name>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <nova:name>tempest-broadcast-receiver-124-44875693</nova:name>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:27:32</nova:creationTime>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:27:32 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:        <nova:user uuid="843ea0278e174175a6f8e21731c1383e">tempest-BroadcastTestVlanTransparency-538458942-project-member</nova:user>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:        <nova:project uuid="7e1086961263487db8a3c5190fdf1b2e">tempest-BroadcastTestVlanTransparency-538458942</nova:project>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:        <nova:port uuid="92e6817e-732a-4e42-973e-2d26e62163f5">
Oct  8 11:27:32 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <entry name="serial">63718bc7-c79a-49a4-a0f2-bb47aa50f5b6</entry>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <entry name="uuid">63718bc7-c79a-49a4-a0f2-bb47aa50f5b6</entry>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.config"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:6b:88:09"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <target dev="tap92e6817e-73"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/console.log" append="off"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:27:32 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:27:32 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:27:32 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:27:32 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.440 2 DEBUG nova.compute.manager [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Preparing to wait for external event network-vif-plugged-92e6817e-732a-4e42-973e-2d26e62163f5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.440 2 DEBUG oslo_concurrency.lockutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.441 2 DEBUG oslo_concurrency.lockutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.441 2 DEBUG oslo_concurrency.lockutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.441 2 DEBUG nova.virt.libvirt.vif [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:27:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-broadcast-receiver-124-44875693',display_name='tempest-broadcast-receiver-124-44875693',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-broadcast-receiver-124-44875693',id=35,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGBqxlO9VuM0Qq/DWr14YnhGxOxwcqegm/N2XcRSLA8NJfb1K0EfLGDHkMQul32EUhmJshL5J7ZH56Voxwq765dL8/B4SFbezZWy3ydp4mAt0951qcEHggiOu5J3JaZbOg==',key_name='tempest-keypair-test-1882494757',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7e1086961263487db8a3c5190fdf1b2e',ramdisk_id='',reservation_id='r-p4w6uocs',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-BroadcastTestVlanTransparency-538458942',owner_user_name='tempest-BroadcastTestVlanTransparency-538458942-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:27:29Z,user_data=None,user_id='843ea0278e174175a6f8e21731c1383e',uuid=63718bc7-c79a-49a4-a0f2-bb47aa50f5b6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "92e6817e-732a-4e42-973e-2d26e62163f5", "address": "fa:16:3e:6b:88:09", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92e6817e-73", "ovs_interfaceid": "92e6817e-732a-4e42-973e-2d26e62163f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.442 2 DEBUG nova.network.os_vif_util [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Converting VIF {"id": "92e6817e-732a-4e42-973e-2d26e62163f5", "address": "fa:16:3e:6b:88:09", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92e6817e-73", "ovs_interfaceid": "92e6817e-732a-4e42-973e-2d26e62163f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.442 2 DEBUG nova.network.os_vif_util [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:88:09,bridge_name='br-int',has_traffic_filtering=True,id=92e6817e-732a-4e42-973e-2d26e62163f5,network=Network(7a77f8cd-4394-4cb0-a8a1-33872549758a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap92e6817e-73') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.443 2 DEBUG os_vif [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:88:09,bridge_name='br-int',has_traffic_filtering=True,id=92e6817e-732a-4e42-973e-2d26e62163f5,network=Network(7a77f8cd-4394-4cb0-a8a1-33872549758a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap92e6817e-73') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.443 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.443 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.447 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap92e6817e-73, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.448 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap92e6817e-73, col_values=(('external_ids', {'iface-id': '92e6817e-732a-4e42-973e-2d26e62163f5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6b:88:09', 'vm-uuid': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:32 np0005476733 NetworkManager[51699]: <info>  [1759937252.4507] manager: (tap92e6817e-73): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/108)
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.457 2 INFO os_vif [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:88:09,bridge_name='br-int',has_traffic_filtering=True,id=92e6817e-732a-4e42-973e-2d26e62163f5,network=Network(7a77f8cd-4394-4cb0-a8a1-33872549758a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap92e6817e-73')#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.538 2 DEBUG nova.virt.libvirt.driver [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.538 2 DEBUG nova.virt.libvirt.driver [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.539 2 DEBUG nova.virt.libvirt.driver [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] No VIF found with MAC fa:16:3e:6b:88:09, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.539 2 INFO nova.virt.libvirt.driver [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Using config drive#033[00m
Oct  8 11:27:32 np0005476733 podman[228967]: 2025-10-08 15:27:32.588894004 +0000 UTC m=+0.078945073 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  8 11:27:32 np0005476733 podman[228966]: 2025-10-08 15:27:32.615250289 +0000 UTC m=+0.112450155 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.769 2 DEBUG nova.compute.manager [req-d7990201-c505-41fd-928c-604d6fa9d338 req-d97a45bf-6c29-46e9-af6b-5e764a4f38c3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Received event network-vif-plugged-7ad20ed3-8502-40cd-84e3-773d77da33ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.770 2 DEBUG oslo_concurrency.lockutils [req-d7990201-c505-41fd-928c-604d6fa9d338 req-d97a45bf-6c29-46e9-af6b-5e764a4f38c3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.770 2 DEBUG oslo_concurrency.lockutils [req-d7990201-c505-41fd-928c-604d6fa9d338 req-d97a45bf-6c29-46e9-af6b-5e764a4f38c3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.771 2 DEBUG oslo_concurrency.lockutils [req-d7990201-c505-41fd-928c-604d6fa9d338 req-d97a45bf-6c29-46e9-af6b-5e764a4f38c3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.771 2 DEBUG nova.compute.manager [req-d7990201-c505-41fd-928c-604d6fa9d338 req-d97a45bf-6c29-46e9-af6b-5e764a4f38c3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] No waiting events found dispatching network-vif-plugged-7ad20ed3-8502-40cd-84e3-773d77da33ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.772 2 WARNING nova.compute.manager [req-d7990201-c505-41fd-928c-604d6fa9d338 req-d97a45bf-6c29-46e9-af6b-5e764a4f38c3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Received unexpected event network-vif-plugged-7ad20ed3-8502-40cd-84e3-773d77da33ae for instance with vm_state active and task_state None.#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.772 2 DEBUG nova.compute.manager [req-d7990201-c505-41fd-928c-604d6fa9d338 req-d97a45bf-6c29-46e9-af6b-5e764a4f38c3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Received event network-vif-plugged-7ad20ed3-8502-40cd-84e3-773d77da33ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.773 2 DEBUG oslo_concurrency.lockutils [req-d7990201-c505-41fd-928c-604d6fa9d338 req-d97a45bf-6c29-46e9-af6b-5e764a4f38c3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.773 2 DEBUG oslo_concurrency.lockutils [req-d7990201-c505-41fd-928c-604d6fa9d338 req-d97a45bf-6c29-46e9-af6b-5e764a4f38c3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.774 2 DEBUG oslo_concurrency.lockutils [req-d7990201-c505-41fd-928c-604d6fa9d338 req-d97a45bf-6c29-46e9-af6b-5e764a4f38c3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.774 2 DEBUG nova.compute.manager [req-d7990201-c505-41fd-928c-604d6fa9d338 req-d97a45bf-6c29-46e9-af6b-5e764a4f38c3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] No waiting events found dispatching network-vif-plugged-7ad20ed3-8502-40cd-84e3-773d77da33ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.774 2 WARNING nova.compute.manager [req-d7990201-c505-41fd-928c-604d6fa9d338 req-d97a45bf-6c29-46e9-af6b-5e764a4f38c3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Received unexpected event network-vif-plugged-7ad20ed3-8502-40cd-84e3-773d77da33ae for instance with vm_state active and task_state None.#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.998 2 DEBUG nova.network.neutron [req-37ed5a05-42f0-4e34-8321-1fdefefec87e req-0e0187ed-60cb-4e48-b150-881dc95d6374 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Updated VIF entry in instance network info cache for port 7ad20ed3-8502-40cd-84e3-773d77da33ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:27:32 np0005476733 nova_compute[192580]: 2025-10-08 15:27:32.999 2 DEBUG nova.network.neutron [req-37ed5a05-42f0-4e34-8321-1fdefefec87e req-0e0187ed-60cb-4e48-b150-881dc95d6374 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Updating instance_info_cache with network_info: [{"id": "b5af459f-569f-4ca4-86fe-d2d018227a96", "address": "fa:16:3e:4e:90:51", "network": {"id": "f7929135-b0f8-4022-8ac4-4734ecb47f0b", "bridge": "br-int", "label": "tempest-test-network--322813205", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5af459f-56", "ovs_interfaceid": "b5af459f-569f-4ca4-86fe-d2d018227a96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7ad20ed3-8502-40cd-84e3-773d77da33ae", "address": "fa:16:3e:62:ec:86", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad20ed3-85", "ovs_interfaceid": "7ad20ed3-8502-40cd-84e3-773d77da33ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:27:33 np0005476733 nova_compute[192580]: 2025-10-08 15:27:33.105 2 INFO nova.virt.libvirt.driver [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Creating config drive at /var/lib/nova/instances/63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.config#033[00m
Oct  8 11:27:33 np0005476733 nova_compute[192580]: 2025-10-08 15:27:33.111 2 DEBUG oslo_concurrency.processutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpefp3gi7z execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:27:33 np0005476733 nova_compute[192580]: 2025-10-08 15:27:33.234 2 DEBUG oslo_concurrency.lockutils [req-37ed5a05-42f0-4e34-8321-1fdefefec87e req-0e0187ed-60cb-4e48-b150-881dc95d6374 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-7f1808f3-5a79-4149-84d1-7bc21eefa497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:27:33 np0005476733 nova_compute[192580]: 2025-10-08 15:27:33.245 2 DEBUG oslo_concurrency.processutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpefp3gi7z" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:27:33 np0005476733 kernel: tap92e6817e-73: entered promiscuous mode
Oct  8 11:27:33 np0005476733 NetworkManager[51699]: <info>  [1759937253.3212] manager: (tap92e6817e-73): new Tun device (/org/freedesktop/NetworkManager/Devices/109)
Oct  8 11:27:33 np0005476733 nova_compute[192580]: 2025-10-08 15:27:33.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:33 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:33Z|00285|binding|INFO|Claiming lport 92e6817e-732a-4e42-973e-2d26e62163f5 for this chassis.
Oct  8 11:27:33 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:33Z|00286|binding|INFO|92e6817e-732a-4e42-973e-2d26e62163f5: Claiming fa:16:3e:6b:88:09 10.100.0.7
Oct  8 11:27:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:33.340 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:88:09 10.100.0.7'], port_security=['fa:16:3e:6b:88:09 10.100.0.7 192.168.111.15/24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7a77f8cd-4394-4cb0-a8a1-33872549758a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e1086961263487db8a3c5190fdf1b2e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '78a6a465-5b3b-43e0-8a00-63e5875c77b4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=538f4b4e-d2f6-4df4-8e2a-7fc02c73fc5a, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=92e6817e-732a-4e42-973e-2d26e62163f5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:27:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:33.341 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 92e6817e-732a-4e42-973e-2d26e62163f5 in datapath 7a77f8cd-4394-4cb0-a8a1-33872549758a bound to our chassis#033[00m
Oct  8 11:27:33 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:33Z|00287|binding|INFO|Setting lport 92e6817e-732a-4e42-973e-2d26e62163f5 ovn-installed in OVS
Oct  8 11:27:33 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:33Z|00288|binding|INFO|Setting lport 92e6817e-732a-4e42-973e-2d26e62163f5 up in Southbound
Oct  8 11:27:33 np0005476733 nova_compute[192580]: 2025-10-08 15:27:33.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:33.352 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7a77f8cd-4394-4cb0-a8a1-33872549758a#033[00m
Oct  8 11:27:33 np0005476733 systemd-udevd[229026]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:27:33 np0005476733 systemd-machined[152624]: New machine qemu-22-instance-00000023.
Oct  8 11:27:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:33.369 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[61031790-d87a-46c9-9517-c1260b0591bb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:33 np0005476733 NetworkManager[51699]: <info>  [1759937253.3746] device (tap92e6817e-73): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:27:33 np0005476733 NetworkManager[51699]: <info>  [1759937253.3754] device (tap92e6817e-73): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:27:33 np0005476733 systemd[1]: Started Virtual Machine qemu-22-instance-00000023.
Oct  8 11:27:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:33.400 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[c5962d22-d11a-45c9-af82-31d7c31455e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:33.403 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[8d507085-31ce-42c6-a85d-5191c21f4a29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:33.435 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[15b5f6f4-62a6-4406-aa35-6f1c5126beb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:33.456 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6103f325-763c-4c9d-8375-8e8c1b3414d0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7a77f8cd-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:53:0f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 19, 'tx_packets': 7, 'rx_bytes': 1294, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 19, 'tx_packets': 7, 'rx_bytes': 1294, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 389913, 'reachable_time': 23320, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229040, 'error': None, 'target': 'ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:33.478 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[edde08cc-e0c0-4f82-a7de-11f403ccd211]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7a77f8cd-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 389931, 'tstamp': 389931}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229042, 'error': None, 'target': 'ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7a77f8cd-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 389935, 'tstamp': 389935}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229042, 'error': None, 'target': 'ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:33.480 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a77f8cd-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:33 np0005476733 nova_compute[192580]: 2025-10-08 15:27:33.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:33 np0005476733 nova_compute[192580]: 2025-10-08 15:27:33.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:33.484 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7a77f8cd-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:33.484 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:27:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:33.484 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7a77f8cd-40, col_values=(('external_ids', {'iface-id': 'b563ca05-c871-4f0e-9980-177237a3f88d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:33.484 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:27:33 np0005476733 nova_compute[192580]: 2025-10-08 15:27:33.780 2 DEBUG nova.compute.manager [req-7dc1bd45-1de9-4751-af30-d1f5e1f9009c req-308448b2-6e55-4b6d-8896-c80009324f87 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Received event network-vif-plugged-92e6817e-732a-4e42-973e-2d26e62163f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:27:33 np0005476733 nova_compute[192580]: 2025-10-08 15:27:33.781 2 DEBUG oslo_concurrency.lockutils [req-7dc1bd45-1de9-4751-af30-d1f5e1f9009c req-308448b2-6e55-4b6d-8896-c80009324f87 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:33 np0005476733 nova_compute[192580]: 2025-10-08 15:27:33.781 2 DEBUG oslo_concurrency.lockutils [req-7dc1bd45-1de9-4751-af30-d1f5e1f9009c req-308448b2-6e55-4b6d-8896-c80009324f87 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:33 np0005476733 nova_compute[192580]: 2025-10-08 15:27:33.781 2 DEBUG oslo_concurrency.lockutils [req-7dc1bd45-1de9-4751-af30-d1f5e1f9009c req-308448b2-6e55-4b6d-8896-c80009324f87 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:33 np0005476733 nova_compute[192580]: 2025-10-08 15:27:33.781 2 DEBUG nova.compute.manager [req-7dc1bd45-1de9-4751-af30-d1f5e1f9009c req-308448b2-6e55-4b6d-8896-c80009324f87 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Processing event network-vif-plugged-92e6817e-732a-4e42-973e-2d26e62163f5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:27:33 np0005476733 nova_compute[192580]: 2025-10-08 15:27:33.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.141 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937254.1410036, 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.141 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] VM Started (Lifecycle Event)#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.143 2 DEBUG nova.compute.manager [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.151 2 DEBUG nova.virt.libvirt.driver [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.155 2 INFO nova.virt.libvirt.driver [-] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Instance spawned successfully.#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.155 2 DEBUG nova.virt.libvirt.driver [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.214 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.219 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.223 2 DEBUG nova.virt.libvirt.driver [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.224 2 DEBUG nova.virt.libvirt.driver [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.224 2 DEBUG nova.virt.libvirt.driver [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.224 2 DEBUG nova.virt.libvirt.driver [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.224 2 DEBUG nova.virt.libvirt.driver [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.225 2 DEBUG nova.virt.libvirt.driver [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.255 2 DEBUG nova.network.neutron [req-1109a55e-7507-4657-b027-d5c6e620adbd req-c7105e00-e070-4906-a3d6-ad1d450c38f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Updated VIF entry in instance network info cache for port 92e6817e-732a-4e42-973e-2d26e62163f5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.255 2 DEBUG nova.network.neutron [req-1109a55e-7507-4657-b027-d5c6e620adbd req-c7105e00-e070-4906-a3d6-ad1d450c38f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Updating instance_info_cache with network_info: [{"id": "92e6817e-732a-4e42-973e-2d26e62163f5", "address": "fa:16:3e:6b:88:09", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92e6817e-73", "ovs_interfaceid": "92e6817e-732a-4e42-973e-2d26e62163f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.267 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.267 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937254.143663, 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.267 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.320 2 DEBUG oslo_concurrency.lockutils [req-1109a55e-7507-4657-b027-d5c6e620adbd req-c7105e00-e070-4906-a3d6-ad1d450c38f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-63718bc7-c79a-49a4-a0f2-bb47aa50f5b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.345 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.348 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937254.1464498, 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.349 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.363 2 INFO nova.compute.manager [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Took 4.50 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.363 2 DEBUG nova.compute.manager [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.374 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.377 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.449 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.484 2 INFO nova.compute.manager [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Took 5.19 seconds to build instance.#033[00m
Oct  8 11:27:34 np0005476733 nova_compute[192580]: 2025-10-08 15:27:34.514 2 DEBUG oslo_concurrency.lockutils [None req-9f0afe5c-2f5a-4dff-a4f3-52a9d69e0400 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "63718bc7-c79a-49a4-a0f2-bb47aa50f5b6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.318s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:35 np0005476733 nova_compute[192580]: 2025-10-08 15:27:35.320 2 INFO nova.compute.manager [None req-6900018f-3e50-4d60-8e57-0b5126a5d14e 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Get console output#033[00m
Oct  8 11:27:35 np0005476733 nova_compute[192580]: 2025-10-08 15:27:35.327 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:27:35 np0005476733 nova_compute[192580]: 2025-10-08 15:27:35.886 2 DEBUG nova.compute.manager [req-dad0d413-fce7-4125-8cd4-a46d585ae1ba req-1836bc2f-d0a2-4e7f-a6a0-7d3c56f8244d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Received event network-vif-plugged-92e6817e-732a-4e42-973e-2d26e62163f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:27:35 np0005476733 nova_compute[192580]: 2025-10-08 15:27:35.887 2 DEBUG oslo_concurrency.lockutils [req-dad0d413-fce7-4125-8cd4-a46d585ae1ba req-1836bc2f-d0a2-4e7f-a6a0-7d3c56f8244d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:35 np0005476733 nova_compute[192580]: 2025-10-08 15:27:35.888 2 DEBUG oslo_concurrency.lockutils [req-dad0d413-fce7-4125-8cd4-a46d585ae1ba req-1836bc2f-d0a2-4e7f-a6a0-7d3c56f8244d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:35 np0005476733 nova_compute[192580]: 2025-10-08 15:27:35.888 2 DEBUG oslo_concurrency.lockutils [req-dad0d413-fce7-4125-8cd4-a46d585ae1ba req-1836bc2f-d0a2-4e7f-a6a0-7d3c56f8244d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:35 np0005476733 nova_compute[192580]: 2025-10-08 15:27:35.889 2 DEBUG nova.compute.manager [req-dad0d413-fce7-4125-8cd4-a46d585ae1ba req-1836bc2f-d0a2-4e7f-a6a0-7d3c56f8244d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] No waiting events found dispatching network-vif-plugged-92e6817e-732a-4e42-973e-2d26e62163f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:27:35 np0005476733 nova_compute[192580]: 2025-10-08 15:27:35.889 2 WARNING nova.compute.manager [req-dad0d413-fce7-4125-8cd4-a46d585ae1ba req-1836bc2f-d0a2-4e7f-a6a0-7d3c56f8244d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Received unexpected event network-vif-plugged-92e6817e-732a-4e42-973e-2d26e62163f5 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:27:35 np0005476733 nova_compute[192580]: 2025-10-08 15:27:35.890 2 DEBUG nova.compute.manager [req-dad0d413-fce7-4125-8cd4-a46d585ae1ba req-1836bc2f-d0a2-4e7f-a6a0-7d3c56f8244d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Received event network-changed-7ad20ed3-8502-40cd-84e3-773d77da33ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:27:35 np0005476733 nova_compute[192580]: 2025-10-08 15:27:35.890 2 DEBUG nova.compute.manager [req-dad0d413-fce7-4125-8cd4-a46d585ae1ba req-1836bc2f-d0a2-4e7f-a6a0-7d3c56f8244d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Refreshing instance network info cache due to event network-changed-7ad20ed3-8502-40cd-84e3-773d77da33ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:27:35 np0005476733 nova_compute[192580]: 2025-10-08 15:27:35.891 2 DEBUG oslo_concurrency.lockutils [req-dad0d413-fce7-4125-8cd4-a46d585ae1ba req-1836bc2f-d0a2-4e7f-a6a0-7d3c56f8244d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-7f1808f3-5a79-4149-84d1-7bc21eefa497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:27:35 np0005476733 nova_compute[192580]: 2025-10-08 15:27:35.891 2 DEBUG oslo_concurrency.lockutils [req-dad0d413-fce7-4125-8cd4-a46d585ae1ba req-1836bc2f-d0a2-4e7f-a6a0-7d3c56f8244d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-7f1808f3-5a79-4149-84d1-7bc21eefa497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:27:35 np0005476733 nova_compute[192580]: 2025-10-08 15:27:35.891 2 DEBUG nova.network.neutron [req-dad0d413-fce7-4125-8cd4-a46d585ae1ba req-1836bc2f-d0a2-4e7f-a6a0-7d3c56f8244d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Refreshing network info cache for port 7ad20ed3-8502-40cd-84e3-773d77da33ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:27:37 np0005476733 nova_compute[192580]: 2025-10-08 15:27:37.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:37 np0005476733 nova_compute[192580]: 2025-10-08 15:27:37.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:27:37 np0005476733 nova_compute[192580]: 2025-10-08 15:27:37.638 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:37 np0005476733 nova_compute[192580]: 2025-10-08 15:27:37.639 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:37 np0005476733 nova_compute[192580]: 2025-10-08 15:27:37.640 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:37 np0005476733 nova_compute[192580]: 2025-10-08 15:27:37.640 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:27:37 np0005476733 nova_compute[192580]: 2025-10-08 15:27:37.846 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:27:37 np0005476733 nova_compute[192580]: 2025-10-08 15:27:37.941 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:27:37 np0005476733 nova_compute[192580]: 2025-10-08 15:27:37.942 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.004 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.012 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.072 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.074 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.140 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.150 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.218 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.219 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.283 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.288 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.363 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.364 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.448 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.454 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cefc7b22-5a31-4d0c-bb25-462153dfc427/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.491 2 DEBUG nova.network.neutron [req-dad0d413-fce7-4125-8cd4-a46d585ae1ba req-1836bc2f-d0a2-4e7f-a6a0-7d3c56f8244d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Updated VIF entry in instance network info cache for port 7ad20ed3-8502-40cd-84e3-773d77da33ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.492 2 DEBUG nova.network.neutron [req-dad0d413-fce7-4125-8cd4-a46d585ae1ba req-1836bc2f-d0a2-4e7f-a6a0-7d3c56f8244d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Updating instance_info_cache with network_info: [{"id": "b5af459f-569f-4ca4-86fe-d2d018227a96", "address": "fa:16:3e:4e:90:51", "network": {"id": "f7929135-b0f8-4022-8ac4-4734ecb47f0b", "bridge": "br-int", "label": "tempest-test-network--322813205", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5af459f-56", "ovs_interfaceid": "b5af459f-569f-4ca4-86fe-d2d018227a96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7ad20ed3-8502-40cd-84e3-773d77da33ae", "address": "fa:16:3e:62:ec:86", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad20ed3-85", "ovs_interfaceid": "7ad20ed3-8502-40cd-84e3-773d77da33ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.517 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cefc7b22-5a31-4d0c-bb25-462153dfc427/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.518 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cefc7b22-5a31-4d0c-bb25-462153dfc427/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.574 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cefc7b22-5a31-4d0c-bb25-462153dfc427/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.582 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.637 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.638 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.663 2 DEBUG oslo_concurrency.lockutils [req-dad0d413-fce7-4125-8cd4-a46d585ae1ba req-1836bc2f-d0a2-4e7f-a6a0-7d3c56f8244d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-7f1808f3-5a79-4149-84d1-7bc21eefa497" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.731 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.739 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.823 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.825 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.900 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.910 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.967 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:27:38 np0005476733 nova_compute[192580]: 2025-10-08 15:27:38.968 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:27:39 np0005476733 nova_compute[192580]: 2025-10-08 15:27:39.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:39 np0005476733 nova_compute[192580]: 2025-10-08 15:27:39.027 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:27:39 np0005476733 nova_compute[192580]: 2025-10-08 15:27:39.257 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:27:39 np0005476733 nova_compute[192580]: 2025-10-08 15:27:39.259 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=8244MB free_disk=110.31080627441406GB free_vcpus=0 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:27:39 np0005476733 nova_compute[192580]: 2025-10-08 15:27:39.259 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:39 np0005476733 nova_compute[192580]: 2025-10-08 15:27:39.259 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:39 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:39Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:16:82:23 192.168.2.168
Oct  8 11:27:39 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:39Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:16:82:23 192.168.2.168
Oct  8 11:27:40 np0005476733 podman[229122]: 2025-10-08 15:27:40.253115846 +0000 UTC m=+0.075440762 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-type=git, config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  8 11:27:40 np0005476733 nova_compute[192580]: 2025-10-08 15:27:40.672 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 656c0a96-03f3-4a70-baac-01de2a126a91 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:27:40 np0005476733 nova_compute[192580]: 2025-10-08 15:27:40.673 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 6efc9ea0-184c-46cc-aeb5-e2759e10e398 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:27:40 np0005476733 nova_compute[192580]: 2025-10-08 15:27:40.673 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:27:40 np0005476733 nova_compute[192580]: 2025-10-08 15:27:40.673 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 1cbc4434-d89a-483d-a1f2-299190262888 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:27:40 np0005476733 nova_compute[192580]: 2025-10-08 15:27:40.673 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:27:40 np0005476733 nova_compute[192580]: 2025-10-08 15:27:40.674 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance e36dd986-15d5-466e-93d6-dc7b4483c8e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:27:40 np0005476733 nova_compute[192580]: 2025-10-08 15:27:40.674 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance cefc7b22-5a31-4d0c-bb25-462153dfc427 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:27:40 np0005476733 nova_compute[192580]: 2025-10-08 15:27:40.674 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 7f1808f3-5a79-4149-84d1-7bc21eefa497 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:27:40 np0005476733 nova_compute[192580]: 2025-10-08 15:27:40.674 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 8 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:27:40 np0005476733 nova_compute[192580]: 2025-10-08 15:27:40.675 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=8704MB phys_disk=119GB used_disk=80GB total_vcpus=8 used_vcpus=8 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:27:40 np0005476733 nova_compute[192580]: 2025-10-08 15:27:40.871 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:27:40 np0005476733 nova_compute[192580]: 2025-10-08 15:27:40.903 2 INFO nova.compute.manager [None req-1ca1c52c-e087-4ee5-9d04-8157987ac83f 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Get console output#033[00m
Oct  8 11:27:40 np0005476733 nova_compute[192580]: 2025-10-08 15:27:40.909 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:27:41 np0005476733 nova_compute[192580]: 2025-10-08 15:27:41.059 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:27:41 np0005476733 nova_compute[192580]: 2025-10-08 15:27:41.170 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:27:41 np0005476733 nova_compute[192580]: 2025-10-08 15:27:41.170 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.911s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:42 np0005476733 podman[229145]: 2025-10-08 15:27:42.226866203 +0000 UTC m=+0.051155763 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:27:42 np0005476733 podman[229144]: 2025-10-08 15:27:42.242943952 +0000 UTC m=+0.071613960 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.627 2 DEBUG oslo_concurrency.lockutils [None req-9d226a53-c00b-44cc-83cb-7f6c3ed655b8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "cefc7b22-5a31-4d0c-bb25-462153dfc427" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.628 2 DEBUG oslo_concurrency.lockutils [None req-9d226a53-c00b-44cc-83cb-7f6c3ed655b8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.628 2 DEBUG oslo_concurrency.lockutils [None req-9d226a53-c00b-44cc-83cb-7f6c3ed655b8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.629 2 DEBUG oslo_concurrency.lockutils [None req-9d226a53-c00b-44cc-83cb-7f6c3ed655b8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.629 2 DEBUG oslo_concurrency.lockutils [None req-9d226a53-c00b-44cc-83cb-7f6c3ed655b8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.630 2 INFO nova.compute.manager [None req-9d226a53-c00b-44cc-83cb-7f6c3ed655b8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Terminating instance#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.631 2 DEBUG nova.compute.manager [None req-9d226a53-c00b-44cc-83cb-7f6c3ed655b8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:27:42 np0005476733 kernel: tapcae08d04-f9 (unregistering): left promiscuous mode
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.666 2 DEBUG oslo_concurrency.lockutils [None req-32f5f635-8f28-43a9-928f-779e43c43ac9 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "e36dd986-15d5-466e-93d6-dc7b4483c8e9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.666 2 DEBUG oslo_concurrency.lockutils [None req-32f5f635-8f28-43a9-928f-779e43c43ac9 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "e36dd986-15d5-466e-93d6-dc7b4483c8e9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.666 2 DEBUG oslo_concurrency.lockutils [None req-32f5f635-8f28-43a9-928f-779e43c43ac9 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "e36dd986-15d5-466e-93d6-dc7b4483c8e9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.667 2 DEBUG oslo_concurrency.lockutils [None req-32f5f635-8f28-43a9-928f-779e43c43ac9 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "e36dd986-15d5-466e-93d6-dc7b4483c8e9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.667 2 DEBUG oslo_concurrency.lockutils [None req-32f5f635-8f28-43a9-928f-779e43c43ac9 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "e36dd986-15d5-466e-93d6-dc7b4483c8e9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.668 2 INFO nova.compute.manager [None req-32f5f635-8f28-43a9-928f-779e43c43ac9 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Terminating instance#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.669 2 DEBUG nova.compute.manager [None req-32f5f635-8f28-43a9-928f-779e43c43ac9 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:27:42 np0005476733 NetworkManager[51699]: <info>  [1759937262.6696] device (tapcae08d04-f9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:42 np0005476733 kernel: tap27016abf-08 (unregistering): left promiscuous mode
Oct  8 11:27:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:42Z|00289|binding|INFO|Releasing lport cae08d04-f9a8-46ee-ba57-0a0db94ae186 from this chassis (sb_readonly=0)
Oct  8 11:27:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:42Z|00290|binding|INFO|Setting lport cae08d04-f9a8-46ee-ba57-0a0db94ae186 down in Southbound
Oct  8 11:27:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:42Z|00291|binding|INFO|Removing iface tapcae08d04-f9 ovn-installed in OVS
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:42 np0005476733 NetworkManager[51699]: <info>  [1759937262.7020] device (tap27016abf-08): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:42Z|00292|binding|INFO|Releasing lport 27016abf-08ed-40dc-8da9-bebab3e3a2a3 from this chassis (sb_readonly=1)
Oct  8 11:27:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:42Z|00293|binding|INFO|Removing iface tap27016abf-08 ovn-installed in OVS
Oct  8 11:27:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:42Z|00294|if_status|INFO|Dropped 2 log messages in last 32 seconds (most recently, 32 seconds ago) due to excessive rate
Oct  8 11:27:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:42Z|00295|if_status|INFO|Not setting lport 27016abf-08ed-40dc-8da9-bebab3e3a2a3 down as sb is readonly
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:42 np0005476733 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Oct  8 11:27:42 np0005476733 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001e.scope: Consumed 29.265s CPU time.
Oct  8 11:27:42 np0005476733 systemd-machined[152624]: Machine qemu-21-instance-0000001e terminated.
Oct  8 11:27:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:42.757 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:16:82:23 192.168.2.168'], port_security=['fa:16:3e:16:82:23 192.168.2.168'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'neutron:cidrs': '192.168.2.168/24', 'neutron:device_id': 'cefc7b22-5a31-4d0c-bb25-462153dfc427', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c022ba9-08a2-40a7-896d-13c1538d7064', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4', 'neutron:project_id': '93e68db931464f0282500c84d398d8af', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'ee93d6be-59e3-41c0-a55f-8df79fb9da74', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.230'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5ea236cb-dec7-48d3-a1ef-7ce9f1bd90ad, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=cae08d04-f9a8-46ee-ba57-0a0db94ae186) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:27:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:42.759 103739 INFO neutron.agent.ovn.metadata.agent [-] Port cae08d04-f9a8-46ee-ba57-0a0db94ae186 in datapath 9c022ba9-08a2-40a7-896d-13c1538d7064 unbound from our chassis#033[00m
Oct  8 11:27:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:42Z|00296|binding|INFO|Setting lport 27016abf-08ed-40dc-8da9-bebab3e3a2a3 down in Southbound
Oct  8 11:27:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:42.761 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9c022ba9-08a2-40a7-896d-13c1538d7064, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:27:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:42.762 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e7e57c82-da21-471e-a733-273a0ea9348e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:42.763 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9c022ba9-08a2-40a7-896d-13c1538d7064 namespace which is not needed anymore#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:42.779 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fe:38:dd 10.100.0.13'], port_security=['fa:16:3e:fe:38:dd 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e36dd986-15d5-466e-93d6-dc7b4483c8e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7f2acdb26a5a4269a4b1e407da7722c3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '89de78c9-f0c2-4dee-bf11-af3dd2c1fe7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ebe434a7-5fd3-4a18-92a7-9bb4b2dc9121, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=27016abf-08ed-40dc-8da9-bebab3e3a2a3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:27:42 np0005476733 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000016.scope: Deactivated successfully.
Oct  8 11:27:42 np0005476733 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000016.scope: Consumed 45.182s CPU time.
Oct  8 11:27:42 np0005476733 systemd-machined[152624]: Machine qemu-14-instance-00000016 terminated.
Oct  8 11:27:42 np0005476733 neutron-haproxy-ovnmeta-9c022ba9-08a2-40a7-896d-13c1538d7064[228708]: [NOTICE]   (228751) : haproxy version is 2.8.14-c23fe91
Oct  8 11:27:42 np0005476733 neutron-haproxy-ovnmeta-9c022ba9-08a2-40a7-896d-13c1538d7064[228708]: [NOTICE]   (228751) : path to executable is /usr/sbin/haproxy
Oct  8 11:27:42 np0005476733 neutron-haproxy-ovnmeta-9c022ba9-08a2-40a7-896d-13c1538d7064[228708]: [WARNING]  (228751) : Exiting Master process...
Oct  8 11:27:42 np0005476733 neutron-haproxy-ovnmeta-9c022ba9-08a2-40a7-896d-13c1538d7064[228708]: [WARNING]  (228751) : Exiting Master process...
Oct  8 11:27:42 np0005476733 neutron-haproxy-ovnmeta-9c022ba9-08a2-40a7-896d-13c1538d7064[228708]: [ALERT]    (228751) : Current worker (228754) exited with code 143 (Terminated)
Oct  8 11:27:42 np0005476733 neutron-haproxy-ovnmeta-9c022ba9-08a2-40a7-896d-13c1538d7064[228708]: [WARNING]  (228751) : All workers exited. Exiting... (0)
Oct  8 11:27:42 np0005476733 systemd[1]: libpod-f983de7a5b1b48e3a3bc134620dcf4e95dc35bbbb01ba557f773fe3d9dff1a6c.scope: Deactivated successfully.
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.916 2 INFO nova.virt.libvirt.driver [-] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Instance destroyed successfully.#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.917 2 DEBUG nova.objects.instance [None req-9d226a53-c00b-44cc-83cb-7f6c3ed655b8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lazy-loading 'resources' on Instance uuid cefc7b22-5a31-4d0c-bb25-462153dfc427 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:27:42 np0005476733 podman[229218]: 2025-10-08 15:27:42.919828085 +0000 UTC m=+0.055353225 container died f983de7a5b1b48e3a3bc134620dcf4e95dc35bbbb01ba557f773fe3d9dff1a6c (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-9c022ba9-08a2-40a7-896d-13c1538d7064, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.933 2 DEBUG nova.virt.libvirt.vif [None req-9d226a53-c00b-44cc-83cb-7f6c3ed655b8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:25:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4',display_name='tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-extradhcpoptionstest-1757752636-test-extra-dhcp-opts-di',id=30,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAROHXDFBirKKfgv1/Q2k8TOz822D2j3GssXLkqqAYkfNmKCLTZPWHL9R3TttvPeVcQM9XeUfcVk0LUjV4/DUc229+mDzz6yKwrgz0g4olEc5cIgAsFC91SZyJ937u9BxA==',key_name='tempest-ExtraDhcpOptionsTest-1757752636',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:25:52Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='93e68db931464f0282500c84d398d8af',ramdisk_id='',reservation_id='r-qkxis15o',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ExtraDhcpOptionsTest-522093769',owner_user_name='tempest-ExtraDhcpOptionsTest-522093769-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:27:11Z,user_data=None,user_id='048380879c82439f920961e33c8fc34c',uuid=cefc7b22-5a31-4d0c-bb25-462153dfc427,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cae08d04-f9a8-46ee-ba57-0a0db94ae186", "address": "fa:16:3e:16:82:23", "network": {"id": "9c022ba9-08a2-40a7-896d-13c1538d7064", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.168", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae08d04-f9", "ovs_interfaceid": "cae08d04-f9a8-46ee-ba57-0a0db94ae186", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.934 2 DEBUG nova.network.os_vif_util [None req-9d226a53-c00b-44cc-83cb-7f6c3ed655b8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Converting VIF {"id": "cae08d04-f9a8-46ee-ba57-0a0db94ae186", "address": "fa:16:3e:16:82:23", "network": {"id": "9c022ba9-08a2-40a7-896d-13c1538d7064", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_disabled_enabled_dhcp4", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.168", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcae08d04-f9", "ovs_interfaceid": "cae08d04-f9a8-46ee-ba57-0a0db94ae186", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.935 2 DEBUG nova.network.os_vif_util [None req-9d226a53-c00b-44cc-83cb-7f6c3ed655b8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:16:82:23,bridge_name='br-int',has_traffic_filtering=True,id=cae08d04-f9a8-46ee-ba57-0a0db94ae186,network=Network(9c022ba9-08a2-40a7-896d-13c1538d7064),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcae08d04-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.936 2 DEBUG os_vif [None req-9d226a53-c00b-44cc-83cb-7f6c3ed655b8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:16:82:23,bridge_name='br-int',has_traffic_filtering=True,id=cae08d04-f9a8-46ee-ba57-0a0db94ae186,network=Network(9c022ba9-08a2-40a7-896d-13c1538d7064),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcae08d04-f9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.938 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcae08d04-f9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:27:42 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f983de7a5b1b48e3a3bc134620dcf4e95dc35bbbb01ba557f773fe3d9dff1a6c-userdata-shm.mount: Deactivated successfully.
Oct  8 11:27:42 np0005476733 systemd[1]: var-lib-containers-storage-overlay-9db5a2fca3107ec5c1b5c64dbd2fcbe9d16d8a9b35b42ce6742c5f8833258229-merged.mount: Deactivated successfully.
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.952 2 INFO os_vif [None req-9d226a53-c00b-44cc-83cb-7f6c3ed655b8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:16:82:23,bridge_name='br-int',has_traffic_filtering=True,id=cae08d04-f9a8-46ee-ba57-0a0db94ae186,network=Network(9c022ba9-08a2-40a7-896d-13c1538d7064),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapcae08d04-f9')#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.953 2 INFO nova.virt.libvirt.driver [None req-9d226a53-c00b-44cc-83cb-7f6c3ed655b8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Deleting instance files /var/lib/nova/instances/cefc7b22-5a31-4d0c-bb25-462153dfc427_del#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.954 2 INFO nova.virt.libvirt.driver [None req-9d226a53-c00b-44cc-83cb-7f6c3ed655b8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Deletion of /var/lib/nova/instances/cefc7b22-5a31-4d0c-bb25-462153dfc427_del complete#033[00m
Oct  8 11:27:42 np0005476733 podman[229218]: 2025-10-08 15:27:42.965693919 +0000 UTC m=+0.101219059 container cleanup f983de7a5b1b48e3a3bc134620dcf4e95dc35bbbb01ba557f773fe3d9dff1a6c (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-9c022ba9-08a2-40a7-896d-13c1538d7064, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.967 2 INFO nova.virt.libvirt.driver [-] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Instance destroyed successfully.#033[00m
Oct  8 11:27:42 np0005476733 nova_compute[192580]: 2025-10-08 15:27:42.967 2 DEBUG nova.objects.instance [None req-32f5f635-8f28-43a9-928f-779e43c43ac9 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lazy-loading 'resources' on Instance uuid e36dd986-15d5-466e-93d6-dc7b4483c8e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:27:42 np0005476733 systemd[1]: libpod-conmon-f983de7a5b1b48e3a3bc134620dcf4e95dc35bbbb01ba557f773fe3d9dff1a6c.scope: Deactivated successfully.
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.015 2 DEBUG nova.virt.libvirt.vif [None req-32f5f635-8f28-43a9-928f-779e43c43ac9 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:23:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986',display_name='tempest-test_igmp_snooping_same_network_and_unsubscribe-1863725986',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-igmp-snooping-same-network-and-unsubscribe-1863725',id=22,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD2WFeSg/DGHNB9+nWyQfurOVjPkTxdtZkW0R1GkMWJ7Z/35TtPo56N93IJ9W+ueAP01srElKtm0K/Obvpsxk9Lrs3cBEC1ElilHgpG+1/NKtqmriMYH4DXfeSh+aMoHPg==',key_name='tempest-keypair-test-469695160',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:23:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7f2acdb26a5a4269a4b1e407da7722c3',ramdisk_id='',reservation_id='r-mydo74dz',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-MulticastTestIPv4Common-178854047',owner_user_name='tempest-MulticastTestIPv4Common-178854047-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:23:54Z,user_data=None,user_id='f03335a379bd4afdbbd7b9cc7cae27e0',uuid=e36dd986-15d5-466e-93d6-dc7b4483c8e9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "27016abf-08ed-40dc-8da9-bebab3e3a2a3", "address": "fa:16:3e:fe:38:dd", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27016abf-08", "ovs_interfaceid": "27016abf-08ed-40dc-8da9-bebab3e3a2a3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.016 2 DEBUG nova.network.os_vif_util [None req-32f5f635-8f28-43a9-928f-779e43c43ac9 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Converting VIF {"id": "27016abf-08ed-40dc-8da9-bebab3e3a2a3", "address": "fa:16:3e:fe:38:dd", "network": {"id": "3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567", "bridge": "br-int", "label": "tempest-test-network--1621974926", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7f2acdb26a5a4269a4b1e407da7722c3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27016abf-08", "ovs_interfaceid": "27016abf-08ed-40dc-8da9-bebab3e3a2a3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.016 2 DEBUG nova.network.os_vif_util [None req-32f5f635-8f28-43a9-928f-779e43c43ac9 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fe:38:dd,bridge_name='br-int',has_traffic_filtering=True,id=27016abf-08ed-40dc-8da9-bebab3e3a2a3,network=Network(3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27016abf-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.017 2 DEBUG os_vif [None req-32f5f635-8f28-43a9-928f-779e43c43ac9 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fe:38:dd,bridge_name='br-int',has_traffic_filtering=True,id=27016abf-08ed-40dc-8da9-bebab3e3a2a3,network=Network(3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27016abf-08') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.019 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27016abf-08, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.024 2 INFO os_vif [None req-32f5f635-8f28-43a9-928f-779e43c43ac9 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fe:38:dd,bridge_name='br-int',has_traffic_filtering=True,id=27016abf-08ed-40dc-8da9-bebab3e3a2a3,network=Network(3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27016abf-08')#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.024 2 INFO nova.virt.libvirt.driver [None req-32f5f635-8f28-43a9-928f-779e43c43ac9 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Deleting instance files /var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9_del#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.025 2 INFO nova.virt.libvirt.driver [None req-32f5f635-8f28-43a9-928f-779e43c43ac9 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Deletion of /var/lib/nova/instances/e36dd986-15d5-466e-93d6-dc7b4483c8e9_del complete#033[00m
Oct  8 11:27:43 np0005476733 podman[229277]: 2025-10-08 15:27:43.037043361 +0000 UTC m=+0.051403161 container remove f983de7a5b1b48e3a3bc134620dcf4e95dc35bbbb01ba557f773fe3d9dff1a6c (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-9c022ba9-08a2-40a7-896d-13c1538d7064, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.044 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5fc7c42c-bb64-4453-8f82-fe381a9f060b]: (4, ('Wed Oct  8 03:27:42 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9c022ba9-08a2-40a7-896d-13c1538d7064 (f983de7a5b1b48e3a3bc134620dcf4e95dc35bbbb01ba557f773fe3d9dff1a6c)\nf983de7a5b1b48e3a3bc134620dcf4e95dc35bbbb01ba557f773fe3d9dff1a6c\nWed Oct  8 03:27:42 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9c022ba9-08a2-40a7-896d-13c1538d7064 (f983de7a5b1b48e3a3bc134620dcf4e95dc35bbbb01ba557f773fe3d9dff1a6c)\nf983de7a5b1b48e3a3bc134620dcf4e95dc35bbbb01ba557f773fe3d9dff1a6c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.047 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3d639a5d-cfe9-4d64-be1d-fece4dae5ff0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.048 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c022ba9-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:43 np0005476733 kernel: tap9c022ba9-00: left promiscuous mode
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.100 2 INFO nova.compute.manager [None req-9d226a53-c00b-44cc-83cb-7f6c3ed655b8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Took 0.47 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.102 2 DEBUG oslo.service.loopingcall [None req-9d226a53-c00b-44cc-83cb-7f6c3ed655b8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.102 2 DEBUG nova.compute.manager [-] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.102 2 DEBUG nova.network.neutron [-] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.115 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9cc9beb7-4075-44cb-acbb-a54fece320e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.141 2 INFO nova.compute.manager [None req-32f5f635-8f28-43a9-928f-779e43c43ac9 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Took 0.47 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.143 2 DEBUG oslo.service.loopingcall [None req-32f5f635-8f28-43a9-928f-779e43c43ac9 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.143 2 DEBUG nova.compute.manager [-] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.144 2 DEBUG nova.network.neutron [-] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.146 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[991b8640-0595-43de-91ea-9d27f3d98f44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.148 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5462965c-e5f2-4fc0-8770-8b3298552911]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.168 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[090ef50d-cb75-4dc0-a30a-2193110c3e0a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 413435, 'reachable_time': 26157, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229296, 'error': None, 'target': 'ovnmeta-9c022ba9-08a2-40a7-896d-13c1538d7064', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:43 np0005476733 systemd[1]: run-netns-ovnmeta\x2d9c022ba9\x2d08a2\x2d40a7\x2d896d\x2d13c1538d7064.mount: Deactivated successfully.
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.171 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9c022ba9-08a2-40a7-896d-13c1538d7064 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.171 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[5dc191de-dbe2-4447-a09b-eeb1cf735d2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.175 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 27016abf-08ed-40dc-8da9-bebab3e3a2a3 in datapath 3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567 unbound from our chassis#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.178 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.178 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c71dec3a-85ca-49c1-8f52-3a6baece4ce7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.178 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567 namespace which is not needed anymore#033[00m
Oct  8 11:27:43 np0005476733 neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567[226244]: [NOTICE]   (226259) : haproxy version is 2.8.14-c23fe91
Oct  8 11:27:43 np0005476733 neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567[226244]: [NOTICE]   (226259) : path to executable is /usr/sbin/haproxy
Oct  8 11:27:43 np0005476733 neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567[226244]: [WARNING]  (226259) : Exiting Master process...
Oct  8 11:27:43 np0005476733 neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567[226244]: [WARNING]  (226259) : Exiting Master process...
Oct  8 11:27:43 np0005476733 neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567[226244]: [ALERT]    (226259) : Current worker (226266) exited with code 143 (Terminated)
Oct  8 11:27:43 np0005476733 neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567[226244]: [WARNING]  (226259) : All workers exited. Exiting... (0)
Oct  8 11:27:43 np0005476733 systemd[1]: libpod-3ccc8fc7b426526d985baf99fb75a94501775473bd46b4e803a3d2d2ba01b736.scope: Deactivated successfully.
Oct  8 11:27:43 np0005476733 podman[229314]: 2025-10-08 15:27:43.314610057 +0000 UTC m=+0.048083444 container died 3ccc8fc7b426526d985baf99fb75a94501775473bd46b4e803a3d2d2ba01b736 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  8 11:27:43 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3ccc8fc7b426526d985baf99fb75a94501775473bd46b4e803a3d2d2ba01b736-userdata-shm.mount: Deactivated successfully.
Oct  8 11:27:43 np0005476733 systemd[1]: var-lib-containers-storage-overlay-927467e59113a6f78629d25bbe2e57b003ee25270ec43dd92f4c214a3581231b-merged.mount: Deactivated successfully.
Oct  8 11:27:43 np0005476733 podman[229314]: 2025-10-08 15:27:43.383055187 +0000 UTC m=+0.116528554 container cleanup 3ccc8fc7b426526d985baf99fb75a94501775473bd46b4e803a3d2d2ba01b736 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  8 11:27:43 np0005476733 systemd[1]: libpod-conmon-3ccc8fc7b426526d985baf99fb75a94501775473bd46b4e803a3d2d2ba01b736.scope: Deactivated successfully.
Oct  8 11:27:43 np0005476733 podman[229345]: 2025-10-08 15:27:43.480973701 +0000 UTC m=+0.071207359 container remove 3ccc8fc7b426526d985baf99fb75a94501775473bd46b4e803a3d2d2ba01b736 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2)
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.487 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6caab05c-e505-4a68-b32b-e4979e211b3b]: (4, ('Wed Oct  8 03:27:43 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567 (3ccc8fc7b426526d985baf99fb75a94501775473bd46b4e803a3d2d2ba01b736)\n3ccc8fc7b426526d985baf99fb75a94501775473bd46b4e803a3d2d2ba01b736\nWed Oct  8 03:27:43 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567 (3ccc8fc7b426526d985baf99fb75a94501775473bd46b4e803a3d2d2ba01b736)\n3ccc8fc7b426526d985baf99fb75a94501775473bd46b4e803a3d2d2ba01b736\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.489 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[fa24d644-7922-4e81-9c6c-9c45f1c5914c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.489 2 DEBUG nova.compute.manager [req-8c3e14fa-066e-4259-b27f-6152190a3f16 req-e540aaa7-2fe8-4b0f-bc73-87c52b7a59e5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Received event network-vif-unplugged-27016abf-08ed-40dc-8da9-bebab3e3a2a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.490 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3ec2e14e-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.490 2 DEBUG oslo_concurrency.lockutils [req-8c3e14fa-066e-4259-b27f-6152190a3f16 req-e540aaa7-2fe8-4b0f-bc73-87c52b7a59e5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e36dd986-15d5-466e-93d6-dc7b4483c8e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.491 2 DEBUG oslo_concurrency.lockutils [req-8c3e14fa-066e-4259-b27f-6152190a3f16 req-e540aaa7-2fe8-4b0f-bc73-87c52b7a59e5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e36dd986-15d5-466e-93d6-dc7b4483c8e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.491 2 DEBUG oslo_concurrency.lockutils [req-8c3e14fa-066e-4259-b27f-6152190a3f16 req-e540aaa7-2fe8-4b0f-bc73-87c52b7a59e5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e36dd986-15d5-466e-93d6-dc7b4483c8e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.492 2 DEBUG nova.compute.manager [req-8c3e14fa-066e-4259-b27f-6152190a3f16 req-e540aaa7-2fe8-4b0f-bc73-87c52b7a59e5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] No waiting events found dispatching network-vif-unplugged-27016abf-08ed-40dc-8da9-bebab3e3a2a3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.492 2 DEBUG nova.compute.manager [req-8c3e14fa-066e-4259-b27f-6152190a3f16 req-e540aaa7-2fe8-4b0f-bc73-87c52b7a59e5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Received event network-vif-unplugged-27016abf-08ed-40dc-8da9-bebab3e3a2a3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:43 np0005476733 kernel: tap3ec2e14e-50: left promiscuous mode
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.516 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6c29b95d-8256-4c60-9daa-3d8ada1cc3b0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.546 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3a4671af-0210-46cf-9c56-b983b04b8ce2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.548 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e3ebd02d-a8d0-439b-8273-814d052bab81]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.564 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a725f5cd-3a2e-437d-9d82-0c28d198e26a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 393585, 'reachable_time': 28630, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229360, 'error': None, 'target': 'ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.567 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3ec2e14e-57e7-4e0a-bf0c-0beb3cfd5567 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.567 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[e7a9dac4-0543-4982-aa57-9e61ea9cd170]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.702 2 DEBUG oslo_concurrency.lockutils [None req-f5b191a0-1987-47f4-9a1c-2510105e877f c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "6efc9ea0-184c-46cc-aeb5-e2759e10e398" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.703 2 DEBUG oslo_concurrency.lockutils [None req-f5b191a0-1987-47f4-9a1c-2510105e877f c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "6efc9ea0-184c-46cc-aeb5-e2759e10e398" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.703 2 DEBUG oslo_concurrency.lockutils [None req-f5b191a0-1987-47f4-9a1c-2510105e877f c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "6efc9ea0-184c-46cc-aeb5-e2759e10e398-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.704 2 DEBUG oslo_concurrency.lockutils [None req-f5b191a0-1987-47f4-9a1c-2510105e877f c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "6efc9ea0-184c-46cc-aeb5-e2759e10e398-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.705 2 DEBUG oslo_concurrency.lockutils [None req-f5b191a0-1987-47f4-9a1c-2510105e877f c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "6efc9ea0-184c-46cc-aeb5-e2759e10e398-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.707 2 INFO nova.compute.manager [None req-f5b191a0-1987-47f4-9a1c-2510105e877f c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Terminating instance#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.709 2 DEBUG nova.compute.manager [None req-f5b191a0-1987-47f4-9a1c-2510105e877f c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:27:43 np0005476733 kernel: tap36047ed0-01 (unregistering): left promiscuous mode
Oct  8 11:27:43 np0005476733 NetworkManager[51699]: <info>  [1759937263.7468] device (tap36047ed0-01): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:27:43 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:43Z|00297|binding|INFO|Releasing lport 36047ed0-015a-4d5e-8c0a-fc4d965a13b7 from this chassis (sb_readonly=0)
Oct  8 11:27:43 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:43Z|00298|binding|INFO|Setting lport 36047ed0-015a-4d5e-8c0a-fc4d965a13b7 down in Southbound
Oct  8 11:27:43 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:43Z|00299|binding|INFO|Removing iface tap36047ed0-01 ovn-installed in OVS
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.777 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:d0:1a 10.100.0.6'], port_security=['fa:16:3e:a3:d0:1a 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '6efc9ea0-184c-46cc-aeb5-e2759e10e398', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '496a37645ecf47b496dcf02c696ca64a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '023a0cd3-fdca-4dff-ba80-8ef557b384c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.196'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b3d4cc6-3768-451b-b35e-6b2333c921fd, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=36047ed0-015a-4d5e-8c0a-fc4d965a13b7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.778 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 36047ed0-015a-4d5e-8c0a-fc4d965a13b7 in datapath 30cdfb1e-750a-4d0e-9e9c-321b06b371b9 unbound from our chassis#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.781 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 30cdfb1e-750a-4d0e-9e9c-321b06b371b9#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.795 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ef488d3b-6fbf-4c23-8e5c-3f86b3c25837]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:43 np0005476733 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Oct  8 11:27:43 np0005476733 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000001a.scope: Consumed 48.717s CPU time.
Oct  8 11:27:43 np0005476733 systemd-machined[152624]: Machine qemu-16-instance-0000001a terminated.
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.834 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[a988f2ed-8ace-45ae-9655-2c805837ceae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.838 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[8530ea82-74a1-4e8e-97eb-96f229958aa0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.881 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[b566e53f-efe7-43ec-b245-d5c91bbe6204]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.905 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[feea63d4-1267-409d-acf0-e4bd15bd90c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap30cdfb1e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:3e:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 7, 'rx_bytes': 1084, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 7, 'rx_bytes': 1084, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387053, 'reachable_time': 34516, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229369, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.929 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a1898452-e868-4315-9595-88d3cd3c2d7f]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap30cdfb1e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387066, 'tstamp': 387066}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229370, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap30cdfb1e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 387069, 'tstamp': 387069}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229370, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.931 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap30cdfb1e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:43 np0005476733 systemd[1]: run-netns-ovnmeta\x2d3ec2e14e\x2d57e7\x2d4e0a\x2dbf0c\x2d0beb3cfd5567.mount: Deactivated successfully.
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.942 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap30cdfb1e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.942 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.943 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap30cdfb1e-70, col_values=(('external_ids', {'iface-id': '76302563-91ae-48df-adce-3edec8d5a578'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:43.943 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.972 2 INFO nova.virt.libvirt.driver [-] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Instance destroyed successfully.#033[00m
Oct  8 11:27:43 np0005476733 nova_compute[192580]: 2025-10-08 15:27:43.973 2 DEBUG nova.objects.instance [None req-f5b191a0-1987-47f4-9a1c-2510105e877f c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lazy-loading 'resources' on Instance uuid 6efc9ea0-184c-46cc-aeb5-e2759e10e398 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.067 2 DEBUG nova.virt.libvirt.vif [None req-f5b191a0-1987-47f4-9a1c-2510105e877f c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:24:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_multicast_after_idle_timeout-155366011',display_name='tempest-test_multicast_after_idle_timeout-155366011',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-multicast-after-idle-timeout-155366011',id=26,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHaTUyIW7HAi8eLb2uxsb3hQ01QNiqMtiwd2QQElMyFusiyPekoP+eGZG5apcvUeJj+ezHykEE9e9GalqeB/Pt0gdiMZz/nmUCtHv59KRRGG4S5F2fPmbxlRdJaDztvzVg==',key_name='tempest-keypair-test-1272869518',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:24:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='496a37645ecf47b496dcf02c696ca64a',ramdisk_id='',reservation_id='r-0ewp8wvp',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-MulticastTestIPv4Ovn-1993668591',owner_user_name='tempest-MulticastTestIPv4Ovn-1993668591-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:24:51Z,user_data=None,user_id='c0c7c5c2dab54695b1cc0a34bdc4ee47',uuid=6efc9ea0-184c-46cc-aeb5-e2759e10e398,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "36047ed0-015a-4d5e-8c0a-fc4d965a13b7", "address": "fa:16:3e:a3:d0:1a", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36047ed0-01", "ovs_interfaceid": "36047ed0-015a-4d5e-8c0a-fc4d965a13b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.068 2 DEBUG nova.network.os_vif_util [None req-f5b191a0-1987-47f4-9a1c-2510105e877f c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converting VIF {"id": "36047ed0-015a-4d5e-8c0a-fc4d965a13b7", "address": "fa:16:3e:a3:d0:1a", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36047ed0-01", "ovs_interfaceid": "36047ed0-015a-4d5e-8c0a-fc4d965a13b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.068 2 DEBUG nova.network.os_vif_util [None req-f5b191a0-1987-47f4-9a1c-2510105e877f c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a3:d0:1a,bridge_name='br-int',has_traffic_filtering=True,id=36047ed0-015a-4d5e-8c0a-fc4d965a13b7,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36047ed0-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.069 2 DEBUG os_vif [None req-f5b191a0-1987-47f4-9a1c-2510105e877f c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a3:d0:1a,bridge_name='br-int',has_traffic_filtering=True,id=36047ed0-015a-4d5e-8c0a-fc4d965a13b7,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36047ed0-01') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.070 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap36047ed0-01, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.077 2 INFO os_vif [None req-f5b191a0-1987-47f4-9a1c-2510105e877f c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a3:d0:1a,bridge_name='br-int',has_traffic_filtering=True,id=36047ed0-015a-4d5e-8c0a-fc4d965a13b7,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36047ed0-01')#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.077 2 INFO nova.virt.libvirt.driver [None req-f5b191a0-1987-47f4-9a1c-2510105e877f c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Deleting instance files /var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398_del#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.078 2 INFO nova.virt.libvirt.driver [None req-f5b191a0-1987-47f4-9a1c-2510105e877f c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Deletion of /var/lib/nova/instances/6efc9ea0-184c-46cc-aeb5-e2759e10e398_del complete#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.171 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.171 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.172 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.274 2 INFO nova.compute.manager [None req-f5b191a0-1987-47f4-9a1c-2510105e877f c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Took 0.56 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.274 2 DEBUG oslo.service.loopingcall [None req-f5b191a0-1987-47f4-9a1c-2510105e877f c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.275 2 DEBUG nova.compute.manager [-] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.275 2 DEBUG nova.network.neutron [-] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.504 2 DEBUG nova.compute.manager [req-b9d67870-6242-4e7e-bbdd-f6df1175e54f req-4a611a74-6bb6-4d96-87cf-e462471ccb36 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Received event network-vif-unplugged-cae08d04-f9a8-46ee-ba57-0a0db94ae186 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.505 2 DEBUG oslo_concurrency.lockutils [req-b9d67870-6242-4e7e-bbdd-f6df1175e54f req-4a611a74-6bb6-4d96-87cf-e462471ccb36 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.505 2 DEBUG oslo_concurrency.lockutils [req-b9d67870-6242-4e7e-bbdd-f6df1175e54f req-4a611a74-6bb6-4d96-87cf-e462471ccb36 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.505 2 DEBUG oslo_concurrency.lockutils [req-b9d67870-6242-4e7e-bbdd-f6df1175e54f req-4a611a74-6bb6-4d96-87cf-e462471ccb36 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.506 2 DEBUG nova.compute.manager [req-b9d67870-6242-4e7e-bbdd-f6df1175e54f req-4a611a74-6bb6-4d96-87cf-e462471ccb36 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] No waiting events found dispatching network-vif-unplugged-cae08d04-f9a8-46ee-ba57-0a0db94ae186 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.506 2 DEBUG nova.compute.manager [req-b9d67870-6242-4e7e-bbdd-f6df1175e54f req-4a611a74-6bb6-4d96-87cf-e462471ccb36 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Received event network-vif-unplugged-cae08d04-f9a8-46ee-ba57-0a0db94ae186 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:27:44 np0005476733 nova_compute[192580]: 2025-10-08 15:27:44.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:27:45 np0005476733 nova_compute[192580]: 2025-10-08 15:27:45.693 2 DEBUG nova.compute.manager [req-aeab1a10-acf5-49b6-99b0-bd67a088ae2f req-ffed4256-d4b0-4238-b236-f0c6f28cf19d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Received event network-vif-plugged-27016abf-08ed-40dc-8da9-bebab3e3a2a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:27:45 np0005476733 nova_compute[192580]: 2025-10-08 15:27:45.694 2 DEBUG oslo_concurrency.lockutils [req-aeab1a10-acf5-49b6-99b0-bd67a088ae2f req-ffed4256-d4b0-4238-b236-f0c6f28cf19d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e36dd986-15d5-466e-93d6-dc7b4483c8e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:45 np0005476733 nova_compute[192580]: 2025-10-08 15:27:45.695 2 DEBUG oslo_concurrency.lockutils [req-aeab1a10-acf5-49b6-99b0-bd67a088ae2f req-ffed4256-d4b0-4238-b236-f0c6f28cf19d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e36dd986-15d5-466e-93d6-dc7b4483c8e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:45 np0005476733 nova_compute[192580]: 2025-10-08 15:27:45.695 2 DEBUG oslo_concurrency.lockutils [req-aeab1a10-acf5-49b6-99b0-bd67a088ae2f req-ffed4256-d4b0-4238-b236-f0c6f28cf19d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e36dd986-15d5-466e-93d6-dc7b4483c8e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:45 np0005476733 nova_compute[192580]: 2025-10-08 15:27:45.695 2 DEBUG nova.compute.manager [req-aeab1a10-acf5-49b6-99b0-bd67a088ae2f req-ffed4256-d4b0-4238-b236-f0c6f28cf19d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] No waiting events found dispatching network-vif-plugged-27016abf-08ed-40dc-8da9-bebab3e3a2a3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:27:45 np0005476733 nova_compute[192580]: 2025-10-08 15:27:45.696 2 WARNING nova.compute.manager [req-aeab1a10-acf5-49b6-99b0-bd67a088ae2f req-ffed4256-d4b0-4238-b236-f0c6f28cf19d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Received unexpected event network-vif-plugged-27016abf-08ed-40dc-8da9-bebab3e3a2a3 for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:27:45 np0005476733 nova_compute[192580]: 2025-10-08 15:27:45.696 2 DEBUG nova.compute.manager [req-aeab1a10-acf5-49b6-99b0-bd67a088ae2f req-ffed4256-d4b0-4238-b236-f0c6f28cf19d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Received event network-vif-unplugged-36047ed0-015a-4d5e-8c0a-fc4d965a13b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:27:45 np0005476733 nova_compute[192580]: 2025-10-08 15:27:45.696 2 DEBUG oslo_concurrency.lockutils [req-aeab1a10-acf5-49b6-99b0-bd67a088ae2f req-ffed4256-d4b0-4238-b236-f0c6f28cf19d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "6efc9ea0-184c-46cc-aeb5-e2759e10e398-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:45 np0005476733 nova_compute[192580]: 2025-10-08 15:27:45.697 2 DEBUG oslo_concurrency.lockutils [req-aeab1a10-acf5-49b6-99b0-bd67a088ae2f req-ffed4256-d4b0-4238-b236-f0c6f28cf19d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "6efc9ea0-184c-46cc-aeb5-e2759e10e398-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:45 np0005476733 nova_compute[192580]: 2025-10-08 15:27:45.697 2 DEBUG oslo_concurrency.lockutils [req-aeab1a10-acf5-49b6-99b0-bd67a088ae2f req-ffed4256-d4b0-4238-b236-f0c6f28cf19d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "6efc9ea0-184c-46cc-aeb5-e2759e10e398-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:45 np0005476733 nova_compute[192580]: 2025-10-08 15:27:45.697 2 DEBUG nova.compute.manager [req-aeab1a10-acf5-49b6-99b0-bd67a088ae2f req-ffed4256-d4b0-4238-b236-f0c6f28cf19d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] No waiting events found dispatching network-vif-unplugged-36047ed0-015a-4d5e-8c0a-fc4d965a13b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:27:45 np0005476733 nova_compute[192580]: 2025-10-08 15:27:45.698 2 DEBUG nova.compute.manager [req-aeab1a10-acf5-49b6-99b0-bd67a088ae2f req-ffed4256-d4b0-4238-b236-f0c6f28cf19d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Received event network-vif-unplugged-36047ed0-015a-4d5e-8c0a-fc4d965a13b7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:27:45 np0005476733 nova_compute[192580]: 2025-10-08 15:27:45.698 2 DEBUG nova.compute.manager [req-aeab1a10-acf5-49b6-99b0-bd67a088ae2f req-ffed4256-d4b0-4238-b236-f0c6f28cf19d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Received event network-vif-plugged-36047ed0-015a-4d5e-8c0a-fc4d965a13b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:27:45 np0005476733 nova_compute[192580]: 2025-10-08 15:27:45.698 2 DEBUG oslo_concurrency.lockutils [req-aeab1a10-acf5-49b6-99b0-bd67a088ae2f req-ffed4256-d4b0-4238-b236-f0c6f28cf19d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "6efc9ea0-184c-46cc-aeb5-e2759e10e398-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:45 np0005476733 nova_compute[192580]: 2025-10-08 15:27:45.699 2 DEBUG oslo_concurrency.lockutils [req-aeab1a10-acf5-49b6-99b0-bd67a088ae2f req-ffed4256-d4b0-4238-b236-f0c6f28cf19d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "6efc9ea0-184c-46cc-aeb5-e2759e10e398-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:45 np0005476733 nova_compute[192580]: 2025-10-08 15:27:45.699 2 DEBUG oslo_concurrency.lockutils [req-aeab1a10-acf5-49b6-99b0-bd67a088ae2f req-ffed4256-d4b0-4238-b236-f0c6f28cf19d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "6efc9ea0-184c-46cc-aeb5-e2759e10e398-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:45 np0005476733 nova_compute[192580]: 2025-10-08 15:27:45.699 2 DEBUG nova.compute.manager [req-aeab1a10-acf5-49b6-99b0-bd67a088ae2f req-ffed4256-d4b0-4238-b236-f0c6f28cf19d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] No waiting events found dispatching network-vif-plugged-36047ed0-015a-4d5e-8c0a-fc4d965a13b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:27:45 np0005476733 nova_compute[192580]: 2025-10-08 15:27:45.700 2 WARNING nova.compute.manager [req-aeab1a10-acf5-49b6-99b0-bd67a088ae2f req-ffed4256-d4b0-4238-b236-f0c6f28cf19d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Received unexpected event network-vif-plugged-36047ed0-015a-4d5e-8c0a-fc4d965a13b7 for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:27:45 np0005476733 nova_compute[192580]: 2025-10-08 15:27:45.823 2 DEBUG nova.network.neutron [-] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:27:45 np0005476733 nova_compute[192580]: 2025-10-08 15:27:45.995 2 INFO nova.compute.manager [-] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Took 2.85 seconds to deallocate network for instance.#033[00m
Oct  8 11:27:46 np0005476733 nova_compute[192580]: 2025-10-08 15:27:46.157 2 DEBUG oslo_concurrency.lockutils [None req-32f5f635-8f28-43a9-928f-779e43c43ac9 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:46 np0005476733 nova_compute[192580]: 2025-10-08 15:27:46.158 2 DEBUG oslo_concurrency.lockutils [None req-32f5f635-8f28-43a9-928f-779e43c43ac9 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:46 np0005476733 nova_compute[192580]: 2025-10-08 15:27:46.186 2 INFO nova.compute.manager [None req-e6bad069-eff3-410e-a817-8c38402b6963 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Get console output#033[00m
Oct  8 11:27:46 np0005476733 nova_compute[192580]: 2025-10-08 15:27:46.354 2 DEBUG nova.compute.provider_tree [None req-32f5f635-8f28-43a9-928f-779e43c43ac9 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:27:46 np0005476733 nova_compute[192580]: 2025-10-08 15:27:46.476 2 DEBUG nova.scheduler.client.report [None req-32f5f635-8f28-43a9-928f-779e43c43ac9 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:27:46 np0005476733 nova_compute[192580]: 2025-10-08 15:27:46.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:27:46 np0005476733 nova_compute[192580]: 2025-10-08 15:27:46.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:27:46 np0005476733 nova_compute[192580]: 2025-10-08 15:27:46.595 2 DEBUG oslo_concurrency.lockutils [None req-32f5f635-8f28-43a9-928f-779e43c43ac9 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.437s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:46 np0005476733 nova_compute[192580]: 2025-10-08 15:27:46.752 2 DEBUG nova.compute.manager [req-4144fad1-3d43-47c3-af62-cdfaaddeabe8 req-e93c2f13-5afb-436a-adf4-ec3d15f38880 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Received event network-vif-plugged-cae08d04-f9a8-46ee-ba57-0a0db94ae186 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:27:46 np0005476733 nova_compute[192580]: 2025-10-08 15:27:46.753 2 DEBUG oslo_concurrency.lockutils [req-4144fad1-3d43-47c3-af62-cdfaaddeabe8 req-e93c2f13-5afb-436a-adf4-ec3d15f38880 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:46 np0005476733 nova_compute[192580]: 2025-10-08 15:27:46.754 2 DEBUG oslo_concurrency.lockutils [req-4144fad1-3d43-47c3-af62-cdfaaddeabe8 req-e93c2f13-5afb-436a-adf4-ec3d15f38880 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:46 np0005476733 nova_compute[192580]: 2025-10-08 15:27:46.754 2 DEBUG oslo_concurrency.lockutils [req-4144fad1-3d43-47c3-af62-cdfaaddeabe8 req-e93c2f13-5afb-436a-adf4-ec3d15f38880 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:46 np0005476733 nova_compute[192580]: 2025-10-08 15:27:46.755 2 DEBUG nova.compute.manager [req-4144fad1-3d43-47c3-af62-cdfaaddeabe8 req-e93c2f13-5afb-436a-adf4-ec3d15f38880 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] No waiting events found dispatching network-vif-plugged-cae08d04-f9a8-46ee-ba57-0a0db94ae186 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:27:46 np0005476733 nova_compute[192580]: 2025-10-08 15:27:46.755 2 WARNING nova.compute.manager [req-4144fad1-3d43-47c3-af62-cdfaaddeabe8 req-e93c2f13-5afb-436a-adf4-ec3d15f38880 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Received unexpected event network-vif-plugged-cae08d04-f9a8-46ee-ba57-0a0db94ae186 for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:27:46 np0005476733 nova_compute[192580]: 2025-10-08 15:27:46.755 2 DEBUG nova.compute.manager [req-4144fad1-3d43-47c3-af62-cdfaaddeabe8 req-e93c2f13-5afb-436a-adf4-ec3d15f38880 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Received event network-vif-deleted-27016abf-08ed-40dc-8da9-bebab3e3a2a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:27:46 np0005476733 nova_compute[192580]: 2025-10-08 15:27:46.866 2 INFO nova.scheduler.client.report [None req-32f5f635-8f28-43a9-928f-779e43c43ac9 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Deleted allocations for instance e36dd986-15d5-466e-93d6-dc7b4483c8e9#033[00m
Oct  8 11:27:46 np0005476733 nova_compute[192580]: 2025-10-08 15:27:46.960 2 DEBUG nova.network.neutron [-] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:27:47 np0005476733 nova_compute[192580]: 2025-10-08 15:27:47.080 2 INFO nova.compute.manager [-] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Took 2.80 seconds to deallocate network for instance.#033[00m
Oct  8 11:27:47 np0005476733 nova_compute[192580]: 2025-10-08 15:27:47.133 2 DEBUG nova.network.neutron [-] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:27:47 np0005476733 nova_compute[192580]: 2025-10-08 15:27:47.143 2 DEBUG oslo_concurrency.lockutils [None req-32f5f635-8f28-43a9-928f-779e43c43ac9 f03335a379bd4afdbbd7b9cc7cae27e0 7f2acdb26a5a4269a4b1e407da7722c3 - - default default] Lock "e36dd986-15d5-466e-93d6-dc7b4483c8e9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.477s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:47 np0005476733 nova_compute[192580]: 2025-10-08 15:27:47.205 2 DEBUG oslo_concurrency.lockutils [None req-f5b191a0-1987-47f4-9a1c-2510105e877f c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:47 np0005476733 nova_compute[192580]: 2025-10-08 15:27:47.205 2 DEBUG oslo_concurrency.lockutils [None req-f5b191a0-1987-47f4-9a1c-2510105e877f c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:47 np0005476733 nova_compute[192580]: 2025-10-08 15:27:47.251 2 INFO nova.compute.manager [-] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Took 4.15 seconds to deallocate network for instance.#033[00m
Oct  8 11:27:47 np0005476733 nova_compute[192580]: 2025-10-08 15:27:47.361 2 DEBUG oslo_concurrency.lockutils [None req-9d226a53-c00b-44cc-83cb-7f6c3ed655b8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:47 np0005476733 nova_compute[192580]: 2025-10-08 15:27:47.368 2 DEBUG nova.compute.provider_tree [None req-f5b191a0-1987-47f4-9a1c-2510105e877f c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:27:47 np0005476733 nova_compute[192580]: 2025-10-08 15:27:47.441 2 DEBUG nova.scheduler.client.report [None req-f5b191a0-1987-47f4-9a1c-2510105e877f c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:27:47 np0005476733 nova_compute[192580]: 2025-10-08 15:27:47.566 2 DEBUG oslo_concurrency.lockutils [None req-f5b191a0-1987-47f4-9a1c-2510105e877f c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.361s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:47 np0005476733 nova_compute[192580]: 2025-10-08 15:27:47.569 2 DEBUG oslo_concurrency.lockutils [None req-9d226a53-c00b-44cc-83cb-7f6c3ed655b8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:47 np0005476733 nova_compute[192580]: 2025-10-08 15:27:47.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:27:47 np0005476733 nova_compute[192580]: 2025-10-08 15:27:47.618 2 INFO nova.scheduler.client.report [None req-f5b191a0-1987-47f4-9a1c-2510105e877f c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Deleted allocations for instance 6efc9ea0-184c-46cc-aeb5-e2759e10e398#033[00m
Oct  8 11:27:47 np0005476733 nova_compute[192580]: 2025-10-08 15:27:47.771 2 DEBUG oslo_concurrency.lockutils [None req-f5b191a0-1987-47f4-9a1c-2510105e877f c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "6efc9ea0-184c-46cc-aeb5-e2759e10e398" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.068s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:47 np0005476733 nova_compute[192580]: 2025-10-08 15:27:47.855 2 DEBUG nova.compute.provider_tree [None req-9d226a53-c00b-44cc-83cb-7f6c3ed655b8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:27:48 np0005476733 nova_compute[192580]: 2025-10-08 15:27:48.079 2 DEBUG nova.scheduler.client.report [None req-9d226a53-c00b-44cc-83cb-7f6c3ed655b8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:27:48 np0005476733 nova_compute[192580]: 2025-10-08 15:27:48.134 2 DEBUG oslo_concurrency.lockutils [None req-9d226a53-c00b-44cc-83cb-7f6c3ed655b8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.565s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:48 np0005476733 nova_compute[192580]: 2025-10-08 15:27:48.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:27:48 np0005476733 nova_compute[192580]: 2025-10-08 15:27:48.589 2 INFO nova.scheduler.client.report [None req-9d226a53-c00b-44cc-83cb-7f6c3ed655b8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Deleted allocations for instance cefc7b22-5a31-4d0c-bb25-462153dfc427#033[00m
Oct  8 11:27:48 np0005476733 nova_compute[192580]: 2025-10-08 15:27:48.992 2 DEBUG oslo_concurrency.lockutils [None req-9d226a53-c00b-44cc-83cb-7f6c3ed655b8 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "cefc7b22-5a31-4d0c-bb25-462153dfc427" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.364s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:48 np0005476733 nova_compute[192580]: 2025-10-08 15:27:48.995 2 DEBUG nova.compute.manager [req-c5aec2f7-b75e-4eb5-aef2-9094c7f0dcfa req-85cb672d-eb43-4dc2-ac04-c683f05b67c1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Received event network-vif-deleted-36047ed0-015a-4d5e-8c0a-fc4d965a13b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:27:49 np0005476733 nova_compute[192580]: 2025-10-08 15:27:49.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:49 np0005476733 nova_compute[192580]: 2025-10-08 15:27:49.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:50 np0005476733 nova_compute[192580]: 2025-10-08 15:27:50.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:27:50 np0005476733 nova_compute[192580]: 2025-10-08 15:27:50.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:27:50 np0005476733 nova_compute[192580]: 2025-10-08 15:27:50.654 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 11:27:51 np0005476733 podman[229397]: 2025-10-08 15:27:51.241752013 +0000 UTC m=+0.067814550 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:27:51 np0005476733 podman[229396]: 2025-10-08 15:27:51.276779013 +0000 UTC m=+0.103282164 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  8 11:27:51 np0005476733 nova_compute[192580]: 2025-10-08 15:27:51.490 2 INFO nova.compute.manager [None req-621bf20a-3381-4deb-bd31-f71550b394e3 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Get console output#033[00m
Oct  8 11:27:51 np0005476733 nova_compute[192580]: 2025-10-08 15:27:51.496 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.370 2 DEBUG oslo_concurrency.lockutils [None req-7f69236d-d8c8-42bd-a64f-68b24ad25d9c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "656c0a96-03f3-4a70-baac-01de2a126a91" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.371 2 DEBUG oslo_concurrency.lockutils [None req-7f69236d-d8c8-42bd-a64f-68b24ad25d9c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "656c0a96-03f3-4a70-baac-01de2a126a91" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.371 2 DEBUG oslo_concurrency.lockutils [None req-7f69236d-d8c8-42bd-a64f-68b24ad25d9c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "656c0a96-03f3-4a70-baac-01de2a126a91-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.371 2 DEBUG oslo_concurrency.lockutils [None req-7f69236d-d8c8-42bd-a64f-68b24ad25d9c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "656c0a96-03f3-4a70-baac-01de2a126a91-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.372 2 DEBUG oslo_concurrency.lockutils [None req-7f69236d-d8c8-42bd-a64f-68b24ad25d9c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "656c0a96-03f3-4a70-baac-01de2a126a91-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.373 2 INFO nova.compute.manager [None req-7f69236d-d8c8-42bd-a64f-68b24ad25d9c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Terminating instance#033[00m
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.373 2 DEBUG nova.compute.manager [None req-7f69236d-d8c8-42bd-a64f-68b24ad25d9c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:27:52 np0005476733 kernel: tap59f58b79-91 (unregistering): left promiscuous mode
Oct  8 11:27:52 np0005476733 NetworkManager[51699]: <info>  [1759937272.4000] device (tap59f58b79-91): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:27:52 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:52Z|00300|binding|INFO|Releasing lport 59f58b79-9163-41ba-8e03-7430e5def4ef from this chassis (sb_readonly=0)
Oct  8 11:27:52 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:52Z|00301|binding|INFO|Setting lport 59f58b79-9163-41ba-8e03-7430e5def4ef down in Southbound
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:52 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:52Z|00302|binding|INFO|Removing iface tap59f58b79-91 ovn-installed in OVS
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:52.429 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:94:83 10.100.0.3'], port_security=['fa:16:3e:5f:94:83 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '656c0a96-03f3-4a70-baac-01de2a126a91', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '496a37645ecf47b496dcf02c696ca64a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '023a0cd3-fdca-4dff-ba80-8ef557b384c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.227'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b3d4cc6-3768-451b-b35e-6b2333c921fd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=59f58b79-9163-41ba-8e03-7430e5def4ef) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:52.433 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 59f58b79-9163-41ba-8e03-7430e5def4ef in datapath 30cdfb1e-750a-4d0e-9e9c-321b06b371b9 unbound from our chassis#033[00m
Oct  8 11:27:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:52.438 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 30cdfb1e-750a-4d0e-9e9c-321b06b371b9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:27:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:52.441 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3967dcfe-2931-4dce-8762-5f2b97558ad3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:52.445 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9 namespace which is not needed anymore#033[00m
Oct  8 11:27:52 np0005476733 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000013.scope: Deactivated successfully.
Oct  8 11:27:52 np0005476733 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000013.scope: Consumed 46.181s CPU time.
Oct  8 11:27:52 np0005476733 systemd-machined[152624]: Machine qemu-12-instance-00000013 terminated.
Oct  8 11:27:52 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[225134]: [NOTICE]   (225138) : haproxy version is 2.8.14-c23fe91
Oct  8 11:27:52 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[225134]: [NOTICE]   (225138) : path to executable is /usr/sbin/haproxy
Oct  8 11:27:52 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[225134]: [WARNING]  (225138) : Exiting Master process...
Oct  8 11:27:52 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[225134]: [ALERT]    (225138) : Current worker (225140) exited with code 143 (Terminated)
Oct  8 11:27:52 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[225134]: [WARNING]  (225138) : All workers exited. Exiting... (0)
Oct  8 11:27:52 np0005476733 systemd[1]: libpod-18df9ece4581dd634a654a8be88a002996a285bc52011554f76c8c3f51e1dd3a.scope: Deactivated successfully.
Oct  8 11:27:52 np0005476733 podman[229462]: 2025-10-08 15:27:52.587347351 +0000 UTC m=+0.047943421 container died 18df9ece4581dd634a654a8be88a002996a285bc52011554f76c8c3f51e1dd3a (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  8 11:27:52 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-18df9ece4581dd634a654a8be88a002996a285bc52011554f76c8c3f51e1dd3a-userdata-shm.mount: Deactivated successfully.
Oct  8 11:27:52 np0005476733 systemd[1]: var-lib-containers-storage-overlay-3554fa24416693bb2ea6b587fc627c333ee2507e5834ba877ff5ff0eee24ba67-merged.mount: Deactivated successfully.
Oct  8 11:27:52 np0005476733 podman[229462]: 2025-10-08 15:27:52.64474827 +0000 UTC m=+0.105344340 container cleanup 18df9ece4581dd634a654a8be88a002996a285bc52011554f76c8c3f51e1dd3a (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.665 2 INFO nova.virt.libvirt.driver [-] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Instance destroyed successfully.#033[00m
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.666 2 DEBUG nova.objects.instance [None req-7f69236d-d8c8-42bd-a64f-68b24ad25d9c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lazy-loading 'resources' on Instance uuid 656c0a96-03f3-4a70-baac-01de2a126a91 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:27:52 np0005476733 systemd[1]: libpod-conmon-18df9ece4581dd634a654a8be88a002996a285bc52011554f76c8c3f51e1dd3a.scope: Deactivated successfully.
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.718 2 DEBUG nova.virt.libvirt.vif [None req-7f69236d-d8c8-42bd-a64f-68b24ad25d9c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:22:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_multicast_after_idle_timeout-135618235',display_name='tempest-test_multicast_after_idle_timeout-135618235',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-multicast-after-idle-timeout-135618235',id=19,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHaTUyIW7HAi8eLb2uxsb3hQ01QNiqMtiwd2QQElMyFusiyPekoP+eGZG5apcvUeJj+ezHykEE9e9GalqeB/Pt0gdiMZz/nmUCtHv59KRRGG4S5F2fPmbxlRdJaDztvzVg==',key_name='tempest-keypair-test-1272869518',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:22:48Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='496a37645ecf47b496dcf02c696ca64a',ramdisk_id='',reservation_id='r-7ui3kw3n',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-MulticastTestIPv4Ovn-1993668591',owner_user_name='tempest-MulticastTestIPv4Ovn-1993668591-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:22:48Z,user_data=None,user_id='c0c7c5c2dab54695b1cc0a34bdc4ee47',uuid=656c0a96-03f3-4a70-baac-01de2a126a91,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "59f58b79-9163-41ba-8e03-7430e5def4ef", "address": "fa:16:3e:5f:94:83", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59f58b79-91", "ovs_interfaceid": "59f58b79-9163-41ba-8e03-7430e5def4ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.718 2 DEBUG nova.network.os_vif_util [None req-7f69236d-d8c8-42bd-a64f-68b24ad25d9c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converting VIF {"id": "59f58b79-9163-41ba-8e03-7430e5def4ef", "address": "fa:16:3e:5f:94:83", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59f58b79-91", "ovs_interfaceid": "59f58b79-9163-41ba-8e03-7430e5def4ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.719 2 DEBUG nova.network.os_vif_util [None req-7f69236d-d8c8-42bd-a64f-68b24ad25d9c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5f:94:83,bridge_name='br-int',has_traffic_filtering=True,id=59f58b79-9163-41ba-8e03-7430e5def4ef,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap59f58b79-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:27:52 np0005476733 podman[229507]: 2025-10-08 15:27:52.719667815 +0000 UTC m=+0.047808697 container remove 18df9ece4581dd634a654a8be88a002996a285bc52011554f76c8c3f51e1dd3a (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.720 2 DEBUG os_vif [None req-7f69236d-d8c8-42bd-a64f-68b24ad25d9c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:94:83,bridge_name='br-int',has_traffic_filtering=True,id=59f58b79-9163-41ba-8e03-7430e5def4ef,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap59f58b79-91') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.722 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap59f58b79-91, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.730 2 INFO os_vif [None req-7f69236d-d8c8-42bd-a64f-68b24ad25d9c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:94:83,bridge_name='br-int',has_traffic_filtering=True,id=59f58b79-9163-41ba-8e03-7430e5def4ef,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap59f58b79-91')#033[00m
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.731 2 INFO nova.virt.libvirt.driver [None req-7f69236d-d8c8-42bd-a64f-68b24ad25d9c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Deleting instance files /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91_del#033[00m
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.732 2 INFO nova.virt.libvirt.driver [None req-7f69236d-d8c8-42bd-a64f-68b24ad25d9c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Deletion of /var/lib/nova/instances/656c0a96-03f3-4a70-baac-01de2a126a91_del complete#033[00m
Oct  8 11:27:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:52.731 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[52118497-acdc-4f10-8980-a697ead4db98]: (4, ('Wed Oct  8 03:27:52 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9 (18df9ece4581dd634a654a8be88a002996a285bc52011554f76c8c3f51e1dd3a)\n18df9ece4581dd634a654a8be88a002996a285bc52011554f76c8c3f51e1dd3a\nWed Oct  8 03:27:52 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9 (18df9ece4581dd634a654a8be88a002996a285bc52011554f76c8c3f51e1dd3a)\n18df9ece4581dd634a654a8be88a002996a285bc52011554f76c8c3f51e1dd3a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:52.733 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[fd3c2504-14fc-4e1f-be62-30b302d806b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:52.734 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap30cdfb1e-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:52 np0005476733 kernel: tap30cdfb1e-70: left promiscuous mode
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:52.754 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[28d22ac4-f373-4f32-936f-594713b24c97]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:52.783 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[451d8b02-b4b0-406e-8e69-5badcf8d1794]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:52.785 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[163b30f0-e09f-4d0b-96ea-6070957793d5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:52.801 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9d1dc108-6741-49d3-9298-eec6cfcea9a7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 387043, 'reachable_time': 25849, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229522, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:52 np0005476733 systemd[1]: run-netns-ovnmeta\x2d30cdfb1e\x2d750a\x2d4d0e\x2d9e9c\x2d321b06b371b9.mount: Deactivated successfully.
Oct  8 11:27:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:52.808 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:27:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:52.808 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[e721c1a9-1e37-4fb7-b864-096cd7bd5840]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.832 2 INFO nova.compute.manager [None req-7f69236d-d8c8-42bd-a64f-68b24ad25d9c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Took 0.46 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.833 2 DEBUG oslo.service.loopingcall [None req-7f69236d-d8c8-42bd-a64f-68b24ad25d9c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.833 2 DEBUG nova.compute.manager [-] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:27:52 np0005476733 nova_compute[192580]: 2025-10-08 15:27:52.833 2 DEBUG nova.network.neutron [-] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:27:53 np0005476733 nova_compute[192580]: 2025-10-08 15:27:53.331 2 DEBUG nova.compute.manager [req-908069f6-d26a-49f7-95e7-afb3ef795cbe req-23f893c8-833e-4a2e-812d-381a4074cf85 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Received event network-vif-unplugged-59f58b79-9163-41ba-8e03-7430e5def4ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:27:53 np0005476733 nova_compute[192580]: 2025-10-08 15:27:53.332 2 DEBUG oslo_concurrency.lockutils [req-908069f6-d26a-49f7-95e7-afb3ef795cbe req-23f893c8-833e-4a2e-812d-381a4074cf85 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "656c0a96-03f3-4a70-baac-01de2a126a91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:53 np0005476733 nova_compute[192580]: 2025-10-08 15:27:53.332 2 DEBUG oslo_concurrency.lockutils [req-908069f6-d26a-49f7-95e7-afb3ef795cbe req-23f893c8-833e-4a2e-812d-381a4074cf85 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "656c0a96-03f3-4a70-baac-01de2a126a91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:53 np0005476733 nova_compute[192580]: 2025-10-08 15:27:53.333 2 DEBUG oslo_concurrency.lockutils [req-908069f6-d26a-49f7-95e7-afb3ef795cbe req-23f893c8-833e-4a2e-812d-381a4074cf85 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "656c0a96-03f3-4a70-baac-01de2a126a91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:53 np0005476733 nova_compute[192580]: 2025-10-08 15:27:53.333 2 DEBUG nova.compute.manager [req-908069f6-d26a-49f7-95e7-afb3ef795cbe req-23f893c8-833e-4a2e-812d-381a4074cf85 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] No waiting events found dispatching network-vif-unplugged-59f58b79-9163-41ba-8e03-7430e5def4ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:27:53 np0005476733 nova_compute[192580]: 2025-10-08 15:27:53.333 2 DEBUG nova.compute.manager [req-908069f6-d26a-49f7-95e7-afb3ef795cbe req-23f893c8-833e-4a2e-812d-381a4074cf85 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Received event network-vif-unplugged-59f58b79-9163-41ba-8e03-7430e5def4ef for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:27:53 np0005476733 nova_compute[192580]: 2025-10-08 15:27:53.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:27:54 np0005476733 nova_compute[192580]: 2025-10-08 15:27:54.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:54 np0005476733 nova_compute[192580]: 2025-10-08 15:27:54.957 2 DEBUG nova.network.neutron [-] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:27:55 np0005476733 nova_compute[192580]: 2025-10-08 15:27:55.095 2 INFO nova.compute.manager [-] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Took 2.26 seconds to deallocate network for instance.#033[00m
Oct  8 11:27:55 np0005476733 nova_compute[192580]: 2025-10-08 15:27:55.308 2 DEBUG oslo_concurrency.lockutils [None req-7f69236d-d8c8-42bd-a64f-68b24ad25d9c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:55 np0005476733 nova_compute[192580]: 2025-10-08 15:27:55.309 2 DEBUG oslo_concurrency.lockutils [None req-7f69236d-d8c8-42bd-a64f-68b24ad25d9c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:55 np0005476733 nova_compute[192580]: 2025-10-08 15:27:55.521 2 DEBUG nova.compute.provider_tree [None req-7f69236d-d8c8-42bd-a64f-68b24ad25d9c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:27:55 np0005476733 nova_compute[192580]: 2025-10-08 15:27:55.621 2 DEBUG nova.scheduler.client.report [None req-7f69236d-d8c8-42bd-a64f-68b24ad25d9c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:27:55 np0005476733 nova_compute[192580]: 2025-10-08 15:27:55.753 2 DEBUG oslo_concurrency.lockutils [None req-7f69236d-d8c8-42bd-a64f-68b24ad25d9c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.445s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:55 np0005476733 nova_compute[192580]: 2025-10-08 15:27:55.862 2 INFO nova.scheduler.client.report [None req-7f69236d-d8c8-42bd-a64f-68b24ad25d9c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Deleted allocations for instance 656c0a96-03f3-4a70-baac-01de2a126a91#033[00m
Oct  8 11:27:56 np0005476733 nova_compute[192580]: 2025-10-08 15:27:56.046 2 DEBUG nova.compute.manager [req-01fb7808-9261-483c-be3d-bdc1d9f78312 req-b4bb2e9e-73de-41d7-abf9-046b8c4de201 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Received event network-vif-plugged-59f58b79-9163-41ba-8e03-7430e5def4ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:27:56 np0005476733 nova_compute[192580]: 2025-10-08 15:27:56.046 2 DEBUG oslo_concurrency.lockutils [req-01fb7808-9261-483c-be3d-bdc1d9f78312 req-b4bb2e9e-73de-41d7-abf9-046b8c4de201 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "656c0a96-03f3-4a70-baac-01de2a126a91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:27:56 np0005476733 nova_compute[192580]: 2025-10-08 15:27:56.047 2 DEBUG oslo_concurrency.lockutils [req-01fb7808-9261-483c-be3d-bdc1d9f78312 req-b4bb2e9e-73de-41d7-abf9-046b8c4de201 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "656c0a96-03f3-4a70-baac-01de2a126a91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:27:56 np0005476733 nova_compute[192580]: 2025-10-08 15:27:56.047 2 DEBUG oslo_concurrency.lockutils [req-01fb7808-9261-483c-be3d-bdc1d9f78312 req-b4bb2e9e-73de-41d7-abf9-046b8c4de201 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "656c0a96-03f3-4a70-baac-01de2a126a91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:56 np0005476733 nova_compute[192580]: 2025-10-08 15:27:56.048 2 DEBUG nova.compute.manager [req-01fb7808-9261-483c-be3d-bdc1d9f78312 req-b4bb2e9e-73de-41d7-abf9-046b8c4de201 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] No waiting events found dispatching network-vif-plugged-59f58b79-9163-41ba-8e03-7430e5def4ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:27:56 np0005476733 nova_compute[192580]: 2025-10-08 15:27:56.049 2 WARNING nova.compute.manager [req-01fb7808-9261-483c-be3d-bdc1d9f78312 req-b4bb2e9e-73de-41d7-abf9-046b8c4de201 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Received unexpected event network-vif-plugged-59f58b79-9163-41ba-8e03-7430e5def4ef for instance with vm_state deleted and task_state None.#033[00m
Oct  8 11:27:56 np0005476733 nova_compute[192580]: 2025-10-08 15:27:56.049 2 DEBUG nova.compute.manager [req-01fb7808-9261-483c-be3d-bdc1d9f78312 req-b4bb2e9e-73de-41d7-abf9-046b8c4de201 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Received event network-vif-deleted-59f58b79-9163-41ba-8e03-7430e5def4ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:27:56 np0005476733 nova_compute[192580]: 2025-10-08 15:27:56.059 2 DEBUG oslo_concurrency.lockutils [None req-7f69236d-d8c8-42bd-a64f-68b24ad25d9c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "656c0a96-03f3-4a70-baac-01de2a126a91" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.688s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:27:56 np0005476733 nova_compute[192580]: 2025-10-08 15:27:56.960 2 INFO nova.compute.manager [None req-af478ec7-0acb-42a3-95f6-fbad01412069 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Get console output#033[00m
Oct  8 11:27:56 np0005476733 nova_compute[192580]: 2025-10-08 15:27:56.967 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:27:57 np0005476733 nova_compute[192580]: 2025-10-08 15:27:57.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:57 np0005476733 nova_compute[192580]: 2025-10-08 15:27:57.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:57 np0005476733 nova_compute[192580]: 2025-10-08 15:27:57.914 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937262.9122503, cefc7b22-5a31-4d0c-bb25-462153dfc427 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:27:57 np0005476733 nova_compute[192580]: 2025-10-08 15:27:57.914 2 INFO nova.compute.manager [-] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:27:57 np0005476733 nova_compute[192580]: 2025-10-08 15:27:57.946 2 DEBUG nova.compute.manager [None req-adc5ba1a-cdc2-40a0-ad0a-b8b698a7d345 - - - - - -] [instance: cefc7b22-5a31-4d0c-bb25-462153dfc427] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:27:57 np0005476733 nova_compute[192580]: 2025-10-08 15:27:57.966 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937262.9654737, e36dd986-15d5-466e-93d6-dc7b4483c8e9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:27:57 np0005476733 nova_compute[192580]: 2025-10-08 15:27:57.967 2 INFO nova.compute.manager [-] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:27:57 np0005476733 nova_compute[192580]: 2025-10-08 15:27:57.986 2 DEBUG nova.compute.manager [None req-0c89a443-2b02-4adc-b96d-651a0e44fd24 - - - - - -] [instance: e36dd986-15d5-466e-93d6-dc7b4483c8e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:27:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:58.289 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:27:58 np0005476733 nova_compute[192580]: 2025-10-08 15:27:58.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:27:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:27:58.291 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:27:58 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:58Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6b:88:09 10.100.0.7
Oct  8 11:27:58 np0005476733 ovn_controller[94857]: 2025-10-08T15:27:58Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6b:88:09 10.100.0.7
Oct  8 11:27:58 np0005476733 nova_compute[192580]: 2025-10-08 15:27:58.971 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937263.9693007, 6efc9ea0-184c-46cc-aeb5-e2759e10e398 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:27:58 np0005476733 nova_compute[192580]: 2025-10-08 15:27:58.971 2 INFO nova.compute.manager [-] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:27:59 np0005476733 nova_compute[192580]: 2025-10-08 15:27:59.010 2 DEBUG nova.compute.manager [None req-54da6894-d86d-43d5-a88d-1d72b505c64b - - - - - -] [instance: 6efc9ea0-184c-46cc-aeb5-e2759e10e398] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:27:59 np0005476733 nova_compute[192580]: 2025-10-08 15:27:59.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:00 np0005476733 podman[229523]: 2025-10-08 15:28:00.24358149 +0000 UTC m=+0.064898208 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 11:28:00 np0005476733 ovn_controller[94857]: 2025-10-08T15:28:00Z|00303|binding|INFO|Releasing lport 46f589fc-b5d9-4e1f-b085-8789fd1f48e9 from this chassis (sb_readonly=0)
Oct  8 11:28:00 np0005476733 ovn_controller[94857]: 2025-10-08T15:28:00Z|00304|binding|INFO|Releasing lport b563ca05-c871-4f0e-9980-177237a3f88d from this chassis (sb_readonly=0)
Oct  8 11:28:00 np0005476733 ovn_controller[94857]: 2025-10-08T15:28:00Z|00305|binding|INFO|Releasing lport 9e6f9f1a-9b45-47d5-b171-40ef2fcda78c from this chassis (sb_readonly=0)
Oct  8 11:28:00 np0005476733 nova_compute[192580]: 2025-10-08 15:28:00.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:02 np0005476733 nova_compute[192580]: 2025-10-08 15:28:02.212 2 INFO nova.compute.manager [None req-d7b9eade-5ee8-4cfe-92b0-8bf24d47c33f 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Get console output#033[00m
Oct  8 11:28:02 np0005476733 nova_compute[192580]: 2025-10-08 15:28:02.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:03 np0005476733 podman[229543]: 2025-10-08 15:28:03.250624346 +0000 UTC m=+0.074051998 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:28:03 np0005476733 podman[229542]: 2025-10-08 15:28:03.274541594 +0000 UTC m=+0.102883631 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 11:28:03 np0005476733 nova_compute[192580]: 2025-10-08 15:28:03.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:04 np0005476733 nova_compute[192580]: 2025-10-08 15:28:04.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:04 np0005476733 systemd-logind[827]: New session 37 of user zuul.
Oct  8 11:28:04 np0005476733 systemd[1]: Started Session 37 of User zuul.
Oct  8 11:28:05 np0005476733 systemd[1]: session-37.scope: Deactivated successfully.
Oct  8 11:28:05 np0005476733 systemd-logind[827]: Session 37 logged out. Waiting for processes to exit.
Oct  8 11:28:05 np0005476733 systemd-logind[827]: Removed session 37.
Oct  8 11:28:06 np0005476733 nova_compute[192580]: 2025-10-08 15:28:06.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:06.294 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:28:07 np0005476733 nova_compute[192580]: 2025-10-08 15:28:07.612 2 INFO nova.compute.manager [None req-b4ef1201-7512-44c8-a046-d6ada6b59cc6 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Get console output#033[00m
Oct  8 11:28:07 np0005476733 nova_compute[192580]: 2025-10-08 15:28:07.617 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:28:07 np0005476733 nova_compute[192580]: 2025-10-08 15:28:07.620 2 INFO nova.virt.libvirt.driver [None req-b4ef1201-7512-44c8-a046-d6ada6b59cc6 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Truncated console log returned, 2983 bytes ignored#033[00m
Oct  8 11:28:07 np0005476733 nova_compute[192580]: 2025-10-08 15:28:07.660 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937272.6592147, 656c0a96-03f3-4a70-baac-01de2a126a91 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:28:07 np0005476733 nova_compute[192580]: 2025-10-08 15:28:07.661 2 INFO nova.compute.manager [-] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:28:07 np0005476733 nova_compute[192580]: 2025-10-08 15:28:07.702 2 DEBUG nova.compute.manager [None req-9b34ca14-3a82-4971-90c5-63a858e85b88 - - - - - -] [instance: 656c0a96-03f3-4a70-baac-01de2a126a91] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:28:07 np0005476733 nova_compute[192580]: 2025-10-08 15:28:07.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:09 np0005476733 nova_compute[192580]: 2025-10-08 15:28:09.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:11 np0005476733 podman[229635]: 2025-10-08 15:28:11.234033755 +0000 UTC m=+0.061127648 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, distribution-scope=public, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container)
Oct  8 11:28:12 np0005476733 nova_compute[192580]: 2025-10-08 15:28:12.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:13 np0005476733 nova_compute[192580]: 2025-10-08 15:28:13.070 2 INFO nova.compute.manager [None req-6050334d-6a28-4897-9678-2c6ef1d567b5 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Get console output#033[00m
Oct  8 11:28:13 np0005476733 nova_compute[192580]: 2025-10-08 15:28:13.077 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:28:13 np0005476733 nova_compute[192580]: 2025-10-08 15:28:13.080 2 INFO nova.virt.libvirt.driver [None req-6050334d-6a28-4897-9678-2c6ef1d567b5 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Truncated console log returned, 3185 bytes ignored#033[00m
Oct  8 11:28:13 np0005476733 podman[229657]: 2025-10-08 15:28:13.227817057 +0000 UTC m=+0.051846345 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:28:13 np0005476733 podman[229656]: 2025-10-08 15:28:13.23833613 +0000 UTC m=+0.059136976 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 11:28:14 np0005476733 nova_compute[192580]: 2025-10-08 15:28:14.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:17 np0005476733 nova_compute[192580]: 2025-10-08 15:28:17.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:17 np0005476733 nova_compute[192580]: 2025-10-08 15:28:17.702 2 DEBUG oslo_concurrency.lockutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "020f3abc-b9cd-43d6-81f9-4464a8d20207" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:28:17 np0005476733 nova_compute[192580]: 2025-10-08 15:28:17.703 2 DEBUG oslo_concurrency.lockutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "020f3abc-b9cd-43d6-81f9-4464a8d20207" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:28:17 np0005476733 nova_compute[192580]: 2025-10-08 15:28:17.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:17 np0005476733 nova_compute[192580]: 2025-10-08 15:28:17.779 2 DEBUG nova.compute.manager [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:28:17 np0005476733 nova_compute[192580]: 2025-10-08 15:28:17.868 2 DEBUG oslo_concurrency.lockutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:28:17 np0005476733 nova_compute[192580]: 2025-10-08 15:28:17.869 2 DEBUG oslo_concurrency.lockutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:28:17 np0005476733 nova_compute[192580]: 2025-10-08 15:28:17.878 2 DEBUG nova.virt.hardware [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:28:17 np0005476733 nova_compute[192580]: 2025-10-08 15:28:17.878 2 INFO nova.compute.claims [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.153 2 DEBUG nova.compute.provider_tree [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.172 2 DEBUG nova.scheduler.client.report [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.204 2 DEBUG oslo_concurrency.lockutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.336s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.205 2 DEBUG nova.compute.manager [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.283 2 DEBUG nova.compute.manager [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.284 2 DEBUG nova.network.neutron [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.318 2 INFO nova.virt.libvirt.driver [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.349 2 DEBUG nova.compute.manager [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.493 2 DEBUG nova.compute.manager [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.494 2 DEBUG nova.virt.libvirt.driver [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.495 2 INFO nova.virt.libvirt.driver [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Creating image(s)#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.495 2 DEBUG oslo_concurrency.lockutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "/var/lib/nova/instances/020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.495 2 DEBUG oslo_concurrency.lockutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "/var/lib/nova/instances/020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.496 2 DEBUG oslo_concurrency.lockutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "/var/lib/nova/instances/020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.510 2 DEBUG oslo_concurrency.processutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.561 2 DEBUG oslo_concurrency.processutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.562 2 DEBUG oslo_concurrency.lockutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.563 2 DEBUG oslo_concurrency.lockutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.578 2 DEBUG oslo_concurrency.processutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.639 2 DEBUG oslo_concurrency.processutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.640 2 DEBUG oslo_concurrency.processutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/020f3abc-b9cd-43d6-81f9-4464a8d20207/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.670 2 DEBUG oslo_concurrency.processutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/020f3abc-b9cd-43d6-81f9-4464a8d20207/disk 10737418240" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.671 2 DEBUG oslo_concurrency.lockutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.672 2 DEBUG oslo_concurrency.processutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.744 2 DEBUG oslo_concurrency.processutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.747 2 DEBUG nova.objects.instance [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lazy-loading 'migration_context' on Instance uuid 020f3abc-b9cd-43d6-81f9-4464a8d20207 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.783 2 DEBUG nova.virt.libvirt.driver [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.784 2 DEBUG nova.virt.libvirt.driver [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Ensure instance console log exists: /var/lib/nova/instances/020f3abc-b9cd-43d6-81f9-4464a8d20207/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.784 2 DEBUG oslo_concurrency.lockutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.785 2 DEBUG oslo_concurrency.lockutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.785 2 DEBUG oslo_concurrency.lockutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:28:18 np0005476733 nova_compute[192580]: 2025-10-08 15:28:18.811 2 DEBUG nova.policy [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:28:19 np0005476733 nova_compute[192580]: 2025-10-08 15:28:19.037 2 DEBUG nova.compute.manager [req-9b3cda80-0ff3-4dff-a3c6-982bdb1f8ac8 req-80ed3e5d-6e9f-49a5-9cc7-4eabd8bb5d52 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Received event network-changed-92e6817e-732a-4e42-973e-2d26e62163f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:28:19 np0005476733 nova_compute[192580]: 2025-10-08 15:28:19.037 2 DEBUG nova.compute.manager [req-9b3cda80-0ff3-4dff-a3c6-982bdb1f8ac8 req-80ed3e5d-6e9f-49a5-9cc7-4eabd8bb5d52 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Refreshing instance network info cache due to event network-changed-92e6817e-732a-4e42-973e-2d26e62163f5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:28:19 np0005476733 nova_compute[192580]: 2025-10-08 15:28:19.038 2 DEBUG oslo_concurrency.lockutils [req-9b3cda80-0ff3-4dff-a3c6-982bdb1f8ac8 req-80ed3e5d-6e9f-49a5-9cc7-4eabd8bb5d52 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-63718bc7-c79a-49a4-a0f2-bb47aa50f5b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:28:19 np0005476733 nova_compute[192580]: 2025-10-08 15:28:19.039 2 DEBUG oslo_concurrency.lockutils [req-9b3cda80-0ff3-4dff-a3c6-982bdb1f8ac8 req-80ed3e5d-6e9f-49a5-9cc7-4eabd8bb5d52 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-63718bc7-c79a-49a4-a0f2-bb47aa50f5b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:28:19 np0005476733 nova_compute[192580]: 2025-10-08 15:28:19.039 2 DEBUG nova.network.neutron [req-9b3cda80-0ff3-4dff-a3c6-982bdb1f8ac8 req-80ed3e5d-6e9f-49a5-9cc7-4eabd8bb5d52 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Refreshing network info cache for port 92e6817e-732a-4e42-973e-2d26e62163f5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:28:19 np0005476733 nova_compute[192580]: 2025-10-08 15:28:19.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:19 np0005476733 nova_compute[192580]: 2025-10-08 15:28:19.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:28:20Z|00306|pinctrl|WARN|Dropped 7015 log messages in last 60 seconds (most recently, 0 seconds ago) due to excessive rate
Oct  8 11:28:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:28:20Z|00307|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:28:22 np0005476733 nova_compute[192580]: 2025-10-08 15:28:22.157 2 DEBUG nova.network.neutron [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Successfully updated port: 902f5462-63af-4928-a517-b67d158bf2c2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:28:22 np0005476733 nova_compute[192580]: 2025-10-08 15:28:22.180 2 DEBUG oslo_concurrency.lockutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "refresh_cache-020f3abc-b9cd-43d6-81f9-4464a8d20207" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:28:22 np0005476733 nova_compute[192580]: 2025-10-08 15:28:22.180 2 DEBUG oslo_concurrency.lockutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquired lock "refresh_cache-020f3abc-b9cd-43d6-81f9-4464a8d20207" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:28:22 np0005476733 nova_compute[192580]: 2025-10-08 15:28:22.181 2 DEBUG nova.network.neutron [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:28:22 np0005476733 podman[229715]: 2025-10-08 15:28:22.235950843 +0000 UTC m=+0.059834378 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=iscsid)
Oct  8 11:28:22 np0005476733 podman[229716]: 2025-10-08 15:28:22.236389077 +0000 UTC m=+0.054555201 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 11:28:22 np0005476733 nova_compute[192580]: 2025-10-08 15:28:22.252 2 DEBUG nova.network.neutron [req-9b3cda80-0ff3-4dff-a3c6-982bdb1f8ac8 req-80ed3e5d-6e9f-49a5-9cc7-4eabd8bb5d52 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Updated VIF entry in instance network info cache for port 92e6817e-732a-4e42-973e-2d26e62163f5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:28:22 np0005476733 nova_compute[192580]: 2025-10-08 15:28:22.252 2 DEBUG nova.network.neutron [req-9b3cda80-0ff3-4dff-a3c6-982bdb1f8ac8 req-80ed3e5d-6e9f-49a5-9cc7-4eabd8bb5d52 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Updating instance_info_cache with network_info: [{"id": "92e6817e-732a-4e42-973e-2d26e62163f5", "address": "fa:16:3e:6b:88:09", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92e6817e-73", "ovs_interfaceid": "92e6817e-732a-4e42-973e-2d26e62163f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:28:22 np0005476733 nova_compute[192580]: 2025-10-08 15:28:22.288 2 DEBUG oslo_concurrency.lockutils [req-9b3cda80-0ff3-4dff-a3c6-982bdb1f8ac8 req-80ed3e5d-6e9f-49a5-9cc7-4eabd8bb5d52 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-63718bc7-c79a-49a4-a0f2-bb47aa50f5b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:28:22 np0005476733 nova_compute[192580]: 2025-10-08 15:28:22.438 2 DEBUG nova.network.neutron [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:28:22 np0005476733 nova_compute[192580]: 2025-10-08 15:28:22.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:23 np0005476733 nova_compute[192580]: 2025-10-08 15:28:23.416 2 DEBUG nova.compute.manager [req-04830d6d-ed21-4c3a-a441-0c5b2399e4c9 req-03c56003-a3a4-4ec9-aa8b-befd507e863c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Received event network-changed-902f5462-63af-4928-a517-b67d158bf2c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:28:23 np0005476733 nova_compute[192580]: 2025-10-08 15:28:23.416 2 DEBUG nova.compute.manager [req-04830d6d-ed21-4c3a-a441-0c5b2399e4c9 req-03c56003-a3a4-4ec9-aa8b-befd507e863c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Refreshing instance network info cache due to event network-changed-902f5462-63af-4928-a517-b67d158bf2c2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:28:23 np0005476733 nova_compute[192580]: 2025-10-08 15:28:23.417 2 DEBUG oslo_concurrency.lockutils [req-04830d6d-ed21-4c3a-a441-0c5b2399e4c9 req-03c56003-a3a4-4ec9-aa8b-befd507e863c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-020f3abc-b9cd-43d6-81f9-4464a8d20207" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.776 2 DEBUG nova.network.neutron [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Updating instance_info_cache with network_info: [{"id": "902f5462-63af-4928-a517-b67d158bf2c2", "address": "fa:16:3e:97:62:62", "network": {"id": "6c4aa983-f923-4017-9479-47738c5b827f", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.171", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:3::/64", "dns": [], "gateway": {"address": "2001:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:3::6b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap902f5462-63", "ovs_interfaceid": "902f5462-63af-4928-a517-b67d158bf2c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.819 2 DEBUG oslo_concurrency.lockutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Releasing lock "refresh_cache-020f3abc-b9cd-43d6-81f9-4464a8d20207" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.820 2 DEBUG nova.compute.manager [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Instance network_info: |[{"id": "902f5462-63af-4928-a517-b67d158bf2c2", "address": "fa:16:3e:97:62:62", "network": {"id": "6c4aa983-f923-4017-9479-47738c5b827f", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.171", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:3::/64", "dns": [], "gateway": {"address": "2001:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:3::6b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap902f5462-63", "ovs_interfaceid": "902f5462-63af-4928-a517-b67d158bf2c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.820 2 DEBUG oslo_concurrency.lockutils [req-04830d6d-ed21-4c3a-a441-0c5b2399e4c9 req-03c56003-a3a4-4ec9-aa8b-befd507e863c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-020f3abc-b9cd-43d6-81f9-4464a8d20207" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.820 2 DEBUG nova.network.neutron [req-04830d6d-ed21-4c3a-a441-0c5b2399e4c9 req-03c56003-a3a4-4ec9-aa8b-befd507e863c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Refreshing network info cache for port 902f5462-63af-4928-a517-b67d158bf2c2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.823 2 DEBUG nova.virt.libvirt.driver [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Start _get_guest_xml network_info=[{"id": "902f5462-63af-4928-a517-b67d158bf2c2", "address": "fa:16:3e:97:62:62", "network": {"id": "6c4aa983-f923-4017-9479-47738c5b827f", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.171", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:3::/64", "dns": [], "gateway": {"address": "2001:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:3::6b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap902f5462-63", "ovs_interfaceid": "902f5462-63af-4928-a517-b67d158bf2c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.829 2 WARNING nova.virt.libvirt.driver [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.836 2 DEBUG nova.virt.libvirt.host [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.837 2 DEBUG nova.virt.libvirt.host [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.841 2 DEBUG nova.virt.libvirt.host [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.841 2 DEBUG nova.virt.libvirt.host [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.842 2 DEBUG nova.virt.libvirt.driver [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.842 2 DEBUG nova.virt.hardware [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.843 2 DEBUG nova.virt.hardware [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.843 2 DEBUG nova.virt.hardware [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.843 2 DEBUG nova.virt.hardware [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.844 2 DEBUG nova.virt.hardware [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.844 2 DEBUG nova.virt.hardware [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.844 2 DEBUG nova.virt.hardware [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.845 2 DEBUG nova.virt.hardware [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.845 2 DEBUG nova.virt.hardware [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.845 2 DEBUG nova.virt.hardware [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.846 2 DEBUG nova.virt.hardware [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.850 2 DEBUG nova.virt.libvirt.vif [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:28:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful',display_name='tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-extradhcpoptionstest-1757752636-test-extra-dhcp-opts-ip',id=37,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAROHXDFBirKKfgv1/Q2k8TOz822D2j3GssXLkqqAYkfNmKCLTZPWHL9R3TttvPeVcQM9XeUfcVk0LUjV4/DUc229+mDzz6yKwrgz0g4olEc5cIgAsFC91SZyJ937u9BxA==',key_name='tempest-ExtraDhcpOptionsTest-1757752636',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='93e68db931464f0282500c84d398d8af',ramdisk_id='',reservation_id='r-8k2l0w6s',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ExtraDhcpOptionsTest-522093769',owner_user_name='tempest-ExtraDhcpOptionsTest-522093769-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:28:18Z,user_data=None,user_id='048380879c82439f920961e33c8fc34c',uuid=020f3abc-b9cd-43d6-81f9-4464a8d20207,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "902f5462-63af-4928-a517-b67d158bf2c2", "address": "fa:16:3e:97:62:62", "network": {"id": "6c4aa983-f923-4017-9479-47738c5b827f", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.171", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:3::/64", "dns": [], "gateway": {"address": "2001:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:3::6b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap902f5462-63", "ovs_interfaceid": "902f5462-63af-4928-a517-b67d158bf2c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.850 2 DEBUG nova.network.os_vif_util [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Converting VIF {"id": "902f5462-63af-4928-a517-b67d158bf2c2", "address": "fa:16:3e:97:62:62", "network": {"id": "6c4aa983-f923-4017-9479-47738c5b827f", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.171", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:3::/64", "dns": [], "gateway": {"address": "2001:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:3::6b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap902f5462-63", "ovs_interfaceid": "902f5462-63af-4928-a517-b67d158bf2c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.851 2 DEBUG nova.network.os_vif_util [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:62:62,bridge_name='br-int',has_traffic_filtering=True,id=902f5462-63af-4928-a517-b67d158bf2c2,network=Network(6c4aa983-f923-4017-9479-47738c5b827f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap902f5462-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.852 2 DEBUG nova.objects.instance [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lazy-loading 'pci_devices' on Instance uuid 020f3abc-b9cd-43d6-81f9-4464a8d20207 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.869 2 DEBUG nova.virt.libvirt.driver [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:28:24 np0005476733 nova_compute[192580]:  <uuid>020f3abc-b9cd-43d6-81f9-4464a8d20207</uuid>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:  <name>instance-00000025</name>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <nova:name>tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful</nova:name>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:28:24</nova:creationTime>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:28:24 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:        <nova:user uuid="048380879c82439f920961e33c8fc34c">tempest-ExtraDhcpOptionsTest-522093769-project-member</nova:user>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:        <nova:project uuid="93e68db931464f0282500c84d398d8af">tempest-ExtraDhcpOptionsTest-522093769</nova:project>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:        <nova:port uuid="902f5462-63af-4928-a517-b67d158bf2c2">
Oct  8 11:28:24 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.3.171" ipVersion="4"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="2001:3::6b" ipVersion="6"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <entry name="serial">020f3abc-b9cd-43d6-81f9-4464a8d20207</entry>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <entry name="uuid">020f3abc-b9cd-43d6-81f9-4464a8d20207</entry>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/020f3abc-b9cd-43d6-81f9-4464a8d20207/disk"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.config"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:97:62:62"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <target dev="tap902f5462-63"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/020f3abc-b9cd-43d6-81f9-4464a8d20207/console.log" append="off"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:28:24 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:28:24 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:28:24 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:28:24 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.871 2 DEBUG nova.compute.manager [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Preparing to wait for external event network-vif-plugged-902f5462-63af-4928-a517-b67d158bf2c2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.871 2 DEBUG oslo_concurrency.lockutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "020f3abc-b9cd-43d6-81f9-4464a8d20207-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.872 2 DEBUG oslo_concurrency.lockutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "020f3abc-b9cd-43d6-81f9-4464a8d20207-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.872 2 DEBUG oslo_concurrency.lockutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "020f3abc-b9cd-43d6-81f9-4464a8d20207-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.873 2 DEBUG nova.virt.libvirt.vif [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:28:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful',display_name='tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-extradhcpoptionstest-1757752636-test-extra-dhcp-opts-ip',id=37,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAROHXDFBirKKfgv1/Q2k8TOz822D2j3GssXLkqqAYkfNmKCLTZPWHL9R3TttvPeVcQM9XeUfcVk0LUjV4/DUc229+mDzz6yKwrgz0g4olEc5cIgAsFC91SZyJ937u9BxA==',key_name='tempest-ExtraDhcpOptionsTest-1757752636',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='93e68db931464f0282500c84d398d8af',ramdisk_id='',reservation_id='r-8k2l0w6s',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ExtraDhcpOptionsTest-522093769',owner_user_name='tempest-ExtraDhcpOptionsTest-522093769-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:28:18Z,user_data=None,user_id='048380879c82439f920961e33c8fc34c',uuid=020f3abc-b9cd-43d6-81f9-4464a8d20207,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "902f5462-63af-4928-a517-b67d158bf2c2", "address": "fa:16:3e:97:62:62", "network": {"id": "6c4aa983-f923-4017-9479-47738c5b827f", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.171", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:3::/64", "dns": [], "gateway": {"address": "2001:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:3::6b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap902f5462-63", "ovs_interfaceid": "902f5462-63af-4928-a517-b67d158bf2c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.874 2 DEBUG nova.network.os_vif_util [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Converting VIF {"id": "902f5462-63af-4928-a517-b67d158bf2c2", "address": "fa:16:3e:97:62:62", "network": {"id": "6c4aa983-f923-4017-9479-47738c5b827f", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.171", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:3::/64", "dns": [], "gateway": {"address": "2001:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:3::6b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap902f5462-63", "ovs_interfaceid": "902f5462-63af-4928-a517-b67d158bf2c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.875 2 DEBUG nova.network.os_vif_util [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:62:62,bridge_name='br-int',has_traffic_filtering=True,id=902f5462-63af-4928-a517-b67d158bf2c2,network=Network(6c4aa983-f923-4017-9479-47738c5b827f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap902f5462-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.875 2 DEBUG os_vif [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:62:62,bridge_name='br-int',has_traffic_filtering=True,id=902f5462-63af-4928-a517-b67d158bf2c2,network=Network(6c4aa983-f923-4017-9479-47738c5b827f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap902f5462-63') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.877 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.878 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.880 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap902f5462-63, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.881 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap902f5462-63, col_values=(('external_ids', {'iface-id': '902f5462-63af-4928-a517-b67d158bf2c2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:97:62:62', 'vm-uuid': '020f3abc-b9cd-43d6-81f9-4464a8d20207'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:24 np0005476733 NetworkManager[51699]: <info>  [1759937304.8843] manager: (tap902f5462-63): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/110)
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.891 2 INFO os_vif [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:62:62,bridge_name='br-int',has_traffic_filtering=True,id=902f5462-63af-4928-a517-b67d158bf2c2,network=Network(6c4aa983-f923-4017-9479-47738c5b827f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap902f5462-63')#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.989 2 DEBUG nova.virt.libvirt.driver [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.990 2 DEBUG nova.virt.libvirt.driver [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.990 2 DEBUG nova.virt.libvirt.driver [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] No VIF found with MAC fa:16:3e:97:62:62, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:28:24 np0005476733 nova_compute[192580]: 2025-10-08 15:28:24.991 2 INFO nova.virt.libvirt.driver [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Using config drive#033[00m
Oct  8 11:28:25 np0005476733 nova_compute[192580]: 2025-10-08 15:28:25.439 2 INFO nova.virt.libvirt.driver [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Creating config drive at /var/lib/nova/instances/020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.config#033[00m
Oct  8 11:28:25 np0005476733 nova_compute[192580]: 2025-10-08 15:28:25.444 2 DEBUG oslo_concurrency.processutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5ru356p1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:28:25 np0005476733 nova_compute[192580]: 2025-10-08 15:28:25.574 2 DEBUG oslo_concurrency.processutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5ru356p1" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:28:25 np0005476733 kernel: tap902f5462-63: entered promiscuous mode
Oct  8 11:28:25 np0005476733 NetworkManager[51699]: <info>  [1759937305.6582] manager: (tap902f5462-63): new Tun device (/org/freedesktop/NetworkManager/Devices/111)
Oct  8 11:28:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:28:25Z|00308|binding|INFO|Claiming lport 902f5462-63af-4928-a517-b67d158bf2c2 for this chassis.
Oct  8 11:28:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:28:25Z|00309|binding|INFO|902f5462-63af-4928-a517-b67d158bf2c2: Claiming fa:16:3e:97:62:62 192.168.3.171 2001:3::6b
Oct  8 11:28:25 np0005476733 nova_compute[192580]: 2025-10-08 15:28:25.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:25 np0005476733 nova_compute[192580]: 2025-10-08 15:28:25.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.673 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:62:62 192.168.3.171 2001:3::6b'], port_security=['fa:16:3e:97:62:62 192.168.3.171 2001:3::6b'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'neutron:cidrs': '192.168.3.171/24 2001:3::6b/64', 'neutron:device_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c4aa983-f923-4017-9479-47738c5b827f', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'neutron:project_id': '93e68db931464f0282500c84d398d8af', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ee93d6be-59e3-41c0-a55f-8df79fb9da74', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7326b02d-cc3d-4e8f-b8b4-4ee71b4d8a00, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=902f5462-63af-4928-a517-b67d158bf2c2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:28:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.675 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 902f5462-63af-4928-a517-b67d158bf2c2 in datapath 6c4aa983-f923-4017-9479-47738c5b827f bound to our chassis#033[00m
Oct  8 11:28:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:28:25Z|00310|binding|INFO|Setting lport 902f5462-63af-4928-a517-b67d158bf2c2 ovn-installed in OVS
Oct  8 11:28:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:28:25Z|00311|binding|INFO|Setting lport 902f5462-63af-4928-a517-b67d158bf2c2 up in Southbound
Oct  8 11:28:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.679 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6c4aa983-f923-4017-9479-47738c5b827f#033[00m
Oct  8 11:28:25 np0005476733 nova_compute[192580]: 2025-10-08 15:28:25.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.690 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4557ecd4-7a5a-4157-a645-7a225e2368bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:28:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.691 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6c4aa983-f1 in ovnmeta-6c4aa983-f923-4017-9479-47738c5b827f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:28:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.693 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6c4aa983-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:28:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.693 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[bab9f367-ae11-42e2-bc20-3bbf8300ad11]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:28:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.694 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[90a3c27b-82ed-430a-b15d-6c89a0a20aa1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:28:25 np0005476733 systemd-udevd[229814]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:28:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.708 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[1131b2b3-d031-402d-90f1-2a5bce1f9573]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:28:25 np0005476733 systemd-machined[152624]: New machine qemu-23-instance-00000025.
Oct  8 11:28:25 np0005476733 NetworkManager[51699]: <info>  [1759937305.7155] device (tap902f5462-63): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:28:25 np0005476733 NetworkManager[51699]: <info>  [1759937305.7163] device (tap902f5462-63): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:28:25 np0005476733 systemd[1]: Started Virtual Machine qemu-23-instance-00000025.
Oct  8 11:28:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.733 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[830021a9-e8d9-4962-84ea-6134d63ccf3c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:28:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.764 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[5ec05694-0239-4ebb-bdd5-cd9b3f9bc893]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:28:25 np0005476733 systemd-udevd[229819]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:28:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.770 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[79dca5e7-999a-4a88-ab92-987464558880]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:28:25 np0005476733 NetworkManager[51699]: <info>  [1759937305.7718] manager: (tap6c4aa983-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/112)
Oct  8 11:28:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.806 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[66ba0b0c-ca9a-42ff-9126-743a9ef79e9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:28:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.809 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[6cc4f00a-70f7-4b66-9c62-ca02d1b5db3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:28:25 np0005476733 NetworkManager[51699]: <info>  [1759937305.8270] device (tap6c4aa983-f0): carrier: link connected
Oct  8 11:28:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.831 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[f91f6017-7e8a-4ee0-8f44-d9772a48603e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:28:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.846 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[cd4fd51f-a938-4d80-8ba0-6c7c4f31d648]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6c4aa983-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f0:45:04'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 420949, 'reachable_time': 36348, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229847, 'error': None, 'target': 'ovnmeta-6c4aa983-f923-4017-9479-47738c5b827f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:28:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.863 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[cd3fb5bd-8903-474b-b0c7-6f2d6865c8fd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef0:4504'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 420949, 'tstamp': 420949}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229848, 'error': None, 'target': 'ovnmeta-6c4aa983-f923-4017-9479-47738c5b827f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:28:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.882 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1c1fbafb-bcee-4cd3-9079-031078e5c59e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6c4aa983-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f0:45:04'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 420949, 'reachable_time': 36348, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229849, 'error': None, 'target': 'ovnmeta-6c4aa983-f923-4017-9479-47738c5b827f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:28:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.913 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[72c813b4-db38-43e8-bd8e-d350b7b6714f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:28:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.972 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[05e62edf-32e0-4001-a5e5-046e70dd5abd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:28:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.974 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c4aa983-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:28:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.974 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:28:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.975 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6c4aa983-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:28:25 np0005476733 nova_compute[192580]: 2025-10-08 15:28:25.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:25 np0005476733 NetworkManager[51699]: <info>  [1759937305.9774] manager: (tap6c4aa983-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/113)
Oct  8 11:28:25 np0005476733 kernel: tap6c4aa983-f0: entered promiscuous mode
Oct  8 11:28:25 np0005476733 nova_compute[192580]: 2025-10-08 15:28:25.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.981 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6c4aa983-f0, col_values=(('external_ids', {'iface-id': 'c8b535f4-c584-42f9-a77f-2ed6ee5f2aaa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:28:25 np0005476733 nova_compute[192580]: 2025-10-08 15:28:25.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:28:25Z|00312|binding|INFO|Releasing lport c8b535f4-c584-42f9-a77f-2ed6ee5f2aaa from this chassis (sb_readonly=0)
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:25.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:25.999 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6c4aa983-f923-4017-9479-47738c5b827f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6c4aa983-f923-4017-9479-47738c5b827f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:26.001 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[924ce49d-0a04-4808-8d54-b02f9ee7c568]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:26.002 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-6c4aa983-f923-4017-9479-47738c5b827f
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/6c4aa983-f923-4017-9479-47738c5b827f.pid.haproxy
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 6c4aa983-f923-4017-9479-47738c5b827f
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:26.002 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6c4aa983-f923-4017-9479-47738c5b827f', 'env', 'PROCESS_TAG=haproxy-6c4aa983-f923-4017-9479-47738c5b827f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6c4aa983-f923-4017-9479-47738c5b827f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:26.310 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:26.311 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:28:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:26.312 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.384 2 DEBUG nova.compute.manager [req-162e2918-6f57-41b0-8f49-3e073cf6911d req-fcf8f5e8-2afc-432c-a0c7-583effebdb60 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Received event network-vif-plugged-902f5462-63af-4928-a517-b67d158bf2c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.385 2 DEBUG oslo_concurrency.lockutils [req-162e2918-6f57-41b0-8f49-3e073cf6911d req-fcf8f5e8-2afc-432c-a0c7-583effebdb60 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "020f3abc-b9cd-43d6-81f9-4464a8d20207-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.386 2 DEBUG oslo_concurrency.lockutils [req-162e2918-6f57-41b0-8f49-3e073cf6911d req-fcf8f5e8-2afc-432c-a0c7-583effebdb60 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "020f3abc-b9cd-43d6-81f9-4464a8d20207-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.386 2 DEBUG oslo_concurrency.lockutils [req-162e2918-6f57-41b0-8f49-3e073cf6911d req-fcf8f5e8-2afc-432c-a0c7-583effebdb60 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "020f3abc-b9cd-43d6-81f9-4464a8d20207-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.387 2 DEBUG nova.compute.manager [req-162e2918-6f57-41b0-8f49-3e073cf6911d req-fcf8f5e8-2afc-432c-a0c7-583effebdb60 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Processing event network-vif-plugged-902f5462-63af-4928-a517-b67d158bf2c2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:28:26 np0005476733 podman[229890]: 2025-10-08 15:28:26.393750561 +0000 UTC m=+0.058955180 container create a320f314a428392da87df7d4f7b6f362bb9a2f6a48136aaa4675bccac0cfa1a1 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-6c4aa983-f923-4017-9479-47738c5b827f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:28:26 np0005476733 systemd[1]: Started libpod-conmon-a320f314a428392da87df7d4f7b6f362bb9a2f6a48136aaa4675bccac0cfa1a1.scope.
Oct  8 11:28:26 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:28:26 np0005476733 podman[229890]: 2025-10-08 15:28:26.361126376 +0000 UTC m=+0.026331035 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:28:26 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8f02d9f893e26da7b5f0788e1e2f8a4f3fb154509e7fa16576e38dcb809cb94d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:28:26 np0005476733 podman[229890]: 2025-10-08 15:28:26.473585801 +0000 UTC m=+0.138790420 container init a320f314a428392da87df7d4f7b6f362bb9a2f6a48136aaa4675bccac0cfa1a1 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-6c4aa983-f923-4017-9479-47738c5b827f, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  8 11:28:26 np0005476733 podman[229890]: 2025-10-08 15:28:26.481344867 +0000 UTC m=+0.146549476 container start a320f314a428392da87df7d4f7b6f362bb9a2f6a48136aaa4675bccac0cfa1a1 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-6c4aa983-f923-4017-9479-47738c5b827f, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0)
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.481 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937306.4813805, 020f3abc-b9cd-43d6-81f9-4464a8d20207 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.482 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] VM Started (Lifecycle Event)#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.486 2 DEBUG nova.compute.manager [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.490 2 DEBUG nova.virt.libvirt.driver [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.494 2 INFO nova.virt.libvirt.driver [-] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Instance spawned successfully.#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.494 2 DEBUG nova.virt.libvirt.driver [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:28:26 np0005476733 neutron-haproxy-ovnmeta-6c4aa983-f923-4017-9479-47738c5b827f[229905]: [NOTICE]   (229909) : New worker (229911) forked
Oct  8 11:28:26 np0005476733 neutron-haproxy-ovnmeta-6c4aa983-f923-4017-9479-47738c5b827f[229905]: [NOTICE]   (229909) : Loading success.
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.509 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.511 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.527 2 DEBUG nova.virt.libvirt.driver [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.527 2 DEBUG nova.virt.libvirt.driver [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.528 2 DEBUG nova.virt.libvirt.driver [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.528 2 DEBUG nova.virt.libvirt.driver [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.529 2 DEBUG nova.virt.libvirt.driver [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.529 2 DEBUG nova.virt.libvirt.driver [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.559 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.560 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937306.4823575, 020f3abc-b9cd-43d6-81f9-4464a8d20207 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.560 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.612 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.618 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937306.4924755, 020f3abc-b9cd-43d6-81f9-4464a8d20207 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.619 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.665 2 INFO nova.compute.manager [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Took 8.17 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.666 2 DEBUG nova.compute.manager [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.676 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.680 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.788 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.847 2 INFO nova.compute.manager [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Took 9.01 seconds to build instance.#033[00m
Oct  8 11:28:26 np0005476733 nova_compute[192580]: 2025-10-08 15:28:26.873 2 DEBUG oslo_concurrency.lockutils [None req-19f16208-d3db-48ab-8f5d-ab39b4f7057e 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "020f3abc-b9cd-43d6-81f9-4464a8d20207" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:28:27 np0005476733 nova_compute[192580]: 2025-10-08 15:28:27.049 2 DEBUG nova.network.neutron [req-04830d6d-ed21-4c3a-a441-0c5b2399e4c9 req-03c56003-a3a4-4ec9-aa8b-befd507e863c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Updated VIF entry in instance network info cache for port 902f5462-63af-4928-a517-b67d158bf2c2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:28:27 np0005476733 nova_compute[192580]: 2025-10-08 15:28:27.050 2 DEBUG nova.network.neutron [req-04830d6d-ed21-4c3a-a441-0c5b2399e4c9 req-03c56003-a3a4-4ec9-aa8b-befd507e863c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Updating instance_info_cache with network_info: [{"id": "902f5462-63af-4928-a517-b67d158bf2c2", "address": "fa:16:3e:97:62:62", "network": {"id": "6c4aa983-f923-4017-9479-47738c5b827f", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.171", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:3::/64", "dns": [], "gateway": {"address": "2001:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:3::6b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap902f5462-63", "ovs_interfaceid": "902f5462-63af-4928-a517-b67d158bf2c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:28:27 np0005476733 nova_compute[192580]: 2025-10-08 15:28:27.072 2 DEBUG oslo_concurrency.lockutils [req-04830d6d-ed21-4c3a-a441-0c5b2399e4c9 req-03c56003-a3a4-4ec9-aa8b-befd507e863c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-020f3abc-b9cd-43d6-81f9-4464a8d20207" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:28:28 np0005476733 nova_compute[192580]: 2025-10-08 15:28:28.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:28 np0005476733 nova_compute[192580]: 2025-10-08 15:28:28.502 2 DEBUG nova.compute.manager [req-6b436f5b-3ffb-44f2-ba75-f9ca03c92070 req-898e0090-baaf-4066-9f9d-094fe60944c9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Received event network-vif-plugged-902f5462-63af-4928-a517-b67d158bf2c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:28:28 np0005476733 nova_compute[192580]: 2025-10-08 15:28:28.502 2 DEBUG oslo_concurrency.lockutils [req-6b436f5b-3ffb-44f2-ba75-f9ca03c92070 req-898e0090-baaf-4066-9f9d-094fe60944c9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "020f3abc-b9cd-43d6-81f9-4464a8d20207-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:28:28 np0005476733 nova_compute[192580]: 2025-10-08 15:28:28.502 2 DEBUG oslo_concurrency.lockutils [req-6b436f5b-3ffb-44f2-ba75-f9ca03c92070 req-898e0090-baaf-4066-9f9d-094fe60944c9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "020f3abc-b9cd-43d6-81f9-4464a8d20207-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:28:28 np0005476733 nova_compute[192580]: 2025-10-08 15:28:28.503 2 DEBUG oslo_concurrency.lockutils [req-6b436f5b-3ffb-44f2-ba75-f9ca03c92070 req-898e0090-baaf-4066-9f9d-094fe60944c9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "020f3abc-b9cd-43d6-81f9-4464a8d20207-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:28:28 np0005476733 nova_compute[192580]: 2025-10-08 15:28:28.503 2 DEBUG nova.compute.manager [req-6b436f5b-3ffb-44f2-ba75-f9ca03c92070 req-898e0090-baaf-4066-9f9d-094fe60944c9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] No waiting events found dispatching network-vif-plugged-902f5462-63af-4928-a517-b67d158bf2c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:28:28 np0005476733 nova_compute[192580]: 2025-10-08 15:28:28.503 2 WARNING nova.compute.manager [req-6b436f5b-3ffb-44f2-ba75-f9ca03c92070 req-898e0090-baaf-4066-9f9d-094fe60944c9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Received unexpected event network-vif-plugged-902f5462-63af-4928-a517-b67d158bf2c2 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:28:28 np0005476733 nova_compute[192580]: 2025-10-08 15:28:28.608 2 INFO nova.compute.manager [None req-16aad8d4-47f8-41be-bca8-4b23d1b36b8b 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Get console output#033[00m
Oct  8 11:28:28 np0005476733 nova_compute[192580]: 2025-10-08 15:28:28.614 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:28:29 np0005476733 nova_compute[192580]: 2025-10-08 15:28:29.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:29 np0005476733 nova_compute[192580]: 2025-10-08 15:28:29.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:30 np0005476733 nova_compute[192580]: 2025-10-08 15:28:30.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:31 np0005476733 podman[229921]: 2025-10-08 15:28:31.249675806 +0000 UTC m=+0.078872451 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Oct  8 11:28:33 np0005476733 nova_compute[192580]: 2025-10-08 15:28:33.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:33 np0005476733 nova_compute[192580]: 2025-10-08 15:28:33.777 2 INFO nova.compute.manager [None req-d2cece66-bc2c-4d7f-867f-b7fbcc1de5b9 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Get console output#033[00m
Oct  8 11:28:33 np0005476733 nova_compute[192580]: 2025-10-08 15:28:33.785 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:28:34 np0005476733 nova_compute[192580]: 2025-10-08 15:28:34.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:34 np0005476733 podman[229942]: 2025-10-08 15:28:34.251924 +0000 UTC m=+0.067956385 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Oct  8 11:28:34 np0005476733 podman[229941]: 2025-10-08 15:28:34.260546374 +0000 UTC m=+0.077646133 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  8 11:28:34 np0005476733 nova_compute[192580]: 2025-10-08 15:28:34.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.006 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'name': 'tempest-broadcast-receiver-124-44875693', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000023', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7e1086961263487db8a3c5190fdf1b2e', 'user_id': '843ea0278e174175a6f8e21731c1383e', 'hostId': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.009 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000025', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '93e68db931464f0282500c84d398d8af', 'user_id': '048380879c82439f920961e33c8fc34c', 'hostId': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.012 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'name': 'tempest-test_bw_limit_east_west-350327070', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000021', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'hostId': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.015 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'name': 'tempest-broadcast-receiver-123-1908290520', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000014', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7e1086961263487db8a3c5190fdf1b2e', 'user_id': '843ea0278e174175a6f8e21731c1383e', 'hostId': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.018 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1cbc4434-d89a-483d-a1f2-299190262888', 'name': 'tempest-broadcast-sender-124-598361755', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001d', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7e1086961263487db8a3c5190fdf1b2e', 'user_id': '843ea0278e174175a6f8e21731c1383e', 'hostId': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.018 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.022 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6 / tap92e6817e-73 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.022 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.026 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 020f3abc-b9cd-43d6-81f9-4464a8d20207 / tap902f5462-63 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.026 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.031 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 7f1808f3-5a79-4149-84d1-7bc21eefa497 / tap7ad20ed3-85 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.032 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.incoming.bytes.delta volume: 47334809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.032 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.037 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.incoming.bytes.delta volume: 14738 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.041 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/network.incoming.bytes.delta volume: 734 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0556d82-5728-4ab6-9ed2-c9255233dc47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000023-63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-tap92e6817e-73', 'timestamp': '2025-10-08T15:28:36.019016', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'tap92e6817e-73', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6b:88:09', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap92e6817e-73'}, 'message_id': '74fd7292-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.742040441, 'message_signature': '7161b65828ccf389ae56f151f5278f000bda711940d570d6c005d505b5174239'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-00000025-020f3abc-b9cd-43d6-81f9-4464a8d20207-tap902f5462-63', 'timestamp': '2025-10-08T15:28:36.019016', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'tap902f5462-63', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:97:62:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap902f5462-63'}, 'message_id': '74fe009a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.746400989, 'message_signature': '08c6648dd3163c0e68ce53488367e1fb2459d5f35629fc5f06b6bde2d4f5ef41'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 47334809, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tapb5af459f-56', 'timestamp': '2025-10-08T15:28:36.019016', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tapb5af459f-56', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4e:90:51', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb5af459f-56'}, 'message_id': '74fee726-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.750113617, 'message_signature': '7aa9a8aa2a7ebf2e7aaedf7c07aa5ae105d8deb9f2c4a62eeb2619774e5e1bab'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tap7ad20ed3-85', 'timestamp': '2025-10-08T15:28:36.019016', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tap7ad20ed3-85', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:62:ec:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7ad20ed3-85'}, 'message_id': '74fefdce-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.750113617, 'message_signature': 'bc04b15004c99db1871542dbb00b35c43f6d54d1ec30cb13bb44027dbeaa37fb'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 14738, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:28:36.019016', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': '74ffb250-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.756487788, 'message_signature': '1ba5c42ac39bc3d30057f06b8a2a8bba1ed39b560ddc55eb9af8d7e04765865d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 734, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 434-d89a-483d-a1f2-299190262888-tap020c7187-87', 'timestamp': '2025-10-08T15:28:36.019016', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'tap020c7187-87', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8b:42:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap020c7187-87'}, 'message_id': '750062e0-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.761202618, 'message_signature': '20e62c0778332dad0448e02e5422c783fac9dfa5f8c12f6f98e5d35e6d449d32'}]}, 'timestamp': '2025-10-08 15:28:36.042705', '_unique_id': 'f4fbb1e53586445c99fd5f472ba4ea63'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.046 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.092 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.device.read.bytes volume: 330900992 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.092 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.121 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.device.read.bytes volume: 93131776 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.122 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.149 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.read.bytes volume: 329442816 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.150 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.168 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.read.bytes volume: 331110400 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.169 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.191 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.read.bytes volume: 329394176 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.192 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50f0baea-af55-43ad-9d4f-630b635a4357', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 330900992, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-vda', 'timestamp': '2025-10-08T15:28:36.046423', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'instance-00000023', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '75080d4c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.769452119, 'message_signature': 'ad179381191a61df995cb9f35a1d9aa50660decc34debe789bcf25c9d8a8e398'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-sda', 'timestamp': '2025-10-08T15:28:36.046423', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'instance-00000023', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '75081a62-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.769452119, 'message_signature': 'f311bf67d4c4302eb506ba8e1f9c7d6476534dd172a927d0ff7f6b77e50bd73f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 93131776, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207-vda', 'timestamp': '2025-10-08T15:28:36.046423', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'instance-00000025', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '750c80f2-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.816128358, 'message_signature': 'd78dca85845315db7863dd2e4d1bd8c433b70435e5708d584ea4819a78144245'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207-sda', 'timestamp': '2025-10-08T15:28:36.046423', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'instance-00000025', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '750c8fc0-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.816128358, 'message_signature': '3b48e619ac090459b99bddec81b5219e3377827a9ece9e841ea8c180ed9ba990'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 329442816, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-vda', 'timestamp': '2025-10-08T15:28:36.046423', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '7510c6da-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.845321744, 'message_signature': '26e41a9ec8715d38dfd0811c6a8c955a5a2577cb5f13e81d75889285a6aa532a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-sda', 'timestamp': '2025-10-08T15:28:36.046423', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk':
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: e': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '7510d698-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.845321744, 'message_signature': '5462f0ab344cc948fa27624e9861c69735fd8eda299ab0670acb3994c22320a0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 331110400, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:28:36.046423', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '7513b354-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.873361293, 'message_signature': '781f22b38e931828d573ed30de932c61e36641d30d47e09c29b0787755f7991a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:28:36.046423', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '7513c48e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.873361293, 'message_signature': 'd1bdedce30f40c0a90beb1af99ee3284124a54cf53139a0813606c679fe9949a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 329394176, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-vda', 'timestamp': '2025-10-08T15:28:36.046423', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '75172bec-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.892591992, 'message_signature': 'b67a7b919478b6137b824ec8668d3fc8ba54a0cc4e06f2bc0159719f18891c8c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-sda', 'timestamp': '2025-10-08T15:28:36.046423', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '75173e2a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.892591992, 'message_signature': '005a32b0a8e8d423887e3a075bdc26126df51fd8483d45170249dea183d31637'}]}, 'timestamp': '2025-10-08 15:28:36.192320', '_unique_id': '5d0fecf8dbbf438592b2a723842de6ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.194 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.207 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.device.usage volume: 152764416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.208 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.220 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.220 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.235 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.usage volume: 152436736 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.236 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.246 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.usage volume: 161153024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.246 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.257 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.usage volume: 169410560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:28:36.044 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.258 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:28:36.193 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89fd0ffc-28d1-4dee-8600-063e1f934c11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152764416, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-vda', 'timestamp': '2025-10-08T15:28:36.194375', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'instance-00000023', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '7519a2b4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.917357127, 'message_signature': '02a3389fb5bce4586a198351ed73a406868ed5613298f975bca734de4c5d8d0a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-sda', 'timestamp': '2025-10-08T15:28:36.194375', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'instance-00000023', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '7519b11e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.917357127, 'message_signature': '08a9e35c1a99906500a520bdd42817047e0aef50261f3c2b73984cb241c7bf4f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207-vda', 'timestamp': '2025-10-08T15:28:36.194375', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'instance-00000025', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '751b972c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.931369981, 'message_signature': '7769b9ab549fd67b493cba22b52f27ac84755019e2b0b804439087bdd6c4518c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207-sda', 'timestamp': '2025-10-08T15:28:36.194375', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'instance-00000025', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '751ba424-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.931369981, 'message_signature': 'd0e2b595e181602b260d655e3213b2b1b0b6abfad4fcd5b45c139cb372ffe59e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152436736, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-vda', 'timestamp': '2025-10-08T15:28:36.194375', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '751de568-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.944156146, 'message_signature': '165a0754a342d761651a4cc5cfd22e85d693d162b1cc8d4196c76bb4bdcb4d05'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-sda', 'timestamp': '2025-10-08T15:28:36.194375', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state'
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: ef': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '751e0a02-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.944156146, 'message_signature': 'd91201ae4ff9b8e43ea97d5c4a5289ea4874f1f84b7af8863054f77a2bcbbf67'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 161153024, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:28:36.194375', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '751f89c2-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.959825613, 'message_signature': '10ad3e2229b7a0d3ebaa947fb3c3483c7a0f46cbbe64ef73386d8bf851c6ed4d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:28:36.194375', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '751f9566-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.959825613, 'message_signature': 'f3ce026c599f13aae4e1656d28551ff8574cf61c3e42b1fd208c456baf09ead1'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 169410560, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-vda', 'timestamp': '2025-10-08T15:28:36.194375', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '752146d6-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.969969974, 'message_signature': 'e5fc28311a4ee31f835cbfde93629b251b353b1e4c3d2894df2d9552551eb34d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-sda', 'timestamp': '2025-10-08T15:28:36.194375', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '7521527a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.969969974, 'message_signature': '47dd618c5e70f1dc09b1c657907149322f8fa2c9531d41943a766bb0a4665f2a'}]}, 'timestamp': '2025-10-08 15:28:36.258403', '_unique_id': 'b0c6702c149c4684b823c2c5b161875c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:28:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.259 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.260 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.260 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.260 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.261 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.261 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.261 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.261 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35b1dae1-2f81-43b4-8c2d-1803bb800559', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000023-63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-tap92e6817e-73', 'timestamp': '2025-10-08T15:28:36.260666', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'tap92e6817e-73', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6b:88:09', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap92e6817e-73'}, 'message_id': '7521b5f8-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.742040441, 'message_signature': '832158fa6ead49799206f70fa95ee95887f65b89570cf9dcbf30515c5a59cc8e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-00000025-020f3abc-b9cd-43d6-81f9-4464a8d20207-tap902f5462-63', 'timestamp': '2025-10-08T15:28:36.260666', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'tap902f5462-63', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:97:62:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap902f5462-63'}, 'message_id': '7521be68-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.746400989, 'message_signature': '6055fe2e86d0440379a5909d5db244dab577f3d4ac31e116c4eef0d46f6085ce'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tapb5af459f-56', 'timestamp': '2025-10-08T15:28:36.260666', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tapb5af459f-56', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4e:90:51', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb5af459f-56'}, 'message_id': '7521c728-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.750113617, 'message_signature': 'ef8901e1c79ed41ab307b0cfef4ab482e915ea784ac8bd635215aede840fa99d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tap7ad20ed3-85', 'timestamp': '2025-10-08T15:28:36.260666', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tap7ad20ed3-85', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:62:ec:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7ad20ed3-85'}, 'message_id': '7521ceda-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.750113617, 'message_signature': '2ffd2a6a4e292b66c813f992e378c8b7ec274527b9aca20fa949936e28951acc'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:28:36.260666', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': '7521d70e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.756487788, 'message_signature': '3a59fb01213c5644691da8316ad28034fcc2e27ee5f9a2227a3d94540d7cb126'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '843ea0278e
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: roject_name': None, 'resource_id': 'instance-0000001d-1cbc4434-d89a-483d-a1f2-299190262888-tap020c7187-87', 'timestamp': '2025-10-08T15:28:36.260666', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'tap020c7187-87', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8b:42:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap020c7187-87'}, 'message_id': '7521deb6-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.761202618, 'message_signature': 'b81a11baab485ce8dbc2f797ec27cd4d7e0fb94755296983761388eb78b7e1d9'}]}, 'timestamp': '2025-10-08 15:28:36.261954', '_unique_id': 'a077d93b5ef845858fe8cd23e1fb7087'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.263 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.263 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.device.read.latency volume: 7242175191 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.263 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.device.read.latency volume: 43670875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.263 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.device.read.latency volume: 3846761237 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.264 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.device.read.latency volume: 3440619 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.264 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.read.latency volume: 7277702332 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.264 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.read.latency volume: 51361935 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.264 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.read.latency volume: 9548447160 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.265 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.read.latency volume: 62858344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.265 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.read.latency volume: 9455040627 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.265 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.read.latency volume: 54327544 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7860005-3873-4782-b366-be28956c7501', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7242175191, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-vda', 'timestamp': '2025-10-08T15:28:36.263325', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'instance-00000023', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '75221e44-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.769452119, 'message_signature': 'db4bf87c4aeb9186c8bffcffda535324e196c911fb177e36d750473703e75008'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 43670875, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-sda', 'timestamp': '2025-10-08T15:28:36.263325', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'instance-00000023', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '7522283a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.769452119, 'message_signature': '32cca816041b89b8af213857c707c95b14ca05b738e9820368a0dc563e55f271'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3846761237, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207-vda', 'timestamp': '2025-10-08T15:28:36.263325', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'instance-00000025', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '7522308c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.816128358, 'message_signature': 'ff4f6705c56008f6bd50784d1d423389ce038beb3b817f7afd6d18b187355856'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3440619, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207-sda', 'timestamp': '2025-10-08T15:28:36.263325', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'instance-00000025', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '75223c44-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.816128358, 'message_signature': '3074b3759ee9ea00417a06602ccadc675fbba757a738c254f79a1b490dbcc021'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7277702332, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-vda', 'timestamp': '2025-10-08T15:28:36.263325', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '7522478e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.845321744, 'message_signature': 'b5805ceb66059ce032eea1e8c520b22b218170b702b5a347b4a0dc16931b12d3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 51361935, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-sda', 'timestamp': '2025-10-08T15:28:36.263325', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vc
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: ing', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '75224f7c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.845321744, 'message_signature': 'aabc19ea35fcfcf0444224147fd9b38b5553d7b1e8cca4d9cd15362d19150634'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9548447160, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:28:36.263325', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '752256f2-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.873361293, 'message_signature': '8f8f500f22074b056af8caea35d12322e98b006b2fd3b17dbfa307c34e01b265'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 62858344, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:28:36.263325', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '75225fbc-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.873361293, 'message_signature': '5b295c9a43e0605153845e3d37d3a1e8638c679b2be238bc125925bddca0b80a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9455040627, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-vda', 'timestamp': '2025-10-08T15:28:36.263325', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '7522673c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.892591992, 'message_signature': '614595abb099661e03b9eb437e048190d8652fee170494cde829b2b6fe8fe7ff'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 54327544, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-sda', 'timestamp': '2025-10-08T15:28:36.263325', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '75226e80-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.892591992, 'message_signature': '80fac246a48df35a6c89e4d8ec9201e544827436dfa4a5502db730c6e92fcdd1'}]}, 'timestamp': '2025-10-08 15:28:36.265633', '_unique_id': 'ce5e3ab769b4493781f6941feff6e4ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.267 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.267 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.267 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.267 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.268 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.268 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.268 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bbf9657c-7cce-4374-a488-7da8c78a3b4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000023-63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-tap92e6817e-73', 'timestamp': '2025-10-08T15:28:36.267440', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'tap92e6817e-73', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6b:88:09', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap92e6817e-73'}, 'message_id': '7522bdfe-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.742040441, 'message_signature': '10ed87e42c89aa8a5b41e8d22e709a5b232475c807560c56605f863bba6957b1'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-00000025-020f3abc-b9cd-43d6-81f9-4464a8d20207-tap902f5462-63', 'timestamp': '2025-10-08T15:28:36.267440', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'tap902f5462-63', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:97:62:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap902f5462-63'}, 'message_id': '7522c70e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.746400989, 'message_signature': '02cead18cf3a5e81bd960945fd7896498891ed92863f9f6dfc74918c9ac173df'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tapb5af459f-56', 'timestamp': '2025-10-08T15:28:36.267440', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tapb5af459f-56', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4e:90:51', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb5af459f-56'}, 'message_id': '7522cf88-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.750113617, 'message_signature': '8634054060b0801e8e1e86854d6ec6ce443dffafbcd9f100017398c34c21559c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tap7ad20ed3-85', 'timestamp': '2025-10-08T15:28:36.267440', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tap7ad20ed3-85', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:62:ec:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7ad20ed3-85'}, 'message_id': '7522d848-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.750113617, 'message_signature': 'dfc3995f6639f265e2ab038f3508193e1d56baac42ddb48319cc5748ccacea70'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:28:36.267440', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': '7522e018-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.756487788, 'message_signature': 'f067972708eb436649b890190f865d76e81ca206bfedb0c96da00745bfe378a9'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '843ea0278e174175
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: _name': None, 'resource_id': 'instance-0000001d-1cbc4434-d89a-483d-a1f2-299190262888-tap020c7187-87', 'timestamp': '2025-10-08T15:28:36.267440', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'tap020c7187-87', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8b:42:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap020c7187-87'}, 'message_id': '7522e9e6-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.761202618, 'message_signature': '3bdff64f057eb11f3fb1e8674168a276e91c3be1c7dce1edff50e692d6a1a4d9'}]}, 'timestamp': '2025-10-08 15:28:36.268804', '_unique_id': '446b5f7c5b9f4ca7a6fdbbfd0a7ccf44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.269 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.270 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.device.write.bytes volume: 135861760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.270 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.270 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.device.write.bytes volume: 1024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.270 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.270 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.write.bytes volume: 136912896 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.271 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.271 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.write.bytes volume: 145012736 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.271 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.271 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.write.bytes volume: 153164800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.272 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bbc36144-2a54-431d-b444-291304dc087a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 135861760, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-vda', 'timestamp': '2025-10-08T15:28:36.270078', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'instance-00000023', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '752325d2-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.769452119, 'message_signature': 'b17de4dbbccaff194fd28a5ff90d3d9a616e405ace292d267e07ce076bbcbc63'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-sda', 'timestamp': '2025-10-08T15:28:36.270078', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'instance-00000023', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '75232dca-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.769452119, 'message_signature': '8fb069928451a1b2c9169a6603362012a6a6db63d62fbc139369680d4ad4b3f6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1024, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207-vda', 'timestamp': '2025-10-08T15:28:36.270078', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'instance-00000025', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '752335ea-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.816128358, 'message_signature': '88d355ddf6d16fc21c744453345820967c5c52cd2f30870ff3d2cfe420c755be'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207-sda', 'timestamp': '2025-10-08T15:28:36.270078', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'instance-00000025', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '75233f22-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.816128358, 'message_signature': '45437c5c01b94f842f223fa7d4e102a423e5b4b82b7c2e8a88cc08ebb41e046d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 136912896, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-vda', 'timestamp': '2025-10-08T15:28:36.270078', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '752348c8-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.845321744, 'message_signature': '0dc3ab9a4cf81b8c0a40ffebf9a3a9c97b8fe11413269e27b045c1635737af82'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-sda', 'timestamp': '2025-10-08T15:28:36.270078', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephem
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '752350b6-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.845321744, 'message_signature': '4fc3b32b0db6ce40af14ee245bdb36a92f9828a3ac1ca41b8b44d4c7e1169884'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 145012736, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:28:36.270078', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '752358cc-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.873361293, 'message_signature': 'e39b2a66284259ad15b59a71e95b4bbc854c20a69f8f73d620c8d55bb5bcbe02'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:28:36.270078', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '752362ae-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.873361293, 'message_signature': '1756df2b0564a61723c8ec74f2bcd4125130a3250c38c60eb44a24c058abfc20'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 153164800, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-vda', 'timestamp': '2025-10-08T15:28:36.270078', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '75236c7c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.892591992, 'message_signature': '2a42e2df895d4a849ea48f01824652758dc7c5fe161a2d8b246bf40b1b2c9797'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-sda', 'timestamp': '2025-10-08T15:28:36.270078', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '752376f4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.892591992, 'message_signature': '093cde2d78dd9d2e2061c9bad3f16ac0bea87afdde2d3add8b1171bab96ae711'}]}, 'timestamp': '2025-10-08 15:28:36.272432', '_unique_id': '3d290c7f795a44bcb00ec0a22b8dea9a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.273 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/network.incoming.packets volume: 190 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.274 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:28:36.262 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.274 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.incoming.packets volume: 14460 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:28:36.266 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.274 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.incoming.packets volume: 449 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.275 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.incoming.packets volume: 243 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.275 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/network.incoming.packets volume: 164 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98f0ef5c-fbbe-4139-875d-82e02af8cb9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 190, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000023-63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-tap92e6817e-73', 'timestamp': '2025-10-08T15:28:36.273970', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'tap92e6817e-73', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6b:88:09', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap92e6817e-73'}, 'message_id': '7523beb6-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.742040441, 'message_signature': 'f4edc375893decb0bc1ee4fea7bd8386f1ce6366c93c325d4d0836946f40e2f3'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-00000025-020f3abc-b9cd-43d6-81f9-4464a8d20207-tap902f5462-63', 'timestamp': '2025-10-08T15:28:36.273970', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'tap902f5462-63', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:97:62:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap902f5462-63'}, 'message_id': '7523c96a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.746400989, 'message_signature': '5a8b83d088d5bd128085a511bc454271cd16e5f240e061eae836993330e29d71'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14460, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tapb5af459f-56', 'timestamp': '2025-10-08T15:28:36.273970', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tapb5af459f-56', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4e:90:51', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb5af459f-56'}, 'message_id': '7523d3ec-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.750113617, 'message_signature': 'd5db9928a52900b9ed3c8b0f42f5ec8c37218c6a7d796c8293eb5571df607ded'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 449, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tap7ad20ed3-85', 'timestamp': '2025-10-08T15:28:36.273970', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tap7ad20ed3-85', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:62:ec:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7ad20ed3-85'}, 'message_id': '7523de46-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.750113617, 'message_signature': '829a62f16f01191144fba7c630b97bd8094cbd765484f7a7bfbfd7025e02f4e7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 243, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:28:36.273970', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': '7523ea80-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.756487788, 'message_signature': '560fe4c0197f99c774f45dfd9b705c420a90fa44c7575a83f2d87f6cf17a0afe'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 164, 'user_id': '843ea0278e174175a6f8e21731c1383e',
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: ource_id': 'instance-0000001d-1cbc4434-d89a-483d-a1f2-299190262888-tap020c7187-87', 'timestamp': '2025-10-08T15:28:36.273970', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'tap020c7187-87', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8b:42:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap020c7187-87'}, 'message_id': '7523f4d0-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.761202618, 'message_signature': 'e4f2d660fe19153f5ae49d2fb3e687e0266f8c41607871156fe76cadc4a47241'}]}, 'timestamp': '2025-10-08 15:28:36.275664', '_unique_id': '28543f7c7e214d51a02936e70a1f6309'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.276 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.277 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/network.incoming.bytes volume: 32657 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.277 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.277 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.incoming.bytes volume: 47334919 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.277 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.incoming.bytes volume: 79337 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.277 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.incoming.bytes volume: 40169 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/network.incoming.bytes volume: 25925 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3babface-8a55-4418-9430-90bc0556b2f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32657, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000023-63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-tap92e6817e-73', 'timestamp': '2025-10-08T15:28:36.277055', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'tap92e6817e-73', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6b:88:09', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap92e6817e-73'}, 'message_id': '752435f8-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.742040441, 'message_signature': 'fdaafa234f89bcdb72dd0a318e998762f1cf8ec8b144c0debe4a8627f501a5c6'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-00000025-020f3abc-b9cd-43d6-81f9-4464a8d20207-tap902f5462-63', 'timestamp': '2025-10-08T15:28:36.277055', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'tap902f5462-63', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:97:62:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap902f5462-63'}, 'message_id': '75243e40-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.746400989, 'message_signature': 'b1b53da091fb1698b1a1faf3f3f9946f8dbc2df84d307369fb419a218001e31d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 47334919, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tapb5af459f-56', 'timestamp': '2025-10-08T15:28:36.277055', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tapb5af459f-56', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4e:90:51', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb5af459f-56'}, 'message_id': '75244660-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.750113617, 'message_signature': '6442d6222e1e4508a74fa59ac6d0ba6d423affb781c081d726e9e02e5228272c'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 79337, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tap7ad20ed3-85', 'timestamp': '2025-10-08T15:28:36.277055', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tap7ad20ed3-85', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:62:ec:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7ad20ed3-85'}, 'message_id': '75244ea8-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.750113617, 'message_signature': '36089a96344f2f99f96f42358cc212f1dddd6652ed044aacac3fc5a3a1c5f9d1'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 40169, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:28:36.277055', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': '75245682-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.756487788, 'message_signature': 'bf3036f0abe57e8a721979385fbd6ddd8ad5a0fa8ab74621f1bbf5cc7d0d5a5b'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25925, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: -1cbc4434-d89a-483d-a1f2-299190262888-tap020c7187-87', 'timestamp': '2025-10-08T15:28:36.277055', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'tap020c7187-87', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8b:42:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap020c7187-87'}, 'message_id': '75245f92-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.761202618, 'message_signature': '52cf151ae57cc1341fa459ace06fc2c7a9e1fc29da8f2b30c9676d2e259ccee9'}]}, 'timestamp': '2025-10-08 15:28:36.278360', '_unique_id': '47350c28f0b44f32bd5d5b6740e3749b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.279 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 11:28:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:28:36.269 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:28:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:28:36.273 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.310 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/cpu volume: 42380000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:28:36.276 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:28:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:28:36.278 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.332 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/cpu volume: 9450000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.345 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/cpu volume: 45100000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.360 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/cpu volume: 46220000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.373 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/cpu volume: 45280000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3bb27a7-2750-46c6-b81d-bfb2b552810d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 42380000000, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'timestamp': '2025-10-08T15:28:36.279686', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'instance-00000023', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '7529667c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4220.033820288, 'message_signature': 'ee3e9326788dd13ec3ad6a6fbe97d1212543bde21d777fba10d109c5136a142c'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9450000000, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'timestamp': '2025-10-08T15:28:36.279686', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'instance-00000025', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '752caee0-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4220.055375532, 'message_signature': 'dfcbc3f6701d28772e90cfcf310ade9f42cf0897b18b555721774c7a4e3371e6'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 45100000000, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'timestamp': '2025-10-08T15:28:36.279686', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '752eb0a0-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4220.068514858, 'message_signature': '81248a7e0850899449f008e531428bbf983ea8e50af0f7d9d8984472f2a5aebe'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46220000000, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'timestamp': '2025-10-08T15:28:36.279686', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '7531058a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4220.083641198, 'message_signature': '6d9a0e512884bd5e3ff9f56470b7e345db1972e215152682a5c1fae1043a560f'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 45280000000, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'timestamp': '2025-10-08T15:28:36.279686', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '7533013c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4220.096853786, 'message_signature': '6a7435e07d2dd9a85574c9f924c5f460f41b81233d038654b23b2ad7ea25da71'}]}, 'timestamp': '2025-10-08 15:28:36.374277', '_unique_id': 'b008e676579143e0b8abe6e3c8b22263'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.375 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.376 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.376 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/network.outgoing.bytes volume: 45931 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.376 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.376 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.outgoing.bytes volume: 96532 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.377 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.outgoing.bytes volume: 124833 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.377 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.outgoing.bytes volume: 58141 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.377 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/network.outgoing.bytes volume: 39681 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce56d4a3-43a0-42af-b08b-3538b2663096', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 45931, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000023-63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-tap92e6817e-73', 'timestamp': '2025-10-08T15:28:36.376289', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'tap92e6817e-73', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6b:88:09', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap92e6817e-73'}, 'message_id': '75335a06-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.742040441, 'message_signature': 'b58f3d906c35677fff36a581870ccd27b2a2be5605f29c5509dd346edbcb4ead'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-00000025-020f3abc-b9cd-43d6-81f9-4464a8d20207-tap902f5462-63', 'timestamp': '2025-10-08T15:28:36.376289', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'tap902f5462-63', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:97:62:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap902f5462-63'}, 'message_id': '75336550-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.746400989, 'message_signature': '366073666b5d47c8c84e0ee0c11d464f59e57c9c70a3c5e62ffa7a918d2223d4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 96532, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tapb5af459f-56', 'timestamp': '2025-10-08T15:28:36.376289', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tapb5af459f-56', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4e:90:51', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb5af459f-56'}, 'message_id': '75336d84-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.750113617, 'message_signature': '729df0c737e82b2e4b47590e39e7705073e9dc1c1f3d9a08e0cad37f23c43ff5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 124833, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tap7ad20ed3-85', 'timestamp': '2025-10-08T15:28:36.376289', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tap7ad20ed3-85', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:62:ec:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7ad20ed3-85'}, 'message_id': '753375fe-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.750113617, 'message_signature': '1e44f106ecf26505fba42a64f064f6beab216b22a5494b01b9a247e386c9cb97'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 58141, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:28:36.376289', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': '75337dce-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.756487788, 'message_signature': '29fcb04c0822ec27533d0141098d0fe43a0422062c6680fc2c659b9ee246d6e0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 39681, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id':
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: c4434-d89a-483d-a1f2-299190262888-tap020c7187-87', 'timestamp': '2025-10-08T15:28:36.376289', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'tap020c7187-87', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8b:42:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap020c7187-87'}, 'message_id': '75338562-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.761202618, 'message_signature': '810dd139b032741179bdb1dd5a804c1ccc830ef39b0af8292b6ebbee73b4cfcc'}]}, 'timestamp': '2025-10-08 15:28:36.377631', '_unique_id': 'd4741ac4988341519c7e8ac8d1347dda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.378 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.device.write.latency volume: 10641334284 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.379 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.379 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.device.write.latency volume: 12345381 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.379 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.379 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.write.latency volume: 8670973168 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.379 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.380 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.write.latency volume: 12424899987 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.380 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.380 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.write.latency volume: 18602335385 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.380 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '038be8d3-e76e-4070-9be6-75e1c48b113f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10641334284, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-vda', 'timestamp': '2025-10-08T15:28:36.378822', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'instance-00000023', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '7533bc6c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.769452119, 'message_signature': 'b0bb20797f94784c8f79fa9aa89165219d68c5f51cc21bc105239feebe529a4e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-sda', 'timestamp': '2025-10-08T15:28:36.378822', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'instance-00000023', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '7533c4d2-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.769452119, 'message_signature': '984edebb2a16c13f332719ff44387d5ec5bfdadac312d21460e88fee1141f4a2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12345381, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207-vda', 'timestamp': '2025-10-08T15:28:36.378822', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'instance-00000025', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '7533cc5c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.816128358, 'message_signature': '388316bb25645aff59e56d6408c55cfb261338cc92aa9ca559effb22b597d992'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207-sda', 'timestamp': '2025-10-08T15:28:36.378822', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'instance-00000025', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '7533d3be-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.816128358, 'message_signature': 'a0613fbae0548693177c3a91a694f6eef3a02593250afbaa3c2f9250de2bfe3b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8670973168, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-vda', 'timestamp': '2025-10-08T15:28:36.378822', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '7533dc2e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.845321744, 'message_signature': 'ebc4c47a76660a7d2a4b5d5b7e025fe7600052549f9a2d38f64e0ed574fa380f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-sda', 'timestamp': '2025-10-08T15:28:36.378822', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram':
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: te': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '7533e3a4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.845321744, 'message_signature': '2369d72e57fae43ca880c7c5ad7d1f9d6ea92391b7dedbfa9a3d69233f8b3861'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12424899987, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:28:36.378822', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '7533ec28-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.873361293, 'message_signature': 'dfd7153645b460571dae777d68006245941132dbf0c32d54f57632c079d4c197'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:28:36.378822', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '7533f394-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.873361293, 'message_signature': '9a6943a0fff1d7c6f74a6a12acaef05d4d41d92f9ff1514cb30decabe85a9989'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18602335385, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-vda', 'timestamp': '2025-10-08T15:28:36.378822', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '7533fb82-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.892591992, 'message_signature': 'beb354aa31f4b91b9e2032b82e9fdf109caedf1c41dfcdefd626322fc41057d4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-sda', 'timestamp': '2025-10-08T15:28:36.378822', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '753403c0-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.892591992, 'message_signature': 'b9c9b80422f91d7517c1a47ecc6939a7f786fd497f6e8de42459ff60020912b6'}]}, 'timestamp': '2025-10-08 15:28:36.380860', '_unique_id': '02fcec50695b4841a03cf8aa989d395a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.382 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.382 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.382 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.382 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.382 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.382 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4383ef39-2569-4701-92dd-48f33b63672e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000023-63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-tap92e6817e-73', 'timestamp': '2025-10-08T15:28:36.382148', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'tap92e6817e-73', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6b:88:09', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap92e6817e-73'}, 'message_id': '75343e6c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.742040441, 'message_signature': '15b51150cd1e352a37252846bd53ac52bc8a95afaf2445e5d13e84464e719686'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-00000025-020f3abc-b9cd-43d6-81f9-4464a8d20207-tap902f5462-63', 'timestamp': '2025-10-08T15:28:36.382148', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'tap902f5462-63', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:97:62:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap902f5462-63'}, 'message_id': '75344696-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.746400989, 'message_signature': 'd2762e1bdb87f68be3b93ee7308d6eceabb7dd18352c94a649d29106c5b913e3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tapb5af459f-56', 'timestamp': '2025-10-08T15:28:36.382148', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tapb5af459f-56', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4e:90:51', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb5af459f-56'}, 'message_id': '75344eca-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.750113617, 'message_signature': '0d6eb0374cdb2c42d5304942481fdb6a5f8cd50cebb4f35fbdc64ecbd4807d8f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tap7ad20ed3-85', 'timestamp': '2025-10-08T15:28:36.382148', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tap7ad20ed3-85', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:62:ec:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7ad20ed3-85'}, 'message_id': '75345690-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.750113617, 'message_signature': 'a8f1e007e337260cd037a8f1ac9364efae0251b34a76d90cbdcd31b03e049b93'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:28:36.382148', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': '75345f32-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.756487788, 'message_signature': '165cf3254273ed2f5fde36ebfa8892528ad8a2c5494940920e26ceeab57937d2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '843ea0278e
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: roject_name': None, 'resource_id': 'instance-0000001d-1cbc4434-d89a-483d-a1f2-299190262888-tap020c7187-87', 'timestamp': '2025-10-08T15:28:36.382148', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'tap020c7187-87', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8b:42:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap020c7187-87'}, 'message_id': '75346770-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.761202618, 'message_signature': 'e87392fdd4e1cbfde741bfcded6ddabe86214a4b573225aaba992f54c07b01ea'}]}, 'timestamp': '2025-10-08 15:28:36.383418', '_unique_id': '30c08b4d93b544ebbdfc8f0bae0798cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.384 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.384 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.384 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.385 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.385 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.385 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.385 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f29db007-752a-4482-8f06-fea9e47ce92c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000023-63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-tap92e6817e-73', 'timestamp': '2025-10-08T15:28:36.384660', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'tap92e6817e-73', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6b:88:09', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap92e6817e-73'}, 'message_id': '7534a0aa-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.742040441, 'message_signature': '44aa298722d34c2913793dd34771075d080a20a77d82e06eb788d05fb65c6dfe'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-00000025-020f3abc-b9cd-43d6-81f9-4464a8d20207-tap902f5462-63', 'timestamp': '2025-10-08T15:28:36.384660', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'tap902f5462-63', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:97:62:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap902f5462-63'}, 'message_id': '7534a910-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.746400989, 'message_signature': '701e4dd089c69dab68d2e88cf2f734432db770a98c59abf4ec62a7362c309cc5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tapb5af459f-56', 'timestamp': '2025-10-08T15:28:36.384660', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tapb5af459f-56', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4e:90:51', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb5af459f-56'}, 'message_id': '7534b20c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.750113617, 'message_signature': 'b8829e10dc65b464afedb1b831ed78ae7b003ed27bae98979c1e79f9b497665f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tap7ad20ed3-85', 'timestamp': '2025-10-08T15:28:36.384660', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tap7ad20ed3-85', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:62:ec:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7ad20ed3-85'}, 'message_id': '7534b9b4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.750113617, 'message_signature': '38bf1cf3e83fac5d341feca6547a0d9877b08702b4101b2ff23a227d593fd6f1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:28:36.384660', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': '7534c15c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.756487788, 'message_signature': '6025524ce5123a87df29483542781969660913ebe8382c91937d9b5f79b6c1a2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '843ea0278e174175
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: _name': None, 'resource_id': 'instance-0000001d-1cbc4434-d89a-483d-a1f2-299190262888-tap020c7187-87', 'timestamp': '2025-10-08T15:28:36.384660', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'tap020c7187-87', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8b:42:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap020c7187-87'}, 'message_id': '7534c90e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.761202618, 'message_signature': '596d3ed6c144b90a85223aad461515e3685d239a424a3a3d7c0ef8af13f156e1'}]}, 'timestamp': '2025-10-08 15:28:36.385917', '_unique_id': '4108dd6f24a0483ba79ddb4a4e4e7d17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.387 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.387 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.387 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-broadcast-receiver-124-44875693>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-broadcast-receiver-124-44875693>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful>]
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.387 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.387 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.device.write.requests volume: 718 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.387 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.388 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.388 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.388 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.write.requests volume: 831 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.388 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.388 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.write.requests volume: 813 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.389 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.389 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.write.requests volume: 777 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.389 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e60f5ee2-547c-486b-8cef-184fc2853f00', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 718, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-vda', 'timestamp': '2025-10-08T15:28:36.387455', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'instance-00000023', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '75350d7e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.769452119, 'message_signature': 'd232532065201a8c74489e240a7ed54f2f4c7c870a37fa3664042f4569bcf046'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-sda', 'timestamp': '2025-10-08T15:28:36.387455', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'instance-00000023', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '75351bc0-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.769452119, 'message_signature': 'e6a95373e0cf1d1f18717b50c66682c3439da570b551e63d2b76cbc49ca4331c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207-vda', 'timestamp': '2025-10-08T15:28:36.387455', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'instance-00000025', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '75352462-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.816128358, 'message_signature': '2a9bdc47a6e31c00d1ca0d5720b129ad2cebdfd3fab2ce8f4e7b8c784b6b0b95'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207-sda', 'timestamp': '2025-10-08T15:28:36.387455', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'instance-00000025', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '75352bec-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.816128358, 'message_signature': 'cbb1bc7a386cb3ee80a09f9691ea9bc5eb3e5726b321262ef37555377808be54'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 831, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-vda', 'timestamp': '2025-10-08T15:28:36.387455', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '7535342a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.845321744, 'message_signature': '99aa97e562df3d5b30de7010213ada560ff59e230c8ffa645b386b892bb00dde'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-sda', 'timestamp': '2025-10-08T15:28:36.387455', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcp
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: ng', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '75353c68-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.845321744, 'message_signature': '7bc6db9199d4f052db721bdbcfeb119815d7ec66157248c7ce19087b3d4370b4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 813, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:28:36.387455', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '753543ca-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.873361293, 'message_signature': '56dbcd65f4c7be57c7d0c2d268ba0774cb8de111a5baa308f156b63ff5456ca5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:28:36.387455', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '75355036-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.873361293, 'message_signature': 'a6bd3fdfd6694dff82b6b56ad10c4e39be2cb20f003bc911859019796539561a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 777, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-vda', 'timestamp': '2025-10-08T15:28:36.387455', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '75355838-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.892591992, 'message_signature': '8200b823529b006c90689260bb3506085f1ba5dcc216e607c4cdc4ad884f031d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-sda', 'timestamp': '2025-10-08T15:28:36.387455', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '75355f72-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.892591992, 'message_signature': 'c52eabdda56fe707dbdf251cd803bed0708b8bd80359bcc1d6d69df41c8f113a'}]}, 'timestamp': '2025-10-08 15:28:36.389760', '_unique_id': '1216bf985a604d9391449c015547ea7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.390 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.391 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.device.read.requests volume: 11695 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.391 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.391 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.device.read.requests volume: 5688 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.391 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.391 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.read.requests volume: 11679 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.392 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.392 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.read.requests volume: 11711 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.392 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.392 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.read.requests volume: 11679 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.392 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8465ebf5-4fe9-4042-9d16-da29032c6976', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11695, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-vda', 'timestamp': '2025-10-08T15:28:36.390994', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'instance-00000023', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '75359866-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.769452119, 'message_signature': '6b7c24dce90647584d1bee320588dbb767571234a7cb422427f6f97444bbf1ab'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-sda', 'timestamp': '2025-10-08T15:28:36.390994', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'instance-00000023', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '75359ffa-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.769452119, 'message_signature': '90a02faf69fcfd067930d88d0c86214ff1658ab67215d0156bf503fa2cd06e6b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 5688, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207-vda', 'timestamp': '2025-10-08T15:28:36.390994', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'instance-00000025', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '7535a73e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.816128358, 'message_signature': '0a9e0ffeffde6a92bee9d27ec19802630caf49480e0ecd29b354bdc162279a02'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207-sda', 'timestamp': '2025-10-08T15:28:36.390994', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'instance-00000025', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '7535ae78-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.816128358, 'message_signature': 'ee23fe22553ad10bf78b02cac79f59e5a0d0e55b10efb48c12f0df37c68fb465'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11679, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-vda', 'timestamp': '2025-10-08T15:28:36.390994', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '7535b5ee-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.845321744, 'message_signature': '9ff1637857921baa6c8808d7da5400f57c65d2e51188636e490226cbe82d75ca'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-sda', 'timestamp': '2025-10-08T15:28:36.390994', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest',
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '7535bf94-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.845321744, 'message_signature': '7c46ddd499d1eeb37260f89256af1d2bbb886ebe2d110568d2ba3f19ce2451f0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11711, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:28:36.390994', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '7535c7b4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.873361293, 'message_signature': 'b3e60c97450fb1c072d761e492daf362c90a4e7974b1b5113a0702254ed38013'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:28:36.390994', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '7535d010-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.873361293, 'message_signature': '401f7709bc5109d2554da1cd50e0eda69c4e1c38d8dc746edd6b80939e2bb55e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11679, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-vda', 'timestamp': '2025-10-08T15:28:36.390994', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '7535d790-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.892591992, 'message_signature': '23aceab907529c4d497eff42bb6c8cc73d2140b8e2a2cc123aa09be0acb2e802'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-sda', 'timestamp': '2025-10-08T15:28:36.390994', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '7535df4c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.892591992, 'message_signature': '2b7ed77efc647949e52dcbbf8104bb758195d928462ae3454e4d942e27c5175a'}]}, 'timestamp': '2025-10-08 15:28:36.393036', '_unique_id': '4b583f94f9f749f29bbce2d19e949231'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.394 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.394 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/network.outgoing.packets volume: 235 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.394 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.395 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.outgoing.packets volume: 1164 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.395 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.outgoing.packets volume: 435 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.395 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.outgoing.packets volume: 312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.395 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/network.outgoing.packets volume: 223 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18c881f2-25a6-4207-b238-844d204b8b8a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 235, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000023-63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-tap92e6817e-73', 'timestamp': '2025-10-08T15:28:36.394631', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'tap92e6817e-73', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6b:88:09', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap92e6817e-73'}, 'message_id': '75362650-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.742040441, 'message_signature': '037e6fb219e4e3dc0b22512e1cd729a7d902900946f7413df47616b6c3bc68f4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-00000025-020f3abc-b9cd-43d6-81f9-4464a8d20207-tap902f5462-63', 'timestamp': '2025-10-08T15:28:36.394631', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'tap902f5462-63', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:97:62:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap902f5462-63'}, 'message_id': '75362eca-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.746400989, 'message_signature': '78fb1c39704c08776afe5d08b694203cc683dd441bd63b95fe577c28363843b6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1164, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tapb5af459f-56', 'timestamp': '2025-10-08T15:28:36.394631', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tapb5af459f-56', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4e:90:51', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb5af459f-56'}, 'message_id': '75363762-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.750113617, 'message_signature': 'fe0f6fb4d6db7fc75bf8ccfe0f0554ae28d43e86e922d8be4fe15108c752223a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 435, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tap7ad20ed3-85', 'timestamp': '2025-10-08T15:28:36.394631', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tap7ad20ed3-85', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:62:ec:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7ad20ed3-85'}, 'message_id': '75363f32-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.750113617, 'message_signature': '5b153dc123508d9c8817b7ce1e6defc4dce781c3d46f42844579a87e322122ac'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 312, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:28:36.394631', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': '75364702-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.756487788, 'message_signature': 'b8d4116506304442e8cd1eb04bfb0485cb763650029cb32cad0ef6d3729bcbc9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 223, 'user_id': '843ea0278e174175a6f8e21731c1383e', 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: urce_id': 'instance-0000001d-1cbc4434-d89a-483d-a1f2-299190262888-tap020c7187-87', 'timestamp': '2025-10-08T15:28:36.394631', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'tap020c7187-87', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8b:42:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap020c7187-87'}, 'message_id': '75364f2c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.761202618, 'message_signature': '151ae29f4d730e13b5854bce6783180600e18e304bf6480369b21e02bf26487f'}]}, 'timestamp': '2025-10-08 15:28:36.395915', '_unique_id': '0f70b798e3ee4180873769ac22f66f95'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.397 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.397 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.397 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-broadcast-receiver-124-44875693>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-broadcast-receiver-124-44875693>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful>]
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.397 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.397 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.397 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.398 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.outgoing.bytes.delta volume: 96532 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.398 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.398 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/network.outgoing.bytes.delta volume: 18766 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.398 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/network.outgoing.bytes.delta volume: 1812 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '430fbfb8-d62a-4c1b-bb7e-ffea67c2eb44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000023-63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-tap92e6817e-73', 'timestamp': '2025-10-08T15:28:36.397478', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'tap92e6817e-73', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6b:88:09', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap92e6817e-73'}, 'message_id': '753695a4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.742040441, 'message_signature': '3a3e4157718de3e0b6a9b2326206031ea73ccaa0c80485490ffafbb879fc97af'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-00000025-020f3abc-b9cd-43d6-81f9-4464a8d20207-tap902f5462-63', 'timestamp': '2025-10-08T15:28:36.397478', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'tap902f5462-63', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:97:62:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap902f5462-63'}, 'message_id': '75369fc2-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.746400989, 'message_signature': '8c54ca853595eab009387b7a3fd876f27e5614399526f95ce6fe0a2d929af6f5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 96532, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tapb5af459f-56', 'timestamp': '2025-10-08T15:28:36.397478', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tapb5af459f-56', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4e:90:51', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb5af459f-56'}, 'message_id': '7536ab7a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.750113617, 'message_signature': 'f4b717a1b9b00d8995ae7d598835131821c8c0f7f4f261a5465e3ac5c9428d93'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000021-7f1808f3-5a79-4149-84d1-7bc21eefa497-tap7ad20ed3-85', 'timestamp': '2025-10-08T15:28:36.397478', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'tap7ad20ed3-85', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:62:ec:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7ad20ed3-85'}, 'message_id': '7536b5de-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.750113617, 'message_signature': 'bd8b9ecbfb98830216a56983a36320d856cc028a7af7819749a32dae8005ef98'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 18766, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': 'instance-00000014-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-tap0bb60f77-cd', 'timestamp': '2025-10-08T15:28:36.397478', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'tap0bb60f77-cd', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:a6:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0bb60f77-cd'}, 'message_id': '7536be3a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.756487788, 'message_signature': '222a41ece4d3a3072ac057c423545f761f5677e042ce3de79a545a9ca287d50e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 1812, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 4-d89a-483d-a1f2-299190262888-tap020c7187-87', 'timestamp': '2025-10-08T15:28:36.397478', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'tap020c7187-87', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8b:42:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap020c7187-87'}, 'message_id': '7536c628-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.761202618, 'message_signature': '289c1477ab6dd408126a180a4b4cda513087aa747c35a920557b0a50b228b6dc'}]}, 'timestamp': '2025-10-08 15:28:36.398951', '_unique_id': 'c0e5abf42acb40adb5f90d8c7b6d96f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.400 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.400 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/memory.usage volume: 256.109375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.400 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.400 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 020f3abc-b9cd-43d6-81f9-4464a8d20207: ceilometer.compute.pollsters.NoVolumeException
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.400 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/memory.usage volume: 241.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/memory.usage volume: 248.44921875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/memory.usage volume: 244.07421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c79583ce-9e4d-498b-86f8-6fb47248be5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 256.109375, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'timestamp': '2025-10-08T15:28:36.400369', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'instance-00000023', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '7537070a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4220.033820288, 'message_signature': 'b64cfdd4da334a2eb7a5bfbb64b07326219dbe4b01bb3eed2474983a4d836c7e'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 241.38671875, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'timestamp': '2025-10-08T15:28:36.400369', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '7537168c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4220.068514858, 'message_signature': 'fbd433643be8d77d5ca90d33dbfc46c60b4598571e3eab491eedc7b9bfa265ca'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 248.44921875, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'timestamp': '2025-10-08T15:28:36.400369', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '75371f6a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4220.083641198, 'message_signature': '8227595b3f1ffeeb995d434d39ab1bd1f41678cbf99addeebe4167f3716677d3'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 244.07421875, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'timestamp': '2025-10-08T15:28:36.400369', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '753726f4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4220.096853786, 'message_signature': '22ae1abc77c5f71681345f05f962609e853655994d8b54a35adfc34256107d00'}]}, 'timestamp': '2025-10-08 15:28:36.401422', '_unique_id': 'ba0fe9fc9ca6486aa63c63c981f36e2b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.401 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.402 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.402 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.402 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.403 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.403 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.403 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.403 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.403 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.404 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.404 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.404 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3077290a-2222-4d0c-be29-61d74d3609d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-vda', 'timestamp': '2025-10-08T15:28:36.402639', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'instance-00000023', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '75375ee4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.917357127, 'message_signature': '34c75bf4808983aec79081dd0ab8dcc804e19e4aa5084ac0574bae510cf52a87'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-sda', 'timestamp': '2025-10-08T15:28:36.402639', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'instance-00000023', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '753766b4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.917357127, 'message_signature': 'c2ff453956f96329fb3e346f4809c21c305463953d5d60f040c164c10bf48bec'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207-vda', 'timestamp': '2025-10-08T15:28:36.402639', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'instance-00000025', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '75376ee8-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.931369981, 'message_signature': 'ee1d1dc1a22c9353dd43cf58009b89a300ad4e3a9b0c207c3448ad84e302d149'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207-sda', 'timestamp': '2025-10-08T15:28:36.402639', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'instance-00000025', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '7537765e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.931369981, 'message_signature': '7ec6a8a4f9f857c616eb7b5872191a40a247b1c0346dee9cbce0aa439ce03119'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-vda', 'timestamp': '2025-10-08T15:28:36.402639', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '75377dac-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.944156146, 'message_signature': '17f2220f19e4822bfed1b18b063040374caa764a63885b9e13e290efd0cdde45'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-sda', 'timestamp': '2025-10-08T15:28:36.402639', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 11-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '75378658-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.944156146, 'message_signature': '9e38d3cf80db67e7cbb525b47b0af33b76eae9a597e8b611cc12a47b74343779'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:28:36.402639', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '75378dec-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.959825613, 'message_signature': '86e404c9e939c95f0725e6adca2d4f895bcd3c58b5a65e699fa149a149da12eb'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:28:36.402639', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '753795c6-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.959825613, 'message_signature': '31c69454414d012931382733da1a2f2d12020301928fd65a0167e04580597321'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-vda', 'timestamp': '2025-10-08T15:28:36.402639', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '75379d28-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.969969974, 'message_signature': 'fd5d33b39de654e0ed2c6c9eae0df437fb92f96b54b9f7b3eac9e07b6eee752e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-sda', 'timestamp': '2025-10-08T15:28:36.402639', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '7537a502-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.969969974, 'message_signature': 'c7df6cffd34941030e5c8165f7514224624adc9ca863006595946f6a63a78bcf'}]}, 'timestamp': '2025-10-08 15:28:36.404662', '_unique_id': 'b4524cf499f54f3f80d9ea832f9214ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.406 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.406 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.406 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-broadcast-receiver-124-44875693>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-broadcast-receiver-124-44875693>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful>]
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.406 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.406 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.406 12 DEBUG ceilometer.compute.pollsters [-] 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.407 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.device.allocation volume: 1253376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.407 12 DEBUG ceilometer.compute.pollsters [-] 020f3abc-b9cd-43d6-81f9-4464a8d20207/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.407 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.allocation volume: 153100288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.407 12 DEBUG ceilometer.compute.pollsters [-] 7f1808f3-5a79-4149-84d1-7bc21eefa497/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.408 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.allocation volume: 161484800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.408 12 DEBUG ceilometer.compute.pollsters [-] 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.408 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.allocation volume: 169873408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.408 12 DEBUG ceilometer.compute.pollsters [-] 1cbc4434-d89a-483d-a1f2-299190262888/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b07abe01-9a53-4d94-ab72-bb803038d02e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-vda', 'timestamp': '2025-10-08T15:28:36.406533', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'instance-00000023', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '7537f76e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.917357127, 'message_signature': 'cd3b869cac922861d7a75b6a0681957fc777b6bf98e74b0c43f500c8f41e23da'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-sda', 'timestamp': '2025-10-08T15:28:36.406533', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-124-44875693', 'name': 'instance-00000023', 'instance_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '75380006-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.917357127, 'message_signature': '45484e046e4bb7fb55dc5ad263376d7a2a032dbf0c228e49562316f8d20ee17f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1253376, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207-vda', 'timestamp': '2025-10-08T15:28:36.406533', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'instance-00000025', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '75380b64-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.931369981, 'message_signature': '5ef4b252c4e1da99f889e2f9ee2d3ca0a29b48b8925dbd0bee84b29cdf2749ab'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207-sda', 'timestamp': '2025-10-08T15:28:36.406533', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'name': 'instance-00000025', 'instance_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '75381564-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.931369981, 'message_signature': '216b53fa421cf6264e23a9431ca059f91e701237321a3f81ced485cdc8a9d96b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153100288, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-vda', 'timestamp': '2025-10-08T15:28:36.406533', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '75381eba-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.944156146, 'message_signature': '59f5643939b2578be042914945728745d4d8698986e2ba955dbfa1f0a8caa396'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497-sda', 'timestamp': '2025-10-08T15:28:36.406533', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_east_west-350327070', 'name': 'instance-00000021', 'instance_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 1-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '75382810-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.944156146, 'message_signature': 'bb26cab1e99a0fce466ef2eb47062c39ffda6cdd65c9c4cb578d1e1ea7f1a4c3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 161484800, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-vda', 'timestamp': '2025-10-08T15:28:36.406533', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '75383206-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.959825613, 'message_signature': 'b0b0ab529edfc6b8388c85b4123c473bee9d192843896d41ed96f77f5c7a251a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-sda', 'timestamp': '2025-10-08T15:28:36.406533', 'resource_metadata': {'display_name': 'tempest-broadcast-receiver-123-1908290520', 'name': 'instance-00000014', 'instance_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '75383c9c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.959825613, 'message_signature': 'af4955c1831e129c85bcc93982f23d326afdd70c87e3ee8f705ada9fa5026b41'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 169873408, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-vda', 'timestamp': '2025-10-08T15:28:36.406533', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '75384c96-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.969969974, 'message_signature': 'c700432f82c8b3bd32bafcc6147d4424cbb5c15a4ecbefb2415886b77d6a075d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '843ea0278e174175a6f8e21731c1383e', 'user_name': None, 'project_id': '7e1086961263487db8a3c5190fdf1b2e', 'project_name': None, 'resource_id': '1cbc4434-d89a-483d-a1f2-299190262888-sda', 'timestamp': '2025-10-08T15:28:36.406533', 'resource_metadata': {'display_name': 'tempest-broadcast-sender-124-598361755', 'name': 'instance-0000001d', 'instance_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'instance_type': 'custom_neutron_guest', 'host': 'a0842dd7275788009a6adc21785485cd3bb1f9b75433f2d188e654fb', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '7538545c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4219.969969974, 'message_signature': 'e41374d3b3bc618b6664492770f05b09513b1334ef14745567bcb386c8eeacea'}]}, 'timestamp': '2025-10-08 15:28:36.409161', '_unique_id': '29569bf1382a4fcaa1a9cf3310f837a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.410 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.410 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:28:36.410 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-broadcast-receiver-124-44875693>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-broadcast-receiver-124-44875693>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful>]
Oct  8 11:28:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:28:36.378 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:28:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:28:36.381 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:28:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:28:36.383 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:28:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:28:36.386 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:28:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:28:36.390 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:28:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:28:36.393 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:28:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:28:36.396 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:28:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:28:36.399 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:28:36 np0005476733 ovn_controller[94857]: 2025-10-08T15:28:36Z|00313|binding|INFO|Releasing lport 46f589fc-b5d9-4e1f-b085-8789fd1f48e9 from this chassis (sb_readonly=0)
Oct  8 11:28:36 np0005476733 ovn_controller[94857]: 2025-10-08T15:28:36Z|00314|binding|INFO|Releasing lport b563ca05-c871-4f0e-9980-177237a3f88d from this chassis (sb_readonly=0)
Oct  8 11:28:36 np0005476733 ovn_controller[94857]: 2025-10-08T15:28:36Z|00315|binding|INFO|Releasing lport c8b535f4-c584-42f9-a77f-2ed6ee5f2aaa from this chassis (sb_readonly=0)
Oct  8 11:28:36 np0005476733 ovn_controller[94857]: 2025-10-08T15:28:36Z|00316|binding|INFO|Releasing lport 9e6f9f1a-9b45-47d5-b171-40ef2fcda78c from this chassis (sb_readonly=0)
Oct  8 11:28:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:28:36.405 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:28:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:28:36.409 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:28:36 np0005476733 nova_compute[192580]: 2025-10-08 15:28:36.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:37.213 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:28:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:37.213 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:28:37 np0005476733 nova_compute[192580]: 2025-10-08 15:28:37.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:38 np0005476733 nova_compute[192580]: 2025-10-08 15:28:38.992 2 INFO nova.compute.manager [None req-bc69962d-b692-4504-b33b-a7fafdf389ee 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Get console output#033[00m
Oct  8 11:28:38 np0005476733 nova_compute[192580]: 2025-10-08 15:28:38.998 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:28:39 np0005476733 nova_compute[192580]: 2025-10-08 15:28:39.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:39 np0005476733 nova_compute[192580]: 2025-10-08 15:28:39.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:28:39 np0005476733 nova_compute[192580]: 2025-10-08 15:28:39.733 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:28:39 np0005476733 nova_compute[192580]: 2025-10-08 15:28:39.733 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:28:39 np0005476733 nova_compute[192580]: 2025-10-08 15:28:39.734 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:28:39 np0005476733 nova_compute[192580]: 2025-10-08 15:28:39.734 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:28:39 np0005476733 nova_compute[192580]: 2025-10-08 15:28:39.847 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:28:39 np0005476733 nova_compute[192580]: 2025-10-08 15:28:39.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:39 np0005476733 nova_compute[192580]: 2025-10-08 15:28:39.921 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:28:39 np0005476733 nova_compute[192580]: 2025-10-08 15:28:39.922 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:28:39 np0005476733 nova_compute[192580]: 2025-10-08 15:28:39.983 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63718bc7-c79a-49a4-a0f2-bb47aa50f5b6/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:28:39 np0005476733 nova_compute[192580]: 2025-10-08 15:28:39.989 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/020f3abc-b9cd-43d6-81f9-4464a8d20207/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.040 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/020f3abc-b9cd-43d6-81f9-4464a8d20207/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.043 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/020f3abc-b9cd-43d6-81f9-4464a8d20207/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.125 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/020f3abc-b9cd-43d6-81f9-4464a8d20207/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.131 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.189 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.190 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.251 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.258 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.316 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.317 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.392 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.397 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.454 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.455 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.508 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.703 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.704 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=10101MB free_disk=110.74222183227539GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.705 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.705 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.811 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.812 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 1cbc4434-d89a-483d-a1f2-299190262888 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.812 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 7f1808f3-5a79-4149-84d1-7bc21eefa497 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.812 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.813 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 020f3abc-b9cd-43d6-81f9-4464a8d20207 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.813 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 5 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.813 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=5632MB phys_disk=119GB used_disk=50GB total_vcpus=8 used_vcpus=5 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.931 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.964 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.988 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:28:40 np0005476733 nova_compute[192580]: 2025-10-08 15:28:40.989 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:28:41 np0005476733 nova_compute[192580]: 2025-10-08 15:28:41.989 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:28:42 np0005476733 podman[230022]: 2025-10-08 15:28:42.228557444 +0000 UTC m=+0.054801748 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Oct  8 11:28:43 np0005476733 nova_compute[192580]: 2025-10-08 15:28:43.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:28:44 np0005476733 nova_compute[192580]: 2025-10-08 15:28:44.139 2 INFO nova.compute.manager [None req-22e2e6d4-bd5f-4a4d-97b0-c6bba60da673 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Get console output#033[00m
Oct  8 11:28:44 np0005476733 nova_compute[192580]: 2025-10-08 15:28:44.144 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:28:44 np0005476733 nova_compute[192580]: 2025-10-08 15:28:44.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:44 np0005476733 podman[230053]: 2025-10-08 15:28:44.240407458 +0000 UTC m=+0.062042107 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, container_name=multipathd)
Oct  8 11:28:44 np0005476733 podman[230054]: 2025-10-08 15:28:44.244813827 +0000 UTC m=+0.060629482 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 11:28:44 np0005476733 nova_compute[192580]: 2025-10-08 15:28:44.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:28:44 np0005476733 nova_compute[192580]: 2025-10-08 15:28:44.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:45 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:45.215 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:28:45 np0005476733 nova_compute[192580]: 2025-10-08 15:28:45.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:28:45 np0005476733 ovn_controller[94857]: 2025-10-08T15:28:45Z|00317|binding|INFO|Releasing lport 46f589fc-b5d9-4e1f-b085-8789fd1f48e9 from this chassis (sb_readonly=0)
Oct  8 11:28:45 np0005476733 ovn_controller[94857]: 2025-10-08T15:28:45Z|00318|binding|INFO|Releasing lport b563ca05-c871-4f0e-9980-177237a3f88d from this chassis (sb_readonly=0)
Oct  8 11:28:45 np0005476733 ovn_controller[94857]: 2025-10-08T15:28:45Z|00319|binding|INFO|Releasing lport c8b535f4-c584-42f9-a77f-2ed6ee5f2aaa from this chassis (sb_readonly=0)
Oct  8 11:28:45 np0005476733 ovn_controller[94857]: 2025-10-08T15:28:45Z|00320|binding|INFO|Releasing lport 9e6f9f1a-9b45-47d5-b171-40ef2fcda78c from this chassis (sb_readonly=0)
Oct  8 11:28:46 np0005476733 nova_compute[192580]: 2025-10-08 15:28:46.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:47 np0005476733 nova_compute[192580]: 2025-10-08 15:28:47.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:28:47 np0005476733 nova_compute[192580]: 2025-10-08 15:28:47.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:28:48 np0005476733 nova_compute[192580]: 2025-10-08 15:28:48.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:28:49 np0005476733 nova_compute[192580]: 2025-10-08 15:28:49.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:49 np0005476733 nova_compute[192580]: 2025-10-08 15:28:49.366 2 INFO nova.compute.manager [None req-eb0326e0-398a-4910-8519-2a69f75cbf86 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Get console output#033[00m
Oct  8 11:28:49 np0005476733 nova_compute[192580]: 2025-10-08 15:28:49.378 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:28:49 np0005476733 nova_compute[192580]: 2025-10-08 15:28:49.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:51 np0005476733 ovn_controller[94857]: 2025-10-08T15:28:51Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:97:62:62 192.168.3.171
Oct  8 11:28:51 np0005476733 ovn_controller[94857]: 2025-10-08T15:28:51Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:97:62:62 192.168.3.171
Oct  8 11:28:52 np0005476733 nova_compute[192580]: 2025-10-08 15:28:52.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:28:52 np0005476733 nova_compute[192580]: 2025-10-08 15:28:52.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:28:52 np0005476733 nova_compute[192580]: 2025-10-08 15:28:52.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:28:53 np0005476733 nova_compute[192580]: 2025-10-08 15:28:53.138 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:28:53 np0005476733 nova_compute[192580]: 2025-10-08 15:28:53.139 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:28:53 np0005476733 nova_compute[192580]: 2025-10-08 15:28:53.139 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:28:53 np0005476733 nova_compute[192580]: 2025-10-08 15:28:53.139 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:28:53 np0005476733 podman[230100]: 2025-10-08 15:28:53.221991133 +0000 UTC m=+0.054375214 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 11:28:53 np0005476733 podman[230101]: 2025-10-08 15:28:53.222286753 +0000 UTC m=+0.053940361 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:28:54 np0005476733 nova_compute[192580]: 2025-10-08 15:28:54.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:54 np0005476733 nova_compute[192580]: 2025-10-08 15:28:54.673 2 INFO nova.compute.manager [None req-165aebf5-9455-44b1-93d8-001047629d34 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Get console output#033[00m
Oct  8 11:28:54 np0005476733 nova_compute[192580]: 2025-10-08 15:28:54.678 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:28:54 np0005476733 nova_compute[192580]: 2025-10-08 15:28:54.833 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Updating instance_info_cache with network_info: [{"id": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "address": "fa:16:3e:a6:aa:3b", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bb60f77-cd", "ovs_interfaceid": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:28:54 np0005476733 nova_compute[192580]: 2025-10-08 15:28:54.848 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-1a3ae685-bd3d-4f36-ad77-9f5b6b95677f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:28:54 np0005476733 nova_compute[192580]: 2025-10-08 15:28:54.848 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:28:54 np0005476733 nova_compute[192580]: 2025-10-08 15:28:54.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:55 np0005476733 nova_compute[192580]: 2025-10-08 15:28:55.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:28:58 np0005476733 nova_compute[192580]: 2025-10-08 15:28:58.677 2 DEBUG oslo_concurrency.lockutils [None req-00bc7583-981d-4f0e-af94-2aed5fa3eb9c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "63718bc7-c79a-49a4-a0f2-bb47aa50f5b6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:28:58 np0005476733 nova_compute[192580]: 2025-10-08 15:28:58.678 2 DEBUG oslo_concurrency.lockutils [None req-00bc7583-981d-4f0e-af94-2aed5fa3eb9c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "63718bc7-c79a-49a4-a0f2-bb47aa50f5b6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:28:58 np0005476733 nova_compute[192580]: 2025-10-08 15:28:58.678 2 DEBUG oslo_concurrency.lockutils [None req-00bc7583-981d-4f0e-af94-2aed5fa3eb9c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:28:58 np0005476733 nova_compute[192580]: 2025-10-08 15:28:58.679 2 DEBUG oslo_concurrency.lockutils [None req-00bc7583-981d-4f0e-af94-2aed5fa3eb9c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:28:58 np0005476733 nova_compute[192580]: 2025-10-08 15:28:58.679 2 DEBUG oslo_concurrency.lockutils [None req-00bc7583-981d-4f0e-af94-2aed5fa3eb9c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:28:58 np0005476733 nova_compute[192580]: 2025-10-08 15:28:58.681 2 INFO nova.compute.manager [None req-00bc7583-981d-4f0e-af94-2aed5fa3eb9c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Terminating instance#033[00m
Oct  8 11:28:58 np0005476733 nova_compute[192580]: 2025-10-08 15:28:58.683 2 DEBUG nova.compute.manager [None req-00bc7583-981d-4f0e-af94-2aed5fa3eb9c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:28:58 np0005476733 kernel: tap92e6817e-73 (unregistering): left promiscuous mode
Oct  8 11:28:58 np0005476733 NetworkManager[51699]: <info>  [1759937338.7094] device (tap92e6817e-73): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:28:58 np0005476733 ovn_controller[94857]: 2025-10-08T15:28:58Z|00321|binding|INFO|Releasing lport 92e6817e-732a-4e42-973e-2d26e62163f5 from this chassis (sb_readonly=0)
Oct  8 11:28:58 np0005476733 ovn_controller[94857]: 2025-10-08T15:28:58Z|00322|binding|INFO|Setting lport 92e6817e-732a-4e42-973e-2d26e62163f5 down in Southbound
Oct  8 11:28:58 np0005476733 nova_compute[192580]: 2025-10-08 15:28:58.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:58 np0005476733 ovn_controller[94857]: 2025-10-08T15:28:58Z|00323|binding|INFO|Removing iface tap92e6817e-73 ovn-installed in OVS
Oct  8 11:28:58 np0005476733 nova_compute[192580]: 2025-10-08 15:28:58.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:58.801 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:88:09 10.100.0.7'], port_security=['fa:16:3e:6b:88:09 10.100.0.7 192.168.111.15/24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '63718bc7-c79a-49a4-a0f2-bb47aa50f5b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7a77f8cd-4394-4cb0-a8a1-33872549758a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e1086961263487db8a3c5190fdf1b2e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '78a6a465-5b3b-43e0-8a00-63e5875c77b4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=538f4b4e-d2f6-4df4-8e2a-7fc02c73fc5a, chassis=[], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=92e6817e-732a-4e42-973e-2d26e62163f5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:28:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:58.804 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 92e6817e-732a-4e42-973e-2d26e62163f5 in datapath 7a77f8cd-4394-4cb0-a8a1-33872549758a unbound from our chassis#033[00m
Oct  8 11:28:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:58.807 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7a77f8cd-4394-4cb0-a8a1-33872549758a#033[00m
Oct  8 11:28:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:58.827 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[32272ea3-b413-4d57-8073-06e8bcbe5df2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:28:58 np0005476733 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000023.scope: Deactivated successfully.
Oct  8 11:28:58 np0005476733 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000023.scope: Consumed 46.982s CPU time.
Oct  8 11:28:58 np0005476733 systemd-machined[152624]: Machine qemu-22-instance-00000023 terminated.
Oct  8 11:28:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:58.861 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[1224855c-295b-4983-aa60-98c0fcec85f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:28:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:58.864 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[026250a8-7089-4783-9bf8-bbbf863b1705]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:28:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:58.894 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[00c6cf7b-edd3-496e-ba8d-e78c6adf9029]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:28:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:58.914 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[32bf5a06-bdf5-42b8-902d-184c34f440db]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7a77f8cd-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:53:0f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 9, 'rx_bytes': 1718, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 9, 'rx_bytes': 1718, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 389913, 'reachable_time': 23320, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230155, 'error': None, 'target': 'ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:28:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:58.933 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[49a5f54f-8d83-4f3c-9d56-5ddcf8e27a4f]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7a77f8cd-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 389931, 'tstamp': 389931}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230162, 'error': None, 'target': 'ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7a77f8cd-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 389935, 'tstamp': 389935}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230162, 'error': None, 'target': 'ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:28:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:58.935 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a77f8cd-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:28:58 np0005476733 nova_compute[192580]: 2025-10-08 15:28:58.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:58 np0005476733 nova_compute[192580]: 2025-10-08 15:28:58.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:58.944 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7a77f8cd-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:28:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:58.944 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:28:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:58.945 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7a77f8cd-40, col_values=(('external_ids', {'iface-id': 'b563ca05-c871-4f0e-9980-177237a3f88d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:28:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:28:58.945 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:28:58 np0005476733 nova_compute[192580]: 2025-10-08 15:28:58.947 2 INFO nova.virt.libvirt.driver [-] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Instance destroyed successfully.#033[00m
Oct  8 11:28:58 np0005476733 nova_compute[192580]: 2025-10-08 15:28:58.947 2 DEBUG nova.objects.instance [None req-00bc7583-981d-4f0e-af94-2aed5fa3eb9c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lazy-loading 'resources' on Instance uuid 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:28:58 np0005476733 nova_compute[192580]: 2025-10-08 15:28:58.964 2 DEBUG nova.virt.libvirt.vif [None req-00bc7583-981d-4f0e-af94-2aed5fa3eb9c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:27:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-broadcast-receiver-124-44875693',display_name='tempest-broadcast-receiver-124-44875693',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-broadcast-receiver-124-44875693',id=35,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGBqxlO9VuM0Qq/DWr14YnhGxOxwcqegm/N2XcRSLA8NJfb1K0EfLGDHkMQul32EUhmJshL5J7ZH56Voxwq765dL8/B4SFbezZWy3ydp4mAt0951qcEHggiOu5J3JaZbOg==',key_name='tempest-keypair-test-1882494757',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:27:34Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7e1086961263487db8a3c5190fdf1b2e',ramdisk_id='',reservation_id='r-p4w6uocs',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-BroadcastTestVlanTransparency-538458942',owner_user_name='tempest-BroadcastTestVlanTransparency-538458942-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:27:34Z,user_data=None,user_id='843ea0278e174175a6f8e21731c1383e',uuid=63718bc7-c79a-49a4-a0f2-bb47aa50f5b6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "92e6817e-732a-4e42-973e-2d26e62163f5", "address": "fa:16:3e:6b:88:09", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92e6817e-73", "ovs_interfaceid": "92e6817e-732a-4e42-973e-2d26e62163f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:28:58 np0005476733 nova_compute[192580]: 2025-10-08 15:28:58.965 2 DEBUG nova.network.os_vif_util [None req-00bc7583-981d-4f0e-af94-2aed5fa3eb9c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Converting VIF {"id": "92e6817e-732a-4e42-973e-2d26e62163f5", "address": "fa:16:3e:6b:88:09", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92e6817e-73", "ovs_interfaceid": "92e6817e-732a-4e42-973e-2d26e62163f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:28:58 np0005476733 nova_compute[192580]: 2025-10-08 15:28:58.966 2 DEBUG nova.network.os_vif_util [None req-00bc7583-981d-4f0e-af94-2aed5fa3eb9c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6b:88:09,bridge_name='br-int',has_traffic_filtering=True,id=92e6817e-732a-4e42-973e-2d26e62163f5,network=Network(7a77f8cd-4394-4cb0-a8a1-33872549758a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap92e6817e-73') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:28:58 np0005476733 nova_compute[192580]: 2025-10-08 15:28:58.966 2 DEBUG os_vif [None req-00bc7583-981d-4f0e-af94-2aed5fa3eb9c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6b:88:09,bridge_name='br-int',has_traffic_filtering=True,id=92e6817e-732a-4e42-973e-2d26e62163f5,network=Network(7a77f8cd-4394-4cb0-a8a1-33872549758a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap92e6817e-73') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:28:58 np0005476733 nova_compute[192580]: 2025-10-08 15:28:58.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:58 np0005476733 nova_compute[192580]: 2025-10-08 15:28:58.968 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap92e6817e-73, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:28:58 np0005476733 nova_compute[192580]: 2025-10-08 15:28:58.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:58 np0005476733 nova_compute[192580]: 2025-10-08 15:28:58.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:28:58 np0005476733 nova_compute[192580]: 2025-10-08 15:28:58.974 2 INFO os_vif [None req-00bc7583-981d-4f0e-af94-2aed5fa3eb9c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6b:88:09,bridge_name='br-int',has_traffic_filtering=True,id=92e6817e-732a-4e42-973e-2d26e62163f5,network=Network(7a77f8cd-4394-4cb0-a8a1-33872549758a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap92e6817e-73')#033[00m
Oct  8 11:28:58 np0005476733 nova_compute[192580]: 2025-10-08 15:28:58.975 2 INFO nova.virt.libvirt.driver [None req-00bc7583-981d-4f0e-af94-2aed5fa3eb9c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Deleting instance files /var/lib/nova/instances/63718bc7-c79a-49a4-a0f2-bb47aa50f5b6_del#033[00m
Oct  8 11:28:58 np0005476733 nova_compute[192580]: 2025-10-08 15:28:58.976 2 INFO nova.virt.libvirt.driver [None req-00bc7583-981d-4f0e-af94-2aed5fa3eb9c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Deletion of /var/lib/nova/instances/63718bc7-c79a-49a4-a0f2-bb47aa50f5b6_del complete#033[00m
Oct  8 11:28:59 np0005476733 nova_compute[192580]: 2025-10-08 15:28:59.049 2 INFO nova.compute.manager [None req-00bc7583-981d-4f0e-af94-2aed5fa3eb9c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:28:59 np0005476733 nova_compute[192580]: 2025-10-08 15:28:59.049 2 DEBUG oslo.service.loopingcall [None req-00bc7583-981d-4f0e-af94-2aed5fa3eb9c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:28:59 np0005476733 nova_compute[192580]: 2025-10-08 15:28:59.050 2 DEBUG nova.compute.manager [-] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:28:59 np0005476733 nova_compute[192580]: 2025-10-08 15:28:59.050 2 DEBUG nova.network.neutron [-] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:28:59 np0005476733 nova_compute[192580]: 2025-10-08 15:28:59.195 2 DEBUG nova.compute.manager [req-92d3c060-d326-4614-a0ea-d7b9b95daf55 req-efdeb8e9-fbd5-4eeb-9445-e1c7187d9174 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Received event network-vif-unplugged-92e6817e-732a-4e42-973e-2d26e62163f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:28:59 np0005476733 nova_compute[192580]: 2025-10-08 15:28:59.196 2 DEBUG oslo_concurrency.lockutils [req-92d3c060-d326-4614-a0ea-d7b9b95daf55 req-efdeb8e9-fbd5-4eeb-9445-e1c7187d9174 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:28:59 np0005476733 nova_compute[192580]: 2025-10-08 15:28:59.196 2 DEBUG oslo_concurrency.lockutils [req-92d3c060-d326-4614-a0ea-d7b9b95daf55 req-efdeb8e9-fbd5-4eeb-9445-e1c7187d9174 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:28:59 np0005476733 nova_compute[192580]: 2025-10-08 15:28:59.196 2 DEBUG oslo_concurrency.lockutils [req-92d3c060-d326-4614-a0ea-d7b9b95daf55 req-efdeb8e9-fbd5-4eeb-9445-e1c7187d9174 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:28:59 np0005476733 nova_compute[192580]: 2025-10-08 15:28:59.197 2 DEBUG nova.compute.manager [req-92d3c060-d326-4614-a0ea-d7b9b95daf55 req-efdeb8e9-fbd5-4eeb-9445-e1c7187d9174 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] No waiting events found dispatching network-vif-unplugged-92e6817e-732a-4e42-973e-2d26e62163f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:28:59 np0005476733 nova_compute[192580]: 2025-10-08 15:28:59.197 2 DEBUG nova.compute.manager [req-92d3c060-d326-4614-a0ea-d7b9b95daf55 req-efdeb8e9-fbd5-4eeb-9445-e1c7187d9174 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Received event network-vif-unplugged-92e6817e-732a-4e42-973e-2d26e62163f5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:28:59 np0005476733 nova_compute[192580]: 2025-10-08 15:28:59.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:28:59 np0005476733 nova_compute[192580]: 2025-10-08 15:28:59.883 2 INFO nova.compute.manager [None req-adfd4fe5-8bfd-4850-a950-d02c6d9f5232 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Get console output#033[00m
Oct  8 11:28:59 np0005476733 nova_compute[192580]: 2025-10-08 15:28:59.889 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:28:59 np0005476733 nova_compute[192580]: 2025-10-08 15:28:59.893 2 INFO nova.virt.libvirt.driver [None req-adfd4fe5-8bfd-4850-a950-d02c6d9f5232 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Truncated console log returned, 3681 bytes ignored#033[00m
Oct  8 11:29:00 np0005476733 nova_compute[192580]: 2025-10-08 15:29:00.558 2 DEBUG nova.network.neutron [-] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:29:00 np0005476733 nova_compute[192580]: 2025-10-08 15:29:00.583 2 INFO nova.compute.manager [-] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Took 1.53 seconds to deallocate network for instance.#033[00m
Oct  8 11:29:00 np0005476733 nova_compute[192580]: 2025-10-08 15:29:00.629 2 DEBUG oslo_concurrency.lockutils [None req-00bc7583-981d-4f0e-af94-2aed5fa3eb9c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:00 np0005476733 nova_compute[192580]: 2025-10-08 15:29:00.630 2 DEBUG oslo_concurrency.lockutils [None req-00bc7583-981d-4f0e-af94-2aed5fa3eb9c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:00 np0005476733 nova_compute[192580]: 2025-10-08 15:29:00.800 2 DEBUG nova.compute.provider_tree [None req-00bc7583-981d-4f0e-af94-2aed5fa3eb9c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:29:00 np0005476733 nova_compute[192580]: 2025-10-08 15:29:00.820 2 DEBUG nova.scheduler.client.report [None req-00bc7583-981d-4f0e-af94-2aed5fa3eb9c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:29:00 np0005476733 nova_compute[192580]: 2025-10-08 15:29:00.848 2 DEBUG oslo_concurrency.lockutils [None req-00bc7583-981d-4f0e-af94-2aed5fa3eb9c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.218s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:00 np0005476733 nova_compute[192580]: 2025-10-08 15:29:00.886 2 INFO nova.scheduler.client.report [None req-00bc7583-981d-4f0e-af94-2aed5fa3eb9c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Deleted allocations for instance 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6#033[00m
Oct  8 11:29:00 np0005476733 nova_compute[192580]: 2025-10-08 15:29:00.965 2 DEBUG oslo_concurrency.lockutils [None req-00bc7583-981d-4f0e-af94-2aed5fa3eb9c 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "63718bc7-c79a-49a4-a0f2-bb47aa50f5b6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.287s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:01 np0005476733 nova_compute[192580]: 2025-10-08 15:29:01.347 2 DEBUG nova.compute.manager [req-80f352a7-de3f-41c6-bf73-889cd9a2f401 req-8b003864-e2ff-42bf-b4d5-7de521d00393 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Received event network-vif-plugged-92e6817e-732a-4e42-973e-2d26e62163f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:29:01 np0005476733 nova_compute[192580]: 2025-10-08 15:29:01.347 2 DEBUG oslo_concurrency.lockutils [req-80f352a7-de3f-41c6-bf73-889cd9a2f401 req-8b003864-e2ff-42bf-b4d5-7de521d00393 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:01 np0005476733 nova_compute[192580]: 2025-10-08 15:29:01.348 2 DEBUG oslo_concurrency.lockutils [req-80f352a7-de3f-41c6-bf73-889cd9a2f401 req-8b003864-e2ff-42bf-b4d5-7de521d00393 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:01 np0005476733 nova_compute[192580]: 2025-10-08 15:29:01.348 2 DEBUG oslo_concurrency.lockutils [req-80f352a7-de3f-41c6-bf73-889cd9a2f401 req-8b003864-e2ff-42bf-b4d5-7de521d00393 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "63718bc7-c79a-49a4-a0f2-bb47aa50f5b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:01 np0005476733 nova_compute[192580]: 2025-10-08 15:29:01.348 2 DEBUG nova.compute.manager [req-80f352a7-de3f-41c6-bf73-889cd9a2f401 req-8b003864-e2ff-42bf-b4d5-7de521d00393 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] No waiting events found dispatching network-vif-plugged-92e6817e-732a-4e42-973e-2d26e62163f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:29:01 np0005476733 nova_compute[192580]: 2025-10-08 15:29:01.348 2 WARNING nova.compute.manager [req-80f352a7-de3f-41c6-bf73-889cd9a2f401 req-8b003864-e2ff-42bf-b4d5-7de521d00393 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Received unexpected event network-vif-plugged-92e6817e-732a-4e42-973e-2d26e62163f5 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 11:29:02 np0005476733 podman[230173]: 2025-10-08 15:29:02.260229024 +0000 UTC m=+0.085554212 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Oct  8 11:29:02 np0005476733 nova_compute[192580]: 2025-10-08 15:29:02.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:03 np0005476733 nova_compute[192580]: 2025-10-08 15:29:03.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:04 np0005476733 nova_compute[192580]: 2025-10-08 15:29:04.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:05 np0005476733 nova_compute[192580]: 2025-10-08 15:29:05.092 2 INFO nova.compute.manager [None req-9a933a72-d40c-4838-ae6f-630e4b7bc037 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Get console output#033[00m
Oct  8 11:29:05 np0005476733 nova_compute[192580]: 2025-10-08 15:29:05.098 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:29:05 np0005476733 nova_compute[192580]: 2025-10-08 15:29:05.102 2 INFO nova.virt.libvirt.driver [None req-9a933a72-d40c-4838-ae6f-630e4b7bc037 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Truncated console log returned, 3907 bytes ignored#033[00m
Oct  8 11:29:05 np0005476733 podman[230199]: 2025-10-08 15:29:05.233466918 +0000 UTC m=+0.057555954 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Oct  8 11:29:05 np0005476733 podman[230198]: 2025-10-08 15:29:05.286936413 +0000 UTC m=+0.112335421 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.396 2 DEBUG oslo_concurrency.lockutils [None req-0545c7df-eaa3-4c42-b5f6-26ff9d7f71e0 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "1cbc4434-d89a-483d-a1f2-299190262888" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.397 2 DEBUG oslo_concurrency.lockutils [None req-0545c7df-eaa3-4c42-b5f6-26ff9d7f71e0 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "1cbc4434-d89a-483d-a1f2-299190262888" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.397 2 DEBUG oslo_concurrency.lockutils [None req-0545c7df-eaa3-4c42-b5f6-26ff9d7f71e0 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "1cbc4434-d89a-483d-a1f2-299190262888-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.397 2 DEBUG oslo_concurrency.lockutils [None req-0545c7df-eaa3-4c42-b5f6-26ff9d7f71e0 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "1cbc4434-d89a-483d-a1f2-299190262888-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.398 2 DEBUG oslo_concurrency.lockutils [None req-0545c7df-eaa3-4c42-b5f6-26ff9d7f71e0 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "1cbc4434-d89a-483d-a1f2-299190262888-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.399 2 INFO nova.compute.manager [None req-0545c7df-eaa3-4c42-b5f6-26ff9d7f71e0 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Terminating instance#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.400 2 DEBUG nova.compute.manager [None req-0545c7df-eaa3-4c42-b5f6-26ff9d7f71e0 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:29:06 np0005476733 kernel: tap020c7187-87 (unregistering): left promiscuous mode
Oct  8 11:29:06 np0005476733 NetworkManager[51699]: <info>  [1759937346.4609] device (tap020c7187-87): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:29:06 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:06Z|00324|binding|INFO|Releasing lport 020c7187-878e-4336-a49d-ac40eb956ef6 from this chassis (sb_readonly=0)
Oct  8 11:29:06 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:06Z|00325|binding|INFO|Setting lport 020c7187-878e-4336-a49d-ac40eb956ef6 down in Southbound
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:06 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:06Z|00326|binding|INFO|Removing iface tap020c7187-87 ovn-installed in OVS
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:06.509 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:42:9f 10.100.0.9'], port_security=['fa:16:3e:8b:42:9f 10.100.0.9 192.168.111.13/24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1cbc4434-d89a-483d-a1f2-299190262888', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7a77f8cd-4394-4cb0-a8a1-33872549758a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e1086961263487db8a3c5190fdf1b2e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '78a6a465-5b3b-43e0-8a00-63e5875c77b4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.248'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=538f4b4e-d2f6-4df4-8e2a-7fc02c73fc5a, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=020c7187-878e-4336-a49d-ac40eb956ef6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:29:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:06.511 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 020c7187-878e-4336-a49d-ac40eb956ef6 in datapath 7a77f8cd-4394-4cb0-a8a1-33872549758a unbound from our chassis#033[00m
Oct  8 11:29:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:06.514 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7a77f8cd-4394-4cb0-a8a1-33872549758a#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:06.529 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[36705309-be2b-454c-9509-8e6ac0adefdd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:06.555 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[c5318406-e885-4bc6-98b3-b7ff2142a411]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:06.559 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[0c22b5f4-43a0-427a-a7d6-e12a8f8a97a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:06 np0005476733 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Oct  8 11:29:06 np0005476733 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000001d.scope: Consumed 49.935s CPU time.
Oct  8 11:29:06 np0005476733 systemd-machined[152624]: Machine qemu-17-instance-0000001d terminated.
Oct  8 11:29:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:06.587 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[39e369df-7192-4009-b9f7-7c566842c409]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:06.604 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3ebe6f14-2497-43e3-9c79-57e8dc1b8ad1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7a77f8cd-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:53:0f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 11, 'rx_bytes': 1718, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 11, 'rx_bytes': 1718, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 389913, 'reachable_time': 23320, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230254, 'error': None, 'target': 'ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:06.626 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1fc86ba3-7888-4cf1-8869-abf036d3f2db]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7a77f8cd-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 389931, 'tstamp': 389931}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230256, 'error': None, 'target': 'ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7a77f8cd-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 389935, 'tstamp': 389935}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230256, 'error': None, 'target': 'ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:06.628 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a77f8cd-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:06.635 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7a77f8cd-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:06.635 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:29:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:06.635 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7a77f8cd-40, col_values=(('external_ids', {'iface-id': 'b563ca05-c871-4f0e-9980-177237a3f88d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:06.636 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.671 2 INFO nova.virt.libvirt.driver [-] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Instance destroyed successfully.#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.672 2 DEBUG nova.objects.instance [None req-0545c7df-eaa3-4c42-b5f6-26ff9d7f71e0 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lazy-loading 'resources' on Instance uuid 1cbc4434-d89a-483d-a1f2-299190262888 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.691 2 DEBUG nova.virt.libvirt.vif [None req-0545c7df-eaa3-4c42-b5f6-26ff9d7f71e0 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:25:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-broadcast-sender-124-598361755',display_name='tempest-broadcast-sender-124-598361755',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-broadcast-sender-124-598361755',id=29,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGBqxlO9VuM0Qq/DWr14YnhGxOxwcqegm/N2XcRSLA8NJfb1K0EfLGDHkMQul32EUhmJshL5J7ZH56Voxwq765dL8/B4SFbezZWy3ydp4mAt0951qcEHggiOu5J3JaZbOg==',key_name='tempest-keypair-test-1882494757',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:25:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7e1086961263487db8a3c5190fdf1b2e',ramdisk_id='',reservation_id='r-yda0bge1',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-BroadcastTestVlanTransparency-538458942',owner_user_name='tempest-BroadcastTestVlanTransparency-538458942-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:25:16Z,user_data=None,user_id='843ea0278e174175a6f8e21731c1383e',uuid=1cbc4434-d89a-483d-a1f2-299190262888,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "020c7187-878e-4336-a49d-ac40eb956ef6", "address": "fa:16:3e:8b:42:9f", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap020c7187-87", "ovs_interfaceid": "020c7187-878e-4336-a49d-ac40eb956ef6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.692 2 DEBUG nova.network.os_vif_util [None req-0545c7df-eaa3-4c42-b5f6-26ff9d7f71e0 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Converting VIF {"id": "020c7187-878e-4336-a49d-ac40eb956ef6", "address": "fa:16:3e:8b:42:9f", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap020c7187-87", "ovs_interfaceid": "020c7187-878e-4336-a49d-ac40eb956ef6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.693 2 DEBUG nova.network.os_vif_util [None req-0545c7df-eaa3-4c42-b5f6-26ff9d7f71e0 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8b:42:9f,bridge_name='br-int',has_traffic_filtering=True,id=020c7187-878e-4336-a49d-ac40eb956ef6,network=Network(7a77f8cd-4394-4cb0-a8a1-33872549758a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap020c7187-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.693 2 DEBUG os_vif [None req-0545c7df-eaa3-4c42-b5f6-26ff9d7f71e0 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:42:9f,bridge_name='br-int',has_traffic_filtering=True,id=020c7187-878e-4336-a49d-ac40eb956ef6,network=Network(7a77f8cd-4394-4cb0-a8a1-33872549758a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap020c7187-87') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.694 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap020c7187-87, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.701 2 INFO os_vif [None req-0545c7df-eaa3-4c42-b5f6-26ff9d7f71e0 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:42:9f,bridge_name='br-int',has_traffic_filtering=True,id=020c7187-878e-4336-a49d-ac40eb956ef6,network=Network(7a77f8cd-4394-4cb0-a8a1-33872549758a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap020c7187-87')#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.702 2 INFO nova.virt.libvirt.driver [None req-0545c7df-eaa3-4c42-b5f6-26ff9d7f71e0 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Deleting instance files /var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888_del#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.703 2 INFO nova.virt.libvirt.driver [None req-0545c7df-eaa3-4c42-b5f6-26ff9d7f71e0 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Deletion of /var/lib/nova/instances/1cbc4434-d89a-483d-a1f2-299190262888_del complete#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.753 2 INFO nova.compute.manager [None req-0545c7df-eaa3-4c42-b5f6-26ff9d7f71e0 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.754 2 DEBUG oslo.service.loopingcall [None req-0545c7df-eaa3-4c42-b5f6-26ff9d7f71e0 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.754 2 DEBUG nova.compute.manager [-] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:29:06 np0005476733 nova_compute[192580]: 2025-10-08 15:29:06.755 2 DEBUG nova.network.neutron [-] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:29:07 np0005476733 nova_compute[192580]: 2025-10-08 15:29:07.846 2 DEBUG nova.compute.manager [req-83b0b584-fe6e-4589-96ee-8f0e0e6ffa2f req-d56484c7-bd7c-4ca5-b6a8-a979e9f8e946 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Received event network-vif-unplugged-020c7187-878e-4336-a49d-ac40eb956ef6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:29:07 np0005476733 nova_compute[192580]: 2025-10-08 15:29:07.848 2 DEBUG oslo_concurrency.lockutils [req-83b0b584-fe6e-4589-96ee-8f0e0e6ffa2f req-d56484c7-bd7c-4ca5-b6a8-a979e9f8e946 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "1cbc4434-d89a-483d-a1f2-299190262888-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:07 np0005476733 nova_compute[192580]: 2025-10-08 15:29:07.849 2 DEBUG oslo_concurrency.lockutils [req-83b0b584-fe6e-4589-96ee-8f0e0e6ffa2f req-d56484c7-bd7c-4ca5-b6a8-a979e9f8e946 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "1cbc4434-d89a-483d-a1f2-299190262888-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:07 np0005476733 nova_compute[192580]: 2025-10-08 15:29:07.850 2 DEBUG oslo_concurrency.lockutils [req-83b0b584-fe6e-4589-96ee-8f0e0e6ffa2f req-d56484c7-bd7c-4ca5-b6a8-a979e9f8e946 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "1cbc4434-d89a-483d-a1f2-299190262888-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:07 np0005476733 nova_compute[192580]: 2025-10-08 15:29:07.851 2 DEBUG nova.compute.manager [req-83b0b584-fe6e-4589-96ee-8f0e0e6ffa2f req-d56484c7-bd7c-4ca5-b6a8-a979e9f8e946 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] No waiting events found dispatching network-vif-unplugged-020c7187-878e-4336-a49d-ac40eb956ef6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:29:07 np0005476733 nova_compute[192580]: 2025-10-08 15:29:07.851 2 DEBUG nova.compute.manager [req-83b0b584-fe6e-4589-96ee-8f0e0e6ffa2f req-d56484c7-bd7c-4ca5-b6a8-a979e9f8e946 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Received event network-vif-unplugged-020c7187-878e-4336-a49d-ac40eb956ef6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:29:07 np0005476733 nova_compute[192580]: 2025-10-08 15:29:07.852 2 DEBUG nova.compute.manager [req-83b0b584-fe6e-4589-96ee-8f0e0e6ffa2f req-d56484c7-bd7c-4ca5-b6a8-a979e9f8e946 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Received event network-vif-plugged-020c7187-878e-4336-a49d-ac40eb956ef6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:29:07 np0005476733 nova_compute[192580]: 2025-10-08 15:29:07.853 2 DEBUG oslo_concurrency.lockutils [req-83b0b584-fe6e-4589-96ee-8f0e0e6ffa2f req-d56484c7-bd7c-4ca5-b6a8-a979e9f8e946 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "1cbc4434-d89a-483d-a1f2-299190262888-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:07 np0005476733 nova_compute[192580]: 2025-10-08 15:29:07.853 2 DEBUG oslo_concurrency.lockutils [req-83b0b584-fe6e-4589-96ee-8f0e0e6ffa2f req-d56484c7-bd7c-4ca5-b6a8-a979e9f8e946 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "1cbc4434-d89a-483d-a1f2-299190262888-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:07 np0005476733 nova_compute[192580]: 2025-10-08 15:29:07.854 2 DEBUG oslo_concurrency.lockutils [req-83b0b584-fe6e-4589-96ee-8f0e0e6ffa2f req-d56484c7-bd7c-4ca5-b6a8-a979e9f8e946 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "1cbc4434-d89a-483d-a1f2-299190262888-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:07 np0005476733 nova_compute[192580]: 2025-10-08 15:29:07.855 2 DEBUG nova.compute.manager [req-83b0b584-fe6e-4589-96ee-8f0e0e6ffa2f req-d56484c7-bd7c-4ca5-b6a8-a979e9f8e946 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] No waiting events found dispatching network-vif-plugged-020c7187-878e-4336-a49d-ac40eb956ef6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:29:07 np0005476733 nova_compute[192580]: 2025-10-08 15:29:07.855 2 WARNING nova.compute.manager [req-83b0b584-fe6e-4589-96ee-8f0e0e6ffa2f req-d56484c7-bd7c-4ca5-b6a8-a979e9f8e946 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Received unexpected event network-vif-plugged-020c7187-878e-4336-a49d-ac40eb956ef6 for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:29:08 np0005476733 nova_compute[192580]: 2025-10-08 15:29:08.120 2 DEBUG nova.network.neutron [-] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:29:08 np0005476733 nova_compute[192580]: 2025-10-08 15:29:08.154 2 INFO nova.compute.manager [-] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Took 1.40 seconds to deallocate network for instance.#033[00m
Oct  8 11:29:08 np0005476733 nova_compute[192580]: 2025-10-08 15:29:08.194 2 DEBUG oslo_concurrency.lockutils [None req-0545c7df-eaa3-4c42-b5f6-26ff9d7f71e0 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:08 np0005476733 nova_compute[192580]: 2025-10-08 15:29:08.194 2 DEBUG oslo_concurrency.lockutils [None req-0545c7df-eaa3-4c42-b5f6-26ff9d7f71e0 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:08 np0005476733 nova_compute[192580]: 2025-10-08 15:29:08.333 2 DEBUG nova.compute.provider_tree [None req-0545c7df-eaa3-4c42-b5f6-26ff9d7f71e0 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:29:08 np0005476733 nova_compute[192580]: 2025-10-08 15:29:08.368 2 DEBUG nova.scheduler.client.report [None req-0545c7df-eaa3-4c42-b5f6-26ff9d7f71e0 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:29:08 np0005476733 nova_compute[192580]: 2025-10-08 15:29:08.416 2 DEBUG oslo_concurrency.lockutils [None req-0545c7df-eaa3-4c42-b5f6-26ff9d7f71e0 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:08 np0005476733 nova_compute[192580]: 2025-10-08 15:29:08.476 2 INFO nova.scheduler.client.report [None req-0545c7df-eaa3-4c42-b5f6-26ff9d7f71e0 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Deleted allocations for instance 1cbc4434-d89a-483d-a1f2-299190262888#033[00m
Oct  8 11:29:08 np0005476733 nova_compute[192580]: 2025-10-08 15:29:08.586 2 DEBUG oslo_concurrency.lockutils [None req-0545c7df-eaa3-4c42-b5f6-26ff9d7f71e0 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "1cbc4434-d89a-483d-a1f2-299190262888" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:09 np0005476733 nova_compute[192580]: 2025-10-08 15:29:09.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:09 np0005476733 nova_compute[192580]: 2025-10-08 15:29:09.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:10 np0005476733 nova_compute[192580]: 2025-10-08 15:29:10.013 2 DEBUG nova.compute.manager [req-32e13911-e9c8-44c6-8632-49812efc8618 req-8ec84cbb-65d0-49e9-8ea2-f47182141c0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Received event network-changed-902f5462-63af-4928-a517-b67d158bf2c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:29:10 np0005476733 nova_compute[192580]: 2025-10-08 15:29:10.014 2 DEBUG nova.compute.manager [req-32e13911-e9c8-44c6-8632-49812efc8618 req-8ec84cbb-65d0-49e9-8ea2-f47182141c0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Refreshing instance network info cache due to event network-changed-902f5462-63af-4928-a517-b67d158bf2c2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:29:10 np0005476733 nova_compute[192580]: 2025-10-08 15:29:10.014 2 DEBUG oslo_concurrency.lockutils [req-32e13911-e9c8-44c6-8632-49812efc8618 req-8ec84cbb-65d0-49e9-8ea2-f47182141c0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-020f3abc-b9cd-43d6-81f9-4464a8d20207" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:29:10 np0005476733 nova_compute[192580]: 2025-10-08 15:29:10.014 2 DEBUG oslo_concurrency.lockutils [req-32e13911-e9c8-44c6-8632-49812efc8618 req-8ec84cbb-65d0-49e9-8ea2-f47182141c0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-020f3abc-b9cd-43d6-81f9-4464a8d20207" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:29:10 np0005476733 nova_compute[192580]: 2025-10-08 15:29:10.015 2 DEBUG nova.network.neutron [req-32e13911-e9c8-44c6-8632-49812efc8618 req-8ec84cbb-65d0-49e9-8ea2-f47182141c0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Refreshing network info cache for port 902f5462-63af-4928-a517-b67d158bf2c2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:29:10 np0005476733 nova_compute[192580]: 2025-10-08 15:29:10.845 2 DEBUG oslo_concurrency.lockutils [None req-b9b67a45-eb29-4ba9-97d5-03b5de919019 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "020f3abc-b9cd-43d6-81f9-4464a8d20207" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:10 np0005476733 nova_compute[192580]: 2025-10-08 15:29:10.846 2 DEBUG oslo_concurrency.lockutils [None req-b9b67a45-eb29-4ba9-97d5-03b5de919019 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "020f3abc-b9cd-43d6-81f9-4464a8d20207" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:10 np0005476733 nova_compute[192580]: 2025-10-08 15:29:10.847 2 DEBUG oslo_concurrency.lockutils [None req-b9b67a45-eb29-4ba9-97d5-03b5de919019 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "020f3abc-b9cd-43d6-81f9-4464a8d20207-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:10 np0005476733 nova_compute[192580]: 2025-10-08 15:29:10.847 2 DEBUG oslo_concurrency.lockutils [None req-b9b67a45-eb29-4ba9-97d5-03b5de919019 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "020f3abc-b9cd-43d6-81f9-4464a8d20207-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:10 np0005476733 nova_compute[192580]: 2025-10-08 15:29:10.847 2 DEBUG oslo_concurrency.lockutils [None req-b9b67a45-eb29-4ba9-97d5-03b5de919019 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "020f3abc-b9cd-43d6-81f9-4464a8d20207-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:10 np0005476733 nova_compute[192580]: 2025-10-08 15:29:10.848 2 INFO nova.compute.manager [None req-b9b67a45-eb29-4ba9-97d5-03b5de919019 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Terminating instance#033[00m
Oct  8 11:29:10 np0005476733 nova_compute[192580]: 2025-10-08 15:29:10.849 2 DEBUG nova.compute.manager [None req-b9b67a45-eb29-4ba9-97d5-03b5de919019 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:29:11 np0005476733 nova_compute[192580]: 2025-10-08 15:29:11.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:11 np0005476733 kernel: tap902f5462-63 (unregistering): left promiscuous mode
Oct  8 11:29:11 np0005476733 NetworkManager[51699]: <info>  [1759937351.9786] device (tap902f5462-63): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:29:11 np0005476733 nova_compute[192580]: 2025-10-08 15:29:11.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:11 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:11Z|00327|binding|INFO|Releasing lport 902f5462-63af-4928-a517-b67d158bf2c2 from this chassis (sb_readonly=0)
Oct  8 11:29:11 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:11Z|00328|binding|INFO|Setting lport 902f5462-63af-4928-a517-b67d158bf2c2 down in Southbound
Oct  8 11:29:11 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:11Z|00329|binding|INFO|Removing iface tap902f5462-63 ovn-installed in OVS
Oct  8 11:29:11 np0005476733 nova_compute[192580]: 2025-10-08 15:29:11.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:11.996 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:62:62 192.168.3.171 2001:3::6b'], port_security=['fa:16:3e:97:62:62 192.168.3.171 2001:3::6b'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'neutron:cidrs': '192.168.3.171/24 2001:3::6b/64', 'neutron:device_id': '020f3abc-b9cd-43d6-81f9-4464a8d20207', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c4aa983-f923-4017-9479-47738c5b827f', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful', 'neutron:project_id': '93e68db931464f0282500c84d398d8af', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ee93d6be-59e3-41c0-a55f-8df79fb9da74', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.240'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7326b02d-cc3d-4e8f-b8b4-4ee71b4d8a00, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=902f5462-63af-4928-a517-b67d158bf2c2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:29:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:11.999 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 902f5462-63af-4928-a517-b67d158bf2c2 in datapath 6c4aa983-f923-4017-9479-47738c5b827f unbound from our chassis#033[00m
Oct  8 11:29:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:12.006 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6c4aa983-f923-4017-9479-47738c5b827f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:29:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:12.007 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b9b2e241-7ac5-4cc3-a2f8-4ea0ff6fd151]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:12.009 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6c4aa983-f923-4017-9479-47738c5b827f namespace which is not needed anymore#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:12 np0005476733 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000025.scope: Deactivated successfully.
Oct  8 11:29:12 np0005476733 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000025.scope: Consumed 45.617s CPU time.
Oct  8 11:29:12 np0005476733 systemd-machined[152624]: Machine qemu-23-instance-00000025 terminated.
Oct  8 11:29:12 np0005476733 neutron-haproxy-ovnmeta-6c4aa983-f923-4017-9479-47738c5b827f[229905]: [NOTICE]   (229909) : haproxy version is 2.8.14-c23fe91
Oct  8 11:29:12 np0005476733 neutron-haproxy-ovnmeta-6c4aa983-f923-4017-9479-47738c5b827f[229905]: [NOTICE]   (229909) : path to executable is /usr/sbin/haproxy
Oct  8 11:29:12 np0005476733 neutron-haproxy-ovnmeta-6c4aa983-f923-4017-9479-47738c5b827f[229905]: [ALERT]    (229909) : Current worker (229911) exited with code 143 (Terminated)
Oct  8 11:29:12 np0005476733 neutron-haproxy-ovnmeta-6c4aa983-f923-4017-9479-47738c5b827f[229905]: [WARNING]  (229909) : All workers exited. Exiting... (0)
Oct  8 11:29:12 np0005476733 systemd[1]: libpod-a320f314a428392da87df7d4f7b6f362bb9a2f6a48136aaa4675bccac0cfa1a1.scope: Deactivated successfully.
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.337 2 INFO nova.virt.libvirt.driver [-] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Instance destroyed successfully.#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.338 2 DEBUG nova.objects.instance [None req-b9b67a45-eb29-4ba9-97d5-03b5de919019 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lazy-loading 'resources' on Instance uuid 020f3abc-b9cd-43d6-81f9-4464a8d20207 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:29:12 np0005476733 podman[230311]: 2025-10-08 15:29:12.34682851 +0000 UTC m=+0.208133887 container died a320f314a428392da87df7d4f7b6f362bb9a2f6a48136aaa4675bccac0cfa1a1 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-6c4aa983-f923-4017-9479-47738c5b827f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.357 2 DEBUG nova.virt.libvirt.vif [None req-b9b67a45-eb29-4ba9-97d5-03b5de919019 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:28:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful',display_name='tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-extradhcpoptionstest-1757752636-test-extra-dhcp-opts-ip',id=37,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAROHXDFBirKKfgv1/Q2k8TOz822D2j3GssXLkqqAYkfNmKCLTZPWHL9R3TttvPeVcQM9XeUfcVk0LUjV4/DUc229+mDzz6yKwrgz0g4olEc5cIgAsFC91SZyJ937u9BxA==',key_name='tempest-ExtraDhcpOptionsTest-1757752636',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:28:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='93e68db931464f0282500c84d398d8af',ramdisk_id='',reservation_id='r-8k2l0w6s',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ExtraDhcpOptionsTest-522093769',owner_user_name='tempest-ExtraDhcpOptionsTest-522093769-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:28:26Z,user_data=None,user_id='048380879c82439f920961e33c8fc34c',uuid=020f3abc-b9cd-43d6-81f9-4464a8d20207,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "902f5462-63af-4928-a517-b67d158bf2c2", "address": "fa:16:3e:97:62:62", "network": {"id": "6c4aa983-f923-4017-9479-47738c5b827f", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.171", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:3::/64", "dns": [], "gateway": {"address": "2001:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:3::6b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap902f5462-63", "ovs_interfaceid": "902f5462-63af-4928-a517-b67d158bf2c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.358 2 DEBUG nova.network.os_vif_util [None req-b9b67a45-eb29-4ba9-97d5-03b5de919019 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Converting VIF {"id": "902f5462-63af-4928-a517-b67d158bf2c2", "address": "fa:16:3e:97:62:62", "network": {"id": "6c4aa983-f923-4017-9479-47738c5b827f", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.171", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:3::/64", "dns": [], "gateway": {"address": "2001:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:3::6b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap902f5462-63", "ovs_interfaceid": "902f5462-63af-4928-a517-b67d158bf2c2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.358 2 DEBUG nova.network.os_vif_util [None req-b9b67a45-eb29-4ba9-97d5-03b5de919019 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:62:62,bridge_name='br-int',has_traffic_filtering=True,id=902f5462-63af-4928-a517-b67d158bf2c2,network=Network(6c4aa983-f923-4017-9479-47738c5b827f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap902f5462-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.359 2 DEBUG os_vif [None req-b9b67a45-eb29-4ba9-97d5-03b5de919019 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:62:62,bridge_name='br-int',has_traffic_filtering=True,id=902f5462-63af-4928-a517-b67d158bf2c2,network=Network(6c4aa983-f923-4017-9479-47738c5b827f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap902f5462-63') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.360 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap902f5462-63, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.366 2 INFO os_vif [None req-b9b67a45-eb29-4ba9-97d5-03b5de919019 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:62:62,bridge_name='br-int',has_traffic_filtering=True,id=902f5462-63af-4928-a517-b67d158bf2c2,network=Network(6c4aa983-f923-4017-9479-47738c5b827f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap902f5462-63')#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.367 2 INFO nova.virt.libvirt.driver [None req-b9b67a45-eb29-4ba9-97d5-03b5de919019 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Deleting instance files /var/lib/nova/instances/020f3abc-b9cd-43d6-81f9-4464a8d20207_del#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.367 2 INFO nova.virt.libvirt.driver [None req-b9b67a45-eb29-4ba9-97d5-03b5de919019 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Deletion of /var/lib/nova/instances/020f3abc-b9cd-43d6-81f9-4464a8d20207_del complete#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.393 2 DEBUG nova.compute.manager [req-51a210ae-c187-463a-87cc-1f7297c028a6 req-4f2ad7f2-d510-4ee5-8d44-d73f6d654f80 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Received event network-vif-unplugged-902f5462-63af-4928-a517-b67d158bf2c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.393 2 DEBUG oslo_concurrency.lockutils [req-51a210ae-c187-463a-87cc-1f7297c028a6 req-4f2ad7f2-d510-4ee5-8d44-d73f6d654f80 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "020f3abc-b9cd-43d6-81f9-4464a8d20207-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.394 2 DEBUG oslo_concurrency.lockutils [req-51a210ae-c187-463a-87cc-1f7297c028a6 req-4f2ad7f2-d510-4ee5-8d44-d73f6d654f80 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "020f3abc-b9cd-43d6-81f9-4464a8d20207-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.394 2 DEBUG oslo_concurrency.lockutils [req-51a210ae-c187-463a-87cc-1f7297c028a6 req-4f2ad7f2-d510-4ee5-8d44-d73f6d654f80 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "020f3abc-b9cd-43d6-81f9-4464a8d20207-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.394 2 DEBUG nova.compute.manager [req-51a210ae-c187-463a-87cc-1f7297c028a6 req-4f2ad7f2-d510-4ee5-8d44-d73f6d654f80 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] No waiting events found dispatching network-vif-unplugged-902f5462-63af-4928-a517-b67d158bf2c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.395 2 DEBUG nova.compute.manager [req-51a210ae-c187-463a-87cc-1f7297c028a6 req-4f2ad7f2-d510-4ee5-8d44-d73f6d654f80 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Received event network-vif-unplugged-902f5462-63af-4928-a517-b67d158bf2c2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.418 2 INFO nova.compute.manager [None req-b9b67a45-eb29-4ba9-97d5-03b5de919019 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Took 1.57 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.419 2 DEBUG oslo.service.loopingcall [None req-b9b67a45-eb29-4ba9-97d5-03b5de919019 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.419 2 DEBUG nova.compute.manager [-] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.420 2 DEBUG nova.network.neutron [-] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:29:12 np0005476733 systemd[1]: var-lib-containers-storage-overlay-8f02d9f893e26da7b5f0788e1e2f8a4f3fb154509e7fa16576e38dcb809cb94d-merged.mount: Deactivated successfully.
Oct  8 11:29:12 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a320f314a428392da87df7d4f7b6f362bb9a2f6a48136aaa4675bccac0cfa1a1-userdata-shm.mount: Deactivated successfully.
Oct  8 11:29:12 np0005476733 podman[230311]: 2025-10-08 15:29:12.564370465 +0000 UTC m=+0.425675842 container cleanup a320f314a428392da87df7d4f7b6f362bb9a2f6a48136aaa4675bccac0cfa1a1 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-6c4aa983-f923-4017-9479-47738c5b827f, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:29:12 np0005476733 systemd[1]: libpod-conmon-a320f314a428392da87df7d4f7b6f362bb9a2f6a48136aaa4675bccac0cfa1a1.scope: Deactivated successfully.
Oct  8 11:29:12 np0005476733 podman[230338]: 2025-10-08 15:29:12.588211811 +0000 UTC m=+0.239703568 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.openshift.expose-services=)
Oct  8 11:29:12 np0005476733 podman[230376]: 2025-10-08 15:29:12.706887692 +0000 UTC m=+0.117310239 container remove a320f314a428392da87df7d4f7b6f362bb9a2f6a48136aaa4675bccac0cfa1a1 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-6c4aa983-f923-4017-9479-47738c5b827f, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 11:29:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:12.713 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[73b90ada-756b-4ce9-99f6-19ecc12513a1]: (4, ('Wed Oct  8 03:29:12 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6c4aa983-f923-4017-9479-47738c5b827f (a320f314a428392da87df7d4f7b6f362bb9a2f6a48136aaa4675bccac0cfa1a1)\na320f314a428392da87df7d4f7b6f362bb9a2f6a48136aaa4675bccac0cfa1a1\nWed Oct  8 03:29:12 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6c4aa983-f923-4017-9479-47738c5b827f (a320f314a428392da87df7d4f7b6f362bb9a2f6a48136aaa4675bccac0cfa1a1)\na320f314a428392da87df7d4f7b6f362bb9a2f6a48136aaa4675bccac0cfa1a1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:12.715 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[177d5a6d-7379-43fe-aa9e-d1275d210d71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:12.716 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c4aa983-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.717 2 DEBUG nova.network.neutron [req-32e13911-e9c8-44c6-8632-49812efc8618 req-8ec84cbb-65d0-49e9-8ea2-f47182141c0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Updated VIF entry in instance network info cache for port 902f5462-63af-4928-a517-b67d158bf2c2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.717 2 DEBUG nova.network.neutron [req-32e13911-e9c8-44c6-8632-49812efc8618 req-8ec84cbb-65d0-49e9-8ea2-f47182141c0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Updating instance_info_cache with network_info: [{"id": "902f5462-63af-4928-a517-b67d158bf2c2", "address": "fa:16:3e:97:62:62", "network": {"id": "6c4aa983-f923-4017-9479-47738c5b827f", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateful", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.171", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:3::/64", "dns": [], "gateway": {"address": "2001:3::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:3::6b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateful"}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap902f5462-63", "ovs_interfaceid": "902f5462-63af-4928-a517-b67d158bf2c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:29:12 np0005476733 kernel: tap6c4aa983-f0: left promiscuous mode
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:12.724 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4fa47746-2671-4540-ad1a-7ba6eb583cf9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:12.765 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e0a7bca8-7baa-4e5d-a17c-7154693551aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:12.766 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[47d5d926-bbca-4a4a-86c1-6db812e0f3d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:12.783 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0559f0d9-191f-428a-8448-77489e29ac15]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 420942, 'reachable_time': 26904, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230391, 'error': None, 'target': 'ovnmeta-6c4aa983-f923-4017-9479-47738c5b827f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:12.786 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6c4aa983-f923-4017-9479-47738c5b827f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:29:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:12.787 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[cc56f480-25a1-4a59-b705-7185905fd298]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:12 np0005476733 systemd[1]: run-netns-ovnmeta\x2d6c4aa983\x2df923\x2d4017\x2d9479\x2d47738c5b827f.mount: Deactivated successfully.
Oct  8 11:29:12 np0005476733 nova_compute[192580]: 2025-10-08 15:29:12.799 2 DEBUG oslo_concurrency.lockutils [req-32e13911-e9c8-44c6-8632-49812efc8618 req-8ec84cbb-65d0-49e9-8ea2-f47182141c0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-020f3abc-b9cd-43d6-81f9-4464a8d20207" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:29:13 np0005476733 nova_compute[192580]: 2025-10-08 15:29:13.944 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937338.9431295, 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:29:13 np0005476733 nova_compute[192580]: 2025-10-08 15:29:13.945 2 INFO nova.compute.manager [-] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:29:14 np0005476733 nova_compute[192580]: 2025-10-08 15:29:14.122 2 DEBUG nova.compute.manager [None req-7a96ea3e-7ca4-4257-96d8-9de944091ca7 - - - - - -] [instance: 63718bc7-c79a-49a4-a0f2-bb47aa50f5b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:29:14 np0005476733 nova_compute[192580]: 2025-10-08 15:29:14.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:14 np0005476733 nova_compute[192580]: 2025-10-08 15:29:14.584 2 DEBUG nova.compute.manager [req-216b4c4f-15b2-473c-a0cb-74517e03e3cb req-e3c76687-9b3e-4c4b-9421-edb91a4158c0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Received event network-vif-plugged-902f5462-63af-4928-a517-b67d158bf2c2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:29:14 np0005476733 nova_compute[192580]: 2025-10-08 15:29:14.585 2 DEBUG oslo_concurrency.lockutils [req-216b4c4f-15b2-473c-a0cb-74517e03e3cb req-e3c76687-9b3e-4c4b-9421-edb91a4158c0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "020f3abc-b9cd-43d6-81f9-4464a8d20207-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:14 np0005476733 nova_compute[192580]: 2025-10-08 15:29:14.586 2 DEBUG oslo_concurrency.lockutils [req-216b4c4f-15b2-473c-a0cb-74517e03e3cb req-e3c76687-9b3e-4c4b-9421-edb91a4158c0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "020f3abc-b9cd-43d6-81f9-4464a8d20207-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:14 np0005476733 nova_compute[192580]: 2025-10-08 15:29:14.586 2 DEBUG oslo_concurrency.lockutils [req-216b4c4f-15b2-473c-a0cb-74517e03e3cb req-e3c76687-9b3e-4c4b-9421-edb91a4158c0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "020f3abc-b9cd-43d6-81f9-4464a8d20207-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:14 np0005476733 nova_compute[192580]: 2025-10-08 15:29:14.587 2 DEBUG nova.compute.manager [req-216b4c4f-15b2-473c-a0cb-74517e03e3cb req-e3c76687-9b3e-4c4b-9421-edb91a4158c0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] No waiting events found dispatching network-vif-plugged-902f5462-63af-4928-a517-b67d158bf2c2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:29:14 np0005476733 nova_compute[192580]: 2025-10-08 15:29:14.587 2 WARNING nova.compute.manager [req-216b4c4f-15b2-473c-a0cb-74517e03e3cb req-e3c76687-9b3e-4c4b-9421-edb91a4158c0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Received unexpected event network-vif-plugged-902f5462-63af-4928-a517-b67d158bf2c2 for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:29:15 np0005476733 podman[230393]: 2025-10-08 15:29:15.267516799 +0000 UTC m=+0.076787464 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:29:15 np0005476733 podman[230392]: 2025-10-08 15:29:15.289175711 +0000 UTC m=+0.100629256 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, container_name=multipathd)
Oct  8 11:29:15 np0005476733 nova_compute[192580]: 2025-10-08 15:29:15.556 2 DEBUG nova.network.neutron [-] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:29:15 np0005476733 nova_compute[192580]: 2025-10-08 15:29:15.579 2 INFO nova.compute.manager [-] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Took 3.16 seconds to deallocate network for instance.#033[00m
Oct  8 11:29:15 np0005476733 nova_compute[192580]: 2025-10-08 15:29:15.631 2 DEBUG oslo_concurrency.lockutils [None req-b9b67a45-eb29-4ba9-97d5-03b5de919019 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:15 np0005476733 nova_compute[192580]: 2025-10-08 15:29:15.631 2 DEBUG oslo_concurrency.lockutils [None req-b9b67a45-eb29-4ba9-97d5-03b5de919019 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:15 np0005476733 nova_compute[192580]: 2025-10-08 15:29:15.759 2 DEBUG nova.compute.provider_tree [None req-b9b67a45-eb29-4ba9-97d5-03b5de919019 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:29:15 np0005476733 nova_compute[192580]: 2025-10-08 15:29:15.780 2 DEBUG nova.scheduler.client.report [None req-b9b67a45-eb29-4ba9-97d5-03b5de919019 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:29:15 np0005476733 nova_compute[192580]: 2025-10-08 15:29:15.818 2 DEBUG oslo_concurrency.lockutils [None req-b9b67a45-eb29-4ba9-97d5-03b5de919019 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:15 np0005476733 nova_compute[192580]: 2025-10-08 15:29:15.888 2 INFO nova.scheduler.client.report [None req-b9b67a45-eb29-4ba9-97d5-03b5de919019 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Deleted allocations for instance 020f3abc-b9cd-43d6-81f9-4464a8d20207#033[00m
Oct  8 11:29:15 np0005476733 nova_compute[192580]: 2025-10-08 15:29:15.959 2 DEBUG oslo_concurrency.lockutils [None req-b9b67a45-eb29-4ba9-97d5-03b5de919019 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "020f3abc-b9cd-43d6-81f9-4464a8d20207" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.384 2 DEBUG oslo_concurrency.lockutils [None req-d0e87eaf-54b1-4dc7-89d5-140b4b456a90 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "1a3ae685-bd3d-4f36-ad77-9f5b6b95677f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.385 2 DEBUG oslo_concurrency.lockutils [None req-d0e87eaf-54b1-4dc7-89d5-140b4b456a90 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "1a3ae685-bd3d-4f36-ad77-9f5b6b95677f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.386 2 DEBUG oslo_concurrency.lockutils [None req-d0e87eaf-54b1-4dc7-89d5-140b4b456a90 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.386 2 DEBUG oslo_concurrency.lockutils [None req-d0e87eaf-54b1-4dc7-89d5-140b4b456a90 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.386 2 DEBUG oslo_concurrency.lockutils [None req-d0e87eaf-54b1-4dc7-89d5-140b4b456a90 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.388 2 INFO nova.compute.manager [None req-d0e87eaf-54b1-4dc7-89d5-140b4b456a90 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Terminating instance#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.389 2 DEBUG nova.compute.manager [None req-d0e87eaf-54b1-4dc7-89d5-140b4b456a90 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:29:17 np0005476733 kernel: tap0bb60f77-cd (unregistering): left promiscuous mode
Oct  8 11:29:17 np0005476733 NetworkManager[51699]: <info>  [1759937357.4109] device (tap0bb60f77-cd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:17 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:17Z|00330|binding|INFO|Releasing lport 0bb60f77-cd96-4dfd-9810-5583ec966cb2 from this chassis (sb_readonly=0)
Oct  8 11:29:17 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:17Z|00331|binding|INFO|Setting lport 0bb60f77-cd96-4dfd-9810-5583ec966cb2 down in Southbound
Oct  8 11:29:17 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:17Z|00332|binding|INFO|Removing iface tap0bb60f77-cd ovn-installed in OVS
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:17.441 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a6:aa:3b 10.100.0.8'], port_security=['fa:16:3e:a6:aa:3b 10.100.0.8 192.168.111.11/24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1a3ae685-bd3d-4f36-ad77-9f5b6b95677f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7a77f8cd-4394-4cb0-a8a1-33872549758a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e1086961263487db8a3c5190fdf1b2e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '78a6a465-5b3b-43e0-8a00-63e5875c77b4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.197'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=538f4b4e-d2f6-4df4-8e2a-7fc02c73fc5a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=0bb60f77-cd96-4dfd-9810-5583ec966cb2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:29:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:17.443 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 0bb60f77-cd96-4dfd-9810-5583ec966cb2 in datapath 7a77f8cd-4394-4cb0-a8a1-33872549758a unbound from our chassis#033[00m
Oct  8 11:29:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:17.447 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7a77f8cd-4394-4cb0-a8a1-33872549758a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:29:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:17.449 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[88d85dec-48bc-4dce-9619-617bdd6bb337]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:17.449 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a namespace which is not needed anymore#033[00m
Oct  8 11:29:17 np0005476733 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000014.scope: Deactivated successfully.
Oct  8 11:29:17 np0005476733 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000014.scope: Consumed 51.568s CPU time.
Oct  8 11:29:17 np0005476733 systemd-machined[152624]: Machine qemu-13-instance-00000014 terminated.
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.662 2 INFO nova.virt.libvirt.driver [-] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Instance destroyed successfully.#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.663 2 DEBUG nova.objects.instance [None req-d0e87eaf-54b1-4dc7-89d5-140b4b456a90 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lazy-loading 'resources' on Instance uuid 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.685 2 DEBUG nova.virt.libvirt.vif [None req-d0e87eaf-54b1-4dc7-89d5-140b4b456a90 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:23:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-broadcast-receiver-123-1908290520',display_name='tempest-broadcast-receiver-123-1908290520',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-broadcast-receiver-123-1908290520',id=20,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGBqxlO9VuM0Qq/DWr14YnhGxOxwcqegm/N2XcRSLA8NJfb1K0EfLGDHkMQul32EUhmJshL5J7ZH56Voxwq765dL8/B4SFbezZWy3ydp4mAt0951qcEHggiOu5J3JaZbOg==',key_name='tempest-keypair-test-1882494757',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:23:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7e1086961263487db8a3c5190fdf1b2e',ramdisk_id='',reservation_id='r-52vk5sn9',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-BroadcastTestVlanTransparency-538458942',owner_user_name='tempest-BroadcastTestVlanTransparency-538458942-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:23:16Z,user_data=None,user_id='843ea0278e174175a6f8e21731c1383e',uuid=1a3ae685-bd3d-4f36-ad77-9f5b6b95677f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "address": "fa:16:3e:a6:aa:3b", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bb60f77-cd", "ovs_interfaceid": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.686 2 DEBUG nova.network.os_vif_util [None req-d0e87eaf-54b1-4dc7-89d5-140b4b456a90 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Converting VIF {"id": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "address": "fa:16:3e:a6:aa:3b", "network": {"id": "7a77f8cd-4394-4cb0-a8a1-33872549758a", "bridge": "br-int", "label": "tempest-test-network--933303718", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e1086961263487db8a3c5190fdf1b2e", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0bb60f77-cd", "ovs_interfaceid": "0bb60f77-cd96-4dfd-9810-5583ec966cb2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.686 2 DEBUG nova.network.os_vif_util [None req-d0e87eaf-54b1-4dc7-89d5-140b4b456a90 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a6:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=0bb60f77-cd96-4dfd-9810-5583ec966cb2,network=Network(7a77f8cd-4394-4cb0-a8a1-33872549758a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0bb60f77-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.687 2 DEBUG os_vif [None req-d0e87eaf-54b1-4dc7-89d5-140b4b456a90 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a6:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=0bb60f77-cd96-4dfd-9810-5583ec966cb2,network=Network(7a77f8cd-4394-4cb0-a8a1-33872549758a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0bb60f77-cd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.689 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0bb60f77-cd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.697 2 INFO os_vif [None req-d0e87eaf-54b1-4dc7-89d5-140b4b456a90 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a6:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=0bb60f77-cd96-4dfd-9810-5583ec966cb2,network=Network(7a77f8cd-4394-4cb0-a8a1-33872549758a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0bb60f77-cd')#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.698 2 INFO nova.virt.libvirt.driver [None req-d0e87eaf-54b1-4dc7-89d5-140b4b456a90 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Deleting instance files /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f_del#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.698 2 INFO nova.virt.libvirt.driver [None req-d0e87eaf-54b1-4dc7-89d5-140b4b456a90 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Deletion of /var/lib/nova/instances/1a3ae685-bd3d-4f36-ad77-9f5b6b95677f_del complete#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.765 2 DEBUG oslo_concurrency.lockutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.766 2 DEBUG oslo_concurrency.lockutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:17 np0005476733 nova_compute[192580]: 2025-10-08 15:29:17.796 2 DEBUG nova.compute.manager [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:29:17 np0005476733 neutron-haproxy-ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a[225497]: [NOTICE]   (225501) : haproxy version is 2.8.14-c23fe91
Oct  8 11:29:17 np0005476733 neutron-haproxy-ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a[225497]: [NOTICE]   (225501) : path to executable is /usr/sbin/haproxy
Oct  8 11:29:17 np0005476733 neutron-haproxy-ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a[225497]: [WARNING]  (225501) : Exiting Master process...
Oct  8 11:29:17 np0005476733 neutron-haproxy-ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a[225497]: [WARNING]  (225501) : Exiting Master process...
Oct  8 11:29:17 np0005476733 neutron-haproxy-ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a[225497]: [ALERT]    (225501) : Current worker (225503) exited with code 143 (Terminated)
Oct  8 11:29:17 np0005476733 neutron-haproxy-ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a[225497]: [WARNING]  (225501) : All workers exited. Exiting... (0)
Oct  8 11:29:17 np0005476733 systemd[1]: libpod-9cc9ad7d6f3f073545d8485061fa8b036ac5fc2618d84b4a5cbed171d9617366.scope: Deactivated successfully.
Oct  8 11:29:17 np0005476733 podman[230459]: 2025-10-08 15:29:17.852963622 +0000 UTC m=+0.283223760 container died 9cc9ad7d6f3f073545d8485061fa8b036ac5fc2618d84b4a5cbed171d9617366 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.005 2 DEBUG oslo_concurrency.lockutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Acquiring lock "a7cf9795-ac6e-4d38-8500-755c39931e14" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.006 2 DEBUG oslo_concurrency.lockutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Lock "a7cf9795-ac6e-4d38-8500-755c39931e14" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.010 2 INFO nova.compute.manager [None req-d0e87eaf-54b1-4dc7-89d5-140b4b456a90 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Took 0.62 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.011 2 DEBUG oslo.service.loopingcall [None req-d0e87eaf-54b1-4dc7-89d5-140b4b456a90 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.011 2 DEBUG nova.compute.manager [-] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.011 2 DEBUG nova.network.neutron [-] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.035 2 DEBUG nova.compute.manager [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:29:18 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9cc9ad7d6f3f073545d8485061fa8b036ac5fc2618d84b4a5cbed171d9617366-userdata-shm.mount: Deactivated successfully.
Oct  8 11:29:18 np0005476733 systemd[1]: var-lib-containers-storage-overlay-548964f02231e33d3158a6429259dcf0919199dd66efcea55fac07f198cd4b58-merged.mount: Deactivated successfully.
Oct  8 11:29:18 np0005476733 podman[230459]: 2025-10-08 15:29:18.106040517 +0000 UTC m=+0.536300655 container cleanup 9cc9ad7d6f3f073545d8485061fa8b036ac5fc2618d84b4a5cbed171d9617366 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:29:18 np0005476733 systemd[1]: libpod-conmon-9cc9ad7d6f3f073545d8485061fa8b036ac5fc2618d84b4a5cbed171d9617366.scope: Deactivated successfully.
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.296 2 DEBUG oslo_concurrency.lockutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.297 2 DEBUG oslo_concurrency.lockutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.306 2 DEBUG nova.virt.hardware [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.307 2 INFO nova.compute.claims [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:29:18 np0005476733 podman[230502]: 2025-10-08 15:29:18.335787147 +0000 UTC m=+0.192431608 container remove 9cc9ad7d6f3f073545d8485061fa8b036ac5fc2618d84b4a5cbed171d9617366 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  8 11:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:18.344 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e707d9cd-34de-4561-b041-e1cac41f6c9c]: (4, ('Wed Oct  8 03:29:17 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a (9cc9ad7d6f3f073545d8485061fa8b036ac5fc2618d84b4a5cbed171d9617366)\n9cc9ad7d6f3f073545d8485061fa8b036ac5fc2618d84b4a5cbed171d9617366\nWed Oct  8 03:29:18 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a (9cc9ad7d6f3f073545d8485061fa8b036ac5fc2618d84b4a5cbed171d9617366)\n9cc9ad7d6f3f073545d8485061fa8b036ac5fc2618d84b4a5cbed171d9617366\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:18.346 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[82729e1a-6a08-4e50-b2b9-78080f63b1ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:18.348 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a77f8cd-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:18 np0005476733 kernel: tap7a77f8cd-40: left promiscuous mode
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.354 2 DEBUG oslo_concurrency.lockutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:18.357 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[67bcf936-4344-4540-ae0b-f114a970713c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:18.388 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b2082580-be37-4fac-ae85-c77cb4b0607d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:18.390 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[fd43440b-6f9d-43f9-b0f8-98be6868dc90]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:18.411 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1da32c32-7099-4756-9936-1920f3d11e4e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 389904, 'reachable_time': 20134, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230515, 'error': None, 'target': 'ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:18.417 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7a77f8cd-4394-4cb0-a8a1-33872549758a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:18.417 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[c80d234b-9035-4eca-9de0-62f8f12fe706]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:18 np0005476733 systemd[1]: run-netns-ovnmeta\x2d7a77f8cd\x2d4394\x2d4cb0\x2da8a1\x2d33872549758a.mount: Deactivated successfully.
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.527 2 DEBUG nova.compute.provider_tree [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.555 2 DEBUG nova.scheduler.client.report [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.592 2 DEBUG oslo_concurrency.lockutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.296s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.593 2 DEBUG nova.compute.manager [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.597 2 DEBUG oslo_concurrency.lockutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.608 2 DEBUG nova.virt.hardware [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.608 2 INFO nova.compute.claims [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.704 2 DEBUG nova.compute.manager [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.704 2 DEBUG nova.network.neutron [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.741 2 INFO nova.virt.libvirt.driver [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.794 2 DEBUG nova.compute.manager [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.904 2 DEBUG nova.compute.provider_tree [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.938 2 DEBUG nova.scheduler.client.report [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.949 2 DEBUG nova.compute.manager [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.951 2 DEBUG nova.virt.libvirt.driver [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.952 2 INFO nova.virt.libvirt.driver [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Creating image(s)#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.953 2 DEBUG oslo_concurrency.lockutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "/var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.953 2 DEBUG oslo_concurrency.lockutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "/var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.955 2 DEBUG oslo_concurrency.lockutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "/var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.982 2 DEBUG oslo_concurrency.lockutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.385s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.983 2 DEBUG nova.compute.manager [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:29:18 np0005476733 nova_compute[192580]: 2025-10-08 15:29:18.987 2 DEBUG oslo_concurrency.processutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.043 2 DEBUG nova.compute.manager [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.044 2 DEBUG nova.network.neutron [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.056 2 DEBUG oslo_concurrency.processutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.057 2 DEBUG oslo_concurrency.lockutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.058 2 DEBUG oslo_concurrency.lockutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.077 2 DEBUG oslo_concurrency.processutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.101 2 INFO nova.virt.libvirt.driver [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.133 2 DEBUG nova.compute.manager [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.144 2 DEBUG oslo_concurrency.processutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.145 2 DEBUG oslo_concurrency.processutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.266 2 DEBUG nova.compute.manager [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.267 2 DEBUG nova.virt.libvirt.driver [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.268 2 INFO nova.virt.libvirt.driver [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Creating image(s)#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.268 2 DEBUG oslo_concurrency.lockutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Acquiring lock "/var/lib/nova/instances/a7cf9795-ac6e-4d38-8500-755c39931e14/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.269 2 DEBUG oslo_concurrency.lockutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Lock "/var/lib/nova/instances/a7cf9795-ac6e-4d38-8500-755c39931e14/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.269 2 DEBUG oslo_concurrency.lockutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Lock "/var/lib/nova/instances/a7cf9795-ac6e-4d38-8500-755c39931e14/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.282 2 DEBUG nova.policy [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.285 2 DEBUG oslo_concurrency.processutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.343 2 DEBUG oslo_concurrency.processutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.344 2 DEBUG oslo_concurrency.lockutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.358 2 DEBUG oslo_concurrency.processutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk 10737418240" returned: 0 in 0.212s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.358 2 DEBUG oslo_concurrency.lockutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.300s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.359 2 DEBUG oslo_concurrency.processutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.377 2 DEBUG oslo_concurrency.lockutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.033s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.393 2 DEBUG oslo_concurrency.processutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.431 2 DEBUG oslo_concurrency.processutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.432 2 DEBUG nova.objects.instance [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lazy-loading 'migration_context' on Instance uuid 27fa9a5a-04a0-4d80-b75d-564df1c974e8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.449 2 DEBUG nova.virt.libvirt.driver [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.450 2 DEBUG nova.virt.libvirt.driver [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Ensure instance console log exists: /var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.450 2 DEBUG oslo_concurrency.lockutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.451 2 DEBUG oslo_concurrency.lockutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.451 2 DEBUG oslo_concurrency.lockutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.472 2 DEBUG oslo_concurrency.processutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.473 2 DEBUG oslo_concurrency.processutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/a7cf9795-ac6e-4d38-8500-755c39931e14/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.510 2 DEBUG oslo_concurrency.processutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/a7cf9795-ac6e-4d38-8500-755c39931e14/disk 10737418240" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.511 2 DEBUG oslo_concurrency.lockutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.512 2 DEBUG oslo_concurrency.processutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.588 2 DEBUG oslo_concurrency.processutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.589 2 DEBUG nova.objects.instance [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Lazy-loading 'migration_context' on Instance uuid a7cf9795-ac6e-4d38-8500-755c39931e14 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.628 2 DEBUG nova.virt.libvirt.driver [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.629 2 DEBUG nova.virt.libvirt.driver [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Ensure instance console log exists: /var/lib/nova/instances/a7cf9795-ac6e-4d38-8500-755c39931e14/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.630 2 DEBUG oslo_concurrency.lockutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.630 2 DEBUG oslo_concurrency.lockutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.631 2 DEBUG oslo_concurrency.lockutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:19 np0005476733 nova_compute[192580]: 2025-10-08 15:29:19.945 2 DEBUG nova.policy [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c852472017334735b37425ffa8591384', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:29:20 np0005476733 nova_compute[192580]: 2025-10-08 15:29:20.451 2 DEBUG nova.compute.manager [req-67d8dd8f-9bc6-44bb-9f20-3ffe1c20163f req-a2a077fe-4c58-40c4-81e8-a96d312e93f9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Received event network-vif-unplugged-0bb60f77-cd96-4dfd-9810-5583ec966cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:29:20 np0005476733 nova_compute[192580]: 2025-10-08 15:29:20.451 2 DEBUG oslo_concurrency.lockutils [req-67d8dd8f-9bc6-44bb-9f20-3ffe1c20163f req-a2a077fe-4c58-40c4-81e8-a96d312e93f9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:20 np0005476733 nova_compute[192580]: 2025-10-08 15:29:20.451 2 DEBUG oslo_concurrency.lockutils [req-67d8dd8f-9bc6-44bb-9f20-3ffe1c20163f req-a2a077fe-4c58-40c4-81e8-a96d312e93f9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:20 np0005476733 nova_compute[192580]: 2025-10-08 15:29:20.452 2 DEBUG oslo_concurrency.lockutils [req-67d8dd8f-9bc6-44bb-9f20-3ffe1c20163f req-a2a077fe-4c58-40c4-81e8-a96d312e93f9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:20 np0005476733 nova_compute[192580]: 2025-10-08 15:29:20.452 2 DEBUG nova.compute.manager [req-67d8dd8f-9bc6-44bb-9f20-3ffe1c20163f req-a2a077fe-4c58-40c4-81e8-a96d312e93f9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] No waiting events found dispatching network-vif-unplugged-0bb60f77-cd96-4dfd-9810-5583ec966cb2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:29:20 np0005476733 nova_compute[192580]: 2025-10-08 15:29:20.452 2 DEBUG nova.compute.manager [req-67d8dd8f-9bc6-44bb-9f20-3ffe1c20163f req-a2a077fe-4c58-40c4-81e8-a96d312e93f9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Received event network-vif-unplugged-0bb60f77-cd96-4dfd-9810-5583ec966cb2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:29:20 np0005476733 nova_compute[192580]: 2025-10-08 15:29:20.452 2 DEBUG nova.compute.manager [req-67d8dd8f-9bc6-44bb-9f20-3ffe1c20163f req-a2a077fe-4c58-40c4-81e8-a96d312e93f9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Received event network-vif-plugged-0bb60f77-cd96-4dfd-9810-5583ec966cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:29:20 np0005476733 nova_compute[192580]: 2025-10-08 15:29:20.452 2 DEBUG oslo_concurrency.lockutils [req-67d8dd8f-9bc6-44bb-9f20-3ffe1c20163f req-a2a077fe-4c58-40c4-81e8-a96d312e93f9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:20 np0005476733 nova_compute[192580]: 2025-10-08 15:29:20.452 2 DEBUG oslo_concurrency.lockutils [req-67d8dd8f-9bc6-44bb-9f20-3ffe1c20163f req-a2a077fe-4c58-40c4-81e8-a96d312e93f9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:20 np0005476733 nova_compute[192580]: 2025-10-08 15:29:20.453 2 DEBUG oslo_concurrency.lockutils [req-67d8dd8f-9bc6-44bb-9f20-3ffe1c20163f req-a2a077fe-4c58-40c4-81e8-a96d312e93f9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "1a3ae685-bd3d-4f36-ad77-9f5b6b95677f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:20 np0005476733 nova_compute[192580]: 2025-10-08 15:29:20.453 2 DEBUG nova.compute.manager [req-67d8dd8f-9bc6-44bb-9f20-3ffe1c20163f req-a2a077fe-4c58-40c4-81e8-a96d312e93f9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] No waiting events found dispatching network-vif-plugged-0bb60f77-cd96-4dfd-9810-5583ec966cb2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:29:20 np0005476733 nova_compute[192580]: 2025-10-08 15:29:20.453 2 WARNING nova.compute.manager [req-67d8dd8f-9bc6-44bb-9f20-3ffe1c20163f req-a2a077fe-4c58-40c4-81e8-a96d312e93f9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Received unexpected event network-vif-plugged-0bb60f77-cd96-4dfd-9810-5583ec966cb2 for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:29:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:20Z|00333|pinctrl|WARN|Dropped 7181 log messages in last 60 seconds (most recently, 2 seconds ago) due to excessive rate
Oct  8 11:29:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:20Z|00334|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:29:20 np0005476733 nova_compute[192580]: 2025-10-08 15:29:20.952 2 DEBUG nova.network.neutron [-] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:29:20 np0005476733 nova_compute[192580]: 2025-10-08 15:29:20.975 2 INFO nova.compute.manager [-] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Took 2.96 seconds to deallocate network for instance.#033[00m
Oct  8 11:29:21 np0005476733 nova_compute[192580]: 2025-10-08 15:29:21.044 2 DEBUG oslo_concurrency.lockutils [None req-d0e87eaf-54b1-4dc7-89d5-140b4b456a90 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:21 np0005476733 nova_compute[192580]: 2025-10-08 15:29:21.045 2 DEBUG oslo_concurrency.lockutils [None req-d0e87eaf-54b1-4dc7-89d5-140b4b456a90 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:21.075 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:29:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:21.076 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:29:21 np0005476733 nova_compute[192580]: 2025-10-08 15:29:21.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:21 np0005476733 nova_compute[192580]: 2025-10-08 15:29:21.224 2 DEBUG nova.compute.provider_tree [None req-d0e87eaf-54b1-4dc7-89d5-140b4b456a90 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:29:21 np0005476733 nova_compute[192580]: 2025-10-08 15:29:21.239 2 DEBUG nova.scheduler.client.report [None req-d0e87eaf-54b1-4dc7-89d5-140b4b456a90 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:29:21 np0005476733 nova_compute[192580]: 2025-10-08 15:29:21.274 2 DEBUG oslo_concurrency.lockutils [None req-d0e87eaf-54b1-4dc7-89d5-140b4b456a90 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:21 np0005476733 nova_compute[192580]: 2025-10-08 15:29:21.299 2 INFO nova.scheduler.client.report [None req-d0e87eaf-54b1-4dc7-89d5-140b4b456a90 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Deleted allocations for instance 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f#033[00m
Oct  8 11:29:21 np0005476733 nova_compute[192580]: 2025-10-08 15:29:21.368 2 DEBUG oslo_concurrency.lockutils [None req-d0e87eaf-54b1-4dc7-89d5-140b4b456a90 843ea0278e174175a6f8e21731c1383e 7e1086961263487db8a3c5190fdf1b2e - - default default] Lock "1a3ae685-bd3d-4f36-ad77-9f5b6b95677f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.983s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:21 np0005476733 nova_compute[192580]: 2025-10-08 15:29:21.375 2 DEBUG nova.network.neutron [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Successfully created port: 23f6a943-ce2f-4958-a0c6-73f789517892 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 11:29:21 np0005476733 nova_compute[192580]: 2025-10-08 15:29:21.669 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937346.6683388, 1cbc4434-d89a-483d-a1f2-299190262888 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:29:21 np0005476733 nova_compute[192580]: 2025-10-08 15:29:21.669 2 INFO nova.compute.manager [-] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:29:21 np0005476733 nova_compute[192580]: 2025-10-08 15:29:21.703 2 DEBUG nova.compute.manager [None req-2598f450-2e36-430e-98ec-331d8eafa6b7 - - - - - -] [instance: 1cbc4434-d89a-483d-a1f2-299190262888] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:29:21 np0005476733 nova_compute[192580]: 2025-10-08 15:29:21.747 2 DEBUG nova.network.neutron [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Successfully updated port: 66f32729-1d2a-44d6-b604-29e4c751f95c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:29:21 np0005476733 nova_compute[192580]: 2025-10-08 15:29:21.766 2 DEBUG oslo_concurrency.lockutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Acquiring lock "refresh_cache-a7cf9795-ac6e-4d38-8500-755c39931e14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:29:21 np0005476733 nova_compute[192580]: 2025-10-08 15:29:21.766 2 DEBUG oslo_concurrency.lockutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Acquired lock "refresh_cache-a7cf9795-ac6e-4d38-8500-755c39931e14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:29:21 np0005476733 nova_compute[192580]: 2025-10-08 15:29:21.766 2 DEBUG nova.network.neutron [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:29:22 np0005476733 nova_compute[192580]: 2025-10-08 15:29:22.031 2 DEBUG nova.network.neutron [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:29:22 np0005476733 nova_compute[192580]: 2025-10-08 15:29:22.583 2 DEBUG nova.compute.manager [req-c74184f7-915a-42f2-b10d-148322c30212 req-38b5f5cb-da01-4952-8873-42fe11d0be7d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Received event network-changed-66f32729-1d2a-44d6-b604-29e4c751f95c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:29:22 np0005476733 nova_compute[192580]: 2025-10-08 15:29:22.584 2 DEBUG nova.compute.manager [req-c74184f7-915a-42f2-b10d-148322c30212 req-38b5f5cb-da01-4952-8873-42fe11d0be7d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Refreshing instance network info cache due to event network-changed-66f32729-1d2a-44d6-b604-29e4c751f95c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:29:22 np0005476733 nova_compute[192580]: 2025-10-08 15:29:22.584 2 DEBUG oslo_concurrency.lockutils [req-c74184f7-915a-42f2-b10d-148322c30212 req-38b5f5cb-da01-4952-8873-42fe11d0be7d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-a7cf9795-ac6e-4d38-8500-755c39931e14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:29:22 np0005476733 nova_compute[192580]: 2025-10-08 15:29:22.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:23 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:23.080 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.595 2 DEBUG nova.network.neutron [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Updating instance_info_cache with network_info: [{"id": "66f32729-1d2a-44d6-b604-29e4c751f95c", "address": "fa:16:3e:22:ff:69", "network": {"id": "858b993e-0613-4d63-983c-94fe95ccca9d", "bridge": "br-int", "label": "tempest-MultiVlanTransparencyTest-1225694051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4f21e712eb24213a38bc89e2b2f44b3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66f32729-1d", "ovs_interfaceid": "66f32729-1d2a-44d6-b604-29e4c751f95c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.623 2 DEBUG oslo_concurrency.lockutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Releasing lock "refresh_cache-a7cf9795-ac6e-4d38-8500-755c39931e14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.624 2 DEBUG nova.compute.manager [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Instance network_info: |[{"id": "66f32729-1d2a-44d6-b604-29e4c751f95c", "address": "fa:16:3e:22:ff:69", "network": {"id": "858b993e-0613-4d63-983c-94fe95ccca9d", "bridge": "br-int", "label": "tempest-MultiVlanTransparencyTest-1225694051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4f21e712eb24213a38bc89e2b2f44b3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66f32729-1d", "ovs_interfaceid": "66f32729-1d2a-44d6-b604-29e4c751f95c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.624 2 DEBUG oslo_concurrency.lockutils [req-c74184f7-915a-42f2-b10d-148322c30212 req-38b5f5cb-da01-4952-8873-42fe11d0be7d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-a7cf9795-ac6e-4d38-8500-755c39931e14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.625 2 DEBUG nova.network.neutron [req-c74184f7-915a-42f2-b10d-148322c30212 req-38b5f5cb-da01-4952-8873-42fe11d0be7d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Refreshing network info cache for port 66f32729-1d2a-44d6-b604-29e4c751f95c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.628 2 DEBUG nova.virt.libvirt.driver [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Start _get_guest_xml network_info=[{"id": "66f32729-1d2a-44d6-b604-29e4c751f95c", "address": "fa:16:3e:22:ff:69", "network": {"id": "858b993e-0613-4d63-983c-94fe95ccca9d", "bridge": "br-int", "label": "tempest-MultiVlanTransparencyTest-1225694051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4f21e712eb24213a38bc89e2b2f44b3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66f32729-1d", "ovs_interfaceid": "66f32729-1d2a-44d6-b604-29e4c751f95c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.632 2 WARNING nova.virt.libvirt.driver [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.638 2 DEBUG nova.virt.libvirt.host [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.639 2 DEBUG nova.virt.libvirt.host [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.645 2 DEBUG nova.virt.libvirt.host [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.645 2 DEBUG nova.virt.libvirt.host [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.646 2 DEBUG nova.virt.libvirt.driver [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.646 2 DEBUG nova.virt.hardware [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.647 2 DEBUG nova.virt.hardware [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.647 2 DEBUG nova.virt.hardware [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.647 2 DEBUG nova.virt.hardware [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.648 2 DEBUG nova.virt.hardware [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.648 2 DEBUG nova.virt.hardware [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.648 2 DEBUG nova.virt.hardware [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.648 2 DEBUG nova.virt.hardware [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.649 2 DEBUG nova.virt.hardware [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.649 2 DEBUG nova.virt.hardware [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.649 2 DEBUG nova.virt.hardware [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.652 2 DEBUG nova.virt.libvirt.vif [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:29:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiVlanTransparencyTest-1225694051-0',display_name='server-tempest-MultiVlanTransparencyTest-1225694051-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multivlantransparencytest-1225694051-0',id=40,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLvujo5b87uOVbg3RpxHXnqbrVt2ySGYzFVUknwtVv2YR6AYy5RSYfPq4hh/P68Iq/ARCEc1PMbDU99yQi39bUYIlrvmMeOEw4FT/HN0a6mEQB3qyjgogOJ/vPLLZ3a+kQ==',key_name='tempest-MultiVlanTransparencyTest-1225694051',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f4f21e712eb24213a38bc89e2b2f44b3',ramdisk_id='',reservation_id='r-3f98e63s',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiVlanTransparencyTest-48410347',owner_user_name='tempest-MultiVlanTransparencyTest-48410347-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:29:19Z,user_data=None,user_id='c852472017334735b37425ffa8591384',uuid=a7cf9795-ac6e-4d38-8500-755c39931e14,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "66f32729-1d2a-44d6-b604-29e4c751f95c", "address": "fa:16:3e:22:ff:69", "network": {"id": "858b993e-0613-4d63-983c-94fe95ccca9d", "bridge": "br-int", "label": "tempest-MultiVlanTransparencyTest-1225694051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4f21e712eb24213a38bc89e2b2f44b3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66f32729-1d", "ovs_interfaceid": "66f32729-1d2a-44d6-b604-29e4c751f95c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.653 2 DEBUG nova.network.os_vif_util [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Converting VIF {"id": "66f32729-1d2a-44d6-b604-29e4c751f95c", "address": "fa:16:3e:22:ff:69", "network": {"id": "858b993e-0613-4d63-983c-94fe95ccca9d", "bridge": "br-int", "label": "tempest-MultiVlanTransparencyTest-1225694051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4f21e712eb24213a38bc89e2b2f44b3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66f32729-1d", "ovs_interfaceid": "66f32729-1d2a-44d6-b604-29e4c751f95c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.653 2 DEBUG nova.network.os_vif_util [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:ff:69,bridge_name='br-int',has_traffic_filtering=True,id=66f32729-1d2a-44d6-b604-29e4c751f95c,network=Network(858b993e-0613-4d63-983c-94fe95ccca9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap66f32729-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.654 2 DEBUG nova.objects.instance [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Lazy-loading 'pci_devices' on Instance uuid a7cf9795-ac6e-4d38-8500-755c39931e14 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.666 2 DEBUG nova.virt.libvirt.driver [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:29:23 np0005476733 nova_compute[192580]:  <uuid>a7cf9795-ac6e-4d38-8500-755c39931e14</uuid>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:  <name>instance-00000028</name>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <nova:name>server-tempest-MultiVlanTransparencyTest-1225694051-0</nova:name>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:29:23</nova:creationTime>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:29:23 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:        <nova:user uuid="c852472017334735b37425ffa8591384">tempest-MultiVlanTransparencyTest-48410347-project-member</nova:user>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:        <nova:project uuid="f4f21e712eb24213a38bc89e2b2f44b3">tempest-MultiVlanTransparencyTest-48410347</nova:project>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:        <nova:port uuid="66f32729-1d2a-44d6-b604-29e4c751f95c">
Oct  8 11:29:23 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <entry name="serial">a7cf9795-ac6e-4d38-8500-755c39931e14</entry>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <entry name="uuid">a7cf9795-ac6e-4d38-8500-755c39931e14</entry>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/a7cf9795-ac6e-4d38-8500-755c39931e14/disk"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/a7cf9795-ac6e-4d38-8500-755c39931e14/disk.config"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:22:ff:69"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <target dev="tap66f32729-1d"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/a7cf9795-ac6e-4d38-8500-755c39931e14/console.log" append="off"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:29:23 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:29:23 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:29:23 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:29:23 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.668 2 DEBUG nova.compute.manager [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Preparing to wait for external event network-vif-plugged-66f32729-1d2a-44d6-b604-29e4c751f95c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.668 2 DEBUG oslo_concurrency.lockutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Acquiring lock "a7cf9795-ac6e-4d38-8500-755c39931e14-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.668 2 DEBUG oslo_concurrency.lockutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Lock "a7cf9795-ac6e-4d38-8500-755c39931e14-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.668 2 DEBUG oslo_concurrency.lockutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Lock "a7cf9795-ac6e-4d38-8500-755c39931e14-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.669 2 DEBUG nova.virt.libvirt.vif [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:29:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiVlanTransparencyTest-1225694051-0',display_name='server-tempest-MultiVlanTransparencyTest-1225694051-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multivlantransparencytest-1225694051-0',id=40,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLvujo5b87uOVbg3RpxHXnqbrVt2ySGYzFVUknwtVv2YR6AYy5RSYfPq4hh/P68Iq/ARCEc1PMbDU99yQi39bUYIlrvmMeOEw4FT/HN0a6mEQB3qyjgogOJ/vPLLZ3a+kQ==',key_name='tempest-MultiVlanTransparencyTest-1225694051',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f4f21e712eb24213a38bc89e2b2f44b3',ramdisk_id='',reservation_id='r-3f98e63s',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiVlanTransparencyTest-48410347',owner_user_name='tempest-MultiVlanTransparencyTest-48410347-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:29:19Z,user_data=None,user_id='c852472017334735b37425ffa8591384',uuid=a7cf9795-ac6e-4d38-8500-755c39931e14,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "66f32729-1d2a-44d6-b604-29e4c751f95c", "address": "fa:16:3e:22:ff:69", "network": {"id": "858b993e-0613-4d63-983c-94fe95ccca9d", "bridge": "br-int", "label": "tempest-MultiVlanTransparencyTest-1225694051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4f21e712eb24213a38bc89e2b2f44b3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66f32729-1d", "ovs_interfaceid": "66f32729-1d2a-44d6-b604-29e4c751f95c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.669 2 DEBUG nova.network.os_vif_util [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Converting VIF {"id": "66f32729-1d2a-44d6-b604-29e4c751f95c", "address": "fa:16:3e:22:ff:69", "network": {"id": "858b993e-0613-4d63-983c-94fe95ccca9d", "bridge": "br-int", "label": "tempest-MultiVlanTransparencyTest-1225694051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4f21e712eb24213a38bc89e2b2f44b3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66f32729-1d", "ovs_interfaceid": "66f32729-1d2a-44d6-b604-29e4c751f95c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.670 2 DEBUG nova.network.os_vif_util [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:ff:69,bridge_name='br-int',has_traffic_filtering=True,id=66f32729-1d2a-44d6-b604-29e4c751f95c,network=Network(858b993e-0613-4d63-983c-94fe95ccca9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap66f32729-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.670 2 DEBUG os_vif [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:ff:69,bridge_name='br-int',has_traffic_filtering=True,id=66f32729-1d2a-44d6-b604-29e4c751f95c,network=Network(858b993e-0613-4d63-983c-94fe95ccca9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap66f32729-1d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.671 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.671 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.674 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66f32729-1d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.675 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap66f32729-1d, col_values=(('external_ids', {'iface-id': '66f32729-1d2a-44d6-b604-29e4c751f95c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:22:ff:69', 'vm-uuid': 'a7cf9795-ac6e-4d38-8500-755c39931e14'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:29:23 np0005476733 NetworkManager[51699]: <info>  [1759937363.6789] manager: (tap66f32729-1d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/114)
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.685 2 INFO os_vif [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:ff:69,bridge_name='br-int',has_traffic_filtering=True,id=66f32729-1d2a-44d6-b604-29e4c751f95c,network=Network(858b993e-0613-4d63-983c-94fe95ccca9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap66f32729-1d')#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.762 2 DEBUG nova.virt.libvirt.driver [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.762 2 DEBUG nova.virt.libvirt.driver [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.762 2 DEBUG nova.virt.libvirt.driver [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] No VIF found with MAC fa:16:3e:22:ff:69, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.763 2 INFO nova.virt.libvirt.driver [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Using config drive#033[00m
Oct  8 11:29:23 np0005476733 podman[230544]: 2025-10-08 15:29:23.811405369 +0000 UTC m=+0.080361169 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:29:23 np0005476733 podman[230543]: 2025-10-08 15:29:23.811631556 +0000 UTC m=+0.077830988 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct  8 11:29:23 np0005476733 nova_compute[192580]: 2025-10-08 15:29:23.988 2 DEBUG nova.network.neutron [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Successfully updated port: 23f6a943-ce2f-4958-a0c6-73f789517892 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:29:24 np0005476733 nova_compute[192580]: 2025-10-08 15:29:24.013 2 DEBUG oslo_concurrency.lockutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "refresh_cache-27fa9a5a-04a0-4d80-b75d-564df1c974e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:29:24 np0005476733 nova_compute[192580]: 2025-10-08 15:29:24.013 2 DEBUG oslo_concurrency.lockutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquired lock "refresh_cache-27fa9a5a-04a0-4d80-b75d-564df1c974e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:29:24 np0005476733 nova_compute[192580]: 2025-10-08 15:29:24.014 2 DEBUG nova.network.neutron [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:29:24 np0005476733 nova_compute[192580]: 2025-10-08 15:29:24.039 2 INFO nova.virt.libvirt.driver [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Creating config drive at /var/lib/nova/instances/a7cf9795-ac6e-4d38-8500-755c39931e14/disk.config#033[00m
Oct  8 11:29:24 np0005476733 nova_compute[192580]: 2025-10-08 15:29:24.048 2 DEBUG oslo_concurrency.processutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a7cf9795-ac6e-4d38-8500-755c39931e14/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpte6v80td execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:29:24 np0005476733 nova_compute[192580]: 2025-10-08 15:29:24.145 2 DEBUG nova.compute.manager [req-ca1a885f-8fa0-434d-9e70-8163c6d956d7 req-3b70f10a-d16b-4840-9abb-e45f231e6295 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Received event network-changed-23f6a943-ce2f-4958-a0c6-73f789517892 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:29:24 np0005476733 nova_compute[192580]: 2025-10-08 15:29:24.146 2 DEBUG nova.compute.manager [req-ca1a885f-8fa0-434d-9e70-8163c6d956d7 req-3b70f10a-d16b-4840-9abb-e45f231e6295 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Refreshing instance network info cache due to event network-changed-23f6a943-ce2f-4958-a0c6-73f789517892. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:29:24 np0005476733 nova_compute[192580]: 2025-10-08 15:29:24.146 2 DEBUG oslo_concurrency.lockutils [req-ca1a885f-8fa0-434d-9e70-8163c6d956d7 req-3b70f10a-d16b-4840-9abb-e45f231e6295 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-27fa9a5a-04a0-4d80-b75d-564df1c974e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:29:24 np0005476733 nova_compute[192580]: 2025-10-08 15:29:24.184 2 DEBUG oslo_concurrency.processutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a7cf9795-ac6e-4d38-8500-755c39931e14/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpte6v80td" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:29:24 np0005476733 kernel: tap66f32729-1d: entered promiscuous mode
Oct  8 11:29:24 np0005476733 NetworkManager[51699]: <info>  [1759937364.2455] manager: (tap66f32729-1d): new Tun device (/org/freedesktop/NetworkManager/Devices/115)
Oct  8 11:29:24 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:24Z|00335|binding|INFO|Claiming lport 66f32729-1d2a-44d6-b604-29e4c751f95c for this chassis.
Oct  8 11:29:24 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:24Z|00336|binding|INFO|66f32729-1d2a-44d6-b604-29e4c751f95c: Claiming fa:16:3e:22:ff:69 10.100.0.13
Oct  8 11:29:24 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:24Z|00337|binding|INFO|66f32729-1d2a-44d6-b604-29e4c751f95c: Claiming unknown
Oct  8 11:29:24 np0005476733 nova_compute[192580]: 2025-10-08 15:29:24.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:24 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:24Z|00338|binding|INFO|Setting lport 66f32729-1d2a-44d6-b604-29e4c751f95c ovn-installed in OVS
Oct  8 11:29:24 np0005476733 nova_compute[192580]: 2025-10-08 15:29:24.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:24 np0005476733 nova_compute[192580]: 2025-10-08 15:29:24.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:24 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:24Z|00339|binding|INFO|Setting lport 66f32729-1d2a-44d6-b604-29e4c751f95c up in Southbound
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.272 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:ff:69 10.100.0.13', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'port-tempest-MultiVlanTransparencyTest-1225694051-0', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-858b993e-0613-4d63-983c-94fe95ccca9d', 'neutron:port_capabilities': '', 'neutron:port_name': 'port-tempest-MultiVlanTransparencyTest-1225694051-0', 'neutron:project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=31a08c4d-6969-4ca1-af4c-9d4107c6eb62, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=66f32729-1d2a-44d6-b604-29e4c751f95c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.273 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 66f32729-1d2a-44d6-b604-29e4c751f95c in datapath 858b993e-0613-4d63-983c-94fe95ccca9d bound to our chassis#033[00m
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.275 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 858b993e-0613-4d63-983c-94fe95ccca9d#033[00m
Oct  8 11:29:24 np0005476733 systemd-udevd[230602]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:29:24 np0005476733 systemd-machined[152624]: New machine qemu-24-instance-00000028.
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.287 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[afd00f8f-511a-478d-8fa4-82c3a86a85da]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.288 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap858b993e-01 in ovnmeta-858b993e-0613-4d63-983c-94fe95ccca9d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.290 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap858b993e-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.290 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[45343a4e-99dd-4bf9-8ed8-23197a16c27f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.291 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e73c05d9-cccd-4afd-94f4-e00cd3f5dae1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:24 np0005476733 NetworkManager[51699]: <info>  [1759937364.2942] device (tap66f32729-1d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:29:24 np0005476733 NetworkManager[51699]: <info>  [1759937364.2957] device (tap66f32729-1d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:29:24 np0005476733 systemd[1]: Started Virtual Machine qemu-24-instance-00000028.
Oct  8 11:29:24 np0005476733 nova_compute[192580]: 2025-10-08 15:29:24.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.304 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[3f21313b-a8fa-4aae-b376-a3adc28bc4ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.320 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7bd4436d-c261-4cc1-b110-44d0720c0e69]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.345 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[0e900c30-e06c-4cce-af47-c1ea9c2cafbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:24 np0005476733 systemd-udevd[230605]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:29:24 np0005476733 NetworkManager[51699]: <info>  [1759937364.3540] manager: (tap858b993e-00): new Veth device (/org/freedesktop/NetworkManager/Devices/116)
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.353 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[22b7fc7e-7b04-4fd5-b76e-0a1bc062ea56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.397 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[cc2eb0eb-3b27-432c-9f34-b551cda0dbd1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:24 np0005476733 nova_compute[192580]: 2025-10-08 15:29:24.398 2 DEBUG nova.network.neutron [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.401 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[41e89f9a-c13b-460f-89cc-9e8451e12ec0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:24 np0005476733 NetworkManager[51699]: <info>  [1759937364.4246] device (tap858b993e-00): carrier: link connected
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.430 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[a1467962-3fbe-4016-9d3c-f45d2b5f187a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.452 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b91fc8bf-a76f-4c9d-bb50-b7bf796bc2dc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap858b993e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:3b:31'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 78], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426808, 'reachable_time': 33175, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230635, 'error': None, 'target': 'ovnmeta-858b993e-0613-4d63-983c-94fe95ccca9d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.470 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2b1c880a-142b-4667-b700-0cb9739a5abc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe35:3b31'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426808, 'tstamp': 426808}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230636, 'error': None, 'target': 'ovnmeta-858b993e-0613-4d63-983c-94fe95ccca9d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.498 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[12d2050e-e5e5-4a38-bf5a-1be35e4f78b5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap858b993e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:3b:31'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 78], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426808, 'reachable_time': 33175, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230637, 'error': None, 'target': 'ovnmeta-858b993e-0613-4d63-983c-94fe95ccca9d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.545 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5bca8983-9dbe-4956-9b6d-57717e0b665c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.620 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5a9e3e55-b396-4142-95fd-15ce1bf2ba42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.622 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap858b993e-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.623 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.623 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap858b993e-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:24 np0005476733 NetworkManager[51699]: <info>  [1759937364.6300] manager: (tap858b993e-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/117)
Oct  8 11:29:24 np0005476733 kernel: tap858b993e-00: entered promiscuous mode
Oct  8 11:29:24 np0005476733 nova_compute[192580]: 2025-10-08 15:29:24.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.634 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap858b993e-00, col_values=(('external_ids', {'iface-id': 'bbd16b0e-af1f-427d-8500-724401e2ed53'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:24 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:24Z|00340|binding|INFO|Releasing lport bbd16b0e-af1f-427d-8500-724401e2ed53 from this chassis (sb_readonly=0)
Oct  8 11:29:24 np0005476733 nova_compute[192580]: 2025-10-08 15:29:24.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:24 np0005476733 nova_compute[192580]: 2025-10-08 15:29:24.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.651 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/858b993e-0613-4d63-983c-94fe95ccca9d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/858b993e-0613-4d63-983c-94fe95ccca9d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.652 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5acfa045-d708-4424-a3a2-ab5665323f7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.654 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-858b993e-0613-4d63-983c-94fe95ccca9d
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/858b993e-0613-4d63-983c-94fe95ccca9d.pid.haproxy
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 858b993e-0613-4d63-983c-94fe95ccca9d
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:29:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:24.655 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-858b993e-0613-4d63-983c-94fe95ccca9d', 'env', 'PROCESS_TAG=haproxy-858b993e-0613-4d63-983c-94fe95ccca9d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/858b993e-0613-4d63-983c-94fe95ccca9d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:29:24 np0005476733 nova_compute[192580]: 2025-10-08 15:29:24.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:25 np0005476733 podman[230669]: 2025-10-08 15:29:24.993281259 +0000 UTC m=+0.031425856 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:29:25 np0005476733 podman[230669]: 2025-10-08 15:29:25.174406075 +0000 UTC m=+0.212550572 container create 8b7ffbbcaff2a34a1b70e2b914c887e0d24c7996e07c13ad227e0c3e6e26ad61 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-858b993e-0613-4d63-983c-94fe95ccca9d, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 11:29:25 np0005476733 systemd[1]: Started libpod-conmon-8b7ffbbcaff2a34a1b70e2b914c887e0d24c7996e07c13ad227e0c3e6e26ad61.scope.
Oct  8 11:29:25 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:29:25 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd17042d803c6aabb4070bfc0b7789bb09aea40c02adf4bc92f076acc0e395fb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:29:25 np0005476733 nova_compute[192580]: 2025-10-08 15:29:25.549 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937365.5490444, a7cf9795-ac6e-4d38-8500-755c39931e14 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:29:25 np0005476733 nova_compute[192580]: 2025-10-08 15:29:25.551 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] VM Started (Lifecycle Event)#033[00m
Oct  8 11:29:25 np0005476733 nova_compute[192580]: 2025-10-08 15:29:25.594 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:29:25 np0005476733 nova_compute[192580]: 2025-10-08 15:29:25.599 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937365.5491867, a7cf9795-ac6e-4d38-8500-755c39931e14 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:29:25 np0005476733 nova_compute[192580]: 2025-10-08 15:29:25.599 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:29:25 np0005476733 nova_compute[192580]: 2025-10-08 15:29:25.621 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:29:25 np0005476733 nova_compute[192580]: 2025-10-08 15:29:25.625 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:29:25 np0005476733 podman[230669]: 2025-10-08 15:29:25.637299235 +0000 UTC m=+0.675443752 container init 8b7ffbbcaff2a34a1b70e2b914c887e0d24c7996e07c13ad227e0c3e6e26ad61 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-858b993e-0613-4d63-983c-94fe95ccca9d, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct  8 11:29:25 np0005476733 podman[230669]: 2025-10-08 15:29:25.644279858 +0000 UTC m=+0.682424365 container start 8b7ffbbcaff2a34a1b70e2b914c887e0d24c7996e07c13ad227e0c3e6e26ad61 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-858b993e-0613-4d63-983c-94fe95ccca9d, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 11:29:25 np0005476733 nova_compute[192580]: 2025-10-08 15:29:25.647 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:29:25 np0005476733 neutron-haproxy-ovnmeta-858b993e-0613-4d63-983c-94fe95ccca9d[230689]: [NOTICE]   (230695) : New worker (230697) forked
Oct  8 11:29:25 np0005476733 neutron-haproxy-ovnmeta-858b993e-0613-4d63-983c-94fe95ccca9d[230689]: [NOTICE]   (230695) : Loading success.
Oct  8 11:29:25 np0005476733 nova_compute[192580]: 2025-10-08 15:29:25.925 2 DEBUG nova.network.neutron [req-c74184f7-915a-42f2-b10d-148322c30212 req-38b5f5cb-da01-4952-8873-42fe11d0be7d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Updated VIF entry in instance network info cache for port 66f32729-1d2a-44d6-b604-29e4c751f95c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:29:25 np0005476733 nova_compute[192580]: 2025-10-08 15:29:25.926 2 DEBUG nova.network.neutron [req-c74184f7-915a-42f2-b10d-148322c30212 req-38b5f5cb-da01-4952-8873-42fe11d0be7d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Updating instance_info_cache with network_info: [{"id": "66f32729-1d2a-44d6-b604-29e4c751f95c", "address": "fa:16:3e:22:ff:69", "network": {"id": "858b993e-0613-4d63-983c-94fe95ccca9d", "bridge": "br-int", "label": "tempest-MultiVlanTransparencyTest-1225694051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4f21e712eb24213a38bc89e2b2f44b3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66f32729-1d", "ovs_interfaceid": "66f32729-1d2a-44d6-b604-29e4c751f95c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:29:25 np0005476733 nova_compute[192580]: 2025-10-08 15:29:25.953 2 DEBUG oslo_concurrency.lockutils [req-c74184f7-915a-42f2-b10d-148322c30212 req-38b5f5cb-da01-4952-8873-42fe11d0be7d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-a7cf9795-ac6e-4d38-8500-755c39931e14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.296 2 DEBUG nova.compute.manager [req-f7691328-a5b7-42ac-b8f3-cce7b45a5674 req-612a17da-8e39-4a19-80c6-22e2d30dc940 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Received event network-vif-plugged-66f32729-1d2a-44d6-b604-29e4c751f95c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.296 2 DEBUG oslo_concurrency.lockutils [req-f7691328-a5b7-42ac-b8f3-cce7b45a5674 req-612a17da-8e39-4a19-80c6-22e2d30dc940 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "a7cf9795-ac6e-4d38-8500-755c39931e14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.297 2 DEBUG oslo_concurrency.lockutils [req-f7691328-a5b7-42ac-b8f3-cce7b45a5674 req-612a17da-8e39-4a19-80c6-22e2d30dc940 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "a7cf9795-ac6e-4d38-8500-755c39931e14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.297 2 DEBUG oslo_concurrency.lockutils [req-f7691328-a5b7-42ac-b8f3-cce7b45a5674 req-612a17da-8e39-4a19-80c6-22e2d30dc940 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "a7cf9795-ac6e-4d38-8500-755c39931e14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.297 2 DEBUG nova.compute.manager [req-f7691328-a5b7-42ac-b8f3-cce7b45a5674 req-612a17da-8e39-4a19-80c6-22e2d30dc940 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Processing event network-vif-plugged-66f32729-1d2a-44d6-b604-29e4c751f95c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.298 2 DEBUG nova.compute.manager [req-f7691328-a5b7-42ac-b8f3-cce7b45a5674 req-612a17da-8e39-4a19-80c6-22e2d30dc940 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Received event network-vif-plugged-66f32729-1d2a-44d6-b604-29e4c751f95c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.298 2 DEBUG oslo_concurrency.lockutils [req-f7691328-a5b7-42ac-b8f3-cce7b45a5674 req-612a17da-8e39-4a19-80c6-22e2d30dc940 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "a7cf9795-ac6e-4d38-8500-755c39931e14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.298 2 DEBUG oslo_concurrency.lockutils [req-f7691328-a5b7-42ac-b8f3-cce7b45a5674 req-612a17da-8e39-4a19-80c6-22e2d30dc940 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "a7cf9795-ac6e-4d38-8500-755c39931e14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.299 2 DEBUG oslo_concurrency.lockutils [req-f7691328-a5b7-42ac-b8f3-cce7b45a5674 req-612a17da-8e39-4a19-80c6-22e2d30dc940 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "a7cf9795-ac6e-4d38-8500-755c39931e14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.299 2 DEBUG nova.compute.manager [req-f7691328-a5b7-42ac-b8f3-cce7b45a5674 req-612a17da-8e39-4a19-80c6-22e2d30dc940 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] No waiting events found dispatching network-vif-plugged-66f32729-1d2a-44d6-b604-29e4c751f95c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.299 2 WARNING nova.compute.manager [req-f7691328-a5b7-42ac-b8f3-cce7b45a5674 req-612a17da-8e39-4a19-80c6-22e2d30dc940 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Received unexpected event network-vif-plugged-66f32729-1d2a-44d6-b604-29e4c751f95c for instance with vm_state building and task_state spawning.#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.300 2 DEBUG nova.compute.manager [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.305 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937366.3043354, a7cf9795-ac6e-4d38-8500-755c39931e14 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.305 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.307 2 DEBUG nova.virt.libvirt.driver [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.310 2 INFO nova.virt.libvirt.driver [-] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Instance spawned successfully.#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.311 2 DEBUG nova.virt.libvirt.driver [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:29:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:26.312 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:26.313 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:26.315 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.345 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.351 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.355 2 DEBUG nova.virt.libvirt.driver [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.356 2 DEBUG nova.virt.libvirt.driver [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.356 2 DEBUG nova.virt.libvirt.driver [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.357 2 DEBUG nova.virt.libvirt.driver [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.357 2 DEBUG nova.virt.libvirt.driver [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.358 2 DEBUG nova.virt.libvirt.driver [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.409 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.450 2 INFO nova.compute.manager [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Took 7.18 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.450 2 DEBUG nova.compute.manager [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.521 2 INFO nova.compute.manager [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Took 8.24 seconds to build instance.#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.566 2 DEBUG oslo_concurrency.lockutils [None req-4e258ae1-087c-4b7b-ab26-00767fda0b5c c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Lock "a7cf9795-ac6e-4d38-8500-755c39931e14" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.561s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.623 2 DEBUG nova.network.neutron [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Updating instance_info_cache with network_info: [{"id": "23f6a943-ce2f-4958-a0c6-73f789517892", "address": "fa:16:3e:38:f6:e1", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23f6a943-ce", "ovs_interfaceid": "23f6a943-ce2f-4958-a0c6-73f789517892", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.654 2 DEBUG oslo_concurrency.lockutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Releasing lock "refresh_cache-27fa9a5a-04a0-4d80-b75d-564df1c974e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.654 2 DEBUG nova.compute.manager [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Instance network_info: |[{"id": "23f6a943-ce2f-4958-a0c6-73f789517892", "address": "fa:16:3e:38:f6:e1", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23f6a943-ce", "ovs_interfaceid": "23f6a943-ce2f-4958-a0c6-73f789517892", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.654 2 DEBUG oslo_concurrency.lockutils [req-ca1a885f-8fa0-434d-9e70-8163c6d956d7 req-3b70f10a-d16b-4840-9abb-e45f231e6295 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-27fa9a5a-04a0-4d80-b75d-564df1c974e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.655 2 DEBUG nova.network.neutron [req-ca1a885f-8fa0-434d-9e70-8163c6d956d7 req-3b70f10a-d16b-4840-9abb-e45f231e6295 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Refreshing network info cache for port 23f6a943-ce2f-4958-a0c6-73f789517892 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.658 2 DEBUG nova.virt.libvirt.driver [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Start _get_guest_xml network_info=[{"id": "23f6a943-ce2f-4958-a0c6-73f789517892", "address": "fa:16:3e:38:f6:e1", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23f6a943-ce", "ovs_interfaceid": "23f6a943-ce2f-4958-a0c6-73f789517892", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.663 2 WARNING nova.virt.libvirt.driver [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.667 2 DEBUG nova.virt.libvirt.host [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.667 2 DEBUG nova.virt.libvirt.host [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.671 2 DEBUG nova.virt.libvirt.host [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.671 2 DEBUG nova.virt.libvirt.host [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.672 2 DEBUG nova.virt.libvirt.driver [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.672 2 DEBUG nova.virt.hardware [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.672 2 DEBUG nova.virt.hardware [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.673 2 DEBUG nova.virt.hardware [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.673 2 DEBUG nova.virt.hardware [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.673 2 DEBUG nova.virt.hardware [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.673 2 DEBUG nova.virt.hardware [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.674 2 DEBUG nova.virt.hardware [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.674 2 DEBUG nova.virt.hardware [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.674 2 DEBUG nova.virt.hardware [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.675 2 DEBUG nova.virt.hardware [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.675 2 DEBUG nova.virt.hardware [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.679 2 DEBUG nova.virt.libvirt.vif [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:29:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_multicast_east_west-1699367735',display_name='tempest-test_multicast_east_west-1699367735',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-multicast-east-west-1699367735',id=39,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHaTUyIW7HAi8eLb2uxsb3hQ01QNiqMtiwd2QQElMyFusiyPekoP+eGZG5apcvUeJj+ezHykEE9e9GalqeB/Pt0gdiMZz/nmUCtHv59KRRGG4S5F2fPmbxlRdJaDztvzVg==',key_name='tempest-keypair-test-1272869518',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='496a37645ecf47b496dcf02c696ca64a',ramdisk_id='',reservation_id='r-l53p72t9',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Ovn-1993668591',owner_user_name='tempest-MulticastTestIPv4Ovn-1993668591-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:29:18Z,user_data=None,user_id='c0c7c5c2dab54695b1cc0a34bdc4ee47',uuid=27fa9a5a-04a0-4d80-b75d-564df1c974e8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "23f6a943-ce2f-4958-a0c6-73f789517892", "address": "fa:16:3e:38:f6:e1", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23f6a943-ce", "ovs_interfaceid": "23f6a943-ce2f-4958-a0c6-73f789517892", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.679 2 DEBUG nova.network.os_vif_util [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converting VIF {"id": "23f6a943-ce2f-4958-a0c6-73f789517892", "address": "fa:16:3e:38:f6:e1", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23f6a943-ce", "ovs_interfaceid": "23f6a943-ce2f-4958-a0c6-73f789517892", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.680 2 DEBUG nova.network.os_vif_util [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:f6:e1,bridge_name='br-int',has_traffic_filtering=True,id=23f6a943-ce2f-4958-a0c6-73f789517892,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23f6a943-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.681 2 DEBUG nova.objects.instance [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lazy-loading 'pci_devices' on Instance uuid 27fa9a5a-04a0-4d80-b75d-564df1c974e8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.694 2 DEBUG nova.virt.libvirt.driver [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:29:26 np0005476733 nova_compute[192580]:  <uuid>27fa9a5a-04a0-4d80-b75d-564df1c974e8</uuid>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:  <name>instance-00000027</name>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <nova:name>tempest-test_multicast_east_west-1699367735</nova:name>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:29:26</nova:creationTime>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:29:26 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:        <nova:user uuid="c0c7c5c2dab54695b1cc0a34bdc4ee47">tempest-MulticastTestIPv4Ovn-1993668591-project-member</nova:user>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:        <nova:project uuid="496a37645ecf47b496dcf02c696ca64a">tempest-MulticastTestIPv4Ovn-1993668591</nova:project>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:        <nova:port uuid="23f6a943-ce2f-4958-a0c6-73f789517892">
Oct  8 11:29:26 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <entry name="serial">27fa9a5a-04a0-4d80-b75d-564df1c974e8</entry>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <entry name="uuid">27fa9a5a-04a0-4d80-b75d-564df1c974e8</entry>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.config"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:38:f6:e1"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <target dev="tap23f6a943-ce"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8/console.log" append="off"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:29:26 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:29:26 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:29:26 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:29:26 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.695 2 DEBUG nova.compute.manager [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Preparing to wait for external event network-vif-plugged-23f6a943-ce2f-4958-a0c6-73f789517892 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.695 2 DEBUG oslo_concurrency.lockutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.696 2 DEBUG oslo_concurrency.lockutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.696 2 DEBUG oslo_concurrency.lockutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.696 2 DEBUG nova.virt.libvirt.vif [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:29:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_multicast_east_west-1699367735',display_name='tempest-test_multicast_east_west-1699367735',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-multicast-east-west-1699367735',id=39,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHaTUyIW7HAi8eLb2uxsb3hQ01QNiqMtiwd2QQElMyFusiyPekoP+eGZG5apcvUeJj+ezHykEE9e9GalqeB/Pt0gdiMZz/nmUCtHv59KRRGG4S5F2fPmbxlRdJaDztvzVg==',key_name='tempest-keypair-test-1272869518',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='496a37645ecf47b496dcf02c696ca64a',ramdisk_id='',reservation_id='r-l53p72t9',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Ovn-1993668591',owner_user_name='tempest-MulticastTestIPv4Ovn-1993668591-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:29:18Z,user_data=None,user_id='c0c7c5c2dab54695b1cc0a34bdc4ee47',uuid=27fa9a5a-04a0-4d80-b75d-564df1c974e8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "23f6a943-ce2f-4958-a0c6-73f789517892", "address": "fa:16:3e:38:f6:e1", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23f6a943-ce", "ovs_interfaceid": "23f6a943-ce2f-4958-a0c6-73f789517892", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.697 2 DEBUG nova.network.os_vif_util [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converting VIF {"id": "23f6a943-ce2f-4958-a0c6-73f789517892", "address": "fa:16:3e:38:f6:e1", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23f6a943-ce", "ovs_interfaceid": "23f6a943-ce2f-4958-a0c6-73f789517892", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.697 2 DEBUG nova.network.os_vif_util [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:f6:e1,bridge_name='br-int',has_traffic_filtering=True,id=23f6a943-ce2f-4958-a0c6-73f789517892,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23f6a943-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.698 2 DEBUG os_vif [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:f6:e1,bridge_name='br-int',has_traffic_filtering=True,id=23f6a943-ce2f-4958-a0c6-73f789517892,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23f6a943-ce') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.699 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.699 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.702 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap23f6a943-ce, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.703 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap23f6a943-ce, col_values=(('external_ids', {'iface-id': '23f6a943-ce2f-4958-a0c6-73f789517892', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:38:f6:e1', 'vm-uuid': '27fa9a5a-04a0-4d80-b75d-564df1c974e8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:26 np0005476733 NetworkManager[51699]: <info>  [1759937366.7061] manager: (tap23f6a943-ce): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/118)
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.716 2 INFO os_vif [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:f6:e1,bridge_name='br-int',has_traffic_filtering=True,id=23f6a943-ce2f-4958-a0c6-73f789517892,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23f6a943-ce')#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.838 2 DEBUG nova.virt.libvirt.driver [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.839 2 DEBUG nova.virt.libvirt.driver [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.840 2 DEBUG nova.virt.libvirt.driver [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] No VIF found with MAC fa:16:3e:38:f6:e1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:29:26 np0005476733 nova_compute[192580]: 2025-10-08 15:29:26.841 2 INFO nova.virt.libvirt.driver [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Using config drive#033[00m
Oct  8 11:29:27 np0005476733 nova_compute[192580]: 2025-10-08 15:29:27.331 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937352.3305788, 020f3abc-b9cd-43d6-81f9-4464a8d20207 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:29:27 np0005476733 nova_compute[192580]: 2025-10-08 15:29:27.331 2 INFO nova.compute.manager [-] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:29:27 np0005476733 nova_compute[192580]: 2025-10-08 15:29:27.351 2 DEBUG nova.compute.manager [None req-98145971-1514-4ae3-a786-f230cad81f7b - - - - - -] [instance: 020f3abc-b9cd-43d6-81f9-4464a8d20207] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:29:27 np0005476733 nova_compute[192580]: 2025-10-08 15:29:27.499 2 INFO nova.virt.libvirt.driver [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Creating config drive at /var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.config#033[00m
Oct  8 11:29:27 np0005476733 nova_compute[192580]: 2025-10-08 15:29:27.508 2 DEBUG oslo_concurrency.processutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmku3jvql execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:29:27 np0005476733 nova_compute[192580]: 2025-10-08 15:29:27.655 2 DEBUG oslo_concurrency.processutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmku3jvql" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:29:27 np0005476733 kernel: tap23f6a943-ce: entered promiscuous mode
Oct  8 11:29:27 np0005476733 NetworkManager[51699]: <info>  [1759937367.7380] manager: (tap23f6a943-ce): new Tun device (/org/freedesktop/NetworkManager/Devices/119)
Oct  8 11:29:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:27Z|00341|binding|INFO|Claiming lport 23f6a943-ce2f-4958-a0c6-73f789517892 for this chassis.
Oct  8 11:29:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:27Z|00342|binding|INFO|23f6a943-ce2f-4958-a0c6-73f789517892: Claiming fa:16:3e:38:f6:e1 10.100.0.13
Oct  8 11:29:27 np0005476733 nova_compute[192580]: 2025-10-08 15:29:27.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:27.755 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:f6:e1 10.100.0.13'], port_security=['fa:16:3e:38:f6:e1 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '496a37645ecf47b496dcf02c696ca64a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '023a0cd3-fdca-4dff-ba80-8ef557b384c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b3d4cc6-3768-451b-b35e-6b2333c921fd, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=23f6a943-ce2f-4958-a0c6-73f789517892) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:29:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:27.756 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 23f6a943-ce2f-4958-a0c6-73f789517892 in datapath 30cdfb1e-750a-4d0e-9e9c-321b06b371b9 bound to our chassis#033[00m
Oct  8 11:29:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:27.758 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 30cdfb1e-750a-4d0e-9e9c-321b06b371b9#033[00m
Oct  8 11:29:27 np0005476733 nova_compute[192580]: 2025-10-08 15:29:27.770 2 INFO nova.compute.manager [None req-48cfcee5-89b0-42a3-97d6-c6e72c881b90 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Get console output#033[00m
Oct  8 11:29:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:27Z|00343|binding|INFO|Setting lport 23f6a943-ce2f-4958-a0c6-73f789517892 ovn-installed in OVS
Oct  8 11:29:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:27Z|00344|binding|INFO|Setting lport 23f6a943-ce2f-4958-a0c6-73f789517892 up in Southbound
Oct  8 11:29:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:27.778 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[71e94bb1-25e3-46b3-9e98-44f58cd44511]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:27.779 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap30cdfb1e-71 in ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:29:27 np0005476733 nova_compute[192580]: 2025-10-08 15:29:27.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:27.787 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap30cdfb1e-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:29:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:27.787 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9ee24428-d19e-41be-9ae6-ac393825a949]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:27.789 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a6021f58-30e9-44dc-ad4a-cd19f16d8f95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:27 np0005476733 nova_compute[192580]: 2025-10-08 15:29:27.794 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:29:27 np0005476733 systemd-udevd[230728]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:29:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:27.810 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[3395907f-023a-4fb8-8da7-2b82e6d5959d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:27 np0005476733 NetworkManager[51699]: <info>  [1759937367.8319] device (tap23f6a943-ce): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:29:27 np0005476733 systemd-machined[152624]: New machine qemu-25-instance-00000027.
Oct  8 11:29:27 np0005476733 NetworkManager[51699]: <info>  [1759937367.8363] device (tap23f6a943-ce): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:29:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:27.842 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d6965b91-531a-4818-a8e7-4e5521fbde7c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:27 np0005476733 systemd[1]: Started Virtual Machine qemu-25-instance-00000027.
Oct  8 11:29:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:27.873 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[c42ca02b-f41c-478e-8a34-4744f299c3ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:27 np0005476733 NetworkManager[51699]: <info>  [1759937367.8842] manager: (tap30cdfb1e-70): new Veth device (/org/freedesktop/NetworkManager/Devices/120)
Oct  8 11:29:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:27.885 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[353db351-0845-4684-9022-f3bf09e6a5d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:27.923 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[b2fd1626-9f7b-4a2d-993a-e9f3db950cd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:27.929 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[200c7b03-3caa-498a-aae5-6d7e903628e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:27 np0005476733 NetworkManager[51699]: <info>  [1759937367.9536] device (tap30cdfb1e-70): carrier: link connected
Oct  8 11:29:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:27.959 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[5afb3018-3291-40fb-85c0-4aebc7a1aa81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:27.979 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4f7a703c-7add-4ec4-8ece-0edd5c9ddf4f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap30cdfb1e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:3e:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 427161, 'reachable_time': 16966, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230760, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:28.002 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[fbfb2067-e266-4cbf-a899-c535b93bdd63]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed8:3ea4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 427161, 'tstamp': 427161}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230761, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:28.022 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[cf544bdc-989c-4b42-ad71-c579641daf6c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap30cdfb1e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:3e:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 427161, 'reachable_time': 16966, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230762, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:28.052 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ad4031f1-79f2-4221-92bd-c06a83ffd67c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:28.122 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[01ed8021-d713-4cdc-afc6-511da937c397]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:28.123 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap30cdfb1e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:28.124 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:28.124 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap30cdfb1e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:28 np0005476733 kernel: tap30cdfb1e-70: entered promiscuous mode
Oct  8 11:29:28 np0005476733 NetworkManager[51699]: <info>  [1759937368.1272] manager: (tap30cdfb1e-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/121)
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:28.129 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap30cdfb1e-70, col_values=(('external_ids', {'iface-id': '76302563-91ae-48df-adce-3edec8d5a578'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:28 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:28Z|00345|binding|INFO|Releasing lport 76302563-91ae-48df-adce-3edec8d5a578 from this chassis (sb_readonly=0)
Oct  8 11:29:28 np0005476733 nova_compute[192580]: 2025-10-08 15:29:28.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:28 np0005476733 nova_compute[192580]: 2025-10-08 15:29:28.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:28.149 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/30cdfb1e-750a-4d0e-9e9c-321b06b371b9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/30cdfb1e-750a-4d0e-9e9c-321b06b371b9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:28.150 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d9321972-23de-4cb2-aea5-9296c2512ea3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:28.151 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-30cdfb1e-750a-4d0e-9e9c-321b06b371b9
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/30cdfb1e-750a-4d0e-9e9c-321b06b371b9.pid.haproxy
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 30cdfb1e-750a-4d0e-9e9c-321b06b371b9
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:29:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:28.153 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'env', 'PROCESS_TAG=haproxy-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/30cdfb1e-750a-4d0e-9e9c-321b06b371b9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:29:28 np0005476733 nova_compute[192580]: 2025-10-08 15:29:28.348 2 DEBUG nova.network.neutron [req-ca1a885f-8fa0-434d-9e70-8163c6d956d7 req-3b70f10a-d16b-4840-9abb-e45f231e6295 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Updated VIF entry in instance network info cache for port 23f6a943-ce2f-4958-a0c6-73f789517892. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:29:28 np0005476733 nova_compute[192580]: 2025-10-08 15:29:28.349 2 DEBUG nova.network.neutron [req-ca1a885f-8fa0-434d-9e70-8163c6d956d7 req-3b70f10a-d16b-4840-9abb-e45f231e6295 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Updating instance_info_cache with network_info: [{"id": "23f6a943-ce2f-4958-a0c6-73f789517892", "address": "fa:16:3e:38:f6:e1", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23f6a943-ce", "ovs_interfaceid": "23f6a943-ce2f-4958-a0c6-73f789517892", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:29:28 np0005476733 nova_compute[192580]: 2025-10-08 15:29:28.418 2 DEBUG oslo_concurrency.lockutils [req-ca1a885f-8fa0-434d-9e70-8163c6d956d7 req-3b70f10a-d16b-4840-9abb-e45f231e6295 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-27fa9a5a-04a0-4d80-b75d-564df1c974e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:29:28 np0005476733 podman[230798]: 2025-10-08 15:29:28.575308022 +0000 UTC m=+0.099867043 container create 13a46010a3529927506e2994b6e30e9d88632466e005671e6955a4114e297803 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 11:29:28 np0005476733 podman[230798]: 2025-10-08 15:29:28.499454118 +0000 UTC m=+0.024013149 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:29:28 np0005476733 systemd[1]: Started libpod-conmon-13a46010a3529927506e2994b6e30e9d88632466e005671e6955a4114e297803.scope.
Oct  8 11:29:28 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:29:28 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8472d2d34892162c03e9b5e37746390044b6e365ede262b47cf39b7b04ac757/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:29:28 np0005476733 podman[230798]: 2025-10-08 15:29:28.684981505 +0000 UTC m=+0.209540526 container init 13a46010a3529927506e2994b6e30e9d88632466e005671e6955a4114e297803 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS)
Oct  8 11:29:28 np0005476733 podman[230798]: 2025-10-08 15:29:28.691654238 +0000 UTC m=+0.216213239 container start 13a46010a3529927506e2994b6e30e9d88632466e005671e6955a4114e297803 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:29:28 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[230812]: [NOTICE]   (230816) : New worker (230818) forked
Oct  8 11:29:28 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[230812]: [NOTICE]   (230816) : Loading success.
Oct  8 11:29:28 np0005476733 nova_compute[192580]: 2025-10-08 15:29:28.901 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937368.9006007, 27fa9a5a-04a0-4d80-b75d-564df1c974e8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:29:28 np0005476733 nova_compute[192580]: 2025-10-08 15:29:28.901 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] VM Started (Lifecycle Event)#033[00m
Oct  8 11:29:28 np0005476733 nova_compute[192580]: 2025-10-08 15:29:28.929 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:29:28 np0005476733 nova_compute[192580]: 2025-10-08 15:29:28.932 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937368.900823, 27fa9a5a-04a0-4d80-b75d-564df1c974e8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:29:28 np0005476733 nova_compute[192580]: 2025-10-08 15:29:28.932 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:29:28 np0005476733 nova_compute[192580]: 2025-10-08 15:29:28.951 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:29:28 np0005476733 nova_compute[192580]: 2025-10-08 15:29:28.955 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:29:28 np0005476733 nova_compute[192580]: 2025-10-08 15:29:28.995 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.017 2 DEBUG nova.compute.manager [req-c5606ae8-db86-4bad-8db1-c615b33a54dc req-3442e41e-130c-4be0-95a0-9fc4446159e9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Received event network-vif-plugged-23f6a943-ce2f-4958-a0c6-73f789517892 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.017 2 DEBUG oslo_concurrency.lockutils [req-c5606ae8-db86-4bad-8db1-c615b33a54dc req-3442e41e-130c-4be0-95a0-9fc4446159e9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.018 2 DEBUG oslo_concurrency.lockutils [req-c5606ae8-db86-4bad-8db1-c615b33a54dc req-3442e41e-130c-4be0-95a0-9fc4446159e9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.018 2 DEBUG oslo_concurrency.lockutils [req-c5606ae8-db86-4bad-8db1-c615b33a54dc req-3442e41e-130c-4be0-95a0-9fc4446159e9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.019 2 DEBUG nova.compute.manager [req-c5606ae8-db86-4bad-8db1-c615b33a54dc req-3442e41e-130c-4be0-95a0-9fc4446159e9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Processing event network-vif-plugged-23f6a943-ce2f-4958-a0c6-73f789517892 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.020 2 DEBUG nova.compute.manager [req-c5606ae8-db86-4bad-8db1-c615b33a54dc req-3442e41e-130c-4be0-95a0-9fc4446159e9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Received event network-vif-plugged-23f6a943-ce2f-4958-a0c6-73f789517892 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.020 2 DEBUG oslo_concurrency.lockutils [req-c5606ae8-db86-4bad-8db1-c615b33a54dc req-3442e41e-130c-4be0-95a0-9fc4446159e9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.020 2 DEBUG oslo_concurrency.lockutils [req-c5606ae8-db86-4bad-8db1-c615b33a54dc req-3442e41e-130c-4be0-95a0-9fc4446159e9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.021 2 DEBUG oslo_concurrency.lockutils [req-c5606ae8-db86-4bad-8db1-c615b33a54dc req-3442e41e-130c-4be0-95a0-9fc4446159e9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.021 2 DEBUG nova.compute.manager [req-c5606ae8-db86-4bad-8db1-c615b33a54dc req-3442e41e-130c-4be0-95a0-9fc4446159e9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] No waiting events found dispatching network-vif-plugged-23f6a943-ce2f-4958-a0c6-73f789517892 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.022 2 WARNING nova.compute.manager [req-c5606ae8-db86-4bad-8db1-c615b33a54dc req-3442e41e-130c-4be0-95a0-9fc4446159e9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Received unexpected event network-vif-plugged-23f6a943-ce2f-4958-a0c6-73f789517892 for instance with vm_state building and task_state spawning.#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.023 2 DEBUG nova.compute.manager [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.029 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937369.0286765, 27fa9a5a-04a0-4d80-b75d-564df1c974e8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.029 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.032 2 DEBUG nova.virt.libvirt.driver [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.036 2 INFO nova.virt.libvirt.driver [-] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Instance spawned successfully.#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.037 2 DEBUG nova.virt.libvirt.driver [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.056 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.065 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.070 2 DEBUG nova.virt.libvirt.driver [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.071 2 DEBUG nova.virt.libvirt.driver [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.071 2 DEBUG nova.virt.libvirt.driver [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.072 2 DEBUG nova.virt.libvirt.driver [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.073 2 DEBUG nova.virt.libvirt.driver [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.074 2 DEBUG nova.virt.libvirt.driver [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.088 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.150 2 INFO nova.compute.manager [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Took 10.20 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.151 2 DEBUG nova.compute.manager [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.222 2 INFO nova.compute.manager [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Took 10.96 seconds to build instance.#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.243 2 DEBUG oslo_concurrency.lockutils [None req-61b5c5b5-61af-400c-b1d7-8c98dd31cbe7 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.477s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:29 np0005476733 nova_compute[192580]: 2025-10-08 15:29:29.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:31 np0005476733 nova_compute[192580]: 2025-10-08 15:29:31.470 2 INFO nova.compute.manager [None req-450d595a-98ee-4c10-aae0-5b9b16c31f48 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Get console output#033[00m
Oct  8 11:29:31 np0005476733 nova_compute[192580]: 2025-10-08 15:29:31.475 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:29:31 np0005476733 nova_compute[192580]: 2025-10-08 15:29:31.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:32 np0005476733 nova_compute[192580]: 2025-10-08 15:29:32.661 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937357.6602826, 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:29:32 np0005476733 nova_compute[192580]: 2025-10-08 15:29:32.662 2 INFO nova.compute.manager [-] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:29:32 np0005476733 nova_compute[192580]: 2025-10-08 15:29:32.693 2 DEBUG nova.compute.manager [None req-df9a7a55-b5e9-478e-b43f-d306a9425a71 - - - - - -] [instance: 1a3ae685-bd3d-4f36-ad77-9f5b6b95677f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:29:32 np0005476733 nova_compute[192580]: 2025-10-08 15:29:32.978 2 INFO nova.compute.manager [None req-5a467280-57bc-4ff0-af71-d0364b93c477 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Get console output#033[00m
Oct  8 11:29:32 np0005476733 nova_compute[192580]: 2025-10-08 15:29:32.982 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:29:33 np0005476733 podman[230827]: 2025-10-08 15:29:33.266990437 +0000 UTC m=+0.088441058 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 11:29:34 np0005476733 nova_compute[192580]: 2025-10-08 15:29:34.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:36 np0005476733 podman[230847]: 2025-10-08 15:29:36.245131666 +0000 UTC m=+0.058471669 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 11:29:36 np0005476733 podman[230846]: 2025-10-08 15:29:36.275834416 +0000 UTC m=+0.101458072 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 11:29:36 np0005476733 nova_compute[192580]: 2025-10-08 15:29:36.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:36 np0005476733 nova_compute[192580]: 2025-10-08 15:29:36.634 2 INFO nova.compute.manager [None req-99c21be9-1349-4fc0-80e0-0eaafcd1414a c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Get console output#033[00m
Oct  8 11:29:36 np0005476733 nova_compute[192580]: 2025-10-08 15:29:36.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:38 np0005476733 nova_compute[192580]: 2025-10-08 15:29:38.231 2 INFO nova.compute.manager [None req-388ae8c1-d36f-4c27-971a-e6ad27143341 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Get console output#033[00m
Oct  8 11:29:38 np0005476733 nova_compute[192580]: 2025-10-08 15:29:38.235 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:29:39 np0005476733 nova_compute[192580]: 2025-10-08 15:29:39.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:39.390 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:d2:77 192.168.5.2 2001:5::f816:3eff:fe3e:d277'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.5.2/24 2001:5::f816:3eff:fe3e:d277/64', 'neutron:device_id': 'ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f872d065-dcdd-4abe-966e-984ec8347cf7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '93e68db931464f0282500c84d398d8af', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5057352-eab1-4ec2-8137-06eaee60ec6e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=1e3f1387-ae31-4ad9-bc6d-8f39af22638d) old=Port_Binding(mac=['fa:16:3e:3e:d2:77 192.168.5.2'], external_ids={'neutron:cidrs': '192.168.5.2/24', 'neutron:device_id': 'ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f872d065-dcdd-4abe-966e-984ec8347cf7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '93e68db931464f0282500c84d398d8af', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:29:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:39.392 103739 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 1e3f1387-ae31-4ad9-bc6d-8f39af22638d in datapath f872d065-dcdd-4abe-966e-984ec8347cf7 updated#033[00m
Oct  8 11:29:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:39.396 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f872d065-dcdd-4abe-966e-984ec8347cf7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:29:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:39.398 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[cdb9f06f-1a45-46da-b63c-0a8886e0bf18]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:41 np0005476733 nova_compute[192580]: 2025-10-08 15:29:41.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:29:41 np0005476733 nova_compute[192580]: 2025-10-08 15:29:41.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:29:41 np0005476733 nova_compute[192580]: 2025-10-08 15:29:41.653 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:41 np0005476733 nova_compute[192580]: 2025-10-08 15:29:41.653 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:41 np0005476733 nova_compute[192580]: 2025-10-08 15:29:41.654 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:41 np0005476733 nova_compute[192580]: 2025-10-08 15:29:41.654 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:29:41 np0005476733 nova_compute[192580]: 2025-10-08 15:29:41.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:41 np0005476733 nova_compute[192580]: 2025-10-08 15:29:41.808 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7cf9795-ac6e-4d38-8500-755c39931e14/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:29:41 np0005476733 nova_compute[192580]: 2025-10-08 15:29:41.875 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7cf9795-ac6e-4d38-8500-755c39931e14/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:29:41 np0005476733 nova_compute[192580]: 2025-10-08 15:29:41.875 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7cf9795-ac6e-4d38-8500-755c39931e14/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:29:41 np0005476733 nova_compute[192580]: 2025-10-08 15:29:41.898 2 INFO nova.compute.manager [None req-6e26e76a-a932-458d-88e2-97ee60b3a602 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Get console output#033[00m
Oct  8 11:29:41 np0005476733 nova_compute[192580]: 2025-10-08 15:29:41.903 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:29:41 np0005476733 nova_compute[192580]: 2025-10-08 15:29:41.963 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7cf9795-ac6e-4d38-8500-755c39931e14/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:29:41 np0005476733 nova_compute[192580]: 2025-10-08 15:29:41.968 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:29:42 np0005476733 nova_compute[192580]: 2025-10-08 15:29:42.024 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:29:42 np0005476733 nova_compute[192580]: 2025-10-08 15:29:42.025 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:29:42 np0005476733 nova_compute[192580]: 2025-10-08 15:29:42.107 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:29:42 np0005476733 nova_compute[192580]: 2025-10-08 15:29:42.115 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:29:42 np0005476733 nova_compute[192580]: 2025-10-08 15:29:42.169 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:29:42 np0005476733 nova_compute[192580]: 2025-10-08 15:29:42.170 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:29:42 np0005476733 nova_compute[192580]: 2025-10-08 15:29:42.222 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:29:42 np0005476733 nova_compute[192580]: 2025-10-08 15:29:42.406 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:29:42 np0005476733 nova_compute[192580]: 2025-10-08 15:29:42.407 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12159MB free_disk=111.19312286376953GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:29:42 np0005476733 nova_compute[192580]: 2025-10-08 15:29:42.407 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:42 np0005476733 nova_compute[192580]: 2025-10-08 15:29:42.408 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:42 np0005476733 nova_compute[192580]: 2025-10-08 15:29:42.757 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 7f1808f3-5a79-4149-84d1-7bc21eefa497 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:29:42 np0005476733 nova_compute[192580]: 2025-10-08 15:29:42.757 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 27fa9a5a-04a0-4d80-b75d-564df1c974e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:29:42 np0005476733 nova_compute[192580]: 2025-10-08 15:29:42.757 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance a7cf9795-ac6e-4d38-8500-755c39931e14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:29:42 np0005476733 nova_compute[192580]: 2025-10-08 15:29:42.758 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:29:42 np0005476733 nova_compute[192580]: 2025-10-08 15:29:42.758 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=3584MB phys_disk=119GB used_disk=30GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:29:43 np0005476733 nova_compute[192580]: 2025-10-08 15:29:43.107 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:29:43 np0005476733 nova_compute[192580]: 2025-10-08 15:29:43.138 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:29:43 np0005476733 nova_compute[192580]: 2025-10-08 15:29:43.167 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:29:43 np0005476733 nova_compute[192580]: 2025-10-08 15:29:43.168 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.760s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:43 np0005476733 podman[230918]: 2025-10-08 15:29:43.233294182 +0000 UTC m=+0.059370358 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, release=1755695350)
Oct  8 11:29:43 np0005476733 nova_compute[192580]: 2025-10-08 15:29:43.450 2 INFO nova.compute.manager [None req-c5b99754-ff65-413e-b963-10f3cffc5c9b c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Get console output#033[00m
Oct  8 11:29:43 np0005476733 nova_compute[192580]: 2025-10-08 15:29:43.456 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:29:44 np0005476733 nova_compute[192580]: 2025-10-08 15:29:44.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:46 np0005476733 nova_compute[192580]: 2025-10-08 15:29:46.166 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:29:46 np0005476733 podman[230949]: 2025-10-08 15:29:46.247540294 +0000 UTC m=+0.065587056 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 11:29:46 np0005476733 podman[230948]: 2025-10-08 15:29:46.256461699 +0000 UTC m=+0.073573782 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct  8 11:29:46 np0005476733 nova_compute[192580]: 2025-10-08 15:29:46.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:29:46 np0005476733 nova_compute[192580]: 2025-10-08 15:29:46.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:47 np0005476733 nova_compute[192580]: 2025-10-08 15:29:47.067 2 INFO nova.compute.manager [None req-f9abb987-897e-423b-a805-03044e88f5b2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Get console output#033[00m
Oct  8 11:29:47 np0005476733 nova_compute[192580]: 2025-10-08 15:29:47.072 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:29:47 np0005476733 nova_compute[192580]: 2025-10-08 15:29:47.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:29:48 np0005476733 nova_compute[192580]: 2025-10-08 15:29:48.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:29:48 np0005476733 nova_compute[192580]: 2025-10-08 15:29:48.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:29:48 np0005476733 nova_compute[192580]: 2025-10-08 15:29:48.698 2 INFO nova.compute.manager [None req-e916cda2-c261-412e-8e39-6fba00261b85 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Get console output#033[00m
Oct  8 11:29:48 np0005476733 nova_compute[192580]: 2025-10-08 15:29:48.702 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:29:49 np0005476733 nova_compute[192580]: 2025-10-08 15:29:49.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:50 np0005476733 nova_compute[192580]: 2025-10-08 15:29:50.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:29:50 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:50Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:22:ff:69 10.100.0.13
Oct  8 11:29:50 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:50Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:22:ff:69 10.100.0.13
Oct  8 11:29:51 np0005476733 nova_compute[192580]: 2025-10-08 15:29:51.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.282 2 INFO nova.compute.manager [None req-a3a5c11d-4346-4618-9672-8ebdacee1ca3 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Get console output#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.286 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.433 2 DEBUG oslo_concurrency.lockutils [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "7f1808f3-5a79-4149-84d1-7bc21eefa497" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.434 2 DEBUG oslo_concurrency.lockutils [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "7f1808f3-5a79-4149-84d1-7bc21eefa497" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.434 2 DEBUG oslo_concurrency.lockutils [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.434 2 DEBUG oslo_concurrency.lockutils [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.435 2 DEBUG oslo_concurrency.lockutils [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.436 2 INFO nova.compute.manager [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Terminating instance#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.437 2 DEBUG nova.compute.manager [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:29:52 np0005476733 kernel: tapb5af459f-56 (unregistering): left promiscuous mode
Oct  8 11:29:52 np0005476733 NetworkManager[51699]: <info>  [1759937392.4707] device (tapb5af459f-56): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:29:52 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:52Z|00346|binding|INFO|Releasing lport b5af459f-569f-4ca4-86fe-d2d018227a96 from this chassis (sb_readonly=0)
Oct  8 11:29:52 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:52Z|00347|binding|INFO|Setting lport b5af459f-569f-4ca4-86fe-d2d018227a96 down in Southbound
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:52 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:52Z|00348|binding|INFO|Removing iface tapb5af459f-56 ovn-installed in OVS
Oct  8 11:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:52.502 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:90:51 192.168.2.175'], port_security=['fa:16:3e:4e:90:51 192.168.2.175'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.2.175/24', 'neutron:device_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7929135-b0f8-4022-8ac4-4734ecb47f0b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b449450f-29a2-4ba2-a56d-c4c1cca923db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.186'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e26c823e-4eb7-44c0-a2be-0739b8d56851, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=b5af459f-569f-4ca4-86fe-d2d018227a96) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:52.505 103739 INFO neutron.agent.ovn.metadata.agent [-] Port b5af459f-569f-4ca4-86fe-d2d018227a96 in datapath f7929135-b0f8-4022-8ac4-4734ecb47f0b unbound from our chassis#033[00m
Oct  8 11:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:52.509 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f7929135-b0f8-4022-8ac4-4734ecb47f0b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:52.512 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9a747962-eaca-4bb1-8ef3-61999cdedd5f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:52.513 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f7929135-b0f8-4022-8ac4-4734ecb47f0b namespace which is not needed anymore#033[00m
Oct  8 11:29:52 np0005476733 kernel: tap7ad20ed3-85 (unregistering): left promiscuous mode
Oct  8 11:29:52 np0005476733 NetworkManager[51699]: <info>  [1759937392.5217] device (tap7ad20ed3-85): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:52 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:52Z|00349|binding|INFO|Releasing lport 7ad20ed3-8502-40cd-84e3-773d77da33ae from this chassis (sb_readonly=0)
Oct  8 11:29:52 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:52Z|00350|binding|INFO|Setting lport 7ad20ed3-8502-40cd-84e3-773d77da33ae down in Southbound
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:52 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:52Z|00351|binding|INFO|Removing iface tap7ad20ed3-85 ovn-installed in OVS
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:52.549 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:ec:86 10.100.0.3'], port_security=['fa:16:3e:62:ec:86 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '7f1808f3-5a79-4149-84d1-7bc21eefa497', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '82ea289b-c65f-44fe-a172-e9784a3ab9f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.213'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3da71a44-b74e-4032-87c4-3337484b3d54, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=7ad20ed3-8502-40cd-84e3-773d77da33ae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:29:52 np0005476733 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000021.scope: Deactivated successfully.
Oct  8 11:29:52 np0005476733 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000021.scope: Consumed 57.787s CPU time.
Oct  8 11:29:52 np0005476733 systemd-machined[152624]: Machine qemu-20-instance-00000021 terminated.
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.614 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.615 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:29:52 np0005476733 NetworkManager[51699]: <info>  [1759937392.6784] manager: (tap7ad20ed3-85): new Tun device (/org/freedesktop/NetworkManager/Devices/122)
Oct  8 11:29:52 np0005476733 neutron-haproxy-ovnmeta-f7929135-b0f8-4022-8ac4-4734ecb47f0b[228000]: [NOTICE]   (228010) : haproxy version is 2.8.14-c23fe91
Oct  8 11:29:52 np0005476733 neutron-haproxy-ovnmeta-f7929135-b0f8-4022-8ac4-4734ecb47f0b[228000]: [NOTICE]   (228010) : path to executable is /usr/sbin/haproxy
Oct  8 11:29:52 np0005476733 neutron-haproxy-ovnmeta-f7929135-b0f8-4022-8ac4-4734ecb47f0b[228000]: [WARNING]  (228010) : Exiting Master process...
Oct  8 11:29:52 np0005476733 neutron-haproxy-ovnmeta-f7929135-b0f8-4022-8ac4-4734ecb47f0b[228000]: [ALERT]    (228010) : Current worker (228016) exited with code 143 (Terminated)
Oct  8 11:29:52 np0005476733 neutron-haproxy-ovnmeta-f7929135-b0f8-4022-8ac4-4734ecb47f0b[228000]: [WARNING]  (228010) : All workers exited. Exiting... (0)
Oct  8 11:29:52 np0005476733 systemd[1]: libpod-c8cd2035bc72214dc9ecd897698723de627c0bdb29f7b7088b9cc47dfabfc438.scope: Deactivated successfully.
Oct  8 11:29:52 np0005476733 podman[231015]: 2025-10-08 15:29:52.693994672 +0000 UTC m=+0.065016557 container died c8cd2035bc72214dc9ecd897698723de627c0bdb29f7b7088b9cc47dfabfc438 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f7929135-b0f8-4022-8ac4-4734ecb47f0b, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.696 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9907#033[00m
Oct  8 11:29:52 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c8cd2035bc72214dc9ecd897698723de627c0bdb29f7b7088b9cc47dfabfc438-userdata-shm.mount: Deactivated successfully.
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.734 2 INFO nova.virt.libvirt.driver [-] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Instance destroyed successfully.#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.735 2 DEBUG nova.objects.instance [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'resources' on Instance uuid 7f1808f3-5a79-4149-84d1-7bc21eefa497 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.737 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 11:29:52 np0005476733 systemd[1]: var-lib-containers-storage-overlay-eadc0cd2daef3b5cfbb38b77508b7e01431db41f485f4a67a1c025088b7f64ef-merged.mount: Deactivated successfully.
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.752 2 DEBUG nova.virt.libvirt.vif [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:26:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_bw_limit_east_west-350327070',display_name='tempest-test_bw_limit_east_west-350327070',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-bw-limit-east-west-350327070',id=33,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:26:18Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-j87mn69z',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:26:18Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=7f1808f3-5a79-4149-84d1-7bc21eefa497,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b5af459f-569f-4ca4-86fe-d2d018227a96", "address": "fa:16:3e:4e:90:51", "network": {"id": "f7929135-b0f8-4022-8ac4-4734ecb47f0b", "bridge": "br-int", "label": "tempest-test-network--322813205", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5af459f-56", "ovs_interfaceid": "b5af459f-569f-4ca4-86fe-d2d018227a96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.752 2 DEBUG nova.network.os_vif_util [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "b5af459f-569f-4ca4-86fe-d2d018227a96", "address": "fa:16:3e:4e:90:51", "network": {"id": "f7929135-b0f8-4022-8ac4-4734ecb47f0b", "bridge": "br-int", "label": "tempest-test-network--322813205", "subnets": [{"cidr": "192.168.2.0/24", "dns": [], "gateway": {"address": "192.168.2.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.2.175", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5af459f-56", "ovs_interfaceid": "b5af459f-569f-4ca4-86fe-d2d018227a96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.753 2 DEBUG nova.network.os_vif_util [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4e:90:51,bridge_name='br-int',has_traffic_filtering=True,id=b5af459f-569f-4ca4-86fe-d2d018227a96,network=Network(f7929135-b0f8-4022-8ac4-4734ecb47f0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5af459f-56') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.753 2 DEBUG os_vif [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4e:90:51,bridge_name='br-int',has_traffic_filtering=True,id=b5af459f-569f-4ca4-86fe-d2d018227a96,network=Network(f7929135-b0f8-4022-8ac4-4734ecb47f0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5af459f-56') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.755 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5af459f-56, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:52 np0005476733 podman[231015]: 2025-10-08 15:29:52.759099042 +0000 UTC m=+0.130120927 container cleanup c8cd2035bc72214dc9ecd897698723de627c0bdb29f7b7088b9cc47dfabfc438 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f7929135-b0f8-4022-8ac4-4734ecb47f0b, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001)
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.768 2 INFO os_vif [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4e:90:51,bridge_name='br-int',has_traffic_filtering=True,id=b5af459f-569f-4ca4-86fe-d2d018227a96,network=Network(f7929135-b0f8-4022-8ac4-4734ecb47f0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5af459f-56')#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.769 2 DEBUG nova.virt.libvirt.vif [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:26:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_bw_limit_east_west-350327070',display_name='tempest-test_bw_limit_east_west-350327070',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-bw-limit-east-west-350327070',id=33,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:26:18Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-j87mn69z',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:26:18Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=7f1808f3-5a79-4149-84d1-7bc21eefa497,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7ad20ed3-8502-40cd-84e3-773d77da33ae", "address": "fa:16:3e:62:ec:86", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad20ed3-85", "ovs_interfaceid": "7ad20ed3-8502-40cd-84e3-773d77da33ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.770 2 DEBUG nova.network.os_vif_util [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "7ad20ed3-8502-40cd-84e3-773d77da33ae", "address": "fa:16:3e:62:ec:86", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad20ed3-85", "ovs_interfaceid": "7ad20ed3-8502-40cd-84e3-773d77da33ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:29:52 np0005476733 systemd[1]: libpod-conmon-c8cd2035bc72214dc9ecd897698723de627c0bdb29f7b7088b9cc47dfabfc438.scope: Deactivated successfully.
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.771 2 DEBUG nova.network.os_vif_util [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:62:ec:86,bridge_name='br-int',has_traffic_filtering=True,id=7ad20ed3-8502-40cd-84e3-773d77da33ae,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7ad20ed3-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.772 2 DEBUG os_vif [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:ec:86,bridge_name='br-int',has_traffic_filtering=True,id=7ad20ed3-8502-40cd-84e3-773d77da33ae,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7ad20ed3-85') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.774 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7ad20ed3-85, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.785 2 INFO os_vif [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:ec:86,bridge_name='br-int',has_traffic_filtering=True,id=7ad20ed3-8502-40cd-84e3-773d77da33ae,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap7ad20ed3-85')#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.791 2 INFO nova.virt.libvirt.driver [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Deleting instance files /var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497_del#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.792 2 INFO nova.virt.libvirt.driver [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Deletion of /var/lib/nova/instances/7f1808f3-5a79-4149-84d1-7bc21eefa497_del complete#033[00m
Oct  8 11:29:52 np0005476733 podman[231073]: 2025-10-08 15:29:52.841471135 +0000 UTC m=+0.052903521 container remove c8cd2035bc72214dc9ecd897698723de627c0bdb29f7b7088b9cc47dfabfc438 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f7929135-b0f8-4022-8ac4-4734ecb47f0b, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:52.852 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9d8f43ed-9a2b-4a94-8f6e-74757fb6cda3]: (4, ('Wed Oct  8 03:29:52 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f7929135-b0f8-4022-8ac4-4734ecb47f0b (c8cd2035bc72214dc9ecd897698723de627c0bdb29f7b7088b9cc47dfabfc438)\nc8cd2035bc72214dc9ecd897698723de627c0bdb29f7b7088b9cc47dfabfc438\nWed Oct  8 03:29:52 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f7929135-b0f8-4022-8ac4-4734ecb47f0b (c8cd2035bc72214dc9ecd897698723de627c0bdb29f7b7088b9cc47dfabfc438)\nc8cd2035bc72214dc9ecd897698723de627c0bdb29f7b7088b9cc47dfabfc438\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.853 2 INFO nova.compute.manager [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.854 2 DEBUG oslo.service.loopingcall [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.854 2 DEBUG nova.compute.manager [-] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.855 2 DEBUG nova.network.neutron [-] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:52.862 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[cecc22f7-9a19-432f-9872-ab10dea5fd1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:52.863 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7929135-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:52 np0005476733 kernel: tapf7929135-b0: left promiscuous mode
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:52.875 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8ac7530b-6504-4992-8e00-55197c84c0f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:52 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:52Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:38:f6:e1 10.100.0.13
Oct  8 11:29:52 np0005476733 nova_compute[192580]: 2025-10-08 15:29:52.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:52 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:52Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:38:f6:e1 10.100.0.13
Oct  8 11:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:52.897 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[08d131df-5ef0-4240-8518-41d1bcdd6922]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:52.898 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[fb04280b-11a7-4dcb-945b-ff666a72a288]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:52.915 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e7bf03ed-7a69-4357-a9bf-dd4c835e32cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408126, 'reachable_time': 17924, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231089, 'error': None, 'target': 'ovnmeta-f7929135-b0f8-4022-8ac4-4734ecb47f0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:52 np0005476733 systemd[1]: run-netns-ovnmeta\x2df7929135\x2db0f8\x2d4022\x2d8ac4\x2d4734ecb47f0b.mount: Deactivated successfully.
Oct  8 11:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:52.923 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f7929135-b0f8-4022-8ac4-4734ecb47f0b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:52.924 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[f489578b-d91f-487c-8d80-8ab2236d27cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:52.926 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 7ad20ed3-8502-40cd-84e3-773d77da33ae in datapath 58a69152-b5a6-41d0-85d5-36ab51cfbfb5 unbound from our chassis#033[00m
Oct  8 11:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:52.930 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58a69152-b5a6-41d0-85d5-36ab51cfbfb5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:52.931 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5673b948-f6ff-4a16-b899-79e7c6733fe3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:52.933 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 namespace which is not needed anymore#033[00m
Oct  8 11:29:53 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[228945]: [NOTICE]   (228949) : haproxy version is 2.8.14-c23fe91
Oct  8 11:29:53 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[228945]: [NOTICE]   (228949) : path to executable is /usr/sbin/haproxy
Oct  8 11:29:53 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[228945]: [WARNING]  (228949) : Exiting Master process...
Oct  8 11:29:53 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[228945]: [WARNING]  (228949) : Exiting Master process...
Oct  8 11:29:53 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[228945]: [ALERT]    (228949) : Current worker (228951) exited with code 143 (Terminated)
Oct  8 11:29:53 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[228945]: [WARNING]  (228949) : All workers exited. Exiting... (0)
Oct  8 11:29:53 np0005476733 systemd[1]: libpod-888c79ef1380fa876c972cb71bfcdb0dd462eef9c2ce38f07e2b18b630efdec6.scope: Deactivated successfully.
Oct  8 11:29:53 np0005476733 podman[231108]: 2025-10-08 15:29:53.071248086 +0000 UTC m=+0.050125662 container died 888c79ef1380fa876c972cb71bfcdb0dd462eef9c2ce38f07e2b18b630efdec6 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:29:53 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-888c79ef1380fa876c972cb71bfcdb0dd462eef9c2ce38f07e2b18b630efdec6-userdata-shm.mount: Deactivated successfully.
Oct  8 11:29:53 np0005476733 systemd[1]: var-lib-containers-storage-overlay-9cd9da7bb5746506a062b8c51415ce13f87e4ac881c7d8dbc0e78502a9256580-merged.mount: Deactivated successfully.
Oct  8 11:29:53 np0005476733 podman[231108]: 2025-10-08 15:29:53.118426583 +0000 UTC m=+0.097304149 container cleanup 888c79ef1380fa876c972cb71bfcdb0dd462eef9c2ce38f07e2b18b630efdec6 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:29:53 np0005476733 systemd[1]: libpod-conmon-888c79ef1380fa876c972cb71bfcdb0dd462eef9c2ce38f07e2b18b630efdec6.scope: Deactivated successfully.
Oct  8 11:29:53 np0005476733 podman[231138]: 2025-10-08 15:29:53.186632373 +0000 UTC m=+0.045776394 container remove 888c79ef1380fa876c972cb71bfcdb0dd462eef9c2ce38f07e2b18b630efdec6 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:29:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:53.193 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6064947c-a733-4e1b-8dfe-f2821aa77f1a]: (4, ('Wed Oct  8 03:29:53 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 (888c79ef1380fa876c972cb71bfcdb0dd462eef9c2ce38f07e2b18b630efdec6)\n888c79ef1380fa876c972cb71bfcdb0dd462eef9c2ce38f07e2b18b630efdec6\nWed Oct  8 03:29:53 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 (888c79ef1380fa876c972cb71bfcdb0dd462eef9c2ce38f07e2b18b630efdec6)\n888c79ef1380fa876c972cb71bfcdb0dd462eef9c2ce38f07e2b18b630efdec6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:53.195 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ab354a68-2c16-4d90-8495-5a503a7d848f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:53.196 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58a69152-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:29:53 np0005476733 nova_compute[192580]: 2025-10-08 15:29:53.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:53 np0005476733 kernel: tap58a69152-b0: left promiscuous mode
Oct  8 11:29:53 np0005476733 nova_compute[192580]: 2025-10-08 15:29:53.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:53.203 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e3c54d29-9bae-4c67-b6d8-eb471df75db9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:53 np0005476733 nova_compute[192580]: 2025-10-08 15:29:53.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:53.225 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f5d26780-cb83-4344-9527-831ee9fa25f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:53.227 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a237cd8a-7ac4-4111-9a2a-942146a0f92a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:53.244 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0e2e0c3c-478d-47c7-ba25-bba4c5e50b91]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415394, 'reachable_time': 31990, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231153, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:53.246 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:29:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:29:53.246 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[f519bda0-dcd6-4dbb-bde4-52da3c4083d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:29:53 np0005476733 systemd[1]: run-netns-ovnmeta\x2d58a69152\x2db5a6\x2d41d0\x2d85d5\x2d36ab51cfbfb5.mount: Deactivated successfully.
Oct  8 11:29:54 np0005476733 podman[231154]: 2025-10-08 15:29:54.241423552 +0000 UTC m=+0.067839969 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:29:54 np0005476733 podman[231155]: 2025-10-08 15:29:54.241500154 +0000 UTC m=+0.066439284 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:29:54 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:54Z|00352|binding|INFO|Releasing lport 76302563-91ae-48df-adce-3edec8d5a578 from this chassis (sb_readonly=0)
Oct  8 11:29:54 np0005476733 ovn_controller[94857]: 2025-10-08T15:29:54Z|00353|binding|INFO|Releasing lport bbd16b0e-af1f-427d-8500-724401e2ed53 from this chassis (sb_readonly=0)
Oct  8 11:29:54 np0005476733 nova_compute[192580]: 2025-10-08 15:29:54.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:54 np0005476733 nova_compute[192580]: 2025-10-08 15:29:54.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:54 np0005476733 nova_compute[192580]: 2025-10-08 15:29:54.877 2 DEBUG nova.compute.manager [req-48511b38-226e-42d0-af72-7656f3a80fc6 req-dc5960b4-a397-402b-8d02-eef8d50fd3ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Received event network-vif-unplugged-b5af459f-569f-4ca4-86fe-d2d018227a96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:29:54 np0005476733 nova_compute[192580]: 2025-10-08 15:29:54.880 2 DEBUG oslo_concurrency.lockutils [req-48511b38-226e-42d0-af72-7656f3a80fc6 req-dc5960b4-a397-402b-8d02-eef8d50fd3ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:54 np0005476733 nova_compute[192580]: 2025-10-08 15:29:54.881 2 DEBUG oslo_concurrency.lockutils [req-48511b38-226e-42d0-af72-7656f3a80fc6 req-dc5960b4-a397-402b-8d02-eef8d50fd3ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:54 np0005476733 nova_compute[192580]: 2025-10-08 15:29:54.881 2 DEBUG oslo_concurrency.lockutils [req-48511b38-226e-42d0-af72-7656f3a80fc6 req-dc5960b4-a397-402b-8d02-eef8d50fd3ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:54 np0005476733 nova_compute[192580]: 2025-10-08 15:29:54.882 2 DEBUG nova.compute.manager [req-48511b38-226e-42d0-af72-7656f3a80fc6 req-dc5960b4-a397-402b-8d02-eef8d50fd3ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] No waiting events found dispatching network-vif-unplugged-b5af459f-569f-4ca4-86fe-d2d018227a96 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:29:54 np0005476733 nova_compute[192580]: 2025-10-08 15:29:54.882 2 DEBUG nova.compute.manager [req-48511b38-226e-42d0-af72-7656f3a80fc6 req-dc5960b4-a397-402b-8d02-eef8d50fd3ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Received event network-vif-unplugged-b5af459f-569f-4ca4-86fe-d2d018227a96 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:29:54 np0005476733 nova_compute[192580]: 2025-10-08 15:29:54.964 2 INFO nova.compute.manager [None req-c8bbd3e1-df4d-4ba8-ae5a-00358f393de6 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Get console output#033[00m
Oct  8 11:29:54 np0005476733 nova_compute[192580]: 2025-10-08 15:29:54.970 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:29:55 np0005476733 nova_compute[192580]: 2025-10-08 15:29:55.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:29:55 np0005476733 nova_compute[192580]: 2025-10-08 15:29:55.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:29:55 np0005476733 nova_compute[192580]: 2025-10-08 15:29:55.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.463 2 DEBUG oslo_concurrency.lockutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.464 2 DEBUG oslo_concurrency.lockutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.493 2 DEBUG nova.compute.manager [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.621 2 DEBUG oslo_concurrency.lockutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.622 2 DEBUG oslo_concurrency.lockutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.631 2 DEBUG nova.virt.hardware [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.631 2 INFO nova.compute.claims [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.844 2 DEBUG nova.compute.provider_tree [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.860 2 DEBUG nova.scheduler.client.report [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.887 2 DEBUG oslo_concurrency.lockutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.888 2 DEBUG nova.compute.manager [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.951 2 DEBUG nova.compute.manager [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.951 2 DEBUG nova.network.neutron [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.967 2 DEBUG nova.compute.manager [req-23b4fb00-c8f5-475d-9009-10797a33b268 req-f997a37c-0532-4202-bd0f-90415c6b0519 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Received event network-vif-plugged-b5af459f-569f-4ca4-86fe-d2d018227a96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.968 2 DEBUG oslo_concurrency.lockutils [req-23b4fb00-c8f5-475d-9009-10797a33b268 req-f997a37c-0532-4202-bd0f-90415c6b0519 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.968 2 DEBUG oslo_concurrency.lockutils [req-23b4fb00-c8f5-475d-9009-10797a33b268 req-f997a37c-0532-4202-bd0f-90415c6b0519 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.969 2 DEBUG oslo_concurrency.lockutils [req-23b4fb00-c8f5-475d-9009-10797a33b268 req-f997a37c-0532-4202-bd0f-90415c6b0519 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.969 2 DEBUG nova.compute.manager [req-23b4fb00-c8f5-475d-9009-10797a33b268 req-f997a37c-0532-4202-bd0f-90415c6b0519 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] No waiting events found dispatching network-vif-plugged-b5af459f-569f-4ca4-86fe-d2d018227a96 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.969 2 WARNING nova.compute.manager [req-23b4fb00-c8f5-475d-9009-10797a33b268 req-f997a37c-0532-4202-bd0f-90415c6b0519 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Received unexpected event network-vif-plugged-b5af459f-569f-4ca4-86fe-d2d018227a96 for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.969 2 DEBUG nova.compute.manager [req-23b4fb00-c8f5-475d-9009-10797a33b268 req-f997a37c-0532-4202-bd0f-90415c6b0519 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Received event network-vif-unplugged-7ad20ed3-8502-40cd-84e3-773d77da33ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.970 2 DEBUG oslo_concurrency.lockutils [req-23b4fb00-c8f5-475d-9009-10797a33b268 req-f997a37c-0532-4202-bd0f-90415c6b0519 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.970 2 DEBUG oslo_concurrency.lockutils [req-23b4fb00-c8f5-475d-9009-10797a33b268 req-f997a37c-0532-4202-bd0f-90415c6b0519 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.970 2 DEBUG oslo_concurrency.lockutils [req-23b4fb00-c8f5-475d-9009-10797a33b268 req-f997a37c-0532-4202-bd0f-90415c6b0519 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.971 2 DEBUG nova.compute.manager [req-23b4fb00-c8f5-475d-9009-10797a33b268 req-f997a37c-0532-4202-bd0f-90415c6b0519 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] No waiting events found dispatching network-vif-unplugged-7ad20ed3-8502-40cd-84e3-773d77da33ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.971 2 DEBUG nova.compute.manager [req-23b4fb00-c8f5-475d-9009-10797a33b268 req-f997a37c-0532-4202-bd0f-90415c6b0519 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Received event network-vif-unplugged-7ad20ed3-8502-40cd-84e3-773d77da33ae for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.971 2 DEBUG nova.compute.manager [req-23b4fb00-c8f5-475d-9009-10797a33b268 req-f997a37c-0532-4202-bd0f-90415c6b0519 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Received event network-vif-plugged-7ad20ed3-8502-40cd-84e3-773d77da33ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.971 2 DEBUG oslo_concurrency.lockutils [req-23b4fb00-c8f5-475d-9009-10797a33b268 req-f997a37c-0532-4202-bd0f-90415c6b0519 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.972 2 DEBUG oslo_concurrency.lockutils [req-23b4fb00-c8f5-475d-9009-10797a33b268 req-f997a37c-0532-4202-bd0f-90415c6b0519 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.972 2 DEBUG oslo_concurrency.lockutils [req-23b4fb00-c8f5-475d-9009-10797a33b268 req-f997a37c-0532-4202-bd0f-90415c6b0519 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7f1808f3-5a79-4149-84d1-7bc21eefa497-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.972 2 DEBUG nova.compute.manager [req-23b4fb00-c8f5-475d-9009-10797a33b268 req-f997a37c-0532-4202-bd0f-90415c6b0519 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] No waiting events found dispatching network-vif-plugged-7ad20ed3-8502-40cd-84e3-773d77da33ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.972 2 WARNING nova.compute.manager [req-23b4fb00-c8f5-475d-9009-10797a33b268 req-f997a37c-0532-4202-bd0f-90415c6b0519 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Received unexpected event network-vif-plugged-7ad20ed3-8502-40cd-84e3-773d77da33ae for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.974 2 INFO nova.virt.libvirt.driver [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:29:56 np0005476733 nova_compute[192580]: 2025-10-08 15:29:56.994 2 DEBUG nova.compute.manager [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.092 2 DEBUG nova.compute.manager [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.095 2 DEBUG nova.virt.libvirt.driver [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.096 2 INFO nova.virt.libvirt.driver [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Creating image(s)#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.097 2 DEBUG oslo_concurrency.lockutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "/var/lib/nova/instances/90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.098 2 DEBUG oslo_concurrency.lockutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "/var/lib/nova/instances/90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.100 2 DEBUG oslo_concurrency.lockutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "/var/lib/nova/instances/90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.123 2 DEBUG oslo_concurrency.processutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.215 2 DEBUG oslo_concurrency.processutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.216 2 DEBUG oslo_concurrency.lockutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.217 2 DEBUG oslo_concurrency.lockutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.230 2 DEBUG oslo_concurrency.processutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.273 2 DEBUG nova.network.neutron [-] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.291 2 DEBUG oslo_concurrency.processutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.292 2 DEBUG oslo_concurrency.processutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.315 2 INFO nova.compute.manager [-] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Took 4.46 seconds to deallocate network for instance.#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.359 2 DEBUG oslo_concurrency.lockutils [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.360 2 DEBUG oslo_concurrency.lockutils [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.361 2 DEBUG oslo_concurrency.processutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk 10737418240" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.362 2 DEBUG oslo_concurrency.lockutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.362 2 DEBUG oslo_concurrency.processutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.392 2 DEBUG nova.compute.manager [req-3944dc13-8b95-4451-b841-45739f947afb req-5370e4cf-6944-4eb6-8187-c27707b077db 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Received event network-vif-deleted-b5af459f-569f-4ca4-86fe-d2d018227a96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.425 2 INFO nova.compute.manager [None req-716a03d1-4184-4e7c-82aa-7ac723507412 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Get console output#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.433 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.452 2 DEBUG oslo_concurrency.processutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.453 2 DEBUG nova.objects.instance [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lazy-loading 'migration_context' on Instance uuid 90f7bb14-f463-4f98-92fc-22c2a06a12cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.480 2 DEBUG nova.virt.libvirt.driver [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.480 2 DEBUG nova.virt.libvirt.driver [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Ensure instance console log exists: /var/lib/nova/instances/90f7bb14-f463-4f98-92fc-22c2a06a12cd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.481 2 DEBUG oslo_concurrency.lockutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.481 2 DEBUG oslo_concurrency.lockutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.481 2 DEBUG oslo_concurrency.lockutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.575 2 DEBUG nova.compute.provider_tree [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.601 2 DEBUG nova.scheduler.client.report [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.629 2 DEBUG oslo_concurrency.lockutils [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.636 2 DEBUG nova.policy [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.663 2 INFO nova.scheduler.client.report [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Deleted allocations for instance 7f1808f3-5a79-4149-84d1-7bc21eefa497#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.746 2 DEBUG oslo_concurrency.lockutils [None req-4d0aab33-4c09-4253-a99a-c634ba5936b6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "7f1808f3-5a79-4149-84d1-7bc21eefa497" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.313s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:29:57 np0005476733 nova_compute[192580]: 2025-10-08 15:29:57.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:58 np0005476733 nova_compute[192580]: 2025-10-08 15:29:58.982 2 DEBUG nova.network.neutron [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Successfully updated port: 3ca6fe41-629a-4c92-9418-834869a48822 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:29:58 np0005476733 nova_compute[192580]: 2025-10-08 15:29:58.997 2 DEBUG oslo_concurrency.lockutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "refresh_cache-90f7bb14-f463-4f98-92fc-22c2a06a12cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:29:58 np0005476733 nova_compute[192580]: 2025-10-08 15:29:58.998 2 DEBUG oslo_concurrency.lockutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquired lock "refresh_cache-90f7bb14-f463-4f98-92fc-22c2a06a12cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:29:58 np0005476733 nova_compute[192580]: 2025-10-08 15:29:58.998 2 DEBUG nova.network.neutron [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:29:59 np0005476733 nova_compute[192580]: 2025-10-08 15:29:59.137 2 DEBUG nova.compute.manager [req-e2acbfbc-ce64-47da-884d-95d2533b7a70 req-374a7782-b36b-4559-97e6-9e233b0a52c0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Received event network-changed-3ca6fe41-629a-4c92-9418-834869a48822 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:29:59 np0005476733 nova_compute[192580]: 2025-10-08 15:29:59.137 2 DEBUG nova.compute.manager [req-e2acbfbc-ce64-47da-884d-95d2533b7a70 req-374a7782-b36b-4559-97e6-9e233b0a52c0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Refreshing instance network info cache due to event network-changed-3ca6fe41-629a-4c92-9418-834869a48822. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:29:59 np0005476733 nova_compute[192580]: 2025-10-08 15:29:59.138 2 DEBUG oslo_concurrency.lockutils [req-e2acbfbc-ce64-47da-884d-95d2533b7a70 req-374a7782-b36b-4559-97e6-9e233b0a52c0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-90f7bb14-f463-4f98-92fc-22c2a06a12cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:29:59 np0005476733 nova_compute[192580]: 2025-10-08 15:29:59.256 2 DEBUG nova.network.neutron [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:29:59 np0005476733 nova_compute[192580]: 2025-10-08 15:29:59.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:29:59 np0005476733 nova_compute[192580]: 2025-10-08 15:29:59.609 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:29:59 np0005476733 nova_compute[192580]: 2025-10-08 15:29:59.610 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 11:29:59 np0005476733 nova_compute[192580]: 2025-10-08 15:29:59.636 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 11:30:00 np0005476733 nova_compute[192580]: 2025-10-08 15:30:00.219 2 INFO nova.compute.manager [None req-58953a1c-faf3-4175-a794-746c91ad22c5 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Get console output#033[00m
Oct  8 11:30:00 np0005476733 nova_compute[192580]: 2025-10-08 15:30:00.228 2 INFO nova.virt.libvirt.driver [None req-58953a1c-faf3-4175-a794-746c91ad22c5 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Truncated console log returned, 3192 bytes ignored#033[00m
Oct  8 11:30:00 np0005476733 nova_compute[192580]: 2025-10-08 15:30:00.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.100 2 DEBUG nova.network.neutron [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Updating instance_info_cache with network_info: [{"id": "3ca6fe41-629a-4c92-9418-834869a48822", "address": "fa:16:3e:03:33:a9", "network": {"id": "f872d065-dcdd-4abe-966e-984ec8347cf7", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.60", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::f816:3eff:fe03:33a9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ca6fe41-62", "ovs_interfaceid": "3ca6fe41-629a-4c92-9418-834869a48822", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.172 2 DEBUG oslo_concurrency.lockutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Releasing lock "refresh_cache-90f7bb14-f463-4f98-92fc-22c2a06a12cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.172 2 DEBUG nova.compute.manager [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Instance network_info: |[{"id": "3ca6fe41-629a-4c92-9418-834869a48822", "address": "fa:16:3e:03:33:a9", "network": {"id": "f872d065-dcdd-4abe-966e-984ec8347cf7", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.60", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::f816:3eff:fe03:33a9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ca6fe41-62", "ovs_interfaceid": "3ca6fe41-629a-4c92-9418-834869a48822", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.173 2 DEBUG oslo_concurrency.lockutils [req-e2acbfbc-ce64-47da-884d-95d2533b7a70 req-374a7782-b36b-4559-97e6-9e233b0a52c0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-90f7bb14-f463-4f98-92fc-22c2a06a12cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.174 2 DEBUG nova.network.neutron [req-e2acbfbc-ce64-47da-884d-95d2533b7a70 req-374a7782-b36b-4559-97e6-9e233b0a52c0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Refreshing network info cache for port 3ca6fe41-629a-4c92-9418-834869a48822 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.179 2 DEBUG nova.virt.libvirt.driver [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Start _get_guest_xml network_info=[{"id": "3ca6fe41-629a-4c92-9418-834869a48822", "address": "fa:16:3e:03:33:a9", "network": {"id": "f872d065-dcdd-4abe-966e-984ec8347cf7", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.60", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::f816:3eff:fe03:33a9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ca6fe41-62", "ovs_interfaceid": "3ca6fe41-629a-4c92-9418-834869a48822", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.185 2 WARNING nova.virt.libvirt.driver [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.191 2 DEBUG nova.virt.libvirt.host [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.192 2 DEBUG nova.virt.libvirt.host [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.196 2 DEBUG nova.virt.libvirt.host [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.197 2 DEBUG nova.virt.libvirt.host [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.198 2 DEBUG nova.virt.libvirt.driver [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.198 2 DEBUG nova.virt.hardware [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.198 2 DEBUG nova.virt.hardware [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.199 2 DEBUG nova.virt.hardware [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.199 2 DEBUG nova.virt.hardware [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.199 2 DEBUG nova.virt.hardware [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.199 2 DEBUG nova.virt.hardware [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.199 2 DEBUG nova.virt.hardware [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.200 2 DEBUG nova.virt.hardware [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.200 2 DEBUG nova.virt.hardware [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.200 2 DEBUG nova.virt.hardware [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.200 2 DEBUG nova.virt.hardware [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.204 2 DEBUG nova.virt.libvirt.vif [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:29:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless',display_name='tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-extradhcpoptionstest-1757752636-test-extra-dhcp-opts-ip',id=42,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAROHXDFBirKKfgv1/Q2k8TOz822D2j3GssXLkqqAYkfNmKCLTZPWHL9R3TttvPeVcQM9XeUfcVk0LUjV4/DUc229+mDzz6yKwrgz0g4olEc5cIgAsFC91SZyJ937u9BxA==',key_name='tempest-ExtraDhcpOptionsTest-1757752636',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='93e68db931464f0282500c84d398d8af',ramdisk_id='',reservation_id='r-tdjre26z',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ExtraDhcpOptionsTest-522093769',owner_user_name='tempest-ExtraDhcpOptionsTest-522093769-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:29:57Z,user_data=None,user_id='048380879c82439f920961e33c8fc34c',uuid=90f7bb14-f463-4f98-92fc-22c2a06a12cd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3ca6fe41-629a-4c92-9418-834869a48822", "address": "fa:16:3e:03:33:a9", "network": {"id": "f872d065-dcdd-4abe-966e-984ec8347cf7", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.60", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::f816:3eff:fe03:33a9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ca6fe41-62", "ovs_interfaceid": "3ca6fe41-629a-4c92-9418-834869a48822", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.204 2 DEBUG nova.network.os_vif_util [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Converting VIF {"id": "3ca6fe41-629a-4c92-9418-834869a48822", "address": "fa:16:3e:03:33:a9", "network": {"id": "f872d065-dcdd-4abe-966e-984ec8347cf7", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.60", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::f816:3eff:fe03:33a9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ca6fe41-62", "ovs_interfaceid": "3ca6fe41-629a-4c92-9418-834869a48822", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.205 2 DEBUG nova.network.os_vif_util [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:33:a9,bridge_name='br-int',has_traffic_filtering=True,id=3ca6fe41-629a-4c92-9418-834869a48822,network=Network(f872d065-dcdd-4abe-966e-984ec8347cf7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3ca6fe41-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.206 2 DEBUG nova.objects.instance [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lazy-loading 'pci_devices' on Instance uuid 90f7bb14-f463-4f98-92fc-22c2a06a12cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.226 2 DEBUG nova.virt.libvirt.driver [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:30:02 np0005476733 nova_compute[192580]:  <uuid>90f7bb14-f463-4f98-92fc-22c2a06a12cd</uuid>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:  <name>instance-0000002a</name>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <nova:name>tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless</nova:name>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:30:02</nova:creationTime>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:30:02 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:        <nova:user uuid="048380879c82439f920961e33c8fc34c">tempest-ExtraDhcpOptionsTest-522093769-project-member</nova:user>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:        <nova:project uuid="93e68db931464f0282500c84d398d8af">tempest-ExtraDhcpOptionsTest-522093769</nova:project>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:        <nova:port uuid="3ca6fe41-629a-4c92-9418-834869a48822">
Oct  8 11:30:02 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.5.60" ipVersion="4"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="2001:5::f816:3eff:fe03:33a9" ipVersion="6"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <entry name="serial">90f7bb14-f463-4f98-92fc-22c2a06a12cd</entry>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <entry name="uuid">90f7bb14-f463-4f98-92fc-22c2a06a12cd</entry>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.config"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:03:33:a9"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <target dev="tap3ca6fe41-62"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/90f7bb14-f463-4f98-92fc-22c2a06a12cd/console.log" append="off"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:30:02 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:30:02 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:30:02 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:30:02 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.228 2 DEBUG nova.compute.manager [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Preparing to wait for external event network-vif-plugged-3ca6fe41-629a-4c92-9418-834869a48822 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.228 2 DEBUG oslo_concurrency.lockutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.228 2 DEBUG oslo_concurrency.lockutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.228 2 DEBUG oslo_concurrency.lockutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.229 2 DEBUG nova.virt.libvirt.vif [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:29:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless',display_name='tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-extradhcpoptionstest-1757752636-test-extra-dhcp-opts-ip',id=42,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAROHXDFBirKKfgv1/Q2k8TOz822D2j3GssXLkqqAYkfNmKCLTZPWHL9R3TttvPeVcQM9XeUfcVk0LUjV4/DUc229+mDzz6yKwrgz0g4olEc5cIgAsFC91SZyJ937u9BxA==',key_name='tempest-ExtraDhcpOptionsTest-1757752636',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='93e68db931464f0282500c84d398d8af',ramdisk_id='',reservation_id='r-tdjre26z',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ExtraDhcpOptionsTest-522093769',owner_user_name='tempest-ExtraDhcpOptionsTest-522093769-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:29:57Z,user_data=None,user_id='048380879c82439f920961e33c8fc34c',uuid=90f7bb14-f463-4f98-92fc-22c2a06a12cd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3ca6fe41-629a-4c92-9418-834869a48822", "address": "fa:16:3e:03:33:a9", "network": {"id": "f872d065-dcdd-4abe-966e-984ec8347cf7", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.60", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::f816:3eff:fe03:33a9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ca6fe41-62", "ovs_interfaceid": "3ca6fe41-629a-4c92-9418-834869a48822", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.229 2 DEBUG nova.network.os_vif_util [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Converting VIF {"id": "3ca6fe41-629a-4c92-9418-834869a48822", "address": "fa:16:3e:03:33:a9", "network": {"id": "f872d065-dcdd-4abe-966e-984ec8347cf7", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.60", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::f816:3eff:fe03:33a9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ca6fe41-62", "ovs_interfaceid": "3ca6fe41-629a-4c92-9418-834869a48822", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.231 2 DEBUG nova.network.os_vif_util [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:33:a9,bridge_name='br-int',has_traffic_filtering=True,id=3ca6fe41-629a-4c92-9418-834869a48822,network=Network(f872d065-dcdd-4abe-966e-984ec8347cf7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3ca6fe41-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.231 2 DEBUG os_vif [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:33:a9,bridge_name='br-int',has_traffic_filtering=True,id=3ca6fe41-629a-4c92-9418-834869a48822,network=Network(f872d065-dcdd-4abe-966e-984ec8347cf7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3ca6fe41-62') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.232 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.232 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.236 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3ca6fe41-62, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.236 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3ca6fe41-62, col_values=(('external_ids', {'iface-id': '3ca6fe41-629a-4c92-9418-834869a48822', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:03:33:a9', 'vm-uuid': '90f7bb14-f463-4f98-92fc-22c2a06a12cd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:02 np0005476733 NetworkManager[51699]: <info>  [1759937402.2394] manager: (tap3ca6fe41-62): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/123)
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.246 2 INFO os_vif [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:33:a9,bridge_name='br-int',has_traffic_filtering=True,id=3ca6fe41-629a-4c92-9418-834869a48822,network=Network(f872d065-dcdd-4abe-966e-984ec8347cf7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3ca6fe41-62')#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.312 2 DEBUG nova.virt.libvirt.driver [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.313 2 DEBUG nova.virt.libvirt.driver [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.313 2 DEBUG nova.virt.libvirt.driver [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] No VIF found with MAC fa:16:3e:03:33:a9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:30:02 np0005476733 nova_compute[192580]: 2025-10-08 15:30:02.313 2 INFO nova.virt.libvirt.driver [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Using config drive#033[00m
Oct  8 11:30:03 np0005476733 nova_compute[192580]: 2025-10-08 15:30:03.087 2 INFO nova.compute.manager [None req-0a1c59ed-1965-489d-afd7-1e29b5165a6b c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Get console output#033[00m
Oct  8 11:30:03 np0005476733 nova_compute[192580]: 2025-10-08 15:30:03.093 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:30:03 np0005476733 nova_compute[192580]: 2025-10-08 15:30:03.097 2 INFO nova.virt.libvirt.driver [None req-0a1c59ed-1965-489d-afd7-1e29b5165a6b c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Truncated console log returned, 3165 bytes ignored#033[00m
Oct  8 11:30:03 np0005476733 nova_compute[192580]: 2025-10-08 15:30:03.953 2 INFO nova.virt.libvirt.driver [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Creating config drive at /var/lib/nova/instances/90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.config#033[00m
Oct  8 11:30:03 np0005476733 nova_compute[192580]: 2025-10-08 15:30:03.963 2 DEBUG oslo_concurrency.processutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2ep80f23 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:30:04 np0005476733 nova_compute[192580]: 2025-10-08 15:30:04.094 2 DEBUG oslo_concurrency.processutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2ep80f23" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:30:04 np0005476733 kernel: tap3ca6fe41-62: entered promiscuous mode
Oct  8 11:30:04 np0005476733 ovn_controller[94857]: 2025-10-08T15:30:04Z|00354|binding|INFO|Claiming lport 3ca6fe41-629a-4c92-9418-834869a48822 for this chassis.
Oct  8 11:30:04 np0005476733 nova_compute[192580]: 2025-10-08 15:30:04.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:04 np0005476733 ovn_controller[94857]: 2025-10-08T15:30:04Z|00355|binding|INFO|3ca6fe41-629a-4c92-9418-834869a48822: Claiming fa:16:3e:03:33:a9 192.168.5.60 2001:5::f816:3eff:fe03:33a9
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.206 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:33:a9 192.168.5.60 2001:5::f816:3eff:fe03:33a9'], port_security=['fa:16:3e:03:33:a9 192.168.5.60 2001:5::f816:3eff:fe03:33a9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'neutron:cidrs': '192.168.5.60/24 2001:5::f816:3eff:fe03:33a9/64', 'neutron:device_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f872d065-dcdd-4abe-966e-984ec8347cf7', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'neutron:project_id': '93e68db931464f0282500c84d398d8af', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ee93d6be-59e3-41c0-a55f-8df79fb9da74', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5057352-eab1-4ec2-8137-06eaee60ec6e, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=3ca6fe41-629a-4c92-9418-834869a48822) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.207 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 3ca6fe41-629a-4c92-9418-834869a48822 in datapath f872d065-dcdd-4abe-966e-984ec8347cf7 bound to our chassis#033[00m
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.209 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f872d065-dcdd-4abe-966e-984ec8347cf7#033[00m
Oct  8 11:30:04 np0005476733 NetworkManager[51699]: <info>  [1759937404.2124] manager: (tap3ca6fe41-62): new Tun device (/org/freedesktop/NetworkManager/Devices/124)
Oct  8 11:30:04 np0005476733 nova_compute[192580]: 2025-10-08 15:30:04.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:04 np0005476733 ovn_controller[94857]: 2025-10-08T15:30:04Z|00356|binding|INFO|Setting lport 3ca6fe41-629a-4c92-9418-834869a48822 ovn-installed in OVS
Oct  8 11:30:04 np0005476733 ovn_controller[94857]: 2025-10-08T15:30:04Z|00357|binding|INFO|Setting lport 3ca6fe41-629a-4c92-9418-834869a48822 up in Southbound
Oct  8 11:30:04 np0005476733 nova_compute[192580]: 2025-10-08 15:30:04.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.222 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[adcda2d0-6270-44b6-bc94-78fd3776ba10]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.223 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf872d065-d1 in ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.225 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf872d065-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.226 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[436d08cd-2e35-4873-9cd2-2b9f73536a3e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:04 np0005476733 systemd-udevd[231275]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.229 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[527796ad-bc49-4a49-a0b0-26af327d7755]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:04 np0005476733 NetworkManager[51699]: <info>  [1759937404.2438] device (tap3ca6fe41-62): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:30:04 np0005476733 NetworkManager[51699]: <info>  [1759937404.2448] device (tap3ca6fe41-62): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.245 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[0f84283b-a951-4b99-b046-ea223c73845d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:04 np0005476733 systemd-machined[152624]: New machine qemu-26-instance-0000002a.
Oct  8 11:30:04 np0005476733 systemd[1]: Started Virtual Machine qemu-26-instance-0000002a.
Oct  8 11:30:04 np0005476733 podman[231255]: 2025-10-08 15:30:04.264494571 +0000 UTC m=+0.088678005 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.270 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ee9e78e3-8aa7-4e12-99b7-379a341d7be5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.305 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[fe2908dc-a5a3-466c-8c4d-6eb191b54267]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:04 np0005476733 NetworkManager[51699]: <info>  [1759937404.3120] manager: (tapf872d065-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/125)
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.313 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[38091230-d0d3-4e91-ab0c-9c8a0ac32d0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.351 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[97288afd-eeaa-4fbe-9b95-3bb4dad7d4e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.354 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[4bf61c96-a618-45f0-a965-dffbe86faf4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:04 np0005476733 NetworkManager[51699]: <info>  [1759937404.3797] device (tapf872d065-d0): carrier: link connected
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.388 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[33f4242e-bd89-444a-a6a7-e1827568115d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.410 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[54bc1bcb-01b7-4cfd-8c4e-af8098d7a9fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf872d065-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:d2:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 84], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430804, 'reachable_time': 24344, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231316, 'error': None, 'target': 'ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:04 np0005476733 nova_compute[192580]: 2025-10-08 15:30:04.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.430 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4c23bf83-de18-410e-8df3-461d737b97b3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3e:d277'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 430804, 'tstamp': 430804}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231317, 'error': None, 'target': 'ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.447 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b06413f0-9b07-4602-8c9e-be6dc0549d5d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf872d065-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:d2:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 84], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430804, 'reachable_time': 24344, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231318, 'error': None, 'target': 'ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.479 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1c3daf95-6625-4a2e-a601-4a18c98c7055]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.540 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a099c925-20ed-44a0-995c-4d30b25a9f31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.543 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf872d065-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.543 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.544 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf872d065-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:30:04 np0005476733 NetworkManager[51699]: <info>  [1759937404.5468] manager: (tapf872d065-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/126)
Oct  8 11:30:04 np0005476733 nova_compute[192580]: 2025-10-08 15:30:04.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:04 np0005476733 kernel: tapf872d065-d0: entered promiscuous mode
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.552 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf872d065-d0, col_values=(('external_ids', {'iface-id': '1e3f1387-ae31-4ad9-bc6d-8f39af22638d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:30:04 np0005476733 nova_compute[192580]: 2025-10-08 15:30:04.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:04 np0005476733 ovn_controller[94857]: 2025-10-08T15:30:04Z|00358|binding|INFO|Releasing lport 1e3f1387-ae31-4ad9-bc6d-8f39af22638d from this chassis (sb_readonly=0)
Oct  8 11:30:04 np0005476733 nova_compute[192580]: 2025-10-08 15:30:04.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.580 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f872d065-dcdd-4abe-966e-984ec8347cf7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f872d065-dcdd-4abe-966e-984ec8347cf7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.581 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a70a289e-0165-44da-a007-dd095b4e1343]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.582 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-f872d065-dcdd-4abe-966e-984ec8347cf7
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/f872d065-dcdd-4abe-966e-984ec8347cf7.pid.haproxy
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID f872d065-dcdd-4abe-966e-984ec8347cf7
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:30:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:04.583 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7', 'env', 'PROCESS_TAG=haproxy-f872d065-dcdd-4abe-966e-984ec8347cf7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f872d065-dcdd-4abe-966e-984ec8347cf7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:30:05 np0005476733 podman[231354]: 2025-10-08 15:30:05.018111167 +0000 UTC m=+0.063458658 container create 614f6d2102f50e73e404d7d23869ae6f128afeb3aea535a2a552e13a3aea6d75 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  8 11:30:05 np0005476733 systemd[1]: Started libpod-conmon-614f6d2102f50e73e404d7d23869ae6f128afeb3aea535a2a552e13a3aea6d75.scope.
Oct  8 11:30:05 np0005476733 podman[231354]: 2025-10-08 15:30:04.981992333 +0000 UTC m=+0.027339904 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:30:05 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:30:05 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67894a9509764f65413bd23644ff07169a62b2f20093a30464796983d3188766/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:30:05 np0005476733 podman[231354]: 2025-10-08 15:30:05.099440286 +0000 UTC m=+0.144787797 container init 614f6d2102f50e73e404d7d23869ae6f128afeb3aea535a2a552e13a3aea6d75 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:30:05 np0005476733 podman[231354]: 2025-10-08 15:30:05.104174158 +0000 UTC m=+0.149521639 container start 614f6d2102f50e73e404d7d23869ae6f128afeb3aea535a2a552e13a3aea6d75 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  8 11:30:05 np0005476733 neutron-haproxy-ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7[231369]: [NOTICE]   (231373) : New worker (231375) forked
Oct  8 11:30:05 np0005476733 neutron-haproxy-ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7[231369]: [NOTICE]   (231373) : Loading success.
Oct  8 11:30:05 np0005476733 nova_compute[192580]: 2025-10-08 15:30:05.349 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937405.3494043, 90f7bb14-f463-4f98-92fc-22c2a06a12cd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:30:05 np0005476733 nova_compute[192580]: 2025-10-08 15:30:05.351 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] VM Started (Lifecycle Event)#033[00m
Oct  8 11:30:05 np0005476733 nova_compute[192580]: 2025-10-08 15:30:05.386 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:30:05 np0005476733 nova_compute[192580]: 2025-10-08 15:30:05.390 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937405.3494937, 90f7bb14-f463-4f98-92fc-22c2a06a12cd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:30:05 np0005476733 nova_compute[192580]: 2025-10-08 15:30:05.390 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:30:05 np0005476733 nova_compute[192580]: 2025-10-08 15:30:05.412 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:30:05 np0005476733 nova_compute[192580]: 2025-10-08 15:30:05.415 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:30:05 np0005476733 nova_compute[192580]: 2025-10-08 15:30:05.438 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:30:05 np0005476733 nova_compute[192580]: 2025-10-08 15:30:05.698 2 INFO nova.compute.manager [None req-639065d7-669b-4fac-88a1-48a31708726e c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Get console output#033[00m
Oct  8 11:30:05 np0005476733 nova_compute[192580]: 2025-10-08 15:30:05.710 2 INFO nova.virt.libvirt.driver [None req-639065d7-669b-4fac-88a1-48a31708726e c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Truncated console log returned, 3326 bytes ignored#033[00m
Oct  8 11:30:06 np0005476733 nova_compute[192580]: 2025-10-08 15:30:06.177 2 DEBUG nova.network.neutron [req-e2acbfbc-ce64-47da-884d-95d2533b7a70 req-374a7782-b36b-4559-97e6-9e233b0a52c0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Updated VIF entry in instance network info cache for port 3ca6fe41-629a-4c92-9418-834869a48822. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:30:06 np0005476733 nova_compute[192580]: 2025-10-08 15:30:06.178 2 DEBUG nova.network.neutron [req-e2acbfbc-ce64-47da-884d-95d2533b7a70 req-374a7782-b36b-4559-97e6-9e233b0a52c0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Updating instance_info_cache with network_info: [{"id": "3ca6fe41-629a-4c92-9418-834869a48822", "address": "fa:16:3e:03:33:a9", "network": {"id": "f872d065-dcdd-4abe-966e-984ec8347cf7", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.60", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::f816:3eff:fe03:33a9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ca6fe41-62", "ovs_interfaceid": "3ca6fe41-629a-4c92-9418-834869a48822", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:30:06 np0005476733 nova_compute[192580]: 2025-10-08 15:30:06.200 2 DEBUG oslo_concurrency.lockutils [req-e2acbfbc-ce64-47da-884d-95d2533b7a70 req-374a7782-b36b-4559-97e6-9e233b0a52c0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-90f7bb14-f463-4f98-92fc-22c2a06a12cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:07 np0005476733 podman[231385]: 2025-10-08 15:30:07.259427856 +0000 UTC m=+0.078491709 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=edpm, tcib_managed=true)
Oct  8 11:30:07 np0005476733 podman[231384]: 2025-10-08 15:30:07.279137646 +0000 UTC m=+0.102945871 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.433 2 DEBUG nova.compute.manager [req-5e70dd44-63c5-40db-b96d-f9cfc1cd2204 req-da935cc6-a55f-484b-ba49-39b28dc97fe5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Received event network-vif-plugged-3ca6fe41-629a-4c92-9418-834869a48822 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.433 2 DEBUG oslo_concurrency.lockutils [req-5e70dd44-63c5-40db-b96d-f9cfc1cd2204 req-da935cc6-a55f-484b-ba49-39b28dc97fe5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.434 2 DEBUG oslo_concurrency.lockutils [req-5e70dd44-63c5-40db-b96d-f9cfc1cd2204 req-da935cc6-a55f-484b-ba49-39b28dc97fe5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.434 2 DEBUG oslo_concurrency.lockutils [req-5e70dd44-63c5-40db-b96d-f9cfc1cd2204 req-da935cc6-a55f-484b-ba49-39b28dc97fe5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.434 2 DEBUG nova.compute.manager [req-5e70dd44-63c5-40db-b96d-f9cfc1cd2204 req-da935cc6-a55f-484b-ba49-39b28dc97fe5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Processing event network-vif-plugged-3ca6fe41-629a-4c92-9418-834869a48822 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.435 2 DEBUG nova.compute.manager [req-5e70dd44-63c5-40db-b96d-f9cfc1cd2204 req-da935cc6-a55f-484b-ba49-39b28dc97fe5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Received event network-vif-plugged-3ca6fe41-629a-4c92-9418-834869a48822 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.435 2 DEBUG oslo_concurrency.lockutils [req-5e70dd44-63c5-40db-b96d-f9cfc1cd2204 req-da935cc6-a55f-484b-ba49-39b28dc97fe5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.435 2 DEBUG oslo_concurrency.lockutils [req-5e70dd44-63c5-40db-b96d-f9cfc1cd2204 req-da935cc6-a55f-484b-ba49-39b28dc97fe5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.435 2 DEBUG oslo_concurrency.lockutils [req-5e70dd44-63c5-40db-b96d-f9cfc1cd2204 req-da935cc6-a55f-484b-ba49-39b28dc97fe5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.436 2 DEBUG nova.compute.manager [req-5e70dd44-63c5-40db-b96d-f9cfc1cd2204 req-da935cc6-a55f-484b-ba49-39b28dc97fe5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] No waiting events found dispatching network-vif-plugged-3ca6fe41-629a-4c92-9418-834869a48822 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.436 2 WARNING nova.compute.manager [req-5e70dd44-63c5-40db-b96d-f9cfc1cd2204 req-da935cc6-a55f-484b-ba49-39b28dc97fe5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Received unexpected event network-vif-plugged-3ca6fe41-629a-4c92-9418-834869a48822 for instance with vm_state building and task_state spawning.#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.437 2 DEBUG nova.compute.manager [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.441 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937407.4412417, 90f7bb14-f463-4f98-92fc-22c2a06a12cd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.441 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.443 2 DEBUG nova.virt.libvirt.driver [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.447 2 INFO nova.virt.libvirt.driver [-] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Instance spawned successfully.#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.447 2 DEBUG nova.virt.libvirt.driver [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.492 2 DEBUG nova.virt.libvirt.driver [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.492 2 DEBUG nova.virt.libvirt.driver [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.493 2 DEBUG nova.virt.libvirt.driver [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.494 2 DEBUG nova.virt.libvirt.driver [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.494 2 DEBUG nova.virt.libvirt.driver [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.495 2 DEBUG nova.virt.libvirt.driver [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.498 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.503 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.539 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.591 2 INFO nova.compute.manager [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Took 10.50 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.592 2 DEBUG nova.compute.manager [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.681 2 INFO nova.compute.manager [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Took 11.10 seconds to build instance.#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.729 2 DEBUG oslo_concurrency.lockutils [None req-b72301ad-a146-4d8f-ad28-a6e64d72a7ff 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.733 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937392.7320073, 7f1808f3-5a79-4149-84d1-7bc21eefa497 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.733 2 INFO nova.compute.manager [-] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:30:07 np0005476733 nova_compute[192580]: 2025-10-08 15:30:07.759 2 DEBUG nova.compute.manager [None req-65ac0473-8b1e-4523-9130-fe8ee88f0af5 - - - - - -] [instance: 7f1808f3-5a79-4149-84d1-7bc21eefa497] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:30:09 np0005476733 nova_compute[192580]: 2025-10-08 15:30:09.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:10 np0005476733 nova_compute[192580]: 2025-10-08 15:30:10.056 2 INFO nova.compute.manager [None req-bf231289-13d5-49fe-9eef-d434164d8ecc 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Get console output#033[00m
Oct  8 11:30:10 np0005476733 nova_compute[192580]: 2025-10-08 15:30:10.060 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:30:11 np0005476733 nova_compute[192580]: 2025-10-08 15:30:11.762 2 DEBUG nova.compute.manager [req-a2e1796c-e553-4cf2-a8a1-19cf4edbdc5b req-c72b8187-e354-4de6-b0a3-79b7b7919950 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Received event network-changed-23f6a943-ce2f-4958-a0c6-73f789517892 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:30:11 np0005476733 nova_compute[192580]: 2025-10-08 15:30:11.764 2 DEBUG nova.compute.manager [req-a2e1796c-e553-4cf2-a8a1-19cf4edbdc5b req-c72b8187-e354-4de6-b0a3-79b7b7919950 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Refreshing instance network info cache due to event network-changed-23f6a943-ce2f-4958-a0c6-73f789517892. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:30:11 np0005476733 nova_compute[192580]: 2025-10-08 15:30:11.765 2 DEBUG oslo_concurrency.lockutils [req-a2e1796c-e553-4cf2-a8a1-19cf4edbdc5b req-c72b8187-e354-4de6-b0a3-79b7b7919950 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-27fa9a5a-04a0-4d80-b75d-564df1c974e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:30:11 np0005476733 nova_compute[192580]: 2025-10-08 15:30:11.765 2 DEBUG oslo_concurrency.lockutils [req-a2e1796c-e553-4cf2-a8a1-19cf4edbdc5b req-c72b8187-e354-4de6-b0a3-79b7b7919950 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-27fa9a5a-04a0-4d80-b75d-564df1c974e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:30:11 np0005476733 nova_compute[192580]: 2025-10-08 15:30:11.766 2 DEBUG nova.network.neutron [req-a2e1796c-e553-4cf2-a8a1-19cf4edbdc5b req-c72b8187-e354-4de6-b0a3-79b7b7919950 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Refreshing network info cache for port 23f6a943-ce2f-4958-a0c6-73f789517892 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:30:12 np0005476733 nova_compute[192580]: 2025-10-08 15:30:12.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:12 np0005476733 nova_compute[192580]: 2025-10-08 15:30:12.313 2 DEBUG nova.compute.manager [req-8c3420d3-cdef-4905-ab1e-d2cc921adc6a req-4236b8e9-6048-44fd-90d5-adf27197e2de 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Received event network-changed-66f32729-1d2a-44d6-b604-29e4c751f95c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:30:12 np0005476733 nova_compute[192580]: 2025-10-08 15:30:12.314 2 DEBUG nova.compute.manager [req-8c3420d3-cdef-4905-ab1e-d2cc921adc6a req-4236b8e9-6048-44fd-90d5-adf27197e2de 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Refreshing instance network info cache due to event network-changed-66f32729-1d2a-44d6-b604-29e4c751f95c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:30:12 np0005476733 nova_compute[192580]: 2025-10-08 15:30:12.315 2 DEBUG oslo_concurrency.lockutils [req-8c3420d3-cdef-4905-ab1e-d2cc921adc6a req-4236b8e9-6048-44fd-90d5-adf27197e2de 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-a7cf9795-ac6e-4d38-8500-755c39931e14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:30:12 np0005476733 nova_compute[192580]: 2025-10-08 15:30:12.316 2 DEBUG oslo_concurrency.lockutils [req-8c3420d3-cdef-4905-ab1e-d2cc921adc6a req-4236b8e9-6048-44fd-90d5-adf27197e2de 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-a7cf9795-ac6e-4d38-8500-755c39931e14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:30:12 np0005476733 nova_compute[192580]: 2025-10-08 15:30:12.316 2 DEBUG nova.network.neutron [req-8c3420d3-cdef-4905-ab1e-d2cc921adc6a req-4236b8e9-6048-44fd-90d5-adf27197e2de 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Refreshing network info cache for port 66f32729-1d2a-44d6-b604-29e4c751f95c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:30:14 np0005476733 podman[231428]: 2025-10-08 15:30:14.244364089 +0000 UTC m=+0.071556318 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, release=1755695350, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public)
Oct  8 11:30:14 np0005476733 nova_compute[192580]: 2025-10-08 15:30:14.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:14 np0005476733 nova_compute[192580]: 2025-10-08 15:30:14.814 2 DEBUG nova.network.neutron [req-8c3420d3-cdef-4905-ab1e-d2cc921adc6a req-4236b8e9-6048-44fd-90d5-adf27197e2de 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Updated VIF entry in instance network info cache for port 66f32729-1d2a-44d6-b604-29e4c751f95c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:30:14 np0005476733 nova_compute[192580]: 2025-10-08 15:30:14.815 2 DEBUG nova.network.neutron [req-8c3420d3-cdef-4905-ab1e-d2cc921adc6a req-4236b8e9-6048-44fd-90d5-adf27197e2de 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Updating instance_info_cache with network_info: [{"id": "66f32729-1d2a-44d6-b604-29e4c751f95c", "address": "fa:16:3e:22:ff:69", "network": {"id": "858b993e-0613-4d63-983c-94fe95ccca9d", "bridge": "br-int", "label": "tempest-MultiVlanTransparencyTest-1225694051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4f21e712eb24213a38bc89e2b2f44b3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66f32729-1d", "ovs_interfaceid": "66f32729-1d2a-44d6-b604-29e4c751f95c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:30:14 np0005476733 nova_compute[192580]: 2025-10-08 15:30:14.896 2 DEBUG oslo_concurrency.lockutils [req-8c3420d3-cdef-4905-ab1e-d2cc921adc6a req-4236b8e9-6048-44fd-90d5-adf27197e2de 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-a7cf9795-ac6e-4d38-8500-755c39931e14" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:30:15 np0005476733 nova_compute[192580]: 2025-10-08 15:30:15.664 2 DEBUG nova.network.neutron [req-a2e1796c-e553-4cf2-a8a1-19cf4edbdc5b req-c72b8187-e354-4de6-b0a3-79b7b7919950 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Updated VIF entry in instance network info cache for port 23f6a943-ce2f-4958-a0c6-73f789517892. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:30:15 np0005476733 nova_compute[192580]: 2025-10-08 15:30:15.665 2 DEBUG nova.network.neutron [req-a2e1796c-e553-4cf2-a8a1-19cf4edbdc5b req-c72b8187-e354-4de6-b0a3-79b7b7919950 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Updating instance_info_cache with network_info: [{"id": "23f6a943-ce2f-4958-a0c6-73f789517892", "address": "fa:16:3e:38:f6:e1", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23f6a943-ce", "ovs_interfaceid": "23f6a943-ce2f-4958-a0c6-73f789517892", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:30:15 np0005476733 nova_compute[192580]: 2025-10-08 15:30:15.701 2 INFO nova.compute.manager [None req-2c4b7584-7591-4a67-9ef3-766175617a1d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Get console output#033[00m
Oct  8 11:30:15 np0005476733 nova_compute[192580]: 2025-10-08 15:30:15.742 2 DEBUG oslo_concurrency.lockutils [req-a2e1796c-e553-4cf2-a8a1-19cf4edbdc5b req-c72b8187-e354-4de6-b0a3-79b7b7919950 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-27fa9a5a-04a0-4d80-b75d-564df1c974e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:30:17 np0005476733 nova_compute[192580]: 2025-10-08 15:30:17.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:17 np0005476733 podman[231453]: 2025-10-08 15:30:17.243499308 +0000 UTC m=+0.063621044 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 11:30:17 np0005476733 podman[231452]: 2025-10-08 15:30:17.246853535 +0000 UTC m=+0.069418249 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 11:30:17 np0005476733 nova_compute[192580]: 2025-10-08 15:30:17.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:19 np0005476733 nova_compute[192580]: 2025-10-08 15:30:19.224 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:30:19 np0005476733 nova_compute[192580]: 2025-10-08 15:30:19.263 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Triggering sync for uuid 27fa9a5a-04a0-4d80-b75d-564df1c974e8 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  8 11:30:19 np0005476733 nova_compute[192580]: 2025-10-08 15:30:19.264 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Triggering sync for uuid a7cf9795-ac6e-4d38-8500-755c39931e14 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  8 11:30:19 np0005476733 nova_compute[192580]: 2025-10-08 15:30:19.264 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Triggering sync for uuid 90f7bb14-f463-4f98-92fc-22c2a06a12cd _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  8 11:30:19 np0005476733 nova_compute[192580]: 2025-10-08 15:30:19.265 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:19 np0005476733 nova_compute[192580]: 2025-10-08 15:30:19.265 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:19 np0005476733 nova_compute[192580]: 2025-10-08 15:30:19.266 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "a7cf9795-ac6e-4d38-8500-755c39931e14" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:19 np0005476733 nova_compute[192580]: 2025-10-08 15:30:19.266 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "a7cf9795-ac6e-4d38-8500-755c39931e14" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:19 np0005476733 nova_compute[192580]: 2025-10-08 15:30:19.267 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:19 np0005476733 nova_compute[192580]: 2025-10-08 15:30:19.268 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:19 np0005476733 nova_compute[192580]: 2025-10-08 15:30:19.320 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.055s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:19 np0005476733 nova_compute[192580]: 2025-10-08 15:30:19.323 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.054s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:19 np0005476733 nova_compute[192580]: 2025-10-08 15:30:19.323 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "a7cf9795-ac6e-4d38-8500-755c39931e14" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.057s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:19 np0005476733 nova_compute[192580]: 2025-10-08 15:30:19.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:30:20Z|00359|pinctrl|WARN|Dropped 8731 log messages in last 60 seconds (most recently, 2 seconds ago) due to excessive rate
Oct  8 11:30:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:30:20Z|00360|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:30:21 np0005476733 nova_compute[192580]: 2025-10-08 15:30:21.128 2 INFO nova.compute.manager [None req-f89fb0c7-5d4b-4b60-8661-487622f10d17 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Get console output#033[00m
Oct  8 11:30:21 np0005476733 nova_compute[192580]: 2025-10-08 15:30:21.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:21.165 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:30:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:21.170 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:30:21 np0005476733 nova_compute[192580]: 2025-10-08 15:30:21.367 2 DEBUG oslo_concurrency.lockutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "066ef28b-88ac-4f5c-acae-3458c3e19762" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:21 np0005476733 nova_compute[192580]: 2025-10-08 15:30:21.368 2 DEBUG oslo_concurrency.lockutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "066ef28b-88ac-4f5c-acae-3458c3e19762" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:21 np0005476733 nova_compute[192580]: 2025-10-08 15:30:21.418 2 DEBUG nova.compute.manager [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:30:21 np0005476733 nova_compute[192580]: 2025-10-08 15:30:21.535 2 DEBUG oslo_concurrency.lockutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:21 np0005476733 nova_compute[192580]: 2025-10-08 15:30:21.535 2 DEBUG oslo_concurrency.lockutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:21 np0005476733 nova_compute[192580]: 2025-10-08 15:30:21.541 2 DEBUG nova.virt.hardware [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:30:21 np0005476733 nova_compute[192580]: 2025-10-08 15:30:21.542 2 INFO nova.compute.claims [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:30:21 np0005476733 nova_compute[192580]: 2025-10-08 15:30:21.682 2 DEBUG nova.scheduler.client.report [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 11:30:21 np0005476733 nova_compute[192580]: 2025-10-08 15:30:21.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:21 np0005476733 nova_compute[192580]: 2025-10-08 15:30:21.729 2 DEBUG nova.scheduler.client.report [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 11:30:21 np0005476733 nova_compute[192580]: 2025-10-08 15:30:21.730 2 DEBUG nova.compute.provider_tree [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 11:30:21 np0005476733 nova_compute[192580]: 2025-10-08 15:30:21.765 2 DEBUG nova.scheduler.client.report [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 11:30:21 np0005476733 nova_compute[192580]: 2025-10-08 15:30:21.811 2 DEBUG nova.scheduler.client.report [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 11:30:21 np0005476733 nova_compute[192580]: 2025-10-08 15:30:21.953 2 DEBUG nova.compute.provider_tree [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:30:21 np0005476733 nova_compute[192580]: 2025-10-08 15:30:21.992 2 DEBUG nova.scheduler.client.report [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.034 2 DEBUG oslo_concurrency.lockutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.499s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.035 2 DEBUG nova.compute.manager [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.115 2 DEBUG nova.compute.manager [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.116 2 DEBUG nova.network.neutron [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.160 2 INFO nova.virt.libvirt.driver [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.209 2 DEBUG nova.compute.manager [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.380 2 DEBUG nova.compute.manager [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.382 2 DEBUG nova.virt.libvirt.driver [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.382 2 INFO nova.virt.libvirt.driver [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Creating image(s)#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.383 2 DEBUG oslo_concurrency.lockutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "/var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.384 2 DEBUG oslo_concurrency.lockutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "/var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.384 2 DEBUG oslo_concurrency.lockutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "/var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.403 2 DEBUG oslo_concurrency.processutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.440 2 DEBUG nova.policy [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.482 2 DEBUG oslo_concurrency.processutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.484 2 DEBUG oslo_concurrency.lockutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.485 2 DEBUG oslo_concurrency.lockutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.500 2 DEBUG oslo_concurrency.processutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.578 2 DEBUG oslo_concurrency.processutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.580 2 DEBUG oslo_concurrency.processutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.613 2 DEBUG oslo_concurrency.processutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk 10737418240" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.615 2 DEBUG oslo_concurrency.lockutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.615 2 DEBUG oslo_concurrency.processutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.670 2 DEBUG oslo_concurrency.processutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.671 2 DEBUG nova.objects.instance [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'migration_context' on Instance uuid 066ef28b-88ac-4f5c-acae-3458c3e19762 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.686 2 DEBUG nova.virt.libvirt.driver [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.687 2 DEBUG nova.virt.libvirt.driver [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Ensure instance console log exists: /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.688 2 DEBUG oslo_concurrency.lockutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.688 2 DEBUG oslo_concurrency.lockutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:22 np0005476733 nova_compute[192580]: 2025-10-08 15:30:22.688 2 DEBUG oslo_concurrency.lockutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:24 np0005476733 nova_compute[192580]: 2025-10-08 15:30:24.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:25 np0005476733 podman[231550]: 2025-10-08 15:30:25.234839734 +0000 UTC m=+0.055456662 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=iscsid)
Oct  8 11:30:25 np0005476733 podman[231551]: 2025-10-08 15:30:25.234843535 +0000 UTC m=+0.052914352 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:30:25 np0005476733 nova_compute[192580]: 2025-10-08 15:30:25.357 2 DEBUG nova.network.neutron [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Successfully created port: 8f7d5998-037f-4a70-98a0-8482a8043a7e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 11:30:26 np0005476733 nova_compute[192580]: 2025-10-08 15:30:26.284 2 INFO nova.compute.manager [None req-e6a4968f-10b8-48eb-8628-02677ba4075f 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Get console output#033[00m
Oct  8 11:30:26 np0005476733 nova_compute[192580]: 2025-10-08 15:30:26.292 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:30:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:26.312 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:26.313 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:26.314 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:26 np0005476733 nova_compute[192580]: 2025-10-08 15:30:26.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:27 np0005476733 nova_compute[192580]: 2025-10-08 15:30:27.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:28 np0005476733 nova_compute[192580]: 2025-10-08 15:30:28.531 2 DEBUG nova.network.neutron [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Successfully updated port: 8f7d5998-037f-4a70-98a0-8482a8043a7e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:30:28 np0005476733 nova_compute[192580]: 2025-10-08 15:30:28.623 2 DEBUG oslo_concurrency.lockutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:30:28 np0005476733 nova_compute[192580]: 2025-10-08 15:30:28.624 2 DEBUG oslo_concurrency.lockutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquired lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:30:28 np0005476733 nova_compute[192580]: 2025-10-08 15:30:28.625 2 DEBUG nova.network.neutron [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:30:28 np0005476733 nova_compute[192580]: 2025-10-08 15:30:28.861 2 DEBUG nova.network.neutron [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.754 2 DEBUG nova.network.neutron [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updating instance_info_cache with network_info: [{"id": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "address": "fa:16:3e:85:7d:15", "network": {"id": "f81b33e3-d2f7-4437-b8c9-c9a54931fb61", "bridge": "br-int", "label": "tempest-test-network--416037603", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f7d5998-03", "ovs_interfaceid": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.832 2 DEBUG oslo_concurrency.lockutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Releasing lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.833 2 DEBUG nova.compute.manager [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Instance network_info: |[{"id": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "address": "fa:16:3e:85:7d:15", "network": {"id": "f81b33e3-d2f7-4437-b8c9-c9a54931fb61", "bridge": "br-int", "label": "tempest-test-network--416037603", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f7d5998-03", "ovs_interfaceid": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.838 2 DEBUG nova.virt.libvirt.driver [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Start _get_guest_xml network_info=[{"id": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "address": "fa:16:3e:85:7d:15", "network": {"id": "f81b33e3-d2f7-4437-b8c9-c9a54931fb61", "bridge": "br-int", "label": "tempest-test-network--416037603", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f7d5998-03", "ovs_interfaceid": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.846 2 WARNING nova.virt.libvirt.driver [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.851 2 DEBUG nova.virt.libvirt.host [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.853 2 DEBUG nova.virt.libvirt.host [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.857 2 DEBUG nova.virt.libvirt.host [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.858 2 DEBUG nova.virt.libvirt.host [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.859 2 DEBUG nova.virt.libvirt.driver [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.860 2 DEBUG nova.virt.hardware [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.861 2 DEBUG nova.virt.hardware [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.861 2 DEBUG nova.virt.hardware [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.862 2 DEBUG nova.virt.hardware [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.862 2 DEBUG nova.virt.hardware [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.863 2 DEBUG nova.virt.hardware [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.863 2 DEBUG nova.virt.hardware [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.864 2 DEBUG nova.virt.hardware [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.865 2 DEBUG nova.virt.hardware [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.865 2 DEBUG nova.virt.hardware [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.866 2 DEBUG nova.virt.hardware [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.874 2 DEBUG nova.virt.libvirt.vif [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:30:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_bw_limit_tenant_network-1685300098',display_name='tempest-test_bw_limit_tenant_network-1685300098',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-bw-limit-tenant-network-1685300098',id=45,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-1p84nw3a',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:30:22Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=066ef28b-88ac-4f5c-acae-3458c3e19762,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "address": "fa:16:3e:85:7d:15", "network": {"id": "f81b33e3-d2f7-4437-b8c9-c9a54931fb61", "bridge": "br-int", "label": "tempest-test-network--416037603", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f7d5998-03", "ovs_interfaceid": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.875 2 DEBUG nova.network.os_vif_util [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "address": "fa:16:3e:85:7d:15", "network": {"id": "f81b33e3-d2f7-4437-b8c9-c9a54931fb61", "bridge": "br-int", "label": "tempest-test-network--416037603", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f7d5998-03", "ovs_interfaceid": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.877 2 DEBUG nova.network.os_vif_util [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:7d:15,bridge_name='br-int',has_traffic_filtering=True,id=8f7d5998-037f-4a70-98a0-8482a8043a7e,network=Network(f81b33e3-d2f7-4437-b8c9-c9a54931fb61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f7d5998-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.879 2 DEBUG nova.objects.instance [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 066ef28b-88ac-4f5c-acae-3458c3e19762 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.912 2 DEBUG nova.virt.libvirt.driver [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:30:29 np0005476733 nova_compute[192580]:  <uuid>066ef28b-88ac-4f5c-acae-3458c3e19762</uuid>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:  <name>instance-0000002d</name>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <nova:name>tempest-test_bw_limit_tenant_network-1685300098</nova:name>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:30:29</nova:creationTime>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:30:29 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:        <nova:user uuid="d4d641ac754b44f89a23c1628056309a">tempest-QosTestCommon-1316104462-project-member</nova:user>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:        <nova:project uuid="d58fb802e34e481ea69b20f4fe8df6d2">tempest-QosTestCommon-1316104462</nova:project>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:        <nova:port uuid="8f7d5998-037f-4a70-98a0-8482a8043a7e">
Oct  8 11:30:29 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.3.176" ipVersion="4"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <entry name="serial">066ef28b-88ac-4f5c-acae-3458c3e19762</entry>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <entry name="uuid">066ef28b-88ac-4f5c-acae-3458c3e19762</entry>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk.config"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:85:7d:15"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <target dev="tap8f7d5998-03"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/console.log" append="off"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:30:29 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:30:29 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:30:29 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:30:29 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.923 2 DEBUG nova.compute.manager [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Preparing to wait for external event network-vif-plugged-8f7d5998-037f-4a70-98a0-8482a8043a7e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.924 2 DEBUG oslo_concurrency.lockutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.924 2 DEBUG oslo_concurrency.lockutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.925 2 DEBUG oslo_concurrency.lockutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.926 2 DEBUG nova.virt.libvirt.vif [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:30:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_bw_limit_tenant_network-1685300098',display_name='tempest-test_bw_limit_tenant_network-1685300098',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-bw-limit-tenant-network-1685300098',id=45,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-1p84nw3a',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:30:22Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=066ef28b-88ac-4f5c-acae-3458c3e19762,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "address": "fa:16:3e:85:7d:15", "network": {"id": "f81b33e3-d2f7-4437-b8c9-c9a54931fb61", "bridge": "br-int", "label": "tempest-test-network--416037603", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f7d5998-03", "ovs_interfaceid": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.926 2 DEBUG nova.network.os_vif_util [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "address": "fa:16:3e:85:7d:15", "network": {"id": "f81b33e3-d2f7-4437-b8c9-c9a54931fb61", "bridge": "br-int", "label": "tempest-test-network--416037603", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f7d5998-03", "ovs_interfaceid": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.929 2 DEBUG nova.network.os_vif_util [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:7d:15,bridge_name='br-int',has_traffic_filtering=True,id=8f7d5998-037f-4a70-98a0-8482a8043a7e,network=Network(f81b33e3-d2f7-4437-b8c9-c9a54931fb61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f7d5998-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.929 2 DEBUG os_vif [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:7d:15,bridge_name='br-int',has_traffic_filtering=True,id=8f7d5998-037f-4a70-98a0-8482a8043a7e,network=Network(f81b33e3-d2f7-4437-b8c9-c9a54931fb61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f7d5998-03') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.932 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.932 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.937 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8f7d5998-03, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.938 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8f7d5998-03, col_values=(('external_ids', {'iface-id': '8f7d5998-037f-4a70-98a0-8482a8043a7e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:85:7d:15', 'vm-uuid': '066ef28b-88ac-4f5c-acae-3458c3e19762'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:29 np0005476733 NetworkManager[51699]: <info>  [1759937429.9421] manager: (tap8f7d5998-03): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/127)
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:29 np0005476733 nova_compute[192580]: 2025-10-08 15:30:29.951 2 INFO os_vif [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:7d:15,bridge_name='br-int',has_traffic_filtering=True,id=8f7d5998-037f-4a70-98a0-8482a8043a7e,network=Network(f81b33e3-d2f7-4437-b8c9-c9a54931fb61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f7d5998-03')#033[00m
Oct  8 11:30:30 np0005476733 nova_compute[192580]: 2025-10-08 15:30:30.032 2 DEBUG nova.virt.libvirt.driver [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:30:30 np0005476733 nova_compute[192580]: 2025-10-08 15:30:30.033 2 DEBUG nova.virt.libvirt.driver [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:30:30 np0005476733 nova_compute[192580]: 2025-10-08 15:30:30.034 2 DEBUG nova.virt.libvirt.driver [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No VIF found with MAC fa:16:3e:85:7d:15, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:30:30 np0005476733 nova_compute[192580]: 2025-10-08 15:30:30.035 2 INFO nova.virt.libvirt.driver [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Using config drive#033[00m
Oct  8 11:30:30 np0005476733 nova_compute[192580]: 2025-10-08 15:30:30.240 2 DEBUG nova.compute.manager [req-4ef7310e-c072-48df-bcb5-e091044830c1 req-b2ed66b3-58e7-4276-b8ee-31cad456994c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received event network-changed-8f7d5998-037f-4a70-98a0-8482a8043a7e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:30:30 np0005476733 nova_compute[192580]: 2025-10-08 15:30:30.241 2 DEBUG nova.compute.manager [req-4ef7310e-c072-48df-bcb5-e091044830c1 req-b2ed66b3-58e7-4276-b8ee-31cad456994c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Refreshing instance network info cache due to event network-changed-8f7d5998-037f-4a70-98a0-8482a8043a7e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:30:30 np0005476733 nova_compute[192580]: 2025-10-08 15:30:30.242 2 DEBUG oslo_concurrency.lockutils [req-4ef7310e-c072-48df-bcb5-e091044830c1 req-b2ed66b3-58e7-4276-b8ee-31cad456994c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:30:30 np0005476733 nova_compute[192580]: 2025-10-08 15:30:30.243 2 DEBUG oslo_concurrency.lockutils [req-4ef7310e-c072-48df-bcb5-e091044830c1 req-b2ed66b3-58e7-4276-b8ee-31cad456994c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:30:30 np0005476733 nova_compute[192580]: 2025-10-08 15:30:30.243 2 DEBUG nova.network.neutron [req-4ef7310e-c072-48df-bcb5-e091044830c1 req-b2ed66b3-58e7-4276-b8ee-31cad456994c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Refreshing network info cache for port 8f7d5998-037f-4a70-98a0-8482a8043a7e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:30:30 np0005476733 nova_compute[192580]: 2025-10-08 15:30:30.554 2 INFO nova.virt.libvirt.driver [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Creating config drive at /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk.config#033[00m
Oct  8 11:30:30 np0005476733 nova_compute[192580]: 2025-10-08 15:30:30.564 2 DEBUG oslo_concurrency.processutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2g81q204 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:30:30 np0005476733 nova_compute[192580]: 2025-10-08 15:30:30.696 2 DEBUG oslo_concurrency.processutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2g81q204" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:30:30 np0005476733 NetworkManager[51699]: <info>  [1759937430.7855] manager: (tap8f7d5998-03): new Tun device (/org/freedesktop/NetworkManager/Devices/128)
Oct  8 11:30:30 np0005476733 kernel: tap8f7d5998-03: entered promiscuous mode
Oct  8 11:30:30 np0005476733 nova_compute[192580]: 2025-10-08 15:30:30.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:30 np0005476733 ovn_controller[94857]: 2025-10-08T15:30:30Z|00361|binding|INFO|Claiming lport 8f7d5998-037f-4a70-98a0-8482a8043a7e for this chassis.
Oct  8 11:30:30 np0005476733 ovn_controller[94857]: 2025-10-08T15:30:30Z|00362|binding|INFO|8f7d5998-037f-4a70-98a0-8482a8043a7e: Claiming fa:16:3e:85:7d:15 192.168.3.176
Oct  8 11:30:30 np0005476733 ovn_controller[94857]: 2025-10-08T15:30:30Z|00363|binding|INFO|Setting lport 8f7d5998-037f-4a70-98a0-8482a8043a7e ovn-installed in OVS
Oct  8 11:30:30 np0005476733 nova_compute[192580]: 2025-10-08 15:30:30.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:30 np0005476733 systemd-udevd[231609]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:30:30 np0005476733 nova_compute[192580]: 2025-10-08 15:30:30.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:30 np0005476733 NetworkManager[51699]: <info>  [1759937430.8358] device (tap8f7d5998-03): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:30:30 np0005476733 NetworkManager[51699]: <info>  [1759937430.8370] device (tap8f7d5998-03): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:30:30 np0005476733 systemd-machined[152624]: New machine qemu-27-instance-0000002d.
Oct  8 11:30:30 np0005476733 systemd[1]: Started Virtual Machine qemu-27-instance-0000002d.
Oct  8 11:30:30 np0005476733 ovn_controller[94857]: 2025-10-08T15:30:30Z|00364|binding|INFO|Setting lport 8f7d5998-037f-4a70-98a0-8482a8043a7e up in Southbound
Oct  8 11:30:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:30.881 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:7d:15 192.168.3.176'], port_security=['fa:16:3e:85:7d:15 192.168.3.176'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.3.176/24', 'neutron:device_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f81b33e3-d2f7-4437-b8c9-c9a54931fb61', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b449450f-29a2-4ba2-a56d-c4c1cca923db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b01ceb45-280a-4b94-9dbb-432344b9bd77, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=8f7d5998-037f-4a70-98a0-8482a8043a7e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:30:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:30.883 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 8f7d5998-037f-4a70-98a0-8482a8043a7e in datapath f81b33e3-d2f7-4437-b8c9-c9a54931fb61 bound to our chassis#033[00m
Oct  8 11:30:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:30.889 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f81b33e3-d2f7-4437-b8c9-c9a54931fb61#033[00m
Oct  8 11:30:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:30.909 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ea32fdd7-a8b9-4e13-b81e-9a2ed761f53f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:30.910 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf81b33e3-d1 in ovnmeta-f81b33e3-d2f7-4437-b8c9-c9a54931fb61 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:30:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:30.919 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf81b33e3-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:30:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:30.919 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[df510d60-f4d9-4446-a80a-80f2e6b0c457]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:30.920 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[498ad9a4-426c-44db-90f4-fc72e3b4a294]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:30.937 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[7d8d06e5-def9-4abd-a14f-07c66a222b86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:30.968 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[11d77cdc-0f7b-42fa-a1ea-cb0e67001372]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:31.011 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[af4881e7-a654-4d15-8d31-1741c8281e2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:31 np0005476733 NetworkManager[51699]: <info>  [1759937431.0191] manager: (tapf81b33e3-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/129)
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:31.018 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ece6a744-41b5-4271-8a1b-fda03b482217]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:31 np0005476733 systemd-udevd[231614]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:31.076 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[89af1e98-5e44-4ec0-a991-279a2ff00ea8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:31.079 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[ddfe688b-e959-4d70-a098-d6518994cd5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:31 np0005476733 NetworkManager[51699]: <info>  [1759937431.1032] device (tapf81b33e3-d0): carrier: link connected
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:31.113 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[f8fdb601-270d-463a-8dea-746dd63114a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:31.141 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5057effd-8ab9-4413-b885-3ed455097dd2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf81b33e3-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:2b:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433476, 'reachable_time': 30889, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231645, 'error': None, 'target': 'ovnmeta-f81b33e3-d2f7-4437-b8c9-c9a54931fb61', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:31.159 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a125f998-a88d-475f-b940-d3a3b79017ec]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea8:2b3e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 433476, 'tstamp': 433476}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231646, 'error': None, 'target': 'ovnmeta-f81b33e3-d2f7-4437-b8c9-c9a54931fb61', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:31.174 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:31.177 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3281a640-4f0b-408f-829d-eaebb6f6120f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf81b33e3-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:2b:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433476, 'reachable_time': 30889, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231647, 'error': None, 'target': 'ovnmeta-f81b33e3-d2f7-4437-b8c9-c9a54931fb61', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:31.211 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5159b7e6-e6f0-42aa-bd0e-189c6619605d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:31.274 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3fde856f-8071-498d-90dc-0d9bcbb9a565]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:31.276 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf81b33e3-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:31.276 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:31.277 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf81b33e3-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:30:31 np0005476733 NetworkManager[51699]: <info>  [1759937431.2800] manager: (tapf81b33e3-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/130)
Oct  8 11:30:31 np0005476733 nova_compute[192580]: 2025-10-08 15:30:31.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:31 np0005476733 kernel: tapf81b33e3-d0: entered promiscuous mode
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:31.284 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf81b33e3-d0, col_values=(('external_ids', {'iface-id': 'f67773e8-4408-425a-8438-2209ddc36987'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:30:31 np0005476733 ovn_controller[94857]: 2025-10-08T15:30:31Z|00365|binding|INFO|Releasing lport f67773e8-4408-425a-8438-2209ddc36987 from this chassis (sb_readonly=0)
Oct  8 11:30:31 np0005476733 nova_compute[192580]: 2025-10-08 15:30:31.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:31 np0005476733 nova_compute[192580]: 2025-10-08 15:30:31.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:31.306 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f81b33e3-d2f7-4437-b8c9-c9a54931fb61.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f81b33e3-d2f7-4437-b8c9-c9a54931fb61.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:31.307 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d5869cef-f3db-45d8-8209-b6cad3c7a5f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:31.308 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-f81b33e3-d2f7-4437-b8c9-c9a54931fb61
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/f81b33e3-d2f7-4437-b8c9-c9a54931fb61.pid.haproxy
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID f81b33e3-d2f7-4437-b8c9-c9a54931fb61
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:30:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:31.309 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f81b33e3-d2f7-4437-b8c9-c9a54931fb61', 'env', 'PROCESS_TAG=haproxy-f81b33e3-d2f7-4437-b8c9-c9a54931fb61', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f81b33e3-d2f7-4437-b8c9-c9a54931fb61.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:30:31 np0005476733 nova_compute[192580]: 2025-10-08 15:30:31.467 2 INFO nova.compute.manager [None req-084fd321-cf79-46b0-a988-014e82db10fb 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Get console output#033[00m
Oct  8 11:30:31 np0005476733 nova_compute[192580]: 2025-10-08 15:30:31.478 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:30:31 np0005476733 podman[231679]: 2025-10-08 15:30:31.645094766 +0000 UTC m=+0.023289444 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:30:31 np0005476733 podman[231679]: 2025-10-08 15:30:31.766128603 +0000 UTC m=+0.144323261 container create 80a664ad4137692204f83f1b22e3ca8882cee731c746eb9e941bebdb7de2ee22 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f81b33e3-d2f7-4437-b8c9-c9a54931fb61, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Oct  8 11:30:31 np0005476733 systemd[1]: Started libpod-conmon-80a664ad4137692204f83f1b22e3ca8882cee731c746eb9e941bebdb7de2ee22.scope.
Oct  8 11:30:31 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:30:31 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f0208bda96c5ca82475315efd68ab0d43c824e09e12c2b392457156a237e197/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:30:31 np0005476733 podman[231679]: 2025-10-08 15:30:31.893958947 +0000 UTC m=+0.272153635 container init 80a664ad4137692204f83f1b22e3ca8882cee731c746eb9e941bebdb7de2ee22 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f81b33e3-d2f7-4437-b8c9-c9a54931fb61, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:30:31 np0005476733 podman[231679]: 2025-10-08 15:30:31.900657652 +0000 UTC m=+0.278852300 container start 80a664ad4137692204f83f1b22e3ca8882cee731c746eb9e941bebdb7de2ee22 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f81b33e3-d2f7-4437-b8c9-c9a54931fb61, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001)
Oct  8 11:30:31 np0005476733 neutron-haproxy-ovnmeta-f81b33e3-d2f7-4437-b8c9-c9a54931fb61[231696]: [NOTICE]   (231700) : New worker (231702) forked
Oct  8 11:30:31 np0005476733 neutron-haproxy-ovnmeta-f81b33e3-d2f7-4437-b8c9-c9a54931fb61[231696]: [NOTICE]   (231700) : Loading success.
Oct  8 11:30:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:30:32Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:03:33:a9 192.168.5.60
Oct  8 11:30:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:30:32Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:03:33:a9 192.168.5.60
Oct  8 11:30:32 np0005476733 nova_compute[192580]: 2025-10-08 15:30:32.600 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937432.599929, 066ef28b-88ac-4f5c-acae-3458c3e19762 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:30:32 np0005476733 nova_compute[192580]: 2025-10-08 15:30:32.601 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] VM Started (Lifecycle Event)#033[00m
Oct  8 11:30:32 np0005476733 nova_compute[192580]: 2025-10-08 15:30:32.627 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:30:32 np0005476733 nova_compute[192580]: 2025-10-08 15:30:32.632 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937432.6002982, 066ef28b-88ac-4f5c-acae-3458c3e19762 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:30:32 np0005476733 nova_compute[192580]: 2025-10-08 15:30:32.633 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:30:32 np0005476733 nova_compute[192580]: 2025-10-08 15:30:32.679 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:30:32 np0005476733 nova_compute[192580]: 2025-10-08 15:30:32.684 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:30:32 np0005476733 nova_compute[192580]: 2025-10-08 15:30:32.699 2 DEBUG nova.network.neutron [req-4ef7310e-c072-48df-bcb5-e091044830c1 req-b2ed66b3-58e7-4276-b8ee-31cad456994c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updated VIF entry in instance network info cache for port 8f7d5998-037f-4a70-98a0-8482a8043a7e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:30:32 np0005476733 nova_compute[192580]: 2025-10-08 15:30:32.699 2 DEBUG nova.network.neutron [req-4ef7310e-c072-48df-bcb5-e091044830c1 req-b2ed66b3-58e7-4276-b8ee-31cad456994c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updating instance_info_cache with network_info: [{"id": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "address": "fa:16:3e:85:7d:15", "network": {"id": "f81b33e3-d2f7-4437-b8c9-c9a54931fb61", "bridge": "br-int", "label": "tempest-test-network--416037603", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f7d5998-03", "ovs_interfaceid": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:30:32 np0005476733 nova_compute[192580]: 2025-10-08 15:30:32.766 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:30:32 np0005476733 nova_compute[192580]: 2025-10-08 15:30:32.767 2 DEBUG oslo_concurrency.lockutils [req-4ef7310e-c072-48df-bcb5-e091044830c1 req-b2ed66b3-58e7-4276-b8ee-31cad456994c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:30:33 np0005476733 nova_compute[192580]: 2025-10-08 15:30:33.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:34 np0005476733 nova_compute[192580]: 2025-10-08 15:30:34.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:34 np0005476733 nova_compute[192580]: 2025-10-08 15:30:34.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:35 np0005476733 podman[231718]: 2025-10-08 15:30:35.232807731 +0000 UTC m=+0.060390461 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 11:30:35 np0005476733 nova_compute[192580]: 2025-10-08 15:30:35.384 2 DEBUG nova.compute.manager [req-2c2b08d9-a3b9-4b63-b6ce-31258d571138 req-302e4476-3e2a-4fa7-9778-e6f60d8ea680 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received event network-vif-plugged-8f7d5998-037f-4a70-98a0-8482a8043a7e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:30:35 np0005476733 nova_compute[192580]: 2025-10-08 15:30:35.385 2 DEBUG oslo_concurrency.lockutils [req-2c2b08d9-a3b9-4b63-b6ce-31258d571138 req-302e4476-3e2a-4fa7-9778-e6f60d8ea680 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:35 np0005476733 nova_compute[192580]: 2025-10-08 15:30:35.386 2 DEBUG oslo_concurrency.lockutils [req-2c2b08d9-a3b9-4b63-b6ce-31258d571138 req-302e4476-3e2a-4fa7-9778-e6f60d8ea680 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:35 np0005476733 nova_compute[192580]: 2025-10-08 15:30:35.386 2 DEBUG oslo_concurrency.lockutils [req-2c2b08d9-a3b9-4b63-b6ce-31258d571138 req-302e4476-3e2a-4fa7-9778-e6f60d8ea680 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:35 np0005476733 nova_compute[192580]: 2025-10-08 15:30:35.386 2 DEBUG nova.compute.manager [req-2c2b08d9-a3b9-4b63-b6ce-31258d571138 req-302e4476-3e2a-4fa7-9778-e6f60d8ea680 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Processing event network-vif-plugged-8f7d5998-037f-4a70-98a0-8482a8043a7e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:30:35 np0005476733 nova_compute[192580]: 2025-10-08 15:30:35.387 2 DEBUG nova.compute.manager [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:30:35 np0005476733 nova_compute[192580]: 2025-10-08 15:30:35.393 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937435.3928502, 066ef28b-88ac-4f5c-acae-3458c3e19762 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:30:35 np0005476733 nova_compute[192580]: 2025-10-08 15:30:35.393 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:30:35 np0005476733 nova_compute[192580]: 2025-10-08 15:30:35.395 2 DEBUG nova.virt.libvirt.driver [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:30:35 np0005476733 nova_compute[192580]: 2025-10-08 15:30:35.403 2 INFO nova.virt.libvirt.driver [-] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Instance spawned successfully.#033[00m
Oct  8 11:30:35 np0005476733 nova_compute[192580]: 2025-10-08 15:30:35.404 2 DEBUG nova.virt.libvirt.driver [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:30:35 np0005476733 nova_compute[192580]: 2025-10-08 15:30:35.416 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:30:35 np0005476733 nova_compute[192580]: 2025-10-08 15:30:35.421 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:30:35 np0005476733 nova_compute[192580]: 2025-10-08 15:30:35.427 2 DEBUG nova.virt.libvirt.driver [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:30:35 np0005476733 nova_compute[192580]: 2025-10-08 15:30:35.427 2 DEBUG nova.virt.libvirt.driver [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:30:35 np0005476733 nova_compute[192580]: 2025-10-08 15:30:35.428 2 DEBUG nova.virt.libvirt.driver [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:30:35 np0005476733 nova_compute[192580]: 2025-10-08 15:30:35.428 2 DEBUG nova.virt.libvirt.driver [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:30:35 np0005476733 nova_compute[192580]: 2025-10-08 15:30:35.429 2 DEBUG nova.virt.libvirt.driver [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:30:35 np0005476733 nova_compute[192580]: 2025-10-08 15:30:35.429 2 DEBUG nova.virt.libvirt.driver [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:30:35 np0005476733 nova_compute[192580]: 2025-10-08 15:30:35.452 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:30:35 np0005476733 nova_compute[192580]: 2025-10-08 15:30:35.484 2 INFO nova.compute.manager [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Took 13.10 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:30:35 np0005476733 nova_compute[192580]: 2025-10-08 15:30:35.484 2 DEBUG nova.compute.manager [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:30:35 np0005476733 nova_compute[192580]: 2025-10-08 15:30:35.556 2 INFO nova.compute.manager [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Took 14.06 seconds to build instance.#033[00m
Oct  8 11:30:35 np0005476733 nova_compute[192580]: 2025-10-08 15:30:35.578 2 DEBUG oslo_concurrency.lockutils [None req-0a54534d-fa2f-40ec-a9b8-eab914ba169a d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "066ef28b-88ac-4f5c-acae-3458c3e19762" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.006 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'name': 'tempest-test_bw_limit_tenant_network-1685300098', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002d', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'hostId': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.009 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000028', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'user_id': 'c852472017334735b37425ffa8591384', 'hostId': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.011 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'name': 'tempest-test_multicast_east_west-1699367735', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000027', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '496a37645ecf47b496dcf02c696ca64a', 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'hostId': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.014 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002a', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '93e68db931464f0282500c84d398d8af', 'user_id': '048380879c82439f920961e33c8fc34c', 'hostId': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.014 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.015 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.015 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-test_bw_limit_tenant_network-1685300098>, <NovaLikeServer: server-tempest-MultiVlanTransparencyTest-1225694051-0>, <NovaLikeServer: tempest-test_multicast_east_west-1699367735>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_bw_limit_tenant_network-1685300098>, <NovaLikeServer: server-tempest-MultiVlanTransparencyTest-1225694051-0>, <NovaLikeServer: tempest-test_multicast_east_west-1699367735>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless>]
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.015 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.015 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.015 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-test_bw_limit_tenant_network-1685300098>, <NovaLikeServer: server-tempest-MultiVlanTransparencyTest-1225694051-0>, <NovaLikeServer: tempest-test_multicast_east_west-1699367735>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_bw_limit_tenant_network-1685300098>, <NovaLikeServer: server-tempest-MultiVlanTransparencyTest-1225694051-0>, <NovaLikeServer: tempest-test_multicast_east_west-1699367735>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless>]
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.015 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.028 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/cpu volume: 580000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.050 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/cpu volume: 41600000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.065 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/cpu volume: 40030000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.083 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/cpu volume: 27570000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43020e9c-fff9-41d7-867d-33c4527bc56d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 580000000, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'timestamp': '2025-10-08T15:30:36.016064', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'bc84f234-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.751734282, 'message_signature': 'ba13b3363d692d28be83b0d38cee0d19eded82f98eb9671340007c4377748cf0'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 41600000000, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'timestamp': '2025-10-08T15:30:36.016064', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'instance-00000028', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'bc884948-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.773787458, 'message_signature': '9231dde663606b8e6ad869c4b00f4428762dbb48fb2ce662d3e435d14943fe30'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 40030000000, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'timestamp': '2025-10-08T15:30:36.016064', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'instance-00000027', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'bc8a88d4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.788500727, 'message_signature': '7d398380f81819164a6145a8a3058c3e9f04f73f0b15ebb9affc96f42e3adf29'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27570000000, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'timestamp': '2025-10-08T15:30:36.016064', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'instance-0000002a', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'bc8d3bf6-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.806068509, 'message_signature': '9e1f9690a9db6e9207eb03f264d0d6ecc3d27cee2584d839975431a7bcb3d96b'}]}, 'timestamp': '2025-10-08 15:30:36.084076', '_unique_id': '83b21e433cee460ba46e335c0ba1e1ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.086 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.087 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.100 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.usage volume: 196768 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.101 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.116 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/disk.device.usage volume: 152436736 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.117 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.129 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.device.usage volume: 152371200 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.129 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.142 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.device.usage volume: 18219008 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.143 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b20aea3c-345d-4ac9-91c9-0e8eb2e22ff2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196768, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:30:36.087871', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bc8fe388-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.810872762, 'message_signature': '78c8a94a78e6430b60334e10fb953766ee3ecde5d9968b3b224dcb9f9dfbae62'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:30:36.087871', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bc8fef2c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.810872762, 'message_signature': 'cdeb425de76452b167e2cf69a405604a9914e9023b10224ddbacd67634a9dedf'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152436736, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14-vda', 'timestamp': '2025-10-08T15:30:36.087871', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'instance-00000028', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bc92508c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.824462976, 'message_signature': '1bc69b9a0a5334b5537e6415c220c3b3925ee222e7106385e6500a0e781b24a8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14-sda', 'timestamp': '2025-10-08T15:30:36.087871', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'instance-00000028', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bc926108-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.824462976, 'message_signature': 'cfbc24286aed2d735d707261550da6e55b3b04b320ca6704c0f4a99ae7a6f52c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152371200, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8-vda', 'timestamp': '2025-10-08T15:30:36.087871', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'instance-00000027', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bc943b68-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.84054654, 'message_signature': 'd5e4563ff6d00c2dbffc6aa63391b06fcd413703015a61eaca8229dfc52c5dbd'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8-sda', 'timestamp': '2025-10-08T15:30:36.087871', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'instance-00000027', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'i
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bc944a86-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.84054654, 'message_signature': 'bf22fca3caf69cc3fe30443149b84db4551bd9db34ab741ac8fd254541ef6153'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 18219008, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd-vda', 'timestamp': '2025-10-08T15:30:36.087871', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'instance-0000002a', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bc964ffc-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.853093121, 'message_signature': '77a7fd1318b25766c207156945968722310ad65fad05d3cfe53fd528feb1b68b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd-sda', 'timestamp': '2025-10-08T15:30:36.087871', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'instance-0000002a', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bc965c36-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.853093121, 'message_signature': 'b830beb485b446561348ae1f91bfe52eb23978c851062c5f4cbf2fa49c1f4fbe'}]}, 'timestamp': '2025-10-08 15:30:36.143643', '_unique_id': 'f9c0bb4786ea40b2b4103e6a2de0e154'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.145 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.147 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 066ef28b-88ac-4f5c-acae-3458c3e19762 / tap8f7d5998-03 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.147 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.149 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for a7cf9795-ac6e-4d38-8500-755c39931e14 / tap66f32729-1d inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.149 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.152 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 27fa9a5a-04a0-4d80-b75d-564df1c974e8 / tap23f6a943-ce inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.152 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.154 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 90f7bb14-f463-4f98-92fc-22c2a06a12cd / tap3ca6fe41-62 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.154 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fac9c56-5223-4ed4-b531-dac3ed4d1824', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:30:36.145517', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': 'bc970e06-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.868502484, 'message_signature': '7848f29f63960819373dee2c0b0487e6e0b0f0478522e370d6d421b412261557'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'instance-00000028-a7cf9795-ac6e-4d38-8500-755c39931e14-tap66f32729-1d', 'timestamp': '2025-10-08T15:30:36.145517', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'tap66f32729-1d', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:22:ff:69', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66f32729-1d'}, 'message_id': 'bc975636-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.871148148, 'message_signature': 'ccefc7113ee2cc884ddf83f148965c94125b2c04371b1fb18a1637512be810a0'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000027-27fa9a5a-04a0-4d80-b75d-564df1c974e8-tap23f6a943-ce', 'timestamp': '2025-10-08T15:30:36.145517', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'tap23f6a943-ce', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:38:f6:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap23f6a943-ce'}, 'message_id': 'bc97b95a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.872961526, 'message_signature': '252d786dda5e8cf188ec45e2637bb035d01a15d108c9513cc5e4a95df24aa252'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-0000002a-90f7bb14-f463-4f98-92fc-22c2a06a12cd-tap3ca6fe41-62', 'timestamp': '2025-10-08T15:30:36.145517', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'tap3ca6fe41-62', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:03:33:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ca6fe41-62'}, 'message_id': 'bc9805cc-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.875524357, 'message_signature': '809a5c1904e3d1b7e16b63bdab1ff2c019e24055a7580e3b1e237167e8a82016'}]}, 'timestamp': '2025-10-08 15:30:36.154550', '_unique_id': '486bf801d379464ba063b97861282ae2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.155 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.156 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.156 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.156 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/network.incoming.bytes volume: 24223 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.157 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/network.incoming.bytes volume: 2252 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.157 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/network.incoming.bytes volume: 1825 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba59499b-ed88-4ce8-bf68-a7bcf3b6956d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:30:36.156561', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': 'bc985ffe-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.868502484, 'message_signature': 'e999e99b2278fbc1010dde57da3dca3e77e99ef395d376fc2193bd62d0d98f30'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 24223, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'instance-00000028-a7cf9795-ac6e-4d38-8500-755c39931e14-tap66f32729-1d', 'timestamp': '2025-10-08T15:30:36.156561', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'tap66f32729-1d', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:22:ff:69', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66f32729-1d'}, 'message_id': 'bc986bb6-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.871148148, 'message_signature': 'b4ef9bffbe8cb1b7be736a199df3dd4eebfa3af44b003682cdc60433bdab03df'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2252, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000027-27fa9a5a-04a0-4d80-b75d-564df1c974e8-tap23f6a943-ce', 'timestamp': '2025-10-08T15:30:36.156561', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'tap23f6a943-ce', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:38:f6:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap23f6a943-ce'}, 'message_id': 'bc9877dc-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.872961526, 'message_signature': '786433be847565e991346789ce762eb8f57c6e344d522fcd35125f958320e367'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1825, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-0000002a-90f7bb14-f463-4f98-92fc-22c2a06a12cd-tap3ca6fe41-62', 'timestamp': '2025-10-08T15:30:36.156561', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'tap3ca6fe41-62', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:03:33:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ca6fe41-62'}, 'message_id': 'bc988272-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.875524357, 'message_signature': '904dbfc0a96eb46a4b78d3805d7bf2bb10de964c6f808722ba67b85a9c96143b'}]}, 'timestamp': '2025-10-08 15:30:36.157746', '_unique_id': '14b9d367a75742ea94aeee7ce21c5c7b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.158 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.159 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.159 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.159 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.160 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.160 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c70aea51-689f-48ab-ad01-2920766512e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:30:36.159503', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': 'bc98d2d6-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.868502484, 'message_signature': 'db51698bdc9cb15c7877ec55e965013492c7e3a1edd95e846e1cd94e70ad40ac'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'instance-00000028-a7cf9795-ac6e-4d38-8500-755c39931e14-tap66f32729-1d', 'timestamp': '2025-10-08T15:30:36.159503', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'tap66f32729-1d', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:22:ff:69', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66f32729-1d'}, 'message_id': 'bc98dc90-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.871148148, 'message_signature': '773a1962d18801c35e968a85b2678b8321ba8f1fd9befaed8224dfc4a83c6a06'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000027-27fa9a5a-04a0-4d80-b75d-564df1c974e8-tap23f6a943-ce', 'timestamp': '2025-10-08T15:30:36.159503', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'tap23f6a943-ce', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:38:f6:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap23f6a943-ce'}, 'message_id': 'bc98e87a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.872961526, 'message_signature': 'ff05ce74d1d3550249fed0d624db8815ceac74a07692ac63af34ed46e6d87c25'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-0000002a-90f7bb14-f463-4f98-92fc-22c2a06a12cd-tap3ca6fe41-62', 'timestamp': '2025-10-08T15:30:36.159503', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'tap3ca6fe41-62', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:03:33:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ca6fe41-62'}, 'message_id': 'bc98f34c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.875524357, 'message_signature': 'f6bf08538e7c8edac37d4bb9cb39413c57990789debb0767cd6b6b5676a7d1f7'}]}, 'timestamp': '2025-10-08 15:30:36.160639', '_unique_id': '7f7fa190a3294361beb56f805808073b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.161 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.162 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.162 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-test_bw_limit_tenant_network-1685300098>, <NovaLikeServer: server-tempest-MultiVlanTransparencyTest-1225694051-0>, <NovaLikeServer: tempest-test_multicast_east_west-1699367735>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_bw_limit_tenant_network-1685300098>, <NovaLikeServer: server-tempest-MultiVlanTransparencyTest-1225694051-0>, <NovaLikeServer: tempest-test_multicast_east_west-1699367735>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless>]
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.162 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.162 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.162 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 066ef28b-88ac-4f5c-acae-3458c3e19762: ceilometer.compute.pollsters.NoVolumeException
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.162 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/memory.usage volume: 248.5859375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.162 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/memory.usage volume: 223.875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.163 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/memory.usage volume: 168.1875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38ce7c24-8dc9-43fc-aa94-567f2a9003d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 248.5859375, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'timestamp': '2025-10-08T15:30:36.162422', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'instance-00000028', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': 'bc994c52-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.773787458, 'message_signature': '08b7fa256c6e76fe58376a13b9b5d231154c49df977bdb40f4158e77b7aae586'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 223.875, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'timestamp': '2025-10-08T15:30:36.162422', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'instance-00000027', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': 'bc995710-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.788500727, 'message_signature': '43af3ba5bae348689f7ae198921a25751dbb039ab906b02a1d046d87280bbc40'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 168.1875, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'timestamp': '2025-10-08T15:30:36.162422', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'instance-0000002a', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': 'bc99621e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.806068509, 'message_signature': '14926a5d53dab87346cafe2f878a4334e3511135c5d50f8053fe47ef57897138'}]}, 'timestamp': '2025-10-08 15:30:36.163459', '_unique_id': '87b6185e5edc45ceb303a4591db42e22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.164 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.182 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.183 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.198 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/disk.device.write.requests volume: 729 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.198 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.213 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.device.write.requests volume: 709 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.213 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.233 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.device.write.requests volume: 116 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.233 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db372c3b-f316-4f13-84e3-3aca6a2a573c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:30:36.164867', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bc9c6130-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.887839461, 'message_signature': '6d5169bcd0b43a49a68a22a6a9395616d04aa2961664351673ceea72d22ca27a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:30:36.164867', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bc9c6f90-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.887839461, 'message_signature': 'd9234284b1569f161cb39648e9b23fe0a0747caa32cb014c3eae5c8d34598bb1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 729, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14-vda', 'timestamp': '2025-10-08T15:30:36.164867', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'instance-00000028', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bc9ec628-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.906452196, 'message_signature': '78577ad61147b8f337756d6157d58bd2c86a92b68796c737e81d355f974696d9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14-sda', 'timestamp': '2025-10-08T15:30:36.164867', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'instance-00000028', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bc9ecf56-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.906452196, 'message_signature': '26baad6e42b32c244f3678cee7b1912a0b558f88552fbb760aff34e627f8048d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 709, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8-vda', 'timestamp': '2025-10-08T15:30:36.164867', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'instance-00000027', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bca1050a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.921937491, 'message_signature': 'af791f5176e9b08e00ebdad6e1ef38cf706fbe98f6ef18efaf04d48134bb8624'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8-sda', 'timestamp': '2025-10-08T15:30:36.164867', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'instance-00000027', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bca1107c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.921937491, 'message_signature': '638904cc1ba169e735a311640d9246befdade89f060fd75352eef1d28caa9ba2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 116, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd-vda', 'timestamp': '2025-10-08T15:30:36.164867', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'instance-0000002a', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bca41290-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.936784015, 'message_signature': 'bb8cc50b27ddff7b244bcaf9830e3af8c9c2d9bf9d95a4200ed8eec9ed4688bb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd-sda', 'timestamp': '2025-10-08T15:30:36.164867', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'instance-0000002a', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bca4203c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.936784015, 'message_signature': 'dbe448232bb38ec3c59091fc92d16d4b7709a9ecd1083d8c7c3930801a11c28d'}]}, 'timestamp': '2025-10-08 15:30:36.233919', '_unique_id': '742e1382f2d54252bcb0a59c4c1b9e0a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.236 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.236 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.236 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.237 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.237 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '252f7b7e-305c-47ec-9d5c-ce053b352217', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:30:36.236465', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': 'bca492ec-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.868502484, 'message_signature': '8e27146fcb5c6e68374ac27b6a4a87f6bc9c61b85088a4e0151e834eced3f90e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'instance-00000028-a7cf9795-ac6e-4d38-8500-755c39931e14-tap66f32729-1d', 'timestamp': '2025-10-08T15:30:36.236465', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'tap66f32729-1d', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:22:ff:69', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66f32729-1d'}, 'message_id': 'bca4a26e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.871148148, 'message_signature': 'b471424936aa49f4e99ceaffbb39391733bf06ad65b1ffc7c0122941c23fa253'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000027-27fa9a5a-04a0-4d80-b75d-564df1c974e8-tap23f6a943-ce', 'timestamp': '2025-10-08T15:30:36.236465', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'tap23f6a943-ce', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:38:f6:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap23f6a943-ce'}, 'message_id': 'bca4b128-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.872961526, 'message_signature': 'ac5c2e8f1562ffbd13bf3158f35aa2a8b4876eed51e76f88b6331e5fb2aef111'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-0000002a-90f7bb14-f463-4f98-92fc-22c2a06a12cd-tap3ca6fe41-62', 'timestamp': '2025-10-08T15:30:36.236465', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'tap3ca6fe41-62', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:03:33:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ca6fe41-62'}, 'message_id': 'bca4bd12-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.875524357, 'message_signature': '18e2b6dfad753212b5a69ae5fae4019125f356529bf2030c63872cdf323ebd3c'}]}, 'timestamp': '2025-10-08 15:30:36.237925', '_unique_id': 'b37895222e0e4299a4987b6682cc8e63'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.238 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.240 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.240 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.240 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.240 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.241 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e37a0440-de53-4f95-aa0d-22b8524676c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:30:36.240235', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': 'bca5250e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.868502484, 'message_signature': 'c44992116afb3e2e663f57455c46d668eca150cea2ed0e8e9104a12fff567622'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'instance-00000028-a7cf9795-ac6e-4d38-8500-755c39931e14-tap66f32729-1d', 'timestamp': '2025-10-08T15:30:36.240235', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'tap66f32729-1d', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:22:ff:69', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66f32729-1d'}, 'message_id': 'bca53148-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.871148148, 'message_signature': '44577fe6b0c17475ccb7c48243676557b4237cf75d9f7fa95fe773d54e1a18ee'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000027-27fa9a5a-04a0-4d80-b75d-564df1c974e8-tap23f6a943-ce', 'timestamp': '2025-10-08T15:30:36.240235', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'tap23f6a943-ce', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:38:f6:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap23f6a943-ce'}, 'message_id': 'bca53cba-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.872961526, 'message_signature': '9b528d7449031830d0e502aab2fef77b1b5c0fb7a30dedf14db62095988d6b51'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-0000002a-90f7bb14-f463-4f98-92fc-22c2a06a12cd-tap3ca6fe41-62', 'timestamp': '2025-10-08T15:30:36.240235', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'tap3ca6fe41-62', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:03:33:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ca6fe41-62'}, 'message_id': 'bca5493a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.875524357, 'message_signature': 'fefdc4f94ae9230f8168a7077bc7ed701546162f8e546037adc496defb33fa97'}]}, 'timestamp': '2025-10-08 15:30:36.241482', '_unique_id': '764f74c94a3f457fa8f5238e094142d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.242 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.243 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.243 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.243 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-test_bw_limit_tenant_network-1685300098>, <NovaLikeServer: server-tempest-MultiVlanTransparencyTest-1225694051-0>, <NovaLikeServer: tempest-test_multicast_east_west-1699367735>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_bw_limit_tenant_network-1685300098>, <NovaLikeServer: server-tempest-MultiVlanTransparencyTest-1225694051-0>, <NovaLikeServer: tempest-test_multicast_east_west-1699367735>, <NovaLikeServer: tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless>]
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.243 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.243 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.244 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/network.incoming.packets volume: 139 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.244 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/network.incoming.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.244 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a541ed98-9d2a-4359-9536-8018ea7bfcc5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:30:36.243673', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': 'bca5ab14-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.868502484, 'message_signature': '90e032d517d9e20525c50968e61e20be40b1fa753554578dc7d4a4edac4ffa78'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 139, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'instance-00000028-a7cf9795-ac6e-4d38-8500-755c39931e14-tap66f32729-1d', 'timestamp': '2025-10-08T15:30:36.243673', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'tap66f32729-1d', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:22:ff:69', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66f32729-1d'}, 'message_id': 'bca5b7f8-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.871148148, 'message_signature': 'd3bfba1fd3649e5ac2f4fcdf76b3b51cccdcd575d4c3665b630ff51b1721187d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000027-27fa9a5a-04a0-4d80-b75d-564df1c974e8-tap23f6a943-ce', 'timestamp': '2025-10-08T15:30:36.243673', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'tap23f6a943-ce', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:38:f6:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap23f6a943-ce'}, 'message_id': 'bca5c36a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.872961526, 'message_signature': '9d4753debb72c92ebcb1f502418899bb2c6572a086d55b7706018492500728ac'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-0000002a-90f7bb14-f463-4f98-92fc-22c2a06a12cd-tap3ca6fe41-62', 'timestamp': '2025-10-08T15:30:36.243673', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'tap3ca6fe41-62', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:03:33:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ca6fe41-62'}, 'message_id': 'bca5ce8c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.875524357, 'message_signature': '1a72c4a3e329a90a49c9617f40729f146e5c137c22955195cba91d9c1f7db16b'}]}, 'timestamp': '2025-10-08 15:30:36.244897', '_unique_id': '527dbc832588415fa8e31b569b749375'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.245 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.246 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.246 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.246 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/network.outgoing.bytes volume: 36817 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.247 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/network.outgoing.bytes volume: 3852 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.247 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/network.outgoing.bytes volume: 2260 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41adb6e8-7f80-4b19-b021-18f788d32f1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:30:36.246576', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': 'bca61c7a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.868502484, 'message_signature': '7a6a20ca2e9668b960427e2f5e4e0815f77839f9f38ed263a0f9eeadcd4cd711'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 36817, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'instance-00000028-a7cf9795-ac6e-4d38-8500-755c39931e14-tap66f32729-1d', 'timestamp': '2025-10-08T15:30:36.246576', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'tap66f32729-1d', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:22:ff:69', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66f32729-1d'}, 'message_id': 'bca628d2-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.871148148, 'message_signature': '7f21e9a2ceb119959cb02227ed9c95b9f4a533886f1a3c5c7fe99440bacfe8f8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3852, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000027-27fa9a5a-04a0-4d80-b75d-564df1c974e8-tap23f6a943-ce', 'timestamp': '2025-10-08T15:30:36.246576', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'tap23f6a943-ce', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:38:f6:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap23f6a943-ce'}, 'message_id': 'bca635b6-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.872961526, 'message_signature': '2d9b3978f146d50c311dabcf77d1ed91a1633a623905421c29faa78b9018b06b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2260, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-0000002a-90f7bb14-f463-4f98-92fc-22c2a06a12cd-tap3ca6fe41-62', 'timestamp': '2025-10-08T15:30:36.246576', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'tap3ca6fe41-62', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:03:33:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ca6fe41-62'}, 'message_id': 'bca64100-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.875524357, 'message_signature': '643a432c50128748cee578b05e7ab1df001f330e5194caec37bec9afdb4f365e'}]}, 'timestamp': '2025-10-08 15:30:36.247826', '_unique_id': 'be37bb7838984c74a22162e6c02c7c22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.248 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.249 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.249 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.249 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.250 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/disk.device.write.latency volume: 10877476939 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.250 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.250 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.device.write.latency volume: 61074509363 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.250 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.251 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.device.write.latency volume: 659575488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.251 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14370089-319d-4518-9dd8-cc9311a736c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:30:36.249466', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bca68d40-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.887839461, 'message_signature': '111517341b3dcf28d96a65ae1051feb00860d616cbac3c9bdfe0a571831977b8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:30:36.249466', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bca69952-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.887839461, 'message_signature': 'c0344bae6ba748b684383a57351cdce48eb93a1145da7028eb7de8b813d0b14c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10877476939, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14-vda', 'timestamp': '2025-10-08T15:30:36.249466', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'instance-00000028', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bca6a546-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.906452196, 'message_signature': '8c0d762eb85d64a622ab8474537385525a11755eefbc3024510cfa83713a397f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14-sda', 'timestamp': '2025-10-08T15:30:36.249466', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'instance-00000028', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bca6afdc-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.906452196, 'message_signature': '466d37a2e48037f9a9cedd42b67117e6b99e927240e517bbec93fe70f824ac12'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 61074509363, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8-vda', 'timestamp': '2025-10-08T15:30:36.249466', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'instance-00000027', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bca6ba7c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.921937491, 'message_signature': '65dd9ad84fca5075924ca4a727588714b9ea414f76de2c73d84bc8b63d13e279'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8-sda', 'timestamp': '2025-10-08T15:30:36.249466', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'instance-00000027', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'sw
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 1-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bca6c526-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.921937491, 'message_signature': '9f602a634483acca31ad8ec4f5752d95ea529e8deb163da7f29a35e85eb70f4a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 659575488, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd-vda', 'timestamp': '2025-10-08T15:30:36.249466', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'instance-0000002a', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bca6d0f2-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.936784015, 'message_signature': '4b7113eb89a5a22a19b3432bd3b487d3cd73dcb443094612f578a312bfbba9ee'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd-sda', 'timestamp': '2025-10-08T15:30:36.249466', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'instance-0000002a', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bca6db88-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.936784015, 'message_signature': '50e03d60d9dd62fc97bdb27b952450e9987f79549e55cf11bd9cdd63ccb78238'}]}, 'timestamp': '2025-10-08 15:30:36.251769', '_unique_id': '64a6f92e277040b591728b981e3ebd07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:30:36.144 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.253 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.253 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.254 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.254 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/disk.device.write.bytes volume: 135907328 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.254 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.255 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.device.write.bytes volume: 135500800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.255 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.255 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.device.write.bytes volume: 17129472 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.255 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8674a117-60b3-456e-b877-fb8b22e32c93', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:30:36.253892', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bca73c4a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.887839461, 'message_signature': 'ca7e94048cda78cbab74c0481276134c8f830fd5c32ac477a3fa4db8038099bf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:30:36.253892', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bca7488e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.887839461, 'message_signature': '983330cc968bac4de31f2f5ab66bf7bc2757bb3939883b831956e8f1569be84b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 135907328, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14-vda', 'timestamp': '2025-10-08T15:30:36.253892', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'instance-00000028', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bca7536a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.906452196, 'message_signature': '831782c1a5e75529747c33de267ef57e34b5ca2db6665599d1b18c9966c46738'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14-sda', 'timestamp': '2025-10-08T15:30:36.253892', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'instance-00000028', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bca75e14-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.906452196, 'message_signature': '9850ad10dcd026b0804c64dfc2dbe17e47655adecb0f2b99dec136a710f70a28'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 135500800, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8-vda', 'timestamp': '2025-10-08T15:30:36.253892', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'instance-00000027', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bca76aee-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.921937491, 'message_signature': '1f89db78e71cb0196e536feba9dfc9be4a7a47421547e7a2de72de049a2b0e41'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8-sda', 'timestamp': '2025-10-08T15:30:36.253892', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'instance-00000027', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'ac
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 11'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bca7758e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.921937491, 'message_signature': '645e6647ea62dca7015bcf42a657382e801545e7ed3ee33d48e3ba6b7271185e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 17129472, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd-vda', 'timestamp': '2025-10-08T15:30:36.253892', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'instance-0000002a', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bca77ffc-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.936784015, 'message_signature': 'dc51ae89a2ed5f0acf434f5c154ff280f41398d7d37352bad07e311486eabdc5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd-sda', 'timestamp': '2025-10-08T15:30:36.253892', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'instance-0000002a', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bca78baa-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.936784015, 'message_signature': 'abf8d924c0ebc2889ca66bdcddf4ebabc811e7cc8d838eb24827f6b00de9861c'}]}, 'timestamp': '2025-10-08 15:30:36.256283', '_unique_id': 'fb22a606304d42bd99ba1fa1446cead5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.258 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.258 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.read.requests volume: 181 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.258 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.259 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/disk.device.read.requests volume: 11668 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.259 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.259 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.device.read.requests volume: 11534 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.259 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.259 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.device.read.requests volume: 10144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '250297a3-fb83-49b5-bbbc-e784b117d393', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 181, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:30:36.258367', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bca7ec62-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.887839461, 'message_signature': 'e974dd5b9418459cda237d2c28bb3430e640b3ba34044f5cc4ab4cdae08693d2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:30:36.258367', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bca7f856-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.887839461, 'message_signature': '0f16d23ad3c989175c69b12cc832d421a2d5522b403ca5d1a3ea4a56932c9b33'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11668, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14-vda', 'timestamp': '2025-10-08T15:30:36.258367', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'instance-00000028', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bca800ee-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.906452196, 'message_signature': '73bade81dbcbaffa6862d884c22749cb535f720a40c89e9acbaf8c485484e64d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14-sda', 'timestamp': '2025-10-08T15:30:36.258367', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'instance-00000028', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bca809fe-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.906452196, 'message_signature': '3511d8d5bb3291d6ba6cc7a132c1aae493d9e46ed005016e2dcaecc5e58426b8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11534, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8-vda', 'timestamp': '2025-10-08T15:30:36.258367', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'instance-00000027', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bca81264-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.921937491, 'message_signature': 'ced2dee98ac3ec8903d413e94d28f9226442c478394145224bd5e8670e144c3f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8-sda', 'timestamp': '2025-10-08T15:30:36.258367', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'instance-00000027', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk':
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: e': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bca81bec-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.921937491, 'message_signature': '4b0eeb0ca9051517c3a815ee3fbe0b4b95abef28046e96d04690f9a3a0caa633'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 10144, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd-vda', 'timestamp': '2025-10-08T15:30:36.258367', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'instance-0000002a', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bca823bc-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.936784015, 'message_signature': '8f8c665495d1698df0a52efd7b4fd7ac45704a071646de76359d8c22de453437'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd-sda', 'timestamp': '2025-10-08T15:30:36.258367', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'instance-0000002a', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bca82d80-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.936784015, 'message_signature': '532817f1807dc7991e6f0a2fd745a0ce4a72571e887fa212d50401f343448cd2'}]}, 'timestamp': '2025-10-08 15:30:36.260384', '_unique_id': '0ca1dd292750497f9405836bd1cf6206'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.261 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.261 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.261 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.262 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.262 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.262 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.262 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.263 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.263 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ad37ff3-1fe4-4214-8168-1e91f13919f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:30:36.261611', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bca866ba-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.810872762, 'message_signature': '573c4a4890e1d2e078ec2f4fc7baa634f798d757acf056bbf3330daf8d224ea8'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:30:36.261611', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bca870a6-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.810872762, 'message_signature': '82b17114967b4d24862f297bf5a9cb54c2d4eb7073e18cf85099c43a4cdeddf0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14-vda', 'timestamp': '2025-10-08T15:30:36.261611', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'instance-00000028', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bca87bf0-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.824462976, 'message_signature': '7e357620e6655384355026e3ac009b2378d4c217340cdacc399e62b3642397a4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14-sda', 'timestamp': '2025-10-08T15:30:36.261611', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'instance-00000028', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bca885e6-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.824462976, 'message_signature': '17a30ea286e7078b4105c4bba7eef30c48b6d6d8d88988f5698fb369dd24d11c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8-vda', 'timestamp': '2025-10-08T15:30:36.261611', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'instance-00000027', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bca89086-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.84054654, 'message_signature': '2bbc112869863667545542e067210201e6af5a68c945ef4adac014864c7a9900'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8-sda', 'timestamp': '2025-10-08T15:30:36.261611', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'instance-00000027', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'run
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bca89bd0-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.84054654, 'message_signature': 'b17567381f7aed4c2f9db3393924b69bc6eb479bf879c7df739fb020a8dc6eb1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd-vda', 'timestamp': '2025-10-08T15:30:36.261611', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'instance-0000002a', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bca8a5f8-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.853093121, 'message_signature': '1d6578c825461772b57f96f9abdd17b815feaf43428dc84d45dc5ac98b6e2c68'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd-sda', 'timestamp': '2025-10-08T15:30:36.261611', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'instance-0000002a', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bca8ad8c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.853093121, 'message_signature': '763ca1c826fb625d996f631371a8bdec2a350778ca21b51bb06e99e725928290'}]}, 'timestamp': '2025-10-08 15:30:36.263658', '_unique_id': 'e14bc942e14244a6babc501f9e409249'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.264 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.265 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.265 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.265 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.265 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.265 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.266 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.266 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.device.allocation volume: 19079168 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.266 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.device.allocation volume: 491520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b666570-25eb-406a-9966-eb8becdc8cd1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:30:36.265015', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bca8eb62-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.810872762, 'message_signature': 'f5b766fda20f0a1d192b524f06ec6dcc7c4c37f543035f5a0f6f39e67bbc0eba'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:30:36.265015', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bca8f31e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.810872762, 'message_signature': 'e200092ae1d4b7aaae04373ead45a7a566f0e44fb7223a1a82cb643d78d4e869'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14-vda', 'timestamp': '2025-10-08T15:30:36.265015', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'instance-00000028', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bca8fc24-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.824462976, 'message_signature': '96198b2948f8b15605ff6794f5572c00ff7700923df92e6c994aae4bde99e065'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14-sda', 'timestamp': '2025-10-08T15:30:36.265015', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'instance-00000028', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bca9046c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.824462976, 'message_signature': '90823b767530598e07aaa55f0bd24de63e9c3baf2c06c60143d99682ff1c22d8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8-vda', 'timestamp': '2025-10-08T15:30:36.265015', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'instance-00000027', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bca90d4a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.84054654, 'message_signature': '03700713b82ebe1194312b1cf1b2f316dcaa91f8c6dfa9752afbcf41dd2fec8d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8-sda', 'timestamp': '2025-10-08T15:30:36.265015', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'instance-00000027', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': '
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: : '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bca915ce-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.84054654, 'message_signature': '976b1f1dc63ecfc557e8db162f9ed79429f2928382d2290849dbaee797149815'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 19079168, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd-vda', 'timestamp': '2025-10-08T15:30:36.265015', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'instance-0000002a', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bca91de4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.853093121, 'message_signature': '24ff7b66879a3c40ff9c1c868cba2e5d05c51b6631f6e32fe506e5fe77164031'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 491520, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd-sda', 'timestamp': '2025-10-08T15:30:36.265015', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'instance-0000002a', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bca925be-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.853093121, 'message_signature': 'b9aa56d3e4cfe4e705cc206c3f3d1fbd5c17c7cbf103b6b429a69025fb1ba407'}]}, 'timestamp': '2025-10-08 15:30:36.266769', '_unique_id': '9ca68b214f86402d82804efefbb032cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.268 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.268 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.268 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.268 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.269 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7f3ee1c-1da3-4a95-93aa-668819d53f6e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:30:36.268291', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': 'bca96c22-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.868502484, 'message_signature': '5074ff6fc2f4eeb0de35269909871d3c49261f7a03c184ea4f1cf1e5f6429c86'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'instance-00000028-a7cf9795-ac6e-4d38-8500-755c39931e14-tap66f32729-1d', 'timestamp': '2025-10-08T15:30:36.268291', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'tap66f32729-1d', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:22:ff:69', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66f32729-1d'}, 'message_id': 'bca975be-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.871148148, 'message_signature': '861e6af6407c5d63753ad7d4a4768811846a11f758c51cb027976ee777691547'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000027-27fa9a5a-04a0-4d80-b75d-564df1c974e8-tap23f6a943-ce', 'timestamp': '2025-10-08T15:30:36.268291', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'tap23f6a943-ce', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:38:f6:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap23f6a943-ce'}, 'message_id': 'bca9802c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.872961526, 'message_signature': 'd8cfff4bfc1d0598c5c435567105eaf1c9c04b82f698592aa48b280b38157269'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-0000002a-90f7bb14-f463-4f98-92fc-22c2a06a12cd-tap3ca6fe41-62', 'timestamp': '2025-10-08T15:30:36.268291', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'tap3ca6fe41-62', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:03:33:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ca6fe41-62'}, 'message_id': 'bca98b1c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.875524357, 'message_signature': '38cd94c2ac9ec12eaec18df8c6425b6341af803d188244c99f12fe19e85d4364'}]}, 'timestamp': '2025-10-08 15:30:36.269375', '_unique_id': '2e76495c97d9433e95596d133aeaec08'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.270 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.271 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/network.outgoing.packets volume: 191 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.271 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/network.outgoing.packets volume: 38 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.271 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/network.outgoing.packets volume: 20 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4060050-0e7a-43da-b026-f00c7d2d88b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:30:36.270922', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': 'bca9d3d8-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.868502484, 'message_signature': '1bdb454d21b7433eeb7a8fdb89df31d026fa7911b76a64130165520f7a42ee6c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 191, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'instance-00000028-a7cf9795-ac6e-4d38-8500-755c39931e14-tap66f32729-1d', 'timestamp': '2025-10-08T15:30:36.270922', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'tap66f32729-1d', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:22:ff:69', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66f32729-1d'}, 'message_id': 'bca9dfea-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.871148148, 'message_signature': '3a12ffd6359d572cb2e8f0975506068194fd14dca726bcd6fadcc041cac96b8a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 38, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000027-27fa9a5a-04a0-4d80-b75d-564df1c974e8-tap23f6a943-ce', 'timestamp': '2025-10-08T15:30:36.270922', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'tap23f6a943-ce', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:38:f6:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap23f6a943-ce'}, 'message_id': 'bca9eb8e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.872961526, 'message_signature': '858a448e5d9340ccf40af9bf104d0bd39bd299e1183366cb8bf577fcada0641a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 20, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-0000002a-90f7bb14-f463-4f98-92fc-22c2a06a12cd-tap3ca6fe41-62', 'timestamp': '2025-10-08T15:30:36.270922', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'tap3ca6fe41-62', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:03:33:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ca6fe41-62'}, 'message_id': 'bca9f67e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.875524357, 'message_signature': '27acd05873afd25bb62f63e794da44ce8d17f026af35a0c3b3e792a704824444'}]}, 'timestamp': '2025-10-08 15:30:36.272144', '_unique_id': '6e5f6dea8bc744b2b0b042e64d67913b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.272 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.273 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.273 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.273 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.273 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.274 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ecf78e51-9832-43e6-a760-8f91deb4744e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:30:36.273522', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': 'bcaa3742-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.868502484, 'message_signature': '0b01b5ed710913be6ccbd69ca5da464e92db60a494890f0e04c1fc30a207a46d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'instance-00000028-a7cf9795-ac6e-4d38-8500-755c39931e14-tap66f32729-1d', 'timestamp': '2025-10-08T15:30:36.273522', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'tap66f32729-1d', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:22:ff:69', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66f32729-1d'}, 'message_id': 'bcaa40ca-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.871148148, 'message_signature': '92aeaf661eedd9fab4c3942edc4b6649c8166ac27426509dbc33d58af0801bbf'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000027-27fa9a5a-04a0-4d80-b75d-564df1c974e8-tap23f6a943-ce', 'timestamp': '2025-10-08T15:30:36.273522', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'tap23f6a943-ce', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:38:f6:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap23f6a943-ce'}, 'message_id': 'bcaa496c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.872961526, 'message_signature': '68061ce558266147cf8477f5b68ee67701815a5189f183a2702324cc353a8b26'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': 'instance-0000002a-90f7bb14-f463-4f98-92fc-22c2a06a12cd-tap3ca6fe41-62', 'timestamp': '2025-10-08T15:30:36.273522', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'tap3ca6fe41-62', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:03:33:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3ca6fe41-62'}, 'message_id': 'bcaa59de-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.875524357, 'message_signature': 'f594144aef69b6a47b150b8005452ae08a6c1abd1f50717026fac5d9f42bf44e'}]}, 'timestamp': '2025-10-08 15:30:36.274632', '_unique_id': 'a329bfdf4e18496eaa5d7058ec91d894'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.275 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.read.latency volume: 105974450 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.276 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.read.latency volume: 3667798 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.276 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/disk.device.read.latency volume: 8771580992 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.276 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/disk.device.read.latency volume: 104762157 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.276 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.device.read.latency volume: 9206746791 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.277 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.device.read.latency volume: 126414636 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.277 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.device.read.latency volume: 6665729951 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.277 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.device.read.latency volume: 118003773 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '083854a7-d42f-4935-8122-005a84650057', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 105974450, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:30:36.275915', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bcaa95e8-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.887839461, 'message_signature': 'afff24d5889d2a7fda2b1e5bd5500705b242bc04b0de8f33a7a2b1c68605e333'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3667798, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:30:36.275915', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bcaaa0ec-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.887839461, 'message_signature': '140649de187d4a4263f41cb5af7b1060c442a2486fcc844a041ddcee55911706'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8771580992, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14-vda', 'timestamp': '2025-10-08T15:30:36.275915', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'instance-00000028', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bcaaab32-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.906452196, 'message_signature': '23782f40b322f0dfeb1fa1f9c459fb2fab91ae2241fa951cf4f64324c84879d6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 104762157, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14-sda', 'timestamp': '2025-10-08T15:30:36.275915', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'instance-00000028', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bcaab55a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.906452196, 'message_signature': '0c3cf94cc4731c5dcc528d0f89420055e7515ce60fd49ed0248ecf88cc138c62'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9206746791, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8-vda', 'timestamp': '2025-10-08T15:30:36.275915', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'instance-00000027', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bcaabef6-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.921937491, 'message_signature': '54cdaff7f7d525ee1779945951416303bbcf9aa1e22c5ef4061a1ee636600751'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 126414636, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8-sda', 'timestamp': '2025-10-08T15:30:36.275915', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'instance-00000027', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 1
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: : {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bcaac78e-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.921937491, 'message_signature': 'd38577f8e247cc44980eccbb4e14674a4ee3a3c58e4a726c8d39a8ad393cba1c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6665729951, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd-vda', 'timestamp': '2025-10-08T15:30:36.275915', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'instance-0000002a', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bcaad0d0-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.936784015, 'message_signature': '82202a264e00aebf9f9b7ab6658e404124039f4011541aded1f97318115bb049'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 118003773, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd-sda', 'timestamp': '2025-10-08T15:30:36.275915', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'instance-0000002a', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bcaada3a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.936784015, 'message_signature': '9033625fc4c4f9cf8f1104cc44a6c69bf744d9912262a90ed9cead0682e1214b'}]}, 'timestamp': '2025-10-08 15:30:36.277993', '_unique_id': '2830c47abfb04d39aa2496914665e09c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.279 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.279 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.read.bytes volume: 2920960 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.279 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.280 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/disk.device.read.bytes volume: 329291264 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.280 12 DEBUG ceilometer.compute.pollsters [-] a7cf9795-ac6e-4d38-8500-755c39931e14/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.280 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.device.read.bytes volume: 326993408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.280 12 DEBUG ceilometer.compute.pollsters [-] 27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.281 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.device.read.bytes volume: 241239040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.281 12 DEBUG ceilometer.compute.pollsters [-] 90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8cb6af3b-eb93-4ae4-8275-651589f41e00', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2920960, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:30:36.279766', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bcab2b20-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.887839461, 'message_signature': 'aa42473561785c7c48bbe368725f96b08e79d87b78c7ad17aa05f0031b788275'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:30:36.279766', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bcab3340-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.887839461, 'message_signature': '9ecc5012dd3bc73968a2719a6bac3e0fe8d72a37861242edc8525b9b829aadb5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 329291264, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14-vda', 'timestamp': '2025-10-08T15:30:36.279766', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'instance-00000028', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bcab3c32-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.906452196, 'message_signature': 'a837a1c28efd3bfb00585ec78fcf42354449521b4c2a74abfc61cf40db895301'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'c852472017334735b37425ffa8591384', 'user_name': None, 'project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'project_name': None, 'resource_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14-sda', 'timestamp': '2025-10-08T15:30:36.279766', 'resource_metadata': {'display_name': 'server-tempest-MultiVlanTransparencyTest-1225694051-0', 'name': 'instance-00000028', 'instance_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'instance_type': 'custom_neutron_guest', 'host': '11c1b74997fa0a75f1e47f2c46ab068529ced9fab42b7dfd88d821ba', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bcab454c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.906452196, 'message_signature': 'f1a6951c212bf03531a57e6f8f156d1534de091d535a6ea7a80bcac1cb99c0e1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 326993408, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8-vda', 'timestamp': '2025-10-08T15:30:36.279766', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'instance-00000027', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bcab4cf4-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.921937491, 'message_signature': '575306c592f623ec564cb9c18e140adbf21727d6fe78a695d61ec4d36158197c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8-sda', 'timestamp': '2025-10-08T15:30:36.279766', 'resource_metadata': {'display_name': 'tempest-test_multicast_east_west-1699367735', 'name': 'instance-00000027', 'instance_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 11-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bcab546a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.921937491, 'message_signature': '8c728f8516a43ba25667e6961905434437580c68aaab812a186e204bd14aeb0d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 241239040, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd-vda', 'timestamp': '2025-10-08T15:30:36.279766', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'instance-0000002a', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'bcab5c3a-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.936784015, 'message_signature': 'd3a2c489ba67d0c05aaffe20ef784ab29a6e7eccd7847809c564dd1c1784b869'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': '048380879c82439f920961e33c8fc34c', 'user_name': None, 'project_id': '93e68db931464f0282500c84d398d8af', 'project_name': None, 'resource_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd-sda', 'timestamp': '2025-10-08T15:30:36.279766', 'resource_metadata': {'display_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'name': 'instance-0000002a', 'instance_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'instance_type': 'custom_neutron_guest', 'host': '9cb2d0e97707b20a40f757b6d876f43d2db36c119fbc5036c93a1064', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'bcab634c-a45b-11f0-9274-fa163ef67048', 'monotonic_time': 4339.936784015, 'message_signature': '84af3b36dfa175f55f2931430784bb79585f52f9862f016856683edd59168bff'}]}, 'timestamp': '2025-10-08 15:30:36.281416', '_unique_id': '5c9ff433b96e4decb92b19cf03cb5b39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:30:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:30:36.235 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:30:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:30:36.252 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:30:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:30:36.257 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:30:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:30:36.260 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:30:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:30:36.264 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:30:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:30:36.267 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:30:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:30:36.278 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:30:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:30:36.282 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:30:36 np0005476733 nova_compute[192580]: 2025-10-08 15:30:36.739 2 INFO nova.compute.manager [None req-a4595e7b-e89d-4b8e-84e4-404d8348bbab d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Get console output#033[00m
Oct  8 11:30:36 np0005476733 nova_compute[192580]: 2025-10-08 15:30:36.747 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:30:36 np0005476733 nova_compute[192580]: 2025-10-08 15:30:36.841 2 INFO nova.compute.manager [None req-a73b30af-f50c-4720-b939-774c408b1666 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Get console output#033[00m
Oct  8 11:30:36 np0005476733 nova_compute[192580]: 2025-10-08 15:30:36.845 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:30:37 np0005476733 nova_compute[192580]: 2025-10-08 15:30:37.538 2 DEBUG nova.compute.manager [req-8847bd2c-c7ed-46f0-b92c-d73a7d9fd596 req-9ccf7376-3664-41b6-83ed-5303124303cb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received event network-vif-plugged-8f7d5998-037f-4a70-98a0-8482a8043a7e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:30:37 np0005476733 nova_compute[192580]: 2025-10-08 15:30:37.539 2 DEBUG oslo_concurrency.lockutils [req-8847bd2c-c7ed-46f0-b92c-d73a7d9fd596 req-9ccf7376-3664-41b6-83ed-5303124303cb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:37 np0005476733 nova_compute[192580]: 2025-10-08 15:30:37.540 2 DEBUG oslo_concurrency.lockutils [req-8847bd2c-c7ed-46f0-b92c-d73a7d9fd596 req-9ccf7376-3664-41b6-83ed-5303124303cb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:37 np0005476733 nova_compute[192580]: 2025-10-08 15:30:37.541 2 DEBUG oslo_concurrency.lockutils [req-8847bd2c-c7ed-46f0-b92c-d73a7d9fd596 req-9ccf7376-3664-41b6-83ed-5303124303cb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:37 np0005476733 nova_compute[192580]: 2025-10-08 15:30:37.541 2 DEBUG nova.compute.manager [req-8847bd2c-c7ed-46f0-b92c-d73a7d9fd596 req-9ccf7376-3664-41b6-83ed-5303124303cb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] No waiting events found dispatching network-vif-plugged-8f7d5998-037f-4a70-98a0-8482a8043a7e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:30:37 np0005476733 nova_compute[192580]: 2025-10-08 15:30:37.542 2 WARNING nova.compute.manager [req-8847bd2c-c7ed-46f0-b92c-d73a7d9fd596 req-9ccf7376-3664-41b6-83ed-5303124303cb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received unexpected event network-vif-plugged-8f7d5998-037f-4a70-98a0-8482a8043a7e for instance with vm_state active and task_state None.#033[00m
Oct  8 11:30:38 np0005476733 nova_compute[192580]: 2025-10-08 15:30:38.063 2 DEBUG oslo_concurrency.lockutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Acquiring lock "164d69c5-58d2-413e-9b1f-907b5cc12d9b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:38 np0005476733 nova_compute[192580]: 2025-10-08 15:30:38.063 2 DEBUG oslo_concurrency.lockutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "164d69c5-58d2-413e-9b1f-907b5cc12d9b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:38 np0005476733 nova_compute[192580]: 2025-10-08 15:30:38.186 2 DEBUG nova.compute.manager [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:30:38 np0005476733 podman[231738]: 2025-10-08 15:30:38.269463019 +0000 UTC m=+0.078873721 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  8 11:30:38 np0005476733 podman[231737]: 2025-10-08 15:30:38.301742521 +0000 UTC m=+0.117327850 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 11:30:38 np0005476733 nova_compute[192580]: 2025-10-08 15:30:38.334 2 DEBUG oslo_concurrency.lockutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:38 np0005476733 nova_compute[192580]: 2025-10-08 15:30:38.335 2 DEBUG oslo_concurrency.lockutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:38 np0005476733 nova_compute[192580]: 2025-10-08 15:30:38.340 2 DEBUG nova.virt.hardware [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:30:38 np0005476733 nova_compute[192580]: 2025-10-08 15:30:38.341 2 INFO nova.compute.claims [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:30:38 np0005476733 nova_compute[192580]: 2025-10-08 15:30:38.532 2 DEBUG nova.compute.provider_tree [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:30:38 np0005476733 nova_compute[192580]: 2025-10-08 15:30:38.593 2 DEBUG nova.scheduler.client.report [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.036 2 DEBUG oslo_concurrency.lockutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.701s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.037 2 DEBUG nova.compute.manager [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.174 2 DEBUG nova.compute.manager [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.175 2 DEBUG nova.network.neutron [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.270 2 INFO nova.virt.libvirt.driver [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.315 2 DEBUG nova.compute.manager [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.445 2 DEBUG nova.compute.manager [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.446 2 DEBUG nova.virt.libvirt.driver [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.447 2 INFO nova.virt.libvirt.driver [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Creating image(s)#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.448 2 DEBUG oslo_concurrency.lockutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Acquiring lock "/var/lib/nova/instances/164d69c5-58d2-413e-9b1f-907b5cc12d9b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.448 2 DEBUG oslo_concurrency.lockutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "/var/lib/nova/instances/164d69c5-58d2-413e-9b1f-907b5cc12d9b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.449 2 DEBUG oslo_concurrency.lockutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "/var/lib/nova/instances/164d69c5-58d2-413e-9b1f-907b5cc12d9b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.463 2 DEBUG oslo_concurrency.processutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.523 2 DEBUG oslo_concurrency.processutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.525 2 DEBUG oslo_concurrency.lockutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Acquiring lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.525 2 DEBUG oslo_concurrency.lockutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.536 2 DEBUG oslo_concurrency.processutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.589 2 DEBUG oslo_concurrency.processutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.590 2 DEBUG oslo_concurrency.processutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/164d69c5-58d2-413e-9b1f-907b5cc12d9b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.629 2 DEBUG oslo_concurrency.processutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/164d69c5-58d2-413e-9b1f-907b5cc12d9b/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.630 2 DEBUG oslo_concurrency.lockutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.631 2 DEBUG oslo_concurrency.processutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.698 2 DEBUG oslo_concurrency.processutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.700 2 DEBUG nova.virt.disk.api [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Checking if we can resize image /var/lib/nova/instances/164d69c5-58d2-413e-9b1f-907b5cc12d9b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.701 2 DEBUG oslo_concurrency.processutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/164d69c5-58d2-413e-9b1f-907b5cc12d9b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.759 2 DEBUG oslo_concurrency.processutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/164d69c5-58d2-413e-9b1f-907b5cc12d9b/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.761 2 DEBUG nova.virt.disk.api [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Cannot resize image /var/lib/nova/instances/164d69c5-58d2-413e-9b1f-907b5cc12d9b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.762 2 DEBUG nova.objects.instance [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lazy-loading 'migration_context' on Instance uuid 164d69c5-58d2-413e-9b1f-907b5cc12d9b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.785 2 DEBUG nova.virt.libvirt.driver [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.786 2 DEBUG nova.virt.libvirt.driver [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Ensure instance console log exists: /var/lib/nova/instances/164d69c5-58d2-413e-9b1f-907b5cc12d9b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.786 2 DEBUG oslo_concurrency.lockutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.787 2 DEBUG oslo_concurrency.lockutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.787 2 DEBUG oslo_concurrency.lockutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:39 np0005476733 nova_compute[192580]: 2025-10-08 15:30:39.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:40 np0005476733 nova_compute[192580]: 2025-10-08 15:30:40.708 2 DEBUG nova.policy [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b35a1072024b4c6598970391dd8abb59', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c98d2b93a2394e67a5e6525145c5bca5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:30:41 np0005476733 nova_compute[192580]: 2025-10-08 15:30:41.911 2 INFO nova.compute.manager [None req-22a40d1b-8dae-44f9-811c-ee1678d59001 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Get console output#033[00m
Oct  8 11:30:41 np0005476733 nova_compute[192580]: 2025-10-08 15:30:41.921 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:30:42 np0005476733 nova_compute[192580]: 2025-10-08 15:30:42.046 2 INFO nova.compute.manager [None req-488a362f-297a-4c93-849c-480bb92a11b2 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Get console output#033[00m
Oct  8 11:30:42 np0005476733 nova_compute[192580]: 2025-10-08 15:30:42.051 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:30:42 np0005476733 nova_compute[192580]: 2025-10-08 15:30:42.053 2 INFO nova.virt.libvirt.driver [None req-488a362f-297a-4c93-849c-480bb92a11b2 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Truncated console log returned, 3625 bytes ignored#033[00m
Oct  8 11:30:42 np0005476733 nova_compute[192580]: 2025-10-08 15:30:42.634 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:30:43 np0005476733 nova_compute[192580]: 2025-10-08 15:30:43.077 2 DEBUG nova.network.neutron [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Successfully updated port: c864df57-e86e-439c-88f6-198c1e0cf48c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:30:43 np0005476733 nova_compute[192580]: 2025-10-08 15:30:43.102 2 DEBUG oslo_concurrency.lockutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Acquiring lock "refresh_cache-164d69c5-58d2-413e-9b1f-907b5cc12d9b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:30:43 np0005476733 nova_compute[192580]: 2025-10-08 15:30:43.103 2 DEBUG oslo_concurrency.lockutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Acquired lock "refresh_cache-164d69c5-58d2-413e-9b1f-907b5cc12d9b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:30:43 np0005476733 nova_compute[192580]: 2025-10-08 15:30:43.103 2 DEBUG nova.network.neutron [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:30:43 np0005476733 nova_compute[192580]: 2025-10-08 15:30:43.334 2 DEBUG nova.network.neutron [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:30:43 np0005476733 nova_compute[192580]: 2025-10-08 15:30:43.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:30:43 np0005476733 nova_compute[192580]: 2025-10-08 15:30:43.849 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:43 np0005476733 nova_compute[192580]: 2025-10-08 15:30:43.850 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:43 np0005476733 nova_compute[192580]: 2025-10-08 15:30:43.850 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:43 np0005476733 nova_compute[192580]: 2025-10-08 15:30:43.851 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:30:43 np0005476733 nova_compute[192580]: 2025-10-08 15:30:43.988 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.055 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.056 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.132 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.140 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7cf9795-ac6e-4d38-8500-755c39931e14/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.207 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7cf9795-ac6e-4d38-8500-755c39931e14/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.208 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7cf9795-ac6e-4d38-8500-755c39931e14/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.264 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7cf9795-ac6e-4d38-8500-755c39931e14/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.270 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.335 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.337 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.417 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.424 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.518 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.519 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.573 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/90f7bb14-f463-4f98-92fc-22c2a06a12cd/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.628 2 DEBUG nova.compute.manager [req-71e7d848-afe1-417f-b72a-616df8c082e8 req-c72d91cb-0e75-4e1e-9c71-d7a7e59f1c93 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Received event network-changed-c864df57-e86e-439c-88f6-198c1e0cf48c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.629 2 DEBUG nova.compute.manager [req-71e7d848-afe1-417f-b72a-616df8c082e8 req-c72d91cb-0e75-4e1e-9c71-d7a7e59f1c93 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Refreshing instance network info cache due to event network-changed-c864df57-e86e-439c-88f6-198c1e0cf48c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.629 2 DEBUG oslo_concurrency.lockutils [req-71e7d848-afe1-417f-b72a-616df8c082e8 req-c72d91cb-0e75-4e1e-9c71-d7a7e59f1c93 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-164d69c5-58d2-413e-9b1f-907b5cc12d9b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.778 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.780 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=11218MB free_disk=110.9709358215332GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.780 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.781 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.884 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 27fa9a5a-04a0-4d80-b75d-564df1c974e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.884 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance a7cf9795-ac6e-4d38-8500-755c39931e14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.885 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 90f7bb14-f463-4f98-92fc-22c2a06a12cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.885 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 066ef28b-88ac-4f5c-acae-3458c3e19762 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.885 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 164d69c5-58d2-413e-9b1f-907b5cc12d9b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.885 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 5 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.886 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=4736MB phys_disk=119GB used_disk=41GB total_vcpus=8 used_vcpus=5 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:30:44 np0005476733 nova_compute[192580]: 2025-10-08 15:30:44.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.043 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.061 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.089 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.090 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.309s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:45 np0005476733 podman[231838]: 2025-10-08 15:30:45.229214448 +0000 UTC m=+0.061524007 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_id=edpm, io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.769 2 DEBUG nova.network.neutron [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Updating instance_info_cache with network_info: [{"id": "c864df57-e86e-439c-88f6-198c1e0cf48c", "address": "fa:16:3e:11:7d:97", "network": {"id": "05e23ee7-84d7-47d9-8de9-b53576f6a373", "bridge": "br-int", "label": "tempest-test-network--1624656132", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c98d2b93a2394e67a5e6525145c5bca5", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc864df57-e8", "ovs_interfaceid": "c864df57-e86e-439c-88f6-198c1e0cf48c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.809 2 DEBUG oslo_concurrency.lockutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Releasing lock "refresh_cache-164d69c5-58d2-413e-9b1f-907b5cc12d9b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.809 2 DEBUG nova.compute.manager [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Instance network_info: |[{"id": "c864df57-e86e-439c-88f6-198c1e0cf48c", "address": "fa:16:3e:11:7d:97", "network": {"id": "05e23ee7-84d7-47d9-8de9-b53576f6a373", "bridge": "br-int", "label": "tempest-test-network--1624656132", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c98d2b93a2394e67a5e6525145c5bca5", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc864df57-e8", "ovs_interfaceid": "c864df57-e86e-439c-88f6-198c1e0cf48c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.810 2 DEBUG oslo_concurrency.lockutils [req-71e7d848-afe1-417f-b72a-616df8c082e8 req-c72d91cb-0e75-4e1e-9c71-d7a7e59f1c93 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-164d69c5-58d2-413e-9b1f-907b5cc12d9b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.810 2 DEBUG nova.network.neutron [req-71e7d848-afe1-417f-b72a-616df8c082e8 req-c72d91cb-0e75-4e1e-9c71-d7a7e59f1c93 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Refreshing network info cache for port c864df57-e86e-439c-88f6-198c1e0cf48c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.813 2 DEBUG nova.virt.libvirt.driver [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Start _get_guest_xml network_info=[{"id": "c864df57-e86e-439c-88f6-198c1e0cf48c", "address": "fa:16:3e:11:7d:97", "network": {"id": "05e23ee7-84d7-47d9-8de9-b53576f6a373", "bridge": "br-int", "label": "tempest-test-network--1624656132", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c98d2b93a2394e67a5e6525145c5bca5", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc864df57-e8", "ovs_interfaceid": "c864df57-e86e-439c-88f6-198c1e0cf48c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T15:17:39Z,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T15:17:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.816 2 WARNING nova.virt.libvirt.driver [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.820 2 DEBUG nova.virt.libvirt.host [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.821 2 DEBUG nova.virt.libvirt.host [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.823 2 DEBUG nova.virt.libvirt.host [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.824 2 DEBUG nova.virt.libvirt.host [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.824 2 DEBUG nova.virt.libvirt.driver [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.825 2 DEBUG nova.virt.hardware [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='987b2db7-1d21-4b59-831a-1e8ace40589b',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T15:17:39Z,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T15:17:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.825 2 DEBUG nova.virt.hardware [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.826 2 DEBUG nova.virt.hardware [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.826 2 DEBUG nova.virt.hardware [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.827 2 DEBUG nova.virt.hardware [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.827 2 DEBUG nova.virt.hardware [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.827 2 DEBUG nova.virt.hardware [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.827 2 DEBUG nova.virt.hardware [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.828 2 DEBUG nova.virt.hardware [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.828 2 DEBUG nova.virt.hardware [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.828 2 DEBUG nova.virt.hardware [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.832 2 DEBUG nova.virt.libvirt.vif [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:30:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-924816690',display_name='tempest-server-test-924816690',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-924816690',id=47,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD8eBR+7791zIq3Ca3IMTdsqXttoRnnB98onayUfwxm3wid9Grh+Seb7hbQVJS1apoK+LjhlbIItD35aVZWAg+9RelHofSx/RlM7CwUN9S/EmBk++Oh1cTCh74OhHkN3ow==',key_name='tempest-keypair-test-1341641232',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c98d2b93a2394e67a5e6525145c5bca5',ramdisk_id='',reservation_id='r-h6o9b07j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortSecurityTest-26261139',owner_user_name='tempest-PortSecurityTest-26261139-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:30:39Z,user_data=None,user_id='b35a1072024b4c6598970391dd8abb59',uuid=164d69c5-58d2-413e-9b1f-907b5cc12d9b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c864df57-e86e-439c-88f6-198c1e0cf48c", "address": "fa:16:3e:11:7d:97", "network": {"id": "05e23ee7-84d7-47d9-8de9-b53576f6a373", "bridge": "br-int", "label": "tempest-test-network--1624656132", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c98d2b93a2394e67a5e6525145c5bca5", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc864df57-e8", "ovs_interfaceid": "c864df57-e86e-439c-88f6-198c1e0cf48c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.832 2 DEBUG nova.network.os_vif_util [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Converting VIF {"id": "c864df57-e86e-439c-88f6-198c1e0cf48c", "address": "fa:16:3e:11:7d:97", "network": {"id": "05e23ee7-84d7-47d9-8de9-b53576f6a373", "bridge": "br-int", "label": "tempest-test-network--1624656132", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c98d2b93a2394e67a5e6525145c5bca5", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc864df57-e8", "ovs_interfaceid": "c864df57-e86e-439c-88f6-198c1e0cf48c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.833 2 DEBUG nova.network.os_vif_util [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:7d:97,bridge_name='br-int',has_traffic_filtering=True,id=c864df57-e86e-439c-88f6-198c1e0cf48c,network=Network(05e23ee7-84d7-47d9-8de9-b53576f6a373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc864df57-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.834 2 DEBUG nova.objects.instance [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 164d69c5-58d2-413e-9b1f-907b5cc12d9b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.849 2 DEBUG nova.virt.libvirt.driver [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:30:45 np0005476733 nova_compute[192580]:  <uuid>164d69c5-58d2-413e-9b1f-907b5cc12d9b</uuid>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:  <name>instance-0000002f</name>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:  <memory>131072</memory>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <nova:name>tempest-server-test-924816690</nova:name>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:30:45</nova:creationTime>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <nova:flavor name="m1.nano">
Oct  8 11:30:45 np0005476733 nova_compute[192580]:        <nova:memory>128</nova:memory>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:        <nova:disk>1</nova:disk>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:        <nova:user uuid="b35a1072024b4c6598970391dd8abb59">tempest-PortSecurityTest-26261139-project-member</nova:user>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:        <nova:project uuid="c98d2b93a2394e67a5e6525145c5bca5">tempest-PortSecurityTest-26261139</nova:project>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="ec29a055-bb5f-49c2-94be-8574c5ea97ea"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:        <nova:port uuid="c864df57-e86e-439c-88f6-198c1e0cf48c">
Oct  8 11:30:45 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <entry name="serial">164d69c5-58d2-413e-9b1f-907b5cc12d9b</entry>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <entry name="uuid">164d69c5-58d2-413e-9b1f-907b5cc12d9b</entry>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/164d69c5-58d2-413e-9b1f-907b5cc12d9b/disk"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/164d69c5-58d2-413e-9b1f-907b5cc12d9b/disk.config"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:11:7d:97"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <target dev="tapc864df57-e8"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/164d69c5-58d2-413e-9b1f-907b5cc12d9b/console.log" append="off"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:30:45 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:30:45 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:30:45 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:30:45 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.855 2 DEBUG nova.compute.manager [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Preparing to wait for external event network-vif-plugged-c864df57-e86e-439c-88f6-198c1e0cf48c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.856 2 DEBUG oslo_concurrency.lockutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Acquiring lock "164d69c5-58d2-413e-9b1f-907b5cc12d9b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.856 2 DEBUG oslo_concurrency.lockutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "164d69c5-58d2-413e-9b1f-907b5cc12d9b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.856 2 DEBUG oslo_concurrency.lockutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "164d69c5-58d2-413e-9b1f-907b5cc12d9b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.857 2 DEBUG nova.virt.libvirt.vif [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:30:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-924816690',display_name='tempest-server-test-924816690',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-924816690',id=47,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD8eBR+7791zIq3Ca3IMTdsqXttoRnnB98onayUfwxm3wid9Grh+Seb7hbQVJS1apoK+LjhlbIItD35aVZWAg+9RelHofSx/RlM7CwUN9S/EmBk++Oh1cTCh74OhHkN3ow==',key_name='tempest-keypair-test-1341641232',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c98d2b93a2394e67a5e6525145c5bca5',ramdisk_id='',reservation_id='r-h6o9b07j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortSecurityTest-26261139',owner_user_name='tempest-PortSecurityTest-26261139-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:30:39Z,user_data=None,user_id='b35a1072024b4c6598970391dd8abb59',uuid=164d69c5-58d2-413e-9b1f-907b5cc12d9b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c864df57-e86e-439c-88f6-198c1e0cf48c", "address": "fa:16:3e:11:7d:97", "network": {"id": "05e23ee7-84d7-47d9-8de9-b53576f6a373", "bridge": "br-int", "label": "tempest-test-network--1624656132", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c98d2b93a2394e67a5e6525145c5bca5", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc864df57-e8", "ovs_interfaceid": "c864df57-e86e-439c-88f6-198c1e0cf48c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.858 2 DEBUG nova.network.os_vif_util [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Converting VIF {"id": "c864df57-e86e-439c-88f6-198c1e0cf48c", "address": "fa:16:3e:11:7d:97", "network": {"id": "05e23ee7-84d7-47d9-8de9-b53576f6a373", "bridge": "br-int", "label": "tempest-test-network--1624656132", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c98d2b93a2394e67a5e6525145c5bca5", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc864df57-e8", "ovs_interfaceid": "c864df57-e86e-439c-88f6-198c1e0cf48c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.858 2 DEBUG nova.network.os_vif_util [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:7d:97,bridge_name='br-int',has_traffic_filtering=True,id=c864df57-e86e-439c-88f6-198c1e0cf48c,network=Network(05e23ee7-84d7-47d9-8de9-b53576f6a373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc864df57-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.859 2 DEBUG os_vif [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:7d:97,bridge_name='br-int',has_traffic_filtering=True,id=c864df57-e86e-439c-88f6-198c1e0cf48c,network=Network(05e23ee7-84d7-47d9-8de9-b53576f6a373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc864df57-e8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.860 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.860 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.863 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc864df57-e8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.863 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc864df57-e8, col_values=(('external_ids', {'iface-id': 'c864df57-e86e-439c-88f6-198c1e0cf48c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:11:7d:97', 'vm-uuid': '164d69c5-58d2-413e-9b1f-907b5cc12d9b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:30:45 np0005476733 NetworkManager[51699]: <info>  [1759937445.8660] manager: (tapc864df57-e8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/131)
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.872 2 INFO os_vif [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:7d:97,bridge_name='br-int',has_traffic_filtering=True,id=c864df57-e86e-439c-88f6-198c1e0cf48c,network=Network(05e23ee7-84d7-47d9-8de9-b53576f6a373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc864df57-e8')#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.944 2 DEBUG nova.virt.libvirt.driver [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.944 2 DEBUG nova.virt.libvirt.driver [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.944 2 DEBUG nova.virt.libvirt.driver [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] No VIF found with MAC fa:16:3e:11:7d:97, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:30:45 np0005476733 nova_compute[192580]: 2025-10-08 15:30:45.945 2 INFO nova.virt.libvirt.driver [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Using config drive#033[00m
Oct  8 11:30:46 np0005476733 nova_compute[192580]: 2025-10-08 15:30:46.351 2 INFO nova.virt.libvirt.driver [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Creating config drive at /var/lib/nova/instances/164d69c5-58d2-413e-9b1f-907b5cc12d9b/disk.config#033[00m
Oct  8 11:30:46 np0005476733 nova_compute[192580]: 2025-10-08 15:30:46.358 2 DEBUG oslo_concurrency.processutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/164d69c5-58d2-413e-9b1f-907b5cc12d9b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq7ou0jx0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:30:46 np0005476733 nova_compute[192580]: 2025-10-08 15:30:46.483 2 DEBUG oslo_concurrency.processutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/164d69c5-58d2-413e-9b1f-907b5cc12d9b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq7ou0jx0" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:30:46 np0005476733 NetworkManager[51699]: <info>  [1759937446.5695] manager: (tapc864df57-e8): new Tun device (/org/freedesktop/NetworkManager/Devices/132)
Oct  8 11:30:46 np0005476733 kernel: tapc864df57-e8: entered promiscuous mode
Oct  8 11:30:46 np0005476733 ovn_controller[94857]: 2025-10-08T15:30:46Z|00366|binding|INFO|Claiming lport c864df57-e86e-439c-88f6-198c1e0cf48c for this chassis.
Oct  8 11:30:46 np0005476733 ovn_controller[94857]: 2025-10-08T15:30:46Z|00367|binding|INFO|c864df57-e86e-439c-88f6-198c1e0cf48c: Claiming fa:16:3e:11:7d:97 10.100.0.8
Oct  8 11:30:46 np0005476733 ovn_controller[94857]: 2025-10-08T15:30:46Z|00368|binding|INFO|c864df57-e86e-439c-88f6-198c1e0cf48c: Claiming unknown
Oct  8 11:30:46 np0005476733 nova_compute[192580]: 2025-10-08 15:30:46.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:46 np0005476733 ovn_controller[94857]: 2025-10-08T15:30:46Z|00369|binding|INFO|Setting lport c864df57-e86e-439c-88f6-198c1e0cf48c ovn-installed in OVS
Oct  8 11:30:46 np0005476733 systemd-udevd[231876]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:30:46 np0005476733 ovn_controller[94857]: 2025-10-08T15:30:46Z|00370|binding|INFO|Setting lport c864df57-e86e-439c-88f6-198c1e0cf48c up in Southbound
Oct  8 11:30:46 np0005476733 nova_compute[192580]: 2025-10-08 15:30:46.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.597 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:7d:97 10.100.0.8', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '164d69c5-58d2-413e-9b1f-907b5cc12d9b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-05e23ee7-84d7-47d9-8de9-b53576f6a373', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c98d2b93a2394e67a5e6525145c5bca5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bec80dd1-0f75-4955-b64e-4b7639499c68, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=c864df57-e86e-439c-88f6-198c1e0cf48c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.602 103739 INFO neutron.agent.ovn.metadata.agent [-] Port c864df57-e86e-439c-88f6-198c1e0cf48c in datapath 05e23ee7-84d7-47d9-8de9-b53576f6a373 bound to our chassis#033[00m
Oct  8 11:30:46 np0005476733 nova_compute[192580]: 2025-10-08 15:30:46.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:46 np0005476733 NetworkManager[51699]: <info>  [1759937446.6122] device (tapc864df57-e8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.612 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 05e23ee7-84d7-47d9-8de9-b53576f6a373#033[00m
Oct  8 11:30:46 np0005476733 NetworkManager[51699]: <info>  [1759937446.6133] device (tapc864df57-e8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:30:46 np0005476733 systemd-machined[152624]: New machine qemu-28-instance-0000002f.
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.636 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8093edd8-f6e1-4b98-b7f8-ce62008d4390]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.637 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap05e23ee7-81 in ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.639 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap05e23ee7-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.639 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[29a239c1-2e7b-4624-a41b-ddf2475c7ce3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.640 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2a3c621b-c533-49ba-bbbf-18d26d64c65c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:46 np0005476733 systemd[1]: Started Virtual Machine qemu-28-instance-0000002f.
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.654 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[6ad40618-4ca1-419e-804f-ab69e64cd254]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.681 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[cbf2ea79-6cc5-4e59-ab8f-c89b651eb620]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.719 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[fb9502b6-aefa-4f96-89a1-2c7482ffe0e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:46 np0005476733 systemd-udevd[231880]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:30:46 np0005476733 NetworkManager[51699]: <info>  [1759937446.7266] manager: (tap05e23ee7-80): new Veth device (/org/freedesktop/NetworkManager/Devices/133)
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.725 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[906184af-a848-46ca-9906-ea847c239289]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:46 np0005476733 nova_compute[192580]: 2025-10-08 15:30:46.769 2 DEBUG nova.compute.manager [req-2e48f0da-21b4-45d0-9b8e-1a25893e9b54 req-7f1658dc-38d3-44e0-a892-4a7c6c4cb2c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Received event network-changed-3ca6fe41-629a-4c92-9418-834869a48822 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:30:46 np0005476733 nova_compute[192580]: 2025-10-08 15:30:46.770 2 DEBUG nova.compute.manager [req-2e48f0da-21b4-45d0-9b8e-1a25893e9b54 req-7f1658dc-38d3-44e0-a892-4a7c6c4cb2c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Refreshing instance network info cache due to event network-changed-3ca6fe41-629a-4c92-9418-834869a48822. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:30:46 np0005476733 nova_compute[192580]: 2025-10-08 15:30:46.770 2 DEBUG oslo_concurrency.lockutils [req-2e48f0da-21b4-45d0-9b8e-1a25893e9b54 req-7f1658dc-38d3-44e0-a892-4a7c6c4cb2c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-90f7bb14-f463-4f98-92fc-22c2a06a12cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:30:46 np0005476733 nova_compute[192580]: 2025-10-08 15:30:46.770 2 DEBUG oslo_concurrency.lockutils [req-2e48f0da-21b4-45d0-9b8e-1a25893e9b54 req-7f1658dc-38d3-44e0-a892-4a7c6c4cb2c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-90f7bb14-f463-4f98-92fc-22c2a06a12cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:30:46 np0005476733 nova_compute[192580]: 2025-10-08 15:30:46.770 2 DEBUG nova.network.neutron [req-2e48f0da-21b4-45d0-9b8e-1a25893e9b54 req-7f1658dc-38d3-44e0-a892-4a7c6c4cb2c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Refreshing network info cache for port 3ca6fe41-629a-4c92-9418-834869a48822 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.778 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[c3383ce2-3edf-4c25-9401-4aeaae9d01c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.782 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[025a83cd-c44f-445e-824e-bf9a2277e92e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:46 np0005476733 NetworkManager[51699]: <info>  [1759937446.8075] device (tap05e23ee7-80): carrier: link connected
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.816 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[9ae9049d-e685-4667-ba1a-9ae877dfe70f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.834 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[881a20cc-e383-4893-9321-928c4f8c6b0d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap05e23ee7-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:08:53:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 435047, 'reachable_time': 32760, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231912, 'error': None, 'target': 'ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.852 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b6c77331-67b2-40a9-91ea-70bcacf8d787]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe08:535b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 435047, 'tstamp': 435047}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231913, 'error': None, 'target': 'ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.871 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[70201b17-2123-4100-a7ee-4e9fd9110841]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap05e23ee7-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:08:53:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 435047, 'reachable_time': 32760, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231914, 'error': None, 'target': 'ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.907 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a175383b-18bd-4513-8ead-8f539195373c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.966 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f06c4b63-7b90-4779-ba40-72dc2931df08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.968 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap05e23ee7-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.968 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.969 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap05e23ee7-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:30:46 np0005476733 nova_compute[192580]: 2025-10-08 15:30:46.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:46 np0005476733 NetworkManager[51699]: <info>  [1759937446.9717] manager: (tap05e23ee7-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/134)
Oct  8 11:30:46 np0005476733 kernel: tap05e23ee7-80: entered promiscuous mode
Oct  8 11:30:46 np0005476733 nova_compute[192580]: 2025-10-08 15:30:46.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.975 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap05e23ee7-80, col_values=(('external_ids', {'iface-id': 'd2188afb-493c-4705-9ba9-87c4b983c343'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:30:46 np0005476733 ovn_controller[94857]: 2025-10-08T15:30:46Z|00371|binding|INFO|Releasing lport d2188afb-493c-4705-9ba9-87c4b983c343 from this chassis (sb_readonly=0)
Oct  8 11:30:46 np0005476733 nova_compute[192580]: 2025-10-08 15:30:46.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:46 np0005476733 nova_compute[192580]: 2025-10-08 15:30:46.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:46 np0005476733 nova_compute[192580]: 2025-10-08 15:30:46.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.990 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/05e23ee7-84d7-47d9-8de9-b53576f6a373.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/05e23ee7-84d7-47d9-8de9-b53576f6a373.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.991 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[14f1249a-a6ae-41d6-8547-bfadbc628a6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.992 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-05e23ee7-84d7-47d9-8de9-b53576f6a373
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/05e23ee7-84d7-47d9-8de9-b53576f6a373.pid.haproxy
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 05e23ee7-84d7-47d9-8de9-b53576f6a373
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:30:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:46.992 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373', 'env', 'PROCESS_TAG=haproxy-05e23ee7-84d7-47d9-8de9-b53576f6a373', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/05e23ee7-84d7-47d9-8de9-b53576f6a373.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:30:47 np0005476733 nova_compute[192580]: 2025-10-08 15:30:47.090 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:30:47 np0005476733 nova_compute[192580]: 2025-10-08 15:30:47.206 2 INFO nova.compute.manager [None req-bab55168-0283-4be9-aad0-8d90daae70e7 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Get console output#033[00m
Oct  8 11:30:47 np0005476733 podman[231943]: 2025-10-08 15:30:47.382971989 +0000 UTC m=+0.056435234 container create 66c2ed38a4fc311308bf8460697bbb32d97d85d7f4e1085cdc4e06e49737cd1b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  8 11:30:47 np0005476733 systemd[1]: Started libpod-conmon-66c2ed38a4fc311308bf8460697bbb32d97d85d7f4e1085cdc4e06e49737cd1b.scope.
Oct  8 11:30:47 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:30:47 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0c094283ad176a6c4b75ea8cbd548e98fc48c7a3c15e77e9ebc0be63d3ead5a0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:30:47 np0005476733 podman[231943]: 2025-10-08 15:30:47.441372835 +0000 UTC m=+0.114836100 container init 66c2ed38a4fc311308bf8460697bbb32d97d85d7f4e1085cdc4e06e49737cd1b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:30:47 np0005476733 podman[231943]: 2025-10-08 15:30:47.351700149 +0000 UTC m=+0.025163444 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:30:47 np0005476733 podman[231943]: 2025-10-08 15:30:47.449165523 +0000 UTC m=+0.122628768 container start 66c2ed38a4fc311308bf8460697bbb32d97d85d7f4e1085cdc4e06e49737cd1b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 11:30:47 np0005476733 neutron-haproxy-ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373[231960]: [NOTICE]   (231983) : New worker (232002) forked
Oct  8 11:30:47 np0005476733 neutron-haproxy-ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373[231960]: [NOTICE]   (231983) : Loading success.
Oct  8 11:30:47 np0005476733 podman[231956]: 2025-10-08 15:30:47.501024081 +0000 UTC m=+0.082859109 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 11:30:47 np0005476733 podman[231959]: 2025-10-08 15:30:47.515949397 +0000 UTC m=+0.098537670 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:30:47 np0005476733 nova_compute[192580]: 2025-10-08 15:30:47.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.127 2 DEBUG oslo_concurrency.lockutils [None req-1dd2cfbe-3d8f-4cec-a0a0-221cd450ca7d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.128 2 DEBUG oslo_concurrency.lockutils [None req-1dd2cfbe-3d8f-4cec-a0a0-221cd450ca7d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.128 2 DEBUG oslo_concurrency.lockutils [None req-1dd2cfbe-3d8f-4cec-a0a0-221cd450ca7d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.128 2 DEBUG oslo_concurrency.lockutils [None req-1dd2cfbe-3d8f-4cec-a0a0-221cd450ca7d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.128 2 DEBUG oslo_concurrency.lockutils [None req-1dd2cfbe-3d8f-4cec-a0a0-221cd450ca7d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.129 2 INFO nova.compute.manager [None req-1dd2cfbe-3d8f-4cec-a0a0-221cd450ca7d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Terminating instance#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.130 2 DEBUG nova.compute.manager [None req-1dd2cfbe-3d8f-4cec-a0a0-221cd450ca7d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:30:48 np0005476733 kernel: tap3ca6fe41-62 (unregistering): left promiscuous mode
Oct  8 11:30:48 np0005476733 NetworkManager[51699]: <info>  [1759937448.1599] device (tap3ca6fe41-62): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:48 np0005476733 ovn_controller[94857]: 2025-10-08T15:30:48Z|00372|binding|INFO|Releasing lport 3ca6fe41-629a-4c92-9418-834869a48822 from this chassis (sb_readonly=0)
Oct  8 11:30:48 np0005476733 ovn_controller[94857]: 2025-10-08T15:30:48Z|00373|binding|INFO|Setting lport 3ca6fe41-629a-4c92-9418-834869a48822 down in Southbound
Oct  8 11:30:48 np0005476733 ovn_controller[94857]: 2025-10-08T15:30:48Z|00374|binding|INFO|Removing iface tap3ca6fe41-62 ovn-installed in OVS
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:48.180 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:33:a9 192.168.5.60 2001:5::f816:3eff:fe03:33a9'], port_security=['fa:16:3e:03:33:a9 192.168.5.60 2001:5::f816:3eff:fe03:33a9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'neutron:cidrs': '192.168.5.60/24 2001:5::f816:3eff:fe03:33a9/64', 'neutron:device_id': '90f7bb14-f463-4f98-92fc-22c2a06a12cd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f872d065-dcdd-4abe-966e-984ec8347cf7', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless', 'neutron:project_id': '93e68db931464f0282500c84d398d8af', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ee93d6be-59e3-41c0-a55f-8df79fb9da74', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.205'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5057352-eab1-4ec2-8137-06eaee60ec6e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=3ca6fe41-629a-4c92-9418-834869a48822) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:30:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:48.181 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 3ca6fe41-629a-4c92-9418-834869a48822 in datapath f872d065-dcdd-4abe-966e-984ec8347cf7 unbound from our chassis#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:48.185 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f872d065-dcdd-4abe-966e-984ec8347cf7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:30:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:48.186 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e38f0a94-28f1-4a13-91d8-6c7d140fbb5a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:48.187 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7 namespace which is not needed anymore#033[00m
Oct  8 11:30:48 np0005476733 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000002a.scope: Deactivated successfully.
Oct  8 11:30:48 np0005476733 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000002a.scope: Consumed 42.571s CPU time.
Oct  8 11:30:48 np0005476733 systemd-machined[152624]: Machine qemu-26-instance-0000002a terminated.
Oct  8 11:30:48 np0005476733 neutron-haproxy-ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7[231369]: [NOTICE]   (231373) : haproxy version is 2.8.14-c23fe91
Oct  8 11:30:48 np0005476733 neutron-haproxy-ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7[231369]: [NOTICE]   (231373) : path to executable is /usr/sbin/haproxy
Oct  8 11:30:48 np0005476733 neutron-haproxy-ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7[231369]: [WARNING]  (231373) : Exiting Master process...
Oct  8 11:30:48 np0005476733 neutron-haproxy-ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7[231369]: [ALERT]    (231373) : Current worker (231375) exited with code 143 (Terminated)
Oct  8 11:30:48 np0005476733 neutron-haproxy-ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7[231369]: [WARNING]  (231373) : All workers exited. Exiting... (0)
Oct  8 11:30:48 np0005476733 systemd[1]: libpod-614f6d2102f50e73e404d7d23869ae6f128afeb3aea535a2a552e13a3aea6d75.scope: Deactivated successfully.
Oct  8 11:30:48 np0005476733 podman[232038]: 2025-10-08 15:30:48.320372378 +0000 UTC m=+0.042436767 container died 614f6d2102f50e73e404d7d23869ae6f128afeb3aea535a2a552e13a3aea6d75 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:30:48 np0005476733 systemd[1]: var-lib-containers-storage-overlay-67894a9509764f65413bd23644ff07169a62b2f20093a30464796983d3188766-merged.mount: Deactivated successfully.
Oct  8 11:30:48 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-614f6d2102f50e73e404d7d23869ae6f128afeb3aea535a2a552e13a3aea6d75-userdata-shm.mount: Deactivated successfully.
Oct  8 11:30:48 np0005476733 podman[232038]: 2025-10-08 15:30:48.37273172 +0000 UTC m=+0.094796119 container cleanup 614f6d2102f50e73e404d7d23869ae6f128afeb3aea535a2a552e13a3aea6d75 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:48 np0005476733 systemd[1]: libpod-conmon-614f6d2102f50e73e404d7d23869ae6f128afeb3aea535a2a552e13a3aea6d75.scope: Deactivated successfully.
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.436 2 DEBUG nova.network.neutron [req-71e7d848-afe1-417f-b72a-616df8c082e8 req-c72d91cb-0e75-4e1e-9c71-d7a7e59f1c93 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Updated VIF entry in instance network info cache for port c864df57-e86e-439c-88f6-198c1e0cf48c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.436 2 DEBUG nova.network.neutron [req-71e7d848-afe1-417f-b72a-616df8c082e8 req-c72d91cb-0e75-4e1e-9c71-d7a7e59f1c93 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Updating instance_info_cache with network_info: [{"id": "c864df57-e86e-439c-88f6-198c1e0cf48c", "address": "fa:16:3e:11:7d:97", "network": {"id": "05e23ee7-84d7-47d9-8de9-b53576f6a373", "bridge": "br-int", "label": "tempest-test-network--1624656132", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c98d2b93a2394e67a5e6525145c5bca5", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc864df57-e8", "ovs_interfaceid": "c864df57-e86e-439c-88f6-198c1e0cf48c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.458 2 DEBUG oslo_concurrency.lockutils [req-71e7d848-afe1-417f-b72a-616df8c082e8 req-c72d91cb-0e75-4e1e-9c71-d7a7e59f1c93 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-164d69c5-58d2-413e-9b1f-907b5cc12d9b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.463 2 INFO nova.virt.libvirt.driver [-] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Instance destroyed successfully.#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.463 2 DEBUG nova.objects.instance [None req-1dd2cfbe-3d8f-4cec-a0a0-221cd450ca7d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lazy-loading 'resources' on Instance uuid 90f7bb14-f463-4f98-92fc-22c2a06a12cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.476 2 DEBUG nova.virt.libvirt.vif [None req-1dd2cfbe-3d8f-4cec-a0a0-221cd450ca7d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:29:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless',display_name='tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-extradhcpoptionstest-1757752636-test-extra-dhcp-opts-ip',id=42,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAROHXDFBirKKfgv1/Q2k8TOz822D2j3GssXLkqqAYkfNmKCLTZPWHL9R3TttvPeVcQM9XeUfcVk0LUjV4/DUc229+mDzz6yKwrgz0g4olEc5cIgAsFC91SZyJ937u9BxA==',key_name='tempest-ExtraDhcpOptionsTest-1757752636',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:30:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='93e68db931464f0282500c84d398d8af',ramdisk_id='',reservation_id='r-tdjre26z',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ExtraDhcpOptionsTest-522093769',owner_user_name='tempest-ExtraDhcpOptionsTest-522093769-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:30:07Z,user_data=None,user_id='048380879c82439f920961e33c8fc34c',uuid=90f7bb14-f463-4f98-92fc-22c2a06a12cd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3ca6fe41-629a-4c92-9418-834869a48822", "address": "fa:16:3e:03:33:a9", "network": {"id": "f872d065-dcdd-4abe-966e-984ec8347cf7", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.60", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::f816:3eff:fe03:33a9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ca6fe41-62", "ovs_interfaceid": "3ca6fe41-629a-4c92-9418-834869a48822", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.477 2 DEBUG nova.network.os_vif_util [None req-1dd2cfbe-3d8f-4cec-a0a0-221cd450ca7d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Converting VIF {"id": "3ca6fe41-629a-4c92-9418-834869a48822", "address": "fa:16:3e:03:33:a9", "network": {"id": "f872d065-dcdd-4abe-966e-984ec8347cf7", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.60", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::f816:3eff:fe03:33a9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ca6fe41-62", "ovs_interfaceid": "3ca6fe41-629a-4c92-9418-834869a48822", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.478 2 DEBUG nova.network.os_vif_util [None req-1dd2cfbe-3d8f-4cec-a0a0-221cd450ca7d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:33:a9,bridge_name='br-int',has_traffic_filtering=True,id=3ca6fe41-629a-4c92-9418-834869a48822,network=Network(f872d065-dcdd-4abe-966e-984ec8347cf7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3ca6fe41-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.478 2 DEBUG os_vif [None req-1dd2cfbe-3d8f-4cec-a0a0-221cd450ca7d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:33:a9,bridge_name='br-int',has_traffic_filtering=True,id=3ca6fe41-629a-4c92-9418-834869a48822,network=Network(f872d065-dcdd-4abe-966e-984ec8347cf7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3ca6fe41-62') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.480 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3ca6fe41-62, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:48 np0005476733 podman[232077]: 2025-10-08 15:30:48.483566192 +0000 UTC m=+0.039062189 container remove 614f6d2102f50e73e404d7d23869ae6f128afeb3aea535a2a552e13a3aea6d75 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.486 2 INFO os_vif [None req-1dd2cfbe-3d8f-4cec-a0a0-221cd450ca7d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:33:a9,bridge_name='br-int',has_traffic_filtering=True,id=3ca6fe41-629a-4c92-9418-834869a48822,network=Network(f872d065-dcdd-4abe-966e-984ec8347cf7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3ca6fe41-62')#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.487 2 INFO nova.virt.libvirt.driver [None req-1dd2cfbe-3d8f-4cec-a0a0-221cd450ca7d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Deleting instance files /var/lib/nova/instances/90f7bb14-f463-4f98-92fc-22c2a06a12cd_del#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.488 2 INFO nova.virt.libvirt.driver [None req-1dd2cfbe-3d8f-4cec-a0a0-221cd450ca7d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Deletion of /var/lib/nova/instances/90f7bb14-f463-4f98-92fc-22c2a06a12cd_del complete#033[00m
Oct  8 11:30:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:48.489 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[271da8fa-c587-4791-960f-4411c05ded87]: (4, ('Wed Oct  8 03:30:48 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7 (614f6d2102f50e73e404d7d23869ae6f128afeb3aea535a2a552e13a3aea6d75)\n614f6d2102f50e73e404d7d23869ae6f128afeb3aea535a2a552e13a3aea6d75\nWed Oct  8 03:30:48 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7 (614f6d2102f50e73e404d7d23869ae6f128afeb3aea535a2a552e13a3aea6d75)\n614f6d2102f50e73e404d7d23869ae6f128afeb3aea535a2a552e13a3aea6d75\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:48.490 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[760bfa57-c033-4814-90b4-338f74d2e0ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:48.491 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf872d065-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:48 np0005476733 kernel: tapf872d065-d0: left promiscuous mode
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:48.509 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7e96595b-6526-4d04-b26e-f0a4f388d0bd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:48.529 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9f820d2f-1afe-4433-aa30-2cb89f98e592]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:48.530 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[513d5635-15a9-4214-858c-27850956da42]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.544 2 INFO nova.compute.manager [None req-1dd2cfbe-3d8f-4cec-a0a0-221cd450ca7d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.545 2 DEBUG oslo.service.loopingcall [None req-1dd2cfbe-3d8f-4cec-a0a0-221cd450ca7d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.545 2 DEBUG nova.compute.manager [-] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.546 2 DEBUG nova.network.neutron [-] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:30:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:48.545 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3338ad42-4d87-4334-8262-ba53921098a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430796, 'reachable_time': 18449, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232097, 'error': None, 'target': 'ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:48.547 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f872d065-dcdd-4abe-966e-984ec8347cf7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:30:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:30:48.547 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[71b6d865-2bbc-4eac-962c-d48dfdca0fab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:30:48 np0005476733 systemd[1]: run-netns-ovnmeta\x2df872d065\x2ddcdd\x2d4abe\x2d966e\x2d984ec8347cf7.mount: Deactivated successfully.
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.562 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937448.561802, 164d69c5-58d2-413e-9b1f-907b5cc12d9b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.562 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] VM Started (Lifecycle Event)#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.595 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.603 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937448.5626001, 164d69c5-58d2-413e-9b1f-907b5cc12d9b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.604 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.630 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.633 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:30:48 np0005476733 nova_compute[192580]: 2025-10-08 15:30:48.658 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.147 2 DEBUG nova.compute.manager [req-053e88ca-f082-4abb-a733-29b80b9c67ce req-fd374e21-6104-436f-8f86-24cdd7b7df0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Received event network-vif-plugged-c864df57-e86e-439c-88f6-198c1e0cf48c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.148 2 DEBUG oslo_concurrency.lockutils [req-053e88ca-f082-4abb-a733-29b80b9c67ce req-fd374e21-6104-436f-8f86-24cdd7b7df0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "164d69c5-58d2-413e-9b1f-907b5cc12d9b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.149 2 DEBUG oslo_concurrency.lockutils [req-053e88ca-f082-4abb-a733-29b80b9c67ce req-fd374e21-6104-436f-8f86-24cdd7b7df0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "164d69c5-58d2-413e-9b1f-907b5cc12d9b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.149 2 DEBUG oslo_concurrency.lockutils [req-053e88ca-f082-4abb-a733-29b80b9c67ce req-fd374e21-6104-436f-8f86-24cdd7b7df0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "164d69c5-58d2-413e-9b1f-907b5cc12d9b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.149 2 DEBUG nova.compute.manager [req-053e88ca-f082-4abb-a733-29b80b9c67ce req-fd374e21-6104-436f-8f86-24cdd7b7df0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Processing event network-vif-plugged-c864df57-e86e-439c-88f6-198c1e0cf48c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.150 2 DEBUG nova.compute.manager [req-053e88ca-f082-4abb-a733-29b80b9c67ce req-fd374e21-6104-436f-8f86-24cdd7b7df0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Received event network-vif-plugged-c864df57-e86e-439c-88f6-198c1e0cf48c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.150 2 DEBUG oslo_concurrency.lockutils [req-053e88ca-f082-4abb-a733-29b80b9c67ce req-fd374e21-6104-436f-8f86-24cdd7b7df0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "164d69c5-58d2-413e-9b1f-907b5cc12d9b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.150 2 DEBUG oslo_concurrency.lockutils [req-053e88ca-f082-4abb-a733-29b80b9c67ce req-fd374e21-6104-436f-8f86-24cdd7b7df0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "164d69c5-58d2-413e-9b1f-907b5cc12d9b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.151 2 DEBUG oslo_concurrency.lockutils [req-053e88ca-f082-4abb-a733-29b80b9c67ce req-fd374e21-6104-436f-8f86-24cdd7b7df0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "164d69c5-58d2-413e-9b1f-907b5cc12d9b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.151 2 DEBUG nova.compute.manager [req-053e88ca-f082-4abb-a733-29b80b9c67ce req-fd374e21-6104-436f-8f86-24cdd7b7df0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] No waiting events found dispatching network-vif-plugged-c864df57-e86e-439c-88f6-198c1e0cf48c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.151 2 WARNING nova.compute.manager [req-053e88ca-f082-4abb-a733-29b80b9c67ce req-fd374e21-6104-436f-8f86-24cdd7b7df0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Received unexpected event network-vif-plugged-c864df57-e86e-439c-88f6-198c1e0cf48c for instance with vm_state building and task_state spawning.#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.151 2 DEBUG nova.compute.manager [req-053e88ca-f082-4abb-a733-29b80b9c67ce req-fd374e21-6104-436f-8f86-24cdd7b7df0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Received event network-vif-unplugged-3ca6fe41-629a-4c92-9418-834869a48822 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.152 2 DEBUG oslo_concurrency.lockutils [req-053e88ca-f082-4abb-a733-29b80b9c67ce req-fd374e21-6104-436f-8f86-24cdd7b7df0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.152 2 DEBUG oslo_concurrency.lockutils [req-053e88ca-f082-4abb-a733-29b80b9c67ce req-fd374e21-6104-436f-8f86-24cdd7b7df0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.152 2 DEBUG oslo_concurrency.lockutils [req-053e88ca-f082-4abb-a733-29b80b9c67ce req-fd374e21-6104-436f-8f86-24cdd7b7df0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.153 2 DEBUG nova.compute.manager [req-053e88ca-f082-4abb-a733-29b80b9c67ce req-fd374e21-6104-436f-8f86-24cdd7b7df0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] No waiting events found dispatching network-vif-unplugged-3ca6fe41-629a-4c92-9418-834869a48822 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.153 2 DEBUG nova.compute.manager [req-053e88ca-f082-4abb-a733-29b80b9c67ce req-fd374e21-6104-436f-8f86-24cdd7b7df0c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Received event network-vif-unplugged-3ca6fe41-629a-4c92-9418-834869a48822 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.153 2 DEBUG nova.compute.manager [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.162 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937449.1626225, 164d69c5-58d2-413e-9b1f-907b5cc12d9b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.163 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.167 2 DEBUG nova.virt.libvirt.driver [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.170 2 INFO nova.virt.libvirt.driver [-] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Instance spawned successfully.#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.171 2 DEBUG nova.virt.libvirt.driver [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.215 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.222 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.226 2 DEBUG nova.virt.libvirt.driver [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.226 2 DEBUG nova.virt.libvirt.driver [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.227 2 DEBUG nova.virt.libvirt.driver [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.227 2 DEBUG nova.virt.libvirt.driver [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.227 2 DEBUG nova.virt.libvirt.driver [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.228 2 DEBUG nova.virt.libvirt.driver [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.262 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.315 2 INFO nova.compute.manager [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Took 9.87 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.315 2 DEBUG nova.compute.manager [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.386 2 INFO nova.compute.manager [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Took 11.08 seconds to build instance.#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.409 2 DEBUG oslo_concurrency.lockutils [None req-6484de4a-7574-4b74-a817-25952fccc87d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "164d69c5-58d2-413e-9b1f-907b5cc12d9b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.345s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:30:49 np0005476733 nova_compute[192580]: 2025-10-08 15:30:49.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:30:50 np0005476733 nova_compute[192580]: 2025-10-08 15:30:50.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:30:51 np0005476733 nova_compute[192580]: 2025-10-08 15:30:51.399 2 DEBUG nova.network.neutron [req-2e48f0da-21b4-45d0-9b8e-1a25893e9b54 req-7f1658dc-38d3-44e0-a892-4a7c6c4cb2c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Updated VIF entry in instance network info cache for port 3ca6fe41-629a-4c92-9418-834869a48822. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:30:51 np0005476733 nova_compute[192580]: 2025-10-08 15:30:51.400 2 DEBUG nova.network.neutron [req-2e48f0da-21b4-45d0-9b8e-1a25893e9b54 req-7f1658dc-38d3-44e0-a892-4a7c6c4cb2c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Updating instance_info_cache with network_info: [{"id": "3ca6fe41-629a-4c92-9418-834869a48822", "address": "fa:16:3e:03:33:a9", "network": {"id": "f872d065-dcdd-4abe-966e-984ec8347cf7", "bridge": "br-int", "label": "tempest-ExtraDhcpOptionsTest-1757752636-test_extra_dhcp_opts_ipv4_ipv6_stateless", "subnets": [{"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.60", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:5::/64", "dns": [], "gateway": {"address": "2001:5::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:5::f816:3eff:fe03:33a9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "dhcpv6-stateless"}}], "meta": {"injected": false, "tenant_id": "93e68db931464f0282500c84d398d8af", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ca6fe41-62", "ovs_interfaceid": "3ca6fe41-629a-4c92-9418-834869a48822", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:30:51 np0005476733 nova_compute[192580]: 2025-10-08 15:30:51.432 2 DEBUG oslo_concurrency.lockutils [req-2e48f0da-21b4-45d0-9b8e-1a25893e9b54 req-7f1658dc-38d3-44e0-a892-4a7c6c4cb2c6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-90f7bb14-f463-4f98-92fc-22c2a06a12cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:30:51 np0005476733 nova_compute[192580]: 2025-10-08 15:30:51.520 2 DEBUG nova.compute.manager [req-99775672-5f79-4bdb-9dff-4814338438e8 req-449f8ca2-327d-48ec-88b0-363d38ca4575 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Received event network-vif-plugged-3ca6fe41-629a-4c92-9418-834869a48822 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:30:51 np0005476733 nova_compute[192580]: 2025-10-08 15:30:51.521 2 DEBUG oslo_concurrency.lockutils [req-99775672-5f79-4bdb-9dff-4814338438e8 req-449f8ca2-327d-48ec-88b0-363d38ca4575 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:51 np0005476733 nova_compute[192580]: 2025-10-08 15:30:51.521 2 DEBUG oslo_concurrency.lockutils [req-99775672-5f79-4bdb-9dff-4814338438e8 req-449f8ca2-327d-48ec-88b0-363d38ca4575 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:51 np0005476733 nova_compute[192580]: 2025-10-08 15:30:51.522 2 DEBUG oslo_concurrency.lockutils [req-99775672-5f79-4bdb-9dff-4814338438e8 req-449f8ca2-327d-48ec-88b0-363d38ca4575 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:51 np0005476733 nova_compute[192580]: 2025-10-08 15:30:51.522 2 DEBUG nova.compute.manager [req-99775672-5f79-4bdb-9dff-4814338438e8 req-449f8ca2-327d-48ec-88b0-363d38ca4575 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] No waiting events found dispatching network-vif-plugged-3ca6fe41-629a-4c92-9418-834869a48822 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:30:51 np0005476733 nova_compute[192580]: 2025-10-08 15:30:51.522 2 WARNING nova.compute.manager [req-99775672-5f79-4bdb-9dff-4814338438e8 req-449f8ca2-327d-48ec-88b0-363d38ca4575 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Received unexpected event network-vif-plugged-3ca6fe41-629a-4c92-9418-834869a48822 for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:30:51 np0005476733 nova_compute[192580]: 2025-10-08 15:30:51.630 2 INFO nova.compute.manager [None req-70100322-ab07-4d46-89ac-db94683c84aa b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Get console output#033[00m
Oct  8 11:30:51 np0005476733 nova_compute[192580]: 2025-10-08 15:30:51.637 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:30:52 np0005476733 nova_compute[192580]: 2025-10-08 15:30:52.344 2 INFO nova.compute.manager [None req-241c7df1-bd5e-4d45-a60d-a5fe2bf0c552 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Get console output#033[00m
Oct  8 11:30:52 np0005476733 nova_compute[192580]: 2025-10-08 15:30:52.350 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:30:52 np0005476733 nova_compute[192580]: 2025-10-08 15:30:52.456 2 DEBUG nova.network.neutron [-] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:30:52 np0005476733 nova_compute[192580]: 2025-10-08 15:30:52.479 2 INFO nova.compute.manager [-] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Took 3.93 seconds to deallocate network for instance.#033[00m
Oct  8 11:30:52 np0005476733 nova_compute[192580]: 2025-10-08 15:30:52.528 2 DEBUG oslo_concurrency.lockutils [None req-1dd2cfbe-3d8f-4cec-a0a0-221cd450ca7d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:30:52 np0005476733 nova_compute[192580]: 2025-10-08 15:30:52.529 2 DEBUG oslo_concurrency.lockutils [None req-1dd2cfbe-3d8f-4cec-a0a0-221cd450ca7d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:30:52 np0005476733 nova_compute[192580]: 2025-10-08 15:30:52.635 2 DEBUG nova.compute.provider_tree [None req-1dd2cfbe-3d8f-4cec-a0a0-221cd450ca7d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:30:52 np0005476733 nova_compute[192580]: 2025-10-08 15:30:52.652 2 DEBUG nova.scheduler.client.report [None req-1dd2cfbe-3d8f-4cec-a0a0-221cd450ca7d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:30:52 np0005476733 nova_compute[192580]: 2025-10-08 15:30:52.674 2 DEBUG oslo_concurrency.lockutils [None req-1dd2cfbe-3d8f-4cec-a0a0-221cd450ca7d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:52 np0005476733 nova_compute[192580]: 2025-10-08 15:30:52.697 2 INFO nova.scheduler.client.report [None req-1dd2cfbe-3d8f-4cec-a0a0-221cd450ca7d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Deleted allocations for instance 90f7bb14-f463-4f98-92fc-22c2a06a12cd#033[00m
Oct  8 11:30:52 np0005476733 nova_compute[192580]: 2025-10-08 15:30:52.769 2 DEBUG oslo_concurrency.lockutils [None req-1dd2cfbe-3d8f-4cec-a0a0-221cd450ca7d 048380879c82439f920961e33c8fc34c 93e68db931464f0282500c84d398d8af - - default default] Lock "90f7bb14-f463-4f98-92fc-22c2a06a12cd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.641s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:30:53 np0005476733 nova_compute[192580]: 2025-10-08 15:30:53.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:54 np0005476733 nova_compute[192580]: 2025-10-08 15:30:54.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:54 np0005476733 nova_compute[192580]: 2025-10-08 15:30:54.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:30:54 np0005476733 nova_compute[192580]: 2025-10-08 15:30:54.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:30:54 np0005476733 nova_compute[192580]: 2025-10-08 15:30:54.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:30:54 np0005476733 nova_compute[192580]: 2025-10-08 15:30:54.788 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-27fa9a5a-04a0-4d80-b75d-564df1c974e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:30:54 np0005476733 nova_compute[192580]: 2025-10-08 15:30:54.790 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-27fa9a5a-04a0-4d80-b75d-564df1c974e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:30:54 np0005476733 nova_compute[192580]: 2025-10-08 15:30:54.791 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:30:54 np0005476733 nova_compute[192580]: 2025-10-08 15:30:54.791 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 27fa9a5a-04a0-4d80-b75d-564df1c974e8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:30:56 np0005476733 podman[232103]: 2025-10-08 15:30:56.23428157 +0000 UTC m=+0.064558624 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid)
Oct  8 11:30:56 np0005476733 podman[232104]: 2025-10-08 15:30:56.234857099 +0000 UTC m=+0.063125339 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 11:30:56 np0005476733 nova_compute[192580]: 2025-10-08 15:30:56.796 2 INFO nova.compute.manager [None req-c8ce799f-ee00-4bf3-af01-bc34da77a6c7 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Get console output#033[00m
Oct  8 11:30:56 np0005476733 nova_compute[192580]: 2025-10-08 15:30:56.865 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Updating instance_info_cache with network_info: [{"id": "23f6a943-ce2f-4958-a0c6-73f789517892", "address": "fa:16:3e:38:f6:e1", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23f6a943-ce", "ovs_interfaceid": "23f6a943-ce2f-4958-a0c6-73f789517892", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:30:56 np0005476733 nova_compute[192580]: 2025-10-08 15:30:56.888 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-27fa9a5a-04a0-4d80-b75d-564df1c974e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:30:56 np0005476733 nova_compute[192580]: 2025-10-08 15:30:56.889 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:30:56 np0005476733 nova_compute[192580]: 2025-10-08 15:30:56.890 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:30:57 np0005476733 nova_compute[192580]: 2025-10-08 15:30:57.514 2 INFO nova.compute.manager [None req-6306b67e-6e2a-40bb-9c16-057a39b2df63 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Get console output#033[00m
Oct  8 11:30:57 np0005476733 nova_compute[192580]: 2025-10-08 15:30:57.519 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:30:58 np0005476733 nova_compute[192580]: 2025-10-08 15:30:58.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:30:59 np0005476733 nova_compute[192580]: 2025-10-08 15:30:59.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:00 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:00Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:11:7d:97 10.100.0.8
Oct  8 11:31:00 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:00Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:11:7d:97 10.100.0.8
Oct  8 11:31:00 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:00Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:85:7d:15 192.168.3.176
Oct  8 11:31:00 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:00Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:85:7d:15 192.168.3.176
Oct  8 11:31:01 np0005476733 nova_compute[192580]: 2025-10-08 15:31:01.953 2 INFO nova.compute.manager [None req-777fdfcc-fe8e-4a95-9a77-b54451f53f70 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Get console output#033[00m
Oct  8 11:31:02 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:02Z|00375|binding|INFO|Releasing lport d2188afb-493c-4705-9ba9-87c4b983c343 from this chassis (sb_readonly=0)
Oct  8 11:31:02 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:02Z|00376|binding|INFO|Releasing lport f67773e8-4408-425a-8438-2209ddc36987 from this chassis (sb_readonly=0)
Oct  8 11:31:02 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:02Z|00377|binding|INFO|Releasing lport 76302563-91ae-48df-adce-3edec8d5a578 from this chassis (sb_readonly=0)
Oct  8 11:31:02 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:02Z|00378|binding|INFO|Releasing lport bbd16b0e-af1f-427d-8500-724401e2ed53 from this chassis (sb_readonly=0)
Oct  8 11:31:02 np0005476733 nova_compute[192580]: 2025-10-08 15:31:02.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:02 np0005476733 nova_compute[192580]: 2025-10-08 15:31:02.682 2 INFO nova.compute.manager [None req-bddd611a-8bef-4c62-8ce6-826d0f218f8b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Get console output#033[00m
Oct  8 11:31:02 np0005476733 nova_compute[192580]: 2025-10-08 15:31:02.687 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:31:03 np0005476733 nova_compute[192580]: 2025-10-08 15:31:03.462 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937448.4599495, 90f7bb14-f463-4f98-92fc-22c2a06a12cd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:31:03 np0005476733 nova_compute[192580]: 2025-10-08 15:31:03.464 2 INFO nova.compute.manager [-] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:31:03 np0005476733 nova_compute[192580]: 2025-10-08 15:31:03.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:03 np0005476733 nova_compute[192580]: 2025-10-08 15:31:03.492 2 DEBUG nova.compute.manager [None req-4597a33b-db51-4b09-b1fd-4d5a198907dd - - - - - -] [instance: 90f7bb14-f463-4f98-92fc-22c2a06a12cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:31:04 np0005476733 nova_compute[192580]: 2025-10-08 15:31:04.390 2 DEBUG nova.compute.manager [req-60fbd14a-f445-448e-bdba-3c4fcc61a09c req-a26440a0-a9de-47be-8992-d28c8149b9d5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Received event network-changed-c864df57-e86e-439c-88f6-198c1e0cf48c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:31:04 np0005476733 nova_compute[192580]: 2025-10-08 15:31:04.390 2 DEBUG nova.compute.manager [req-60fbd14a-f445-448e-bdba-3c4fcc61a09c req-a26440a0-a9de-47be-8992-d28c8149b9d5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Refreshing instance network info cache due to event network-changed-c864df57-e86e-439c-88f6-198c1e0cf48c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:31:04 np0005476733 nova_compute[192580]: 2025-10-08 15:31:04.391 2 DEBUG oslo_concurrency.lockutils [req-60fbd14a-f445-448e-bdba-3c4fcc61a09c req-a26440a0-a9de-47be-8992-d28c8149b9d5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-164d69c5-58d2-413e-9b1f-907b5cc12d9b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:31:04 np0005476733 nova_compute[192580]: 2025-10-08 15:31:04.391 2 DEBUG oslo_concurrency.lockutils [req-60fbd14a-f445-448e-bdba-3c4fcc61a09c req-a26440a0-a9de-47be-8992-d28c8149b9d5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-164d69c5-58d2-413e-9b1f-907b5cc12d9b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:31:04 np0005476733 nova_compute[192580]: 2025-10-08 15:31:04.392 2 DEBUG nova.network.neutron [req-60fbd14a-f445-448e-bdba-3c4fcc61a09c req-a26440a0-a9de-47be-8992-d28c8149b9d5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Refreshing network info cache for port c864df57-e86e-439c-88f6-198c1e0cf48c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:31:04 np0005476733 nova_compute[192580]: 2025-10-08 15:31:04.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:05 np0005476733 nova_compute[192580]: 2025-10-08 15:31:05.858 2 DEBUG nova.network.neutron [req-60fbd14a-f445-448e-bdba-3c4fcc61a09c req-a26440a0-a9de-47be-8992-d28c8149b9d5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Updated VIF entry in instance network info cache for port c864df57-e86e-439c-88f6-198c1e0cf48c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:31:05 np0005476733 nova_compute[192580]: 2025-10-08 15:31:05.859 2 DEBUG nova.network.neutron [req-60fbd14a-f445-448e-bdba-3c4fcc61a09c req-a26440a0-a9de-47be-8992-d28c8149b9d5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Updating instance_info_cache with network_info: [{"id": "c864df57-e86e-439c-88f6-198c1e0cf48c", "address": "fa:16:3e:11:7d:97", "network": {"id": "05e23ee7-84d7-47d9-8de9-b53576f6a373", "bridge": "br-int", "label": "tempest-test-network--1624656132", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c98d2b93a2394e67a5e6525145c5bca5", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc864df57-e8", "ovs_interfaceid": "c864df57-e86e-439c-88f6-198c1e0cf48c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:31:05 np0005476733 nova_compute[192580]: 2025-10-08 15:31:05.879 2 DEBUG oslo_concurrency.lockutils [req-60fbd14a-f445-448e-bdba-3c4fcc61a09c req-a26440a0-a9de-47be-8992-d28c8149b9d5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-164d69c5-58d2-413e-9b1f-907b5cc12d9b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:31:06 np0005476733 podman[232163]: 2025-10-08 15:31:06.232431873 +0000 UTC m=+0.057442097 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:31:08 np0005476733 nova_compute[192580]: 2025-10-08 15:31:08.483 2 INFO nova.compute.manager [None req-819c578c-b663-4919-aa55-d125a935e033 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Get console output#033[00m
Oct  8 11:31:08 np0005476733 nova_compute[192580]: 2025-10-08 15:31:08.490 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:31:08 np0005476733 nova_compute[192580]: 2025-10-08 15:31:08.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:08 np0005476733 nova_compute[192580]: 2025-10-08 15:31:08.495 2 INFO nova.virt.libvirt.driver [None req-819c578c-b663-4919-aa55-d125a935e033 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Truncated console log returned, 3043 bytes ignored#033[00m
Oct  8 11:31:08 np0005476733 nova_compute[192580]: 2025-10-08 15:31:08.761 2 DEBUG oslo_concurrency.lockutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Acquiring lock "307216ea-7a54-4279-80a4-70a83f9056e4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:08 np0005476733 nova_compute[192580]: 2025-10-08 15:31:08.763 2 DEBUG oslo_concurrency.lockutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "307216ea-7a54-4279-80a4-70a83f9056e4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:08 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:08Z|00379|binding|INFO|Releasing lport d2188afb-493c-4705-9ba9-87c4b983c343 from this chassis (sb_readonly=0)
Oct  8 11:31:08 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:08Z|00380|binding|INFO|Releasing lport f67773e8-4408-425a-8438-2209ddc36987 from this chassis (sb_readonly=0)
Oct  8 11:31:08 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:08Z|00381|binding|INFO|Releasing lport 76302563-91ae-48df-adce-3edec8d5a578 from this chassis (sb_readonly=0)
Oct  8 11:31:08 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:08Z|00382|binding|INFO|Releasing lport bbd16b0e-af1f-427d-8500-724401e2ed53 from this chassis (sb_readonly=0)
Oct  8 11:31:08 np0005476733 nova_compute[192580]: 2025-10-08 15:31:08.799 2 DEBUG nova.compute.manager [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:31:08 np0005476733 nova_compute[192580]: 2025-10-08 15:31:08.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:08 np0005476733 nova_compute[192580]: 2025-10-08 15:31:08.889 2 DEBUG oslo_concurrency.lockutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:08 np0005476733 nova_compute[192580]: 2025-10-08 15:31:08.889 2 DEBUG oslo_concurrency.lockutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:08 np0005476733 nova_compute[192580]: 2025-10-08 15:31:08.900 2 DEBUG nova.virt.hardware [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:31:08 np0005476733 nova_compute[192580]: 2025-10-08 15:31:08.900 2 INFO nova.compute.claims [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.108 2 DEBUG nova.compute.provider_tree [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.127 2 DEBUG nova.scheduler.client.report [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.155 2 DEBUG oslo_concurrency.lockutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.156 2 DEBUG nova.compute.manager [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.220 2 DEBUG nova.compute.manager [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.221 2 DEBUG nova.network.neutron [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.253 2 INFO nova.virt.libvirt.driver [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:31:09 np0005476733 podman[232184]: 2025-10-08 15:31:09.266636793 +0000 UTC m=+0.084150310 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.280 2 DEBUG nova.compute.manager [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:31:09 np0005476733 podman[232183]: 2025-10-08 15:31:09.283310625 +0000 UTC m=+0.101457202 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.396 2 DEBUG nova.compute.manager [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.398 2 DEBUG nova.virt.libvirt.driver [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.398 2 INFO nova.virt.libvirt.driver [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Creating image(s)#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.399 2 DEBUG oslo_concurrency.lockutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Acquiring lock "/var/lib/nova/instances/307216ea-7a54-4279-80a4-70a83f9056e4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.399 2 DEBUG oslo_concurrency.lockutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "/var/lib/nova/instances/307216ea-7a54-4279-80a4-70a83f9056e4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.400 2 DEBUG oslo_concurrency.lockutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "/var/lib/nova/instances/307216ea-7a54-4279-80a4-70a83f9056e4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.416 2 DEBUG oslo_concurrency.processutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.485 2 DEBUG oslo_concurrency.processutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.486 2 DEBUG oslo_concurrency.lockutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Acquiring lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.487 2 DEBUG oslo_concurrency.lockutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.498 2 DEBUG oslo_concurrency.processutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.569 2 DEBUG oslo_concurrency.processutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.570 2 DEBUG oslo_concurrency.processutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/307216ea-7a54-4279-80a4-70a83f9056e4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.672 2 DEBUG oslo_concurrency.processutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/307216ea-7a54-4279-80a4-70a83f9056e4/disk 1073741824" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.673 2 DEBUG oslo_concurrency.lockutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.674 2 DEBUG oslo_concurrency.processutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.740 2 DEBUG oslo_concurrency.processutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.741 2 DEBUG nova.virt.disk.api [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Checking if we can resize image /var/lib/nova/instances/307216ea-7a54-4279-80a4-70a83f9056e4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.742 2 DEBUG oslo_concurrency.processutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/307216ea-7a54-4279-80a4-70a83f9056e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.808 2 DEBUG oslo_concurrency.processutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/307216ea-7a54-4279-80a4-70a83f9056e4/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.809 2 DEBUG nova.virt.disk.api [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Cannot resize image /var/lib/nova/instances/307216ea-7a54-4279-80a4-70a83f9056e4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.810 2 DEBUG nova.objects.instance [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lazy-loading 'migration_context' on Instance uuid 307216ea-7a54-4279-80a4-70a83f9056e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.828 2 DEBUG nova.virt.libvirt.driver [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.829 2 DEBUG nova.virt.libvirt.driver [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Ensure instance console log exists: /var/lib/nova/instances/307216ea-7a54-4279-80a4-70a83f9056e4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.829 2 DEBUG oslo_concurrency.lockutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.830 2 DEBUG oslo_concurrency.lockutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:09 np0005476733 nova_compute[192580]: 2025-10-08 15:31:09.830 2 DEBUG oslo_concurrency.lockutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:10 np0005476733 nova_compute[192580]: 2025-10-08 15:31:10.513 2 DEBUG nova.policy [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b35a1072024b4c6598970391dd8abb59', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c98d2b93a2394e67a5e6525145c5bca5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:31:12 np0005476733 nova_compute[192580]: 2025-10-08 15:31:12.091 2 DEBUG nova.network.neutron [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Successfully updated port: e483b131-9a58-498a-941e-6f52029cc1c5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:31:12 np0005476733 nova_compute[192580]: 2025-10-08 15:31:12.112 2 DEBUG oslo_concurrency.lockutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Acquiring lock "refresh_cache-307216ea-7a54-4279-80a4-70a83f9056e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:31:12 np0005476733 nova_compute[192580]: 2025-10-08 15:31:12.113 2 DEBUG oslo_concurrency.lockutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Acquired lock "refresh_cache-307216ea-7a54-4279-80a4-70a83f9056e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:31:12 np0005476733 nova_compute[192580]: 2025-10-08 15:31:12.113 2 DEBUG nova.network.neutron [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:31:12 np0005476733 nova_compute[192580]: 2025-10-08 15:31:12.279 2 DEBUG nova.network.neutron [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.302 2 DEBUG nova.network.neutron [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Updating instance_info_cache with network_info: [{"id": "e483b131-9a58-498a-941e-6f52029cc1c5", "address": "fa:16:3e:74:65:70", "network": {"id": "05e23ee7-84d7-47d9-8de9-b53576f6a373", "bridge": "br-int", "label": "tempest-test-network--1624656132", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c98d2b93a2394e67a5e6525145c5bca5", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape483b131-9a", "ovs_interfaceid": "e483b131-9a58-498a-941e-6f52029cc1c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.326 2 DEBUG oslo_concurrency.lockutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Releasing lock "refresh_cache-307216ea-7a54-4279-80a4-70a83f9056e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.327 2 DEBUG nova.compute.manager [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Instance network_info: |[{"id": "e483b131-9a58-498a-941e-6f52029cc1c5", "address": "fa:16:3e:74:65:70", "network": {"id": "05e23ee7-84d7-47d9-8de9-b53576f6a373", "bridge": "br-int", "label": "tempest-test-network--1624656132", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c98d2b93a2394e67a5e6525145c5bca5", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape483b131-9a", "ovs_interfaceid": "e483b131-9a58-498a-941e-6f52029cc1c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.329 2 DEBUG nova.virt.libvirt.driver [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Start _get_guest_xml network_info=[{"id": "e483b131-9a58-498a-941e-6f52029cc1c5", "address": "fa:16:3e:74:65:70", "network": {"id": "05e23ee7-84d7-47d9-8de9-b53576f6a373", "bridge": "br-int", "label": "tempest-test-network--1624656132", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c98d2b93a2394e67a5e6525145c5bca5", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape483b131-9a", "ovs_interfaceid": "e483b131-9a58-498a-941e-6f52029cc1c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T15:17:39Z,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T15:17:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.334 2 WARNING nova.virt.libvirt.driver [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.339 2 DEBUG nova.virt.libvirt.host [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.340 2 DEBUG nova.virt.libvirt.host [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.345 2 DEBUG nova.virt.libvirt.host [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.346 2 DEBUG nova.virt.libvirt.host [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.346 2 DEBUG nova.virt.libvirt.driver [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.347 2 DEBUG nova.virt.hardware [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='987b2db7-1d21-4b59-831a-1e8ace40589b',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T15:17:39Z,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T15:17:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.347 2 DEBUG nova.virt.hardware [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.348 2 DEBUG nova.virt.hardware [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.348 2 DEBUG nova.virt.hardware [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.348 2 DEBUG nova.virt.hardware [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.349 2 DEBUG nova.virt.hardware [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.349 2 DEBUG nova.virt.hardware [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.349 2 DEBUG nova.virt.hardware [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.350 2 DEBUG nova.virt.hardware [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.350 2 DEBUG nova.virt.hardware [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.350 2 DEBUG nova.virt.hardware [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.354 2 DEBUG nova.virt.libvirt.vif [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:31:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1029057004',display_name='tempest-server-test-1029057004',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1029057004',id=48,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD8eBR+7791zIq3Ca3IMTdsqXttoRnnB98onayUfwxm3wid9Grh+Seb7hbQVJS1apoK+LjhlbIItD35aVZWAg+9RelHofSx/RlM7CwUN9S/EmBk++Oh1cTCh74OhHkN3ow==',key_name='tempest-keypair-test-1341641232',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c98d2b93a2394e67a5e6525145c5bca5',ramdisk_id='',reservation_id='r-9scz6dsc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortSecurityTest-26261139',owner_user_name='tempest-PortSecurityTest-26261139-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:31:09Z,user_data=None,user_id='b35a1072024b4c6598970391dd8abb59',uuid=307216ea-7a54-4279-80a4-70a83f9056e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e483b131-9a58-498a-941e-6f52029cc1c5", "address": "fa:16:3e:74:65:70", "network": {"id": "05e23ee7-84d7-47d9-8de9-b53576f6a373", "bridge": "br-int", "label": "tempest-test-network--1624656132", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c98d2b93a2394e67a5e6525145c5bca5", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape483b131-9a", "ovs_interfaceid": "e483b131-9a58-498a-941e-6f52029cc1c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.355 2 DEBUG nova.network.os_vif_util [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Converting VIF {"id": "e483b131-9a58-498a-941e-6f52029cc1c5", "address": "fa:16:3e:74:65:70", "network": {"id": "05e23ee7-84d7-47d9-8de9-b53576f6a373", "bridge": "br-int", "label": "tempest-test-network--1624656132", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c98d2b93a2394e67a5e6525145c5bca5", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape483b131-9a", "ovs_interfaceid": "e483b131-9a58-498a-941e-6f52029cc1c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.356 2 DEBUG nova.network.os_vif_util [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:65:70,bridge_name='br-int',has_traffic_filtering=True,id=e483b131-9a58-498a-941e-6f52029cc1c5,network=Network(05e23ee7-84d7-47d9-8de9-b53576f6a373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape483b131-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.357 2 DEBUG nova.objects.instance [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 307216ea-7a54-4279-80a4-70a83f9056e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.373 2 DEBUG nova.virt.libvirt.driver [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:31:13 np0005476733 nova_compute[192580]:  <uuid>307216ea-7a54-4279-80a4-70a83f9056e4</uuid>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:  <name>instance-00000030</name>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:  <memory>131072</memory>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <nova:name>tempest-server-test-1029057004</nova:name>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:31:13</nova:creationTime>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <nova:flavor name="m1.nano">
Oct  8 11:31:13 np0005476733 nova_compute[192580]:        <nova:memory>128</nova:memory>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:        <nova:disk>1</nova:disk>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:        <nova:user uuid="b35a1072024b4c6598970391dd8abb59">tempest-PortSecurityTest-26261139-project-member</nova:user>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:        <nova:project uuid="c98d2b93a2394e67a5e6525145c5bca5">tempest-PortSecurityTest-26261139</nova:project>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="ec29a055-bb5f-49c2-94be-8574c5ea97ea"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:        <nova:port uuid="e483b131-9a58-498a-941e-6f52029cc1c5">
Oct  8 11:31:13 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <entry name="serial">307216ea-7a54-4279-80a4-70a83f9056e4</entry>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <entry name="uuid">307216ea-7a54-4279-80a4-70a83f9056e4</entry>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/307216ea-7a54-4279-80a4-70a83f9056e4/disk"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/307216ea-7a54-4279-80a4-70a83f9056e4/disk.config"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:74:65:70"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <target dev="tape483b131-9a"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/307216ea-7a54-4279-80a4-70a83f9056e4/console.log" append="off"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:31:13 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:31:13 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:31:13 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:31:13 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.380 2 DEBUG nova.compute.manager [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Preparing to wait for external event network-vif-plugged-e483b131-9a58-498a-941e-6f52029cc1c5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.380 2 DEBUG oslo_concurrency.lockutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Acquiring lock "307216ea-7a54-4279-80a4-70a83f9056e4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.380 2 DEBUG oslo_concurrency.lockutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "307216ea-7a54-4279-80a4-70a83f9056e4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.381 2 DEBUG oslo_concurrency.lockutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "307216ea-7a54-4279-80a4-70a83f9056e4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.382 2 DEBUG nova.virt.libvirt.vif [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:31:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1029057004',display_name='tempest-server-test-1029057004',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1029057004',id=48,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD8eBR+7791zIq3Ca3IMTdsqXttoRnnB98onayUfwxm3wid9Grh+Seb7hbQVJS1apoK+LjhlbIItD35aVZWAg+9RelHofSx/RlM7CwUN9S/EmBk++Oh1cTCh74OhHkN3ow==',key_name='tempest-keypair-test-1341641232',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c98d2b93a2394e67a5e6525145c5bca5',ramdisk_id='',reservation_id='r-9scz6dsc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-PortSecurityTest-26261139',owner_user_name='tempest-PortSecurityTest-26261139-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:31:09Z,user_data=None,user_id='b35a1072024b4c6598970391dd8abb59',uuid=307216ea-7a54-4279-80a4-70a83f9056e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e483b131-9a58-498a-941e-6f52029cc1c5", "address": "fa:16:3e:74:65:70", "network": {"id": "05e23ee7-84d7-47d9-8de9-b53576f6a373", "bridge": "br-int", "label": "tempest-test-network--1624656132", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c98d2b93a2394e67a5e6525145c5bca5", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape483b131-9a", "ovs_interfaceid": "e483b131-9a58-498a-941e-6f52029cc1c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.382 2 DEBUG nova.network.os_vif_util [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Converting VIF {"id": "e483b131-9a58-498a-941e-6f52029cc1c5", "address": "fa:16:3e:74:65:70", "network": {"id": "05e23ee7-84d7-47d9-8de9-b53576f6a373", "bridge": "br-int", "label": "tempest-test-network--1624656132", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c98d2b93a2394e67a5e6525145c5bca5", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape483b131-9a", "ovs_interfaceid": "e483b131-9a58-498a-941e-6f52029cc1c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.383 2 DEBUG nova.network.os_vif_util [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:65:70,bridge_name='br-int',has_traffic_filtering=True,id=e483b131-9a58-498a-941e-6f52029cc1c5,network=Network(05e23ee7-84d7-47d9-8de9-b53576f6a373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape483b131-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.383 2 DEBUG os_vif [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:65:70,bridge_name='br-int',has_traffic_filtering=True,id=e483b131-9a58-498a-941e-6f52029cc1c5,network=Network(05e23ee7-84d7-47d9-8de9-b53576f6a373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape483b131-9a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.385 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.385 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.389 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape483b131-9a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.389 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape483b131-9a, col_values=(('external_ids', {'iface-id': 'e483b131-9a58-498a-941e-6f52029cc1c5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:74:65:70', 'vm-uuid': '307216ea-7a54-4279-80a4-70a83f9056e4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:31:13 np0005476733 NetworkManager[51699]: <info>  [1759937473.3945] manager: (tape483b131-9a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/135)
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.400 2 INFO os_vif [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:65:70,bridge_name='br-int',has_traffic_filtering=True,id=e483b131-9a58-498a-941e-6f52029cc1c5,network=Network(05e23ee7-84d7-47d9-8de9-b53576f6a373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape483b131-9a')#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.472 2 DEBUG nova.virt.libvirt.driver [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.473 2 DEBUG nova.virt.libvirt.driver [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.474 2 DEBUG nova.virt.libvirt.driver [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] No VIF found with MAC fa:16:3e:74:65:70, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.474 2 INFO nova.virt.libvirt.driver [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Using config drive#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.674 2 INFO nova.compute.manager [None req-3dc4f3ce-ea0c-4441-b333-35e490b44674 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Get console output#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.679 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.683 2 INFO nova.virt.libvirt.driver [None req-3dc4f3ce-ea0c-4441-b333-35e490b44674 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Truncated console log returned, 3253 bytes ignored#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.738 2 INFO nova.virt.libvirt.driver [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Creating config drive at /var/lib/nova/instances/307216ea-7a54-4279-80a4-70a83f9056e4/disk.config#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.744 2 DEBUG oslo_concurrency.processutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/307216ea-7a54-4279-80a4-70a83f9056e4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl59hssef execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.875 2 DEBUG oslo_concurrency.processutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/307216ea-7a54-4279-80a4-70a83f9056e4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl59hssef" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.910 2 DEBUG nova.compute.manager [req-d8f8b49e-d08c-4faf-9815-7e653e098c5c req-04c7720f-0745-472d-b9cc-321260c84804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Received event network-changed-e483b131-9a58-498a-941e-6f52029cc1c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.912 2 DEBUG nova.compute.manager [req-d8f8b49e-d08c-4faf-9815-7e653e098c5c req-04c7720f-0745-472d-b9cc-321260c84804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Refreshing instance network info cache due to event network-changed-e483b131-9a58-498a-941e-6f52029cc1c5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.912 2 DEBUG oslo_concurrency.lockutils [req-d8f8b49e-d08c-4faf-9815-7e653e098c5c req-04c7720f-0745-472d-b9cc-321260c84804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-307216ea-7a54-4279-80a4-70a83f9056e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.913 2 DEBUG oslo_concurrency.lockutils [req-d8f8b49e-d08c-4faf-9815-7e653e098c5c req-04c7720f-0745-472d-b9cc-321260c84804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-307216ea-7a54-4279-80a4-70a83f9056e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.914 2 DEBUG nova.network.neutron [req-d8f8b49e-d08c-4faf-9815-7e653e098c5c req-04c7720f-0745-472d-b9cc-321260c84804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Refreshing network info cache for port e483b131-9a58-498a-941e-6f52029cc1c5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:31:13 np0005476733 kernel: tape483b131-9a: entered promiscuous mode
Oct  8 11:31:13 np0005476733 NetworkManager[51699]: <info>  [1759937473.9586] manager: (tape483b131-9a): new Tun device (/org/freedesktop/NetworkManager/Devices/136)
Oct  8 11:31:13 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:13Z|00383|binding|INFO|Claiming lport e483b131-9a58-498a-941e-6f52029cc1c5 for this chassis.
Oct  8 11:31:13 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:13Z|00384|binding|INFO|e483b131-9a58-498a-941e-6f52029cc1c5: Claiming fa:16:3e:74:65:70 10.100.0.14
Oct  8 11:31:13 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:13Z|00385|binding|INFO|e483b131-9a58-498a-941e-6f52029cc1c5: Claiming unknown
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:13.981 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:65:70 10.100.0.14', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '307216ea-7a54-4279-80a4-70a83f9056e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-05e23ee7-84d7-47d9-8de9-b53576f6a373', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c98d2b93a2394e67a5e6525145c5bca5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bec80dd1-0f75-4955-b64e-4b7639499c68, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=e483b131-9a58-498a-941e-6f52029cc1c5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:13 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:13Z|00386|binding|INFO|Setting lport e483b131-9a58-498a-941e-6f52029cc1c5 ovn-installed in OVS
Oct  8 11:31:13 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:13Z|00387|binding|INFO|Setting lport e483b131-9a58-498a-941e-6f52029cc1c5 up in Southbound
Oct  8 11:31:13 np0005476733 nova_compute[192580]: 2025-10-08 15:31:13.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:13.985 103739 INFO neutron.agent.ovn.metadata.agent [-] Port e483b131-9a58-498a-941e-6f52029cc1c5 in datapath 05e23ee7-84d7-47d9-8de9-b53576f6a373 bound to our chassis#033[00m
Oct  8 11:31:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:13.995 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 05e23ee7-84d7-47d9-8de9-b53576f6a373#033[00m
Oct  8 11:31:14 np0005476733 systemd-machined[152624]: New machine qemu-29-instance-00000030.
Oct  8 11:31:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:14.016 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[919fd02f-5058-4c16-b52c-b44cfbf4a5a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:14 np0005476733 systemd-udevd[232287]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:31:14 np0005476733 NetworkManager[51699]: <info>  [1759937474.0319] device (tape483b131-9a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:31:14 np0005476733 NetworkManager[51699]: <info>  [1759937474.0328] device (tape483b131-9a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:31:14 np0005476733 systemd[1]: Started Virtual Machine qemu-29-instance-00000030.
Oct  8 11:31:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:14.049 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[9f728e86-cec5-4a19-8eb5-f4f01114b0bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:14.052 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[3be524ef-1041-4d3f-9a52-b527e2929ca9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:14.081 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[985b1264-a6ff-4226-bf18-d2143a4f6f2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:14.103 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e63aa9f2-7cdc-41dc-b081-23f0eee40c3b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap05e23ee7-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:08:53:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 17, 'tx_packets': 5, 'rx_bytes': 1322, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 17, 'tx_packets': 5, 'rx_bytes': 1322, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 435047, 'reachable_time': 32760, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 12, 'inoctets': 944, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 12, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 944, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 12, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232298, 'error': None, 'target': 'ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:14.123 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7014875a-beec-4530-bf86-0e4f6666f655]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap05e23ee7-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 435059, 'tstamp': 435059}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232300, 'error': None, 'target': 'ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap05e23ee7-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 435062, 'tstamp': 435062}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232300, 'error': None, 'target': 'ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:14.125 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap05e23ee7-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:14.130 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap05e23ee7-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:31:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:14.130 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:14.131 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap05e23ee7-80, col_values=(('external_ids', {'iface-id': 'd2188afb-493c-4705-9ba9-87c4b983c343'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:31:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:14.131 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.475 2 DEBUG nova.compute.manager [req-c7124461-887f-4bf1-89b3-a9e5fc7fd888 req-532111e6-2d00-4935-a2c4-706cce305ccf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Received event network-vif-plugged-e483b131-9a58-498a-941e-6f52029cc1c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.476 2 DEBUG oslo_concurrency.lockutils [req-c7124461-887f-4bf1-89b3-a9e5fc7fd888 req-532111e6-2d00-4935-a2c4-706cce305ccf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "307216ea-7a54-4279-80a4-70a83f9056e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.477 2 DEBUG oslo_concurrency.lockutils [req-c7124461-887f-4bf1-89b3-a9e5fc7fd888 req-532111e6-2d00-4935-a2c4-706cce305ccf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "307216ea-7a54-4279-80a4-70a83f9056e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.477 2 DEBUG oslo_concurrency.lockutils [req-c7124461-887f-4bf1-89b3-a9e5fc7fd888 req-532111e6-2d00-4935-a2c4-706cce305ccf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "307216ea-7a54-4279-80a4-70a83f9056e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.477 2 DEBUG nova.compute.manager [req-c7124461-887f-4bf1-89b3-a9e5fc7fd888 req-532111e6-2d00-4935-a2c4-706cce305ccf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Processing event network-vif-plugged-e483b131-9a58-498a-941e-6f52029cc1c5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.790 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937474.7905877, 307216ea-7a54-4279-80a4-70a83f9056e4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.791 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] VM Started (Lifecycle Event)#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.793 2 DEBUG nova.compute.manager [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.797 2 DEBUG nova.virt.libvirt.driver [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.800 2 INFO nova.virt.libvirt.driver [-] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Instance spawned successfully.#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.801 2 DEBUG nova.virt.libvirt.driver [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.821 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.826 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.830 2 DEBUG nova.virt.libvirt.driver [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.830 2 DEBUG nova.virt.libvirt.driver [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.830 2 DEBUG nova.virt.libvirt.driver [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.831 2 DEBUG nova.virt.libvirt.driver [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.831 2 DEBUG nova.virt.libvirt.driver [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.832 2 DEBUG nova.virt.libvirt.driver [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.865 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.866 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937474.7906988, 307216ea-7a54-4279-80a4-70a83f9056e4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.866 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.900 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.905 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937474.7972634, 307216ea-7a54-4279-80a4-70a83f9056e4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.905 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.939 2 INFO nova.compute.manager [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Took 5.54 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.940 2 DEBUG nova.compute.manager [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.943 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.951 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:31:14 np0005476733 nova_compute[192580]: 2025-10-08 15:31:14.995 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:31:15 np0005476733 nova_compute[192580]: 2025-10-08 15:31:15.039 2 INFO nova.compute.manager [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Took 6.18 seconds to build instance.#033[00m
Oct  8 11:31:15 np0005476733 nova_compute[192580]: 2025-10-08 15:31:15.066 2 DEBUG oslo_concurrency.lockutils [None req-b8bc7449-a485-44ab-b284-f4673d075638 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "307216ea-7a54-4279-80a4-70a83f9056e4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.302s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:16 np0005476733 podman[232309]: 2025-10-08 15:31:16.232041111 +0000 UTC m=+0.060756951 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41)
Oct  8 11:31:16 np0005476733 nova_compute[192580]: 2025-10-08 15:31:16.480 2 DEBUG nova.network.neutron [req-d8f8b49e-d08c-4faf-9815-7e653e098c5c req-04c7720f-0745-472d-b9cc-321260c84804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Updated VIF entry in instance network info cache for port e483b131-9a58-498a-941e-6f52029cc1c5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:31:16 np0005476733 nova_compute[192580]: 2025-10-08 15:31:16.481 2 DEBUG nova.network.neutron [req-d8f8b49e-d08c-4faf-9815-7e653e098c5c req-04c7720f-0745-472d-b9cc-321260c84804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Updating instance_info_cache with network_info: [{"id": "e483b131-9a58-498a-941e-6f52029cc1c5", "address": "fa:16:3e:74:65:70", "network": {"id": "05e23ee7-84d7-47d9-8de9-b53576f6a373", "bridge": "br-int", "label": "tempest-test-network--1624656132", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c98d2b93a2394e67a5e6525145c5bca5", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape483b131-9a", "ovs_interfaceid": "e483b131-9a58-498a-941e-6f52029cc1c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:31:16 np0005476733 nova_compute[192580]: 2025-10-08 15:31:16.506 2 DEBUG oslo_concurrency.lockutils [req-d8f8b49e-d08c-4faf-9815-7e653e098c5c req-04c7720f-0745-472d-b9cc-321260c84804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-307216ea-7a54-4279-80a4-70a83f9056e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:31:16 np0005476733 nova_compute[192580]: 2025-10-08 15:31:16.586 2 INFO nova.compute.manager [None req-9bcb1539-9755-41f9-b6fe-e75efb8756dc b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Get console output#033[00m
Oct  8 11:31:16 np0005476733 nova_compute[192580]: 2025-10-08 15:31:16.591 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:31:16 np0005476733 nova_compute[192580]: 2025-10-08 15:31:16.699 2 DEBUG nova.compute.manager [req-5acc1b2c-3562-4fbc-9c07-8afb8c2e1708 req-064f7e4d-ce97-4c39-a3c0-4f67a51b2b80 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Received event network-vif-plugged-e483b131-9a58-498a-941e-6f52029cc1c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:31:16 np0005476733 nova_compute[192580]: 2025-10-08 15:31:16.699 2 DEBUG oslo_concurrency.lockutils [req-5acc1b2c-3562-4fbc-9c07-8afb8c2e1708 req-064f7e4d-ce97-4c39-a3c0-4f67a51b2b80 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "307216ea-7a54-4279-80a4-70a83f9056e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:16 np0005476733 nova_compute[192580]: 2025-10-08 15:31:16.700 2 DEBUG oslo_concurrency.lockutils [req-5acc1b2c-3562-4fbc-9c07-8afb8c2e1708 req-064f7e4d-ce97-4c39-a3c0-4f67a51b2b80 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "307216ea-7a54-4279-80a4-70a83f9056e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:16 np0005476733 nova_compute[192580]: 2025-10-08 15:31:16.700 2 DEBUG oslo_concurrency.lockutils [req-5acc1b2c-3562-4fbc-9c07-8afb8c2e1708 req-064f7e4d-ce97-4c39-a3c0-4f67a51b2b80 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "307216ea-7a54-4279-80a4-70a83f9056e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:16 np0005476733 nova_compute[192580]: 2025-10-08 15:31:16.700 2 DEBUG nova.compute.manager [req-5acc1b2c-3562-4fbc-9c07-8afb8c2e1708 req-064f7e4d-ce97-4c39-a3c0-4f67a51b2b80 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] No waiting events found dispatching network-vif-plugged-e483b131-9a58-498a-941e-6f52029cc1c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:31:16 np0005476733 nova_compute[192580]: 2025-10-08 15:31:16.700 2 WARNING nova.compute.manager [req-5acc1b2c-3562-4fbc-9c07-8afb8c2e1708 req-064f7e4d-ce97-4c39-a3c0-4f67a51b2b80 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Received unexpected event network-vif-plugged-e483b131-9a58-498a-941e-6f52029cc1c5 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:31:18 np0005476733 podman[232353]: 2025-10-08 15:31:18.229048344 +0000 UTC m=+0.051773235 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 11:31:18 np0005476733 podman[232352]: 2025-10-08 15:31:18.241995557 +0000 UTC m=+0.066496535 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 11:31:18 np0005476733 nova_compute[192580]: 2025-10-08 15:31:18.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:19 np0005476733 nova_compute[192580]: 2025-10-08 15:31:19.240 2 DEBUG nova.compute.manager [req-ad179d58-4e08-4c99-837c-a5a7a8c2dcd0 req-7e76d8dd-fbaf-4777-a104-e9f55033b20b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received event network-changed-8f7d5998-037f-4a70-98a0-8482a8043a7e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:31:19 np0005476733 nova_compute[192580]: 2025-10-08 15:31:19.240 2 DEBUG nova.compute.manager [req-ad179d58-4e08-4c99-837c-a5a7a8c2dcd0 req-7e76d8dd-fbaf-4777-a104-e9f55033b20b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Refreshing instance network info cache due to event network-changed-8f7d5998-037f-4a70-98a0-8482a8043a7e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:31:19 np0005476733 nova_compute[192580]: 2025-10-08 15:31:19.240 2 DEBUG oslo_concurrency.lockutils [req-ad179d58-4e08-4c99-837c-a5a7a8c2dcd0 req-7e76d8dd-fbaf-4777-a104-e9f55033b20b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:31:19 np0005476733 nova_compute[192580]: 2025-10-08 15:31:19.240 2 DEBUG oslo_concurrency.lockutils [req-ad179d58-4e08-4c99-837c-a5a7a8c2dcd0 req-7e76d8dd-fbaf-4777-a104-e9f55033b20b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:31:19 np0005476733 nova_compute[192580]: 2025-10-08 15:31:19.241 2 DEBUG nova.network.neutron [req-ad179d58-4e08-4c99-837c-a5a7a8c2dcd0 req-7e76d8dd-fbaf-4777-a104-e9f55033b20b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Refreshing network info cache for port 8f7d5998-037f-4a70-98a0-8482a8043a7e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:31:19 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:19Z|00388|binding|INFO|Releasing lport d2188afb-493c-4705-9ba9-87c4b983c343 from this chassis (sb_readonly=0)
Oct  8 11:31:19 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:19Z|00389|binding|INFO|Releasing lport f67773e8-4408-425a-8438-2209ddc36987 from this chassis (sb_readonly=0)
Oct  8 11:31:19 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:19Z|00390|binding|INFO|Releasing lport 76302563-91ae-48df-adce-3edec8d5a578 from this chassis (sb_readonly=0)
Oct  8 11:31:19 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:19Z|00391|binding|INFO|Releasing lport bbd16b0e-af1f-427d-8500-724401e2ed53 from this chassis (sb_readonly=0)
Oct  8 11:31:19 np0005476733 nova_compute[192580]: 2025-10-08 15:31:19.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:19 np0005476733 nova_compute[192580]: 2025-10-08 15:31:19.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:20Z|00392|pinctrl|WARN|Dropped 8511 log messages in last 60 seconds (most recently, 1 seconds ago) due to excessive rate
Oct  8 11:31:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:20Z|00393|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:31:20 np0005476733 nova_compute[192580]: 2025-10-08 15:31:20.598 2 DEBUG nova.network.neutron [req-ad179d58-4e08-4c99-837c-a5a7a8c2dcd0 req-7e76d8dd-fbaf-4777-a104-e9f55033b20b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updated VIF entry in instance network info cache for port 8f7d5998-037f-4a70-98a0-8482a8043a7e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:31:20 np0005476733 nova_compute[192580]: 2025-10-08 15:31:20.599 2 DEBUG nova.network.neutron [req-ad179d58-4e08-4c99-837c-a5a7a8c2dcd0 req-7e76d8dd-fbaf-4777-a104-e9f55033b20b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updating instance_info_cache with network_info: [{"id": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "address": "fa:16:3e:85:7d:15", "network": {"id": "f81b33e3-d2f7-4437-b8c9-c9a54931fb61", "bridge": "br-int", "label": "tempest-test-network--416037603", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f7d5998-03", "ovs_interfaceid": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:31:20 np0005476733 nova_compute[192580]: 2025-10-08 15:31:20.797 2 DEBUG oslo_concurrency.lockutils [req-ad179d58-4e08-4c99-837c-a5a7a8c2dcd0 req-7e76d8dd-fbaf-4777-a104-e9f55033b20b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:31:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:21.704 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:31:21 np0005476733 nova_compute[192580]: 2025-10-08 15:31:21.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:21.706 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:31:21 np0005476733 nova_compute[192580]: 2025-10-08 15:31:21.763 2 INFO nova.compute.manager [None req-d5e2bbde-0719-46ed-9606-1df02a8f5d5d b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Get console output#033[00m
Oct  8 11:31:21 np0005476733 nova_compute[192580]: 2025-10-08 15:31:21.768 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:31:23 np0005476733 nova_compute[192580]: 2025-10-08 15:31:23.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:24 np0005476733 nova_compute[192580]: 2025-10-08 15:31:24.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:26.313 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:26.315 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:26.316 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:27 np0005476733 nova_compute[192580]: 2025-10-08 15:31:27.105 2 INFO nova.compute.manager [None req-2070190f-8a01-46bb-968b-6ace8dc758a7 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Get console output#033[00m
Oct  8 11:31:27 np0005476733 nova_compute[192580]: 2025-10-08 15:31:27.113 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:31:27 np0005476733 podman[232403]: 2025-10-08 15:31:27.26489067 +0000 UTC m=+0.079073292 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_managed=true, config_id=iscsid)
Oct  8 11:31:27 np0005476733 podman[232404]: 2025-10-08 15:31:27.290426071 +0000 UTC m=+0.099718556 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:31:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:27Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:74:65:70 10.100.0.14
Oct  8 11:31:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:27Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:74:65:70 10.100.0.14
Oct  8 11:31:28 np0005476733 nova_compute[192580]: 2025-10-08 15:31:28.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:28.709 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:31:29 np0005476733 nova_compute[192580]: 2025-10-08 15:31:29.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:33 np0005476733 nova_compute[192580]: 2025-10-08 15:31:33.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:34 np0005476733 nova_compute[192580]: 2025-10-08 15:31:34.388 2 DEBUG nova.compute.manager [req-b79c1d9d-eb9c-4e1f-9983-a3c3097d4658 req-3024683f-c579-40b3-abe9-ad11b7fd96ab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Received event network-changed-e483b131-9a58-498a-941e-6f52029cc1c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:31:34 np0005476733 nova_compute[192580]: 2025-10-08 15:31:34.388 2 DEBUG nova.compute.manager [req-b79c1d9d-eb9c-4e1f-9983-a3c3097d4658 req-3024683f-c579-40b3-abe9-ad11b7fd96ab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Refreshing instance network info cache due to event network-changed-e483b131-9a58-498a-941e-6f52029cc1c5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:31:34 np0005476733 nova_compute[192580]: 2025-10-08 15:31:34.388 2 DEBUG oslo_concurrency.lockutils [req-b79c1d9d-eb9c-4e1f-9983-a3c3097d4658 req-3024683f-c579-40b3-abe9-ad11b7fd96ab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-307216ea-7a54-4279-80a4-70a83f9056e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:31:34 np0005476733 nova_compute[192580]: 2025-10-08 15:31:34.389 2 DEBUG oslo_concurrency.lockutils [req-b79c1d9d-eb9c-4e1f-9983-a3c3097d4658 req-3024683f-c579-40b3-abe9-ad11b7fd96ab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-307216ea-7a54-4279-80a4-70a83f9056e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:31:34 np0005476733 nova_compute[192580]: 2025-10-08 15:31:34.389 2 DEBUG nova.network.neutron [req-b79c1d9d-eb9c-4e1f-9983-a3c3097d4658 req-3024683f-c579-40b3-abe9-ad11b7fd96ab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Refreshing network info cache for port e483b131-9a58-498a-941e-6f52029cc1c5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:31:34 np0005476733 nova_compute[192580]: 2025-10-08 15:31:34.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:35 np0005476733 nova_compute[192580]: 2025-10-08 15:31:35.828 2 DEBUG oslo_concurrency.lockutils [None req-0d07c2f4-0178-4359-8cd9-b34fa8d164c1 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Acquiring lock "a7cf9795-ac6e-4d38-8500-755c39931e14" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:35 np0005476733 nova_compute[192580]: 2025-10-08 15:31:35.829 2 DEBUG oslo_concurrency.lockutils [None req-0d07c2f4-0178-4359-8cd9-b34fa8d164c1 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Lock "a7cf9795-ac6e-4d38-8500-755c39931e14" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:35 np0005476733 nova_compute[192580]: 2025-10-08 15:31:35.829 2 DEBUG oslo_concurrency.lockutils [None req-0d07c2f4-0178-4359-8cd9-b34fa8d164c1 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Acquiring lock "a7cf9795-ac6e-4d38-8500-755c39931e14-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:35 np0005476733 nova_compute[192580]: 2025-10-08 15:31:35.829 2 DEBUG oslo_concurrency.lockutils [None req-0d07c2f4-0178-4359-8cd9-b34fa8d164c1 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Lock "a7cf9795-ac6e-4d38-8500-755c39931e14-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:35 np0005476733 nova_compute[192580]: 2025-10-08 15:31:35.830 2 DEBUG oslo_concurrency.lockutils [None req-0d07c2f4-0178-4359-8cd9-b34fa8d164c1 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Lock "a7cf9795-ac6e-4d38-8500-755c39931e14-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:35 np0005476733 nova_compute[192580]: 2025-10-08 15:31:35.831 2 INFO nova.compute.manager [None req-0d07c2f4-0178-4359-8cd9-b34fa8d164c1 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Terminating instance#033[00m
Oct  8 11:31:35 np0005476733 nova_compute[192580]: 2025-10-08 15:31:35.832 2 DEBUG nova.compute.manager [None req-0d07c2f4-0178-4359-8cd9-b34fa8d164c1 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:31:35 np0005476733 kernel: tap66f32729-1d (unregistering): left promiscuous mode
Oct  8 11:31:35 np0005476733 NetworkManager[51699]: <info>  [1759937495.8576] device (tap66f32729-1d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:31:35 np0005476733 nova_compute[192580]: 2025-10-08 15:31:35.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:35 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:35Z|00394|binding|INFO|Releasing lport 66f32729-1d2a-44d6-b604-29e4c751f95c from this chassis (sb_readonly=0)
Oct  8 11:31:35 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:35Z|00395|binding|INFO|Setting lport 66f32729-1d2a-44d6-b604-29e4c751f95c down in Southbound
Oct  8 11:31:35 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:35Z|00396|binding|INFO|Removing iface tap66f32729-1d ovn-installed in OVS
Oct  8 11:31:35 np0005476733 nova_compute[192580]: 2025-10-08 15:31:35.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:35 np0005476733 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000028.scope: Deactivated successfully.
Oct  8 11:31:35 np0005476733 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000028.scope: Consumed 46.535s CPU time.
Oct  8 11:31:35 np0005476733 systemd-machined[152624]: Machine qemu-24-instance-00000028 terminated.
Oct  8 11:31:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:35.988 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:ff:69 10.100.0.13', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'port-tempest-MultiVlanTransparencyTest-1225694051-0', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a7cf9795-ac6e-4d38-8500-755c39931e14', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-858b993e-0613-4d63-983c-94fe95ccca9d', 'neutron:port_capabilities': '', 'neutron:port_name': 'port-tempest-MultiVlanTransparencyTest-1225694051-0', 'neutron:project_id': 'f4f21e712eb24213a38bc89e2b2f44b3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.189'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=31a08c4d-6969-4ca1-af4c-9d4107c6eb62, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=66f32729-1d2a-44d6-b604-29e4c751f95c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:31:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:35.990 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 66f32729-1d2a-44d6-b604-29e4c751f95c in datapath 858b993e-0613-4d63-983c-94fe95ccca9d unbound from our chassis#033[00m
Oct  8 11:31:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:35.994 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 858b993e-0613-4d63-983c-94fe95ccca9d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:31:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:35.996 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[33661962-5624-4a2d-ac96-3ebcb2aaab87]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:35.996 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-858b993e-0613-4d63-983c-94fe95ccca9d namespace which is not needed anymore#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.120 2 INFO nova.virt.libvirt.driver [-] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Instance destroyed successfully.#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.121 2 DEBUG nova.objects.instance [None req-0d07c2f4-0178-4359-8cd9-b34fa8d164c1 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Lazy-loading 'resources' on Instance uuid a7cf9795-ac6e-4d38-8500-755c39931e14 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:31:36 np0005476733 neutron-haproxy-ovnmeta-858b993e-0613-4d63-983c-94fe95ccca9d[230689]: [NOTICE]   (230695) : haproxy version is 2.8.14-c23fe91
Oct  8 11:31:36 np0005476733 neutron-haproxy-ovnmeta-858b993e-0613-4d63-983c-94fe95ccca9d[230689]: [NOTICE]   (230695) : path to executable is /usr/sbin/haproxy
Oct  8 11:31:36 np0005476733 neutron-haproxy-ovnmeta-858b993e-0613-4d63-983c-94fe95ccca9d[230689]: [WARNING]  (230695) : Exiting Master process...
Oct  8 11:31:36 np0005476733 neutron-haproxy-ovnmeta-858b993e-0613-4d63-983c-94fe95ccca9d[230689]: [ALERT]    (230695) : Current worker (230697) exited with code 143 (Terminated)
Oct  8 11:31:36 np0005476733 neutron-haproxy-ovnmeta-858b993e-0613-4d63-983c-94fe95ccca9d[230689]: [WARNING]  (230695) : All workers exited. Exiting... (0)
Oct  8 11:31:36 np0005476733 systemd[1]: libpod-8b7ffbbcaff2a34a1b70e2b914c887e0d24c7996e07c13ad227e0c3e6e26ad61.scope: Deactivated successfully.
Oct  8 11:31:36 np0005476733 podman[232484]: 2025-10-08 15:31:36.172634524 +0000 UTC m=+0.045921357 container died 8b7ffbbcaff2a34a1b70e2b914c887e0d24c7996e07c13ad227e0c3e6e26ad61 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-858b993e-0613-4d63-983c-94fe95ccca9d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.191 2 DEBUG nova.virt.libvirt.vif [None req-0d07c2f4-0178-4359-8cd9-b34fa8d164c1 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:29:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='server-tempest-MultiVlanTransparencyTest-1225694051-0',display_name='server-tempest-MultiVlanTransparencyTest-1225694051-0',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multivlantransparencytest-1225694051-0',id=40,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLvujo5b87uOVbg3RpxHXnqbrVt2ySGYzFVUknwtVv2YR6AYy5RSYfPq4hh/P68Iq/ARCEc1PMbDU99yQi39bUYIlrvmMeOEw4FT/HN0a6mEQB3qyjgogOJ/vPLLZ3a+kQ==',key_name='tempest-MultiVlanTransparencyTest-1225694051',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:29:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f4f21e712eb24213a38bc89e2b2f44b3',ramdisk_id='',reservation_id='r-3f98e63s',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-MultiVlanTransparencyTest-48410347',owner_user_name='tempest-MultiVlanTransparencyTest-48410347-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:29:26Z,user_data=None,user_id='c852472017334735b37425ffa8591384',uuid=a7cf9795-ac6e-4d38-8500-755c39931e14,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "66f32729-1d2a-44d6-b604-29e4c751f95c", "address": "fa:16:3e:22:ff:69", "network": {"id": "858b993e-0613-4d63-983c-94fe95ccca9d", "bridge": "br-int", "label": "tempest-MultiVlanTransparencyTest-1225694051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4f21e712eb24213a38bc89e2b2f44b3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66f32729-1d", "ovs_interfaceid": "66f32729-1d2a-44d6-b604-29e4c751f95c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.191 2 DEBUG nova.network.os_vif_util [None req-0d07c2f4-0178-4359-8cd9-b34fa8d164c1 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Converting VIF {"id": "66f32729-1d2a-44d6-b604-29e4c751f95c", "address": "fa:16:3e:22:ff:69", "network": {"id": "858b993e-0613-4d63-983c-94fe95ccca9d", "bridge": "br-int", "label": "tempest-MultiVlanTransparencyTest-1225694051", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4f21e712eb24213a38bc89e2b2f44b3", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66f32729-1d", "ovs_interfaceid": "66f32729-1d2a-44d6-b604-29e4c751f95c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.192 2 DEBUG nova.network.os_vif_util [None req-0d07c2f4-0178-4359-8cd9-b34fa8d164c1 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:ff:69,bridge_name='br-int',has_traffic_filtering=True,id=66f32729-1d2a-44d6-b604-29e4c751f95c,network=Network(858b993e-0613-4d63-983c-94fe95ccca9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap66f32729-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.192 2 DEBUG os_vif [None req-0d07c2f4-0178-4359-8cd9-b34fa8d164c1 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:ff:69,bridge_name='br-int',has_traffic_filtering=True,id=66f32729-1d2a-44d6-b604-29e4c751f95c,network=Network(858b993e-0613-4d63-983c-94fe95ccca9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap66f32729-1d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.195 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66f32729-1d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.200 2 INFO os_vif [None req-0d07c2f4-0178-4359-8cd9-b34fa8d164c1 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:ff:69,bridge_name='br-int',has_traffic_filtering=True,id=66f32729-1d2a-44d6-b604-29e4c751f95c,network=Network(858b993e-0613-4d63-983c-94fe95ccca9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap66f32729-1d')#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.201 2 INFO nova.virt.libvirt.driver [None req-0d07c2f4-0178-4359-8cd9-b34fa8d164c1 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Deleting instance files /var/lib/nova/instances/a7cf9795-ac6e-4d38-8500-755c39931e14_del#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.202 2 INFO nova.virt.libvirt.driver [None req-0d07c2f4-0178-4359-8cd9-b34fa8d164c1 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Deletion of /var/lib/nova/instances/a7cf9795-ac6e-4d38-8500-755c39931e14_del complete#033[00m
Oct  8 11:31:36 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8b7ffbbcaff2a34a1b70e2b914c887e0d24c7996e07c13ad227e0c3e6e26ad61-userdata-shm.mount: Deactivated successfully.
Oct  8 11:31:36 np0005476733 systemd[1]: var-lib-containers-storage-overlay-bd17042d803c6aabb4070bfc0b7789bb09aea40c02adf4bc92f076acc0e395fb-merged.mount: Deactivated successfully.
Oct  8 11:31:36 np0005476733 podman[232484]: 2025-10-08 15:31:36.219458588 +0000 UTC m=+0.092745421 container cleanup 8b7ffbbcaff2a34a1b70e2b914c887e0d24c7996e07c13ad227e0c3e6e26ad61 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-858b993e-0613-4d63-983c-94fe95ccca9d, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:31:36 np0005476733 systemd[1]: libpod-conmon-8b7ffbbcaff2a34a1b70e2b914c887e0d24c7996e07c13ad227e0c3e6e26ad61.scope: Deactivated successfully.
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.265 2 DEBUG oslo_concurrency.lockutils [None req-8eefa3e6-4d5e-4843-b6da-be64f0ab9db3 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Acquiring lock "307216ea-7a54-4279-80a4-70a83f9056e4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.265 2 DEBUG oslo_concurrency.lockutils [None req-8eefa3e6-4d5e-4843-b6da-be64f0ab9db3 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "307216ea-7a54-4279-80a4-70a83f9056e4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.265 2 DEBUG oslo_concurrency.lockutils [None req-8eefa3e6-4d5e-4843-b6da-be64f0ab9db3 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Acquiring lock "307216ea-7a54-4279-80a4-70a83f9056e4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.266 2 DEBUG oslo_concurrency.lockutils [None req-8eefa3e6-4d5e-4843-b6da-be64f0ab9db3 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "307216ea-7a54-4279-80a4-70a83f9056e4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.266 2 DEBUG oslo_concurrency.lockutils [None req-8eefa3e6-4d5e-4843-b6da-be64f0ab9db3 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "307216ea-7a54-4279-80a4-70a83f9056e4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.267 2 INFO nova.compute.manager [None req-8eefa3e6-4d5e-4843-b6da-be64f0ab9db3 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Terminating instance#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.267 2 DEBUG nova.compute.manager [None req-8eefa3e6-4d5e-4843-b6da-be64f0ab9db3 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:31:36 np0005476733 podman[232514]: 2025-10-08 15:31:36.277429502 +0000 UTC m=+0.037815587 container remove 8b7ffbbcaff2a34a1b70e2b914c887e0d24c7996e07c13ad227e0c3e6e26ad61 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-858b993e-0613-4d63-983c-94fe95ccca9d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 11:31:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:36.282 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[abba278e-0062-4ac5-8ce6-4401c4c14dde]: (4, ('Wed Oct  8 03:31:36 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-858b993e-0613-4d63-983c-94fe95ccca9d (8b7ffbbcaff2a34a1b70e2b914c887e0d24c7996e07c13ad227e0c3e6e26ad61)\n8b7ffbbcaff2a34a1b70e2b914c887e0d24c7996e07c13ad227e0c3e6e26ad61\nWed Oct  8 03:31:36 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-858b993e-0613-4d63-983c-94fe95ccca9d (8b7ffbbcaff2a34a1b70e2b914c887e0d24c7996e07c13ad227e0c3e6e26ad61)\n8b7ffbbcaff2a34a1b70e2b914c887e0d24c7996e07c13ad227e0c3e6e26ad61\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:36.284 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ef750339-4208-4b7c-9d35-8e06e7a6b2da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:36.285 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap858b993e-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:36 np0005476733 kernel: tap858b993e-00: left promiscuous mode
Oct  8 11:31:36 np0005476733 kernel: tape483b131-9a (unregistering): left promiscuous mode
Oct  8 11:31:36 np0005476733 NetworkManager[51699]: <info>  [1759937496.2954] device (tape483b131-9a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:36.308 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0fcae9b9-1b6f-4f62-b471-c4eaa16cceb4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:36 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:36Z|00397|binding|INFO|Releasing lport e483b131-9a58-498a-941e-6f52029cc1c5 from this chassis (sb_readonly=0)
Oct  8 11:31:36 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:36Z|00398|binding|INFO|Setting lport e483b131-9a58-498a-941e-6f52029cc1c5 down in Southbound
Oct  8 11:31:36 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:36Z|00399|binding|INFO|Removing iface tape483b131-9a ovn-installed in OVS
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.323 2 INFO nova.compute.manager [None req-0d07c2f4-0178-4359-8cd9-b34fa8d164c1 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Took 0.49 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:31:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:36.322 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:65:70 10.100.0.14', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '307216ea-7a54-4279-80a4-70a83f9056e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-05e23ee7-84d7-47d9-8de9-b53576f6a373', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c98d2b93a2394e67a5e6525145c5bca5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.226'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bec80dd1-0f75-4955-b64e-4b7639499c68, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=e483b131-9a58-498a-941e-6f52029cc1c5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.324 2 DEBUG oslo.service.loopingcall [None req-0d07c2f4-0178-4359-8cd9-b34fa8d164c1 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.324 2 DEBUG nova.compute.manager [-] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.324 2 DEBUG nova.network.neutron [-] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:36.340 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[94516f1f-6778-4836-b618-c060a1013fbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:36.341 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e9d2aaa5-0ece-4d4a-b532-571acb200a01]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:36.359 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f4124155-5bed-4b6c-820b-78bcd6d49da5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426800, 'reachable_time': 43882, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232542, 'error': None, 'target': 'ovnmeta-858b993e-0613-4d63-983c-94fe95ccca9d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:36.362 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-858b993e-0613-4d63-983c-94fe95ccca9d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:31:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:36.362 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[20472c9b-e572-4c9f-947f-132abdc94fd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:36.363 103739 INFO neutron.agent.ovn.metadata.agent [-] Port e483b131-9a58-498a-941e-6f52029cc1c5 in datapath 05e23ee7-84d7-47d9-8de9-b53576f6a373 unbound from our chassis#033[00m
Oct  8 11:31:36 np0005476733 systemd[1]: run-netns-ovnmeta\x2d858b993e\x2d0613\x2d4d63\x2d983c\x2d94fe95ccca9d.mount: Deactivated successfully.
Oct  8 11:31:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:36.365 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 05e23ee7-84d7-47d9-8de9-b53576f6a373#033[00m
Oct  8 11:31:36 np0005476733 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000030.scope: Deactivated successfully.
Oct  8 11:31:36 np0005476733 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000030.scope: Consumed 12.695s CPU time.
Oct  8 11:31:36 np0005476733 systemd-machined[152624]: Machine qemu-29-instance-00000030 terminated.
Oct  8 11:31:36 np0005476733 podman[232527]: 2025-10-08 15:31:36.378933983 +0000 UTC m=+0.058295044 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:31:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:36.383 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2e56c3d5-66fe-4736-ae89-08f4ecee5dfa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:36.412 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[480c6214-e64c-4160-8a82-59fbc641073f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:36.416 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[85f15137-8b07-4249-a148-28c900b74c67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:36.441 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[ffd6f528-5112-4550-9e36-5d816748d922]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:36.460 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[09cb2dc7-b0db-49f2-b696-537e68785e2c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap05e23ee7-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:08:53:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 25, 'tx_packets': 7, 'rx_bytes': 1742, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 25, 'tx_packets': 7, 'rx_bytes': 1742, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 435047, 'reachable_time': 32760, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 15, 'inoctets': 1112, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 15, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 1112, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 15, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232560, 'error': None, 'target': 'ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:36.475 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[80dd772f-7bb2-41cb-8f6b-0fb142429a3f]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap05e23ee7-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 435059, 'tstamp': 435059}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232561, 'error': None, 'target': 'ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap05e23ee7-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 435062, 'tstamp': 435062}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232561, 'error': None, 'target': 'ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:36.476 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap05e23ee7-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:36.520 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap05e23ee7-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:31:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:36.521 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:31:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:36.521 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap05e23ee7-80, col_values=(('external_ids', {'iface-id': 'd2188afb-493c-4705-9ba9-87c4b983c343'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:31:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:36.522 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.561 2 INFO nova.virt.libvirt.driver [-] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Instance destroyed successfully.#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.562 2 DEBUG nova.objects.instance [None req-8eefa3e6-4d5e-4843-b6da-be64f0ab9db3 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lazy-loading 'resources' on Instance uuid 307216ea-7a54-4279-80a4-70a83f9056e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.581 2 DEBUG nova.virt.libvirt.vif [None req-8eefa3e6-4d5e-4843-b6da-be64f0ab9db3 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:31:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1029057004',display_name='tempest-server-test-1029057004',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1029057004',id=48,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD8eBR+7791zIq3Ca3IMTdsqXttoRnnB98onayUfwxm3wid9Grh+Seb7hbQVJS1apoK+LjhlbIItD35aVZWAg+9RelHofSx/RlM7CwUN9S/EmBk++Oh1cTCh74OhHkN3ow==',key_name='tempest-keypair-test-1341641232',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:31:14Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c98d2b93a2394e67a5e6525145c5bca5',ramdisk_id='',reservation_id='r-9scz6dsc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-PortSecurityTest-26261139',owner_user_name='tempest-PortSecurityTest-26261139-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:31:15Z,user_data=None,user_id='b35a1072024b4c6598970391dd8abb59',uuid=307216ea-7a54-4279-80a4-70a83f9056e4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e483b131-9a58-498a-941e-6f52029cc1c5", "address": "fa:16:3e:74:65:70", "network": {"id": "05e23ee7-84d7-47d9-8de9-b53576f6a373", "bridge": "br-int", "label": "tempest-test-network--1624656132", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c98d2b93a2394e67a5e6525145c5bca5", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape483b131-9a", "ovs_interfaceid": "e483b131-9a58-498a-941e-6f52029cc1c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.582 2 DEBUG nova.network.os_vif_util [None req-8eefa3e6-4d5e-4843-b6da-be64f0ab9db3 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Converting VIF {"id": "e483b131-9a58-498a-941e-6f52029cc1c5", "address": "fa:16:3e:74:65:70", "network": {"id": "05e23ee7-84d7-47d9-8de9-b53576f6a373", "bridge": "br-int", "label": "tempest-test-network--1624656132", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c98d2b93a2394e67a5e6525145c5bca5", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape483b131-9a", "ovs_interfaceid": "e483b131-9a58-498a-941e-6f52029cc1c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.582 2 DEBUG nova.network.os_vif_util [None req-8eefa3e6-4d5e-4843-b6da-be64f0ab9db3 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:65:70,bridge_name='br-int',has_traffic_filtering=True,id=e483b131-9a58-498a-941e-6f52029cc1c5,network=Network(05e23ee7-84d7-47d9-8de9-b53576f6a373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape483b131-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.583 2 DEBUG os_vif [None req-8eefa3e6-4d5e-4843-b6da-be64f0ab9db3 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:65:70,bridge_name='br-int',has_traffic_filtering=True,id=e483b131-9a58-498a-941e-6f52029cc1c5,network=Network(05e23ee7-84d7-47d9-8de9-b53576f6a373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape483b131-9a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.586 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape483b131-9a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.592 2 INFO os_vif [None req-8eefa3e6-4d5e-4843-b6da-be64f0ab9db3 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:65:70,bridge_name='br-int',has_traffic_filtering=True,id=e483b131-9a58-498a-941e-6f52029cc1c5,network=Network(05e23ee7-84d7-47d9-8de9-b53576f6a373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape483b131-9a')#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.592 2 INFO nova.virt.libvirt.driver [None req-8eefa3e6-4d5e-4843-b6da-be64f0ab9db3 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Deleting instance files /var/lib/nova/instances/307216ea-7a54-4279-80a4-70a83f9056e4_del#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.594 2 INFO nova.virt.libvirt.driver [None req-8eefa3e6-4d5e-4843-b6da-be64f0ab9db3 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Deletion of /var/lib/nova/instances/307216ea-7a54-4279-80a4-70a83f9056e4_del complete#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.663 2 INFO nova.compute.manager [None req-8eefa3e6-4d5e-4843-b6da-be64f0ab9db3 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.663 2 DEBUG oslo.service.loopingcall [None req-8eefa3e6-4d5e-4843-b6da-be64f0ab9db3 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.664 2 DEBUG nova.compute.manager [-] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:31:36 np0005476733 nova_compute[192580]: 2025-10-08 15:31:36.664 2 DEBUG nova.network.neutron [-] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:31:37 np0005476733 nova_compute[192580]: 2025-10-08 15:31:37.495 2 DEBUG nova.network.neutron [req-b79c1d9d-eb9c-4e1f-9983-a3c3097d4658 req-3024683f-c579-40b3-abe9-ad11b7fd96ab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Updated VIF entry in instance network info cache for port e483b131-9a58-498a-941e-6f52029cc1c5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:31:37 np0005476733 nova_compute[192580]: 2025-10-08 15:31:37.496 2 DEBUG nova.network.neutron [req-b79c1d9d-eb9c-4e1f-9983-a3c3097d4658 req-3024683f-c579-40b3-abe9-ad11b7fd96ab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Updating instance_info_cache with network_info: [{"id": "e483b131-9a58-498a-941e-6f52029cc1c5", "address": "fa:16:3e:74:65:70", "network": {"id": "05e23ee7-84d7-47d9-8de9-b53576f6a373", "bridge": "br-int", "label": "tempest-test-network--1624656132", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c98d2b93a2394e67a5e6525145c5bca5", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape483b131-9a", "ovs_interfaceid": "e483b131-9a58-498a-941e-6f52029cc1c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:31:37 np0005476733 nova_compute[192580]: 2025-10-08 15:31:37.525 2 DEBUG oslo_concurrency.lockutils [req-b79c1d9d-eb9c-4e1f-9983-a3c3097d4658 req-3024683f-c579-40b3-abe9-ad11b7fd96ab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-307216ea-7a54-4279-80a4-70a83f9056e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:31:38 np0005476733 nova_compute[192580]: 2025-10-08 15:31:38.805 2 DEBUG nova.compute.manager [req-6581a72b-3537-48e7-9669-9e92c0e2e5ee req-8423aa7a-1a26-41b8-a9fa-c974a6230c88 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Received event network-vif-unplugged-e483b131-9a58-498a-941e-6f52029cc1c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:31:38 np0005476733 nova_compute[192580]: 2025-10-08 15:31:38.806 2 DEBUG oslo_concurrency.lockutils [req-6581a72b-3537-48e7-9669-9e92c0e2e5ee req-8423aa7a-1a26-41b8-a9fa-c974a6230c88 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "307216ea-7a54-4279-80a4-70a83f9056e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:38 np0005476733 nova_compute[192580]: 2025-10-08 15:31:38.807 2 DEBUG oslo_concurrency.lockutils [req-6581a72b-3537-48e7-9669-9e92c0e2e5ee req-8423aa7a-1a26-41b8-a9fa-c974a6230c88 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "307216ea-7a54-4279-80a4-70a83f9056e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:38 np0005476733 nova_compute[192580]: 2025-10-08 15:31:38.807 2 DEBUG oslo_concurrency.lockutils [req-6581a72b-3537-48e7-9669-9e92c0e2e5ee req-8423aa7a-1a26-41b8-a9fa-c974a6230c88 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "307216ea-7a54-4279-80a4-70a83f9056e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:38 np0005476733 nova_compute[192580]: 2025-10-08 15:31:38.808 2 DEBUG nova.compute.manager [req-6581a72b-3537-48e7-9669-9e92c0e2e5ee req-8423aa7a-1a26-41b8-a9fa-c974a6230c88 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] No waiting events found dispatching network-vif-unplugged-e483b131-9a58-498a-941e-6f52029cc1c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:31:38 np0005476733 nova_compute[192580]: 2025-10-08 15:31:38.808 2 DEBUG nova.compute.manager [req-6581a72b-3537-48e7-9669-9e92c0e2e5ee req-8423aa7a-1a26-41b8-a9fa-c974a6230c88 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Received event network-vif-unplugged-e483b131-9a58-498a-941e-6f52029cc1c5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:31:39 np0005476733 nova_compute[192580]: 2025-10-08 15:31:39.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:40 np0005476733 podman[232578]: 2025-10-08 15:31:40.255023074 +0000 UTC m=+0.071167309 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 11:31:40 np0005476733 podman[232577]: 2025-10-08 15:31:40.289286655 +0000 UTC m=+0.103808897 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 11:31:40 np0005476733 nova_compute[192580]: 2025-10-08 15:31:40.297 2 DEBUG nova.compute.manager [req-65724da2-ef17-4586-aa66-e3c2ab9f4460 req-0eee6bd3-616c-4664-bdb2-88469c1ac5af 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Received event network-vif-unplugged-66f32729-1d2a-44d6-b604-29e4c751f95c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:31:40 np0005476733 nova_compute[192580]: 2025-10-08 15:31:40.298 2 DEBUG oslo_concurrency.lockutils [req-65724da2-ef17-4586-aa66-e3c2ab9f4460 req-0eee6bd3-616c-4664-bdb2-88469c1ac5af 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "a7cf9795-ac6e-4d38-8500-755c39931e14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:40 np0005476733 nova_compute[192580]: 2025-10-08 15:31:40.298 2 DEBUG oslo_concurrency.lockutils [req-65724da2-ef17-4586-aa66-e3c2ab9f4460 req-0eee6bd3-616c-4664-bdb2-88469c1ac5af 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "a7cf9795-ac6e-4d38-8500-755c39931e14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:40 np0005476733 nova_compute[192580]: 2025-10-08 15:31:40.298 2 DEBUG oslo_concurrency.lockutils [req-65724da2-ef17-4586-aa66-e3c2ab9f4460 req-0eee6bd3-616c-4664-bdb2-88469c1ac5af 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "a7cf9795-ac6e-4d38-8500-755c39931e14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:40 np0005476733 nova_compute[192580]: 2025-10-08 15:31:40.298 2 DEBUG nova.compute.manager [req-65724da2-ef17-4586-aa66-e3c2ab9f4460 req-0eee6bd3-616c-4664-bdb2-88469c1ac5af 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] No waiting events found dispatching network-vif-unplugged-66f32729-1d2a-44d6-b604-29e4c751f95c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:31:40 np0005476733 nova_compute[192580]: 2025-10-08 15:31:40.298 2 DEBUG nova.compute.manager [req-65724da2-ef17-4586-aa66-e3c2ab9f4460 req-0eee6bd3-616c-4664-bdb2-88469c1ac5af 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Received event network-vif-unplugged-66f32729-1d2a-44d6-b604-29e4c751f95c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:31:40 np0005476733 nova_compute[192580]: 2025-10-08 15:31:40.298 2 DEBUG nova.compute.manager [req-65724da2-ef17-4586-aa66-e3c2ab9f4460 req-0eee6bd3-616c-4664-bdb2-88469c1ac5af 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Received event network-vif-plugged-66f32729-1d2a-44d6-b604-29e4c751f95c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:31:40 np0005476733 nova_compute[192580]: 2025-10-08 15:31:40.298 2 DEBUG oslo_concurrency.lockutils [req-65724da2-ef17-4586-aa66-e3c2ab9f4460 req-0eee6bd3-616c-4664-bdb2-88469c1ac5af 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "a7cf9795-ac6e-4d38-8500-755c39931e14-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:40 np0005476733 nova_compute[192580]: 2025-10-08 15:31:40.299 2 DEBUG oslo_concurrency.lockutils [req-65724da2-ef17-4586-aa66-e3c2ab9f4460 req-0eee6bd3-616c-4664-bdb2-88469c1ac5af 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "a7cf9795-ac6e-4d38-8500-755c39931e14-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:40 np0005476733 nova_compute[192580]: 2025-10-08 15:31:40.299 2 DEBUG oslo_concurrency.lockutils [req-65724da2-ef17-4586-aa66-e3c2ab9f4460 req-0eee6bd3-616c-4664-bdb2-88469c1ac5af 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "a7cf9795-ac6e-4d38-8500-755c39931e14-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:40 np0005476733 nova_compute[192580]: 2025-10-08 15:31:40.299 2 DEBUG nova.compute.manager [req-65724da2-ef17-4586-aa66-e3c2ab9f4460 req-0eee6bd3-616c-4664-bdb2-88469c1ac5af 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] No waiting events found dispatching network-vif-plugged-66f32729-1d2a-44d6-b604-29e4c751f95c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:31:40 np0005476733 nova_compute[192580]: 2025-10-08 15:31:40.299 2 WARNING nova.compute.manager [req-65724da2-ef17-4586-aa66-e3c2ab9f4460 req-0eee6bd3-616c-4664-bdb2-88469c1ac5af 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Received unexpected event network-vif-plugged-66f32729-1d2a-44d6-b604-29e4c751f95c for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:31:40 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:40Z|00400|binding|INFO|Releasing lport d2188afb-493c-4705-9ba9-87c4b983c343 from this chassis (sb_readonly=0)
Oct  8 11:31:40 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:40Z|00401|binding|INFO|Releasing lport f67773e8-4408-425a-8438-2209ddc36987 from this chassis (sb_readonly=0)
Oct  8 11:31:40 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:40Z|00402|binding|INFO|Releasing lport 76302563-91ae-48df-adce-3edec8d5a578 from this chassis (sb_readonly=0)
Oct  8 11:31:40 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:40Z|00403|binding|INFO|Releasing lport d2188afb-493c-4705-9ba9-87c4b983c343 from this chassis (sb_readonly=0)
Oct  8 11:31:40 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:40Z|00404|binding|INFO|Releasing lport f67773e8-4408-425a-8438-2209ddc36987 from this chassis (sb_readonly=0)
Oct  8 11:31:40 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:40Z|00405|binding|INFO|Releasing lport 76302563-91ae-48df-adce-3edec8d5a578 from this chassis (sb_readonly=0)
Oct  8 11:31:40 np0005476733 nova_compute[192580]: 2025-10-08 15:31:40.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.052 2 DEBUG nova.compute.manager [req-3d01bc2e-6708-4f1d-af8c-d0bb5e9ec1ab req-f18a7381-f59d-44da-bcbe-5964cd9fed27 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Received event network-vif-plugged-e483b131-9a58-498a-941e-6f52029cc1c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.053 2 DEBUG oslo_concurrency.lockutils [req-3d01bc2e-6708-4f1d-af8c-d0bb5e9ec1ab req-f18a7381-f59d-44da-bcbe-5964cd9fed27 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "307216ea-7a54-4279-80a4-70a83f9056e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.054 2 DEBUG oslo_concurrency.lockutils [req-3d01bc2e-6708-4f1d-af8c-d0bb5e9ec1ab req-f18a7381-f59d-44da-bcbe-5964cd9fed27 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "307216ea-7a54-4279-80a4-70a83f9056e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.054 2 DEBUG oslo_concurrency.lockutils [req-3d01bc2e-6708-4f1d-af8c-d0bb5e9ec1ab req-f18a7381-f59d-44da-bcbe-5964cd9fed27 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "307216ea-7a54-4279-80a4-70a83f9056e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.055 2 DEBUG nova.compute.manager [req-3d01bc2e-6708-4f1d-af8c-d0bb5e9ec1ab req-f18a7381-f59d-44da-bcbe-5964cd9fed27 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] No waiting events found dispatching network-vif-plugged-e483b131-9a58-498a-941e-6f52029cc1c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.055 2 WARNING nova.compute.manager [req-3d01bc2e-6708-4f1d-af8c-d0bb5e9ec1ab req-f18a7381-f59d-44da-bcbe-5964cd9fed27 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Received unexpected event network-vif-plugged-e483b131-9a58-498a-941e-6f52029cc1c5 for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.071 2 DEBUG nova.network.neutron [-] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.108 2 INFO nova.compute.manager [-] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Took 4.78 seconds to deallocate network for instance.#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.170 2 DEBUG oslo_concurrency.lockutils [None req-0d07c2f4-0178-4359-8cd9-b34fa8d164c1 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.171 2 DEBUG oslo_concurrency.lockutils [None req-0d07c2f4-0178-4359-8cd9-b34fa8d164c1 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.359 2 DEBUG nova.compute.provider_tree [None req-0d07c2f4-0178-4359-8cd9-b34fa8d164c1 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.387 2 DEBUG nova.scheduler.client.report [None req-0d07c2f4-0178-4359-8cd9-b34fa8d164c1 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.439 2 DEBUG nova.network.neutron [-] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.442 2 DEBUG oslo_concurrency.lockutils [None req-0d07c2f4-0178-4359-8cd9-b34fa8d164c1 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.465 2 INFO nova.compute.manager [-] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Took 4.80 seconds to deallocate network for instance.#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.478 2 INFO nova.scheduler.client.report [None req-0d07c2f4-0178-4359-8cd9-b34fa8d164c1 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Deleted allocations for instance a7cf9795-ac6e-4d38-8500-755c39931e14#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.552 2 DEBUG oslo_concurrency.lockutils [None req-8eefa3e6-4d5e-4843-b6da-be64f0ab9db3 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.553 2 DEBUG oslo_concurrency.lockutils [None req-8eefa3e6-4d5e-4843-b6da-be64f0ab9db3 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.588 2 DEBUG oslo_concurrency.lockutils [None req-0d07c2f4-0178-4359-8cd9-b34fa8d164c1 c852472017334735b37425ffa8591384 f4f21e712eb24213a38bc89e2b2f44b3 - - default default] Lock "a7cf9795-ac6e-4d38-8500-755c39931e14" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.759s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.692 2 DEBUG nova.compute.provider_tree [None req-8eefa3e6-4d5e-4843-b6da-be64f0ab9db3 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.714 2 DEBUG nova.scheduler.client.report [None req-8eefa3e6-4d5e-4843-b6da-be64f0ab9db3 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.752 2 DEBUG oslo_concurrency.lockutils [None req-8eefa3e6-4d5e-4843-b6da-be64f0ab9db3 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.786 2 INFO nova.scheduler.client.report [None req-8eefa3e6-4d5e-4843-b6da-be64f0ab9db3 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Deleted allocations for instance 307216ea-7a54-4279-80a4-70a83f9056e4#033[00m
Oct  8 11:31:41 np0005476733 nova_compute[192580]: 2025-10-08 15:31:41.897 2 DEBUG oslo_concurrency.lockutils [None req-8eefa3e6-4d5e-4843-b6da-be64f0ab9db3 b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "307216ea-7a54-4279-80a4-70a83f9056e4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.632s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:43 np0005476733 nova_compute[192580]: 2025-10-08 15:31:43.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:31:43 np0005476733 nova_compute[192580]: 2025-10-08 15:31:43.683 2 DEBUG oslo_concurrency.lockutils [None req-ae87eb3c-59f4-44b6-8bcc-0963cd98144a b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Acquiring lock "164d69c5-58d2-413e-9b1f-907b5cc12d9b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:43 np0005476733 nova_compute[192580]: 2025-10-08 15:31:43.684 2 DEBUG oslo_concurrency.lockutils [None req-ae87eb3c-59f4-44b6-8bcc-0963cd98144a b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "164d69c5-58d2-413e-9b1f-907b5cc12d9b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:43 np0005476733 nova_compute[192580]: 2025-10-08 15:31:43.685 2 DEBUG oslo_concurrency.lockutils [None req-ae87eb3c-59f4-44b6-8bcc-0963cd98144a b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Acquiring lock "164d69c5-58d2-413e-9b1f-907b5cc12d9b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:43 np0005476733 nova_compute[192580]: 2025-10-08 15:31:43.685 2 DEBUG oslo_concurrency.lockutils [None req-ae87eb3c-59f4-44b6-8bcc-0963cd98144a b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "164d69c5-58d2-413e-9b1f-907b5cc12d9b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:43 np0005476733 nova_compute[192580]: 2025-10-08 15:31:43.686 2 DEBUG oslo_concurrency.lockutils [None req-ae87eb3c-59f4-44b6-8bcc-0963cd98144a b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "164d69c5-58d2-413e-9b1f-907b5cc12d9b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:43 np0005476733 nova_compute[192580]: 2025-10-08 15:31:43.688 2 INFO nova.compute.manager [None req-ae87eb3c-59f4-44b6-8bcc-0963cd98144a b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Terminating instance#033[00m
Oct  8 11:31:43 np0005476733 nova_compute[192580]: 2025-10-08 15:31:43.690 2 DEBUG nova.compute.manager [None req-ae87eb3c-59f4-44b6-8bcc-0963cd98144a b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:31:43 np0005476733 kernel: tapc864df57-e8 (unregistering): left promiscuous mode
Oct  8 11:31:43 np0005476733 NetworkManager[51699]: <info>  [1759937503.7198] device (tapc864df57-e8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:31:43 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:43Z|00406|binding|INFO|Releasing lport c864df57-e86e-439c-88f6-198c1e0cf48c from this chassis (sb_readonly=0)
Oct  8 11:31:43 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:43Z|00407|binding|INFO|Setting lport c864df57-e86e-439c-88f6-198c1e0cf48c down in Southbound
Oct  8 11:31:43 np0005476733 nova_compute[192580]: 2025-10-08 15:31:43.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:43 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:43Z|00408|binding|INFO|Removing iface tapc864df57-e8 ovn-installed in OVS
Oct  8 11:31:43 np0005476733 nova_compute[192580]: 2025-10-08 15:31:43.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:43.758 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:7d:97 10.100.0.8', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '164d69c5-58d2-413e-9b1f-907b5cc12d9b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-05e23ee7-84d7-47d9-8de9-b53576f6a373', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c98d2b93a2394e67a5e6525145c5bca5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.205'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bec80dd1-0f75-4955-b64e-4b7639499c68, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=c864df57-e86e-439c-88f6-198c1e0cf48c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:31:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:43.761 103739 INFO neutron.agent.ovn.metadata.agent [-] Port c864df57-e86e-439c-88f6-198c1e0cf48c in datapath 05e23ee7-84d7-47d9-8de9-b53576f6a373 unbound from our chassis#033[00m
Oct  8 11:31:43 np0005476733 nova_compute[192580]: 2025-10-08 15:31:43.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:43.769 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 05e23ee7-84d7-47d9-8de9-b53576f6a373, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:31:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:43.770 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f8f3ec38-7182-4062-985f-c50d08e75d22]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:43.771 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373 namespace which is not needed anymore#033[00m
Oct  8 11:31:43 np0005476733 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d0000002f.scope: Deactivated successfully.
Oct  8 11:31:43 np0005476733 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d0000002f.scope: Consumed 15.196s CPU time.
Oct  8 11:31:43 np0005476733 systemd-machined[152624]: Machine qemu-28-instance-0000002f terminated.
Oct  8 11:31:43 np0005476733 neutron-haproxy-ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373[231960]: [NOTICE]   (231983) : haproxy version is 2.8.14-c23fe91
Oct  8 11:31:43 np0005476733 neutron-haproxy-ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373[231960]: [NOTICE]   (231983) : path to executable is /usr/sbin/haproxy
Oct  8 11:31:43 np0005476733 neutron-haproxy-ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373[231960]: [WARNING]  (231983) : Exiting Master process...
Oct  8 11:31:43 np0005476733 neutron-haproxy-ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373[231960]: [ALERT]    (231983) : Current worker (232002) exited with code 143 (Terminated)
Oct  8 11:31:43 np0005476733 neutron-haproxy-ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373[231960]: [WARNING]  (231983) : All workers exited. Exiting... (0)
Oct  8 11:31:43 np0005476733 systemd[1]: libpod-66c2ed38a4fc311308bf8460697bbb32d97d85d7f4e1085cdc4e06e49737cd1b.scope: Deactivated successfully.
Oct  8 11:31:43 np0005476733 podman[232645]: 2025-10-08 15:31:43.968525387 +0000 UTC m=+0.079062992 container died 66c2ed38a4fc311308bf8460697bbb32d97d85d7f4e1085cdc4e06e49737cd1b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:31:43 np0005476733 nova_compute[192580]: 2025-10-08 15:31:43.967 2 INFO nova.virt.libvirt.driver [-] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Instance destroyed successfully.#033[00m
Oct  8 11:31:43 np0005476733 nova_compute[192580]: 2025-10-08 15:31:43.968 2 DEBUG nova.objects.instance [None req-ae87eb3c-59f4-44b6-8bcc-0963cd98144a b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lazy-loading 'resources' on Instance uuid 164d69c5-58d2-413e-9b1f-907b5cc12d9b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:31:43 np0005476733 nova_compute[192580]: 2025-10-08 15:31:43.988 2 DEBUG nova.virt.libvirt.vif [None req-ae87eb3c-59f4-44b6-8bcc-0963cd98144a b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:30:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-924816690',display_name='tempest-server-test-924816690',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-924816690',id=47,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD8eBR+7791zIq3Ca3IMTdsqXttoRnnB98onayUfwxm3wid9Grh+Seb7hbQVJS1apoK+LjhlbIItD35aVZWAg+9RelHofSx/RlM7CwUN9S/EmBk++Oh1cTCh74OhHkN3ow==',key_name='tempest-keypair-test-1341641232',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:30:49Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c98d2b93a2394e67a5e6525145c5bca5',ramdisk_id='',reservation_id='r-h6o9b07j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-PortSecurityTest-26261139',owner_user_name='tempest-PortSecurityTest-26261139-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:30:49Z,user_data=None,user_id='b35a1072024b4c6598970391dd8abb59',uuid=164d69c5-58d2-413e-9b1f-907b5cc12d9b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c864df57-e86e-439c-88f6-198c1e0cf48c", "address": "fa:16:3e:11:7d:97", "network": {"id": "05e23ee7-84d7-47d9-8de9-b53576f6a373", "bridge": "br-int", "label": "tempest-test-network--1624656132", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c98d2b93a2394e67a5e6525145c5bca5", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc864df57-e8", "ovs_interfaceid": "c864df57-e86e-439c-88f6-198c1e0cf48c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:31:43 np0005476733 nova_compute[192580]: 2025-10-08 15:31:43.988 2 DEBUG nova.network.os_vif_util [None req-ae87eb3c-59f4-44b6-8bcc-0963cd98144a b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Converting VIF {"id": "c864df57-e86e-439c-88f6-198c1e0cf48c", "address": "fa:16:3e:11:7d:97", "network": {"id": "05e23ee7-84d7-47d9-8de9-b53576f6a373", "bridge": "br-int", "label": "tempest-test-network--1624656132", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c98d2b93a2394e67a5e6525145c5bca5", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc864df57-e8", "ovs_interfaceid": "c864df57-e86e-439c-88f6-198c1e0cf48c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:31:43 np0005476733 nova_compute[192580]: 2025-10-08 15:31:43.990 2 DEBUG nova.network.os_vif_util [None req-ae87eb3c-59f4-44b6-8bcc-0963cd98144a b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:11:7d:97,bridge_name='br-int',has_traffic_filtering=True,id=c864df57-e86e-439c-88f6-198c1e0cf48c,network=Network(05e23ee7-84d7-47d9-8de9-b53576f6a373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc864df57-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:31:43 np0005476733 nova_compute[192580]: 2025-10-08 15:31:43.990 2 DEBUG os_vif [None req-ae87eb3c-59f4-44b6-8bcc-0963cd98144a b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:11:7d:97,bridge_name='br-int',has_traffic_filtering=True,id=c864df57-e86e-439c-88f6-198c1e0cf48c,network=Network(05e23ee7-84d7-47d9-8de9-b53576f6a373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc864df57-e8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:31:43 np0005476733 nova_compute[192580]: 2025-10-08 15:31:43.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:43 np0005476733 nova_compute[192580]: 2025-10-08 15:31:43.992 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc864df57-e8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:31:43 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-66c2ed38a4fc311308bf8460697bbb32d97d85d7f4e1085cdc4e06e49737cd1b-userdata-shm.mount: Deactivated successfully.
Oct  8 11:31:43 np0005476733 nova_compute[192580]: 2025-10-08 15:31:43.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.002 2 INFO os_vif [None req-ae87eb3c-59f4-44b6-8bcc-0963cd98144a b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:11:7d:97,bridge_name='br-int',has_traffic_filtering=True,id=c864df57-e86e-439c-88f6-198c1e0cf48c,network=Network(05e23ee7-84d7-47d9-8de9-b53576f6a373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc864df57-e8')#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.003 2 INFO nova.virt.libvirt.driver [None req-ae87eb3c-59f4-44b6-8bcc-0963cd98144a b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Deleting instance files /var/lib/nova/instances/164d69c5-58d2-413e-9b1f-907b5cc12d9b_del#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.003 2 INFO nova.virt.libvirt.driver [None req-ae87eb3c-59f4-44b6-8bcc-0963cd98144a b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Deletion of /var/lib/nova/instances/164d69c5-58d2-413e-9b1f-907b5cc12d9b_del complete#033[00m
Oct  8 11:31:44 np0005476733 systemd[1]: var-lib-containers-storage-overlay-0c094283ad176a6c4b75ea8cbd548e98fc48c7a3c15e77e9ebc0be63d3ead5a0-merged.mount: Deactivated successfully.
Oct  8 11:31:44 np0005476733 podman[232645]: 2025-10-08 15:31:44.015786005 +0000 UTC m=+0.126323600 container cleanup 66c2ed38a4fc311308bf8460697bbb32d97d85d7f4e1085cdc4e06e49737cd1b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:31:44 np0005476733 systemd[1]: libpod-conmon-66c2ed38a4fc311308bf8460697bbb32d97d85d7f4e1085cdc4e06e49737cd1b.scope: Deactivated successfully.
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.067 2 INFO nova.compute.manager [None req-ae87eb3c-59f4-44b6-8bcc-0963cd98144a b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.068 2 DEBUG oslo.service.loopingcall [None req-ae87eb3c-59f4-44b6-8bcc-0963cd98144a b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.068 2 DEBUG nova.compute.manager [-] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.068 2 DEBUG nova.network.neutron [-] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:31:44 np0005476733 podman[232692]: 2025-10-08 15:31:44.079037568 +0000 UTC m=+0.041649280 container remove 66c2ed38a4fc311308bf8460697bbb32d97d85d7f4e1085cdc4e06e49737cd1b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 11:31:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:44.085 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6b689b1b-a56a-423a-a635-98eb1d23643c]: (4, ('Wed Oct  8 03:31:43 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373 (66c2ed38a4fc311308bf8460697bbb32d97d85d7f4e1085cdc4e06e49737cd1b)\n66c2ed38a4fc311308bf8460697bbb32d97d85d7f4e1085cdc4e06e49737cd1b\nWed Oct  8 03:31:44 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373 (66c2ed38a4fc311308bf8460697bbb32d97d85d7f4e1085cdc4e06e49737cd1b)\n66c2ed38a4fc311308bf8460697bbb32d97d85d7f4e1085cdc4e06e49737cd1b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:44.086 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[69af6eb4-2eed-4c38-ac68-79af5c45285d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:44.087 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap05e23ee7-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:31:44 np0005476733 kernel: tap05e23ee7-80: left promiscuous mode
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:44.104 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8222a8bc-341e-4e8c-9a8f-abc991af799d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:44.134 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6efeecb3-51e2-444b-9da9-950d435f835e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:44.136 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4201630f-db7b-454e-a0d2-251d2c59a94d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:44.151 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[23940b54-30c0-4940-a621-cd44dd6c72c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 435037, 'reachable_time': 20086, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232707, 'error': None, 'target': 'ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:44.154 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-05e23ee7-84d7-47d9-8de9-b53576f6a373 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:31:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:44.154 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[1c6dab86-7a1b-4f8f-80aa-644597e51587]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:44 np0005476733 systemd[1]: run-netns-ovnmeta\x2d05e23ee7\x2d84d7\x2d47d9\x2d8de9\x2db53576f6a373.mount: Deactivated successfully.
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.558 2 DEBUG nova.compute.manager [req-7eaea00f-1a7c-4622-94cd-10053afbfb3e req-e3462a3c-3435-4c23-803a-5c5061b4b00a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Received event network-vif-unplugged-c864df57-e86e-439c-88f6-198c1e0cf48c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.559 2 DEBUG oslo_concurrency.lockutils [req-7eaea00f-1a7c-4622-94cd-10053afbfb3e req-e3462a3c-3435-4c23-803a-5c5061b4b00a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "164d69c5-58d2-413e-9b1f-907b5cc12d9b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.559 2 DEBUG oslo_concurrency.lockutils [req-7eaea00f-1a7c-4622-94cd-10053afbfb3e req-e3462a3c-3435-4c23-803a-5c5061b4b00a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "164d69c5-58d2-413e-9b1f-907b5cc12d9b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.560 2 DEBUG oslo_concurrency.lockutils [req-7eaea00f-1a7c-4622-94cd-10053afbfb3e req-e3462a3c-3435-4c23-803a-5c5061b4b00a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "164d69c5-58d2-413e-9b1f-907b5cc12d9b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.560 2 DEBUG nova.compute.manager [req-7eaea00f-1a7c-4622-94cd-10053afbfb3e req-e3462a3c-3435-4c23-803a-5c5061b4b00a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] No waiting events found dispatching network-vif-unplugged-c864df57-e86e-439c-88f6-198c1e0cf48c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.561 2 DEBUG nova.compute.manager [req-7eaea00f-1a7c-4622-94cd-10053afbfb3e req-e3462a3c-3435-4c23-803a-5c5061b4b00a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Received event network-vif-unplugged-c864df57-e86e-439c-88f6-198c1e0cf48c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.561 2 DEBUG nova.compute.manager [req-7eaea00f-1a7c-4622-94cd-10053afbfb3e req-e3462a3c-3435-4c23-803a-5c5061b4b00a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Received event network-vif-plugged-c864df57-e86e-439c-88f6-198c1e0cf48c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.561 2 DEBUG oslo_concurrency.lockutils [req-7eaea00f-1a7c-4622-94cd-10053afbfb3e req-e3462a3c-3435-4c23-803a-5c5061b4b00a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "164d69c5-58d2-413e-9b1f-907b5cc12d9b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.562 2 DEBUG oslo_concurrency.lockutils [req-7eaea00f-1a7c-4622-94cd-10053afbfb3e req-e3462a3c-3435-4c23-803a-5c5061b4b00a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "164d69c5-58d2-413e-9b1f-907b5cc12d9b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.562 2 DEBUG oslo_concurrency.lockutils [req-7eaea00f-1a7c-4622-94cd-10053afbfb3e req-e3462a3c-3435-4c23-803a-5c5061b4b00a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "164d69c5-58d2-413e-9b1f-907b5cc12d9b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.563 2 DEBUG nova.compute.manager [req-7eaea00f-1a7c-4622-94cd-10053afbfb3e req-e3462a3c-3435-4c23-803a-5c5061b4b00a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] No waiting events found dispatching network-vif-plugged-c864df57-e86e-439c-88f6-198c1e0cf48c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.563 2 WARNING nova.compute.manager [req-7eaea00f-1a7c-4622-94cd-10053afbfb3e req-e3462a3c-3435-4c23-803a-5c5061b4b00a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Received unexpected event network-vif-plugged-c864df57-e86e-439c-88f6-198c1e0cf48c for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.637 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.637 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.638 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.638 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.747 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.807 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.808 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.863 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.871 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.938 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.939 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:31:44 np0005476733 nova_compute[192580]: 2025-10-08 15:31:44.999 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:31:45 np0005476733 nova_compute[192580]: 2025-10-08 15:31:45.190 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:31:45 np0005476733 nova_compute[192580]: 2025-10-08 15:31:45.192 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12183MB free_disk=111.03909683227539GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:31:45 np0005476733 nova_compute[192580]: 2025-10-08 15:31:45.192 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:45 np0005476733 nova_compute[192580]: 2025-10-08 15:31:45.192 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:45 np0005476733 nova_compute[192580]: 2025-10-08 15:31:45.309 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 27fa9a5a-04a0-4d80-b75d-564df1c974e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:31:45 np0005476733 nova_compute[192580]: 2025-10-08 15:31:45.310 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 066ef28b-88ac-4f5c-acae-3458c3e19762 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:31:45 np0005476733 nova_compute[192580]: 2025-10-08 15:31:45.310 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 164d69c5-58d2-413e-9b1f-907b5cc12d9b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:31:45 np0005476733 nova_compute[192580]: 2025-10-08 15:31:45.310 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:31:45 np0005476733 nova_compute[192580]: 2025-10-08 15:31:45.311 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=2688MB phys_disk=119GB used_disk=21GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:31:45 np0005476733 nova_compute[192580]: 2025-10-08 15:31:45.462 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:31:45 np0005476733 nova_compute[192580]: 2025-10-08 15:31:45.491 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:31:45 np0005476733 nova_compute[192580]: 2025-10-08 15:31:45.535 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:31:45 np0005476733 nova_compute[192580]: 2025-10-08 15:31:45.535 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.343s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:46 np0005476733 nova_compute[192580]: 2025-10-08 15:31:46.514 2 DEBUG nova.network.neutron [-] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:31:46 np0005476733 nova_compute[192580]: 2025-10-08 15:31:46.547 2 INFO nova.compute.manager [-] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Took 2.48 seconds to deallocate network for instance.#033[00m
Oct  8 11:31:46 np0005476733 nova_compute[192580]: 2025-10-08 15:31:46.632 2 DEBUG oslo_concurrency.lockutils [None req-ae87eb3c-59f4-44b6-8bcc-0963cd98144a b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:46 np0005476733 nova_compute[192580]: 2025-10-08 15:31:46.632 2 DEBUG oslo_concurrency.lockutils [None req-ae87eb3c-59f4-44b6-8bcc-0963cd98144a b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:46 np0005476733 nova_compute[192580]: 2025-10-08 15:31:46.753 2 DEBUG nova.compute.provider_tree [None req-ae87eb3c-59f4-44b6-8bcc-0963cd98144a b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:31:46 np0005476733 nova_compute[192580]: 2025-10-08 15:31:46.775 2 DEBUG nova.scheduler.client.report [None req-ae87eb3c-59f4-44b6-8bcc-0963cd98144a b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:31:46 np0005476733 nova_compute[192580]: 2025-10-08 15:31:46.816 2 DEBUG oslo_concurrency.lockutils [None req-ae87eb3c-59f4-44b6-8bcc-0963cd98144a b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:46 np0005476733 nova_compute[192580]: 2025-10-08 15:31:46.849 2 INFO nova.scheduler.client.report [None req-ae87eb3c-59f4-44b6-8bcc-0963cd98144a b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Deleted allocations for instance 164d69c5-58d2-413e-9b1f-907b5cc12d9b#033[00m
Oct  8 11:31:46 np0005476733 nova_compute[192580]: 2025-10-08 15:31:46.976 2 DEBUG oslo_concurrency.lockutils [None req-ae87eb3c-59f4-44b6-8bcc-0963cd98144a b35a1072024b4c6598970391dd8abb59 c98d2b93a2394e67a5e6525145c5bca5 - - default default] Lock "164d69c5-58d2-413e-9b1f-907b5cc12d9b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:47 np0005476733 podman[232721]: 2025-10-08 15:31:47.230935991 +0000 UTC m=+0.060004029 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, architecture=x86_64, io.openshift.expose-services=, release=1755695350, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  8 11:31:47 np0005476733 nova_compute[192580]: 2025-10-08 15:31:47.536 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:31:47 np0005476733 nova_compute[192580]: 2025-10-08 15:31:47.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.685 2 DEBUG oslo_concurrency.lockutils [None req-8a180b1d-f572-49d0-a113-beac09b10594 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.685 2 DEBUG oslo_concurrency.lockutils [None req-8a180b1d-f572-49d0-a113-beac09b10594 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.686 2 DEBUG oslo_concurrency.lockutils [None req-8a180b1d-f572-49d0-a113-beac09b10594 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.686 2 DEBUG oslo_concurrency.lockutils [None req-8a180b1d-f572-49d0-a113-beac09b10594 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.686 2 DEBUG oslo_concurrency.lockutils [None req-8a180b1d-f572-49d0-a113-beac09b10594 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.687 2 INFO nova.compute.manager [None req-8a180b1d-f572-49d0-a113-beac09b10594 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Terminating instance#033[00m
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.688 2 DEBUG nova.compute.manager [None req-8a180b1d-f572-49d0-a113-beac09b10594 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:31:48 np0005476733 kernel: tap23f6a943-ce (unregistering): left promiscuous mode
Oct  8 11:31:48 np0005476733 NetworkManager[51699]: <info>  [1759937508.7113] device (tap23f6a943-ce): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:48 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:48Z|00409|binding|INFO|Releasing lport 23f6a943-ce2f-4958-a0c6-73f789517892 from this chassis (sb_readonly=0)
Oct  8 11:31:48 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:48Z|00410|binding|INFO|Setting lport 23f6a943-ce2f-4958-a0c6-73f789517892 down in Southbound
Oct  8 11:31:48 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:48Z|00411|binding|INFO|Removing iface tap23f6a943-ce ovn-installed in OVS
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:48.735 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:f6:e1 10.100.0.13'], port_security=['fa:16:3e:38:f6:e1 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '27fa9a5a-04a0-4d80-b75d-564df1c974e8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '496a37645ecf47b496dcf02c696ca64a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '023a0cd3-fdca-4dff-ba80-8ef557b384c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.184'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b3d4cc6-3768-451b-b35e-6b2333c921fd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=23f6a943-ce2f-4958-a0c6-73f789517892) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:31:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:48.736 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 23f6a943-ce2f-4958-a0c6-73f789517892 in datapath 30cdfb1e-750a-4d0e-9e9c-321b06b371b9 unbound from our chassis#033[00m
Oct  8 11:31:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:48.739 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 30cdfb1e-750a-4d0e-9e9c-321b06b371b9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:31:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:48.740 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f26f9457-1333-4e99-b82b-d767a503223f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:48.741 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9 namespace which is not needed anymore#033[00m
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:48 np0005476733 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000027.scope: Deactivated successfully.
Oct  8 11:31:48 np0005476733 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000027.scope: Consumed 45.981s CPU time.
Oct  8 11:31:48 np0005476733 systemd-machined[152624]: Machine qemu-25-instance-00000027 terminated.
Oct  8 11:31:48 np0005476733 podman[232755]: 2025-10-08 15:31:48.811025561 +0000 UTC m=+0.069233265 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:31:48 np0005476733 podman[232752]: 2025-10-08 15:31:48.831140758 +0000 UTC m=+0.092081341 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:31:48 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[230812]: [NOTICE]   (230816) : haproxy version is 2.8.14-c23fe91
Oct  8 11:31:48 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[230812]: [NOTICE]   (230816) : path to executable is /usr/sbin/haproxy
Oct  8 11:31:48 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[230812]: [WARNING]  (230816) : Exiting Master process...
Oct  8 11:31:48 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[230812]: [WARNING]  (230816) : Exiting Master process...
Oct  8 11:31:48 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[230812]: [ALERT]    (230816) : Current worker (230818) exited with code 143 (Terminated)
Oct  8 11:31:48 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[230812]: [WARNING]  (230816) : All workers exited. Exiting... (0)
Oct  8 11:31:48 np0005476733 systemd[1]: libpod-13a46010a3529927506e2994b6e30e9d88632466e005671e6955a4114e297803.scope: Deactivated successfully.
Oct  8 11:31:48 np0005476733 podman[232813]: 2025-10-08 15:31:48.875645738 +0000 UTC m=+0.047938112 container died 13a46010a3529927506e2994b6e30e9d88632466e005671e6955a4114e297803 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:31:48 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-13a46010a3529927506e2994b6e30e9d88632466e005671e6955a4114e297803-userdata-shm.mount: Deactivated successfully.
Oct  8 11:31:48 np0005476733 systemd[1]: var-lib-containers-storage-overlay-e8472d2d34892162c03e9b5e37746390044b6e365ede262b47cf39b7b04ac757-merged.mount: Deactivated successfully.
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:48 np0005476733 podman[232813]: 2025-10-08 15:31:48.917579856 +0000 UTC m=+0.089872260 container cleanup 13a46010a3529927506e2994b6e30e9d88632466e005671e6955a4114e297803 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:48 np0005476733 systemd[1]: libpod-conmon-13a46010a3529927506e2994b6e30e9d88632466e005671e6955a4114e297803.scope: Deactivated successfully.
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.955 2 INFO nova.virt.libvirt.driver [-] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Instance destroyed successfully.#033[00m
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.956 2 DEBUG nova.objects.instance [None req-8a180b1d-f572-49d0-a113-beac09b10594 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lazy-loading 'resources' on Instance uuid 27fa9a5a-04a0-4d80-b75d-564df1c974e8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.971 2 DEBUG nova.virt.libvirt.vif [None req-8a180b1d-f572-49d0-a113-beac09b10594 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:29:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_multicast_east_west-1699367735',display_name='tempest-test_multicast_east_west-1699367735',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-multicast-east-west-1699367735',id=39,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHaTUyIW7HAi8eLb2uxsb3hQ01QNiqMtiwd2QQElMyFusiyPekoP+eGZG5apcvUeJj+ezHykEE9e9GalqeB/Pt0gdiMZz/nmUCtHv59KRRGG4S5F2fPmbxlRdJaDztvzVg==',key_name='tempest-keypair-test-1272869518',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:29:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='496a37645ecf47b496dcf02c696ca64a',ramdisk_id='',reservation_id='r-l53p72t9',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-MulticastTestIPv4Ovn-1993668591',owner_user_name='tempest-MulticastTestIPv4Ovn-1993668591-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:29:29Z,user_data=None,user_id='c0c7c5c2dab54695b1cc0a34bdc4ee47',uuid=27fa9a5a-04a0-4d80-b75d-564df1c974e8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "23f6a943-ce2f-4958-a0c6-73f789517892", "address": "fa:16:3e:38:f6:e1", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23f6a943-ce", "ovs_interfaceid": "23f6a943-ce2f-4958-a0c6-73f789517892", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.972 2 DEBUG nova.network.os_vif_util [None req-8a180b1d-f572-49d0-a113-beac09b10594 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converting VIF {"id": "23f6a943-ce2f-4958-a0c6-73f789517892", "address": "fa:16:3e:38:f6:e1", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23f6a943-ce", "ovs_interfaceid": "23f6a943-ce2f-4958-a0c6-73f789517892", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.973 2 DEBUG nova.network.os_vif_util [None req-8a180b1d-f572-49d0-a113-beac09b10594 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:38:f6:e1,bridge_name='br-int',has_traffic_filtering=True,id=23f6a943-ce2f-4958-a0c6-73f789517892,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23f6a943-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.973 2 DEBUG os_vif [None req-8a180b1d-f572-49d0-a113-beac09b10594 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:38:f6:e1,bridge_name='br-int',has_traffic_filtering=True,id=23f6a943-ce2f-4958-a0c6-73f789517892,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23f6a943-ce') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.976 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23f6a943-ce, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.983 2 INFO os_vif [None req-8a180b1d-f572-49d0-a113-beac09b10594 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:38:f6:e1,bridge_name='br-int',has_traffic_filtering=True,id=23f6a943-ce2f-4958-a0c6-73f789517892,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23f6a943-ce')#033[00m
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.984 2 INFO nova.virt.libvirt.driver [None req-8a180b1d-f572-49d0-a113-beac09b10594 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Deleting instance files /var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8_del#033[00m
Oct  8 11:31:48 np0005476733 nova_compute[192580]: 2025-10-08 15:31:48.985 2 INFO nova.virt.libvirt.driver [None req-8a180b1d-f572-49d0-a113-beac09b10594 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Deletion of /var/lib/nova/instances/27fa9a5a-04a0-4d80-b75d-564df1c974e8_del complete#033[00m
Oct  8 11:31:49 np0005476733 podman[232854]: 2025-10-08 15:31:49.002019889 +0000 UTC m=+0.051142764 container remove 13a46010a3529927506e2994b6e30e9d88632466e005671e6955a4114e297803 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 11:31:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:49.009 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[19343663-2821-4b2f-adde-2caeac978ddf]: (4, ('Wed Oct  8 03:31:48 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9 (13a46010a3529927506e2994b6e30e9d88632466e005671e6955a4114e297803)\n13a46010a3529927506e2994b6e30e9d88632466e005671e6955a4114e297803\nWed Oct  8 03:31:48 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9 (13a46010a3529927506e2994b6e30e9d88632466e005671e6955a4114e297803)\n13a46010a3529927506e2994b6e30e9d88632466e005671e6955a4114e297803\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:49.010 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4832583f-5a88-4d7b-a09b-9a13f444fa40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:49.011 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap30cdfb1e-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:31:49 np0005476733 nova_compute[192580]: 2025-10-08 15:31:49.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:49 np0005476733 kernel: tap30cdfb1e-70: left promiscuous mode
Oct  8 11:31:49 np0005476733 nova_compute[192580]: 2025-10-08 15:31:49.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:49.018 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[10839be6-9e9f-40f1-939d-9e9e5647a401]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:49 np0005476733 nova_compute[192580]: 2025-10-08 15:31:49.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:49.051 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3307dbd9-c4e5-47e7-966a-26926733c2d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:49.052 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ad4c4447-a728-45fc-8a50-03565c7c3278]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:49 np0005476733 nova_compute[192580]: 2025-10-08 15:31:49.063 2 INFO nova.compute.manager [None req-8a180b1d-f572-49d0-a113-beac09b10594 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:31:49 np0005476733 nova_compute[192580]: 2025-10-08 15:31:49.064 2 DEBUG oslo.service.loopingcall [None req-8a180b1d-f572-49d0-a113-beac09b10594 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:31:49 np0005476733 nova_compute[192580]: 2025-10-08 15:31:49.064 2 DEBUG nova.compute.manager [-] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:31:49 np0005476733 nova_compute[192580]: 2025-10-08 15:31:49.064 2 DEBUG nova.network.neutron [-] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:31:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:49.073 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[32361450-7d2b-4ebb-83db-d7f656f704da]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 427153, 'reachable_time': 34274, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232872, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:49.075 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:31:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:31:49.076 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[b192f938-79cf-416b-9876-b55528e88f09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:31:49 np0005476733 systemd[1]: run-netns-ovnmeta\x2d30cdfb1e\x2d750a\x2d4d0e\x2d9e9c\x2d321b06b371b9.mount: Deactivated successfully.
Oct  8 11:31:49 np0005476733 nova_compute[192580]: 2025-10-08 15:31:49.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:49 np0005476733 nova_compute[192580]: 2025-10-08 15:31:49.821 2 DEBUG nova.compute.manager [req-5c08c9d4-39d3-459c-a556-40d207a8740f req-986d4255-d3d5-4f78-a976-b9e9efe749b8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Received event network-vif-unplugged-23f6a943-ce2f-4958-a0c6-73f789517892 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:31:49 np0005476733 nova_compute[192580]: 2025-10-08 15:31:49.822 2 DEBUG oslo_concurrency.lockutils [req-5c08c9d4-39d3-459c-a556-40d207a8740f req-986d4255-d3d5-4f78-a976-b9e9efe749b8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:49 np0005476733 nova_compute[192580]: 2025-10-08 15:31:49.822 2 DEBUG oslo_concurrency.lockutils [req-5c08c9d4-39d3-459c-a556-40d207a8740f req-986d4255-d3d5-4f78-a976-b9e9efe749b8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:49 np0005476733 nova_compute[192580]: 2025-10-08 15:31:49.822 2 DEBUG oslo_concurrency.lockutils [req-5c08c9d4-39d3-459c-a556-40d207a8740f req-986d4255-d3d5-4f78-a976-b9e9efe749b8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:49 np0005476733 nova_compute[192580]: 2025-10-08 15:31:49.822 2 DEBUG nova.compute.manager [req-5c08c9d4-39d3-459c-a556-40d207a8740f req-986d4255-d3d5-4f78-a976-b9e9efe749b8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] No waiting events found dispatching network-vif-unplugged-23f6a943-ce2f-4958-a0c6-73f789517892 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:31:49 np0005476733 nova_compute[192580]: 2025-10-08 15:31:49.823 2 DEBUG nova.compute.manager [req-5c08c9d4-39d3-459c-a556-40d207a8740f req-986d4255-d3d5-4f78-a976-b9e9efe749b8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Received event network-vif-unplugged-23f6a943-ce2f-4958-a0c6-73f789517892 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:31:50 np0005476733 nova_compute[192580]: 2025-10-08 15:31:50.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:31:50 np0005476733 nova_compute[192580]: 2025-10-08 15:31:50.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:31:51 np0005476733 nova_compute[192580]: 2025-10-08 15:31:51.119 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937496.1174448, a7cf9795-ac6e-4d38-8500-755c39931e14 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:31:51 np0005476733 nova_compute[192580]: 2025-10-08 15:31:51.119 2 INFO nova.compute.manager [-] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:31:51 np0005476733 nova_compute[192580]: 2025-10-08 15:31:51.146 2 DEBUG nova.compute.manager [None req-454f743f-a428-4493-8912-fbd0d394b148 - - - - - -] [instance: a7cf9795-ac6e-4d38-8500-755c39931e14] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:31:51 np0005476733 nova_compute[192580]: 2025-10-08 15:31:51.560 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937496.5595539, 307216ea-7a54-4279-80a4-70a83f9056e4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:31:51 np0005476733 nova_compute[192580]: 2025-10-08 15:31:51.560 2 INFO nova.compute.manager [-] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:31:51 np0005476733 nova_compute[192580]: 2025-10-08 15:31:51.585 2 DEBUG nova.compute.manager [None req-8f37d4b1-21e7-44dd-839e-a6e5006b8c94 - - - - - -] [instance: 307216ea-7a54-4279-80a4-70a83f9056e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:31:51 np0005476733 nova_compute[192580]: 2025-10-08 15:31:51.928 2 DEBUG nova.compute.manager [req-4829f260-6aeb-4235-87c3-101e38a80d04 req-b1ddfe39-94e4-42f0-8730-9c8b36756c1c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Received event network-vif-plugged-23f6a943-ce2f-4958-a0c6-73f789517892 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:31:51 np0005476733 nova_compute[192580]: 2025-10-08 15:31:51.929 2 DEBUG oslo_concurrency.lockutils [req-4829f260-6aeb-4235-87c3-101e38a80d04 req-b1ddfe39-94e4-42f0-8730-9c8b36756c1c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:51 np0005476733 nova_compute[192580]: 2025-10-08 15:31:51.929 2 DEBUG oslo_concurrency.lockutils [req-4829f260-6aeb-4235-87c3-101e38a80d04 req-b1ddfe39-94e4-42f0-8730-9c8b36756c1c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:51 np0005476733 nova_compute[192580]: 2025-10-08 15:31:51.930 2 DEBUG oslo_concurrency.lockutils [req-4829f260-6aeb-4235-87c3-101e38a80d04 req-b1ddfe39-94e4-42f0-8730-9c8b36756c1c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:51 np0005476733 nova_compute[192580]: 2025-10-08 15:31:51.930 2 DEBUG nova.compute.manager [req-4829f260-6aeb-4235-87c3-101e38a80d04 req-b1ddfe39-94e4-42f0-8730-9c8b36756c1c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] No waiting events found dispatching network-vif-plugged-23f6a943-ce2f-4958-a0c6-73f789517892 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:31:51 np0005476733 nova_compute[192580]: 2025-10-08 15:31:51.931 2 WARNING nova.compute.manager [req-4829f260-6aeb-4235-87c3-101e38a80d04 req-b1ddfe39-94e4-42f0-8730-9c8b36756c1c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Received unexpected event network-vif-plugged-23f6a943-ce2f-4958-a0c6-73f789517892 for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:31:51 np0005476733 nova_compute[192580]: 2025-10-08 15:31:51.933 2 DEBUG nova.network.neutron [-] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:31:51 np0005476733 nova_compute[192580]: 2025-10-08 15:31:51.958 2 INFO nova.compute.manager [-] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Took 2.89 seconds to deallocate network for instance.#033[00m
Oct  8 11:31:52 np0005476733 nova_compute[192580]: 2025-10-08 15:31:52.004 2 DEBUG oslo_concurrency.lockutils [None req-8a180b1d-f572-49d0-a113-beac09b10594 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:52 np0005476733 nova_compute[192580]: 2025-10-08 15:31:52.005 2 DEBUG oslo_concurrency.lockutils [None req-8a180b1d-f572-49d0-a113-beac09b10594 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:52 np0005476733 nova_compute[192580]: 2025-10-08 15:31:52.129 2 DEBUG nova.compute.provider_tree [None req-8a180b1d-f572-49d0-a113-beac09b10594 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:31:52 np0005476733 nova_compute[192580]: 2025-10-08 15:31:52.154 2 DEBUG nova.scheduler.client.report [None req-8a180b1d-f572-49d0-a113-beac09b10594 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:31:52 np0005476733 nova_compute[192580]: 2025-10-08 15:31:52.184 2 DEBUG oslo_concurrency.lockutils [None req-8a180b1d-f572-49d0-a113-beac09b10594 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:52 np0005476733 nova_compute[192580]: 2025-10-08 15:31:52.219 2 INFO nova.scheduler.client.report [None req-8a180b1d-f572-49d0-a113-beac09b10594 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Deleted allocations for instance 27fa9a5a-04a0-4d80-b75d-564df1c974e8#033[00m
Oct  8 11:31:52 np0005476733 nova_compute[192580]: 2025-10-08 15:31:52.317 2 DEBUG oslo_concurrency.lockutils [None req-8a180b1d-f572-49d0-a113-beac09b10594 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "27fa9a5a-04a0-4d80-b75d-564df1c974e8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.632s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:52 np0005476733 nova_compute[192580]: 2025-10-08 15:31:52.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:31:52 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:52Z|00412|binding|INFO|Releasing lport f67773e8-4408-425a-8438-2209ddc36987 from this chassis (sb_readonly=0)
Oct  8 11:31:52 np0005476733 nova_compute[192580]: 2025-10-08 15:31:52.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:53 np0005476733 nova_compute[192580]: 2025-10-08 15:31:53.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:54 np0005476733 nova_compute[192580]: 2025-10-08 15:31:54.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:54 np0005476733 nova_compute[192580]: 2025-10-08 15:31:54.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:31:54 np0005476733 nova_compute[192580]: 2025-10-08 15:31:54.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:31:54 np0005476733 nova_compute[192580]: 2025-10-08 15:31:54.610 2 DEBUG nova.compute.manager [req-85c4dfa1-8576-4801-804b-33e0c241aef9 req-c224f542-9e17-4f0b-a7c9-29355424e9c5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Received event network-vif-deleted-23f6a943-ce2f-4958-a0c6-73f789517892 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:31:55 np0005476733 nova_compute[192580]: 2025-10-08 15:31:55.133 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:31:55 np0005476733 nova_compute[192580]: 2025-10-08 15:31:55.133 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:31:55 np0005476733 nova_compute[192580]: 2025-10-08 15:31:55.133 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:31:55 np0005476733 nova_compute[192580]: 2025-10-08 15:31:55.728 2 DEBUG oslo_concurrency.lockutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Acquiring lock "e7b170f9-efdc-458b-a2e6-04c7f2072900" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:55 np0005476733 nova_compute[192580]: 2025-10-08 15:31:55.729 2 DEBUG oslo_concurrency.lockutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Lock "e7b170f9-efdc-458b-a2e6-04c7f2072900" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:55 np0005476733 nova_compute[192580]: 2025-10-08 15:31:55.752 2 DEBUG nova.compute.manager [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:31:55 np0005476733 nova_compute[192580]: 2025-10-08 15:31:55.837 2 DEBUG oslo_concurrency.lockutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:55 np0005476733 nova_compute[192580]: 2025-10-08 15:31:55.837 2 DEBUG oslo_concurrency.lockutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:55 np0005476733 nova_compute[192580]: 2025-10-08 15:31:55.847 2 DEBUG nova.virt.hardware [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:31:55 np0005476733 nova_compute[192580]: 2025-10-08 15:31:55.848 2 INFO nova.compute.claims [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.048 2 DEBUG nova.compute.provider_tree [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.076 2 DEBUG nova.scheduler.client.report [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.100 2 DEBUG oslo_concurrency.lockutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.263s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.101 2 DEBUG nova.compute.manager [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.154 2 DEBUG nova.compute.manager [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.155 2 DEBUG nova.network.neutron [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.175 2 INFO nova.virt.libvirt.driver [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.194 2 DEBUG nova.compute.manager [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.308 2 DEBUG nova.compute.manager [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.310 2 DEBUG nova.virt.libvirt.driver [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.311 2 INFO nova.virt.libvirt.driver [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Creating image(s)#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.312 2 DEBUG oslo_concurrency.lockutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Acquiring lock "/var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.313 2 DEBUG oslo_concurrency.lockutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Lock "/var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.314 2 DEBUG oslo_concurrency.lockutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Lock "/var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.330 2 DEBUG oslo_concurrency.processutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.402 2 DEBUG oslo_concurrency.processutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.403 2 DEBUG oslo_concurrency.lockutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Acquiring lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.403 2 DEBUG oslo_concurrency.lockutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.416 2 DEBUG oslo_concurrency.processutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.492 2 DEBUG oslo_concurrency.processutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.493 2 DEBUG oslo_concurrency.processutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.520 2 DEBUG nova.policy [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.537 2 DEBUG oslo_concurrency.processutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.539 2 DEBUG oslo_concurrency.lockutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.539 2 DEBUG oslo_concurrency.processutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.624 2 DEBUG oslo_concurrency.processutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.625 2 DEBUG nova.virt.disk.api [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Checking if we can resize image /var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.626 2 DEBUG oslo_concurrency.processutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.698 2 DEBUG oslo_concurrency.processutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.700 2 DEBUG nova.virt.disk.api [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Cannot resize image /var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.700 2 DEBUG nova.objects.instance [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Lazy-loading 'migration_context' on Instance uuid e7b170f9-efdc-458b-a2e6-04c7f2072900 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.745 2 DEBUG nova.virt.libvirt.driver [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.745 2 DEBUG nova.virt.libvirt.driver [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Ensure instance console log exists: /var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.746 2 DEBUG oslo_concurrency.lockutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.746 2 DEBUG oslo_concurrency.lockutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:31:56 np0005476733 nova_compute[192580]: 2025-10-08 15:31:56.746 2 DEBUG oslo_concurrency.lockutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:31:58 np0005476733 nova_compute[192580]: 2025-10-08 15:31:58.130 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updating instance_info_cache with network_info: [{"id": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "address": "fa:16:3e:85:7d:15", "network": {"id": "f81b33e3-d2f7-4437-b8c9-c9a54931fb61", "bridge": "br-int", "label": "tempest-test-network--416037603", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f7d5998-03", "ovs_interfaceid": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:31:58 np0005476733 nova_compute[192580]: 2025-10-08 15:31:58.135 2 DEBUG nova.network.neutron [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Successfully created port: 6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 11:31:58 np0005476733 nova_compute[192580]: 2025-10-08 15:31:58.165 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:31:58 np0005476733 nova_compute[192580]: 2025-10-08 15:31:58.166 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:31:58 np0005476733 nova_compute[192580]: 2025-10-08 15:31:58.168 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:31:58 np0005476733 podman[232889]: 2025-10-08 15:31:58.24423307 +0000 UTC m=+0.069744872 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 11:31:58 np0005476733 podman[232888]: 2025-10-08 15:31:58.251914108 +0000 UTC m=+0.077253064 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  8 11:31:58 np0005476733 nova_compute[192580]: 2025-10-08 15:31:58.966 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937503.9654417, 164d69c5-58d2-413e-9b1f-907b5cc12d9b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:31:58 np0005476733 nova_compute[192580]: 2025-10-08 15:31:58.967 2 INFO nova.compute.manager [-] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:31:58 np0005476733 nova_compute[192580]: 2025-10-08 15:31:58.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:58 np0005476733 nova_compute[192580]: 2025-10-08 15:31:58.995 2 DEBUG nova.compute.manager [None req-eb2146af-de27-4332-a513-5a690eed4452 - - - - - -] [instance: 164d69c5-58d2-413e-9b1f-907b5cc12d9b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:31:59 np0005476733 nova_compute[192580]: 2025-10-08 15:31:59.160 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:31:59 np0005476733 nova_compute[192580]: 2025-10-08 15:31:59.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:59 np0005476733 ovn_controller[94857]: 2025-10-08T15:31:59Z|00413|binding|INFO|Releasing lport f67773e8-4408-425a-8438-2209ddc36987 from this chassis (sb_readonly=0)
Oct  8 11:31:59 np0005476733 nova_compute[192580]: 2025-10-08 15:31:59.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:31:59 np0005476733 nova_compute[192580]: 2025-10-08 15:31:59.818 2 DEBUG nova.network.neutron [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Successfully updated port: 6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:31:59 np0005476733 nova_compute[192580]: 2025-10-08 15:31:59.832 2 DEBUG oslo_concurrency.lockutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Acquiring lock "refresh_cache-e7b170f9-efdc-458b-a2e6-04c7f2072900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:31:59 np0005476733 nova_compute[192580]: 2025-10-08 15:31:59.832 2 DEBUG oslo_concurrency.lockutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Acquired lock "refresh_cache-e7b170f9-efdc-458b-a2e6-04c7f2072900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:31:59 np0005476733 nova_compute[192580]: 2025-10-08 15:31:59.832 2 DEBUG nova.network.neutron [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:32:00 np0005476733 nova_compute[192580]: 2025-10-08 15:32:00.186 2 DEBUG nova.compute.manager [req-a8605719-cee1-4047-b99a-44c725643e41 req-d138507a-3980-4e94-bc56-aec6f8ff8644 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Received event network-changed-6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:32:00 np0005476733 nova_compute[192580]: 2025-10-08 15:32:00.186 2 DEBUG nova.compute.manager [req-a8605719-cee1-4047-b99a-44c725643e41 req-d138507a-3980-4e94-bc56-aec6f8ff8644 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Refreshing instance network info cache due to event network-changed-6a82b4a7-2453-4ee1-866e-a6fe2175b5c4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:32:00 np0005476733 nova_compute[192580]: 2025-10-08 15:32:00.186 2 DEBUG oslo_concurrency.lockutils [req-a8605719-cee1-4047-b99a-44c725643e41 req-d138507a-3980-4e94-bc56-aec6f8ff8644 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-e7b170f9-efdc-458b-a2e6-04c7f2072900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:32:00 np0005476733 nova_compute[192580]: 2025-10-08 15:32:00.338 2 DEBUG nova.network.neutron [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.440 2 DEBUG nova.network.neutron [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Updating instance_info_cache with network_info: [{"id": "6a82b4a7-2453-4ee1-866e-a6fe2175b5c4", "address": "fa:16:3e:0a:fc:11", "network": {"id": "d7fe4641-81c3-446f-bec0-114221bc2533", "bridge": "br-int", "label": "tempest-test-network--1217232881", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0ff332fd7f14bd0831aa78a16065653", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a82b4a7-24", "ovs_interfaceid": "6a82b4a7-2453-4ee1-866e-a6fe2175b5c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.462 2 DEBUG oslo_concurrency.lockutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Releasing lock "refresh_cache-e7b170f9-efdc-458b-a2e6-04c7f2072900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.462 2 DEBUG nova.compute.manager [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Instance network_info: |[{"id": "6a82b4a7-2453-4ee1-866e-a6fe2175b5c4", "address": "fa:16:3e:0a:fc:11", "network": {"id": "d7fe4641-81c3-446f-bec0-114221bc2533", "bridge": "br-int", "label": "tempest-test-network--1217232881", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0ff332fd7f14bd0831aa78a16065653", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a82b4a7-24", "ovs_interfaceid": "6a82b4a7-2453-4ee1-866e-a6fe2175b5c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.463 2 DEBUG oslo_concurrency.lockutils [req-a8605719-cee1-4047-b99a-44c725643e41 req-d138507a-3980-4e94-bc56-aec6f8ff8644 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-e7b170f9-efdc-458b-a2e6-04c7f2072900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.464 2 DEBUG nova.network.neutron [req-a8605719-cee1-4047-b99a-44c725643e41 req-d138507a-3980-4e94-bc56-aec6f8ff8644 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Refreshing network info cache for port 6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.466 2 DEBUG nova.virt.libvirt.driver [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Start _get_guest_xml network_info=[{"id": "6a82b4a7-2453-4ee1-866e-a6fe2175b5c4", "address": "fa:16:3e:0a:fc:11", "network": {"id": "d7fe4641-81c3-446f-bec0-114221bc2533", "bridge": "br-int", "label": "tempest-test-network--1217232881", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0ff332fd7f14bd0831aa78a16065653", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a82b4a7-24", "ovs_interfaceid": "6a82b4a7-2453-4ee1-866e-a6fe2175b5c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T15:17:39Z,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T15:17:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.472 2 WARNING nova.virt.libvirt.driver [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.477 2 DEBUG nova.virt.libvirt.host [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.477 2 DEBUG nova.virt.libvirt.host [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.481 2 DEBUG nova.virt.libvirt.host [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.482 2 DEBUG nova.virt.libvirt.host [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.482 2 DEBUG nova.virt.libvirt.driver [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.483 2 DEBUG nova.virt.hardware [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='987b2db7-1d21-4b59-831a-1e8ace40589b',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T15:17:39Z,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T15:17:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.483 2 DEBUG nova.virt.hardware [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.483 2 DEBUG nova.virt.hardware [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.483 2 DEBUG nova.virt.hardware [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.484 2 DEBUG nova.virt.hardware [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.484 2 DEBUG nova.virt.hardware [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.484 2 DEBUG nova.virt.hardware [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.484 2 DEBUG nova.virt.hardware [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.485 2 DEBUG nova.virt.hardware [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.485 2 DEBUG nova.virt.hardware [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.485 2 DEBUG nova.virt.hardware [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.490 2 DEBUG nova.virt.libvirt.vif [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:31:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-976377571',display_name='tempest-server-test-976377571',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-976377571',id=51,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLFGOBkGCcnIhZLnHKhYm7hKKuHgkmXVeeupmTvZ0MNWUPxIG3Vb+h0B4+BO+V7hH0rIdgfIY/0I7XuFYkR1QQ36NlsKH4DmMxnH65ozcPPxhe9fqB9OYl0FmItPfvXCDQ==',key_name='tempest-keypair-test-2773249',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c0ff332fd7f14bd0831aa78a16065653',ramdisk_id='',reservation_id='r-4nxuos03',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkDefaultSecGroupTest-1600801023',owner_user_name='tempest-NetworkDefaultSecGroupTest-1600801023-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:31:56Z,user_data=None,user_id='c45a2b13dbdc4134a7829d83659d4dfd',uuid=e7b170f9-efdc-458b-a2e6-04c7f2072900,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6a82b4a7-2453-4ee1-866e-a6fe2175b5c4", "address": "fa:16:3e:0a:fc:11", "network": {"id": "d7fe4641-81c3-446f-bec0-114221bc2533", "bridge": "br-int", "label": "tempest-test-network--1217232881", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0ff332fd7f14bd0831aa78a16065653", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a82b4a7-24", "ovs_interfaceid": "6a82b4a7-2453-4ee1-866e-a6fe2175b5c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.490 2 DEBUG nova.network.os_vif_util [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Converting VIF {"id": "6a82b4a7-2453-4ee1-866e-a6fe2175b5c4", "address": "fa:16:3e:0a:fc:11", "network": {"id": "d7fe4641-81c3-446f-bec0-114221bc2533", "bridge": "br-int", "label": "tempest-test-network--1217232881", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0ff332fd7f14bd0831aa78a16065653", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a82b4a7-24", "ovs_interfaceid": "6a82b4a7-2453-4ee1-866e-a6fe2175b5c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.491 2 DEBUG nova.network.os_vif_util [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:fc:11,bridge_name='br-int',has_traffic_filtering=True,id=6a82b4a7-2453-4ee1-866e-a6fe2175b5c4,network=Network(d7fe4641-81c3-446f-bec0-114221bc2533),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a82b4a7-24') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.493 2 DEBUG nova.objects.instance [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Lazy-loading 'pci_devices' on Instance uuid e7b170f9-efdc-458b-a2e6-04c7f2072900 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.511 2 DEBUG nova.virt.libvirt.driver [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:32:02 np0005476733 nova_compute[192580]:  <uuid>e7b170f9-efdc-458b-a2e6-04c7f2072900</uuid>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:  <name>instance-00000033</name>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:  <memory>131072</memory>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <nova:name>tempest-server-test-976377571</nova:name>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:32:02</nova:creationTime>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <nova:flavor name="m1.nano">
Oct  8 11:32:02 np0005476733 nova_compute[192580]:        <nova:memory>128</nova:memory>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:        <nova:disk>1</nova:disk>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:        <nova:user uuid="c45a2b13dbdc4134a7829d83659d4dfd">tempest-NetworkDefaultSecGroupTest-1600801023-project-member</nova:user>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:        <nova:project uuid="c0ff332fd7f14bd0831aa78a16065653">tempest-NetworkDefaultSecGroupTest-1600801023</nova:project>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="ec29a055-bb5f-49c2-94be-8574c5ea97ea"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:        <nova:port uuid="6a82b4a7-2453-4ee1-866e-a6fe2175b5c4">
Oct  8 11:32:02 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <entry name="serial">e7b170f9-efdc-458b-a2e6-04c7f2072900</entry>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <entry name="uuid">e7b170f9-efdc-458b-a2e6-04c7f2072900</entry>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900/disk"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.config"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:0a:fc:11"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <target dev="tap6a82b4a7-24"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900/console.log" append="off"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:32:02 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:32:02 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:32:02 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:32:02 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.513 2 DEBUG nova.compute.manager [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Preparing to wait for external event network-vif-plugged-6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.513 2 DEBUG oslo_concurrency.lockutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Acquiring lock "e7b170f9-efdc-458b-a2e6-04c7f2072900-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.513 2 DEBUG oslo_concurrency.lockutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Lock "e7b170f9-efdc-458b-a2e6-04c7f2072900-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.514 2 DEBUG oslo_concurrency.lockutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Lock "e7b170f9-efdc-458b-a2e6-04c7f2072900-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.514 2 DEBUG nova.virt.libvirt.vif [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:31:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-976377571',display_name='tempest-server-test-976377571',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-976377571',id=51,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLFGOBkGCcnIhZLnHKhYm7hKKuHgkmXVeeupmTvZ0MNWUPxIG3Vb+h0B4+BO+V7hH0rIdgfIY/0I7XuFYkR1QQ36NlsKH4DmMxnH65ozcPPxhe9fqB9OYl0FmItPfvXCDQ==',key_name='tempest-keypair-test-2773249',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c0ff332fd7f14bd0831aa78a16065653',ramdisk_id='',reservation_id='r-4nxuos03',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NetworkDefaultSecGroupTest-1600801023',owner_user_name='tempest-NetworkDefaultSecGroupTest-1600801023-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:31:56Z,user_data=None,user_id='c45a2b13dbdc4134a7829d83659d4dfd',uuid=e7b170f9-efdc-458b-a2e6-04c7f2072900,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6a82b4a7-2453-4ee1-866e-a6fe2175b5c4", "address": "fa:16:3e:0a:fc:11", "network": {"id": "d7fe4641-81c3-446f-bec0-114221bc2533", "bridge": "br-int", "label": "tempest-test-network--1217232881", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0ff332fd7f14bd0831aa78a16065653", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a82b4a7-24", "ovs_interfaceid": "6a82b4a7-2453-4ee1-866e-a6fe2175b5c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.515 2 DEBUG nova.network.os_vif_util [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Converting VIF {"id": "6a82b4a7-2453-4ee1-866e-a6fe2175b5c4", "address": "fa:16:3e:0a:fc:11", "network": {"id": "d7fe4641-81c3-446f-bec0-114221bc2533", "bridge": "br-int", "label": "tempest-test-network--1217232881", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0ff332fd7f14bd0831aa78a16065653", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a82b4a7-24", "ovs_interfaceid": "6a82b4a7-2453-4ee1-866e-a6fe2175b5c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.516 2 DEBUG nova.network.os_vif_util [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:fc:11,bridge_name='br-int',has_traffic_filtering=True,id=6a82b4a7-2453-4ee1-866e-a6fe2175b5c4,network=Network(d7fe4641-81c3-446f-bec0-114221bc2533),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a82b4a7-24') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.516 2 DEBUG os_vif [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:fc:11,bridge_name='br-int',has_traffic_filtering=True,id=6a82b4a7-2453-4ee1-866e-a6fe2175b5c4,network=Network(d7fe4641-81c3-446f-bec0-114221bc2533),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a82b4a7-24') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.518 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.518 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.523 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a82b4a7-24, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.523 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6a82b4a7-24, col_values=(('external_ids', {'iface-id': '6a82b4a7-2453-4ee1-866e-a6fe2175b5c4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0a:fc:11', 'vm-uuid': 'e7b170f9-efdc-458b-a2e6-04c7f2072900'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:02 np0005476733 NetworkManager[51699]: <info>  [1759937522.5273] manager: (tap6a82b4a7-24): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/137)
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.539 2 INFO os_vif [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:fc:11,bridge_name='br-int',has_traffic_filtering=True,id=6a82b4a7-2453-4ee1-866e-a6fe2175b5c4,network=Network(d7fe4641-81c3-446f-bec0-114221bc2533),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a82b4a7-24')#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.605 2 DEBUG nova.virt.libvirt.driver [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.605 2 DEBUG nova.virt.libvirt.driver [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.605 2 DEBUG nova.virt.libvirt.driver [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] No VIF found with MAC fa:16:3e:0a:fc:11, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:32:02 np0005476733 nova_compute[192580]: 2025-10-08 15:32:02.606 2 INFO nova.virt.libvirt.driver [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Using config drive#033[00m
Oct  8 11:32:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:03.322 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:32:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:03.323 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:32:03 np0005476733 nova_compute[192580]: 2025-10-08 15:32:03.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:03 np0005476733 nova_compute[192580]: 2025-10-08 15:32:03.954 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937508.9540708, 27fa9a5a-04a0-4d80-b75d-564df1c974e8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:32:03 np0005476733 nova_compute[192580]: 2025-10-08 15:32:03.955 2 INFO nova.compute.manager [-] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:32:03 np0005476733 nova_compute[192580]: 2025-10-08 15:32:03.972 2 DEBUG nova.compute.manager [None req-a76f180d-2471-47c6-b300-b01e0ae39ee5 - - - - - -] [instance: 27fa9a5a-04a0-4d80-b75d-564df1c974e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:32:04 np0005476733 nova_compute[192580]: 2025-10-08 15:32:04.370 2 INFO nova.virt.libvirt.driver [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Creating config drive at /var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.config#033[00m
Oct  8 11:32:04 np0005476733 nova_compute[192580]: 2025-10-08 15:32:04.376 2 DEBUG oslo_concurrency.processutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1mi9reqq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:32:04 np0005476733 nova_compute[192580]: 2025-10-08 15:32:04.508 2 DEBUG oslo_concurrency.processutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1mi9reqq" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:32:04 np0005476733 nova_compute[192580]: 2025-10-08 15:32:04.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:04 np0005476733 kernel: tap6a82b4a7-24: entered promiscuous mode
Oct  8 11:32:04 np0005476733 NetworkManager[51699]: <info>  [1759937524.5934] manager: (tap6a82b4a7-24): new Tun device (/org/freedesktop/NetworkManager/Devices/138)
Oct  8 11:32:04 np0005476733 ovn_controller[94857]: 2025-10-08T15:32:04Z|00414|binding|INFO|Claiming lport 6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 for this chassis.
Oct  8 11:32:04 np0005476733 ovn_controller[94857]: 2025-10-08T15:32:04Z|00415|binding|INFO|6a82b4a7-2453-4ee1-866e-a6fe2175b5c4: Claiming fa:16:3e:0a:fc:11 10.100.0.13
Oct  8 11:32:04 np0005476733 nova_compute[192580]: 2025-10-08 15:32:04.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.619 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:fc:11 10.100.0.13'], port_security=['fa:16:3e:0a:fc:11 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7fe4641-81c3-446f-bec0-114221bc2533', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3d602ec4-c136-4188-b9a6-e7299f4a8d98', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=812093c9-6f12-436a-826a-1a6fb93b9ea7, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=6a82b4a7-2453-4ee1-866e-a6fe2175b5c4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.620 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 in datapath d7fe4641-81c3-446f-bec0-114221bc2533 bound to our chassis#033[00m
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.623 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7fe4641-81c3-446f-bec0-114221bc2533#033[00m
Oct  8 11:32:04 np0005476733 ovn_controller[94857]: 2025-10-08T15:32:04Z|00416|binding|INFO|Setting lport 6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 ovn-installed in OVS
Oct  8 11:32:04 np0005476733 ovn_controller[94857]: 2025-10-08T15:32:04Z|00417|binding|INFO|Setting lport 6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 up in Southbound
Oct  8 11:32:04 np0005476733 nova_compute[192580]: 2025-10-08 15:32:04.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:04 np0005476733 nova_compute[192580]: 2025-10-08 15:32:04.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:04 np0005476733 nova_compute[192580]: 2025-10-08 15:32:04.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.636 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[efffb283-55b3-4060-80d6-3eb8e6025413]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.638 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7fe4641-81 in ovnmeta-d7fe4641-81c3-446f-bec0-114221bc2533 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:32:04 np0005476733 systemd-udevd[232953]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.641 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7fe4641-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.642 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b9dbdc60-5c1e-4acc-a86c-fe4217b0aaca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.643 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3de7ed7b-553f-4e5c-b12b-fea9b94f3e5d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:04 np0005476733 NetworkManager[51699]: <info>  [1759937524.6542] device (tap6a82b4a7-24): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:32:04 np0005476733 NetworkManager[51699]: <info>  [1759937524.6554] device (tap6a82b4a7-24): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:32:04 np0005476733 systemd-machined[152624]: New machine qemu-30-instance-00000033.
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.658 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[233b4e94-2ba3-4478-8717-b7a3dbe91a16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:04 np0005476733 systemd[1]: Started Virtual Machine qemu-30-instance-00000033.
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.678 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f4316792-de6a-4299-a4be-73c95131bac0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.713 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[1e932ed6-5818-4c90-96ee-5b1689ebfe26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:04 np0005476733 NetworkManager[51699]: <info>  [1759937524.7195] manager: (tapd7fe4641-80): new Veth device (/org/freedesktop/NetworkManager/Devices/139)
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.718 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[745e270f-f5ef-4b5c-9f7a-aadc173a24b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:04 np0005476733 systemd-udevd[232957]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.758 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[cd9d62f6-9d2e-45cd-b0f2-0a813132310f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.764 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[a8547342-806e-4548-928c-2ec9e7a31b67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:04 np0005476733 NetworkManager[51699]: <info>  [1759937524.7936] device (tapd7fe4641-80): carrier: link connected
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.800 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[fd8c3deb-8a47-4161-8f16-6b643ee81c25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.822 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[49b9a3d0-37d6-49fc-acb9-714aac27ddb4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7fe4641-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f5:2a:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 96], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 442845, 'reachable_time': 18640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232986, 'error': None, 'target': 'ovnmeta-d7fe4641-81c3-446f-bec0-114221bc2533', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.842 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c5c393a3-fafa-449e-a2a9-95487ef9c418]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef5:2a18'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 442845, 'tstamp': 442845}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232987, 'error': None, 'target': 'ovnmeta-d7fe4641-81c3-446f-bec0-114221bc2533', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.866 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[39ed369e-5ff2-4ec6-bab5-d03226eec903]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7fe4641-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f5:2a:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 96], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 442845, 'reachable_time': 18640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232988, 'error': None, 'target': 'ovnmeta-d7fe4641-81c3-446f-bec0-114221bc2533', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.904 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[bde421bf-b724-4b2c-b4cf-ed1ebdc9c5bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.971 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[86b6db01-d702-4e15-85e9-8ffdb89f051d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.973 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7fe4641-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.973 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.974 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7fe4641-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:32:04 np0005476733 NetworkManager[51699]: <info>  [1759937524.9765] manager: (tapd7fe4641-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/140)
Oct  8 11:32:04 np0005476733 kernel: tapd7fe4641-80: entered promiscuous mode
Oct  8 11:32:04 np0005476733 nova_compute[192580]: 2025-10-08 15:32:04.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.979 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7fe4641-80, col_values=(('external_ids', {'iface-id': '273f6d13-1643-4fbf-8405-0ca2e0049b96'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:32:04 np0005476733 nova_compute[192580]: 2025-10-08 15:32:04.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:04 np0005476733 ovn_controller[94857]: 2025-10-08T15:32:04Z|00418|binding|INFO|Releasing lport 273f6d13-1643-4fbf-8405-0ca2e0049b96 from this chassis (sb_readonly=0)
Oct  8 11:32:04 np0005476733 nova_compute[192580]: 2025-10-08 15:32:04.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.994 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7fe4641-81c3-446f-bec0-114221bc2533.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7fe4641-81c3-446f-bec0-114221bc2533.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.995 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9b313f41-cb28-4b7b-a405-19b54ea6bd13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.996 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-d7fe4641-81c3-446f-bec0-114221bc2533
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/d7fe4641-81c3-446f-bec0-114221bc2533.pid.haproxy
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID d7fe4641-81c3-446f-bec0-114221bc2533
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:32:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:04.996 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7fe4641-81c3-446f-bec0-114221bc2533', 'env', 'PROCESS_TAG=haproxy-d7fe4641-81c3-446f-bec0-114221bc2533', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7fe4641-81c3-446f-bec0-114221bc2533.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:32:05 np0005476733 podman[233025]: 2025-10-08 15:32:05.399517114 +0000 UTC m=+0.061210468 container create 6c5a649430b81f27fc11badc8b282e92a1c0c2aa36503a3e0edcf48c18bf6bb7 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-d7fe4641-81c3-446f-bec0-114221bc2533, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:32:05 np0005476733 systemd[1]: Started libpod-conmon-6c5a649430b81f27fc11badc8b282e92a1c0c2aa36503a3e0edcf48c18bf6bb7.scope.
Oct  8 11:32:05 np0005476733 podman[233025]: 2025-10-08 15:32:05.364711716 +0000 UTC m=+0.026405090 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:32:05 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:32:05 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8d2ac4e2071f1e4f11f384f7b701759bd7d59e43cab4d8e8b1428b0b4c31455/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:32:05 np0005476733 podman[233025]: 2025-10-08 15:32:05.507545376 +0000 UTC m=+0.169238750 container init 6c5a649430b81f27fc11badc8b282e92a1c0c2aa36503a3e0edcf48c18bf6bb7 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-d7fe4641-81c3-446f-bec0-114221bc2533, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:32:05 np0005476733 podman[233025]: 2025-10-08 15:32:05.513031922 +0000 UTC m=+0.174725296 container start 6c5a649430b81f27fc11badc8b282e92a1c0c2aa36503a3e0edcf48c18bf6bb7 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-d7fe4641-81c3-446f-bec0-114221bc2533, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:32:05 np0005476733 neutron-haproxy-ovnmeta-d7fe4641-81c3-446f-bec0-114221bc2533[233040]: [NOTICE]   (233044) : New worker (233046) forked
Oct  8 11:32:05 np0005476733 neutron-haproxy-ovnmeta-d7fe4641-81c3-446f-bec0-114221bc2533[233040]: [NOTICE]   (233044) : Loading success.
Oct  8 11:32:05 np0005476733 nova_compute[192580]: 2025-10-08 15:32:05.676 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937525.6758451, e7b170f9-efdc-458b-a2e6-04c7f2072900 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:32:05 np0005476733 nova_compute[192580]: 2025-10-08 15:32:05.677 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] VM Started (Lifecycle Event)#033[00m
Oct  8 11:32:05 np0005476733 nova_compute[192580]: 2025-10-08 15:32:05.711 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:32:05 np0005476733 nova_compute[192580]: 2025-10-08 15:32:05.716 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937525.677489, e7b170f9-efdc-458b-a2e6-04c7f2072900 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:32:05 np0005476733 nova_compute[192580]: 2025-10-08 15:32:05.716 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:32:05 np0005476733 nova_compute[192580]: 2025-10-08 15:32:05.734 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:32:05 np0005476733 nova_compute[192580]: 2025-10-08 15:32:05.741 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:32:05 np0005476733 nova_compute[192580]: 2025-10-08 15:32:05.763 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:32:06 np0005476733 nova_compute[192580]: 2025-10-08 15:32:06.929 2 DEBUG nova.compute.manager [req-e3d70eca-1453-4b2c-9460-1d0917bc58de req-3e38fc52-a9c9-47f3-b7ae-90493244ada5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Received event network-vif-plugged-6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:32:06 np0005476733 nova_compute[192580]: 2025-10-08 15:32:06.929 2 DEBUG oslo_concurrency.lockutils [req-e3d70eca-1453-4b2c-9460-1d0917bc58de req-3e38fc52-a9c9-47f3-b7ae-90493244ada5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e7b170f9-efdc-458b-a2e6-04c7f2072900-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:32:06 np0005476733 nova_compute[192580]: 2025-10-08 15:32:06.929 2 DEBUG oslo_concurrency.lockutils [req-e3d70eca-1453-4b2c-9460-1d0917bc58de req-3e38fc52-a9c9-47f3-b7ae-90493244ada5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e7b170f9-efdc-458b-a2e6-04c7f2072900-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:32:06 np0005476733 nova_compute[192580]: 2025-10-08 15:32:06.930 2 DEBUG oslo_concurrency.lockutils [req-e3d70eca-1453-4b2c-9460-1d0917bc58de req-3e38fc52-a9c9-47f3-b7ae-90493244ada5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e7b170f9-efdc-458b-a2e6-04c7f2072900-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:32:06 np0005476733 nova_compute[192580]: 2025-10-08 15:32:06.930 2 DEBUG nova.compute.manager [req-e3d70eca-1453-4b2c-9460-1d0917bc58de req-3e38fc52-a9c9-47f3-b7ae-90493244ada5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Processing event network-vif-plugged-6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:32:06 np0005476733 nova_compute[192580]: 2025-10-08 15:32:06.930 2 DEBUG nova.compute.manager [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:32:06 np0005476733 nova_compute[192580]: 2025-10-08 15:32:06.934 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937526.9341574, e7b170f9-efdc-458b-a2e6-04c7f2072900 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:32:06 np0005476733 nova_compute[192580]: 2025-10-08 15:32:06.934 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:32:06 np0005476733 nova_compute[192580]: 2025-10-08 15:32:06.936 2 DEBUG nova.virt.libvirt.driver [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:32:06 np0005476733 nova_compute[192580]: 2025-10-08 15:32:06.939 2 INFO nova.virt.libvirt.driver [-] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Instance spawned successfully.#033[00m
Oct  8 11:32:06 np0005476733 nova_compute[192580]: 2025-10-08 15:32:06.939 2 DEBUG nova.virt.libvirt.driver [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:32:06 np0005476733 nova_compute[192580]: 2025-10-08 15:32:06.965 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:32:06 np0005476733 nova_compute[192580]: 2025-10-08 15:32:06.969 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:32:06 np0005476733 nova_compute[192580]: 2025-10-08 15:32:06.982 2 DEBUG nova.virt.libvirt.driver [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:32:06 np0005476733 nova_compute[192580]: 2025-10-08 15:32:06.982 2 DEBUG nova.virt.libvirt.driver [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:32:06 np0005476733 nova_compute[192580]: 2025-10-08 15:32:06.983 2 DEBUG nova.virt.libvirt.driver [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:32:06 np0005476733 nova_compute[192580]: 2025-10-08 15:32:06.984 2 DEBUG nova.virt.libvirt.driver [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:32:06 np0005476733 nova_compute[192580]: 2025-10-08 15:32:06.984 2 DEBUG nova.virt.libvirt.driver [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:32:06 np0005476733 nova_compute[192580]: 2025-10-08 15:32:06.985 2 DEBUG nova.virt.libvirt.driver [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:32:06 np0005476733 nova_compute[192580]: 2025-10-08 15:32:06.990 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:32:07 np0005476733 nova_compute[192580]: 2025-10-08 15:32:07.057 2 INFO nova.compute.manager [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Took 10.75 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:32:07 np0005476733 nova_compute[192580]: 2025-10-08 15:32:07.058 2 DEBUG nova.compute.manager [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:32:07 np0005476733 nova_compute[192580]: 2025-10-08 15:32:07.149 2 INFO nova.compute.manager [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Took 11.34 seconds to build instance.#033[00m
Oct  8 11:32:07 np0005476733 nova_compute[192580]: 2025-10-08 15:32:07.179 2 DEBUG oslo_concurrency.lockutils [None req-fa0fc673-acf3-4044-93ad-b5a5efb435c5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Lock "e7b170f9-efdc-458b-a2e6-04c7f2072900" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.450s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:32:07 np0005476733 nova_compute[192580]: 2025-10-08 15:32:07.225 2 DEBUG nova.network.neutron [req-a8605719-cee1-4047-b99a-44c725643e41 req-d138507a-3980-4e94-bc56-aec6f8ff8644 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Updated VIF entry in instance network info cache for port 6a82b4a7-2453-4ee1-866e-a6fe2175b5c4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:32:07 np0005476733 nova_compute[192580]: 2025-10-08 15:32:07.226 2 DEBUG nova.network.neutron [req-a8605719-cee1-4047-b99a-44c725643e41 req-d138507a-3980-4e94-bc56-aec6f8ff8644 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Updating instance_info_cache with network_info: [{"id": "6a82b4a7-2453-4ee1-866e-a6fe2175b5c4", "address": "fa:16:3e:0a:fc:11", "network": {"id": "d7fe4641-81c3-446f-bec0-114221bc2533", "bridge": "br-int", "label": "tempest-test-network--1217232881", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0ff332fd7f14bd0831aa78a16065653", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a82b4a7-24", "ovs_interfaceid": "6a82b4a7-2453-4ee1-866e-a6fe2175b5c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:32:07 np0005476733 podman[233055]: 2025-10-08 15:32:07.24240612 +0000 UTC m=+0.060339839 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:32:07 np0005476733 nova_compute[192580]: 2025-10-08 15:32:07.244 2 DEBUG oslo_concurrency.lockutils [req-a8605719-cee1-4047-b99a-44c725643e41 req-d138507a-3980-4e94-bc56-aec6f8ff8644 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-e7b170f9-efdc-458b-a2e6-04c7f2072900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:32:07 np0005476733 nova_compute[192580]: 2025-10-08 15:32:07.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:08 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:08.326 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:32:08 np0005476733 nova_compute[192580]: 2025-10-08 15:32:08.737 2 INFO nova.compute.manager [None req-252accf6-06a7-4c89-8857-962ed6ed8976 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Get console output#033[00m
Oct  8 11:32:09 np0005476733 nova_compute[192580]: 2025-10-08 15:32:09.018 2 DEBUG nova.compute.manager [req-3989c1d0-e9c7-43a2-bd4a-51193b87d931 req-6c702409-8e94-4d65-8780-69d2323fa78c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Received event network-vif-plugged-6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:32:09 np0005476733 nova_compute[192580]: 2025-10-08 15:32:09.019 2 DEBUG oslo_concurrency.lockutils [req-3989c1d0-e9c7-43a2-bd4a-51193b87d931 req-6c702409-8e94-4d65-8780-69d2323fa78c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e7b170f9-efdc-458b-a2e6-04c7f2072900-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:32:09 np0005476733 nova_compute[192580]: 2025-10-08 15:32:09.020 2 DEBUG oslo_concurrency.lockutils [req-3989c1d0-e9c7-43a2-bd4a-51193b87d931 req-6c702409-8e94-4d65-8780-69d2323fa78c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e7b170f9-efdc-458b-a2e6-04c7f2072900-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:32:09 np0005476733 nova_compute[192580]: 2025-10-08 15:32:09.020 2 DEBUG oslo_concurrency.lockutils [req-3989c1d0-e9c7-43a2-bd4a-51193b87d931 req-6c702409-8e94-4d65-8780-69d2323fa78c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e7b170f9-efdc-458b-a2e6-04c7f2072900-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:32:09 np0005476733 nova_compute[192580]: 2025-10-08 15:32:09.020 2 DEBUG nova.compute.manager [req-3989c1d0-e9c7-43a2-bd4a-51193b87d931 req-6c702409-8e94-4d65-8780-69d2323fa78c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] No waiting events found dispatching network-vif-plugged-6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:32:09 np0005476733 nova_compute[192580]: 2025-10-08 15:32:09.021 2 WARNING nova.compute.manager [req-3989c1d0-e9c7-43a2-bd4a-51193b87d931 req-6c702409-8e94-4d65-8780-69d2323fa78c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Received unexpected event network-vif-plugged-6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:32:09 np0005476733 nova_compute[192580]: 2025-10-08 15:32:09.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:10 np0005476733 ovn_controller[94857]: 2025-10-08T15:32:10Z|00419|binding|INFO|Releasing lport 273f6d13-1643-4fbf-8405-0ca2e0049b96 from this chassis (sb_readonly=0)
Oct  8 11:32:10 np0005476733 ovn_controller[94857]: 2025-10-08T15:32:10Z|00420|binding|INFO|Releasing lport f67773e8-4408-425a-8438-2209ddc36987 from this chassis (sb_readonly=0)
Oct  8 11:32:10 np0005476733 nova_compute[192580]: 2025-10-08 15:32:10.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:11 np0005476733 podman[233077]: 2025-10-08 15:32:11.327793885 +0000 UTC m=+0.141635043 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:32:11 np0005476733 podman[233078]: 2025-10-08 15:32:11.339484681 +0000 UTC m=+0.147825283 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:32:11 np0005476733 systemd-logind[827]: New session 38 of user zuul.
Oct  8 11:32:11 np0005476733 systemd[1]: Started Session 38 of User zuul.
Oct  8 11:32:11 np0005476733 systemd[1]: session-38.scope: Deactivated successfully.
Oct  8 11:32:11 np0005476733 systemd-logind[827]: Session 38 logged out. Waiting for processes to exit.
Oct  8 11:32:11 np0005476733 systemd-logind[827]: Removed session 38.
Oct  8 11:32:12 np0005476733 nova_compute[192580]: 2025-10-08 15:32:12.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:13 np0005476733 nova_compute[192580]: 2025-10-08 15:32:13.884 2 INFO nova.compute.manager [None req-7e9f7d49-f6ab-475a-9f18-79e0d3d2c156 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Get console output#033[00m
Oct  8 11:32:14 np0005476733 nova_compute[192580]: 2025-10-08 15:32:14.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:17 np0005476733 nova_compute[192580]: 2025-10-08 15:32:17.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:18 np0005476733 podman[233162]: 2025-10-08 15:32:18.26724045 +0000 UTC m=+0.091770100 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_id=edpm, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  8 11:32:18 np0005476733 ovn_controller[94857]: 2025-10-08T15:32:18Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0a:fc:11 10.100.0.13
Oct  8 11:32:18 np0005476733 ovn_controller[94857]: 2025-10-08T15:32:18Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0a:fc:11 10.100.0.13
Oct  8 11:32:19 np0005476733 nova_compute[192580]: 2025-10-08 15:32:19.039 2 INFO nova.compute.manager [None req-c6b51b6c-135d-4b73-89fe-49523a4f03d5 c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Get console output#033[00m
Oct  8 11:32:19 np0005476733 podman[233182]: 2025-10-08 15:32:19.238077341 +0000 UTC m=+0.066940903 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_managed=true, container_name=multipathd)
Oct  8 11:32:19 np0005476733 podman[233183]: 2025-10-08 15:32:19.269441179 +0000 UTC m=+0.092016328 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 11:32:19 np0005476733 nova_compute[192580]: 2025-10-08 15:32:19.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:32:20Z|00421|pinctrl|WARN|Dropped 8077 log messages in last 60 seconds (most recently, 1 seconds ago) due to excessive rate
Oct  8 11:32:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:32:20Z|00422|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:32:21 np0005476733 nova_compute[192580]: 2025-10-08 15:32:21.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:22 np0005476733 nova_compute[192580]: 2025-10-08 15:32:22.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:24 np0005476733 nova_compute[192580]: 2025-10-08 15:32:24.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:25 np0005476733 nova_compute[192580]: 2025-10-08 15:32:25.680 2 DEBUG nova.compute.manager [req-be96ead8-0d26-4492-8bc7-50c66c361da9 req-7549bd07-596f-4d2d-a180-f194586c6d6b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Received event network-changed-6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:32:25 np0005476733 nova_compute[192580]: 2025-10-08 15:32:25.680 2 DEBUG nova.compute.manager [req-be96ead8-0d26-4492-8bc7-50c66c361da9 req-7549bd07-596f-4d2d-a180-f194586c6d6b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Refreshing instance network info cache due to event network-changed-6a82b4a7-2453-4ee1-866e-a6fe2175b5c4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:32:25 np0005476733 nova_compute[192580]: 2025-10-08 15:32:25.680 2 DEBUG oslo_concurrency.lockutils [req-be96ead8-0d26-4492-8bc7-50c66c361da9 req-7549bd07-596f-4d2d-a180-f194586c6d6b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-e7b170f9-efdc-458b-a2e6-04c7f2072900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:32:25 np0005476733 nova_compute[192580]: 2025-10-08 15:32:25.681 2 DEBUG oslo_concurrency.lockutils [req-be96ead8-0d26-4492-8bc7-50c66c361da9 req-7549bd07-596f-4d2d-a180-f194586c6d6b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-e7b170f9-efdc-458b-a2e6-04c7f2072900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:32:25 np0005476733 nova_compute[192580]: 2025-10-08 15:32:25.681 2 DEBUG nova.network.neutron [req-be96ead8-0d26-4492-8bc7-50c66c361da9 req-7549bd07-596f-4d2d-a180-f194586c6d6b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Refreshing network info cache for port 6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:32:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:26.315 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:32:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:26.317 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:32:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:26.318 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:32:27 np0005476733 nova_compute[192580]: 2025-10-08 15:32:27.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:27 np0005476733 nova_compute[192580]: 2025-10-08 15:32:27.844 2 DEBUG nova.network.neutron [req-be96ead8-0d26-4492-8bc7-50c66c361da9 req-7549bd07-596f-4d2d-a180-f194586c6d6b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Updated VIF entry in instance network info cache for port 6a82b4a7-2453-4ee1-866e-a6fe2175b5c4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:32:27 np0005476733 nova_compute[192580]: 2025-10-08 15:32:27.845 2 DEBUG nova.network.neutron [req-be96ead8-0d26-4492-8bc7-50c66c361da9 req-7549bd07-596f-4d2d-a180-f194586c6d6b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Updating instance_info_cache with network_info: [{"id": "6a82b4a7-2453-4ee1-866e-a6fe2175b5c4", "address": "fa:16:3e:0a:fc:11", "network": {"id": "d7fe4641-81c3-446f-bec0-114221bc2533", "bridge": "br-int", "label": "tempest-test-network--1217232881", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0ff332fd7f14bd0831aa78a16065653", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a82b4a7-24", "ovs_interfaceid": "6a82b4a7-2453-4ee1-866e-a6fe2175b5c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:32:27 np0005476733 nova_compute[192580]: 2025-10-08 15:32:27.870 2 DEBUG oslo_concurrency.lockutils [req-be96ead8-0d26-4492-8bc7-50c66c361da9 req-7549bd07-596f-4d2d-a180-f194586c6d6b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-e7b170f9-efdc-458b-a2e6-04c7f2072900" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:32:29 np0005476733 podman[233239]: 2025-10-08 15:32:29.239658005 +0000 UTC m=+0.058831702 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 11:32:29 np0005476733 podman[233238]: 2025-10-08 15:32:29.242887159 +0000 UTC m=+0.064985320 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 11:32:29 np0005476733 nova_compute[192580]: 2025-10-08 15:32:29.597 2 DEBUG oslo_concurrency.lockutils [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "interface-066ef28b-88ac-4f5c-acae-3458c3e19762-50d486c7-b030-4d82-8b22-2f71cd277074" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:32:29 np0005476733 nova_compute[192580]: 2025-10-08 15:32:29.598 2 DEBUG oslo_concurrency.lockutils [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "interface-066ef28b-88ac-4f5c-acae-3458c3e19762-50d486c7-b030-4d82-8b22-2f71cd277074" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:32:29 np0005476733 nova_compute[192580]: 2025-10-08 15:32:29.598 2 DEBUG nova.objects.instance [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'flavor' on Instance uuid 066ef28b-88ac-4f5c-acae-3458c3e19762 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:32:29 np0005476733 nova_compute[192580]: 2025-10-08 15:32:29.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:30 np0005476733 nova_compute[192580]: 2025-10-08 15:32:30.093 2 DEBUG nova.objects.instance [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'pci_requests' on Instance uuid 066ef28b-88ac-4f5c-acae-3458c3e19762 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:32:30 np0005476733 nova_compute[192580]: 2025-10-08 15:32:30.115 2 DEBUG nova.network.neutron [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:32:30 np0005476733 nova_compute[192580]: 2025-10-08 15:32:30.530 2 DEBUG nova.policy [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:32:31 np0005476733 nova_compute[192580]: 2025-10-08 15:32:31.776 2 DEBUG nova.network.neutron [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Successfully updated port: 50d486c7-b030-4d82-8b22-2f71cd277074 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:32:31 np0005476733 nova_compute[192580]: 2025-10-08 15:32:31.789 2 DEBUG oslo_concurrency.lockutils [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:32:31 np0005476733 nova_compute[192580]: 2025-10-08 15:32:31.790 2 DEBUG oslo_concurrency.lockutils [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquired lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:32:31 np0005476733 nova_compute[192580]: 2025-10-08 15:32:31.790 2 DEBUG nova.network.neutron [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:32:31 np0005476733 nova_compute[192580]: 2025-10-08 15:32:31.875 2 DEBUG nova.compute.manager [req-a7c92f38-6919-4ffd-8deb-2377f6b3676e req-e57e497d-256b-48dc-b613-0613871e7184 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received event network-changed-50d486c7-b030-4d82-8b22-2f71cd277074 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:32:31 np0005476733 nova_compute[192580]: 2025-10-08 15:32:31.876 2 DEBUG nova.compute.manager [req-a7c92f38-6919-4ffd-8deb-2377f6b3676e req-e57e497d-256b-48dc-b613-0613871e7184 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Refreshing instance network info cache due to event network-changed-50d486c7-b030-4d82-8b22-2f71cd277074. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:32:31 np0005476733 nova_compute[192580]: 2025-10-08 15:32:31.876 2 DEBUG oslo_concurrency.lockutils [req-a7c92f38-6919-4ffd-8deb-2377f6b3676e req-e57e497d-256b-48dc-b613-0613871e7184 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:32:32 np0005476733 nova_compute[192580]: 2025-10-08 15:32:32.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:33 np0005476733 nova_compute[192580]: 2025-10-08 15:32:33.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.014 2 DEBUG nova.network.neutron [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updating instance_info_cache with network_info: [{"id": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "address": "fa:16:3e:85:7d:15", "network": {"id": "f81b33e3-d2f7-4437-b8c9-c9a54931fb61", "bridge": "br-int", "label": "tempest-test-network--416037603", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f7d5998-03", "ovs_interfaceid": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "50d486c7-b030-4d82-8b22-2f71cd277074", "address": "fa:16:3e:c9:e2:37", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50d486c7-b0", "ovs_interfaceid": "50d486c7-b030-4d82-8b22-2f71cd277074", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.056 2 DEBUG oslo_concurrency.lockutils [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Releasing lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.057 2 DEBUG oslo_concurrency.lockutils [req-a7c92f38-6919-4ffd-8deb-2377f6b3676e req-e57e497d-256b-48dc-b613-0613871e7184 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.058 2 DEBUG nova.network.neutron [req-a7c92f38-6919-4ffd-8deb-2377f6b3676e req-e57e497d-256b-48dc-b613-0613871e7184 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Refreshing network info cache for port 50d486c7-b030-4d82-8b22-2f71cd277074 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.061 2 DEBUG nova.virt.libvirt.vif [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:30:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_bw_limit_tenant_network-1685300098',display_name='tempest-test_bw_limit_tenant_network-1685300098',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-bw-limit-tenant-network-1685300098',id=45,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:30:35Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-1p84nw3a',resources=<?>,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:30:35Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=066ef28b-88ac-4f5c-acae-3458c3e19762,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "50d486c7-b030-4d82-8b22-2f71cd277074", "address": "fa:16:3e:c9:e2:37", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50d486c7-b0", "ovs_interfaceid": "50d486c7-b030-4d82-8b22-2f71cd277074", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.061 2 DEBUG nova.network.os_vif_util [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "50d486c7-b030-4d82-8b22-2f71cd277074", "address": "fa:16:3e:c9:e2:37", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50d486c7-b0", "ovs_interfaceid": "50d486c7-b030-4d82-8b22-2f71cd277074", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.062 2 DEBUG nova.network.os_vif_util [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:e2:37,bridge_name='br-int',has_traffic_filtering=True,id=50d486c7-b030-4d82-8b22-2f71cd277074,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap50d486c7-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.062 2 DEBUG os_vif [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:e2:37,bridge_name='br-int',has_traffic_filtering=True,id=50d486c7-b030-4d82-8b22-2f71cd277074,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap50d486c7-b0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.063 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.063 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.066 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap50d486c7-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.066 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap50d486c7-b0, col_values=(('external_ids', {'iface-id': '50d486c7-b030-4d82-8b22-2f71cd277074', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c9:e2:37', 'vm-uuid': '066ef28b-88ac-4f5c-acae-3458c3e19762'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:34 np0005476733 NetworkManager[51699]: <info>  [1759937554.1151] manager: (tap50d486c7-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/141)
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.125 2 INFO os_vif [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:e2:37,bridge_name='br-int',has_traffic_filtering=True,id=50d486c7-b030-4d82-8b22-2f71cd277074,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap50d486c7-b0')#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.126 2 DEBUG nova.virt.libvirt.vif [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:30:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_bw_limit_tenant_network-1685300098',display_name='tempest-test_bw_limit_tenant_network-1685300098',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-bw-limit-tenant-network-1685300098',id=45,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:30:35Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-1p84nw3a',resources=<?>,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:30:35Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=066ef28b-88ac-4f5c-acae-3458c3e19762,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "50d486c7-b030-4d82-8b22-2f71cd277074", "address": "fa:16:3e:c9:e2:37", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50d486c7-b0", "ovs_interfaceid": "50d486c7-b030-4d82-8b22-2f71cd277074", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.127 2 DEBUG nova.network.os_vif_util [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "50d486c7-b030-4d82-8b22-2f71cd277074", "address": "fa:16:3e:c9:e2:37", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50d486c7-b0", "ovs_interfaceid": "50d486c7-b030-4d82-8b22-2f71cd277074", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.128 2 DEBUG nova.network.os_vif_util [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:e2:37,bridge_name='br-int',has_traffic_filtering=True,id=50d486c7-b030-4d82-8b22-2f71cd277074,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap50d486c7-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.131 2 DEBUG nova.virt.libvirt.guest [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] attach device xml: <interface type="ethernet">
Oct  8 11:32:34 np0005476733 nova_compute[192580]:  <mac address="fa:16:3e:c9:e2:37"/>
Oct  8 11:32:34 np0005476733 nova_compute[192580]:  <model type="virtio"/>
Oct  8 11:32:34 np0005476733 nova_compute[192580]:  <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:32:34 np0005476733 nova_compute[192580]:  <mtu size="1342"/>
Oct  8 11:32:34 np0005476733 nova_compute[192580]:  <target dev="tap50d486c7-b0"/>
Oct  8 11:32:34 np0005476733 nova_compute[192580]: </interface>
Oct  8 11:32:34 np0005476733 nova_compute[192580]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  8 11:32:34 np0005476733 kernel: tap50d486c7-b0: entered promiscuous mode
Oct  8 11:32:34 np0005476733 NetworkManager[51699]: <info>  [1759937554.1510] manager: (tap50d486c7-b0): new Tun device (/org/freedesktop/NetworkManager/Devices/142)
Oct  8 11:32:34 np0005476733 ovn_controller[94857]: 2025-10-08T15:32:34Z|00423|binding|INFO|Claiming lport 50d486c7-b030-4d82-8b22-2f71cd277074 for this chassis.
Oct  8 11:32:34 np0005476733 ovn_controller[94857]: 2025-10-08T15:32:34Z|00424|binding|INFO|50d486c7-b030-4d82-8b22-2f71cd277074: Claiming fa:16:3e:c9:e2:37 10.100.0.12
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.163 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:e2:37 10.100.0.12'], port_security=['fa:16:3e:c9:e2:37 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '82ea289b-c65f-44fe-a172-e9784a3ab9f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3da71a44-b74e-4032-87c4-3337484b3d54, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=50d486c7-b030-4d82-8b22-2f71cd277074) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.165 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 50d486c7-b030-4d82-8b22-2f71cd277074 in datapath 58a69152-b5a6-41d0-85d5-36ab51cfbfb5 bound to our chassis#033[00m
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.169 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58a69152-b5a6-41d0-85d5-36ab51cfbfb5#033[00m
Oct  8 11:32:34 np0005476733 ovn_controller[94857]: 2025-10-08T15:32:34Z|00425|binding|INFO|Setting lport 50d486c7-b030-4d82-8b22-2f71cd277074 ovn-installed in OVS
Oct  8 11:32:34 np0005476733 ovn_controller[94857]: 2025-10-08T15:32:34Z|00426|binding|INFO|Setting lport 50d486c7-b030-4d82-8b22-2f71cd277074 up in Southbound
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.181 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ad1fb1a0-3a54-44ac-8709-e84a3c832cf9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.182 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap58a69152-b1 in ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:32:34 np0005476733 systemd-udevd[233289]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.184 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap58a69152-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.184 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3f8bf56a-a739-4b57-9eb2-aa178499012d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.185 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[40f7d83a-145a-4d75-8b29-64062573f16b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.201 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[ee1ebbb3-27e6-4bf0-a26c-278103a1841c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:34 np0005476733 NetworkManager[51699]: <info>  [1759937554.2071] device (tap50d486c7-b0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:32:34 np0005476733 NetworkManager[51699]: <info>  [1759937554.2079] device (tap50d486c7-b0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.223 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[29c08072-7b51-4c21-ad30-acd1ce3b3e4f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.257 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[b25c0509-9f3b-430d-b418-fd1b8e2f6aae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.260 2 DEBUG nova.virt.libvirt.driver [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.261 2 DEBUG nova.virt.libvirt.driver [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.261 2 DEBUG nova.virt.libvirt.driver [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No VIF found with MAC fa:16:3e:85:7d:15, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.262 2 DEBUG nova.virt.libvirt.driver [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No VIF found with MAC fa:16:3e:c9:e2:37, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:32:34 np0005476733 systemd-udevd[233293]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:32:34 np0005476733 NetworkManager[51699]: <info>  [1759937554.2656] manager: (tap58a69152-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/143)
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.265 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e543144c-d18c-4cc6-8841-b7aac8b8d1a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.297 2 DEBUG nova.virt.libvirt.guest [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:32:34 np0005476733 nova_compute[192580]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:32:34 np0005476733 nova_compute[192580]:  <nova:name>tempest-test_bw_limit_tenant_network-1685300098</nova:name>
Oct  8 11:32:34 np0005476733 nova_compute[192580]:  <nova:creationTime>2025-10-08 15:32:34</nova:creationTime>
Oct  8 11:32:34 np0005476733 nova_compute[192580]:  <nova:flavor name="custom_neutron_guest">
Oct  8 11:32:34 np0005476733 nova_compute[192580]:    <nova:memory>1024</nova:memory>
Oct  8 11:32:34 np0005476733 nova_compute[192580]:    <nova:disk>10</nova:disk>
Oct  8 11:32:34 np0005476733 nova_compute[192580]:    <nova:swap>0</nova:swap>
Oct  8 11:32:34 np0005476733 nova_compute[192580]:    <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:32:34 np0005476733 nova_compute[192580]:    <nova:vcpus>1</nova:vcpus>
Oct  8 11:32:34 np0005476733 nova_compute[192580]:  </nova:flavor>
Oct  8 11:32:34 np0005476733 nova_compute[192580]:  <nova:owner>
Oct  8 11:32:34 np0005476733 nova_compute[192580]:    <nova:user uuid="d4d641ac754b44f89a23c1628056309a">tempest-QosTestCommon-1316104462-project-member</nova:user>
Oct  8 11:32:34 np0005476733 nova_compute[192580]:    <nova:project uuid="d58fb802e34e481ea69b20f4fe8df6d2">tempest-QosTestCommon-1316104462</nova:project>
Oct  8 11:32:34 np0005476733 nova_compute[192580]:  </nova:owner>
Oct  8 11:32:34 np0005476733 nova_compute[192580]:  <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:32:34 np0005476733 nova_compute[192580]:  <nova:ports>
Oct  8 11:32:34 np0005476733 nova_compute[192580]:    <nova:port uuid="8f7d5998-037f-4a70-98a0-8482a8043a7e">
Oct  8 11:32:34 np0005476733 nova_compute[192580]:      <nova:ip type="fixed" address="192.168.3.176" ipVersion="4"/>
Oct  8 11:32:34 np0005476733 nova_compute[192580]:    </nova:port>
Oct  8 11:32:34 np0005476733 nova_compute[192580]:    <nova:port uuid="50d486c7-b030-4d82-8b22-2f71cd277074">
Oct  8 11:32:34 np0005476733 nova_compute[192580]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  8 11:32:34 np0005476733 nova_compute[192580]:    </nova:port>
Oct  8 11:32:34 np0005476733 nova_compute[192580]:  </nova:ports>
Oct  8 11:32:34 np0005476733 nova_compute[192580]: </nova:instance>
Oct  8 11:32:34 np0005476733 nova_compute[192580]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.318 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[a0b9af49-d96e-4898-8d42-bc18c75b6d1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.322 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[58aeed1e-e4cb-44f4-ba58-70f98834e3ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.327 2 DEBUG oslo_concurrency.lockutils [None req-9e833d2d-53a8-430c-86db-778ad33543a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "interface-066ef28b-88ac-4f5c-acae-3458c3e19762-50d486c7-b030-4d82-8b22-2f71cd277074" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 4.729s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:32:34 np0005476733 ovn_controller[94857]: 2025-10-08T15:32:34Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c9:e2:37 10.100.0.12
Oct  8 11:32:34 np0005476733 NetworkManager[51699]: <info>  [1759937554.3503] device (tap58a69152-b0): carrier: link connected
Oct  8 11:32:34 np0005476733 ovn_controller[94857]: 2025-10-08T15:32:34Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c9:e2:37 10.100.0.12
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.359 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[7e8e7d2b-a92b-419b-aed8-93fb37976051]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.380 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f8e9f8a5-7dfc-4e9a-95ab-a35ebf8ed07d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58a69152-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:63:a5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445801, 'reachable_time': 40870, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233316, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.401 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[eef14068-0ac5-4e42-a1e4-7c987738a607]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecf:63a5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 445801, 'tstamp': 445801}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233317, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.424 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e85cb3ba-0e86-4164-a18d-3315a716a645]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58a69152-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:63:a5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445801, 'reachable_time': 40870, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233318, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.463 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c6fa8828-26b6-4065-a29c-908f23a16be4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.544 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7caad070-9056-4ce3-8668-9ff0fb4b7b25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.547 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58a69152-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.548 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.548 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58a69152-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:34 np0005476733 NetworkManager[51699]: <info>  [1759937554.5515] manager: (tap58a69152-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/144)
Oct  8 11:32:34 np0005476733 kernel: tap58a69152-b0: entered promiscuous mode
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.555 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58a69152-b0, col_values=(('external_ids', {'iface-id': '46f589fc-b5d9-4e1f-b085-8789fd1f48e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:32:34 np0005476733 ovn_controller[94857]: 2025-10-08T15:32:34Z|00427|binding|INFO|Releasing lport 46f589fc-b5d9-4e1f-b085-8789fd1f48e9 from this chassis (sb_readonly=0)
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.559 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/58a69152-b5a6-41d0-85d5-36ab51cfbfb5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/58a69152-b5a6-41d0-85d5-36ab51cfbfb5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.560 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[be78dc15-7f9b-4d16-9247-f42eff866097]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.562 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-58a69152-b5a6-41d0-85d5-36ab51cfbfb5
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/58a69152-b5a6-41d0-85d5-36ab51cfbfb5.pid.haproxy
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 58a69152-b5a6-41d0-85d5-36ab51cfbfb5
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:32:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:32:34.563 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'env', 'PROCESS_TAG=haproxy-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/58a69152-b5a6-41d0-85d5-36ab51cfbfb5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:34 np0005476733 nova_compute[192580]: 2025-10-08 15:32:34.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:34 np0005476733 podman[233350]: 2025-10-08 15:32:34.989268623 +0000 UTC m=+0.057520749 container create 22dd98b562e2bfbed38b87906bed3cc062810e00cfd6894e85eea3790d097206 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2)
Oct  8 11:32:35 np0005476733 systemd[1]: Started libpod-conmon-22dd98b562e2bfbed38b87906bed3cc062810e00cfd6894e85eea3790d097206.scope.
Oct  8 11:32:35 np0005476733 podman[233350]: 2025-10-08 15:32:34.95962879 +0000 UTC m=+0.027880946 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:32:35 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:32:35 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1d02dda42a5c32845ea92e289f2a16305f47b8302a7cbf41f14af9b770be60d2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:32:35 np0005476733 podman[233350]: 2025-10-08 15:32:35.075764912 +0000 UTC m=+0.144017038 container init 22dd98b562e2bfbed38b87906bed3cc062810e00cfd6894e85eea3790d097206 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  8 11:32:35 np0005476733 podman[233350]: 2025-10-08 15:32:35.082055865 +0000 UTC m=+0.150307971 container start 22dd98b562e2bfbed38b87906bed3cc062810e00cfd6894e85eea3790d097206 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:32:35 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[233366]: [NOTICE]   (233370) : New worker (233372) forked
Oct  8 11:32:35 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[233366]: [NOTICE]   (233370) : Loading success.
Oct  8 11:32:35 np0005476733 nova_compute[192580]: 2025-10-08 15:32:35.517 2 DEBUG nova.network.neutron [req-a7c92f38-6919-4ffd-8deb-2377f6b3676e req-e57e497d-256b-48dc-b613-0613871e7184 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updated VIF entry in instance network info cache for port 50d486c7-b030-4d82-8b22-2f71cd277074. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:32:35 np0005476733 nova_compute[192580]: 2025-10-08 15:32:35.517 2 DEBUG nova.network.neutron [req-a7c92f38-6919-4ffd-8deb-2377f6b3676e req-e57e497d-256b-48dc-b613-0613871e7184 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updating instance_info_cache with network_info: [{"id": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "address": "fa:16:3e:85:7d:15", "network": {"id": "f81b33e3-d2f7-4437-b8c9-c9a54931fb61", "bridge": "br-int", "label": "tempest-test-network--416037603", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f7d5998-03", "ovs_interfaceid": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "50d486c7-b030-4d82-8b22-2f71cd277074", "address": "fa:16:3e:c9:e2:37", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50d486c7-b0", "ovs_interfaceid": "50d486c7-b030-4d82-8b22-2f71cd277074", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:32:35 np0005476733 nova_compute[192580]: 2025-10-08 15:32:35.546 2 DEBUG oslo_concurrency.lockutils [req-a7c92f38-6919-4ffd-8deb-2377f6b3676e req-e57e497d-256b-48dc-b613-0613871e7184 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.008 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'name': 'tempest-test_bw_limit_tenant_network-1685300098', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002d', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'hostId': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.012 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'name': 'tempest-server-test-976377571', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000033', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c0ff332fd7f14bd0831aa78a16065653', 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'hostId': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.012 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.020 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 066ef28b-88ac-4f5c-acae-3458c3e19762 / tap50d486c7-b0 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.020 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.bytes.delta volume: 6077 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.022 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.026 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e7b170f9-efdc-458b-a2e6-04c7f2072900 / tap6a82b4a7-24 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.027 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25e6a9a5-4344-4d03-9107-34937f4d9af8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 6077, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:32:36.013021', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': '040a5414-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.736031056, 'message_signature': '1e96a0c566ccd6d3462f807e69aa6570ba1dfdc0f4aba34a361d2983df32d50b'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap50d486c7-b0', 'timestamp': '2025-10-08T15:32:36.013021', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap50d486c7-b0', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c9:e2:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50d486c7-b0'}, 'message_id': '040a6c74-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.736031056, 'message_signature': '66101014362dbcf30e7c449348c77479d882c4842e004e6bd038cb555e407d45'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'instance-00000033-e7b170f9-efdc-458b-a2e6-04c7f2072900-tap6a82b4a7-24', 'timestamp': '2025-10-08T15:32:36.013021', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'tap6a82b4a7-24', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:fc:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a82b4a7-24'}, 'message_id': '040b42d4-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.745620735, 'message_signature': 'ff26b6d5bd806f357c50c6ff63abc4d004a0b242fbe5ff28f3b266e1f39b9298'}]}, 'timestamp': '2025-10-08 15:32:36.028239', '_unique_id': 'cfbb8c4d129840b588a29aa30e4da799'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.030 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.031 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.058 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.write.bytes volume: 152675840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.059 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.087 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.device.write.bytes volume: 73007104 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.088 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f12e7ce6-6cfb-4cd0-a4bd-03ff123d12fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 152675840, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:32:36.031760', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '04100f30-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.75479068, 'message_signature': 'f655f1d573c057c15e428993f30750879d138cf1ef472e5a2e3cbf7beafec622'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:32:36.031760', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '04102074-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.75479068, 'message_signature': '96f3505b6a0e2af7ebe4174d279cf5b9fb06aa09ab842a1da412cf919d286065'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73007104, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900-vda', 'timestamp': '2025-10-08T15:32:36.031760', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'instance-00000033', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '04146bc0-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.783051958, 'message_signature': '53f4bee05478d58d303d5f67ac87d52ef2ddb5cc429b61517b8e7f70b61c2723'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900-sda', 'timestamp': '2025-10-08T15:32:36.031760', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'instance-00000033', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '04147e6c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.783051958, 'message_signature': '5d0b40124e1d25d9a9f5569818a2eb05040f11361491a21d8e2cbabfc09a5baa'}]}, 'timestamp': '2025-10-08 15:32:36.088551', '_unique_id': '4cd8125228954a4ebf519581f07511ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.090 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.091 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.091 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.write.latency volume: 10410832285 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.091 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.091 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.device.write.latency volume: 2947043503 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.092 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c42ba4d-f091-4af0-9850-c685aa9ce0fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10410832285, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:32:36.091263', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '0414f6c6-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.75479068, 'message_signature': 'a8705e1d6b3cb51edd921808726925fdfa5835f0880feb951625a619281819ae'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:32:36.091263', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '0415030a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.75479068, 'message_signature': '59e7a44692972024697fd521d1c8d6e23087d52753a05472a48419d03a8ac47c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2947043503, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900-vda', 'timestamp': '2025-10-08T15:32:36.091263', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'instance-00000033', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '041510b6-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.783051958, 'message_signature': '3a68f51df96195371e7a63ece21fe34b542e5259a1ab8b71d1706bd2bd1b0e60'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900-sda', 'timestamp': '2025-10-08T15:32:36.091263', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'instance-00000033', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '04151bb0-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.783051958, 'message_signature': '47c4fe8f6519f8d7eabbf3dee7aeb1dbe7af996d999fd84342fcf5ac9fca7f84'}]}, 'timestamp': '2025-10-08 15:32:36.092560', '_unique_id': '8d530e2d294a4ae0b4033c15ccd3891e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.093 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.094 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.094 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.bytes volume: 6187 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.095 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.bytes volume: 1472 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.095 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/network.incoming.bytes volume: 8766 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8c8b4a3-c247-46c1-9060-87305d26f4ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6187, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:32:36.094762', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': '04157f56-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.736031056, 'message_signature': '34349dccfdcc175e82e07e1e0a630a008384a18ead9ba91087174f7d6ee0b970'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1472, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap50d486c7-b0', 'timestamp': '2025-10-08T15:32:36.094762', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap50d486c7-b0', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c9:e2:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50d486c7-b0'}, 'message_id': '04158c8a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.736031056, 'message_signature': 'eb9fb643163961346f9d78851e58923b2550bace39a17d50e947e44fa2e9273f'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8766, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'instance-00000033-e7b170f9-efdc-458b-a2e6-04c7f2072900-tap6a82b4a7-24', 'timestamp': '2025-10-08T15:32:36.094762', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'tap6a82b4a7-24', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:fc:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a82b4a7-24'}, 'message_id': '041596e4-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.745620735, 'message_signature': 'f4e42fafb0989747796dce36d05ff23eb1f9334c8b8855c8b7fe361a81944a5a'}]}, 'timestamp': '2025-10-08 15:32:36.095708', '_unique_id': 'fb03bfa4fb3949c88304e440712814f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.096 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.097 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.112 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/memory.usage volume: 229.609375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.127 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/memory.usage volume: 42.89453125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1078f836-0d4e-41d4-b862-e87cebeb5a4b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 229.609375, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'timestamp': '2025-10-08T15:32:36.097800', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '04183336-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.835174713, 'message_signature': '7bbf46848b8e280e5362e0f4f020e7ec301c5efc9aee609870c5ed526ead63ae'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.89453125, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'timestamp': '2025-10-08T15:32:36.097800', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'instance-00000033', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '041a9bb2-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.850836796, 'message_signature': 'a1bb9b2fcab7e37ef6a2ec508806fc9b6212303045820acf9b15666de2fd065e'}]}, 'timestamp': '2025-10-08 15:32:36.128664', '_unique_id': '170aa4d8f7274e3a90965ed4d8cec1fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.130 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.131 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.131 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.131 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-976377571>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-976377571>]
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.131 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.144 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.145 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.155 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.155 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03525a96-334a-4db5-b176-faa8d7787427', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:32:36.131638', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '041d2788-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.854608827, 'message_signature': 'cb63734f4e8894b0c7cd31b988b7fbdddeecb2936b683e943ce055ffb4277587'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:32:36.131638', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '041d32e6-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.854608827, 'message_signature': '54962e994aa97759620f9f6ac87256c592aab009e5c24c7edfa0032a1f2a482e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900-vda', 'timestamp': '2025-10-08T15:32:36.131638', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'instance-00000033', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '041ebc10-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.868513824, 'message_signature': 'c044808c117c82a5fa3fb42a409526e93a5cefe3b162df817a84f0f54e643b85'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900-sda', 'timestamp': '2025-10-08T15:32:36.131638', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'instance-00000033', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '041ec5c0-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.868513824, 'message_signature': 'acd105011914de9e36b31f287229e8313d4faaca533b851420ff7e2da540cf3b'}]}, 'timestamp': '2025-10-08 15:32:36.155858', '_unique_id': '4ed80b98594b4b4fb97f468f1c0f9ae8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.156 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.157 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.157 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.157 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49d8bab8-9e8e-4fb9-92d5-19ff923633ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:32:36.157700', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': '041f178c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.736031056, 'message_signature': '5d531f8f07100dfb98994ab009b70a2a84de06cd3e759171843835e43620f4ff'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap50d486c7-b0', 'timestamp': '2025-10-08T15:32:36.157700', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap50d486c7-b0', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c9:e2:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50d486c7-b0'}, 'message_id': '041f2088-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.736031056, 'message_signature': 'd255b0bb4414e4c321f47c213edc8d1f2a671595687326a873c30568dfc1a529'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'instance-00000033-e7b170f9-efdc-458b-a2e6-04c7f2072900-tap6a82b4a7-24', 'timestamp': '2025-10-08T15:32:36.157700', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'tap6a82b4a7-24', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:fc:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a82b4a7-24'}, 'message_id': '041f2aba-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.745620735, 'message_signature': '5f8dcbd9beb24567be9ed3313b32033aa7fb266ec91fb6c94d4c8a8d68f266e3'}]}, 'timestamp': '2025-10-08 15:32:36.158431', '_unique_id': '71bf32f54d2e4bc3911ee5334cc39533'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.158 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.159 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.159 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/cpu volume: 41790000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.159 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/cpu volume: 10380000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ebd1df2-f096-4e92-853a-287e117b3d1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 41790000000, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'timestamp': '2025-10-08T15:32:36.159606', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '041f623c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.835174713, 'message_signature': '1895e9e17a9955e7f043bfe248d7cd4c4e069981b08f27e7e92ffeeab6c93b8b'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10380000000, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'timestamp': '2025-10-08T15:32:36.159606', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'instance-00000033', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '041f6ba6-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.850836796, 'message_signature': '8aab924f62eaa389bc58575e7182f9bda8b6dda6bfcb7848887d6c526bc8a23f'}]}, 'timestamp': '2025-10-08 15:32:36.160179', '_unique_id': '8f82ea35469f4ce1a48e686b60870e52'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.161 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.161 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.161 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.161 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bde0c7d8-b087-47f5-be8a-5089d986cc17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:32:36.161517', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': '041fac42-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.736031056, 'message_signature': 'a5fedd5a207b7a90e437b943896aad25c51ea40060aa67b22cd326a58d24468a'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap50d486c7-b0', 'timestamp': '2025-10-08T15:32:36.161517', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap50d486c7-b0', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c9:e2:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50d486c7-b0'}, 'message_id': '041fb552-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.736031056, 'message_signature': '321858f366b059bee036bb5088798a2cb6f62f7c0531da1cfde939ccb95ad1e1'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'instance-00000033-e7b170f9-efdc-458b-a2e6-04c7f2072900-tap6a82b4a7-24', 'timestamp': '2025-10-08T15:32:36.161517', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'tap6a82b4a7-24', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:fc:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a82b4a7-24'}, 'message_id': '041fbe6c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.745620735, 'message_signature': 'a44daf8493c9640777c10cdb38306dba190b1f175df9a357e81e0e0a6d41cace'}]}, 'timestamp': '2025-10-08 15:32:36.162213', '_unique_id': '27b173ecf4a44e46acad30020fe4231c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.162 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.163 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.163 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.write.requests volume: 773 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.163 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.163 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.device.write.requests volume: 314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78a4dfc3-be08-4079-a273-ad932873c02d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 773, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:32:36.163433', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '041ff710-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.75479068, 'message_signature': '69b8f977a08b69dab5172319fc043575d91cbed1783b6ddf417eb025cdb69ccb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:32:36.163433', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '042000b6-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.75479068, 'message_signature': 'e621c0435ef684ac4e34afe8a1c79722e14ab1b32bf3c117f13c6f3777e6b51b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 314, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900-vda', 'timestamp': '2025-10-08T15:32:36.163433', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'instance-00000033', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '042008ae-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.783051958, 'message_signature': 'd71f17e350a12ab9393e1ef811417a3fd4e70450370747f9c3ee55fa9cacbba8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900-sda', 'timestamp': '2025-10-08T15:32:36.163433', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'instance-00000033', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '042012cc-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.783051958, 'message_signature': '534b50d2decf4f9eb80dd82f29a13581f0e11f4cebe210132c3737df1a9e1f4c'}]}, 'timestamp': '2025-10-08 15:32:36.164364', '_unique_id': '01d0e23990954d5a8850784d4437d7af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.165 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.165 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.allocation volume: 169873408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.165 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.165 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c483392-e6d1-4f09-afce-7a73e246180e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 169873408, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:32:36.165514', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '04204814-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.854608827, 'message_signature': 'cd4e0c116ac353e4b07ab51d517e45913c03e03816ffc02a6b2e6af352a9eddf'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:32:36.165514', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '04205016-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.854608827, 'message_signature': '55920c84a9fee26ea0e311770a5d9289cea2a8801e8597dd9080558df11ea9ad'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900-vda', 'timestamp': '2025-10-08T15:32:36.165514', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'instance-00000033', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '042057e6-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.868513824, 'message_signature': '2562106971a6f2c37f9e1ac2722480b1f83ecc0269409e0c45483f730715a80f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900-sda', 'timestamp': '2025-10-08T15:32:36.165514', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'instance-00000033', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '04206128-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.868513824, 'message_signature': 'da1b0acf054916e60a64c16e94b43d7b9de41072a6df33054a4ff658706b4aa0'}]}, 'timestamp': '2025-10-08 15:32:36.166369', '_unique_id': 'f8fe4388578648c9a005e803d5271d82'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.166 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.167 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.167 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.packets volume: 42 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.167 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/network.incoming.packets volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5b278da-e372-4f04-adad-1074eb0c2746', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 42, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:32:36.167582', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': '04209a26-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.736031056, 'message_signature': '7f7cb0901884f0eba047daf0a7dac876bd7efe8bf58c20976ae9178c1052e1c8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap50d486c7-b0', 'timestamp': '2025-10-08T15:32:36.167582', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap50d486c7-b0', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c9:e2:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50d486c7-b0'}, 'message_id': '0420a4b2-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.736031056, 'message_signature': '93cb3294a25718d790ff48ccd58228dc6ffeb91befbc3587f8d42719d8f3afc9'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 47, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'instance-00000033-e7b170f9-efdc-458b-a2e6-04c7f2072900-tap6a82b4a7-24', 'timestamp': '2025-10-08T15:32:36.167582', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'tap6a82b4a7-24', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:fc:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a82b4a7-24'}, 'message_id': '0420aeb2-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.745620735, 'message_signature': '0bf50dd36c2ee0465ae04a7434ea04aa1233dd5b9ce74fbe2c1ea6c42e510ff1'}]}, 'timestamp': '2025-10-08 15:32:36.168367', '_unique_id': '6fc693733fcf4d1a9ac567fd25607afb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.168 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.169 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.169 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.bytes.delta volume: 10285 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.169 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c33a9d84-af08-4dfd-8f04-e3435b08989e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 10285, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:32:36.169559', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': '0420e67a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.736031056, 'message_signature': '2641a94233112c0035f5f419532f9cc897b9727f2cf7371f10e47725ecb2be4c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap50d486c7-b0', 'timestamp': '2025-10-08T15:32:36.169559', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap50d486c7-b0', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c9:e2:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50d486c7-b0'}, 'message_id': '0420eeb8-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.736031056, 'message_signature': '2732f01a86c392c64b7a482ea8b24bbd9a349d88915fe8e1a8b21bffb691f983'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'instance-00000033-e7b170f9-efdc-458b-a2e6-04c7f2072900-tap6a82b4a7-24', 'timestamp': '2025-10-08T15:32:36.169559', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'tap6a82b4a7-24', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:fc:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a82b4a7-24'}, 'message_id': '0420f782-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.745620735, 'message_signature': '2a8034ea3cae7ea6d1982ce664c80a556a2c3f78a490d4e7dfbcaec3b00bf9bd'}]}, 'timestamp': '2025-10-08 15:32:36.170227', '_unique_id': 'c1b964ca680e4fcfb0c0aa575aa2f6a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.170 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.171 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.171 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.171 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-976377571>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-976377571>]
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.171 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.171 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.packets volume: 80 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.171 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.packets volume: 7 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/network.outgoing.packets volume: 72 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd7a36049-c421-4ade-95d2-3b54e7bba65c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 80, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:32:36.171671', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': '0421388c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.736031056, 'message_signature': 'b12b32da0240f2772c02e076d84aaf967ab064ecff91a45e19b4fe3d1ab9ac59'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 7, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap50d486c7-b0', 'timestamp': '2025-10-08T15:32:36.171671', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap50d486c7-b0', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c9:e2:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50d486c7-b0'}, 'message_id': '04214110-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.736031056, 'message_signature': 'f207987308f73fab2d7ac368baf80b41cbd1889eed57361cc00e4d695d1164d9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 72, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'instance-00000033-e7b170f9-efdc-458b-a2e6-04c7f2072900-tap6a82b4a7-24', 'timestamp': '2025-10-08T15:32:36.171671', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'tap6a82b4a7-24', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:fc:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a82b4a7-24'}, 'message_id': '04214a7a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.745620735, 'message_signature': 'b51caf7f8d6442fcdb13fadb3f8e97f8228d7bb106e6c42bf84c8282a7105b75'}]}, 'timestamp': '2025-10-08 15:32:36.172348', '_unique_id': 'b2d5ea5abd634b6fa90fc0c440984311'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.172 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.173 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.173 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.usage volume: 169213952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.173 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.173 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10fd3beb-39c2-4659-b985-2d69d13520a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 169213952, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:32:36.173463', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '04217e82-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.854608827, 'message_signature': '6945536ec2d96b7bb5606e8f38692a5b628670299ff2b7303a9d9504a6e84ca6'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:32:36.173463', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '04218710-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.854608827, 'message_signature': '7947a3797db25b3ce031dfb33ba4d9160e58965fd42867b2ac1968bb6c2863ec'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900-vda', 'timestamp': '2025-10-08T15:32:36.173463', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'instance-00000033', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '04218ed6-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.868513824, 'message_signature': '83916be06c61aa98a26877fe01ed6bc40a8bc887b8056dc8abb762a7123ec1ea'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900-sda', 'timestamp': '2025-10-08T15:32:36.173463', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'instance-00000033', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '04219714-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.868513824, 'message_signature': 'a2efaa30d6829e4d1f1fb634c43c3602d35c056c1fe15fb342c8a8fca986ed58'}]}, 'timestamp': '2025-10-08 15:32:36.174303', '_unique_id': '1c405a562e78461a89deac0cc1455265'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.174 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.175 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.175 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.bytes volume: 10285 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.175 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.bytes volume: 1144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.175 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/network.outgoing.bytes volume: 9424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4b6d4f6-c76d-4d43-b380-08c9c3070a41', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10285, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:32:36.175435', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': '0421cb8a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.736031056, 'message_signature': '197c5b7e6ee6b3883977fbd46af84d1785b6d11dc1f4d6ac0d819e5d055cf1a5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1144, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap50d486c7-b0', 'timestamp': '2025-10-08T15:32:36.175435', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap50d486c7-b0', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c9:e2:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50d486c7-b0'}, 'message_id': '0421d396-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.736031056, 'message_signature': 'dbad0798c3b67a4409500c653eefe5ba0580aa8d6ae74decee2e306a603d6071'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9424, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'instance-00000033-e7b170f9-efdc-458b-a2e6-04c7f2072900-tap6a82b4a7-24', 'timestamp': '2025-10-08T15:32:36.175435', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'tap6a82b4a7-24', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:fc:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a82b4a7-24'}, 'message_id': '0421dc92-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.745620735, 'message_signature': '658fdbf944a0446dadb00799ae254a501160903ce536fd7d8ba34279fddc38db'}]}, 'timestamp': '2025-10-08 15:32:36.176105', '_unique_id': '0546e5a0e2d7454387b4a63bb0d7d5c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.176 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.177 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.177 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.read.latency volume: 9061539075 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.177 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.read.latency volume: 48138693 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.177 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.device.read.latency volume: 476977258 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.177 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.device.read.latency volume: 50707269 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a459b1bd-1f09-4d72-8762-c4f59ba7259c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9061539075, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:32:36.177246', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '04221234-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.75479068, 'message_signature': '65380e63d27e6c932a948630721ffe5d62726bb93153abdca0a3f9eb44a36a9f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 48138693, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:32:36.177246', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '042219d2-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.75479068, 'message_signature': '30ca0cdb0cdaaa7b22f6d07ddb366cde6a7e19277343d499dff7c8c7c3bf2b35'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 476977258, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900-vda', 'timestamp': '2025-10-08T15:32:36.177246', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'instance-00000033', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '04222148-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.783051958, 'message_signature': '89250a8eb42f39cb85100fb64c9afeda6996fc2d55e7435b6a3c94ff8c5291bd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 50707269, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900-sda', 'timestamp': '2025-10-08T15:32:36.177246', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'instance-00000033', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '042229b8-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.783051958, 'message_signature': '095e941f9d6419795dfaf9b75167abb01dc5934cc2f6faef646427fa0b61bb3e'}]}, 'timestamp': '2025-10-08 15:32:36.178058', '_unique_id': 'ee679def0856493fa809cbc5403bea12'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.178 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.179 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.179 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.read.bytes volume: 328451584 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.179 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.179 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.device.read.bytes volume: 29465088 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.179 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '776837bc-4783-44fb-842d-74ce46f1e885', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 328451584, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:32:36.179241', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '0422602c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.75479068, 'message_signature': '50ce97d0f27a6b9be9fe4c5dd8db28fbf720ff67c374db2afe29e68c61e038e9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:32:36.179241', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '042267f2-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.75479068, 'message_signature': 'c855a25c854738429e75f838e4707dc7d6a6d06336143e8c8779cbeb77a47e82'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29465088, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900-vda', 'timestamp': '2025-10-08T15:32:36.179241', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'instance-00000033', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '04226f5e-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.783051958, 'message_signature': '8e4c1fdd108891f5323fa01c202d3b12db38528cb74d3e8e8f348a91a7628e45'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900-sda', 'timestamp': '2025-10-08T15:32:36.179241', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'instance-00000033', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '042276b6-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.783051958, 'message_signature': 'a721848d64eedd464baa9f270bd060f3b13056ffca75e4da28bfec76db82e3a5'}]}, 'timestamp': '2025-10-08 15:32:36.180045', '_unique_id': '99e5b28d222543cfa5cb6e7b4372d4fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.180 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.181 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.181 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.181 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.181 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f59ee2af-1e0a-4811-95b8-f68eae268da8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:32:36.181217', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': '0422ada2-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.736031056, 'message_signature': 'c03b016e0a4234aedb569bb5cc33e089c634d122b901f3db95a014e62786a852'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap50d486c7-b0', 'timestamp': '2025-10-08T15:32:36.181217', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap50d486c7-b0', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c9:e2:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50d486c7-b0'}, 'message_id': '0422b6b2-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.736031056, 'message_signature': '6f8e029132da79afb49c2eb180ef62ff877a7c5519a5d226dd0d969a7f7fc262'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'instance-00000033-e7b170f9-efdc-458b-a2e6-04c7f2072900-tap6a82b4a7-24', 'timestamp': '2025-10-08T15:32:36.181217', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'tap6a82b4a7-24', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:fc:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a82b4a7-24'}, 'message_id': '0422c062-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.745620735, 'message_signature': '225acef0c6c630b82ab31dc84c0b5ccaacc8009cde10450c4878b631881d505d'}]}, 'timestamp': '2025-10-08 15:32:36.181926', '_unique_id': '94c8d783e0e5401f93619a61e185e5fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.182 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.183 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.183 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.183 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '907462e8-78e6-4cd0-befe-33268428fc23', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:32:36.183058', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': '0422f744-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.736031056, 'message_signature': '3bdb6c9ee54a3f7f9607ac30bd0d58204a5f840fae3038cbbcc677be771df7ab'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap50d486c7-b0', 'timestamp': '2025-10-08T15:32:36.183058', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap50d486c7-b0', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c9:e2:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50d486c7-b0'}, 'message_id': '0422ff82-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.736031056, 'message_signature': 'fdbef7e46eab614790ca0e8329870514bec5a42c78ba8dcadca0612e1d4c9345'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'instance-00000033-e7b170f9-efdc-458b-a2e6-04c7f2072900-tap6a82b4a7-24', 'timestamp': '2025-10-08T15:32:36.183058', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'tap6a82b4a7-24', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:fc:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a82b4a7-24'}, 'message_id': '04230752-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.745620735, 'message_signature': '074fca36b7bda54c337cdb64fe9a8c4642f1442c25939ce57c06a4b0709cdddc'}]}, 'timestamp': '2025-10-08 15:32:36.183735', '_unique_id': '4fed734ac4d2406cbb51463e89e9ba7b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.184 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.185 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-976377571>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-976377571>]
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.185 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.185 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.read.requests volume: 11653 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.185 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.185 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.device.read.requests volume: 1065 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.185 12 DEBUG ceilometer.compute.pollsters [-] e7b170f9-efdc-458b-a2e6-04c7f2072900/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb82e224-9d81-4553-99ff-2aa47580111a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11653, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:32:36.185246', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '04234af0-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.75479068, 'message_signature': 'cc8da72456514f47147e21ce4794f4ebd249d691089d1d761e44ba40afe102cb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:32:36.185246', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '042352ac-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.75479068, 'message_signature': '4105e4b5e2f8a663de3a370d5738442755a4ece4c5043cd702e9411a24515a89'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1065, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900-vda', 'timestamp': '2025-10-08T15:32:36.185246', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'instance-00000033', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '04235ab8-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.783051958, 'message_signature': '0b197be4862dc0f8ff10fa47eac0165cb823c11cb25d91cd8515ab41a6866172'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'c45a2b13dbdc4134a7829d83659d4dfd', 'user_name': None, 'project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'project_name': None, 'resource_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900-sda', 'timestamp': '2025-10-08T15:32:36.185246', 'resource_metadata': {'display_name': 'tempest-server-test-976377571', 'name': 'instance-00000033', 'instance_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'instance_type': 'm1.nano', 'host': '6b7aece4e8acc15fa10187289c5882aed4ea133b4b323c53037bbf0f', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '04236242-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4459.783051958, 'message_signature': 'b4d551e62319b359c270cf950db4ae779e1a65a7cc33790dd7ca1d8e7d1854ff'}]}, 'timestamp': '2025-10-08 15:32:36.186066', '_unique_id': 'b549d622c99f40c89913a0e233f0f494'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.186 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.187 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.187 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:32:36.187 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-976377571>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-976377571>]
Oct  8 11:32:36 np0005476733 nova_compute[192580]: 2025-10-08 15:32:36.780 2 DEBUG nova.compute.manager [req-8f9b1e2f-f7a9-4dec-a6a5-14e972fa54c0 req-13c95d46-7b9b-4a9f-8cc9-07cb091dff09 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received event network-vif-plugged-50d486c7-b030-4d82-8b22-2f71cd277074 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:32:36 np0005476733 nova_compute[192580]: 2025-10-08 15:32:36.781 2 DEBUG oslo_concurrency.lockutils [req-8f9b1e2f-f7a9-4dec-a6a5-14e972fa54c0 req-13c95d46-7b9b-4a9f-8cc9-07cb091dff09 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:32:36 np0005476733 nova_compute[192580]: 2025-10-08 15:32:36.781 2 DEBUG oslo_concurrency.lockutils [req-8f9b1e2f-f7a9-4dec-a6a5-14e972fa54c0 req-13c95d46-7b9b-4a9f-8cc9-07cb091dff09 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:32:36 np0005476733 nova_compute[192580]: 2025-10-08 15:32:36.781 2 DEBUG oslo_concurrency.lockutils [req-8f9b1e2f-f7a9-4dec-a6a5-14e972fa54c0 req-13c95d46-7b9b-4a9f-8cc9-07cb091dff09 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:32:36 np0005476733 nova_compute[192580]: 2025-10-08 15:32:36.782 2 DEBUG nova.compute.manager [req-8f9b1e2f-f7a9-4dec-a6a5-14e972fa54c0 req-13c95d46-7b9b-4a9f-8cc9-07cb091dff09 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] No waiting events found dispatching network-vif-plugged-50d486c7-b030-4d82-8b22-2f71cd277074 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:32:36 np0005476733 nova_compute[192580]: 2025-10-08 15:32:36.782 2 WARNING nova.compute.manager [req-8f9b1e2f-f7a9-4dec-a6a5-14e972fa54c0 req-13c95d46-7b9b-4a9f-8cc9-07cb091dff09 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received unexpected event network-vif-plugged-50d486c7-b030-4d82-8b22-2f71cd277074 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:32:36 np0005476733 nova_compute[192580]: 2025-10-08 15:32:36.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:38 np0005476733 podman[233381]: 2025-10-08 15:32:38.263616252 +0000 UTC m=+0.089310391 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 11:32:38 np0005476733 nova_compute[192580]: 2025-10-08 15:32:38.884 2 DEBUG nova.compute.manager [req-2264ea25-be16-4016-ac8d-db6efb1b5c8e req-0f567ab7-d279-4967-adca-7ef63d20d6a1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received event network-vif-plugged-50d486c7-b030-4d82-8b22-2f71cd277074 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:32:38 np0005476733 nova_compute[192580]: 2025-10-08 15:32:38.884 2 DEBUG oslo_concurrency.lockutils [req-2264ea25-be16-4016-ac8d-db6efb1b5c8e req-0f567ab7-d279-4967-adca-7ef63d20d6a1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:32:38 np0005476733 nova_compute[192580]: 2025-10-08 15:32:38.885 2 DEBUG oslo_concurrency.lockutils [req-2264ea25-be16-4016-ac8d-db6efb1b5c8e req-0f567ab7-d279-4967-adca-7ef63d20d6a1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:32:38 np0005476733 nova_compute[192580]: 2025-10-08 15:32:38.885 2 DEBUG oslo_concurrency.lockutils [req-2264ea25-be16-4016-ac8d-db6efb1b5c8e req-0f567ab7-d279-4967-adca-7ef63d20d6a1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:32:38 np0005476733 nova_compute[192580]: 2025-10-08 15:32:38.886 2 DEBUG nova.compute.manager [req-2264ea25-be16-4016-ac8d-db6efb1b5c8e req-0f567ab7-d279-4967-adca-7ef63d20d6a1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] No waiting events found dispatching network-vif-plugged-50d486c7-b030-4d82-8b22-2f71cd277074 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:32:38 np0005476733 nova_compute[192580]: 2025-10-08 15:32:38.886 2 WARNING nova.compute.manager [req-2264ea25-be16-4016-ac8d-db6efb1b5c8e req-0f567ab7-d279-4967-adca-7ef63d20d6a1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received unexpected event network-vif-plugged-50d486c7-b030-4d82-8b22-2f71cd277074 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:32:39 np0005476733 nova_compute[192580]: 2025-10-08 15:32:39.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:39 np0005476733 ovn_controller[94857]: 2025-10-08T15:32:39Z|00428|binding|INFO|Releasing lport 46f589fc-b5d9-4e1f-b085-8789fd1f48e9 from this chassis (sb_readonly=0)
Oct  8 11:32:39 np0005476733 ovn_controller[94857]: 2025-10-08T15:32:39Z|00429|binding|INFO|Releasing lport 273f6d13-1643-4fbf-8405-0ca2e0049b96 from this chassis (sb_readonly=0)
Oct  8 11:32:39 np0005476733 ovn_controller[94857]: 2025-10-08T15:32:39Z|00430|binding|INFO|Releasing lport f67773e8-4408-425a-8438-2209ddc36987 from this chassis (sb_readonly=0)
Oct  8 11:32:39 np0005476733 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 11:32:39 np0005476733 nova_compute[192580]: 2025-10-08 15:32:39.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:39 np0005476733 nova_compute[192580]: 2025-10-08 15:32:39.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:42 np0005476733 podman[233402]: 2025-10-08 15:32:42.251919257 +0000 UTC m=+0.070121345 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, container_name=ceilometer_agent_compute, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 11:32:42 np0005476733 podman[233401]: 2025-10-08 15:32:42.281346172 +0000 UTC m=+0.101021538 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 11:32:43 np0005476733 nova_compute[192580]: 2025-10-08 15:32:43.529 2 DEBUG nova.compute.manager [req-1ddc0507-cd77-4bc8-b166-1d80b3cf6f1c req-71d3c8a5-b134-4590-ab78-78be4dc68970 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received event network-changed-50d486c7-b030-4d82-8b22-2f71cd277074 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:32:43 np0005476733 nova_compute[192580]: 2025-10-08 15:32:43.529 2 DEBUG nova.compute.manager [req-1ddc0507-cd77-4bc8-b166-1d80b3cf6f1c req-71d3c8a5-b134-4590-ab78-78be4dc68970 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Refreshing instance network info cache due to event network-changed-50d486c7-b030-4d82-8b22-2f71cd277074. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:32:43 np0005476733 nova_compute[192580]: 2025-10-08 15:32:43.530 2 DEBUG oslo_concurrency.lockutils [req-1ddc0507-cd77-4bc8-b166-1d80b3cf6f1c req-71d3c8a5-b134-4590-ab78-78be4dc68970 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:32:43 np0005476733 nova_compute[192580]: 2025-10-08 15:32:43.530 2 DEBUG oslo_concurrency.lockutils [req-1ddc0507-cd77-4bc8-b166-1d80b3cf6f1c req-71d3c8a5-b134-4590-ab78-78be4dc68970 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:32:43 np0005476733 nova_compute[192580]: 2025-10-08 15:32:43.530 2 DEBUG nova.network.neutron [req-1ddc0507-cd77-4bc8-b166-1d80b3cf6f1c req-71d3c8a5-b134-4590-ab78-78be4dc68970 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Refreshing network info cache for port 50d486c7-b030-4d82-8b22-2f71cd277074 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:32:44 np0005476733 nova_compute[192580]: 2025-10-08 15:32:44.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:44 np0005476733 nova_compute[192580]: 2025-10-08 15:32:44.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:32:44 np0005476733 nova_compute[192580]: 2025-10-08 15:32:44.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:46 np0005476733 nova_compute[192580]: 2025-10-08 15:32:46.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:32:46 np0005476733 nova_compute[192580]: 2025-10-08 15:32:46.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:32:46 np0005476733 nova_compute[192580]: 2025-10-08 15:32:46.866 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:32:46 np0005476733 nova_compute[192580]: 2025-10-08 15:32:46.867 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:32:46 np0005476733 nova_compute[192580]: 2025-10-08 15:32:46.867 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:32:46 np0005476733 nova_compute[192580]: 2025-10-08 15:32:46.867 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:32:46 np0005476733 nova_compute[192580]: 2025-10-08 15:32:46.961 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:32:47 np0005476733 nova_compute[192580]: 2025-10-08 15:32:47.028 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:32:47 np0005476733 nova_compute[192580]: 2025-10-08 15:32:47.030 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:32:47 np0005476733 nova_compute[192580]: 2025-10-08 15:32:47.095 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:32:47 np0005476733 nova_compute[192580]: 2025-10-08 15:32:47.105 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:32:47 np0005476733 nova_compute[192580]: 2025-10-08 15:32:47.171 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:32:47 np0005476733 nova_compute[192580]: 2025-10-08 15:32:47.172 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:32:47 np0005476733 nova_compute[192580]: 2025-10-08 15:32:47.237 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:32:47 np0005476733 nova_compute[192580]: 2025-10-08 15:32:47.444 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:32:47 np0005476733 nova_compute[192580]: 2025-10-08 15:32:47.446 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12750MB free_disk=111.1533317565918GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:32:47 np0005476733 nova_compute[192580]: 2025-10-08 15:32:47.446 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:32:47 np0005476733 nova_compute[192580]: 2025-10-08 15:32:47.447 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:32:47 np0005476733 nova_compute[192580]: 2025-10-08 15:32:47.496 2 DEBUG nova.network.neutron [req-1ddc0507-cd77-4bc8-b166-1d80b3cf6f1c req-71d3c8a5-b134-4590-ab78-78be4dc68970 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updated VIF entry in instance network info cache for port 50d486c7-b030-4d82-8b22-2f71cd277074. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:32:47 np0005476733 nova_compute[192580]: 2025-10-08 15:32:47.497 2 DEBUG nova.network.neutron [req-1ddc0507-cd77-4bc8-b166-1d80b3cf6f1c req-71d3c8a5-b134-4590-ab78-78be4dc68970 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updating instance_info_cache with network_info: [{"id": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "address": "fa:16:3e:85:7d:15", "network": {"id": "f81b33e3-d2f7-4437-b8c9-c9a54931fb61", "bridge": "br-int", "label": "tempest-test-network--416037603", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f7d5998-03", "ovs_interfaceid": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "50d486c7-b030-4d82-8b22-2f71cd277074", "address": "fa:16:3e:c9:e2:37", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50d486c7-b0", "ovs_interfaceid": "50d486c7-b030-4d82-8b22-2f71cd277074", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:32:47 np0005476733 nova_compute[192580]: 2025-10-08 15:32:47.516 2 DEBUG oslo_concurrency.lockutils [req-1ddc0507-cd77-4bc8-b166-1d80b3cf6f1c req-71d3c8a5-b134-4590-ab78-78be4dc68970 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:32:47 np0005476733 nova_compute[192580]: 2025-10-08 15:32:47.539 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 066ef28b-88ac-4f5c-acae-3458c3e19762 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:32:47 np0005476733 nova_compute[192580]: 2025-10-08 15:32:47.540 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance e7b170f9-efdc-458b-a2e6-04c7f2072900 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:32:47 np0005476733 nova_compute[192580]: 2025-10-08 15:32:47.540 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:32:47 np0005476733 nova_compute[192580]: 2025-10-08 15:32:47.540 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1664MB phys_disk=119GB used_disk=11GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:32:47 np0005476733 nova_compute[192580]: 2025-10-08 15:32:47.636 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:32:47 np0005476733 nova_compute[192580]: 2025-10-08 15:32:47.662 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:32:47 np0005476733 nova_compute[192580]: 2025-10-08 15:32:47.691 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:32:47 np0005476733 nova_compute[192580]: 2025-10-08 15:32:47.691 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.244s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:32:49 np0005476733 nova_compute[192580]: 2025-10-08 15:32:49.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:49 np0005476733 podman[233461]: 2025-10-08 15:32:49.269432084 +0000 UTC m=+0.070173287 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350)
Oct  8 11:32:49 np0005476733 podman[233483]: 2025-10-08 15:32:49.363955481 +0000 UTC m=+0.057796908 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:32:49 np0005476733 podman[233482]: 2025-10-08 15:32:49.368761616 +0000 UTC m=+0.067933034 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 11:32:49 np0005476733 nova_compute[192580]: 2025-10-08 15:32:49.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:50 np0005476733 nova_compute[192580]: 2025-10-08 15:32:50.684 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:32:50 np0005476733 nova_compute[192580]: 2025-10-08 15:32:50.685 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:32:50 np0005476733 nova_compute[192580]: 2025-10-08 15:32:50.685 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:32:50 np0005476733 nova_compute[192580]: 2025-10-08 15:32:50.685 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:32:52 np0005476733 ovn_controller[94857]: 2025-10-08T15:32:52Z|00431|binding|INFO|Releasing lport 46f589fc-b5d9-4e1f-b085-8789fd1f48e9 from this chassis (sb_readonly=0)
Oct  8 11:32:52 np0005476733 ovn_controller[94857]: 2025-10-08T15:32:52Z|00432|binding|INFO|Releasing lport 273f6d13-1643-4fbf-8405-0ca2e0049b96 from this chassis (sb_readonly=0)
Oct  8 11:32:52 np0005476733 ovn_controller[94857]: 2025-10-08T15:32:52Z|00433|binding|INFO|Releasing lport f67773e8-4408-425a-8438-2209ddc36987 from this chassis (sb_readonly=0)
Oct  8 11:32:52 np0005476733 nova_compute[192580]: 2025-10-08 15:32:52.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:53 np0005476733 nova_compute[192580]: 2025-10-08 15:32:53.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:32:54 np0005476733 nova_compute[192580]: 2025-10-08 15:32:54.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:54 np0005476733 nova_compute[192580]: 2025-10-08 15:32:54.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:32:54 np0005476733 nova_compute[192580]: 2025-10-08 15:32:54.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:32:54 np0005476733 nova_compute[192580]: 2025-10-08 15:32:54.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:54 np0005476733 nova_compute[192580]: 2025-10-08 15:32:54.622 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 11:32:57 np0005476733 nova_compute[192580]: 2025-10-08 15:32:57.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:32:59 np0005476733 nova_compute[192580]: 2025-10-08 15:32:59.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:32:59 np0005476733 nova_compute[192580]: 2025-10-08 15:32:59.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:00 np0005476733 podman[233529]: 2025-10-08 15:33:00.235353451 +0000 UTC m=+0.060486565 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:33:00 np0005476733 podman[233528]: 2025-10-08 15:33:00.253440402 +0000 UTC m=+0.075625691 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid)
Oct  8 11:33:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:01.359 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:33:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:01.360 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:33:01 np0005476733 nova_compute[192580]: 2025-10-08 15:33:01.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:03 np0005476733 nova_compute[192580]: 2025-10-08 15:33:03.637 2 DEBUG oslo_concurrency.lockutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "d73d8a2e-011b-4f41-9734-d2bb2b068986" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:33:03 np0005476733 nova_compute[192580]: 2025-10-08 15:33:03.638 2 DEBUG oslo_concurrency.lockutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "d73d8a2e-011b-4f41-9734-d2bb2b068986" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:33:03 np0005476733 nova_compute[192580]: 2025-10-08 15:33:03.663 2 DEBUG nova.compute.manager [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:33:03 np0005476733 nova_compute[192580]: 2025-10-08 15:33:03.759 2 DEBUG oslo_concurrency.lockutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:33:03 np0005476733 nova_compute[192580]: 2025-10-08 15:33:03.760 2 DEBUG oslo_concurrency.lockutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:33:03 np0005476733 nova_compute[192580]: 2025-10-08 15:33:03.773 2 DEBUG nova.virt.hardware [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:33:03 np0005476733 nova_compute[192580]: 2025-10-08 15:33:03.774 2 INFO nova.compute.claims [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:33:03 np0005476733 nova_compute[192580]: 2025-10-08 15:33:03.923 2 DEBUG nova.compute.provider_tree [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:33:03 np0005476733 nova_compute[192580]: 2025-10-08 15:33:03.942 2 DEBUG nova.scheduler.client.report [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:33:03 np0005476733 nova_compute[192580]: 2025-10-08 15:33:03.966 2 DEBUG oslo_concurrency.lockutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:33:03 np0005476733 nova_compute[192580]: 2025-10-08 15:33:03.967 2 DEBUG nova.compute.manager [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.014 2 DEBUG nova.compute.manager [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.015 2 DEBUG nova.network.neutron [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.034 2 INFO nova.virt.libvirt.driver [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.055 2 DEBUG nova.compute.manager [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.143 2 DEBUG nova.compute.manager [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.145 2 DEBUG nova.virt.libvirt.driver [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.146 2 INFO nova.virt.libvirt.driver [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Creating image(s)#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.147 2 DEBUG oslo_concurrency.lockutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "/var/lib/nova/instances/d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.147 2 DEBUG oslo_concurrency.lockutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "/var/lib/nova/instances/d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.148 2 DEBUG oslo_concurrency.lockutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "/var/lib/nova/instances/d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.166 2 DEBUG oslo_concurrency.processutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.244 2 DEBUG nova.policy [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.254 2 DEBUG oslo_concurrency.processutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.254 2 DEBUG oslo_concurrency.lockutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.255 2 DEBUG oslo_concurrency.lockutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.265 2 DEBUG oslo_concurrency.processutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.359 2 DEBUG oslo_concurrency.processutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.360 2 DEBUG oslo_concurrency.processutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/d73d8a2e-011b-4f41-9734-d2bb2b068986/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.398 2 DEBUG oslo_concurrency.processutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/d73d8a2e-011b-4f41-9734-d2bb2b068986/disk 10737418240" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.399 2 DEBUG oslo_concurrency.lockutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.399 2 DEBUG oslo_concurrency.processutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.475 2 DEBUG oslo_concurrency.processutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.476 2 DEBUG nova.objects.instance [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lazy-loading 'migration_context' on Instance uuid d73d8a2e-011b-4f41-9734-d2bb2b068986 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.490 2 DEBUG nova.virt.libvirt.driver [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.490 2 DEBUG nova.virt.libvirt.driver [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Ensure instance console log exists: /var/lib/nova/instances/d73d8a2e-011b-4f41-9734-d2bb2b068986/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.491 2 DEBUG oslo_concurrency.lockutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.491 2 DEBUG oslo_concurrency.lockutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.492 2 DEBUG oslo_concurrency.lockutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:33:04 np0005476733 nova_compute[192580]: 2025-10-08 15:33:04.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:05 np0005476733 nova_compute[192580]: 2025-10-08 15:33:05.325 2 DEBUG nova.network.neutron [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Successfully created port: bc705fd7-4e51-4032-817d-a3554b18a7d9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 11:33:06 np0005476733 nova_compute[192580]: 2025-10-08 15:33:06.744 2 DEBUG nova.network.neutron [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Successfully updated port: bc705fd7-4e51-4032-817d-a3554b18a7d9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:33:06 np0005476733 nova_compute[192580]: 2025-10-08 15:33:06.769 2 DEBUG oslo_concurrency.lockutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "refresh_cache-d73d8a2e-011b-4f41-9734-d2bb2b068986" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:33:06 np0005476733 nova_compute[192580]: 2025-10-08 15:33:06.770 2 DEBUG oslo_concurrency.lockutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquired lock "refresh_cache-d73d8a2e-011b-4f41-9734-d2bb2b068986" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:33:06 np0005476733 nova_compute[192580]: 2025-10-08 15:33:06.770 2 DEBUG nova.network.neutron [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:33:06 np0005476733 nova_compute[192580]: 2025-10-08 15:33:06.851 2 DEBUG nova.compute.manager [req-ee4cd7f5-a0fc-4380-a56d-50b87ca741b8 req-e8eb78af-58a2-430e-b618-cf604342b691 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Received event network-changed-bc705fd7-4e51-4032-817d-a3554b18a7d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:33:06 np0005476733 nova_compute[192580]: 2025-10-08 15:33:06.852 2 DEBUG nova.compute.manager [req-ee4cd7f5-a0fc-4380-a56d-50b87ca741b8 req-e8eb78af-58a2-430e-b618-cf604342b691 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Refreshing instance network info cache due to event network-changed-bc705fd7-4e51-4032-817d-a3554b18a7d9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:33:06 np0005476733 nova_compute[192580]: 2025-10-08 15:33:06.852 2 DEBUG oslo_concurrency.lockutils [req-ee4cd7f5-a0fc-4380-a56d-50b87ca741b8 req-e8eb78af-58a2-430e-b618-cf604342b691 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-d73d8a2e-011b-4f41-9734-d2bb2b068986" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:33:06 np0005476733 nova_compute[192580]: 2025-10-08 15:33:06.972 2 DEBUG nova.network.neutron [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:33:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:07.363 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.079 2 DEBUG nova.network.neutron [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Updating instance_info_cache with network_info: [{"id": "bc705fd7-4e51-4032-817d-a3554b18a7d9", "address": "fa:16:3e:00:d6:7c", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc705fd7-4e", "ovs_interfaceid": "bc705fd7-4e51-4032-817d-a3554b18a7d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.104 2 DEBUG oslo_concurrency.lockutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Releasing lock "refresh_cache-d73d8a2e-011b-4f41-9734-d2bb2b068986" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.105 2 DEBUG nova.compute.manager [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Instance network_info: |[{"id": "bc705fd7-4e51-4032-817d-a3554b18a7d9", "address": "fa:16:3e:00:d6:7c", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc705fd7-4e", "ovs_interfaceid": "bc705fd7-4e51-4032-817d-a3554b18a7d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.105 2 DEBUG oslo_concurrency.lockutils [req-ee4cd7f5-a0fc-4380-a56d-50b87ca741b8 req-e8eb78af-58a2-430e-b618-cf604342b691 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-d73d8a2e-011b-4f41-9734-d2bb2b068986" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.106 2 DEBUG nova.network.neutron [req-ee4cd7f5-a0fc-4380-a56d-50b87ca741b8 req-e8eb78af-58a2-430e-b618-cf604342b691 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Refreshing network info cache for port bc705fd7-4e51-4032-817d-a3554b18a7d9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.108 2 DEBUG nova.virt.libvirt.driver [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Start _get_guest_xml network_info=[{"id": "bc705fd7-4e51-4032-817d-a3554b18a7d9", "address": "fa:16:3e:00:d6:7c", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc705fd7-4e", "ovs_interfaceid": "bc705fd7-4e51-4032-817d-a3554b18a7d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.113 2 WARNING nova.virt.libvirt.driver [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.118 2 DEBUG nova.virt.libvirt.host [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.119 2 DEBUG nova.virt.libvirt.host [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.128 2 DEBUG nova.virt.libvirt.host [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.129 2 DEBUG nova.virt.libvirt.host [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.129 2 DEBUG nova.virt.libvirt.driver [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.129 2 DEBUG nova.virt.hardware [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.130 2 DEBUG nova.virt.hardware [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.130 2 DEBUG nova.virt.hardware [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.130 2 DEBUG nova.virt.hardware [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.131 2 DEBUG nova.virt.hardware [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.131 2 DEBUG nova.virt.hardware [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.131 2 DEBUG nova.virt.hardware [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.131 2 DEBUG nova.virt.hardware [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.132 2 DEBUG nova.virt.hardware [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.132 2 DEBUG nova.virt.hardware [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.132 2 DEBUG nova.virt.hardware [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.136 2 DEBUG nova.virt.libvirt.vif [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:33:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_multicast_north_south-1395375184',display_name='tempest-test_multicast_north_south-1395375184',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-multicast-north-south-1395375184',id=53,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHaTUyIW7HAi8eLb2uxsb3hQ01QNiqMtiwd2QQElMyFusiyPekoP+eGZG5apcvUeJj+ezHykEE9e9GalqeB/Pt0gdiMZz/nmUCtHv59KRRGG4S5F2fPmbxlRdJaDztvzVg==',key_name='tempest-keypair-test-1272869518',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='496a37645ecf47b496dcf02c696ca64a',ramdisk_id='',reservation_id='r-a0huy3g9',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Ovn-1993668591',owner_user_name='tempest-MulticastTestIPv4Ovn-1993668591-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:33:04Z,user_data=None,user_id='c0c7c5c2dab54695b1cc0a34bdc4ee47',uuid=d73d8a2e-011b-4f41-9734-d2bb2b068986,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bc705fd7-4e51-4032-817d-a3554b18a7d9", "address": "fa:16:3e:00:d6:7c", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc705fd7-4e", "ovs_interfaceid": "bc705fd7-4e51-4032-817d-a3554b18a7d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.136 2 DEBUG nova.network.os_vif_util [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converting VIF {"id": "bc705fd7-4e51-4032-817d-a3554b18a7d9", "address": "fa:16:3e:00:d6:7c", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc705fd7-4e", "ovs_interfaceid": "bc705fd7-4e51-4032-817d-a3554b18a7d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.137 2 DEBUG nova.network.os_vif_util [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:d6:7c,bridge_name='br-int',has_traffic_filtering=True,id=bc705fd7-4e51-4032-817d-a3554b18a7d9,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc705fd7-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.138 2 DEBUG nova.objects.instance [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lazy-loading 'pci_devices' on Instance uuid d73d8a2e-011b-4f41-9734-d2bb2b068986 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.290 2 DEBUG nova.virt.libvirt.driver [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:33:08 np0005476733 nova_compute[192580]:  <uuid>d73d8a2e-011b-4f41-9734-d2bb2b068986</uuid>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:  <name>instance-00000035</name>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <nova:name>tempest-test_multicast_north_south-1395375184</nova:name>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:33:08</nova:creationTime>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:33:08 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:        <nova:user uuid="c0c7c5c2dab54695b1cc0a34bdc4ee47">tempest-MulticastTestIPv4Ovn-1993668591-project-member</nova:user>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:        <nova:project uuid="496a37645ecf47b496dcf02c696ca64a">tempest-MulticastTestIPv4Ovn-1993668591</nova:project>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:        <nova:port uuid="bc705fd7-4e51-4032-817d-a3554b18a7d9">
Oct  8 11:33:08 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <entry name="serial">d73d8a2e-011b-4f41-9734-d2bb2b068986</entry>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <entry name="uuid">d73d8a2e-011b-4f41-9734-d2bb2b068986</entry>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/d73d8a2e-011b-4f41-9734-d2bb2b068986/disk"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.config"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:00:d6:7c"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <target dev="tapbc705fd7-4e"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/d73d8a2e-011b-4f41-9734-d2bb2b068986/console.log" append="off"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:33:08 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:33:08 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:33:08 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:33:08 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.292 2 DEBUG nova.compute.manager [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Preparing to wait for external event network-vif-plugged-bc705fd7-4e51-4032-817d-a3554b18a7d9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.293 2 DEBUG oslo_concurrency.lockutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "d73d8a2e-011b-4f41-9734-d2bb2b068986-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.294 2 DEBUG oslo_concurrency.lockutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "d73d8a2e-011b-4f41-9734-d2bb2b068986-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.294 2 DEBUG oslo_concurrency.lockutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "d73d8a2e-011b-4f41-9734-d2bb2b068986-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.295 2 DEBUG nova.virt.libvirt.vif [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:33:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_multicast_north_south-1395375184',display_name='tempest-test_multicast_north_south-1395375184',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-multicast-north-south-1395375184',id=53,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHaTUyIW7HAi8eLb2uxsb3hQ01QNiqMtiwd2QQElMyFusiyPekoP+eGZG5apcvUeJj+ezHykEE9e9GalqeB/Pt0gdiMZz/nmUCtHv59KRRGG4S5F2fPmbxlRdJaDztvzVg==',key_name='tempest-keypair-test-1272869518',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='496a37645ecf47b496dcf02c696ca64a',ramdisk_id='',reservation_id='r-a0huy3g9',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Ovn-1993668591',owner_user_name='tempest-MulticastTestIPv4Ovn-1993668591-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:33:04Z,user_data=None,user_id='c0c7c5c2dab54695b1cc0a34bdc4ee47',uuid=d73d8a2e-011b-4f41-9734-d2bb2b068986,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bc705fd7-4e51-4032-817d-a3554b18a7d9", "address": "fa:16:3e:00:d6:7c", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc705fd7-4e", "ovs_interfaceid": "bc705fd7-4e51-4032-817d-a3554b18a7d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.295 2 DEBUG nova.network.os_vif_util [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converting VIF {"id": "bc705fd7-4e51-4032-817d-a3554b18a7d9", "address": "fa:16:3e:00:d6:7c", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc705fd7-4e", "ovs_interfaceid": "bc705fd7-4e51-4032-817d-a3554b18a7d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.296 2 DEBUG nova.network.os_vif_util [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:d6:7c,bridge_name='br-int',has_traffic_filtering=True,id=bc705fd7-4e51-4032-817d-a3554b18a7d9,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc705fd7-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.296 2 DEBUG os_vif [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:d6:7c,bridge_name='br-int',has_traffic_filtering=True,id=bc705fd7-4e51-4032-817d-a3554b18a7d9,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc705fd7-4e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.297 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.298 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.303 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbc705fd7-4e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.303 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbc705fd7-4e, col_values=(('external_ids', {'iface-id': 'bc705fd7-4e51-4032-817d-a3554b18a7d9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:00:d6:7c', 'vm-uuid': 'd73d8a2e-011b-4f41-9734-d2bb2b068986'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:08 np0005476733 NetworkManager[51699]: <info>  [1759937588.3059] manager: (tapbc705fd7-4e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/145)
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.315 2 INFO os_vif [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:d6:7c,bridge_name='br-int',has_traffic_filtering=True,id=bc705fd7-4e51-4032-817d-a3554b18a7d9,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc705fd7-4e')#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.365 2 DEBUG nova.virt.libvirt.driver [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.366 2 DEBUG nova.virt.libvirt.driver [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.366 2 DEBUG nova.virt.libvirt.driver [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] No VIF found with MAC fa:16:3e:00:d6:7c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.367 2 INFO nova.virt.libvirt.driver [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Using config drive#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.933 2 INFO nova.virt.libvirt.driver [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Creating config drive at /var/lib/nova/instances/d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.config#033[00m
Oct  8 11:33:08 np0005476733 nova_compute[192580]: 2025-10-08 15:33:08.938 2 DEBUG oslo_concurrency.processutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9vbq2n37 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:33:09 np0005476733 nova_compute[192580]: 2025-10-08 15:33:09.068 2 DEBUG oslo_concurrency.processutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9vbq2n37" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:33:09 np0005476733 kernel: tapbc705fd7-4e: entered promiscuous mode
Oct  8 11:33:09 np0005476733 NetworkManager[51699]: <info>  [1759937589.1529] manager: (tapbc705fd7-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/146)
Oct  8 11:33:09 np0005476733 ovn_controller[94857]: 2025-10-08T15:33:09Z|00434|binding|INFO|Claiming lport bc705fd7-4e51-4032-817d-a3554b18a7d9 for this chassis.
Oct  8 11:33:09 np0005476733 ovn_controller[94857]: 2025-10-08T15:33:09Z|00435|binding|INFO|bc705fd7-4e51-4032-817d-a3554b18a7d9: Claiming fa:16:3e:00:d6:7c 10.100.0.10
Oct  8 11:33:09 np0005476733 nova_compute[192580]: 2025-10-08 15:33:09.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.163 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:00:d6:7c 10.100.0.10'], port_security=['fa:16:3e:00:d6:7c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '496a37645ecf47b496dcf02c696ca64a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '023a0cd3-fdca-4dff-ba80-8ef557b384c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b3d4cc6-3768-451b-b35e-6b2333c921fd, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=bc705fd7-4e51-4032-817d-a3554b18a7d9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.164 103739 INFO neutron.agent.ovn.metadata.agent [-] Port bc705fd7-4e51-4032-817d-a3554b18a7d9 in datapath 30cdfb1e-750a-4d0e-9e9c-321b06b371b9 bound to our chassis#033[00m
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.166 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 30cdfb1e-750a-4d0e-9e9c-321b06b371b9#033[00m
Oct  8 11:33:09 np0005476733 ovn_controller[94857]: 2025-10-08T15:33:09Z|00436|binding|INFO|Setting lport bc705fd7-4e51-4032-817d-a3554b18a7d9 ovn-installed in OVS
Oct  8 11:33:09 np0005476733 ovn_controller[94857]: 2025-10-08T15:33:09Z|00437|binding|INFO|Setting lport bc705fd7-4e51-4032-817d-a3554b18a7d9 up in Southbound
Oct  8 11:33:09 np0005476733 nova_compute[192580]: 2025-10-08 15:33:09.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:09 np0005476733 nova_compute[192580]: 2025-10-08 15:33:09.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.182 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a3b559df-d00e-40fe-9b0b-85450698cd90]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.184 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap30cdfb1e-71 in ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.186 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap30cdfb1e-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.187 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[326a1d15-28c8-4fbb-abaf-9a997b9d777c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.188 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[887f65c4-7fe9-4520-a531-a5f85c93aa16]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.200 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[352845db-2373-4eec-bd86-8b640e4a4609]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:09 np0005476733 systemd-udevd[233618]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.218 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[fea43408-2198-461c-9198-afb438d22637]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:09 np0005476733 systemd-machined[152624]: New machine qemu-31-instance-00000035.
Oct  8 11:33:09 np0005476733 NetworkManager[51699]: <info>  [1759937589.2306] device (tapbc705fd7-4e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:33:09 np0005476733 NetworkManager[51699]: <info>  [1759937589.2315] device (tapbc705fd7-4e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:33:09 np0005476733 systemd[1]: Started Virtual Machine qemu-31-instance-00000035.
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.255 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[037eac8c-287c-4f0a-b5b9-45260e478166]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:09 np0005476733 NetworkManager[51699]: <info>  [1759937589.2639] manager: (tap30cdfb1e-70): new Veth device (/org/freedesktop/NetworkManager/Devices/147)
Oct  8 11:33:09 np0005476733 systemd-udevd[233627]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.265 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f2949063-3d77-48de-8f15-400eee350f2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:09 np0005476733 podman[233596]: 2025-10-08 15:33:09.285272411 +0000 UTC m=+0.138276655 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.309 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[6bc76dc0-db46-4f1b-acb5-333eb64840dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.322 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[2987bd27-62be-4d85-8750-18180f6f3e2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:09 np0005476733 nova_compute[192580]: 2025-10-08 15:33:09.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:09 np0005476733 NetworkManager[51699]: <info>  [1759937589.3713] device (tap30cdfb1e-70): carrier: link connected
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.377 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[78bec801-5471-4f4d-88f5-086ee18812c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.396 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[05538c26-e1d8-492a-8607-c2fdcb1d6c2e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap30cdfb1e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:3e:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 100], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449303, 'reachable_time': 20489, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233655, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.416 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8a6cc992-6ab8-4ef0-a4bd-dd51ee7e0f80]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed8:3ea4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 449303, 'tstamp': 449303}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233656, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:09 np0005476733 nova_compute[192580]: 2025-10-08 15:33:09.426 2 DEBUG nova.compute.manager [req-479a1d57-1c5b-45f2-a9a4-1abc567d6f69 req-601b230b-f2f4-4c85-bf03-5df927ee8fc3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Received event network-vif-plugged-bc705fd7-4e51-4032-817d-a3554b18a7d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:33:09 np0005476733 nova_compute[192580]: 2025-10-08 15:33:09.427 2 DEBUG oslo_concurrency.lockutils [req-479a1d57-1c5b-45f2-a9a4-1abc567d6f69 req-601b230b-f2f4-4c85-bf03-5df927ee8fc3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "d73d8a2e-011b-4f41-9734-d2bb2b068986-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:33:09 np0005476733 nova_compute[192580]: 2025-10-08 15:33:09.427 2 DEBUG oslo_concurrency.lockutils [req-479a1d57-1c5b-45f2-a9a4-1abc567d6f69 req-601b230b-f2f4-4c85-bf03-5df927ee8fc3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d73d8a2e-011b-4f41-9734-d2bb2b068986-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:33:09 np0005476733 nova_compute[192580]: 2025-10-08 15:33:09.428 2 DEBUG oslo_concurrency.lockutils [req-479a1d57-1c5b-45f2-a9a4-1abc567d6f69 req-601b230b-f2f4-4c85-bf03-5df927ee8fc3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d73d8a2e-011b-4f41-9734-d2bb2b068986-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:33:09 np0005476733 nova_compute[192580]: 2025-10-08 15:33:09.428 2 DEBUG nova.compute.manager [req-479a1d57-1c5b-45f2-a9a4-1abc567d6f69 req-601b230b-f2f4-4c85-bf03-5df927ee8fc3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Processing event network-vif-plugged-bc705fd7-4e51-4032-817d-a3554b18a7d9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.434 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[483832f9-8953-4694-a182-b89e63d86538]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap30cdfb1e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:3e:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 100], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449303, 'reachable_time': 20489, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233657, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.473 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b75a33f9-c02b-4792-a5a5-a15bb0d55926]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.534 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[07678f7c-381c-4832-9390-d3d784cca02d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.538 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap30cdfb1e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.538 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.539 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap30cdfb1e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:33:09 np0005476733 nova_compute[192580]: 2025-10-08 15:33:09.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:09 np0005476733 NetworkManager[51699]: <info>  [1759937589.5414] manager: (tap30cdfb1e-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/148)
Oct  8 11:33:09 np0005476733 kernel: tap30cdfb1e-70: entered promiscuous mode
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.548 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap30cdfb1e-70, col_values=(('external_ids', {'iface-id': '76302563-91ae-48df-adce-3edec8d5a578'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:33:09 np0005476733 nova_compute[192580]: 2025-10-08 15:33:09.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:09 np0005476733 ovn_controller[94857]: 2025-10-08T15:33:09Z|00438|binding|INFO|Releasing lport 76302563-91ae-48df-adce-3edec8d5a578 from this chassis (sb_readonly=0)
Oct  8 11:33:09 np0005476733 nova_compute[192580]: 2025-10-08 15:33:09.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.554 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/30cdfb1e-750a-4d0e-9e9c-321b06b371b9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/30cdfb1e-750a-4d0e-9e9c-321b06b371b9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.555 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[214bed11-7a02-4245-ad1d-ad2f6414d152]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.556 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-30cdfb1e-750a-4d0e-9e9c-321b06b371b9
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/30cdfb1e-750a-4d0e-9e9c-321b06b371b9.pid.haproxy
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 30cdfb1e-750a-4d0e-9e9c-321b06b371b9
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:33:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:09.557 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'env', 'PROCESS_TAG=haproxy-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/30cdfb1e-750a-4d0e-9e9c-321b06b371b9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:33:09 np0005476733 nova_compute[192580]: 2025-10-08 15:33:09.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:09 np0005476733 nova_compute[192580]: 2025-10-08 15:33:09.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:09 np0005476733 podman[233696]: 2025-10-08 15:33:09.949973842 +0000 UTC m=+0.067685066 container create bba8d4de3bfa484e141c5d1032593cae4f1bc11710796874a2504c8c8251faf1 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct  8 11:33:10 np0005476733 systemd[1]: Started libpod-conmon-bba8d4de3bfa484e141c5d1032593cae4f1bc11710796874a2504c8c8251faf1.scope.
Oct  8 11:33:10 np0005476733 podman[233696]: 2025-10-08 15:33:09.916118894 +0000 UTC m=+0.033830118 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:33:10 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:33:10 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/65124f7a483a3b3183b8c70e44a7c847783d99276680556f3f641efadd99ce8f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:33:10 np0005476733 podman[233696]: 2025-10-08 15:33:10.056296669 +0000 UTC m=+0.174007923 container init bba8d4de3bfa484e141c5d1032593cae4f1bc11710796874a2504c8c8251faf1 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2)
Oct  8 11:33:10 np0005476733 podman[233696]: 2025-10-08 15:33:10.062171508 +0000 UTC m=+0.179882732 container start bba8d4de3bfa484e141c5d1032593cae4f1bc11710796874a2504c8c8251faf1 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:33:10 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[233711]: [NOTICE]   (233715) : New worker (233717) forked
Oct  8 11:33:10 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[233711]: [NOTICE]   (233715) : Loading success.
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.121 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937590.120961, d73d8a2e-011b-4f41-9734-d2bb2b068986 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.122 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] VM Started (Lifecycle Event)#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.125 2 DEBUG nova.compute.manager [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.130 2 DEBUG nova.virt.libvirt.driver [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.134 2 INFO nova.virt.libvirt.driver [-] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Instance spawned successfully.#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.135 2 DEBUG nova.virt.libvirt.driver [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.144 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.146 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.158 2 DEBUG nova.virt.libvirt.driver [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.158 2 DEBUG nova.virt.libvirt.driver [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.159 2 DEBUG nova.virt.libvirt.driver [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.159 2 DEBUG nova.virt.libvirt.driver [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.159 2 DEBUG nova.virt.libvirt.driver [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.160 2 DEBUG nova.virt.libvirt.driver [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.168 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.168 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937590.121236, d73d8a2e-011b-4f41-9734-d2bb2b068986 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.168 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.189 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.192 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937590.128037, d73d8a2e-011b-4f41-9734-d2bb2b068986 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.192 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.234 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.240 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.248 2 INFO nova.compute.manager [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Took 6.10 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.249 2 DEBUG nova.compute.manager [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.314 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.355 2 INFO nova.compute.manager [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Took 6.63 seconds to build instance.#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.388 2 DEBUG oslo_concurrency.lockutils [None req-ff8bc97e-404a-4712-82c1-11a387784410 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "d73d8a2e-011b-4f41-9734-d2bb2b068986" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.750s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.414 2 DEBUG nova.network.neutron [req-ee4cd7f5-a0fc-4380-a56d-50b87ca741b8 req-e8eb78af-58a2-430e-b618-cf604342b691 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Updated VIF entry in instance network info cache for port bc705fd7-4e51-4032-817d-a3554b18a7d9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.415 2 DEBUG nova.network.neutron [req-ee4cd7f5-a0fc-4380-a56d-50b87ca741b8 req-e8eb78af-58a2-430e-b618-cf604342b691 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Updating instance_info_cache with network_info: [{"id": "bc705fd7-4e51-4032-817d-a3554b18a7d9", "address": "fa:16:3e:00:d6:7c", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc705fd7-4e", "ovs_interfaceid": "bc705fd7-4e51-4032-817d-a3554b18a7d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:33:10 np0005476733 nova_compute[192580]: 2025-10-08 15:33:10.437 2 DEBUG oslo_concurrency.lockutils [req-ee4cd7f5-a0fc-4380-a56d-50b87ca741b8 req-e8eb78af-58a2-430e-b618-cf604342b691 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-d73d8a2e-011b-4f41-9734-d2bb2b068986" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:33:11 np0005476733 nova_compute[192580]: 2025-10-08 15:33:11.497 2 DEBUG nova.compute.manager [req-bb9e7f51-741c-43b7-a40d-6d0c371577a7 req-7933b7a3-0cac-4066-b6bc-d783c7e3d200 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Received event network-vif-plugged-bc705fd7-4e51-4032-817d-a3554b18a7d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:33:11 np0005476733 nova_compute[192580]: 2025-10-08 15:33:11.497 2 DEBUG oslo_concurrency.lockutils [req-bb9e7f51-741c-43b7-a40d-6d0c371577a7 req-7933b7a3-0cac-4066-b6bc-d783c7e3d200 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "d73d8a2e-011b-4f41-9734-d2bb2b068986-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:33:11 np0005476733 nova_compute[192580]: 2025-10-08 15:33:11.498 2 DEBUG oslo_concurrency.lockutils [req-bb9e7f51-741c-43b7-a40d-6d0c371577a7 req-7933b7a3-0cac-4066-b6bc-d783c7e3d200 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d73d8a2e-011b-4f41-9734-d2bb2b068986-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:33:11 np0005476733 nova_compute[192580]: 2025-10-08 15:33:11.498 2 DEBUG oslo_concurrency.lockutils [req-bb9e7f51-741c-43b7-a40d-6d0c371577a7 req-7933b7a3-0cac-4066-b6bc-d783c7e3d200 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d73d8a2e-011b-4f41-9734-d2bb2b068986-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:33:11 np0005476733 nova_compute[192580]: 2025-10-08 15:33:11.498 2 DEBUG nova.compute.manager [req-bb9e7f51-741c-43b7-a40d-6d0c371577a7 req-7933b7a3-0cac-4066-b6bc-d783c7e3d200 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] No waiting events found dispatching network-vif-plugged-bc705fd7-4e51-4032-817d-a3554b18a7d9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:33:11 np0005476733 nova_compute[192580]: 2025-10-08 15:33:11.499 2 WARNING nova.compute.manager [req-bb9e7f51-741c-43b7-a40d-6d0c371577a7 req-7933b7a3-0cac-4066-b6bc-d783c7e3d200 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Received unexpected event network-vif-plugged-bc705fd7-4e51-4032-817d-a3554b18a7d9 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:33:12 np0005476733 nova_compute[192580]: 2025-10-08 15:33:12.626 2 INFO nova.compute.manager [None req-93e5dc56-8465-49b7-823e-299e96427d58 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Get console output#033[00m
Oct  8 11:33:12 np0005476733 nova_compute[192580]: 2025-10-08 15:33:12.633 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:33:13 np0005476733 podman[233727]: 2025-10-08 15:33:13.245577675 +0000 UTC m=+0.071186959 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 11:33:13 np0005476733 podman[233726]: 2025-10-08 15:33:13.26814753 +0000 UTC m=+0.097869636 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=ovn_controller)
Oct  8 11:33:13 np0005476733 nova_compute[192580]: 2025-10-08 15:33:13.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:14 np0005476733 nova_compute[192580]: 2025-10-08 15:33:14.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:15 np0005476733 nova_compute[192580]: 2025-10-08 15:33:15.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:17 np0005476733 nova_compute[192580]: 2025-10-08 15:33:17.818 2 INFO nova.compute.manager [None req-a27f4898-7e7d-426f-9cb2-d925170512fa c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Get console output#033[00m
Oct  8 11:33:17 np0005476733 nova_compute[192580]: 2025-10-08 15:33:17.824 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:33:18 np0005476733 nova_compute[192580]: 2025-10-08 15:33:18.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:19 np0005476733 nova_compute[192580]: 2025-10-08 15:33:19.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:20 np0005476733 podman[233774]: 2025-10-08 15:33:20.246154115 +0000 UTC m=+0.072366807 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, managed_by=edpm_ansible, version=9.6, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, io.openshift.tags=minimal rhel9)
Oct  8 11:33:20 np0005476733 podman[233773]: 2025-10-08 15:33:20.251480376 +0000 UTC m=+0.076960634 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:33:20 np0005476733 podman[233772]: 2025-10-08 15:33:20.260977192 +0000 UTC m=+0.083703961 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  8 11:33:21 np0005476733 ovn_controller[94857]: 2025-10-08T15:33:21Z|00439|pinctrl|WARN|Dropped 3695 log messages in last 61 seconds (most recently, 2 seconds ago) due to excessive rate
Oct  8 11:33:21 np0005476733 ovn_controller[94857]: 2025-10-08T15:33:21Z|00440|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:33:22 np0005476733 nova_compute[192580]: 2025-10-08 15:33:22.992 2 INFO nova.compute.manager [None req-3d516965-e24b-4d6a-90da-79c5c0557a2d c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Get console output#033[00m
Oct  8 11:33:23 np0005476733 nova_compute[192580]: 2025-10-08 15:33:23.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:24 np0005476733 nova_compute[192580]: 2025-10-08 15:33:24.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:26.317 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:33:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:26.318 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:33:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:26.320 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:33:28 np0005476733 nova_compute[192580]: 2025-10-08 15:33:28.165 2 INFO nova.compute.manager [None req-0489be8c-38ed-438e-b6a4-46ef7c499b0c c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Get console output#033[00m
Oct  8 11:33:28 np0005476733 nova_compute[192580]: 2025-10-08 15:33:28.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:29 np0005476733 nova_compute[192580]: 2025-10-08 15:33:29.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:31 np0005476733 podman[233843]: 2025-10-08 15:33:31.285387888 +0000 UTC m=+0.083220205 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 11:33:31 np0005476733 podman[233844]: 2025-10-08 15:33:31.299394758 +0000 UTC m=+0.101488402 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:33:33 np0005476733 nova_compute[192580]: 2025-10-08 15:33:33.301 2 INFO nova.compute.manager [None req-aeab33a7-d40e-446d-9769-5307a979c3eb c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Get console output#033[00m
Oct  8 11:33:33 np0005476733 nova_compute[192580]: 2025-10-08 15:33:33.309 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:33:33 np0005476733 nova_compute[192580]: 2025-10-08 15:33:33.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:33 np0005476733 ovn_controller[94857]: 2025-10-08T15:33:33Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:00:d6:7c 10.100.0.10
Oct  8 11:33:33 np0005476733 ovn_controller[94857]: 2025-10-08T15:33:33Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:00:d6:7c 10.100.0.10
Oct  8 11:33:34 np0005476733 nova_compute[192580]: 2025-10-08 15:33:34.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:38 np0005476733 nova_compute[192580]: 2025-10-08 15:33:38.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:38 np0005476733 nova_compute[192580]: 2025-10-08 15:33:38.601 2 INFO nova.compute.manager [None req-8bc6032f-4809-4e8f-8d0c-88af2bb288f9 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Get console output#033[00m
Oct  8 11:33:38 np0005476733 nova_compute[192580]: 2025-10-08 15:33:38.606 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:33:38 np0005476733 nova_compute[192580]: 2025-10-08 15:33:38.609 2 INFO nova.virt.libvirt.driver [None req-8bc6032f-4809-4e8f-8d0c-88af2bb288f9 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Truncated console log returned, 1359 bytes ignored#033[00m
Oct  8 11:33:38 np0005476733 nova_compute[192580]: 2025-10-08 15:33:38.728 2 DEBUG nova.compute.manager [req-1fe8ab3e-de6a-4ebc-9938-53d1a1873f54 req-d767af62-6194-4207-80b3-ed74007a52f7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received event network-changed-8f7d5998-037f-4a70-98a0-8482a8043a7e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:33:38 np0005476733 nova_compute[192580]: 2025-10-08 15:33:38.729 2 DEBUG nova.compute.manager [req-1fe8ab3e-de6a-4ebc-9938-53d1a1873f54 req-d767af62-6194-4207-80b3-ed74007a52f7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Refreshing instance network info cache due to event network-changed-8f7d5998-037f-4a70-98a0-8482a8043a7e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:33:38 np0005476733 nova_compute[192580]: 2025-10-08 15:33:38.729 2 DEBUG oslo_concurrency.lockutils [req-1fe8ab3e-de6a-4ebc-9938-53d1a1873f54 req-d767af62-6194-4207-80b3-ed74007a52f7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:33:38 np0005476733 nova_compute[192580]: 2025-10-08 15:33:38.729 2 DEBUG oslo_concurrency.lockutils [req-1fe8ab3e-de6a-4ebc-9938-53d1a1873f54 req-d767af62-6194-4207-80b3-ed74007a52f7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:33:38 np0005476733 nova_compute[192580]: 2025-10-08 15:33:38.730 2 DEBUG nova.network.neutron [req-1fe8ab3e-de6a-4ebc-9938-53d1a1873f54 req-d767af62-6194-4207-80b3-ed74007a52f7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Refreshing network info cache for port 8f7d5998-037f-4a70-98a0-8482a8043a7e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:33:39 np0005476733 nova_compute[192580]: 2025-10-08 15:33:39.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:40 np0005476733 podman[233888]: 2025-10-08 15:33:40.274504136 +0000 UTC m=+0.084512786 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Oct  8 11:33:40 np0005476733 nova_compute[192580]: 2025-10-08 15:33:40.935 2 DEBUG nova.network.neutron [req-1fe8ab3e-de6a-4ebc-9938-53d1a1873f54 req-d767af62-6194-4207-80b3-ed74007a52f7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updated VIF entry in instance network info cache for port 8f7d5998-037f-4a70-98a0-8482a8043a7e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:33:40 np0005476733 nova_compute[192580]: 2025-10-08 15:33:40.935 2 DEBUG nova.network.neutron [req-1fe8ab3e-de6a-4ebc-9938-53d1a1873f54 req-d767af62-6194-4207-80b3-ed74007a52f7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updating instance_info_cache with network_info: [{"id": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "address": "fa:16:3e:85:7d:15", "network": {"id": "f81b33e3-d2f7-4437-b8c9-c9a54931fb61", "bridge": "br-int", "label": "tempest-test-network--416037603", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f7d5998-03", "ovs_interfaceid": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "50d486c7-b030-4d82-8b22-2f71cd277074", "address": "fa:16:3e:c9:e2:37", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50d486c7-b0", "ovs_interfaceid": "50d486c7-b030-4d82-8b22-2f71cd277074", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:33:40 np0005476733 nova_compute[192580]: 2025-10-08 15:33:40.959 2 DEBUG oslo_concurrency.lockutils [req-1fe8ab3e-de6a-4ebc-9938-53d1a1873f54 req-d767af62-6194-4207-80b3-ed74007a52f7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:33:43 np0005476733 nova_compute[192580]: 2025-10-08 15:33:43.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:43 np0005476733 nova_compute[192580]: 2025-10-08 15:33:43.754 2 INFO nova.compute.manager [None req-2ab0aa14-b6b8-4f9a-b3b0-f1168dd57055 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Get console output#033[00m
Oct  8 11:33:43 np0005476733 nova_compute[192580]: 2025-10-08 15:33:43.761 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:33:43 np0005476733 nova_compute[192580]: 2025-10-08 15:33:43.765 2 INFO nova.virt.libvirt.driver [None req-2ab0aa14-b6b8-4f9a-b3b0-f1168dd57055 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Truncated console log returned, 3244 bytes ignored#033[00m
Oct  8 11:33:44 np0005476733 podman[233908]: 2025-10-08 15:33:44.265755736 +0000 UTC m=+0.076189260 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 11:33:44 np0005476733 podman[233907]: 2025-10-08 15:33:44.291530573 +0000 UTC m=+0.108120565 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:33:44 np0005476733 nova_compute[192580]: 2025-10-08 15:33:44.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:45 np0005476733 nova_compute[192580]: 2025-10-08 15:33:45.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:33:46 np0005476733 nova_compute[192580]: 2025-10-08 15:33:46.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:33:46 np0005476733 nova_compute[192580]: 2025-10-08 15:33:46.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:33:46 np0005476733 nova_compute[192580]: 2025-10-08 15:33:46.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:33:46 np0005476733 nova_compute[192580]: 2025-10-08 15:33:46.617 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:33:46 np0005476733 nova_compute[192580]: 2025-10-08 15:33:46.617 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:33:46 np0005476733 nova_compute[192580]: 2025-10-08 15:33:46.717 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:33:46 np0005476733 nova_compute[192580]: 2025-10-08 15:33:46.789 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:33:46 np0005476733 nova_compute[192580]: 2025-10-08 15:33:46.790 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:33:46 np0005476733 nova_compute[192580]: 2025-10-08 15:33:46.848 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:33:46 np0005476733 nova_compute[192580]: 2025-10-08 15:33:46.854 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:33:46 np0005476733 nova_compute[192580]: 2025-10-08 15:33:46.917 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:33:46 np0005476733 nova_compute[192580]: 2025-10-08 15:33:46.918 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:33:46 np0005476733 nova_compute[192580]: 2025-10-08 15:33:46.983 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:33:46 np0005476733 nova_compute[192580]: 2025-10-08 15:33:46.990 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d73d8a2e-011b-4f41-9734-d2bb2b068986/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:33:47 np0005476733 nova_compute[192580]: 2025-10-08 15:33:47.054 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d73d8a2e-011b-4f41-9734-d2bb2b068986/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:33:47 np0005476733 nova_compute[192580]: 2025-10-08 15:33:47.056 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d73d8a2e-011b-4f41-9734-d2bb2b068986/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:33:47 np0005476733 nova_compute[192580]: 2025-10-08 15:33:47.118 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d73d8a2e-011b-4f41-9734-d2bb2b068986/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:33:47 np0005476733 nova_compute[192580]: 2025-10-08 15:33:47.330 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:33:47 np0005476733 nova_compute[192580]: 2025-10-08 15:33:47.332 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12077MB free_disk=111.07153701782227GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:33:47 np0005476733 nova_compute[192580]: 2025-10-08 15:33:47.332 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:33:47 np0005476733 nova_compute[192580]: 2025-10-08 15:33:47.333 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:33:47 np0005476733 nova_compute[192580]: 2025-10-08 15:33:47.441 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 066ef28b-88ac-4f5c-acae-3458c3e19762 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:33:47 np0005476733 nova_compute[192580]: 2025-10-08 15:33:47.442 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance e7b170f9-efdc-458b-a2e6-04c7f2072900 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:33:47 np0005476733 nova_compute[192580]: 2025-10-08 15:33:47.442 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance d73d8a2e-011b-4f41-9734-d2bb2b068986 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:33:47 np0005476733 nova_compute[192580]: 2025-10-08 15:33:47.443 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:33:47 np0005476733 nova_compute[192580]: 2025-10-08 15:33:47.444 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=2688MB phys_disk=119GB used_disk=21GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:33:47 np0005476733 nova_compute[192580]: 2025-10-08 15:33:47.545 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:33:47 np0005476733 nova_compute[192580]: 2025-10-08 15:33:47.571 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:33:47 np0005476733 nova_compute[192580]: 2025-10-08 15:33:47.608 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:33:47 np0005476733 nova_compute[192580]: 2025-10-08 15:33:47.609 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:33:47 np0005476733 nova_compute[192580]: 2025-10-08 15:33:47.728 2 DEBUG nova.compute.manager [req-2cba5fc8-326e-4a71-90ed-e44f20007a40 req-5f1d00eb-a483-4c16-be48-3e74015d3cae 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Received event network-changed-bc705fd7-4e51-4032-817d-a3554b18a7d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:33:47 np0005476733 nova_compute[192580]: 2025-10-08 15:33:47.729 2 DEBUG nova.compute.manager [req-2cba5fc8-326e-4a71-90ed-e44f20007a40 req-5f1d00eb-a483-4c16-be48-3e74015d3cae 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Refreshing instance network info cache due to event network-changed-bc705fd7-4e51-4032-817d-a3554b18a7d9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:33:47 np0005476733 nova_compute[192580]: 2025-10-08 15:33:47.729 2 DEBUG oslo_concurrency.lockutils [req-2cba5fc8-326e-4a71-90ed-e44f20007a40 req-5f1d00eb-a483-4c16-be48-3e74015d3cae 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-d73d8a2e-011b-4f41-9734-d2bb2b068986" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:33:47 np0005476733 nova_compute[192580]: 2025-10-08 15:33:47.729 2 DEBUG oslo_concurrency.lockutils [req-2cba5fc8-326e-4a71-90ed-e44f20007a40 req-5f1d00eb-a483-4c16-be48-3e74015d3cae 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-d73d8a2e-011b-4f41-9734-d2bb2b068986" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:33:47 np0005476733 nova_compute[192580]: 2025-10-08 15:33:47.730 2 DEBUG nova.network.neutron [req-2cba5fc8-326e-4a71-90ed-e44f20007a40 req-5f1d00eb-a483-4c16-be48-3e74015d3cae 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Refreshing network info cache for port bc705fd7-4e51-4032-817d-a3554b18a7d9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:33:48 np0005476733 nova_compute[192580]: 2025-10-08 15:33:48.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:48 np0005476733 nova_compute[192580]: 2025-10-08 15:33:48.611 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.160 2 DEBUG oslo_concurrency.lockutils [None req-1465cde5-a5d1-4c17-83ef-60341966a41f c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Acquiring lock "e7b170f9-efdc-458b-a2e6-04c7f2072900" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.160 2 DEBUG oslo_concurrency.lockutils [None req-1465cde5-a5d1-4c17-83ef-60341966a41f c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Lock "e7b170f9-efdc-458b-a2e6-04c7f2072900" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.161 2 DEBUG oslo_concurrency.lockutils [None req-1465cde5-a5d1-4c17-83ef-60341966a41f c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Acquiring lock "e7b170f9-efdc-458b-a2e6-04c7f2072900-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.161 2 DEBUG oslo_concurrency.lockutils [None req-1465cde5-a5d1-4c17-83ef-60341966a41f c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Lock "e7b170f9-efdc-458b-a2e6-04c7f2072900-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.162 2 DEBUG oslo_concurrency.lockutils [None req-1465cde5-a5d1-4c17-83ef-60341966a41f c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Lock "e7b170f9-efdc-458b-a2e6-04c7f2072900-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.163 2 INFO nova.compute.manager [None req-1465cde5-a5d1-4c17-83ef-60341966a41f c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Terminating instance#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.164 2 DEBUG nova.compute.manager [None req-1465cde5-a5d1-4c17-83ef-60341966a41f c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:33:49 np0005476733 kernel: tap6a82b4a7-24 (unregistering): left promiscuous mode
Oct  8 11:33:49 np0005476733 NetworkManager[51699]: <info>  [1759937629.1863] device (tap6a82b4a7-24): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:33:49 np0005476733 ovn_controller[94857]: 2025-10-08T15:33:49Z|00441|binding|INFO|Releasing lport 6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 from this chassis (sb_readonly=0)
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:49 np0005476733 ovn_controller[94857]: 2025-10-08T15:33:49Z|00442|binding|INFO|Setting lport 6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 down in Southbound
Oct  8 11:33:49 np0005476733 ovn_controller[94857]: 2025-10-08T15:33:49Z|00443|binding|INFO|Removing iface tap6a82b4a7-24 ovn-installed in OVS
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:49.228 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:fc:11 10.100.0.13'], port_security=['fa:16:3e:0a:fc:11 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e7b170f9-efdc-458b-a2e6-04c7f2072900', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7fe4641-81c3-446f-bec0-114221bc2533', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c0ff332fd7f14bd0831aa78a16065653', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3d602ec4-c136-4188-b9a6-e7299f4a8d98', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.188'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=812093c9-6f12-436a-826a-1a6fb93b9ea7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=6a82b4a7-2453-4ee1-866e-a6fe2175b5c4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:33:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:49.230 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 in datapath d7fe4641-81c3-446f-bec0-114221bc2533 unbound from our chassis#033[00m
Oct  8 11:33:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:49.233 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7fe4641-81c3-446f-bec0-114221bc2533, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:33:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:49.236 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[73df07bc-7f89-40c0-a27f-b5b98a7d38bf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:49.237 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7fe4641-81c3-446f-bec0-114221bc2533 namespace which is not needed anymore#033[00m
Oct  8 11:33:49 np0005476733 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000033.scope: Deactivated successfully.
Oct  8 11:33:49 np0005476733 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000033.scope: Consumed 16.174s CPU time.
Oct  8 11:33:49 np0005476733 systemd-machined[152624]: Machine qemu-30-instance-00000033 terminated.
Oct  8 11:33:49 np0005476733 neutron-haproxy-ovnmeta-d7fe4641-81c3-446f-bec0-114221bc2533[233040]: [NOTICE]   (233044) : haproxy version is 2.8.14-c23fe91
Oct  8 11:33:49 np0005476733 neutron-haproxy-ovnmeta-d7fe4641-81c3-446f-bec0-114221bc2533[233040]: [NOTICE]   (233044) : path to executable is /usr/sbin/haproxy
Oct  8 11:33:49 np0005476733 neutron-haproxy-ovnmeta-d7fe4641-81c3-446f-bec0-114221bc2533[233040]: [WARNING]  (233044) : Exiting Master process...
Oct  8 11:33:49 np0005476733 neutron-haproxy-ovnmeta-d7fe4641-81c3-446f-bec0-114221bc2533[233040]: [ALERT]    (233044) : Current worker (233046) exited with code 143 (Terminated)
Oct  8 11:33:49 np0005476733 neutron-haproxy-ovnmeta-d7fe4641-81c3-446f-bec0-114221bc2533[233040]: [WARNING]  (233044) : All workers exited. Exiting... (0)
Oct  8 11:33:49 np0005476733 systemd[1]: libpod-6c5a649430b81f27fc11badc8b282e92a1c0c2aa36503a3e0edcf48c18bf6bb7.scope: Deactivated successfully.
Oct  8 11:33:49 np0005476733 podman[233994]: 2025-10-08 15:33:49.398789768 +0000 UTC m=+0.057385314 container died 6c5a649430b81f27fc11badc8b282e92a1c0c2aa36503a3e0edcf48c18bf6bb7 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-d7fe4641-81c3-446f-bec0-114221bc2533, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  8 11:33:49 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6c5a649430b81f27fc11badc8b282e92a1c0c2aa36503a3e0edcf48c18bf6bb7-userdata-shm.mount: Deactivated successfully.
Oct  8 11:33:49 np0005476733 systemd[1]: var-lib-containers-storage-overlay-e8d2ac4e2071f1e4f11f384f7b701759bd7d59e43cab4d8e8b1428b0b4c31455-merged.mount: Deactivated successfully.
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.448 2 INFO nova.virt.libvirt.driver [-] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Instance destroyed successfully.#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.449 2 DEBUG nova.objects.instance [None req-1465cde5-a5d1-4c17-83ef-60341966a41f c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Lazy-loading 'resources' on Instance uuid e7b170f9-efdc-458b-a2e6-04c7f2072900 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:33:49 np0005476733 podman[233994]: 2025-10-08 15:33:49.454869561 +0000 UTC m=+0.113465087 container cleanup 6c5a649430b81f27fc11badc8b282e92a1c0c2aa36503a3e0edcf48c18bf6bb7 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-d7fe4641-81c3-446f-bec0-114221bc2533, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  8 11:33:49 np0005476733 systemd[1]: libpod-conmon-6c5a649430b81f27fc11badc8b282e92a1c0c2aa36503a3e0edcf48c18bf6bb7.scope: Deactivated successfully.
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.469 2 DEBUG nova.virt.libvirt.vif [None req-1465cde5-a5d1-4c17-83ef-60341966a41f c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:31:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-976377571',display_name='tempest-server-test-976377571',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-976377571',id=51,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLFGOBkGCcnIhZLnHKhYm7hKKuHgkmXVeeupmTvZ0MNWUPxIG3Vb+h0B4+BO+V7hH0rIdgfIY/0I7XuFYkR1QQ36NlsKH4DmMxnH65ozcPPxhe9fqB9OYl0FmItPfvXCDQ==',key_name='tempest-keypair-test-2773249',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:32:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c0ff332fd7f14bd0831aa78a16065653',ramdisk_id='',reservation_id='r-4nxuos03',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NetworkDefaultSecGroupTest-1600801023',owner_user_name='tempest-NetworkDefaultSecGroupTest-1600801023-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:32:07Z,user_data=None,user_id='c45a2b13dbdc4134a7829d83659d4dfd',uuid=e7b170f9-efdc-458b-a2e6-04c7f2072900,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6a82b4a7-2453-4ee1-866e-a6fe2175b5c4", "address": "fa:16:3e:0a:fc:11", "network": {"id": "d7fe4641-81c3-446f-bec0-114221bc2533", "bridge": "br-int", "label": "tempest-test-network--1217232881", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0ff332fd7f14bd0831aa78a16065653", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a82b4a7-24", "ovs_interfaceid": "6a82b4a7-2453-4ee1-866e-a6fe2175b5c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.470 2 DEBUG nova.network.os_vif_util [None req-1465cde5-a5d1-4c17-83ef-60341966a41f c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Converting VIF {"id": "6a82b4a7-2453-4ee1-866e-a6fe2175b5c4", "address": "fa:16:3e:0a:fc:11", "network": {"id": "d7fe4641-81c3-446f-bec0-114221bc2533", "bridge": "br-int", "label": "tempest-test-network--1217232881", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0ff332fd7f14bd0831aa78a16065653", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a82b4a7-24", "ovs_interfaceid": "6a82b4a7-2453-4ee1-866e-a6fe2175b5c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.471 2 DEBUG nova.network.os_vif_util [None req-1465cde5-a5d1-4c17-83ef-60341966a41f c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0a:fc:11,bridge_name='br-int',has_traffic_filtering=True,id=6a82b4a7-2453-4ee1-866e-a6fe2175b5c4,network=Network(d7fe4641-81c3-446f-bec0-114221bc2533),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a82b4a7-24') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.471 2 DEBUG os_vif [None req-1465cde5-a5d1-4c17-83ef-60341966a41f c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0a:fc:11,bridge_name='br-int',has_traffic_filtering=True,id=6a82b4a7-2453-4ee1-866e-a6fe2175b5c4,network=Network(d7fe4641-81c3-446f-bec0-114221bc2533),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a82b4a7-24') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.474 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a82b4a7-24, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.488 2 INFO os_vif [None req-1465cde5-a5d1-4c17-83ef-60341966a41f c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0a:fc:11,bridge_name='br-int',has_traffic_filtering=True,id=6a82b4a7-2453-4ee1-866e-a6fe2175b5c4,network=Network(d7fe4641-81c3-446f-bec0-114221bc2533),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a82b4a7-24')#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.488 2 INFO nova.virt.libvirt.driver [None req-1465cde5-a5d1-4c17-83ef-60341966a41f c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Deleting instance files /var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900_del#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.489 2 INFO nova.virt.libvirt.driver [None req-1465cde5-a5d1-4c17-83ef-60341966a41f c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Deletion of /var/lib/nova/instances/e7b170f9-efdc-458b-a2e6-04c7f2072900_del complete#033[00m
Oct  8 11:33:49 np0005476733 podman[234039]: 2025-10-08 15:33:49.541689711 +0000 UTC m=+0.057509779 container remove 6c5a649430b81f27fc11badc8b282e92a1c0c2aa36503a3e0edcf48c18bf6bb7 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-d7fe4641-81c3-446f-bec0-114221bc2533, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001)
Oct  8 11:33:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:49.548 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4855add7-6f19-4938-98c5-0752fed7c079]: (4, ('Wed Oct  8 03:33:49 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d7fe4641-81c3-446f-bec0-114221bc2533 (6c5a649430b81f27fc11badc8b282e92a1c0c2aa36503a3e0edcf48c18bf6bb7)\n6c5a649430b81f27fc11badc8b282e92a1c0c2aa36503a3e0edcf48c18bf6bb7\nWed Oct  8 03:33:49 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d7fe4641-81c3-446f-bec0-114221bc2533 (6c5a649430b81f27fc11badc8b282e92a1c0c2aa36503a3e0edcf48c18bf6bb7)\n6c5a649430b81f27fc11badc8b282e92a1c0c2aa36503a3e0edcf48c18bf6bb7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:49.550 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f36a9b44-941f-4fb6-8248-04261825b896]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:49.551 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7fe4641-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:33:49 np0005476733 kernel: tapd7fe4641-80: left promiscuous mode
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.563 2 INFO nova.compute.manager [None req-1465cde5-a5d1-4c17-83ef-60341966a41f c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.564 2 DEBUG oslo.service.loopingcall [None req-1465cde5-a5d1-4c17-83ef-60341966a41f c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.565 2 DEBUG nova.compute.manager [-] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.565 2 DEBUG nova.network.neutron [-] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:49.573 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[faac10c9-0616-470b-a808-727b82b1db16]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:33:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:49.607 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[24c0896f-984f-4f36-9b83-0c9305a3c0e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:49.609 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1287ebb9-2a9c-4752-a2c1-6237adddc05d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:49.625 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e37764a3-5766-49c5-af89-06ac692ba77e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 442837, 'reachable_time': 38168, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234054, 'error': None, 'target': 'ovnmeta-d7fe4641-81c3-446f-bec0-114221bc2533', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:49.629 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7fe4641-81c3-446f-bec0-114221bc2533 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:33:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:49.630 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[c97d8a70-473c-459c-a09c-b7a0b20f808d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:49 np0005476733 systemd[1]: run-netns-ovnmeta\x2dd7fe4641\x2d81c3\x2d446f\x2dbec0\x2d114221bc2533.mount: Deactivated successfully.
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.862 2 DEBUG nova.network.neutron [req-2cba5fc8-326e-4a71-90ed-e44f20007a40 req-5f1d00eb-a483-4c16-be48-3e74015d3cae 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Updated VIF entry in instance network info cache for port bc705fd7-4e51-4032-817d-a3554b18a7d9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.863 2 DEBUG nova.network.neutron [req-2cba5fc8-326e-4a71-90ed-e44f20007a40 req-5f1d00eb-a483-4c16-be48-3e74015d3cae 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Updating instance_info_cache with network_info: [{"id": "bc705fd7-4e51-4032-817d-a3554b18a7d9", "address": "fa:16:3e:00:d6:7c", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc705fd7-4e", "ovs_interfaceid": "bc705fd7-4e51-4032-817d-a3554b18a7d9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.886 2 DEBUG oslo_concurrency.lockutils [req-2cba5fc8-326e-4a71-90ed-e44f20007a40 req-5f1d00eb-a483-4c16-be48-3e74015d3cae 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-d73d8a2e-011b-4f41-9734-d2bb2b068986" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.900 2 DEBUG nova.compute.manager [req-d0aba604-5caf-4c13-817c-b0efe32f3798 req-8adde207-3e7c-4c6a-8537-3072c2c0895d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Received event network-vif-unplugged-6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.900 2 DEBUG oslo_concurrency.lockutils [req-d0aba604-5caf-4c13-817c-b0efe32f3798 req-8adde207-3e7c-4c6a-8537-3072c2c0895d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e7b170f9-efdc-458b-a2e6-04c7f2072900-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.901 2 DEBUG oslo_concurrency.lockutils [req-d0aba604-5caf-4c13-817c-b0efe32f3798 req-8adde207-3e7c-4c6a-8537-3072c2c0895d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e7b170f9-efdc-458b-a2e6-04c7f2072900-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.901 2 DEBUG oslo_concurrency.lockutils [req-d0aba604-5caf-4c13-817c-b0efe32f3798 req-8adde207-3e7c-4c6a-8537-3072c2c0895d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e7b170f9-efdc-458b-a2e6-04c7f2072900-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.901 2 DEBUG nova.compute.manager [req-d0aba604-5caf-4c13-817c-b0efe32f3798 req-8adde207-3e7c-4c6a-8537-3072c2c0895d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] No waiting events found dispatching network-vif-unplugged-6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:33:49 np0005476733 nova_compute[192580]: 2025-10-08 15:33:49.902 2 DEBUG nova.compute.manager [req-d0aba604-5caf-4c13-817c-b0efe32f3798 req-8adde207-3e7c-4c6a-8537-3072c2c0895d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Received event network-vif-unplugged-6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:33:50 np0005476733 nova_compute[192580]: 2025-10-08 15:33:50.582 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:33:50 np0005476733 nova_compute[192580]: 2025-10-08 15:33:50.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:33:50 np0005476733 nova_compute[192580]: 2025-10-08 15:33:50.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:33:50 np0005476733 nova_compute[192580]: 2025-10-08 15:33:50.618 2 DEBUG nova.network.neutron [-] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:33:50 np0005476733 nova_compute[192580]: 2025-10-08 15:33:50.646 2 INFO nova.compute.manager [-] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Took 1.08 seconds to deallocate network for instance.#033[00m
Oct  8 11:33:50 np0005476733 nova_compute[192580]: 2025-10-08 15:33:50.684 2 DEBUG nova.compute.manager [req-7b07e1c5-ac5f-448a-a479-c92bdf961da2 req-5bba8394-b4ab-41c4-a421-a0058852b9cf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Received event network-vif-deleted-6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:33:50 np0005476733 nova_compute[192580]: 2025-10-08 15:33:50.728 2 DEBUG oslo_concurrency.lockutils [None req-1465cde5-a5d1-4c17-83ef-60341966a41f c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:33:50 np0005476733 nova_compute[192580]: 2025-10-08 15:33:50.729 2 DEBUG oslo_concurrency.lockutils [None req-1465cde5-a5d1-4c17-83ef-60341966a41f c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:33:50 np0005476733 nova_compute[192580]: 2025-10-08 15:33:50.846 2 DEBUG nova.compute.provider_tree [None req-1465cde5-a5d1-4c17-83ef-60341966a41f c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:33:50 np0005476733 nova_compute[192580]: 2025-10-08 15:33:50.865 2 DEBUG nova.scheduler.client.report [None req-1465cde5-a5d1-4c17-83ef-60341966a41f c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:33:50 np0005476733 nova_compute[192580]: 2025-10-08 15:33:50.895 2 DEBUG oslo_concurrency.lockutils [None req-1465cde5-a5d1-4c17-83ef-60341966a41f c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:33:50 np0005476733 nova_compute[192580]: 2025-10-08 15:33:50.938 2 INFO nova.scheduler.client.report [None req-1465cde5-a5d1-4c17-83ef-60341966a41f c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Deleted allocations for instance e7b170f9-efdc-458b-a2e6-04c7f2072900#033[00m
Oct  8 11:33:51 np0005476733 nova_compute[192580]: 2025-10-08 15:33:51.016 2 DEBUG oslo_concurrency.lockutils [None req-1465cde5-a5d1-4c17-83ef-60341966a41f c45a2b13dbdc4134a7829d83659d4dfd c0ff332fd7f14bd0831aa78a16065653 - - default default] Lock "e7b170f9-efdc-458b-a2e6-04c7f2072900" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.856s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:33:51 np0005476733 podman[234055]: 2025-10-08 15:33:51.267290848 +0000 UTC m=+0.080521130 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 11:33:51 np0005476733 podman[234056]: 2025-10-08 15:33:51.291247407 +0000 UTC m=+0.094736245 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 11:33:51 np0005476733 podman[234057]: 2025-10-08 15:33:51.307871491 +0000 UTC m=+0.110353937 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, release=1755695350, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, config_id=edpm, vendor=Red Hat, Inc., io.buildah.version=1.33.7)
Oct  8 11:33:51 np0005476733 nova_compute[192580]: 2025-10-08 15:33:51.516 2 DEBUG oslo_concurrency.lockutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "6700973e-9d22-4d4a-8d39-ae92bc3bd6e2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:33:51 np0005476733 nova_compute[192580]: 2025-10-08 15:33:51.517 2 DEBUG oslo_concurrency.lockutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "6700973e-9d22-4d4a-8d39-ae92bc3bd6e2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:33:51 np0005476733 nova_compute[192580]: 2025-10-08 15:33:51.546 2 DEBUG nova.compute.manager [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:33:51 np0005476733 nova_compute[192580]: 2025-10-08 15:33:51.651 2 DEBUG oslo_concurrency.lockutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:33:51 np0005476733 nova_compute[192580]: 2025-10-08 15:33:51.652 2 DEBUG oslo_concurrency.lockutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:33:51 np0005476733 nova_compute[192580]: 2025-10-08 15:33:51.660 2 DEBUG nova.virt.hardware [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:33:51 np0005476733 nova_compute[192580]: 2025-10-08 15:33:51.661 2 INFO nova.compute.claims [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:33:51 np0005476733 nova_compute[192580]: 2025-10-08 15:33:51.819 2 DEBUG nova.compute.provider_tree [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:33:51 np0005476733 nova_compute[192580]: 2025-10-08 15:33:51.844 2 DEBUG nova.scheduler.client.report [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:33:51 np0005476733 nova_compute[192580]: 2025-10-08 15:33:51.866 2 DEBUG oslo_concurrency.lockutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:33:51 np0005476733 nova_compute[192580]: 2025-10-08 15:33:51.867 2 DEBUG nova.compute.manager [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:33:51 np0005476733 nova_compute[192580]: 2025-10-08 15:33:51.934 2 DEBUG nova.compute.manager [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:33:51 np0005476733 nova_compute[192580]: 2025-10-08 15:33:51.934 2 DEBUG nova.network.neutron [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:33:51 np0005476733 nova_compute[192580]: 2025-10-08 15:33:51.956 2 INFO nova.virt.libvirt.driver [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:33:51 np0005476733 nova_compute[192580]: 2025-10-08 15:33:51.982 2 DEBUG nova.compute.manager [req-5804f1b9-a918-40b5-b345-46931f3f918c req-446b6441-4884-4a07-8475-fa657272e774 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Received event network-vif-plugged-6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:33:51 np0005476733 nova_compute[192580]: 2025-10-08 15:33:51.983 2 DEBUG oslo_concurrency.lockutils [req-5804f1b9-a918-40b5-b345-46931f3f918c req-446b6441-4884-4a07-8475-fa657272e774 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e7b170f9-efdc-458b-a2e6-04c7f2072900-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:33:51 np0005476733 nova_compute[192580]: 2025-10-08 15:33:51.984 2 DEBUG oslo_concurrency.lockutils [req-5804f1b9-a918-40b5-b345-46931f3f918c req-446b6441-4884-4a07-8475-fa657272e774 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e7b170f9-efdc-458b-a2e6-04c7f2072900-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:33:51 np0005476733 nova_compute[192580]: 2025-10-08 15:33:51.984 2 DEBUG oslo_concurrency.lockutils [req-5804f1b9-a918-40b5-b345-46931f3f918c req-446b6441-4884-4a07-8475-fa657272e774 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e7b170f9-efdc-458b-a2e6-04c7f2072900-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:33:51 np0005476733 nova_compute[192580]: 2025-10-08 15:33:51.985 2 DEBUG nova.compute.manager [req-5804f1b9-a918-40b5-b345-46931f3f918c req-446b6441-4884-4a07-8475-fa657272e774 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] No waiting events found dispatching network-vif-plugged-6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:33:51 np0005476733 nova_compute[192580]: 2025-10-08 15:33:51.985 2 WARNING nova.compute.manager [req-5804f1b9-a918-40b5-b345-46931f3f918c req-446b6441-4884-4a07-8475-fa657272e774 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Received unexpected event network-vif-plugged-6a82b4a7-2453-4ee1-866e-a6fe2175b5c4 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 11:33:51 np0005476733 nova_compute[192580]: 2025-10-08 15:33:51.991 2 DEBUG nova.compute.manager [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:33:52 np0005476733 nova_compute[192580]: 2025-10-08 15:33:52.108 2 DEBUG nova.compute.manager [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:33:52 np0005476733 nova_compute[192580]: 2025-10-08 15:33:52.111 2 DEBUG nova.virt.libvirt.driver [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:33:52 np0005476733 nova_compute[192580]: 2025-10-08 15:33:52.111 2 INFO nova.virt.libvirt.driver [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Creating image(s)#033[00m
Oct  8 11:33:52 np0005476733 nova_compute[192580]: 2025-10-08 15:33:52.113 2 DEBUG oslo_concurrency.lockutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "/var/lib/nova/instances/6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:33:52 np0005476733 nova_compute[192580]: 2025-10-08 15:33:52.114 2 DEBUG oslo_concurrency.lockutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "/var/lib/nova/instances/6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:33:52 np0005476733 nova_compute[192580]: 2025-10-08 15:33:52.115 2 DEBUG oslo_concurrency.lockutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "/var/lib/nova/instances/6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:33:52 np0005476733 nova_compute[192580]: 2025-10-08 15:33:52.139 2 DEBUG nova.policy [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:33:52 np0005476733 nova_compute[192580]: 2025-10-08 15:33:52.144 2 DEBUG oslo_concurrency.processutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:33:52 np0005476733 nova_compute[192580]: 2025-10-08 15:33:52.218 2 DEBUG oslo_concurrency.processutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:33:52 np0005476733 nova_compute[192580]: 2025-10-08 15:33:52.219 2 DEBUG oslo_concurrency.lockutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:33:52 np0005476733 nova_compute[192580]: 2025-10-08 15:33:52.220 2 DEBUG oslo_concurrency.lockutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:33:52 np0005476733 nova_compute[192580]: 2025-10-08 15:33:52.229 2 DEBUG oslo_concurrency.processutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:33:52 np0005476733 nova_compute[192580]: 2025-10-08 15:33:52.294 2 DEBUG oslo_concurrency.processutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:33:52 np0005476733 nova_compute[192580]: 2025-10-08 15:33:52.295 2 DEBUG oslo_concurrency.processutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:33:52 np0005476733 nova_compute[192580]: 2025-10-08 15:33:52.329 2 DEBUG oslo_concurrency.processutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk 10737418240" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:33:52 np0005476733 nova_compute[192580]: 2025-10-08 15:33:52.330 2 DEBUG oslo_concurrency.lockutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:33:52 np0005476733 nova_compute[192580]: 2025-10-08 15:33:52.331 2 DEBUG oslo_concurrency.processutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:33:52 np0005476733 nova_compute[192580]: 2025-10-08 15:33:52.386 2 DEBUG oslo_concurrency.processutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:33:52 np0005476733 nova_compute[192580]: 2025-10-08 15:33:52.388 2 DEBUG nova.objects.instance [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lazy-loading 'migration_context' on Instance uuid 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:33:52 np0005476733 nova_compute[192580]: 2025-10-08 15:33:52.406 2 DEBUG nova.virt.libvirt.driver [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:33:52 np0005476733 nova_compute[192580]: 2025-10-08 15:33:52.407 2 DEBUG nova.virt.libvirt.driver [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Ensure instance console log exists: /var/lib/nova/instances/6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:33:52 np0005476733 nova_compute[192580]: 2025-10-08 15:33:52.408 2 DEBUG oslo_concurrency.lockutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:33:52 np0005476733 nova_compute[192580]: 2025-10-08 15:33:52.408 2 DEBUG oslo_concurrency.lockutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:33:52 np0005476733 nova_compute[192580]: 2025-10-08 15:33:52.409 2 DEBUG oslo_concurrency.lockutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:33:53 np0005476733 nova_compute[192580]: 2025-10-08 15:33:53.809 2 DEBUG nova.network.neutron [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Successfully created port: cfb829d2-b09f-4c87-8adf-76c33a6a438b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 11:33:54 np0005476733 nova_compute[192580]: 2025-10-08 15:33:54.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:54 np0005476733 nova_compute[192580]: 2025-10-08 15:33:54.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:54 np0005476733 nova_compute[192580]: 2025-10-08 15:33:54.950 2 DEBUG nova.network.neutron [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Successfully updated port: cfb829d2-b09f-4c87-8adf-76c33a6a438b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:33:54 np0005476733 nova_compute[192580]: 2025-10-08 15:33:54.974 2 DEBUG oslo_concurrency.lockutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "refresh_cache-6700973e-9d22-4d4a-8d39-ae92bc3bd6e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:33:54 np0005476733 nova_compute[192580]: 2025-10-08 15:33:54.974 2 DEBUG oslo_concurrency.lockutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquired lock "refresh_cache-6700973e-9d22-4d4a-8d39-ae92bc3bd6e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:33:54 np0005476733 nova_compute[192580]: 2025-10-08 15:33:54.974 2 DEBUG nova.network.neutron [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:33:55 np0005476733 nova_compute[192580]: 2025-10-08 15:33:55.134 2 DEBUG nova.network.neutron [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:33:55 np0005476733 nova_compute[192580]: 2025-10-08 15:33:55.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:33:56 np0005476733 nova_compute[192580]: 2025-10-08 15:33:56.181 2 DEBUG nova.compute.manager [req-ecaf9f58-9cfa-4451-bfd2-edfb533e9451 req-fd5480da-78c5-40da-97cd-b8e38de25ef8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Received event network-changed-cfb829d2-b09f-4c87-8adf-76c33a6a438b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:33:56 np0005476733 nova_compute[192580]: 2025-10-08 15:33:56.181 2 DEBUG nova.compute.manager [req-ecaf9f58-9cfa-4451-bfd2-edfb533e9451 req-fd5480da-78c5-40da-97cd-b8e38de25ef8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Refreshing instance network info cache due to event network-changed-cfb829d2-b09f-4c87-8adf-76c33a6a438b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:33:56 np0005476733 nova_compute[192580]: 2025-10-08 15:33:56.182 2 DEBUG oslo_concurrency.lockutils [req-ecaf9f58-9cfa-4451-bfd2-edfb533e9451 req-fd5480da-78c5-40da-97cd-b8e38de25ef8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-6700973e-9d22-4d4a-8d39-ae92bc3bd6e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:33:56 np0005476733 nova_compute[192580]: 2025-10-08 15:33:56.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:33:56 np0005476733 nova_compute[192580]: 2025-10-08 15:33:56.608 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:33:56 np0005476733 nova_compute[192580]: 2025-10-08 15:33:56.609 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:33:56 np0005476733 nova_compute[192580]: 2025-10-08 15:33:56.609 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:33:56 np0005476733 nova_compute[192580]: 2025-10-08 15:33:56.637 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  8 11:33:56 np0005476733 nova_compute[192580]: 2025-10-08 15:33:56.861 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:33:56 np0005476733 nova_compute[192580]: 2025-10-08 15:33:56.861 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:33:56 np0005476733 nova_compute[192580]: 2025-10-08 15:33:56.862 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:33:56 np0005476733 nova_compute[192580]: 2025-10-08 15:33:56.862 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 066ef28b-88ac-4f5c-acae-3458c3e19762 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.578 2 DEBUG nova.network.neutron [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Updating instance_info_cache with network_info: [{"id": "cfb829d2-b09f-4c87-8adf-76c33a6a438b", "address": "fa:16:3e:b3:15:9d", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfb829d2-b0", "ovs_interfaceid": "cfb829d2-b09f-4c87-8adf-76c33a6a438b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.608 2 DEBUG oslo_concurrency.lockutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Releasing lock "refresh_cache-6700973e-9d22-4d4a-8d39-ae92bc3bd6e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.609 2 DEBUG nova.compute.manager [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Instance network_info: |[{"id": "cfb829d2-b09f-4c87-8adf-76c33a6a438b", "address": "fa:16:3e:b3:15:9d", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfb829d2-b0", "ovs_interfaceid": "cfb829d2-b09f-4c87-8adf-76c33a6a438b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.609 2 DEBUG oslo_concurrency.lockutils [req-ecaf9f58-9cfa-4451-bfd2-edfb533e9451 req-fd5480da-78c5-40da-97cd-b8e38de25ef8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-6700973e-9d22-4d4a-8d39-ae92bc3bd6e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.610 2 DEBUG nova.network.neutron [req-ecaf9f58-9cfa-4451-bfd2-edfb533e9451 req-fd5480da-78c5-40da-97cd-b8e38de25ef8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Refreshing network info cache for port cfb829d2-b09f-4c87-8adf-76c33a6a438b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.612 2 DEBUG nova.virt.libvirt.driver [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Start _get_guest_xml network_info=[{"id": "cfb829d2-b09f-4c87-8adf-76c33a6a438b", "address": "fa:16:3e:b3:15:9d", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfb829d2-b0", "ovs_interfaceid": "cfb829d2-b09f-4c87-8adf-76c33a6a438b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.616 2 WARNING nova.virt.libvirt.driver [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.621 2 DEBUG nova.virt.libvirt.host [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.622 2 DEBUG nova.virt.libvirt.host [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.630 2 DEBUG nova.virt.libvirt.host [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.630 2 DEBUG nova.virt.libvirt.host [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.631 2 DEBUG nova.virt.libvirt.driver [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.631 2 DEBUG nova.virt.hardware [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.631 2 DEBUG nova.virt.hardware [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.632 2 DEBUG nova.virt.hardware [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.632 2 DEBUG nova.virt.hardware [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.632 2 DEBUG nova.virt.hardware [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.632 2 DEBUG nova.virt.hardware [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.633 2 DEBUG nova.virt.hardware [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.633 2 DEBUG nova.virt.hardware [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.633 2 DEBUG nova.virt.hardware [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.633 2 DEBUG nova.virt.hardware [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.634 2 DEBUG nova.virt.hardware [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.637 2 DEBUG nova.virt.libvirt.vif [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:33:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_multicast_north_south-39687703',display_name='tempest-test_multicast_north_south-39687703',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-multicast-north-south-39687703',id=55,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHaTUyIW7HAi8eLb2uxsb3hQ01QNiqMtiwd2QQElMyFusiyPekoP+eGZG5apcvUeJj+ezHykEE9e9GalqeB/Pt0gdiMZz/nmUCtHv59KRRGG4S5F2fPmbxlRdJaDztvzVg==',key_name='tempest-keypair-test-1272869518',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='496a37645ecf47b496dcf02c696ca64a',ramdisk_id='',reservation_id='r-08rfza5j',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Ovn-1993668591',owner_user_name='tempest-MulticastTestIPv4Ovn-1993668591-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:33:52Z,user_data=None,user_id='c0c7c5c2dab54695b1cc0a34bdc4ee47',uuid=6700973e-9d22-4d4a-8d39-ae92bc3bd6e2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cfb829d2-b09f-4c87-8adf-76c33a6a438b", "address": "fa:16:3e:b3:15:9d", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfb829d2-b0", "ovs_interfaceid": "cfb829d2-b09f-4c87-8adf-76c33a6a438b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.637 2 DEBUG nova.network.os_vif_util [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converting VIF {"id": "cfb829d2-b09f-4c87-8adf-76c33a6a438b", "address": "fa:16:3e:b3:15:9d", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfb829d2-b0", "ovs_interfaceid": "cfb829d2-b09f-4c87-8adf-76c33a6a438b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.638 2 DEBUG nova.network.os_vif_util [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:15:9d,bridge_name='br-int',has_traffic_filtering=True,id=cfb829d2-b09f-4c87-8adf-76c33a6a438b,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfb829d2-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.639 2 DEBUG nova.objects.instance [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lazy-loading 'pci_devices' on Instance uuid 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.653 2 DEBUG nova.virt.libvirt.driver [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:33:57 np0005476733 nova_compute[192580]:  <uuid>6700973e-9d22-4d4a-8d39-ae92bc3bd6e2</uuid>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:  <name>instance-00000037</name>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <nova:name>tempest-test_multicast_north_south-39687703</nova:name>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:33:57</nova:creationTime>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:33:57 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:        <nova:user uuid="c0c7c5c2dab54695b1cc0a34bdc4ee47">tempest-MulticastTestIPv4Ovn-1993668591-project-member</nova:user>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:        <nova:project uuid="496a37645ecf47b496dcf02c696ca64a">tempest-MulticastTestIPv4Ovn-1993668591</nova:project>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:        <nova:port uuid="cfb829d2-b09f-4c87-8adf-76c33a6a438b">
Oct  8 11:33:57 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <entry name="serial">6700973e-9d22-4d4a-8d39-ae92bc3bd6e2</entry>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <entry name="uuid">6700973e-9d22-4d4a-8d39-ae92bc3bd6e2</entry>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.config"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:b3:15:9d"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <target dev="tapcfb829d2-b0"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/console.log" append="off"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:33:57 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:33:57 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:33:57 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:33:57 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.654 2 DEBUG nova.compute.manager [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Preparing to wait for external event network-vif-plugged-cfb829d2-b09f-4c87-8adf-76c33a6a438b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.655 2 DEBUG oslo_concurrency.lockutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.656 2 DEBUG oslo_concurrency.lockutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.656 2 DEBUG oslo_concurrency.lockutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.658 2 DEBUG nova.virt.libvirt.vif [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:33:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_multicast_north_south-39687703',display_name='tempest-test_multicast_north_south-39687703',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-multicast-north-south-39687703',id=55,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHaTUyIW7HAi8eLb2uxsb3hQ01QNiqMtiwd2QQElMyFusiyPekoP+eGZG5apcvUeJj+ezHykEE9e9GalqeB/Pt0gdiMZz/nmUCtHv59KRRGG4S5F2fPmbxlRdJaDztvzVg==',key_name='tempest-keypair-test-1272869518',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='496a37645ecf47b496dcf02c696ca64a',ramdisk_id='',reservation_id='r-08rfza5j',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Ovn-1993668591',owner_user_name='tempest-MulticastTestIPv4Ovn-1993668591-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:33:52Z,user_data=None,user_id='c0c7c5c2dab54695b1cc0a34bdc4ee47',uuid=6700973e-9d22-4d4a-8d39-ae92bc3bd6e2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cfb829d2-b09f-4c87-8adf-76c33a6a438b", "address": "fa:16:3e:b3:15:9d", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfb829d2-b0", "ovs_interfaceid": "cfb829d2-b09f-4c87-8adf-76c33a6a438b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.658 2 DEBUG nova.network.os_vif_util [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converting VIF {"id": "cfb829d2-b09f-4c87-8adf-76c33a6a438b", "address": "fa:16:3e:b3:15:9d", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfb829d2-b0", "ovs_interfaceid": "cfb829d2-b09f-4c87-8adf-76c33a6a438b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.659 2 DEBUG nova.network.os_vif_util [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:15:9d,bridge_name='br-int',has_traffic_filtering=True,id=cfb829d2-b09f-4c87-8adf-76c33a6a438b,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfb829d2-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.660 2 DEBUG os_vif [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:15:9d,bridge_name='br-int',has_traffic_filtering=True,id=cfb829d2-b09f-4c87-8adf-76c33a6a438b,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfb829d2-b0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.662 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.663 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.668 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcfb829d2-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.669 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcfb829d2-b0, col_values=(('external_ids', {'iface-id': 'cfb829d2-b09f-4c87-8adf-76c33a6a438b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b3:15:9d', 'vm-uuid': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:57 np0005476733 NetworkManager[51699]: <info>  [1759937637.6729] manager: (tapcfb829d2-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/149)
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.678 2 INFO os_vif [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:15:9d,bridge_name='br-int',has_traffic_filtering=True,id=cfb829d2-b09f-4c87-8adf-76c33a6a438b,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfb829d2-b0')#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.747 2 DEBUG nova.virt.libvirt.driver [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.748 2 DEBUG nova.virt.libvirt.driver [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.748 2 DEBUG nova.virt.libvirt.driver [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] No VIF found with MAC fa:16:3e:b3:15:9d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:33:57 np0005476733 nova_compute[192580]: 2025-10-08 15:33:57.749 2 INFO nova.virt.libvirt.driver [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Using config drive#033[00m
Oct  8 11:33:58 np0005476733 nova_compute[192580]: 2025-10-08 15:33:58.868 2 INFO nova.virt.libvirt.driver [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Creating config drive at /var/lib/nova/instances/6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.config#033[00m
Oct  8 11:33:58 np0005476733 nova_compute[192580]: 2025-10-08 15:33:58.875 2 DEBUG oslo_concurrency.processutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsr1ae3sa execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:33:58 np0005476733 nova_compute[192580]: 2025-10-08 15:33:58.958 2 DEBUG nova.compute.manager [req-b58f19db-ae1a-48ad-a9bf-820c86ae4244 req-27afeaba-bd4f-4152-b1d8-5d9b1e807a41 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received event network-changed-8f7d5998-037f-4a70-98a0-8482a8043a7e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:33:58 np0005476733 nova_compute[192580]: 2025-10-08 15:33:58.958 2 DEBUG nova.compute.manager [req-b58f19db-ae1a-48ad-a9bf-820c86ae4244 req-27afeaba-bd4f-4152-b1d8-5d9b1e807a41 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Refreshing instance network info cache due to event network-changed-8f7d5998-037f-4a70-98a0-8482a8043a7e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:33:58 np0005476733 nova_compute[192580]: 2025-10-08 15:33:58.959 2 DEBUG oslo_concurrency.lockutils [req-b58f19db-ae1a-48ad-a9bf-820c86ae4244 req-27afeaba-bd4f-4152-b1d8-5d9b1e807a41 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:33:59 np0005476733 nova_compute[192580]: 2025-10-08 15:33:59.001 2 DEBUG oslo_concurrency.processutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsr1ae3sa" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:33:59 np0005476733 kernel: tapcfb829d2-b0: entered promiscuous mode
Oct  8 11:33:59 np0005476733 NetworkManager[51699]: <info>  [1759937639.0791] manager: (tapcfb829d2-b0): new Tun device (/org/freedesktop/NetworkManager/Devices/150)
Oct  8 11:33:59 np0005476733 ovn_controller[94857]: 2025-10-08T15:33:59Z|00444|binding|INFO|Claiming lport cfb829d2-b09f-4c87-8adf-76c33a6a438b for this chassis.
Oct  8 11:33:59 np0005476733 ovn_controller[94857]: 2025-10-08T15:33:59Z|00445|binding|INFO|cfb829d2-b09f-4c87-8adf-76c33a6a438b: Claiming fa:16:3e:b3:15:9d 10.100.0.7
Oct  8 11:33:59 np0005476733 nova_compute[192580]: 2025-10-08 15:33:59.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:59.094 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:15:9d 10.100.0.7'], port_security=['fa:16:3e:b3:15:9d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '496a37645ecf47b496dcf02c696ca64a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '023a0cd3-fdca-4dff-ba80-8ef557b384c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b3d4cc6-3768-451b-b35e-6b2333c921fd, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=cfb829d2-b09f-4c87-8adf-76c33a6a438b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:33:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:59.097 103739 INFO neutron.agent.ovn.metadata.agent [-] Port cfb829d2-b09f-4c87-8adf-76c33a6a438b in datapath 30cdfb1e-750a-4d0e-9e9c-321b06b371b9 bound to our chassis#033[00m
Oct  8 11:33:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:59.102 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 30cdfb1e-750a-4d0e-9e9c-321b06b371b9#033[00m
Oct  8 11:33:59 np0005476733 ovn_controller[94857]: 2025-10-08T15:33:59Z|00446|binding|INFO|Setting lport cfb829d2-b09f-4c87-8adf-76c33a6a438b ovn-installed in OVS
Oct  8 11:33:59 np0005476733 ovn_controller[94857]: 2025-10-08T15:33:59Z|00447|binding|INFO|Setting lport cfb829d2-b09f-4c87-8adf-76c33a6a438b up in Southbound
Oct  8 11:33:59 np0005476733 nova_compute[192580]: 2025-10-08 15:33:59.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:59 np0005476733 nova_compute[192580]: 2025-10-08 15:33:59.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:59.130 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7b25e85e-f864-463c-988f-4a1c05a3c0ec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:59 np0005476733 systemd-udevd[234149]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:33:59 np0005476733 NetworkManager[51699]: <info>  [1759937639.1541] device (tapcfb829d2-b0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:33:59 np0005476733 NetworkManager[51699]: <info>  [1759937639.1558] device (tapcfb829d2-b0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:33:59 np0005476733 systemd-machined[152624]: New machine qemu-32-instance-00000037.
Oct  8 11:33:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:59.182 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[6f9a5b28-e912-400b-aa23-e86232dfa6f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:59 np0005476733 systemd[1]: Started Virtual Machine qemu-32-instance-00000037.
Oct  8 11:33:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:59.187 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[5b20bb5e-7e7b-4521-b09e-1a5b053c4d1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:59.219 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[704b0c54-0c46-4c9d-bfbf-efc51b784ff3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:59.241 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8691a877-ee3c-4779-834c-bc3ea651020e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap30cdfb1e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:3e:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 11, 'tx_packets': 5, 'rx_bytes': 958, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 11, 'tx_packets': 5, 'rx_bytes': 958, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 100], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449303, 'reachable_time': 20489, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234161, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:59.259 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[586b2816-218b-49b9-80bc-0bf6ace9b5b0]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap30cdfb1e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 449316, 'tstamp': 449316}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234164, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap30cdfb1e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 449319, 'tstamp': 449319}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234164, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:33:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:59.260 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap30cdfb1e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:33:59 np0005476733 nova_compute[192580]: 2025-10-08 15:33:59.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:59 np0005476733 nova_compute[192580]: 2025-10-08 15:33:59.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:33:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:59.264 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap30cdfb1e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:33:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:59.264 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:33:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:59.265 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap30cdfb1e-70, col_values=(('external_ids', {'iface-id': '76302563-91ae-48df-adce-3edec8d5a578'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:33:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:33:59.265 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:33:59 np0005476733 nova_compute[192580]: 2025-10-08 15:33:59.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:00 np0005476733 nova_compute[192580]: 2025-10-08 15:34:00.173 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937640.172845, 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:34:00 np0005476733 nova_compute[192580]: 2025-10-08 15:34:00.175 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] VM Started (Lifecycle Event)#033[00m
Oct  8 11:34:00 np0005476733 nova_compute[192580]: 2025-10-08 15:34:00.197 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:34:00 np0005476733 nova_compute[192580]: 2025-10-08 15:34:00.203 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937640.1742094, 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:34:00 np0005476733 nova_compute[192580]: 2025-10-08 15:34:00.203 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:34:00 np0005476733 nova_compute[192580]: 2025-10-08 15:34:00.222 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:34:00 np0005476733 nova_compute[192580]: 2025-10-08 15:34:00.225 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:34:00 np0005476733 nova_compute[192580]: 2025-10-08 15:34:00.245 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.781 2 DEBUG nova.compute.manager [req-04f1b537-9f1b-4c85-90b4-a7716b5a8588 req-48a16111-451d-4763-ac39-6be90539d262 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Received event network-vif-plugged-cfb829d2-b09f-4c87-8adf-76c33a6a438b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.782 2 DEBUG oslo_concurrency.lockutils [req-04f1b537-9f1b-4c85-90b4-a7716b5a8588 req-48a16111-451d-4763-ac39-6be90539d262 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.783 2 DEBUG oslo_concurrency.lockutils [req-04f1b537-9f1b-4c85-90b4-a7716b5a8588 req-48a16111-451d-4763-ac39-6be90539d262 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.784 2 DEBUG oslo_concurrency.lockutils [req-04f1b537-9f1b-4c85-90b4-a7716b5a8588 req-48a16111-451d-4763-ac39-6be90539d262 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.784 2 DEBUG nova.compute.manager [req-04f1b537-9f1b-4c85-90b4-a7716b5a8588 req-48a16111-451d-4763-ac39-6be90539d262 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Processing event network-vif-plugged-cfb829d2-b09f-4c87-8adf-76c33a6a438b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.786 2 DEBUG nova.compute.manager [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.790 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937641.7902863, 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.791 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.794 2 DEBUG nova.virt.libvirt.driver [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.799 2 INFO nova.virt.libvirt.driver [-] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Instance spawned successfully.#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.800 2 DEBUG nova.virt.libvirt.driver [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.823 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.833 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.841 2 DEBUG nova.virt.libvirt.driver [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.842 2 DEBUG nova.virt.libvirt.driver [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.844 2 DEBUG nova.virt.libvirt.driver [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.845 2 DEBUG nova.virt.libvirt.driver [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.846 2 DEBUG nova.virt.libvirt.driver [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.847 2 DEBUG nova.virt.libvirt.driver [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.879 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.919 2 INFO nova.compute.manager [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Took 9.81 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.920 2 DEBUG nova.compute.manager [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.945 2 DEBUG nova.network.neutron [req-ecaf9f58-9cfa-4451-bfd2-edfb533e9451 req-fd5480da-78c5-40da-97cd-b8e38de25ef8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Updated VIF entry in instance network info cache for port cfb829d2-b09f-4c87-8adf-76c33a6a438b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.946 2 DEBUG nova.network.neutron [req-ecaf9f58-9cfa-4451-bfd2-edfb533e9451 req-fd5480da-78c5-40da-97cd-b8e38de25ef8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Updating instance_info_cache with network_info: [{"id": "cfb829d2-b09f-4c87-8adf-76c33a6a438b", "address": "fa:16:3e:b3:15:9d", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfb829d2-b0", "ovs_interfaceid": "cfb829d2-b09f-4c87-8adf-76c33a6a438b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.975 2 DEBUG oslo_concurrency.lockutils [req-ecaf9f58-9cfa-4451-bfd2-edfb533e9451 req-fd5480da-78c5-40da-97cd-b8e38de25ef8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-6700973e-9d22-4d4a-8d39-ae92bc3bd6e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:34:01 np0005476733 nova_compute[192580]: 2025-10-08 15:34:01.998 2 INFO nova.compute.manager [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Took 10.37 seconds to build instance.#033[00m
Oct  8 11:34:02 np0005476733 nova_compute[192580]: 2025-10-08 15:34:02.022 2 DEBUG oslo_concurrency.lockutils [None req-3252bcee-c576-4716-941b-9974d95cd777 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "6700973e-9d22-4d4a-8d39-ae92bc3bd6e2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.506s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:34:02 np0005476733 podman[234205]: 2025-10-08 15:34:02.262734304 +0000 UTC m=+0.084055483 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 11:34:02 np0005476733 podman[234211]: 2025-10-08 15:34:02.300128306 +0000 UTC m=+0.092493244 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 11:34:02 np0005476733 nova_compute[192580]: 2025-10-08 15:34:02.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:03 np0005476733 nova_compute[192580]: 2025-10-08 15:34:03.356 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updating instance_info_cache with network_info: [{"id": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "address": "fa:16:3e:85:7d:15", "network": {"id": "f81b33e3-d2f7-4437-b8c9-c9a54931fb61", "bridge": "br-int", "label": "tempest-test-network--416037603", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f7d5998-03", "ovs_interfaceid": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "50d486c7-b030-4d82-8b22-2f71cd277074", "address": "fa:16:3e:c9:e2:37", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50d486c7-b0", "ovs_interfaceid": "50d486c7-b030-4d82-8b22-2f71cd277074", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:34:03 np0005476733 nova_compute[192580]: 2025-10-08 15:34:03.383 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:34:03 np0005476733 nova_compute[192580]: 2025-10-08 15:34:03.383 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:34:03 np0005476733 nova_compute[192580]: 2025-10-08 15:34:03.384 2 DEBUG oslo_concurrency.lockutils [req-b58f19db-ae1a-48ad-a9bf-820c86ae4244 req-27afeaba-bd4f-4152-b1d8-5d9b1e807a41 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:34:03 np0005476733 nova_compute[192580]: 2025-10-08 15:34:03.384 2 DEBUG nova.network.neutron [req-b58f19db-ae1a-48ad-a9bf-820c86ae4244 req-27afeaba-bd4f-4152-b1d8-5d9b1e807a41 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Refreshing network info cache for port 8f7d5998-037f-4a70-98a0-8482a8043a7e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:34:03 np0005476733 nova_compute[192580]: 2025-10-08 15:34:03.385 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:34:03 np0005476733 nova_compute[192580]: 2025-10-08 15:34:03.412 2 INFO nova.compute.manager [None req-24dbaccd-a7bd-49b8-9443-fddcf48cb70b c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Get console output#033[00m
Oct  8 11:34:03 np0005476733 nova_compute[192580]: 2025-10-08 15:34:03.418 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:34:03 np0005476733 nova_compute[192580]: 2025-10-08 15:34:03.879 2 DEBUG nova.compute.manager [req-4d1bdcb6-3a5b-4e33-8795-c7c0d75f8899 req-8a2ecdbc-5cab-4677-8188-5ad3a792861d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Received event network-vif-plugged-cfb829d2-b09f-4c87-8adf-76c33a6a438b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:34:03 np0005476733 nova_compute[192580]: 2025-10-08 15:34:03.880 2 DEBUG oslo_concurrency.lockutils [req-4d1bdcb6-3a5b-4e33-8795-c7c0d75f8899 req-8a2ecdbc-5cab-4677-8188-5ad3a792861d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:34:03 np0005476733 nova_compute[192580]: 2025-10-08 15:34:03.880 2 DEBUG oslo_concurrency.lockutils [req-4d1bdcb6-3a5b-4e33-8795-c7c0d75f8899 req-8a2ecdbc-5cab-4677-8188-5ad3a792861d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:34:03 np0005476733 nova_compute[192580]: 2025-10-08 15:34:03.881 2 DEBUG oslo_concurrency.lockutils [req-4d1bdcb6-3a5b-4e33-8795-c7c0d75f8899 req-8a2ecdbc-5cab-4677-8188-5ad3a792861d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:34:03 np0005476733 nova_compute[192580]: 2025-10-08 15:34:03.881 2 DEBUG nova.compute.manager [req-4d1bdcb6-3a5b-4e33-8795-c7c0d75f8899 req-8a2ecdbc-5cab-4677-8188-5ad3a792861d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] No waiting events found dispatching network-vif-plugged-cfb829d2-b09f-4c87-8adf-76c33a6a438b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:34:03 np0005476733 nova_compute[192580]: 2025-10-08 15:34:03.882 2 WARNING nova.compute.manager [req-4d1bdcb6-3a5b-4e33-8795-c7c0d75f8899 req-8a2ecdbc-5cab-4677-8188-5ad3a792861d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Received unexpected event network-vif-plugged-cfb829d2-b09f-4c87-8adf-76c33a6a438b for instance with vm_state active and task_state None.#033[00m
Oct  8 11:34:04 np0005476733 nova_compute[192580]: 2025-10-08 15:34:04.437 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937629.4363055, e7b170f9-efdc-458b-a2e6-04c7f2072900 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:34:04 np0005476733 nova_compute[192580]: 2025-10-08 15:34:04.439 2 INFO nova.compute.manager [-] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:34:04 np0005476733 nova_compute[192580]: 2025-10-08 15:34:04.460 2 DEBUG nova.compute.manager [None req-8212e89d-f946-4d0e-b276-20dda489fd24 - - - - - -] [instance: e7b170f9-efdc-458b-a2e6-04c7f2072900] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:34:04 np0005476733 nova_compute[192580]: 2025-10-08 15:34:04.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:05 np0005476733 nova_compute[192580]: 2025-10-08 15:34:05.039 2 DEBUG nova.network.neutron [req-b58f19db-ae1a-48ad-a9bf-820c86ae4244 req-27afeaba-bd4f-4152-b1d8-5d9b1e807a41 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updated VIF entry in instance network info cache for port 8f7d5998-037f-4a70-98a0-8482a8043a7e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:34:05 np0005476733 nova_compute[192580]: 2025-10-08 15:34:05.039 2 DEBUG nova.network.neutron [req-b58f19db-ae1a-48ad-a9bf-820c86ae4244 req-27afeaba-bd4f-4152-b1d8-5d9b1e807a41 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updating instance_info_cache with network_info: [{"id": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "address": "fa:16:3e:85:7d:15", "network": {"id": "f81b33e3-d2f7-4437-b8c9-c9a54931fb61", "bridge": "br-int", "label": "tempest-test-network--416037603", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f7d5998-03", "ovs_interfaceid": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "50d486c7-b030-4d82-8b22-2f71cd277074", "address": "fa:16:3e:c9:e2:37", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50d486c7-b0", "ovs_interfaceid": "50d486c7-b030-4d82-8b22-2f71cd277074", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:34:05 np0005476733 nova_compute[192580]: 2025-10-08 15:34:05.059 2 DEBUG oslo_concurrency.lockutils [req-b58f19db-ae1a-48ad-a9bf-820c86ae4244 req-27afeaba-bd4f-4152-b1d8-5d9b1e807a41 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:34:07 np0005476733 nova_compute[192580]: 2025-10-08 15:34:07.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:08 np0005476733 nova_compute[192580]: 2025-10-08 15:34:08.544 2 INFO nova.compute.manager [None req-5f85e415-b3b1-40d2-b402-e53df13d1d93 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Get console output#033[00m
Oct  8 11:34:08 np0005476733 nova_compute[192580]: 2025-10-08 15:34:08.552 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:34:09 np0005476733 nova_compute[192580]: 2025-10-08 15:34:09.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:11 np0005476733 podman[234247]: 2025-10-08 15:34:11.257213434 +0000 UTC m=+0.081615323 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:34:12 np0005476733 nova_compute[192580]: 2025-10-08 15:34:12.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:13 np0005476733 nova_compute[192580]: 2025-10-08 15:34:13.668 2 INFO nova.compute.manager [None req-a953c88b-e2e1-498e-959a-41b216735016 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Get console output#033[00m
Oct  8 11:34:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:14.178 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:34:14 np0005476733 nova_compute[192580]: 2025-10-08 15:34:14.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:14.180 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:34:14 np0005476733 nova_compute[192580]: 2025-10-08 15:34:14.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:15 np0005476733 podman[234268]: 2025-10-08 15:34:15.288792409 +0000 UTC m=+0.096322376 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:34:15 np0005476733 podman[234267]: 2025-10-08 15:34:15.314780335 +0000 UTC m=+0.140568869 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  8 11:34:17 np0005476733 nova_compute[192580]: 2025-10-08 15:34:17.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:18 np0005476733 nova_compute[192580]: 2025-10-08 15:34:18.799 2 INFO nova.compute.manager [None req-70ab0756-91b8-4cd5-ac38-01615a82072d c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Get console output#033[00m
Oct  8 11:34:19 np0005476733 systemd-logind[827]: New session 39 of user zuul.
Oct  8 11:34:19 np0005476733 systemd[1]: Started Session 39 of User zuul.
Oct  8 11:34:19 np0005476733 systemd[1]: session-39.scope: Deactivated successfully.
Oct  8 11:34:19 np0005476733 systemd-logind[827]: Session 39 logged out. Waiting for processes to exit.
Oct  8 11:34:19 np0005476733 systemd-logind[827]: Removed session 39.
Oct  8 11:34:19 np0005476733 nova_compute[192580]: 2025-10-08 15:34:19.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:34:20Z|00448|pinctrl|WARN|Dropped 1739 log messages in last 59 seconds (most recently, 5 seconds ago) due to excessive rate
Oct  8 11:34:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:34:20Z|00449|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:34:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:21.182 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:34:22 np0005476733 podman[234346]: 2025-10-08 15:34:22.241811341 +0000 UTC m=+0.064550896 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:34:22 np0005476733 podman[234345]: 2025-10-08 15:34:22.244711124 +0000 UTC m=+0.067164479 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:34:22 np0005476733 podman[234347]: 2025-10-08 15:34:22.252157903 +0000 UTC m=+0.069274577 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, release=1755695350, name=ubi9-minimal, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public)
Oct  8 11:34:22 np0005476733 nova_compute[192580]: 2025-10-08 15:34:22.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:23 np0005476733 nova_compute[192580]: 2025-10-08 15:34:23.962 2 INFO nova.compute.manager [None req-28088aab-67d1-422c-92b2-53e32ab4cd86 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Get console output#033[00m
Oct  8 11:34:23 np0005476733 nova_compute[192580]: 2025-10-08 15:34:23.969 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:34:24 np0005476733 nova_compute[192580]: 2025-10-08 15:34:24.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:34:25Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b3:15:9d 10.100.0.7
Oct  8 11:34:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:34:25Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b3:15:9d 10.100.0.7
Oct  8 11:34:25 np0005476733 nova_compute[192580]: 2025-10-08 15:34:25.748 2 DEBUG oslo_concurrency.lockutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Acquiring lock "8a310a2e-17af-42b8-a212-cf0a278e20c7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:34:25 np0005476733 nova_compute[192580]: 2025-10-08 15:34:25.749 2 DEBUG oslo_concurrency.lockutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "8a310a2e-17af-42b8-a212-cf0a278e20c7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:34:25 np0005476733 nova_compute[192580]: 2025-10-08 15:34:25.773 2 DEBUG nova.compute.manager [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:34:25 np0005476733 nova_compute[192580]: 2025-10-08 15:34:25.852 2 DEBUG oslo_concurrency.lockutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:34:25 np0005476733 nova_compute[192580]: 2025-10-08 15:34:25.853 2 DEBUG oslo_concurrency.lockutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:34:25 np0005476733 nova_compute[192580]: 2025-10-08 15:34:25.861 2 DEBUG nova.virt.hardware [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:34:25 np0005476733 nova_compute[192580]: 2025-10-08 15:34:25.862 2 INFO nova.compute.claims [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.036 2 DEBUG nova.compute.provider_tree [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.058 2 DEBUG nova.scheduler.client.report [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.099 2 DEBUG oslo_concurrency.lockutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.100 2 DEBUG nova.compute.manager [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.155 2 DEBUG nova.compute.manager [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.156 2 DEBUG nova.network.neutron [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.264 2 INFO nova.virt.libvirt.driver [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.307 2 DEBUG nova.compute.manager [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:34:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:26.319 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:34:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:26.320 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:34:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:26.322 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.423 2 DEBUG nova.compute.manager [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.425 2 DEBUG nova.virt.libvirt.driver [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.426 2 INFO nova.virt.libvirt.driver [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Creating image(s)#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.427 2 DEBUG oslo_concurrency.lockutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Acquiring lock "/var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.427 2 DEBUG oslo_concurrency.lockutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "/var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.428 2 DEBUG oslo_concurrency.lockutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "/var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.445 2 DEBUG oslo_concurrency.processutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.510 2 DEBUG oslo_concurrency.processutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.512 2 DEBUG oslo_concurrency.lockutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.512 2 DEBUG oslo_concurrency.lockutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.523 2 DEBUG oslo_concurrency.processutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.582 2 DEBUG oslo_concurrency.processutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.583 2 DEBUG oslo_concurrency.processutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.621 2 DEBUG oslo_concurrency.processutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7/disk 10737418240" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.623 2 DEBUG oslo_concurrency.lockutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.624 2 DEBUG oslo_concurrency.processutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.685 2 DEBUG oslo_concurrency.processutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.687 2 DEBUG nova.objects.instance [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lazy-loading 'migration_context' on Instance uuid 8a310a2e-17af-42b8-a212-cf0a278e20c7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.811 2 DEBUG nova.virt.libvirt.driver [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.812 2 DEBUG nova.virt.libvirt.driver [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Ensure instance console log exists: /var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.812 2 DEBUG oslo_concurrency.lockutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.813 2 DEBUG oslo_concurrency.lockutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.813 2 DEBUG oslo_concurrency.lockutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:34:26 np0005476733 nova_compute[192580]: 2025-10-08 15:34:26.942 2 DEBUG nova.policy [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:34:27 np0005476733 nova_compute[192580]: 2025-10-08 15:34:27.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:29 np0005476733 nova_compute[192580]: 2025-10-08 15:34:29.088 2 DEBUG nova.network.neutron [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Successfully updated port: 832212ef-772b-4b36-b486-7b4131fc3ab5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:34:29 np0005476733 nova_compute[192580]: 2025-10-08 15:34:29.110 2 DEBUG oslo_concurrency.lockutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Acquiring lock "refresh_cache-8a310a2e-17af-42b8-a212-cf0a278e20c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:34:29 np0005476733 nova_compute[192580]: 2025-10-08 15:34:29.111 2 DEBUG oslo_concurrency.lockutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Acquired lock "refresh_cache-8a310a2e-17af-42b8-a212-cf0a278e20c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:34:29 np0005476733 nova_compute[192580]: 2025-10-08 15:34:29.111 2 DEBUG nova.network.neutron [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:34:29 np0005476733 nova_compute[192580]: 2025-10-08 15:34:29.153 2 INFO nova.compute.manager [None req-b879469a-7393-47ad-ae05-67a7a08b03a2 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Get console output#033[00m
Oct  8 11:34:29 np0005476733 nova_compute[192580]: 2025-10-08 15:34:29.159 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:34:29 np0005476733 nova_compute[192580]: 2025-10-08 15:34:29.271 2 DEBUG nova.compute.manager [req-30773e9b-94be-4cad-b627-8ab63df4edaa req-63e58138-4ff5-4b92-928b-e0c7809ac68d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Received event network-changed-832212ef-772b-4b36-b486-7b4131fc3ab5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:34:29 np0005476733 nova_compute[192580]: 2025-10-08 15:34:29.271 2 DEBUG nova.compute.manager [req-30773e9b-94be-4cad-b627-8ab63df4edaa req-63e58138-4ff5-4b92-928b-e0c7809ac68d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Refreshing instance network info cache due to event network-changed-832212ef-772b-4b36-b486-7b4131fc3ab5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:34:29 np0005476733 nova_compute[192580]: 2025-10-08 15:34:29.272 2 DEBUG oslo_concurrency.lockutils [req-30773e9b-94be-4cad-b627-8ab63df4edaa req-63e58138-4ff5-4b92-928b-e0c7809ac68d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-8a310a2e-17af-42b8-a212-cf0a278e20c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:34:29 np0005476733 nova_compute[192580]: 2025-10-08 15:34:29.327 2 DEBUG nova.network.neutron [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:34:29 np0005476733 nova_compute[192580]: 2025-10-08 15:34:29.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.585 2 DEBUG nova.network.neutron [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Updating instance_info_cache with network_info: [{"id": "832212ef-772b-4b36-b486-7b4131fc3ab5", "address": "fa:16:3e:26:8e:11", "network": {"id": "3556a570-1234-4dd3-a7d3-e2cf3097a776", "bridge": "br-int", "label": "tempest-test-network--565113220", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.73", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c96d22c99734f059343a5340cc6f287", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap832212ef-77", "ovs_interfaceid": "832212ef-772b-4b36-b486-7b4131fc3ab5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.626 2 DEBUG oslo_concurrency.lockutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Releasing lock "refresh_cache-8a310a2e-17af-42b8-a212-cf0a278e20c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.626 2 DEBUG nova.compute.manager [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Instance network_info: |[{"id": "832212ef-772b-4b36-b486-7b4131fc3ab5", "address": "fa:16:3e:26:8e:11", "network": {"id": "3556a570-1234-4dd3-a7d3-e2cf3097a776", "bridge": "br-int", "label": "tempest-test-network--565113220", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.73", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c96d22c99734f059343a5340cc6f287", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap832212ef-77", "ovs_interfaceid": "832212ef-772b-4b36-b486-7b4131fc3ab5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.627 2 DEBUG oslo_concurrency.lockutils [req-30773e9b-94be-4cad-b627-8ab63df4edaa req-63e58138-4ff5-4b92-928b-e0c7809ac68d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-8a310a2e-17af-42b8-a212-cf0a278e20c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.627 2 DEBUG nova.network.neutron [req-30773e9b-94be-4cad-b627-8ab63df4edaa req-63e58138-4ff5-4b92-928b-e0c7809ac68d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Refreshing network info cache for port 832212ef-772b-4b36-b486-7b4131fc3ab5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.634 2 DEBUG nova.virt.libvirt.driver [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Start _get_guest_xml network_info=[{"id": "832212ef-772b-4b36-b486-7b4131fc3ab5", "address": "fa:16:3e:26:8e:11", "network": {"id": "3556a570-1234-4dd3-a7d3-e2cf3097a776", "bridge": "br-int", "label": "tempest-test-network--565113220", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.73", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c96d22c99734f059343a5340cc6f287", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap832212ef-77", "ovs_interfaceid": "832212ef-772b-4b36-b486-7b4131fc3ab5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.640 2 WARNING nova.virt.libvirt.driver [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.649 2 DEBUG nova.virt.libvirt.host [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.652 2 DEBUG nova.virt.libvirt.host [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.661 2 DEBUG nova.virt.libvirt.host [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.664 2 DEBUG nova.virt.libvirt.host [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.665 2 DEBUG nova.virt.libvirt.driver [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.667 2 DEBUG nova.virt.hardware [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.668 2 DEBUG nova.virt.hardware [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.668 2 DEBUG nova.virt.hardware [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.669 2 DEBUG nova.virt.hardware [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.671 2 DEBUG nova.virt.hardware [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.672 2 DEBUG nova.virt.hardware [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.674 2 DEBUG nova.virt.hardware [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.674 2 DEBUG nova.virt.hardware [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.674 2 DEBUG nova.virt.hardware [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.674 2 DEBUG nova.virt.hardware [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.675 2 DEBUG nova.virt.hardware [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.679 2 DEBUG nova.virt.libvirt.vif [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:34:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='vm2',display_name='vm2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='vm2',id=57,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOvSvkf6Ez84pLvA70nMe7oECsKsEg614H2CjeZigbOROrUCgiu8YQ0cYGErpHWEAbVsaccsZMl1XjLVhCbSAWLcNqRXB+mFUuPERzl3xca7lAlc6pqTmGJGSY+TB7aO5w==',key_name='tempest-keypair-test-1659993707',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8c96d22c99734f059343a5340cc6f287',ramdisk_id='',reservation_id='r-sbfgkeja',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VrrpTest-336353520',owner_user_name='tempest-VrrpTest-336353520-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:34:26Z,user_data=None,user_id='7dd1826c89b24382854eb7979b65ba87',uuid=8a310a2e-17af-42b8-a212-cf0a278e20c7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "832212ef-772b-4b36-b486-7b4131fc3ab5", "address": "fa:16:3e:26:8e:11", "network": {"id": "3556a570-1234-4dd3-a7d3-e2cf3097a776", "bridge": "br-int", "label": "tempest-test-network--565113220", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.73", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c96d22c99734f059343a5340cc6f287", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap832212ef-77", "ovs_interfaceid": "832212ef-772b-4b36-b486-7b4131fc3ab5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.679 2 DEBUG nova.network.os_vif_util [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Converting VIF {"id": "832212ef-772b-4b36-b486-7b4131fc3ab5", "address": "fa:16:3e:26:8e:11", "network": {"id": "3556a570-1234-4dd3-a7d3-e2cf3097a776", "bridge": "br-int", "label": "tempest-test-network--565113220", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.73", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c96d22c99734f059343a5340cc6f287", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap832212ef-77", "ovs_interfaceid": "832212ef-772b-4b36-b486-7b4131fc3ab5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.680 2 DEBUG nova.network.os_vif_util [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:8e:11,bridge_name='br-int',has_traffic_filtering=True,id=832212ef-772b-4b36-b486-7b4131fc3ab5,network=Network(3556a570-1234-4dd3-a7d3-e2cf3097a776),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap832212ef-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.681 2 DEBUG nova.objects.instance [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8a310a2e-17af-42b8-a212-cf0a278e20c7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.717 2 DEBUG nova.virt.libvirt.driver [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:34:30 np0005476733 nova_compute[192580]:  <uuid>8a310a2e-17af-42b8-a212-cf0a278e20c7</uuid>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:  <name>instance-00000039</name>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <nova:name>vm2</nova:name>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:34:30</nova:creationTime>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:34:30 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:        <nova:user uuid="7dd1826c89b24382854eb7979b65ba87">tempest-VrrpTest-336353520-project-member</nova:user>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:        <nova:project uuid="8c96d22c99734f059343a5340cc6f287">tempest-VrrpTest-336353520</nova:project>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:        <nova:port uuid="832212ef-772b-4b36-b486-7b4131fc3ab5">
Oct  8 11:34:30 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.100.73" ipVersion="4"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <entry name="serial">8a310a2e-17af-42b8-a212-cf0a278e20c7</entry>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <entry name="uuid">8a310a2e-17af-42b8-a212-cf0a278e20c7</entry>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7/disk"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.config"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:26:8e:11"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <target dev="tap832212ef-77"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7/console.log" append="off"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:34:30 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:34:30 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:34:30 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:34:30 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.719 2 DEBUG nova.compute.manager [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Preparing to wait for external event network-vif-plugged-832212ef-772b-4b36-b486-7b4131fc3ab5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.719 2 DEBUG oslo_concurrency.lockutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Acquiring lock "8a310a2e-17af-42b8-a212-cf0a278e20c7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.720 2 DEBUG oslo_concurrency.lockutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "8a310a2e-17af-42b8-a212-cf0a278e20c7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.720 2 DEBUG oslo_concurrency.lockutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "8a310a2e-17af-42b8-a212-cf0a278e20c7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.721 2 DEBUG nova.virt.libvirt.vif [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:34:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='vm2',display_name='vm2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='vm2',id=57,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOvSvkf6Ez84pLvA70nMe7oECsKsEg614H2CjeZigbOROrUCgiu8YQ0cYGErpHWEAbVsaccsZMl1XjLVhCbSAWLcNqRXB+mFUuPERzl3xca7lAlc6pqTmGJGSY+TB7aO5w==',key_name='tempest-keypair-test-1659993707',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8c96d22c99734f059343a5340cc6f287',ramdisk_id='',reservation_id='r-sbfgkeja',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VrrpTest-336353520',owner_user_name='tempest-VrrpTest-336353520-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:34:26Z,user_data=None,user_id='7dd1826c89b24382854eb7979b65ba87',uuid=8a310a2e-17af-42b8-a212-cf0a278e20c7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "832212ef-772b-4b36-b486-7b4131fc3ab5", "address": "fa:16:3e:26:8e:11", "network": {"id": "3556a570-1234-4dd3-a7d3-e2cf3097a776", "bridge": "br-int", "label": "tempest-test-network--565113220", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.73", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c96d22c99734f059343a5340cc6f287", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap832212ef-77", "ovs_interfaceid": "832212ef-772b-4b36-b486-7b4131fc3ab5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.721 2 DEBUG nova.network.os_vif_util [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Converting VIF {"id": "832212ef-772b-4b36-b486-7b4131fc3ab5", "address": "fa:16:3e:26:8e:11", "network": {"id": "3556a570-1234-4dd3-a7d3-e2cf3097a776", "bridge": "br-int", "label": "tempest-test-network--565113220", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.73", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c96d22c99734f059343a5340cc6f287", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap832212ef-77", "ovs_interfaceid": "832212ef-772b-4b36-b486-7b4131fc3ab5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.722 2 DEBUG nova.network.os_vif_util [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:8e:11,bridge_name='br-int',has_traffic_filtering=True,id=832212ef-772b-4b36-b486-7b4131fc3ab5,network=Network(3556a570-1234-4dd3-a7d3-e2cf3097a776),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap832212ef-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.723 2 DEBUG os_vif [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:8e:11,bridge_name='br-int',has_traffic_filtering=True,id=832212ef-772b-4b36-b486-7b4131fc3ab5,network=Network(3556a570-1234-4dd3-a7d3-e2cf3097a776),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap832212ef-77') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.725 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.726 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.731 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap832212ef-77, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.732 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap832212ef-77, col_values=(('external_ids', {'iface-id': '832212ef-772b-4b36-b486-7b4131fc3ab5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:26:8e:11', 'vm-uuid': '8a310a2e-17af-42b8-a212-cf0a278e20c7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:30 np0005476733 NetworkManager[51699]: <info>  [1759937670.7360] manager: (tap832212ef-77): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/151)
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.748 2 INFO os_vif [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:8e:11,bridge_name='br-int',has_traffic_filtering=True,id=832212ef-772b-4b36-b486-7b4131fc3ab5,network=Network(3556a570-1234-4dd3-a7d3-e2cf3097a776),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap832212ef-77')#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.823 2 DEBUG nova.virt.libvirt.driver [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.824 2 DEBUG nova.virt.libvirt.driver [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.824 2 DEBUG nova.virt.libvirt.driver [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] No VIF found with MAC fa:16:3e:26:8e:11, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:34:30 np0005476733 nova_compute[192580]: 2025-10-08 15:34:30.825 2 INFO nova.virt.libvirt.driver [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Using config drive#033[00m
Oct  8 11:34:31 np0005476733 nova_compute[192580]: 2025-10-08 15:34:31.618 2 INFO nova.virt.libvirt.driver [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Creating config drive at /var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.config#033[00m
Oct  8 11:34:31 np0005476733 nova_compute[192580]: 2025-10-08 15:34:31.623 2 DEBUG oslo_concurrency.processutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6jbtr666 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:34:31 np0005476733 nova_compute[192580]: 2025-10-08 15:34:31.752 2 DEBUG oslo_concurrency.processutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6jbtr666" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:34:31 np0005476733 NetworkManager[51699]: <info>  [1759937671.8174] manager: (tap832212ef-77): new Tun device (/org/freedesktop/NetworkManager/Devices/152)
Oct  8 11:34:31 np0005476733 kernel: tap832212ef-77: entered promiscuous mode
Oct  8 11:34:31 np0005476733 nova_compute[192580]: 2025-10-08 15:34:31.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:31 np0005476733 ovn_controller[94857]: 2025-10-08T15:34:31Z|00450|binding|INFO|Claiming lport 832212ef-772b-4b36-b486-7b4131fc3ab5 for this chassis.
Oct  8 11:34:31 np0005476733 ovn_controller[94857]: 2025-10-08T15:34:31Z|00451|binding|INFO|832212ef-772b-4b36-b486-7b4131fc3ab5: Claiming fa:16:3e:26:8e:11 192.168.100.73
Oct  8 11:34:31 np0005476733 ovn_controller[94857]: 2025-10-08T15:34:31Z|00452|binding|INFO|Setting lport 832212ef-772b-4b36-b486-7b4131fc3ab5 ovn-installed in OVS
Oct  8 11:34:31 np0005476733 nova_compute[192580]: 2025-10-08 15:34:31.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:31 np0005476733 nova_compute[192580]: 2025-10-08 15:34:31.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:31 np0005476733 systemd-udevd[234439]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:34:31 np0005476733 systemd-machined[152624]: New machine qemu-33-instance-00000039.
Oct  8 11:34:31 np0005476733 ovn_controller[94857]: 2025-10-08T15:34:31Z|00453|binding|INFO|Setting lport 832212ef-772b-4b36-b486-7b4131fc3ab5 up in Southbound
Oct  8 11:34:31 np0005476733 systemd[1]: Started Virtual Machine qemu-33-instance-00000039.
Oct  8 11:34:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:31.871 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:8e:11 192.168.100.73'], port_security=['fa:16:3e:26:8e:11 192.168.100.73 192.168.100.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.100.73/24', 'neutron:device_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3556a570-1234-4dd3-a7d3-e2cf3097a776', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c96d22c99734f059343a5340cc6f287', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2d0efdcb-fc9f-4ff6-ac01-106f25450adb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.243'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d1675938-de61-482c-b526-990e293aed89, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=832212ef-772b-4b36-b486-7b4131fc3ab5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:34:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:31.875 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 832212ef-772b-4b36-b486-7b4131fc3ab5 in datapath 3556a570-1234-4dd3-a7d3-e2cf3097a776 bound to our chassis#033[00m
Oct  8 11:34:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:31.877 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3556a570-1234-4dd3-a7d3-e2cf3097a776#033[00m
Oct  8 11:34:31 np0005476733 NetworkManager[51699]: <info>  [1759937671.8795] device (tap832212ef-77): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:34:31 np0005476733 NetworkManager[51699]: <info>  [1759937671.8802] device (tap832212ef-77): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:34:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:31.889 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9728e70d-bb5d-472d-bc8d-db4cddd57ad6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:34:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:31.892 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3556a570-11 in ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:34:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:31.894 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3556a570-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:34:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:31.894 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[24b53444-cc7a-4f7b-ac08-07992d0fa9f9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:34:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:31.895 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0643c308-d3b0-45b1-aede-fd91114c9894]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:34:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:31.907 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[8326d1fc-6b6d-4223-aa89-6593bcbfd44a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:34:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:31.932 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[fb460685-d7c6-42ae-b4ad-5a00e417085c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:34:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:31.964 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[5675fde7-8728-4762-93e1-27ab27ba89cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:34:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:31.969 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9e1e99e8-5991-446a-8cc3-03cd43494578]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:34:31 np0005476733 NetworkManager[51699]: <info>  [1759937671.9727] manager: (tap3556a570-10): new Veth device (/org/freedesktop/NetworkManager/Devices/153)
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:32.010 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[2c95f047-5d2c-4ab2-a7b9-90f14687f49b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:32.014 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[d0836a28-4432-414d-b7f8-078b5fa997ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:34:32 np0005476733 NetworkManager[51699]: <info>  [1759937672.0466] device (tap3556a570-10): carrier: link connected
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:32.052 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[35ef6ebc-4c52-476b-a9ef-967088a54ca7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:32.068 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d61ab787-5a7c-4c4e-9705-9bfc8ff5e3e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3556a570-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:26:e2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 104], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457571, 'reachable_time': 19436, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234473, 'error': None, 'target': 'ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:32.081 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[06770fc6-c701-44b7-b2bb-78919b282886]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb3:26e2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 457571, 'tstamp': 457571}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234474, 'error': None, 'target': 'ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:32.097 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2d4dbc3b-30a9-4d6a-8168-5d7cd1b4f71b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3556a570-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:26:e2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 104], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457571, 'reachable_time': 19436, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234475, 'error': None, 'target': 'ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:32.124 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[db1ac871-b77f-49fb-bdda-767a56f001e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:32.188 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[82984956-0a8f-406a-b3ba-bab8449dc317]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:32.189 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3556a570-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:32.190 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:32.190 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3556a570-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:34:32 np0005476733 nova_compute[192580]: 2025-10-08 15:34:32.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:32 np0005476733 kernel: tap3556a570-10: entered promiscuous mode
Oct  8 11:34:32 np0005476733 NetworkManager[51699]: <info>  [1759937672.1946] manager: (tap3556a570-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/154)
Oct  8 11:34:32 np0005476733 nova_compute[192580]: 2025-10-08 15:34:32.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:32.203 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3556a570-10, col_values=(('external_ids', {'iface-id': 'edc27bfb-7622-457f-b7d3-480bac0b8693'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:34:32 np0005476733 nova_compute[192580]: 2025-10-08 15:34:32.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:34:32Z|00454|binding|INFO|Releasing lport edc27bfb-7622-457f-b7d3-480bac0b8693 from this chassis (sb_readonly=0)
Oct  8 11:34:32 np0005476733 nova_compute[192580]: 2025-10-08 15:34:32.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:32.230 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3556a570-1234-4dd3-a7d3-e2cf3097a776.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3556a570-1234-4dd3-a7d3-e2cf3097a776.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:32.231 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[09e0b037-56d5-48df-96d7-64719c8bbaca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:32.232 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-3556a570-1234-4dd3-a7d3-e2cf3097a776
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/3556a570-1234-4dd3-a7d3-e2cf3097a776.pid.haproxy
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 3556a570-1234-4dd3-a7d3-e2cf3097a776
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:34:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:32.232 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776', 'env', 'PROCESS_TAG=haproxy-3556a570-1234-4dd3-a7d3-e2cf3097a776', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3556a570-1234-4dd3-a7d3-e2cf3097a776.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:34:32 np0005476733 podman[234514]: 2025-10-08 15:34:32.617791138 +0000 UTC m=+0.050962738 container create b4a7b083bb8836c6b1218364e2aeedd0beed005e7b0aa5eef03694a563396eec (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 11:34:32 np0005476733 nova_compute[192580]: 2025-10-08 15:34:32.638 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937672.637343, 8a310a2e-17af-42b8-a212-cf0a278e20c7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:34:32 np0005476733 nova_compute[192580]: 2025-10-08 15:34:32.639 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] VM Started (Lifecycle Event)#033[00m
Oct  8 11:34:32 np0005476733 systemd[1]: Started libpod-conmon-b4a7b083bb8836c6b1218364e2aeedd0beed005e7b0aa5eef03694a563396eec.scope.
Oct  8 11:34:32 np0005476733 podman[234514]: 2025-10-08 15:34:32.588300131 +0000 UTC m=+0.021471751 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:34:32 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:34:32 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fcd379ff73a8d13ae336ef0397bd9e98a5a70a58b3db034011a1f1aa8c8d8b12/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:34:32 np0005476733 podman[234528]: 2025-10-08 15:34:32.722139472 +0000 UTC m=+0.075225078 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:34:32 np0005476733 podman[234514]: 2025-10-08 15:34:32.722359699 +0000 UTC m=+0.155531359 container init b4a7b083bb8836c6b1218364e2aeedd0beed005e7b0aa5eef03694a563396eec (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  8 11:34:32 np0005476733 podman[234514]: 2025-10-08 15:34:32.727219195 +0000 UTC m=+0.160390805 container start b4a7b083bb8836c6b1218364e2aeedd0beed005e7b0aa5eef03694a563396eec (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:34:32 np0005476733 podman[234527]: 2025-10-08 15:34:32.735002456 +0000 UTC m=+0.087550895 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=iscsid)
Oct  8 11:34:32 np0005476733 neutron-haproxy-ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776[234550]: [NOTICE]   (234574) : New worker (234576) forked
Oct  8 11:34:32 np0005476733 neutron-haproxy-ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776[234550]: [NOTICE]   (234574) : Loading success.
Oct  8 11:34:32 np0005476733 nova_compute[192580]: 2025-10-08 15:34:32.761 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:34:32 np0005476733 nova_compute[192580]: 2025-10-08 15:34:32.767 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937672.6386487, 8a310a2e-17af-42b8-a212-cf0a278e20c7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:34:32 np0005476733 nova_compute[192580]: 2025-10-08 15:34:32.768 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:34:32 np0005476733 nova_compute[192580]: 2025-10-08 15:34:32.814 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:34:32 np0005476733 nova_compute[192580]: 2025-10-08 15:34:32.817 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:34:32 np0005476733 nova_compute[192580]: 2025-10-08 15:34:32.972 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:34:33 np0005476733 nova_compute[192580]: 2025-10-08 15:34:33.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:33 np0005476733 nova_compute[192580]: 2025-10-08 15:34:33.570 2 DEBUG nova.network.neutron [req-30773e9b-94be-4cad-b627-8ab63df4edaa req-63e58138-4ff5-4b92-928b-e0c7809ac68d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Updated VIF entry in instance network info cache for port 832212ef-772b-4b36-b486-7b4131fc3ab5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:34:33 np0005476733 nova_compute[192580]: 2025-10-08 15:34:33.571 2 DEBUG nova.network.neutron [req-30773e9b-94be-4cad-b627-8ab63df4edaa req-63e58138-4ff5-4b92-928b-e0c7809ac68d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Updating instance_info_cache with network_info: [{"id": "832212ef-772b-4b36-b486-7b4131fc3ab5", "address": "fa:16:3e:26:8e:11", "network": {"id": "3556a570-1234-4dd3-a7d3-e2cf3097a776", "bridge": "br-int", "label": "tempest-test-network--565113220", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.73", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c96d22c99734f059343a5340cc6f287", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap832212ef-77", "ovs_interfaceid": "832212ef-772b-4b36-b486-7b4131fc3ab5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:34:33 np0005476733 nova_compute[192580]: 2025-10-08 15:34:33.655 2 DEBUG oslo_concurrency.lockutils [req-30773e9b-94be-4cad-b627-8ab63df4edaa req-63e58138-4ff5-4b92-928b-e0c7809ac68d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-8a310a2e-17af-42b8-a212-cf0a278e20c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:34:34 np0005476733 nova_compute[192580]: 2025-10-08 15:34:34.657 2 INFO nova.compute.manager [None req-b5aed49f-3e7e-48dd-b87e-6f4e05dadf28 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Get console output#033[00m
Oct  8 11:34:34 np0005476733 nova_compute[192580]: 2025-10-08 15:34:34.665 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:34:34 np0005476733 nova_compute[192580]: 2025-10-08 15:34:34.669 2 INFO nova.virt.libvirt.driver [None req-b5aed49f-3e7e-48dd-b87e-6f4e05dadf28 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Truncated console log returned, 3166 bytes ignored#033[00m
Oct  8 11:34:34 np0005476733 nova_compute[192580]: 2025-10-08 15:34:34.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:35 np0005476733 nova_compute[192580]: 2025-10-08 15:34:35.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.008 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'name': 'tempest-test_bw_limit_tenant_network-1685300098', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002d', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'hostId': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.012 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'name': 'tempest-test_multicast_north_south-1395375184', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000035', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '496a37645ecf47b496dcf02c696ca64a', 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'hostId': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.014 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'name': 'tempest-test_multicast_north_south-39687703', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000037', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '496a37645ecf47b496dcf02c696ca64a', 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'hostId': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.016 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'name': 'vm2', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000039', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': '8c96d22c99734f059343a5340cc6f287', 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'hostId': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.016 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.042 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.read.latency volume: 9083710695 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.043 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.read.latency volume: 48138693 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.069 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.device.read.latency volume: 6687005181 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.069 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.device.read.latency volume: 49403415 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.091 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.device.read.latency volume: 6616621668 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.092 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.device.read.latency volume: 52597478 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.119 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.120 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3024c962-ca83-444f-913d-89c708466e0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9083710695, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:34:36.017203', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4b94339a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.74018627, 'message_signature': 'acfbbb3d827075ee8fff3fbc13f88823ea7979d49034f7c8c9830fb91b9bd7c8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 48138693, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:34:36.017203', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4b944600-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.74018627, 'message_signature': '4b36148b93f0dfc2b9b64977cf08d0cef4fd14fe2f534a486dea8906603f08f4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6687005181, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986-vda', 'timestamp': '2025-10-08T15:34:36.017203', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'instance-00000035', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4b982bee-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.767261269, 'message_signature': '23bd631291eb0534891211b787500a798d0001e656af7fac1e92d9a7a810f266'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 49403415, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986-sda', 'timestamp': '2025-10-08T15:34:36.017203', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'instance-00000035', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4b983b48-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.767261269, 'message_signature': '4e696a5da4bc0612b24e579e04b5154e0c0c03506978098f53d00a7acb9b3f64'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6616621668, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-vda', 'timestamp': '2025-10-08T15:34:36.017203', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'instance-00000037', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4b9b9158-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.793168682, 'message_signature': '2acf7363e42cd40354d1d5c15ec2f7c5de855955a43a571fe5616f43e6444885'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 52597478, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-sda', 'timestamp': '2025-10-08T15:34:36.017203', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'instance-00000037', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: -1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4b9ba7ce-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.793168682, 'message_signature': 'a03d2f42ae0de22f4343776d22fabb445927cc6cdf2d1dfb9abb83cd77e71b92'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-vda', 'timestamp': '2025-10-08T15:34:36.017203', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4b9ff676-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.815603373, 'message_signature': '40be692fa4712e3960b8e121de09c3797cf480c29b08fdaf5f5585e8ccb9b584'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-sda', 'timestamp': '2025-10-08T15:34:36.017203', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4ba00c1a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.815603373, 'message_signature': 'ad8b9eafeb3f6ec523a9851030cef743c216f380e394a369f9e824e4386ab7fa'}]}, 'timestamp': '2025-10-08 15:34:36.121357', '_unique_id': '96d777cfe2464af9b8436946a4fa84ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.124 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.138 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.139 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.151 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.152 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.165 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.166 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.176 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.177 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a1f59d9-4a8e-4751-a83c-ed4bd64c1014', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:34:36.125009', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4ba2c3ce-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.848054697, 'message_signature': '82437c7e9b999a0c3d99f0e21497b6c4010a6acf4e8e33f87a1d562427132dff'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:34:36.125009', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4ba2d364-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.848054697, 'message_signature': '7501bf613eba5dca0c664aefbd67488aca74e5b99d49b0cd4b3af3340b78f507'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986-vda', 'timestamp': '2025-10-08T15:34:36.125009', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'instance-00000035', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4ba4b972-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.862515851, 'message_signature': '5543d1942377e5655e77e4d6f8972bdc3ad1756aff9904ba5f4431963515c1b4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986-sda', 'timestamp': '2025-10-08T15:34:36.125009', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'instance-00000035', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4ba4c9ee-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.862515851, 'message_signature': 'ffaf84b3a9fd21627c99eb400b418089b9cef62242abf95303082ef2c7e1b429'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-vda', 'timestamp': '2025-10-08T15:34:36.125009', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'instance-00000037', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4ba6e4f4-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.875353794, 'message_signature': 'f2192b1ecf9dc688e9dc69b3b060075eb6885cba56fa84132ff885ad51b64e33'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-sda', 'timestamp': '2025-10-08T15:34:36.125009', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'instance-00000037', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_st
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4ba6f1e2-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.875353794, 'message_signature': 'e740353da5a99b22de4bf825aedd5423f3c969ec2c4ffeedcc1d1e5344671f1f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-vda', 'timestamp': '2025-10-08T15:34:36.125009', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4ba88bce-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.889478988, 'message_signature': 'fe6a7b7b725cf831da65b482cc7bdc22b0bc50480c31730839ca4c3dbc9c14f0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-sda', 'timestamp': '2025-10-08T15:34:36.125009', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4ba89556-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.889478988, 'message_signature': 'ea73708129d6a595ea2a2ee3b84a456fb7fe957e251d9631ff8ee1fce610c341'}]}, 'timestamp': '2025-10-08 15:34:36.177245', '_unique_id': '5e7f5b3fd6884b7bbe101e88bd9cc3b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.178 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.182 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.bytes volume: 32162818 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.182 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.bytes volume: 90242 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.184 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for d73d8a2e-011b-4f41-9734-d2bb2b068986 / tapbc705fd7-4e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.184 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/network.incoming.bytes volume: 2432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.186 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2 / tapcfb829d2-b0 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.186 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/network.incoming.bytes volume: 2092 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.189 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 8a310a2e-17af-42b8-a212-cf0a278e20c7 / tap832212ef-77 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.189 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9839350-411c-4ce8-8807-05a204272439', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32162818, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:34:36.179037', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': '4ba96ce2-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.902030761, 'message_signature': '2cddb7cd60edf2f3e790c87c20d8785b4b5dd3e86fe320f8be582df4f507a074'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90242, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap50d486c7-b0', 'timestamp': '2025-10-08T15:34:36.179037', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap50d486c7-b0', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c9:e2:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50d486c7-b0'}, 'message_id': '4ba9762e-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.902030761, 'message_signature': 'ef8036e3b9ea3f03fc72a19d9b0778e5eb5b7c2e11fe913e4d00d567a2e8b5e7'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2432, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000035-d73d8a2e-011b-4f41-9734-d2bb2b068986-tapbc705fd7-4e', 'timestamp': '2025-10-08T15:34:36.179037', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'tapbc705fd7-4e', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:00:d6:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbc705fd7-4e'}, 'message_id': '4ba9ca16-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.905970078, 'message_signature': 'f762177b92e8948d748d7b7c05c1b95995d6ab709f875c6f2545f0f978cec6ec'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2092, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000037-6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-tapcfb829d2-b0', 'timestamp': '2025-10-08T15:34:36.179037', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'tapcfb829d2-b0', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b3:15:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcfb829d2-b0'}, 'message_id': '4baa1b4c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.908101866, 'message_signature': '9e0b29ff2efb7b9dcfb8a32b4bce288dd6eecfffca5d47b84bb7ca0cb44d8991'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-00000039-8a310a2e-17af-42b8-a212-cf0a278e20c7-tap832212ef-77', 'timestamp': '2025-10-08T15:34:36.179037', 'resource_metadata': {'display_name': 'vm2', 'name': 'tap832212ef-77', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:26:8e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap832212ef-77'}, 'message_id': '4baa7cf4-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.910167442, 'message_signature': 'eb9cd86a3dc64df96488ebbbff46c503b35a3e803177c289aaf8bcb8be3aa836'}]}, 'timestamp': '2025-10-08 15:34:36.189741', '_unique_id': '574fe467e7124747bcd9592f1b08fbe4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.190 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.191 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.191 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.allocation volume: 169873408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.191 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.191 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.device.allocation volume: 154148864 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.192 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.192 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.device.allocation volume: 85987328 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.192 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.192 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.192 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3ea556b-919d-4ba0-96c1-bc47e91df3de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 169873408, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:34:36.191438', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4baac902-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.848054697, 'message_signature': 'b5223faf6258b17c404cb274aa362023e48aca189f24fe6104fb7086addfce72'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:34:36.191438', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4baad122-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.848054697, 'message_signature': 'f8d3ea80ebafdc1969d4f9d2a79b6ae25ae78a6c1867e7c6302878c1e5e26e7b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 154148864, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986-vda', 'timestamp': '2025-10-08T15:34:36.191438', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'instance-00000035', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4baadb18-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.862515851, 'message_signature': 'cd9b2b4ce33bd823a5ebb7efb245e43129a24b33c5061d695826e35159c6e762'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986-sda', 'timestamp': '2025-10-08T15:34:36.191438', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'instance-00000035', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4baae630-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.862515851, 'message_signature': '64811b1190eeaae249a348efb89458375298d29a60bcdcda3a7efdb0c053843b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 85987328, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-vda', 'timestamp': '2025-10-08T15:34:36.191438', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'instance-00000037', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4baaede2-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.875353794, 'message_signature': 'c4c8f3f85807a151e6dbd17208e956397e95e3db0443c5782eaf0ba1ab713c55'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-sda', 'timestamp': '2025-10-08T15:34:36.191438', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'instance-00000037', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'ta
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4baaf54e-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.875353794, 'message_signature': 'cfa0764d6ee7950593fc4e5ee2c87a522722438b6bb6b39d6123f79efe450020'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-vda', 'timestamp': '2025-10-08T15:34:36.191438', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4baafca6-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.889478988, 'message_signature': 'cf72ed70639275b54ca46d532fbac8ec578ee1f74e39c4ed646d51ad779742cd'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-sda', 'timestamp': '2025-10-08T15:34:36.191438', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bab03e0-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.889478988, 'message_signature': '5236db53c9230b6996772fa6a655579273f6a48b997a892f262e7934b2349dee'}]}, 'timestamp': '2025-10-08 15:34:36.193183', '_unique_id': '9b094d93e0b44a168f236218992b1ad1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.194 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.207 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/memory.usage volume: 244.42578125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.221 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/memory.usage volume: 225.0625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.234 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/memory.usage volume: 267.53125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.247 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.247 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 8a310a2e-17af-42b8-a212-cf0a278e20c7: ceilometer.compute.pollsters.NoVolumeException
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '948681a5-bc25-4ef7-a9a1-c08af68b9ccd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 244.42578125, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'timestamp': '2025-10-08T15:34:36.194442', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '4bad4b3c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.930518266, 'message_signature': 'e24ba885cd37512857b411e4eef81bb0c8eee584b00cc0f56360b1db6ca6a4de'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 225.0625, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'timestamp': '2025-10-08T15:34:36.194442', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'instance-00000035', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '4baf69d0-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.944512586, 'message_signature': 'f99a7d404859d19b0b77170437309a51e1a76926071158e621a76863f7bee9d6'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 267.53125, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'timestamp': '2025-10-08T15:34:36.194442', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'instance-00000037', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '4bb15830-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.9570573, 'message_signature': 'a9a6c8f700c7b3b1d9466452eb4feace2183fbce7bba7a87328b6f54f3a4c6b4'}]}, 'timestamp': '2025-10-08 15:34:36.248050', '_unique_id': 'fbfdf935d49b462b9f445d524b5277d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.249 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.250 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.250 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.250 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-test_multicast_north_south-1395375184>, <NovaLikeServer: tempest-test_multicast_north_south-39687703>, <NovaLikeServer: vm2>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_multicast_north_south-1395375184>, <NovaLikeServer: tempest-test_multicast_north_south-39687703>, <NovaLikeServer: vm2>]
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.250 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.250 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.250 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-test_multicast_north_south-1395375184>, <NovaLikeServer: tempest-test_multicast_north_south-39687703>, <NovaLikeServer: vm2>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_multicast_north_south-1395375184>, <NovaLikeServer: tempest-test_multicast_north_south-39687703>, <NovaLikeServer: vm2>]
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.251 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.251 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.write.requests volume: 926 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.251 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.251 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.device.write.requests volume: 722 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.252 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.252 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.device.write.requests volume: 354 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.252 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.253 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.253 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cecab932-f5af-4112-a381-b99244ffcc71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 926, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:34:36.251272', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4bb3ecc6-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.74018627, 'message_signature': '435304ea3c09588435c75d8506c84d1805d4d1057a59be2dd1721022006a3a10'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:34:36.251272', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bb3fa68-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.74018627, 'message_signature': '51887b339f830db61ec37d44877035ff69e79769f4b094217dd882dde53a8354'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 722, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986-vda', 'timestamp': '2025-10-08T15:34:36.251272', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'instance-00000035', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4bb406a2-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.767261269, 'message_signature': 'cb45fc0a9b54d23db3175a95de84ebe08f1e04f1b514790a06c35b9abcac99b3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986-sda', 'timestamp': '2025-10-08T15:34:36.251272', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'instance-00000035', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bb41192-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.767261269, 'message_signature': 'b573cc0090da5764734816b419c272682fe963d717bffee0b1613ac79612221e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 354, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-vda', 'timestamp': '2025-10-08T15:34:36.251272', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'instance-00000037', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4bb41c00-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.793168682, 'message_signature': '1e8471a70ed3146ac191ef478d9d328e33602086dcf8444be5e236deee767cd3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-sda', 'timestamp': '2025-10-08T15:34:36.251272', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'instance-00000037', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: -1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bb42664-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.793168682, 'message_signature': '128cdf99f917e22e80d54fd8c3385fe4b0c2cefe1da5f26c85520ce5b10df48e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-vda', 'timestamp': '2025-10-08T15:34:36.251272', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4bb431cc-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.815603373, 'message_signature': '7cd1d8fac6a7f82da7209a9cb107f60e1c02ea2723204c36f60db215dfdf7a01'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-sda', 'timestamp': '2025-10-08T15:34:36.251272', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bb43c1c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.815603373, 'message_signature': '399477b7a014879438f666cdac91097fb6eb4384e40bf53381b7433eb6ed685f'}]}, 'timestamp': '2025-10-08 15:34:36.253632', '_unique_id': 'f2a6c2580dd04236b33d7abb8a89a1af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.255 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.255 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.256 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.256 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.256 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.257 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e49f5cba-edbc-4e2a-87a0-4b739ada99c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:34:36.255837', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': '4bb49ebe-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.902030761, 'message_signature': 'e5fbf614b0d06640ef86528fc604e56e7f325fc1d270491821db154fd35d9b64'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap50d486c7-b0', 'timestamp': '2025-10-08T15:34:36.255837', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap50d486c7-b0', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c9:e2:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50d486c7-b0'}, 'message_id': '4bb4ac74-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.902030761, 'message_signature': '54b9bf159240f130c7d8014b45b165a01b2080185ea8f5991fb66df19e0a0527'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000035-d73d8a2e-011b-4f41-9734-d2bb2b068986-tapbc705fd7-4e', 'timestamp': '2025-10-08T15:34:36.255837', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'tapbc705fd7-4e', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:00:d6:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbc705fd7-4e'}, 'message_id': '4bb4b78c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.905970078, 'message_signature': '9dd52faf7939f91ff7eebf03df98c7ba064dc9988da13c5ab57cdea27155bb0b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000037-6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-tapcfb829d2-b0', 'timestamp': '2025-10-08T15:34:36.255837', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'tapcfb829d2-b0', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b3:15:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcfb829d2-b0'}, 'message_id': '4bb4c2e0-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.908101866, 'message_signature': 'bf023f75fd79c79d6caacded98377516784630bb72b949474c1c9593e8f02c69'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-00000039-8a310a2e-17af-42b8-a212-cf0a278e20c7-tap832212ef-77', 'timestamp': '2025-10-08T15:34:36.255837', 'resource_metadata': {'display_name': 'vm2', 'name': 'tap832212ef-77', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:26:8e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap832212ef-77'}, 'message_id': '4bb4cf24-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.910167442, 'message_signature': 'e88f1041b867ae11e07e8a4096fff25d2e3f2765a0d961966ba9bedb9f37b69e'}]}, 'timestamp': '2025-10-08 15:34:36.257413', '_unique_id': 'da373e514f0543b0b49f606a50ddbac3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:34:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:34:36.123 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.258 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.259 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.259 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.bytes.delta volume: 47802708 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.259 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.bytes.delta volume: 131562 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.259 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.260 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.260 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4cb12cae-a59b-4cd8-9df2-1fa257a5378d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 47802708, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:34:36.259378', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': '4bb528ac-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.902030761, 'message_signature': '8af2fb8cdddcf0702e8b298f1bec3a7c6233c9145206cf546cf92825ade52c6d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 131562, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap50d486c7-b0', 'timestamp': '2025-10-08T15:34:36.259378', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap50d486c7-b0', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c9:e2:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50d486c7-b0'}, 'message_id': '4bb53450-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.902030761, 'message_signature': '79b9271cb7af2a5db164599393c01439a3f84824828191fa84ce988d75ef9e86'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000035-d73d8a2e-011b-4f41-9734-d2bb2b068986-tapbc705fd7-4e', 'timestamp': '2025-10-08T15:34:36.259378', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'tapbc705fd7-4e', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:00:d6:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbc705fd7-4e'}, 'message_id': '4bb54030-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.905970078, 'message_signature': 'd17426ce260dac40e0e6293e52f84f587e6879eddf926579b79026e20ba2b6a4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000037-6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-tapcfb829d2-b0', 'timestamp': '2025-10-08T15:34:36.259378', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'tapcfb829d2-b0', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b3:15:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcfb829d2-b0'}, 'message_id': '4bb54b5c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.908101866, 'message_signature': 'eb0755cc4b8d5926a9a6835739c31da747926146b0e9c2f4ff2228f0f0e72a34'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-00000039-8a310a2e-17af-42b8-a212-cf0a278e20c7-tap832212ef-77', 'timestamp': '2025-10-08T15:34:36.259378', 'resource_metadata': {'display_name': 'vm2', 'name': 'tap832212ef-77', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:26:8e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap832212ef-77'}, 'message_id': '4bb5562e-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.910167442, 'message_signature': '9cd354e11fd3be978e5dc2818181c438a65068d5e4c85649de41b1363af19f8e'}]}, 'timestamp': '2025-10-08 15:34:36.260864', '_unique_id': '34ff12f22a3e49b2b0b751da9eea807c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.261 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.262 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 11:34:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:34:36.178 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.262 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.263 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.263 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.263 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0e88df4-1cf1-45de-8bb4-cdd2a5762c7c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:34:36.262885', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': '4bb5b16e-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.902030761, 'message_signature': '66274dba84bfda87836e6752db4c13cad0768798c5df6306c821cb78fbbfb815'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap50d486c7-b0', 'timestamp': '2025-10-08T15:34:36.262885', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap50d486c7-b0', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c9:e2:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50d486c7-b0'}, 'message_id': '4bb5be02-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.902030761, 'message_signature': '47f44c16ab0ac51afd36cac48c5ec85f1cca2504d6d1d6ed879918958c4a42aa'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000035-d73d8a2e-011b-4f41-9734-d2bb2b068986-tapbc705fd7-4e', 'timestamp': '2025-10-08T15:34:36.262885', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'tapbc705fd7-4e', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:00:d6:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbc705fd7-4e'}, 'message_id': '4bb5c906-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.905970078, 'message_signature': '4626c3419af80201720e690c9e852fb409fe1c3193f89eebc2af398157ba67c3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000037-6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-tapcfb829d2-b0', 'timestamp': '2025-10-08T15:34:36.262885', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'tapcfb829d2-b0', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b3:15:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcfb829d2-b0'}, 'message_id': '4bb5f962-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.908101866, 'message_signature': 'c769222d37f1a280587e71f5b8e19df67912433889cad808a3264bdf81a8f1d0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-00000039-8a310a2e-17af-42b8-a212-cf0a278e20c7-tap832212ef-77', 'timestamp': '2025-10-08T15:34:36.262885', 'resource_metadata': {'display_name': 'vm2', 'name': 'tap832212ef-77', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:26:8e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap832212ef-77'}, 'message_id': '4bb60484-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.910167442, 'message_signature': '11fead56db5acb1486ac1e57e24297f951134623f5f48dd1b1ad34aaddaccbd7'}]}, 'timestamp': '2025-10-08 15:34:36.265289', '_unique_id': '645a1e530e30451e933ff70bcdfa895b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.265 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.266 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.266 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.write.bytes volume: 154524672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.267 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.267 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.device.write.bytes volume: 135708672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.267 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.267 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.device.write.bytes volume: 72586240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.268 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.268 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.268 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '595f2a80-e243-4e2b-95aa-bf6214bce00b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 154524672, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:34:36.266816', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4bb64b60-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.74018627, 'message_signature': 'aadf8db6294d4e978fb0dcb160c297849bc2600083496be8c4ef9a9424057fb4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:34:36.266816', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bb65696-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.74018627, 'message_signature': '1d9105861af0da9905abd9db31f66b517a8c1af591a4e72d99c7c1b8b7cb9b3f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 135708672, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986-vda', 'timestamp': '2025-10-08T15:34:36.266816', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'instance-00000035', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4bb6603c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.767261269, 'message_signature': 'dd25949802b9b46c41b8a3aeb21cf69db15d2ecf7ca5ab2a9900fe08c4e145dd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986-sda', 'timestamp': '2025-10-08T15:34:36.266816', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'instance-00000035', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bb6696a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.767261269, 'message_signature': 'f15908a74dac214aa4606e6c19c5b335c0b3f33dda1d74398f7995a8a88a8ee4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72586240, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-vda', 'timestamp': '2025-10-08T15:34:36.266816', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'instance-00000037', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4bb6711c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.793168682, 'message_signature': '29eea4e6282aaa3188ece5cda63581229232bd121336f9838d7ccb4e6abe7112'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-sda', 'timestamp': '2025-10-08T15:34:36.266816', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'instance-00000037', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'sta
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: e_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bb67b58-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.793168682, 'message_signature': 'ff0dec06ac3c491708b820c7c119e3d0f0f38227628ef342797c5af706e30be4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-vda', 'timestamp': '2025-10-08T15:34:36.266816', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4bb684ae-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.815603373, 'message_signature': '68a304ac3be0d7aedde1bd876a11944ff3f4cdf52ab42ebfda60430b8e1ba594'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-sda', 'timestamp': '2025-10-08T15:34:36.266816', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bb68e72-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.815603373, 'message_signature': 'd78b301fa2fd07d1d1270cb6fa67532986ae4bb0e8007a0c9f94efad664a9d9d'}]}, 'timestamp': '2025-10-08 15:34:36.268839', '_unique_id': '895633deb74145afb6ed4eb6eee985b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.270 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.270 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.270 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.271 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.271 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.271 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aae93eb3-71c1-448c-981c-6a550ef1897e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:34:36.270546', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': '4bb6dc1a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.902030761, 'message_signature': 'e307463e28b95aa4f79ab339f0faef539496424398883281a6b06a4c3d938a62'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap50d486c7-b0', 'timestamp': '2025-10-08T15:34:36.270546', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap50d486c7-b0', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c9:e2:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50d486c7-b0'}, 'message_id': '4bb6e75a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.902030761, 'message_signature': 'c71403319be12f4cec7901cfeff62703cd9bdc25db1176e8a320ad60babd754e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000035-d73d8a2e-011b-4f41-9734-d2bb2b068986-tapbc705fd7-4e', 'timestamp': '2025-10-08T15:34:36.270546', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'tapbc705fd7-4e', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:00:d6:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbc705fd7-4e'}, 'message_id': '4bb6f178-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.905970078, 'message_signature': '0a22a054a89d9c864c924a11c97e99814e560ffe5595d23941503ad871c2881e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000037-6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-tapcfb829d2-b0', 'timestamp': '2025-10-08T15:34:36.270546', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'tapcfb829d2-b0', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b3:15:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcfb829d2-b0'}, 'message_id': '4bb6faf6-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.908101866, 'message_signature': 'd093dc15747624789638a304b835363ca3a935c1ca572c9c1508a8a5b5032016'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-00000039-8a310a2e-17af-42b8-a212-cf0a278e20c7-tap832212ef-77', 'timestamp': '2025-10-08T15:34:36.270546', 'resource_metadata': {'display_name': 'vm2', 'name': 'tap832212ef-77', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:26:8e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap832212ef-77'}, 'message_id': '4bb70532-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.910167442, 'message_signature': '3d146fcd4b695f061063fa6cfc22a11109025112aa60a16a824d33ecd515248e'}]}, 'timestamp': '2025-10-08 15:34:36.271859', '_unique_id': 'b3ed976f80ef481fa9b64dd914922c1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:34:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:34:36.193 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.272 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.273 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.273 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.read.bytes volume: 329442816 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.273 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.273 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.device.read.bytes volume: 326608384 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.273 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.274 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.device.read.bytes volume: 312660992 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.274 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.274 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.274 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e55a9e6-8aec-41b1-a842-87d8b683abcc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 329442816, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:34:36.273260', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4bb744fc-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.74018627, 'message_signature': 'c7f4636aa34afc36e5fca6ac3464db3f1096e56083c32cedb63b8de79727e25c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:34:36.273260', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bb74d08-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.74018627, 'message_signature': '714c7e269d5a634c05c9f8d6ac94104e238c2847dfdfafce8c5b956e4da879e4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 326608384, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986-vda', 'timestamp': '2025-10-08T15:34:36.273260', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'instance-00000035', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4bb754ec-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.767261269, 'message_signature': 'f651d23422a71c8a0030290f830ced0d2aae44fc5f97892803634e1298b8fcea'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986-sda', 'timestamp': '2025-10-08T15:34:36.273260', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'instance-00000035', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bb75c3a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.767261269, 'message_signature': '37bb18ec5306112ddd38badac8bc6eea12dfdfdf8392abb5730fe541d13f1c05'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 312660992, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-vda', 'timestamp': '2025-10-08T15:34:36.273260', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'instance-00000037', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4bb765ae-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.793168682, 'message_signature': 'f6b93a8835986aa1510c9e7ea17ce62222be2212993e44fb0c75e2e9e788a487'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-sda', 'timestamp': '2025-10-08T15:34:36.273260', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'instance-00000037', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'act
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 1'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bb76cf2-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.793168682, 'message_signature': 'da66d4d48a53aa7b74a1d1951b7d1cc6025acaedfef6c76ce528bcf37e8051f0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-vda', 'timestamp': '2025-10-08T15:34:36.273260', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4bb77530-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.815603373, 'message_signature': 'fbac218449ef626851635561e52d379c27a4ad5a312d4cd29866df37a1b32eee'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-sda', 'timestamp': '2025-10-08T15:34:36.273260', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bb77e18-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.815603373, 'message_signature': 'd99f74affae457feb9123d2d2467ed9ddcdb349eec38796ebd8743726f7d9fd5'}]}, 'timestamp': '2025-10-08 15:34:36.274937', '_unique_id': '78d0c2a77a5845e8aa3c1c46c2733667'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.276 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.276 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.packets volume: 14639 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.276 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.packets volume: 504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.276 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/network.outgoing.packets volume: 37 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.277 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/network.outgoing.packets volume: 23 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.277 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c552b048-5b9e-4711-9022-4af4bab0382b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14639, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:34:36.276287', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': '4bb7bce8-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.902030761, 'message_signature': '46ca3981cc0e8abf5773a00ad3b198e2ce3ced8e3f161a176465a64d7aa8da49'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 504, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap50d486c7-b0', 'timestamp': '2025-10-08T15:34:36.276287', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap50d486c7-b0', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c9:e2:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50d486c7-b0'}, 'message_id': '4bb7c7f6-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.902030761, 'message_signature': '029393d96f8498c9158fa75077c353c4603da5dc199a968fdb043ff16fce954f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 37, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000035-d73d8a2e-011b-4f41-9734-d2bb2b068986-tapbc705fd7-4e', 'timestamp': '2025-10-08T15:34:36.276287', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'tapbc705fd7-4e', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:00:d6:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbc705fd7-4e'}, 'message_id': '4bb7d0ca-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.905970078, 'message_signature': '5e4220ccb96697af7ddbf3960122631b7c4ce7dc1809f0bddfb0e494e6c20785'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 23, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000037-6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-tapcfb829d2-b0', 'timestamp': '2025-10-08T15:34:36.276287', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'tapcfb829d2-b0', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b3:15:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcfb829d2-b0'}, 'message_id': '4bb7dea8-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.908101866, 'message_signature': '76af85ea1cdac3cc13edefd95cfcc9ac2d939dcbbc82e945d2e33564f86b0cca'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-00000039-8a310a2e-17af-42b8-a212-cf0a278e20c7-tap832212ef-77', 'timestamp': '2025-10-08T15:34:36.276287', 'resource_metadata': {'display_name': 'vm2', 'name': 'tap832212ef-77', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:26:8e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap832212ef-77'}, 'message_id': '4bb7e77c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.910167442, 'message_signature': '9961b2e68a8489a3a08be6491f38b8b213c079881c4d8481c2ecac8aff4170d0'}]}, 'timestamp': '2025-10-08 15:34:36.277678', '_unique_id': '91095ab22b4d45e4b7bf7a674ce75de6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:34:36.254 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.278 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.bytes.delta volume: 32156631 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.279 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.bytes.delta volume: 88770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.279 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.279 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.279 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '085dfb08-dc43-4c39-983f-223918f295c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 32156631, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:34:36.278959', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': '4bb8255c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.902030761, 'message_signature': '08b8b81f6bcf6f55e676348124d151c92fe13014e9a991dffbe52cae0da43360'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 88770, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap50d486c7-b0', 'timestamp': '2025-10-08T15:34:36.278959', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap50d486c7-b0', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c9:e2:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50d486c7-b0'}, 'message_id': '4bb82d90-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.902030761, 'message_signature': 'ea7207d6849a3f2f373e524b575d94755854301b39d78790648765bd0b5c1552'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000035-d73d8a2e-011b-4f41-9734-d2bb2b068986-tapbc705fd7-4e', 'timestamp': '2025-10-08T15:34:36.278959', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'tapbc705fd7-4e', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:00:d6:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbc705fd7-4e'}, 'message_id': '4bb83560-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.905970078, 'message_signature': '85ac5d332505c368b5ad4c5837a76cc152f7463fd56b5009d281f927969b9570'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000037-6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-tapcfb829d2-b0', 'timestamp': '2025-10-08T15:34:36.278959', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'tapcfb829d2-b0', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b3:15:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcfb829d2-b0'}, 'message_id': '4bb83d08-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.908101866, 'message_signature': '68566f8f034516007fe4823a4baa5132abeaf07c9bc135fc1a16ebb5492af3ad'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-00000039-8a310a2e-17af-42b8-a212-cf0a278e20c7-tap832212ef-77', 'timestamp': '2025-10-08T15:34:36.278959', 'resource_metadata': {'display_name': 'vm2', 'name': 'tap832212ef-77', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:26:8e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap832212ef-77'}, 'message_id': '4bb844a6-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.910167442, 'message_signature': '2e1db6583476a0cec4a0df69e6199b3b072ff1bb0109115f3f7f84be0d096e9b'}]}, 'timestamp': '2025-10-08 15:34:36.280026', '_unique_id': '715af53a643b433a94d65e0e45ffcc26'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.280 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.281 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.281 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.bytes volume: 47812993 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.281 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.bytes volume: 132706 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.281 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/network.outgoing.bytes volume: 3826 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.282 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/network.outgoing.bytes volume: 2504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.282 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eaf092ca-c464-4c80-b13a-376716b6305e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 47812993, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:34:36.281272', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': '4bb87f52-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.902030761, 'message_signature': 'a54940324370f2529d1863a77238be725bf6b42f1f66c67dd4adae805b381b52'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 132706, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap50d486c7-b0', 'timestamp': '2025-10-08T15:34:36.281272', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap50d486c7-b0', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c9:e2:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50d486c7-b0'}, 'message_id': '4bb88772-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.902030761, 'message_signature': '1a80468388c6f5efe2575a8cf7afba61b4398120bcade6b706e6d5e0a4f9bb05'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3826, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000035-d73d8a2e-011b-4f41-9734-d2bb2b068986-tapbc705fd7-4e', 'timestamp': '2025-10-08T15:34:36.281272', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'tapbc705fd7-4e', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:00:d6:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbc705fd7-4e'}, 'message_id': '4bb8912c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.905970078, 'message_signature': '3d20484a54427e63eac20fe0df51103da24565552d3aad8a0b1237f3628bf88b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2504, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000037-6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-tapcfb829d2-b0', 'timestamp': '2025-10-08T15:34:36.281272', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'tapcfb829d2-b0', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b3:15:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcfb829d2-b0'}, 'message_id': '4bb89cf8-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.908101866, 'message_signature': 'faf967a566969597b6bd75c700a7ee845cd0c23ba5ab2289f52c4a0cb1d90346'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-00000039-8a310a2e-17af-42b8-a212-cf0a278e20c7-tap832212ef-77', 'timestamp': '2025-10-08T15:34:36.281272', 'resource_metadata': {'display_name': 'vm2', 'name': 'tap832212ef-77', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:26:8e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap832212ef-77'}, 'message_id': '4bb8a7d4-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.910167442, 'message_signature': '6b8a9ac20aa9dfe4ed3692fdac05d782d811aff01c1fa55bb7ab72165c8e6cad'}]}, 'timestamp': '2025-10-08 15:34:36.282638', '_unique_id': '1e0ea4e9e50f450dac6d5f24715c9ef3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.283 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.284 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.284 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-test_multicast_north_south-1395375184>, <NovaLikeServer: tempest-test_multicast_north_south-39687703>, <NovaLikeServer: vm2>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_multicast_north_south-1395375184>, <NovaLikeServer: tempest-test_multicast_north_south-39687703>, <NovaLikeServer: vm2>]
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.284 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.284 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.packets volume: 14402 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.284 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.incoming.packets volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.284 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/network.incoming.packets volume: 19 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.285 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/network.incoming.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.285 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c6dd880-f600-402d-80ca-3edb4f939c12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14402, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:34:36.284329', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': '4bb8f55e-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.902030761, 'message_signature': '44ea27f90bfe14b9103f18900ad28fef8b5c3d9dabb60dddbc6df32e8dea8f76'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 512, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap50d486c7-b0', 'timestamp': '2025-10-08T15:34:36.284329', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap50d486c7-b0', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c9:e2:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50d486c7-b0'}, 'message_id': '4bb8fdba-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.902030761, 'message_signature': 'cc5c68217b11e373634be05973d4543521f63636efaf68fcdbd1dc605c0861b4'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 19, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000035-d73d8a2e-011b-4f41-9734-d2bb2b068986-tapbc705fd7-4e', 'timestamp': '2025-10-08T15:34:36.284329', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'tapbc705fd7-4e', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:00:d6:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbc705fd7-4e'}, 'message_id': '4bb906a2-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.905970078, 'message_signature': 'ff3468d7960b1bd373aed7f0cf9b91d94fe8c2cfb445862a57b8a899d24254f6'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000037-6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-tapcfb829d2-b0', 'timestamp': '2025-10-08T15:34:36.284329', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'tapcfb829d2-b0', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b3:15:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcfb829d2-b0'}, 'message_id': '4bb911e2-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.908101866, 'message_signature': '359e3af0c3ecd23e36c414ba007cbfdf2b5ed2ae9330075ca769a4ff60e9ddf0'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-00000039-8a310a2e-17af-42b8-a212-cf0a278e20c7-tap832212ef-77', 'timestamp': '2025-10-08T15:34:36.284329', 'resource_metadata': {'display_name': 'vm2', 'name': 'tap832212ef-77', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:26:8e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap832212ef-77'}, 'message_id': '4bb91c14-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.910167442, 'message_signature': '439c2a22f22311be35f61b9fcf56aa3d06c04e33db490a3849e72d679f4dcdb3'}]}, 'timestamp': '2025-10-08 15:34:36.285587', '_unique_id': 'a3205ccaeeb24e10b4894b5a4cf5ce54'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.286 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-test_multicast_north_south-1395375184>, <NovaLikeServer: tempest-test_multicast_north_south-39687703>, <NovaLikeServer: vm2>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_multicast_north_south-1395375184>, <NovaLikeServer: tempest-test_multicast_north_south-39687703>, <NovaLikeServer: vm2>]
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.287 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.287 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.read.requests volume: 11679 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.287 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.287 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.device.read.requests volume: 11520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.287 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.288 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.device.read.requests volume: 11348 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.288 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.288 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.288 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81983142-f2b6-4a9f-b2ef-24928403fe68', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11679, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:34:36.287205', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4bb96552-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.74018627, 'message_signature': '4167ef52ad747940810ee57330869949405c8c2d1271b4b4b108ebe1d324edcc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:34:36.287205', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bb96e1c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.74018627, 'message_signature': '905e2e939a72cae0ae8a20c736d3dfd61af230a98ff17d9b2aff8c3320d5c7b2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11520, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986-vda', 'timestamp': '2025-10-08T15:34:36.287205', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'instance-00000035', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4bb9763c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.767261269, 'message_signature': 'eaeb499befca32c0efb3caa1afad1e88603c8a0b935ebf88b276bd171619170d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986-sda', 'timestamp': '2025-10-08T15:34:36.287205', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'instance-00000035', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bb97da8-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.767261269, 'message_signature': '7596c3bcb4945bd32409938a6aaeebe7469a5b94e93cb87326e8dbb3979bf2df'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11348, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-vda', 'timestamp': '2025-10-08T15:34:36.287205', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'instance-00000037', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4bb98708-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.793168682, 'message_signature': '0eedef05db8fb66298e5694117f1aeaacb66d94d0498697541a3ef12cc322a20'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-sda', 'timestamp': '2025-10-08T15:34:36.287205', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'instance-00000037', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemera
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bb99180-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.793168682, 'message_signature': 'c8c4bfe40cd259fb1a0930d03b15852207432e798fb2457c7329249ef223dd14'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-vda', 'timestamp': '2025-10-08T15:34:36.287205', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4bb99c20-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.815603373, 'message_signature': '3e89bf622300929b1ee19cdee9a509a70854c86f3b3334c03c17c28e4593ea1b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-sda', 'timestamp': '2025-10-08T15:34:36.287205', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bb9a5f8-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.815603373, 'message_signature': '0cfe634614fb68bd050c5df937e3a6f6e00fef22509fb60d4ccc684da8eca242'}]}, 'timestamp': '2025-10-08 15:34:36.289289', '_unique_id': '5c93c75834034f5ab8aecfd1fd8ef0cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:34:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:34:36.269 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.290 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.291 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/cpu volume: 54210000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.291 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/cpu volume: 39150000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.291 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/cpu volume: 32420000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.291 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/cpu volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94ca05d1-3ad9-4519-92e2-1a05231dd8f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 54210000000, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'timestamp': '2025-10-08T15:34:36.290996', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '4bb9fc4c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.930518266, 'message_signature': '2d60f72166318030eff146623128f339e29df67004c1eacc0eadec481f0cdc6a'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 39150000000, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'timestamp': '2025-10-08T15:34:36.290996', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'instance-00000035', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '4bba0796-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.944512586, 'message_signature': '1c3ddb2855ddf2efd258a416a5a18c4fdbec4cdf2696fc1484c9d7a49f734112'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 32420000000, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'timestamp': '2025-10-08T15:34:36.290996', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'instance-00000037', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '4bba120e-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.9570573, 'message_signature': '0648e1b0374916178a9e738468a4d7eec17d0fcd3a1b0d4d299084480d4513a3'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'timestamp': '2025-10-08T15:34:36.290996', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '4bba1cae-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.970399768, 'message_signature': '6f85a9bf6febc3df8f0151361f7831a585eb364684de5839a973c7ab47f24e9f'}]}, 'timestamp': '2025-10-08 15:34:36.292165', '_unique_id': 'e0ab52c35bd24daa94ba33a070235321'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.292 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.293 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.293 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.write.latency volume: 10566496752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.293 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.293 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.device.write.latency volume: 3709422225 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.294 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.294 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.device.write.latency volume: 3909292887 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.294 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.294 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.294 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f867b75d-8fbb-4c62-bb44-30fd948190ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10566496752, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:34:36.293398', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4bba578c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.74018627, 'message_signature': 'f1896c33aa681994a954945eeb44a1488924d0a96cb5b3ac28b6060fa49070c3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:34:36.293398', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bba5f98-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.74018627, 'message_signature': 'e717a9fae3fc564cbe0a6ce4e3db21adc5ddfa8cf9c43a45bafe9bbdac765591'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3709422225, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986-vda', 'timestamp': '2025-10-08T15:34:36.293398', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'instance-00000035', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4bba67f4-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.767261269, 'message_signature': 'cdb43f0e6a162ad13e9721c296a8fa627a1ab3f9ee880e03fcb6679654135a4b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986-sda', 'timestamp': '2025-10-08T15:34:36.293398', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'instance-00000035', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bba70fa-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.767261269, 'message_signature': '03a7003d51bfd67b861e4d8749ece30e309e39b1dffa34ac1169e46c95bb917c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3909292887, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-vda', 'timestamp': '2025-10-08T15:34:36.293398', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'instance-00000037', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4bba7870-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.793168682, 'message_signature': '6f6f4caa6dd2356fa94703521365c7325467567cada4ffc385868d385c40264c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-sda', 'timestamp': '2025-10-08T15:34:36.293398', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'instance-00000037', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, '
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 1-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bba7fbe-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.793168682, 'message_signature': '2b8bf34fbe22b3b3c8d4de5dcd1cdfab7c20af3efc3f1c542826370f05182f24'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-vda', 'timestamp': '2025-10-08T15:34:36.293398', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4bba89d2-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.815603373, 'message_signature': '1324e4452180374f74c3a672bba2ce131d3aee7a76ab8022e3b85cbd24671f39'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-sda', 'timestamp': '2025-10-08T15:34:36.293398', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bba9346-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.815603373, 'message_signature': '8dd01c8d40b054894811acd2d5742e3339cc0b493c46b16ae2bad45fbb1746d9'}]}, 'timestamp': '2025-10-08 15:34:36.295177', '_unique_id': '26b73794a69f43e08a0a227198dde97e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:34:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:34:36.275 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.296 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.296 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.usage volume: 169607168 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.296 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.296 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.device.usage volume: 153223168 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.297 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.297 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.device.usage volume: 85196800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.297 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.297 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.usage volume: 196768 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.297 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0675f907-2f4f-4b49-91c1-2b9b85a6b646', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 169607168, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-vda', 'timestamp': '2025-10-08T15:34:36.296471', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4bbacf8c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.848054697, 'message_signature': 'cfec1cbec9e12e4f9a6cf2009e64ad5dcc6d3dbb5431154f707d9536e3c7b521'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '066ef28b-88ac-4f5c-acae-3458c3e19762-sda', 'timestamp': '2025-10-08T15:34:36.296471', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'instance-0000002d', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bbad702-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.848054697, 'message_signature': '363873c284213b78e94f1556eb2515d8baa569619e471c53e0f1a045effe2fb5'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153223168, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986-vda', 'timestamp': '2025-10-08T15:34:36.296471', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'instance-00000035', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4bbade5a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.862515851, 'message_signature': '1045e1bcc45a7b7aeb855f37d0a6ed66da852ab8f27bf9306641ab094fde195d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986-sda', 'timestamp': '2025-10-08T15:34:36.296471', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'instance-00000035', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bbae6ac-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.862515851, 'message_signature': 'addf5fe27ee9c3c5b315e5e327d22f8c9145e46f8c0c7387a83e4f9d459c4bd4'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 85196800, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-vda', 'timestamp': '2025-10-08T15:34:36.296471', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'instance-00000037', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4bbaef76-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.875353794, 'message_signature': '2626dc19ea221512ee54126005aee0b9a0a1ef65230921dd48b0b1b10b03bb6e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-sda', 'timestamp': '2025-10-08T15:34:36.296471', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'instance-00000037', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id':
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bbaf782-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.875353794, 'message_signature': '522078712697865ac8cbb2bb345592a60b5ca844ba096e3dcba4e1ce253a80ef'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196768, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-vda', 'timestamp': '2025-10-08T15:34:36.296471', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4bbb00b0-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.889478988, 'message_signature': '065760a05b5327783f3cf4f4dc482d998fb5b1073d2b1784e266a1498a520a2b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-sda', 'timestamp': '2025-10-08T15:34:36.296471', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4bbb0812-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.889478988, 'message_signature': 'b6f8dceffecd61fade634234e1abf43ca21eef2bb93abc8f3f74e34d55be9d44'}]}, 'timestamp': '2025-10-08 15:34:36.298259', '_unique_id': '681013faf67648f0b164db6e3d5ec3a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.299 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.299 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.299 12 DEBUG ceilometer.compute.pollsters [-] 066ef28b-88ac-4f5c-acae-3458c3e19762/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.300 12 DEBUG ceilometer.compute.pollsters [-] d73d8a2e-011b-4f41-9734-d2bb2b068986/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.300 12 DEBUG ceilometer.compute.pollsters [-] 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.300 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '052f074f-1129-4c8e-8cba-4b26d65e54d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap8f7d5998-03', 'timestamp': '2025-10-08T15:34:36.299561', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap8f7d5998-03', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:85:7d:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f7d5998-03'}, 'message_id': '4bbb4868-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.902030761, 'message_signature': '948af60edf8e9b6509e9a5a42e7dffae6d9255e94e7685b67b41e04a8f2001af'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000002d-066ef28b-88ac-4f5c-acae-3458c3e19762-tap50d486c7-b0', 'timestamp': '2025-10-08T15:34:36.299561', 'resource_metadata': {'display_name': 'tempest-test_bw_limit_tenant_network-1685300098', 'name': 'tap50d486c7-b0', 'instance_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c9:e2:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50d486c7-b0'}, 'message_id': '4bbb50e2-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.902030761, 'message_signature': '65efc08179067bda44625da66e36c72b8420f636e826dd23f85a31d076c9779d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000035-d73d8a2e-011b-4f41-9734-d2bb2b068986-tapbc705fd7-4e', 'timestamp': '2025-10-08T15:34:36.299561', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-1395375184', 'name': 'tapbc705fd7-4e', 'instance_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:00:d6:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbc705fd7-4e'}, 'message_id': '4bbb5a10-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.905970078, 'message_signature': '0ad4ed3cae262ca40614b8d30a61fddc5fbcbc194028df336c2c7c5eeea51730'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c0c7c5c2dab54695b1cc0a34bdc4ee47', 'user_name': None, 'project_id': '496a37645ecf47b496dcf02c696ca64a', 'project_name': None, 'resource_id': 'instance-00000037-6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-tapcfb829d2-b0', 'timestamp': '2025-10-08T15:34:36.299561', 'resource_metadata': {'display_name': 'tempest-test_multicast_north_south-39687703', 'name': 'tapcfb829d2-b0', 'instance_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'instance_type': 'custom_neutron_guest', 'host': 'fd46ab92dc87ca00129f7a6211ea2e87fe38da15995758baa1dc9cbd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b3:15:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcfb829d2-b0'}, 'message_id': '4bbb6212-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.908101866, 'message_signature': '558198d5be94d7ac5807fa3423475354bd1ea93873e5cee1daaae7dcb408bd84'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-00000039-8a310a2e-17af-42b8-a212-cf0a278e20c7-tap832212ef-77', 'timestamp': '2025-10-08T15:34:36.299561', 'resource_metadata': {'display_name': 'vm2', 'name': 'tap832212ef-77', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:26:8e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap832212ef-77'}, 'message_id': '4bbb6a6e-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4579.910167442, 'message_signature': 'ffcf5c06fa0f74ad3457ed30d4a01cbf678ae512445cc18fa855ce03ed522f06'}]}, 'timestamp': '2025-10-08 15:34:36.300657', '_unique_id': '511ee7bed791446fa1d2b2756d44455a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:34:36.301 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:34:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:34:36.290 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:34:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:34:36.295 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:34:36 np0005476733 rsyslogd[1005]: message too long (8192) with configured size 8096, begin of message is: 2025-10-08 15:34:36.298 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Oct  8 11:34:36 np0005476733 nova_compute[192580]: 2025-10-08 15:34:36.775 2 DEBUG nova.compute.manager [req-9802f9a8-7dc2-43a4-acfd-f72e21b7535a req-6bfc58d5-f96f-4229-bc74-eade3333ed32 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Received event network-changed-cfb829d2-b09f-4c87-8adf-76c33a6a438b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:34:36 np0005476733 nova_compute[192580]: 2025-10-08 15:34:36.776 2 DEBUG nova.compute.manager [req-9802f9a8-7dc2-43a4-acfd-f72e21b7535a req-6bfc58d5-f96f-4229-bc74-eade3333ed32 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Refreshing instance network info cache due to event network-changed-cfb829d2-b09f-4c87-8adf-76c33a6a438b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:34:36 np0005476733 nova_compute[192580]: 2025-10-08 15:34:36.776 2 DEBUG oslo_concurrency.lockutils [req-9802f9a8-7dc2-43a4-acfd-f72e21b7535a req-6bfc58d5-f96f-4229-bc74-eade3333ed32 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-6700973e-9d22-4d4a-8d39-ae92bc3bd6e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:34:36 np0005476733 nova_compute[192580]: 2025-10-08 15:34:36.776 2 DEBUG oslo_concurrency.lockutils [req-9802f9a8-7dc2-43a4-acfd-f72e21b7535a req-6bfc58d5-f96f-4229-bc74-eade3333ed32 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-6700973e-9d22-4d4a-8d39-ae92bc3bd6e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:34:36 np0005476733 nova_compute[192580]: 2025-10-08 15:34:36.777 2 DEBUG nova.network.neutron [req-9802f9a8-7dc2-43a4-acfd-f72e21b7535a req-6bfc58d5-f96f-4229-bc74-eade3333ed32 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Refreshing network info cache for port cfb829d2-b09f-4c87-8adf-76c33a6a438b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:34:38 np0005476733 nova_compute[192580]: 2025-10-08 15:34:38.890 2 DEBUG nova.compute.manager [req-4e749ecd-f77a-45c9-8e6a-38b3c0b8c6d0 req-843662aa-73a3-4110-b8d9-b52c6b0db444 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Received event network-vif-plugged-832212ef-772b-4b36-b486-7b4131fc3ab5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:34:38 np0005476733 nova_compute[192580]: 2025-10-08 15:34:38.891 2 DEBUG oslo_concurrency.lockutils [req-4e749ecd-f77a-45c9-8e6a-38b3c0b8c6d0 req-843662aa-73a3-4110-b8d9-b52c6b0db444 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "8a310a2e-17af-42b8-a212-cf0a278e20c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:34:38 np0005476733 nova_compute[192580]: 2025-10-08 15:34:38.891 2 DEBUG oslo_concurrency.lockutils [req-4e749ecd-f77a-45c9-8e6a-38b3c0b8c6d0 req-843662aa-73a3-4110-b8d9-b52c6b0db444 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8a310a2e-17af-42b8-a212-cf0a278e20c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:34:38 np0005476733 nova_compute[192580]: 2025-10-08 15:34:38.891 2 DEBUG oslo_concurrency.lockutils [req-4e749ecd-f77a-45c9-8e6a-38b3c0b8c6d0 req-843662aa-73a3-4110-b8d9-b52c6b0db444 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8a310a2e-17af-42b8-a212-cf0a278e20c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:34:38 np0005476733 nova_compute[192580]: 2025-10-08 15:34:38.892 2 DEBUG nova.compute.manager [req-4e749ecd-f77a-45c9-8e6a-38b3c0b8c6d0 req-843662aa-73a3-4110-b8d9-b52c6b0db444 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Processing event network-vif-plugged-832212ef-772b-4b36-b486-7b4131fc3ab5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:34:38 np0005476733 nova_compute[192580]: 2025-10-08 15:34:38.893 2 DEBUG nova.compute.manager [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Instance event wait completed in 6 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:34:38 np0005476733 nova_compute[192580]: 2025-10-08 15:34:38.898 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937678.8984833, 8a310a2e-17af-42b8-a212-cf0a278e20c7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:34:38 np0005476733 nova_compute[192580]: 2025-10-08 15:34:38.899 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:34:38 np0005476733 nova_compute[192580]: 2025-10-08 15:34:38.902 2 DEBUG nova.virt.libvirt.driver [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:34:38 np0005476733 nova_compute[192580]: 2025-10-08 15:34:38.906 2 INFO nova.virt.libvirt.driver [-] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Instance spawned successfully.#033[00m
Oct  8 11:34:38 np0005476733 nova_compute[192580]: 2025-10-08 15:34:38.906 2 DEBUG nova.virt.libvirt.driver [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:34:38 np0005476733 nova_compute[192580]: 2025-10-08 15:34:38.944 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:34:38 np0005476733 nova_compute[192580]: 2025-10-08 15:34:38.950 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:34:39 np0005476733 nova_compute[192580]: 2025-10-08 15:34:39.090 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:34:39 np0005476733 nova_compute[192580]: 2025-10-08 15:34:39.092 2 DEBUG nova.virt.libvirt.driver [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:34:39 np0005476733 nova_compute[192580]: 2025-10-08 15:34:39.092 2 DEBUG nova.virt.libvirt.driver [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:34:39 np0005476733 nova_compute[192580]: 2025-10-08 15:34:39.093 2 DEBUG nova.virt.libvirt.driver [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:34:39 np0005476733 nova_compute[192580]: 2025-10-08 15:34:39.094 2 DEBUG nova.virt.libvirt.driver [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:34:39 np0005476733 nova_compute[192580]: 2025-10-08 15:34:39.094 2 DEBUG nova.virt.libvirt.driver [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:34:39 np0005476733 nova_compute[192580]: 2025-10-08 15:34:39.095 2 DEBUG nova.virt.libvirt.driver [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:34:39 np0005476733 nova_compute[192580]: 2025-10-08 15:34:39.210 2 INFO nova.compute.manager [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Took 12.79 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:34:39 np0005476733 nova_compute[192580]: 2025-10-08 15:34:39.211 2 DEBUG nova.compute.manager [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:34:39 np0005476733 nova_compute[192580]: 2025-10-08 15:34:39.307 2 INFO nova.compute.manager [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Took 13.48 seconds to build instance.#033[00m
Oct  8 11:34:39 np0005476733 nova_compute[192580]: 2025-10-08 15:34:39.337 2 DEBUG nova.network.neutron [req-9802f9a8-7dc2-43a4-acfd-f72e21b7535a req-6bfc58d5-f96f-4229-bc74-eade3333ed32 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Updated VIF entry in instance network info cache for port cfb829d2-b09f-4c87-8adf-76c33a6a438b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:34:39 np0005476733 nova_compute[192580]: 2025-10-08 15:34:39.338 2 DEBUG nova.network.neutron [req-9802f9a8-7dc2-43a4-acfd-f72e21b7535a req-6bfc58d5-f96f-4229-bc74-eade3333ed32 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Updating instance_info_cache with network_info: [{"id": "cfb829d2-b09f-4c87-8adf-76c33a6a438b", "address": "fa:16:3e:b3:15:9d", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfb829d2-b0", "ovs_interfaceid": "cfb829d2-b09f-4c87-8adf-76c33a6a438b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:34:39 np0005476733 nova_compute[192580]: 2025-10-08 15:34:39.509 2 DEBUG oslo_concurrency.lockutils [None req-44065e25-5acd-4c97-84e6-d83f2e71411a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "8a310a2e-17af-42b8-a212-cf0a278e20c7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.760s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:34:39 np0005476733 nova_compute[192580]: 2025-10-08 15:34:39.588 2 DEBUG oslo_concurrency.lockutils [req-9802f9a8-7dc2-43a4-acfd-f72e21b7535a req-6bfc58d5-f96f-4229-bc74-eade3333ed32 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-6700973e-9d22-4d4a-8d39-ae92bc3bd6e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:34:39 np0005476733 nova_compute[192580]: 2025-10-08 15:34:39.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:40 np0005476733 nova_compute[192580]: 2025-10-08 15:34:40.347 2 INFO nova.compute.manager [None req-a12b10aa-7908-4649-9737-6a48e174abd5 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Get console output#033[00m
Oct  8 11:34:40 np0005476733 nova_compute[192580]: 2025-10-08 15:34:40.353 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:34:40 np0005476733 nova_compute[192580]: 2025-10-08 15:34:40.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:41 np0005476733 nova_compute[192580]: 2025-10-08 15:34:41.021 2 DEBUG nova.compute.manager [req-aaa26cbd-ad7e-4e7f-9791-348b82ba0c6d req-b5f6734c-ad34-4809-b850-782af7390ae8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Received event network-vif-plugged-832212ef-772b-4b36-b486-7b4131fc3ab5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:34:41 np0005476733 nova_compute[192580]: 2025-10-08 15:34:41.022 2 DEBUG oslo_concurrency.lockutils [req-aaa26cbd-ad7e-4e7f-9791-348b82ba0c6d req-b5f6734c-ad34-4809-b850-782af7390ae8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "8a310a2e-17af-42b8-a212-cf0a278e20c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:34:41 np0005476733 nova_compute[192580]: 2025-10-08 15:34:41.023 2 DEBUG oslo_concurrency.lockutils [req-aaa26cbd-ad7e-4e7f-9791-348b82ba0c6d req-b5f6734c-ad34-4809-b850-782af7390ae8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8a310a2e-17af-42b8-a212-cf0a278e20c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:34:41 np0005476733 nova_compute[192580]: 2025-10-08 15:34:41.024 2 DEBUG oslo_concurrency.lockutils [req-aaa26cbd-ad7e-4e7f-9791-348b82ba0c6d req-b5f6734c-ad34-4809-b850-782af7390ae8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8a310a2e-17af-42b8-a212-cf0a278e20c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:34:41 np0005476733 nova_compute[192580]: 2025-10-08 15:34:41.024 2 DEBUG nova.compute.manager [req-aaa26cbd-ad7e-4e7f-9791-348b82ba0c6d req-b5f6734c-ad34-4809-b850-782af7390ae8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] No waiting events found dispatching network-vif-plugged-832212ef-772b-4b36-b486-7b4131fc3ab5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:34:41 np0005476733 nova_compute[192580]: 2025-10-08 15:34:41.025 2 WARNING nova.compute.manager [req-aaa26cbd-ad7e-4e7f-9791-348b82ba0c6d req-b5f6734c-ad34-4809-b850-782af7390ae8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Received unexpected event network-vif-plugged-832212ef-772b-4b36-b486-7b4131fc3ab5 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:34:42 np0005476733 podman[234602]: 2025-10-08 15:34:42.2476722 +0000 UTC m=+0.072901804 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:34:43 np0005476733 nova_compute[192580]: 2025-10-08 15:34:43.138 2 DEBUG nova.compute.manager [req-123f5ab0-78d0-4f63-a3a4-b3fbab8ad056 req-3503b78e-98e2-4e6d-a95b-1ac0127d830e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received event network-changed-8f7d5998-037f-4a70-98a0-8482a8043a7e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:34:43 np0005476733 nova_compute[192580]: 2025-10-08 15:34:43.139 2 DEBUG nova.compute.manager [req-123f5ab0-78d0-4f63-a3a4-b3fbab8ad056 req-3503b78e-98e2-4e6d-a95b-1ac0127d830e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Refreshing instance network info cache due to event network-changed-8f7d5998-037f-4a70-98a0-8482a8043a7e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:34:43 np0005476733 nova_compute[192580]: 2025-10-08 15:34:43.139 2 DEBUG oslo_concurrency.lockutils [req-123f5ab0-78d0-4f63-a3a4-b3fbab8ad056 req-3503b78e-98e2-4e6d-a95b-1ac0127d830e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:34:43 np0005476733 nova_compute[192580]: 2025-10-08 15:34:43.140 2 DEBUG oslo_concurrency.lockutils [req-123f5ab0-78d0-4f63-a3a4-b3fbab8ad056 req-3503b78e-98e2-4e6d-a95b-1ac0127d830e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:34:43 np0005476733 nova_compute[192580]: 2025-10-08 15:34:43.140 2 DEBUG nova.network.neutron [req-123f5ab0-78d0-4f63-a3a4-b3fbab8ad056 req-3503b78e-98e2-4e6d-a95b-1ac0127d830e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Refreshing network info cache for port 8f7d5998-037f-4a70-98a0-8482a8043a7e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:34:44 np0005476733 nova_compute[192580]: 2025-10-08 15:34:44.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:45 np0005476733 nova_compute[192580]: 2025-10-08 15:34:45.524 2 INFO nova.compute.manager [None req-355f3883-5792-4200-8c97-fd99cdd0032a 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Get console output#033[00m
Oct  8 11:34:45 np0005476733 nova_compute[192580]: 2025-10-08 15:34:45.529 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:34:45 np0005476733 nova_compute[192580]: 2025-10-08 15:34:45.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:45 np0005476733 nova_compute[192580]: 2025-10-08 15:34:45.844 2 DEBUG nova.network.neutron [req-123f5ab0-78d0-4f63-a3a4-b3fbab8ad056 req-3503b78e-98e2-4e6d-a95b-1ac0127d830e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updated VIF entry in instance network info cache for port 8f7d5998-037f-4a70-98a0-8482a8043a7e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:34:45 np0005476733 nova_compute[192580]: 2025-10-08 15:34:45.844 2 DEBUG nova.network.neutron [req-123f5ab0-78d0-4f63-a3a4-b3fbab8ad056 req-3503b78e-98e2-4e6d-a95b-1ac0127d830e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updating instance_info_cache with network_info: [{"id": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "address": "fa:16:3e:85:7d:15", "network": {"id": "f81b33e3-d2f7-4437-b8c9-c9a54931fb61", "bridge": "br-int", "label": "tempest-test-network--416037603", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f7d5998-03", "ovs_interfaceid": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "50d486c7-b030-4d82-8b22-2f71cd277074", "address": "fa:16:3e:c9:e2:37", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50d486c7-b0", "ovs_interfaceid": "50d486c7-b030-4d82-8b22-2f71cd277074", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:34:45 np0005476733 nova_compute[192580]: 2025-10-08 15:34:45.870 2 DEBUG oslo_concurrency.lockutils [req-123f5ab0-78d0-4f63-a3a4-b3fbab8ad056 req-3503b78e-98e2-4e6d-a95b-1ac0127d830e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:34:46 np0005476733 podman[234622]: 2025-10-08 15:34:46.243997351 +0000 UTC m=+0.057793558 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  8 11:34:46 np0005476733 podman[234621]: 2025-10-08 15:34:46.263067444 +0000 UTC m=+0.083819895 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, managed_by=edpm_ansible)
Oct  8 11:34:46 np0005476733 nova_compute[192580]: 2025-10-08 15:34:46.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:34:48 np0005476733 nova_compute[192580]: 2025-10-08 15:34:48.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:34:48 np0005476733 nova_compute[192580]: 2025-10-08 15:34:48.622 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:34:48 np0005476733 nova_compute[192580]: 2025-10-08 15:34:48.623 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:34:48 np0005476733 nova_compute[192580]: 2025-10-08 15:34:48.623 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:34:48 np0005476733 nova_compute[192580]: 2025-10-08 15:34:48.623 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:34:48 np0005476733 nova_compute[192580]: 2025-10-08 15:34:48.735 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:34:48 np0005476733 nova_compute[192580]: 2025-10-08 15:34:48.836 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:34:48 np0005476733 nova_compute[192580]: 2025-10-08 15:34:48.838 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:34:48 np0005476733 nova_compute[192580]: 2025-10-08 15:34:48.899 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:34:48 np0005476733 nova_compute[192580]: 2025-10-08 15:34:48.906 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d73d8a2e-011b-4f41-9734-d2bb2b068986/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:34:48 np0005476733 nova_compute[192580]: 2025-10-08 15:34:48.965 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d73d8a2e-011b-4f41-9734-d2bb2b068986/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:34:48 np0005476733 nova_compute[192580]: 2025-10-08 15:34:48.966 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d73d8a2e-011b-4f41-9734-d2bb2b068986/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:34:49 np0005476733 nova_compute[192580]: 2025-10-08 15:34:49.038 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d73d8a2e-011b-4f41-9734-d2bb2b068986/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:34:49 np0005476733 nova_compute[192580]: 2025-10-08 15:34:49.044 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:34:49 np0005476733 nova_compute[192580]: 2025-10-08 15:34:49.111 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:34:49 np0005476733 nova_compute[192580]: 2025-10-08 15:34:49.113 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:34:49 np0005476733 nova_compute[192580]: 2025-10-08 15:34:49.176 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6700973e-9d22-4d4a-8d39-ae92bc3bd6e2/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:34:49 np0005476733 nova_compute[192580]: 2025-10-08 15:34:49.183 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:34:49 np0005476733 nova_compute[192580]: 2025-10-08 15:34:49.242 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:34:49 np0005476733 nova_compute[192580]: 2025-10-08 15:34:49.243 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:34:49 np0005476733 nova_compute[192580]: 2025-10-08 15:34:49.304 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:34:49 np0005476733 nova_compute[192580]: 2025-10-08 15:34:49.532 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:34:49 np0005476733 nova_compute[192580]: 2025-10-08 15:34:49.533 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=11027MB free_disk=110.90083694458008GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:34:49 np0005476733 nova_compute[192580]: 2025-10-08 15:34:49.534 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:34:49 np0005476733 nova_compute[192580]: 2025-10-08 15:34:49.534 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:34:49 np0005476733 nova_compute[192580]: 2025-10-08 15:34:49.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:49 np0005476733 nova_compute[192580]: 2025-10-08 15:34:49.703 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 066ef28b-88ac-4f5c-acae-3458c3e19762 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:34:49 np0005476733 nova_compute[192580]: 2025-10-08 15:34:49.704 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance d73d8a2e-011b-4f41-9734-d2bb2b068986 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:34:49 np0005476733 nova_compute[192580]: 2025-10-08 15:34:49.704 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:34:49 np0005476733 nova_compute[192580]: 2025-10-08 15:34:49.704 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 8a310a2e-17af-42b8-a212-cf0a278e20c7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:34:49 np0005476733 nova_compute[192580]: 2025-10-08 15:34:49.705 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:34:49 np0005476733 nova_compute[192580]: 2025-10-08 15:34:49.705 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=4608MB phys_disk=119GB used_disk=40GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:34:49 np0005476733 nova_compute[192580]: 2025-10-08 15:34:49.899 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:34:49 np0005476733 nova_compute[192580]: 2025-10-08 15:34:49.917 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:34:49 np0005476733 nova_compute[192580]: 2025-10-08 15:34:49.943 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:34:49 np0005476733 nova_compute[192580]: 2025-10-08 15:34:49.944 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.410s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:34:50 np0005476733 nova_compute[192580]: 2025-10-08 15:34:50.661 2 INFO nova.compute.manager [None req-dafd5ebe-6f2c-4ff4-a24a-b0939150215c 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Get console output#033[00m
Oct  8 11:34:50 np0005476733 nova_compute[192580]: 2025-10-08 15:34:50.669 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:34:50 np0005476733 nova_compute[192580]: 2025-10-08 15:34:50.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:50 np0005476733 nova_compute[192580]: 2025-10-08 15:34:50.945 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:34:50 np0005476733 nova_compute[192580]: 2025-10-08 15:34:50.946 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:34:51 np0005476733 nova_compute[192580]: 2025-10-08 15:34:51.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:34:51 np0005476733 nova_compute[192580]: 2025-10-08 15:34:51.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:34:52 np0005476733 nova_compute[192580]: 2025-10-08 15:34:52.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:34:53 np0005476733 podman[234701]: 2025-10-08 15:34:53.259183181 +0000 UTC m=+0.071852340 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 11:34:53 np0005476733 podman[234702]: 2025-10-08 15:34:53.266931831 +0000 UTC m=+0.070073353 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, version=9.6, name=ubi9-minimal, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:34:53 np0005476733 podman[234700]: 2025-10-08 15:34:53.276009262 +0000 UTC m=+0.089598581 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:34:54 np0005476733 nova_compute[192580]: 2025-10-08 15:34:54.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:55 np0005476733 nova_compute[192580]: 2025-10-08 15:34:55.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:34:55 np0005476733 nova_compute[192580]: 2025-10-08 15:34:55.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:55 np0005476733 nova_compute[192580]: 2025-10-08 15:34:55.841 2 INFO nova.compute.manager [None req-abd1dddc-4a06-4cb5-aba8-b34caa260f60 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Get console output#033[00m
Oct  8 11:34:55 np0005476733 nova_compute[192580]: 2025-10-08 15:34:55.846 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.534 2 DEBUG oslo_concurrency.lockutils [None req-1f6d5205-4f1c-4e36-9c3c-e79c0eac0e23 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "6700973e-9d22-4d4a-8d39-ae92bc3bd6e2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.535 2 DEBUG oslo_concurrency.lockutils [None req-1f6d5205-4f1c-4e36-9c3c-e79c0eac0e23 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "6700973e-9d22-4d4a-8d39-ae92bc3bd6e2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.535 2 DEBUG oslo_concurrency.lockutils [None req-1f6d5205-4f1c-4e36-9c3c-e79c0eac0e23 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.536 2 DEBUG oslo_concurrency.lockutils [None req-1f6d5205-4f1c-4e36-9c3c-e79c0eac0e23 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.536 2 DEBUG oslo_concurrency.lockutils [None req-1f6d5205-4f1c-4e36-9c3c-e79c0eac0e23 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.538 2 INFO nova.compute.manager [None req-1f6d5205-4f1c-4e36-9c3c-e79c0eac0e23 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Terminating instance#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.539 2 DEBUG nova.compute.manager [None req-1f6d5205-4f1c-4e36-9c3c-e79c0eac0e23 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:34:56 np0005476733 kernel: tapcfb829d2-b0 (unregistering): left promiscuous mode
Oct  8 11:34:56 np0005476733 NetworkManager[51699]: <info>  [1759937696.5828] device (tapcfb829d2-b0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:56 np0005476733 ovn_controller[94857]: 2025-10-08T15:34:56Z|00455|binding|INFO|Releasing lport cfb829d2-b09f-4c87-8adf-76c33a6a438b from this chassis (sb_readonly=0)
Oct  8 11:34:56 np0005476733 ovn_controller[94857]: 2025-10-08T15:34:56Z|00456|binding|INFO|Setting lport cfb829d2-b09f-4c87-8adf-76c33a6a438b down in Southbound
Oct  8 11:34:56 np0005476733 ovn_controller[94857]: 2025-10-08T15:34:56Z|00457|binding|INFO|Removing iface tapcfb829d2-b0 ovn-installed in OVS
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:56.602 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:15:9d 10.100.0.7'], port_security=['fa:16:3e:b3:15:9d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '6700973e-9d22-4d4a-8d39-ae92bc3bd6e2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '496a37645ecf47b496dcf02c696ca64a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '023a0cd3-fdca-4dff-ba80-8ef557b384c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.183'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b3d4cc6-3768-451b-b35e-6b2333c921fd, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=cfb829d2-b09f-4c87-8adf-76c33a6a438b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:34:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:56.604 103739 INFO neutron.agent.ovn.metadata.agent [-] Port cfb829d2-b09f-4c87-8adf-76c33a6a438b in datapath 30cdfb1e-750a-4d0e-9e9c-321b06b371b9 unbound from our chassis#033[00m
Oct  8 11:34:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:56.609 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 30cdfb1e-750a-4d0e-9e9c-321b06b371b9#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:56.632 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1a2a885d-9d49-4de6-b315-b825c39046ba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:34:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:56.663 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[903967fc-0596-48ea-851f-fca7e221edc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:34:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:56.666 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[42a3e88b-41b4-41a3-90be-590dbe0c731d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:34:56 np0005476733 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000037.scope: Deactivated successfully.
Oct  8 11:34:56 np0005476733 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000037.scope: Consumed 44.440s CPU time.
Oct  8 11:34:56 np0005476733 systemd-machined[152624]: Machine qemu-32-instance-00000037 terminated.
Oct  8 11:34:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:56.699 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[47c00a6e-f27d-4dd3-a81c-4bda57736ec7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:34:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:56.718 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b90d974a-4678-4334-93cd-313ee32b5170]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap30cdfb1e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:3e:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 7, 'rx_bytes': 1084, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 7, 'rx_bytes': 1084, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 100], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449303, 'reachable_time': 23840, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234781, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:34:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:56.734 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1306df10-9906-4e51-8c5c-25315d622f4f]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap30cdfb1e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 449316, 'tstamp': 449316}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234782, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap30cdfb1e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 449319, 'tstamp': 449319}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234782, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:34:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:56.736 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap30cdfb1e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:56.743 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap30cdfb1e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:34:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:56.743 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:34:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:56.743 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap30cdfb1e-70, col_values=(('external_ids', {'iface-id': '76302563-91ae-48df-adce-3edec8d5a578'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:56.744 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.826 2 INFO nova.virt.libvirt.driver [-] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Instance destroyed successfully.#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.827 2 DEBUG nova.objects.instance [None req-1f6d5205-4f1c-4e36-9c3c-e79c0eac0e23 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lazy-loading 'resources' on Instance uuid 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.839 2 DEBUG nova.virt.libvirt.vif [None req-1f6d5205-4f1c-4e36-9c3c-e79c0eac0e23 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:33:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_multicast_north_south-39687703',display_name='tempest-test_multicast_north_south-39687703',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-multicast-north-south-39687703',id=55,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHaTUyIW7HAi8eLb2uxsb3hQ01QNiqMtiwd2QQElMyFusiyPekoP+eGZG5apcvUeJj+ezHykEE9e9GalqeB/Pt0gdiMZz/nmUCtHv59KRRGG4S5F2fPmbxlRdJaDztvzVg==',key_name='tempest-keypair-test-1272869518',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:34:01Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='496a37645ecf47b496dcf02c696ca64a',ramdisk_id='',reservation_id='r-08rfza5j',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-MulticastTestIPv4Ovn-1993668591',owner_user_name='tempest-MulticastTestIPv4Ovn-1993668591-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:34:01Z,user_data=None,user_id='c0c7c5c2dab54695b1cc0a34bdc4ee47',uuid=6700973e-9d22-4d4a-8d39-ae92bc3bd6e2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cfb829d2-b09f-4c87-8adf-76c33a6a438b", "address": "fa:16:3e:b3:15:9d", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfb829d2-b0", "ovs_interfaceid": "cfb829d2-b09f-4c87-8adf-76c33a6a438b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.840 2 DEBUG nova.network.os_vif_util [None req-1f6d5205-4f1c-4e36-9c3c-e79c0eac0e23 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converting VIF {"id": "cfb829d2-b09f-4c87-8adf-76c33a6a438b", "address": "fa:16:3e:b3:15:9d", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfb829d2-b0", "ovs_interfaceid": "cfb829d2-b09f-4c87-8adf-76c33a6a438b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.840 2 DEBUG nova.network.os_vif_util [None req-1f6d5205-4f1c-4e36-9c3c-e79c0eac0e23 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b3:15:9d,bridge_name='br-int',has_traffic_filtering=True,id=cfb829d2-b09f-4c87-8adf-76c33a6a438b,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfb829d2-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.841 2 DEBUG os_vif [None req-1f6d5205-4f1c-4e36-9c3c-e79c0eac0e23 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b3:15:9d,bridge_name='br-int',has_traffic_filtering=True,id=cfb829d2-b09f-4c87-8adf-76c33a6a438b,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfb829d2-b0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.842 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcfb829d2-b0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.850 2 INFO os_vif [None req-1f6d5205-4f1c-4e36-9c3c-e79c0eac0e23 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b3:15:9d,bridge_name='br-int',has_traffic_filtering=True,id=cfb829d2-b09f-4c87-8adf-76c33a6a438b,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfb829d2-b0')#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.851 2 INFO nova.virt.libvirt.driver [None req-1f6d5205-4f1c-4e36-9c3c-e79c0eac0e23 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Deleting instance files /var/lib/nova/instances/6700973e-9d22-4d4a-8d39-ae92bc3bd6e2_del#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.852 2 INFO nova.virt.libvirt.driver [None req-1f6d5205-4f1c-4e36-9c3c-e79c0eac0e23 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Deletion of /var/lib/nova/instances/6700973e-9d22-4d4a-8d39-ae92bc3bd6e2_del complete#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.899 2 INFO nova.compute.manager [None req-1f6d5205-4f1c-4e36-9c3c-e79c0eac0e23 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.899 2 DEBUG oslo.service.loopingcall [None req-1f6d5205-4f1c-4e36-9c3c-e79c0eac0e23 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.900 2 DEBUG nova.compute.manager [-] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:34:56 np0005476733 nova_compute[192580]: 2025-10-08 15:34:56.900 2 DEBUG nova.network.neutron [-] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:34:57 np0005476733 nova_compute[192580]: 2025-10-08 15:34:57.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:34:57 np0005476733 nova_compute[192580]: 2025-10-08 15:34:57.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:34:58 np0005476733 nova_compute[192580]: 2025-10-08 15:34:58.511 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-d73d8a2e-011b-4f41-9734-d2bb2b068986" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:34:58 np0005476733 nova_compute[192580]: 2025-10-08 15:34:58.512 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-d73d8a2e-011b-4f41-9734-d2bb2b068986" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:34:58 np0005476733 nova_compute[192580]: 2025-10-08 15:34:58.513 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:34:58 np0005476733 nova_compute[192580]: 2025-10-08 15:34:58.760 2 DEBUG nova.compute.manager [req-d1887494-e221-4bd7-87e1-ffc30c282908 req-ebded04c-7150-4849-bfcc-e8b381e183eb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Received event network-vif-unplugged-cfb829d2-b09f-4c87-8adf-76c33a6a438b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:34:58 np0005476733 nova_compute[192580]: 2025-10-08 15:34:58.760 2 DEBUG oslo_concurrency.lockutils [req-d1887494-e221-4bd7-87e1-ffc30c282908 req-ebded04c-7150-4849-bfcc-e8b381e183eb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:34:58 np0005476733 nova_compute[192580]: 2025-10-08 15:34:58.761 2 DEBUG oslo_concurrency.lockutils [req-d1887494-e221-4bd7-87e1-ffc30c282908 req-ebded04c-7150-4849-bfcc-e8b381e183eb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:34:58 np0005476733 nova_compute[192580]: 2025-10-08 15:34:58.761 2 DEBUG oslo_concurrency.lockutils [req-d1887494-e221-4bd7-87e1-ffc30c282908 req-ebded04c-7150-4849-bfcc-e8b381e183eb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:34:58 np0005476733 nova_compute[192580]: 2025-10-08 15:34:58.761 2 DEBUG nova.compute.manager [req-d1887494-e221-4bd7-87e1-ffc30c282908 req-ebded04c-7150-4849-bfcc-e8b381e183eb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] No waiting events found dispatching network-vif-unplugged-cfb829d2-b09f-4c87-8adf-76c33a6a438b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:34:58 np0005476733 nova_compute[192580]: 2025-10-08 15:34:58.761 2 DEBUG nova.compute.manager [req-d1887494-e221-4bd7-87e1-ffc30c282908 req-ebded04c-7150-4849-bfcc-e8b381e183eb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Received event network-vif-unplugged-cfb829d2-b09f-4c87-8adf-76c33a6a438b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.153 2 DEBUG nova.network.neutron [-] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.177 2 INFO nova.compute.manager [-] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Took 2.28 seconds to deallocate network for instance.#033[00m
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.221 2 DEBUG oslo_concurrency.lockutils [None req-1f6d5205-4f1c-4e36-9c3c-e79c0eac0e23 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.222 2 DEBUG oslo_concurrency.lockutils [None req-1f6d5205-4f1c-4e36-9c3c-e79c0eac0e23 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.355 2 DEBUG nova.compute.provider_tree [None req-1f6d5205-4f1c-4e36-9c3c-e79c0eac0e23 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.377 2 DEBUG nova.scheduler.client.report [None req-1f6d5205-4f1c-4e36-9c3c-e79c0eac0e23 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.405 2 DEBUG oslo_concurrency.lockutils [None req-1f6d5205-4f1c-4e36-9c3c-e79c0eac0e23 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.430 2 INFO nova.scheduler.client.report [None req-1f6d5205-4f1c-4e36-9c3c-e79c0eac0e23 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Deleted allocations for instance 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2#033[00m
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.512 2 DEBUG oslo_concurrency.lockutils [None req-1f6d5205-4f1c-4e36-9c3c-e79c0eac0e23 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "6700973e-9d22-4d4a-8d39-ae92bc3bd6e2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.977s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.849 2 DEBUG oslo_concurrency.lockutils [None req-d53b4d9e-3cf9-4b8d-a0da-18e95284f968 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "d73d8a2e-011b-4f41-9734-d2bb2b068986" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.849 2 DEBUG oslo_concurrency.lockutils [None req-d53b4d9e-3cf9-4b8d-a0da-18e95284f968 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "d73d8a2e-011b-4f41-9734-d2bb2b068986" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.850 2 DEBUG oslo_concurrency.lockutils [None req-d53b4d9e-3cf9-4b8d-a0da-18e95284f968 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "d73d8a2e-011b-4f41-9734-d2bb2b068986-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.850 2 DEBUG oslo_concurrency.lockutils [None req-d53b4d9e-3cf9-4b8d-a0da-18e95284f968 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "d73d8a2e-011b-4f41-9734-d2bb2b068986-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.851 2 DEBUG oslo_concurrency.lockutils [None req-d53b4d9e-3cf9-4b8d-a0da-18e95284f968 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "d73d8a2e-011b-4f41-9734-d2bb2b068986-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.852 2 INFO nova.compute.manager [None req-d53b4d9e-3cf9-4b8d-a0da-18e95284f968 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Terminating instance#033[00m
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.853 2 DEBUG nova.compute.manager [None req-d53b4d9e-3cf9-4b8d-a0da-18e95284f968 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:34:59 np0005476733 kernel: tapbc705fd7-4e (unregistering): left promiscuous mode
Oct  8 11:34:59 np0005476733 NetworkManager[51699]: <info>  [1759937699.8803] device (tapbc705fd7-4e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:59 np0005476733 ovn_controller[94857]: 2025-10-08T15:34:59Z|00458|binding|INFO|Releasing lport bc705fd7-4e51-4032-817d-a3554b18a7d9 from this chassis (sb_readonly=0)
Oct  8 11:34:59 np0005476733 ovn_controller[94857]: 2025-10-08T15:34:59Z|00459|binding|INFO|Setting lport bc705fd7-4e51-4032-817d-a3554b18a7d9 down in Southbound
Oct  8 11:34:59 np0005476733 ovn_controller[94857]: 2025-10-08T15:34:59Z|00460|binding|INFO|Removing iface tapbc705fd7-4e ovn-installed in OVS
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:59.901 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:00:d6:7c 10.100.0.10'], port_security=['fa:16:3e:00:d6:7c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd73d8a2e-011b-4f41-9734-d2bb2b068986', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '496a37645ecf47b496dcf02c696ca64a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '023a0cd3-fdca-4dff-ba80-8ef557b384c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.235'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b3d4cc6-3768-451b-b35e-6b2333c921fd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=bc705fd7-4e51-4032-817d-a3554b18a7d9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:34:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:59.902 103739 INFO neutron.agent.ovn.metadata.agent [-] Port bc705fd7-4e51-4032-817d-a3554b18a7d9 in datapath 30cdfb1e-750a-4d0e-9e9c-321b06b371b9 unbound from our chassis#033[00m
Oct  8 11:34:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:59.904 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 30cdfb1e-750a-4d0e-9e9c-321b06b371b9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:34:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:59.906 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8179e3d0-fdb0-48df-8185-9c25df5a7e49]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:34:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:34:59.906 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9 namespace which is not needed anymore#033[00m
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.918 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Updating instance_info_cache with network_info: [{"id": "bc705fd7-4e51-4032-817d-a3554b18a7d9", "address": "fa:16:3e:00:d6:7c", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc705fd7-4e", "ovs_interfaceid": "bc705fd7-4e51-4032-817d-a3554b18a7d9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.936 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-d73d8a2e-011b-4f41-9734-d2bb2b068986" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.937 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.937 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:34:59 np0005476733 nova_compute[192580]: 2025-10-08 15:34:59.937 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 11:34:59 np0005476733 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000035.scope: Deactivated successfully.
Oct  8 11:34:59 np0005476733 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000035.scope: Consumed 44.712s CPU time.
Oct  8 11:34:59 np0005476733 systemd-machined[152624]: Machine qemu-31-instance-00000035 terminated.
Oct  8 11:35:00 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[233711]: [NOTICE]   (233715) : haproxy version is 2.8.14-c23fe91
Oct  8 11:35:00 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[233711]: [NOTICE]   (233715) : path to executable is /usr/sbin/haproxy
Oct  8 11:35:00 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[233711]: [WARNING]  (233715) : Exiting Master process...
Oct  8 11:35:00 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[233711]: [ALERT]    (233715) : Current worker (233717) exited with code 143 (Terminated)
Oct  8 11:35:00 np0005476733 neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9[233711]: [WARNING]  (233715) : All workers exited. Exiting... (0)
Oct  8 11:35:00 np0005476733 systemd[1]: libpod-bba8d4de3bfa484e141c5d1032593cae4f1bc11710796874a2504c8c8251faf1.scope: Deactivated successfully.
Oct  8 11:35:00 np0005476733 podman[234824]: 2025-10-08 15:35:00.063549276 +0000 UTC m=+0.048613203 container died bba8d4de3bfa484e141c5d1032593cae4f1bc11710796874a2504c8c8251faf1 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:35:00 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bba8d4de3bfa484e141c5d1032593cae4f1bc11710796874a2504c8c8251faf1-userdata-shm.mount: Deactivated successfully.
Oct  8 11:35:00 np0005476733 systemd[1]: var-lib-containers-storage-overlay-65124f7a483a3b3183b8c70e44a7c847783d99276680556f3f641efadd99ce8f-merged.mount: Deactivated successfully.
Oct  8 11:35:00 np0005476733 podman[234824]: 2025-10-08 15:35:00.110680111 +0000 UTC m=+0.095744028 container cleanup bba8d4de3bfa484e141c5d1032593cae4f1bc11710796874a2504c8c8251faf1 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 11:35:00 np0005476733 systemd[1]: libpod-conmon-bba8d4de3bfa484e141c5d1032593cae4f1bc11710796874a2504c8c8251faf1.scope: Deactivated successfully.
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.126 2 INFO nova.virt.libvirt.driver [-] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Instance destroyed successfully.#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.126 2 DEBUG nova.objects.instance [None req-d53b4d9e-3cf9-4b8d-a0da-18e95284f968 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lazy-loading 'resources' on Instance uuid d73d8a2e-011b-4f41-9734-d2bb2b068986 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.150 2 DEBUG nova.virt.libvirt.vif [None req-d53b4d9e-3cf9-4b8d-a0da-18e95284f968 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:33:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_multicast_north_south-1395375184',display_name='tempest-test_multicast_north_south-1395375184',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-multicast-north-south-1395375184',id=53,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHaTUyIW7HAi8eLb2uxsb3hQ01QNiqMtiwd2QQElMyFusiyPekoP+eGZG5apcvUeJj+ezHykEE9e9GalqeB/Pt0gdiMZz/nmUCtHv59KRRGG4S5F2fPmbxlRdJaDztvzVg==',key_name='tempest-keypair-test-1272869518',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:33:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='496a37645ecf47b496dcf02c696ca64a',ramdisk_id='',reservation_id='r-a0huy3g9',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-MulticastTestIPv4Ovn-1993668591',owner_user_name='tempest-MulticastTestIPv4Ovn-1993668591-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:33:10Z,user_data=None,user_id='c0c7c5c2dab54695b1cc0a34bdc4ee47',uuid=d73d8a2e-011b-4f41-9734-d2bb2b068986,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bc705fd7-4e51-4032-817d-a3554b18a7d9", "address": "fa:16:3e:00:d6:7c", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc705fd7-4e", "ovs_interfaceid": "bc705fd7-4e51-4032-817d-a3554b18a7d9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.151 2 DEBUG nova.network.os_vif_util [None req-d53b4d9e-3cf9-4b8d-a0da-18e95284f968 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converting VIF {"id": "bc705fd7-4e51-4032-817d-a3554b18a7d9", "address": "fa:16:3e:00:d6:7c", "network": {"id": "30cdfb1e-750a-4d0e-9e9c-321b06b371b9", "bridge": "br-int", "label": "tempest-test-network--366823925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "496a37645ecf47b496dcf02c696ca64a", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc705fd7-4e", "ovs_interfaceid": "bc705fd7-4e51-4032-817d-a3554b18a7d9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.152 2 DEBUG nova.network.os_vif_util [None req-d53b4d9e-3cf9-4b8d-a0da-18e95284f968 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:00:d6:7c,bridge_name='br-int',has_traffic_filtering=True,id=bc705fd7-4e51-4032-817d-a3554b18a7d9,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc705fd7-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.152 2 DEBUG os_vif [None req-d53b4d9e-3cf9-4b8d-a0da-18e95284f968 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:00:d6:7c,bridge_name='br-int',has_traffic_filtering=True,id=bc705fd7-4e51-4032-817d-a3554b18a7d9,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc705fd7-4e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.154 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbc705fd7-4e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.161 2 INFO os_vif [None req-d53b4d9e-3cf9-4b8d-a0da-18e95284f968 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:00:d6:7c,bridge_name='br-int',has_traffic_filtering=True,id=bc705fd7-4e51-4032-817d-a3554b18a7d9,network=Network(30cdfb1e-750a-4d0e-9e9c-321b06b371b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc705fd7-4e')#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.162 2 INFO nova.virt.libvirt.driver [None req-d53b4d9e-3cf9-4b8d-a0da-18e95284f968 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Deleting instance files /var/lib/nova/instances/d73d8a2e-011b-4f41-9734-d2bb2b068986_del#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.162 2 INFO nova.virt.libvirt.driver [None req-d53b4d9e-3cf9-4b8d-a0da-18e95284f968 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Deletion of /var/lib/nova/instances/d73d8a2e-011b-4f41-9734-d2bb2b068986_del complete#033[00m
Oct  8 11:35:00 np0005476733 podman[234870]: 2025-10-08 15:35:00.187072996 +0000 UTC m=+0.048438438 container remove bba8d4de3bfa484e141c5d1032593cae4f1bc11710796874a2504c8c8251faf1 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 11:35:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:00.194 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5ec71d68-fa58-4e0a-8b79-6d036846e5c7]: (4, ('Wed Oct  8 03:35:00 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9 (bba8d4de3bfa484e141c5d1032593cae4f1bc11710796874a2504c8c8251faf1)\nbba8d4de3bfa484e141c5d1032593cae4f1bc11710796874a2504c8c8251faf1\nWed Oct  8 03:35:00 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9 (bba8d4de3bfa484e141c5d1032593cae4f1bc11710796874a2504c8c8251faf1)\nbba8d4de3bfa484e141c5d1032593cae4f1bc11710796874a2504c8c8251faf1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:35:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:00.197 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f1a04d93-0e44-4c46-9c27-fea01151c7bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:35:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:00.198 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap30cdfb1e-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:00 np0005476733 kernel: tap30cdfb1e-70: left promiscuous mode
Oct  8 11:35:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:00.205 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e2e907a2-baeb-44d6-b7a2-f2973216a204]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.206 2 DEBUG nova.compute.manager [req-d0d5630c-0487-4ae8-aa74-8cd21fa4caad req-01f39fd6-7004-4c69-9303-1fd182b3924b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Received event network-vif-unplugged-bc705fd7-4e51-4032-817d-a3554b18a7d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.207 2 DEBUG oslo_concurrency.lockutils [req-d0d5630c-0487-4ae8-aa74-8cd21fa4caad req-01f39fd6-7004-4c69-9303-1fd182b3924b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "d73d8a2e-011b-4f41-9734-d2bb2b068986-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.207 2 DEBUG oslo_concurrency.lockutils [req-d0d5630c-0487-4ae8-aa74-8cd21fa4caad req-01f39fd6-7004-4c69-9303-1fd182b3924b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d73d8a2e-011b-4f41-9734-d2bb2b068986-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.207 2 DEBUG oslo_concurrency.lockutils [req-d0d5630c-0487-4ae8-aa74-8cd21fa4caad req-01f39fd6-7004-4c69-9303-1fd182b3924b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d73d8a2e-011b-4f41-9734-d2bb2b068986-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.208 2 DEBUG nova.compute.manager [req-d0d5630c-0487-4ae8-aa74-8cd21fa4caad req-01f39fd6-7004-4c69-9303-1fd182b3924b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] No waiting events found dispatching network-vif-unplugged-bc705fd7-4e51-4032-817d-a3554b18a7d9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.208 2 DEBUG nova.compute.manager [req-d0d5630c-0487-4ae8-aa74-8cd21fa4caad req-01f39fd6-7004-4c69-9303-1fd182b3924b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Received event network-vif-unplugged-bc705fd7-4e51-4032-817d-a3554b18a7d9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.231 2 INFO nova.compute.manager [None req-d53b4d9e-3cf9-4b8d-a0da-18e95284f968 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.233 2 DEBUG oslo.service.loopingcall [None req-d53b4d9e-3cf9-4b8d-a0da-18e95284f968 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.234 2 DEBUG nova.compute.manager [-] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.234 2 DEBUG nova.network.neutron [-] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:35:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:00.244 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[57f9ad4c-6acd-4423-9307-ba8668768757]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:35:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:00.246 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9055d1ca-56bd-42e8-a162-54c6eae9de5c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:35:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:00.262 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c69f50b3-7bae-48ec-ac7d-be3a82814da7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449291, 'reachable_time': 35024, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234885, 'error': None, 'target': 'ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:35:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:00.265 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-30cdfb1e-750a-4d0e-9e9c-321b06b371b9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:35:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:00.266 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[399b4296-eca1-407c-8639-408abfa7bdc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:35:00 np0005476733 systemd[1]: run-netns-ovnmeta\x2d30cdfb1e\x2d750a\x2d4d0e\x2d9e9c\x2d321b06b371b9.mount: Deactivated successfully.
Oct  8 11:35:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:00.316 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:35:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:00.317 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.599 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.872 2 DEBUG nova.compute.manager [req-906e406c-f278-4fd2-a0e0-4b0b0d680f2d req-1dcb76ff-17ea-41fe-8377-351f229a72aa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Received event network-vif-deleted-cfb829d2-b09f-4c87-8adf-76c33a6a438b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.872 2 DEBUG nova.compute.manager [req-906e406c-f278-4fd2-a0e0-4b0b0d680f2d req-1dcb76ff-17ea-41fe-8377-351f229a72aa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Received event network-vif-plugged-cfb829d2-b09f-4c87-8adf-76c33a6a438b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.873 2 DEBUG oslo_concurrency.lockutils [req-906e406c-f278-4fd2-a0e0-4b0b0d680f2d req-1dcb76ff-17ea-41fe-8377-351f229a72aa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.873 2 DEBUG oslo_concurrency.lockutils [req-906e406c-f278-4fd2-a0e0-4b0b0d680f2d req-1dcb76ff-17ea-41fe-8377-351f229a72aa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.874 2 DEBUG oslo_concurrency.lockutils [req-906e406c-f278-4fd2-a0e0-4b0b0d680f2d req-1dcb76ff-17ea-41fe-8377-351f229a72aa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "6700973e-9d22-4d4a-8d39-ae92bc3bd6e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.874 2 DEBUG nova.compute.manager [req-906e406c-f278-4fd2-a0e0-4b0b0d680f2d req-1dcb76ff-17ea-41fe-8377-351f229a72aa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] No waiting events found dispatching network-vif-plugged-cfb829d2-b09f-4c87-8adf-76c33a6a438b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.874 2 WARNING nova.compute.manager [req-906e406c-f278-4fd2-a0e0-4b0b0d680f2d req-1dcb76ff-17ea-41fe-8377-351f229a72aa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Received unexpected event network-vif-plugged-cfb829d2-b09f-4c87-8adf-76c33a6a438b for instance with vm_state deleted and task_state None.#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.874 2 DEBUG nova.compute.manager [req-906e406c-f278-4fd2-a0e0-4b0b0d680f2d req-1dcb76ff-17ea-41fe-8377-351f229a72aa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received event network-changed-8f7d5998-037f-4a70-98a0-8482a8043a7e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.875 2 DEBUG nova.compute.manager [req-906e406c-f278-4fd2-a0e0-4b0b0d680f2d req-1dcb76ff-17ea-41fe-8377-351f229a72aa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Refreshing instance network info cache due to event network-changed-8f7d5998-037f-4a70-98a0-8482a8043a7e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.875 2 DEBUG oslo_concurrency.lockutils [req-906e406c-f278-4fd2-a0e0-4b0b0d680f2d req-1dcb76ff-17ea-41fe-8377-351f229a72aa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.875 2 DEBUG oslo_concurrency.lockutils [req-906e406c-f278-4fd2-a0e0-4b0b0d680f2d req-1dcb76ff-17ea-41fe-8377-351f229a72aa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.875 2 DEBUG nova.network.neutron [req-906e406c-f278-4fd2-a0e0-4b0b0d680f2d req-1dcb76ff-17ea-41fe-8377-351f229a72aa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Refreshing network info cache for port 8f7d5998-037f-4a70-98a0-8482a8043a7e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.962 2 INFO nova.compute.manager [None req-b58be43e-3b90-44c0-a8c0-5b46db5ad143 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Get console output#033[00m
Oct  8 11:35:00 np0005476733 nova_compute[192580]: 2025-10-08 15:35:00.969 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:35:02 np0005476733 nova_compute[192580]: 2025-10-08 15:35:02.146 2 DEBUG nova.network.neutron [-] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:35:02 np0005476733 nova_compute[192580]: 2025-10-08 15:35:02.165 2 INFO nova.compute.manager [-] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Took 1.93 seconds to deallocate network for instance.#033[00m
Oct  8 11:35:02 np0005476733 nova_compute[192580]: 2025-10-08 15:35:02.302 2 DEBUG oslo_concurrency.lockutils [None req-d53b4d9e-3cf9-4b8d-a0da-18e95284f968 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:35:02 np0005476733 nova_compute[192580]: 2025-10-08 15:35:02.303 2 DEBUG oslo_concurrency.lockutils [None req-d53b4d9e-3cf9-4b8d-a0da-18e95284f968 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:35:02 np0005476733 nova_compute[192580]: 2025-10-08 15:35:02.306 2 DEBUG nova.compute.manager [req-844a1692-0d96-42cb-9ab2-4229bc6689a9 req-09aa7c95-5245-4d0b-9b7f-74e624d7b457 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Received event network-vif-plugged-bc705fd7-4e51-4032-817d-a3554b18a7d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:35:02 np0005476733 nova_compute[192580]: 2025-10-08 15:35:02.306 2 DEBUG oslo_concurrency.lockutils [req-844a1692-0d96-42cb-9ab2-4229bc6689a9 req-09aa7c95-5245-4d0b-9b7f-74e624d7b457 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "d73d8a2e-011b-4f41-9734-d2bb2b068986-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:35:02 np0005476733 nova_compute[192580]: 2025-10-08 15:35:02.307 2 DEBUG oslo_concurrency.lockutils [req-844a1692-0d96-42cb-9ab2-4229bc6689a9 req-09aa7c95-5245-4d0b-9b7f-74e624d7b457 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d73d8a2e-011b-4f41-9734-d2bb2b068986-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:35:02 np0005476733 nova_compute[192580]: 2025-10-08 15:35:02.307 2 DEBUG oslo_concurrency.lockutils [req-844a1692-0d96-42cb-9ab2-4229bc6689a9 req-09aa7c95-5245-4d0b-9b7f-74e624d7b457 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d73d8a2e-011b-4f41-9734-d2bb2b068986-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:35:02 np0005476733 nova_compute[192580]: 2025-10-08 15:35:02.307 2 DEBUG nova.compute.manager [req-844a1692-0d96-42cb-9ab2-4229bc6689a9 req-09aa7c95-5245-4d0b-9b7f-74e624d7b457 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] No waiting events found dispatching network-vif-plugged-bc705fd7-4e51-4032-817d-a3554b18a7d9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:35:02 np0005476733 nova_compute[192580]: 2025-10-08 15:35:02.307 2 WARNING nova.compute.manager [req-844a1692-0d96-42cb-9ab2-4229bc6689a9 req-09aa7c95-5245-4d0b-9b7f-74e624d7b457 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Received unexpected event network-vif-plugged-bc705fd7-4e51-4032-817d-a3554b18a7d9 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 11:35:02 np0005476733 nova_compute[192580]: 2025-10-08 15:35:02.308 2 DEBUG nova.compute.manager [req-844a1692-0d96-42cb-9ab2-4229bc6689a9 req-09aa7c95-5245-4d0b-9b7f-74e624d7b457 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received event network-changed-8f7d5998-037f-4a70-98a0-8482a8043a7e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:35:02 np0005476733 nova_compute[192580]: 2025-10-08 15:35:02.308 2 DEBUG nova.compute.manager [req-844a1692-0d96-42cb-9ab2-4229bc6689a9 req-09aa7c95-5245-4d0b-9b7f-74e624d7b457 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Refreshing instance network info cache due to event network-changed-8f7d5998-037f-4a70-98a0-8482a8043a7e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:35:02 np0005476733 nova_compute[192580]: 2025-10-08 15:35:02.308 2 DEBUG oslo_concurrency.lockutils [req-844a1692-0d96-42cb-9ab2-4229bc6689a9 req-09aa7c95-5245-4d0b-9b7f-74e624d7b457 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:35:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:02.319 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:35:02 np0005476733 nova_compute[192580]: 2025-10-08 15:35:02.395 2 DEBUG nova.compute.provider_tree [None req-d53b4d9e-3cf9-4b8d-a0da-18e95284f968 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:35:02 np0005476733 nova_compute[192580]: 2025-10-08 15:35:02.411 2 DEBUG nova.scheduler.client.report [None req-d53b4d9e-3cf9-4b8d-a0da-18e95284f968 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:35:02 np0005476733 nova_compute[192580]: 2025-10-08 15:35:02.430 2 DEBUG oslo_concurrency.lockutils [None req-d53b4d9e-3cf9-4b8d-a0da-18e95284f968 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:35:02 np0005476733 nova_compute[192580]: 2025-10-08 15:35:02.455 2 INFO nova.scheduler.client.report [None req-d53b4d9e-3cf9-4b8d-a0da-18e95284f968 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Deleted allocations for instance d73d8a2e-011b-4f41-9734-d2bb2b068986#033[00m
Oct  8 11:35:02 np0005476733 nova_compute[192580]: 2025-10-08 15:35:02.517 2 DEBUG oslo_concurrency.lockutils [None req-d53b4d9e-3cf9-4b8d-a0da-18e95284f968 c0c7c5c2dab54695b1cc0a34bdc4ee47 496a37645ecf47b496dcf02c696ca64a - - default default] Lock "d73d8a2e-011b-4f41-9734-d2bb2b068986" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.668s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:35:02 np0005476733 nova_compute[192580]: 2025-10-08 15:35:02.947 2 DEBUG nova.compute.manager [req-5826acea-62a2-4978-a21d-aee04711d83a req-f2d92583-ff64-49ef-81a6-30849aeff467 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Received event network-vif-deleted-bc705fd7-4e51-4032-817d-a3554b18a7d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:35:03 np0005476733 podman[234887]: 2025-10-08 15:35:03.27205915 +0000 UTC m=+0.080751166 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 11:35:03 np0005476733 podman[234886]: 2025-10-08 15:35:03.280537013 +0000 UTC m=+0.091113790 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 11:35:03 np0005476733 nova_compute[192580]: 2025-10-08 15:35:03.469 2 DEBUG nova.network.neutron [req-906e406c-f278-4fd2-a0e0-4b0b0d680f2d req-1dcb76ff-17ea-41fe-8377-351f229a72aa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updated VIF entry in instance network info cache for port 8f7d5998-037f-4a70-98a0-8482a8043a7e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:35:03 np0005476733 nova_compute[192580]: 2025-10-08 15:35:03.470 2 DEBUG nova.network.neutron [req-906e406c-f278-4fd2-a0e0-4b0b0d680f2d req-1dcb76ff-17ea-41fe-8377-351f229a72aa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updating instance_info_cache with network_info: [{"id": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "address": "fa:16:3e:85:7d:15", "network": {"id": "f81b33e3-d2f7-4437-b8c9-c9a54931fb61", "bridge": "br-int", "label": "tempest-test-network--416037603", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f7d5998-03", "ovs_interfaceid": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "50d486c7-b030-4d82-8b22-2f71cd277074", "address": "fa:16:3e:c9:e2:37", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50d486c7-b0", "ovs_interfaceid": "50d486c7-b030-4d82-8b22-2f71cd277074", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:35:03 np0005476733 nova_compute[192580]: 2025-10-08 15:35:03.489 2 DEBUG oslo_concurrency.lockutils [req-906e406c-f278-4fd2-a0e0-4b0b0d680f2d req-1dcb76ff-17ea-41fe-8377-351f229a72aa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:35:03 np0005476733 nova_compute[192580]: 2025-10-08 15:35:03.491 2 DEBUG oslo_concurrency.lockutils [req-844a1692-0d96-42cb-9ab2-4229bc6689a9 req-09aa7c95-5245-4d0b-9b7f-74e624d7b457 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:35:03 np0005476733 nova_compute[192580]: 2025-10-08 15:35:03.491 2 DEBUG nova.network.neutron [req-844a1692-0d96-42cb-9ab2-4229bc6689a9 req-09aa7c95-5245-4d0b-9b7f-74e624d7b457 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Refreshing network info cache for port 8f7d5998-037f-4a70-98a0-8482a8043a7e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:35:03 np0005476733 ovn_controller[94857]: 2025-10-08T15:35:03Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:26:8e:11 192.168.100.73
Oct  8 11:35:03 np0005476733 ovn_controller[94857]: 2025-10-08T15:35:03Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:26:8e:11 192.168.100.73
Oct  8 11:35:04 np0005476733 nova_compute[192580]: 2025-10-08 15:35:04.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:05 np0005476733 nova_compute[192580]: 2025-10-08 15:35:05.053 2 DEBUG nova.compute.manager [req-e88c19cc-04cd-4aa6-a7ea-db223262b7fb req-454b1ce8-3953-40e2-802b-ea4f67fb645d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received event network-changed-8f7d5998-037f-4a70-98a0-8482a8043a7e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:35:05 np0005476733 nova_compute[192580]: 2025-10-08 15:35:05.054 2 DEBUG nova.compute.manager [req-e88c19cc-04cd-4aa6-a7ea-db223262b7fb req-454b1ce8-3953-40e2-802b-ea4f67fb645d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Refreshing instance network info cache due to event network-changed-8f7d5998-037f-4a70-98a0-8482a8043a7e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:35:05 np0005476733 nova_compute[192580]: 2025-10-08 15:35:05.054 2 DEBUG oslo_concurrency.lockutils [req-e88c19cc-04cd-4aa6-a7ea-db223262b7fb req-454b1ce8-3953-40e2-802b-ea4f67fb645d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:35:05 np0005476733 nova_compute[192580]: 2025-10-08 15:35:05.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:05 np0005476733 nova_compute[192580]: 2025-10-08 15:35:05.787 2 DEBUG nova.network.neutron [req-844a1692-0d96-42cb-9ab2-4229bc6689a9 req-09aa7c95-5245-4d0b-9b7f-74e624d7b457 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updated VIF entry in instance network info cache for port 8f7d5998-037f-4a70-98a0-8482a8043a7e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:35:05 np0005476733 nova_compute[192580]: 2025-10-08 15:35:05.789 2 DEBUG nova.network.neutron [req-844a1692-0d96-42cb-9ab2-4229bc6689a9 req-09aa7c95-5245-4d0b-9b7f-74e624d7b457 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updating instance_info_cache with network_info: [{"id": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "address": "fa:16:3e:85:7d:15", "network": {"id": "f81b33e3-d2f7-4437-b8c9-c9a54931fb61", "bridge": "br-int", "label": "tempest-test-network--416037603", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f7d5998-03", "ovs_interfaceid": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "50d486c7-b030-4d82-8b22-2f71cd277074", "address": "fa:16:3e:c9:e2:37", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50d486c7-b0", "ovs_interfaceid": "50d486c7-b030-4d82-8b22-2f71cd277074", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:35:05 np0005476733 nova_compute[192580]: 2025-10-08 15:35:05.810 2 DEBUG oslo_concurrency.lockutils [req-844a1692-0d96-42cb-9ab2-4229bc6689a9 req-09aa7c95-5245-4d0b-9b7f-74e624d7b457 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:35:05 np0005476733 nova_compute[192580]: 2025-10-08 15:35:05.811 2 DEBUG oslo_concurrency.lockutils [req-e88c19cc-04cd-4aa6-a7ea-db223262b7fb req-454b1ce8-3953-40e2-802b-ea4f67fb645d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:35:05 np0005476733 nova_compute[192580]: 2025-10-08 15:35:05.812 2 DEBUG nova.network.neutron [req-e88c19cc-04cd-4aa6-a7ea-db223262b7fb req-454b1ce8-3953-40e2-802b-ea4f67fb645d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Refreshing network info cache for port 8f7d5998-037f-4a70-98a0-8482a8043a7e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:35:06 np0005476733 nova_compute[192580]: 2025-10-08 15:35:06.156 2 INFO nova.compute.manager [None req-82150c69-0a34-4fd2-97e6-3dd958848bd3 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Get console output#033[00m
Oct  8 11:35:06 np0005476733 nova_compute[192580]: 2025-10-08 15:35:06.167 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:35:08 np0005476733 nova_compute[192580]: 2025-10-08 15:35:08.185 2 DEBUG nova.network.neutron [req-e88c19cc-04cd-4aa6-a7ea-db223262b7fb req-454b1ce8-3953-40e2-802b-ea4f67fb645d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updated VIF entry in instance network info cache for port 8f7d5998-037f-4a70-98a0-8482a8043a7e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:35:08 np0005476733 nova_compute[192580]: 2025-10-08 15:35:08.187 2 DEBUG nova.network.neutron [req-e88c19cc-04cd-4aa6-a7ea-db223262b7fb req-454b1ce8-3953-40e2-802b-ea4f67fb645d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updating instance_info_cache with network_info: [{"id": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "address": "fa:16:3e:85:7d:15", "network": {"id": "f81b33e3-d2f7-4437-b8c9-c9a54931fb61", "bridge": "br-int", "label": "tempest-test-network--416037603", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f7d5998-03", "ovs_interfaceid": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "50d486c7-b030-4d82-8b22-2f71cd277074", "address": "fa:16:3e:c9:e2:37", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50d486c7-b0", "ovs_interfaceid": "50d486c7-b030-4d82-8b22-2f71cd277074", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:35:08 np0005476733 nova_compute[192580]: 2025-10-08 15:35:08.206 2 DEBUG oslo_concurrency.lockutils [req-e88c19cc-04cd-4aa6-a7ea-db223262b7fb req-454b1ce8-3953-40e2-802b-ea4f67fb645d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-066ef28b-88ac-4f5c-acae-3458c3e19762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:35:09 np0005476733 nova_compute[192580]: 2025-10-08 15:35:09.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:10 np0005476733 nova_compute[192580]: 2025-10-08 15:35:10.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.315 2 INFO nova.compute.manager [None req-6acddf5d-f413-45ed-bce7-be944a549f53 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Get console output#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.322 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.328 2 INFO nova.virt.libvirt.driver [None req-6acddf5d-f413-45ed-bce7-be944a549f53 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Truncated console log returned, 2662 bytes ignored#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.611 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.612 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.614 2 DEBUG oslo_concurrency.lockutils [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "066ef28b-88ac-4f5c-acae-3458c3e19762" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.615 2 DEBUG oslo_concurrency.lockutils [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "066ef28b-88ac-4f5c-acae-3458c3e19762" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.615 2 DEBUG oslo_concurrency.lockutils [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.616 2 DEBUG oslo_concurrency.lockutils [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.617 2 DEBUG oslo_concurrency.lockutils [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.619 2 INFO nova.compute.manager [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Terminating instance#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.620 2 DEBUG nova.compute.manager [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:35:11 np0005476733 kernel: tap8f7d5998-03 (unregistering): left promiscuous mode
Oct  8 11:35:11 np0005476733 NetworkManager[51699]: <info>  [1759937711.6630] device (tap8f7d5998-03): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:11 np0005476733 ovn_controller[94857]: 2025-10-08T15:35:11Z|00461|binding|INFO|Releasing lport 8f7d5998-037f-4a70-98a0-8482a8043a7e from this chassis (sb_readonly=0)
Oct  8 11:35:11 np0005476733 ovn_controller[94857]: 2025-10-08T15:35:11Z|00462|binding|INFO|Setting lport 8f7d5998-037f-4a70-98a0-8482a8043a7e down in Southbound
Oct  8 11:35:11 np0005476733 ovn_controller[94857]: 2025-10-08T15:35:11Z|00463|binding|INFO|Removing iface tap8f7d5998-03 ovn-installed in OVS
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:11.687 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:7d:15 192.168.3.176'], port_security=['fa:16:3e:85:7d:15 192.168.3.176'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.3.176/24', 'neutron:device_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f81b33e3-d2f7-4437-b8c9-c9a54931fb61', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'b449450f-29a2-4ba2-a56d-c4c1cca923db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.214'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b01ceb45-280a-4b94-9dbb-432344b9bd77, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=8f7d5998-037f-4a70-98a0-8482a8043a7e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:35:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:11.690 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 8f7d5998-037f-4a70-98a0-8482a8043a7e in datapath f81b33e3-d2f7-4437-b8c9-c9a54931fb61 unbound from our chassis#033[00m
Oct  8 11:35:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:11.695 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f81b33e3-d2f7-4437-b8c9-c9a54931fb61, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:35:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:11.702 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[02541259-4867-4e04-a7f0-dc29cf30dd65]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:35:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:11.702 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f81b33e3-d2f7-4437-b8c9-c9a54931fb61 namespace which is not needed anymore#033[00m
Oct  8 11:35:11 np0005476733 kernel: tap50d486c7-b0 (unregistering): left promiscuous mode
Oct  8 11:35:11 np0005476733 NetworkManager[51699]: <info>  [1759937711.7118] device (tap50d486c7-b0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:11 np0005476733 ovn_controller[94857]: 2025-10-08T15:35:11Z|00464|binding|INFO|Releasing lport 50d486c7-b030-4d82-8b22-2f71cd277074 from this chassis (sb_readonly=0)
Oct  8 11:35:11 np0005476733 ovn_controller[94857]: 2025-10-08T15:35:11Z|00465|binding|INFO|Setting lport 50d486c7-b030-4d82-8b22-2f71cd277074 down in Southbound
Oct  8 11:35:11 np0005476733 ovn_controller[94857]: 2025-10-08T15:35:11Z|00466|binding|INFO|Removing iface tap50d486c7-b0 ovn-installed in OVS
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:11.733 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:e2:37 10.100.0.12'], port_security=['fa:16:3e:c9:e2:37 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '066ef28b-88ac-4f5c-acae-3458c3e19762', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '82ea289b-c65f-44fe-a172-e9784a3ab9f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3da71a44-b74e-4032-87c4-3337484b3d54, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=50d486c7-b030-4d82-8b22-2f71cd277074) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:11 np0005476733 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d0000002d.scope: Deactivated successfully.
Oct  8 11:35:11 np0005476733 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d0000002d.scope: Consumed 1min 1.934s CPU time.
Oct  8 11:35:11 np0005476733 systemd-machined[152624]: Machine qemu-27-instance-0000002d terminated.
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.825 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937696.824433, 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.826 2 INFO nova.compute.manager [-] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.847 2 DEBUG nova.compute.manager [None req-94694b6e-1f65-433a-987b-03f29d20c9d6 - - - - - -] [instance: 6700973e-9d22-4d4a-8d39-ae92bc3bd6e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:35:11 np0005476733 NetworkManager[51699]: <info>  [1759937711.8588] manager: (tap50d486c7-b0): new Tun device (/org/freedesktop/NetworkManager/Devices/155)
Oct  8 11:35:11 np0005476733 neutron-haproxy-ovnmeta-f81b33e3-d2f7-4437-b8c9-c9a54931fb61[231696]: [NOTICE]   (231700) : haproxy version is 2.8.14-c23fe91
Oct  8 11:35:11 np0005476733 neutron-haproxy-ovnmeta-f81b33e3-d2f7-4437-b8c9-c9a54931fb61[231696]: [NOTICE]   (231700) : path to executable is /usr/sbin/haproxy
Oct  8 11:35:11 np0005476733 neutron-haproxy-ovnmeta-f81b33e3-d2f7-4437-b8c9-c9a54931fb61[231696]: [WARNING]  (231700) : Exiting Master process...
Oct  8 11:35:11 np0005476733 neutron-haproxy-ovnmeta-f81b33e3-d2f7-4437-b8c9-c9a54931fb61[231696]: [WARNING]  (231700) : Exiting Master process...
Oct  8 11:35:11 np0005476733 neutron-haproxy-ovnmeta-f81b33e3-d2f7-4437-b8c9-c9a54931fb61[231696]: [ALERT]    (231700) : Current worker (231702) exited with code 143 (Terminated)
Oct  8 11:35:11 np0005476733 neutron-haproxy-ovnmeta-f81b33e3-d2f7-4437-b8c9-c9a54931fb61[231696]: [WARNING]  (231700) : All workers exited. Exiting... (0)
Oct  8 11:35:11 np0005476733 systemd[1]: libpod-80a664ad4137692204f83f1b22e3ca8882cee731c746eb9e941bebdb7de2ee22.scope: Deactivated successfully.
Oct  8 11:35:11 np0005476733 podman[234956]: 2025-10-08 15:35:11.888361077 +0000 UTC m=+0.071629783 container died 80a664ad4137692204f83f1b22e3ca8882cee731c746eb9e941bebdb7de2ee22 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f81b33e3-d2f7-4437-b8c9-c9a54931fb61, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.917 2 INFO nova.virt.libvirt.driver [-] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Instance destroyed successfully.#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.918 2 DEBUG nova.objects.instance [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'resources' on Instance uuid 066ef28b-88ac-4f5c-acae-3458c3e19762 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.932 2 DEBUG nova.virt.libvirt.vif [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:30:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_bw_limit_tenant_network-1685300098',display_name='tempest-test_bw_limit_tenant_network-1685300098',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-bw-limit-tenant-network-1685300098',id=45,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:30:35Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-1p84nw3a',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:30:35Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=066ef28b-88ac-4f5c-acae-3458c3e19762,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "address": "fa:16:3e:85:7d:15", "network": {"id": "f81b33e3-d2f7-4437-b8c9-c9a54931fb61", "bridge": "br-int", "label": "tempest-test-network--416037603", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f7d5998-03", "ovs_interfaceid": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.933 2 DEBUG nova.network.os_vif_util [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "address": "fa:16:3e:85:7d:15", "network": {"id": "f81b33e3-d2f7-4437-b8c9-c9a54931fb61", "bridge": "br-int", "label": "tempest-test-network--416037603", "subnets": [{"cidr": "192.168.3.0/24", "dns": [], "gateway": {"address": "192.168.3.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.3.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f7d5998-03", "ovs_interfaceid": "8f7d5998-037f-4a70-98a0-8482a8043a7e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.934 2 DEBUG nova.network.os_vif_util [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:85:7d:15,bridge_name='br-int',has_traffic_filtering=True,id=8f7d5998-037f-4a70-98a0-8482a8043a7e,network=Network(f81b33e3-d2f7-4437-b8c9-c9a54931fb61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f7d5998-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.935 2 DEBUG os_vif [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:85:7d:15,bridge_name='br-int',has_traffic_filtering=True,id=8f7d5998-037f-4a70-98a0-8482a8043a7e,network=Network(f81b33e3-d2f7-4437-b8c9-c9a54931fb61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f7d5998-03') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.936 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f7d5998-03, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:11 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-80a664ad4137692204f83f1b22e3ca8882cee731c746eb9e941bebdb7de2ee22-userdata-shm.mount: Deactivated successfully.
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:35:11 np0005476733 systemd[1]: var-lib-containers-storage-overlay-0f0208bda96c5ca82475315efd68ab0d43c824e09e12c2b392457156a237e197-merged.mount: Deactivated successfully.
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.945 2 INFO os_vif [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:85:7d:15,bridge_name='br-int',has_traffic_filtering=True,id=8f7d5998-037f-4a70-98a0-8482a8043a7e,network=Network(f81b33e3-d2f7-4437-b8c9-c9a54931fb61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f7d5998-03')#033[00m
Oct  8 11:35:11 np0005476733 podman[234956]: 2025-10-08 15:35:11.946252717 +0000 UTC m=+0.129521393 container cleanup 80a664ad4137692204f83f1b22e3ca8882cee731c746eb9e941bebdb7de2ee22 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f81b33e3-d2f7-4437-b8c9-c9a54931fb61, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.946 2 DEBUG nova.virt.libvirt.vif [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:30:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_bw_limit_tenant_network-1685300098',display_name='tempest-test_bw_limit_tenant_network-1685300098',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-bw-limit-tenant-network-1685300098',id=45,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:30:35Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-1p84nw3a',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:30:35Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=066ef28b-88ac-4f5c-acae-3458c3e19762,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "50d486c7-b030-4d82-8b22-2f71cd277074", "address": "fa:16:3e:c9:e2:37", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50d486c7-b0", "ovs_interfaceid": "50d486c7-b030-4d82-8b22-2f71cd277074", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.947 2 DEBUG nova.network.os_vif_util [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "50d486c7-b030-4d82-8b22-2f71cd277074", "address": "fa:16:3e:c9:e2:37", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50d486c7-b0", "ovs_interfaceid": "50d486c7-b030-4d82-8b22-2f71cd277074", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.947 2 DEBUG nova.network.os_vif_util [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c9:e2:37,bridge_name='br-int',has_traffic_filtering=True,id=50d486c7-b030-4d82-8b22-2f71cd277074,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap50d486c7-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.948 2 DEBUG os_vif [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c9:e2:37,bridge_name='br-int',has_traffic_filtering=True,id=50d486c7-b030-4d82-8b22-2f71cd277074,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap50d486c7-b0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.950 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50d486c7-b0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:35:11 np0005476733 systemd[1]: libpod-conmon-80a664ad4137692204f83f1b22e3ca8882cee731c746eb9e941bebdb7de2ee22.scope: Deactivated successfully.
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.956 2 INFO os_vif [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c9:e2:37,bridge_name='br-int',has_traffic_filtering=True,id=50d486c7-b030-4d82-8b22-2f71cd277074,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap50d486c7-b0')#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.957 2 INFO nova.virt.libvirt.driver [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Deleting instance files /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762_del#033[00m
Oct  8 11:35:11 np0005476733 nova_compute[192580]: 2025-10-08 15:35:11.958 2 INFO nova.virt.libvirt.driver [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Deletion of /var/lib/nova/instances/066ef28b-88ac-4f5c-acae-3458c3e19762_del complete#033[00m
Oct  8 11:35:12 np0005476733 nova_compute[192580]: 2025-10-08 15:35:12.002 2 INFO nova.compute.manager [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:35:12 np0005476733 nova_compute[192580]: 2025-10-08 15:35:12.003 2 DEBUG oslo.service.loopingcall [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:35:12 np0005476733 nova_compute[192580]: 2025-10-08 15:35:12.004 2 DEBUG nova.compute.manager [-] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:35:12 np0005476733 nova_compute[192580]: 2025-10-08 15:35:12.005 2 DEBUG nova.network.neutron [-] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:35:12 np0005476733 podman[235013]: 2025-10-08 15:35:12.013986534 +0000 UTC m=+0.042309031 container remove 80a664ad4137692204f83f1b22e3ca8882cee731c746eb9e941bebdb7de2ee22 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f81b33e3-d2f7-4437-b8c9-c9a54931fb61, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:35:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:12.019 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6cbb56be-11d9-4616-a53e-812372f773af]: (4, ('Wed Oct  8 03:35:11 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f81b33e3-d2f7-4437-b8c9-c9a54931fb61 (80a664ad4137692204f83f1b22e3ca8882cee731c746eb9e941bebdb7de2ee22)\n80a664ad4137692204f83f1b22e3ca8882cee731c746eb9e941bebdb7de2ee22\nWed Oct  8 03:35:11 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f81b33e3-d2f7-4437-b8c9-c9a54931fb61 (80a664ad4137692204f83f1b22e3ca8882cee731c746eb9e941bebdb7de2ee22)\n80a664ad4137692204f83f1b22e3ca8882cee731c746eb9e941bebdb7de2ee22\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:35:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:12.021 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[91c0733c-7492-45f1-a46d-2b366a0867ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:35:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:12.022 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf81b33e3-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:35:12 np0005476733 nova_compute[192580]: 2025-10-08 15:35:12.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:12 np0005476733 kernel: tapf81b33e3-d0: left promiscuous mode
Oct  8 11:35:12 np0005476733 nova_compute[192580]: 2025-10-08 15:35:12.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:12.085 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c9e9ea2d-606d-423a-b56f-30a20df1319c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:35:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:12.119 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e1cd67a7-27fb-484d-a48d-dd69b2204de5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:35:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:12.122 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2ab300ee-7a64-476e-a1e4-5e07babb460a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:35:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:12.146 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[cc4afc04-15d4-4fcb-9aae-832868a80ae7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 433466, 'reachable_time': 42186, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235028, 'error': None, 'target': 'ovnmeta-f81b33e3-d2f7-4437-b8c9-c9a54931fb61', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:35:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:12.148 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f81b33e3-d2f7-4437-b8c9-c9a54931fb61 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:35:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:12.148 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[23a72ae5-d99b-4b54-9a37-e8de0c095885]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:35:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:12.149 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 50d486c7-b030-4d82-8b22-2f71cd277074 in datapath 58a69152-b5a6-41d0-85d5-36ab51cfbfb5 unbound from our chassis#033[00m
Oct  8 11:35:12 np0005476733 systemd[1]: run-netns-ovnmeta\x2df81b33e3\x2dd2f7\x2d4437\x2db8c9\x2dc9a54931fb61.mount: Deactivated successfully.
Oct  8 11:35:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:12.151 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58a69152-b5a6-41d0-85d5-36ab51cfbfb5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:35:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:12.152 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[082220d1-4797-4de7-a26e-660d5f3c3363]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:35:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:12.152 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 namespace which is not needed anymore#033[00m
Oct  8 11:35:12 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[233366]: [NOTICE]   (233370) : haproxy version is 2.8.14-c23fe91
Oct  8 11:35:12 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[233366]: [NOTICE]   (233370) : path to executable is /usr/sbin/haproxy
Oct  8 11:35:12 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[233366]: [WARNING]  (233370) : Exiting Master process...
Oct  8 11:35:12 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[233366]: [ALERT]    (233370) : Current worker (233372) exited with code 143 (Terminated)
Oct  8 11:35:12 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[233366]: [WARNING]  (233370) : All workers exited. Exiting... (0)
Oct  8 11:35:12 np0005476733 systemd[1]: libpod-22dd98b562e2bfbed38b87906bed3cc062810e00cfd6894e85eea3790d097206.scope: Deactivated successfully.
Oct  8 11:35:12 np0005476733 podman[235046]: 2025-10-08 15:35:12.318442508 +0000 UTC m=+0.056694642 container died 22dd98b562e2bfbed38b87906bed3cc062810e00cfd6894e85eea3790d097206 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:35:12 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-22dd98b562e2bfbed38b87906bed3cc062810e00cfd6894e85eea3790d097206-userdata-shm.mount: Deactivated successfully.
Oct  8 11:35:12 np0005476733 systemd[1]: var-lib-containers-storage-overlay-1d02dda42a5c32845ea92e289f2a16305f47b8302a7cbf41f14af9b770be60d2-merged.mount: Deactivated successfully.
Oct  8 11:35:12 np0005476733 podman[235046]: 2025-10-08 15:35:12.363075453 +0000 UTC m=+0.101327607 container cleanup 22dd98b562e2bfbed38b87906bed3cc062810e00cfd6894e85eea3790d097206 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:35:12 np0005476733 systemd[1]: libpod-conmon-22dd98b562e2bfbed38b87906bed3cc062810e00cfd6894e85eea3790d097206.scope: Deactivated successfully.
Oct  8 11:35:12 np0005476733 podman[235059]: 2025-10-08 15:35:12.423852226 +0000 UTC m=+0.086300084 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  8 11:35:12 np0005476733 podman[235085]: 2025-10-08 15:35:12.447918699 +0000 UTC m=+0.054675107 container remove 22dd98b562e2bfbed38b87906bed3cc062810e00cfd6894e85eea3790d097206 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:35:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:12.454 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7b7545cf-38e6-4cbb-a3fc-2ce19269076c]: (4, ('Wed Oct  8 03:35:12 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 (22dd98b562e2bfbed38b87906bed3cc062810e00cfd6894e85eea3790d097206)\n22dd98b562e2bfbed38b87906bed3cc062810e00cfd6894e85eea3790d097206\nWed Oct  8 03:35:12 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 (22dd98b562e2bfbed38b87906bed3cc062810e00cfd6894e85eea3790d097206)\n22dd98b562e2bfbed38b87906bed3cc062810e00cfd6894e85eea3790d097206\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:35:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:12.457 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3abdfdd3-9260-4224-90c7-a70fc8273737]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:35:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:12.458 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58a69152-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:35:12 np0005476733 nova_compute[192580]: 2025-10-08 15:35:12.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:12 np0005476733 kernel: tap58a69152-b0: left promiscuous mode
Oct  8 11:35:12 np0005476733 nova_compute[192580]: 2025-10-08 15:35:12.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:12.493 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[aee27375-e9d5-4ff5-b675-70083f9b780f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:35:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:12.519 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a0a0a81c-3fda-48da-a422-4c682bb00a6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:35:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:12.520 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[cc3d5bf5-3cab-463b-91e6-6d07f33d31dd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:35:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:12.536 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5c45b850-c0c7-4d77-a16d-bf39308ca384]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445791, 'reachable_time': 43702, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235106, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:35:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:12.539 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:35:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:12.539 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[d637a62c-9f6e-44e4-8796-7357ea656c3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:35:12 np0005476733 systemd[1]: run-netns-ovnmeta\x2d58a69152\x2db5a6\x2d41d0\x2d85d5\x2d36ab51cfbfb5.mount: Deactivated successfully.
Oct  8 11:35:13 np0005476733 nova_compute[192580]: 2025-10-08 15:35:13.517 2 DEBUG nova.compute.manager [req-5e61cf66-0ef6-4293-af5c-37a501adfec7 req-6d6ee8e0-8040-4413-be24-f4e5c2e32804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received event network-vif-unplugged-8f7d5998-037f-4a70-98a0-8482a8043a7e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:35:13 np0005476733 nova_compute[192580]: 2025-10-08 15:35:13.518 2 DEBUG oslo_concurrency.lockutils [req-5e61cf66-0ef6-4293-af5c-37a501adfec7 req-6d6ee8e0-8040-4413-be24-f4e5c2e32804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:35:13 np0005476733 nova_compute[192580]: 2025-10-08 15:35:13.518 2 DEBUG oslo_concurrency.lockutils [req-5e61cf66-0ef6-4293-af5c-37a501adfec7 req-6d6ee8e0-8040-4413-be24-f4e5c2e32804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:35:13 np0005476733 nova_compute[192580]: 2025-10-08 15:35:13.519 2 DEBUG oslo_concurrency.lockutils [req-5e61cf66-0ef6-4293-af5c-37a501adfec7 req-6d6ee8e0-8040-4413-be24-f4e5c2e32804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:35:13 np0005476733 nova_compute[192580]: 2025-10-08 15:35:13.519 2 DEBUG nova.compute.manager [req-5e61cf66-0ef6-4293-af5c-37a501adfec7 req-6d6ee8e0-8040-4413-be24-f4e5c2e32804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] No waiting events found dispatching network-vif-unplugged-8f7d5998-037f-4a70-98a0-8482a8043a7e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:35:13 np0005476733 nova_compute[192580]: 2025-10-08 15:35:13.519 2 DEBUG nova.compute.manager [req-5e61cf66-0ef6-4293-af5c-37a501adfec7 req-6d6ee8e0-8040-4413-be24-f4e5c2e32804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received event network-vif-unplugged-8f7d5998-037f-4a70-98a0-8482a8043a7e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:35:13 np0005476733 nova_compute[192580]: 2025-10-08 15:35:13.519 2 DEBUG nova.compute.manager [req-5e61cf66-0ef6-4293-af5c-37a501adfec7 req-6d6ee8e0-8040-4413-be24-f4e5c2e32804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received event network-vif-plugged-8f7d5998-037f-4a70-98a0-8482a8043a7e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:35:13 np0005476733 nova_compute[192580]: 2025-10-08 15:35:13.520 2 DEBUG oslo_concurrency.lockutils [req-5e61cf66-0ef6-4293-af5c-37a501adfec7 req-6d6ee8e0-8040-4413-be24-f4e5c2e32804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:35:13 np0005476733 nova_compute[192580]: 2025-10-08 15:35:13.520 2 DEBUG oslo_concurrency.lockutils [req-5e61cf66-0ef6-4293-af5c-37a501adfec7 req-6d6ee8e0-8040-4413-be24-f4e5c2e32804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:35:13 np0005476733 nova_compute[192580]: 2025-10-08 15:35:13.520 2 DEBUG oslo_concurrency.lockutils [req-5e61cf66-0ef6-4293-af5c-37a501adfec7 req-6d6ee8e0-8040-4413-be24-f4e5c2e32804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:35:13 np0005476733 nova_compute[192580]: 2025-10-08 15:35:13.520 2 DEBUG nova.compute.manager [req-5e61cf66-0ef6-4293-af5c-37a501adfec7 req-6d6ee8e0-8040-4413-be24-f4e5c2e32804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] No waiting events found dispatching network-vif-plugged-8f7d5998-037f-4a70-98a0-8482a8043a7e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:35:13 np0005476733 nova_compute[192580]: 2025-10-08 15:35:13.521 2 WARNING nova.compute.manager [req-5e61cf66-0ef6-4293-af5c-37a501adfec7 req-6d6ee8e0-8040-4413-be24-f4e5c2e32804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received unexpected event network-vif-plugged-8f7d5998-037f-4a70-98a0-8482a8043a7e for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:35:13 np0005476733 nova_compute[192580]: 2025-10-08 15:35:13.521 2 DEBUG nova.compute.manager [req-5e61cf66-0ef6-4293-af5c-37a501adfec7 req-6d6ee8e0-8040-4413-be24-f4e5c2e32804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received event network-vif-unplugged-50d486c7-b030-4d82-8b22-2f71cd277074 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:35:13 np0005476733 nova_compute[192580]: 2025-10-08 15:35:13.521 2 DEBUG oslo_concurrency.lockutils [req-5e61cf66-0ef6-4293-af5c-37a501adfec7 req-6d6ee8e0-8040-4413-be24-f4e5c2e32804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:35:13 np0005476733 nova_compute[192580]: 2025-10-08 15:35:13.521 2 DEBUG oslo_concurrency.lockutils [req-5e61cf66-0ef6-4293-af5c-37a501adfec7 req-6d6ee8e0-8040-4413-be24-f4e5c2e32804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:35:13 np0005476733 nova_compute[192580]: 2025-10-08 15:35:13.522 2 DEBUG oslo_concurrency.lockutils [req-5e61cf66-0ef6-4293-af5c-37a501adfec7 req-6d6ee8e0-8040-4413-be24-f4e5c2e32804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:35:13 np0005476733 nova_compute[192580]: 2025-10-08 15:35:13.522 2 DEBUG nova.compute.manager [req-5e61cf66-0ef6-4293-af5c-37a501adfec7 req-6d6ee8e0-8040-4413-be24-f4e5c2e32804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] No waiting events found dispatching network-vif-unplugged-50d486c7-b030-4d82-8b22-2f71cd277074 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:35:13 np0005476733 nova_compute[192580]: 2025-10-08 15:35:13.522 2 DEBUG nova.compute.manager [req-5e61cf66-0ef6-4293-af5c-37a501adfec7 req-6d6ee8e0-8040-4413-be24-f4e5c2e32804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received event network-vif-unplugged-50d486c7-b030-4d82-8b22-2f71cd277074 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:35:13 np0005476733 nova_compute[192580]: 2025-10-08 15:35:13.523 2 DEBUG nova.compute.manager [req-5e61cf66-0ef6-4293-af5c-37a501adfec7 req-6d6ee8e0-8040-4413-be24-f4e5c2e32804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received event network-vif-plugged-50d486c7-b030-4d82-8b22-2f71cd277074 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:35:13 np0005476733 nova_compute[192580]: 2025-10-08 15:35:13.523 2 DEBUG oslo_concurrency.lockutils [req-5e61cf66-0ef6-4293-af5c-37a501adfec7 req-6d6ee8e0-8040-4413-be24-f4e5c2e32804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:35:13 np0005476733 nova_compute[192580]: 2025-10-08 15:35:13.523 2 DEBUG oslo_concurrency.lockutils [req-5e61cf66-0ef6-4293-af5c-37a501adfec7 req-6d6ee8e0-8040-4413-be24-f4e5c2e32804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:35:13 np0005476733 nova_compute[192580]: 2025-10-08 15:35:13.523 2 DEBUG oslo_concurrency.lockutils [req-5e61cf66-0ef6-4293-af5c-37a501adfec7 req-6d6ee8e0-8040-4413-be24-f4e5c2e32804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "066ef28b-88ac-4f5c-acae-3458c3e19762-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:35:13 np0005476733 nova_compute[192580]: 2025-10-08 15:35:13.524 2 DEBUG nova.compute.manager [req-5e61cf66-0ef6-4293-af5c-37a501adfec7 req-6d6ee8e0-8040-4413-be24-f4e5c2e32804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] No waiting events found dispatching network-vif-plugged-50d486c7-b030-4d82-8b22-2f71cd277074 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:35:13 np0005476733 nova_compute[192580]: 2025-10-08 15:35:13.524 2 WARNING nova.compute.manager [req-5e61cf66-0ef6-4293-af5c-37a501adfec7 req-6d6ee8e0-8040-4413-be24-f4e5c2e32804 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received unexpected event network-vif-plugged-50d486c7-b030-4d82-8b22-2f71cd277074 for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:35:14 np0005476733 nova_compute[192580]: 2025-10-08 15:35:14.139 2 DEBUG nova.network.neutron [-] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:35:14 np0005476733 nova_compute[192580]: 2025-10-08 15:35:14.167 2 INFO nova.compute.manager [-] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Took 2.16 seconds to deallocate network for instance.#033[00m
Oct  8 11:35:14 np0005476733 nova_compute[192580]: 2025-10-08 15:35:14.223 2 DEBUG oslo_concurrency.lockutils [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:35:14 np0005476733 nova_compute[192580]: 2025-10-08 15:35:14.224 2 DEBUG oslo_concurrency.lockutils [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:35:14 np0005476733 nova_compute[192580]: 2025-10-08 15:35:14.266 2 DEBUG nova.compute.manager [req-337635ed-743e-44b9-a612-53034f225a9e req-1b9aedba-95a6-4d84-90eb-8d1e5fede64e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Received event network-vif-deleted-8f7d5998-037f-4a70-98a0-8482a8043a7e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:35:14 np0005476733 nova_compute[192580]: 2025-10-08 15:35:14.320 2 DEBUG nova.compute.provider_tree [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:35:14 np0005476733 nova_compute[192580]: 2025-10-08 15:35:14.341 2 DEBUG nova.scheduler.client.report [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:35:14 np0005476733 nova_compute[192580]: 2025-10-08 15:35:14.369 2 DEBUG oslo_concurrency.lockutils [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:35:14 np0005476733 nova_compute[192580]: 2025-10-08 15:35:14.398 2 INFO nova.scheduler.client.report [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Deleted allocations for instance 066ef28b-88ac-4f5c-acae-3458c3e19762#033[00m
Oct  8 11:35:14 np0005476733 nova_compute[192580]: 2025-10-08 15:35:14.473 2 DEBUG oslo_concurrency.lockutils [None req-39b48bb2-7776-4993-bfe7-a17ba31389f8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "066ef28b-88ac-4f5c-acae-3458c3e19762" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.858s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:35:14 np0005476733 nova_compute[192580]: 2025-10-08 15:35:14.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:15 np0005476733 nova_compute[192580]: 2025-10-08 15:35:15.125 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937700.1235466, d73d8a2e-011b-4f41-9734-d2bb2b068986 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:35:15 np0005476733 nova_compute[192580]: 2025-10-08 15:35:15.126 2 INFO nova.compute.manager [-] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:35:15 np0005476733 nova_compute[192580]: 2025-10-08 15:35:15.146 2 DEBUG nova.compute.manager [None req-e2adcf13-986c-4a11-a767-6b6fc2ce021c - - - - - -] [instance: d73d8a2e-011b-4f41-9734-d2bb2b068986] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:35:16 np0005476733 nova_compute[192580]: 2025-10-08 15:35:16.462 2 INFO nova.compute.manager [None req-8b5a50e6-e134-4bbf-9e02-4dafad60b9cc 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Get console output#033[00m
Oct  8 11:35:16 np0005476733 nova_compute[192580]: 2025-10-08 15:35:16.467 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:35:16 np0005476733 nova_compute[192580]: 2025-10-08 15:35:16.471 2 INFO nova.virt.libvirt.driver [None req-8b5a50e6-e134-4bbf-9e02-4dafad60b9cc 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Truncated console log returned, 2828 bytes ignored#033[00m
Oct  8 11:35:16 np0005476733 nova_compute[192580]: 2025-10-08 15:35:16.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:17 np0005476733 podman[235109]: 2025-10-08 15:35:17.263996286 +0000 UTC m=+0.082849473 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  8 11:35:17 np0005476733 podman[235108]: 2025-10-08 15:35:17.32042121 +0000 UTC m=+0.136099966 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct  8 11:35:19 np0005476733 nova_compute[192580]: 2025-10-08 15:35:19.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:35:20Z|00467|pinctrl|WARN|Dropped 4063 log messages in last 60 seconds (most recently, 1 seconds ago) due to excessive rate
Oct  8 11:35:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:35:20Z|00468|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:35:21 np0005476733 ovn_controller[94857]: 2025-10-08T15:35:21Z|00469|binding|INFO|Releasing lport edc27bfb-7622-457f-b7d3-480bac0b8693 from this chassis (sb_readonly=0)
Oct  8 11:35:21 np0005476733 nova_compute[192580]: 2025-10-08 15:35:21.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:21 np0005476733 nova_compute[192580]: 2025-10-08 15:35:21.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:24 np0005476733 podman[235155]: 2025-10-08 15:35:24.242392234 +0000 UTC m=+0.073419019 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:35:24 np0005476733 podman[235156]: 2025-10-08 15:35:24.246320401 +0000 UTC m=+0.065677582 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 11:35:24 np0005476733 podman[235157]: 2025-10-08 15:35:24.268148562 +0000 UTC m=+0.087624767 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41)
Oct  8 11:35:24 np0005476733 nova_compute[192580]: 2025-10-08 15:35:24.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:26.319 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:35:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:26.320 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:35:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:26.321 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:35:26 np0005476733 nova_compute[192580]: 2025-10-08 15:35:26.916 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937711.9151661, 066ef28b-88ac-4f5c-acae-3458c3e19762 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:35:26 np0005476733 nova_compute[192580]: 2025-10-08 15:35:26.916 2 INFO nova.compute.manager [-] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:35:26 np0005476733 nova_compute[192580]: 2025-10-08 15:35:26.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:26 np0005476733 nova_compute[192580]: 2025-10-08 15:35:26.965 2 DEBUG nova.compute.manager [None req-c5615569-d77a-4746-882c-1e957d6972a7 - - - - - -] [instance: 066ef28b-88ac-4f5c-acae-3458c3e19762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:35:28 np0005476733 nova_compute[192580]: 2025-10-08 15:35:28.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:29 np0005476733 nova_compute[192580]: 2025-10-08 15:35:29.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:31 np0005476733 nova_compute[192580]: 2025-10-08 15:35:31.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:34 np0005476733 podman[235228]: 2025-10-08 15:35:34.236782368 +0000 UTC m=+0.059352348 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 11:35:34 np0005476733 podman[235227]: 2025-10-08 15:35:34.236895292 +0000 UTC m=+0.060022770 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct  8 11:35:34 np0005476733 nova_compute[192580]: 2025-10-08 15:35:34.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:36 np0005476733 nova_compute[192580]: 2025-10-08 15:35:36.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:39 np0005476733 nova_compute[192580]: 2025-10-08 15:35:39.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:41 np0005476733 nova_compute[192580]: 2025-10-08 15:35:41.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:42.778 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:35:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:42.779 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:35:42 np0005476733 nova_compute[192580]: 2025-10-08 15:35:42.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:43 np0005476733 podman[235270]: 2025-10-08 15:35:43.265257509 +0000 UTC m=+0.092714271 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:35:44 np0005476733 nova_compute[192580]: 2025-10-08 15:35:44.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:46 np0005476733 nova_compute[192580]: 2025-10-08 15:35:46.634 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:35:46 np0005476733 nova_compute[192580]: 2025-10-08 15:35:46.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:47 np0005476733 systemd-logind[827]: New session 40 of user zuul.
Oct  8 11:35:47 np0005476733 systemd[1]: Started Session 40 of User zuul.
Oct  8 11:35:47 np0005476733 podman[235294]: 2025-10-08 15:35:47.958987533 +0000 UTC m=+0.067669663 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3)
Oct  8 11:35:48 np0005476733 podman[235292]: 2025-10-08 15:35:47.999578732 +0000 UTC m=+0.109940606 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:35:48 np0005476733 systemd[1]: session-40.scope: Deactivated successfully.
Oct  8 11:35:48 np0005476733 systemd-logind[827]: Session 40 logged out. Waiting for processes to exit.
Oct  8 11:35:48 np0005476733 systemd-logind[827]: Removed session 40.
Oct  8 11:35:48 np0005476733 nova_compute[192580]: 2025-10-08 15:35:48.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:35:48 np0005476733 nova_compute[192580]: 2025-10-08 15:35:48.618 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:35:48 np0005476733 nova_compute[192580]: 2025-10-08 15:35:48.619 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:35:48 np0005476733 nova_compute[192580]: 2025-10-08 15:35:48.619 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:35:48 np0005476733 nova_compute[192580]: 2025-10-08 15:35:48.620 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:35:48 np0005476733 nova_compute[192580]: 2025-10-08 15:35:48.729 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:35:48 np0005476733 nova_compute[192580]: 2025-10-08 15:35:48.802 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:35:48 np0005476733 nova_compute[192580]: 2025-10-08 15:35:48.803 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:35:48 np0005476733 nova_compute[192580]: 2025-10-08 15:35:48.873 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:35:49 np0005476733 nova_compute[192580]: 2025-10-08 15:35:49.033 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:35:49 np0005476733 nova_compute[192580]: 2025-10-08 15:35:49.034 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12958MB free_disk=111.19791793823242GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:35:49 np0005476733 nova_compute[192580]: 2025-10-08 15:35:49.035 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:35:49 np0005476733 nova_compute[192580]: 2025-10-08 15:35:49.036 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:35:49 np0005476733 nova_compute[192580]: 2025-10-08 15:35:49.134 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 8a310a2e-17af-42b8-a212-cf0a278e20c7 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 1024, 'DISK_GB': 10}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:35:49 np0005476733 nova_compute[192580]: 2025-10-08 15:35:49.134 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:35:49 np0005476733 nova_compute[192580]: 2025-10-08 15:35:49.135 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:35:49 np0005476733 nova_compute[192580]: 2025-10-08 15:35:49.163 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 11:35:49 np0005476733 nova_compute[192580]: 2025-10-08 15:35:49.191 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 11:35:49 np0005476733 nova_compute[192580]: 2025-10-08 15:35:49.191 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 11:35:49 np0005476733 nova_compute[192580]: 2025-10-08 15:35:49.218 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 11:35:49 np0005476733 nova_compute[192580]: 2025-10-08 15:35:49.240 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 11:35:49 np0005476733 nova_compute[192580]: 2025-10-08 15:35:49.305 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:35:49 np0005476733 nova_compute[192580]: 2025-10-08 15:35:49.325 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:35:49 np0005476733 nova_compute[192580]: 2025-10-08 15:35:49.353 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:35:49 np0005476733 nova_compute[192580]: 2025-10-08 15:35:49.354 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.318s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:35:49 np0005476733 nova_compute[192580]: 2025-10-08 15:35:49.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:50 np0005476733 nova_compute[192580]: 2025-10-08 15:35:50.355 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:35:50 np0005476733 nova_compute[192580]: 2025-10-08 15:35:50.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:35:51 np0005476733 nova_compute[192580]: 2025-10-08 15:35:51.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:35:51 np0005476733 nova_compute[192580]: 2025-10-08 15:35:51.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:35:51 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:35:51.781 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:35:51 np0005476733 nova_compute[192580]: 2025-10-08 15:35:51.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:54 np0005476733 nova_compute[192580]: 2025-10-08 15:35:54.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:35:55 np0005476733 nova_compute[192580]: 2025-10-08 15:35:55.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:55 np0005476733 podman[235372]: 2025-10-08 15:35:55.237866542 +0000 UTC m=+0.053185106 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:35:55 np0005476733 podman[235373]: 2025-10-08 15:35:55.263187199 +0000 UTC m=+0.078336257 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 11:35:55 np0005476733 podman[235374]: 2025-10-08 15:35:55.269846133 +0000 UTC m=+0.079420301 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  8 11:35:56 np0005476733 nova_compute[192580]: 2025-10-08 15:35:56.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:35:56 np0005476733 nova_compute[192580]: 2025-10-08 15:35:56.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:57 np0005476733 nova_compute[192580]: 2025-10-08 15:35:57.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:35:57 np0005476733 nova_compute[192580]: 2025-10-08 15:35:57.457 2 DEBUG oslo_concurrency.lockutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Acquiring lock "d279aff7-6615-416d-81f2-f431378ecc56" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:35:57 np0005476733 nova_compute[192580]: 2025-10-08 15:35:57.457 2 DEBUG oslo_concurrency.lockutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "d279aff7-6615-416d-81f2-f431378ecc56" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:35:57 np0005476733 nova_compute[192580]: 2025-10-08 15:35:57.481 2 DEBUG nova.compute.manager [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:35:57 np0005476733 nova_compute[192580]: 2025-10-08 15:35:57.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:35:57 np0005476733 nova_compute[192580]: 2025-10-08 15:35:57.902 2 DEBUG oslo_concurrency.lockutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:35:57 np0005476733 nova_compute[192580]: 2025-10-08 15:35:57.903 2 DEBUG oslo_concurrency.lockutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:35:57 np0005476733 nova_compute[192580]: 2025-10-08 15:35:57.912 2 DEBUG nova.virt.hardware [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:35:57 np0005476733 nova_compute[192580]: 2025-10-08 15:35:57.914 2 INFO nova.compute.claims [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.094 2 DEBUG nova.compute.provider_tree [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.114 2 DEBUG nova.scheduler.client.report [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.140 2 DEBUG oslo_concurrency.lockutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.141 2 DEBUG nova.compute.manager [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.215 2 DEBUG nova.compute.manager [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.216 2 DEBUG nova.network.neutron [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.255 2 INFO nova.virt.libvirt.driver [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.285 2 DEBUG nova.compute.manager [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.385 2 DEBUG nova.compute.manager [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.387 2 DEBUG nova.virt.libvirt.driver [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.387 2 INFO nova.virt.libvirt.driver [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Creating image(s)#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.388 2 DEBUG oslo_concurrency.lockutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Acquiring lock "/var/lib/nova/instances/d279aff7-6615-416d-81f2-f431378ecc56/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.388 2 DEBUG oslo_concurrency.lockutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "/var/lib/nova/instances/d279aff7-6615-416d-81f2-f431378ecc56/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.389 2 DEBUG oslo_concurrency.lockutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "/var/lib/nova/instances/d279aff7-6615-416d-81f2-f431378ecc56/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.403 2 DEBUG oslo_concurrency.processutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.468 2 DEBUG oslo_concurrency.processutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.470 2 DEBUG oslo_concurrency.lockutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.470 2 DEBUG oslo_concurrency.lockutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.482 2 DEBUG oslo_concurrency.processutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.536 2 DEBUG oslo_concurrency.processutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.538 2 DEBUG oslo_concurrency.processutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/d279aff7-6615-416d-81f2-f431378ecc56/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.573 2 DEBUG oslo_concurrency.processutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/d279aff7-6615-416d-81f2-f431378ecc56/disk 10737418240" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.575 2 DEBUG oslo_concurrency.lockutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.576 2 DEBUG oslo_concurrency.processutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.636 2 DEBUG oslo_concurrency.processutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.638 2 DEBUG nova.objects.instance [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lazy-loading 'migration_context' on Instance uuid d279aff7-6615-416d-81f2-f431378ecc56 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.695 2 DEBUG nova.virt.libvirt.driver [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.696 2 DEBUG nova.virt.libvirt.driver [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Ensure instance console log exists: /var/lib/nova/instances/d279aff7-6615-416d-81f2-f431378ecc56/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.697 2 DEBUG oslo_concurrency.lockutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.697 2 DEBUG oslo_concurrency.lockutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.698 2 DEBUG oslo_concurrency.lockutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:35:58 np0005476733 nova_compute[192580]: 2025-10-08 15:35:58.847 2 DEBUG nova.policy [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:35:59 np0005476733 nova_compute[192580]: 2025-10-08 15:35:59.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:35:59 np0005476733 nova_compute[192580]: 2025-10-08 15:35:59.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:35:59 np0005476733 nova_compute[192580]: 2025-10-08 15:35:59.591 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:35:59 np0005476733 nova_compute[192580]: 2025-10-08 15:35:59.614 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  8 11:35:59 np0005476733 nova_compute[192580]: 2025-10-08 15:35:59.943 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-8a310a2e-17af-42b8-a212-cf0a278e20c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:35:59 np0005476733 nova_compute[192580]: 2025-10-08 15:35:59.944 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-8a310a2e-17af-42b8-a212-cf0a278e20c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:35:59 np0005476733 nova_compute[192580]: 2025-10-08 15:35:59.944 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:35:59 np0005476733 nova_compute[192580]: 2025-10-08 15:35:59.945 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 8a310a2e-17af-42b8-a212-cf0a278e20c7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:36:00 np0005476733 nova_compute[192580]: 2025-10-08 15:36:00.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:01 np0005476733 nova_compute[192580]: 2025-10-08 15:36:01.312 2 DEBUG nova.network.neutron [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Successfully updated port: 03a00809-da14-4bcc-8eee-92146b0d6536 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:36:01 np0005476733 nova_compute[192580]: 2025-10-08 15:36:01.345 2 DEBUG oslo_concurrency.lockutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Acquiring lock "refresh_cache-d279aff7-6615-416d-81f2-f431378ecc56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:36:01 np0005476733 nova_compute[192580]: 2025-10-08 15:36:01.345 2 DEBUG oslo_concurrency.lockutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Acquired lock "refresh_cache-d279aff7-6615-416d-81f2-f431378ecc56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:36:01 np0005476733 nova_compute[192580]: 2025-10-08 15:36:01.346 2 DEBUG nova.network.neutron [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:36:01 np0005476733 nova_compute[192580]: 2025-10-08 15:36:01.547 2 DEBUG nova.compute.manager [req-26316999-4ec5-4660-8ac4-885c92176268 req-76752248-bfb1-497a-8512-b142e1fff654 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Received event network-changed-03a00809-da14-4bcc-8eee-92146b0d6536 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:36:01 np0005476733 nova_compute[192580]: 2025-10-08 15:36:01.548 2 DEBUG nova.compute.manager [req-26316999-4ec5-4660-8ac4-885c92176268 req-76752248-bfb1-497a-8512-b142e1fff654 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Refreshing instance network info cache due to event network-changed-03a00809-da14-4bcc-8eee-92146b0d6536. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:36:01 np0005476733 nova_compute[192580]: 2025-10-08 15:36:01.548 2 DEBUG oslo_concurrency.lockutils [req-26316999-4ec5-4660-8ac4-885c92176268 req-76752248-bfb1-497a-8512-b142e1fff654 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-d279aff7-6615-416d-81f2-f431378ecc56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:36:01 np0005476733 nova_compute[192580]: 2025-10-08 15:36:01.657 2 DEBUG nova.network.neutron [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:36:01 np0005476733 nova_compute[192580]: 2025-10-08 15:36:01.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.270 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Updating instance_info_cache with network_info: [{"id": "832212ef-772b-4b36-b486-7b4131fc3ab5", "address": "fa:16:3e:26:8e:11", "network": {"id": "3556a570-1234-4dd3-a7d3-e2cf3097a776", "bridge": "br-int", "label": "tempest-test-network--565113220", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.73", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c96d22c99734f059343a5340cc6f287", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap832212ef-77", "ovs_interfaceid": "832212ef-772b-4b36-b486-7b4131fc3ab5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.350 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-8a310a2e-17af-42b8-a212-cf0a278e20c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.351 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.681 2 DEBUG nova.network.neutron [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Updating instance_info_cache with network_info: [{"id": "03a00809-da14-4bcc-8eee-92146b0d6536", "address": "fa:16:3e:59:95:d2", "network": {"id": "3556a570-1234-4dd3-a7d3-e2cf3097a776", "bridge": "br-int", "label": "tempest-test-network--565113220", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.161", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c96d22c99734f059343a5340cc6f287", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03a00809-da", "ovs_interfaceid": "03a00809-da14-4bcc-8eee-92146b0d6536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.724 2 DEBUG oslo_concurrency.lockutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Releasing lock "refresh_cache-d279aff7-6615-416d-81f2-f431378ecc56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.725 2 DEBUG nova.compute.manager [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Instance network_info: |[{"id": "03a00809-da14-4bcc-8eee-92146b0d6536", "address": "fa:16:3e:59:95:d2", "network": {"id": "3556a570-1234-4dd3-a7d3-e2cf3097a776", "bridge": "br-int", "label": "tempest-test-network--565113220", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.161", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c96d22c99734f059343a5340cc6f287", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03a00809-da", "ovs_interfaceid": "03a00809-da14-4bcc-8eee-92146b0d6536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.725 2 DEBUG oslo_concurrency.lockutils [req-26316999-4ec5-4660-8ac4-885c92176268 req-76752248-bfb1-497a-8512-b142e1fff654 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-d279aff7-6615-416d-81f2-f431378ecc56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.725 2 DEBUG nova.network.neutron [req-26316999-4ec5-4660-8ac4-885c92176268 req-76752248-bfb1-497a-8512-b142e1fff654 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Refreshing network info cache for port 03a00809-da14-4bcc-8eee-92146b0d6536 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.729 2 DEBUG nova.virt.libvirt.driver [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Start _get_guest_xml network_info=[{"id": "03a00809-da14-4bcc-8eee-92146b0d6536", "address": "fa:16:3e:59:95:d2", "network": {"id": "3556a570-1234-4dd3-a7d3-e2cf3097a776", "bridge": "br-int", "label": "tempest-test-network--565113220", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.161", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c96d22c99734f059343a5340cc6f287", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03a00809-da", "ovs_interfaceid": "03a00809-da14-4bcc-8eee-92146b0d6536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.733 2 WARNING nova.virt.libvirt.driver [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.737 2 DEBUG nova.virt.libvirt.host [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.738 2 DEBUG nova.virt.libvirt.host [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.741 2 DEBUG nova.virt.libvirt.host [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.742 2 DEBUG nova.virt.libvirt.host [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.743 2 DEBUG nova.virt.libvirt.driver [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.743 2 DEBUG nova.virt.hardware [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.743 2 DEBUG nova.virt.hardware [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.744 2 DEBUG nova.virt.hardware [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.744 2 DEBUG nova.virt.hardware [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.744 2 DEBUG nova.virt.hardware [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.744 2 DEBUG nova.virt.hardware [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.744 2 DEBUG nova.virt.hardware [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.745 2 DEBUG nova.virt.hardware [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.745 2 DEBUG nova.virt.hardware [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.745 2 DEBUG nova.virt.hardware [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.745 2 DEBUG nova.virt.hardware [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.748 2 DEBUG nova.virt.libvirt.vif [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:35:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='vm_proxy',display_name='vm_proxy',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='vm-proxy',id=60,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOvSvkf6Ez84pLvA70nMe7oECsKsEg614H2CjeZigbOROrUCgiu8YQ0cYGErpHWEAbVsaccsZMl1XjLVhCbSAWLcNqRXB+mFUuPERzl3xca7lAlc6pqTmGJGSY+TB7aO5w==',key_name='tempest-keypair-test-1659993707',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8c96d22c99734f059343a5340cc6f287',ramdisk_id='',reservation_id='r-we1by55q',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VrrpTest-336353520',owner_user_name='tempest-VrrpTest-336353520-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:35:58Z,user_data=None,user_id='7dd1826c89b24382854eb7979b65ba87',uuid=d279aff7-6615-416d-81f2-f431378ecc56,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "03a00809-da14-4bcc-8eee-92146b0d6536", "address": "fa:16:3e:59:95:d2", "network": {"id": "3556a570-1234-4dd3-a7d3-e2cf3097a776", "bridge": "br-int", "label": "tempest-test-network--565113220", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.161", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c96d22c99734f059343a5340cc6f287", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03a00809-da", "ovs_interfaceid": "03a00809-da14-4bcc-8eee-92146b0d6536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.749 2 DEBUG nova.network.os_vif_util [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Converting VIF {"id": "03a00809-da14-4bcc-8eee-92146b0d6536", "address": "fa:16:3e:59:95:d2", "network": {"id": "3556a570-1234-4dd3-a7d3-e2cf3097a776", "bridge": "br-int", "label": "tempest-test-network--565113220", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.161", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c96d22c99734f059343a5340cc6f287", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03a00809-da", "ovs_interfaceid": "03a00809-da14-4bcc-8eee-92146b0d6536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.750 2 DEBUG nova.network.os_vif_util [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:59:95:d2,bridge_name='br-int',has_traffic_filtering=True,id=03a00809-da14-4bcc-8eee-92146b0d6536,network=Network(3556a570-1234-4dd3-a7d3-e2cf3097a776),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap03a00809-da') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.751 2 DEBUG nova.objects.instance [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lazy-loading 'pci_devices' on Instance uuid d279aff7-6615-416d-81f2-f431378ecc56 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.767 2 DEBUG nova.virt.libvirt.driver [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:36:02 np0005476733 nova_compute[192580]:  <uuid>d279aff7-6615-416d-81f2-f431378ecc56</uuid>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:  <name>instance-0000003c</name>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <nova:name>vm_proxy</nova:name>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:36:02</nova:creationTime>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:36:02 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:        <nova:user uuid="7dd1826c89b24382854eb7979b65ba87">tempest-VrrpTest-336353520-project-member</nova:user>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:        <nova:project uuid="8c96d22c99734f059343a5340cc6f287">tempest-VrrpTest-336353520</nova:project>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:        <nova:port uuid="03a00809-da14-4bcc-8eee-92146b0d6536">
Oct  8 11:36:02 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.100.161" ipVersion="4"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <entry name="serial">d279aff7-6615-416d-81f2-f431378ecc56</entry>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <entry name="uuid">d279aff7-6615-416d-81f2-f431378ecc56</entry>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/d279aff7-6615-416d-81f2-f431378ecc56/disk"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/d279aff7-6615-416d-81f2-f431378ecc56/disk.config"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:59:95:d2"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <target dev="tap03a00809-da"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/d279aff7-6615-416d-81f2-f431378ecc56/console.log" append="off"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:36:02 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:36:02 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:36:02 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:36:02 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.768 2 DEBUG nova.compute.manager [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Preparing to wait for external event network-vif-plugged-03a00809-da14-4bcc-8eee-92146b0d6536 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.769 2 DEBUG oslo_concurrency.lockutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Acquiring lock "d279aff7-6615-416d-81f2-f431378ecc56-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.769 2 DEBUG oslo_concurrency.lockutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "d279aff7-6615-416d-81f2-f431378ecc56-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.770 2 DEBUG oslo_concurrency.lockutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "d279aff7-6615-416d-81f2-f431378ecc56-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.771 2 DEBUG nova.virt.libvirt.vif [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:35:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='vm_proxy',display_name='vm_proxy',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='vm-proxy',id=60,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOvSvkf6Ez84pLvA70nMe7oECsKsEg614H2CjeZigbOROrUCgiu8YQ0cYGErpHWEAbVsaccsZMl1XjLVhCbSAWLcNqRXB+mFUuPERzl3xca7lAlc6pqTmGJGSY+TB7aO5w==',key_name='tempest-keypair-test-1659993707',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8c96d22c99734f059343a5340cc6f287',ramdisk_id='',reservation_id='r-we1by55q',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VrrpTest-336353520',owner_user_name='tempest-VrrpTest-336353520-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:35:58Z,user_data=None,user_id='7dd1826c89b24382854eb7979b65ba87',uuid=d279aff7-6615-416d-81f2-f431378ecc56,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "03a00809-da14-4bcc-8eee-92146b0d6536", "address": "fa:16:3e:59:95:d2", "network": {"id": "3556a570-1234-4dd3-a7d3-e2cf3097a776", "bridge": "br-int", "label": "tempest-test-network--565113220", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.161", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c96d22c99734f059343a5340cc6f287", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03a00809-da", "ovs_interfaceid": "03a00809-da14-4bcc-8eee-92146b0d6536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.771 2 DEBUG nova.network.os_vif_util [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Converting VIF {"id": "03a00809-da14-4bcc-8eee-92146b0d6536", "address": "fa:16:3e:59:95:d2", "network": {"id": "3556a570-1234-4dd3-a7d3-e2cf3097a776", "bridge": "br-int", "label": "tempest-test-network--565113220", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.161", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c96d22c99734f059343a5340cc6f287", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03a00809-da", "ovs_interfaceid": "03a00809-da14-4bcc-8eee-92146b0d6536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.772 2 DEBUG nova.network.os_vif_util [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:59:95:d2,bridge_name='br-int',has_traffic_filtering=True,id=03a00809-da14-4bcc-8eee-92146b0d6536,network=Network(3556a570-1234-4dd3-a7d3-e2cf3097a776),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap03a00809-da') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.772 2 DEBUG os_vif [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:59:95:d2,bridge_name='br-int',has_traffic_filtering=True,id=03a00809-da14-4bcc-8eee-92146b0d6536,network=Network(3556a570-1234-4dd3-a7d3-e2cf3097a776),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap03a00809-da') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.773 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.774 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.777 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap03a00809-da, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.778 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap03a00809-da, col_values=(('external_ids', {'iface-id': '03a00809-da14-4bcc-8eee-92146b0d6536', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:59:95:d2', 'vm-uuid': 'd279aff7-6615-416d-81f2-f431378ecc56'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:36:02 np0005476733 NetworkManager[51699]: <info>  [1759937762.7813] manager: (tap03a00809-da): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/156)
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.790 2 INFO os_vif [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:59:95:d2,bridge_name='br-int',has_traffic_filtering=True,id=03a00809-da14-4bcc-8eee-92146b0d6536,network=Network(3556a570-1234-4dd3-a7d3-e2cf3097a776),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap03a00809-da')#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.851 2 DEBUG nova.virt.libvirt.driver [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.851 2 DEBUG nova.virt.libvirt.driver [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.851 2 DEBUG nova.virt.libvirt.driver [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] No VIF found with MAC fa:16:3e:59:95:d2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:36:02 np0005476733 nova_compute[192580]: 2025-10-08 15:36:02.852 2 INFO nova.virt.libvirt.driver [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Using config drive#033[00m
Oct  8 11:36:03 np0005476733 nova_compute[192580]: 2025-10-08 15:36:03.413 2 INFO nova.virt.libvirt.driver [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Creating config drive at /var/lib/nova/instances/d279aff7-6615-416d-81f2-f431378ecc56/disk.config#033[00m
Oct  8 11:36:03 np0005476733 nova_compute[192580]: 2025-10-08 15:36:03.420 2 DEBUG oslo_concurrency.processutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d279aff7-6615-416d-81f2-f431378ecc56/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfngpcl62 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:36:03 np0005476733 nova_compute[192580]: 2025-10-08 15:36:03.552 2 DEBUG oslo_concurrency.processutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d279aff7-6615-416d-81f2-f431378ecc56/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfngpcl62" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:36:03 np0005476733 kernel: tap03a00809-da: entered promiscuous mode
Oct  8 11:36:03 np0005476733 NetworkManager[51699]: <info>  [1759937763.6215] manager: (tap03a00809-da): new Tun device (/org/freedesktop/NetworkManager/Devices/157)
Oct  8 11:36:03 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:03Z|00470|binding|INFO|Claiming lport 03a00809-da14-4bcc-8eee-92146b0d6536 for this chassis.
Oct  8 11:36:03 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:03Z|00471|binding|INFO|03a00809-da14-4bcc-8eee-92146b0d6536: Claiming fa:16:3e:59:95:d2 192.168.100.161
Oct  8 11:36:03 np0005476733 nova_compute[192580]: 2025-10-08 15:36:03.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:03 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:03Z|00472|binding|INFO|Setting lport 03a00809-da14-4bcc-8eee-92146b0d6536 ovn-installed in OVS
Oct  8 11:36:03 np0005476733 nova_compute[192580]: 2025-10-08 15:36:03.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:03 np0005476733 nova_compute[192580]: 2025-10-08 15:36:03.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:03 np0005476733 systemd-udevd[235465]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:36:03 np0005476733 NetworkManager[51699]: <info>  [1759937763.6668] device (tap03a00809-da): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:36:03 np0005476733 NetworkManager[51699]: <info>  [1759937763.6677] device (tap03a00809-da): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:36:03 np0005476733 systemd-machined[152624]: New machine qemu-34-instance-0000003c.
Oct  8 11:36:03 np0005476733 systemd[1]: Started Virtual Machine qemu-34-instance-0000003c.
Oct  8 11:36:03 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:03Z|00473|binding|INFO|Setting lport 03a00809-da14-4bcc-8eee-92146b0d6536 up in Southbound
Oct  8 11:36:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:03.752 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:59:95:d2 192.168.100.161'], port_security=['fa:16:3e:59:95:d2 192.168.100.161'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.100.161/24', 'neutron:device_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3556a570-1234-4dd3-a7d3-e2cf3097a776', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c96d22c99734f059343a5340cc6f287', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2d0efdcb-fc9f-4ff6-ac01-106f25450adb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d1675938-de61-482c-b526-990e293aed89, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=03a00809-da14-4bcc-8eee-92146b0d6536) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:36:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:03.753 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 03a00809-da14-4bcc-8eee-92146b0d6536 in datapath 3556a570-1234-4dd3-a7d3-e2cf3097a776 bound to our chassis#033[00m
Oct  8 11:36:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:03.755 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3556a570-1234-4dd3-a7d3-e2cf3097a776#033[00m
Oct  8 11:36:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:03.773 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[39b33700-182a-40ad-8101-36f409aaedd1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:36:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:03.810 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[2593cef0-4e9a-4895-b570-50eaf1b664c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:36:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:03.816 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[a51b933d-d243-44c5-9653-498f1ea2f6f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:36:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:03.848 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[8dfcd395-46fa-486e-9479-a9e7a9068b9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:36:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:03.868 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[db4b5f01-469d-4c6e-8af1-53fef86b5f54]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3556a570-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:26:e2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 11, 'tx_packets': 5, 'rx_bytes': 958, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 11, 'tx_packets': 5, 'rx_bytes': 958, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 104], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457571, 'reachable_time': 27678, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235481, 'error': None, 'target': 'ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:36:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:03.891 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6eb2901d-9e02-4cf4-a944-80a10062608f]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3556a570-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 457581, 'tstamp': 457581}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235483, 'error': None, 'target': 'ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.100.2'], ['IFA_LOCAL', '192.168.100.2'], ['IFA_BROADCAST', '192.168.100.255'], ['IFA_LABEL', 'tap3556a570-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 457584, 'tstamp': 457584}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235483, 'error': None, 'target': 'ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:36:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:03.893 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3556a570-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:36:03 np0005476733 nova_compute[192580]: 2025-10-08 15:36:03.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:03 np0005476733 nova_compute[192580]: 2025-10-08 15:36:03.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:03.897 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3556a570-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:36:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:03.897 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:36:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:03.898 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3556a570-10, col_values=(('external_ids', {'iface-id': 'edc27bfb-7622-457f-b7d3-480bac0b8693'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:36:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:03.898 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:36:04 np0005476733 nova_compute[192580]: 2025-10-08 15:36:04.381 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937764.3803768, d279aff7-6615-416d-81f2-f431378ecc56 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:36:04 np0005476733 nova_compute[192580]: 2025-10-08 15:36:04.382 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d279aff7-6615-416d-81f2-f431378ecc56] VM Started (Lifecycle Event)#033[00m
Oct  8 11:36:04 np0005476733 nova_compute[192580]: 2025-10-08 15:36:04.532 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:36:04 np0005476733 nova_compute[192580]: 2025-10-08 15:36:04.536 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937764.3808055, d279aff7-6615-416d-81f2-f431378ecc56 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:36:04 np0005476733 nova_compute[192580]: 2025-10-08 15:36:04.536 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d279aff7-6615-416d-81f2-f431378ecc56] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:36:04 np0005476733 nova_compute[192580]: 2025-10-08 15:36:04.679 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:36:04 np0005476733 nova_compute[192580]: 2025-10-08 15:36:04.683 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:36:05 np0005476733 nova_compute[192580]: 2025-10-08 15:36:05.056 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d279aff7-6615-416d-81f2-f431378ecc56] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:36:05 np0005476733 nova_compute[192580]: 2025-10-08 15:36:05.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:05 np0005476733 podman[235491]: 2025-10-08 15:36:05.252239662 +0000 UTC m=+0.067615972 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 11:36:05 np0005476733 podman[235490]: 2025-10-08 15:36:05.279677756 +0000 UTC m=+0.093083063 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  8 11:36:05 np0005476733 nova_compute[192580]: 2025-10-08 15:36:05.768 2 DEBUG nova.compute.manager [req-272f5661-55c5-4f52-9467-7c89d2fb7072 req-fb7848a3-d5c6-4699-9772-f2eaa5b9a963 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Received event network-vif-plugged-03a00809-da14-4bcc-8eee-92146b0d6536 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:36:05 np0005476733 nova_compute[192580]: 2025-10-08 15:36:05.769 2 DEBUG oslo_concurrency.lockutils [req-272f5661-55c5-4f52-9467-7c89d2fb7072 req-fb7848a3-d5c6-4699-9772-f2eaa5b9a963 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "d279aff7-6615-416d-81f2-f431378ecc56-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:36:05 np0005476733 nova_compute[192580]: 2025-10-08 15:36:05.769 2 DEBUG oslo_concurrency.lockutils [req-272f5661-55c5-4f52-9467-7c89d2fb7072 req-fb7848a3-d5c6-4699-9772-f2eaa5b9a963 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d279aff7-6615-416d-81f2-f431378ecc56-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:36:05 np0005476733 nova_compute[192580]: 2025-10-08 15:36:05.770 2 DEBUG oslo_concurrency.lockutils [req-272f5661-55c5-4f52-9467-7c89d2fb7072 req-fb7848a3-d5c6-4699-9772-f2eaa5b9a963 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d279aff7-6615-416d-81f2-f431378ecc56-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:36:05 np0005476733 nova_compute[192580]: 2025-10-08 15:36:05.770 2 DEBUG nova.compute.manager [req-272f5661-55c5-4f52-9467-7c89d2fb7072 req-fb7848a3-d5c6-4699-9772-f2eaa5b9a963 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Processing event network-vif-plugged-03a00809-da14-4bcc-8eee-92146b0d6536 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:36:05 np0005476733 nova_compute[192580]: 2025-10-08 15:36:05.771 2 DEBUG nova.compute.manager [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:36:05 np0005476733 nova_compute[192580]: 2025-10-08 15:36:05.777 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937765.7768893, d279aff7-6615-416d-81f2-f431378ecc56 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:36:05 np0005476733 nova_compute[192580]: 2025-10-08 15:36:05.777 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d279aff7-6615-416d-81f2-f431378ecc56] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:36:05 np0005476733 nova_compute[192580]: 2025-10-08 15:36:05.780 2 DEBUG nova.virt.libvirt.driver [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:36:05 np0005476733 nova_compute[192580]: 2025-10-08 15:36:05.784 2 INFO nova.virt.libvirt.driver [-] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Instance spawned successfully.#033[00m
Oct  8 11:36:05 np0005476733 nova_compute[192580]: 2025-10-08 15:36:05.785 2 DEBUG nova.virt.libvirt.driver [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:36:05 np0005476733 nova_compute[192580]: 2025-10-08 15:36:05.842 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:36:05 np0005476733 nova_compute[192580]: 2025-10-08 15:36:05.849 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:36:05 np0005476733 nova_compute[192580]: 2025-10-08 15:36:05.853 2 DEBUG nova.virt.libvirt.driver [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:36:05 np0005476733 nova_compute[192580]: 2025-10-08 15:36:05.854 2 DEBUG nova.virt.libvirt.driver [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:36:05 np0005476733 nova_compute[192580]: 2025-10-08 15:36:05.855 2 DEBUG nova.virt.libvirt.driver [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:36:05 np0005476733 nova_compute[192580]: 2025-10-08 15:36:05.855 2 DEBUG nova.virt.libvirt.driver [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:36:05 np0005476733 nova_compute[192580]: 2025-10-08 15:36:05.856 2 DEBUG nova.virt.libvirt.driver [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:36:05 np0005476733 nova_compute[192580]: 2025-10-08 15:36:05.857 2 DEBUG nova.virt.libvirt.driver [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:36:06 np0005476733 nova_compute[192580]: 2025-10-08 15:36:06.079 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d279aff7-6615-416d-81f2-f431378ecc56] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:36:06 np0005476733 nova_compute[192580]: 2025-10-08 15:36:06.339 2 INFO nova.compute.manager [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Took 7.95 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:36:06 np0005476733 nova_compute[192580]: 2025-10-08 15:36:06.339 2 DEBUG nova.compute.manager [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:36:06 np0005476733 nova_compute[192580]: 2025-10-08 15:36:06.535 2 INFO nova.compute.manager [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Took 8.67 seconds to build instance.#033[00m
Oct  8 11:36:06 np0005476733 nova_compute[192580]: 2025-10-08 15:36:06.659 2 DEBUG nova.network.neutron [req-26316999-4ec5-4660-8ac4-885c92176268 req-76752248-bfb1-497a-8512-b142e1fff654 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Updated VIF entry in instance network info cache for port 03a00809-da14-4bcc-8eee-92146b0d6536. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:36:06 np0005476733 nova_compute[192580]: 2025-10-08 15:36:06.660 2 DEBUG nova.network.neutron [req-26316999-4ec5-4660-8ac4-885c92176268 req-76752248-bfb1-497a-8512-b142e1fff654 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Updating instance_info_cache with network_info: [{"id": "03a00809-da14-4bcc-8eee-92146b0d6536", "address": "fa:16:3e:59:95:d2", "network": {"id": "3556a570-1234-4dd3-a7d3-e2cf3097a776", "bridge": "br-int", "label": "tempest-test-network--565113220", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.161", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c96d22c99734f059343a5340cc6f287", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03a00809-da", "ovs_interfaceid": "03a00809-da14-4bcc-8eee-92146b0d6536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:36:06 np0005476733 nova_compute[192580]: 2025-10-08 15:36:06.776 2 DEBUG oslo_concurrency.lockutils [None req-0e3458d5-cc70-4efc-bea3-0cdc7af49eeb 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "d279aff7-6615-416d-81f2-f431378ecc56" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.319s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:36:06 np0005476733 nova_compute[192580]: 2025-10-08 15:36:06.866 2 DEBUG oslo_concurrency.lockutils [req-26316999-4ec5-4660-8ac4-885c92176268 req-76752248-bfb1-497a-8512-b142e1fff654 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-d279aff7-6615-416d-81f2-f431378ecc56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:36:07 np0005476733 nova_compute[192580]: 2025-10-08 15:36:07.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:08 np0005476733 nova_compute[192580]: 2025-10-08 15:36:08.805 2 DEBUG nova.compute.manager [req-33bc093d-6092-4ede-9297-e4deaa9f4c82 req-bd7ab0b7-8e7d-4867-aa00-f3ca74032288 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Received event network-vif-plugged-03a00809-da14-4bcc-8eee-92146b0d6536 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:36:08 np0005476733 nova_compute[192580]: 2025-10-08 15:36:08.806 2 DEBUG oslo_concurrency.lockutils [req-33bc093d-6092-4ede-9297-e4deaa9f4c82 req-bd7ab0b7-8e7d-4867-aa00-f3ca74032288 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "d279aff7-6615-416d-81f2-f431378ecc56-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:36:08 np0005476733 nova_compute[192580]: 2025-10-08 15:36:08.806 2 DEBUG oslo_concurrency.lockutils [req-33bc093d-6092-4ede-9297-e4deaa9f4c82 req-bd7ab0b7-8e7d-4867-aa00-f3ca74032288 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d279aff7-6615-416d-81f2-f431378ecc56-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:36:08 np0005476733 nova_compute[192580]: 2025-10-08 15:36:08.806 2 DEBUG oslo_concurrency.lockutils [req-33bc093d-6092-4ede-9297-e4deaa9f4c82 req-bd7ab0b7-8e7d-4867-aa00-f3ca74032288 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d279aff7-6615-416d-81f2-f431378ecc56-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:36:08 np0005476733 nova_compute[192580]: 2025-10-08 15:36:08.807 2 DEBUG nova.compute.manager [req-33bc093d-6092-4ede-9297-e4deaa9f4c82 req-bd7ab0b7-8e7d-4867-aa00-f3ca74032288 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] No waiting events found dispatching network-vif-plugged-03a00809-da14-4bcc-8eee-92146b0d6536 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:36:08 np0005476733 nova_compute[192580]: 2025-10-08 15:36:08.807 2 WARNING nova.compute.manager [req-33bc093d-6092-4ede-9297-e4deaa9f4c82 req-bd7ab0b7-8e7d-4867-aa00-f3ca74032288 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Received unexpected event network-vif-plugged-03a00809-da14-4bcc-8eee-92146b0d6536 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:36:10 np0005476733 nova_compute[192580]: 2025-10-08 15:36:10.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:10 np0005476733 nova_compute[192580]: 2025-10-08 15:36:10.825 2 INFO nova.compute.manager [None req-3ebf38c3-d88b-4967-ab22-45ed0fb5bda5 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Get console output#033[00m
Oct  8 11:36:10 np0005476733 nova_compute[192580]: 2025-10-08 15:36:10.831 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:36:12 np0005476733 nova_compute[192580]: 2025-10-08 15:36:12.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:14 np0005476733 podman[235535]: 2025-10-08 15:36:14.236500038 +0000 UTC m=+0.054132107 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 11:36:15 np0005476733 nova_compute[192580]: 2025-10-08 15:36:15.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:15 np0005476733 nova_compute[192580]: 2025-10-08 15:36:15.987 2 INFO nova.compute.manager [None req-4cfc94d0-d87e-4572-9178-fa1ca674c8ec 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Get console output#033[00m
Oct  8 11:36:17 np0005476733 nova_compute[192580]: 2025-10-08 15:36:17.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:18 np0005476733 nova_compute[192580]: 2025-10-08 15:36:18.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:18 np0005476733 podman[235556]: 2025-10-08 15:36:18.271899591 +0000 UTC m=+0.080137915 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  8 11:36:18 np0005476733 podman[235555]: 2025-10-08 15:36:18.284841838 +0000 UTC m=+0.100592285 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  8 11:36:20 np0005476733 nova_compute[192580]: 2025-10-08 15:36:20.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:20Z|00474|pinctrl|WARN|Dropped 2301 log messages in last 60 seconds (most recently, 1 seconds ago) due to excessive rate
Oct  8 11:36:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:20Z|00475|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:36:20 np0005476733 nova_compute[192580]: 2025-10-08 15:36:20.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:21 np0005476733 nova_compute[192580]: 2025-10-08 15:36:21.171 2 INFO nova.compute.manager [None req-a6b81634-bbf9-4541-841e-3761aa3080b7 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Get console output#033[00m
Oct  8 11:36:21 np0005476733 nova_compute[192580]: 2025-10-08 15:36:21.179 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:36:22 np0005476733 nova_compute[192580]: 2025-10-08 15:36:22.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:25 np0005476733 nova_compute[192580]: 2025-10-08 15:36:25.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:26 np0005476733 podman[235606]: 2025-10-08 15:36:26.240202253 +0000 UTC m=+0.059267862 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 11:36:26 np0005476733 podman[235605]: 2025-10-08 15:36:26.244289054 +0000 UTC m=+0.069828643 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2)
Oct  8 11:36:26 np0005476733 podman[235607]: 2025-10-08 15:36:26.265354584 +0000 UTC m=+0.080809327 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Oct  8 11:36:26 np0005476733 nova_compute[192580]: 2025-10-08 15:36:26.315 2 INFO nova.compute.manager [None req-b6b6fd26-c19f-440b-9575-e17f3d10e840 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Get console output#033[00m
Oct  8 11:36:26 np0005476733 nova_compute[192580]: 2025-10-08 15:36:26.322 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:36:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:26.322 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:36:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:26.325 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:36:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:26.326 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:36:27 np0005476733 nova_compute[192580]: 2025-10-08 15:36:27.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:27Z|00476|binding|INFO|Releasing lport edc27bfb-7622-457f-b7d3-480bac0b8693 from this chassis (sb_readonly=0)
Oct  8 11:36:27 np0005476733 nova_compute[192580]: 2025-10-08 15:36:27.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:30 np0005476733 nova_compute[192580]: 2025-10-08 15:36:30.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:30 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:30Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:59:95:d2 192.168.100.161
Oct  8 11:36:30 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:30Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:59:95:d2 192.168.100.161
Oct  8 11:36:31 np0005476733 nova_compute[192580]: 2025-10-08 15:36:31.467 2 INFO nova.compute.manager [None req-cac86e66-deae-41fa-86aa-3ab40f8503d8 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Get console output#033[00m
Oct  8 11:36:31 np0005476733 nova_compute[192580]: 2025-10-08 15:36:31.472 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:36:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:31.940 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:36:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:31.940 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:36:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:31.942 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:36:31 np0005476733 nova_compute[192580]: 2025-10-08 15:36:31.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:32 np0005476733 nova_compute[192580]: 2025-10-08 15:36:32.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:34 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:34Z|00477|binding|INFO|Releasing lport edc27bfb-7622-457f-b7d3-480bac0b8693 from this chassis (sb_readonly=0)
Oct  8 11:36:34 np0005476733 nova_compute[192580]: 2025-10-08 15:36:34.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:35 np0005476733 nova_compute[192580]: 2025-10-08 15:36:35.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.010 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'name': 'vm_proxy', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003c', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8c96d22c99734f059343a5340cc6f287', 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'hostId': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.013 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'name': 'vm2', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000039', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8c96d22c99734f059343a5340cc6f287', 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'hostId': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.014 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.018 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for d279aff7-6615-416d-81f2-f431378ecc56 / tap03a00809-da inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.018 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/network.incoming.packets volume: 46 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.023 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/network.incoming.packets volume: 220 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce7ca3fa-afd6-493f-9f09-cc3ee26e418f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 46, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-0000003c-d279aff7-6615-416d-81f2-f431378ecc56-tap03a00809-da', 'timestamp': '2025-10-08T15:36:36.014653', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'tap03a00809-da', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:59:95:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03a00809-da'}, 'message_id': '9317010c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.737644525, 'message_signature': '4ebc5dce99049e3bc79048aa5e7737091e0d27b0cd815e07332964d801b428f8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 220, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-00000039-8a310a2e-17af-42b8-a212-cf0a278e20c7-tap832212ef-77', 'timestamp': '2025-10-08T15:36:36.014653', 'resource_metadata': {'display_name': 'vm2', 'name': 'tap832212ef-77', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:26:8e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap832212ef-77'}, 'message_id': '9317beee-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.742308706, 'message_signature': '2629ca1068817f800f1b4c5ab61e78c9aba373e8a114bb7458ffb25a03ebbe92'}]}, 'timestamp': '2025-10-08 15:36:36.024184', '_unique_id': 'dd88eef5540041a784baa6232ee6055e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.026 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.027 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.054 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/disk.device.read.requests volume: 10401 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.055 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.079 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.read.requests volume: 11823 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.079 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '725eb5b3-56bb-40b8-aa49-738a0842289c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 10401, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'd279aff7-6615-416d-81f2-f431378ecc56-vda', 'timestamp': '2025-10-08T15:36:36.027918', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'instance-0000003c', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '931c786c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.750913644, 'message_signature': '35caf4d1571acc74a4d621d8e86a25c7cf7cd2af7fd5a4a864dfe80c1c556240'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'd279aff7-6615-416d-81f2-f431378ecc56-sda', 'timestamp': '2025-10-08T15:36:36.027918', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'instance-0000003c', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '931c8852-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.750913644, 'message_signature': 'c58add250f273baf797ff9507b8f1e2bcb801a4668ac57b83c4b21ebe6937e0f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11823, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-vda', 'timestamp': '2025-10-08T15:36:36.027918', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '93203d26-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.778364269, 'message_signature': '65331e6aaecd582d606c76f1a0aaefb0209cba64c3968322e36a0c6839b82528'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-sda', 'timestamp': '2025-10-08T15:36:36.027918', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '93204898-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.778364269, 'message_signature': 'a9f8dac25c3685b2a12d1b3fa8b56654e032631f86f82f8910198d204443ed23'}]}, 'timestamp': '2025-10-08 15:36:36.079983', '_unique_id': '6ac35eb4a4844d7ea00f9cd02011fb9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.081 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.100 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/memory.usage volume: 173.64453125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.120 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/memory.usage volume: 243.43359375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c2cd168-e1b1-454d-b0b0-db06d1032206', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 173.64453125, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'timestamp': '2025-10-08T15:36:36.081890', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'instance-0000003c', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '9323888c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.823616269, 'message_signature': '4768a20fab2006a4609cd58b3d6091df36937dd49848708145a1ab9bec07e563'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 243.43359375, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'timestamp': '2025-10-08T15:36:36.081890', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '932699fa-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.843597753, 'message_signature': '6e1f63d8e87a02cae0619b96378767998b41aa53d4e3b8cc7aebacb24f597d67'}]}, 'timestamp': '2025-10-08 15:36:36.121467', '_unique_id': '35644d8183ea46c9a3fd6c9b4542364a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.122 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.123 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.135 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/disk.device.allocation volume: 19079168 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.136 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.154 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.154 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af428892-94c1-4439-ac4c-c9e97533a02f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 19079168, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'd279aff7-6615-416d-81f2-f431378ecc56-vda', 'timestamp': '2025-10-08T15:36:36.123763', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'instance-0000003c', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9328e9e4-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.846762124, 'message_signature': '6a310839310030dc22d95f01b42d3ea9037b8f1a4727eff55f5bcf984f131d20'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'd279aff7-6615-416d-81f2-f431378ecc56-sda', 'timestamp': '2025-10-08T15:36:36.123763', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'instance-0000003c', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9328f858-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.846762124, 'message_signature': '13dee9b43719654b128b0e7c43afc9f7d9c9d266c5228b59f94bfbd46ad76643'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-vda', 'timestamp': '2025-10-08T15:36:36.123763', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '932ba972-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.859915769, 'message_signature': '418e3db9f21493799cda87061a0bc1316ca06db227efb899aa1dc4ca47e65bce'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-sda', 'timestamp': '2025-10-08T15:36:36.123763', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '932bb3e0-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.859915769, 'message_signature': '33edd67d6a57e7141e85a0f58c1ede885b54c77645e11ab4512e053f2479a76a'}]}, 'timestamp': '2025-10-08 15:36:36.154781', '_unique_id': '6dfaf672a24d49a99fa4f44cf9d47d33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.155 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.156 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.156 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.156 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/network.outgoing.bytes.delta volume: 36361 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14753de9-e935-43de-b969-e8f1be97294e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-0000003c-d279aff7-6615-416d-81f2-f431378ecc56-tap03a00809-da', 'timestamp': '2025-10-08T15:36:36.156494', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'tap03a00809-da', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:59:95:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03a00809-da'}, 'message_id': '932c0084-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.737644525, 'message_signature': 'f81bde3e577b64f195d6f7ce37567567f8274db312dbd899154bb0241be854af'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 36361, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-00000039-8a310a2e-17af-42b8-a212-cf0a278e20c7-tap832212ef-77', 'timestamp': '2025-10-08T15:36:36.156494', 'resource_metadata': {'display_name': 'vm2', 'name': 'tap832212ef-77', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:26:8e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap832212ef-77'}, 'message_id': '932c0930-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.742308706, 'message_signature': '28c56fc8ed44ab47b2d693c83dbe18cfdbcd0daa1132793c3fb64157bb990e56'}]}, 'timestamp': '2025-10-08 15:36:36.157009', '_unique_id': 'd87e50a720374007b24886fbb32da82f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.157 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.158 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.158 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/network.incoming.bytes volume: 3548 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.158 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/network.incoming.bytes volume: 30217 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94ef15a0-606b-4724-b5a2-de0d2ba09619', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3548, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-0000003c-d279aff7-6615-416d-81f2-f431378ecc56-tap03a00809-da', 'timestamp': '2025-10-08T15:36:36.158288', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'tap03a00809-da', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:59:95:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03a00809-da'}, 'message_id': '932c463e-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.737644525, 'message_signature': 'b327ed6a455a830614c18ed07363a3130273bfd20dbe294bc5988252978191eb'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30217, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-00000039-8a310a2e-17af-42b8-a212-cf0a278e20c7-tap832212ef-77', 'timestamp': '2025-10-08T15:36:36.158288', 'resource_metadata': {'display_name': 'vm2', 'name': 'tap832212ef-77', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:26:8e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap832212ef-77'}, 'message_id': '932c4f80-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.742308706, 'message_signature': '73ee205d83035a383ac952009b948eb2510d7d0d79d9f7112b1b4c1c9d17d8d2'}]}, 'timestamp': '2025-10-08 15:36:36.158759', '_unique_id': 'ac2f9c5e3b694b65a325a72eb8c660b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.159 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.160 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.160 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/network.outgoing.bytes volume: 1848 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.160 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/network.outgoing.bytes volume: 36361 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '118925f7-0986-4651-9f1b-882a93a4cc35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1848, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-0000003c-d279aff7-6615-416d-81f2-f431378ecc56-tap03a00809-da', 'timestamp': '2025-10-08T15:36:36.160110', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'tap03a00809-da', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:59:95:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03a00809-da'}, 'message_id': '932c8e1e-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.737644525, 'message_signature': '6ad9d3d1f5518a8c0aa395fc96274affd5fc5f528c9260a9ff91df23020a0664'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 36361, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-00000039-8a310a2e-17af-42b8-a212-cf0a278e20c7-tap832212ef-77', 'timestamp': '2025-10-08T15:36:36.160110', 'resource_metadata': {'display_name': 'vm2', 'name': 'tap832212ef-77', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:26:8e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap832212ef-77'}, 'message_id': '932c98aa-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.742308706, 'message_signature': 'f77d9c91c0e291e1a8d65214c0d5e320515d3aac0a51d06f0bfac308e887677b'}]}, 'timestamp': '2025-10-08 15:36:36.160656', '_unique_id': '0161798e01934c59bbb3990799eff034'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.161 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.162 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.162 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.162 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: vm_proxy>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: vm_proxy>]
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.162 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.162 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/disk.device.write.requests volume: 119 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.162 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.163 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.write.requests volume: 833 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.163 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8507123b-bded-465d-abbd-72388ac65adf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 119, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'd279aff7-6615-416d-81f2-f431378ecc56-vda', 'timestamp': '2025-10-08T15:36:36.162637', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'instance-0000003c', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '932cf11a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.750913644, 'message_signature': 'ad2455fb4ea62b78fc8cf778dc8def41331136d8c5a7148a43d846d9418e76c0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'd279aff7-6615-416d-81f2-f431378ecc56-sda', 'timestamp': '2025-10-08T15:36:36.162637', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'instance-0000003c', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '932cfb06-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.750913644, 'message_signature': '232d00a3782f241fb903ab3fd3685c522cca1fc21a3674702e3bd255be1e7879'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 833, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-vda', 'timestamp': '2025-10-08T15:36:36.162637', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '932d0b5a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.778364269, 'message_signature': '466cab993ef4b328e7c46a2ad5cfe80e25ede999bcc5fe8e1c98f65d1356de61'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-sda', 'timestamp': '2025-10-08T15:36:36.162637', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '932d146a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.778364269, 'message_signature': 'e772fa1271a53a9b6875e81b310d03905474ad93100d6c52afef05ac5a917e34'}]}, 'timestamp': '2025-10-08 15:36:36.163816', '_unique_id': '9c399146b04e49f3ad51d85726989ba5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.165 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.165 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/network.outgoing.packets volume: 18 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.165 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/network.outgoing.packets volume: 198 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ddba283-6025-4546-937a-ec32cd1d700a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 18, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-0000003c-d279aff7-6615-416d-81f2-f431378ecc56-tap03a00809-da', 'timestamp': '2025-10-08T15:36:36.165385', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'tap03a00809-da', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:59:95:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03a00809-da'}, 'message_id': '932d5b3c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.737644525, 'message_signature': '82059d47ab65bdedfa518e41a9e74dbb8e0f998e81917bbc22a137d5e348ffac'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 198, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-00000039-8a310a2e-17af-42b8-a212-cf0a278e20c7-tap832212ef-77', 'timestamp': '2025-10-08T15:36:36.165385', 'resource_metadata': {'display_name': 'vm2', 'name': 'tap832212ef-77', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:26:8e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap832212ef-77'}, 'message_id': '932d6604-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.742308706, 'message_signature': 'c784efa29e71501b776aca1b95fb359a249c964e8c4df9fa4b045ad8de646458'}]}, 'timestamp': '2025-10-08 15:36:36.165905', '_unique_id': 'ba88fa7eb04a4a539c0b7d7d5b62ad0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.166 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.167 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.167 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/cpu volume: 28880000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.167 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/cpu volume: 43040000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3aa0819-4064-4f6b-a0bb-c458622b1f97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28880000000, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'timestamp': '2025-10-08T15:36:36.167179', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'instance-0000003c', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '932da164-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.823616269, 'message_signature': 'a06c07073eb98b27903797db8c3dae107d88b224eecda4389c0c1680b8281b11'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 43040000000, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'timestamp': '2025-10-08T15:36:36.167179', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '932da98e-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.843597753, 'message_signature': '905b9bbd35a89ffeb06129e6852527df7b296021c9ea2d82cb7811141c40a17c'}]}, 'timestamp': '2025-10-08 15:36:36.167609', '_unique_id': '4b5681e229e846a2af613a012e4bc8c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.168 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9bf6c9fe-b4d2-4298-ac82-b70c224ae14a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-0000003c-d279aff7-6615-416d-81f2-f431378ecc56-tap03a00809-da', 'timestamp': '2025-10-08T15:36:36.168779', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'tap03a00809-da', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:59:95:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03a00809-da'}, 'message_id': '932ddfb2-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.737644525, 'message_signature': 'f8f22bc0697f08b08738b75af8f844f74a80bc4ad9a8c208b10ee3af64fd48e9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-00000039-8a310a2e-17af-42b8-a212-cf0a278e20c7-tap832212ef-77', 'timestamp': '2025-10-08T15:36:36.168779', 'resource_metadata': {'display_name': 'vm2', 'name': 'tap832212ef-77', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:26:8e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap832212ef-77'}, 'message_id': '932deafc-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.742308706, 'message_signature': '76ddc955998baf850f5e05ea33424529a5455bf7382b24e96fa6ec95fde3df09'}]}, 'timestamp': '2025-10-08 15:36:36.169322', '_unique_id': 'c910544506da4d1ba4afbf1850a36a87'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.169 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.170 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.170 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.170 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f33a4477-5047-497f-a835-b5a927291c46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-0000003c-d279aff7-6615-416d-81f2-f431378ecc56-tap03a00809-da', 'timestamp': '2025-10-08T15:36:36.170579', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'tap03a00809-da', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:59:95:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03a00809-da'}, 'message_id': '932e265c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.737644525, 'message_signature': 'c3f7a42c03b7d0728f35b5b6d2988da731d10ed001f4e6f9e9bfc88bb3842c2c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-00000039-8a310a2e-17af-42b8-a212-cf0a278e20c7-tap832212ef-77', 'timestamp': '2025-10-08T15:36:36.170579', 'resource_metadata': {'display_name': 'vm2', 'name': 'tap832212ef-77', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:26:8e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap832212ef-77'}, 'message_id': '932e302a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.742308706, 'message_signature': '486f246a182a5daf7f321f4ac7f99137ae376e0a78a46b5817e01278240a89c5'}]}, 'timestamp': '2025-10-08 15:36:36.171122', '_unique_id': '80784bde10e74aff97e17973b3b0b8cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.171 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.172 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.172 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/disk.device.write.bytes volume: 17127936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.172 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.172 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.write.bytes volume: 137858048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.172 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '439b8f16-f76b-448a-b7b7-d4e369f529e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 17127936, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'd279aff7-6615-416d-81f2-f431378ecc56-vda', 'timestamp': '2025-10-08T15:36:36.172303', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'instance-0000003c', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '932e6a7c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.750913644, 'message_signature': 'f5a85edd0ddca0b296dfe4a3ded701d39100c2716527c201b5693cf0176bee2f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'd279aff7-6615-416d-81f2-f431378ecc56-sda', 'timestamp': '2025-10-08T15:36:36.172303', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'instance-0000003c', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '932e72b0-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.750913644, 'message_signature': 'a7595fff7d89e1cf4a2a22eea7f14b9656528d8c4d06c90b2de69eda898bfb74'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 137858048, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-vda', 'timestamp': '2025-10-08T15:36:36.172303', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '932e7a30-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.778364269, 'message_signature': '000816ab2b411d0c84534a7caedce23f3d08b5678dbcb9e2cbd3672a46bfe087'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-sda', 'timestamp': '2025-10-08T15:36:36.172303', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '932e82aa-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.778364269, 'message_signature': '331bc86850562dae886f96f36c6c658a2a0f6a987a7c65ff07f35446324942ce'}]}, 'timestamp': '2025-10-08 15:36:36.173180', '_unique_id': 'd0ca214903aa4ccbbe70871d0f64f53d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.173 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.174 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.174 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.174 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b838b4a0-13e1-463d-b525-1a5f742abe35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-0000003c-d279aff7-6615-416d-81f2-f431378ecc56-tap03a00809-da', 'timestamp': '2025-10-08T15:36:36.174404', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'tap03a00809-da', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:59:95:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03a00809-da'}, 'message_id': '932ebb6c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.737644525, 'message_signature': '8e918c0764c3d14fb9728ce626967f77e75b5a20377fb79f04421e164f375fb2'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-00000039-8a310a2e-17af-42b8-a212-cf0a278e20c7-tap832212ef-77', 'timestamp': '2025-10-08T15:36:36.174404', 'resource_metadata': {'display_name': 'vm2', 'name': 'tap832212ef-77', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:26:8e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap832212ef-77'}, 'message_id': '932ec49a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.742308706, 'message_signature': '84b81b5b3c4af5e59adac2db0fd462b803a78d939fcfe2f732e5d99cc25dfc2a'}]}, 'timestamp': '2025-10-08 15:36:36.174865', '_unique_id': '34422cf80cf34ecd86f3434b291e2f69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.175 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.176 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.176 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: vm_proxy>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: vm_proxy>]
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.176 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.176 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.176 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/network.incoming.bytes.delta volume: 30107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8692f228-9799-4c98-92ce-315ee99bea3b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-0000003c-d279aff7-6615-416d-81f2-f431378ecc56-tap03a00809-da', 'timestamp': '2025-10-08T15:36:36.176333', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'tap03a00809-da', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:59:95:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03a00809-da'}, 'message_id': '932f06e4-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.737644525, 'message_signature': '4165a70f2122305881ea4fd4f58990e9657501efe3a1b4210f4ccb61e3aaff32'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 30107, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-00000039-8a310a2e-17af-42b8-a212-cf0a278e20c7-tap832212ef-77', 'timestamp': '2025-10-08T15:36:36.176333', 'resource_metadata': {'display_name': 'vm2', 'name': 'tap832212ef-77', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:26:8e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap832212ef-77'}, 'message_id': '932f0f4a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.742308706, 'message_signature': '81515a30fbec92dce2e8c822a1c8c6090e0d9fbc95fa298143259308ec18fbab'}]}, 'timestamp': '2025-10-08 15:36:36.176774', '_unique_id': 'fed3e83044234ef3970d5ecc6ee820b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.177 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.178 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.178 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.178 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9035032-4590-4f8b-a9ef-177d665c3f0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'd279aff7-6615-416d-81f2-f431378ecc56-vda', 'timestamp': '2025-10-08T15:36:36.177896', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'instance-0000003c', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '932f437a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.846762124, 'message_signature': '27e30e5323073d935ee5db3d79e2683c6bdd6f9c37bd40492458e5a4fb5a5c92'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'd279aff7-6615-416d-81f2-f431378ecc56-sda', 'timestamp': '2025-10-08T15:36:36.177896', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'instance-0000003c', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '932f4be0-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.846762124, 'message_signature': '3079a8b7c0dd8f57f7e9ce22ccfe946f9103da75d06eedd1320afb784a84f5d8'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-vda', 'timestamp': '2025-10-08T15:36:36.177896', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '932f5446-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.859915769, 'message_signature': '919fa1510aa7eb60ed129a67001705633848a4bdeaff20b79776156a0fdcf755'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-sda', 'timestamp': '2025-10-08T15:36:36.177896', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '932f5c34-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.859915769, 'message_signature': '0c2e79925173c4287e382893871a8b8bc3ae7fba36df89a97b5946305e0c8876'}]}, 'timestamp': '2025-10-08 15:36:36.178764', '_unique_id': 'e829248c916741ec956bf33042fcfeda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.179 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.180 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.180 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.180 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3248b6e7-a865-42d9-8ff3-f219aea59b93', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-0000003c-d279aff7-6615-416d-81f2-f431378ecc56-tap03a00809-da', 'timestamp': '2025-10-08T15:36:36.180124', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'tap03a00809-da', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:59:95:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03a00809-da'}, 'message_id': '932f9b9a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.737644525, 'message_signature': '9b149b89b9ea36ec9b6d199cf350ecd20f8684b34bb441a1c24220fd74829114'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'instance-00000039-8a310a2e-17af-42b8-a212-cf0a278e20c7-tap832212ef-77', 'timestamp': '2025-10-08T15:36:36.180124', 'resource_metadata': {'display_name': 'vm2', 'name': 'tap832212ef-77', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:26:8e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap832212ef-77'}, 'message_id': '932fa478-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.742308706, 'message_signature': 'e116fce5f6fc50a839494ed26c71882ab7656188f544937a795c75485a035365'}]}, 'timestamp': '2025-10-08 15:36:36.180591', '_unique_id': '7fc3e5194c224a4baafd756ff2eedf10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/disk.device.read.bytes volume: 245195776 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.181 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.182 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.read.bytes volume: 331429376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.182 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6af1618c-77e0-48d4-a8f9-96d7501e73ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 245195776, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'd279aff7-6615-416d-81f2-f431378ecc56-vda', 'timestamp': '2025-10-08T15:36:36.181736', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'instance-0000003c', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '932fda06-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.750913644, 'message_signature': 'ed4108ec574a20aab4c958840ac7311311e2ee09407dc2cf57985e2ce027240e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'd279aff7-6615-416d-81f2-f431378ecc56-sda', 'timestamp': '2025-10-08T15:36:36.181736', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'instance-0000003c', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '932fe258-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.750913644, 'message_signature': '27cbcc893351af7ffe9aca74a597c502506d3dd64217ff2b7bd1d4598d6d198b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 331429376, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-vda', 'timestamp': '2025-10-08T15:36:36.181736', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '932fec12-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.778364269, 'message_signature': '44e5d31d2be39caaf4939b1b48128b3ce11b717991f4b7b26b9faadc41e7d701'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-sda', 'timestamp': '2025-10-08T15:36:36.181736', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '932ff374-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.778364269, 'message_signature': 'cfd723f5c4e2ea70967985279cd93e0a08934446a398d5b3aea4ef876857f7be'}]}, 'timestamp': '2025-10-08 15:36:36.182603', '_unique_id': '334579afbf724d1283b10d85e7daf7a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/disk.device.write.latency volume: 604670979 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.183 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.184 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.write.latency volume: 3784201114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.184 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd09ab175-7b3e-466a-be2c-300ff056d104', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 604670979, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'd279aff7-6615-416d-81f2-f431378ecc56-vda', 'timestamp': '2025-10-08T15:36:36.183721', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'instance-0000003c', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '93302718-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.750913644, 'message_signature': '5c9f95638ef908ee1f60be054a651df0745e91fdf3747c14b6fea505247cd2f6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'd279aff7-6615-416d-81f2-f431378ecc56-sda', 'timestamp': '2025-10-08T15:36:36.183721', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'instance-0000003c', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '93302eb6-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.750913644, 'message_signature': '111bc65aed4bd39c7c5cb33dc9470150521035105bdf2ea4446b8c77d9dceeb9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3784201114, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-vda', 'timestamp': '2025-10-08T15:36:36.183721', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '933036fe-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.778364269, 'message_signature': 'e282f2d0d5835724ccf93f1ae5031ed997283d7251ef4cd717562a6a00127a14'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-sda', 'timestamp': '2025-10-08T15:36:36.183721', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '93303f5a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.778364269, 'message_signature': '5d25dbed9006f38d9a5b6054b9460386c7ffa30b06b7f44a77f305ad83ef8daf'}]}, 'timestamp': '2025-10-08 15:36:36.184572', '_unique_id': '9a67ce775b794be8a4e7e45e4408ceb6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/disk.device.read.latency volume: 6039663956 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.185 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/disk.device.read.latency volume: 44161959 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.186 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.read.latency volume: 7791519588 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.186 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.read.latency volume: 50314141 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce3b50d1-a5f5-46fd-9cd4-1dcdb287423c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6039663956, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'd279aff7-6615-416d-81f2-f431378ecc56-vda', 'timestamp': '2025-10-08T15:36:36.185768', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'instance-0000003c', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9330775e-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.750913644, 'message_signature': 'ea5f93ee72eb5c5f2e41d155c93cd82ab1e4ffb380050717450829c787f111fe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 44161959, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'd279aff7-6615-416d-81f2-f431378ecc56-sda', 'timestamp': '2025-10-08T15:36:36.185768', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'instance-0000003c', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9330800a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.750913644, 'message_signature': '5cea7dac1442741402cb9ed5e02819cacf97accf2b272f698e3d7462691103cd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7791519588, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-vda', 'timestamp': '2025-10-08T15:36:36.185768', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '933087b2-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.778364269, 'message_signature': 'd8741cc756eed842aac9c9ddc1e1d57de4d9b4eaa282e8883ccfb095df8ed9fd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 50314141, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-sda', 'timestamp': '2025-10-08T15:36:36.185768', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '93308f0a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.778364269, 'message_signature': 'aafb9af9ea4ab3a5e189c8c46e3cda3594ecedc1020ea1b7769089bbc2fb649c'}]}, 'timestamp': '2025-10-08 15:36:36.186594', '_unique_id': 'a38e5f7ea36f4756ab2c677463916a66'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/disk.device.usage volume: 18219008 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.187 12 DEBUG ceilometer.compute.pollsters [-] d279aff7-6615-416d-81f2-f431378ecc56/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.188 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.usage volume: 152829952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.188 12 DEBUG ceilometer.compute.pollsters [-] 8a310a2e-17af-42b8-a212-cf0a278e20c7/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b88c1a7d-898b-4adb-8d00-bce2b8264c46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 18219008, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'd279aff7-6615-416d-81f2-f431378ecc56-vda', 'timestamp': '2025-10-08T15:36:36.187713', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'instance-0000003c', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9330c31c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.846762124, 'message_signature': '253a45c2d5cb1a2899a8727077f75d3a57f3d9b99a8f2559e973e0dba6181eee'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': 'd279aff7-6615-416d-81f2-f431378ecc56-sda', 'timestamp': '2025-10-08T15:36:36.187713', 'resource_metadata': {'display_name': 'vm_proxy', 'name': 'instance-0000003c', 'instance_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9330cb50-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.846762124, 'message_signature': 'f198fb418773d96752ac801984f205cb22497517f202234e1aee7a83839931ac'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152829952, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-vda', 'timestamp': '2025-10-08T15:36:36.187713', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '9330d3b6-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.859915769, 'message_signature': 'fe7e3133e4093c38393bdb220182c1beac3933d8fca51123b6598f51e88bea2e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7dd1826c89b24382854eb7979b65ba87', 'user_name': None, 'project_id': '8c96d22c99734f059343a5340cc6f287', 'project_name': None, 'resource_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7-sda', 'timestamp': '2025-10-08T15:36:36.187713', 'resource_metadata': {'display_name': 'vm2', 'name': 'instance-00000039', 'instance_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'instance_type': 'custom_neutron_guest', 'host': 'f14bad827a4254da354d95a7fac0618757ef19a9eced122924bb51e0', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '9330dbb8-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4699.859915769, 'message_signature': 'ba1fe4e2e42ac72f72c04c8a5d321e737c867d2334e396b2a793836fd62b4332'}]}, 'timestamp': '2025-10-08 15:36:36.188571', '_unique_id': 'f6679484888246059ef9dad7e1b42869'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.189 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: vm_proxy>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: vm_proxy>]
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.190 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.190 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:36:36.190 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: vm_proxy>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: vm_proxy>]
Oct  8 11:36:36 np0005476733 podman[235664]: 2025-10-08 15:36:36.263816209 +0000 UTC m=+0.073502711 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid)
Oct  8 11:36:36 np0005476733 podman[235665]: 2025-10-08 15:36:36.284807806 +0000 UTC m=+0.092487394 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 11:36:36 np0005476733 nova_compute[192580]: 2025-10-08 15:36:36.717 2 INFO nova.compute.manager [None req-adf40c1e-247f-4af9-9cf2-e8b9037f3ded 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Get console output#033[00m
Oct  8 11:36:36 np0005476733 nova_compute[192580]: 2025-10-08 15:36:36.729 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:36:36 np0005476733 nova_compute[192580]: 2025-10-08 15:36:36.733 2 INFO nova.virt.libvirt.driver [None req-adf40c1e-247f-4af9-9cf2-e8b9037f3ded 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Truncated console log returned, 2709 bytes ignored#033[00m
Oct  8 11:36:37 np0005476733 nova_compute[192580]: 2025-10-08 15:36:37.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:40 np0005476733 nova_compute[192580]: 2025-10-08 15:36:40.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:42 np0005476733 nova_compute[192580]: 2025-10-08 15:36:42.022 2 INFO nova.compute.manager [None req-460f07f6-1182-4098-93fe-2052ba993690 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Get console output#033[00m
Oct  8 11:36:42 np0005476733 nova_compute[192580]: 2025-10-08 15:36:42.031 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:36:42 np0005476733 nova_compute[192580]: 2025-10-08 15:36:42.035 2 INFO nova.virt.libvirt.driver [None req-460f07f6-1182-4098-93fe-2052ba993690 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Truncated console log returned, 2880 bytes ignored#033[00m
Oct  8 11:36:42 np0005476733 nova_compute[192580]: 2025-10-08 15:36:42.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:42 np0005476733 nova_compute[192580]: 2025-10-08 15:36:42.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:44 np0005476733 nova_compute[192580]: 2025-10-08 15:36:44.138 2 DEBUG nova.compute.manager [req-f9023ba3-1782-4f79-980f-04f39404cf1a req-f2c96fbc-f364-49da-bae6-5396ab6c94a6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Received event network-changed-03a00809-da14-4bcc-8eee-92146b0d6536 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:36:44 np0005476733 nova_compute[192580]: 2025-10-08 15:36:44.139 2 DEBUG nova.compute.manager [req-f9023ba3-1782-4f79-980f-04f39404cf1a req-f2c96fbc-f364-49da-bae6-5396ab6c94a6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Refreshing instance network info cache due to event network-changed-03a00809-da14-4bcc-8eee-92146b0d6536. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:36:44 np0005476733 nova_compute[192580]: 2025-10-08 15:36:44.140 2 DEBUG oslo_concurrency.lockutils [req-f9023ba3-1782-4f79-980f-04f39404cf1a req-f2c96fbc-f364-49da-bae6-5396ab6c94a6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-d279aff7-6615-416d-81f2-f431378ecc56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:36:44 np0005476733 nova_compute[192580]: 2025-10-08 15:36:44.140 2 DEBUG oslo_concurrency.lockutils [req-f9023ba3-1782-4f79-980f-04f39404cf1a req-f2c96fbc-f364-49da-bae6-5396ab6c94a6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-d279aff7-6615-416d-81f2-f431378ecc56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:36:44 np0005476733 nova_compute[192580]: 2025-10-08 15:36:44.141 2 DEBUG nova.network.neutron [req-f9023ba3-1782-4f79-980f-04f39404cf1a req-f2c96fbc-f364-49da-bae6-5396ab6c94a6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Refreshing network info cache for port 03a00809-da14-4bcc-8eee-92146b0d6536 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:36:45 np0005476733 nova_compute[192580]: 2025-10-08 15:36:45.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:45 np0005476733 podman[235735]: 2025-10-08 15:36:45.252596142 +0000 UTC m=+0.075786995 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:36:45 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:45Z|00478|pinctrl|INFO|Claiming virtual lport 0fd11583-449f-4f64-8d12-db8aa187dfb4 for this chassis with the virtual parent 832212ef-772b-4b36-b486-7b4131fc3ab5
Oct  8 11:36:45 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:45Z|00479|binding|INFO|Setting lport 0fd11583-449f-4f64-8d12-db8aa187dfb4 up in Southbound
Oct  8 11:36:46 np0005476733 nova_compute[192580]: 2025-10-08 15:36:46.262 2 DEBUG nova.network.neutron [req-f9023ba3-1782-4f79-980f-04f39404cf1a req-f2c96fbc-f364-49da-bae6-5396ab6c94a6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Updated VIF entry in instance network info cache for port 03a00809-da14-4bcc-8eee-92146b0d6536. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:36:46 np0005476733 nova_compute[192580]: 2025-10-08 15:36:46.263 2 DEBUG nova.network.neutron [req-f9023ba3-1782-4f79-980f-04f39404cf1a req-f2c96fbc-f364-49da-bae6-5396ab6c94a6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Updating instance_info_cache with network_info: [{"id": "03a00809-da14-4bcc-8eee-92146b0d6536", "address": "fa:16:3e:59:95:d2", "network": {"id": "3556a570-1234-4dd3-a7d3-e2cf3097a776", "bridge": "br-int", "label": "tempest-test-network--565113220", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.161", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c96d22c99734f059343a5340cc6f287", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03a00809-da", "ovs_interfaceid": "03a00809-da14-4bcc-8eee-92146b0d6536", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:36:46 np0005476733 nova_compute[192580]: 2025-10-08 15:36:46.284 2 DEBUG oslo_concurrency.lockutils [req-f9023ba3-1782-4f79-980f-04f39404cf1a req-f2c96fbc-f364-49da-bae6-5396ab6c94a6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-d279aff7-6615-416d-81f2-f431378ecc56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:36:47 np0005476733 nova_compute[192580]: 2025-10-08 15:36:47.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:36:47 np0005476733 nova_compute[192580]: 2025-10-08 15:36:47.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.190 2 DEBUG oslo_concurrency.lockutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "af5ca3d2-d7df-40e0-88f8-b90191a73698" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.191 2 DEBUG oslo_concurrency.lockutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "af5ca3d2-d7df-40e0-88f8-b90191a73698" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.212 2 DEBUG nova.compute.manager [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.333 2 DEBUG oslo_concurrency.lockutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.334 2 DEBUG oslo_concurrency.lockutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.346 2 DEBUG nova.virt.hardware [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.347 2 INFO nova.compute.claims [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.539 2 DEBUG nova.compute.provider_tree [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.569 2 DEBUG nova.scheduler.client.report [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.604 2 DEBUG oslo_concurrency.lockutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.606 2 DEBUG nova.compute.manager [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.657 2 DEBUG nova.compute.manager [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.658 2 DEBUG nova.network.neutron [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.683 2 INFO nova.virt.libvirt.driver [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.711 2 DEBUG nova.compute.manager [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.826 2 DEBUG nova.compute.manager [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.828 2 DEBUG nova.virt.libvirt.driver [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.828 2 INFO nova.virt.libvirt.driver [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Creating image(s)#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.829 2 DEBUG oslo_concurrency.lockutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "/var/lib/nova/instances/af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.830 2 DEBUG oslo_concurrency.lockutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "/var/lib/nova/instances/af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.830 2 DEBUG oslo_concurrency.lockutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "/var/lib/nova/instances/af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.844 2 DEBUG oslo_concurrency.processutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.913 2 DEBUG oslo_concurrency.processutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.914 2 DEBUG oslo_concurrency.lockutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.915 2 DEBUG oslo_concurrency.lockutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.928 2 DEBUG oslo_concurrency.processutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.991 2 DEBUG oslo_concurrency.processutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:36:48 np0005476733 nova_compute[192580]: 2025-10-08 15:36:48.992 2 DEBUG oslo_concurrency.processutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/af5ca3d2-d7df-40e0-88f8-b90191a73698/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:36:49 np0005476733 nova_compute[192580]: 2025-10-08 15:36:49.033 2 DEBUG oslo_concurrency.processutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/af5ca3d2-d7df-40e0-88f8-b90191a73698/disk 10737418240" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:36:49 np0005476733 nova_compute[192580]: 2025-10-08 15:36:49.034 2 DEBUG oslo_concurrency.lockutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:36:49 np0005476733 nova_compute[192580]: 2025-10-08 15:36:49.035 2 DEBUG oslo_concurrency.processutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:36:49 np0005476733 nova_compute[192580]: 2025-10-08 15:36:49.098 2 DEBUG oslo_concurrency.processutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:36:49 np0005476733 nova_compute[192580]: 2025-10-08 15:36:49.100 2 DEBUG nova.objects.instance [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'migration_context' on Instance uuid af5ca3d2-d7df-40e0-88f8-b90191a73698 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:36:49 np0005476733 nova_compute[192580]: 2025-10-08 15:36:49.128 2 DEBUG nova.virt.libvirt.driver [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:36:49 np0005476733 nova_compute[192580]: 2025-10-08 15:36:49.129 2 DEBUG nova.virt.libvirt.driver [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Ensure instance console log exists: /var/lib/nova/instances/af5ca3d2-d7df-40e0-88f8-b90191a73698/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:36:49 np0005476733 nova_compute[192580]: 2025-10-08 15:36:49.129 2 DEBUG oslo_concurrency.lockutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:36:49 np0005476733 nova_compute[192580]: 2025-10-08 15:36:49.130 2 DEBUG oslo_concurrency.lockutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:36:49 np0005476733 nova_compute[192580]: 2025-10-08 15:36:49.130 2 DEBUG oslo_concurrency.lockutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:36:49 np0005476733 podman[235765]: 2025-10-08 15:36:49.236871036 +0000 UTC m=+0.062829567 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:36:49 np0005476733 podman[235764]: 2025-10-08 15:36:49.270749708 +0000 UTC m=+0.099199839 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:36:49 np0005476733 nova_compute[192580]: 2025-10-08 15:36:49.667 2 DEBUG nova.policy [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:36:50 np0005476733 nova_compute[192580]: 2025-10-08 15:36:50.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:50 np0005476733 nova_compute[192580]: 2025-10-08 15:36:50.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:36:50 np0005476733 nova_compute[192580]: 2025-10-08 15:36:50.624 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:36:50 np0005476733 nova_compute[192580]: 2025-10-08 15:36:50.625 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:36:50 np0005476733 nova_compute[192580]: 2025-10-08 15:36:50.626 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:36:50 np0005476733 nova_compute[192580]: 2025-10-08 15:36:50.626 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:36:50 np0005476733 nova_compute[192580]: 2025-10-08 15:36:50.724 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d279aff7-6615-416d-81f2-f431378ecc56/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:36:50 np0005476733 nova_compute[192580]: 2025-10-08 15:36:50.809 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d279aff7-6615-416d-81f2-f431378ecc56/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:36:50 np0005476733 nova_compute[192580]: 2025-10-08 15:36:50.810 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d279aff7-6615-416d-81f2-f431378ecc56/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:36:50 np0005476733 nova_compute[192580]: 2025-10-08 15:36:50.906 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d279aff7-6615-416d-81f2-f431378ecc56/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:36:50 np0005476733 nova_compute[192580]: 2025-10-08 15:36:50.913 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:36:50 np0005476733 nova_compute[192580]: 2025-10-08 15:36:50.982 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:36:50 np0005476733 nova_compute[192580]: 2025-10-08 15:36:50.983 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:36:51 np0005476733 nova_compute[192580]: 2025-10-08 15:36:51.067 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:36:51 np0005476733 nova_compute[192580]: 2025-10-08 15:36:51.256 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:36:51 np0005476733 nova_compute[192580]: 2025-10-08 15:36:51.258 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12116MB free_disk=111.07086944580078GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:36:51 np0005476733 nova_compute[192580]: 2025-10-08 15:36:51.258 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:36:51 np0005476733 nova_compute[192580]: 2025-10-08 15:36:51.259 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:36:51 np0005476733 nova_compute[192580]: 2025-10-08 15:36:51.333 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 8a310a2e-17af-42b8-a212-cf0a278e20c7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:36:51 np0005476733 nova_compute[192580]: 2025-10-08 15:36:51.334 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance d279aff7-6615-416d-81f2-f431378ecc56 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:36:51 np0005476733 nova_compute[192580]: 2025-10-08 15:36:51.334 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance af5ca3d2-d7df-40e0-88f8-b90191a73698 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:36:51 np0005476733 nova_compute[192580]: 2025-10-08 15:36:51.334 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:36:51 np0005476733 nova_compute[192580]: 2025-10-08 15:36:51.334 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=3584MB phys_disk=119GB used_disk=30GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:36:51 np0005476733 nova_compute[192580]: 2025-10-08 15:36:51.409 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:36:51 np0005476733 nova_compute[192580]: 2025-10-08 15:36:51.424 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:36:51 np0005476733 nova_compute[192580]: 2025-10-08 15:36:51.459 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:36:51 np0005476733 nova_compute[192580]: 2025-10-08 15:36:51.459 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:36:52 np0005476733 nova_compute[192580]: 2025-10-08 15:36:52.459 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:36:52 np0005476733 nova_compute[192580]: 2025-10-08 15:36:52.460 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:36:52 np0005476733 nova_compute[192580]: 2025-10-08 15:36:52.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:53 np0005476733 nova_compute[192580]: 2025-10-08 15:36:53.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:53 np0005476733 nova_compute[192580]: 2025-10-08 15:36:53.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:36:53 np0005476733 nova_compute[192580]: 2025-10-08 15:36:53.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:36:53 np0005476733 nova_compute[192580]: 2025-10-08 15:36:53.644 2 DEBUG oslo_concurrency.lockutils [None req-d1bb4f19-7dd7-4cc1-8a5b-2edc057dc563 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Acquiring lock "d279aff7-6615-416d-81f2-f431378ecc56" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:36:53 np0005476733 nova_compute[192580]: 2025-10-08 15:36:53.645 2 DEBUG oslo_concurrency.lockutils [None req-d1bb4f19-7dd7-4cc1-8a5b-2edc057dc563 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "d279aff7-6615-416d-81f2-f431378ecc56" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:36:53 np0005476733 nova_compute[192580]: 2025-10-08 15:36:53.646 2 DEBUG oslo_concurrency.lockutils [None req-d1bb4f19-7dd7-4cc1-8a5b-2edc057dc563 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Acquiring lock "d279aff7-6615-416d-81f2-f431378ecc56-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:36:53 np0005476733 nova_compute[192580]: 2025-10-08 15:36:53.646 2 DEBUG oslo_concurrency.lockutils [None req-d1bb4f19-7dd7-4cc1-8a5b-2edc057dc563 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "d279aff7-6615-416d-81f2-f431378ecc56-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:36:53 np0005476733 nova_compute[192580]: 2025-10-08 15:36:53.647 2 DEBUG oslo_concurrency.lockutils [None req-d1bb4f19-7dd7-4cc1-8a5b-2edc057dc563 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "d279aff7-6615-416d-81f2-f431378ecc56-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:36:53 np0005476733 nova_compute[192580]: 2025-10-08 15:36:53.649 2 INFO nova.compute.manager [None req-d1bb4f19-7dd7-4cc1-8a5b-2edc057dc563 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Terminating instance#033[00m
Oct  8 11:36:53 np0005476733 nova_compute[192580]: 2025-10-08 15:36:53.650 2 DEBUG nova.compute.manager [None req-d1bb4f19-7dd7-4cc1-8a5b-2edc057dc563 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:36:53 np0005476733 kernel: tap03a00809-da (unregistering): left promiscuous mode
Oct  8 11:36:53 np0005476733 NetworkManager[51699]: <info>  [1759937813.6812] device (tap03a00809-da): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:36:53 np0005476733 nova_compute[192580]: 2025-10-08 15:36:53.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:53 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:53Z|00480|binding|INFO|Releasing lport 03a00809-da14-4bcc-8eee-92146b0d6536 from this chassis (sb_readonly=0)
Oct  8 11:36:53 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:53Z|00481|binding|INFO|Setting lport 03a00809-da14-4bcc-8eee-92146b0d6536 down in Southbound
Oct  8 11:36:53 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:53Z|00482|binding|INFO|Removing iface tap03a00809-da ovn-installed in OVS
Oct  8 11:36:53 np0005476733 nova_compute[192580]: 2025-10-08 15:36:53.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:53.706 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:59:95:d2 192.168.100.161'], port_security=['fa:16:3e:59:95:d2 192.168.100.161'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.100.161/24', 'neutron:device_id': 'd279aff7-6615-416d-81f2-f431378ecc56', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3556a570-1234-4dd3-a7d3-e2cf3097a776', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c96d22c99734f059343a5340cc6f287', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2d0efdcb-fc9f-4ff6-ac01-106f25450adb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.208'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d1675938-de61-482c-b526-990e293aed89, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=03a00809-da14-4bcc-8eee-92146b0d6536) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:36:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:53.707 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 03a00809-da14-4bcc-8eee-92146b0d6536 in datapath 3556a570-1234-4dd3-a7d3-e2cf3097a776 unbound from our chassis#033[00m
Oct  8 11:36:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:53.710 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3556a570-1234-4dd3-a7d3-e2cf3097a776#033[00m
Oct  8 11:36:53 np0005476733 nova_compute[192580]: 2025-10-08 15:36:53.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:53.732 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4b095b4d-55bb-4b58-858a-fc5713f2f83f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:36:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:53.765 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[6d6f2389-8ca9-49ad-8af2-f7fd62acbeea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:36:53 np0005476733 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d0000003c.scope: Deactivated successfully.
Oct  8 11:36:53 np0005476733 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d0000003c.scope: Consumed 43.686s CPU time.
Oct  8 11:36:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:53.769 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[6416f7a9-74ae-47ec-af0f-d97cb27b992b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:36:53 np0005476733 systemd-machined[152624]: Machine qemu-34-instance-0000003c terminated.
Oct  8 11:36:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:53.805 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[3610a423-674a-497b-9531-6ce84db5c261]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:36:53 np0005476733 nova_compute[192580]: 2025-10-08 15:36:53.829 2 DEBUG nova.network.neutron [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Successfully created port: 38c374da-b5bd-4c7d-9352-3fe6186df2b8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 11:36:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:53.828 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6e8ff81b-0f0f-4bb5-88ee-670e7478065c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3556a570-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:26:e2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 33, 'tx_packets': 7, 'rx_bytes': 1978, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 33, 'tx_packets': 7, 'rx_bytes': 1978, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 104], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457571, 'reachable_time': 27678, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235840, 'error': None, 'target': 'ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:36:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:53.847 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2988d1d8-af3e-488a-a5e5-ecd78c46c94c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3556a570-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 457581, 'tstamp': 457581}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235841, 'error': None, 'target': 'ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.100.2'], ['IFA_LOCAL', '192.168.100.2'], ['IFA_BROADCAST', '192.168.100.255'], ['IFA_LABEL', 'tap3556a570-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 457584, 'tstamp': 457584}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235841, 'error': None, 'target': 'ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:36:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:53.849 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3556a570-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:36:53 np0005476733 nova_compute[192580]: 2025-10-08 15:36:53.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:53 np0005476733 nova_compute[192580]: 2025-10-08 15:36:53.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:53.906 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3556a570-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:36:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:53.907 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:36:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:53.907 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3556a570-10, col_values=(('external_ids', {'iface-id': 'edc27bfb-7622-457f-b7d3-480bac0b8693'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:36:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:53.907 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:36:53 np0005476733 nova_compute[192580]: 2025-10-08 15:36:53.982 2 INFO nova.virt.libvirt.driver [-] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Instance destroyed successfully.#033[00m
Oct  8 11:36:53 np0005476733 nova_compute[192580]: 2025-10-08 15:36:53.982 2 DEBUG nova.objects.instance [None req-d1bb4f19-7dd7-4cc1-8a5b-2edc057dc563 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lazy-loading 'resources' on Instance uuid d279aff7-6615-416d-81f2-f431378ecc56 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:36:54 np0005476733 nova_compute[192580]: 2025-10-08 15:36:54.011 2 DEBUG nova.virt.libvirt.vif [None req-d1bb4f19-7dd7-4cc1-8a5b-2edc057dc563 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:35:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='vm_proxy',display_name='vm_proxy',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='vm-proxy',id=60,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOvSvkf6Ez84pLvA70nMe7oECsKsEg614H2CjeZigbOROrUCgiu8YQ0cYGErpHWEAbVsaccsZMl1XjLVhCbSAWLcNqRXB+mFUuPERzl3xca7lAlc6pqTmGJGSY+TB7aO5w==',key_name='tempest-keypair-test-1659993707',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:36:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8c96d22c99734f059343a5340cc6f287',ramdisk_id='',reservation_id='r-we1by55q',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VrrpTest-336353520',owner_user_name='tempest-VrrpTest-336353520-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:36:06Z,user_data=None,user_id='7dd1826c89b24382854eb7979b65ba87',uuid=d279aff7-6615-416d-81f2-f431378ecc56,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "03a00809-da14-4bcc-8eee-92146b0d6536", "address": "fa:16:3e:59:95:d2", "network": {"id": "3556a570-1234-4dd3-a7d3-e2cf3097a776", "bridge": "br-int", "label": "tempest-test-network--565113220", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.161", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c96d22c99734f059343a5340cc6f287", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03a00809-da", "ovs_interfaceid": "03a00809-da14-4bcc-8eee-92146b0d6536", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:36:54 np0005476733 nova_compute[192580]: 2025-10-08 15:36:54.012 2 DEBUG nova.network.os_vif_util [None req-d1bb4f19-7dd7-4cc1-8a5b-2edc057dc563 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Converting VIF {"id": "03a00809-da14-4bcc-8eee-92146b0d6536", "address": "fa:16:3e:59:95:d2", "network": {"id": "3556a570-1234-4dd3-a7d3-e2cf3097a776", "bridge": "br-int", "label": "tempest-test-network--565113220", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.161", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c96d22c99734f059343a5340cc6f287", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03a00809-da", "ovs_interfaceid": "03a00809-da14-4bcc-8eee-92146b0d6536", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:36:54 np0005476733 nova_compute[192580]: 2025-10-08 15:36:54.012 2 DEBUG nova.network.os_vif_util [None req-d1bb4f19-7dd7-4cc1-8a5b-2edc057dc563 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:59:95:d2,bridge_name='br-int',has_traffic_filtering=True,id=03a00809-da14-4bcc-8eee-92146b0d6536,network=Network(3556a570-1234-4dd3-a7d3-e2cf3097a776),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap03a00809-da') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:36:54 np0005476733 nova_compute[192580]: 2025-10-08 15:36:54.013 2 DEBUG os_vif [None req-d1bb4f19-7dd7-4cc1-8a5b-2edc057dc563 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:59:95:d2,bridge_name='br-int',has_traffic_filtering=True,id=03a00809-da14-4bcc-8eee-92146b0d6536,network=Network(3556a570-1234-4dd3-a7d3-e2cf3097a776),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap03a00809-da') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:36:54 np0005476733 nova_compute[192580]: 2025-10-08 15:36:54.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:54 np0005476733 nova_compute[192580]: 2025-10-08 15:36:54.015 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap03a00809-da, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:36:54 np0005476733 nova_compute[192580]: 2025-10-08 15:36:54.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:54 np0005476733 nova_compute[192580]: 2025-10-08 15:36:54.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:54 np0005476733 nova_compute[192580]: 2025-10-08 15:36:54.027 2 INFO os_vif [None req-d1bb4f19-7dd7-4cc1-8a5b-2edc057dc563 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:59:95:d2,bridge_name='br-int',has_traffic_filtering=True,id=03a00809-da14-4bcc-8eee-92146b0d6536,network=Network(3556a570-1234-4dd3-a7d3-e2cf3097a776),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap03a00809-da')#033[00m
Oct  8 11:36:54 np0005476733 nova_compute[192580]: 2025-10-08 15:36:54.028 2 INFO nova.virt.libvirt.driver [None req-d1bb4f19-7dd7-4cc1-8a5b-2edc057dc563 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Deleting instance files /var/lib/nova/instances/d279aff7-6615-416d-81f2-f431378ecc56_del#033[00m
Oct  8 11:36:54 np0005476733 nova_compute[192580]: 2025-10-08 15:36:54.029 2 INFO nova.virt.libvirt.driver [None req-d1bb4f19-7dd7-4cc1-8a5b-2edc057dc563 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Deletion of /var/lib/nova/instances/d279aff7-6615-416d-81f2-f431378ecc56_del complete#033[00m
Oct  8 11:36:54 np0005476733 nova_compute[192580]: 2025-10-08 15:36:54.092 2 DEBUG nova.compute.manager [req-af1ac11a-a8de-4a8f-9d6e-9b344adca500 req-21736686-7026-4ef3-9bdf-2dbfebf61b11 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Received event network-vif-unplugged-03a00809-da14-4bcc-8eee-92146b0d6536 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:36:54 np0005476733 nova_compute[192580]: 2025-10-08 15:36:54.093 2 DEBUG oslo_concurrency.lockutils [req-af1ac11a-a8de-4a8f-9d6e-9b344adca500 req-21736686-7026-4ef3-9bdf-2dbfebf61b11 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "d279aff7-6615-416d-81f2-f431378ecc56-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:36:54 np0005476733 nova_compute[192580]: 2025-10-08 15:36:54.093 2 DEBUG oslo_concurrency.lockutils [req-af1ac11a-a8de-4a8f-9d6e-9b344adca500 req-21736686-7026-4ef3-9bdf-2dbfebf61b11 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d279aff7-6615-416d-81f2-f431378ecc56-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:36:54 np0005476733 nova_compute[192580]: 2025-10-08 15:36:54.093 2 DEBUG oslo_concurrency.lockutils [req-af1ac11a-a8de-4a8f-9d6e-9b344adca500 req-21736686-7026-4ef3-9bdf-2dbfebf61b11 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d279aff7-6615-416d-81f2-f431378ecc56-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:36:54 np0005476733 nova_compute[192580]: 2025-10-08 15:36:54.094 2 DEBUG nova.compute.manager [req-af1ac11a-a8de-4a8f-9d6e-9b344adca500 req-21736686-7026-4ef3-9bdf-2dbfebf61b11 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] No waiting events found dispatching network-vif-unplugged-03a00809-da14-4bcc-8eee-92146b0d6536 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:36:54 np0005476733 nova_compute[192580]: 2025-10-08 15:36:54.094 2 DEBUG nova.compute.manager [req-af1ac11a-a8de-4a8f-9d6e-9b344adca500 req-21736686-7026-4ef3-9bdf-2dbfebf61b11 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Received event network-vif-unplugged-03a00809-da14-4bcc-8eee-92146b0d6536 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:36:54 np0005476733 nova_compute[192580]: 2025-10-08 15:36:54.112 2 INFO nova.compute.manager [None req-d1bb4f19-7dd7-4cc1-8a5b-2edc057dc563 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Took 0.46 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:36:54 np0005476733 nova_compute[192580]: 2025-10-08 15:36:54.113 2 DEBUG oslo.service.loopingcall [None req-d1bb4f19-7dd7-4cc1-8a5b-2edc057dc563 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:36:54 np0005476733 nova_compute[192580]: 2025-10-08 15:36:54.113 2 DEBUG nova.compute.manager [-] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:36:54 np0005476733 nova_compute[192580]: 2025-10-08 15:36:54.114 2 DEBUG nova.network.neutron [-] [instance: d279aff7-6615-416d-81f2-f431378ecc56] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:36:55 np0005476733 nova_compute[192580]: 2025-10-08 15:36:55.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:55 np0005476733 nova_compute[192580]: 2025-10-08 15:36:55.582 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:36:56 np0005476733 nova_compute[192580]: 2025-10-08 15:36:56.199 2 DEBUG nova.compute.manager [req-7b56fcc7-9620-4716-92e2-d15737fa85c8 req-eaf76d16-89f5-42cd-84e8-14559f29d771 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Received event network-vif-plugged-03a00809-da14-4bcc-8eee-92146b0d6536 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:36:56 np0005476733 nova_compute[192580]: 2025-10-08 15:36:56.200 2 DEBUG oslo_concurrency.lockutils [req-7b56fcc7-9620-4716-92e2-d15737fa85c8 req-eaf76d16-89f5-42cd-84e8-14559f29d771 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "d279aff7-6615-416d-81f2-f431378ecc56-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:36:56 np0005476733 nova_compute[192580]: 2025-10-08 15:36:56.200 2 DEBUG oslo_concurrency.lockutils [req-7b56fcc7-9620-4716-92e2-d15737fa85c8 req-eaf76d16-89f5-42cd-84e8-14559f29d771 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d279aff7-6615-416d-81f2-f431378ecc56-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:36:56 np0005476733 nova_compute[192580]: 2025-10-08 15:36:56.200 2 DEBUG oslo_concurrency.lockutils [req-7b56fcc7-9620-4716-92e2-d15737fa85c8 req-eaf76d16-89f5-42cd-84e8-14559f29d771 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d279aff7-6615-416d-81f2-f431378ecc56-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:36:56 np0005476733 nova_compute[192580]: 2025-10-08 15:36:56.200 2 DEBUG nova.compute.manager [req-7b56fcc7-9620-4716-92e2-d15737fa85c8 req-eaf76d16-89f5-42cd-84e8-14559f29d771 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] No waiting events found dispatching network-vif-plugged-03a00809-da14-4bcc-8eee-92146b0d6536 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:36:56 np0005476733 nova_compute[192580]: 2025-10-08 15:36:56.201 2 WARNING nova.compute.manager [req-7b56fcc7-9620-4716-92e2-d15737fa85c8 req-eaf76d16-89f5-42cd-84e8-14559f29d771 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Received unexpected event network-vif-plugged-03a00809-da14-4bcc-8eee-92146b0d6536 for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:36:56 np0005476733 nova_compute[192580]: 2025-10-08 15:36:56.783 2 DEBUG nova.network.neutron [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Successfully updated port: 38c374da-b5bd-4c7d-9352-3fe6186df2b8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:36:56 np0005476733 nova_compute[192580]: 2025-10-08 15:36:56.802 2 DEBUG oslo_concurrency.lockutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "refresh_cache-af5ca3d2-d7df-40e0-88f8-b90191a73698" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:36:56 np0005476733 nova_compute[192580]: 2025-10-08 15:36:56.802 2 DEBUG oslo_concurrency.lockutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquired lock "refresh_cache-af5ca3d2-d7df-40e0-88f8-b90191a73698" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:36:56 np0005476733 nova_compute[192580]: 2025-10-08 15:36:56.802 2 DEBUG nova.network.neutron [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:36:57 np0005476733 nova_compute[192580]: 2025-10-08 15:36:57.058 2 DEBUG nova.compute.manager [req-be85e27e-4ec8-4fb1-85d0-10155c9c1b40 req-e1a0694c-52f2-4ef9-bfdd-6fe5e1cf2340 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Received event network-changed-38c374da-b5bd-4c7d-9352-3fe6186df2b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:36:57 np0005476733 nova_compute[192580]: 2025-10-08 15:36:57.058 2 DEBUG nova.compute.manager [req-be85e27e-4ec8-4fb1-85d0-10155c9c1b40 req-e1a0694c-52f2-4ef9-bfdd-6fe5e1cf2340 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Refreshing instance network info cache due to event network-changed-38c374da-b5bd-4c7d-9352-3fe6186df2b8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:36:57 np0005476733 nova_compute[192580]: 2025-10-08 15:36:57.058 2 DEBUG oslo_concurrency.lockutils [req-be85e27e-4ec8-4fb1-85d0-10155c9c1b40 req-e1a0694c-52f2-4ef9-bfdd-6fe5e1cf2340 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-af5ca3d2-d7df-40e0-88f8-b90191a73698" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:36:57 np0005476733 nova_compute[192580]: 2025-10-08 15:36:57.137 2 DEBUG nova.network.neutron [-] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:36:57 np0005476733 nova_compute[192580]: 2025-10-08 15:36:57.157 2 INFO nova.compute.manager [-] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Took 3.04 seconds to deallocate network for instance.#033[00m
Oct  8 11:36:57 np0005476733 nova_compute[192580]: 2025-10-08 15:36:57.178 2 DEBUG nova.network.neutron [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:36:57 np0005476733 nova_compute[192580]: 2025-10-08 15:36:57.202 2 DEBUG oslo_concurrency.lockutils [None req-d1bb4f19-7dd7-4cc1-8a5b-2edc057dc563 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:36:57 np0005476733 nova_compute[192580]: 2025-10-08 15:36:57.203 2 DEBUG oslo_concurrency.lockutils [None req-d1bb4f19-7dd7-4cc1-8a5b-2edc057dc563 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:36:57 np0005476733 podman[235861]: 2025-10-08 15:36:57.253396571 +0000 UTC m=+0.070769214 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 11:36:57 np0005476733 podman[235862]: 2025-10-08 15:36:57.266028727 +0000 UTC m=+0.079515724 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_id=edpm, io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, version=9.6, release=1755695350, architecture=x86_64, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, name=ubi9-minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  8 11:36:57 np0005476733 podman[235860]: 2025-10-08 15:36:57.284412781 +0000 UTC m=+0.105364909 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3)
Oct  8 11:36:57 np0005476733 nova_compute[192580]: 2025-10-08 15:36:57.339 2 DEBUG nova.compute.provider_tree [None req-d1bb4f19-7dd7-4cc1-8a5b-2edc057dc563 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:36:57 np0005476733 nova_compute[192580]: 2025-10-08 15:36:57.357 2 DEBUG nova.scheduler.client.report [None req-d1bb4f19-7dd7-4cc1-8a5b-2edc057dc563 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:36:57 np0005476733 nova_compute[192580]: 2025-10-08 15:36:57.404 2 DEBUG oslo_concurrency.lockutils [None req-d1bb4f19-7dd7-4cc1-8a5b-2edc057dc563 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:36:57 np0005476733 nova_compute[192580]: 2025-10-08 15:36:57.457 2 INFO nova.scheduler.client.report [None req-d1bb4f19-7dd7-4cc1-8a5b-2edc057dc563 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Deleted allocations for instance d279aff7-6615-416d-81f2-f431378ecc56#033[00m
Oct  8 11:36:57 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:57Z|00483|binding|INFO|Releasing lport edc27bfb-7622-457f-b7d3-480bac0b8693 from this chassis (sb_readonly=0)
Oct  8 11:36:57 np0005476733 nova_compute[192580]: 2025-10-08 15:36:57.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:57 np0005476733 nova_compute[192580]: 2025-10-08 15:36:57.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:36:57 np0005476733 nova_compute[192580]: 2025-10-08 15:36:57.606 2 DEBUG oslo_concurrency.lockutils [None req-d1bb4f19-7dd7-4cc1-8a5b-2edc057dc563 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "d279aff7-6615-416d-81f2-f431378ecc56" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.961s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:36:58 np0005476733 nova_compute[192580]: 2025-10-08 15:36:58.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.000 2 DEBUG nova.network.neutron [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Updating instance_info_cache with network_info: [{"id": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "address": "fa:16:3e:c4:ae:37", "network": {"id": "42f8c4b4-9578-47d9-8732-7b9267b6fb6d", "bridge": "br-int", "label": "tempest-test-network--971798211", "subnets": [{"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.132", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38c374da-b5", "ovs_interfaceid": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.025 2 DEBUG oslo_concurrency.lockutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Releasing lock "refresh_cache-af5ca3d2-d7df-40e0-88f8-b90191a73698" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.025 2 DEBUG nova.compute.manager [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Instance network_info: |[{"id": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "address": "fa:16:3e:c4:ae:37", "network": {"id": "42f8c4b4-9578-47d9-8732-7b9267b6fb6d", "bridge": "br-int", "label": "tempest-test-network--971798211", "subnets": [{"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.132", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38c374da-b5", "ovs_interfaceid": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.026 2 DEBUG oslo_concurrency.lockutils [req-be85e27e-4ec8-4fb1-85d0-10155c9c1b40 req-e1a0694c-52f2-4ef9-bfdd-6fe5e1cf2340 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-af5ca3d2-d7df-40e0-88f8-b90191a73698" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.026 2 DEBUG nova.network.neutron [req-be85e27e-4ec8-4fb1-85d0-10155c9c1b40 req-e1a0694c-52f2-4ef9-bfdd-6fe5e1cf2340 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Refreshing network info cache for port 38c374da-b5bd-4c7d-9352-3fe6186df2b8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.029 2 DEBUG nova.virt.libvirt.driver [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Start _get_guest_xml network_info=[{"id": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "address": "fa:16:3e:c4:ae:37", "network": {"id": "42f8c4b4-9578-47d9-8732-7b9267b6fb6d", "bridge": "br-int", "label": "tempest-test-network--971798211", "subnets": [{"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.132", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38c374da-b5", "ovs_interfaceid": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.037 2 WARNING nova.virt.libvirt.driver [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.044 2 DEBUG nova.virt.libvirt.host [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.045 2 DEBUG nova.virt.libvirt.host [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.054 2 DEBUG nova.virt.libvirt.host [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.055 2 DEBUG nova.virt.libvirt.host [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.056 2 DEBUG nova.virt.libvirt.driver [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.057 2 DEBUG nova.virt.hardware [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.057 2 DEBUG nova.virt.hardware [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.057 2 DEBUG nova.virt.hardware [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.057 2 DEBUG nova.virt.hardware [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.057 2 DEBUG nova.virt.hardware [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.058 2 DEBUG nova.virt.hardware [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.058 2 DEBUG nova.virt.hardware [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.058 2 DEBUG nova.virt.hardware [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.058 2 DEBUG nova.virt.hardware [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.058 2 DEBUG nova.virt.hardware [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.059 2 DEBUG nova.virt.hardware [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.062 2 DEBUG nova.virt.libvirt.vif [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:36:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_dscp_marking_east_west-1191916229',display_name='tempest-test_dscp_marking_east_west-1191916229',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dscp-marking-east-west-1191916229',id=61,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-sqeru994',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:36:48Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=af5ca3d2-d7df-40e0-88f8-b90191a73698,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "address": "fa:16:3e:c4:ae:37", "network": {"id": "42f8c4b4-9578-47d9-8732-7b9267b6fb6d", "bridge": "br-int", "label": "tempest-test-network--971798211", "subnets": [{"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.132", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38c374da-b5", "ovs_interfaceid": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.063 2 DEBUG nova.network.os_vif_util [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "address": "fa:16:3e:c4:ae:37", "network": {"id": "42f8c4b4-9578-47d9-8732-7b9267b6fb6d", "bridge": "br-int", "label": "tempest-test-network--971798211", "subnets": [{"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.132", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38c374da-b5", "ovs_interfaceid": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.064 2 DEBUG nova.network.os_vif_util [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:ae:37,bridge_name='br-int',has_traffic_filtering=True,id=38c374da-b5bd-4c7d-9352-3fe6186df2b8,network=Network(42f8c4b4-9578-47d9-8732-7b9267b6fb6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38c374da-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.065 2 DEBUG nova.objects.instance [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'pci_devices' on Instance uuid af5ca3d2-d7df-40e0-88f8-b90191a73698 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.084 2 DEBUG nova.virt.libvirt.driver [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:36:59 np0005476733 nova_compute[192580]:  <uuid>af5ca3d2-d7df-40e0-88f8-b90191a73698</uuid>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:  <name>instance-0000003d</name>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <nova:name>tempest-test_dscp_marking_east_west-1191916229</nova:name>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:36:59</nova:creationTime>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:36:59 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:        <nova:user uuid="d4d641ac754b44f89a23c1628056309a">tempest-QosTestCommon-1316104462-project-member</nova:user>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:        <nova:project uuid="d58fb802e34e481ea69b20f4fe8df6d2">tempest-QosTestCommon-1316104462</nova:project>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:        <nova:port uuid="38c374da-b5bd-4c7d-9352-3fe6186df2b8">
Oct  8 11:36:59 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.5.132" ipVersion="4"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <entry name="serial">af5ca3d2-d7df-40e0-88f8-b90191a73698</entry>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <entry name="uuid">af5ca3d2-d7df-40e0-88f8-b90191a73698</entry>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/af5ca3d2-d7df-40e0-88f8-b90191a73698/disk"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.config"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:c4:ae:37"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <target dev="tap38c374da-b5"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/af5ca3d2-d7df-40e0-88f8-b90191a73698/console.log" append="off"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:36:59 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:36:59 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:36:59 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:36:59 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.085 2 DEBUG nova.compute.manager [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Preparing to wait for external event network-vif-plugged-38c374da-b5bd-4c7d-9352-3fe6186df2b8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.085 2 DEBUG oslo_concurrency.lockutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "af5ca3d2-d7df-40e0-88f8-b90191a73698-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.085 2 DEBUG oslo_concurrency.lockutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "af5ca3d2-d7df-40e0-88f8-b90191a73698-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.085 2 DEBUG oslo_concurrency.lockutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "af5ca3d2-d7df-40e0-88f8-b90191a73698-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.086 2 DEBUG nova.virt.libvirt.vif [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:36:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_dscp_marking_east_west-1191916229',display_name='tempest-test_dscp_marking_east_west-1191916229',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dscp-marking-east-west-1191916229',id=61,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-sqeru994',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:36:48Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=af5ca3d2-d7df-40e0-88f8-b90191a73698,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "address": "fa:16:3e:c4:ae:37", "network": {"id": "42f8c4b4-9578-47d9-8732-7b9267b6fb6d", "bridge": "br-int", "label": "tempest-test-network--971798211", "subnets": [{"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.132", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38c374da-b5", "ovs_interfaceid": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.086 2 DEBUG nova.network.os_vif_util [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "address": "fa:16:3e:c4:ae:37", "network": {"id": "42f8c4b4-9578-47d9-8732-7b9267b6fb6d", "bridge": "br-int", "label": "tempest-test-network--971798211", "subnets": [{"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.132", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38c374da-b5", "ovs_interfaceid": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.087 2 DEBUG nova.network.os_vif_util [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:ae:37,bridge_name='br-int',has_traffic_filtering=True,id=38c374da-b5bd-4c7d-9352-3fe6186df2b8,network=Network(42f8c4b4-9578-47d9-8732-7b9267b6fb6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38c374da-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.087 2 DEBUG os_vif [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:ae:37,bridge_name='br-int',has_traffic_filtering=True,id=38c374da-b5bd-4c7d-9352-3fe6186df2b8,network=Network(42f8c4b4-9578-47d9-8732-7b9267b6fb6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38c374da-b5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.088 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.089 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.092 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap38c374da-b5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.092 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap38c374da-b5, col_values=(('external_ids', {'iface-id': '38c374da-b5bd-4c7d-9352-3fe6186df2b8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c4:ae:37', 'vm-uuid': 'af5ca3d2-d7df-40e0-88f8-b90191a73698'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:59 np0005476733 NetworkManager[51699]: <info>  [1759937819.0950] manager: (tap38c374da-b5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/158)
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.101 2 INFO os_vif [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:ae:37,bridge_name='br-int',has_traffic_filtering=True,id=38c374da-b5bd-4c7d-9352-3fe6186df2b8,network=Network(42f8c4b4-9578-47d9-8732-7b9267b6fb6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38c374da-b5')#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.160 2 DEBUG nova.virt.libvirt.driver [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.162 2 DEBUG nova.virt.libvirt.driver [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.162 2 DEBUG nova.virt.libvirt.driver [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No VIF found with MAC fa:16:3e:c4:ae:37, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.162 2 INFO nova.virt.libvirt.driver [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Using config drive#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.614 2 DEBUG oslo_concurrency.lockutils [None req-d4e084b5-959e-47d5-9d07-7c131d15038f 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Acquiring lock "8a310a2e-17af-42b8-a212-cf0a278e20c7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.614 2 DEBUG oslo_concurrency.lockutils [None req-d4e084b5-959e-47d5-9d07-7c131d15038f 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "8a310a2e-17af-42b8-a212-cf0a278e20c7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.614 2 DEBUG oslo_concurrency.lockutils [None req-d4e084b5-959e-47d5-9d07-7c131d15038f 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Acquiring lock "8a310a2e-17af-42b8-a212-cf0a278e20c7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.615 2 DEBUG oslo_concurrency.lockutils [None req-d4e084b5-959e-47d5-9d07-7c131d15038f 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "8a310a2e-17af-42b8-a212-cf0a278e20c7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.615 2 DEBUG oslo_concurrency.lockutils [None req-d4e084b5-959e-47d5-9d07-7c131d15038f 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "8a310a2e-17af-42b8-a212-cf0a278e20c7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.616 2 INFO nova.compute.manager [None req-d4e084b5-959e-47d5-9d07-7c131d15038f 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Terminating instance#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.617 2 DEBUG nova.compute.manager [None req-d4e084b5-959e-47d5-9d07-7c131d15038f 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.631 2 INFO nova.virt.libvirt.driver [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Creating config drive at /var/lib/nova/instances/af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.config#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.637 2 DEBUG oslo_concurrency.processutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdykf5gif execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:36:59 np0005476733 kernel: tap832212ef-77 (unregistering): left promiscuous mode
Oct  8 11:36:59 np0005476733 NetworkManager[51699]: <info>  [1759937819.6439] device (tap832212ef-77): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:36:59 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:59Z|00484|binding|INFO|Releasing lport 832212ef-772b-4b36-b486-7b4131fc3ab5 from this chassis (sb_readonly=0)
Oct  8 11:36:59 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:59Z|00485|binding|INFO|Setting lport 832212ef-772b-4b36-b486-7b4131fc3ab5 down in Southbound
Oct  8 11:36:59 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:59Z|00486|binding|INFO|Releasing lport 0fd11583-449f-4f64-8d12-db8aa187dfb4 from this chassis (sb_readonly=0)
Oct  8 11:36:59 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:59Z|00487|binding|INFO|Setting lport 0fd11583-449f-4f64-8d12-db8aa187dfb4 down in Southbound
Oct  8 11:36:59 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:59Z|00488|binding|INFO|Removing iface tap832212ef-77 ovn-installed in OVS
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:59.679 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:8e:11 192.168.100.73'], port_security=['fa:16:3e:26:8e:11 192.168.100.73 192.168.100.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.100.73/24', 'neutron:device_id': '8a310a2e-17af-42b8-a212-cf0a278e20c7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3556a570-1234-4dd3-a7d3-e2cf3097a776', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c96d22c99734f059343a5340cc6f287', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2d0efdcb-fc9f-4ff6-ac01-106f25450adb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.243', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d1675938-de61-482c-b526-990e293aed89, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=832212ef-772b-4b36-b486-7b4131fc3ab5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:36:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:59.680 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 832212ef-772b-4b36-b486-7b4131fc3ab5 in datapath 3556a570-1234-4dd3-a7d3-e2cf3097a776 unbound from our chassis#033[00m
Oct  8 11:36:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:59.685 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3556a570-1234-4dd3-a7d3-e2cf3097a776, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:36:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:59.688 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[26fcd2c6-21f0-4c97-8d6c-2a50f5dc99ef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:36:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:59.689 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776 namespace which is not needed anymore#033[00m
Oct  8 11:36:59 np0005476733 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000039.scope: Deactivated successfully.
Oct  8 11:36:59 np0005476733 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000039.scope: Consumed 47.475s CPU time.
Oct  8 11:36:59 np0005476733 systemd-machined[152624]: Machine qemu-33-instance-00000039 terminated.
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.772 2 DEBUG oslo_concurrency.processutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdykf5gif" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:36:59 np0005476733 neutron-haproxy-ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776[234550]: [NOTICE]   (234574) : haproxy version is 2.8.14-c23fe91
Oct  8 11:36:59 np0005476733 neutron-haproxy-ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776[234550]: [NOTICE]   (234574) : path to executable is /usr/sbin/haproxy
Oct  8 11:36:59 np0005476733 neutron-haproxy-ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776[234550]: [WARNING]  (234574) : Exiting Master process...
Oct  8 11:36:59 np0005476733 neutron-haproxy-ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776[234550]: [ALERT]    (234574) : Current worker (234576) exited with code 143 (Terminated)
Oct  8 11:36:59 np0005476733 neutron-haproxy-ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776[234550]: [WARNING]  (234574) : All workers exited. Exiting... (0)
Oct  8 11:36:59 np0005476733 systemd[1]: libpod-b4a7b083bb8836c6b1218364e2aeedd0beed005e7b0aa5eef03694a563396eec.scope: Deactivated successfully.
Oct  8 11:36:59 np0005476733 NetworkManager[51699]: <info>  [1759937819.8464] manager: (tap832212ef-77): new Tun device (/org/freedesktop/NetworkManager/Devices/159)
Oct  8 11:36:59 np0005476733 podman[235955]: 2025-10-08 15:36:59.846049154 +0000 UTC m=+0.053253288 container died b4a7b083bb8836c6b1218364e2aeedd0beed005e7b0aa5eef03694a563396eec (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:36:59 np0005476733 systemd-udevd[235932]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:36:59 np0005476733 kernel: tap38c374da-b5: entered promiscuous mode
Oct  8 11:36:59 np0005476733 NetworkManager[51699]: <info>  [1759937819.8879] manager: (tap38c374da-b5): new Tun device (/org/freedesktop/NetworkManager/Devices/160)
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:59 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:59Z|00489|binding|INFO|Claiming lport 38c374da-b5bd-4c7d-9352-3fe6186df2b8 for this chassis.
Oct  8 11:36:59 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:59Z|00490|binding|INFO|38c374da-b5bd-4c7d-9352-3fe6186df2b8: Claiming fa:16:3e:c4:ae:37 192.168.5.132
Oct  8 11:36:59 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b4a7b083bb8836c6b1218364e2aeedd0beed005e7b0aa5eef03694a563396eec-userdata-shm.mount: Deactivated successfully.
Oct  8 11:36:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:36:59.899 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:ae:37 192.168.5.132'], port_security=['fa:16:3e:c4:ae:37 192.168.5.132'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.5.132/24', 'neutron:device_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42f8c4b4-9578-47d9-8732-7b9267b6fb6d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b449450f-29a2-4ba2-a56d-c4c1cca923db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9e1c7f6d-e9b9-40a2-8243-4254ff88872e, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=38c374da-b5bd-4c7d-9352-3fe6186df2b8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:36:59 np0005476733 systemd[1]: var-lib-containers-storage-overlay-fcd379ff73a8d13ae336ef0397bd9e98a5a70a58b3db034011a1f1aa8c8d8b12-merged.mount: Deactivated successfully.
Oct  8 11:36:59 np0005476733 NetworkManager[51699]: <info>  [1759937819.9053] device (tap38c374da-b5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:36:59 np0005476733 NetworkManager[51699]: <info>  [1759937819.9061] device (tap38c374da-b5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:59 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:59Z|00491|binding|INFO|Setting lport 38c374da-b5bd-4c7d-9352-3fe6186df2b8 ovn-installed in OVS
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:59 np0005476733 ovn_controller[94857]: 2025-10-08T15:36:59Z|00492|binding|INFO|Setting lport 38c374da-b5bd-4c7d-9352-3fe6186df2b8 up in Southbound
Oct  8 11:36:59 np0005476733 podman[235955]: 2025-10-08 15:36:59.919917427 +0000 UTC m=+0.127121561 container cleanup b4a7b083bb8836c6b1218364e2aeedd0beed005e7b0aa5eef03694a563396eec (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:36:59 np0005476733 systemd[1]: libpod-conmon-b4a7b083bb8836c6b1218364e2aeedd0beed005e7b0aa5eef03694a563396eec.scope: Deactivated successfully.
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.930 2 INFO nova.virt.libvirt.driver [-] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Instance destroyed successfully.#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.931 2 DEBUG nova.objects.instance [None req-d4e084b5-959e-47d5-9d07-7c131d15038f 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lazy-loading 'resources' on Instance uuid 8a310a2e-17af-42b8-a212-cf0a278e20c7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.948 2 DEBUG nova.virt.libvirt.vif [None req-d4e084b5-959e-47d5-9d07-7c131d15038f 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:34:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='vm2',display_name='vm2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='vm2',id=57,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOvSvkf6Ez84pLvA70nMe7oECsKsEg614H2CjeZigbOROrUCgiu8YQ0cYGErpHWEAbVsaccsZMl1XjLVhCbSAWLcNqRXB+mFUuPERzl3xca7lAlc6pqTmGJGSY+TB7aO5w==',key_name='tempest-keypair-test-1659993707',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:34:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8c96d22c99734f059343a5340cc6f287',ramdisk_id='',reservation_id='r-sbfgkeja',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VrrpTest-336353520',owner_user_name='tempest-VrrpTest-336353520-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:34:39Z,user_data=None,user_id='7dd1826c89b24382854eb7979b65ba87',uuid=8a310a2e-17af-42b8-a212-cf0a278e20c7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "832212ef-772b-4b36-b486-7b4131fc3ab5", "address": "fa:16:3e:26:8e:11", "network": {"id": "3556a570-1234-4dd3-a7d3-e2cf3097a776", "bridge": "br-int", "label": "tempest-test-network--565113220", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.73", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c96d22c99734f059343a5340cc6f287", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap832212ef-77", "ovs_interfaceid": "832212ef-772b-4b36-b486-7b4131fc3ab5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.949 2 DEBUG nova.network.os_vif_util [None req-d4e084b5-959e-47d5-9d07-7c131d15038f 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Converting VIF {"id": "832212ef-772b-4b36-b486-7b4131fc3ab5", "address": "fa:16:3e:26:8e:11", "network": {"id": "3556a570-1234-4dd3-a7d3-e2cf3097a776", "bridge": "br-int", "label": "tempest-test-network--565113220", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.73", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c96d22c99734f059343a5340cc6f287", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap832212ef-77", "ovs_interfaceid": "832212ef-772b-4b36-b486-7b4131fc3ab5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.950 2 DEBUG nova.network.os_vif_util [None req-d4e084b5-959e-47d5-9d07-7c131d15038f 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:26:8e:11,bridge_name='br-int',has_traffic_filtering=True,id=832212ef-772b-4b36-b486-7b4131fc3ab5,network=Network(3556a570-1234-4dd3-a7d3-e2cf3097a776),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap832212ef-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.950 2 DEBUG os_vif [None req-d4e084b5-959e-47d5-9d07-7c131d15038f 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:8e:11,bridge_name='br-int',has_traffic_filtering=True,id=832212ef-772b-4b36-b486-7b4131fc3ab5,network=Network(3556a570-1234-4dd3-a7d3-e2cf3097a776),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap832212ef-77') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.954 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap832212ef-77, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.961 2 INFO os_vif [None req-d4e084b5-959e-47d5-9d07-7c131d15038f 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:8e:11,bridge_name='br-int',has_traffic_filtering=True,id=832212ef-772b-4b36-b486-7b4131fc3ab5,network=Network(3556a570-1234-4dd3-a7d3-e2cf3097a776),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap832212ef-77')#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.961 2 INFO nova.virt.libvirt.driver [None req-d4e084b5-959e-47d5-9d07-7c131d15038f 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Deleting instance files /var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7_del#033[00m
Oct  8 11:36:59 np0005476733 nova_compute[192580]: 2025-10-08 15:36:59.962 2 INFO nova.virt.libvirt.driver [None req-d4e084b5-959e-47d5-9d07-7c131d15038f 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Deletion of /var/lib/nova/instances/8a310a2e-17af-42b8-a212-cf0a278e20c7_del complete#033[00m
Oct  8 11:36:59 np0005476733 systemd-machined[152624]: New machine qemu-35-instance-0000003d.
Oct  8 11:36:59 np0005476733 systemd[1]: Started Virtual Machine qemu-35-instance-0000003d.
Oct  8 11:37:00 np0005476733 podman[236008]: 2025-10-08 15:37:00.023584449 +0000 UTC m=+0.073856132 container remove b4a7b083bb8836c6b1218364e2aeedd0beed005e7b0aa5eef03694a563396eec (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.030 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[48688ac1-1eff-4dae-a48b-ba6bc903099e]: (4, ('Wed Oct  8 03:36:59 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776 (b4a7b083bb8836c6b1218364e2aeedd0beed005e7b0aa5eef03694a563396eec)\nb4a7b083bb8836c6b1218364e2aeedd0beed005e7b0aa5eef03694a563396eec\nWed Oct  8 03:36:59 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776 (b4a7b083bb8836c6b1218364e2aeedd0beed005e7b0aa5eef03694a563396eec)\nb4a7b083bb8836c6b1218364e2aeedd0beed005e7b0aa5eef03694a563396eec\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.032 2 INFO nova.compute.manager [None req-d4e084b5-959e-47d5-9d07-7c131d15038f 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.033 2 DEBUG oslo.service.loopingcall [None req-d4e084b5-959e-47d5-9d07-7c131d15038f 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.033 2 DEBUG nova.compute.manager [-] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.033 2 DEBUG nova.network.neutron [-] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.036 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[05a66c14-cdcc-48ff-9b9b-b2059902636a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.037 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3556a570-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:37:00 np0005476733 kernel: tap3556a570-10: left promiscuous mode
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.056 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d31fa9ba-5e16-474b-b6f9-ee5a3931b9f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.081 2 DEBUG nova.compute.manager [req-5ef045ed-e947-4ccf-b2f9-9981f1ad576a req-8219907c-307c-4a2a-a5cb-e4c963bfca51 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Received event network-vif-unplugged-832212ef-772b-4b36-b486-7b4131fc3ab5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.083 2 DEBUG oslo_concurrency.lockutils [req-5ef045ed-e947-4ccf-b2f9-9981f1ad576a req-8219907c-307c-4a2a-a5cb-e4c963bfca51 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "8a310a2e-17af-42b8-a212-cf0a278e20c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.083 2 DEBUG oslo_concurrency.lockutils [req-5ef045ed-e947-4ccf-b2f9-9981f1ad576a req-8219907c-307c-4a2a-a5cb-e4c963bfca51 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8a310a2e-17af-42b8-a212-cf0a278e20c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.083 2 DEBUG oslo_concurrency.lockutils [req-5ef045ed-e947-4ccf-b2f9-9981f1ad576a req-8219907c-307c-4a2a-a5cb-e4c963bfca51 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8a310a2e-17af-42b8-a212-cf0a278e20c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.083 2 DEBUG nova.compute.manager [req-5ef045ed-e947-4ccf-b2f9-9981f1ad576a req-8219907c-307c-4a2a-a5cb-e4c963bfca51 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] No waiting events found dispatching network-vif-unplugged-832212ef-772b-4b36-b486-7b4131fc3ab5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.083 2 DEBUG nova.compute.manager [req-5ef045ed-e947-4ccf-b2f9-9981f1ad576a req-8219907c-307c-4a2a-a5cb-e4c963bfca51 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Received event network-vif-unplugged-832212ef-772b-4b36-b486-7b4131fc3ab5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.085 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c3cd3929-83f4-4390-865e-46291ef376bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.087 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[773644a5-78ec-4a2b-a9af-7d6841856808]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.107 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5a5d4f0a-c422-4258-bd5f-1cb4a3727537]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457562, 'reachable_time': 32399, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236035, 'error': None, 'target': 'ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:37:00 np0005476733 systemd[1]: run-netns-ovnmeta\x2d3556a570\x2d1234\x2d4dd3\x2da7d3\x2de2cf3097a776.mount: Deactivated successfully.
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.114 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3556a570-1234-4dd3-a7d3-e2cf3097a776 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.115 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[c8e2f42f-351c-4543-a598-1b0807cae83f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.116 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 38c374da-b5bd-4c7d-9352-3fe6186df2b8 in datapath 42f8c4b4-9578-47d9-8732-7b9267b6fb6d unbound from our chassis#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.118 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 42f8c4b4-9578-47d9-8732-7b9267b6fb6d#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.132 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[909517b7-6b8e-4169-8442-0def57883349]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.133 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap42f8c4b4-91 in ovnmeta-42f8c4b4-9578-47d9-8732-7b9267b6fb6d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.135 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap42f8c4b4-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.135 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f637382e-f9ff-4b72-a6d0-bd35a7cd3e29]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.136 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[18dd7789-d684-440e-9322-09b1a3921fe4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.158 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[36b84b9d-b6c3-408a-9d28-de85bef3dab0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.182 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[01000ee6-f7b7-4fbb-bf42-eb1084c364e8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.218 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[78427d5c-dd4b-44cd-9568-a11ad1cf9b44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:37:00 np0005476733 NetworkManager[51699]: <info>  [1759937820.2266] manager: (tap42f8c4b4-90): new Veth device (/org/freedesktop/NetworkManager/Devices/161)
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.226 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0b5b9dd2-a764-4b69-8064-f0ad88139896]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.263 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[ee21ca3b-98d8-4ec2-bdec-e3899e1c6a00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.267 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[7c07df73-fccb-476a-84fb-adb3b80bc611]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:37:00 np0005476733 NetworkManager[51699]: <info>  [1759937820.2951] device (tap42f8c4b4-90): carrier: link connected
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.303 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[3c843a63-6227-4e75-837d-ad13233867cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.327 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9ac80fbb-78a5-47fc-881e-f8840c2bc614]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap42f8c4b4-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7a:ac:54'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 113], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 472395, 'reachable_time': 15756, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236069, 'error': None, 'target': 'ovnmeta-42f8c4b4-9578-47d9-8732-7b9267b6fb6d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.347 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a097a71c-ecef-4fc5-ab45-fae799dab2a9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7a:ac54'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 472395, 'tstamp': 472395}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236070, 'error': None, 'target': 'ovnmeta-42f8c4b4-9578-47d9-8732-7b9267b6fb6d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.373 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b28ac173-1858-492d-9ab1-13a833f316b4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap42f8c4b4-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7a:ac:54'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 113], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 472395, 'reachable_time': 15756, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236071, 'error': None, 'target': 'ovnmeta-42f8c4b4-9578-47d9-8732-7b9267b6fb6d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.421 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d2597b35-e6b6-4079-93a7-e8558476e3cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.484 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6d425ba1-a22c-4c7c-a238-e6e28831f45d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.486 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap42f8c4b4-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.486 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.487 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap42f8c4b4-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:00 np0005476733 NetworkManager[51699]: <info>  [1759937820.4902] manager: (tap42f8c4b4-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/162)
Oct  8 11:37:00 np0005476733 kernel: tap42f8c4b4-90: entered promiscuous mode
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.494 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap42f8c4b4-90, col_values=(('external_ids', {'iface-id': 'bf77093f-6643-4a58-86b0-b93d373577d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.499 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/42f8c4b4-9578-47d9-8732-7b9267b6fb6d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/42f8c4b4-9578-47d9-8732-7b9267b6fb6d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.500 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[520bf757-0ae6-4f22-b03f-c20b92cba338]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.501 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-42f8c4b4-9578-47d9-8732-7b9267b6fb6d
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/42f8c4b4-9578-47d9-8732-7b9267b6fb6d.pid.haproxy
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 42f8c4b4-9578-47d9-8732-7b9267b6fb6d
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:37:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:00.502 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-42f8c4b4-9578-47d9-8732-7b9267b6fb6d', 'env', 'PROCESS_TAG=haproxy-42f8c4b4-9578-47d9-8732-7b9267b6fb6d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/42f8c4b4-9578-47d9-8732-7b9267b6fb6d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:37:00 np0005476733 ovn_controller[94857]: 2025-10-08T15:37:00Z|00493|binding|INFO|Releasing lport bf77093f-6643-4a58-86b0-b93d373577d6 from this chassis (sb_readonly=0)
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.607 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.608 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.608 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.764 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937820.7639654, af5ca3d2-d7df-40e0-88f8-b90191a73698 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.765 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] VM Started (Lifecycle Event)#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.785 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.790 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937820.764139, af5ca3d2-d7df-40e0-88f8-b90191a73698 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.791 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.813 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.818 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.842 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:37:00 np0005476733 podman[236103]: 2025-10-08 15:37:00.918619495 +0000 UTC m=+0.057521026 container create 6e9b5dca7671fc7bc4be5f5a4ddb5d358f486c884895285219b9a86da2b0e063 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-42f8c4b4-9578-47d9-8732-7b9267b6fb6d, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.949 2 DEBUG nova.network.neutron [req-be85e27e-4ec8-4fb1-85d0-10155c9c1b40 req-e1a0694c-52f2-4ef9-bfdd-6fe5e1cf2340 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Updated VIF entry in instance network info cache for port 38c374da-b5bd-4c7d-9352-3fe6186df2b8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.950 2 DEBUG nova.network.neutron [req-be85e27e-4ec8-4fb1-85d0-10155c9c1b40 req-e1a0694c-52f2-4ef9-bfdd-6fe5e1cf2340 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Updating instance_info_cache with network_info: [{"id": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "address": "fa:16:3e:c4:ae:37", "network": {"id": "42f8c4b4-9578-47d9-8732-7b9267b6fb6d", "bridge": "br-int", "label": "tempest-test-network--971798211", "subnets": [{"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.132", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38c374da-b5", "ovs_interfaceid": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:37:00 np0005476733 systemd[1]: Started libpod-conmon-6e9b5dca7671fc7bc4be5f5a4ddb5d358f486c884895285219b9a86da2b0e063.scope.
Oct  8 11:37:00 np0005476733 nova_compute[192580]: 2025-10-08 15:37:00.972 2 DEBUG oslo_concurrency.lockutils [req-be85e27e-4ec8-4fb1-85d0-10155c9c1b40 req-e1a0694c-52f2-4ef9-bfdd-6fe5e1cf2340 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-af5ca3d2-d7df-40e0-88f8-b90191a73698" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:37:00 np0005476733 podman[236103]: 2025-10-08 15:37:00.88962565 +0000 UTC m=+0.028527211 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:37:00 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:37:00 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20f36882416f6454216ffa7d98abc1b1d6cfc29e363ffda0eea4dd1e0ad9932c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:37:01 np0005476733 podman[236103]: 2025-10-08 15:37:01.009412893 +0000 UTC m=+0.148314504 container init 6e9b5dca7671fc7bc4be5f5a4ddb5d358f486c884895285219b9a86da2b0e063 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-42f8c4b4-9578-47d9-8732-7b9267b6fb6d, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:37:01 np0005476733 podman[236103]: 2025-10-08 15:37:01.015737517 +0000 UTC m=+0.154639078 container start 6e9b5dca7671fc7bc4be5f5a4ddb5d358f486c884895285219b9a86da2b0e063 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-42f8c4b4-9578-47d9-8732-7b9267b6fb6d, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  8 11:37:01 np0005476733 neutron-haproxy-ovnmeta-42f8c4b4-9578-47d9-8732-7b9267b6fb6d[236119]: [NOTICE]   (236123) : New worker (236125) forked
Oct  8 11:37:01 np0005476733 neutron-haproxy-ovnmeta-42f8c4b4-9578-47d9-8732-7b9267b6fb6d[236119]: [NOTICE]   (236123) : Loading success.
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.169 2 DEBUG nova.compute.manager [req-bff09686-ef21-4692-bd95-cb797b7fc46f req-c92b38b5-330a-45fb-a61a-ee233d806aec 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Received event network-vif-plugged-832212ef-772b-4b36-b486-7b4131fc3ab5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.170 2 DEBUG oslo_concurrency.lockutils [req-bff09686-ef21-4692-bd95-cb797b7fc46f req-c92b38b5-330a-45fb-a61a-ee233d806aec 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "8a310a2e-17af-42b8-a212-cf0a278e20c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.171 2 DEBUG oslo_concurrency.lockutils [req-bff09686-ef21-4692-bd95-cb797b7fc46f req-c92b38b5-330a-45fb-a61a-ee233d806aec 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8a310a2e-17af-42b8-a212-cf0a278e20c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.171 2 DEBUG oslo_concurrency.lockutils [req-bff09686-ef21-4692-bd95-cb797b7fc46f req-c92b38b5-330a-45fb-a61a-ee233d806aec 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8a310a2e-17af-42b8-a212-cf0a278e20c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.171 2 DEBUG nova.compute.manager [req-bff09686-ef21-4692-bd95-cb797b7fc46f req-c92b38b5-330a-45fb-a61a-ee233d806aec 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] No waiting events found dispatching network-vif-plugged-832212ef-772b-4b36-b486-7b4131fc3ab5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.171 2 WARNING nova.compute.manager [req-bff09686-ef21-4692-bd95-cb797b7fc46f req-c92b38b5-330a-45fb-a61a-ee233d806aec 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Received unexpected event network-vif-plugged-832212ef-772b-4b36-b486-7b4131fc3ab5 for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.172 2 DEBUG nova.compute.manager [req-bff09686-ef21-4692-bd95-cb797b7fc46f req-c92b38b5-330a-45fb-a61a-ee233d806aec 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Received event network-vif-plugged-38c374da-b5bd-4c7d-9352-3fe6186df2b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.172 2 DEBUG oslo_concurrency.lockutils [req-bff09686-ef21-4692-bd95-cb797b7fc46f req-c92b38b5-330a-45fb-a61a-ee233d806aec 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "af5ca3d2-d7df-40e0-88f8-b90191a73698-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.172 2 DEBUG oslo_concurrency.lockutils [req-bff09686-ef21-4692-bd95-cb797b7fc46f req-c92b38b5-330a-45fb-a61a-ee233d806aec 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "af5ca3d2-d7df-40e0-88f8-b90191a73698-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.172 2 DEBUG oslo_concurrency.lockutils [req-bff09686-ef21-4692-bd95-cb797b7fc46f req-c92b38b5-330a-45fb-a61a-ee233d806aec 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "af5ca3d2-d7df-40e0-88f8-b90191a73698-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.173 2 DEBUG nova.compute.manager [req-bff09686-ef21-4692-bd95-cb797b7fc46f req-c92b38b5-330a-45fb-a61a-ee233d806aec 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Processing event network-vif-plugged-38c374da-b5bd-4c7d-9352-3fe6186df2b8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.173 2 DEBUG nova.compute.manager [req-bff09686-ef21-4692-bd95-cb797b7fc46f req-c92b38b5-330a-45fb-a61a-ee233d806aec 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Received event network-vif-plugged-38c374da-b5bd-4c7d-9352-3fe6186df2b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.173 2 DEBUG oslo_concurrency.lockutils [req-bff09686-ef21-4692-bd95-cb797b7fc46f req-c92b38b5-330a-45fb-a61a-ee233d806aec 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "af5ca3d2-d7df-40e0-88f8-b90191a73698-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.173 2 DEBUG oslo_concurrency.lockutils [req-bff09686-ef21-4692-bd95-cb797b7fc46f req-c92b38b5-330a-45fb-a61a-ee233d806aec 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "af5ca3d2-d7df-40e0-88f8-b90191a73698-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.173 2 DEBUG oslo_concurrency.lockutils [req-bff09686-ef21-4692-bd95-cb797b7fc46f req-c92b38b5-330a-45fb-a61a-ee233d806aec 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "af5ca3d2-d7df-40e0-88f8-b90191a73698-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.174 2 DEBUG nova.compute.manager [req-bff09686-ef21-4692-bd95-cb797b7fc46f req-c92b38b5-330a-45fb-a61a-ee233d806aec 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] No waiting events found dispatching network-vif-plugged-38c374da-b5bd-4c7d-9352-3fe6186df2b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.174 2 WARNING nova.compute.manager [req-bff09686-ef21-4692-bd95-cb797b7fc46f req-c92b38b5-330a-45fb-a61a-ee233d806aec 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Received unexpected event network-vif-plugged-38c374da-b5bd-4c7d-9352-3fe6186df2b8 for instance with vm_state building and task_state spawning.#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.175 2 DEBUG nova.compute.manager [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.176 2 DEBUG nova.network.neutron [-] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.179 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937822.1794062, af5ca3d2-d7df-40e0-88f8-b90191a73698 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.180 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.182 2 DEBUG nova.virt.libvirt.driver [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.185 2 INFO nova.virt.libvirt.driver [-] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Instance spawned successfully.#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.186 2 DEBUG nova.virt.libvirt.driver [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.210 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.211 2 INFO nova.compute.manager [-] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Took 2.18 seconds to deallocate network for instance.#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.220 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.226 2 DEBUG nova.virt.libvirt.driver [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.226 2 DEBUG nova.virt.libvirt.driver [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.226 2 DEBUG nova.virt.libvirt.driver [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.227 2 DEBUG nova.virt.libvirt.driver [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.227 2 DEBUG nova.virt.libvirt.driver [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.228 2 DEBUG nova.virt.libvirt.driver [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.278 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.294 2 DEBUG oslo_concurrency.lockutils [None req-d4e084b5-959e-47d5-9d07-7c131d15038f 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.294 2 DEBUG oslo_concurrency.lockutils [None req-d4e084b5-959e-47d5-9d07-7c131d15038f 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.318 2 INFO nova.compute.manager [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Took 13.49 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.319 2 DEBUG nova.compute.manager [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.368 2 DEBUG nova.compute.provider_tree [None req-d4e084b5-959e-47d5-9d07-7c131d15038f 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.382 2 DEBUG nova.scheduler.client.report [None req-d4e084b5-959e-47d5-9d07-7c131d15038f 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.387 2 INFO nova.compute.manager [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Took 14.09 seconds to build instance.#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.422 2 DEBUG oslo_concurrency.lockutils [None req-d4e084b5-959e-47d5-9d07-7c131d15038f 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.425 2 DEBUG oslo_concurrency.lockutils [None req-de1a466f-c3b4-431c-9544-c11ff639171c d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "af5ca3d2-d7df-40e0-88f8-b90191a73698" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.447 2 INFO nova.scheduler.client.report [None req-d4e084b5-959e-47d5-9d07-7c131d15038f 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Deleted allocations for instance 8a310a2e-17af-42b8-a212-cf0a278e20c7#033[00m
Oct  8 11:37:02 np0005476733 nova_compute[192580]: 2025-10-08 15:37:02.524 2 DEBUG oslo_concurrency.lockutils [None req-d4e084b5-959e-47d5-9d07-7c131d15038f 7dd1826c89b24382854eb7979b65ba87 8c96d22c99734f059343a5340cc6f287 - - default default] Lock "8a310a2e-17af-42b8-a212-cf0a278e20c7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.910s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:37:03 np0005476733 nova_compute[192580]: 2025-10-08 15:37:03.843 2 INFO nova.compute.manager [None req-b4b24c21-2fe1-4b16-b9a4-4b7d7064eef3 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Get console output#033[00m
Oct  8 11:37:03 np0005476733 nova_compute[192580]: 2025-10-08 15:37:03.848 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:37:04 np0005476733 nova_compute[192580]: 2025-10-08 15:37:04.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:37:04 np0005476733 nova_compute[192580]: 2025-10-08 15:37:04.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:05 np0005476733 nova_compute[192580]: 2025-10-08 15:37:05.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:07 np0005476733 podman[236134]: 2025-10-08 15:37:07.260686929 +0000 UTC m=+0.078898655 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid)
Oct  8 11:37:07 np0005476733 podman[236135]: 2025-10-08 15:37:07.293762427 +0000 UTC m=+0.112039335 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:37:08 np0005476733 nova_compute[192580]: 2025-10-08 15:37:08.978 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937813.9776618, d279aff7-6615-416d-81f2-f431378ecc56 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:37:08 np0005476733 nova_compute[192580]: 2025-10-08 15:37:08.979 2 INFO nova.compute.manager [-] [instance: d279aff7-6615-416d-81f2-f431378ecc56] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:37:09 np0005476733 nova_compute[192580]: 2025-10-08 15:37:09.005 2 DEBUG nova.compute.manager [None req-6fc25978-04c2-4b94-8928-56aff9177dbd - - - - - -] [instance: d279aff7-6615-416d-81f2-f431378ecc56] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:37:09 np0005476733 nova_compute[192580]: 2025-10-08 15:37:09.044 2 INFO nova.compute.manager [None req-e6c32c74-27bc-4a49-a960-f754f0465847 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Get console output#033[00m
Oct  8 11:37:09 np0005476733 nova_compute[192580]: 2025-10-08 15:37:09.050 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:37:09 np0005476733 nova_compute[192580]: 2025-10-08 15:37:09.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:10 np0005476733 nova_compute[192580]: 2025-10-08 15:37:10.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:14 np0005476733 nova_compute[192580]: 2025-10-08 15:37:14.211 2 INFO nova.compute.manager [None req-04d6029e-a145-4673-88a6-d2c0ca3a6919 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Get console output#033[00m
Oct  8 11:37:14 np0005476733 nova_compute[192580]: 2025-10-08 15:37:14.217 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:37:14 np0005476733 nova_compute[192580]: 2025-10-08 15:37:14.919 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937819.9170637, 8a310a2e-17af-42b8-a212-cf0a278e20c7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:37:14 np0005476733 nova_compute[192580]: 2025-10-08 15:37:14.920 2 INFO nova.compute.manager [-] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:37:14 np0005476733 nova_compute[192580]: 2025-10-08 15:37:14.945 2 DEBUG nova.compute.manager [None req-c5e8098d-2c87-4d91-a324-5422a14abd48 - - - - - -] [instance: 8a310a2e-17af-42b8-a212-cf0a278e20c7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:37:14 np0005476733 nova_compute[192580]: 2025-10-08 15:37:14.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:15 np0005476733 nova_compute[192580]: 2025-10-08 15:37:15.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:16 np0005476733 podman[236181]: 2025-10-08 15:37:16.247031304 +0000 UTC m=+0.061484175 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:37:17 np0005476733 ovn_controller[94857]: 2025-10-08T15:37:17Z|00494|binding|INFO|Releasing lport bf77093f-6643-4a58-86b0-b93d373577d6 from this chassis (sb_readonly=0)
Oct  8 11:37:17 np0005476733 nova_compute[192580]: 2025-10-08 15:37:17.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:19 np0005476733 nova_compute[192580]: 2025-10-08 15:37:19.363 2 INFO nova.compute.manager [None req-7d69a8c5-221b-4541-88cc-044c38abd1f6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Get console output#033[00m
Oct  8 11:37:19 np0005476733 nova_compute[192580]: 2025-10-08 15:37:19.370 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:37:19 np0005476733 nova_compute[192580]: 2025-10-08 15:37:19.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:20 np0005476733 nova_compute[192580]: 2025-10-08 15:37:20.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:20 np0005476733 podman[236205]: 2025-10-08 15:37:20.255651253 +0000 UTC m=+0.073350606 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Oct  8 11:37:20 np0005476733 podman[236204]: 2025-10-08 15:37:20.292737419 +0000 UTC m=+0.116694354 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 11:37:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:37:20Z|00495|pinctrl|WARN|Dropped 5695 log messages in last 60 seconds (most recently, 1 seconds ago) due to excessive rate
Oct  8 11:37:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:37:20Z|00496|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:37:24 np0005476733 nova_compute[192580]: 2025-10-08 15:37:24.687 2 INFO nova.compute.manager [None req-8e583518-1b11-415d-9186-ef96374846e9 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Get console output#033[00m
Oct  8 11:37:24 np0005476733 nova_compute[192580]: 2025-10-08 15:37:24.693 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:37:24 np0005476733 nova_compute[192580]: 2025-10-08 15:37:24.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:25 np0005476733 nova_compute[192580]: 2025-10-08 15:37:25.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:26.324 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:37:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:26.325 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:37:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:26.326 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:37:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:37:27Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c4:ae:37 192.168.5.132
Oct  8 11:37:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:37:27Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c4:ae:37 192.168.5.132
Oct  8 11:37:28 np0005476733 podman[236250]: 2025-10-08 15:37:28.234129743 +0000 UTC m=+0.058000132 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:37:28 np0005476733 podman[236249]: 2025-10-08 15:37:28.239545938 +0000 UTC m=+0.068430609 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  8 11:37:28 np0005476733 podman[236251]: 2025-10-08 15:37:28.25106777 +0000 UTC m=+0.072899403 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41)
Oct  8 11:37:29 np0005476733 nova_compute[192580]: 2025-10-08 15:37:29.869 2 INFO nova.compute.manager [None req-45df962a-1a9d-48c1-ad68-ccf1407ea169 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Get console output#033[00m
Oct  8 11:37:29 np0005476733 nova_compute[192580]: 2025-10-08 15:37:29.876 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:37:29 np0005476733 nova_compute[192580]: 2025-10-08 15:37:29.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:30 np0005476733 nova_compute[192580]: 2025-10-08 15:37:30.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:34 np0005476733 nova_compute[192580]: 2025-10-08 15:37:34.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:35 np0005476733 nova_compute[192580]: 2025-10-08 15:37:35.052 2 INFO nova.compute.manager [None req-299b096f-f340-40df-90f7-a19b86ec9d56 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Get console output#033[00m
Oct  8 11:37:35 np0005476733 nova_compute[192580]: 2025-10-08 15:37:35.058 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:37:35 np0005476733 nova_compute[192580]: 2025-10-08 15:37:35.061 2 INFO nova.virt.libvirt.driver [None req-299b096f-f340-40df-90f7-a19b86ec9d56 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Truncated console log returned, 3118 bytes ignored#033[00m
Oct  8 11:37:35 np0005476733 nova_compute[192580]: 2025-10-08 15:37:35.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:38 np0005476733 podman[236318]: 2025-10-08 15:37:38.265268151 +0000 UTC m=+0.079298159 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 11:37:38 np0005476733 podman[236317]: 2025-10-08 15:37:38.266067746 +0000 UTC m=+0.092468723 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct  8 11:37:39 np0005476733 nova_compute[192580]: 2025-10-08 15:37:39.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:40 np0005476733 nova_compute[192580]: 2025-10-08 15:37:40.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:40 np0005476733 nova_compute[192580]: 2025-10-08 15:37:40.208 2 INFO nova.compute.manager [None req-79e799ee-4208-4cf7-b53a-af4a04cdc3c3 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Get console output#033[00m
Oct  8 11:37:40 np0005476733 nova_compute[192580]: 2025-10-08 15:37:40.214 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:37:40 np0005476733 nova_compute[192580]: 2025-10-08 15:37:40.221 2 INFO nova.virt.libvirt.driver [None req-79e799ee-4208-4cf7-b53a-af4a04cdc3c3 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Truncated console log returned, 3327 bytes ignored#033[00m
Oct  8 11:37:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:44.675 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:37:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:44.676 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:37:44 np0005476733 nova_compute[192580]: 2025-10-08 15:37:44.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:44 np0005476733 nova_compute[192580]: 2025-10-08 15:37:44.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:45 np0005476733 nova_compute[192580]: 2025-10-08 15:37:45.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:46 np0005476733 nova_compute[192580]: 2025-10-08 15:37:46.050 2 DEBUG nova.compute.manager [req-df7c7d26-06f5-48de-8fdd-24acdf42dc1f req-d85428bc-0c04-458b-b45c-5aa98213f903 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Received event network-changed-38c374da-b5bd-4c7d-9352-3fe6186df2b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:37:46 np0005476733 nova_compute[192580]: 2025-10-08 15:37:46.052 2 DEBUG nova.compute.manager [req-df7c7d26-06f5-48de-8fdd-24acdf42dc1f req-d85428bc-0c04-458b-b45c-5aa98213f903 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Refreshing instance network info cache due to event network-changed-38c374da-b5bd-4c7d-9352-3fe6186df2b8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:37:46 np0005476733 nova_compute[192580]: 2025-10-08 15:37:46.052 2 DEBUG oslo_concurrency.lockutils [req-df7c7d26-06f5-48de-8fdd-24acdf42dc1f req-d85428bc-0c04-458b-b45c-5aa98213f903 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-af5ca3d2-d7df-40e0-88f8-b90191a73698" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:37:46 np0005476733 nova_compute[192580]: 2025-10-08 15:37:46.052 2 DEBUG oslo_concurrency.lockutils [req-df7c7d26-06f5-48de-8fdd-24acdf42dc1f req-d85428bc-0c04-458b-b45c-5aa98213f903 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-af5ca3d2-d7df-40e0-88f8-b90191a73698" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:37:46 np0005476733 nova_compute[192580]: 2025-10-08 15:37:46.052 2 DEBUG nova.network.neutron [req-df7c7d26-06f5-48de-8fdd-24acdf42dc1f req-d85428bc-0c04-458b-b45c-5aa98213f903 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Refreshing network info cache for port 38c374da-b5bd-4c7d-9352-3fe6186df2b8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:37:47 np0005476733 podman[236363]: 2025-10-08 15:37:47.215894424 +0000 UTC m=+0.049288832 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001)
Oct  8 11:37:47 np0005476733 nova_compute[192580]: 2025-10-08 15:37:47.591 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:37:47 np0005476733 nova_compute[192580]: 2025-10-08 15:37:47.969 2 DEBUG nova.network.neutron [req-df7c7d26-06f5-48de-8fdd-24acdf42dc1f req-d85428bc-0c04-458b-b45c-5aa98213f903 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Updated VIF entry in instance network info cache for port 38c374da-b5bd-4c7d-9352-3fe6186df2b8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:37:47 np0005476733 nova_compute[192580]: 2025-10-08 15:37:47.970 2 DEBUG nova.network.neutron [req-df7c7d26-06f5-48de-8fdd-24acdf42dc1f req-d85428bc-0c04-458b-b45c-5aa98213f903 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Updating instance_info_cache with network_info: [{"id": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "address": "fa:16:3e:c4:ae:37", "network": {"id": "42f8c4b4-9578-47d9-8732-7b9267b6fb6d", "bridge": "br-int", "label": "tempest-test-network--971798211", "subnets": [{"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.132", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38c374da-b5", "ovs_interfaceid": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:37:48 np0005476733 nova_compute[192580]: 2025-10-08 15:37:48.213 2 DEBUG oslo_concurrency.lockutils [req-df7c7d26-06f5-48de-8fdd-24acdf42dc1f req-d85428bc-0c04-458b-b45c-5aa98213f903 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-af5ca3d2-d7df-40e0-88f8-b90191a73698" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:37:49 np0005476733 nova_compute[192580]: 2025-10-08 15:37:49.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:50 np0005476733 nova_compute[192580]: 2025-10-08 15:37:50.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:51 np0005476733 podman[236384]: 2025-10-08 15:37:51.240430086 +0000 UTC m=+0.060620976 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:37:51 np0005476733 podman[236383]: 2025-10-08 15:37:51.266059033 +0000 UTC m=+0.090438328 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller)
Oct  8 11:37:51 np0005476733 nova_compute[192580]: 2025-10-08 15:37:51.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:37:51 np0005476733 nova_compute[192580]: 2025-10-08 15:37:51.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:37:51 np0005476733 nova_compute[192580]: 2025-10-08 15:37:51.610 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:37:51 np0005476733 nova_compute[192580]: 2025-10-08 15:37:51.610 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:37:51 np0005476733 nova_compute[192580]: 2025-10-08 15:37:51.610 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:37:51 np0005476733 nova_compute[192580]: 2025-10-08 15:37:51.611 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:37:51 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:37:51.680 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:37:51 np0005476733 nova_compute[192580]: 2025-10-08 15:37:51.684 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af5ca3d2-d7df-40e0-88f8-b90191a73698/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:37:51 np0005476733 nova_compute[192580]: 2025-10-08 15:37:51.779 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af5ca3d2-d7df-40e0-88f8-b90191a73698/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:37:51 np0005476733 nova_compute[192580]: 2025-10-08 15:37:51.780 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af5ca3d2-d7df-40e0-88f8-b90191a73698/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:37:51 np0005476733 nova_compute[192580]: 2025-10-08 15:37:51.841 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af5ca3d2-d7df-40e0-88f8-b90191a73698/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:37:51 np0005476733 nova_compute[192580]: 2025-10-08 15:37:51.994 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:37:51 np0005476733 nova_compute[192580]: 2025-10-08 15:37:51.996 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12919MB free_disk=111.21284103393555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:37:51 np0005476733 nova_compute[192580]: 2025-10-08 15:37:51.996 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:37:51 np0005476733 nova_compute[192580]: 2025-10-08 15:37:51.996 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:37:52 np0005476733 nova_compute[192580]: 2025-10-08 15:37:52.058 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance af5ca3d2-d7df-40e0-88f8-b90191a73698 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:37:52 np0005476733 nova_compute[192580]: 2025-10-08 15:37:52.059 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:37:52 np0005476733 nova_compute[192580]: 2025-10-08 15:37:52.059 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:37:52 np0005476733 nova_compute[192580]: 2025-10-08 15:37:52.101 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:37:52 np0005476733 nova_compute[192580]: 2025-10-08 15:37:52.114 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:37:52 np0005476733 nova_compute[192580]: 2025-10-08 15:37:52.130 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:37:52 np0005476733 nova_compute[192580]: 2025-10-08 15:37:52.130 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:37:53 np0005476733 nova_compute[192580]: 2025-10-08 15:37:53.132 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:37:54 np0005476733 nova_compute[192580]: 2025-10-08 15:37:54.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:55 np0005476733 nova_compute[192580]: 2025-10-08 15:37:55.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:37:55 np0005476733 nova_compute[192580]: 2025-10-08 15:37:55.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:37:55 np0005476733 nova_compute[192580]: 2025-10-08 15:37:55.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:37:57 np0005476733 nova_compute[192580]: 2025-10-08 15:37:57.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:37:58 np0005476733 nova_compute[192580]: 2025-10-08 15:37:58.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:37:59 np0005476733 podman[236458]: 2025-10-08 15:37:59.240242533 +0000 UTC m=+0.063130587 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 11:37:59 np0005476733 podman[236459]: 2025-10-08 15:37:59.246517545 +0000 UTC m=+0.065857464 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, config_id=edpm, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7)
Oct  8 11:37:59 np0005476733 podman[236457]: 2025-10-08 15:37:59.274356403 +0000 UTC m=+0.100117569 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  8 11:37:59 np0005476733 nova_compute[192580]: 2025-10-08 15:37:59.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:00 np0005476733 nova_compute[192580]: 2025-10-08 15:38:00.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:01 np0005476733 nova_compute[192580]: 2025-10-08 15:38:01.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:38:01 np0005476733 nova_compute[192580]: 2025-10-08 15:38:01.608 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:38:01 np0005476733 nova_compute[192580]: 2025-10-08 15:38:01.608 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:38:01 np0005476733 nova_compute[192580]: 2025-10-08 15:38:01.609 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:38:02 np0005476733 nova_compute[192580]: 2025-10-08 15:38:02.661 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-af5ca3d2-d7df-40e0-88f8-b90191a73698" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:38:02 np0005476733 nova_compute[192580]: 2025-10-08 15:38:02.662 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-af5ca3d2-d7df-40e0-88f8-b90191a73698" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:38:02 np0005476733 nova_compute[192580]: 2025-10-08 15:38:02.662 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:38:02 np0005476733 nova_compute[192580]: 2025-10-08 15:38:02.662 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid af5ca3d2-d7df-40e0-88f8-b90191a73698 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:38:04 np0005476733 nova_compute[192580]: 2025-10-08 15:38:04.938 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Updating instance_info_cache with network_info: [{"id": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "address": "fa:16:3e:c4:ae:37", "network": {"id": "42f8c4b4-9578-47d9-8732-7b9267b6fb6d", "bridge": "br-int", "label": "tempest-test-network--971798211", "subnets": [{"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.132", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38c374da-b5", "ovs_interfaceid": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:38:04 np0005476733 nova_compute[192580]: 2025-10-08 15:38:04.956 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-af5ca3d2-d7df-40e0-88f8-b90191a73698" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:38:04 np0005476733 nova_compute[192580]: 2025-10-08 15:38:04.956 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:38:05 np0005476733 nova_compute[192580]: 2025-10-08 15:38:05.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:05 np0005476733 nova_compute[192580]: 2025-10-08 15:38:05.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:05 np0005476733 nova_compute[192580]: 2025-10-08 15:38:05.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:38:09 np0005476733 podman[236518]: 2025-10-08 15:38:09.234227515 +0000 UTC m=+0.056558085 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 11:38:09 np0005476733 podman[236517]: 2025-10-08 15:38:09.239773504 +0000 UTC m=+0.066343811 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  8 11:38:10 np0005476733 nova_compute[192580]: 2025-10-08 15:38:10.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:10 np0005476733 nova_compute[192580]: 2025-10-08 15:38:10.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:15 np0005476733 nova_compute[192580]: 2025-10-08 15:38:15.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:15 np0005476733 nova_compute[192580]: 2025-10-08 15:38:15.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:18 np0005476733 podman[236563]: 2025-10-08 15:38:18.233508435 +0000 UTC m=+0.059507751 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:38:20 np0005476733 nova_compute[192580]: 2025-10-08 15:38:20.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:20 np0005476733 nova_compute[192580]: 2025-10-08 15:38:20.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:22 np0005476733 ovn_controller[94857]: 2025-10-08T15:38:22Z|00497|pinctrl|WARN|Dropped 849 log messages in last 61 seconds (most recently, 3 seconds ago) due to excessive rate
Oct  8 11:38:22 np0005476733 ovn_controller[94857]: 2025-10-08T15:38:22Z|00498|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:38:22 np0005476733 podman[236583]: 2025-10-08 15:38:22.245455522 +0000 UTC m=+0.065971858 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm)
Oct  8 11:38:22 np0005476733 podman[236582]: 2025-10-08 15:38:22.281403501 +0000 UTC m=+0.103465628 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  8 11:38:25 np0005476733 nova_compute[192580]: 2025-10-08 15:38:25.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:25 np0005476733 nova_compute[192580]: 2025-10-08 15:38:25.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.259 2 DEBUG oslo_concurrency.lockutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.260 2 DEBUG oslo_concurrency.lockutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.297 2 DEBUG nova.compute.manager [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:38:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:26.324 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:38:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:26.325 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:38:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:26.326 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.368 2 DEBUG oslo_concurrency.lockutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.369 2 DEBUG oslo_concurrency.lockutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.387 2 DEBUG nova.virt.hardware [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.388 2 INFO nova.compute.claims [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.505 2 DEBUG nova.compute.provider_tree [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.529 2 DEBUG nova.scheduler.client.report [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.553 2 DEBUG oslo_concurrency.lockutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.554 2 DEBUG nova.compute.manager [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.603 2 DEBUG nova.compute.manager [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.604 2 DEBUG nova.network.neutron [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.632 2 INFO nova.virt.libvirt.driver [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.650 2 DEBUG nova.compute.manager [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.741 2 DEBUG nova.compute.manager [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.742 2 DEBUG nova.virt.libvirt.driver [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.742 2 INFO nova.virt.libvirt.driver [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Creating image(s)#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.743 2 DEBUG oslo_concurrency.lockutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "/var/lib/nova/instances/cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.743 2 DEBUG oslo_concurrency.lockutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "/var/lib/nova/instances/cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.744 2 DEBUG oslo_concurrency.lockutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "/var/lib/nova/instances/cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.756 2 DEBUG oslo_concurrency.processutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.825 2 DEBUG oslo_concurrency.processutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.827 2 DEBUG oslo_concurrency.lockutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.827 2 DEBUG oslo_concurrency.lockutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.839 2 DEBUG oslo_concurrency.processutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.902 2 DEBUG oslo_concurrency.processutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.904 2 DEBUG oslo_concurrency.processutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.947 2 DEBUG oslo_concurrency.processutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk 10737418240" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.948 2 DEBUG oslo_concurrency.lockutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:38:26 np0005476733 nova_compute[192580]: 2025-10-08 15:38:26.949 2 DEBUG oslo_concurrency.processutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:38:27 np0005476733 nova_compute[192580]: 2025-10-08 15:38:27.010 2 DEBUG nova.policy [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:38:27 np0005476733 nova_compute[192580]: 2025-10-08 15:38:27.015 2 DEBUG oslo_concurrency.processutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:38:27 np0005476733 nova_compute[192580]: 2025-10-08 15:38:27.016 2 DEBUG nova.objects.instance [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lazy-loading 'migration_context' on Instance uuid cfe5bc3f-449a-4dd6-922b-4e750f8a94d1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:38:27 np0005476733 nova_compute[192580]: 2025-10-08 15:38:27.029 2 DEBUG nova.virt.libvirt.driver [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:38:27 np0005476733 nova_compute[192580]: 2025-10-08 15:38:27.029 2 DEBUG nova.virt.libvirt.driver [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Ensure instance console log exists: /var/lib/nova/instances/cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:38:27 np0005476733 nova_compute[192580]: 2025-10-08 15:38:27.030 2 DEBUG oslo_concurrency.lockutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:38:27 np0005476733 nova_compute[192580]: 2025-10-08 15:38:27.031 2 DEBUG oslo_concurrency.lockutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:38:27 np0005476733 nova_compute[192580]: 2025-10-08 15:38:27.031 2 DEBUG oslo_concurrency.lockutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:38:28 np0005476733 nova_compute[192580]: 2025-10-08 15:38:28.170 2 DEBUG nova.network.neutron [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Successfully updated port: 674868d6-4804-4150-b307-9cd31e0509c3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:38:28 np0005476733 nova_compute[192580]: 2025-10-08 15:38:28.323 2 DEBUG nova.compute.manager [req-3c1fd597-f6f2-476f-ab58-5ba08c7e99f7 req-17499c5a-928d-4163-aa2c-150c4fa74890 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Received event network-changed-674868d6-4804-4150-b307-9cd31e0509c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:38:28 np0005476733 nova_compute[192580]: 2025-10-08 15:38:28.324 2 DEBUG nova.compute.manager [req-3c1fd597-f6f2-476f-ab58-5ba08c7e99f7 req-17499c5a-928d-4163-aa2c-150c4fa74890 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Refreshing instance network info cache due to event network-changed-674868d6-4804-4150-b307-9cd31e0509c3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:38:28 np0005476733 nova_compute[192580]: 2025-10-08 15:38:28.324 2 DEBUG oslo_concurrency.lockutils [req-3c1fd597-f6f2-476f-ab58-5ba08c7e99f7 req-17499c5a-928d-4163-aa2c-150c4fa74890 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:38:28 np0005476733 nova_compute[192580]: 2025-10-08 15:38:28.325 2 DEBUG oslo_concurrency.lockutils [req-3c1fd597-f6f2-476f-ab58-5ba08c7e99f7 req-17499c5a-928d-4163-aa2c-150c4fa74890 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:38:28 np0005476733 nova_compute[192580]: 2025-10-08 15:38:28.325 2 DEBUG nova.network.neutron [req-3c1fd597-f6f2-476f-ab58-5ba08c7e99f7 req-17499c5a-928d-4163-aa2c-150c4fa74890 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Refreshing network info cache for port 674868d6-4804-4150-b307-9cd31e0509c3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:38:28 np0005476733 nova_compute[192580]: 2025-10-08 15:38:28.642 2 DEBUG nova.network.neutron [req-3c1fd597-f6f2-476f-ab58-5ba08c7e99f7 req-17499c5a-928d-4163-aa2c-150c4fa74890 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:38:28 np0005476733 nova_compute[192580]: 2025-10-08 15:38:28.893 2 DEBUG nova.network.neutron [req-3c1fd597-f6f2-476f-ab58-5ba08c7e99f7 req-17499c5a-928d-4163-aa2c-150c4fa74890 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:38:28 np0005476733 nova_compute[192580]: 2025-10-08 15:38:28.909 2 DEBUG oslo_concurrency.lockutils [req-3c1fd597-f6f2-476f-ab58-5ba08c7e99f7 req-17499c5a-928d-4163-aa2c-150c4fa74890 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:38:29 np0005476733 nova_compute[192580]: 2025-10-08 15:38:29.185 2 DEBUG nova.network.neutron [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Successfully updated port: fd3c26a3-b30e-4a03-9339-139f5a106536 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:38:29 np0005476733 nova_compute[192580]: 2025-10-08 15:38:29.197 2 DEBUG oslo_concurrency.lockutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "refresh_cache-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:38:29 np0005476733 nova_compute[192580]: 2025-10-08 15:38:29.197 2 DEBUG oslo_concurrency.lockutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquired lock "refresh_cache-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:38:29 np0005476733 nova_compute[192580]: 2025-10-08 15:38:29.198 2 DEBUG nova.network.neutron [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:38:29 np0005476733 nova_compute[192580]: 2025-10-08 15:38:29.350 2 DEBUG nova.network.neutron [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:38:30 np0005476733 nova_compute[192580]: 2025-10-08 15:38:30.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:30 np0005476733 nova_compute[192580]: 2025-10-08 15:38:30.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:30 np0005476733 podman[236640]: 2025-10-08 15:38:30.23602694 +0000 UTC m=+0.066288738 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 11:38:30 np0005476733 podman[236641]: 2025-10-08 15:38:30.237012752 +0000 UTC m=+0.060046078 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:38:30 np0005476733 podman[236642]: 2025-10-08 15:38:30.242720315 +0000 UTC m=+0.064007775 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Oct  8 11:38:30 np0005476733 nova_compute[192580]: 2025-10-08 15:38:30.422 2 DEBUG nova.compute.manager [req-de575009-a7c1-4a90-9bad-729dc2ee3157 req-42f2e05e-effa-46b6-8eb7-47eb6af9d44c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Received event network-changed-fd3c26a3-b30e-4a03-9339-139f5a106536 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:38:30 np0005476733 nova_compute[192580]: 2025-10-08 15:38:30.422 2 DEBUG nova.compute.manager [req-de575009-a7c1-4a90-9bad-729dc2ee3157 req-42f2e05e-effa-46b6-8eb7-47eb6af9d44c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Refreshing instance network info cache due to event network-changed-fd3c26a3-b30e-4a03-9339-139f5a106536. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:38:30 np0005476733 nova_compute[192580]: 2025-10-08 15:38:30.422 2 DEBUG oslo_concurrency.lockutils [req-de575009-a7c1-4a90-9bad-729dc2ee3157 req-42f2e05e-effa-46b6-8eb7-47eb6af9d44c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.206 2 DEBUG nova.network.neutron [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Updating instance_info_cache with network_info: [{"id": "674868d6-4804-4150-b307-9cd31e0509c3", "address": "fa:16:3e:32:84:b7", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap674868d6-48", "ovs_interfaceid": "674868d6-4804-4150-b307-9cd31e0509c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "fd3c26a3-b30e-4a03-9339-139f5a106536", "address": "fa:16:3e:e5:76:46", "network": {"id": "a6b45266-4ada-4ad9-aceb-a504546d6cbe", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::35", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3c26a3-b3", "ovs_interfaceid": "fd3c26a3-b30e-4a03-9339-139f5a106536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.229 2 DEBUG oslo_concurrency.lockutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Releasing lock "refresh_cache-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.230 2 DEBUG nova.compute.manager [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Instance network_info: |[{"id": "674868d6-4804-4150-b307-9cd31e0509c3", "address": "fa:16:3e:32:84:b7", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap674868d6-48", "ovs_interfaceid": "674868d6-4804-4150-b307-9cd31e0509c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "fd3c26a3-b30e-4a03-9339-139f5a106536", "address": "fa:16:3e:e5:76:46", "network": {"id": "a6b45266-4ada-4ad9-aceb-a504546d6cbe", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::35", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3c26a3-b3", "ovs_interfaceid": "fd3c26a3-b30e-4a03-9339-139f5a106536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.230 2 DEBUG oslo_concurrency.lockutils [req-de575009-a7c1-4a90-9bad-729dc2ee3157 req-42f2e05e-effa-46b6-8eb7-47eb6af9d44c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.231 2 DEBUG nova.network.neutron [req-de575009-a7c1-4a90-9bad-729dc2ee3157 req-42f2e05e-effa-46b6-8eb7-47eb6af9d44c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Refreshing network info cache for port fd3c26a3-b30e-4a03-9339-139f5a106536 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.238 2 DEBUG nova.virt.libvirt.driver [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Start _get_guest_xml network_info=[{"id": "674868d6-4804-4150-b307-9cd31e0509c3", "address": "fa:16:3e:32:84:b7", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap674868d6-48", "ovs_interfaceid": "674868d6-4804-4150-b307-9cd31e0509c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "fd3c26a3-b30e-4a03-9339-139f5a106536", "address": "fa:16:3e:e5:76:46", "network": {"id": "a6b45266-4ada-4ad9-aceb-a504546d6cbe", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::35", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3c26a3-b3", "ovs_interfaceid": "fd3c26a3-b30e-4a03-9339-139f5a106536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.246 2 WARNING nova.virt.libvirt.driver [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.255 2 DEBUG nova.virt.libvirt.host [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.255 2 DEBUG nova.virt.libvirt.host [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.264 2 DEBUG nova.virt.libvirt.host [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.265 2 DEBUG nova.virt.libvirt.host [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.265 2 DEBUG nova.virt.libvirt.driver [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.266 2 DEBUG nova.virt.hardware [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.266 2 DEBUG nova.virt.hardware [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.266 2 DEBUG nova.virt.hardware [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.267 2 DEBUG nova.virt.hardware [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.267 2 DEBUG nova.virt.hardware [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.267 2 DEBUG nova.virt.hardware [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.268 2 DEBUG nova.virt.hardware [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.268 2 DEBUG nova.virt.hardware [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.269 2 DEBUG nova.virt.hardware [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.269 2 DEBUG nova.virt.hardware [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.269 2 DEBUG nova.virt.hardware [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.275 2 DEBUG nova.virt.libvirt.vif [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:38:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-2097740166-1',display_name='server-tempest-MultiPortVlanTransparencyTest-2097740166-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multiportvlantransparencytest-2097740166-1',id=63,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFO6EtKf086AtDcSKUhQT3A92xQMgobyVurrJBZ/a3hiqHTiY5Yo0zaLWibmNBoQ54lPdiEia0lEWiGuyPEo3V1Xkv/BTywiIW8/QXBzK9pxBAvfcXXyqWXEVNgqfaVfhA==',key_name='tempest-MultiPortVlanTransparencyTest-2097740166',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='357683d0efd54df8878ddcfaabe6d388',ramdisk_id='',reservation_id='r-86y0r1pj',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiPortVlanTransparencyTest-198310335',owner_user_name='tempest-MultiPortVlanTransparencyTest-198310335-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:38:26Z,user_data=None,user_id='ec8fd4ab84244ebb88e5af7fcd3ce92b',uuid=cfe5bc3f-449a-4dd6-922b-4e750f8a94d1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "674868d6-4804-4150-b307-9cd31e0509c3", "address": "fa:16:3e:32:84:b7", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap674868d6-48", "ovs_interfaceid": "674868d6-4804-4150-b307-9cd31e0509c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.276 2 DEBUG nova.network.os_vif_util [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converting VIF {"id": "674868d6-4804-4150-b307-9cd31e0509c3", "address": "fa:16:3e:32:84:b7", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap674868d6-48", "ovs_interfaceid": "674868d6-4804-4150-b307-9cd31e0509c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.277 2 DEBUG nova.network.os_vif_util [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:84:b7,bridge_name='br-int',has_traffic_filtering=True,id=674868d6-4804-4150-b307-9cd31e0509c3,network=Network(2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap674868d6-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.278 2 DEBUG nova.virt.libvirt.vif [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:38:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-2097740166-1',display_name='server-tempest-MultiPortVlanTransparencyTest-2097740166-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multiportvlantransparencytest-2097740166-1',id=63,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFO6EtKf086AtDcSKUhQT3A92xQMgobyVurrJBZ/a3hiqHTiY5Yo0zaLWibmNBoQ54lPdiEia0lEWiGuyPEo3V1Xkv/BTywiIW8/QXBzK9pxBAvfcXXyqWXEVNgqfaVfhA==',key_name='tempest-MultiPortVlanTransparencyTest-2097740166',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='357683d0efd54df8878ddcfaabe6d388',ramdisk_id='',reservation_id='r-86y0r1pj',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiPortVlanTransparencyTest-198310335',owner_user_name='tempest-MultiPortVlanTransparencyTest-198310335-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:38:26Z,user_data=None,user_id='ec8fd4ab84244ebb88e5af7fcd3ce92b',uuid=cfe5bc3f-449a-4dd6-922b-4e750f8a94d1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fd3c26a3-b30e-4a03-9339-139f5a106536", "address": "fa:16:3e:e5:76:46", "network": {"id": "a6b45266-4ada-4ad9-aceb-a504546d6cbe", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::35", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3c26a3-b3", "ovs_interfaceid": "fd3c26a3-b30e-4a03-9339-139f5a106536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.278 2 DEBUG nova.network.os_vif_util [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converting VIF {"id": "fd3c26a3-b30e-4a03-9339-139f5a106536", "address": "fa:16:3e:e5:76:46", "network": {"id": "a6b45266-4ada-4ad9-aceb-a504546d6cbe", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::35", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3c26a3-b3", "ovs_interfaceid": "fd3c26a3-b30e-4a03-9339-139f5a106536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.279 2 DEBUG nova.network.os_vif_util [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:76:46,bridge_name='br-int',has_traffic_filtering=True,id=fd3c26a3-b30e-4a03-9339-139f5a106536,network=Network(a6b45266-4ada-4ad9-aceb-a504546d6cbe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfd3c26a3-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.280 2 DEBUG nova.objects.instance [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lazy-loading 'pci_devices' on Instance uuid cfe5bc3f-449a-4dd6-922b-4e750f8a94d1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.297 2 DEBUG nova.virt.libvirt.driver [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:38:31 np0005476733 nova_compute[192580]:  <uuid>cfe5bc3f-449a-4dd6-922b-4e750f8a94d1</uuid>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:  <name>instance-0000003f</name>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <nova:name>server-tempest-MultiPortVlanTransparencyTest-2097740166-1</nova:name>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:38:31</nova:creationTime>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:38:31 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:        <nova:user uuid="ec8fd4ab84244ebb88e5af7fcd3ce92b">tempest-MultiPortVlanTransparencyTest-198310335-project-member</nova:user>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:        <nova:project uuid="357683d0efd54df8878ddcfaabe6d388">tempest-MultiPortVlanTransparencyTest-198310335</nova:project>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:        <nova:port uuid="674868d6-4804-4150-b307-9cd31e0509c3">
Oct  8 11:38:31 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:        <nova:port uuid="fd3c26a3-b30e-4a03-9339-139f5a106536">
Oct  8 11:38:31 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="2001:db8::35" ipVersion="6"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <entry name="serial">cfe5bc3f-449a-4dd6-922b-4e750f8a94d1</entry>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <entry name="uuid">cfe5bc3f-449a-4dd6-922b-4e750f8a94d1</entry>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.config"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:32:84:b7"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <target dev="tap674868d6-48"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:e5:76:46"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <target dev="tapfd3c26a3-b3"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/console.log" append="off"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:38:31 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:38:31 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:38:31 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:38:31 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.301 2 DEBUG nova.compute.manager [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Preparing to wait for external event network-vif-plugged-674868d6-4804-4150-b307-9cd31e0509c3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.301 2 DEBUG oslo_concurrency.lockutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.302 2 DEBUG oslo_concurrency.lockutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.302 2 DEBUG oslo_concurrency.lockutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.302 2 DEBUG nova.compute.manager [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Preparing to wait for external event network-vif-plugged-fd3c26a3-b30e-4a03-9339-139f5a106536 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.302 2 DEBUG oslo_concurrency.lockutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.302 2 DEBUG oslo_concurrency.lockutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.302 2 DEBUG oslo_concurrency.lockutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.303 2 DEBUG nova.virt.libvirt.vif [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:38:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-2097740166-1',display_name='server-tempest-MultiPortVlanTransparencyTest-2097740166-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multiportvlantransparencytest-2097740166-1',id=63,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFO6EtKf086AtDcSKUhQT3A92xQMgobyVurrJBZ/a3hiqHTiY5Yo0zaLWibmNBoQ54lPdiEia0lEWiGuyPEo3V1Xkv/BTywiIW8/QXBzK9pxBAvfcXXyqWXEVNgqfaVfhA==',key_name='tempest-MultiPortVlanTransparencyTest-2097740166',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='357683d0efd54df8878ddcfaabe6d388',ramdisk_id='',reservation_id='r-86y0r1pj',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiPortVlanTransparencyTest-198310335',owner_user_name='tempest-MultiPortVlanTransparencyTest-198310335-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:38:26Z,user_data=None,user_id='ec8fd4ab84244ebb88e5af7fcd3ce92b',uuid=cfe5bc3f-449a-4dd6-922b-4e750f8a94d1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "674868d6-4804-4150-b307-9cd31e0509c3", "address": "fa:16:3e:32:84:b7", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap674868d6-48", "ovs_interfaceid": "674868d6-4804-4150-b307-9cd31e0509c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.304 2 DEBUG nova.network.os_vif_util [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converting VIF {"id": "674868d6-4804-4150-b307-9cd31e0509c3", "address": "fa:16:3e:32:84:b7", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap674868d6-48", "ovs_interfaceid": "674868d6-4804-4150-b307-9cd31e0509c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.305 2 DEBUG nova.network.os_vif_util [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:84:b7,bridge_name='br-int',has_traffic_filtering=True,id=674868d6-4804-4150-b307-9cd31e0509c3,network=Network(2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap674868d6-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.305 2 DEBUG os_vif [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:84:b7,bridge_name='br-int',has_traffic_filtering=True,id=674868d6-4804-4150-b307-9cd31e0509c3,network=Network(2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap674868d6-48') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.306 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.306 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.311 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap674868d6-48, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.312 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap674868d6-48, col_values=(('external_ids', {'iface-id': '674868d6-4804-4150-b307-9cd31e0509c3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:32:84:b7', 'vm-uuid': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:31 np0005476733 NetworkManager[51699]: <info>  [1759937911.3156] manager: (tap674868d6-48): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/163)
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.326 2 INFO os_vif [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:84:b7,bridge_name='br-int',has_traffic_filtering=True,id=674868d6-4804-4150-b307-9cd31e0509c3,network=Network(2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap674868d6-48')#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.327 2 DEBUG nova.virt.libvirt.vif [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:38:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-2097740166-1',display_name='server-tempest-MultiPortVlanTransparencyTest-2097740166-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multiportvlantransparencytest-2097740166-1',id=63,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFO6EtKf086AtDcSKUhQT3A92xQMgobyVurrJBZ/a3hiqHTiY5Yo0zaLWibmNBoQ54lPdiEia0lEWiGuyPEo3V1Xkv/BTywiIW8/QXBzK9pxBAvfcXXyqWXEVNgqfaVfhA==',key_name='tempest-MultiPortVlanTransparencyTest-2097740166',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='357683d0efd54df8878ddcfaabe6d388',ramdisk_id='',reservation_id='r-86y0r1pj',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiPortVlanTransparencyTest-198310335',owner_user_name='tempest-MultiPortVlanTransparencyTest-198310335-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:38:26Z,user_data=None,user_id='ec8fd4ab84244ebb88e5af7fcd3ce92b',uuid=cfe5bc3f-449a-4dd6-922b-4e750f8a94d1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fd3c26a3-b30e-4a03-9339-139f5a106536", "address": "fa:16:3e:e5:76:46", "network": {"id": "a6b45266-4ada-4ad9-aceb-a504546d6cbe", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::35", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3c26a3-b3", "ovs_interfaceid": "fd3c26a3-b30e-4a03-9339-139f5a106536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.327 2 DEBUG nova.network.os_vif_util [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converting VIF {"id": "fd3c26a3-b30e-4a03-9339-139f5a106536", "address": "fa:16:3e:e5:76:46", "network": {"id": "a6b45266-4ada-4ad9-aceb-a504546d6cbe", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::35", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3c26a3-b3", "ovs_interfaceid": "fd3c26a3-b30e-4a03-9339-139f5a106536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.328 2 DEBUG nova.network.os_vif_util [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:76:46,bridge_name='br-int',has_traffic_filtering=True,id=fd3c26a3-b30e-4a03-9339-139f5a106536,network=Network(a6b45266-4ada-4ad9-aceb-a504546d6cbe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfd3c26a3-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.328 2 DEBUG os_vif [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:76:46,bridge_name='br-int',has_traffic_filtering=True,id=fd3c26a3-b30e-4a03-9339-139f5a106536,network=Network(a6b45266-4ada-4ad9-aceb-a504546d6cbe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfd3c26a3-b3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.329 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.330 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.332 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfd3c26a3-b3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.333 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfd3c26a3-b3, col_values=(('external_ids', {'iface-id': 'fd3c26a3-b30e-4a03-9339-139f5a106536', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e5:76:46', 'vm-uuid': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:31 np0005476733 NetworkManager[51699]: <info>  [1759937911.3354] manager: (tapfd3c26a3-b3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/164)
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.344 2 INFO os_vif [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:76:46,bridge_name='br-int',has_traffic_filtering=True,id=fd3c26a3-b30e-4a03-9339-139f5a106536,network=Network(a6b45266-4ada-4ad9-aceb-a504546d6cbe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfd3c26a3-b3')#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.398 2 DEBUG nova.virt.libvirt.driver [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.399 2 DEBUG nova.virt.libvirt.driver [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.399 2 DEBUG nova.virt.libvirt.driver [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] No VIF found with MAC fa:16:3e:32:84:b7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.400 2 DEBUG nova.virt.libvirt.driver [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] No VIF found with MAC fa:16:3e:e5:76:46, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.400 2 INFO nova.virt.libvirt.driver [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Using config drive#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.924 2 INFO nova.virt.libvirt.driver [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Creating config drive at /var/lib/nova/instances/cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.config#033[00m
Oct  8 11:38:31 np0005476733 nova_compute[192580]: 2025-10-08 15:38:31.930 2 DEBUG oslo_concurrency.processutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp25607iaj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:38:32 np0005476733 nova_compute[192580]: 2025-10-08 15:38:32.075 2 DEBUG oslo_concurrency.processutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp25607iaj" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:38:32 np0005476733 kernel: tap674868d6-48: entered promiscuous mode
Oct  8 11:38:32 np0005476733 NetworkManager[51699]: <info>  [1759937912.1568] manager: (tap674868d6-48): new Tun device (/org/freedesktop/NetworkManager/Devices/165)
Oct  8 11:38:32 np0005476733 nova_compute[192580]: 2025-10-08 15:38:32.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:38:32Z|00499|binding|INFO|Claiming lport 674868d6-4804-4150-b307-9cd31e0509c3 for this chassis.
Oct  8 11:38:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:38:32Z|00500|binding|INFO|674868d6-4804-4150-b307-9cd31e0509c3: Claiming fa:16:3e:32:84:b7 10.100.0.12
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.225 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:84:b7 10.100.0.12'], port_security=['fa:16:3e:32:84:b7 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'first_port-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac', 'neutron:port_capabilities': '', 'neutron:port_name': 'first_port-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'neutron:project_id': '357683d0efd54df8878ddcfaabe6d388', 'neutron:revision_number': '2', 'neutron:security_group_ids': '93a341f3-21b5-4aa3-854e-5c20dcdd9b33', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf5d6359-20d9-440f-a678-46a616c58f4d, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=674868d6-4804-4150-b307-9cd31e0509c3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.226 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 674868d6-4804-4150-b307-9cd31e0509c3 in datapath 2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac bound to our chassis#033[00m
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.229 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac#033[00m
Oct  8 11:38:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:38:32Z|00501|binding|INFO|Setting lport 674868d6-4804-4150-b307-9cd31e0509c3 ovn-installed in OVS
Oct  8 11:38:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:38:32Z|00502|binding|INFO|Setting lport 674868d6-4804-4150-b307-9cd31e0509c3 up in Southbound
Oct  8 11:38:32 np0005476733 nova_compute[192580]: 2025-10-08 15:38:32.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:32 np0005476733 nova_compute[192580]: 2025-10-08 15:38:32.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:32 np0005476733 NetworkManager[51699]: <info>  [1759937912.2377] manager: (tapfd3c26a3-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/166)
Oct  8 11:38:32 np0005476733 kernel: tapfd3c26a3-b3: entered promiscuous mode
Oct  8 11:38:32 np0005476733 nova_compute[192580]: 2025-10-08 15:38:32.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:32 np0005476733 systemd-udevd[236730]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.244 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[da213a8a-27d5-4d7b-ad14-e075b5762fa8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.245 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2bf87bc3-31 in ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:38:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:38:32Z|00503|binding|INFO|Claiming lport fd3c26a3-b30e-4a03-9339-139f5a106536 for this chassis.
Oct  8 11:38:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:38:32Z|00504|binding|INFO|fd3c26a3-b30e-4a03-9339-139f5a106536: Claiming fa:16:3e:e5:76:46 2001:db8::35
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.247 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2bf87bc3-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.247 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3090aff5-2788-42af-b1e0-84ba737fe8ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:38:32 np0005476733 systemd-udevd[236729]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.248 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b78e14e9-7fe0-4371-8586-2ab1fe544c9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.254 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:76:46 2001:db8::35'], port_security=['fa:16:3e:e5:76:46 2001:db8::35'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'second_port-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'neutron:cidrs': '2001:db8::35/64', 'neutron:device_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6b45266-4ada-4ad9-aceb-a504546d6cbe', 'neutron:port_capabilities': '', 'neutron:port_name': 'second_port-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'neutron:project_id': '357683d0efd54df8878ddcfaabe6d388', 'neutron:revision_number': '2', 'neutron:security_group_ids': '93a341f3-21b5-4aa3-854e-5c20dcdd9b33', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=988a4c86-cf44-4adf-b0dc-dd1646b531fc, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=fd3c26a3-b30e-4a03-9339-139f5a106536) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.261 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[e2828359-4fb5-44ea-8c0b-2eb2faca457e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:38:32 np0005476733 NetworkManager[51699]: <info>  [1759937912.2640] device (tapfd3c26a3-b3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:38:32 np0005476733 NetworkManager[51699]: <info>  [1759937912.2649] device (tapfd3c26a3-b3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:38:32 np0005476733 NetworkManager[51699]: <info>  [1759937912.2668] device (tap674868d6-48): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:38:32 np0005476733 NetworkManager[51699]: <info>  [1759937912.2678] device (tap674868d6-48): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:38:32 np0005476733 nova_compute[192580]: 2025-10-08 15:38:32.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:38:32Z|00505|binding|INFO|Setting lport fd3c26a3-b30e-4a03-9339-139f5a106536 ovn-installed in OVS
Oct  8 11:38:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:38:32Z|00506|binding|INFO|Setting lport fd3c26a3-b30e-4a03-9339-139f5a106536 up in Southbound
Oct  8 11:38:32 np0005476733 nova_compute[192580]: 2025-10-08 15:38:32.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.291 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a614bf55-bcf5-4157-be3a-7c2ff2329b7b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:38:32 np0005476733 systemd-machined[152624]: New machine qemu-36-instance-0000003f.
Oct  8 11:38:32 np0005476733 systemd[1]: Started Virtual Machine qemu-36-instance-0000003f.
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.323 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[a2ef36a0-a9f3-4540-8b2a-52842704c4dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.329 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1a92bbf1-b6e5-4ae6-8b2d-7ef6a8af0a5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:38:32 np0005476733 NetworkManager[51699]: <info>  [1759937912.3307] manager: (tap2bf87bc3-30): new Veth device (/org/freedesktop/NetworkManager/Devices/167)
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.369 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[63864312-a381-4a42-82f8-0fe577d31a5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.372 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[64457ce9-f418-4602-ad08-091c1ab599a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:38:32 np0005476733 NetworkManager[51699]: <info>  [1759937912.4046] device (tap2bf87bc3-30): carrier: link connected
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.414 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[7ff113ce-ed62-46ac-b23f-1de9acf081cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.434 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d3841dfb-7e16-433d-b7a8-1fbb00261a02]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2bf87bc3-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:ee:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 116], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 481606, 'reachable_time': 34331, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236765, 'error': None, 'target': 'ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.452 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9e356aa4-4f78-49dc-a975-31f92b0f46b1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea4:eeb0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 481606, 'tstamp': 481606}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236766, 'error': None, 'target': 'ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.468 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[dad253ae-d04f-4a65-bceb-895712015d3c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2bf87bc3-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:ee:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 116], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 481606, 'reachable_time': 34331, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236767, 'error': None, 'target': 'ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.509 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f145b046-2dea-4bfb-a898-ad05ecb4b407]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:38:32 np0005476733 nova_compute[192580]: 2025-10-08 15:38:32.532 2 DEBUG nova.compute.manager [req-437adde8-189e-4d8f-91c5-048e62b9ce6b req-17cea37b-726a-4863-8978-f64e590a8459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Received event network-vif-plugged-674868d6-4804-4150-b307-9cd31e0509c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:38:32 np0005476733 nova_compute[192580]: 2025-10-08 15:38:32.533 2 DEBUG oslo_concurrency.lockutils [req-437adde8-189e-4d8f-91c5-048e62b9ce6b req-17cea37b-726a-4863-8978-f64e590a8459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:38:32 np0005476733 nova_compute[192580]: 2025-10-08 15:38:32.533 2 DEBUG oslo_concurrency.lockutils [req-437adde8-189e-4d8f-91c5-048e62b9ce6b req-17cea37b-726a-4863-8978-f64e590a8459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:38:32 np0005476733 nova_compute[192580]: 2025-10-08 15:38:32.534 2 DEBUG oslo_concurrency.lockutils [req-437adde8-189e-4d8f-91c5-048e62b9ce6b req-17cea37b-726a-4863-8978-f64e590a8459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:38:32 np0005476733 nova_compute[192580]: 2025-10-08 15:38:32.534 2 DEBUG nova.compute.manager [req-437adde8-189e-4d8f-91c5-048e62b9ce6b req-17cea37b-726a-4863-8978-f64e590a8459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Processing event network-vif-plugged-674868d6-4804-4150-b307-9cd31e0509c3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.593 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[fed65f64-fcb8-49e0-a459-b612d3e97529]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.596 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2bf87bc3-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.597 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.598 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2bf87bc3-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:38:32 np0005476733 nova_compute[192580]: 2025-10-08 15:38:32.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:32 np0005476733 NetworkManager[51699]: <info>  [1759937912.6023] manager: (tap2bf87bc3-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/168)
Oct  8 11:38:32 np0005476733 kernel: tap2bf87bc3-30: entered promiscuous mode
Oct  8 11:38:32 np0005476733 nova_compute[192580]: 2025-10-08 15:38:32.610 2 DEBUG nova.network.neutron [req-de575009-a7c1-4a90-9bad-729dc2ee3157 req-42f2e05e-effa-46b6-8eb7-47eb6af9d44c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Updated VIF entry in instance network info cache for port fd3c26a3-b30e-4a03-9339-139f5a106536. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:38:32 np0005476733 nova_compute[192580]: 2025-10-08 15:38:32.611 2 DEBUG nova.network.neutron [req-de575009-a7c1-4a90-9bad-729dc2ee3157 req-42f2e05e-effa-46b6-8eb7-47eb6af9d44c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Updating instance_info_cache with network_info: [{"id": "674868d6-4804-4150-b307-9cd31e0509c3", "address": "fa:16:3e:32:84:b7", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap674868d6-48", "ovs_interfaceid": "674868d6-4804-4150-b307-9cd31e0509c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "fd3c26a3-b30e-4a03-9339-139f5a106536", "address": "fa:16:3e:e5:76:46", "network": {"id": "a6b45266-4ada-4ad9-aceb-a504546d6cbe", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::35", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3c26a3-b3", "ovs_interfaceid": "fd3c26a3-b30e-4a03-9339-139f5a106536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:38:32 np0005476733 nova_compute[192580]: 2025-10-08 15:38:32.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.613 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2bf87bc3-30, col_values=(('external_ids', {'iface-id': '88d655c9-33da-4a0f-a7f9-84973702cdd7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:38:32 np0005476733 nova_compute[192580]: 2025-10-08 15:38:32.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:38:32Z|00507|binding|INFO|Releasing lport 88d655c9-33da-4a0f-a7f9-84973702cdd7 from this chassis (sb_readonly=0)
Oct  8 11:38:32 np0005476733 nova_compute[192580]: 2025-10-08 15:38:32.634 2 DEBUG oslo_concurrency.lockutils [req-de575009-a7c1-4a90-9bad-729dc2ee3157 req-42f2e05e-effa-46b6-8eb7-47eb6af9d44c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:38:32 np0005476733 nova_compute[192580]: 2025-10-08 15:38:32.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.644 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.646 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4e4455aa-554a-4074-b480-46baabd17305]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.647 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac.pid.haproxy
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:38:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:32.649 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac', 'env', 'PROCESS_TAG=haproxy-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:38:33 np0005476733 podman[236799]: 2025-10-08 15:38:33.071283959 +0000 UTC m=+0.060170891 container create 2ca71e268b40aeb8ef0c1e4df444edb7f5bff6dffabf66b352e0eb2c103df596 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  8 11:38:33 np0005476733 systemd[1]: Started libpod-conmon-2ca71e268b40aeb8ef0c1e4df444edb7f5bff6dffabf66b352e0eb2c103df596.scope.
Oct  8 11:38:33 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:38:33 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e62f6450988b3fdef84f7698f5bf8de742afe1c5697c6dbf9762ad7b6170c325/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:38:33 np0005476733 podman[236799]: 2025-10-08 15:38:33.041120066 +0000 UTC m=+0.030007008 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:38:33 np0005476733 podman[236799]: 2025-10-08 15:38:33.152823619 +0000 UTC m=+0.141710571 container init 2ca71e268b40aeb8ef0c1e4df444edb7f5bff6dffabf66b352e0eb2c103df596 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  8 11:38:33 np0005476733 podman[236799]: 2025-10-08 15:38:33.159946448 +0000 UTC m=+0.148833380 container start 2ca71e268b40aeb8ef0c1e4df444edb7f5bff6dffabf66b352e0eb2c103df596 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 11:38:33 np0005476733 neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac[236818]: [NOTICE]   (236825) : New worker (236828) forked
Oct  8 11:38:33 np0005476733 neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac[236818]: [NOTICE]   (236825) : Loading success.
Oct  8 11:38:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:33.234 103739 INFO neutron.agent.ovn.metadata.agent [-] Port fd3c26a3-b30e-4a03-9339-139f5a106536 in datapath a6b45266-4ada-4ad9-aceb-a504546d6cbe unbound from our chassis#033[00m
Oct  8 11:38:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:33.236 103739 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a6b45266-4ada-4ad9-aceb-a504546d6cbe or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  8 11:38:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:38:33.237 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[004c3fd6-8833-4c19-b54f-4e93b121a7ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:38:33 np0005476733 nova_compute[192580]: 2025-10-08 15:38:33.624 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937913.623396, cfe5bc3f-449a-4dd6-922b-4e750f8a94d1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:38:33 np0005476733 nova_compute[192580]: 2025-10-08 15:38:33.625 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] VM Started (Lifecycle Event)#033[00m
Oct  8 11:38:33 np0005476733 nova_compute[192580]: 2025-10-08 15:38:33.656 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:38:33 np0005476733 nova_compute[192580]: 2025-10-08 15:38:33.661 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937913.6250122, cfe5bc3f-449a-4dd6-922b-4e750f8a94d1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:38:33 np0005476733 nova_compute[192580]: 2025-10-08 15:38:33.662 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:38:33 np0005476733 nova_compute[192580]: 2025-10-08 15:38:33.682 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:38:33 np0005476733 nova_compute[192580]: 2025-10-08 15:38:33.688 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:38:33 np0005476733 nova_compute[192580]: 2025-10-08 15:38:33.713 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.645 2 DEBUG nova.compute.manager [req-c38dc330-8a7d-412d-ad32-f7c9426b2340 req-0430ebf1-6fb0-4a0c-865f-fe18ffa95bfb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Received event network-vif-plugged-674868d6-4804-4150-b307-9cd31e0509c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.647 2 DEBUG oslo_concurrency.lockutils [req-c38dc330-8a7d-412d-ad32-f7c9426b2340 req-0430ebf1-6fb0-4a0c-865f-fe18ffa95bfb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.648 2 DEBUG oslo_concurrency.lockutils [req-c38dc330-8a7d-412d-ad32-f7c9426b2340 req-0430ebf1-6fb0-4a0c-865f-fe18ffa95bfb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.648 2 DEBUG oslo_concurrency.lockutils [req-c38dc330-8a7d-412d-ad32-f7c9426b2340 req-0430ebf1-6fb0-4a0c-865f-fe18ffa95bfb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.649 2 DEBUG nova.compute.manager [req-c38dc330-8a7d-412d-ad32-f7c9426b2340 req-0430ebf1-6fb0-4a0c-865f-fe18ffa95bfb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] No event matching network-vif-plugged-674868d6-4804-4150-b307-9cd31e0509c3 in dict_keys([('network-vif-plugged', 'fd3c26a3-b30e-4a03-9339-139f5a106536')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.649 2 WARNING nova.compute.manager [req-c38dc330-8a7d-412d-ad32-f7c9426b2340 req-0430ebf1-6fb0-4a0c-865f-fe18ffa95bfb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Received unexpected event network-vif-plugged-674868d6-4804-4150-b307-9cd31e0509c3 for instance with vm_state building and task_state spawning.#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.650 2 DEBUG nova.compute.manager [req-c38dc330-8a7d-412d-ad32-f7c9426b2340 req-0430ebf1-6fb0-4a0c-865f-fe18ffa95bfb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Received event network-vif-plugged-fd3c26a3-b30e-4a03-9339-139f5a106536 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.650 2 DEBUG oslo_concurrency.lockutils [req-c38dc330-8a7d-412d-ad32-f7c9426b2340 req-0430ebf1-6fb0-4a0c-865f-fe18ffa95bfb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.651 2 DEBUG oslo_concurrency.lockutils [req-c38dc330-8a7d-412d-ad32-f7c9426b2340 req-0430ebf1-6fb0-4a0c-865f-fe18ffa95bfb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.651 2 DEBUG oslo_concurrency.lockutils [req-c38dc330-8a7d-412d-ad32-f7c9426b2340 req-0430ebf1-6fb0-4a0c-865f-fe18ffa95bfb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.651 2 DEBUG nova.compute.manager [req-c38dc330-8a7d-412d-ad32-f7c9426b2340 req-0430ebf1-6fb0-4a0c-865f-fe18ffa95bfb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Processing event network-vif-plugged-fd3c26a3-b30e-4a03-9339-139f5a106536 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.652 2 DEBUG nova.compute.manager [req-c38dc330-8a7d-412d-ad32-f7c9426b2340 req-0430ebf1-6fb0-4a0c-865f-fe18ffa95bfb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Received event network-vif-plugged-fd3c26a3-b30e-4a03-9339-139f5a106536 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.652 2 DEBUG oslo_concurrency.lockutils [req-c38dc330-8a7d-412d-ad32-f7c9426b2340 req-0430ebf1-6fb0-4a0c-865f-fe18ffa95bfb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.652 2 DEBUG oslo_concurrency.lockutils [req-c38dc330-8a7d-412d-ad32-f7c9426b2340 req-0430ebf1-6fb0-4a0c-865f-fe18ffa95bfb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.653 2 DEBUG oslo_concurrency.lockutils [req-c38dc330-8a7d-412d-ad32-f7c9426b2340 req-0430ebf1-6fb0-4a0c-865f-fe18ffa95bfb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.653 2 DEBUG nova.compute.manager [req-c38dc330-8a7d-412d-ad32-f7c9426b2340 req-0430ebf1-6fb0-4a0c-865f-fe18ffa95bfb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] No waiting events found dispatching network-vif-plugged-fd3c26a3-b30e-4a03-9339-139f5a106536 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.654 2 WARNING nova.compute.manager [req-c38dc330-8a7d-412d-ad32-f7c9426b2340 req-0430ebf1-6fb0-4a0c-865f-fe18ffa95bfb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Received unexpected event network-vif-plugged-fd3c26a3-b30e-4a03-9339-139f5a106536 for instance with vm_state building and task_state spawning.#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.655 2 DEBUG nova.compute.manager [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.660 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937914.6602345, cfe5bc3f-449a-4dd6-922b-4e750f8a94d1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.661 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.667 2 DEBUG nova.virt.libvirt.driver [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.674 2 INFO nova.virt.libvirt.driver [-] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Instance spawned successfully.#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.674 2 DEBUG nova.virt.libvirt.driver [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.678 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.682 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.693 2 DEBUG nova.virt.libvirt.driver [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.693 2 DEBUG nova.virt.libvirt.driver [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.694 2 DEBUG nova.virt.libvirt.driver [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.694 2 DEBUG nova.virt.libvirt.driver [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.694 2 DEBUG nova.virt.libvirt.driver [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.695 2 DEBUG nova.virt.libvirt.driver [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.702 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.762 2 INFO nova.compute.manager [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Took 8.02 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.763 2 DEBUG nova.compute.manager [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.835 2 INFO nova.compute.manager [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Took 8.50 seconds to build instance.#033[00m
Oct  8 11:38:34 np0005476733 nova_compute[192580]: 2025-10-08 15:38:34.856 2 DEBUG oslo_concurrency.lockutils [None req-76aecfa7-74a1-4452-8762-a52fb4e2e7f7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.596s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:38:35 np0005476733 nova_compute[192580]: 2025-10-08 15:38:35.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.010 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'name': 'tempest-test_dscp_marking_east_west-1191916229', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003d', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'hostId': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.013 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003f', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '357683d0efd54df8878ddcfaabe6d388', 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'hostId': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.013 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.030 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/memory.usage volume: 263.03515625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.048 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.049 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance cfe5bc3f-449a-4dd6-922b-4e750f8a94d1: ceilometer.compute.pollsters.NoVolumeException
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '75aa15be-c598-4dec-88fc-6550065450cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 263.03515625, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'timestamp': '2025-10-08T15:38:36.013449', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'instance-0000003d', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': 'da9f78ce-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.753387664, 'message_signature': '48e243f04e2e628fa73030e17ecbda4b5012ffe28fc21d14e66ff63ae6fea47e'}]}, 'timestamp': '2025-10-08 15:38:36.049514', '_unique_id': '59441fcf5a774bdf8e39f48ef35b5ebe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.050 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.051 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.054 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for af5ca3d2-d7df-40e0-88f8-b90191a73698 / tap38c374da-b5 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.054 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.056 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for cfe5bc3f-449a-4dd6-922b-4e750f8a94d1 / tap674868d6-48 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.057 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for cfe5bc3f-449a-4dd6-922b-4e750f8a94d1 / tapfd3c26a3-b3 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.057 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.057 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fddf0185-d9a1-4b3f-9bd1-4d07e8a16078', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000003d-af5ca3d2-d7df-40e0-88f8-b90191a73698-tap38c374da-b5', 'timestamp': '2025-10-08T15:38:36.051780', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'tap38c374da-b5', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c4:ae:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap38c374da-b5'}, 'message_id': 'daa2f5b2-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.774750373, 'message_signature': '9472d52ce8ea4f00764d763d2ea462a02d82380b354886a7f6a2035b844dbf15'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-0000003f-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-tap674868d6-48', 'timestamp': '2025-10-08T15:38:36.051780', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'tap674868d6-48', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:32:84:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap674868d6-48'}, 'message_id': 'daa3695c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.777564774, 'message_signature': 'c5345f4e75763c1f56eb4010f8f212cc1f71fd027e66b58fda9a40462184643e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-0000003f-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-tapfd3c26a3-b3', 'timestamp': '2025-10-08T15:38:36.051780', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'tapfd3c26a3-b3', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e5:76:46', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfd3c26a3-b3'}, 'message_id': 'daa375dc-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.777564774, 'message_signature': '7f4c1d99b84721384f47659dbdc8c3be5a5f3469c68cd6f9af48e9051b533e8a'}]}, 'timestamp': '2025-10-08 15:38:36.057887', '_unique_id': '5821dc0f8a66424e82fac1417eb6ab1b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.058 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.059 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.059 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.060 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.060 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2156d8cc-5b06-4b4a-a2bf-fa6023ee30e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000003d-af5ca3d2-d7df-40e0-88f8-b90191a73698-tap38c374da-b5', 'timestamp': '2025-10-08T15:38:36.059761', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'tap38c374da-b5', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c4:ae:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap38c374da-b5'}, 'message_id': 'daa3c9ce-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.774750373, 'message_signature': 'a3bb92088aaa9e5a4c4a391508fb288c793a73da5cd9c898b840712637eb74e5'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-0000003f-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-tap674868d6-48', 'timestamp': '2025-10-08T15:38:36.059761', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'tap674868d6-48', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:32:84:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap674868d6-48'}, 'message_id': 'daa3d48c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.777564774, 'message_signature': 'baacd8830500f707d85ca255c002a5005d3c1a43e89f87df05e60febcfe68336'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-0000003f-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-tapfd3c26a3-b3', 'timestamp': '2025-10-08T15:38:36.059761', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'tapfd3c26a3-b3', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e5:76:46', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfd3c26a3-b3'}, 'message_id': 'daa3dd7e-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.777564774, 'message_signature': 'e29a59296e043a8920820c324da2263bf53579cf6e65e7a5d1911848ac67ff00'}]}, 'timestamp': '2025-10-08 15:38:36.060526', '_unique_id': '407e435c69464911ac54465d5fae9bc5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.061 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/cpu volume: 44030000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/cpu volume: 1330000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d00dcf4-e6c4-4d00-bbf3-e523ae46e448', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 44030000000, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'timestamp': '2025-10-08T15:38:36.061882', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'instance-0000003d', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'daa41c30-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.753387664, 'message_signature': 'a0b1cb6270759d0a119218d0841785566f1bbb5ef8b4e88c1ca905f1227d6daa'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1330000000, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'timestamp': '2025-10-08T15:38:36.061882', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'instance-0000003f', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'daa425ea-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.771795768, 'message_signature': '647fe571a1293082deee65c074ae0809547f25421e93e7c6f6d75746b87410ea'}]}, 'timestamp': '2025-10-08 15:38:36.062353', '_unique_id': '54de5833b4f8412391c72dd0ea1e5b7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.062 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.063 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.063 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/network.outgoing.packets volume: 246 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.063 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '935da96c-d3b5-4488-b981-afdad0cc1795', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 246, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000003d-af5ca3d2-d7df-40e0-88f8-b90191a73698-tap38c374da-b5', 'timestamp': '2025-10-08T15:38:36.063606', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'tap38c374da-b5', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c4:ae:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap38c374da-b5'}, 'message_id': 'daa45f92-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.774750373, 'message_signature': '7bf053ab99891d0d9c386ed466922532cb4e58ff291f7e7144d3162ef5d05cb7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-0000003f-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-tap674868d6-48', 'timestamp': '2025-10-08T15:38:36.063606', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'tap674868d6-48', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:32:84:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap674868d6-48'}, 'message_id': 'daa467c6-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.777564774, 'message_signature': 'f8d2d48a5655ee1fc9ca4a2d97c426409a95a6ac9ae0081be261a37d470864f8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-0000003f-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-tapfd3c26a3-b3', 'timestamp': '2025-10-08T15:38:36.063606', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'tapfd3c26a3-b3', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e5:76:46', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfd3c26a3-b3'}, 'message_id': 'daa4719e-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.777564774, 'message_signature': '656423c7ea84197f3a77c2de7cd5d919bf60079d670e3c6dc5a499b687da2d3c'}]}, 'timestamp': '2025-10-08 15:38:36.064332', '_unique_id': '5e5b04bfebf247519f397b9c59288774'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.064 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.065 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.065 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.065 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '977bdb68-111f-4e14-8496-082c76936ac8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000003d-af5ca3d2-d7df-40e0-88f8-b90191a73698-tap38c374da-b5', 'timestamp': '2025-10-08T15:38:36.065666', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'tap38c374da-b5', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c4:ae:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap38c374da-b5'}, 'message_id': 'daa4b0fa-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.774750373, 'message_signature': '04b060b893b7630bd14f0b1dbc022b5490c15c4dcb6b57a3c023f9e2e526644d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-0000003f-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-tap674868d6-48', 'timestamp': '2025-10-08T15:38:36.065666', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'tap674868d6-48', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:32:84:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap674868d6-48'}, 'message_id': 'daa4b960-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.777564774, 'message_signature': '2be7686e2ff477e9186444d33cc3e6b54e3d3a3612cd22f23f2ed3c7227f92f3'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-0000003f-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-tapfd3c26a3-b3', 'timestamp': '2025-10-08T15:38:36.065666', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'tapfd3c26a3-b3', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e5:76:46', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfd3c26a3-b3'}, 'message_id': 'daa4c572-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.777564774, 'message_signature': 'acccdd2047a7e431c07bcb651605b093eaf7a2f2d558020068fa0ab250bfcff5'}]}, 'timestamp': '2025-10-08 15:38:36.066473', '_unique_id': 'c49593b35a21453e9b99db71475c99e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.066 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.067 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.093 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.device.write.latency volume: 3631460274 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.094 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.114 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.device.write.latency volume: 12765511 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.115 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df97c04c-410d-4944-9c24-fefad3a067e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3631460274, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698-vda', 'timestamp': '2025-10-08T15:38:36.067790', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'instance-0000003d', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'daa90646-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.79077481, 'message_signature': '4c6d02abf167698158ef364642e1d86aa28441a6a70d66d46a0884e05fd5db96'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698-sda', 'timestamp': '2025-10-08T15:38:36.067790', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'instance-0000003d', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'daa9144c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.79077481, 'message_signature': '74f976b94b1dd3a877fab668379f424191464b8bd8697cd970a8f3012565dee2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12765511, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-vda', 'timestamp': '2025-10-08T15:38:36.067790', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'instance-0000003f', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'daac3e56-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.817682698, 'message_signature': '4738a9b59e63817f83356dffcce42385f2b9363baee9d1857258db218073324c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-sda', 'timestamp': '2025-10-08T15:38:36.067790', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'instance-0000003f', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'daac4d38-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.817682698, 'message_signature': '1c615bfc54e223418666e334e370841280da6a8a4b0c5df2dc860f0b15325ea0'}]}, 'timestamp': '2025-10-08 15:38:36.115847', '_unique_id': 'a48b8e1b7fce45f29de5950853dacf3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.117 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.118 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.118 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.118 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-test_dscp_marking_east_west-1191916229>, <NovaLikeServer: server-tempest-MultiPortVlanTransparencyTest-2097740166-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dscp_marking_east_west-1191916229>, <NovaLikeServer: server-tempest-MultiPortVlanTransparencyTest-2097740166-1>]
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.118 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.118 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.device.write.bytes volume: 136168448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.119 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.119 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.device.write.bytes volume: 1024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.119 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f21e7cbe-b43f-4dc5-82ef-923f8071ff39', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 136168448, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698-vda', 'timestamp': '2025-10-08T15:38:36.118809', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'instance-0000003d', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'daaccd6c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.79077481, 'message_signature': 'e2ade09e12d5a8c378730872c1f363a831c59984de3e81521bf261b0c705b796'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698-sda', 'timestamp': '2025-10-08T15:38:36.118809', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'instance-0000003d', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'daacd9ec-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.79077481, 'message_signature': '2a02b6072c38216266f788d921c2eb1871a852db4ff753f5f37e3f24870984a7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1024, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-vda', 'timestamp': '2025-10-08T15:38:36.118809', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'instance-0000003f', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'daace446-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.817682698, 'message_signature': 'ddd711bbae648071f8eb2dc35d83d2045cba3a7cc7713d732e54e1d844f10bc8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-sda', 'timestamp': '2025-10-08T15:38:36.118809', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'instance-0000003f', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'daacee50-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.817682698, 'message_signature': 'b5cf6452c4470f8054bbef1d48826d888e928a0c91100bffcd5bd2c12af043b1'}]}, 'timestamp': '2025-10-08 15:38:36.119946', '_unique_id': 'd11911a4e956431aa815e36ae9aea8fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.120 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.121 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.121 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/network.incoming.packets volume: 226 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.121 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.122 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dae7f237-3a12-47c1-a5f0-bd3ba68071ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 226, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000003d-af5ca3d2-d7df-40e0-88f8-b90191a73698-tap38c374da-b5', 'timestamp': '2025-10-08T15:38:36.121577', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'tap38c374da-b5', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c4:ae:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap38c374da-b5'}, 'message_id': 'daad3978-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.774750373, 'message_signature': 'f15ed23603b7056fb2aa7d95ded4df020c1226234cf0016fcb3a88f380b488c0'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-0000003f-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-tap674868d6-48', 'timestamp': '2025-10-08T15:38:36.121577', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'tap674868d6-48', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:32:84:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap674868d6-48'}, 'message_id': 'daad447c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.777564774, 'message_signature': '601d03e9befecb4ff16d349df9c0ed75b2e2a2035613dbae2116644577ce3d03'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-0000003f-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-tapfd3c26a3-b3', 'timestamp': '2025-10-08T15:38:36.121577', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'tapfd3c26a3-b3', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e5:76:46', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfd3c26a3-b3'}, 'message_id': 'daad502a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.777564774, 'message_signature': 'f9d5e87424707f2f1606b07dfc1430d0adabddb839761e01fa16bba7d3b14272'}]}, 'timestamp': '2025-10-08 15:38:36.122459', '_unique_id': '359a3fad33184bb48c910a022fb26a04'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.123 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.124 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-test_dscp_marking_east_west-1191916229>, <NovaLikeServer: server-tempest-MultiPortVlanTransparencyTest-2097740166-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dscp_marking_east_west-1191916229>, <NovaLikeServer: server-tempest-MultiPortVlanTransparencyTest-2097740166-1>]
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.124 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.124 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.device.read.latency volume: 7590501359 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.124 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.device.read.latency volume: 46692068 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.124 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.device.read.latency volume: 92452128 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.125 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.device.read.latency volume: 668191 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fbcfc9f5-9f74-4eb8-93df-2b896d4ab64a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7590501359, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698-vda', 'timestamp': '2025-10-08T15:38:36.124366', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'instance-0000003d', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'daada62e-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.79077481, 'message_signature': 'c89e49bf0a8102a6f5f377c42bd4edb9eca26caaa98824f020ca2c1aa87a4aea'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46692068, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698-sda', 'timestamp': '2025-10-08T15:38:36.124366', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'instance-0000003d', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'daadb0b0-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.79077481, 'message_signature': '71e39e259beba7febf49b47fde6c1d3bd09b958695a9539ca70a34d8ceea684c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 92452128, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-vda', 'timestamp': '2025-10-08T15:38:36.124366', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'instance-0000003f', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'daadbace-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.817682698, 'message_signature': '94c1d4b8dd5b753d43202977ee37ac16a20ff362df564b10444eb3a77c74f0a5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 668191, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-sda', 'timestamp': '2025-10-08T15:38:36.124366', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'instance-0000003f', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'daadc5dc-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.817682698, 'message_signature': '4bc59e7aac660898618441928f14f356d7fe0714bb6fd79860f6b95a9c33c4d6'}]}, 'timestamp': '2025-10-08 15:38:36.125462', '_unique_id': '512f2b6a7df04d91b6296e873756225e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.126 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.127 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.127 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.device.read.bytes volume: 330950144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.127 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.127 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.device.read.bytes volume: 2920960 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.127 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7032bded-5ba0-47c0-9c7e-8702e318ad83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 330950144, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698-vda', 'timestamp': '2025-10-08T15:38:36.127127', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'instance-0000003d', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'daae123a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.79077481, 'message_signature': '4d7aee0a7980fb3718e3c696a40c6d1dea01b4969e101f5b577010738194c14e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698-sda', 'timestamp': '2025-10-08T15:38:36.127127', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'instance-0000003d', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'daae1e06-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.79077481, 'message_signature': 'a1e51ee0cdf184576e8ef711fafe12d9372a6c1f1616c699942bb9ddd3cf14a2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2920960, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-vda', 'timestamp': '2025-10-08T15:38:36.127127', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'instance-0000003f', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'daae2824-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.817682698, 'message_signature': '5031e33936f1f4ce13e813a1884906192c6fe992000066d9064a071ed2ee6c1a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-sda', 'timestamp': '2025-10-08T15:38:36.127127', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'instance-0000003f', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'daae32f6-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.817682698, 'message_signature': '50a6613712b9572a31e5c6b70e44e03c91da585c7d5604e964690ea4b37d1e0a'}]}, 'timestamp': '2025-10-08 15:38:36.128256', '_unique_id': '5997e06975ac47b695a452cd37e63777'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.128 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.129 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.129 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.130 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.130 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1d46cfd-1e35-40cf-a367-be7a339b297d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000003d-af5ca3d2-d7df-40e0-88f8-b90191a73698-tap38c374da-b5', 'timestamp': '2025-10-08T15:38:36.129841', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'tap38c374da-b5', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c4:ae:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap38c374da-b5'}, 'message_id': 'daae7c70-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.774750373, 'message_signature': 'b005003b9b0a267c4dfdc2d1fa22089f153d9db132f1e5482ba07db3e2a6ffe7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-0000003f-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-tap674868d6-48', 'timestamp': '2025-10-08T15:38:36.129841', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'tap674868d6-48', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:32:84:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap674868d6-48'}, 'message_id': 'daae88c8-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.777564774, 'message_signature': 'a7773398ca36f7a933d5c825e78dadc2fbb80937c483546337424191b8713e92'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-0000003f-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-tapfd3c26a3-b3', 'timestamp': '2025-10-08T15:38:36.129841', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'tapfd3c26a3-b3', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e5:76:46', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfd3c26a3-b3'}, 'message_id': 'daae93a4-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.777564774, 'message_signature': 'a777a49b49aabbcb69fd29c6a492d02d41a74e65bf565eeda8c5ea96f8962cb8'}]}, 'timestamp': '2025-10-08 15:38:36.130741', '_unique_id': '6a74ca2173f947908a9e409d93713bda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.131 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.132 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.132 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.device.write.requests volume: 749 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.132 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.132 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.133 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8ae97bb-e15a-434b-b161-332f6032afb4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 749, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698-vda', 'timestamp': '2025-10-08T15:38:36.132291', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'instance-0000003d', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'daaedbb6-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.79077481, 'message_signature': '3ef983563db0008d6deeef08ff2dbd618e5ddfc503a84b4201034f277e10973e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698-sda', 'timestamp': '2025-10-08T15:38:36.132291', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'instance-0000003d', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'daaee638-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.79077481, 'message_signature': '6f7c9b99fdd80537071e5eac2c8e2059ddaedb0cb69b12f8721bad6ead05b53f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-vda', 'timestamp': '2025-10-08T15:38:36.132291', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'instance-0000003f', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'daaef060-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.817682698, 'message_signature': 'b841a573a274fe0e305ed2bc89b47c0f178f95926c0bf9fd9a740ad87cf6156e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-sda', 'timestamp': '2025-10-08T15:38:36.132291', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'instance-0000003f', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'daaefb3c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.817682698, 'message_signature': 'dbc384fccaf848e89eb8fa9ffb775a82dc213830213ab8c61cb1ae032bebe615'}]}, 'timestamp': '2025-10-08 15:38:36.133382', '_unique_id': '069273bfc46749d6af7710b52f131699'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.134 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 11:38:36 np0005476733 nova_compute[192580]: 2025-10-08 15:38:36.139 2 INFO nova.compute.manager [None req-ac57c02d-8c76-44f9-8b25-b25e6ba1cc68 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Get console output#033[00m
Oct  8 11:38:36 np0005476733 nova_compute[192580]: 2025-10-08 15:38:36.154 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.162 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.163 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.175 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.176 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc5ff8e6-7590-4496-81d9-6de5a2acdb43', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698-vda', 'timestamp': '2025-10-08T15:38:36.134857', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'instance-0000003d', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'dab399c6-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.857861854, 'message_signature': 'f44209cd8d3d7f82200577492ec4372862777d5c3b98f5313e42a0064c6d2973'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698-sda', 'timestamp': '2025-10-08T15:38:36.134857', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'instance-0000003d', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'dab3a858-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.857861854, 'message_signature': '2f546fb68cf0ec10c0f6600a022cc95912fe8df47ad634a924406311791b0bf8'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-vda', 'timestamp': '2025-10-08T15:38:36.134857', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'instance-0000003f', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'dab586dc-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.887046394, 'message_signature': '128b05d7e41affd0ad5ac8230f9ef5fdfef91bda716b18a0437a0fe5efe2912b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-sda', 'timestamp': '2025-10-08T15:38:36.134857', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'instance-0000003f', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'dab59442-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.887046394, 'message_signature': '9cb6d34479729937c90d8283b333562babaadd88391fe5feacaad098f592b7f5'}]}, 'timestamp': '2025-10-08 15:38:36.176650', '_unique_id': '3028af4c5c7e4750896044c96b92771f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.177 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.178 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.179 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.179 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-test_dscp_marking_east_west-1191916229>, <NovaLikeServer: server-tempest-MultiPortVlanTransparencyTest-2097740166-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dscp_marking_east_west-1191916229>, <NovaLikeServer: server-tempest-MultiPortVlanTransparencyTest-2097740166-1>]
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.179 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.179 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/network.outgoing.bytes volume: 52563 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.179 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.180 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4b4e1ec-1935-471d-9553-fbfefd8b84ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 52563, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000003d-af5ca3d2-d7df-40e0-88f8-b90191a73698-tap38c374da-b5', 'timestamp': '2025-10-08T15:38:36.179472', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'tap38c374da-b5', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c4:ae:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap38c374da-b5'}, 'message_id': 'dab60f3a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.774750373, 'message_signature': '7b020e95e814a2a4780f4cb491c3055bc30a8d7aaf0bf5ec2f5c9f54353d4dbd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-0000003f-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-tap674868d6-48', 'timestamp': '2025-10-08T15:38:36.179472', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'tap674868d6-48', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:32:84:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap674868d6-48'}, 'message_id': 'dab61a7a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.777564774, 'message_signature': '18eebab4cc50042ae74a1fdd75271db40e296def2396d1c8497e3bb56b0ecdaa'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-0000003f-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-tapfd3c26a3-b3', 'timestamp': '2025-10-08T15:38:36.179472', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'tapfd3c26a3-b3', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e5:76:46', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfd3c26a3-b3'}, 'message_id': 'dab6263c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.777564774, 'message_signature': '19a60f9798dae92772ead14aa70c137f1f09bd5ec4d3ebcee955e5d2219dcf23'}]}, 'timestamp': '2025-10-08 15:38:36.180371', '_unique_id': '25a3739464604116a165126cf63897d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.181 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/network.incoming.bytes volume: 38471 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.182 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.182 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '159bd9a8-122b-4d76-a65b-2c4d44dc85ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 38471, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000003d-af5ca3d2-d7df-40e0-88f8-b90191a73698-tap38c374da-b5', 'timestamp': '2025-10-08T15:38:36.181923', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'tap38c374da-b5', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c4:ae:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap38c374da-b5'}, 'message_id': 'dab66ef8-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.774750373, 'message_signature': '071fed4ea2cb03052965e593ab29e4cb26e8964920de00a299037b81306d0085'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-0000003f-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-tap674868d6-48', 'timestamp': '2025-10-08T15:38:36.181923', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'tap674868d6-48', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:32:84:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap674868d6-48'}, 'message_id': 'dab67af6-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.777564774, 'message_signature': '78335c870fb122a63be78db5c67d69f5d71dbbaa318791196a9fa6e9b7ac1c5c'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-0000003f-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-tapfd3c26a3-b3', 'timestamp': '2025-10-08T15:38:36.181923', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'tapfd3c26a3-b3', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e5:76:46', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfd3c26a3-b3'}, 'message_id': 'dab685a0-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.777564774, 'message_signature': 'ef6619d18856ea0f1f48101f553bef71ff981dc383460fe6478207109f98dc66'}]}, 'timestamp': '2025-10-08 15:38:36.182809', '_unique_id': 'ae30a805f2664ce5a0b96cf8dff8145f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.183 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.184 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.184 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.device.usage volume: 152371200 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.184 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.184 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.185 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e850e3fe-4397-444c-8f2b-d54ec1c642a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152371200, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698-vda', 'timestamp': '2025-10-08T15:38:36.184323', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'instance-0000003d', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'dab6cc2c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.857861854, 'message_signature': 'b4fdeea43da88ccf289c1c2e3846eb3378567bda29af7f217e8dfddff8d01d55'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698-sda', 'timestamp': '2025-10-08T15:38:36.184323', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'instance-0000003d', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'dab6d6c2-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.857861854, 'message_signature': 'c718f093da723a1fdf8a239524a28d3309e5cd8f211e9ec3ba76aabf2c2f5ca2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-vda', 'timestamp': '2025-10-08T15:38:36.184323', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'instance-0000003f', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'dab6e248-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.887046394, 'message_signature': '3f2fcc258ddb2ec2214ec8c289e8470f5a48c45e2ea3445958f66d370c2737e2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-sda', 'timestamp': '2025-10-08T15:38:36.184323', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'instance-0000003f', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'dab6ed60-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.887046394, 'message_signature': '886b941cc31f6c84351d58af830d2506bc98527278b971524af73af656fe7f54'}]}, 'timestamp': '2025-10-08 15:38:36.185453', '_unique_id': '06b93d53daef41b4b27a16630129499f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.186 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.187 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.187 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-test_dscp_marking_east_west-1191916229>, <NovaLikeServer: server-tempest-MultiPortVlanTransparencyTest-2097740166-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dscp_marking_east_west-1191916229>, <NovaLikeServer: server-tempest-MultiPortVlanTransparencyTest-2097740166-1>]
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.187 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.187 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.device.allocation volume: 152571904 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.187 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.187 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.device.allocation volume: 1253376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.188 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c49239fc-7514-4a14-922b-924fa68749cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152571904, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698-vda', 'timestamp': '2025-10-08T15:38:36.187407', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'instance-0000003d', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'dab74530-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.857861854, 'message_signature': '420b5299550941bae5b7b408f4d3b3272e44ac46d9847d904286e205adb22c82'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698-sda', 'timestamp': '2025-10-08T15:38:36.187407', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'instance-0000003d', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'dab750ac-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.857861854, 'message_signature': 'b04b2f5349885de2eb1e71d3f4f3fd5e39363cddcae2a4bdba356a48fff48f0c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1253376, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-vda', 'timestamp': '2025-10-08T15:38:36.187407', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'instance-0000003f', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'dab75c14-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.887046394, 'message_signature': 'a4a01a591036314d0d2ccdcd93b94198228827f17a31c66f333e917634920ee7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-sda', 'timestamp': '2025-10-08T15:38:36.187407', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'instance-0000003f', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'dab76678-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.887046394, 'message_signature': '4b7563513639ad748edbee159655ab03f8f25dc0bd83c6b5ca3366c2ad858895'}]}, 'timestamp': '2025-10-08 15:38:36.188562', '_unique_id': '9fb6764cc7944a59aa0cc0cc3e254132'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.189 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.190 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.190 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.device.read.requests volume: 11690 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.190 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.190 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.device.read.requests volume: 181 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.191 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '990afd48-baac-4a33-85c5-623a619d7746', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11690, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698-vda', 'timestamp': '2025-10-08T15:38:36.190354', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'instance-0000003d', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'dab7b7d6-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.79077481, 'message_signature': '03676c37170288def4fd86b49a40ac9a5c4940c1e2ff82646bc70ddf52e47041'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698-sda', 'timestamp': '2025-10-08T15:38:36.190354', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'instance-0000003d', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'dab7c23a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.79077481, 'message_signature': '942ef7752d3255b2b3778cca38e2b852577750627a32c0739c4e744c4931ca68'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 181, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-vda', 'timestamp': '2025-10-08T15:38:36.190354', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'instance-0000003f', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'dab7cc6c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.817682698, 'message_signature': '4c77ae15e9529e70072a0fde0d9733af236276aa95be9e2ebeb8592a7ac03053'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-sda', 'timestamp': '2025-10-08T15:38:36.190354', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'instance-0000003f', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'dab7d86a-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.817682698, 'message_signature': '3cdf9c315c44af113c9ffc786855fa9b028b7cc2486105510ff041d5a7356d7c'}]}, 'timestamp': '2025-10-08 15:38:36.191471', '_unique_id': '3ef09951e15e4c9f8bd2c1cf107f87ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.192 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.193 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.193 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cca0c39b-b89e-493b-9019-20749cffcfb5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000003d-af5ca3d2-d7df-40e0-88f8-b90191a73698-tap38c374da-b5', 'timestamp': '2025-10-08T15:38:36.192946', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'tap38c374da-b5', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c4:ae:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap38c374da-b5'}, 'message_id': 'dab81e9c-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.774750373, 'message_signature': '23846aae83b54bf696050517b6da447ad80dc2106a9443d4a4da48fa4be547ac'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-0000003f-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-tap674868d6-48', 'timestamp': '2025-10-08T15:38:36.192946', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'tap674868d6-48', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:32:84:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap674868d6-48'}, 'message_id': 'dab82a18-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.777564774, 'message_signature': '65efa1b217c6fb1a1c47dab8eacfe19bf1d2d01659b14c7d29598002e2020d22'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-0000003f-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-tapfd3c26a3-b3', 'timestamp': '2025-10-08T15:38:36.192946', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'tapfd3c26a3-b3', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e5:76:46', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfd3c26a3-b3'}, 'message_id': 'dab834b8-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.777564774, 'message_signature': 'f904d054fae570afbf3158ff7df3df471eb29d6f2a0282ccc31b0a52dde77a6e'}]}, 'timestamp': '2025-10-08 15:38:36.193846', '_unique_id': '0adc3726fd984634b91a5b484abb2268'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.194 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.195 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.195 12 DEBUG ceilometer.compute.pollsters [-] af5ca3d2-d7df-40e0-88f8-b90191a73698/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.195 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 DEBUG ceilometer.compute.pollsters [-] cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1179c21d-06bd-497c-8a8e-1a29c253448c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-0000003d-af5ca3d2-d7df-40e0-88f8-b90191a73698-tap38c374da-b5', 'timestamp': '2025-10-08T15:38:36.195530', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_east_west-1191916229', 'name': 'tap38c374da-b5', 'instance_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c4:ae:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap38c374da-b5'}, 'message_id': 'dab881fc-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.774750373, 'message_signature': '954f8d3bce22ac714047e4bfdb3c3543a7613b636268b1da2511e9bb725540ec'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-0000003f-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-tap674868d6-48', 'timestamp': '2025-10-08T15:38:36.195530', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'tap674868d6-48', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:32:84:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap674868d6-48'}, 'message_id': 'dab88cb0-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.777564774, 'message_signature': 'ded5368e41ceb116ff390a4b15546313a7514cdead8caac85ad33ad0b4dfb201'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-0000003f-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-tapfd3c26a3-b3', 'timestamp': '2025-10-08T15:38:36.195530', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'name': 'tapfd3c26a3-b3', 'instance_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e5:76:46', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfd3c26a3-b3'}, 'message_id': 'dab89836-a45c-11f0-9274-fa163ef67048', 'monotonic_time': 4819.777564774, 'message_signature': '33c8fd79804249a3fd14dad39f34f48b3ed7184c6095b8d22becf47fd357f3f2'}]}, 'timestamp': '2025-10-08 15:38:36.196393', '_unique_id': '696f2fca9cd349d18b0d51a13b5b9add'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:38:36.196 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:38:36 np0005476733 nova_compute[192580]: 2025-10-08 15:38:36.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:40 np0005476733 nova_compute[192580]: 2025-10-08 15:38:40.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:40 np0005476733 podman[236838]: 2025-10-08 15:38:40.257156126 +0000 UTC m=+0.069181991 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 11:38:40 np0005476733 podman[236837]: 2025-10-08 15:38:40.258166759 +0000 UTC m=+0.071849967 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001)
Oct  8 11:38:41 np0005476733 nova_compute[192580]: 2025-10-08 15:38:41.318 2 INFO nova.compute.manager [None req-4be97949-1b29-43eb-8d60-2de79c47249a ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Get console output#033[00m
Oct  8 11:38:41 np0005476733 nova_compute[192580]: 2025-10-08 15:38:41.324 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:38:41 np0005476733 nova_compute[192580]: 2025-10-08 15:38:41.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:45 np0005476733 nova_compute[192580]: 2025-10-08 15:38:45.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:46 np0005476733 nova_compute[192580]: 2025-10-08 15:38:46.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:46 np0005476733 nova_compute[192580]: 2025-10-08 15:38:46.458 2 INFO nova.compute.manager [None req-d4aa1cfd-b64b-4550-9662-343bb8af6cd7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Get console output#033[00m
Oct  8 11:38:46 np0005476733 nova_compute[192580]: 2025-10-08 15:38:46.467 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:38:47 np0005476733 nova_compute[192580]: 2025-10-08 15:38:47.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:38:49 np0005476733 podman[236880]: 2025-10-08 15:38:49.247563051 +0000 UTC m=+0.072546470 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 11:38:50 np0005476733 nova_compute[192580]: 2025-10-08 15:38:50.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:51 np0005476733 nova_compute[192580]: 2025-10-08 15:38:51.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:51 np0005476733 nova_compute[192580]: 2025-10-08 15:38:51.825 2 INFO nova.compute.manager [None req-d382b536-b4ca-4b09-82e9-008ea59233d5 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Get console output#033[00m
Oct  8 11:38:51 np0005476733 nova_compute[192580]: 2025-10-08 15:38:51.830 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:38:52 np0005476733 nova_compute[192580]: 2025-10-08 15:38:52.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:38:52 np0005476733 nova_compute[192580]: 2025-10-08 15:38:52.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:38:53 np0005476733 podman[236907]: 2025-10-08 15:38:53.262491585 +0000 UTC m=+0.078509723 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Oct  8 11:38:53 np0005476733 podman[236906]: 2025-10-08 15:38:53.303222928 +0000 UTC m=+0.121268631 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 11:38:53 np0005476733 nova_compute[192580]: 2025-10-08 15:38:53.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:38:53 np0005476733 nova_compute[192580]: 2025-10-08 15:38:53.619 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:38:53 np0005476733 nova_compute[192580]: 2025-10-08 15:38:53.619 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:38:53 np0005476733 nova_compute[192580]: 2025-10-08 15:38:53.620 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:38:53 np0005476733 nova_compute[192580]: 2025-10-08 15:38:53.620 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:38:53 np0005476733 nova_compute[192580]: 2025-10-08 15:38:53.695 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af5ca3d2-d7df-40e0-88f8-b90191a73698/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:38:53 np0005476733 nova_compute[192580]: 2025-10-08 15:38:53.774 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af5ca3d2-d7df-40e0-88f8-b90191a73698/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:38:53 np0005476733 nova_compute[192580]: 2025-10-08 15:38:53.776 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af5ca3d2-d7df-40e0-88f8-b90191a73698/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:38:53 np0005476733 nova_compute[192580]: 2025-10-08 15:38:53.838 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af5ca3d2-d7df-40e0-88f8-b90191a73698/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:38:53 np0005476733 nova_compute[192580]: 2025-10-08 15:38:53.850 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:38:53 np0005476733 nova_compute[192580]: 2025-10-08 15:38:53.916 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:38:53 np0005476733 nova_compute[192580]: 2025-10-08 15:38:53.917 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:38:53 np0005476733 nova_compute[192580]: 2025-10-08 15:38:53.973 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cfe5bc3f-449a-4dd6-922b-4e750f8a94d1/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:38:54 np0005476733 nova_compute[192580]: 2025-10-08 15:38:54.178 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:38:54 np0005476733 nova_compute[192580]: 2025-10-08 15:38:54.180 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12527MB free_disk=111.19442367553711GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:38:54 np0005476733 nova_compute[192580]: 2025-10-08 15:38:54.180 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:38:54 np0005476733 nova_compute[192580]: 2025-10-08 15:38:54.181 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:38:54 np0005476733 nova_compute[192580]: 2025-10-08 15:38:54.281 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance af5ca3d2-d7df-40e0-88f8-b90191a73698 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:38:54 np0005476733 nova_compute[192580]: 2025-10-08 15:38:54.281 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance cfe5bc3f-449a-4dd6-922b-4e750f8a94d1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:38:54 np0005476733 nova_compute[192580]: 2025-10-08 15:38:54.281 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:38:54 np0005476733 nova_compute[192580]: 2025-10-08 15:38:54.282 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=2560MB phys_disk=119GB used_disk=20GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:38:54 np0005476733 nova_compute[192580]: 2025-10-08 15:38:54.349 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:38:54 np0005476733 nova_compute[192580]: 2025-10-08 15:38:54.364 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:38:54 np0005476733 nova_compute[192580]: 2025-10-08 15:38:54.384 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:38:54 np0005476733 nova_compute[192580]: 2025-10-08 15:38:54.385 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:38:55 np0005476733 nova_compute[192580]: 2025-10-08 15:38:55.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:56 np0005476733 nova_compute[192580]: 2025-10-08 15:38:56.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:38:56 np0005476733 nova_compute[192580]: 2025-10-08 15:38:56.993 2 INFO nova.compute.manager [None req-5e9037f0-b12d-4a76-a29d-461d3e5fceef ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Get console output#033[00m
Oct  8 11:38:56 np0005476733 nova_compute[192580]: 2025-10-08 15:38:56.997 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:38:58 np0005476733 nova_compute[192580]: 2025-10-08 15:38:58.376 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:38:58 np0005476733 nova_compute[192580]: 2025-10-08 15:38:58.377 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:38:58 np0005476733 nova_compute[192580]: 2025-10-08 15:38:58.377 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:38:59 np0005476733 nova_compute[192580]: 2025-10-08 15:38:59.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:38:59 np0005476733 ovn_controller[94857]: 2025-10-08T15:38:59Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:32:84:b7 10.100.0.12
Oct  8 11:38:59 np0005476733 ovn_controller[94857]: 2025-10-08T15:38:59Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:32:84:b7 10.100.0.12
Oct  8 11:39:00 np0005476733 nova_compute[192580]: 2025-10-08 15:39:00.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:01 np0005476733 podman[236964]: 2025-10-08 15:39:01.245734238 +0000 UTC m=+0.063992785 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 11:39:01 np0005476733 podman[236963]: 2025-10-08 15:39:01.253305313 +0000 UTC m=+0.076507849 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:39:01 np0005476733 podman[236965]: 2025-10-08 15:39:01.261550079 +0000 UTC m=+0.075374312 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, vendor=Red Hat, Inc., vcs-type=git, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  8 11:39:01 np0005476733 nova_compute[192580]: 2025-10-08 15:39:01.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:01 np0005476733 nova_compute[192580]: 2025-10-08 15:39:01.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:39:01 np0005476733 nova_compute[192580]: 2025-10-08 15:39:01.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:39:01 np0005476733 nova_compute[192580]: 2025-10-08 15:39:01.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:39:01 np0005476733 nova_compute[192580]: 2025-10-08 15:39:01.791 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-af5ca3d2-d7df-40e0-88f8-b90191a73698" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:39:01 np0005476733 nova_compute[192580]: 2025-10-08 15:39:01.791 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-af5ca3d2-d7df-40e0-88f8-b90191a73698" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:39:01 np0005476733 nova_compute[192580]: 2025-10-08 15:39:01.791 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:39:01 np0005476733 nova_compute[192580]: 2025-10-08 15:39:01.792 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid af5ca3d2-d7df-40e0-88f8-b90191a73698 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:39:02 np0005476733 nova_compute[192580]: 2025-10-08 15:39:02.153 2 INFO nova.compute.manager [None req-a381faae-bdbb-45dd-942b-fda2124e4644 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Get console output#033[00m
Oct  8 11:39:02 np0005476733 nova_compute[192580]: 2025-10-08 15:39:02.159 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:39:02 np0005476733 nova_compute[192580]: 2025-10-08 15:39:02.982 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Updating instance_info_cache with network_info: [{"id": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "address": "fa:16:3e:c4:ae:37", "network": {"id": "42f8c4b4-9578-47d9-8732-7b9267b6fb6d", "bridge": "br-int", "label": "tempest-test-network--971798211", "subnets": [{"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.132", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38c374da-b5", "ovs_interfaceid": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:39:03 np0005476733 nova_compute[192580]: 2025-10-08 15:39:03.002 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-af5ca3d2-d7df-40e0-88f8-b90191a73698" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:39:03 np0005476733 nova_compute[192580]: 2025-10-08 15:39:03.003 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:39:05 np0005476733 nova_compute[192580]: 2025-10-08 15:39:05.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:06 np0005476733 nova_compute[192580]: 2025-10-08 15:39:06.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:06 np0005476733 nova_compute[192580]: 2025-10-08 15:39:06.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:39:07 np0005476733 nova_compute[192580]: 2025-10-08 15:39:07.346 2 INFO nova.compute.manager [None req-55f8c4b6-3fd7-411b-bbd1-3fcefb9e9e58 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Get console output#033[00m
Oct  8 11:39:07 np0005476733 nova_compute[192580]: 2025-10-08 15:39:07.351 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:39:07 np0005476733 nova_compute[192580]: 2025-10-08 15:39:07.355 2 INFO nova.virt.libvirt.driver [None req-55f8c4b6-3fd7-411b-bbd1-3fcefb9e9e58 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Truncated console log returned, 4475 bytes ignored#033[00m
Oct  8 11:39:10 np0005476733 nova_compute[192580]: 2025-10-08 15:39:10.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:11 np0005476733 podman[237042]: 2025-10-08 15:39:11.243840573 +0000 UTC m=+0.060616606 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 11:39:11 np0005476733 podman[237041]: 2025-10-08 15:39:11.273411956 +0000 UTC m=+0.089219238 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 11:39:11 np0005476733 nova_compute[192580]: 2025-10-08 15:39:11.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:12 np0005476733 nova_compute[192580]: 2025-10-08 15:39:12.498 2 INFO nova.compute.manager [None req-10752943-012e-40ea-8113-a245ceb5ab18 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Get console output#033[00m
Oct  8 11:39:12 np0005476733 nova_compute[192580]: 2025-10-08 15:39:12.504 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:39:12 np0005476733 nova_compute[192580]: 2025-10-08 15:39:12.510 2 INFO nova.virt.libvirt.driver [None req-10752943-012e-40ea-8113-a245ceb5ab18 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Truncated console log returned, 4695 bytes ignored#033[00m
Oct  8 11:39:13 np0005476733 nova_compute[192580]: 2025-10-08 15:39:13.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:13.064 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:39:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:13.068 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:39:13 np0005476733 nova_compute[192580]: 2025-10-08 15:39:13.993 2 DEBUG nova.compute.manager [req-d9ae6d3f-36dd-423c-a8a1-c52a5b6a696d req-6f3466c1-cf93-4338-929b-a002b22b1935 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Received event network-changed-674868d6-4804-4150-b307-9cd31e0509c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:39:13 np0005476733 nova_compute[192580]: 2025-10-08 15:39:13.995 2 DEBUG nova.compute.manager [req-d9ae6d3f-36dd-423c-a8a1-c52a5b6a696d req-6f3466c1-cf93-4338-929b-a002b22b1935 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Refreshing instance network info cache due to event network-changed-674868d6-4804-4150-b307-9cd31e0509c3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:39:13 np0005476733 nova_compute[192580]: 2025-10-08 15:39:13.996 2 DEBUG oslo_concurrency.lockutils [req-d9ae6d3f-36dd-423c-a8a1-c52a5b6a696d req-6f3466c1-cf93-4338-929b-a002b22b1935 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:39:13 np0005476733 nova_compute[192580]: 2025-10-08 15:39:13.997 2 DEBUG oslo_concurrency.lockutils [req-d9ae6d3f-36dd-423c-a8a1-c52a5b6a696d req-6f3466c1-cf93-4338-929b-a002b22b1935 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:39:13 np0005476733 nova_compute[192580]: 2025-10-08 15:39:13.997 2 DEBUG nova.network.neutron [req-d9ae6d3f-36dd-423c-a8a1-c52a5b6a696d req-6f3466c1-cf93-4338-929b-a002b22b1935 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Refreshing network info cache for port 674868d6-4804-4150-b307-9cd31e0509c3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:39:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:15.071 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:39:15 np0005476733 nova_compute[192580]: 2025-10-08 15:39:15.091 2 DEBUG nova.network.neutron [req-d9ae6d3f-36dd-423c-a8a1-c52a5b6a696d req-6f3466c1-cf93-4338-929b-a002b22b1935 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Updated VIF entry in instance network info cache for port 674868d6-4804-4150-b307-9cd31e0509c3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:39:15 np0005476733 nova_compute[192580]: 2025-10-08 15:39:15.092 2 DEBUG nova.network.neutron [req-d9ae6d3f-36dd-423c-a8a1-c52a5b6a696d req-6f3466c1-cf93-4338-929b-a002b22b1935 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Updating instance_info_cache with network_info: [{"id": "674868d6-4804-4150-b307-9cd31e0509c3", "address": "fa:16:3e:32:84:b7", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap674868d6-48", "ovs_interfaceid": "674868d6-4804-4150-b307-9cd31e0509c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "fd3c26a3-b30e-4a03-9339-139f5a106536", "address": "fa:16:3e:e5:76:46", "network": {"id": "a6b45266-4ada-4ad9-aceb-a504546d6cbe", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::35", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3c26a3-b3", "ovs_interfaceid": "fd3c26a3-b30e-4a03-9339-139f5a106536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:39:15 np0005476733 nova_compute[192580]: 2025-10-08 15:39:15.125 2 DEBUG oslo_concurrency.lockutils [req-d9ae6d3f-36dd-423c-a8a1-c52a5b6a696d req-6f3466c1-cf93-4338-929b-a002b22b1935 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-cfe5bc3f-449a-4dd6-922b-4e750f8a94d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:39:15 np0005476733 nova_compute[192580]: 2025-10-08 15:39:15.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:16 np0005476733 nova_compute[192580]: 2025-10-08 15:39:16.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:20 np0005476733 podman[237082]: 2025-10-08 15:39:20.239805257 +0000 UTC m=+0.063798339 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.410 2 DEBUG oslo_concurrency.lockutils [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.411 2 DEBUG oslo_concurrency.lockutils [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.411 2 DEBUG oslo_concurrency.lockutils [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.411 2 DEBUG oslo_concurrency.lockutils [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.412 2 DEBUG oslo_concurrency.lockutils [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.413 2 INFO nova.compute.manager [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Terminating instance#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.415 2 DEBUG nova.compute.manager [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:39:20 np0005476733 kernel: tap674868d6-48 (unregistering): left promiscuous mode
Oct  8 11:39:20 np0005476733 NetworkManager[51699]: <info>  [1759937960.4429] device (tap674868d6-48): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:39:20Z|00508|binding|INFO|Releasing lport 674868d6-4804-4150-b307-9cd31e0509c3 from this chassis (sb_readonly=0)
Oct  8 11:39:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:39:20Z|00509|binding|INFO|Setting lport 674868d6-4804-4150-b307-9cd31e0509c3 down in Southbound
Oct  8 11:39:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:39:20Z|00510|pinctrl|WARN|Dropped 1399 log messages in last 59 seconds (most recently, 3 seconds ago) due to excessive rate
Oct  8 11:39:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:39:20Z|00511|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:39:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:39:20Z|00512|binding|INFO|Removing iface tap674868d6-48 ovn-installed in OVS
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:20.474 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:84:b7 10.100.0.12'], port_security=['fa:16:3e:32:84:b7 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'first_port-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac', 'neutron:port_capabilities': '', 'neutron:port_name': 'first_port-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'neutron:project_id': '357683d0efd54df8878ddcfaabe6d388', 'neutron:revision_number': '4', 'neutron:security_group_ids': '93a341f3-21b5-4aa3-854e-5c20dcdd9b33', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.224'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf5d6359-20d9-440f-a678-46a616c58f4d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=674868d6-4804-4150-b307-9cd31e0509c3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:39:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:20.477 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 674868d6-4804-4150-b307-9cd31e0509c3 in datapath 2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac unbound from our chassis#033[00m
Oct  8 11:39:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:20.480 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:39:20 np0005476733 kernel: tapfd3c26a3-b3 (unregistering): left promiscuous mode
Oct  8 11:39:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:20.483 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[74198561-61ec-402f-b679-0a73e340b6d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:20.485 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac namespace which is not needed anymore#033[00m
Oct  8 11:39:20 np0005476733 NetworkManager[51699]: <info>  [1759937960.4884] device (tapfd3c26a3-b3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:39:20Z|00513|binding|INFO|Releasing lport fd3c26a3-b30e-4a03-9339-139f5a106536 from this chassis (sb_readonly=0)
Oct  8 11:39:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:39:20Z|00514|binding|INFO|Setting lport fd3c26a3-b30e-4a03-9339-139f5a106536 down in Southbound
Oct  8 11:39:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:39:20Z|00515|binding|INFO|Removing iface tapfd3c26a3-b3 ovn-installed in OVS
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:20.530 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:76:46 2001:db8::35'], port_security=['fa:16:3e:e5:76:46 2001:db8::35'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'second_port-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'neutron:cidrs': '2001:db8::35/64', 'neutron:device_id': 'cfe5bc3f-449a-4dd6-922b-4e750f8a94d1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6b45266-4ada-4ad9-aceb-a504546d6cbe', 'neutron:port_capabilities': '', 'neutron:port_name': 'second_port-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'neutron:project_id': '357683d0efd54df8878ddcfaabe6d388', 'neutron:revision_number': '4', 'neutron:security_group_ids': '93a341f3-21b5-4aa3-854e-5c20dcdd9b33', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=988a4c86-cf44-4adf-b0dc-dd1646b531fc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=fd3c26a3-b30e-4a03-9339-139f5a106536) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:39:20 np0005476733 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d0000003f.scope: Deactivated successfully.
Oct  8 11:39:20 np0005476733 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d0000003f.scope: Consumed 47.439s CPU time.
Oct  8 11:39:20 np0005476733 systemd-machined[152624]: Machine qemu-36-instance-0000003f terminated.
Oct  8 11:39:20 np0005476733 neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac[236818]: [NOTICE]   (236825) : haproxy version is 2.8.14-c23fe91
Oct  8 11:39:20 np0005476733 neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac[236818]: [NOTICE]   (236825) : path to executable is /usr/sbin/haproxy
Oct  8 11:39:20 np0005476733 neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac[236818]: [WARNING]  (236825) : Exiting Master process...
Oct  8 11:39:20 np0005476733 neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac[236818]: [WARNING]  (236825) : Exiting Master process...
Oct  8 11:39:20 np0005476733 neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac[236818]: [ALERT]    (236825) : Current worker (236828) exited with code 143 (Terminated)
Oct  8 11:39:20 np0005476733 neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac[236818]: [WARNING]  (236825) : All workers exited. Exiting... (0)
Oct  8 11:39:20 np0005476733 systemd[1]: libpod-2ca71e268b40aeb8ef0c1e4df444edb7f5bff6dffabf66b352e0eb2c103df596.scope: Deactivated successfully.
Oct  8 11:39:20 np0005476733 podman[237129]: 2025-10-08 15:39:20.625375501 +0000 UTC m=+0.045424016 container died 2ca71e268b40aeb8ef0c1e4df444edb7f5bff6dffabf66b352e0eb2c103df596 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:39:20 np0005476733 NetworkManager[51699]: <info>  [1759937960.6470] manager: (tapfd3c26a3-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/169)
Oct  8 11:39:20 np0005476733 systemd[1]: var-lib-containers-storage-overlay-e62f6450988b3fdef84f7698f5bf8de742afe1c5697c6dbf9762ad7b6170c325-merged.mount: Deactivated successfully.
Oct  8 11:39:20 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2ca71e268b40aeb8ef0c1e4df444edb7f5bff6dffabf66b352e0eb2c103df596-userdata-shm.mount: Deactivated successfully.
Oct  8 11:39:20 np0005476733 podman[237129]: 2025-10-08 15:39:20.676875143 +0000 UTC m=+0.096923648 container cleanup 2ca71e268b40aeb8ef0c1e4df444edb7f5bff6dffabf66b352e0eb2c103df596 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.698 2 INFO nova.virt.libvirt.driver [-] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Instance destroyed successfully.#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.699 2 DEBUG nova.objects.instance [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lazy-loading 'resources' on Instance uuid cfe5bc3f-449a-4dd6-922b-4e750f8a94d1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:39:20 np0005476733 systemd[1]: libpod-conmon-2ca71e268b40aeb8ef0c1e4df444edb7f5bff6dffabf66b352e0eb2c103df596.scope: Deactivated successfully.
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.721 2 DEBUG nova.virt.libvirt.vif [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:38:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-2097740166-1',display_name='server-tempest-MultiPortVlanTransparencyTest-2097740166-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multiportvlantransparencytest-2097740166-1',id=63,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFO6EtKf086AtDcSKUhQT3A92xQMgobyVurrJBZ/a3hiqHTiY5Yo0zaLWibmNBoQ54lPdiEia0lEWiGuyPEo3V1Xkv/BTywiIW8/QXBzK9pxBAvfcXXyqWXEVNgqfaVfhA==',key_name='tempest-MultiPortVlanTransparencyTest-2097740166',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:38:34Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='357683d0efd54df8878ddcfaabe6d388',ramdisk_id='',reservation_id='r-86y0r1pj',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-MultiPortVlanTransparencyTest-198310335',owner_user_name='tempest-MultiPortVlanTransparencyTest-198310335-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:38:34Z,user_data=None,user_id='ec8fd4ab84244ebb88e5af7fcd3ce92b',uuid=cfe5bc3f-449a-4dd6-922b-4e750f8a94d1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "674868d6-4804-4150-b307-9cd31e0509c3", "address": "fa:16:3e:32:84:b7", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap674868d6-48", "ovs_interfaceid": "674868d6-4804-4150-b307-9cd31e0509c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.722 2 DEBUG nova.network.os_vif_util [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converting VIF {"id": "674868d6-4804-4150-b307-9cd31e0509c3", "address": "fa:16:3e:32:84:b7", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap674868d6-48", "ovs_interfaceid": "674868d6-4804-4150-b307-9cd31e0509c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.722 2 DEBUG nova.network.os_vif_util [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:32:84:b7,bridge_name='br-int',has_traffic_filtering=True,id=674868d6-4804-4150-b307-9cd31e0509c3,network=Network(2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap674868d6-48') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.722 2 DEBUG os_vif [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:32:84:b7,bridge_name='br-int',has_traffic_filtering=True,id=674868d6-4804-4150-b307-9cd31e0509c3,network=Network(2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap674868d6-48') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.724 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap674868d6-48, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.733 2 INFO os_vif [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:32:84:b7,bridge_name='br-int',has_traffic_filtering=True,id=674868d6-4804-4150-b307-9cd31e0509c3,network=Network(2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap674868d6-48')#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.733 2 DEBUG nova.virt.libvirt.vif [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:38:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-2097740166-1',display_name='server-tempest-MultiPortVlanTransparencyTest-2097740166-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multiportvlantransparencytest-2097740166-1',id=63,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFO6EtKf086AtDcSKUhQT3A92xQMgobyVurrJBZ/a3hiqHTiY5Yo0zaLWibmNBoQ54lPdiEia0lEWiGuyPEo3V1Xkv/BTywiIW8/QXBzK9pxBAvfcXXyqWXEVNgqfaVfhA==',key_name='tempest-MultiPortVlanTransparencyTest-2097740166',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:38:34Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='357683d0efd54df8878ddcfaabe6d388',ramdisk_id='',reservation_id='r-86y0r1pj',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-MultiPortVlanTransparencyTest-198310335',owner_user_name='tempest-MultiPortVlanTransparencyTest-198310335-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:38:34Z,user_data=None,user_id='ec8fd4ab84244ebb88e5af7fcd3ce92b',uuid=cfe5bc3f-449a-4dd6-922b-4e750f8a94d1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fd3c26a3-b30e-4a03-9339-139f5a106536", "address": "fa:16:3e:e5:76:46", "network": {"id": "a6b45266-4ada-4ad9-aceb-a504546d6cbe", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::35", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3c26a3-b3", "ovs_interfaceid": "fd3c26a3-b30e-4a03-9339-139f5a106536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.734 2 DEBUG nova.network.os_vif_util [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converting VIF {"id": "fd3c26a3-b30e-4a03-9339-139f5a106536", "address": "fa:16:3e:e5:76:46", "network": {"id": "a6b45266-4ada-4ad9-aceb-a504546d6cbe", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::35", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd3c26a3-b3", "ovs_interfaceid": "fd3c26a3-b30e-4a03-9339-139f5a106536", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.734 2 DEBUG nova.network.os_vif_util [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:76:46,bridge_name='br-int',has_traffic_filtering=True,id=fd3c26a3-b30e-4a03-9339-139f5a106536,network=Network(a6b45266-4ada-4ad9-aceb-a504546d6cbe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfd3c26a3-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.734 2 DEBUG os_vif [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:76:46,bridge_name='br-int',has_traffic_filtering=True,id=fd3c26a3-b30e-4a03-9339-139f5a106536,network=Network(a6b45266-4ada-4ad9-aceb-a504546d6cbe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfd3c26a3-b3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.736 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd3c26a3-b3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.741 2 INFO os_vif [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:76:46,bridge_name='br-int',has_traffic_filtering=True,id=fd3c26a3-b30e-4a03-9339-139f5a106536,network=Network(a6b45266-4ada-4ad9-aceb-a504546d6cbe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfd3c26a3-b3')#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.742 2 INFO nova.virt.libvirt.driver [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Deleting instance files /var/lib/nova/instances/cfe5bc3f-449a-4dd6-922b-4e750f8a94d1_del#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.743 2 INFO nova.virt.libvirt.driver [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Deletion of /var/lib/nova/instances/cfe5bc3f-449a-4dd6-922b-4e750f8a94d1_del complete#033[00m
Oct  8 11:39:20 np0005476733 podman[237188]: 2025-10-08 15:39:20.745149754 +0000 UTC m=+0.042988838 container remove 2ca71e268b40aeb8ef0c1e4df444edb7f5bff6dffabf66b352e0eb2c103df596 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 11:39:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:20.750 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b1d31475-4abb-46a8-bc33-46f74b5bf869]: (4, ('Wed Oct  8 03:39:20 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac (2ca71e268b40aeb8ef0c1e4df444edb7f5bff6dffabf66b352e0eb2c103df596)\n2ca71e268b40aeb8ef0c1e4df444edb7f5bff6dffabf66b352e0eb2c103df596\nWed Oct  8 03:39:20 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac (2ca71e268b40aeb8ef0c1e4df444edb7f5bff6dffabf66b352e0eb2c103df596)\n2ca71e268b40aeb8ef0c1e4df444edb7f5bff6dffabf66b352e0eb2c103df596\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:20.752 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6062e71b-b476-46b0-9493-47efff593197]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:20.753 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2bf87bc3-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:39:20 np0005476733 kernel: tap2bf87bc3-30: left promiscuous mode
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:20.770 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[34a4e171-d33c-4a81-ad2b-dd9df1158d77]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:20.799 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[cf0043b6-feac-43d3-9fa4-a1112cf0562e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:20.801 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ddf96c5f-2611-484c-9907-0259d568fc9d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.807 2 INFO nova.compute.manager [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.808 2 DEBUG oslo.service.loopingcall [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.809 2 DEBUG nova.compute.manager [-] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:39:20 np0005476733 nova_compute[192580]: 2025-10-08 15:39:20.809 2 DEBUG nova.network.neutron [-] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:39:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:20.828 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0fa64474-4636-472d-9b4d-150e75651f00]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 481598, 'reachable_time': 16668, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237205, 'error': None, 'target': 'ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:20 np0005476733 systemd[1]: run-netns-ovnmeta\x2d2bf87bc3\x2d3d0a\x2d4d8a\x2db41e\x2d00010e6b47ac.mount: Deactivated successfully.
Oct  8 11:39:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:20.833 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:39:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:20.834 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[1a0146fb-cf8a-4258-a895-56854cb85b72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:20.835 103739 INFO neutron.agent.ovn.metadata.agent [-] Port fd3c26a3-b30e-4a03-9339-139f5a106536 in datapath a6b45266-4ada-4ad9-aceb-a504546d6cbe unbound from our chassis#033[00m
Oct  8 11:39:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:20.837 103739 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a6b45266-4ada-4ad9-aceb-a504546d6cbe or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  8 11:39:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:20.838 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[125cb843-0203-408d-bf1b-efbccc15be54]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:21 np0005476733 nova_compute[192580]: 2025-10-08 15:39:21.024 2 DEBUG nova.compute.manager [req-bd3da276-0ed4-4716-b210-a4ce336e51e8 req-3cad9d69-a177-439b-945d-ec59a40685fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Received event network-vif-unplugged-674868d6-4804-4150-b307-9cd31e0509c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:39:21 np0005476733 nova_compute[192580]: 2025-10-08 15:39:21.024 2 DEBUG oslo_concurrency.lockutils [req-bd3da276-0ed4-4716-b210-a4ce336e51e8 req-3cad9d69-a177-439b-945d-ec59a40685fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:21 np0005476733 nova_compute[192580]: 2025-10-08 15:39:21.025 2 DEBUG oslo_concurrency.lockutils [req-bd3da276-0ed4-4716-b210-a4ce336e51e8 req-3cad9d69-a177-439b-945d-ec59a40685fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:21 np0005476733 nova_compute[192580]: 2025-10-08 15:39:21.025 2 DEBUG oslo_concurrency.lockutils [req-bd3da276-0ed4-4716-b210-a4ce336e51e8 req-3cad9d69-a177-439b-945d-ec59a40685fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:21 np0005476733 nova_compute[192580]: 2025-10-08 15:39:21.025 2 DEBUG nova.compute.manager [req-bd3da276-0ed4-4716-b210-a4ce336e51e8 req-3cad9d69-a177-439b-945d-ec59a40685fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] No waiting events found dispatching network-vif-unplugged-674868d6-4804-4150-b307-9cd31e0509c3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:39:21 np0005476733 nova_compute[192580]: 2025-10-08 15:39:21.025 2 DEBUG nova.compute.manager [req-bd3da276-0ed4-4716-b210-a4ce336e51e8 req-3cad9d69-a177-439b-945d-ec59a40685fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Received event network-vif-unplugged-674868d6-4804-4150-b307-9cd31e0509c3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:39:23 np0005476733 nova_compute[192580]: 2025-10-08 15:39:23.411 2 DEBUG nova.compute.manager [req-eaa0bc9f-d3a2-41c3-a2e2-506655a1c627 req-9535cc8a-98d3-411b-bb61-f937d5c07213 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Received event network-vif-plugged-674868d6-4804-4150-b307-9cd31e0509c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:39:23 np0005476733 nova_compute[192580]: 2025-10-08 15:39:23.412 2 DEBUG oslo_concurrency.lockutils [req-eaa0bc9f-d3a2-41c3-a2e2-506655a1c627 req-9535cc8a-98d3-411b-bb61-f937d5c07213 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:23 np0005476733 nova_compute[192580]: 2025-10-08 15:39:23.413 2 DEBUG oslo_concurrency.lockutils [req-eaa0bc9f-d3a2-41c3-a2e2-506655a1c627 req-9535cc8a-98d3-411b-bb61-f937d5c07213 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:23 np0005476733 nova_compute[192580]: 2025-10-08 15:39:23.413 2 DEBUG oslo_concurrency.lockutils [req-eaa0bc9f-d3a2-41c3-a2e2-506655a1c627 req-9535cc8a-98d3-411b-bb61-f937d5c07213 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:23 np0005476733 nova_compute[192580]: 2025-10-08 15:39:23.414 2 DEBUG nova.compute.manager [req-eaa0bc9f-d3a2-41c3-a2e2-506655a1c627 req-9535cc8a-98d3-411b-bb61-f937d5c07213 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] No waiting events found dispatching network-vif-plugged-674868d6-4804-4150-b307-9cd31e0509c3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:39:23 np0005476733 nova_compute[192580]: 2025-10-08 15:39:23.414 2 WARNING nova.compute.manager [req-eaa0bc9f-d3a2-41c3-a2e2-506655a1c627 req-9535cc8a-98d3-411b-bb61-f937d5c07213 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Received unexpected event network-vif-plugged-674868d6-4804-4150-b307-9cd31e0509c3 for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:39:23 np0005476733 nova_compute[192580]: 2025-10-08 15:39:23.415 2 DEBUG nova.compute.manager [req-eaa0bc9f-d3a2-41c3-a2e2-506655a1c627 req-9535cc8a-98d3-411b-bb61-f937d5c07213 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Received event network-vif-unplugged-fd3c26a3-b30e-4a03-9339-139f5a106536 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:39:23 np0005476733 nova_compute[192580]: 2025-10-08 15:39:23.415 2 DEBUG oslo_concurrency.lockutils [req-eaa0bc9f-d3a2-41c3-a2e2-506655a1c627 req-9535cc8a-98d3-411b-bb61-f937d5c07213 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:23 np0005476733 nova_compute[192580]: 2025-10-08 15:39:23.416 2 DEBUG oslo_concurrency.lockutils [req-eaa0bc9f-d3a2-41c3-a2e2-506655a1c627 req-9535cc8a-98d3-411b-bb61-f937d5c07213 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:23 np0005476733 nova_compute[192580]: 2025-10-08 15:39:23.416 2 DEBUG oslo_concurrency.lockutils [req-eaa0bc9f-d3a2-41c3-a2e2-506655a1c627 req-9535cc8a-98d3-411b-bb61-f937d5c07213 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:23 np0005476733 nova_compute[192580]: 2025-10-08 15:39:23.416 2 DEBUG nova.compute.manager [req-eaa0bc9f-d3a2-41c3-a2e2-506655a1c627 req-9535cc8a-98d3-411b-bb61-f937d5c07213 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] No waiting events found dispatching network-vif-unplugged-fd3c26a3-b30e-4a03-9339-139f5a106536 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:39:23 np0005476733 nova_compute[192580]: 2025-10-08 15:39:23.417 2 DEBUG nova.compute.manager [req-eaa0bc9f-d3a2-41c3-a2e2-506655a1c627 req-9535cc8a-98d3-411b-bb61-f937d5c07213 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Received event network-vif-unplugged-fd3c26a3-b30e-4a03-9339-139f5a106536 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:39:23 np0005476733 nova_compute[192580]: 2025-10-08 15:39:23.417 2 DEBUG nova.compute.manager [req-eaa0bc9f-d3a2-41c3-a2e2-506655a1c627 req-9535cc8a-98d3-411b-bb61-f937d5c07213 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Received event network-vif-plugged-fd3c26a3-b30e-4a03-9339-139f5a106536 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:39:23 np0005476733 nova_compute[192580]: 2025-10-08 15:39:23.418 2 DEBUG oslo_concurrency.lockutils [req-eaa0bc9f-d3a2-41c3-a2e2-506655a1c627 req-9535cc8a-98d3-411b-bb61-f937d5c07213 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:23 np0005476733 nova_compute[192580]: 2025-10-08 15:39:23.418 2 DEBUG oslo_concurrency.lockutils [req-eaa0bc9f-d3a2-41c3-a2e2-506655a1c627 req-9535cc8a-98d3-411b-bb61-f937d5c07213 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:23 np0005476733 nova_compute[192580]: 2025-10-08 15:39:23.418 2 DEBUG oslo_concurrency.lockutils [req-eaa0bc9f-d3a2-41c3-a2e2-506655a1c627 req-9535cc8a-98d3-411b-bb61-f937d5c07213 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:23 np0005476733 nova_compute[192580]: 2025-10-08 15:39:23.419 2 DEBUG nova.compute.manager [req-eaa0bc9f-d3a2-41c3-a2e2-506655a1c627 req-9535cc8a-98d3-411b-bb61-f937d5c07213 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] No waiting events found dispatching network-vif-plugged-fd3c26a3-b30e-4a03-9339-139f5a106536 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:39:23 np0005476733 nova_compute[192580]: 2025-10-08 15:39:23.419 2 WARNING nova.compute.manager [req-eaa0bc9f-d3a2-41c3-a2e2-506655a1c627 req-9535cc8a-98d3-411b-bb61-f937d5c07213 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Received unexpected event network-vif-plugged-fd3c26a3-b30e-4a03-9339-139f5a106536 for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:39:23 np0005476733 nova_compute[192580]: 2025-10-08 15:39:23.666 2 DEBUG nova.network.neutron [-] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:39:23 np0005476733 nova_compute[192580]: 2025-10-08 15:39:23.845 2 INFO nova.compute.manager [-] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Took 3.04 seconds to deallocate network for instance.#033[00m
Oct  8 11:39:23 np0005476733 nova_compute[192580]: 2025-10-08 15:39:23.940 2 DEBUG oslo_concurrency.lockutils [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:23 np0005476733 nova_compute[192580]: 2025-10-08 15:39:23.940 2 DEBUG oslo_concurrency.lockutils [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:24 np0005476733 nova_compute[192580]: 2025-10-08 15:39:24.025 2 DEBUG nova.compute.provider_tree [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:39:24 np0005476733 nova_compute[192580]: 2025-10-08 15:39:24.051 2 DEBUG nova.scheduler.client.report [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:39:24 np0005476733 nova_compute[192580]: 2025-10-08 15:39:24.073 2 DEBUG oslo_concurrency.lockutils [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:24 np0005476733 nova_compute[192580]: 2025-10-08 15:39:24.116 2 INFO nova.scheduler.client.report [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Deleted allocations for instance cfe5bc3f-449a-4dd6-922b-4e750f8a94d1#033[00m
Oct  8 11:39:24 np0005476733 nova_compute[192580]: 2025-10-08 15:39:24.194 2 DEBUG oslo_concurrency.lockutils [None req-935457e6-f01c-4a15-93b1-36713b3f8768 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "cfe5bc3f-449a-4dd6-922b-4e750f8a94d1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.784s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:24 np0005476733 podman[237213]: 2025-10-08 15:39:24.276783502 +0000 UTC m=+0.086812271 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, org.label-schema.build-date=20251001)
Oct  8 11:39:24 np0005476733 podman[237212]: 2025-10-08 15:39:24.281737842 +0000 UTC m=+0.102425845 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, config_id=ovn_controller)
Oct  8 11:39:25 np0005476733 nova_compute[192580]: 2025-10-08 15:39:25.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:25 np0005476733 nova_compute[192580]: 2025-10-08 15:39:25.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:26.325 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:26.325 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:26.326 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:30 np0005476733 nova_compute[192580]: 2025-10-08 15:39:30.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:30 np0005476733 nova_compute[192580]: 2025-10-08 15:39:30.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:32 np0005476733 podman[237260]: 2025-10-08 15:39:32.230589077 +0000 UTC m=+0.055016375 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:39:32 np0005476733 podman[237261]: 2025-10-08 15:39:32.237536081 +0000 UTC m=+0.060749710 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, version=9.6, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Oct  8 11:39:32 np0005476733 podman[237259]: 2025-10-08 15:39:32.257373341 +0000 UTC m=+0.087524264 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.415 2 DEBUG oslo_concurrency.lockutils [None req-7371041e-b5c3-47cd-86e9-0d4dac3c9045 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "af5ca3d2-d7df-40e0-88f8-b90191a73698" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.415 2 DEBUG oslo_concurrency.lockutils [None req-7371041e-b5c3-47cd-86e9-0d4dac3c9045 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "af5ca3d2-d7df-40e0-88f8-b90191a73698" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.416 2 DEBUG oslo_concurrency.lockutils [None req-7371041e-b5c3-47cd-86e9-0d4dac3c9045 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "af5ca3d2-d7df-40e0-88f8-b90191a73698-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.416 2 DEBUG oslo_concurrency.lockutils [None req-7371041e-b5c3-47cd-86e9-0d4dac3c9045 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "af5ca3d2-d7df-40e0-88f8-b90191a73698-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.416 2 DEBUG oslo_concurrency.lockutils [None req-7371041e-b5c3-47cd-86e9-0d4dac3c9045 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "af5ca3d2-d7df-40e0-88f8-b90191a73698-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.417 2 INFO nova.compute.manager [None req-7371041e-b5c3-47cd-86e9-0d4dac3c9045 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Terminating instance#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.418 2 DEBUG nova.compute.manager [None req-7371041e-b5c3-47cd-86e9-0d4dac3c9045 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:39:32 np0005476733 kernel: tap38c374da-b5 (unregistering): left promiscuous mode
Oct  8 11:39:32 np0005476733 NetworkManager[51699]: <info>  [1759937972.4485] device (tap38c374da-b5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:39:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:39:32Z|00516|binding|INFO|Releasing lport 38c374da-b5bd-4c7d-9352-3fe6186df2b8 from this chassis (sb_readonly=0)
Oct  8 11:39:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:39:32Z|00517|binding|INFO|Setting lport 38c374da-b5bd-4c7d-9352-3fe6186df2b8 down in Southbound
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:39:32Z|00518|binding|INFO|Removing iface tap38c374da-b5 ovn-installed in OVS
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:32.470 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:ae:37 192.168.5.132'], port_security=['fa:16:3e:c4:ae:37 192.168.5.132'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.5.132/24', 'neutron:device_id': 'af5ca3d2-d7df-40e0-88f8-b90191a73698', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42f8c4b4-9578-47d9-8732-7b9267b6fb6d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b449450f-29a2-4ba2-a56d-c4c1cca923db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.227'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9e1c7f6d-e9b9-40a2-8243-4254ff88872e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=38c374da-b5bd-4c7d-9352-3fe6186df2b8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:39:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:32.473 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 38c374da-b5bd-4c7d-9352-3fe6186df2b8 in datapath 42f8c4b4-9578-47d9-8732-7b9267b6fb6d unbound from our chassis#033[00m
Oct  8 11:39:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:32.477 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 42f8c4b4-9578-47d9-8732-7b9267b6fb6d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:39:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:32.481 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1a2b7d4c-353d-4be8-a5fc-e272fd539f16]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:32.482 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-42f8c4b4-9578-47d9-8732-7b9267b6fb6d namespace which is not needed anymore#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:32 np0005476733 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d0000003d.scope: Deactivated successfully.
Oct  8 11:39:32 np0005476733 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d0000003d.scope: Consumed 50.695s CPU time.
Oct  8 11:39:32 np0005476733 systemd-machined[152624]: Machine qemu-35-instance-0000003d terminated.
Oct  8 11:39:32 np0005476733 neutron-haproxy-ovnmeta-42f8c4b4-9578-47d9-8732-7b9267b6fb6d[236119]: [NOTICE]   (236123) : haproxy version is 2.8.14-c23fe91
Oct  8 11:39:32 np0005476733 neutron-haproxy-ovnmeta-42f8c4b4-9578-47d9-8732-7b9267b6fb6d[236119]: [NOTICE]   (236123) : path to executable is /usr/sbin/haproxy
Oct  8 11:39:32 np0005476733 neutron-haproxy-ovnmeta-42f8c4b4-9578-47d9-8732-7b9267b6fb6d[236119]: [WARNING]  (236123) : Exiting Master process...
Oct  8 11:39:32 np0005476733 neutron-haproxy-ovnmeta-42f8c4b4-9578-47d9-8732-7b9267b6fb6d[236119]: [WARNING]  (236123) : Exiting Master process...
Oct  8 11:39:32 np0005476733 neutron-haproxy-ovnmeta-42f8c4b4-9578-47d9-8732-7b9267b6fb6d[236119]: [ALERT]    (236123) : Current worker (236125) exited with code 143 (Terminated)
Oct  8 11:39:32 np0005476733 neutron-haproxy-ovnmeta-42f8c4b4-9578-47d9-8732-7b9267b6fb6d[236119]: [WARNING]  (236123) : All workers exited. Exiting... (0)
Oct  8 11:39:32 np0005476733 systemd[1]: libpod-6e9b5dca7671fc7bc4be5f5a4ddb5d358f486c884895285219b9a86da2b0e063.scope: Deactivated successfully.
Oct  8 11:39:32 np0005476733 podman[237347]: 2025-10-08 15:39:32.644282249 +0000 UTC m=+0.058648272 container died 6e9b5dca7671fc7bc4be5f5a4ddb5d358f486c884895285219b9a86da2b0e063 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-42f8c4b4-9578-47d9-8732-7b9267b6fb6d, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.672 2 DEBUG nova.compute.manager [req-fd836088-8fab-41a6-ac4c-e6ef30a5f1e0 req-0706c3a6-2086-4e7b-afbf-f2e9fd1fc5bf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Received event network-vif-unplugged-38c374da-b5bd-4c7d-9352-3fe6186df2b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.672 2 DEBUG oslo_concurrency.lockutils [req-fd836088-8fab-41a6-ac4c-e6ef30a5f1e0 req-0706c3a6-2086-4e7b-afbf-f2e9fd1fc5bf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "af5ca3d2-d7df-40e0-88f8-b90191a73698-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.672 2 DEBUG oslo_concurrency.lockutils [req-fd836088-8fab-41a6-ac4c-e6ef30a5f1e0 req-0706c3a6-2086-4e7b-afbf-f2e9fd1fc5bf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "af5ca3d2-d7df-40e0-88f8-b90191a73698-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.674 2 DEBUG oslo_concurrency.lockutils [req-fd836088-8fab-41a6-ac4c-e6ef30a5f1e0 req-0706c3a6-2086-4e7b-afbf-f2e9fd1fc5bf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "af5ca3d2-d7df-40e0-88f8-b90191a73698-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.674 2 DEBUG nova.compute.manager [req-fd836088-8fab-41a6-ac4c-e6ef30a5f1e0 req-0706c3a6-2086-4e7b-afbf-f2e9fd1fc5bf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] No waiting events found dispatching network-vif-unplugged-38c374da-b5bd-4c7d-9352-3fe6186df2b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.674 2 DEBUG nova.compute.manager [req-fd836088-8fab-41a6-ac4c-e6ef30a5f1e0 req-0706c3a6-2086-4e7b-afbf-f2e9fd1fc5bf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Received event network-vif-unplugged-38c374da-b5bd-4c7d-9352-3fe6186df2b8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:39:32 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6e9b5dca7671fc7bc4be5f5a4ddb5d358f486c884895285219b9a86da2b0e063-userdata-shm.mount: Deactivated successfully.
Oct  8 11:39:32 np0005476733 systemd[1]: var-lib-containers-storage-overlay-20f36882416f6454216ffa7d98abc1b1d6cfc29e363ffda0eea4dd1e0ad9932c-merged.mount: Deactivated successfully.
Oct  8 11:39:32 np0005476733 podman[237347]: 2025-10-08 15:39:32.688867147 +0000 UTC m=+0.103233150 container cleanup 6e9b5dca7671fc7bc4be5f5a4ddb5d358f486c884895285219b9a86da2b0e063 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-42f8c4b4-9578-47d9-8732-7b9267b6fb6d, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS)
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.698 2 INFO nova.virt.libvirt.driver [-] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Instance destroyed successfully.#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.698 2 DEBUG nova.objects.instance [None req-7371041e-b5c3-47cd-86e9-0d4dac3c9045 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'resources' on Instance uuid af5ca3d2-d7df-40e0-88f8-b90191a73698 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:39:32 np0005476733 systemd[1]: libpod-conmon-6e9b5dca7671fc7bc4be5f5a4ddb5d358f486c884895285219b9a86da2b0e063.scope: Deactivated successfully.
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.733 2 DEBUG nova.virt.libvirt.vif [None req-7371041e-b5c3-47cd-86e9-0d4dac3c9045 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:36:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_dscp_marking_east_west-1191916229',display_name='tempest-test_dscp_marking_east_west-1191916229',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dscp-marking-east-west-1191916229',id=61,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:37:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-sqeru994',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:37:02Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=af5ca3d2-d7df-40e0-88f8-b90191a73698,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "address": "fa:16:3e:c4:ae:37", "network": {"id": "42f8c4b4-9578-47d9-8732-7b9267b6fb6d", "bridge": "br-int", "label": "tempest-test-network--971798211", "subnets": [{"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.132", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38c374da-b5", "ovs_interfaceid": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.734 2 DEBUG nova.network.os_vif_util [None req-7371041e-b5c3-47cd-86e9-0d4dac3c9045 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "address": "fa:16:3e:c4:ae:37", "network": {"id": "42f8c4b4-9578-47d9-8732-7b9267b6fb6d", "bridge": "br-int", "label": "tempest-test-network--971798211", "subnets": [{"cidr": "192.168.5.0/24", "dns": [], "gateway": {"address": "192.168.5.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.5.132", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38c374da-b5", "ovs_interfaceid": "38c374da-b5bd-4c7d-9352-3fe6186df2b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.736 2 DEBUG nova.network.os_vif_util [None req-7371041e-b5c3-47cd-86e9-0d4dac3c9045 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c4:ae:37,bridge_name='br-int',has_traffic_filtering=True,id=38c374da-b5bd-4c7d-9352-3fe6186df2b8,network=Network(42f8c4b4-9578-47d9-8732-7b9267b6fb6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38c374da-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.737 2 DEBUG os_vif [None req-7371041e-b5c3-47cd-86e9-0d4dac3c9045 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c4:ae:37,bridge_name='br-int',has_traffic_filtering=True,id=38c374da-b5bd-4c7d-9352-3fe6186df2b8,network=Network(42f8c4b4-9578-47d9-8732-7b9267b6fb6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38c374da-b5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.742 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap38c374da-b5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.748 2 INFO os_vif [None req-7371041e-b5c3-47cd-86e9-0d4dac3c9045 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c4:ae:37,bridge_name='br-int',has_traffic_filtering=True,id=38c374da-b5bd-4c7d-9352-3fe6186df2b8,network=Network(42f8c4b4-9578-47d9-8732-7b9267b6fb6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap38c374da-b5')#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.748 2 INFO nova.virt.libvirt.driver [None req-7371041e-b5c3-47cd-86e9-0d4dac3c9045 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Deleting instance files /var/lib/nova/instances/af5ca3d2-d7df-40e0-88f8-b90191a73698_del#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.749 2 INFO nova.virt.libvirt.driver [None req-7371041e-b5c3-47cd-86e9-0d4dac3c9045 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Deletion of /var/lib/nova/instances/af5ca3d2-d7df-40e0-88f8-b90191a73698_del complete#033[00m
Oct  8 11:39:32 np0005476733 podman[237395]: 2025-10-08 15:39:32.757613374 +0000 UTC m=+0.047561355 container remove 6e9b5dca7671fc7bc4be5f5a4ddb5d358f486c884895285219b9a86da2b0e063 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-42f8c4b4-9578-47d9-8732-7b9267b6fb6d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 11:39:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:32.765 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[67f98655-71b7-4492-a96c-e859439b7466]: (4, ('Wed Oct  8 03:39:32 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-42f8c4b4-9578-47d9-8732-7b9267b6fb6d (6e9b5dca7671fc7bc4be5f5a4ddb5d358f486c884895285219b9a86da2b0e063)\n6e9b5dca7671fc7bc4be5f5a4ddb5d358f486c884895285219b9a86da2b0e063\nWed Oct  8 03:39:32 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-42f8c4b4-9578-47d9-8732-7b9267b6fb6d (6e9b5dca7671fc7bc4be5f5a4ddb5d358f486c884895285219b9a86da2b0e063)\n6e9b5dca7671fc7bc4be5f5a4ddb5d358f486c884895285219b9a86da2b0e063\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:32.767 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[291714db-cf9d-4a79-a406-c006db585146]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:32.768 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap42f8c4b4-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:32 np0005476733 kernel: tap42f8c4b4-90: left promiscuous mode
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:32.786 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[362a0152-ca84-424a-bef7-66f126be1b1d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:32.827 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[36442993-fdb7-49de-a007-2fbdff8eac1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:32.830 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d7184a97-5efe-4365-8c9f-fb81c73a7d3f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.832 2 INFO nova.compute.manager [None req-7371041e-b5c3-47cd-86e9-0d4dac3c9045 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.833 2 DEBUG oslo.service.loopingcall [None req-7371041e-b5c3-47cd-86e9-0d4dac3c9045 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.833 2 DEBUG nova.compute.manager [-] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:39:32 np0005476733 nova_compute[192580]: 2025-10-08 15:39:32.833 2 DEBUG nova.network.neutron [-] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:39:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:32.847 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d110c547-d8c7-4cf9-bf53-5ed064413f43]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 472387, 'reachable_time': 18615, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237410, 'error': None, 'target': 'ovnmeta-42f8c4b4-9578-47d9-8732-7b9267b6fb6d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:32.849 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-42f8c4b4-9578-47d9-8732-7b9267b6fb6d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:39:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:32.850 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[85964986-9100-4c93-878f-bea05596cdfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:32 np0005476733 systemd[1]: run-netns-ovnmeta\x2d42f8c4b4\x2d9578\x2d47d9\x2d8732\x2d7b9267b6fb6d.mount: Deactivated successfully.
Oct  8 11:39:33 np0005476733 nova_compute[192580]: 2025-10-08 15:39:33.905 2 DEBUG nova.network.neutron [-] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:39:33 np0005476733 nova_compute[192580]: 2025-10-08 15:39:33.928 2 INFO nova.compute.manager [-] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Took 1.09 seconds to deallocate network for instance.#033[00m
Oct  8 11:39:33 np0005476733 nova_compute[192580]: 2025-10-08 15:39:33.976 2 DEBUG oslo_concurrency.lockutils [None req-7371041e-b5c3-47cd-86e9-0d4dac3c9045 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:33 np0005476733 nova_compute[192580]: 2025-10-08 15:39:33.977 2 DEBUG oslo_concurrency.lockutils [None req-7371041e-b5c3-47cd-86e9-0d4dac3c9045 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:33 np0005476733 nova_compute[192580]: 2025-10-08 15:39:33.990 2 DEBUG nova.compute.manager [req-ab016a24-60b5-4799-8933-45d78730a496 req-e2deebf5-5aee-45ee-9405-f480c14cc187 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Received event network-vif-deleted-38c374da-b5bd-4c7d-9352-3fe6186df2b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:39:34 np0005476733 nova_compute[192580]: 2025-10-08 15:39:34.041 2 DEBUG nova.compute.provider_tree [None req-7371041e-b5c3-47cd-86e9-0d4dac3c9045 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:39:34 np0005476733 nova_compute[192580]: 2025-10-08 15:39:34.067 2 DEBUG nova.scheduler.client.report [None req-7371041e-b5c3-47cd-86e9-0d4dac3c9045 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:39:34 np0005476733 nova_compute[192580]: 2025-10-08 15:39:34.131 2 DEBUG oslo_concurrency.lockutils [None req-7371041e-b5c3-47cd-86e9-0d4dac3c9045 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:34 np0005476733 nova_compute[192580]: 2025-10-08 15:39:34.199 2 INFO nova.scheduler.client.report [None req-7371041e-b5c3-47cd-86e9-0d4dac3c9045 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Deleted allocations for instance af5ca3d2-d7df-40e0-88f8-b90191a73698#033[00m
Oct  8 11:39:34 np0005476733 nova_compute[192580]: 2025-10-08 15:39:34.311 2 DEBUG oslo_concurrency.lockutils [None req-7371041e-b5c3-47cd-86e9-0d4dac3c9045 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "af5ca3d2-d7df-40e0-88f8-b90191a73698" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.895s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:34 np0005476733 nova_compute[192580]: 2025-10-08 15:39:34.945 2 DEBUG nova.compute.manager [req-14c03352-9da3-48c2-b1aa-b046632cc0ea req-7561a45f-66bf-422c-b6c5-47e3068ae894 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Received event network-vif-plugged-38c374da-b5bd-4c7d-9352-3fe6186df2b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:39:34 np0005476733 nova_compute[192580]: 2025-10-08 15:39:34.946 2 DEBUG oslo_concurrency.lockutils [req-14c03352-9da3-48c2-b1aa-b046632cc0ea req-7561a45f-66bf-422c-b6c5-47e3068ae894 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "af5ca3d2-d7df-40e0-88f8-b90191a73698-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:34 np0005476733 nova_compute[192580]: 2025-10-08 15:39:34.946 2 DEBUG oslo_concurrency.lockutils [req-14c03352-9da3-48c2-b1aa-b046632cc0ea req-7561a45f-66bf-422c-b6c5-47e3068ae894 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "af5ca3d2-d7df-40e0-88f8-b90191a73698-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:34 np0005476733 nova_compute[192580]: 2025-10-08 15:39:34.946 2 DEBUG oslo_concurrency.lockutils [req-14c03352-9da3-48c2-b1aa-b046632cc0ea req-7561a45f-66bf-422c-b6c5-47e3068ae894 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "af5ca3d2-d7df-40e0-88f8-b90191a73698-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:34 np0005476733 nova_compute[192580]: 2025-10-08 15:39:34.946 2 DEBUG nova.compute.manager [req-14c03352-9da3-48c2-b1aa-b046632cc0ea req-7561a45f-66bf-422c-b6c5-47e3068ae894 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] No waiting events found dispatching network-vif-plugged-38c374da-b5bd-4c7d-9352-3fe6186df2b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:39:34 np0005476733 nova_compute[192580]: 2025-10-08 15:39:34.947 2 WARNING nova.compute.manager [req-14c03352-9da3-48c2-b1aa-b046632cc0ea req-7561a45f-66bf-422c-b6c5-47e3068ae894 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Received unexpected event network-vif-plugged-38c374da-b5bd-4c7d-9352-3fe6186df2b8 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 11:39:35 np0005476733 nova_compute[192580]: 2025-10-08 15:39:35.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:35 np0005476733 nova_compute[192580]: 2025-10-08 15:39:35.698 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937960.69629, cfe5bc3f-449a-4dd6-922b-4e750f8a94d1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:39:35 np0005476733 nova_compute[192580]: 2025-10-08 15:39:35.699 2 INFO nova.compute.manager [-] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:39:35 np0005476733 nova_compute[192580]: 2025-10-08 15:39:35.739 2 DEBUG nova.compute.manager [None req-19151229-e1c0-4ce7-90c2-b9e720325e35 - - - - - -] [instance: cfe5bc3f-449a-4dd6-922b-4e750f8a94d1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:39:37 np0005476733 nova_compute[192580]: 2025-10-08 15:39:37.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:40 np0005476733 nova_compute[192580]: 2025-10-08 15:39:40.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:42 np0005476733 podman[237412]: 2025-10-08 15:39:42.257617223 +0000 UTC m=+0.057288840 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 11:39:42 np0005476733 podman[237411]: 2025-10-08 15:39:42.285280925 +0000 UTC m=+0.091237543 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=iscsid)
Oct  8 11:39:42 np0005476733 nova_compute[192580]: 2025-10-08 15:39:42.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:45 np0005476733 nova_compute[192580]: 2025-10-08 15:39:45.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:46 np0005476733 nova_compute[192580]: 2025-10-08 15:39:46.415 2 DEBUG oslo_concurrency.lockutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:46 np0005476733 nova_compute[192580]: 2025-10-08 15:39:46.415 2 DEBUG oslo_concurrency.lockutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:46 np0005476733 nova_compute[192580]: 2025-10-08 15:39:46.453 2 DEBUG nova.compute.manager [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:39:46 np0005476733 nova_compute[192580]: 2025-10-08 15:39:46.563 2 DEBUG oslo_concurrency.lockutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:46 np0005476733 nova_compute[192580]: 2025-10-08 15:39:46.564 2 DEBUG oslo_concurrency.lockutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:46 np0005476733 nova_compute[192580]: 2025-10-08 15:39:46.575 2 DEBUG nova.virt.hardware [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:39:46 np0005476733 nova_compute[192580]: 2025-10-08 15:39:46.575 2 INFO nova.compute.claims [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:39:46 np0005476733 nova_compute[192580]: 2025-10-08 15:39:46.702 2 DEBUG nova.compute.provider_tree [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:39:46 np0005476733 nova_compute[192580]: 2025-10-08 15:39:46.719 2 DEBUG nova.scheduler.client.report [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:39:46 np0005476733 nova_compute[192580]: 2025-10-08 15:39:46.743 2 DEBUG oslo_concurrency.lockutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:46 np0005476733 nova_compute[192580]: 2025-10-08 15:39:46.744 2 DEBUG nova.compute.manager [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:39:46 np0005476733 nova_compute[192580]: 2025-10-08 15:39:46.800 2 DEBUG nova.compute.manager [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:39:46 np0005476733 nova_compute[192580]: 2025-10-08 15:39:46.801 2 DEBUG nova.network.neutron [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:39:46 np0005476733 nova_compute[192580]: 2025-10-08 15:39:46.824 2 INFO nova.virt.libvirt.driver [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:39:46 np0005476733 nova_compute[192580]: 2025-10-08 15:39:46.846 2 DEBUG nova.compute.manager [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:39:46 np0005476733 nova_compute[192580]: 2025-10-08 15:39:46.953 2 DEBUG nova.compute.manager [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:39:46 np0005476733 nova_compute[192580]: 2025-10-08 15:39:46.955 2 DEBUG nova.virt.libvirt.driver [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:39:46 np0005476733 nova_compute[192580]: 2025-10-08 15:39:46.956 2 INFO nova.virt.libvirt.driver [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Creating image(s)#033[00m
Oct  8 11:39:46 np0005476733 nova_compute[192580]: 2025-10-08 15:39:46.957 2 DEBUG oslo_concurrency.lockutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "/var/lib/nova/instances/ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:46 np0005476733 nova_compute[192580]: 2025-10-08 15:39:46.958 2 DEBUG oslo_concurrency.lockutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "/var/lib/nova/instances/ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:46 np0005476733 nova_compute[192580]: 2025-10-08 15:39:46.959 2 DEBUG oslo_concurrency.lockutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "/var/lib/nova/instances/ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:46 np0005476733 nova_compute[192580]: 2025-10-08 15:39:46.988 2 DEBUG oslo_concurrency.processutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:39:47 np0005476733 nova_compute[192580]: 2025-10-08 15:39:47.056 2 DEBUG oslo_concurrency.processutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:39:47 np0005476733 nova_compute[192580]: 2025-10-08 15:39:47.057 2 DEBUG oslo_concurrency.lockutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:47 np0005476733 nova_compute[192580]: 2025-10-08 15:39:47.057 2 DEBUG oslo_concurrency.lockutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:47 np0005476733 nova_compute[192580]: 2025-10-08 15:39:47.068 2 DEBUG oslo_concurrency.processutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:39:47 np0005476733 nova_compute[192580]: 2025-10-08 15:39:47.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:47 np0005476733 nova_compute[192580]: 2025-10-08 15:39:47.148 2 DEBUG oslo_concurrency.processutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:39:47 np0005476733 nova_compute[192580]: 2025-10-08 15:39:47.149 2 DEBUG oslo_concurrency.processutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:39:47 np0005476733 nova_compute[192580]: 2025-10-08 15:39:47.193 2 DEBUG oslo_concurrency.processutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk 10737418240" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:39:47 np0005476733 nova_compute[192580]: 2025-10-08 15:39:47.194 2 DEBUG oslo_concurrency.lockutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:47 np0005476733 nova_compute[192580]: 2025-10-08 15:39:47.194 2 DEBUG oslo_concurrency.processutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:39:47 np0005476733 nova_compute[192580]: 2025-10-08 15:39:47.257 2 DEBUG oslo_concurrency.processutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:39:47 np0005476733 nova_compute[192580]: 2025-10-08 15:39:47.259 2 DEBUG nova.objects.instance [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lazy-loading 'migration_context' on Instance uuid ccf8be13-2e93-495d-ac4a-2cff54baa4fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:39:47 np0005476733 nova_compute[192580]: 2025-10-08 15:39:47.277 2 DEBUG nova.virt.libvirt.driver [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:39:47 np0005476733 nova_compute[192580]: 2025-10-08 15:39:47.277 2 DEBUG nova.virt.libvirt.driver [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Ensure instance console log exists: /var/lib/nova/instances/ccf8be13-2e93-495d-ac4a-2cff54baa4fb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:39:47 np0005476733 nova_compute[192580]: 2025-10-08 15:39:47.278 2 DEBUG oslo_concurrency.lockutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:47 np0005476733 nova_compute[192580]: 2025-10-08 15:39:47.278 2 DEBUG oslo_concurrency.lockutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:47 np0005476733 nova_compute[192580]: 2025-10-08 15:39:47.278 2 DEBUG oslo_concurrency.lockutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:47 np0005476733 nova_compute[192580]: 2025-10-08 15:39:47.307 2 DEBUG nova.policy [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:39:47 np0005476733 nova_compute[192580]: 2025-10-08 15:39:47.698 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759937972.695614, af5ca3d2-d7df-40e0-88f8-b90191a73698 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:39:47 np0005476733 nova_compute[192580]: 2025-10-08 15:39:47.699 2 INFO nova.compute.manager [-] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:39:47 np0005476733 nova_compute[192580]: 2025-10-08 15:39:47.727 2 DEBUG nova.compute.manager [None req-22f2955f-3fa7-4ab5-900b-0b72636d1b5a - - - - - -] [instance: af5ca3d2-d7df-40e0-88f8-b90191a73698] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:39:47 np0005476733 nova_compute[192580]: 2025-10-08 15:39:47.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:48 np0005476733 nova_compute[192580]: 2025-10-08 15:39:48.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:39:49 np0005476733 nova_compute[192580]: 2025-10-08 15:39:49.047 2 DEBUG nova.network.neutron [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Successfully updated port: d61bf8bf-d254-433d-b32a-427cbb791a7f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:39:49 np0005476733 nova_compute[192580]: 2025-10-08 15:39:49.429 2 DEBUG nova.compute.manager [req-992177be-a173-4f10-b397-2ea1cc924eb1 req-b5c901a5-552d-4e9c-b709-8e9d1662a77e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Received event network-changed-d61bf8bf-d254-433d-b32a-427cbb791a7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:39:49 np0005476733 nova_compute[192580]: 2025-10-08 15:39:49.429 2 DEBUG nova.compute.manager [req-992177be-a173-4f10-b397-2ea1cc924eb1 req-b5c901a5-552d-4e9c-b709-8e9d1662a77e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Refreshing instance network info cache due to event network-changed-d61bf8bf-d254-433d-b32a-427cbb791a7f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:39:49 np0005476733 nova_compute[192580]: 2025-10-08 15:39:49.430 2 DEBUG oslo_concurrency.lockutils [req-992177be-a173-4f10-b397-2ea1cc924eb1 req-b5c901a5-552d-4e9c-b709-8e9d1662a77e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-ccf8be13-2e93-495d-ac4a-2cff54baa4fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:39:49 np0005476733 nova_compute[192580]: 2025-10-08 15:39:49.430 2 DEBUG oslo_concurrency.lockutils [req-992177be-a173-4f10-b397-2ea1cc924eb1 req-b5c901a5-552d-4e9c-b709-8e9d1662a77e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-ccf8be13-2e93-495d-ac4a-2cff54baa4fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:39:49 np0005476733 nova_compute[192580]: 2025-10-08 15:39:49.430 2 DEBUG nova.network.neutron [req-992177be-a173-4f10-b397-2ea1cc924eb1 req-b5c901a5-552d-4e9c-b709-8e9d1662a77e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Refreshing network info cache for port d61bf8bf-d254-433d-b32a-427cbb791a7f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:39:49 np0005476733 nova_compute[192580]: 2025-10-08 15:39:49.903 2 DEBUG nova.network.neutron [req-992177be-a173-4f10-b397-2ea1cc924eb1 req-b5c901a5-552d-4e9c-b709-8e9d1662a77e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:39:50 np0005476733 nova_compute[192580]: 2025-10-08 15:39:50.256 2 DEBUG nova.network.neutron [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Successfully updated port: 3e387217-655c-4ce1-9a54-18395a63adb4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:39:50 np0005476733 nova_compute[192580]: 2025-10-08 15:39:50.287 2 DEBUG oslo_concurrency.lockutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "refresh_cache-ccf8be13-2e93-495d-ac4a-2cff54baa4fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:39:50 np0005476733 nova_compute[192580]: 2025-10-08 15:39:50.292 2 DEBUG nova.network.neutron [req-992177be-a173-4f10-b397-2ea1cc924eb1 req-b5c901a5-552d-4e9c-b709-8e9d1662a77e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:39:50 np0005476733 nova_compute[192580]: 2025-10-08 15:39:50.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:50 np0005476733 nova_compute[192580]: 2025-10-08 15:39:50.324 2 DEBUG oslo_concurrency.lockutils [req-992177be-a173-4f10-b397-2ea1cc924eb1 req-b5c901a5-552d-4e9c-b709-8e9d1662a77e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-ccf8be13-2e93-495d-ac4a-2cff54baa4fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:39:50 np0005476733 nova_compute[192580]: 2025-10-08 15:39:50.325 2 DEBUG oslo_concurrency.lockutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquired lock "refresh_cache-ccf8be13-2e93-495d-ac4a-2cff54baa4fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:39:50 np0005476733 nova_compute[192580]: 2025-10-08 15:39:50.325 2 DEBUG nova.network.neutron [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:39:50 np0005476733 nova_compute[192580]: 2025-10-08 15:39:50.517 2 DEBUG nova.network.neutron [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:39:51 np0005476733 podman[237463]: 2025-10-08 15:39:51.248420169 +0000 UTC m=+0.074267145 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 11:39:51 np0005476733 nova_compute[192580]: 2025-10-08 15:39:51.510 2 DEBUG nova.compute.manager [req-2d8f3ce6-a21a-4b44-95ed-5cbede0bc310 req-f1b4179c-d84c-4bae-867c-769835283a6b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Received event network-changed-3e387217-655c-4ce1-9a54-18395a63adb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:39:51 np0005476733 nova_compute[192580]: 2025-10-08 15:39:51.510 2 DEBUG nova.compute.manager [req-2d8f3ce6-a21a-4b44-95ed-5cbede0bc310 req-f1b4179c-d84c-4bae-867c-769835283a6b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Refreshing instance network info cache due to event network-changed-3e387217-655c-4ce1-9a54-18395a63adb4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:39:51 np0005476733 nova_compute[192580]: 2025-10-08 15:39:51.511 2 DEBUG oslo_concurrency.lockutils [req-2d8f3ce6-a21a-4b44-95ed-5cbede0bc310 req-f1b4179c-d84c-4bae-867c-769835283a6b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-ccf8be13-2e93-495d-ac4a-2cff54baa4fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.055 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.570 2 DEBUG nova.network.neutron [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Updating instance_info_cache with network_info: [{"id": "d61bf8bf-d254-433d-b32a-427cbb791a7f", "address": "fa:16:3e:4a:a8:1d", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd61bf8bf-d2", "ovs_interfaceid": "d61bf8bf-d254-433d-b32a-427cbb791a7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "3e387217-655c-4ce1-9a54-18395a63adb4", "address": "fa:16:3e:cd:58:9a", "network": {"id": "5557e28c-0838-4cc2-bde1-1d8616c6ce66", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "2001:db9::/64", "dns": [], "gateway": {"address": "2001:db9::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db9::344", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e387217-65", "ovs_interfaceid": "3e387217-655c-4ce1-9a54-18395a63adb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.591 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.597 2 DEBUG oslo_concurrency.lockutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Releasing lock "refresh_cache-ccf8be13-2e93-495d-ac4a-2cff54baa4fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.597 2 DEBUG nova.compute.manager [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Instance network_info: |[{"id": "d61bf8bf-d254-433d-b32a-427cbb791a7f", "address": "fa:16:3e:4a:a8:1d", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd61bf8bf-d2", "ovs_interfaceid": "d61bf8bf-d254-433d-b32a-427cbb791a7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "3e387217-655c-4ce1-9a54-18395a63adb4", "address": "fa:16:3e:cd:58:9a", "network": {"id": "5557e28c-0838-4cc2-bde1-1d8616c6ce66", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "2001:db9::/64", "dns": [], "gateway": {"address": "2001:db9::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db9::344", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e387217-65", "ovs_interfaceid": "3e387217-655c-4ce1-9a54-18395a63adb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.598 2 DEBUG oslo_concurrency.lockutils [req-2d8f3ce6-a21a-4b44-95ed-5cbede0bc310 req-f1b4179c-d84c-4bae-867c-769835283a6b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-ccf8be13-2e93-495d-ac4a-2cff54baa4fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.598 2 DEBUG nova.network.neutron [req-2d8f3ce6-a21a-4b44-95ed-5cbede0bc310 req-f1b4179c-d84c-4bae-867c-769835283a6b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Refreshing network info cache for port 3e387217-655c-4ce1-9a54-18395a63adb4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.602 2 DEBUG nova.virt.libvirt.driver [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Start _get_guest_xml network_info=[{"id": "d61bf8bf-d254-433d-b32a-427cbb791a7f", "address": "fa:16:3e:4a:a8:1d", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd61bf8bf-d2", "ovs_interfaceid": "d61bf8bf-d254-433d-b32a-427cbb791a7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "3e387217-655c-4ce1-9a54-18395a63adb4", "address": "fa:16:3e:cd:58:9a", "network": {"id": "5557e28c-0838-4cc2-bde1-1d8616c6ce66", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "2001:db9::/64", "dns": [], "gateway": {"address": "2001:db9::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db9::344", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e387217-65", "ovs_interfaceid": "3e387217-655c-4ce1-9a54-18395a63adb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.608 2 WARNING nova.virt.libvirt.driver [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.621 2 DEBUG nova.virt.libvirt.host [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.622 2 DEBUG nova.virt.libvirt.host [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.629 2 DEBUG nova.virt.libvirt.host [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.630 2 DEBUG nova.virt.libvirt.host [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.630 2 DEBUG nova.virt.libvirt.driver [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.630 2 DEBUG nova.virt.hardware [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.631 2 DEBUG nova.virt.hardware [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.631 2 DEBUG nova.virt.hardware [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.631 2 DEBUG nova.virt.hardware [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.632 2 DEBUG nova.virt.hardware [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.632 2 DEBUG nova.virt.hardware [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.632 2 DEBUG nova.virt.hardware [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.632 2 DEBUG nova.virt.hardware [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.632 2 DEBUG nova.virt.hardware [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.633 2 DEBUG nova.virt.hardware [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.633 2 DEBUG nova.virt.hardware [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.636 2 DEBUG nova.virt.libvirt.vif [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:39:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-2097740166-0',display_name='server-tempest-MultiPortVlanTransparencyTest-2097740166-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multiportvlantransparencytest-2097740166-0',id=64,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFO6EtKf086AtDcSKUhQT3A92xQMgobyVurrJBZ/a3hiqHTiY5Yo0zaLWibmNBoQ54lPdiEia0lEWiGuyPEo3V1Xkv/BTywiIW8/QXBzK9pxBAvfcXXyqWXEVNgqfaVfhA==',key_name='tempest-MultiPortVlanTransparencyTest-2097740166',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='357683d0efd54df8878ddcfaabe6d388',ramdisk_id='',reservation_id='r-zarp8bd9',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiPortVlanTransparencyTest-198310335',owner_user_name='tempest-MultiPortVlanTransparencyTest-198310335-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:39:46Z,user_data=None,user_id='ec8fd4ab84244ebb88e5af7fcd3ce92b',uuid=ccf8be13-2e93-495d-ac4a-2cff54baa4fb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d61bf8bf-d254-433d-b32a-427cbb791a7f", "address": "fa:16:3e:4a:a8:1d", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd61bf8bf-d2", "ovs_interfaceid": "d61bf8bf-d254-433d-b32a-427cbb791a7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.637 2 DEBUG nova.network.os_vif_util [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converting VIF {"id": "d61bf8bf-d254-433d-b32a-427cbb791a7f", "address": "fa:16:3e:4a:a8:1d", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd61bf8bf-d2", "ovs_interfaceid": "d61bf8bf-d254-433d-b32a-427cbb791a7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.638 2 DEBUG nova.network.os_vif_util [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:a8:1d,bridge_name='br-int',has_traffic_filtering=True,id=d61bf8bf-d254-433d-b32a-427cbb791a7f,network=Network(2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd61bf8bf-d2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.639 2 DEBUG nova.virt.libvirt.vif [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:39:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-2097740166-0',display_name='server-tempest-MultiPortVlanTransparencyTest-2097740166-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multiportvlantransparencytest-2097740166-0',id=64,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFO6EtKf086AtDcSKUhQT3A92xQMgobyVurrJBZ/a3hiqHTiY5Yo0zaLWibmNBoQ54lPdiEia0lEWiGuyPEo3V1Xkv/BTywiIW8/QXBzK9pxBAvfcXXyqWXEVNgqfaVfhA==',key_name='tempest-MultiPortVlanTransparencyTest-2097740166',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='357683d0efd54df8878ddcfaabe6d388',ramdisk_id='',reservation_id='r-zarp8bd9',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiPortVlanTransparencyTest-198310335',owner_user_name='tempest-MultiPortVlanTransparencyTest-198310335-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:39:46Z,user_data=None,user_id='ec8fd4ab84244ebb88e5af7fcd3ce92b',uuid=ccf8be13-2e93-495d-ac4a-2cff54baa4fb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3e387217-655c-4ce1-9a54-18395a63adb4", "address": "fa:16:3e:cd:58:9a", "network": {"id": "5557e28c-0838-4cc2-bde1-1d8616c6ce66", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "2001:db9::/64", "dns": [], "gateway": {"address": "2001:db9::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db9::344", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e387217-65", "ovs_interfaceid": "3e387217-655c-4ce1-9a54-18395a63adb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.639 2 DEBUG nova.network.os_vif_util [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converting VIF {"id": "3e387217-655c-4ce1-9a54-18395a63adb4", "address": "fa:16:3e:cd:58:9a", "network": {"id": "5557e28c-0838-4cc2-bde1-1d8616c6ce66", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "2001:db9::/64", "dns": [], "gateway": {"address": "2001:db9::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db9::344", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e387217-65", "ovs_interfaceid": "3e387217-655c-4ce1-9a54-18395a63adb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.640 2 DEBUG nova.network.os_vif_util [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:58:9a,bridge_name='br-int',has_traffic_filtering=True,id=3e387217-655c-4ce1-9a54-18395a63adb4,network=Network(5557e28c-0838-4cc2-bde1-1d8616c6ce66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3e387217-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.640 2 DEBUG nova.objects.instance [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lazy-loading 'pci_devices' on Instance uuid ccf8be13-2e93-495d-ac4a-2cff54baa4fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.655 2 DEBUG nova.virt.libvirt.driver [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:39:52 np0005476733 nova_compute[192580]:  <uuid>ccf8be13-2e93-495d-ac4a-2cff54baa4fb</uuid>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:  <name>instance-00000040</name>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <nova:name>server-tempest-MultiPortVlanTransparencyTest-2097740166-0</nova:name>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:39:52</nova:creationTime>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:39:52 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:        <nova:user uuid="ec8fd4ab84244ebb88e5af7fcd3ce92b">tempest-MultiPortVlanTransparencyTest-198310335-project-member</nova:user>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:        <nova:project uuid="357683d0efd54df8878ddcfaabe6d388">tempest-MultiPortVlanTransparencyTest-198310335</nova:project>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:        <nova:port uuid="d61bf8bf-d254-433d-b32a-427cbb791a7f">
Oct  8 11:39:52 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:        <nova:port uuid="3e387217-655c-4ce1-9a54-18395a63adb4">
Oct  8 11:39:52 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="2001:db9::344" ipVersion="6"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <entry name="serial">ccf8be13-2e93-495d-ac4a-2cff54baa4fb</entry>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <entry name="uuid">ccf8be13-2e93-495d-ac4a-2cff54baa4fb</entry>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.config"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:4a:a8:1d"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <target dev="tapd61bf8bf-d2"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:cd:58:9a"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <target dev="tap3e387217-65"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/ccf8be13-2e93-495d-ac4a-2cff54baa4fb/console.log" append="off"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:39:52 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:39:52 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:39:52 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:39:52 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.656 2 DEBUG nova.compute.manager [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Preparing to wait for external event network-vif-plugged-d61bf8bf-d254-433d-b32a-427cbb791a7f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.657 2 DEBUG oslo_concurrency.lockutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.657 2 DEBUG oslo_concurrency.lockutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.657 2 DEBUG oslo_concurrency.lockutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.657 2 DEBUG nova.compute.manager [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Preparing to wait for external event network-vif-plugged-3e387217-655c-4ce1-9a54-18395a63adb4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.658 2 DEBUG oslo_concurrency.lockutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.658 2 DEBUG oslo_concurrency.lockutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.658 2 DEBUG oslo_concurrency.lockutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.659 2 DEBUG nova.virt.libvirt.vif [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:39:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-2097740166-0',display_name='server-tempest-MultiPortVlanTransparencyTest-2097740166-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multiportvlantransparencytest-2097740166-0',id=64,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFO6EtKf086AtDcSKUhQT3A92xQMgobyVurrJBZ/a3hiqHTiY5Yo0zaLWibmNBoQ54lPdiEia0lEWiGuyPEo3V1Xkv/BTywiIW8/QXBzK9pxBAvfcXXyqWXEVNgqfaVfhA==',key_name='tempest-MultiPortVlanTransparencyTest-2097740166',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='357683d0efd54df8878ddcfaabe6d388',ramdisk_id='',reservation_id='r-zarp8bd9',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiPortVlanTransparencyTest-198310335',owner_user_name='tempest-MultiPortVlanTransparencyTest-198310335-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:39:46Z,user_data=None,user_id='ec8fd4ab84244ebb88e5af7fcd3ce92b',uuid=ccf8be13-2e93-495d-ac4a-2cff54baa4fb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d61bf8bf-d254-433d-b32a-427cbb791a7f", "address": "fa:16:3e:4a:a8:1d", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd61bf8bf-d2", "ovs_interfaceid": "d61bf8bf-d254-433d-b32a-427cbb791a7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.659 2 DEBUG nova.network.os_vif_util [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converting VIF {"id": "d61bf8bf-d254-433d-b32a-427cbb791a7f", "address": "fa:16:3e:4a:a8:1d", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd61bf8bf-d2", "ovs_interfaceid": "d61bf8bf-d254-433d-b32a-427cbb791a7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.660 2 DEBUG nova.network.os_vif_util [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:a8:1d,bridge_name='br-int',has_traffic_filtering=True,id=d61bf8bf-d254-433d-b32a-427cbb791a7f,network=Network(2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd61bf8bf-d2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.660 2 DEBUG os_vif [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:a8:1d,bridge_name='br-int',has_traffic_filtering=True,id=d61bf8bf-d254-433d-b32a-427cbb791a7f,network=Network(2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd61bf8bf-d2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.661 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.661 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.664 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd61bf8bf-d2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.664 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd61bf8bf-d2, col_values=(('external_ids', {'iface-id': 'd61bf8bf-d254-433d-b32a-427cbb791a7f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4a:a8:1d', 'vm-uuid': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:52 np0005476733 NetworkManager[51699]: <info>  [1759937992.6682] manager: (tapd61bf8bf-d2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/170)
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.672 2 INFO os_vif [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:a8:1d,bridge_name='br-int',has_traffic_filtering=True,id=d61bf8bf-d254-433d-b32a-427cbb791a7f,network=Network(2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd61bf8bf-d2')#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.673 2 DEBUG nova.virt.libvirt.vif [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:39:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-2097740166-0',display_name='server-tempest-MultiPortVlanTransparencyTest-2097740166-0',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multiportvlantransparencytest-2097740166-0',id=64,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFO6EtKf086AtDcSKUhQT3A92xQMgobyVurrJBZ/a3hiqHTiY5Yo0zaLWibmNBoQ54lPdiEia0lEWiGuyPEo3V1Xkv/BTywiIW8/QXBzK9pxBAvfcXXyqWXEVNgqfaVfhA==',key_name='tempest-MultiPortVlanTransparencyTest-2097740166',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='357683d0efd54df8878ddcfaabe6d388',ramdisk_id='',reservation_id='r-zarp8bd9',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiPortVlanTransparencyTest-198310335',owner_user_name='tempest-MultiPortVlanTransparencyTest-198310335-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:39:46Z,user_data=None,user_id='ec8fd4ab84244ebb88e5af7fcd3ce92b',uuid=ccf8be13-2e93-495d-ac4a-2cff54baa4fb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3e387217-655c-4ce1-9a54-18395a63adb4", "address": "fa:16:3e:cd:58:9a", "network": {"id": "5557e28c-0838-4cc2-bde1-1d8616c6ce66", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "2001:db9::/64", "dns": [], "gateway": {"address": "2001:db9::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db9::344", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e387217-65", "ovs_interfaceid": "3e387217-655c-4ce1-9a54-18395a63adb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.673 2 DEBUG nova.network.os_vif_util [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converting VIF {"id": "3e387217-655c-4ce1-9a54-18395a63adb4", "address": "fa:16:3e:cd:58:9a", "network": {"id": "5557e28c-0838-4cc2-bde1-1d8616c6ce66", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "2001:db9::/64", "dns": [], "gateway": {"address": "2001:db9::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db9::344", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e387217-65", "ovs_interfaceid": "3e387217-655c-4ce1-9a54-18395a63adb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.674 2 DEBUG nova.network.os_vif_util [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:58:9a,bridge_name='br-int',has_traffic_filtering=True,id=3e387217-655c-4ce1-9a54-18395a63adb4,network=Network(5557e28c-0838-4cc2-bde1-1d8616c6ce66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3e387217-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.674 2 DEBUG os_vif [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:58:9a,bridge_name='br-int',has_traffic_filtering=True,id=3e387217-655c-4ce1-9a54-18395a63adb4,network=Network(5557e28c-0838-4cc2-bde1-1d8616c6ce66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3e387217-65') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.675 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.675 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.678 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3e387217-65, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.678 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3e387217-65, col_values=(('external_ids', {'iface-id': '3e387217-655c-4ce1-9a54-18395a63adb4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cd:58:9a', 'vm-uuid': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:39:52 np0005476733 NetworkManager[51699]: <info>  [1759937992.6809] manager: (tap3e387217-65): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/171)
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.689 2 INFO os_vif [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:58:9a,bridge_name='br-int',has_traffic_filtering=True,id=3e387217-655c-4ce1-9a54-18395a63adb4,network=Network(5557e28c-0838-4cc2-bde1-1d8616c6ce66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3e387217-65')#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.759 2 DEBUG nova.virt.libvirt.driver [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.760 2 DEBUG nova.virt.libvirt.driver [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.760 2 DEBUG nova.virt.libvirt.driver [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] No VIF found with MAC fa:16:3e:4a:a8:1d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.760 2 DEBUG nova.virt.libvirt.driver [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] No VIF found with MAC fa:16:3e:cd:58:9a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:39:52 np0005476733 nova_compute[192580]: 2025-10-08 15:39:52.760 2 INFO nova.virt.libvirt.driver [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Using config drive#033[00m
Oct  8 11:39:53 np0005476733 nova_compute[192580]: 2025-10-08 15:39:53.072 2 INFO nova.virt.libvirt.driver [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Creating config drive at /var/lib/nova/instances/ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.config#033[00m
Oct  8 11:39:53 np0005476733 nova_compute[192580]: 2025-10-08 15:39:53.079 2 DEBUG oslo_concurrency.processutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr23u8za6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:39:53 np0005476733 nova_compute[192580]: 2025-10-08 15:39:53.210 2 DEBUG oslo_concurrency.processutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr23u8za6" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:39:53 np0005476733 kernel: tapd61bf8bf-d2: entered promiscuous mode
Oct  8 11:39:53 np0005476733 NetworkManager[51699]: <info>  [1759937993.2847] manager: (tapd61bf8bf-d2): new Tun device (/org/freedesktop/NetworkManager/Devices/172)
Oct  8 11:39:53 np0005476733 ovn_controller[94857]: 2025-10-08T15:39:53Z|00519|binding|INFO|Claiming lport d61bf8bf-d254-433d-b32a-427cbb791a7f for this chassis.
Oct  8 11:39:53 np0005476733 ovn_controller[94857]: 2025-10-08T15:39:53Z|00520|binding|INFO|d61bf8bf-d254-433d-b32a-427cbb791a7f: Claiming fa:16:3e:4a:a8:1d 10.100.0.5
Oct  8 11:39:53 np0005476733 nova_compute[192580]: 2025-10-08 15:39:53.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.303 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:a8:1d 10.100.0.5'], port_security=['fa:16:3e:4a:a8:1d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'first_port-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac', 'neutron:port_capabilities': '', 'neutron:port_name': 'first_port-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'neutron:project_id': '357683d0efd54df8878ddcfaabe6d388', 'neutron:revision_number': '2', 'neutron:security_group_ids': '93a341f3-21b5-4aa3-854e-5c20dcdd9b33', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf5d6359-20d9-440f-a678-46a616c58f4d, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=d61bf8bf-d254-433d-b32a-427cbb791a7f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:39:53 np0005476733 ovn_controller[94857]: 2025-10-08T15:39:53Z|00521|binding|INFO|Setting lport d61bf8bf-d254-433d-b32a-427cbb791a7f ovn-installed in OVS
Oct  8 11:39:53 np0005476733 ovn_controller[94857]: 2025-10-08T15:39:53Z|00522|binding|INFO|Setting lport d61bf8bf-d254-433d-b32a-427cbb791a7f up in Southbound
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.304 103739 INFO neutron.agent.ovn.metadata.agent [-] Port d61bf8bf-d254-433d-b32a-427cbb791a7f in datapath 2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac bound to our chassis#033[00m
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.307 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac#033[00m
Oct  8 11:39:53 np0005476733 nova_compute[192580]: 2025-10-08 15:39:53.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:53 np0005476733 systemd-udevd[237505]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:39:53 np0005476733 nova_compute[192580]: 2025-10-08 15:39:53.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:53 np0005476733 kernel: tap3e387217-65: entered promiscuous mode
Oct  8 11:39:53 np0005476733 NetworkManager[51699]: <info>  [1759937993.3133] manager: (tap3e387217-65): new Tun device (/org/freedesktop/NetworkManager/Devices/173)
Oct  8 11:39:53 np0005476733 systemd-udevd[237509]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:39:53 np0005476733 nova_compute[192580]: 2025-10-08 15:39:53.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:53 np0005476733 ovn_controller[94857]: 2025-10-08T15:39:53Z|00523|binding|INFO|Claiming lport 3e387217-655c-4ce1-9a54-18395a63adb4 for this chassis.
Oct  8 11:39:53 np0005476733 ovn_controller[94857]: 2025-10-08T15:39:53Z|00524|binding|INFO|3e387217-655c-4ce1-9a54-18395a63adb4: Claiming fa:16:3e:cd:58:9a 2001:db9::344
Oct  8 11:39:53 np0005476733 NetworkManager[51699]: <info>  [1759937993.3227] device (tapd61bf8bf-d2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:39:53 np0005476733 NetworkManager[51699]: <info>  [1759937993.3234] device (tapd61bf8bf-d2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.325 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[19b70f64-1ac2-445b-8c50-89aac8060fba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.327 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2bf87bc3-31 in ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:39:53 np0005476733 NetworkManager[51699]: <info>  [1759937993.3310] device (tap3e387217-65): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:39:53 np0005476733 NetworkManager[51699]: <info>  [1759937993.3317] device (tap3e387217-65): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.336 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:58:9a 2001:db9::344'], port_security=['fa:16:3e:cd:58:9a 2001:db9::344'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'second_port-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'neutron:cidrs': '2001:db9::344/64', 'neutron:device_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5557e28c-0838-4cc2-bde1-1d8616c6ce66', 'neutron:port_capabilities': '', 'neutron:port_name': 'second_port-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'neutron:project_id': '357683d0efd54df8878ddcfaabe6d388', 'neutron:revision_number': '2', 'neutron:security_group_ids': '93a341f3-21b5-4aa3-854e-5c20dcdd9b33', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b29998f5-0e10-45e3-9be1-b1005e0d6bad, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=3e387217-655c-4ce1-9a54-18395a63adb4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.337 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2bf87bc3-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.337 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[154fdd77-d767-4d80-a9d9-8ed8e4aa2c83]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.338 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[02a3a46d-899f-434e-96b7-d067f1fef14b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.352 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[3912b1e2-0b65-473d-b9f0-d100e73edd5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:53 np0005476733 ovn_controller[94857]: 2025-10-08T15:39:53Z|00525|binding|INFO|Setting lport 3e387217-655c-4ce1-9a54-18395a63adb4 ovn-installed in OVS
Oct  8 11:39:53 np0005476733 ovn_controller[94857]: 2025-10-08T15:39:53Z|00526|binding|INFO|Setting lport 3e387217-655c-4ce1-9a54-18395a63adb4 up in Southbound
Oct  8 11:39:53 np0005476733 nova_compute[192580]: 2025-10-08 15:39:53.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.376 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[daef1111-2839-4723-8c3d-86f7f3bbcf7c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:53 np0005476733 systemd-machined[152624]: New machine qemu-37-instance-00000040.
Oct  8 11:39:53 np0005476733 systemd[1]: Started Virtual Machine qemu-37-instance-00000040.
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.413 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[ea41ed3a-b282-4857-a026-d54e1c0b4b60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:53 np0005476733 NetworkManager[51699]: <info>  [1759937993.4244] manager: (tap2bf87bc3-30): new Veth device (/org/freedesktop/NetworkManager/Devices/174)
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.423 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1c46deee-8864-4f6c-9380-21c59202cbe8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.466 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[a38c4096-1284-4dce-b70d-b545f58c564b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.470 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[c0855f63-6317-4ef3-9277-d06d2efe1e73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:53 np0005476733 NetworkManager[51699]: <info>  [1759937993.5077] device (tap2bf87bc3-30): carrier: link connected
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.517 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[a36c2137-4a36-4b78-ab62-e8a52f4207db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:53 np0005476733 nova_compute[192580]: 2025-10-08 15:39:53.549 2 DEBUG nova.compute.manager [req-bfc46e99-09d0-491c-8ac6-15562ac2ffc7 req-b69aff17-922e-40bc-bdf8-0f24fcac4ec2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Received event network-vif-plugged-d61bf8bf-d254-433d-b32a-427cbb791a7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:39:53 np0005476733 nova_compute[192580]: 2025-10-08 15:39:53.549 2 DEBUG oslo_concurrency.lockutils [req-bfc46e99-09d0-491c-8ac6-15562ac2ffc7 req-b69aff17-922e-40bc-bdf8-0f24fcac4ec2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:53 np0005476733 nova_compute[192580]: 2025-10-08 15:39:53.549 2 DEBUG oslo_concurrency.lockutils [req-bfc46e99-09d0-491c-8ac6-15562ac2ffc7 req-b69aff17-922e-40bc-bdf8-0f24fcac4ec2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:53 np0005476733 nova_compute[192580]: 2025-10-08 15:39:53.549 2 DEBUG oslo_concurrency.lockutils [req-bfc46e99-09d0-491c-8ac6-15562ac2ffc7 req-b69aff17-922e-40bc-bdf8-0f24fcac4ec2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:53 np0005476733 nova_compute[192580]: 2025-10-08 15:39:53.550 2 DEBUG nova.compute.manager [req-bfc46e99-09d0-491c-8ac6-15562ac2ffc7 req-b69aff17-922e-40bc-bdf8-0f24fcac4ec2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Processing event network-vif-plugged-d61bf8bf-d254-433d-b32a-427cbb791a7f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.551 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[55a38bca-6b55-4ad9-8c79-56f2450b2c14]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2bf87bc3-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:ee:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 122], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489717, 'reachable_time': 28814, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237544, 'error': None, 'target': 'ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.568 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ce96d950-0222-4b73-a9ba-48e0849f5a7e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea4:eeb0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489717, 'tstamp': 489717}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237545, 'error': None, 'target': 'ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:53 np0005476733 nova_compute[192580]: 2025-10-08 15:39:53.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.589 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f1d249b8-f75e-4669-9a6f-c44a92125021]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2bf87bc3-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:ee:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 122], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489717, 'reachable_time': 28814, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237546, 'error': None, 'target': 'ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:53 np0005476733 nova_compute[192580]: 2025-10-08 15:39:53.619 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:53 np0005476733 nova_compute[192580]: 2025-10-08 15:39:53.619 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:53 np0005476733 nova_compute[192580]: 2025-10-08 15:39:53.620 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:53 np0005476733 nova_compute[192580]: 2025-10-08 15:39:53.620 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.623 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8f3c2bbc-5b29-47d3-8112-d1ba66cf3f69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:53 np0005476733 nova_compute[192580]: 2025-10-08 15:39:53.694 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.697 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[87f6874f-361c-4318-ac4b-6ad91f3762c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.698 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2bf87bc3-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.698 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.699 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2bf87bc3-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:39:53 np0005476733 kernel: tap2bf87bc3-30: entered promiscuous mode
Oct  8 11:39:53 np0005476733 NetworkManager[51699]: <info>  [1759937993.7033] manager: (tap2bf87bc3-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/175)
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.705 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2bf87bc3-30, col_values=(('external_ids', {'iface-id': '88d655c9-33da-4a0f-a7f9-84973702cdd7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:39:53 np0005476733 ovn_controller[94857]: 2025-10-08T15:39:53Z|00527|binding|INFO|Releasing lport 88d655c9-33da-4a0f-a7f9-84973702cdd7 from this chassis (sb_readonly=0)
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.709 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.710 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5ec01fd0-5843-4f52-99fc-0a0fbf68925e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.710 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac.pid.haproxy
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:39:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:53.711 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac', 'env', 'PROCESS_TAG=haproxy-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:39:53 np0005476733 nova_compute[192580]: 2025-10-08 15:39:53.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:53 np0005476733 nova_compute[192580]: 2025-10-08 15:39:53.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:53 np0005476733 nova_compute[192580]: 2025-10-08 15:39:53.779 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:39:53 np0005476733 nova_compute[192580]: 2025-10-08 15:39:53.780 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:39:53 np0005476733 nova_compute[192580]: 2025-10-08 15:39:53.846 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:39:54 np0005476733 nova_compute[192580]: 2025-10-08 15:39:54.035 2 DEBUG nova.network.neutron [req-2d8f3ce6-a21a-4b44-95ed-5cbede0bc310 req-f1b4179c-d84c-4bae-867c-769835283a6b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Updated VIF entry in instance network info cache for port 3e387217-655c-4ce1-9a54-18395a63adb4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:39:54 np0005476733 nova_compute[192580]: 2025-10-08 15:39:54.037 2 DEBUG nova.network.neutron [req-2d8f3ce6-a21a-4b44-95ed-5cbede0bc310 req-f1b4179c-d84c-4bae-867c-769835283a6b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Updating instance_info_cache with network_info: [{"id": "d61bf8bf-d254-433d-b32a-427cbb791a7f", "address": "fa:16:3e:4a:a8:1d", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd61bf8bf-d2", "ovs_interfaceid": "d61bf8bf-d254-433d-b32a-427cbb791a7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "3e387217-655c-4ce1-9a54-18395a63adb4", "address": "fa:16:3e:cd:58:9a", "network": {"id": "5557e28c-0838-4cc2-bde1-1d8616c6ce66", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "2001:db9::/64", "dns": [], "gateway": {"address": "2001:db9::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db9::344", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e387217-65", "ovs_interfaceid": "3e387217-655c-4ce1-9a54-18395a63adb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:39:54 np0005476733 nova_compute[192580]: 2025-10-08 15:39:54.049 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:39:54 np0005476733 nova_compute[192580]: 2025-10-08 15:39:54.050 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13756MB free_disk=111.34024429321289GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:39:54 np0005476733 nova_compute[192580]: 2025-10-08 15:39:54.051 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:54 np0005476733 nova_compute[192580]: 2025-10-08 15:39:54.051 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:54 np0005476733 nova_compute[192580]: 2025-10-08 15:39:54.059 2 DEBUG oslo_concurrency.lockutils [req-2d8f3ce6-a21a-4b44-95ed-5cbede0bc310 req-f1b4179c-d84c-4bae-867c-769835283a6b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-ccf8be13-2e93-495d-ac4a-2cff54baa4fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:39:54 np0005476733 podman[237592]: 2025-10-08 15:39:54.197338784 +0000 UTC m=+0.082573014 container create 9ff65f44bb9ae9ef77c0e9f8e5c41094db9267bfa8107124b97751a477ec92e4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  8 11:39:54 np0005476733 nova_compute[192580]: 2025-10-08 15:39:54.209 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance ccf8be13-2e93-495d-ac4a-2cff54baa4fb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:39:54 np0005476733 nova_compute[192580]: 2025-10-08 15:39:54.212 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:39:54 np0005476733 nova_compute[192580]: 2025-10-08 15:39:54.212 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:39:54 np0005476733 podman[237592]: 2025-10-08 15:39:54.137728661 +0000 UTC m=+0.022962901 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:39:54 np0005476733 systemd[1]: Started libpod-conmon-9ff65f44bb9ae9ef77c0e9f8e5c41094db9267bfa8107124b97751a477ec92e4.scope.
Oct  8 11:39:54 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:39:54 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87e0e5fdb0d9999b8ae0ef304ca19c06c78635ad6f94410546b91350cdd22a77/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:39:54 np0005476733 podman[237592]: 2025-10-08 15:39:54.273934495 +0000 UTC m=+0.159168735 container init 9ff65f44bb9ae9ef77c0e9f8e5c41094db9267bfa8107124b97751a477ec92e4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  8 11:39:54 np0005476733 podman[237592]: 2025-10-08 15:39:54.27906107 +0000 UTC m=+0.164295290 container start 9ff65f44bb9ae9ef77c0e9f8e5c41094db9267bfa8107124b97751a477ec92e4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0)
Oct  8 11:39:54 np0005476733 neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac[237608]: [NOTICE]   (237612) : New worker (237614) forked
Oct  8 11:39:54 np0005476733 neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac[237608]: [NOTICE]   (237612) : Loading success.
Oct  8 11:39:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:54.342 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 3e387217-655c-4ce1-9a54-18395a63adb4 in datapath 5557e28c-0838-4cc2-bde1-1d8616c6ce66 unbound from our chassis#033[00m
Oct  8 11:39:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:54.346 103739 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5557e28c-0838-4cc2-bde1-1d8616c6ce66 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  8 11:39:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:39:54.348 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7d55e3d4-7318-4ccb-8c80-5e13bea60208]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:39:54 np0005476733 nova_compute[192580]: 2025-10-08 15:39:54.386 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937994.3865795, ccf8be13-2e93-495d-ac4a-2cff54baa4fb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:39:54 np0005476733 nova_compute[192580]: 2025-10-08 15:39:54.387 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] VM Started (Lifecycle Event)#033[00m
Oct  8 11:39:54 np0005476733 nova_compute[192580]: 2025-10-08 15:39:54.390 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:39:54 np0005476733 nova_compute[192580]: 2025-10-08 15:39:54.417 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:39:54 np0005476733 nova_compute[192580]: 2025-10-08 15:39:54.422 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:39:54 np0005476733 nova_compute[192580]: 2025-10-08 15:39:54.425 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937994.3867202, ccf8be13-2e93-495d-ac4a-2cff54baa4fb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:39:54 np0005476733 nova_compute[192580]: 2025-10-08 15:39:54.425 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:39:54 np0005476733 nova_compute[192580]: 2025-10-08 15:39:54.456 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:39:54 np0005476733 nova_compute[192580]: 2025-10-08 15:39:54.457 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.406s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:54 np0005476733 nova_compute[192580]: 2025-10-08 15:39:54.459 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:39:54 np0005476733 nova_compute[192580]: 2025-10-08 15:39:54.463 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:39:54 np0005476733 nova_compute[192580]: 2025-10-08 15:39:54.486 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:39:55 np0005476733 podman[237624]: 2025-10-08 15:39:55.274031818 +0000 UTC m=+0.092969490 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:55 np0005476733 podman[237623]: 2025-10-08 15:39:55.312435826 +0000 UTC m=+0.124686473 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.460 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.669 2 DEBUG nova.compute.manager [req-8a75d07e-5117-4ba5-a36d-6d416b439b7f req-e2b032b3-f00f-4774-b2d5-2ae9a2e47391 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Received event network-vif-plugged-d61bf8bf-d254-433d-b32a-427cbb791a7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.669 2 DEBUG oslo_concurrency.lockutils [req-8a75d07e-5117-4ba5-a36d-6d416b439b7f req-e2b032b3-f00f-4774-b2d5-2ae9a2e47391 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.669 2 DEBUG oslo_concurrency.lockutils [req-8a75d07e-5117-4ba5-a36d-6d416b439b7f req-e2b032b3-f00f-4774-b2d5-2ae9a2e47391 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.669 2 DEBUG oslo_concurrency.lockutils [req-8a75d07e-5117-4ba5-a36d-6d416b439b7f req-e2b032b3-f00f-4774-b2d5-2ae9a2e47391 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.670 2 DEBUG nova.compute.manager [req-8a75d07e-5117-4ba5-a36d-6d416b439b7f req-e2b032b3-f00f-4774-b2d5-2ae9a2e47391 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] No event matching network-vif-plugged-d61bf8bf-d254-433d-b32a-427cbb791a7f in dict_keys([('network-vif-plugged', '3e387217-655c-4ce1-9a54-18395a63adb4')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.670 2 WARNING nova.compute.manager [req-8a75d07e-5117-4ba5-a36d-6d416b439b7f req-e2b032b3-f00f-4774-b2d5-2ae9a2e47391 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Received unexpected event network-vif-plugged-d61bf8bf-d254-433d-b32a-427cbb791a7f for instance with vm_state building and task_state spawning.#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.795 2 DEBUG nova.compute.manager [req-7d81453e-2067-434d-b7cb-bf35276ab151 req-594e7b7d-0b1d-42c3-b165-64dfa2c4c51a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Received event network-vif-plugged-3e387217-655c-4ce1-9a54-18395a63adb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.795 2 DEBUG oslo_concurrency.lockutils [req-7d81453e-2067-434d-b7cb-bf35276ab151 req-594e7b7d-0b1d-42c3-b165-64dfa2c4c51a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.795 2 DEBUG oslo_concurrency.lockutils [req-7d81453e-2067-434d-b7cb-bf35276ab151 req-594e7b7d-0b1d-42c3-b165-64dfa2c4c51a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.796 2 DEBUG oslo_concurrency.lockutils [req-7d81453e-2067-434d-b7cb-bf35276ab151 req-594e7b7d-0b1d-42c3-b165-64dfa2c4c51a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.796 2 DEBUG nova.compute.manager [req-7d81453e-2067-434d-b7cb-bf35276ab151 req-594e7b7d-0b1d-42c3-b165-64dfa2c4c51a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Processing event network-vif-plugged-3e387217-655c-4ce1-9a54-18395a63adb4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.796 2 DEBUG nova.compute.manager [req-7d81453e-2067-434d-b7cb-bf35276ab151 req-594e7b7d-0b1d-42c3-b165-64dfa2c4c51a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Received event network-vif-plugged-3e387217-655c-4ce1-9a54-18395a63adb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.797 2 DEBUG oslo_concurrency.lockutils [req-7d81453e-2067-434d-b7cb-bf35276ab151 req-594e7b7d-0b1d-42c3-b165-64dfa2c4c51a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.797 2 DEBUG oslo_concurrency.lockutils [req-7d81453e-2067-434d-b7cb-bf35276ab151 req-594e7b7d-0b1d-42c3-b165-64dfa2c4c51a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.797 2 DEBUG oslo_concurrency.lockutils [req-7d81453e-2067-434d-b7cb-bf35276ab151 req-594e7b7d-0b1d-42c3-b165-64dfa2c4c51a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.797 2 DEBUG nova.compute.manager [req-7d81453e-2067-434d-b7cb-bf35276ab151 req-594e7b7d-0b1d-42c3-b165-64dfa2c4c51a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] No waiting events found dispatching network-vif-plugged-3e387217-655c-4ce1-9a54-18395a63adb4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.798 2 WARNING nova.compute.manager [req-7d81453e-2067-434d-b7cb-bf35276ab151 req-594e7b7d-0b1d-42c3-b165-64dfa2c4c51a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Received unexpected event network-vif-plugged-3e387217-655c-4ce1-9a54-18395a63adb4 for instance with vm_state building and task_state spawning.#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.798 2 DEBUG nova.compute.manager [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.806 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759937995.8064263, ccf8be13-2e93-495d-ac4a-2cff54baa4fb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.807 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.808 2 DEBUG nova.virt.libvirt.driver [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.812 2 INFO nova.virt.libvirt.driver [-] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Instance spawned successfully.#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.813 2 DEBUG nova.virt.libvirt.driver [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.838 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.844 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.847 2 DEBUG nova.virt.libvirt.driver [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.847 2 DEBUG nova.virt.libvirt.driver [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.847 2 DEBUG nova.virt.libvirt.driver [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.848 2 DEBUG nova.virt.libvirt.driver [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.848 2 DEBUG nova.virt.libvirt.driver [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.848 2 DEBUG nova.virt.libvirt.driver [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.879 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.915 2 INFO nova.compute.manager [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Took 8.96 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:39:55 np0005476733 nova_compute[192580]: 2025-10-08 15:39:55.915 2 DEBUG nova.compute.manager [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:39:56 np0005476733 nova_compute[192580]: 2025-10-08 15:39:56.158 2 INFO nova.compute.manager [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Took 9.64 seconds to build instance.#033[00m
Oct  8 11:39:56 np0005476733 nova_compute[192580]: 2025-10-08 15:39:56.176 2 DEBUG oslo_concurrency.lockutils [None req-74c30402-981f-407d-963a-24bcb6cab326 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.761s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:39:57 np0005476733 nova_compute[192580]: 2025-10-08 15:39:57.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:39:57 np0005476733 nova_compute[192580]: 2025-10-08 15:39:57.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:39:57 np0005476733 nova_compute[192580]: 2025-10-08 15:39:57.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:39:59 np0005476733 nova_compute[192580]: 2025-10-08 15:39:59.582 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:39:59 np0005476733 nova_compute[192580]: 2025-10-08 15:39:59.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:39:59 np0005476733 nova_compute[192580]: 2025-10-08 15:39:59.998 2 INFO nova.compute.manager [None req-39df6428-624c-4003-b1b1-eb1e100d57a6 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Get console output#033[00m
Oct  8 11:40:00 np0005476733 nova_compute[192580]: 2025-10-08 15:40:00.007 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:40:00 np0005476733 nova_compute[192580]: 2025-10-08 15:40:00.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:01 np0005476733 nova_compute[192580]: 2025-10-08 15:40:01.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:40:01 np0005476733 nova_compute[192580]: 2025-10-08 15:40:01.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:40:01 np0005476733 nova_compute[192580]: 2025-10-08 15:40:01.709 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 11:40:02 np0005476733 nova_compute[192580]: 2025-10-08 15:40:02.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:03 np0005476733 podman[237668]: 2025-10-08 15:40:03.242812133 +0000 UTC m=+0.060895075 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc.)
Oct  8 11:40:03 np0005476733 podman[237666]: 2025-10-08 15:40:03.245706896 +0000 UTC m=+0.068194289 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, io.buildah.version=1.41.3)
Oct  8 11:40:03 np0005476733 podman[237667]: 2025-10-08 15:40:03.256063676 +0000 UTC m=+0.067856777 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 11:40:05 np0005476733 nova_compute[192580]: 2025-10-08 15:40:05.188 2 INFO nova.compute.manager [None req-be7c9bf9-3c28-409d-9cfd-ac33364a808a ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Get console output#033[00m
Oct  8 11:40:05 np0005476733 nova_compute[192580]: 2025-10-08 15:40:05.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:05 np0005476733 nova_compute[192580]: 2025-10-08 15:40:05.701 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:40:06 np0005476733 nova_compute[192580]: 2025-10-08 15:40:06.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:40:07 np0005476733 nova_compute[192580]: 2025-10-08 15:40:07.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:10 np0005476733 nova_compute[192580]: 2025-10-08 15:40:10.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:10 np0005476733 nova_compute[192580]: 2025-10-08 15:40:10.423 2 INFO nova.compute.manager [None req-d2772779-8924-4deb-87c1-5c06835d5223 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Get console output#033[00m
Oct  8 11:40:10 np0005476733 nova_compute[192580]: 2025-10-08 15:40:10.428 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:40:11 np0005476733 nova_compute[192580]: 2025-10-08 15:40:11.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:40:11 np0005476733 nova_compute[192580]: 2025-10-08 15:40:11.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 11:40:12 np0005476733 nova_compute[192580]: 2025-10-08 15:40:12.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:13 np0005476733 podman[237733]: 2025-10-08 15:40:13.252407425 +0000 UTC m=+0.066171305 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 11:40:13 np0005476733 podman[237732]: 2025-10-08 15:40:13.259401428 +0000 UTC m=+0.073990263 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  8 11:40:14 np0005476733 nova_compute[192580]: 2025-10-08 15:40:14.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:40:15 np0005476733 nova_compute[192580]: 2025-10-08 15:40:15.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:15 np0005476733 nova_compute[192580]: 2025-10-08 15:40:15.565 2 INFO nova.compute.manager [None req-134f7101-2756-4906-8d79-a8ac4a628598 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Get console output#033[00m
Oct  8 11:40:15 np0005476733 nova_compute[192580]: 2025-10-08 15:40:15.570 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:40:17 np0005476733 nova_compute[192580]: 2025-10-08 15:40:17.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:20 np0005476733 nova_compute[192580]: 2025-10-08 15:40:20.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:20 np0005476733 nova_compute[192580]: 2025-10-08 15:40:20.752 2 INFO nova.compute.manager [None req-1c904f67-3e3a-4be9-9910-8c1db4349dd4 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Get console output#033[00m
Oct  8 11:40:20 np0005476733 nova_compute[192580]: 2025-10-08 15:40:20.757 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:40:22 np0005476733 podman[237777]: 2025-10-08 15:40:22.24350605 +0000 UTC m=+0.065548014 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Oct  8 11:40:22 np0005476733 nova_compute[192580]: 2025-10-08 15:40:22.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:23 np0005476733 ovn_controller[94857]: 2025-10-08T15:40:23Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4a:a8:1d 10.100.0.5
Oct  8 11:40:23 np0005476733 ovn_controller[94857]: 2025-10-08T15:40:23Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4a:a8:1d 10.100.0.5
Oct  8 11:40:23 np0005476733 ovn_controller[94857]: 2025-10-08T15:40:23Z|00528|pinctrl|WARN|Dropped 2745 log messages in last 63 seconds (most recently, 28 seconds ago) due to excessive rate
Oct  8 11:40:23 np0005476733 ovn_controller[94857]: 2025-10-08T15:40:23Z|00529|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:40:23 np0005476733 ovn_controller[94857]: 2025-10-08T15:40:23Z|00530|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct  8 11:40:25 np0005476733 nova_compute[192580]: 2025-10-08 15:40:25.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:25 np0005476733 nova_compute[192580]: 2025-10-08 15:40:25.654 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:40:25 np0005476733 nova_compute[192580]: 2025-10-08 15:40:25.656 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 11:40:25 np0005476733 nova_compute[192580]: 2025-10-08 15:40:25.679 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 11:40:25 np0005476733 nova_compute[192580]: 2025-10-08 15:40:25.946 2 INFO nova.compute.manager [None req-5862272e-50e5-4f54-8484-146bf86d22ce ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Get console output#033[00m
Oct  8 11:40:25 np0005476733 nova_compute[192580]: 2025-10-08 15:40:25.954 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:40:26 np0005476733 podman[237797]: 2025-10-08 15:40:26.238763491 +0000 UTC m=+0.064762900 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:40:26 np0005476733 podman[237796]: 2025-10-08 15:40:26.314993865 +0000 UTC m=+0.136615363 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001)
Oct  8 11:40:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:26.327 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:40:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:26.328 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:40:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:26.329 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:40:27 np0005476733 nova_compute[192580]: 2025-10-08 15:40:27.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:30 np0005476733 nova_compute[192580]: 2025-10-08 15:40:30.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:31 np0005476733 nova_compute[192580]: 2025-10-08 15:40:31.119 2 INFO nova.compute.manager [None req-565daaca-e3b5-46cf-b2ec-470e35179551 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Get console output#033[00m
Oct  8 11:40:31 np0005476733 nova_compute[192580]: 2025-10-08 15:40:31.129 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:40:31 np0005476733 nova_compute[192580]: 2025-10-08 15:40:31.132 2 INFO nova.virt.libvirt.driver [None req-565daaca-e3b5-46cf-b2ec-470e35179551 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Truncated console log returned, 4403 bytes ignored#033[00m
Oct  8 11:40:32 np0005476733 nova_compute[192580]: 2025-10-08 15:40:32.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:34 np0005476733 podman[237842]: 2025-10-08 15:40:34.250031383 +0000 UTC m=+0.079206710 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 11:40:34 np0005476733 podman[237844]: 2025-10-08 15:40:34.253833975 +0000 UTC m=+0.071487694 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, version=9.6, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.buildah.version=1.33.7)
Oct  8 11:40:34 np0005476733 podman[237843]: 2025-10-08 15:40:34.273386459 +0000 UTC m=+0.097811265 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:40:35 np0005476733 nova_compute[192580]: 2025-10-08 15:40:35.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.014 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000040', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '357683d0efd54df8878ddcfaabe6d388', 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'hostId': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.015 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.015 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.015 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: server-tempest-MultiPortVlanTransparencyTest-2097740166-0>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: server-tempest-MultiPortVlanTransparencyTest-2097740166-0>]
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.016 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.035 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.device.read.bytes volume: 273138688 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.036 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d9b0c59-e56e-4c10-b8e7-ebe686de3076', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 273138688, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb-vda', 'timestamp': '2025-10-08T15:40:36.016650', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'instance-00000040', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2226aaa0-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.739619581, 'message_signature': '1453ce959b3aad67dca97f5f3b99c5024e27fafc762bcf9b5eb0e6d9e8b7238e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb-sda', 'timestamp': '2025-10-08T15:40:36.016650', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'instance-00000040', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2226bf0e-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.739619581, 'message_signature': '0a16dc07c8d61c7d0d3cfa56c4a43bc28a29e75d62380ae8b5d639750ec26247'}]}, 'timestamp': '2025-10-08 15:40:36.036544', '_unique_id': '871886cba3dd4fadbbb62957e7f5d425'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.038 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.039 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.040 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.device.write.requests volume: 158 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.041 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7396af93-149a-417b-b7c5-db0e1cb99637', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 158, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb-vda', 'timestamp': '2025-10-08T15:40:36.040656', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'instance-00000040', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '22277a3e-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.739619581, 'message_signature': 'fa37fafddfcd7e27a2def62ecf9064e26db66161eb92aedb698d30c8f051e947'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb-sda', 'timestamp': '2025-10-08T15:40:36.040656', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'instance-00000040', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '222789a2-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.739619581, 'message_signature': '19cbdda0ca59273e48b1ce3dbe6bd504a4f70bbc69a5fb994cbc66fd464c4772'}]}, 'timestamp': '2025-10-08 15:40:36.041723', '_unique_id': 'f606b32815dd4ad291fdd06df77e1b0a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.043 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.044 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.048 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for ccf8be13-2e93-495d-ac4a-2cff54baa4fb / tapd61bf8bf-d2 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.050 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for ccf8be13-2e93-495d-ac4a-2cff54baa4fb / tap3e387217-65 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.050 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.051 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f2dc943-8d77-47de-8ad4-709bb5930749', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-00000040-ccf8be13-2e93-495d-ac4a-2cff54baa4fb-tapd61bf8bf-d2', 'timestamp': '2025-10-08T15:40:36.044733', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'tapd61bf8bf-d2', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4a:a8:1d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd61bf8bf-d2'}, 'message_id': '2228f922-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.76777036, 'message_signature': 'f9a7609be3f768cb2d9058b7ca2fbf8b5bd1df25f802beb74c3039ef3940f79e'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-00000040-ccf8be13-2e93-495d-ac4a-2cff54baa4fb-tap3e387217-65', 'timestamp': '2025-10-08T15:40:36.044733', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'tap3e387217-65', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:cd:58:9a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e387217-65'}, 'message_id': '22290b42-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.76777036, 'message_signature': '0dc6b41fc5d5ae07e0c624ccbe7f84fd2f34fd2a2be0ea93414c3db29aca67c7'}]}, 'timestamp': '2025-10-08 15:40:36.051606', '_unique_id': 'e2de2dc733f3477386a7ed033a84c2da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.053 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.054 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.054 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/network.outgoing.bytes volume: 2588 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.055 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/network.outgoing.bytes volume: 702 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2aa14f36-d13e-437b-b52b-0f5b7a850672', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2588, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-00000040-ccf8be13-2e93-495d-ac4a-2cff54baa4fb-tapd61bf8bf-d2', 'timestamp': '2025-10-08T15:40:36.054929', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'tapd61bf8bf-d2', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4a:a8:1d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd61bf8bf-d2'}, 'message_id': '22299bfc-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.76777036, 'message_signature': 'dba1adc04fbbf7c4636348ef5e14e77827eec64582425a2f5028e5eb7a08f7d3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 702, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-00000040-ccf8be13-2e93-495d-ac4a-2cff54baa4fb-tap3e387217-65', 'timestamp': '2025-10-08T15:40:36.054929', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'tap3e387217-65', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:cd:58:9a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e387217-65'}, 'message_id': '2229a8cc-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.76777036, 'message_signature': '45695ad5fc0b866dc0770be4011f4b2740f04c69bdbaf5e175fd339230bca3cb'}]}, 'timestamp': '2025-10-08 15:40:36.055614', '_unique_id': '0071d509a6c64f709c9b290a2fde4c3d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.056 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.057 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.057 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.058 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/network.incoming.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e0f3037-6600-4362-8723-58e6572eed2d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-00000040-ccf8be13-2e93-495d-ac4a-2cff54baa4fb-tapd61bf8bf-d2', 'timestamp': '2025-10-08T15:40:36.057802', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'tapd61bf8bf-d2', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4a:a8:1d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd61bf8bf-d2'}, 'message_id': '222a0b28-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.76777036, 'message_signature': '14e4a1f7e83868e6726a044b8593f9d10e371fd21b507693a7e3d55317e0757b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-00000040-ccf8be13-2e93-495d-ac4a-2cff54baa4fb-tap3e387217-65', 'timestamp': '2025-10-08T15:40:36.057802', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'tap3e387217-65', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:cd:58:9a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e387217-65'}, 'message_id': '222a18c0-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.76777036, 'message_signature': '6febd0971d4b6a6fa3cf2a0767e97e3c7da8447c6d3a46fbf464948602a3d411'}]}, 'timestamp': '2025-10-08 15:40:36.058472', '_unique_id': 'a3a8b0763603470089ea007a92b0c911'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.059 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.061 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.061 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.061 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '398d5b89-def5-4990-b45b-bcb0944f5295', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-00000040-ccf8be13-2e93-495d-ac4a-2cff54baa4fb-tapd61bf8bf-d2', 'timestamp': '2025-10-08T15:40:36.061141', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'tapd61bf8bf-d2', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4a:a8:1d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd61bf8bf-d2'}, 'message_id': '222a8d82-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.76777036, 'message_signature': '820d716fc51f360b0cb69fa29a8c8be5d00afbb4236dc24ec12704c379dab5c1'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-00000040-ccf8be13-2e93-495d-ac4a-2cff54baa4fb-tap3e387217-65', 'timestamp': '2025-10-08T15:40:36.061141', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'tap3e387217-65', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:cd:58:9a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e387217-65'}, 'message_id': '222a970a-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.76777036, 'message_signature': '8f5937bcfb21bbaad6030f238861341fa3d82740fed2f7d840144866034ee5fb'}]}, 'timestamp': '2025-10-08 15:40:36.061674', '_unique_id': '85d9e161c19b491381cb9216c0fd17d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.062 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.device.read.requests volume: 11132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce43f574-c4ca-4303-b760-a177e236d78e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11132, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb-vda', 'timestamp': '2025-10-08T15:40:36.062970', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'instance-00000040', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '222ad38c-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.739619581, 'message_signature': 'cc98aca46b6f16af77932b411d8e21f655daaa1a3daa0fcd446f3968a3c051d1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb-sda', 'timestamp': '2025-10-08T15:40:36.062970', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'instance-00000040', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '222adbca-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.739619581, 'message_signature': '0a2a3168904f79372bc4d74e3ec49b1eea6e1151a9630516e646b275443188f6'}]}, 'timestamp': '2025-10-08 15:40:36.063425', '_unique_id': '1e25b7eb88a54b4c86fca17fd5a208a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.063 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.064 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.064 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.064 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72a8df41-e61c-49ca-be6d-08f368744b03', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-00000040-ccf8be13-2e93-495d-ac4a-2cff54baa4fb-tapd61bf8bf-d2', 'timestamp': '2025-10-08T15:40:36.064590', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'tapd61bf8bf-d2', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4a:a8:1d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd61bf8bf-d2'}, 'message_id': '222b11ee-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.76777036, 'message_signature': '1b9eb021a7a7f588eb89a89c63b68fe24fc170a2911748f29e8abf21b4b51272'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-00000040-ccf8be13-2e93-495d-ac4a-2cff54baa4fb-tap3e387217-65', 'timestamp': '2025-10-08T15:40:36.064590', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'tap3e387217-65', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:cd:58:9a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e387217-65'}, 'message_id': '222b1b30-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.76777036, 'message_signature': '330b234c64650840cb4df6625b5f5732faf9db89fc930572da3f44c12afe95c6'}]}, 'timestamp': '2025-10-08 15:40:36.065058', '_unique_id': '0bde6569d04f46e480bda4fc8ea833d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.065 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.066 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.066 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.066 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: server-tempest-MultiPortVlanTransparencyTest-2097740166-0>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: server-tempest-MultiPortVlanTransparencyTest-2097740166-0>]
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.066 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.066 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.067 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d6ca675-8c6d-4b83-b8ca-01454605eb0a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-00000040-ccf8be13-2e93-495d-ac4a-2cff54baa4fb-tapd61bf8bf-d2', 'timestamp': '2025-10-08T15:40:36.066893', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'tapd61bf8bf-d2', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4a:a8:1d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd61bf8bf-d2'}, 'message_id': '222b6d06-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.76777036, 'message_signature': '884ad54b8835fb79dd3be5daae1cfb8e3599c60be6c848d308651e1fddc14e84'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-00000040-ccf8be13-2e93-495d-ac4a-2cff54baa4fb-tap3e387217-65', 'timestamp': '2025-10-08T15:40:36.066893', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'tap3e387217-65', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:cd:58:9a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e387217-65'}, 'message_id': '222b7c24-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.76777036, 'message_signature': '70f91256431db9c546a96bf85b34a79787c7516d4d9f0dc81cdd2e72c930ab02'}]}, 'timestamp': '2025-10-08 15:40:36.067539', '_unique_id': '9fbe6ca55a9c4ea0841be9b49f36bf7a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.068 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.084 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.device.allocation volume: 20127744 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.085 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f7de8d9-2ec9-47ea-b461-c69b3f4d3769', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 20127744, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb-vda', 'timestamp': '2025-10-08T15:40:36.069003', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'instance-00000040', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '222e2bea-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.792027526, 'message_signature': '48039b753e143214f68d22dcd99e269b2cb6b24db04cb084f8890cb3c58f76ae'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb-sda', 'timestamp': '2025-10-08T15:40:36.069003', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'instance-00000040', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '222e413e-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.792027526, 'message_signature': '6c0ed820e84e78998a15e56a8b66bad36875a21e9b4f7087769b0cdd25202b64'}]}, 'timestamp': '2025-10-08 15:40:36.085768', '_unique_id': '7c77bb8e44184e62934a2aff2ab54b4c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.087 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.088 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.088 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.088 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: server-tempest-MultiPortVlanTransparencyTest-2097740166-0>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: server-tempest-MultiPortVlanTransparencyTest-2097740166-0>]
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.088 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.103 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/memory.usage volume: 255.25 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fbc27ba-ef47-4ef0-bb93-16beafbefd00', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 255.25, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'timestamp': '2025-10-08T15:40:36.088945', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'instance-00000040', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '22310a90-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.826342911, 'message_signature': '02f94eb4a95517172513f08bc2f8a8acd31e0318785c9e3cdd8523c909548603'}]}, 'timestamp': '2025-10-08 15:40:36.104069', '_unique_id': '43023d9c230844658a244069c9151fc3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.105 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.device.usage volume: 19660800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.106 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c00597d3-82f2-4242-920c-2b953272893f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 19660800, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb-vda', 'timestamp': '2025-10-08T15:40:36.105828', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'instance-00000040', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '22315e78-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.792027526, 'message_signature': '5167e0125dc01cdfec6992efa8b5aee753d7f3c42c54e85c1638f48f486f5557'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb-sda', 'timestamp': '2025-10-08T15:40:36.105828', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'instance-00000040', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '22316b3e-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.792027526, 'message_signature': 'cfa74cc1d5033f2930cfc61f2897e464a7a59c2afeeb18fd5b781f9d84fa30c0'}]}, 'timestamp': '2025-10-08 15:40:36.106461', '_unique_id': '5316c04aac184293bacbbd37496e45e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.107 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.108 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/network.outgoing.packets volume: 23 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.108 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/network.outgoing.packets volume: 7 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '173da8a3-59c8-483b-8b3d-d16cf830e23e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 23, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-00000040-ccf8be13-2e93-495d-ac4a-2cff54baa4fb-tapd61bf8bf-d2', 'timestamp': '2025-10-08T15:40:36.107980', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'tapd61bf8bf-d2', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4a:a8:1d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd61bf8bf-d2'}, 'message_id': '2231b382-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.76777036, 'message_signature': '8002a5a588275cc326903fe1a82082d2c4a3f0254a92ee10da3322d91f195710'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 7, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-00000040-ccf8be13-2e93-495d-ac4a-2cff54baa4fb-tap3e387217-65', 'timestamp': '2025-10-08T15:40:36.107980', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'tap3e387217-65', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:cd:58:9a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e387217-65'}, 'message_id': '2231bf9e-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.76777036, 'message_signature': '135f6e5bd2e5754ab695879a01c9b89f47f6c6afefb34e306ceadacbac30fdf0'}]}, 'timestamp': '2025-10-08 15:40:36.108618', '_unique_id': 'a7f9156cc5604b79b06160f10a0327ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.110 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.110 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.device.write.bytes volume: 18116608 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.110 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20db7247-3e6e-404e-8932-51e05d1ba82d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 18116608, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb-vda', 'timestamp': '2025-10-08T15:40:36.110152', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'instance-00000040', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '22320738-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.739619581, 'message_signature': 'ab72d0d8c51bf0957d9a24f7aba2ba2e14ae7482d23d4101b37d603dde1f2b06'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb-sda', 'timestamp': '2025-10-08T15:40:36.110152', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'instance-00000040', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '22320fb2-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.739619581, 'message_signature': 'de94471c51891b0b9e79d81c7d14d43141a32f0cad74c1359c73fdc07363dcfe'}]}, 'timestamp': '2025-10-08 15:40:36.110627', '_unique_id': '8789ec72ad304ac3bf4361f438826740'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.111 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.device.write.latency volume: 6165296044 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df7e45b1-5d07-4619-927c-5367293e464a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6165296044, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb-vda', 'timestamp': '2025-10-08T15:40:36.111793', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'instance-00000040', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '22324586-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.739619581, 'message_signature': '90ff4061dcc1f79cc1ac7c43cd11c86323881225c07d9ac7bf9dad51ffdb5136'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb-sda', 'timestamp': '2025-10-08T15:40:36.111793', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'instance-00000040', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '22324e96-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.739619581, 'message_signature': '6322805450150bd4440c68167b78ae73915fd15e6b239f4e627d8fdd5415907a'}]}, 'timestamp': '2025-10-08 15:40:36.112238', '_unique_id': '3628197265ba4304b75f0ee6e3f8f061'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.112 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.113 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.113 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.113 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f70b99f0-c66e-454e-b4a4-3d8dc207d461', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-00000040-ccf8be13-2e93-495d-ac4a-2cff54baa4fb-tapd61bf8bf-d2', 'timestamp': '2025-10-08T15:40:36.113355', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'tapd61bf8bf-d2', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4a:a8:1d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd61bf8bf-d2'}, 'message_id': '223282bc-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.76777036, 'message_signature': 'fb31afcbd58cb4367779574b70b9719440a521cb15bd2d96b7baa01fdd22f837'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-00000040-ccf8be13-2e93-495d-ac4a-2cff54baa4fb-tap3e387217-65', 'timestamp': '2025-10-08T15:40:36.113355', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'tap3e387217-65', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:cd:58:9a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e387217-65'}, 'message_id': '22328ae6-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.76777036, 'message_signature': '13d6a7b7af8ca3e059c66cb1aee489d39240238067d85c876ad813eb9a9cf14a'}]}, 'timestamp': '2025-10-08 15:40:36.113795', '_unique_id': '16d39b17b2cd40c1808cd3af4c8667de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.114 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/network.incoming.bytes volume: 2064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/network.incoming.bytes volume: 832 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c7936e4-6a80-47d9-816a-463283690dca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2064, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-00000040-ccf8be13-2e93-495d-ac4a-2cff54baa4fb-tapd61bf8bf-d2', 'timestamp': '2025-10-08T15:40:36.114907', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'tapd61bf8bf-d2', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4a:a8:1d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd61bf8bf-d2'}, 'message_id': '2232bf3e-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.76777036, 'message_signature': '16529b3d973630bc27119be3bd3dd11e5919e461b398a9ec744bdcec7d01202b'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 832, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-00000040-ccf8be13-2e93-495d-ac4a-2cff54baa4fb-tap3e387217-65', 'timestamp': '2025-10-08T15:40:36.114907', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'tap3e387217-65', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:cd:58:9a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e387217-65'}, 'message_id': '2232c88a-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.76777036, 'message_signature': '603ee1c478110342e6eb6a1b356efb98476db86b6b4c5f5c2008470133fbf224'}]}, 'timestamp': '2025-10-08 15:40:36.115364', '_unique_id': 'c3cabb7450ca4d81bcc10349a5c1d173'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.115 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.116 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.116 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.116 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8068c6e6-b257-44c8-b394-160243cf9e80', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-00000040-ccf8be13-2e93-495d-ac4a-2cff54baa4fb-tapd61bf8bf-d2', 'timestamp': '2025-10-08T15:40:36.116449', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'tapd61bf8bf-d2', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:4a:a8:1d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd61bf8bf-d2'}, 'message_id': '2232fb7a-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.76777036, 'message_signature': '457d7f101577fc111271cd348ad03e3721bf88bee6f1464533ef8d68f7863560'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'instance-00000040-ccf8be13-2e93-495d-ac4a-2cff54baa4fb-tap3e387217-65', 'timestamp': '2025-10-08T15:40:36.116449', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'tap3e387217-65', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:cd:58:9a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e387217-65'}, 'message_id': '2233037c-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.76777036, 'message_signature': 'fe35b99f1d9288665b16952357c0e214f2e8076b43ef119a4461c357bd0b4396'}]}, 'timestamp': '2025-10-08 15:40:36.116898', '_unique_id': 'b4b3f8e261bd442893e22abe757fad34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.117 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.118 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.118 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: server-tempest-MultiPortVlanTransparencyTest-2097740166-0>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: server-tempest-MultiPortVlanTransparencyTest-2097740166-0>]
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.118 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.118 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.device.read.latency volume: 10460162005 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.118 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.device.read.latency volume: 91530863 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d8e3560-47a0-43ea-ac4c-88629ec9556b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10460162005, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb-vda', 'timestamp': '2025-10-08T15:40:36.118338', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'instance-00000040', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '22334558-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.739619581, 'message_signature': '1071434262a41c2b49b384e5ac9c6d2be49297842989f09abd628f20591c951e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 91530863, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb-sda', 'timestamp': '2025-10-08T15:40:36.118338', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'instance-00000040', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '22334d50-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.739619581, 'message_signature': '9f10a5bceb9fa79811b49bff599037ec670b7d7c28304a38f9482c2b4bdc2798'}]}, 'timestamp': '2025-10-08 15:40:36.118756', '_unique_id': '14fe203d25bc405db8513c4e012723e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.119 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/cpu volume: 34790000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c82defbd-2360-48fa-985a-498d35ee5879', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 34790000000, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'timestamp': '2025-10-08T15:40:36.119842', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'instance-00000040', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '22337fdc-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.826342911, 'message_signature': '1d2929100f6db62c7bbba70bc3b69f7431d2c71d58bd8ae40d0621d38443b718'}]}, 'timestamp': '2025-10-08 15:40:36.120056', '_unique_id': '3e61dacfefe046a0a7b325ac75cb0340'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.120 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.121 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.121 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.121 12 DEBUG ceilometer.compute.pollsters [-] ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98ca149e-fb54-44b7-b0f4-f9417a710b39', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb-vda', 'timestamp': '2025-10-08T15:40:36.121152', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'instance-00000040', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '2233b326-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.792027526, 'message_signature': '15e2bafa439656a6456180a4132b6fbffd5b1257501e4ec0c77e9bcddcc1dfb6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_name': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_name': None, 'resource_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb-sda', 'timestamp': '2025-10-08T15:40:36.121152', 'resource_metadata': {'display_name': 'server-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'name': 'instance-00000040', 'instance_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'instance_type': 'custom_neutron_guest', 'host': 'dc3740b7a37875d497db281141d72c1666d359299b7a1d552b890243', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '2233bba0-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 4939.792027526, 'message_signature': '77f2ca2f9da7db2af7d40357d4047397446d00dfdf50ce8c3de6b10b836c3c80'}]}, 'timestamp': '2025-10-08 15:40:36.121605', '_unique_id': '4ce7e84b74ee41668a58f6d8a36fca86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:40:36.122 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:40:36 np0005476733 nova_compute[192580]: 2025-10-08 15:40:36.263 2 INFO nova.compute.manager [None req-336ec6e1-ac8b-459b-b64d-41469a0dda95 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Get console output#033[00m
Oct  8 11:40:36 np0005476733 nova_compute[192580]: 2025-10-08 15:40:36.267 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:40:36 np0005476733 nova_compute[192580]: 2025-10-08 15:40:36.272 2 INFO nova.virt.libvirt.driver [None req-336ec6e1-ac8b-459b-b64d-41469a0dda95 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Truncated console log returned, 4696 bytes ignored#033[00m
Oct  8 11:40:37 np0005476733 nova_compute[192580]: 2025-10-08 15:40:37.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:37.861 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:37.862 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:40:37 np0005476733 nova_compute[192580]: 2025-10-08 15:40:37.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:39 np0005476733 nova_compute[192580]: 2025-10-08 15:40:39.345 2 DEBUG nova.compute.manager [req-eac915ba-5a36-48d9-bf10-87133c1be5ea req-d8ee3b9c-bfdb-4bb0-823a-9a52e5f1eb65 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Received event network-changed-d61bf8bf-d254-433d-b32a-427cbb791a7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:40:39 np0005476733 nova_compute[192580]: 2025-10-08 15:40:39.346 2 DEBUG nova.compute.manager [req-eac915ba-5a36-48d9-bf10-87133c1be5ea req-d8ee3b9c-bfdb-4bb0-823a-9a52e5f1eb65 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Refreshing instance network info cache due to event network-changed-d61bf8bf-d254-433d-b32a-427cbb791a7f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:40:39 np0005476733 nova_compute[192580]: 2025-10-08 15:40:39.346 2 DEBUG oslo_concurrency.lockutils [req-eac915ba-5a36-48d9-bf10-87133c1be5ea req-d8ee3b9c-bfdb-4bb0-823a-9a52e5f1eb65 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-ccf8be13-2e93-495d-ac4a-2cff54baa4fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:40:39 np0005476733 nova_compute[192580]: 2025-10-08 15:40:39.347 2 DEBUG oslo_concurrency.lockutils [req-eac915ba-5a36-48d9-bf10-87133c1be5ea req-d8ee3b9c-bfdb-4bb0-823a-9a52e5f1eb65 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-ccf8be13-2e93-495d-ac4a-2cff54baa4fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:40:39 np0005476733 nova_compute[192580]: 2025-10-08 15:40:39.347 2 DEBUG nova.network.neutron [req-eac915ba-5a36-48d9-bf10-87133c1be5ea req-d8ee3b9c-bfdb-4bb0-823a-9a52e5f1eb65 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Refreshing network info cache for port d61bf8bf-d254-433d-b32a-427cbb791a7f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:40:40 np0005476733 nova_compute[192580]: 2025-10-08 15:40:40.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:40.864 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:40:41 np0005476733 nova_compute[192580]: 2025-10-08 15:40:41.026 2 DEBUG nova.network.neutron [req-eac915ba-5a36-48d9-bf10-87133c1be5ea req-d8ee3b9c-bfdb-4bb0-823a-9a52e5f1eb65 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Updated VIF entry in instance network info cache for port d61bf8bf-d254-433d-b32a-427cbb791a7f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:40:41 np0005476733 nova_compute[192580]: 2025-10-08 15:40:41.027 2 DEBUG nova.network.neutron [req-eac915ba-5a36-48d9-bf10-87133c1be5ea req-d8ee3b9c-bfdb-4bb0-823a-9a52e5f1eb65 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Updating instance_info_cache with network_info: [{"id": "d61bf8bf-d254-433d-b32a-427cbb791a7f", "address": "fa:16:3e:4a:a8:1d", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd61bf8bf-d2", "ovs_interfaceid": "d61bf8bf-d254-433d-b32a-427cbb791a7f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "3e387217-655c-4ce1-9a54-18395a63adb4", "address": "fa:16:3e:cd:58:9a", "network": {"id": "5557e28c-0838-4cc2-bde1-1d8616c6ce66", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "2001:db9::/64", "dns": [], "gateway": {"address": "2001:db9::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db9::344", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e387217-65", "ovs_interfaceid": "3e387217-655c-4ce1-9a54-18395a63adb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:40:41 np0005476733 nova_compute[192580]: 2025-10-08 15:40:41.057 2 DEBUG oslo_concurrency.lockutils [req-eac915ba-5a36-48d9-bf10-87133c1be5ea req-d8ee3b9c-bfdb-4bb0-823a-9a52e5f1eb65 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-ccf8be13-2e93-495d-ac4a-2cff54baa4fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:40:42 np0005476733 nova_compute[192580]: 2025-10-08 15:40:42.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:44 np0005476733 nova_compute[192580]: 2025-10-08 15:40:44.051 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:40:44 np0005476733 nova_compute[192580]: 2025-10-08 15:40:44.083 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Triggering sync for uuid ccf8be13-2e93-495d-ac4a-2cff54baa4fb _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  8 11:40:44 np0005476733 nova_compute[192580]: 2025-10-08 15:40:44.084 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:40:44 np0005476733 nova_compute[192580]: 2025-10-08 15:40:44.085 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:40:44 np0005476733 nova_compute[192580]: 2025-10-08 15:40:44.121 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.036s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:40:44 np0005476733 podman[237933]: 2025-10-08 15:40:44.236722273 +0000 UTC m=+0.055562636 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:40:44 np0005476733 podman[237932]: 2025-10-08 15:40:44.254449589 +0000 UTC m=+0.070968558 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=iscsid, org.label-schema.schema-version=1.0)
Oct  8 11:40:45 np0005476733 nova_compute[192580]: 2025-10-08 15:40:45.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:47 np0005476733 nova_compute[192580]: 2025-10-08 15:40:47.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:49 np0005476733 nova_compute[192580]: 2025-10-08 15:40:49.623 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.324 2 DEBUG oslo_concurrency.lockutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "f00f4363-87ff-45bb-b619-95a364353d0b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.325 2 DEBUG oslo_concurrency.lockutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "f00f4363-87ff-45bb-b619-95a364353d0b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.347 2 DEBUG nova.compute.manager [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.436 2 DEBUG oslo_concurrency.lockutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.436 2 DEBUG oslo_concurrency.lockutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.445 2 DEBUG nova.virt.hardware [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.445 2 INFO nova.compute.claims [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.527 2 DEBUG nova.scheduler.client.report [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.544 2 DEBUG nova.scheduler.client.report [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.545 2 DEBUG nova.compute.provider_tree [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.580 2 DEBUG nova.scheduler.client.report [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.601 2 DEBUG nova.scheduler.client.report [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.662 2 DEBUG nova.compute.provider_tree [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.677 2 DEBUG nova.scheduler.client.report [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.697 2 DEBUG oslo_concurrency.lockutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.698 2 DEBUG nova.compute.manager [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.755 2 DEBUG nova.compute.manager [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.756 2 DEBUG nova.network.neutron [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.776 2 INFO nova.virt.libvirt.driver [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.797 2 DEBUG nova.compute.manager [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.895 2 DEBUG nova.compute.manager [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.897 2 DEBUG nova.virt.libvirt.driver [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.897 2 INFO nova.virt.libvirt.driver [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Creating image(s)#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.899 2 DEBUG oslo_concurrency.lockutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "/var/lib/nova/instances/f00f4363-87ff-45bb-b619-95a364353d0b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.899 2 DEBUG oslo_concurrency.lockutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "/var/lib/nova/instances/f00f4363-87ff-45bb-b619-95a364353d0b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.901 2 DEBUG oslo_concurrency.lockutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "/var/lib/nova/instances/f00f4363-87ff-45bb-b619-95a364353d0b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.927 2 DEBUG oslo_concurrency.processutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.952 2 DEBUG nova.policy [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.989 2 DEBUG oslo_concurrency.processutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.990 2 DEBUG oslo_concurrency.lockutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:40:50 np0005476733 nova_compute[192580]: 2025-10-08 15:40:50.991 2 DEBUG oslo_concurrency.lockutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:40:51 np0005476733 nova_compute[192580]: 2025-10-08 15:40:51.002 2 DEBUG oslo_concurrency.processutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:40:51 np0005476733 nova_compute[192580]: 2025-10-08 15:40:51.059 2 DEBUG oslo_concurrency.processutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:40:51 np0005476733 nova_compute[192580]: 2025-10-08 15:40:51.061 2 DEBUG oslo_concurrency.processutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/f00f4363-87ff-45bb-b619-95a364353d0b/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:40:51 np0005476733 nova_compute[192580]: 2025-10-08 15:40:51.099 2 DEBUG oslo_concurrency.processutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/f00f4363-87ff-45bb-b619-95a364353d0b/disk 10737418240" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:40:51 np0005476733 nova_compute[192580]: 2025-10-08 15:40:51.101 2 DEBUG oslo_concurrency.lockutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:40:51 np0005476733 nova_compute[192580]: 2025-10-08 15:40:51.101 2 DEBUG oslo_concurrency.processutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:40:51 np0005476733 nova_compute[192580]: 2025-10-08 15:40:51.159 2 DEBUG oslo_concurrency.processutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:40:51 np0005476733 nova_compute[192580]: 2025-10-08 15:40:51.160 2 DEBUG nova.objects.instance [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'migration_context' on Instance uuid f00f4363-87ff-45bb-b619-95a364353d0b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:40:51 np0005476733 nova_compute[192580]: 2025-10-08 15:40:51.179 2 DEBUG nova.virt.libvirt.driver [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:40:51 np0005476733 nova_compute[192580]: 2025-10-08 15:40:51.179 2 DEBUG nova.virt.libvirt.driver [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Ensure instance console log exists: /var/lib/nova/instances/f00f4363-87ff-45bb-b619-95a364353d0b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:40:51 np0005476733 nova_compute[192580]: 2025-10-08 15:40:51.180 2 DEBUG oslo_concurrency.lockutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:40:51 np0005476733 nova_compute[192580]: 2025-10-08 15:40:51.180 2 DEBUG oslo_concurrency.lockutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:40:51 np0005476733 nova_compute[192580]: 2025-10-08 15:40:51.180 2 DEBUG oslo_concurrency.lockutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:40:51 np0005476733 nova_compute[192580]: 2025-10-08 15:40:51.676 2 DEBUG nova.network.neutron [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Successfully created port: 84e96b77-4119-4da7-ad84-5cf394586e03 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 11:40:52 np0005476733 nova_compute[192580]: 2025-10-08 15:40:52.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:40:52 np0005476733 nova_compute[192580]: 2025-10-08 15:40:52.595 2 DEBUG nova.network.neutron [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Successfully updated port: 84e96b77-4119-4da7-ad84-5cf394586e03 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:40:52 np0005476733 nova_compute[192580]: 2025-10-08 15:40:52.618 2 DEBUG oslo_concurrency.lockutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "refresh_cache-f00f4363-87ff-45bb-b619-95a364353d0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:40:52 np0005476733 nova_compute[192580]: 2025-10-08 15:40:52.618 2 DEBUG oslo_concurrency.lockutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquired lock "refresh_cache-f00f4363-87ff-45bb-b619-95a364353d0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:40:52 np0005476733 nova_compute[192580]: 2025-10-08 15:40:52.619 2 DEBUG nova.network.neutron [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:40:52 np0005476733 nova_compute[192580]: 2025-10-08 15:40:52.687 2 DEBUG nova.compute.manager [req-b4aafe93-1216-4661-a157-b609c5048586 req-5fc97174-d15f-4c4d-83fe-4fc5e84860f1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Received event network-changed-84e96b77-4119-4da7-ad84-5cf394586e03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:40:52 np0005476733 nova_compute[192580]: 2025-10-08 15:40:52.688 2 DEBUG nova.compute.manager [req-b4aafe93-1216-4661-a157-b609c5048586 req-5fc97174-d15f-4c4d-83fe-4fc5e84860f1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Refreshing instance network info cache due to event network-changed-84e96b77-4119-4da7-ad84-5cf394586e03. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:40:52 np0005476733 nova_compute[192580]: 2025-10-08 15:40:52.688 2 DEBUG oslo_concurrency.lockutils [req-b4aafe93-1216-4661-a157-b609c5048586 req-5fc97174-d15f-4c4d-83fe-4fc5e84860f1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-f00f4363-87ff-45bb-b619-95a364353d0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:40:52 np0005476733 nova_compute[192580]: 2025-10-08 15:40:52.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:52 np0005476733 nova_compute[192580]: 2025-10-08 15:40:52.802 2 DEBUG nova.network.neutron [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:40:53 np0005476733 podman[238013]: 2025-10-08 15:40:53.23318806 +0000 UTC m=+0.051692813 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.617 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.618 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.619 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.620 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.695 2 DEBUG nova.network.neutron [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Updating instance_info_cache with network_info: [{"id": "84e96b77-4119-4da7-ad84-5cf394586e03", "address": "fa:16:3e:53:2d:ec", "network": {"id": "87537fad-af8f-4eae-8420-dce1a4fd9a36", "bridge": "br-int", "label": "tempest-test-network--927179751", "subnets": [{"cidr": "192.168.6.0/24", "dns": [], "gateway": {"address": "192.168.6.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.6.153", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e96b77-41", "ovs_interfaceid": "84e96b77-4119-4da7-ad84-5cf394586e03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.707 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.735 2 DEBUG oslo_concurrency.lockutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Releasing lock "refresh_cache-f00f4363-87ff-45bb-b619-95a364353d0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.737 2 DEBUG nova.compute.manager [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Instance network_info: |[{"id": "84e96b77-4119-4da7-ad84-5cf394586e03", "address": "fa:16:3e:53:2d:ec", "network": {"id": "87537fad-af8f-4eae-8420-dce1a4fd9a36", "bridge": "br-int", "label": "tempest-test-network--927179751", "subnets": [{"cidr": "192.168.6.0/24", "dns": [], "gateway": {"address": "192.168.6.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.6.153", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e96b77-41", "ovs_interfaceid": "84e96b77-4119-4da7-ad84-5cf394586e03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.738 2 DEBUG oslo_concurrency.lockutils [req-b4aafe93-1216-4661-a157-b609c5048586 req-5fc97174-d15f-4c4d-83fe-4fc5e84860f1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-f00f4363-87ff-45bb-b619-95a364353d0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.739 2 DEBUG nova.network.neutron [req-b4aafe93-1216-4661-a157-b609c5048586 req-5fc97174-d15f-4c4d-83fe-4fc5e84860f1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Refreshing network info cache for port 84e96b77-4119-4da7-ad84-5cf394586e03 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.746 2 DEBUG nova.virt.libvirt.driver [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Start _get_guest_xml network_info=[{"id": "84e96b77-4119-4da7-ad84-5cf394586e03", "address": "fa:16:3e:53:2d:ec", "network": {"id": "87537fad-af8f-4eae-8420-dce1a4fd9a36", "bridge": "br-int", "label": "tempest-test-network--927179751", "subnets": [{"cidr": "192.168.6.0/24", "dns": [], "gateway": {"address": "192.168.6.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.6.153", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e96b77-41", "ovs_interfaceid": "84e96b77-4119-4da7-ad84-5cf394586e03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.755 2 WARNING nova.virt.libvirt.driver [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.763 2 DEBUG nova.virt.libvirt.host [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.764 2 DEBUG nova.virt.libvirt.host [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.775 2 DEBUG nova.virt.libvirt.host [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.776 2 DEBUG nova.virt.libvirt.host [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.776 2 DEBUG nova.virt.libvirt.driver [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.777 2 DEBUG nova.virt.hardware [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.778 2 DEBUG nova.virt.hardware [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.778 2 DEBUG nova.virt.hardware [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.779 2 DEBUG nova.virt.hardware [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.779 2 DEBUG nova.virt.hardware [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.779 2 DEBUG nova.virt.hardware [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.780 2 DEBUG nova.virt.hardware [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.780 2 DEBUG nova.virt.hardware [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.781 2 DEBUG nova.virt.hardware [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.782 2 DEBUG nova.virt.hardware [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.782 2 DEBUG nova.virt.hardware [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.788 2 DEBUG nova.virt.libvirt.vif [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:40:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_dscp_marking_tenant_network-1397692204',display_name='tempest-test_dscp_marking_tenant_network-1397692204',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dscp-marking-tenant-network-1397692204',id=66,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-toxce72r',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:40:50Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=f00f4363-87ff-45bb-b619-95a364353d0b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "84e96b77-4119-4da7-ad84-5cf394586e03", "address": "fa:16:3e:53:2d:ec", "network": {"id": "87537fad-af8f-4eae-8420-dce1a4fd9a36", "bridge": "br-int", "label": "tempest-test-network--927179751", "subnets": [{"cidr": "192.168.6.0/24", "dns": [], "gateway": {"address": "192.168.6.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.6.153", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e96b77-41", "ovs_interfaceid": "84e96b77-4119-4da7-ad84-5cf394586e03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.789 2 DEBUG nova.network.os_vif_util [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "84e96b77-4119-4da7-ad84-5cf394586e03", "address": "fa:16:3e:53:2d:ec", "network": {"id": "87537fad-af8f-4eae-8420-dce1a4fd9a36", "bridge": "br-int", "label": "tempest-test-network--927179751", "subnets": [{"cidr": "192.168.6.0/24", "dns": [], "gateway": {"address": "192.168.6.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.6.153", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e96b77-41", "ovs_interfaceid": "84e96b77-4119-4da7-ad84-5cf394586e03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.789 2 DEBUG nova.network.os_vif_util [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:2d:ec,bridge_name='br-int',has_traffic_filtering=True,id=84e96b77-4119-4da7-ad84-5cf394586e03,network=Network(87537fad-af8f-4eae-8420-dce1a4fd9a36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84e96b77-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.790 2 DEBUG nova.objects.instance [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'pci_devices' on Instance uuid f00f4363-87ff-45bb-b619-95a364353d0b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.793 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.794 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.818 2 DEBUG nova.virt.libvirt.driver [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:40:53 np0005476733 nova_compute[192580]:  <uuid>f00f4363-87ff-45bb-b619-95a364353d0b</uuid>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:  <name>instance-00000042</name>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <nova:name>tempest-test_dscp_marking_tenant_network-1397692204</nova:name>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:40:53</nova:creationTime>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:40:53 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:        <nova:user uuid="d4d641ac754b44f89a23c1628056309a">tempest-QosTestCommon-1316104462-project-member</nova:user>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:        <nova:project uuid="d58fb802e34e481ea69b20f4fe8df6d2">tempest-QosTestCommon-1316104462</nova:project>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:        <nova:port uuid="84e96b77-4119-4da7-ad84-5cf394586e03">
Oct  8 11:40:53 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.6.153" ipVersion="4"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <entry name="serial">f00f4363-87ff-45bb-b619-95a364353d0b</entry>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <entry name="uuid">f00f4363-87ff-45bb-b619-95a364353d0b</entry>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/f00f4363-87ff-45bb-b619-95a364353d0b/disk"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/f00f4363-87ff-45bb-b619-95a364353d0b/disk.config"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:53:2d:ec"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <target dev="tap84e96b77-41"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/f00f4363-87ff-45bb-b619-95a364353d0b/console.log" append="off"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:40:53 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:40:53 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:40:53 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:40:53 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.819 2 DEBUG nova.compute.manager [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Preparing to wait for external event network-vif-plugged-84e96b77-4119-4da7-ad84-5cf394586e03 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.820 2 DEBUG oslo_concurrency.lockutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "f00f4363-87ff-45bb-b619-95a364353d0b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.820 2 DEBUG oslo_concurrency.lockutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "f00f4363-87ff-45bb-b619-95a364353d0b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.821 2 DEBUG oslo_concurrency.lockutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "f00f4363-87ff-45bb-b619-95a364353d0b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.822 2 DEBUG nova.virt.libvirt.vif [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:40:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_dscp_marking_tenant_network-1397692204',display_name='tempest-test_dscp_marking_tenant_network-1397692204',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dscp-marking-tenant-network-1397692204',id=66,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-toxce72r',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:40:50Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=f00f4363-87ff-45bb-b619-95a364353d0b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "84e96b77-4119-4da7-ad84-5cf394586e03", "address": "fa:16:3e:53:2d:ec", "network": {"id": "87537fad-af8f-4eae-8420-dce1a4fd9a36", "bridge": "br-int", "label": "tempest-test-network--927179751", "subnets": [{"cidr": "192.168.6.0/24", "dns": [], "gateway": {"address": "192.168.6.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.6.153", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e96b77-41", "ovs_interfaceid": "84e96b77-4119-4da7-ad84-5cf394586e03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.822 2 DEBUG nova.network.os_vif_util [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "84e96b77-4119-4da7-ad84-5cf394586e03", "address": "fa:16:3e:53:2d:ec", "network": {"id": "87537fad-af8f-4eae-8420-dce1a4fd9a36", "bridge": "br-int", "label": "tempest-test-network--927179751", "subnets": [{"cidr": "192.168.6.0/24", "dns": [], "gateway": {"address": "192.168.6.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.6.153", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e96b77-41", "ovs_interfaceid": "84e96b77-4119-4da7-ad84-5cf394586e03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.823 2 DEBUG nova.network.os_vif_util [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:2d:ec,bridge_name='br-int',has_traffic_filtering=True,id=84e96b77-4119-4da7-ad84-5cf394586e03,network=Network(87537fad-af8f-4eae-8420-dce1a4fd9a36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84e96b77-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.823 2 DEBUG os_vif [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:2d:ec,bridge_name='br-int',has_traffic_filtering=True,id=84e96b77-4119-4da7-ad84-5cf394586e03,network=Network(87537fad-af8f-4eae-8420-dce1a4fd9a36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84e96b77-41') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.825 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.825 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.828 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap84e96b77-41, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.829 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap84e96b77-41, col_values=(('external_ids', {'iface-id': '84e96b77-4119-4da7-ad84-5cf394586e03', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:53:2d:ec', 'vm-uuid': 'f00f4363-87ff-45bb-b619-95a364353d0b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:53 np0005476733 NetworkManager[51699]: <info>  [1759938053.8315] manager: (tap84e96b77-41): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/176)
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.838 2 INFO os_vif [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:2d:ec,bridge_name='br-int',has_traffic_filtering=True,id=84e96b77-4119-4da7-ad84-5cf394586e03,network=Network(87537fad-af8f-4eae-8420-dce1a4fd9a36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84e96b77-41')#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.868 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ccf8be13-2e93-495d-ac4a-2cff54baa4fb/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.891 2 DEBUG nova.virt.libvirt.driver [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.891 2 DEBUG nova.virt.libvirt.driver [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.891 2 DEBUG nova.virt.libvirt.driver [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No VIF found with MAC fa:16:3e:53:2d:ec, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:40:53 np0005476733 nova_compute[192580]: 2025-10-08 15:40:53.892 2 INFO nova.virt.libvirt.driver [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Using config drive#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.070 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.071 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12966MB free_disk=111.18193435668945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.071 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.072 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.163 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance ccf8be13-2e93-495d-ac4a-2cff54baa4fb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.163 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance f00f4363-87ff-45bb-b619-95a364353d0b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.164 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.164 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=2560MB phys_disk=119GB used_disk=20GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.234 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.253 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.266 2 INFO nova.virt.libvirt.driver [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Creating config drive at /var/lib/nova/instances/f00f4363-87ff-45bb-b619-95a364353d0b/disk.config#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.277 2 DEBUG oslo_concurrency.processutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f00f4363-87ff-45bb-b619-95a364353d0b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv15xd__y execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.305 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.305 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.412 2 DEBUG oslo_concurrency.processutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f00f4363-87ff-45bb-b619-95a364353d0b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv15xd__y" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:40:54 np0005476733 kernel: tap84e96b77-41: entered promiscuous mode
Oct  8 11:40:54 np0005476733 NetworkManager[51699]: <info>  [1759938054.4992] manager: (tap84e96b77-41): new Tun device (/org/freedesktop/NetworkManager/Devices/177)
Oct  8 11:40:54 np0005476733 ovn_controller[94857]: 2025-10-08T15:40:54Z|00531|binding|INFO|Claiming lport 84e96b77-4119-4da7-ad84-5cf394586e03 for this chassis.
Oct  8 11:40:54 np0005476733 ovn_controller[94857]: 2025-10-08T15:40:54Z|00532|binding|INFO|84e96b77-4119-4da7-ad84-5cf394586e03: Claiming fa:16:3e:53:2d:ec 192.168.6.153
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.512 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:2d:ec 192.168.6.153'], port_security=['fa:16:3e:53:2d:ec 192.168.6.153'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.6.153/24', 'neutron:device_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-87537fad-af8f-4eae-8420-dce1a4fd9a36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b449450f-29a2-4ba2-a56d-c4c1cca923db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=43ff2ef3-241a-4d61-bde9-cb2730e8ed48, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=84e96b77-4119-4da7-ad84-5cf394586e03) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.514 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 84e96b77-4119-4da7-ad84-5cf394586e03 in datapath 87537fad-af8f-4eae-8420-dce1a4fd9a36 bound to our chassis#033[00m
Oct  8 11:40:54 np0005476733 ovn_controller[94857]: 2025-10-08T15:40:54Z|00533|binding|INFO|Setting lport 84e96b77-4119-4da7-ad84-5cf394586e03 ovn-installed in OVS
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:54 np0005476733 ovn_controller[94857]: 2025-10-08T15:40:54Z|00534|binding|INFO|Setting lport 84e96b77-4119-4da7-ad84-5cf394586e03 up in Southbound
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.516 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 87537fad-af8f-4eae-8420-dce1a4fd9a36#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.530 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f440b5b1-93bf-497e-9d01-3aa82752a0d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.531 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap87537fad-a1 in ovnmeta-87537fad-af8f-4eae-8420-dce1a4fd9a36 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.532 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap87537fad-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.532 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[cd35b7bc-2edd-4eb8-a798-438056113fde]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.533 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[71804c3f-10ef-473b-b495-e4fab0413d2d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:40:54 np0005476733 systemd-udevd[238057]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:40:54 np0005476733 systemd-machined[152624]: New machine qemu-38-instance-00000042.
Oct  8 11:40:54 np0005476733 NetworkManager[51699]: <info>  [1759938054.5538] device (tap84e96b77-41): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:40:54 np0005476733 NetworkManager[51699]: <info>  [1759938054.5549] device (tap84e96b77-41): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.556 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[1a6c6761-0816-4194-8f5c-054a69e3271c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:40:54 np0005476733 systemd[1]: Started Virtual Machine qemu-38-instance-00000042.
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.573 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3c9c8080-a1b1-4141-a8ba-a301d742d6df]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.604 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[fea54b18-9932-437f-b6b8-688ffb3f0a24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.608 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b82ed1e3-a1b3-4c80-865b-b649cc3b35b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:40:54 np0005476733 NetworkManager[51699]: <info>  [1759938054.6095] manager: (tap87537fad-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/178)
Oct  8 11:40:54 np0005476733 systemd-udevd[238061]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.645 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[5829ea1d-ae22-4ce4-af9b-199195d3bbfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.649 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[4444daf9-a88b-4f4f-98dc-7aea193b859f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:40:54 np0005476733 NetworkManager[51699]: <info>  [1759938054.6696] device (tap87537fad-a0): carrier: link connected
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.674 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[6d57cfa2-8943-4182-a8dc-fb038e351e4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.692 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d0e76ccb-9bf6-4e2f-9ecf-7cc7fb5e234b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap87537fad-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7a:de:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495833, 'reachable_time': 31199, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238090, 'error': None, 'target': 'ovnmeta-87537fad-af8f-4eae-8420-dce1a4fd9a36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.708 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ff9cb860-b7e0-4b26-9b75-7b26d2312d96]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7a:de29'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 495833, 'tstamp': 495833}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238091, 'error': None, 'target': 'ovnmeta-87537fad-af8f-4eae-8420-dce1a4fd9a36', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.723 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3e55643b-5c4d-4881-88aa-99d74d4f452a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap87537fad-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7a:de:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495833, 'reachable_time': 31199, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238092, 'error': None, 'target': 'ovnmeta-87537fad-af8f-4eae-8420-dce1a4fd9a36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.755 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d55f898e-606b-4817-b895-a13005a6884a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.815 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d7055062-8bf5-46d5-91c3-ad74f01618ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.816 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap87537fad-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.816 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.817 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap87537fad-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:54 np0005476733 kernel: tap87537fad-a0: entered promiscuous mode
Oct  8 11:40:54 np0005476733 NetworkManager[51699]: <info>  [1759938054.8198] manager: (tap87537fad-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/179)
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.821 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap87537fad-a0, col_values=(('external_ids', {'iface-id': 'eec8c715-0284-4882-81ad-ac0ec5c712a5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:54 np0005476733 ovn_controller[94857]: 2025-10-08T15:40:54Z|00535|binding|INFO|Releasing lport eec8c715-0284-4882-81ad-ac0ec5c712a5 from this chassis (sb_readonly=0)
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.837 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/87537fad-af8f-4eae-8420-dce1a4fd9a36.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/87537fad-af8f-4eae-8420-dce1a4fd9a36.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.838 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[785307aa-ae6e-4a00-9351-61eb62dc518f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.839 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-87537fad-af8f-4eae-8420-dce1a4fd9a36
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/87537fad-af8f-4eae-8420-dce1a4fd9a36.pid.haproxy
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 87537fad-af8f-4eae-8420-dce1a4fd9a36
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:40:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:40:54.840 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-87537fad-af8f-4eae-8420-dce1a4fd9a36', 'env', 'PROCESS_TAG=haproxy-87537fad-af8f-4eae-8420-dce1a4fd9a36', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/87537fad-af8f-4eae-8420-dce1a4fd9a36.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.898 2 DEBUG nova.compute.manager [req-cc7813f8-5f6c-4f73-92aa-b1f4afe25809 req-41fb3097-dd31-4fca-aaee-8cacfdaf25fe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Received event network-vif-plugged-84e96b77-4119-4da7-ad84-5cf394586e03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.898 2 DEBUG oslo_concurrency.lockutils [req-cc7813f8-5f6c-4f73-92aa-b1f4afe25809 req-41fb3097-dd31-4fca-aaee-8cacfdaf25fe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "f00f4363-87ff-45bb-b619-95a364353d0b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.899 2 DEBUG oslo_concurrency.lockutils [req-cc7813f8-5f6c-4f73-92aa-b1f4afe25809 req-41fb3097-dd31-4fca-aaee-8cacfdaf25fe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "f00f4363-87ff-45bb-b619-95a364353d0b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.899 2 DEBUG oslo_concurrency.lockutils [req-cc7813f8-5f6c-4f73-92aa-b1f4afe25809 req-41fb3097-dd31-4fca-aaee-8cacfdaf25fe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "f00f4363-87ff-45bb-b619-95a364353d0b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.899 2 DEBUG nova.compute.manager [req-cc7813f8-5f6c-4f73-92aa-b1f4afe25809 req-41fb3097-dd31-4fca-aaee-8cacfdaf25fe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Processing event network-vif-plugged-84e96b77-4119-4da7-ad84-5cf394586e03 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.954 2 DEBUG nova.network.neutron [req-b4aafe93-1216-4661-a157-b609c5048586 req-5fc97174-d15f-4c4d-83fe-4fc5e84860f1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Updated VIF entry in instance network info cache for port 84e96b77-4119-4da7-ad84-5cf394586e03. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.955 2 DEBUG nova.network.neutron [req-b4aafe93-1216-4661-a157-b609c5048586 req-5fc97174-d15f-4c4d-83fe-4fc5e84860f1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Updating instance_info_cache with network_info: [{"id": "84e96b77-4119-4da7-ad84-5cf394586e03", "address": "fa:16:3e:53:2d:ec", "network": {"id": "87537fad-af8f-4eae-8420-dce1a4fd9a36", "bridge": "br-int", "label": "tempest-test-network--927179751", "subnets": [{"cidr": "192.168.6.0/24", "dns": [], "gateway": {"address": "192.168.6.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.6.153", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e96b77-41", "ovs_interfaceid": "84e96b77-4119-4da7-ad84-5cf394586e03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:40:54 np0005476733 nova_compute[192580]: 2025-10-08 15:40:54.973 2 DEBUG oslo_concurrency.lockutils [req-b4aafe93-1216-4661-a157-b609c5048586 req-5fc97174-d15f-4c4d-83fe-4fc5e84860f1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-f00f4363-87ff-45bb-b619-95a364353d0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:40:55 np0005476733 podman[238131]: 2025-10-08 15:40:55.234229074 +0000 UTC m=+0.056020110 container create 40f6166777c22bcd81f9372473fe973a329a3129b1f399403b500b0089569c17 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-87537fad-af8f-4eae-8420-dce1a4fd9a36, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  8 11:40:55 np0005476733 systemd[1]: Started libpod-conmon-40f6166777c22bcd81f9372473fe973a329a3129b1f399403b500b0089569c17.scope.
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.291 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759938055.2907825, f00f4363-87ff-45bb-b619-95a364353d0b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.291 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] VM Started (Lifecycle Event)#033[00m
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.293 2 DEBUG nova.compute.manager [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.297 2 DEBUG nova.virt.libvirt.driver [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:40:55 np0005476733 podman[238131]: 2025-10-08 15:40:55.204775583 +0000 UTC m=+0.026566669 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.302 2 INFO nova.virt.libvirt.driver [-] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Instance spawned successfully.#033[00m
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.302 2 DEBUG nova.virt.libvirt.driver [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.309 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.312 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:40:55 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:40:55 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2593a17f894e3b904518a1439c4c4419c51daa064012b625d0e6f1040d46370a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.330 2 DEBUG nova.virt.libvirt.driver [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.331 2 DEBUG nova.virt.libvirt.driver [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.331 2 DEBUG nova.virt.libvirt.driver [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.331 2 DEBUG nova.virt.libvirt.driver [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.332 2 DEBUG nova.virt.libvirt.driver [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.332 2 DEBUG nova.virt.libvirt.driver [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:40:55 np0005476733 podman[238131]: 2025-10-08 15:40:55.334724713 +0000 UTC m=+0.156515769 container init 40f6166777c22bcd81f9372473fe973a329a3129b1f399403b500b0089569c17 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-87537fad-af8f-4eae-8420-dce1a4fd9a36, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.335 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.336 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759938055.2933264, f00f4363-87ff-45bb-b619-95a364353d0b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.336 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:40:55 np0005476733 podman[238131]: 2025-10-08 15:40:55.341778178 +0000 UTC m=+0.163569214 container start 40f6166777c22bcd81f9372473fe973a329a3129b1f399403b500b0089569c17 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-87537fad-af8f-4eae-8420-dce1a4fd9a36, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:40:55 np0005476733 neutron-haproxy-ovnmeta-87537fad-af8f-4eae-8420-dce1a4fd9a36[238146]: [NOTICE]   (238150) : New worker (238152) forked
Oct  8 11:40:55 np0005476733 neutron-haproxy-ovnmeta-87537fad-af8f-4eae-8420-dce1a4fd9a36[238146]: [NOTICE]   (238150) : Loading success.
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.368 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.370 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759938055.2969096, f00f4363-87ff-45bb-b619-95a364353d0b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.370 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.399 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.402 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.418 2 INFO nova.compute.manager [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Took 4.52 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.418 2 DEBUG nova.compute.manager [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.430 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.486 2 INFO nova.compute.manager [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Took 5.09 seconds to build instance.#033[00m
Oct  8 11:40:55 np0005476733 nova_compute[192580]: 2025-10-08 15:40:55.511 2 DEBUG oslo_concurrency.lockutils [None req-a9adb994-20cf-434a-abb0-1ee483055090 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "f00f4363-87ff-45bb-b619-95a364353d0b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:40:56 np0005476733 nova_compute[192580]: 2025-10-08 15:40:56.306 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:40:57 np0005476733 podman[238162]: 2025-10-08 15:40:57.237746277 +0000 UTC m=+0.061836796 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  8 11:40:57 np0005476733 podman[238161]: 2025-10-08 15:40:57.271221946 +0000 UTC m=+0.087873527 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 11:40:58 np0005476733 nova_compute[192580]: 2025-10-08 15:40:58.002 2 DEBUG nova.compute.manager [req-5fac3c75-65c9-4c81-ac42-a1f180fe42b7 req-030cf702-2b2e-40bb-a8f3-3bc651650272 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Received event network-vif-plugged-84e96b77-4119-4da7-ad84-5cf394586e03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:40:58 np0005476733 nova_compute[192580]: 2025-10-08 15:40:58.002 2 DEBUG oslo_concurrency.lockutils [req-5fac3c75-65c9-4c81-ac42-a1f180fe42b7 req-030cf702-2b2e-40bb-a8f3-3bc651650272 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "f00f4363-87ff-45bb-b619-95a364353d0b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:40:58 np0005476733 nova_compute[192580]: 2025-10-08 15:40:58.003 2 DEBUG oslo_concurrency.lockutils [req-5fac3c75-65c9-4c81-ac42-a1f180fe42b7 req-030cf702-2b2e-40bb-a8f3-3bc651650272 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "f00f4363-87ff-45bb-b619-95a364353d0b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:40:58 np0005476733 nova_compute[192580]: 2025-10-08 15:40:58.003 2 DEBUG oslo_concurrency.lockutils [req-5fac3c75-65c9-4c81-ac42-a1f180fe42b7 req-030cf702-2b2e-40bb-a8f3-3bc651650272 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "f00f4363-87ff-45bb-b619-95a364353d0b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:40:58 np0005476733 nova_compute[192580]: 2025-10-08 15:40:58.003 2 DEBUG nova.compute.manager [req-5fac3c75-65c9-4c81-ac42-a1f180fe42b7 req-030cf702-2b2e-40bb-a8f3-3bc651650272 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] No waiting events found dispatching network-vif-plugged-84e96b77-4119-4da7-ad84-5cf394586e03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:40:58 np0005476733 nova_compute[192580]: 2025-10-08 15:40:58.004 2 WARNING nova.compute.manager [req-5fac3c75-65c9-4c81-ac42-a1f180fe42b7 req-030cf702-2b2e-40bb-a8f3-3bc651650272 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Received unexpected event network-vif-plugged-84e96b77-4119-4da7-ad84-5cf394586e03 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:40:58 np0005476733 nova_compute[192580]: 2025-10-08 15:40:58.080 2 INFO nova.compute.manager [None req-f4992cec-a024-48a4-b029-4693127d10dc d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Get console output#033[00m
Oct  8 11:40:58 np0005476733 nova_compute[192580]: 2025-10-08 15:40:58.089 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:40:58 np0005476733 nova_compute[192580]: 2025-10-08 15:40:58.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:40:58 np0005476733 nova_compute[192580]: 2025-10-08 15:40:58.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:40:58 np0005476733 nova_compute[192580]: 2025-10-08 15:40:58.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:40:59 np0005476733 nova_compute[192580]: 2025-10-08 15:40:59.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:41:00 np0005476733 nova_compute[192580]: 2025-10-08 15:41:00.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:01 np0005476733 nova_compute[192580]: 2025-10-08 15:41:01.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:41:01 np0005476733 nova_compute[192580]: 2025-10-08 15:41:01.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:41:01 np0005476733 nova_compute[192580]: 2025-10-08 15:41:01.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:41:01 np0005476733 nova_compute[192580]: 2025-10-08 15:41:01.868 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-ccf8be13-2e93-495d-ac4a-2cff54baa4fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:41:01 np0005476733 nova_compute[192580]: 2025-10-08 15:41:01.869 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-ccf8be13-2e93-495d-ac4a-2cff54baa4fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:41:01 np0005476733 nova_compute[192580]: 2025-10-08 15:41:01.869 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:41:01 np0005476733 nova_compute[192580]: 2025-10-08 15:41:01.869 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ccf8be13-2e93-495d-ac4a-2cff54baa4fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:41:03 np0005476733 nova_compute[192580]: 2025-10-08 15:41:03.209 2 INFO nova.compute.manager [None req-0ce04fe1-8abe-4845-9c54-f475d820c466 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Get console output#033[00m
Oct  8 11:41:03 np0005476733 nova_compute[192580]: 2025-10-08 15:41:03.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:05 np0005476733 podman[238208]: 2025-10-08 15:41:05.23617962 +0000 UTC m=+0.060614336 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  8 11:41:05 np0005476733 podman[238210]: 2025-10-08 15:41:05.239859548 +0000 UTC m=+0.058412246 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.tags=minimal rhel9)
Oct  8 11:41:05 np0005476733 podman[238209]: 2025-10-08 15:41:05.24400461 +0000 UTC m=+0.065221633 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 11:41:05 np0005476733 nova_compute[192580]: 2025-10-08 15:41:05.309 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Updating instance_info_cache with network_info: [{"id": "d61bf8bf-d254-433d-b32a-427cbb791a7f", "address": "fa:16:3e:4a:a8:1d", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd61bf8bf-d2", "ovs_interfaceid": "d61bf8bf-d254-433d-b32a-427cbb791a7f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "3e387217-655c-4ce1-9a54-18395a63adb4", "address": "fa:16:3e:cd:58:9a", "network": {"id": "5557e28c-0838-4cc2-bde1-1d8616c6ce66", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "2001:db9::/64", "dns": [], "gateway": {"address": "2001:db9::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db9::344", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e387217-65", "ovs_interfaceid": "3e387217-655c-4ce1-9a54-18395a63adb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:41:05 np0005476733 nova_compute[192580]: 2025-10-08 15:41:05.333 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-ccf8be13-2e93-495d-ac4a-2cff54baa4fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:41:05 np0005476733 nova_compute[192580]: 2025-10-08 15:41:05.333 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:41:05 np0005476733 nova_compute[192580]: 2025-10-08 15:41:05.335 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:41:05 np0005476733 nova_compute[192580]: 2025-10-08 15:41:05.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:08 np0005476733 nova_compute[192580]: 2025-10-08 15:41:08.358 2 INFO nova.compute.manager [None req-96dafbb4-5fbf-438f-ac18-185f16fe0e27 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Get console output#033[00m
Oct  8 11:41:08 np0005476733 nova_compute[192580]: 2025-10-08 15:41:08.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:41:08 np0005476733 nova_compute[192580]: 2025-10-08 15:41:08.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:10 np0005476733 nova_compute[192580]: 2025-10-08 15:41:10.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:13 np0005476733 nova_compute[192580]: 2025-10-08 15:41:13.533 2 INFO nova.compute.manager [None req-4b91e8fd-2397-4b41-ae32-5bfabedf87a3 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Get console output#033[00m
Oct  8 11:41:13 np0005476733 nova_compute[192580]: 2025-10-08 15:41:13.538 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:41:13 np0005476733 nova_compute[192580]: 2025-10-08 15:41:13.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:15 np0005476733 podman[238275]: 2025-10-08 15:41:15.226985632 +0000 UTC m=+0.052674394 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:41:15 np0005476733 podman[238276]: 2025-10-08 15:41:15.227009833 +0000 UTC m=+0.048385907 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:41:15 np0005476733 nova_compute[192580]: 2025-10-08 15:41:15.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:18 np0005476733 nova_compute[192580]: 2025-10-08 15:41:18.687 2 INFO nova.compute.manager [None req-75d315f2-363a-40d0-b61d-c16b16751299 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Get console output#033[00m
Oct  8 11:41:18 np0005476733 nova_compute[192580]: 2025-10-08 15:41:18.691 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:41:18 np0005476733 nova_compute[192580]: 2025-10-08 15:41:18.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:19 np0005476733 ovn_controller[94857]: 2025-10-08T15:41:19Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:53:2d:ec 192.168.6.153
Oct  8 11:41:19 np0005476733 ovn_controller[94857]: 2025-10-08T15:41:19Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:53:2d:ec 192.168.6.153
Oct  8 11:41:20 np0005476733 nova_compute[192580]: 2025-10-08 15:41:20.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:23 np0005476733 nova_compute[192580]: 2025-10-08 15:41:23.828 2 INFO nova.compute.manager [None req-2fcb73f8-f1b0-4fb2-91f5-e2c803afba67 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Get console output#033[00m
Oct  8 11:41:23 np0005476733 nova_compute[192580]: 2025-10-08 15:41:23.832 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:41:23 np0005476733 nova_compute[192580]: 2025-10-08 15:41:23.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:24 np0005476733 podman[238315]: 2025-10-08 15:41:24.230964297 +0000 UTC m=+0.055315157 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  8 11:41:24 np0005476733 ovn_controller[94857]: 2025-10-08T15:41:24Z|00536|pinctrl|WARN|Dropped 1511 log messages in last 61 seconds (most recently, 23 seconds ago) due to excessive rate
Oct  8 11:41:24 np0005476733 ovn_controller[94857]: 2025-10-08T15:41:24Z|00537|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:41:24 np0005476733 ovn_controller[94857]: 2025-10-08T15:41:24Z|00538|memory_trim|INFO|Detected inactivity (last active 30024 ms ago): trimming memory
Oct  8 11:41:25 np0005476733 nova_compute[192580]: 2025-10-08 15:41:25.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:41:26.329 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:41:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:41:26.330 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:41:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:41:26.331 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:41:28 np0005476733 podman[238335]: 2025-10-08 15:41:28.234966128 +0000 UTC m=+0.060468952 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  8 11:41:28 np0005476733 podman[238334]: 2025-10-08 15:41:28.253278563 +0000 UTC m=+0.085955517 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 11:41:28 np0005476733 nova_compute[192580]: 2025-10-08 15:41:28.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:29 np0005476733 nova_compute[192580]: 2025-10-08 15:41:29.006 2 INFO nova.compute.manager [None req-fdf0ee0f-5393-428d-9c67-130904b3f701 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Get console output#033[00m
Oct  8 11:41:29 np0005476733 nova_compute[192580]: 2025-10-08 15:41:29.011 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:41:29 np0005476733 nova_compute[192580]: 2025-10-08 15:41:29.013 2 INFO nova.virt.libvirt.driver [None req-fdf0ee0f-5393-428d-9c67-130904b3f701 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Truncated console log returned, 3153 bytes ignored#033[00m
Oct  8 11:41:30 np0005476733 nova_compute[192580]: 2025-10-08 15:41:30.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:33 np0005476733 nova_compute[192580]: 2025-10-08 15:41:33.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:34 np0005476733 nova_compute[192580]: 2025-10-08 15:41:34.300 2 INFO nova.compute.manager [None req-cd78569f-a333-42ab-abf1-92531c0cd26e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Get console output#033[00m
Oct  8 11:41:34 np0005476733 nova_compute[192580]: 2025-10-08 15:41:34.306 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:41:34 np0005476733 nova_compute[192580]: 2025-10-08 15:41:34.310 2 INFO nova.virt.libvirt.driver [None req-cd78569f-a333-42ab-abf1-92531c0cd26e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Truncated console log returned, 3367 bytes ignored#033[00m
Oct  8 11:41:35 np0005476733 nova_compute[192580]: 2025-10-08 15:41:35.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:36 np0005476733 podman[238399]: 2025-10-08 15:41:36.223458674 +0000 UTC m=+0.050354789 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 11:41:36 np0005476733 podman[238400]: 2025-10-08 15:41:36.242047997 +0000 UTC m=+0.064915664 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, architecture=x86_64)
Oct  8 11:41:36 np0005476733 podman[238398]: 2025-10-08 15:41:36.251322283 +0000 UTC m=+0.081989919 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:41:37 np0005476733 nova_compute[192580]: 2025-10-08 15:41:37.385 2 DEBUG nova.compute.manager [req-df3dd9b9-7201-4175-8a67-ed7a07175db7 req-5035c4fe-89e2-42c5-a22b-6e1c987d9ba5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Received event network-changed-84e96b77-4119-4da7-ad84-5cf394586e03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:41:37 np0005476733 nova_compute[192580]: 2025-10-08 15:41:37.386 2 DEBUG nova.compute.manager [req-df3dd9b9-7201-4175-8a67-ed7a07175db7 req-5035c4fe-89e2-42c5-a22b-6e1c987d9ba5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Refreshing instance network info cache due to event network-changed-84e96b77-4119-4da7-ad84-5cf394586e03. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:41:37 np0005476733 nova_compute[192580]: 2025-10-08 15:41:37.386 2 DEBUG oslo_concurrency.lockutils [req-df3dd9b9-7201-4175-8a67-ed7a07175db7 req-5035c4fe-89e2-42c5-a22b-6e1c987d9ba5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-f00f4363-87ff-45bb-b619-95a364353d0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:41:37 np0005476733 nova_compute[192580]: 2025-10-08 15:41:37.386 2 DEBUG oslo_concurrency.lockutils [req-df3dd9b9-7201-4175-8a67-ed7a07175db7 req-5035c4fe-89e2-42c5-a22b-6e1c987d9ba5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-f00f4363-87ff-45bb-b619-95a364353d0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:41:37 np0005476733 nova_compute[192580]: 2025-10-08 15:41:37.387 2 DEBUG nova.network.neutron [req-df3dd9b9-7201-4175-8a67-ed7a07175db7 req-5035c4fe-89e2-42c5-a22b-6e1c987d9ba5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Refreshing network info cache for port 84e96b77-4119-4da7-ad84-5cf394586e03 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:41:38 np0005476733 nova_compute[192580]: 2025-10-08 15:41:38.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:40 np0005476733 nova_compute[192580]: 2025-10-08 15:41:40.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:40 np0005476733 nova_compute[192580]: 2025-10-08 15:41:40.913 2 DEBUG nova.network.neutron [req-df3dd9b9-7201-4175-8a67-ed7a07175db7 req-5035c4fe-89e2-42c5-a22b-6e1c987d9ba5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Updated VIF entry in instance network info cache for port 84e96b77-4119-4da7-ad84-5cf394586e03. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:41:40 np0005476733 nova_compute[192580]: 2025-10-08 15:41:40.913 2 DEBUG nova.network.neutron [req-df3dd9b9-7201-4175-8a67-ed7a07175db7 req-5035c4fe-89e2-42c5-a22b-6e1c987d9ba5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Updating instance_info_cache with network_info: [{"id": "84e96b77-4119-4da7-ad84-5cf394586e03", "address": "fa:16:3e:53:2d:ec", "network": {"id": "87537fad-af8f-4eae-8420-dce1a4fd9a36", "bridge": "br-int", "label": "tempest-test-network--927179751", "subnets": [{"cidr": "192.168.6.0/24", "dns": [], "gateway": {"address": "192.168.6.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.6.153", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e96b77-41", "ovs_interfaceid": "84e96b77-4119-4da7-ad84-5cf394586e03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:41:40 np0005476733 nova_compute[192580]: 2025-10-08 15:41:40.935 2 DEBUG oslo_concurrency.lockutils [req-df3dd9b9-7201-4175-8a67-ed7a07175db7 req-5035c4fe-89e2-42c5-a22b-6e1c987d9ba5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-f00f4363-87ff-45bb-b619-95a364353d0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:41:43 np0005476733 nova_compute[192580]: 2025-10-08 15:41:43.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:45 np0005476733 nova_compute[192580]: 2025-10-08 15:41:45.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:46 np0005476733 podman[238465]: 2025-10-08 15:41:46.229479471 +0000 UTC m=+0.054180552 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:41:46 np0005476733 podman[238464]: 2025-10-08 15:41:46.251951688 +0000 UTC m=+0.080661467 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3)
Oct  8 11:41:47 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:41:47.258 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:41:47 np0005476733 nova_compute[192580]: 2025-10-08 15:41:47.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:47 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:41:47.262 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:41:48 np0005476733 nova_compute[192580]: 2025-10-08 15:41:48.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:50 np0005476733 nova_compute[192580]: 2025-10-08 15:41:50.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:50 np0005476733 nova_compute[192580]: 2025-10-08 15:41:50.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:41:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:41:53.266 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.495 2 DEBUG oslo_concurrency.lockutils [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.496 2 DEBUG oslo_concurrency.lockutils [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.496 2 DEBUG oslo_concurrency.lockutils [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.497 2 DEBUG oslo_concurrency.lockutils [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.497 2 DEBUG oslo_concurrency.lockutils [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.498 2 INFO nova.compute.manager [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Terminating instance#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.499 2 DEBUG nova.compute.manager [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:41:53 np0005476733 kernel: tapd61bf8bf-d2 (unregistering): left promiscuous mode
Oct  8 11:41:53 np0005476733 NetworkManager[51699]: <info>  [1759938113.5385] device (tapd61bf8bf-d2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:41:53 np0005476733 ovn_controller[94857]: 2025-10-08T15:41:53Z|00539|binding|INFO|Releasing lport d61bf8bf-d254-433d-b32a-427cbb791a7f from this chassis (sb_readonly=0)
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:53 np0005476733 ovn_controller[94857]: 2025-10-08T15:41:53Z|00540|binding|INFO|Setting lport d61bf8bf-d254-433d-b32a-427cbb791a7f down in Southbound
Oct  8 11:41:53 np0005476733 ovn_controller[94857]: 2025-10-08T15:41:53Z|00541|binding|INFO|Removing iface tapd61bf8bf-d2 ovn-installed in OVS
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:41:53.558 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:a8:1d 10.100.0.5'], port_security=['fa:16:3e:4a:a8:1d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'first_port-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac', 'neutron:port_capabilities': '', 'neutron:port_name': 'first_port-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'neutron:project_id': '357683d0efd54df8878ddcfaabe6d388', 'neutron:revision_number': '4', 'neutron:security_group_ids': '93a341f3-21b5-4aa3-854e-5c20dcdd9b33', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.230'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf5d6359-20d9-440f-a678-46a616c58f4d, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=d61bf8bf-d254-433d-b32a-427cbb791a7f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:41:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:41:53.559 103739 INFO neutron.agent.ovn.metadata.agent [-] Port d61bf8bf-d254-433d-b32a-427cbb791a7f in datapath 2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac unbound from our chassis#033[00m
Oct  8 11:41:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:41:53.561 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:41:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:41:53.562 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[df317a44-9246-4cb0-9ac2-0749e89d5bd1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:41:53.564 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac namespace which is not needed anymore#033[00m
Oct  8 11:41:53 np0005476733 kernel: tap3e387217-65 (unregistering): left promiscuous mode
Oct  8 11:41:53 np0005476733 NetworkManager[51699]: <info>  [1759938113.5884] device (tap3e387217-65): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:53 np0005476733 ovn_controller[94857]: 2025-10-08T15:41:53Z|00542|binding|INFO|Releasing lport 3e387217-655c-4ce1-9a54-18395a63adb4 from this chassis (sb_readonly=0)
Oct  8 11:41:53 np0005476733 ovn_controller[94857]: 2025-10-08T15:41:53Z|00543|binding|INFO|Setting lport 3e387217-655c-4ce1-9a54-18395a63adb4 down in Southbound
Oct  8 11:41:53 np0005476733 ovn_controller[94857]: 2025-10-08T15:41:53Z|00544|binding|INFO|Removing iface tap3e387217-65 ovn-installed in OVS
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:41:53.601 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:58:9a 2001:db9::344'], port_security=['fa:16:3e:cd:58:9a 2001:db9::344'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'second_port-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'neutron:cidrs': '2001:db9::344/64', 'neutron:device_id': 'ccf8be13-2e93-495d-ac4a-2cff54baa4fb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5557e28c-0838-4cc2-bde1-1d8616c6ce66', 'neutron:port_capabilities': '', 'neutron:port_name': 'second_port-tempest-MultiPortVlanTransparencyTest-2097740166-0', 'neutron:project_id': '357683d0efd54df8878ddcfaabe6d388', 'neutron:revision_number': '4', 'neutron:security_group_ids': '93a341f3-21b5-4aa3-854e-5c20dcdd9b33', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b29998f5-0e10-45e3-9be1-b1005e0d6bad, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=3e387217-655c-4ce1-9a54-18395a63adb4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:53 np0005476733 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d00000040.scope: Deactivated successfully.
Oct  8 11:41:53 np0005476733 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d00000040.scope: Consumed 50.347s CPU time.
Oct  8 11:41:53 np0005476733 systemd-machined[152624]: Machine qemu-37-instance-00000040 terminated.
Oct  8 11:41:53 np0005476733 neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac[237608]: [NOTICE]   (237612) : haproxy version is 2.8.14-c23fe91
Oct  8 11:41:53 np0005476733 neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac[237608]: [NOTICE]   (237612) : path to executable is /usr/sbin/haproxy
Oct  8 11:41:53 np0005476733 neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac[237608]: [WARNING]  (237612) : Exiting Master process...
Oct  8 11:41:53 np0005476733 neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac[237608]: [ALERT]    (237612) : Current worker (237614) exited with code 143 (Terminated)
Oct  8 11:41:53 np0005476733 neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac[237608]: [WARNING]  (237612) : All workers exited. Exiting... (0)
Oct  8 11:41:53 np0005476733 systemd[1]: libpod-9ff65f44bb9ae9ef77c0e9f8e5c41094db9267bfa8107124b97751a477ec92e4.scope: Deactivated successfully.
Oct  8 11:41:53 np0005476733 podman[238552]: 2025-10-08 15:41:53.709948013 +0000 UTC m=+0.046416324 container died 9ff65f44bb9ae9ef77c0e9f8e5c41094db9267bfa8107124b97751a477ec92e4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:53 np0005476733 NetworkManager[51699]: <info>  [1759938113.7330] manager: (tap3e387217-65): new Tun device (/org/freedesktop/NetworkManager/Devices/180)
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:53 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9ff65f44bb9ae9ef77c0e9f8e5c41094db9267bfa8107124b97751a477ec92e4-userdata-shm.mount: Deactivated successfully.
Oct  8 11:41:53 np0005476733 systemd[1]: var-lib-containers-storage-overlay-87e0e5fdb0d9999b8ae0ef304ca19c06c78635ad6f94410546b91350cdd22a77-merged.mount: Deactivated successfully.
Oct  8 11:41:53 np0005476733 podman[238552]: 2025-10-08 15:41:53.760695002 +0000 UTC m=+0.097163283 container cleanup 9ff65f44bb9ae9ef77c0e9f8e5c41094db9267bfa8107124b97751a477ec92e4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  8 11:41:53 np0005476733 systemd[1]: libpod-conmon-9ff65f44bb9ae9ef77c0e9f8e5c41094db9267bfa8107124b97751a477ec92e4.scope: Deactivated successfully.
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.781 2 INFO nova.virt.libvirt.driver [-] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Instance destroyed successfully.#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.782 2 DEBUG nova.objects.instance [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lazy-loading 'resources' on Instance uuid ccf8be13-2e93-495d-ac4a-2cff54baa4fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.796 2 DEBUG nova.virt.libvirt.vif [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:39:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-2097740166-0',display_name='server-tempest-MultiPortVlanTransparencyTest-2097740166-0',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multiportvlantransparencytest-2097740166-0',id=64,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFO6EtKf086AtDcSKUhQT3A92xQMgobyVurrJBZ/a3hiqHTiY5Yo0zaLWibmNBoQ54lPdiEia0lEWiGuyPEo3V1Xkv/BTywiIW8/QXBzK9pxBAvfcXXyqWXEVNgqfaVfhA==',key_name='tempest-MultiPortVlanTransparencyTest-2097740166',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:39:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='357683d0efd54df8878ddcfaabe6d388',ramdisk_id='',reservation_id='r-zarp8bd9',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-MultiPortVlanTransparencyTest-198310335',owner_user_name='tempest-MultiPortVlanTransparencyTest-198310335-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:39:56Z,user_data=None,user_id='ec8fd4ab84244ebb88e5af7fcd3ce92b',uuid=ccf8be13-2e93-495d-ac4a-2cff54baa4fb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d61bf8bf-d254-433d-b32a-427cbb791a7f", "address": "fa:16:3e:4a:a8:1d", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd61bf8bf-d2", "ovs_interfaceid": "d61bf8bf-d254-433d-b32a-427cbb791a7f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.798 2 DEBUG nova.network.os_vif_util [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converting VIF {"id": "d61bf8bf-d254-433d-b32a-427cbb791a7f", "address": "fa:16:3e:4a:a8:1d", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd61bf8bf-d2", "ovs_interfaceid": "d61bf8bf-d254-433d-b32a-427cbb791a7f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.799 2 DEBUG nova.network.os_vif_util [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4a:a8:1d,bridge_name='br-int',has_traffic_filtering=True,id=d61bf8bf-d254-433d-b32a-427cbb791a7f,network=Network(2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd61bf8bf-d2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.799 2 DEBUG os_vif [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4a:a8:1d,bridge_name='br-int',has_traffic_filtering=True,id=d61bf8bf-d254-433d-b32a-427cbb791a7f,network=Network(2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd61bf8bf-d2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.802 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd61bf8bf-d2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.818 2 INFO os_vif [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4a:a8:1d,bridge_name='br-int',has_traffic_filtering=True,id=d61bf8bf-d254-433d-b32a-427cbb791a7f,network=Network(2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd61bf8bf-d2')#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.819 2 DEBUG nova.virt.libvirt.vif [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:39:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-2097740166-0',display_name='server-tempest-MultiPortVlanTransparencyTest-2097740166-0',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multiportvlantransparencytest-2097740166-0',id=64,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFO6EtKf086AtDcSKUhQT3A92xQMgobyVurrJBZ/a3hiqHTiY5Yo0zaLWibmNBoQ54lPdiEia0lEWiGuyPEo3V1Xkv/BTywiIW8/QXBzK9pxBAvfcXXyqWXEVNgqfaVfhA==',key_name='tempest-MultiPortVlanTransparencyTest-2097740166',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:39:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='357683d0efd54df8878ddcfaabe6d388',ramdisk_id='',reservation_id='r-zarp8bd9',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-MultiPortVlanTransparencyTest-198310335',owner_user_name='tempest-MultiPortVlanTransparencyTest-198310335-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:39:56Z,user_data=None,user_id='ec8fd4ab84244ebb88e5af7fcd3ce92b',uuid=ccf8be13-2e93-495d-ac4a-2cff54baa4fb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e387217-655c-4ce1-9a54-18395a63adb4", "address": "fa:16:3e:cd:58:9a", "network": {"id": "5557e28c-0838-4cc2-bde1-1d8616c6ce66", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "2001:db9::/64", "dns": [], "gateway": {"address": "2001:db9::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db9::344", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e387217-65", "ovs_interfaceid": "3e387217-655c-4ce1-9a54-18395a63adb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.819 2 DEBUG nova.network.os_vif_util [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converting VIF {"id": "3e387217-655c-4ce1-9a54-18395a63adb4", "address": "fa:16:3e:cd:58:9a", "network": {"id": "5557e28c-0838-4cc2-bde1-1d8616c6ce66", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "2001:db9::/64", "dns": [], "gateway": {"address": "2001:db9::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db9::344", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e387217-65", "ovs_interfaceid": "3e387217-655c-4ce1-9a54-18395a63adb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.820 2 DEBUG nova.network.os_vif_util [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cd:58:9a,bridge_name='br-int',has_traffic_filtering=True,id=3e387217-655c-4ce1-9a54-18395a63adb4,network=Network(5557e28c-0838-4cc2-bde1-1d8616c6ce66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3e387217-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.820 2 DEBUG os_vif [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cd:58:9a,bridge_name='br-int',has_traffic_filtering=True,id=3e387217-655c-4ce1-9a54-18395a63adb4,network=Network(5557e28c-0838-4cc2-bde1-1d8616c6ce66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3e387217-65') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.821 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e387217-65, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.825 2 INFO os_vif [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cd:58:9a,bridge_name='br-int',has_traffic_filtering=True,id=3e387217-655c-4ce1-9a54-18395a63adb4,network=Network(5557e28c-0838-4cc2-bde1-1d8616c6ce66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3e387217-65')#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.825 2 INFO nova.virt.libvirt.driver [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Deleting instance files /var/lib/nova/instances/ccf8be13-2e93-495d-ac4a-2cff54baa4fb_del#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.826 2 INFO nova.virt.libvirt.driver [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Deletion of /var/lib/nova/instances/ccf8be13-2e93-495d-ac4a-2cff54baa4fb_del complete#033[00m
Oct  8 11:41:53 np0005476733 podman[238606]: 2025-10-08 15:41:53.834946254 +0000 UTC m=+0.048657095 container remove 9ff65f44bb9ae9ef77c0e9f8e5c41094db9267bfa8107124b97751a477ec92e4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 11:41:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:41:53.840 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ca6f8721-b53b-4bb2-9dcc-0785798ff9e9]: (4, ('Wed Oct  8 03:41:53 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac (9ff65f44bb9ae9ef77c0e9f8e5c41094db9267bfa8107124b97751a477ec92e4)\n9ff65f44bb9ae9ef77c0e9f8e5c41094db9267bfa8107124b97751a477ec92e4\nWed Oct  8 03:41:53 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac (9ff65f44bb9ae9ef77c0e9f8e5c41094db9267bfa8107124b97751a477ec92e4)\n9ff65f44bb9ae9ef77c0e9f8e5c41094db9267bfa8107124b97751a477ec92e4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:41:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:41:53.842 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5570c1c9-0664-4da9-a11b-8c7cc940d3e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:41:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:41:53.842 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2bf87bc3-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:53 np0005476733 kernel: tap2bf87bc3-30: left promiscuous mode
Oct  8 11:41:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:41:53.850 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b8ada07a-5ff4-4a6d-ac16-c2a6dd45219c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:41:53.878 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3648c061-1d29-46f3-a0d9-040920e8295a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:41:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:41:53.880 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7743916c-6d23-4a16-90cb-61301c6a55fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:41:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:41:53.901 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[20c70039-572b-4797-8efb-36dbaa127b7d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489706, 'reachable_time': 29543, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238623, 'error': None, 'target': 'ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:41:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:41:53.904 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:41:53 np0005476733 systemd[1]: run-netns-ovnmeta\x2d2bf87bc3\x2d3d0a\x2d4d8a\x2db41e\x2d00010e6b47ac.mount: Deactivated successfully.
Oct  8 11:41:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:41:53.904 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[cd174531-8436-4042-900d-39503bfc8733]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:41:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:41:53.907 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 3e387217-655c-4ce1-9a54-18395a63adb4 in datapath 5557e28c-0838-4cc2-bde1-1d8616c6ce66 unbound from our chassis#033[00m
Oct  8 11:41:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:41:53.908 103739 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5557e28c-0838-4cc2-bde1-1d8616c6ce66 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  8 11:41:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:41:53.909 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4afb871e-ed4b-423b-94e7-2dae8cd977f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.911 2 INFO nova.compute.manager [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.911 2 DEBUG oslo.service.loopingcall [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.912 2 DEBUG nova.compute.manager [-] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:41:53 np0005476733 nova_compute[192580]: 2025-10-08 15:41:53.912 2 DEBUG nova.network.neutron [-] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:41:54 np0005476733 nova_compute[192580]: 2025-10-08 15:41:54.674 2 DEBUG nova.compute.manager [req-f3eaea6c-c924-4040-9332-8aad823c5cf3 req-8b7e84a5-e695-4c74-b31d-0aeacc3a20ac 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Received event network-vif-unplugged-3e387217-655c-4ce1-9a54-18395a63adb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:41:54 np0005476733 nova_compute[192580]: 2025-10-08 15:41:54.674 2 DEBUG oslo_concurrency.lockutils [req-f3eaea6c-c924-4040-9332-8aad823c5cf3 req-8b7e84a5-e695-4c74-b31d-0aeacc3a20ac 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:41:54 np0005476733 nova_compute[192580]: 2025-10-08 15:41:54.675 2 DEBUG oslo_concurrency.lockutils [req-f3eaea6c-c924-4040-9332-8aad823c5cf3 req-8b7e84a5-e695-4c74-b31d-0aeacc3a20ac 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:41:54 np0005476733 nova_compute[192580]: 2025-10-08 15:41:54.675 2 DEBUG oslo_concurrency.lockutils [req-f3eaea6c-c924-4040-9332-8aad823c5cf3 req-8b7e84a5-e695-4c74-b31d-0aeacc3a20ac 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:41:54 np0005476733 nova_compute[192580]: 2025-10-08 15:41:54.676 2 DEBUG nova.compute.manager [req-f3eaea6c-c924-4040-9332-8aad823c5cf3 req-8b7e84a5-e695-4c74-b31d-0aeacc3a20ac 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] No waiting events found dispatching network-vif-unplugged-3e387217-655c-4ce1-9a54-18395a63adb4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:41:54 np0005476733 nova_compute[192580]: 2025-10-08 15:41:54.676 2 DEBUG nova.compute.manager [req-f3eaea6c-c924-4040-9332-8aad823c5cf3 req-8b7e84a5-e695-4c74-b31d-0aeacc3a20ac 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Received event network-vif-unplugged-3e387217-655c-4ce1-9a54-18395a63adb4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:41:54 np0005476733 nova_compute[192580]: 2025-10-08 15:41:54.677 2 DEBUG nova.compute.manager [req-f3eaea6c-c924-4040-9332-8aad823c5cf3 req-8b7e84a5-e695-4c74-b31d-0aeacc3a20ac 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Received event network-vif-plugged-3e387217-655c-4ce1-9a54-18395a63adb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:41:54 np0005476733 nova_compute[192580]: 2025-10-08 15:41:54.677 2 DEBUG oslo_concurrency.lockutils [req-f3eaea6c-c924-4040-9332-8aad823c5cf3 req-8b7e84a5-e695-4c74-b31d-0aeacc3a20ac 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:41:54 np0005476733 nova_compute[192580]: 2025-10-08 15:41:54.677 2 DEBUG oslo_concurrency.lockutils [req-f3eaea6c-c924-4040-9332-8aad823c5cf3 req-8b7e84a5-e695-4c74-b31d-0aeacc3a20ac 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:41:54 np0005476733 nova_compute[192580]: 2025-10-08 15:41:54.678 2 DEBUG oslo_concurrency.lockutils [req-f3eaea6c-c924-4040-9332-8aad823c5cf3 req-8b7e84a5-e695-4c74-b31d-0aeacc3a20ac 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:41:54 np0005476733 nova_compute[192580]: 2025-10-08 15:41:54.678 2 DEBUG nova.compute.manager [req-f3eaea6c-c924-4040-9332-8aad823c5cf3 req-8b7e84a5-e695-4c74-b31d-0aeacc3a20ac 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] No waiting events found dispatching network-vif-plugged-3e387217-655c-4ce1-9a54-18395a63adb4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:41:54 np0005476733 nova_compute[192580]: 2025-10-08 15:41:54.678 2 WARNING nova.compute.manager [req-f3eaea6c-c924-4040-9332-8aad823c5cf3 req-8b7e84a5-e695-4c74-b31d-0aeacc3a20ac 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Received unexpected event network-vif-plugged-3e387217-655c-4ce1-9a54-18395a63adb4 for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:41:54 np0005476733 nova_compute[192580]: 2025-10-08 15:41:54.744 2 DEBUG nova.compute.manager [req-cf7205f4-46b8-4bc2-9aed-c1a8e048594e req-d9e481b6-3b7b-4c32-87f6-3a6d8e7914bc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Received event network-vif-unplugged-d61bf8bf-d254-433d-b32a-427cbb791a7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:41:54 np0005476733 nova_compute[192580]: 2025-10-08 15:41:54.744 2 DEBUG oslo_concurrency.lockutils [req-cf7205f4-46b8-4bc2-9aed-c1a8e048594e req-d9e481b6-3b7b-4c32-87f6-3a6d8e7914bc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:41:54 np0005476733 nova_compute[192580]: 2025-10-08 15:41:54.744 2 DEBUG oslo_concurrency.lockutils [req-cf7205f4-46b8-4bc2-9aed-c1a8e048594e req-d9e481b6-3b7b-4c32-87f6-3a6d8e7914bc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:41:54 np0005476733 nova_compute[192580]: 2025-10-08 15:41:54.745 2 DEBUG oslo_concurrency.lockutils [req-cf7205f4-46b8-4bc2-9aed-c1a8e048594e req-d9e481b6-3b7b-4c32-87f6-3a6d8e7914bc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:41:54 np0005476733 nova_compute[192580]: 2025-10-08 15:41:54.745 2 DEBUG nova.compute.manager [req-cf7205f4-46b8-4bc2-9aed-c1a8e048594e req-d9e481b6-3b7b-4c32-87f6-3a6d8e7914bc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] No waiting events found dispatching network-vif-unplugged-d61bf8bf-d254-433d-b32a-427cbb791a7f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:41:54 np0005476733 nova_compute[192580]: 2025-10-08 15:41:54.745 2 DEBUG nova.compute.manager [req-cf7205f4-46b8-4bc2-9aed-c1a8e048594e req-d9e481b6-3b7b-4c32-87f6-3a6d8e7914bc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Received event network-vif-unplugged-d61bf8bf-d254-433d-b32a-427cbb791a7f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:41:54 np0005476733 nova_compute[192580]: 2025-10-08 15:41:54.746 2 DEBUG nova.compute.manager [req-cf7205f4-46b8-4bc2-9aed-c1a8e048594e req-d9e481b6-3b7b-4c32-87f6-3a6d8e7914bc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Received event network-vif-plugged-d61bf8bf-d254-433d-b32a-427cbb791a7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:41:54 np0005476733 nova_compute[192580]: 2025-10-08 15:41:54.746 2 DEBUG oslo_concurrency.lockutils [req-cf7205f4-46b8-4bc2-9aed-c1a8e048594e req-d9e481b6-3b7b-4c32-87f6-3a6d8e7914bc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:41:54 np0005476733 nova_compute[192580]: 2025-10-08 15:41:54.746 2 DEBUG oslo_concurrency.lockutils [req-cf7205f4-46b8-4bc2-9aed-c1a8e048594e req-d9e481b6-3b7b-4c32-87f6-3a6d8e7914bc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:41:54 np0005476733 nova_compute[192580]: 2025-10-08 15:41:54.747 2 DEBUG oslo_concurrency.lockutils [req-cf7205f4-46b8-4bc2-9aed-c1a8e048594e req-d9e481b6-3b7b-4c32-87f6-3a6d8e7914bc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:41:54 np0005476733 nova_compute[192580]: 2025-10-08 15:41:54.747 2 DEBUG nova.compute.manager [req-cf7205f4-46b8-4bc2-9aed-c1a8e048594e req-d9e481b6-3b7b-4c32-87f6-3a6d8e7914bc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] No waiting events found dispatching network-vif-plugged-d61bf8bf-d254-433d-b32a-427cbb791a7f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:41:54 np0005476733 nova_compute[192580]: 2025-10-08 15:41:54.747 2 WARNING nova.compute.manager [req-cf7205f4-46b8-4bc2-9aed-c1a8e048594e req-d9e481b6-3b7b-4c32-87f6-3a6d8e7914bc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Received unexpected event network-vif-plugged-d61bf8bf-d254-433d-b32a-427cbb791a7f for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:41:55 np0005476733 podman[238624]: 2025-10-08 15:41:55.281431298 +0000 UTC m=+0.103390953 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct  8 11:41:55 np0005476733 nova_compute[192580]: 2025-10-08 15:41:55.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:55 np0005476733 nova_compute[192580]: 2025-10-08 15:41:55.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:41:55 np0005476733 nova_compute[192580]: 2025-10-08 15:41:55.614 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:41:55 np0005476733 nova_compute[192580]: 2025-10-08 15:41:55.615 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:41:55 np0005476733 nova_compute[192580]: 2025-10-08 15:41:55.615 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:41:55 np0005476733 nova_compute[192580]: 2025-10-08 15:41:55.615 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:41:55 np0005476733 nova_compute[192580]: 2025-10-08 15:41:55.687 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f00f4363-87ff-45bb-b619-95a364353d0b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:41:55 np0005476733 nova_compute[192580]: 2025-10-08 15:41:55.749 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f00f4363-87ff-45bb-b619-95a364353d0b/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:41:55 np0005476733 nova_compute[192580]: 2025-10-08 15:41:55.749 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f00f4363-87ff-45bb-b619-95a364353d0b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:41:55 np0005476733 nova_compute[192580]: 2025-10-08 15:41:55.813 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f00f4363-87ff-45bb-b619-95a364353d0b/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:41:55 np0005476733 nova_compute[192580]: 2025-10-08 15:41:55.974 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:41:55 np0005476733 nova_compute[192580]: 2025-10-08 15:41:55.976 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12993MB free_disk=111.18992614746094GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:41:55 np0005476733 nova_compute[192580]: 2025-10-08 15:41:55.976 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:41:55 np0005476733 nova_compute[192580]: 2025-10-08 15:41:55.976 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:41:56 np0005476733 nova_compute[192580]: 2025-10-08 15:41:56.182 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance ccf8be13-2e93-495d-ac4a-2cff54baa4fb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:41:56 np0005476733 nova_compute[192580]: 2025-10-08 15:41:56.183 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance f00f4363-87ff-45bb-b619-95a364353d0b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:41:56 np0005476733 nova_compute[192580]: 2025-10-08 15:41:56.183 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:41:56 np0005476733 nova_compute[192580]: 2025-10-08 15:41:56.183 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=2560MB phys_disk=119GB used_disk=20GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:41:56 np0005476733 nova_compute[192580]: 2025-10-08 15:41:56.249 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:41:56 np0005476733 nova_compute[192580]: 2025-10-08 15:41:56.274 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:41:56 np0005476733 nova_compute[192580]: 2025-10-08 15:41:56.301 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:41:56 np0005476733 nova_compute[192580]: 2025-10-08 15:41:56.301 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.325s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:41:56 np0005476733 nova_compute[192580]: 2025-10-08 15:41:56.469 2 DEBUG nova.network.neutron [-] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:41:56 np0005476733 nova_compute[192580]: 2025-10-08 15:41:56.490 2 INFO nova.compute.manager [-] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Took 2.58 seconds to deallocate network for instance.#033[00m
Oct  8 11:41:56 np0005476733 nova_compute[192580]: 2025-10-08 15:41:56.524 2 DEBUG oslo_concurrency.lockutils [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:41:56 np0005476733 nova_compute[192580]: 2025-10-08 15:41:56.525 2 DEBUG oslo_concurrency.lockutils [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:41:56 np0005476733 nova_compute[192580]: 2025-10-08 15:41:56.586 2 DEBUG nova.compute.provider_tree [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:41:56 np0005476733 nova_compute[192580]: 2025-10-08 15:41:56.602 2 DEBUG nova.scheduler.client.report [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:41:56 np0005476733 nova_compute[192580]: 2025-10-08 15:41:56.622 2 DEBUG oslo_concurrency.lockutils [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:41:56 np0005476733 nova_compute[192580]: 2025-10-08 15:41:56.658 2 INFO nova.scheduler.client.report [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Deleted allocations for instance ccf8be13-2e93-495d-ac4a-2cff54baa4fb#033[00m
Oct  8 11:41:56 np0005476733 nova_compute[192580]: 2025-10-08 15:41:56.719 2 DEBUG oslo_concurrency.lockutils [None req-7708c68d-ebfc-4dcb-841b-66febab5d1d1 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "ccf8be13-2e93-495d-ac4a-2cff54baa4fb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:41:58 np0005476733 nova_compute[192580]: 2025-10-08 15:41:58.301 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:41:58 np0005476733 nova_compute[192580]: 2025-10-08 15:41:58.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:41:58 np0005476733 nova_compute[192580]: 2025-10-08 15:41:58.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:41:58 np0005476733 nova_compute[192580]: 2025-10-08 15:41:58.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:41:59 np0005476733 podman[238653]: 2025-10-08 15:41:59.232979474 +0000 UTC m=+0.058459748 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001)
Oct  8 11:41:59 np0005476733 podman[238652]: 2025-10-08 15:41:59.25416169 +0000 UTC m=+0.082894208 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller)
Oct  8 11:41:59 np0005476733 nova_compute[192580]: 2025-10-08 15:41:59.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:42:00 np0005476733 nova_compute[192580]: 2025-10-08 15:42:00.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:42:01 np0005476733 nova_compute[192580]: 2025-10-08 15:42:01.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:42:01 np0005476733 nova_compute[192580]: 2025-10-08 15:42:01.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:42:01 np0005476733 nova_compute[192580]: 2025-10-08 15:42:01.884 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-f00f4363-87ff-45bb-b619-95a364353d0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:42:01 np0005476733 nova_compute[192580]: 2025-10-08 15:42:01.885 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-f00f4363-87ff-45bb-b619-95a364353d0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:42:01 np0005476733 nova_compute[192580]: 2025-10-08 15:42:01.885 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:42:03 np0005476733 nova_compute[192580]: 2025-10-08 15:42:03.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:42:04 np0005476733 nova_compute[192580]: 2025-10-08 15:42:04.102 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Updating instance_info_cache with network_info: [{"id": "84e96b77-4119-4da7-ad84-5cf394586e03", "address": "fa:16:3e:53:2d:ec", "network": {"id": "87537fad-af8f-4eae-8420-dce1a4fd9a36", "bridge": "br-int", "label": "tempest-test-network--927179751", "subnets": [{"cidr": "192.168.6.0/24", "dns": [], "gateway": {"address": "192.168.6.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.6.153", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e96b77-41", "ovs_interfaceid": "84e96b77-4119-4da7-ad84-5cf394586e03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:42:04 np0005476733 nova_compute[192580]: 2025-10-08 15:42:04.140 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-f00f4363-87ff-45bb-b619-95a364353d0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:42:04 np0005476733 nova_compute[192580]: 2025-10-08 15:42:04.141 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:42:04 np0005476733 nova_compute[192580]: 2025-10-08 15:42:04.142 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:42:05 np0005476733 nova_compute[192580]: 2025-10-08 15:42:05.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:42:07 np0005476733 podman[238700]: 2025-10-08 15:42:07.238556244 +0000 UTC m=+0.054044128 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:42:07 np0005476733 podman[238699]: 2025-10-08 15:42:07.24094168 +0000 UTC m=+0.058993606 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 11:42:07 np0005476733 podman[238701]: 2025-10-08 15:42:07.248912624 +0000 UTC m=+0.062435565 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, com.redhat.component=ubi9-minimal-container)
Oct  8 11:42:08 np0005476733 nova_compute[192580]: 2025-10-08 15:42:08.780 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759938113.779217, ccf8be13-2e93-495d-ac4a-2cff54baa4fb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:42:08 np0005476733 nova_compute[192580]: 2025-10-08 15:42:08.780 2 INFO nova.compute.manager [-] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:42:08 np0005476733 nova_compute[192580]: 2025-10-08 15:42:08.798 2 DEBUG nova.compute.manager [None req-d0cbf453-4339-483e-b65e-c309790fbabe - - - - - -] [instance: ccf8be13-2e93-495d-ac4a-2cff54baa4fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:42:08 np0005476733 nova_compute[192580]: 2025-10-08 15:42:08.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:42:10 np0005476733 nova_compute[192580]: 2025-10-08 15:42:10.134 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:42:10 np0005476733 nova_compute[192580]: 2025-10-08 15:42:10.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:42:10 np0005476733 nova_compute[192580]: 2025-10-08 15:42:10.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:42:13 np0005476733 nova_compute[192580]: 2025-10-08 15:42:13.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:42:15 np0005476733 nova_compute[192580]: 2025-10-08 15:42:15.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:42:17 np0005476733 podman[238766]: 2025-10-08 15:42:17.245123749 +0000 UTC m=+0.060871985 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 11:42:17 np0005476733 podman[238765]: 2025-10-08 15:42:17.251640406 +0000 UTC m=+0.074772168 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3)
Oct  8 11:42:18 np0005476733 nova_compute[192580]: 2025-10-08 15:42:18.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:42:20 np0005476733 nova_compute[192580]: 2025-10-08 15:42:20.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:42:23 np0005476733 nova_compute[192580]: 2025-10-08 15:42:23.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:42:25 np0005476733 nova_compute[192580]: 2025-10-08 15:42:25.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:42:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:42:25Z|00545|pinctrl|WARN|Dropped 2281 log messages in last 61 seconds (most recently, 13 seconds ago) due to excessive rate
Oct  8 11:42:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:42:25Z|00546|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:42:26 np0005476733 podman[238806]: 2025-10-08 15:42:26.216283705 +0000 UTC m=+0.046950390 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:42:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:42:26.329 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:42:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:42:26.330 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:42:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:42:26.330 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:42:28 np0005476733 nova_compute[192580]: 2025-10-08 15:42:28.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:42:30 np0005476733 podman[238826]: 2025-10-08 15:42:30.257523316 +0000 UTC m=+0.077779865 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 11:42:30 np0005476733 podman[238825]: 2025-10-08 15:42:30.280954923 +0000 UTC m=+0.113461014 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller)
Oct  8 11:42:30 np0005476733 nova_compute[192580]: 2025-10-08 15:42:30.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:42:33 np0005476733 nova_compute[192580]: 2025-10-08 15:42:33.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:42:35 np0005476733 nova_compute[192580]: 2025-10-08 15:42:35.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.013 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000042', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'hostId': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.014 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.045 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/memory.usage volume: 259.0546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd13da35-35af-4fb9-8023-98210f53f2c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 259.0546875, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'timestamp': '2025-10-08T15:42:36.014907', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'instance-00000042', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '69aec3c6-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.767959393, 'message_signature': '7d3162a8f8fa099a9ae27f9fedb375488840d915784aa740188df30a83c58529'}]}, 'timestamp': '2025-10-08 15:42:36.046429', '_unique_id': 'b0d770859f2342f98699f881ecb07bbb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.048 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.049 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.087 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/disk.device.read.requests volume: 11687 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.088 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fddd4f5e-58e0-4901-b5bc-7f35bf84999b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11687, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'f00f4363-87ff-45bb-b619-95a364353d0b-vda', 'timestamp': '2025-10-08T15:42:36.049810', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'instance-00000042', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '69b536f2-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.772805147, 'message_signature': 'd9f77fa9cd37b3275ee2968de25dd448062156ce19e337076e0736e8c6c7f2cb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'f00f4363-87ff-45bb-b619-95a364353d0b-sda', 'timestamp': '2025-10-08T15:42:36.049810', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'instance-00000042', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '69b551aa-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.772805147, 'message_signature': '45629671c95c16f30e6cbfb7061494b8799bf801112f027c67a583509a2a4c04'}]}, 'timestamp': '2025-10-08 15:42:36.089337', '_unique_id': '7d679bcba66242e8bf304672eecbf9dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.090 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.092 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.092 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.093 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-test_dscp_marking_tenant_network-1397692204>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dscp_marking_tenant_network-1397692204>]
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.093 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.097 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f00f4363-87ff-45bb-b619-95a364353d0b / tap84e96b77-41 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.098 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85ee8b2c-acad-4b0b-8a77-b6b756480917', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000042-f00f4363-87ff-45bb-b619-95a364353d0b-tap84e96b77-41', 'timestamp': '2025-10-08T15:42:36.093814', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'tap84e96b77-41', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:53:2d:ec', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84e96b77-41'}, 'message_id': '69b6d1c4-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.816859674, 'message_signature': '4a088bd29aa3f51f35fa72ee13e0005c594941d7aff73c8c286761a7834ed717'}]}, 'timestamp': '2025-10-08 15:42:36.099259', '_unique_id': 'c198f22cf0bd4d58ae0987d15e11a5b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.100 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.102 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.102 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/disk.device.read.latency volume: 7364878508 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.103 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/disk.device.read.latency volume: 53291198 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d39675b-3518-46fd-8052-acc02167f45f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7364878508, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'f00f4363-87ff-45bb-b619-95a364353d0b-vda', 'timestamp': '2025-10-08T15:42:36.102338', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'instance-00000042', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '69b76a8a-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.772805147, 'message_signature': 'c64538a8dc60124849bae7cdd7dd53b295d92557709112f41b6c1303535ed3d0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 53291198, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'f00f4363-87ff-45bb-b619-95a364353d0b-sda', 'timestamp': '2025-10-08T15:42:36.102338', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'instance-00000042', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '69b78786-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.772805147, 'message_signature': '1f6b114af8cc521d2ab8d9f651074e317df98f743102b37b18425a3a357a5dbf'}]}, 'timestamp': '2025-10-08 15:42:36.103797', '_unique_id': '06a7c6d8c0db43ee916ea4bb2fe530d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.105 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.107 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.123 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.124 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6336ea70-1e2f-4b10-9432-f283db7b0b79', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'f00f4363-87ff-45bb-b619-95a364353d0b-vda', 'timestamp': '2025-10-08T15:42:36.107943', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'instance-00000042', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '69baa646-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.830984625, 'message_signature': 'd545b849408b02fdac7d110dda6c6ebaa0b220cf175143724e54ab706c470dab'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'f00f4363-87ff-45bb-b619-95a364353d0b-sda', 'timestamp': '2025-10-08T15:42:36.107943', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'instance-00000042', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '69bab604-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.830984625, 'message_signature': '7316c2a4af8d8678ebe2d8a065cfbae81b426c0b3f8e2340ac3d00c33139ec98'}]}, 'timestamp': '2025-10-08 15:42:36.124430', '_unique_id': 'c902ae2784114efd8ba78fbe2be07eb6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.125 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78a3bba7-3ce3-4a83-b52f-fedd38352fdc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000042-f00f4363-87ff-45bb-b619-95a364353d0b-tap84e96b77-41', 'timestamp': '2025-10-08T15:42:36.126139', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'tap84e96b77-41', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:53:2d:ec', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84e96b77-41'}, 'message_id': '69bb0348-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.816859674, 'message_signature': 'ef6e28b1ce4ca720571d830879deec7e9a16a38957c1c220a87489280472771c'}]}, 'timestamp': '2025-10-08 15:42:36.126406', '_unique_id': 'c20d57ce45ce4430ab82b294e98dbb31'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.126 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.127 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.127 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70b44f69-8242-4757-9f5b-94c552438db9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000042-f00f4363-87ff-45bb-b619-95a364353d0b-tap84e96b77-41', 'timestamp': '2025-10-08T15:42:36.127630', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'tap84e96b77-41', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:53:2d:ec', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84e96b77-41'}, 'message_id': '69bb3c3c-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.816859674, 'message_signature': '6001f11e8f51d062bfd1ccb3336d783c558fdb90065ebc9ebba4a456844bdfe5'}]}, 'timestamp': '2025-10-08 15:42:36.127857', '_unique_id': '52aece3848ca40c08f0b82084c1c24ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.128 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/network.incoming.packets volume: 296 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e4162de-b531-49f5-b30d-390c78736079', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 296, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000042-f00f4363-87ff-45bb-b619-95a364353d0b-tap84e96b77-41', 'timestamp': '2025-10-08T15:42:36.128971', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'tap84e96b77-41', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:53:2d:ec', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84e96b77-41'}, 'message_id': '69bb71a2-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.816859674, 'message_signature': '260130f6287bd9f132f5fd3e44eb89270a57ec06da3a3ccecd046c282dcadc64'}]}, 'timestamp': '2025-10-08 15:42:36.129246', '_unique_id': 'd609769204e149df9d212b78f16ce3a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.130 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.130 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/network.outgoing.bytes volume: 66703 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e3c8441-db99-4c28-a0bc-a00e3ac358a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 66703, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000042-f00f4363-87ff-45bb-b619-95a364353d0b-tap84e96b77-41', 'timestamp': '2025-10-08T15:42:36.130447', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'tap84e96b77-41', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:53:2d:ec', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84e96b77-41'}, 'message_id': '69bbaa46-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.816859674, 'message_signature': '74ea07905a51719a137cffb54056c15750ea886f1698001c888246990a8cd962'}]}, 'timestamp': '2025-10-08 15:42:36.130672', '_unique_id': '783357b13e7e4a309a73a6cdd1370f2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.131 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98439001-a0da-43e4-9d68-2229c8d6eb58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000042-f00f4363-87ff-45bb-b619-95a364353d0b-tap84e96b77-41', 'timestamp': '2025-10-08T15:42:36.131763', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'tap84e96b77-41', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:53:2d:ec', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84e96b77-41'}, 'message_id': '69bbddb8-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.816859674, 'message_signature': '3e82b525f720099dadcd6c65ca77d952f59758081eb495e8799897fb4a1daf8e'}]}, 'timestamp': '2025-10-08 15:42:36.131989', '_unique_id': '4f523f3346fc46898de822994cae4fa5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.132 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d077bd5-89c3-40f1-b098-a78187247d80', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000042-f00f4363-87ff-45bb-b619-95a364353d0b-tap84e96b77-41', 'timestamp': '2025-10-08T15:42:36.133059', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'tap84e96b77-41', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:53:2d:ec', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84e96b77-41'}, 'message_id': '69bc124c-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.816859674, 'message_signature': '7ba383a0ebab9d9c3543c314a50397f95f69b6956ba163a407f1792322e94e67'}]}, 'timestamp': '2025-10-08 15:42:36.133353', '_unique_id': '59702edf902d466f8136e0e4a22c9663'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.133 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.134 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.134 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.134 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd660bf54-d007-4e65-8140-812af0ec8d32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'f00f4363-87ff-45bb-b619-95a364353d0b-vda', 'timestamp': '2025-10-08T15:42:36.134444', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'instance-00000042', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '69bc485c-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.830984625, 'message_signature': '5507db2e8a7850862b8c278278885c16019e55ed4a7ea770877d793914f2635a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'f00f4363-87ff-45bb-b619-95a364353d0b-sda', 'timestamp': '2025-10-08T15:42:36.134444', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'instance-00000042', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '69bc50e0-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.830984625, 'message_signature': '70f7c18178b867a66d0a268abebad86cf294e882a976604b7821071588849e95'}]}, 'timestamp': '2025-10-08 15:42:36.134952', '_unique_id': 'c806019a0c00441cb687c7f90cd277c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.135 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.136 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.136 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.136 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-test_dscp_marking_tenant_network-1397692204>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dscp_marking_tenant_network-1397692204>]
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.136 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.136 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/network.incoming.bytes volume: 51145 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79259959-34ee-490e-a68a-0783c483c4e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 51145, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000042-f00f4363-87ff-45bb-b619-95a364353d0b-tap84e96b77-41', 'timestamp': '2025-10-08T15:42:36.136534', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'tap84e96b77-41', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:53:2d:ec', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84e96b77-41'}, 'message_id': '69bc98a2-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.816859674, 'message_signature': 'e7cb08367a8a2f0f3e7ded66af70fcc1d7d2f7820217c373bf66f8a53d9e954e'}]}, 'timestamp': '2025-10-08 15:42:36.136791', '_unique_id': '13e19295cd504b9698880ce8387a50dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.137 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.138 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.138 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.138 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-test_dscp_marking_tenant_network-1397692204>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dscp_marking_tenant_network-1397692204>]
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.138 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.138 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/disk.device.read.bytes volume: 330950144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.138 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb660261-7715-4b8f-9c56-6979883e5183', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 330950144, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'f00f4363-87ff-45bb-b619-95a364353d0b-vda', 'timestamp': '2025-10-08T15:42:36.138504', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'instance-00000042', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '69bce4f6-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.772805147, 'message_signature': '2d8f6b938b24ed8b68207301faa21dea82b4dff1243094c5fe358740c8ec0707'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'f00f4363-87ff-45bb-b619-95a364353d0b-sda', 'timestamp': '2025-10-08T15:42:36.138504', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'instance-00000042', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '69bced2a-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.772805147, 'message_signature': 'deb1ac4a02f19229a8435a581c5e7943c7c3494ada269b67525eb5feedb6ad39'}]}, 'timestamp': '2025-10-08 15:42:36.138926', '_unique_id': '264a6b46956149fca6cb006ed4b71253'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.139 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/network.outgoing.packets volume: 320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97dd0163-b83c-4bba-bcc8-05123e2b691e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 320, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000042-f00f4363-87ff-45bb-b619-95a364353d0b-tap84e96b77-41', 'timestamp': '2025-10-08T15:42:36.140058', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'tap84e96b77-41', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:53:2d:ec', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84e96b77-41'}, 'message_id': '69bd233a-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.816859674, 'message_signature': '68d53e158ba304b68f3d22bfc2087cee7bde5ec529f8c8c1556fc4313a2c69bf'}]}, 'timestamp': '2025-10-08 15:42:36.140383', '_unique_id': '49b3f4a107e64b979272d65a2df3c13b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.141 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.141 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.141 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-test_dscp_marking_tenant_network-1397692204>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dscp_marking_tenant_network-1397692204>]
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.141 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.141 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/disk.device.write.requests volume: 785 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.141 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6432e3a6-8f13-44d5-8d0b-dac5327658d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 785, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'f00f4363-87ff-45bb-b619-95a364353d0b-vda', 'timestamp': '2025-10-08T15:42:36.141778', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'instance-00000042', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '69bd64e4-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.772805147, 'message_signature': 'ed498b9f0a26fd7028c5416345ba2c321dc90837b8b3935af6ae92e9e8effe79'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'f00f4363-87ff-45bb-b619-95a364353d0b-sda', 'timestamp': '2025-10-08T15:42:36.141778', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'instance-00000042', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '69bd6d5e-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.772805147, 'message_signature': '7ef1445ba8481ae4f5a05be51700246d0cdef740f8f27d460510f473309f3e89'}]}, 'timestamp': '2025-10-08 15:42:36.142223', '_unique_id': 'b50500f575f8413a8d1cf1f0fca2b617'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.142 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.143 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.143 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/disk.device.write.bytes volume: 136335360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.143 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c73a3f8c-c9d7-4f1d-8aed-b714a5e1b76f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 136335360, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'f00f4363-87ff-45bb-b619-95a364353d0b-vda', 'timestamp': '2025-10-08T15:42:36.143337', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'instance-00000042', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '69bda1a2-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.772805147, 'message_signature': 'bf1430e0c8887da8574f5b5529154463fbe820bc380797a75aba52cbe6e76b27'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'f00f4363-87ff-45bb-b619-95a364353d0b-sda', 'timestamp': '2025-10-08T15:42:36.143337', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'instance-00000042', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '69bda972-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.772805147, 'message_signature': 'baa86d402fe9a25c82102e88a7080bdb59e7eeba9547abca5232734abd97aa1e'}]}, 'timestamp': '2025-10-08 15:42:36.143747', '_unique_id': '7fadbfc2e3264df28937d4483c49749c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.144 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/disk.device.write.latency volume: 8256200119 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5244994c-0b63-4b1d-8b5d-2690770cb4f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8256200119, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'f00f4363-87ff-45bb-b619-95a364353d0b-vda', 'timestamp': '2025-10-08T15:42:36.144866', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'instance-00000042', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '69bddd52-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.772805147, 'message_signature': 'bbb76cfd1f3485fe62c8e8288488fd8388ed632f44403e43977fa32a45b86706'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'f00f4363-87ff-45bb-b619-95a364353d0b-sda', 'timestamp': '2025-10-08T15:42:36.144866', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'instance-00000042', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '69bde644-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.772805147, 'message_signature': '51da4777ca7bf01f571d9d3fe908cd20e3dbe67d3e4ad7abd9557399bfe8af9d'}]}, 'timestamp': '2025-10-08 15:42:36.145322', '_unique_id': '1ee52ec0a84d46ec8f995f3fedebbf4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.145 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.146 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.146 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c93119db-256a-4d2f-9426-f79f6bffeb3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000042-f00f4363-87ff-45bb-b619-95a364353d0b-tap84e96b77-41', 'timestamp': '2025-10-08T15:42:36.146412', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'tap84e96b77-41', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:53:2d:ec', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap84e96b77-41'}, 'message_id': '69be1a10-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.816859674, 'message_signature': 'c87847f241d880ec37b395bda594da1b7ead7e886de85672ce4f40f930dee76f'}]}, 'timestamp': '2025-10-08 15:42:36.146640', '_unique_id': 'c1531fb9c8924acf83bcce4d05272c75'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/disk.device.usage volume: 153026560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.147 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8f62221-d7f2-4825-b73c-a81585165398', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153026560, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'f00f4363-87ff-45bb-b619-95a364353d0b-vda', 'timestamp': '2025-10-08T15:42:36.147693', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'instance-00000042', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '69be4bc0-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.830984625, 'message_signature': '4e03d63543ac2df01e62029840217f71cfb499c54211f554b6283ec51b531118'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'f00f4363-87ff-45bb-b619-95a364353d0b-sda', 'timestamp': '2025-10-08T15:42:36.147693', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'instance-00000042', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '69be539a-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.830984625, 'message_signature': '66a4d69cfbd8f00f97389834d23331dbc1c7347060ced9dfffc68eb811b7e71a'}]}, 'timestamp': '2025-10-08 15:42:36.148128', '_unique_id': 'b593f099e7e44da48c4fc18c252e1138'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.148 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 DEBUG ceilometer.compute.pollsters [-] f00f4363-87ff-45bb-b619-95a364353d0b/cpu volume: 42780000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2620d151-c487-48ac-9f1a-213ef556c941', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 42780000000, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'timestamp': '2025-10-08T15:42:36.149237', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_tenant_network-1397692204', 'name': 'instance-00000042', 'instance_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '69be887e-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5059.767959393, 'message_signature': '14f963c31972d264585ec6ead408587260aa1d468348c63a403a08d87eb4401c'}]}, 'timestamp': '2025-10-08 15:42:36.149461', '_unique_id': '227e40e9028c493aba6b7666d27e6334'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:42:36.149 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:42:38 np0005476733 podman[238870]: 2025-10-08 15:42:38.231860409 +0000 UTC m=+0.055325098 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:42:38 np0005476733 podman[238871]: 2025-10-08 15:42:38.242155508 +0000 UTC m=+0.059120229 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, maintainer=Red Hat, Inc., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  8 11:42:38 np0005476733 podman[238869]: 2025-10-08 15:42:38.242175579 +0000 UTC m=+0.067148566 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:42:38 np0005476733 nova_compute[192580]: 2025-10-08 15:42:38.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:42:40 np0005476733 nova_compute[192580]: 2025-10-08 15:42:40.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:42:43 np0005476733 nova_compute[192580]: 2025-10-08 15:42:43.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:42:45 np0005476733 nova_compute[192580]: 2025-10-08 15:42:45.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:42:48 np0005476733 podman[238931]: 2025-10-08 15:42:48.237198394 +0000 UTC m=+0.060609047 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid)
Oct  8 11:42:48 np0005476733 podman[238932]: 2025-10-08 15:42:48.25149495 +0000 UTC m=+0.067993832 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:42:48 np0005476733 nova_compute[192580]: 2025-10-08 15:42:48.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:42:50 np0005476733 nova_compute[192580]: 2025-10-08 15:42:50.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:42:50 np0005476733 nova_compute[192580]: 2025-10-08 15:42:50.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:42:53 np0005476733 nova_compute[192580]: 2025-10-08 15:42:53.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:42:53 np0005476733 nova_compute[192580]: 2025-10-08 15:42:53.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:42:55 np0005476733 nova_compute[192580]: 2025-10-08 15:42:55.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:42:56 np0005476733 nova_compute[192580]: 2025-10-08 15:42:56.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:42:56 np0005476733 nova_compute[192580]: 2025-10-08 15:42:56.621 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:42:56 np0005476733 nova_compute[192580]: 2025-10-08 15:42:56.622 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:42:56 np0005476733 nova_compute[192580]: 2025-10-08 15:42:56.622 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:42:56 np0005476733 nova_compute[192580]: 2025-10-08 15:42:56.622 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:42:56 np0005476733 nova_compute[192580]: 2025-10-08 15:42:56.702 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f00f4363-87ff-45bb-b619-95a364353d0b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:42:56 np0005476733 nova_compute[192580]: 2025-10-08 15:42:56.761 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f00f4363-87ff-45bb-b619-95a364353d0b/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:42:56 np0005476733 nova_compute[192580]: 2025-10-08 15:42:56.762 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f00f4363-87ff-45bb-b619-95a364353d0b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:42:56 np0005476733 nova_compute[192580]: 2025-10-08 15:42:56.838 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f00f4363-87ff-45bb-b619-95a364353d0b/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:42:56 np0005476733 nova_compute[192580]: 2025-10-08 15:42:56.999 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:42:57 np0005476733 nova_compute[192580]: 2025-10-08 15:42:57.000 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12996MB free_disk=111.18994522094727GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:42:57 np0005476733 nova_compute[192580]: 2025-10-08 15:42:57.000 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:42:57 np0005476733 nova_compute[192580]: 2025-10-08 15:42:57.000 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:42:57 np0005476733 nova_compute[192580]: 2025-10-08 15:42:57.087 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance f00f4363-87ff-45bb-b619-95a364353d0b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:42:57 np0005476733 nova_compute[192580]: 2025-10-08 15:42:57.088 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:42:57 np0005476733 nova_compute[192580]: 2025-10-08 15:42:57.088 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:42:57 np0005476733 nova_compute[192580]: 2025-10-08 15:42:57.166 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:42:57 np0005476733 nova_compute[192580]: 2025-10-08 15:42:57.180 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:42:57 np0005476733 nova_compute[192580]: 2025-10-08 15:42:57.198 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:42:57 np0005476733 nova_compute[192580]: 2025-10-08 15:42:57.198 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:42:57 np0005476733 podman[238983]: 2025-10-08 15:42:57.240349154 +0000 UTC m=+0.067105924 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Oct  8 11:42:58 np0005476733 nova_compute[192580]: 2025-10-08 15:42:58.198 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:42:58 np0005476733 nova_compute[192580]: 2025-10-08 15:42:58.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:42:58 np0005476733 nova_compute[192580]: 2025-10-08 15:42:58.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:42:58 np0005476733 nova_compute[192580]: 2025-10-08 15:42:58.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:00 np0005476733 nova_compute[192580]: 2025-10-08 15:43:00.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:00 np0005476733 nova_compute[192580]: 2025-10-08 15:43:00.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:43:01 np0005476733 podman[239002]: 2025-10-08 15:43:01.255711916 +0000 UTC m=+0.078455877 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:43:01 np0005476733 podman[239003]: 2025-10-08 15:43:01.269755064 +0000 UTC m=+0.086963608 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Oct  8 11:43:02 np0005476733 nova_compute[192580]: 2025-10-08 15:43:02.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:43:02 np0005476733 nova_compute[192580]: 2025-10-08 15:43:02.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:43:02 np0005476733 nova_compute[192580]: 2025-10-08 15:43:02.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:43:02 np0005476733 nova_compute[192580]: 2025-10-08 15:43:02.919 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-f00f4363-87ff-45bb-b619-95a364353d0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:43:02 np0005476733 nova_compute[192580]: 2025-10-08 15:43:02.919 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-f00f4363-87ff-45bb-b619-95a364353d0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:43:02 np0005476733 nova_compute[192580]: 2025-10-08 15:43:02.920 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:43:02 np0005476733 nova_compute[192580]: 2025-10-08 15:43:02.920 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f00f4363-87ff-45bb-b619-95a364353d0b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:43:03 np0005476733 nova_compute[192580]: 2025-10-08 15:43:03.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:04.502 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:43:04 np0005476733 nova_compute[192580]: 2025-10-08 15:43:04.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:04.504 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:43:05 np0005476733 nova_compute[192580]: 2025-10-08 15:43:05.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:05 np0005476733 nova_compute[192580]: 2025-10-08 15:43:05.627 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Updating instance_info_cache with network_info: [{"id": "84e96b77-4119-4da7-ad84-5cf394586e03", "address": "fa:16:3e:53:2d:ec", "network": {"id": "87537fad-af8f-4eae-8420-dce1a4fd9a36", "bridge": "br-int", "label": "tempest-test-network--927179751", "subnets": [{"cidr": "192.168.6.0/24", "dns": [], "gateway": {"address": "192.168.6.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.6.153", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e96b77-41", "ovs_interfaceid": "84e96b77-4119-4da7-ad84-5cf394586e03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:43:05 np0005476733 nova_compute[192580]: 2025-10-08 15:43:05.654 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-f00f4363-87ff-45bb-b619-95a364353d0b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:43:05 np0005476733 nova_compute[192580]: 2025-10-08 15:43:05.655 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:43:05 np0005476733 nova_compute[192580]: 2025-10-08 15:43:05.655 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:43:08 np0005476733 nova_compute[192580]: 2025-10-08 15:43:08.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:09 np0005476733 podman[239049]: 2025-10-08 15:43:09.237471147 +0000 UTC m=+0.051541167 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:43:09 np0005476733 podman[239048]: 2025-10-08 15:43:09.254237922 +0000 UTC m=+0.071165153 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:43:09 np0005476733 podman[239050]: 2025-10-08 15:43:09.279206219 +0000 UTC m=+0.087499894 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, vcs-type=git)
Oct  8 11:43:10 np0005476733 nova_compute[192580]: 2025-10-08 15:43:10.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:11.508 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:43:12 np0005476733 nova_compute[192580]: 2025-10-08 15:43:12.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:43:13 np0005476733 nova_compute[192580]: 2025-10-08 15:43:13.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:15 np0005476733 nova_compute[192580]: 2025-10-08 15:43:15.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.564 2 DEBUG oslo_concurrency.lockutils [None req-252f212d-4b54-4fb5-81c1-652c4e59a43b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "f00f4363-87ff-45bb-b619-95a364353d0b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.565 2 DEBUG oslo_concurrency.lockutils [None req-252f212d-4b54-4fb5-81c1-652c4e59a43b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "f00f4363-87ff-45bb-b619-95a364353d0b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.565 2 DEBUG oslo_concurrency.lockutils [None req-252f212d-4b54-4fb5-81c1-652c4e59a43b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "f00f4363-87ff-45bb-b619-95a364353d0b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.566 2 DEBUG oslo_concurrency.lockutils [None req-252f212d-4b54-4fb5-81c1-652c4e59a43b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "f00f4363-87ff-45bb-b619-95a364353d0b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.566 2 DEBUG oslo_concurrency.lockutils [None req-252f212d-4b54-4fb5-81c1-652c4e59a43b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "f00f4363-87ff-45bb-b619-95a364353d0b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.567 2 INFO nova.compute.manager [None req-252f212d-4b54-4fb5-81c1-652c4e59a43b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Terminating instance#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.568 2 DEBUG nova.compute.manager [None req-252f212d-4b54-4fb5-81c1-652c4e59a43b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:43:17 np0005476733 kernel: tap84e96b77-41 (unregistering): left promiscuous mode
Oct  8 11:43:17 np0005476733 NetworkManager[51699]: <info>  [1759938197.6151] device (tap84e96b77-41): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:43:17 np0005476733 ovn_controller[94857]: 2025-10-08T15:43:17Z|00547|binding|INFO|Releasing lport 84e96b77-4119-4da7-ad84-5cf394586e03 from this chassis (sb_readonly=0)
Oct  8 11:43:17 np0005476733 ovn_controller[94857]: 2025-10-08T15:43:17Z|00548|binding|INFO|Setting lport 84e96b77-4119-4da7-ad84-5cf394586e03 down in Southbound
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:17 np0005476733 ovn_controller[94857]: 2025-10-08T15:43:17Z|00549|binding|INFO|Removing iface tap84e96b77-41 ovn-installed in OVS
Oct  8 11:43:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:17.629 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:2d:ec 192.168.6.153'], port_security=['fa:16:3e:53:2d:ec 192.168.6.153'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.6.153/24', 'neutron:device_id': 'f00f4363-87ff-45bb-b619-95a364353d0b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-87537fad-af8f-4eae-8420-dce1a4fd9a36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b449450f-29a2-4ba2-a56d-c4c1cca923db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.239'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=43ff2ef3-241a-4d61-bde9-cb2730e8ed48, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=84e96b77-4119-4da7-ad84-5cf394586e03) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:43:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:17.631 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 84e96b77-4119-4da7-ad84-5cf394586e03 in datapath 87537fad-af8f-4eae-8420-dce1a4fd9a36 unbound from our chassis#033[00m
Oct  8 11:43:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:17.633 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 87537fad-af8f-4eae-8420-dce1a4fd9a36, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:43:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:17.634 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9fa750fe-2a1f-40d9-a2cf-8fa6fd86655a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:17.635 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-87537fad-af8f-4eae-8420-dce1a4fd9a36 namespace which is not needed anymore#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:17 np0005476733 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000042.scope: Deactivated successfully.
Oct  8 11:43:17 np0005476733 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000042.scope: Consumed 48.202s CPU time.
Oct  8 11:43:17 np0005476733 systemd-machined[152624]: Machine qemu-38-instance-00000042 terminated.
Oct  8 11:43:17 np0005476733 neutron-haproxy-ovnmeta-87537fad-af8f-4eae-8420-dce1a4fd9a36[238146]: [NOTICE]   (238150) : haproxy version is 2.8.14-c23fe91
Oct  8 11:43:17 np0005476733 neutron-haproxy-ovnmeta-87537fad-af8f-4eae-8420-dce1a4fd9a36[238146]: [NOTICE]   (238150) : path to executable is /usr/sbin/haproxy
Oct  8 11:43:17 np0005476733 neutron-haproxy-ovnmeta-87537fad-af8f-4eae-8420-dce1a4fd9a36[238146]: [WARNING]  (238150) : Exiting Master process...
Oct  8 11:43:17 np0005476733 neutron-haproxy-ovnmeta-87537fad-af8f-4eae-8420-dce1a4fd9a36[238146]: [ALERT]    (238150) : Current worker (238152) exited with code 143 (Terminated)
Oct  8 11:43:17 np0005476733 neutron-haproxy-ovnmeta-87537fad-af8f-4eae-8420-dce1a4fd9a36[238146]: [WARNING]  (238150) : All workers exited. Exiting... (0)
Oct  8 11:43:17 np0005476733 systemd[1]: libpod-40f6166777c22bcd81f9372473fe973a329a3129b1f399403b500b0089569c17.scope: Deactivated successfully.
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:17 np0005476733 podman[239143]: 2025-10-08 15:43:17.79357651 +0000 UTC m=+0.072011160 container died 40f6166777c22bcd81f9372473fe973a329a3129b1f399403b500b0089569c17 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-87537fad-af8f-4eae-8420-dce1a4fd9a36, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.840 2 INFO nova.virt.libvirt.driver [-] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Instance destroyed successfully.#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.840 2 DEBUG nova.objects.instance [None req-252f212d-4b54-4fb5-81c1-652c4e59a43b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'resources' on Instance uuid f00f4363-87ff-45bb-b619-95a364353d0b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.884 2 DEBUG nova.compute.manager [req-12692c6e-0a93-4fb6-a819-e3006c013c83 req-98f96338-c7d5-4246-b17f-71b7e1418517 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Received event network-vif-unplugged-84e96b77-4119-4da7-ad84-5cf394586e03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.884 2 DEBUG oslo_concurrency.lockutils [req-12692c6e-0a93-4fb6-a819-e3006c013c83 req-98f96338-c7d5-4246-b17f-71b7e1418517 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "f00f4363-87ff-45bb-b619-95a364353d0b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.885 2 DEBUG oslo_concurrency.lockutils [req-12692c6e-0a93-4fb6-a819-e3006c013c83 req-98f96338-c7d5-4246-b17f-71b7e1418517 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "f00f4363-87ff-45bb-b619-95a364353d0b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.885 2 DEBUG oslo_concurrency.lockutils [req-12692c6e-0a93-4fb6-a819-e3006c013c83 req-98f96338-c7d5-4246-b17f-71b7e1418517 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "f00f4363-87ff-45bb-b619-95a364353d0b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.885 2 DEBUG nova.compute.manager [req-12692c6e-0a93-4fb6-a819-e3006c013c83 req-98f96338-c7d5-4246-b17f-71b7e1418517 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] No waiting events found dispatching network-vif-unplugged-84e96b77-4119-4da7-ad84-5cf394586e03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.886 2 DEBUG nova.compute.manager [req-12692c6e-0a93-4fb6-a819-e3006c013c83 req-98f96338-c7d5-4246-b17f-71b7e1418517 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Received event network-vif-unplugged-84e96b77-4119-4da7-ad84-5cf394586e03 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.887 2 DEBUG nova.virt.libvirt.vif [None req-252f212d-4b54-4fb5-81c1-652c4e59a43b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:40:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_dscp_marking_tenant_network-1397692204',display_name='tempest-test_dscp_marking_tenant_network-1397692204',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dscp-marking-tenant-network-1397692204',id=66,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:40:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-toxce72r',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:40:55Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=f00f4363-87ff-45bb-b619-95a364353d0b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "84e96b77-4119-4da7-ad84-5cf394586e03", "address": "fa:16:3e:53:2d:ec", "network": {"id": "87537fad-af8f-4eae-8420-dce1a4fd9a36", "bridge": "br-int", "label": "tempest-test-network--927179751", "subnets": [{"cidr": "192.168.6.0/24", "dns": [], "gateway": {"address": "192.168.6.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.6.153", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e96b77-41", "ovs_interfaceid": "84e96b77-4119-4da7-ad84-5cf394586e03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.888 2 DEBUG nova.network.os_vif_util [None req-252f212d-4b54-4fb5-81c1-652c4e59a43b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "84e96b77-4119-4da7-ad84-5cf394586e03", "address": "fa:16:3e:53:2d:ec", "network": {"id": "87537fad-af8f-4eae-8420-dce1a4fd9a36", "bridge": "br-int", "label": "tempest-test-network--927179751", "subnets": [{"cidr": "192.168.6.0/24", "dns": [], "gateway": {"address": "192.168.6.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.6.153", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84e96b77-41", "ovs_interfaceid": "84e96b77-4119-4da7-ad84-5cf394586e03", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.888 2 DEBUG nova.network.os_vif_util [None req-252f212d-4b54-4fb5-81c1-652c4e59a43b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:53:2d:ec,bridge_name='br-int',has_traffic_filtering=True,id=84e96b77-4119-4da7-ad84-5cf394586e03,network=Network(87537fad-af8f-4eae-8420-dce1a4fd9a36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84e96b77-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.889 2 DEBUG os_vif [None req-252f212d-4b54-4fb5-81c1-652c4e59a43b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:53:2d:ec,bridge_name='br-int',has_traffic_filtering=True,id=84e96b77-4119-4da7-ad84-5cf394586e03,network=Network(87537fad-af8f-4eae-8420-dce1a4fd9a36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84e96b77-41') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.891 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap84e96b77-41, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:43:17 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-40f6166777c22bcd81f9372473fe973a329a3129b1f399403b500b0089569c17-userdata-shm.mount: Deactivated successfully.
Oct  8 11:43:17 np0005476733 systemd[1]: var-lib-containers-storage-overlay-2593a17f894e3b904518a1439c4c4419c51daa064012b625d0e6f1040d46370a-merged.mount: Deactivated successfully.
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.899 2 INFO os_vif [None req-252f212d-4b54-4fb5-81c1-652c4e59a43b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:53:2d:ec,bridge_name='br-int',has_traffic_filtering=True,id=84e96b77-4119-4da7-ad84-5cf394586e03,network=Network(87537fad-af8f-4eae-8420-dce1a4fd9a36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84e96b77-41')#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.899 2 INFO nova.virt.libvirt.driver [None req-252f212d-4b54-4fb5-81c1-652c4e59a43b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Deleting instance files /var/lib/nova/instances/f00f4363-87ff-45bb-b619-95a364353d0b_del#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.900 2 INFO nova.virt.libvirt.driver [None req-252f212d-4b54-4fb5-81c1-652c4e59a43b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Deletion of /var/lib/nova/instances/f00f4363-87ff-45bb-b619-95a364353d0b_del complete#033[00m
Oct  8 11:43:17 np0005476733 podman[239143]: 2025-10-08 15:43:17.959880982 +0000 UTC m=+0.238315632 container cleanup 40f6166777c22bcd81f9372473fe973a329a3129b1f399403b500b0089569c17 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-87537fad-af8f-4eae-8420-dce1a4fd9a36, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.968 2 INFO nova.compute.manager [None req-252f212d-4b54-4fb5-81c1-652c4e59a43b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.969 2 DEBUG oslo.service.loopingcall [None req-252f212d-4b54-4fb5-81c1-652c4e59a43b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.969 2 DEBUG nova.compute.manager [-] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:43:17 np0005476733 nova_compute[192580]: 2025-10-08 15:43:17.970 2 DEBUG nova.network.neutron [-] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:43:17 np0005476733 systemd[1]: libpod-conmon-40f6166777c22bcd81f9372473fe973a329a3129b1f399403b500b0089569c17.scope: Deactivated successfully.
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.112 2 DEBUG oslo_concurrency.lockutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.113 2 DEBUG oslo_concurrency.lockutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:43:18 np0005476733 podman[239188]: 2025-10-08 15:43:18.12140943 +0000 UTC m=+0.120977955 container remove 40f6166777c22bcd81f9372473fe973a329a3129b1f399403b500b0089569c17 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-87537fad-af8f-4eae-8420-dce1a4fd9a36, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:43:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:18.131 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2867d220-064a-4ad4-9936-e1b9c95ffaea]: (4, ('Wed Oct  8 03:43:17 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-87537fad-af8f-4eae-8420-dce1a4fd9a36 (40f6166777c22bcd81f9372473fe973a329a3129b1f399403b500b0089569c17)\n40f6166777c22bcd81f9372473fe973a329a3129b1f399403b500b0089569c17\nWed Oct  8 03:43:17 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-87537fad-af8f-4eae-8420-dce1a4fd9a36 (40f6166777c22bcd81f9372473fe973a329a3129b1f399403b500b0089569c17)\n40f6166777c22bcd81f9372473fe973a329a3129b1f399403b500b0089569c17\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:18.133 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[fed1c557-359a-406f-be6f-221e9c14c242]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:18.135 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap87537fad-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:43:18 np0005476733 kernel: tap87537fad-a0: left promiscuous mode
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.155 2 DEBUG nova.compute.manager [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:18.183 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1749ac6d-9207-4707-885f-ee14c1de55a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:18.232 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e51d2ce9-3f78-4df8-a284-3bac73d15869]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.234 2 DEBUG oslo_concurrency.lockutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:43:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:18.234 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d141b59d-ead0-48d5-9431-ce0e8149e525]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.234 2 DEBUG oslo_concurrency.lockutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.245 2 DEBUG nova.virt.hardware [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.245 2 INFO nova.compute.claims [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:43:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:18.258 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[eb99d585-9804-48ac-826f-a08dceebc142]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495826, 'reachable_time': 35312, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239203, 'error': None, 'target': 'ovnmeta-87537fad-af8f-4eae-8420-dce1a4fd9a36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:18.261 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-87537fad-af8f-4eae-8420-dce1a4fd9a36 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:43:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:18.262 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[3c383fc7-179e-4dfb-899d-9ee869212b5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:18 np0005476733 systemd[1]: run-netns-ovnmeta\x2d87537fad\x2daf8f\x2d4eae\x2d8420\x2ddce1a4fd9a36.mount: Deactivated successfully.
Oct  8 11:43:18 np0005476733 podman[239204]: 2025-10-08 15:43:18.376693583 +0000 UTC m=+0.065212794 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=iscsid, managed_by=edpm_ansible, tcib_managed=true, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:43:18 np0005476733 podman[239205]: 2025-10-08 15:43:18.39381977 +0000 UTC m=+0.083806328 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.393 2 DEBUG nova.compute.provider_tree [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.411 2 DEBUG nova.scheduler.client.report [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.443 2 DEBUG oslo_concurrency.lockutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.444 2 DEBUG nova.compute.manager [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.497 2 DEBUG nova.compute.manager [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.497 2 DEBUG nova.network.neutron [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.521 2 INFO nova.virt.libvirt.driver [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.543 2 DEBUG nova.compute.manager [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.639 2 DEBUG nova.compute.manager [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.642 2 DEBUG nova.virt.libvirt.driver [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.643 2 INFO nova.virt.libvirt.driver [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Creating image(s)#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.643 2 DEBUG oslo_concurrency.lockutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "/var/lib/nova/instances/7a9c56c2-9c61-4740-95cb-34aebd44fb1a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.644 2 DEBUG oslo_concurrency.lockutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "/var/lib/nova/instances/7a9c56c2-9c61-4740-95cb-34aebd44fb1a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.645 2 DEBUG oslo_concurrency.lockutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "/var/lib/nova/instances/7a9c56c2-9c61-4740-95cb-34aebd44fb1a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.669 2 DEBUG oslo_concurrency.processutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.730 2 DEBUG oslo_concurrency.processutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.731 2 DEBUG oslo_concurrency.lockutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.732 2 DEBUG oslo_concurrency.lockutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.747 2 DEBUG oslo_concurrency.processutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.799 2 DEBUG oslo_concurrency.processutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.800 2 DEBUG oslo_concurrency.processutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/7a9c56c2-9c61-4740-95cb-34aebd44fb1a/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.838 2 DEBUG oslo_concurrency.processutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/7a9c56c2-9c61-4740-95cb-34aebd44fb1a/disk 10737418240" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.838 2 DEBUG oslo_concurrency.lockutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.839 2 DEBUG oslo_concurrency.processutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.895 2 DEBUG oslo_concurrency.processutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.896 2 DEBUG nova.objects.instance [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lazy-loading 'migration_context' on Instance uuid 7a9c56c2-9c61-4740-95cb-34aebd44fb1a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.913 2 DEBUG nova.virt.libvirt.driver [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.913 2 DEBUG nova.virt.libvirt.driver [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Ensure instance console log exists: /var/lib/nova/instances/7a9c56c2-9c61-4740-95cb-34aebd44fb1a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.914 2 DEBUG oslo_concurrency.lockutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.914 2 DEBUG oslo_concurrency.lockutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.915 2 DEBUG oslo_concurrency.lockutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.916 2 DEBUG nova.network.neutron [-] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.942 2 INFO nova.compute.manager [-] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Took 0.97 seconds to deallocate network for instance.#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.986 2 DEBUG oslo_concurrency.lockutils [None req-252f212d-4b54-4fb5-81c1-652c4e59a43b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:43:18 np0005476733 nova_compute[192580]: 2025-10-08 15:43:18.987 2 DEBUG oslo_concurrency.lockutils [None req-252f212d-4b54-4fb5-81c1-652c4e59a43b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:43:19 np0005476733 nova_compute[192580]: 2025-10-08 15:43:19.053 2 DEBUG nova.compute.provider_tree [None req-252f212d-4b54-4fb5-81c1-652c4e59a43b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:43:19 np0005476733 nova_compute[192580]: 2025-10-08 15:43:19.073 2 DEBUG nova.scheduler.client.report [None req-252f212d-4b54-4fb5-81c1-652c4e59a43b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:43:19 np0005476733 nova_compute[192580]: 2025-10-08 15:43:19.098 2 DEBUG oslo_concurrency.lockutils [None req-252f212d-4b54-4fb5-81c1-652c4e59a43b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:43:19 np0005476733 nova_compute[192580]: 2025-10-08 15:43:19.127 2 INFO nova.scheduler.client.report [None req-252f212d-4b54-4fb5-81c1-652c4e59a43b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Deleted allocations for instance f00f4363-87ff-45bb-b619-95a364353d0b#033[00m
Oct  8 11:43:19 np0005476733 nova_compute[192580]: 2025-10-08 15:43:19.201 2 DEBUG nova.policy [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ec8fd4ab84244ebb88e5af7fcd3ce92b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '357683d0efd54df8878ddcfaabe6d388', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:43:19 np0005476733 nova_compute[192580]: 2025-10-08 15:43:19.208 2 DEBUG oslo_concurrency.lockutils [None req-252f212d-4b54-4fb5-81c1-652c4e59a43b d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "f00f4363-87ff-45bb-b619-95a364353d0b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:43:20 np0005476733 nova_compute[192580]: 2025-10-08 15:43:20.295 2 DEBUG nova.compute.manager [req-81a7223c-cf3e-40ec-91ad-0b3f91d6ded4 req-7967c9fe-4c12-4bb0-a57e-2e9f362c1176 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Received event network-vif-plugged-84e96b77-4119-4da7-ad84-5cf394586e03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:43:20 np0005476733 nova_compute[192580]: 2025-10-08 15:43:20.296 2 DEBUG oslo_concurrency.lockutils [req-81a7223c-cf3e-40ec-91ad-0b3f91d6ded4 req-7967c9fe-4c12-4bb0-a57e-2e9f362c1176 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "f00f4363-87ff-45bb-b619-95a364353d0b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:43:20 np0005476733 nova_compute[192580]: 2025-10-08 15:43:20.296 2 DEBUG oslo_concurrency.lockutils [req-81a7223c-cf3e-40ec-91ad-0b3f91d6ded4 req-7967c9fe-4c12-4bb0-a57e-2e9f362c1176 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "f00f4363-87ff-45bb-b619-95a364353d0b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:43:20 np0005476733 nova_compute[192580]: 2025-10-08 15:43:20.297 2 DEBUG oslo_concurrency.lockutils [req-81a7223c-cf3e-40ec-91ad-0b3f91d6ded4 req-7967c9fe-4c12-4bb0-a57e-2e9f362c1176 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "f00f4363-87ff-45bb-b619-95a364353d0b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:43:20 np0005476733 nova_compute[192580]: 2025-10-08 15:43:20.297 2 DEBUG nova.compute.manager [req-81a7223c-cf3e-40ec-91ad-0b3f91d6ded4 req-7967c9fe-4c12-4bb0-a57e-2e9f362c1176 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] No waiting events found dispatching network-vif-plugged-84e96b77-4119-4da7-ad84-5cf394586e03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:43:20 np0005476733 nova_compute[192580]: 2025-10-08 15:43:20.297 2 WARNING nova.compute.manager [req-81a7223c-cf3e-40ec-91ad-0b3f91d6ded4 req-7967c9fe-4c12-4bb0-a57e-2e9f362c1176 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Received unexpected event network-vif-plugged-84e96b77-4119-4da7-ad84-5cf394586e03 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 11:43:20 np0005476733 nova_compute[192580]: 2025-10-08 15:43:20.298 2 DEBUG nova.compute.manager [req-81a7223c-cf3e-40ec-91ad-0b3f91d6ded4 req-7967c9fe-4c12-4bb0-a57e-2e9f362c1176 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Received event network-vif-deleted-84e96b77-4119-4da7-ad84-5cf394586e03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:43:20 np0005476733 nova_compute[192580]: 2025-10-08 15:43:20.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:43:20Z|00550|pinctrl|WARN|Dropped 1143 log messages in last 55 seconds (most recently, 0 seconds ago) due to excessive rate
Oct  8 11:43:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:43:20Z|00551|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:43:20 np0005476733 nova_compute[192580]: 2025-10-08 15:43:20.499 2 DEBUG nova.network.neutron [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Successfully updated port: 1ba30d61-83df-42ed-a559-81c0f7e89a5d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:43:22 np0005476733 nova_compute[192580]: 2025-10-08 15:43:22.244 2 DEBUG nova.network.neutron [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Successfully updated port: e1daf344-5b8d-4f3b-aebd-3abc590fa847 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:43:22 np0005476733 nova_compute[192580]: 2025-10-08 15:43:22.262 2 DEBUG oslo_concurrency.lockutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "refresh_cache-7a9c56c2-9c61-4740-95cb-34aebd44fb1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:43:22 np0005476733 nova_compute[192580]: 2025-10-08 15:43:22.262 2 DEBUG oslo_concurrency.lockutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquired lock "refresh_cache-7a9c56c2-9c61-4740-95cb-34aebd44fb1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:43:22 np0005476733 nova_compute[192580]: 2025-10-08 15:43:22.263 2 DEBUG nova.network.neutron [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:43:22 np0005476733 nova_compute[192580]: 2025-10-08 15:43:22.407 2 DEBUG nova.compute.manager [req-61c5e28d-cfca-4971-922d-7f21a74871e4 req-75ab1eaa-4a0f-4f09-b74d-27956bc0572e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Received event network-changed-1ba30d61-83df-42ed-a559-81c0f7e89a5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:43:22 np0005476733 nova_compute[192580]: 2025-10-08 15:43:22.407 2 DEBUG nova.compute.manager [req-61c5e28d-cfca-4971-922d-7f21a74871e4 req-75ab1eaa-4a0f-4f09-b74d-27956bc0572e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Refreshing instance network info cache due to event network-changed-1ba30d61-83df-42ed-a559-81c0f7e89a5d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:43:22 np0005476733 nova_compute[192580]: 2025-10-08 15:43:22.408 2 DEBUG oslo_concurrency.lockutils [req-61c5e28d-cfca-4971-922d-7f21a74871e4 req-75ab1eaa-4a0f-4f09-b74d-27956bc0572e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-7a9c56c2-9c61-4740-95cb-34aebd44fb1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:43:22 np0005476733 nova_compute[192580]: 2025-10-08 15:43:22.476 2 DEBUG nova.network.neutron [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:43:22 np0005476733 nova_compute[192580]: 2025-10-08 15:43:22.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.291 2 DEBUG nova.network.neutron [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Updating instance_info_cache with network_info: [{"id": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "address": "fa:16:3e:71:85:78", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ba30d61-83", "ovs_interfaceid": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "address": "fa:16:3e:dc:32:4f", "network": {"id": "6a609c25-5d86-41c0-a88c-f18743a289f2", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1daf344-5b", "ovs_interfaceid": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.373 2 DEBUG oslo_concurrency.lockutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Releasing lock "refresh_cache-7a9c56c2-9c61-4740-95cb-34aebd44fb1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.373 2 DEBUG nova.compute.manager [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Instance network_info: |[{"id": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "address": "fa:16:3e:71:85:78", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ba30d61-83", "ovs_interfaceid": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "address": "fa:16:3e:dc:32:4f", "network": {"id": "6a609c25-5d86-41c0-a88c-f18743a289f2", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1daf344-5b", "ovs_interfaceid": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.374 2 DEBUG oslo_concurrency.lockutils [req-61c5e28d-cfca-4971-922d-7f21a74871e4 req-75ab1eaa-4a0f-4f09-b74d-27956bc0572e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-7a9c56c2-9c61-4740-95cb-34aebd44fb1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.374 2 DEBUG nova.network.neutron [req-61c5e28d-cfca-4971-922d-7f21a74871e4 req-75ab1eaa-4a0f-4f09-b74d-27956bc0572e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Refreshing network info cache for port 1ba30d61-83df-42ed-a559-81c0f7e89a5d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.379 2 DEBUG nova.virt.libvirt.driver [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Start _get_guest_xml network_info=[{"id": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "address": "fa:16:3e:71:85:78", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ba30d61-83", "ovs_interfaceid": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "address": "fa:16:3e:dc:32:4f", "network": {"id": "6a609c25-5d86-41c0-a88c-f18743a289f2", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1daf344-5b", "ovs_interfaceid": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.405 2 WARNING nova.virt.libvirt.driver [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.412 2 DEBUG nova.virt.libvirt.host [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.413 2 DEBUG nova.virt.libvirt.host [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.420 2 DEBUG nova.virt.libvirt.host [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.421 2 DEBUG nova.virt.libvirt.host [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.422 2 DEBUG nova.virt.libvirt.driver [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.422 2 DEBUG nova.virt.hardware [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.423 2 DEBUG nova.virt.hardware [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.423 2 DEBUG nova.virt.hardware [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.424 2 DEBUG nova.virt.hardware [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.424 2 DEBUG nova.virt.hardware [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.425 2 DEBUG nova.virt.hardware [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.425 2 DEBUG nova.virt.hardware [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.426 2 DEBUG nova.virt.hardware [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.427 2 DEBUG nova.virt.hardware [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.427 2 DEBUG nova.virt.hardware [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.428 2 DEBUG nova.virt.hardware [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.431 2 DEBUG nova.virt.libvirt.vif [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:43:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-2097740166-1',display_name='server-tempest-MultiPortVlanTransparencyTest-2097740166-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multiportvlantransparencytest-2097740166-1',id=69,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFO6EtKf086AtDcSKUhQT3A92xQMgobyVurrJBZ/a3hiqHTiY5Yo0zaLWibmNBoQ54lPdiEia0lEWiGuyPEo3V1Xkv/BTywiIW8/QXBzK9pxBAvfcXXyqWXEVNgqfaVfhA==',key_name='tempest-MultiPortVlanTransparencyTest-2097740166',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='357683d0efd54df8878ddcfaabe6d388',ramdisk_id='',reservation_id='r-vlvvk14k',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiPortVlanTransparencyTest-198310335',owner_user_name='tempest-MultiPortVlanTransparencyTest-198310335-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:43:18Z,user_data=None,user_id='ec8fd4ab84244ebb88e5af7fcd3ce92b',uuid=7a9c56c2-9c61-4740-95cb-34aebd44fb1a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "address": "fa:16:3e:71:85:78", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ba30d61-83", "ovs_interfaceid": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.431 2 DEBUG nova.network.os_vif_util [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converting VIF {"id": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "address": "fa:16:3e:71:85:78", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ba30d61-83", "ovs_interfaceid": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.432 2 DEBUG nova.network.os_vif_util [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:85:78,bridge_name='br-int',has_traffic_filtering=True,id=1ba30d61-83df-42ed-a559-81c0f7e89a5d,network=Network(2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1ba30d61-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.433 2 DEBUG nova.virt.libvirt.vif [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:43:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-2097740166-1',display_name='server-tempest-MultiPortVlanTransparencyTest-2097740166-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multiportvlantransparencytest-2097740166-1',id=69,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFO6EtKf086AtDcSKUhQT3A92xQMgobyVurrJBZ/a3hiqHTiY5Yo0zaLWibmNBoQ54lPdiEia0lEWiGuyPEo3V1Xkv/BTywiIW8/QXBzK9pxBAvfcXXyqWXEVNgqfaVfhA==',key_name='tempest-MultiPortVlanTransparencyTest-2097740166',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='357683d0efd54df8878ddcfaabe6d388',ramdisk_id='',reservation_id='r-vlvvk14k',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiPortVlanTransparencyTest-198310335',owner_user_name='tempest-MultiPortVlanTransparencyTest-198310335-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:43:18Z,user_data=None,user_id='ec8fd4ab84244ebb88e5af7fcd3ce92b',uuid=7a9c56c2-9c61-4740-95cb-34aebd44fb1a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "address": "fa:16:3e:dc:32:4f", "network": {"id": "6a609c25-5d86-41c0-a88c-f18743a289f2", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1daf344-5b", "ovs_interfaceid": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.433 2 DEBUG nova.network.os_vif_util [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converting VIF {"id": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "address": "fa:16:3e:dc:32:4f", "network": {"id": "6a609c25-5d86-41c0-a88c-f18743a289f2", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1daf344-5b", "ovs_interfaceid": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.434 2 DEBUG nova.network.os_vif_util [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dc:32:4f,bridge_name='br-int',has_traffic_filtering=True,id=e1daf344-5b8d-4f3b-aebd-3abc590fa847,network=Network(6a609c25-5d86-41c0-a88c-f18743a289f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape1daf344-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.434 2 DEBUG nova.objects.instance [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7a9c56c2-9c61-4740-95cb-34aebd44fb1a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.618 2 DEBUG nova.virt.libvirt.driver [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:43:25 np0005476733 nova_compute[192580]:  <uuid>7a9c56c2-9c61-4740-95cb-34aebd44fb1a</uuid>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:  <name>instance-00000045</name>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <nova:name>server-tempest-MultiPortVlanTransparencyTest-2097740166-1</nova:name>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:43:25</nova:creationTime>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:43:25 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:        <nova:user uuid="ec8fd4ab84244ebb88e5af7fcd3ce92b">tempest-MultiPortVlanTransparencyTest-198310335-project-member</nova:user>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:        <nova:project uuid="357683d0efd54df8878ddcfaabe6d388">tempest-MultiPortVlanTransparencyTest-198310335</nova:project>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:        <nova:port uuid="1ba30d61-83df-42ed-a559-81c0f7e89a5d">
Oct  8 11:43:25 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:        <nova:port uuid="e1daf344-5b8d-4f3b-aebd-3abc590fa847">
Oct  8 11:43:25 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.100.140" ipVersion="4"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <entry name="serial">7a9c56c2-9c61-4740-95cb-34aebd44fb1a</entry>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <entry name="uuid">7a9c56c2-9c61-4740-95cb-34aebd44fb1a</entry>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/7a9c56c2-9c61-4740-95cb-34aebd44fb1a/disk"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/7a9c56c2-9c61-4740-95cb-34aebd44fb1a/disk.config"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:71:85:78"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <target dev="tap1ba30d61-83"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:dc:32:4f"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <target dev="tape1daf344-5b"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/7a9c56c2-9c61-4740-95cb-34aebd44fb1a/console.log" append="off"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:43:25 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:43:25 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:43:25 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:43:25 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.619 2 DEBUG nova.compute.manager [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Preparing to wait for external event network-vif-plugged-1ba30d61-83df-42ed-a559-81c0f7e89a5d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.619 2 DEBUG oslo_concurrency.lockutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.620 2 DEBUG oslo_concurrency.lockutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.620 2 DEBUG oslo_concurrency.lockutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.620 2 DEBUG nova.compute.manager [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Preparing to wait for external event network-vif-plugged-e1daf344-5b8d-4f3b-aebd-3abc590fa847 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.621 2 DEBUG oslo_concurrency.lockutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.621 2 DEBUG oslo_concurrency.lockutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.621 2 DEBUG oslo_concurrency.lockutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.622 2 DEBUG nova.virt.libvirt.vif [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:43:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-2097740166-1',display_name='server-tempest-MultiPortVlanTransparencyTest-2097740166-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multiportvlantransparencytest-2097740166-1',id=69,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFO6EtKf086AtDcSKUhQT3A92xQMgobyVurrJBZ/a3hiqHTiY5Yo0zaLWibmNBoQ54lPdiEia0lEWiGuyPEo3V1Xkv/BTywiIW8/QXBzK9pxBAvfcXXyqWXEVNgqfaVfhA==',key_name='tempest-MultiPortVlanTransparencyTest-2097740166',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='357683d0efd54df8878ddcfaabe6d388',ramdisk_id='',reservation_id='r-vlvvk14k',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiPortVlanTransparencyTest-198310335',owner_user_name='tempest-MultiPortVlanTransparencyTest-198310335-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:43:18Z,user_data=None,user_id='ec8fd4ab84244ebb88e5af7fcd3ce92b',uuid=7a9c56c2-9c61-4740-95cb-34aebd44fb1a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "address": "fa:16:3e:71:85:78", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ba30d61-83", "ovs_interfaceid": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.623 2 DEBUG nova.network.os_vif_util [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converting VIF {"id": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "address": "fa:16:3e:71:85:78", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ba30d61-83", "ovs_interfaceid": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.624 2 DEBUG nova.network.os_vif_util [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:85:78,bridge_name='br-int',has_traffic_filtering=True,id=1ba30d61-83df-42ed-a559-81c0f7e89a5d,network=Network(2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1ba30d61-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.624 2 DEBUG os_vif [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:85:78,bridge_name='br-int',has_traffic_filtering=True,id=1ba30d61-83df-42ed-a559-81c0f7e89a5d,network=Network(2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1ba30d61-83') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.626 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.626 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.630 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1ba30d61-83, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.631 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1ba30d61-83, col_values=(('external_ids', {'iface-id': '1ba30d61-83df-42ed-a559-81c0f7e89a5d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:71:85:78', 'vm-uuid': '7a9c56c2-9c61-4740-95cb-34aebd44fb1a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:25 np0005476733 NetworkManager[51699]: <info>  [1759938205.6361] manager: (tap1ba30d61-83): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/181)
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.642 2 INFO os_vif [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:85:78,bridge_name='br-int',has_traffic_filtering=True,id=1ba30d61-83df-42ed-a559-81c0f7e89a5d,network=Network(2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1ba30d61-83')#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.643 2 DEBUG nova.virt.libvirt.vif [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:43:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-2097740166-1',display_name='server-tempest-MultiPortVlanTransparencyTest-2097740166-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multiportvlantransparencytest-2097740166-1',id=69,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFO6EtKf086AtDcSKUhQT3A92xQMgobyVurrJBZ/a3hiqHTiY5Yo0zaLWibmNBoQ54lPdiEia0lEWiGuyPEo3V1Xkv/BTywiIW8/QXBzK9pxBAvfcXXyqWXEVNgqfaVfhA==',key_name='tempest-MultiPortVlanTransparencyTest-2097740166',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='357683d0efd54df8878ddcfaabe6d388',ramdisk_id='',reservation_id='r-vlvvk14k',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MultiPortVlanTransparencyTest-198310335',owner_user_name='tempest-MultiPortVlanTransparencyTest-198310335-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:43:18Z,user_data=None,user_id='ec8fd4ab84244ebb88e5af7fcd3ce92b',uuid=7a9c56c2-9c61-4740-95cb-34aebd44fb1a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "address": "fa:16:3e:dc:32:4f", "network": {"id": "6a609c25-5d86-41c0-a88c-f18743a289f2", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1daf344-5b", "ovs_interfaceid": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.644 2 DEBUG nova.network.os_vif_util [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converting VIF {"id": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "address": "fa:16:3e:dc:32:4f", "network": {"id": "6a609c25-5d86-41c0-a88c-f18743a289f2", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1daf344-5b", "ovs_interfaceid": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.644 2 DEBUG nova.network.os_vif_util [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dc:32:4f,bridge_name='br-int',has_traffic_filtering=True,id=e1daf344-5b8d-4f3b-aebd-3abc590fa847,network=Network(6a609c25-5d86-41c0-a88c-f18743a289f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape1daf344-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.645 2 DEBUG os_vif [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dc:32:4f,bridge_name='br-int',has_traffic_filtering=True,id=e1daf344-5b8d-4f3b-aebd-3abc590fa847,network=Network(6a609c25-5d86-41c0-a88c-f18743a289f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape1daf344-5b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.645 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.646 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.649 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape1daf344-5b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.649 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape1daf344-5b, col_values=(('external_ids', {'iface-id': 'e1daf344-5b8d-4f3b-aebd-3abc590fa847', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dc:32:4f', 'vm-uuid': '7a9c56c2-9c61-4740-95cb-34aebd44fb1a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:43:25 np0005476733 NetworkManager[51699]: <info>  [1759938205.6516] manager: (tape1daf344-5b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/182)
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:25 np0005476733 nova_compute[192580]: 2025-10-08 15:43:25.664 2 INFO os_vif [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dc:32:4f,bridge_name='br-int',has_traffic_filtering=True,id=e1daf344-5b8d-4f3b-aebd-3abc590fa847,network=Network(6a609c25-5d86-41c0-a88c-f18743a289f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape1daf344-5b')#033[00m
Oct  8 11:43:26 np0005476733 nova_compute[192580]: 2025-10-08 15:43:26.087 2 DEBUG nova.virt.libvirt.driver [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:43:26 np0005476733 nova_compute[192580]: 2025-10-08 15:43:26.089 2 DEBUG nova.virt.libvirt.driver [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:43:26 np0005476733 nova_compute[192580]: 2025-10-08 15:43:26.090 2 DEBUG nova.virt.libvirt.driver [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] No VIF found with MAC fa:16:3e:71:85:78, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:43:26 np0005476733 nova_compute[192580]: 2025-10-08 15:43:26.090 2 DEBUG nova.virt.libvirt.driver [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] No VIF found with MAC fa:16:3e:dc:32:4f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:43:26 np0005476733 nova_compute[192580]: 2025-10-08 15:43:26.092 2 INFO nova.virt.libvirt.driver [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Using config drive#033[00m
Oct  8 11:43:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:26.330 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:43:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:26.332 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:43:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:26.333 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:43:26 np0005476733 nova_compute[192580]: 2025-10-08 15:43:26.626 2 DEBUG nova.network.neutron [req-61c5e28d-cfca-4971-922d-7f21a74871e4 req-75ab1eaa-4a0f-4f09-b74d-27956bc0572e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Updated VIF entry in instance network info cache for port 1ba30d61-83df-42ed-a559-81c0f7e89a5d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:43:26 np0005476733 nova_compute[192580]: 2025-10-08 15:43:26.627 2 DEBUG nova.network.neutron [req-61c5e28d-cfca-4971-922d-7f21a74871e4 req-75ab1eaa-4a0f-4f09-b74d-27956bc0572e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Updating instance_info_cache with network_info: [{"id": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "address": "fa:16:3e:71:85:78", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ba30d61-83", "ovs_interfaceid": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "address": "fa:16:3e:dc:32:4f", "network": {"id": "6a609c25-5d86-41c0-a88c-f18743a289f2", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1daf344-5b", "ovs_interfaceid": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:43:26 np0005476733 nova_compute[192580]: 2025-10-08 15:43:26.663 2 DEBUG oslo_concurrency.lockutils [req-61c5e28d-cfca-4971-922d-7f21a74871e4 req-75ab1eaa-4a0f-4f09-b74d-27956bc0572e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-7a9c56c2-9c61-4740-95cb-34aebd44fb1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:43:26 np0005476733 nova_compute[192580]: 2025-10-08 15:43:26.664 2 DEBUG nova.compute.manager [req-61c5e28d-cfca-4971-922d-7f21a74871e4 req-75ab1eaa-4a0f-4f09-b74d-27956bc0572e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Received event network-changed-e1daf344-5b8d-4f3b-aebd-3abc590fa847 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:43:26 np0005476733 nova_compute[192580]: 2025-10-08 15:43:26.664 2 DEBUG nova.compute.manager [req-61c5e28d-cfca-4971-922d-7f21a74871e4 req-75ab1eaa-4a0f-4f09-b74d-27956bc0572e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Refreshing instance network info cache due to event network-changed-e1daf344-5b8d-4f3b-aebd-3abc590fa847. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:43:26 np0005476733 nova_compute[192580]: 2025-10-08 15:43:26.664 2 DEBUG oslo_concurrency.lockutils [req-61c5e28d-cfca-4971-922d-7f21a74871e4 req-75ab1eaa-4a0f-4f09-b74d-27956bc0572e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-7a9c56c2-9c61-4740-95cb-34aebd44fb1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:43:26 np0005476733 nova_compute[192580]: 2025-10-08 15:43:26.664 2 DEBUG oslo_concurrency.lockutils [req-61c5e28d-cfca-4971-922d-7f21a74871e4 req-75ab1eaa-4a0f-4f09-b74d-27956bc0572e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-7a9c56c2-9c61-4740-95cb-34aebd44fb1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:43:26 np0005476733 nova_compute[192580]: 2025-10-08 15:43:26.665 2 DEBUG nova.network.neutron [req-61c5e28d-cfca-4971-922d-7f21a74871e4 req-75ab1eaa-4a0f-4f09-b74d-27956bc0572e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Refreshing network info cache for port e1daf344-5b8d-4f3b-aebd-3abc590fa847 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:43:26 np0005476733 nova_compute[192580]: 2025-10-08 15:43:26.870 2 INFO nova.virt.libvirt.driver [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Creating config drive at /var/lib/nova/instances/7a9c56c2-9c61-4740-95cb-34aebd44fb1a/disk.config#033[00m
Oct  8 11:43:26 np0005476733 nova_compute[192580]: 2025-10-08 15:43:26.875 2 DEBUG oslo_concurrency.processutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7a9c56c2-9c61-4740-95cb-34aebd44fb1a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp4_1uudh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:43:27 np0005476733 nova_compute[192580]: 2025-10-08 15:43:27.005 2 DEBUG oslo_concurrency.processutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7a9c56c2-9c61-4740-95cb-34aebd44fb1a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp4_1uudh" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:43:27 np0005476733 kernel: tap1ba30d61-83: entered promiscuous mode
Oct  8 11:43:27 np0005476733 NetworkManager[51699]: <info>  [1759938207.0961] manager: (tap1ba30d61-83): new Tun device (/org/freedesktop/NetworkManager/Devices/183)
Oct  8 11:43:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:43:27Z|00552|binding|INFO|Claiming lport 1ba30d61-83df-42ed-a559-81c0f7e89a5d for this chassis.
Oct  8 11:43:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:43:27Z|00553|binding|INFO|1ba30d61-83df-42ed-a559-81c0f7e89a5d: Claiming fa:16:3e:71:85:78 10.100.0.7
Oct  8 11:43:27 np0005476733 nova_compute[192580]: 2025-10-08 15:43:27.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.110 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:85:78 10.100.0.7'], port_security=['fa:16:3e:71:85:78 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'first_port-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '7a9c56c2-9c61-4740-95cb-34aebd44fb1a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac', 'neutron:port_capabilities': '', 'neutron:port_name': 'first_port-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'neutron:project_id': '357683d0efd54df8878ddcfaabe6d388', 'neutron:revision_number': '2', 'neutron:security_group_ids': '93a341f3-21b5-4aa3-854e-5c20dcdd9b33', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf5d6359-20d9-440f-a678-46a616c58f4d, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=1ba30d61-83df-42ed-a559-81c0f7e89a5d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.112 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 1ba30d61-83df-42ed-a559-81c0f7e89a5d in datapath 2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac bound to our chassis#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.114 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac#033[00m
Oct  8 11:43:27 np0005476733 NetworkManager[51699]: <info>  [1759938207.1149] manager: (tape1daf344-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/184)
Oct  8 11:43:27 np0005476733 kernel: tape1daf344-5b: entered promiscuous mode
Oct  8 11:43:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:43:27Z|00554|binding|INFO|Setting lport 1ba30d61-83df-42ed-a559-81c0f7e89a5d ovn-installed in OVS
Oct  8 11:43:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:43:27Z|00555|binding|INFO|Setting lport 1ba30d61-83df-42ed-a559-81c0f7e89a5d up in Southbound
Oct  8 11:43:27 np0005476733 nova_compute[192580]: 2025-10-08 15:43:27.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:43:27Z|00556|if_status|INFO|Dropped 2 log messages in last 976 seconds (most recently, 976 seconds ago) due to excessive rate
Oct  8 11:43:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:43:27Z|00557|if_status|INFO|Not updating pb chassis for e1daf344-5b8d-4f3b-aebd-3abc590fa847 now as sb is readonly
Oct  8 11:43:27 np0005476733 systemd-udevd[239285]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:43:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:43:27Z|00558|binding|INFO|Claiming lport e1daf344-5b8d-4f3b-aebd-3abc590fa847 for this chassis.
Oct  8 11:43:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:43:27Z|00559|binding|INFO|e1daf344-5b8d-4f3b-aebd-3abc590fa847: Claiming fa:16:3e:dc:32:4f 192.168.100.140
Oct  8 11:43:27 np0005476733 systemd-udevd[239286]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:43:27 np0005476733 nova_compute[192580]: 2025-10-08 15:43:27.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.127 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[17e65f98-3219-41ee-98fb-9d6a2eb08d68]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.131 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2bf87bc3-31 in ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.133 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2bf87bc3-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.133 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0721848b-1c38-456f-bcf2-537a4345a3b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.134 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d1375654-cd89-41df-932a-8ebbe5cbbfcf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:27 np0005476733 NetworkManager[51699]: <info>  [1759938207.1426] device (tap1ba30d61-83): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:43:27 np0005476733 NetworkManager[51699]: <info>  [1759938207.1436] device (tap1ba30d61-83): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:43:27 np0005476733 NetworkManager[51699]: <info>  [1759938207.1442] device (tape1daf344-5b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:43:27 np0005476733 NetworkManager[51699]: <info>  [1759938207.1447] device (tape1daf344-5b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.145 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[53dbc86d-ba5f-41a9-bcf6-fe07858b0fcf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:27 np0005476733 systemd-machined[152624]: New machine qemu-39-instance-00000045.
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.163 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dc:32:4f 192.168.100.140'], port_security=['fa:16:3e:dc:32:4f 192.168.100.140'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'second_port-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'neutron:cidrs': '192.168.100.140/24', 'neutron:device_id': '7a9c56c2-9c61-4740-95cb-34aebd44fb1a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a609c25-5d86-41c0-a88c-f18743a289f2', 'neutron:port_capabilities': '', 'neutron:port_name': 'second_port-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'neutron:project_id': '357683d0efd54df8878ddcfaabe6d388', 'neutron:revision_number': '2', 'neutron:security_group_ids': '93a341f3-21b5-4aa3-854e-5c20dcdd9b33', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5bf5c9fd-68d2-4a47-9a4f-e4d603f08f21, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=e1daf344-5b8d-4f3b-aebd-3abc590fa847) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.179 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d81538bb-1af3-4e99-90cb-28e9153a87b7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:27 np0005476733 systemd[1]: Started Virtual Machine qemu-39-instance-00000045.
Oct  8 11:43:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:43:27Z|00560|binding|INFO|Setting lport e1daf344-5b8d-4f3b-aebd-3abc590fa847 ovn-installed in OVS
Oct  8 11:43:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:43:27Z|00561|binding|INFO|Setting lport e1daf344-5b8d-4f3b-aebd-3abc590fa847 up in Southbound
Oct  8 11:43:27 np0005476733 nova_compute[192580]: 2025-10-08 15:43:27.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.222 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[c9a4c2ce-e5c1-4b52-8def-003667ec3d97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:27 np0005476733 NetworkManager[51699]: <info>  [1759938207.2328] manager: (tap2bf87bc3-30): new Veth device (/org/freedesktop/NetworkManager/Devices/185)
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.231 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[304257ad-a6b0-4db7-889a-a8e62629ee9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.268 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[1fbd1923-6974-4355-b833-652952ed5356]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.272 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[e0881f6a-4421-4765-911c-cee003288eda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:27 np0005476733 NetworkManager[51699]: <info>  [1759938207.2963] device (tap2bf87bc3-30): carrier: link connected
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.301 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[a2831ccb-5f0a-47df-bdce-ec9aa8ffbef7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.317 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d7507833-01b6-4409-b973-557c2bbef3b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2bf87bc3-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:ee:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 130], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511096, 'reachable_time': 36226, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239321, 'error': None, 'target': 'ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.337 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3fb051ee-0093-4ce5-a3ec-f2b474e75d9b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea4:eeb0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 511096, 'tstamp': 511096}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239322, 'error': None, 'target': 'ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.355 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1540f792-9cf9-4013-bf27-22f309663cb1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2bf87bc3-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:ee:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 130], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511096, 'reachable_time': 36226, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239323, 'error': None, 'target': 'ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.391 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8ba7a9f3-8ce6-4d92-bb7e-ed4139cb052d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.464 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[31dd4eca-50b3-41b1-b1a7-e08c7d7ffc99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.465 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2bf87bc3-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.465 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.466 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2bf87bc3-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:43:27 np0005476733 kernel: tap2bf87bc3-30: entered promiscuous mode
Oct  8 11:43:27 np0005476733 nova_compute[192580]: 2025-10-08 15:43:27.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:27 np0005476733 NetworkManager[51699]: <info>  [1759938207.4686] manager: (tap2bf87bc3-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/186)
Oct  8 11:43:27 np0005476733 nova_compute[192580]: 2025-10-08 15:43:27.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.471 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2bf87bc3-30, col_values=(('external_ids', {'iface-id': '88d655c9-33da-4a0f-a7f9-84973702cdd7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:43:27 np0005476733 nova_compute[192580]: 2025-10-08 15:43:27.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:27 np0005476733 ovn_controller[94857]: 2025-10-08T15:43:27Z|00562|binding|INFO|Releasing lport 88d655c9-33da-4a0f-a7f9-84973702cdd7 from this chassis (sb_readonly=0)
Oct  8 11:43:27 np0005476733 nova_compute[192580]: 2025-10-08 15:43:27.475 2 DEBUG nova.compute.manager [req-2d9e8d52-0ca1-4bd9-93c4-bc89d2e3c1a5 req-9b15a2c2-4279-406a-b903-d2c8ddca0ae2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Received event network-vif-plugged-1ba30d61-83df-42ed-a559-81c0f7e89a5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:43:27 np0005476733 nova_compute[192580]: 2025-10-08 15:43:27.476 2 DEBUG oslo_concurrency.lockutils [req-2d9e8d52-0ca1-4bd9-93c4-bc89d2e3c1a5 req-9b15a2c2-4279-406a-b903-d2c8ddca0ae2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:43:27 np0005476733 nova_compute[192580]: 2025-10-08 15:43:27.476 2 DEBUG oslo_concurrency.lockutils [req-2d9e8d52-0ca1-4bd9-93c4-bc89d2e3c1a5 req-9b15a2c2-4279-406a-b903-d2c8ddca0ae2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:43:27 np0005476733 nova_compute[192580]: 2025-10-08 15:43:27.476 2 DEBUG oslo_concurrency.lockutils [req-2d9e8d52-0ca1-4bd9-93c4-bc89d2e3c1a5 req-9b15a2c2-4279-406a-b903-d2c8ddca0ae2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:43:27 np0005476733 nova_compute[192580]: 2025-10-08 15:43:27.476 2 DEBUG nova.compute.manager [req-2d9e8d52-0ca1-4bd9-93c4-bc89d2e3c1a5 req-9b15a2c2-4279-406a-b903-d2c8ddca0ae2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Processing event network-vif-plugged-1ba30d61-83df-42ed-a559-81c0f7e89a5d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:43:27 np0005476733 nova_compute[192580]: 2025-10-08 15:43:27.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:27 np0005476733 nova_compute[192580]: 2025-10-08 15:43:27.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.485 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.486 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c7148d6f-550f-4eed-bb03-9c4924c39d4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.488 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac.pid.haproxy
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:43:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:27.489 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac', 'env', 'PROCESS_TAG=haproxy-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:43:27 np0005476733 podman[239363]: 2025-10-08 15:43:27.897842685 +0000 UTC m=+0.058641374 container create 7e1acecea2ccfc663abdabd41e5ba8d63eebae8b846b6edede579bf68ef0a28d (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct  8 11:43:27 np0005476733 systemd[1]: Started libpod-conmon-7e1acecea2ccfc663abdabd41e5ba8d63eebae8b846b6edede579bf68ef0a28d.scope.
Oct  8 11:43:27 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:43:27 np0005476733 podman[239363]: 2025-10-08 15:43:27.863288372 +0000 UTC m=+0.024087100 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:43:27 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a04b79dc57bd5b881aa3d6d73ac9980bfc845843488dc0f36d6f94b6869ab6cb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:43:28 np0005476733 podman[239363]: 2025-10-08 15:43:28.008888922 +0000 UTC m=+0.169687610 container init 7e1acecea2ccfc663abdabd41e5ba8d63eebae8b846b6edede579bf68ef0a28d (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:43:28 np0005476733 podman[239363]: 2025-10-08 15:43:28.01729412 +0000 UTC m=+0.178092818 container start 7e1acecea2ccfc663abdabd41e5ba8d63eebae8b846b6edede579bf68ef0a28d (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:43:28 np0005476733 neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac[239379]: [NOTICE]   (239392) : New worker (239394) forked
Oct  8 11:43:28 np0005476733 neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac[239379]: [NOTICE]   (239392) : Loading success.
Oct  8 11:43:28 np0005476733 podman[239376]: 2025-10-08 15:43:28.077697229 +0000 UTC m=+0.135559870 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Oct  8 11:43:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:28.099 103739 INFO neutron.agent.ovn.metadata.agent [-] Port e1daf344-5b8d-4f3b-aebd-3abc590fa847 in datapath 6a609c25-5d86-41c0-a88c-f18743a289f2 unbound from our chassis#033[00m
Oct  8 11:43:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:28.101 103739 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6a609c25-5d86-41c0-a88c-f18743a289f2 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  8 11:43:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:43:28.102 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[94552b3a-de8f-4252-9c6d-7e408dc5a931]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:43:28 np0005476733 nova_compute[192580]: 2025-10-08 15:43:28.252 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759938208.2516716, 7a9c56c2-9c61-4740-95cb-34aebd44fb1a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:43:28 np0005476733 nova_compute[192580]: 2025-10-08 15:43:28.253 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] VM Started (Lifecycle Event)#033[00m
Oct  8 11:43:28 np0005476733 nova_compute[192580]: 2025-10-08 15:43:28.282 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:43:28 np0005476733 nova_compute[192580]: 2025-10-08 15:43:28.287 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759938208.251849, 7a9c56c2-9c61-4740-95cb-34aebd44fb1a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:43:28 np0005476733 nova_compute[192580]: 2025-10-08 15:43:28.287 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:43:28 np0005476733 nova_compute[192580]: 2025-10-08 15:43:28.312 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:43:28 np0005476733 nova_compute[192580]: 2025-10-08 15:43:28.316 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:43:28 np0005476733 nova_compute[192580]: 2025-10-08 15:43:28.343 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:43:28 np0005476733 nova_compute[192580]: 2025-10-08 15:43:28.560 2 DEBUG nova.network.neutron [req-61c5e28d-cfca-4971-922d-7f21a74871e4 req-75ab1eaa-4a0f-4f09-b74d-27956bc0572e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Updated VIF entry in instance network info cache for port e1daf344-5b8d-4f3b-aebd-3abc590fa847. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:43:28 np0005476733 nova_compute[192580]: 2025-10-08 15:43:28.560 2 DEBUG nova.network.neutron [req-61c5e28d-cfca-4971-922d-7f21a74871e4 req-75ab1eaa-4a0f-4f09-b74d-27956bc0572e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Updating instance_info_cache with network_info: [{"id": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "address": "fa:16:3e:71:85:78", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ba30d61-83", "ovs_interfaceid": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "address": "fa:16:3e:dc:32:4f", "network": {"id": "6a609c25-5d86-41c0-a88c-f18743a289f2", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1daf344-5b", "ovs_interfaceid": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:43:28 np0005476733 nova_compute[192580]: 2025-10-08 15:43:28.583 2 DEBUG oslo_concurrency.lockutils [req-61c5e28d-cfca-4971-922d-7f21a74871e4 req-75ab1eaa-4a0f-4f09-b74d-27956bc0572e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-7a9c56c2-9c61-4740-95cb-34aebd44fb1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.753 2 DEBUG nova.compute.manager [req-dbd38235-a603-4c4c-8678-b8e1a23bd8af req-87f79db2-96f8-45cc-a960-61c3eb5ddd26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Received event network-vif-plugged-1ba30d61-83df-42ed-a559-81c0f7e89a5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.754 2 DEBUG oslo_concurrency.lockutils [req-dbd38235-a603-4c4c-8678-b8e1a23bd8af req-87f79db2-96f8-45cc-a960-61c3eb5ddd26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.754 2 DEBUG oslo_concurrency.lockutils [req-dbd38235-a603-4c4c-8678-b8e1a23bd8af req-87f79db2-96f8-45cc-a960-61c3eb5ddd26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.754 2 DEBUG oslo_concurrency.lockutils [req-dbd38235-a603-4c4c-8678-b8e1a23bd8af req-87f79db2-96f8-45cc-a960-61c3eb5ddd26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.755 2 DEBUG nova.compute.manager [req-dbd38235-a603-4c4c-8678-b8e1a23bd8af req-87f79db2-96f8-45cc-a960-61c3eb5ddd26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] No event matching network-vif-plugged-1ba30d61-83df-42ed-a559-81c0f7e89a5d in dict_keys([('network-vif-plugged', 'e1daf344-5b8d-4f3b-aebd-3abc590fa847')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.755 2 WARNING nova.compute.manager [req-dbd38235-a603-4c4c-8678-b8e1a23bd8af req-87f79db2-96f8-45cc-a960-61c3eb5ddd26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Received unexpected event network-vif-plugged-1ba30d61-83df-42ed-a559-81c0f7e89a5d for instance with vm_state building and task_state spawning.#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.755 2 DEBUG nova.compute.manager [req-dbd38235-a603-4c4c-8678-b8e1a23bd8af req-87f79db2-96f8-45cc-a960-61c3eb5ddd26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Received event network-vif-plugged-e1daf344-5b8d-4f3b-aebd-3abc590fa847 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.755 2 DEBUG oslo_concurrency.lockutils [req-dbd38235-a603-4c4c-8678-b8e1a23bd8af req-87f79db2-96f8-45cc-a960-61c3eb5ddd26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.755 2 DEBUG oslo_concurrency.lockutils [req-dbd38235-a603-4c4c-8678-b8e1a23bd8af req-87f79db2-96f8-45cc-a960-61c3eb5ddd26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.756 2 DEBUG oslo_concurrency.lockutils [req-dbd38235-a603-4c4c-8678-b8e1a23bd8af req-87f79db2-96f8-45cc-a960-61c3eb5ddd26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.756 2 DEBUG nova.compute.manager [req-dbd38235-a603-4c4c-8678-b8e1a23bd8af req-87f79db2-96f8-45cc-a960-61c3eb5ddd26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Processing event network-vif-plugged-e1daf344-5b8d-4f3b-aebd-3abc590fa847 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.756 2 DEBUG nova.compute.manager [req-dbd38235-a603-4c4c-8678-b8e1a23bd8af req-87f79db2-96f8-45cc-a960-61c3eb5ddd26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Received event network-vif-plugged-e1daf344-5b8d-4f3b-aebd-3abc590fa847 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.756 2 DEBUG oslo_concurrency.lockutils [req-dbd38235-a603-4c4c-8678-b8e1a23bd8af req-87f79db2-96f8-45cc-a960-61c3eb5ddd26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.756 2 DEBUG oslo_concurrency.lockutils [req-dbd38235-a603-4c4c-8678-b8e1a23bd8af req-87f79db2-96f8-45cc-a960-61c3eb5ddd26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.757 2 DEBUG oslo_concurrency.lockutils [req-dbd38235-a603-4c4c-8678-b8e1a23bd8af req-87f79db2-96f8-45cc-a960-61c3eb5ddd26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.757 2 DEBUG nova.compute.manager [req-dbd38235-a603-4c4c-8678-b8e1a23bd8af req-87f79db2-96f8-45cc-a960-61c3eb5ddd26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] No waiting events found dispatching network-vif-plugged-e1daf344-5b8d-4f3b-aebd-3abc590fa847 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.757 2 WARNING nova.compute.manager [req-dbd38235-a603-4c4c-8678-b8e1a23bd8af req-87f79db2-96f8-45cc-a960-61c3eb5ddd26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Received unexpected event network-vif-plugged-e1daf344-5b8d-4f3b-aebd-3abc590fa847 for instance with vm_state building and task_state spawning.#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.757 2 DEBUG nova.compute.manager [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.761 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759938209.7610154, 7a9c56c2-9c61-4740-95cb-34aebd44fb1a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.761 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.762 2 DEBUG nova.virt.libvirt.driver [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.767 2 INFO nova.virt.libvirt.driver [-] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Instance spawned successfully.#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.767 2 DEBUG nova.virt.libvirt.driver [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.785 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.789 2 DEBUG nova.virt.libvirt.driver [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.789 2 DEBUG nova.virt.libvirt.driver [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.789 2 DEBUG nova.virt.libvirt.driver [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.790 2 DEBUG nova.virt.libvirt.driver [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.790 2 DEBUG nova.virt.libvirt.driver [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.791 2 DEBUG nova.virt.libvirt.driver [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.795 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.828 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.853 2 INFO nova.compute.manager [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Took 11.21 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.854 2 DEBUG nova.compute.manager [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.928 2 INFO nova.compute.manager [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Took 11.72 seconds to build instance.#033[00m
Oct  8 11:43:29 np0005476733 nova_compute[192580]: 2025-10-08 15:43:29.949 2 DEBUG oslo_concurrency.lockutils [None req-07a7b36e-0ac7-483e-99f3-046089c82d3b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.836s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:43:30 np0005476733 nova_compute[192580]: 2025-10-08 15:43:30.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:30 np0005476733 nova_compute[192580]: 2025-10-08 15:43:30.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:30 np0005476733 nova_compute[192580]: 2025-10-08 15:43:30.958 2 INFO nova.compute.manager [None req-8bdf74eb-7ca2-4357-b808-2f8a07dc4a31 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Get console output#033[00m
Oct  8 11:43:30 np0005476733 nova_compute[192580]: 2025-10-08 15:43:30.966 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:43:32 np0005476733 podman[239414]: 2025-10-08 15:43:32.246664137 +0000 UTC m=+0.070035578 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 11:43:32 np0005476733 podman[239413]: 2025-10-08 15:43:32.344322506 +0000 UTC m=+0.163952447 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  8 11:43:32 np0005476733 nova_compute[192580]: 2025-10-08 15:43:32.839 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759938197.838555, f00f4363-87ff-45bb-b619-95a364353d0b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:43:32 np0005476733 nova_compute[192580]: 2025-10-08 15:43:32.841 2 INFO nova.compute.manager [-] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:43:32 np0005476733 nova_compute[192580]: 2025-10-08 15:43:32.862 2 DEBUG nova.compute.manager [None req-9168a5db-9b1d-4ee1-ae8c-aeb9d410c6aa - - - - - -] [instance: f00f4363-87ff-45bb-b619-95a364353d0b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:43:35 np0005476733 nova_compute[192580]: 2025-10-08 15:43:35.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:35 np0005476733 nova_compute[192580]: 2025-10-08 15:43:35.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:36 np0005476733 nova_compute[192580]: 2025-10-08 15:43:36.115 2 INFO nova.compute.manager [None req-8f030cff-6fa8-4118-a3ae-6a32680e443e ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Get console output#033[00m
Oct  8 11:43:36 np0005476733 nova_compute[192580]: 2025-10-08 15:43:36.119 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:43:40 np0005476733 podman[239456]: 2025-10-08 15:43:40.266328379 +0000 UTC m=+0.075197692 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 11:43:40 np0005476733 podman[239457]: 2025-10-08 15:43:40.27387732 +0000 UTC m=+0.082328079 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-type=git, architecture=x86_64, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, distribution-scope=public)
Oct  8 11:43:40 np0005476733 podman[239455]: 2025-10-08 15:43:40.274251892 +0000 UTC m=+0.089999195 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 11:43:40 np0005476733 nova_compute[192580]: 2025-10-08 15:43:40.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:40 np0005476733 nova_compute[192580]: 2025-10-08 15:43:40.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:41 np0005476733 nova_compute[192580]: 2025-10-08 15:43:41.281 2 INFO nova.compute.manager [None req-fb47b257-f9c8-454c-89a1-cb475544ca8b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Get console output#033[00m
Oct  8 11:43:45 np0005476733 nova_compute[192580]: 2025-10-08 15:43:45.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:45 np0005476733 nova_compute[192580]: 2025-10-08 15:43:45.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:46 np0005476733 nova_compute[192580]: 2025-10-08 15:43:46.442 2 INFO nova.compute.manager [None req-936f9e55-d6e7-4bd0-8a4b-2b5d6fc8d4ba ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Get console output#033[00m
Oct  8 11:43:49 np0005476733 podman[239524]: 2025-10-08 15:43:49.230522745 +0000 UTC m=+0.050630658 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 11:43:49 np0005476733 podman[239523]: 2025-10-08 15:43:49.266424502 +0000 UTC m=+0.088971292 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct  8 11:43:50 np0005476733 nova_compute[192580]: 2025-10-08 15:43:50.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:50 np0005476733 nova_compute[192580]: 2025-10-08 15:43:50.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:51 np0005476733 nova_compute[192580]: 2025-10-08 15:43:51.605 2 INFO nova.compute.manager [None req-51abfc32-8344-484f-aca9-08bb68d437fd ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Get console output#033[00m
Oct  8 11:43:51 np0005476733 nova_compute[192580]: 2025-10-08 15:43:51.612 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:43:52 np0005476733 nova_compute[192580]: 2025-10-08 15:43:52.591 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:43:53 np0005476733 nova_compute[192580]: 2025-10-08 15:43:53.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:43:55 np0005476733 nova_compute[192580]: 2025-10-08 15:43:55.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:55 np0005476733 nova_compute[192580]: 2025-10-08 15:43:55.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:43:56 np0005476733 nova_compute[192580]: 2025-10-08 15:43:56.780 2 INFO nova.compute.manager [None req-41c6c03a-ceee-4a46-9e12-10983aea944b ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Get console output#033[00m
Oct  8 11:43:56 np0005476733 nova_compute[192580]: 2025-10-08 15:43:56.784 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:43:57 np0005476733 nova_compute[192580]: 2025-10-08 15:43:57.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:43:58 np0005476733 podman[239570]: 2025-10-08 15:43:58.222957913 +0000 UTC m=+0.053703006 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 11:43:58 np0005476733 nova_compute[192580]: 2025-10-08 15:43:58.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:43:58 np0005476733 nova_compute[192580]: 2025-10-08 15:43:58.623 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:43:58 np0005476733 nova_compute[192580]: 2025-10-08 15:43:58.624 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:43:58 np0005476733 nova_compute[192580]: 2025-10-08 15:43:58.624 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:43:58 np0005476733 nova_compute[192580]: 2025-10-08 15:43:58.624 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:43:58 np0005476733 nova_compute[192580]: 2025-10-08 15:43:58.705 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a9c56c2-9c61-4740-95cb-34aebd44fb1a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:43:58 np0005476733 nova_compute[192580]: 2025-10-08 15:43:58.766 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a9c56c2-9c61-4740-95cb-34aebd44fb1a/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:43:58 np0005476733 nova_compute[192580]: 2025-10-08 15:43:58.767 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a9c56c2-9c61-4740-95cb-34aebd44fb1a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:43:58 np0005476733 nova_compute[192580]: 2025-10-08 15:43:58.823 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a9c56c2-9c61-4740-95cb-34aebd44fb1a/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:43:58 np0005476733 ovn_controller[94857]: 2025-10-08T15:43:58Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:71:85:78 10.100.0.7
Oct  8 11:43:58 np0005476733 ovn_controller[94857]: 2025-10-08T15:43:58Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:71:85:78 10.100.0.7
Oct  8 11:43:58 np0005476733 nova_compute[192580]: 2025-10-08 15:43:58.997 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:43:58 np0005476733 nova_compute[192580]: 2025-10-08 15:43:58.998 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13233MB free_disk=111.31570816040039GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:43:58 np0005476733 nova_compute[192580]: 2025-10-08 15:43:58.998 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:43:58 np0005476733 nova_compute[192580]: 2025-10-08 15:43:58.998 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:43:59 np0005476733 nova_compute[192580]: 2025-10-08 15:43:59.070 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 7a9c56c2-9c61-4740-95cb-34aebd44fb1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:43:59 np0005476733 nova_compute[192580]: 2025-10-08 15:43:59.071 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:43:59 np0005476733 nova_compute[192580]: 2025-10-08 15:43:59.071 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:43:59 np0005476733 nova_compute[192580]: 2025-10-08 15:43:59.106 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:43:59 np0005476733 nova_compute[192580]: 2025-10-08 15:43:59.123 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:43:59 np0005476733 nova_compute[192580]: 2025-10-08 15:43:59.162 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:43:59 np0005476733 nova_compute[192580]: 2025-10-08 15:43:59.163 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:44:00 np0005476733 nova_compute[192580]: 2025-10-08 15:44:00.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:00 np0005476733 nova_compute[192580]: 2025-10-08 15:44:00.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:01 np0005476733 nova_compute[192580]: 2025-10-08 15:44:01.163 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:44:01 np0005476733 nova_compute[192580]: 2025-10-08 15:44:01.164 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:44:01 np0005476733 nova_compute[192580]: 2025-10-08 15:44:01.582 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:44:01 np0005476733 nova_compute[192580]: 2025-10-08 15:44:01.978 2 INFO nova.compute.manager [None req-a4179a54-be0c-4314-b27d-a0c3391bce7e ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Get console output#033[00m
Oct  8 11:44:01 np0005476733 nova_compute[192580]: 2025-10-08 15:44:01.983 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:44:01 np0005476733 nova_compute[192580]: 2025-10-08 15:44:01.987 2 INFO nova.virt.libvirt.driver [None req-a4179a54-be0c-4314-b27d-a0c3391bce7e ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Truncated console log returned, 121 bytes ignored#033[00m
Oct  8 11:44:02 np0005476733 nova_compute[192580]: 2025-10-08 15:44:02.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:44:02 np0005476733 nova_compute[192580]: 2025-10-08 15:44:02.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:44:02 np0005476733 nova_compute[192580]: 2025-10-08 15:44:02.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:44:02 np0005476733 nova_compute[192580]: 2025-10-08 15:44:02.753 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-7a9c56c2-9c61-4740-95cb-34aebd44fb1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:44:02 np0005476733 nova_compute[192580]: 2025-10-08 15:44:02.754 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-7a9c56c2-9c61-4740-95cb-34aebd44fb1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:44:02 np0005476733 nova_compute[192580]: 2025-10-08 15:44:02.754 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:44:02 np0005476733 nova_compute[192580]: 2025-10-08 15:44:02.754 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7a9c56c2-9c61-4740-95cb-34aebd44fb1a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:44:03 np0005476733 podman[239598]: 2025-10-08 15:44:03.232769493 +0000 UTC m=+0.057046122 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:44:03 np0005476733 podman[239597]: 2025-10-08 15:44:03.251912164 +0000 UTC m=+0.082514085 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:44:04 np0005476733 nova_compute[192580]: 2025-10-08 15:44:04.516 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Updating instance_info_cache with network_info: [{"id": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "address": "fa:16:3e:71:85:78", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ba30d61-83", "ovs_interfaceid": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "address": "fa:16:3e:dc:32:4f", "network": {"id": "6a609c25-5d86-41c0-a88c-f18743a289f2", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1daf344-5b", "ovs_interfaceid": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:44:04 np0005476733 nova_compute[192580]: 2025-10-08 15:44:04.537 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-7a9c56c2-9c61-4740-95cb-34aebd44fb1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:44:04 np0005476733 nova_compute[192580]: 2025-10-08 15:44:04.537 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:44:04 np0005476733 nova_compute[192580]: 2025-10-08 15:44:04.538 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:44:05 np0005476733 nova_compute[192580]: 2025-10-08 15:44:05.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:05 np0005476733 nova_compute[192580]: 2025-10-08 15:44:05.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:05 np0005476733 ovn_controller[94857]: 2025-10-08T15:44:05Z|00563|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct  8 11:44:07 np0005476733 nova_compute[192580]: 2025-10-08 15:44:07.131 2 INFO nova.compute.manager [None req-0111a569-ee93-4f10-a0d2-82f29a8afa38 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Get console output#033[00m
Oct  8 11:44:07 np0005476733 nova_compute[192580]: 2025-10-08 15:44:07.138 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:44:07 np0005476733 nova_compute[192580]: 2025-10-08 15:44:07.141 2 INFO nova.virt.libvirt.driver [None req-0111a569-ee93-4f10-a0d2-82f29a8afa38 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Truncated console log returned, 4128 bytes ignored#033[00m
Oct  8 11:44:10 np0005476733 nova_compute[192580]: 2025-10-08 15:44:10.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:10 np0005476733 nova_compute[192580]: 2025-10-08 15:44:10.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:11 np0005476733 podman[239658]: 2025-10-08 15:44:11.236707522 +0000 UTC m=+0.050067869 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:44:11 np0005476733 podman[239659]: 2025-10-08 15:44:11.254506201 +0000 UTC m=+0.063578871 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container)
Oct  8 11:44:11 np0005476733 podman[239657]: 2025-10-08 15:44:11.265208433 +0000 UTC m=+0.083224349 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 11:44:12 np0005476733 nova_compute[192580]: 2025-10-08 15:44:12.328 2 INFO nova.compute.manager [None req-fa84a663-9197-4657-8113-29acb0486d49 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Get console output#033[00m
Oct  8 11:44:12 np0005476733 nova_compute[192580]: 2025-10-08 15:44:12.334 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:44:12 np0005476733 nova_compute[192580]: 2025-10-08 15:44:12.338 2 INFO nova.virt.libvirt.driver [None req-fa84a663-9197-4657-8113-29acb0486d49 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Truncated console log returned, 4348 bytes ignored#033[00m
Oct  8 11:44:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:13.255 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:44:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:13.256 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:44:13 np0005476733 nova_compute[192580]: 2025-10-08 15:44:13.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:14 np0005476733 nova_compute[192580]: 2025-10-08 15:44:14.530 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:44:14 np0005476733 nova_compute[192580]: 2025-10-08 15:44:14.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:44:15 np0005476733 nova_compute[192580]: 2025-10-08 15:44:15.117 2 DEBUG nova.compute.manager [req-b8b3fd13-05a5-45cd-87fa-c99a16a19b6a req-d19e1609-e5c5-4ced-a79d-e7e41b1ed5d3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Received event network-changed-1ba30d61-83df-42ed-a559-81c0f7e89a5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:44:15 np0005476733 nova_compute[192580]: 2025-10-08 15:44:15.118 2 DEBUG nova.compute.manager [req-b8b3fd13-05a5-45cd-87fa-c99a16a19b6a req-d19e1609-e5c5-4ced-a79d-e7e41b1ed5d3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Refreshing instance network info cache due to event network-changed-1ba30d61-83df-42ed-a559-81c0f7e89a5d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:44:15 np0005476733 nova_compute[192580]: 2025-10-08 15:44:15.119 2 DEBUG oslo_concurrency.lockutils [req-b8b3fd13-05a5-45cd-87fa-c99a16a19b6a req-d19e1609-e5c5-4ced-a79d-e7e41b1ed5d3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-7a9c56c2-9c61-4740-95cb-34aebd44fb1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:44:15 np0005476733 nova_compute[192580]: 2025-10-08 15:44:15.119 2 DEBUG oslo_concurrency.lockutils [req-b8b3fd13-05a5-45cd-87fa-c99a16a19b6a req-d19e1609-e5c5-4ced-a79d-e7e41b1ed5d3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-7a9c56c2-9c61-4740-95cb-34aebd44fb1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:44:15 np0005476733 nova_compute[192580]: 2025-10-08 15:44:15.119 2 DEBUG nova.network.neutron [req-b8b3fd13-05a5-45cd-87fa-c99a16a19b6a req-d19e1609-e5c5-4ced-a79d-e7e41b1ed5d3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Refreshing network info cache for port 1ba30d61-83df-42ed-a559-81c0f7e89a5d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:44:15 np0005476733 nova_compute[192580]: 2025-10-08 15:44:15.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:15 np0005476733 nova_compute[192580]: 2025-10-08 15:44:15.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:16 np0005476733 nova_compute[192580]: 2025-10-08 15:44:16.457 2 DEBUG nova.network.neutron [req-b8b3fd13-05a5-45cd-87fa-c99a16a19b6a req-d19e1609-e5c5-4ced-a79d-e7e41b1ed5d3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Updated VIF entry in instance network info cache for port 1ba30d61-83df-42ed-a559-81c0f7e89a5d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:44:16 np0005476733 nova_compute[192580]: 2025-10-08 15:44:16.458 2 DEBUG nova.network.neutron [req-b8b3fd13-05a5-45cd-87fa-c99a16a19b6a req-d19e1609-e5c5-4ced-a79d-e7e41b1ed5d3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Updating instance_info_cache with network_info: [{"id": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "address": "fa:16:3e:71:85:78", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ba30d61-83", "ovs_interfaceid": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}, {"id": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "address": "fa:16:3e:dc:32:4f", "network": {"id": "6a609c25-5d86-41c0-a88c-f18743a289f2", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1daf344-5b", "ovs_interfaceid": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:44:16 np0005476733 nova_compute[192580]: 2025-10-08 15:44:16.482 2 DEBUG oslo_concurrency.lockutils [req-b8b3fd13-05a5-45cd-87fa-c99a16a19b6a req-d19e1609-e5c5-4ced-a79d-e7e41b1ed5d3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-7a9c56c2-9c61-4740-95cb-34aebd44fb1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:44:18 np0005476733 nova_compute[192580]: 2025-10-08 15:44:18.883 2 DEBUG oslo_concurrency.lockutils [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:44:18 np0005476733 nova_compute[192580]: 2025-10-08 15:44:18.884 2 DEBUG oslo_concurrency.lockutils [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:44:18 np0005476733 nova_compute[192580]: 2025-10-08 15:44:18.884 2 DEBUG oslo_concurrency.lockutils [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:44:18 np0005476733 nova_compute[192580]: 2025-10-08 15:44:18.885 2 DEBUG oslo_concurrency.lockutils [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:44:18 np0005476733 nova_compute[192580]: 2025-10-08 15:44:18.885 2 DEBUG oslo_concurrency.lockutils [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:44:18 np0005476733 nova_compute[192580]: 2025-10-08 15:44:18.886 2 INFO nova.compute.manager [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Terminating instance#033[00m
Oct  8 11:44:18 np0005476733 nova_compute[192580]: 2025-10-08 15:44:18.887 2 DEBUG nova.compute.manager [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:44:18 np0005476733 kernel: tap1ba30d61-83 (unregistering): left promiscuous mode
Oct  8 11:44:18 np0005476733 NetworkManager[51699]: <info>  [1759938258.9241] device (tap1ba30d61-83): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:44:18 np0005476733 nova_compute[192580]: 2025-10-08 15:44:18.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:18 np0005476733 ovn_controller[94857]: 2025-10-08T15:44:18Z|00564|binding|INFO|Releasing lport 1ba30d61-83df-42ed-a559-81c0f7e89a5d from this chassis (sb_readonly=0)
Oct  8 11:44:18 np0005476733 ovn_controller[94857]: 2025-10-08T15:44:18Z|00565|binding|INFO|Setting lport 1ba30d61-83df-42ed-a559-81c0f7e89a5d down in Southbound
Oct  8 11:44:18 np0005476733 ovn_controller[94857]: 2025-10-08T15:44:18Z|00566|binding|INFO|Removing iface tap1ba30d61-83 ovn-installed in OVS
Oct  8 11:44:18 np0005476733 nova_compute[192580]: 2025-10-08 15:44:18.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:18.947 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:85:78 10.100.0.7'], port_security=['fa:16:3e:71:85:78 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'first_port-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '7a9c56c2-9c61-4740-95cb-34aebd44fb1a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac', 'neutron:port_capabilities': '', 'neutron:port_name': 'first_port-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'neutron:project_id': '357683d0efd54df8878ddcfaabe6d388', 'neutron:revision_number': '4', 'neutron:security_group_ids': '93a341f3-21b5-4aa3-854e-5c20dcdd9b33', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.191'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf5d6359-20d9-440f-a678-46a616c58f4d, chassis=[], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=1ba30d61-83df-42ed-a559-81c0f7e89a5d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:44:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:18.948 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 1ba30d61-83df-42ed-a559-81c0f7e89a5d in datapath 2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac unbound from our chassis#033[00m
Oct  8 11:44:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:18.950 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:44:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:18.951 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[60fc3c29-25f1-4ea8-b26c-1f26a72ab874]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:18.951 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac namespace which is not needed anymore#033[00m
Oct  8 11:44:18 np0005476733 nova_compute[192580]: 2025-10-08 15:44:18.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:18 np0005476733 kernel: tape1daf344-5b (unregistering): left promiscuous mode
Oct  8 11:44:18 np0005476733 NetworkManager[51699]: <info>  [1759938258.9873] device (tape1daf344-5b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:44:18 np0005476733 nova_compute[192580]: 2025-10-08 15:44:18.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:18 np0005476733 nova_compute[192580]: 2025-10-08 15:44:18.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:18 np0005476733 ovn_controller[94857]: 2025-10-08T15:44:18Z|00567|binding|INFO|Releasing lport e1daf344-5b8d-4f3b-aebd-3abc590fa847 from this chassis (sb_readonly=0)
Oct  8 11:44:18 np0005476733 ovn_controller[94857]: 2025-10-08T15:44:18Z|00568|binding|INFO|Setting lport e1daf344-5b8d-4f3b-aebd-3abc590fa847 down in Southbound
Oct  8 11:44:18 np0005476733 ovn_controller[94857]: 2025-10-08T15:44:18Z|00569|binding|INFO|Removing iface tape1daf344-5b ovn-installed in OVS
Oct  8 11:44:18 np0005476733 nova_compute[192580]: 2025-10-08 15:44:18.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:19.008 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dc:32:4f 192.168.100.140'], port_security=['fa:16:3e:dc:32:4f 192.168.100.140'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com', 'vlan-passthru': 'true'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'second_port-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'neutron:cidrs': '192.168.100.140/24', 'neutron:device_id': '7a9c56c2-9c61-4740-95cb-34aebd44fb1a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a609c25-5d86-41c0-a88c-f18743a289f2', 'neutron:port_capabilities': '', 'neutron:port_name': 'second_port-tempest-MultiPortVlanTransparencyTest-2097740166-1', 'neutron:project_id': '357683d0efd54df8878ddcfaabe6d388', 'neutron:revision_number': '4', 'neutron:security_group_ids': '93a341f3-21b5-4aa3-854e-5c20dcdd9b33', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5bf5c9fd-68d2-4a47-9a4f-e4d603f08f21, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=e1daf344-5b8d-4f3b-aebd-3abc590fa847) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:19 np0005476733 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000045.scope: Deactivated successfully.
Oct  8 11:44:19 np0005476733 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000045.scope: Consumed 49.182s CPU time.
Oct  8 11:44:19 np0005476733 systemd-machined[152624]: Machine qemu-39-instance-00000045 terminated.
Oct  8 11:44:19 np0005476733 neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac[239379]: [NOTICE]   (239392) : haproxy version is 2.8.14-c23fe91
Oct  8 11:44:19 np0005476733 neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac[239379]: [NOTICE]   (239392) : path to executable is /usr/sbin/haproxy
Oct  8 11:44:19 np0005476733 neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac[239379]: [WARNING]  (239392) : Exiting Master process...
Oct  8 11:44:19 np0005476733 neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac[239379]: [WARNING]  (239392) : Exiting Master process...
Oct  8 11:44:19 np0005476733 neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac[239379]: [ALERT]    (239392) : Current worker (239394) exited with code 143 (Terminated)
Oct  8 11:44:19 np0005476733 neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac[239379]: [WARNING]  (239392) : All workers exited. Exiting... (0)
Oct  8 11:44:19 np0005476733 systemd[1]: libpod-7e1acecea2ccfc663abdabd41e5ba8d63eebae8b846b6edede579bf68ef0a28d.scope: Deactivated successfully.
Oct  8 11:44:19 np0005476733 podman[239744]: 2025-10-08 15:44:19.09180632 +0000 UTC m=+0.047257001 container died 7e1acecea2ccfc663abdabd41e5ba8d63eebae8b846b6edede579bf68ef0a28d (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  8 11:44:19 np0005476733 NetworkManager[51699]: <info>  [1759938259.1153] manager: (tape1daf344-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/187)
Oct  8 11:44:19 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7e1acecea2ccfc663abdabd41e5ba8d63eebae8b846b6edede579bf68ef0a28d-userdata-shm.mount: Deactivated successfully.
Oct  8 11:44:19 np0005476733 systemd[1]: var-lib-containers-storage-overlay-a04b79dc57bd5b881aa3d6d73ac9980bfc845843488dc0f36d6f94b6869ab6cb-merged.mount: Deactivated successfully.
Oct  8 11:44:19 np0005476733 podman[239744]: 2025-10-08 15:44:19.133941224 +0000 UTC m=+0.089391905 container cleanup 7e1acecea2ccfc663abdabd41e5ba8d63eebae8b846b6edede579bf68ef0a28d (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  8 11:44:19 np0005476733 systemd[1]: libpod-conmon-7e1acecea2ccfc663abdabd41e5ba8d63eebae8b846b6edede579bf68ef0a28d.scope: Deactivated successfully.
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.154 2 DEBUG nova.compute.manager [req-23aa04d1-5871-4c04-bfd4-1084aa063928 req-fbfeb26c-c413-4fe6-bafa-ecf1ffefc2e9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Received event network-vif-unplugged-1ba30d61-83df-42ed-a559-81c0f7e89a5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.155 2 DEBUG oslo_concurrency.lockutils [req-23aa04d1-5871-4c04-bfd4-1084aa063928 req-fbfeb26c-c413-4fe6-bafa-ecf1ffefc2e9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.155 2 DEBUG oslo_concurrency.lockutils [req-23aa04d1-5871-4c04-bfd4-1084aa063928 req-fbfeb26c-c413-4fe6-bafa-ecf1ffefc2e9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.155 2 DEBUG oslo_concurrency.lockutils [req-23aa04d1-5871-4c04-bfd4-1084aa063928 req-fbfeb26c-c413-4fe6-bafa-ecf1ffefc2e9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.156 2 DEBUG nova.compute.manager [req-23aa04d1-5871-4c04-bfd4-1084aa063928 req-fbfeb26c-c413-4fe6-bafa-ecf1ffefc2e9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] No waiting events found dispatching network-vif-unplugged-1ba30d61-83df-42ed-a559-81c0f7e89a5d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.156 2 DEBUG nova.compute.manager [req-23aa04d1-5871-4c04-bfd4-1084aa063928 req-fbfeb26c-c413-4fe6-bafa-ecf1ffefc2e9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Received event network-vif-unplugged-1ba30d61-83df-42ed-a559-81c0f7e89a5d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.160 2 INFO nova.virt.libvirt.driver [-] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Instance destroyed successfully.#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.161 2 DEBUG nova.objects.instance [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lazy-loading 'resources' on Instance uuid 7a9c56c2-9c61-4740-95cb-34aebd44fb1a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.177 2 DEBUG nova.virt.libvirt.vif [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:43:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-2097740166-1',display_name='server-tempest-MultiPortVlanTransparencyTest-2097740166-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multiportvlantransparencytest-2097740166-1',id=69,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFO6EtKf086AtDcSKUhQT3A92xQMgobyVurrJBZ/a3hiqHTiY5Yo0zaLWibmNBoQ54lPdiEia0lEWiGuyPEo3V1Xkv/BTywiIW8/QXBzK9pxBAvfcXXyqWXEVNgqfaVfhA==',key_name='tempest-MultiPortVlanTransparencyTest-2097740166',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:43:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='357683d0efd54df8878ddcfaabe6d388',ramdisk_id='',reservation_id='r-vlvvk14k',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-MultiPortVlanTransparencyTest-198310335',owner_user_name='tempest-MultiPortVlanTransparencyTest-198310335-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:43:29Z,user_data=None,user_id='ec8fd4ab84244ebb88e5af7fcd3ce92b',uuid=7a9c56c2-9c61-4740-95cb-34aebd44fb1a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "address": "fa:16:3e:71:85:78", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ba30d61-83", "ovs_interfaceid": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.178 2 DEBUG nova.network.os_vif_util [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converting VIF {"id": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "address": "fa:16:3e:71:85:78", "network": {"id": "2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac", "bridge": "br-int", "label": "tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ba30d61-83", "ovs_interfaceid": "1ba30d61-83df-42ed-a559-81c0f7e89a5d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.178 2 DEBUG nova.network.os_vif_util [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:71:85:78,bridge_name='br-int',has_traffic_filtering=True,id=1ba30d61-83df-42ed-a559-81c0f7e89a5d,network=Network(2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1ba30d61-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.179 2 DEBUG os_vif [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:71:85:78,bridge_name='br-int',has_traffic_filtering=True,id=1ba30d61-83df-42ed-a559-81c0f7e89a5d,network=Network(2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1ba30d61-83') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.181 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1ba30d61-83, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.189 2 INFO os_vif [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:71:85:78,bridge_name='br-int',has_traffic_filtering=True,id=1ba30d61-83df-42ed-a559-81c0f7e89a5d,network=Network(2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1ba30d61-83')#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.190 2 DEBUG nova.virt.libvirt.vif [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:43:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='server-tempest-MultiPortVlanTransparencyTest-2097740166-1',display_name='server-tempest-MultiPortVlanTransparencyTest-2097740166-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='server-tempest-multiportvlantransparencytest-2097740166-1',id=69,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFO6EtKf086AtDcSKUhQT3A92xQMgobyVurrJBZ/a3hiqHTiY5Yo0zaLWibmNBoQ54lPdiEia0lEWiGuyPEo3V1Xkv/BTywiIW8/QXBzK9pxBAvfcXXyqWXEVNgqfaVfhA==',key_name='tempest-MultiPortVlanTransparencyTest-2097740166',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:43:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='357683d0efd54df8878ddcfaabe6d388',ramdisk_id='',reservation_id='r-vlvvk14k',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-MultiPortVlanTransparencyTest-198310335',owner_user_name='tempest-MultiPortVlanTransparencyTest-198310335-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:43:29Z,user_data=None,user_id='ec8fd4ab84244ebb88e5af7fcd3ce92b',uuid=7a9c56c2-9c61-4740-95cb-34aebd44fb1a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "address": "fa:16:3e:dc:32:4f", "network": {"id": "6a609c25-5d86-41c0-a88c-f18743a289f2", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1daf344-5b", "ovs_interfaceid": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.190 2 DEBUG nova.network.os_vif_util [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converting VIF {"id": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "address": "fa:16:3e:dc:32:4f", "network": {"id": "6a609c25-5d86-41c0-a88c-f18743a289f2", "bridge": "br-int", "label": "second_tempest-MultiPortVlanTransparencyTest-2097740166", "subnets": [{"cidr": "192.168.100.0/24", "dns": [], "gateway": {"address": "192.168.100.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.100.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "357683d0efd54df8878ddcfaabe6d388", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1daf344-5b", "ovs_interfaceid": "e1daf344-5b8d-4f3b-aebd-3abc590fa847", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.191 2 DEBUG nova.network.os_vif_util [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dc:32:4f,bridge_name='br-int',has_traffic_filtering=True,id=e1daf344-5b8d-4f3b-aebd-3abc590fa847,network=Network(6a609c25-5d86-41c0-a88c-f18743a289f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape1daf344-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.191 2 DEBUG os_vif [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dc:32:4f,bridge_name='br-int',has_traffic_filtering=True,id=e1daf344-5b8d-4f3b-aebd-3abc590fa847,network=Network(6a609c25-5d86-41c0-a88c-f18743a289f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape1daf344-5b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.192 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape1daf344-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.198 2 INFO os_vif [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dc:32:4f,bridge_name='br-int',has_traffic_filtering=True,id=e1daf344-5b8d-4f3b-aebd-3abc590fa847,network=Network(6a609c25-5d86-41c0-a88c-f18743a289f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape1daf344-5b')#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.199 2 INFO nova.virt.libvirt.driver [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Deleting instance files /var/lib/nova/instances/7a9c56c2-9c61-4740-95cb-34aebd44fb1a_del#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.200 2 INFO nova.virt.libvirt.driver [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Deletion of /var/lib/nova/instances/7a9c56c2-9c61-4740-95cb-34aebd44fb1a_del complete#033[00m
Oct  8 11:44:19 np0005476733 podman[239796]: 2025-10-08 15:44:19.200581953 +0000 UTC m=+0.043941865 container remove 7e1acecea2ccfc663abdabd41e5ba8d63eebae8b846b6edede579bf68ef0a28d (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 11:44:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:19.207 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[16978645-41de-4410-b6f5-b67419f2556c]: (4, ('Wed Oct  8 03:44:19 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac (7e1acecea2ccfc663abdabd41e5ba8d63eebae8b846b6edede579bf68ef0a28d)\n7e1acecea2ccfc663abdabd41e5ba8d63eebae8b846b6edede579bf68ef0a28d\nWed Oct  8 03:44:19 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac (7e1acecea2ccfc663abdabd41e5ba8d63eebae8b846b6edede579bf68ef0a28d)\n7e1acecea2ccfc663abdabd41e5ba8d63eebae8b846b6edede579bf68ef0a28d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:19.208 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ceb3a079-c4c1-47ea-9f03-adfeb89bb1fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:19.209 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2bf87bc3-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:44:19 np0005476733 kernel: tap2bf87bc3-30: left promiscuous mode
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:19.216 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f8288c27-adfc-45d0-aaca-53e602cd5fc9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:19.241 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8983d179-eef6-4346-833d-81d611fc963b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:19.242 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[13d8857a-3824-4546-b550-5fd996038fa9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:19.258 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a097c459-f800-4d28-a47c-48bd7a386e5a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511087, 'reachable_time': 21262, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239814, 'error': None, 'target': 'ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:19.260 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2bf87bc3-3d0a-4d8a-b41e-00010e6b47ac deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:44:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:19.261 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[29863019-4cfd-46e4-92d0-85948735347e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:19.262 103739 INFO neutron.agent.ovn.metadata.agent [-] Port e1daf344-5b8d-4f3b-aebd-3abc590fa847 in datapath 6a609c25-5d86-41c0-a88c-f18743a289f2 unbound from our chassis#033[00m
Oct  8 11:44:19 np0005476733 systemd[1]: run-netns-ovnmeta\x2d2bf87bc3\x2d3d0a\x2d4d8a\x2db41e\x2d00010e6b47ac.mount: Deactivated successfully.
Oct  8 11:44:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:19.263 103739 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6a609c25-5d86-41c0-a88c-f18743a289f2 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.265 2 INFO nova.compute.manager [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:44:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:19.265 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[55eb7659-5c0e-4aaf-804e-51b94a50db47]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.267 2 DEBUG oslo.service.loopingcall [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.267 2 DEBUG nova.compute.manager [-] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:44:19 np0005476733 nova_compute[192580]: 2025-10-08 15:44:19.267 2 DEBUG nova.network.neutron [-] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:44:19 np0005476733 podman[239815]: 2025-10-08 15:44:19.350552212 +0000 UTC m=+0.056722103 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:44:19 np0005476733 podman[239816]: 2025-10-08 15:44:19.353462555 +0000 UTC m=+0.059386908 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, container_name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible)
Oct  8 11:44:20 np0005476733 nova_compute[192580]: 2025-10-08 15:44:20.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:21 np0005476733 ovn_controller[94857]: 2025-10-08T15:44:21Z|00570|pinctrl|WARN|Dropped 2783 log messages in last 60 seconds (most recently, 1 seconds ago) due to excessive rate
Oct  8 11:44:21 np0005476733 ovn_controller[94857]: 2025-10-08T15:44:21Z|00571|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:44:21 np0005476733 nova_compute[192580]: 2025-10-08 15:44:21.229 2 DEBUG nova.compute.manager [req-8002c357-78ee-440e-85a3-e13df09435b7 req-97fbcaac-f805-464a-a9b0-5dbaaac5f3f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Received event network-vif-plugged-1ba30d61-83df-42ed-a559-81c0f7e89a5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:44:21 np0005476733 nova_compute[192580]: 2025-10-08 15:44:21.229 2 DEBUG oslo_concurrency.lockutils [req-8002c357-78ee-440e-85a3-e13df09435b7 req-97fbcaac-f805-464a-a9b0-5dbaaac5f3f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:44:21 np0005476733 nova_compute[192580]: 2025-10-08 15:44:21.230 2 DEBUG oslo_concurrency.lockutils [req-8002c357-78ee-440e-85a3-e13df09435b7 req-97fbcaac-f805-464a-a9b0-5dbaaac5f3f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:44:21 np0005476733 nova_compute[192580]: 2025-10-08 15:44:21.230 2 DEBUG oslo_concurrency.lockutils [req-8002c357-78ee-440e-85a3-e13df09435b7 req-97fbcaac-f805-464a-a9b0-5dbaaac5f3f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:44:21 np0005476733 nova_compute[192580]: 2025-10-08 15:44:21.230 2 DEBUG nova.compute.manager [req-8002c357-78ee-440e-85a3-e13df09435b7 req-97fbcaac-f805-464a-a9b0-5dbaaac5f3f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] No waiting events found dispatching network-vif-plugged-1ba30d61-83df-42ed-a559-81c0f7e89a5d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:44:21 np0005476733 nova_compute[192580]: 2025-10-08 15:44:21.231 2 WARNING nova.compute.manager [req-8002c357-78ee-440e-85a3-e13df09435b7 req-97fbcaac-f805-464a-a9b0-5dbaaac5f3f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Received unexpected event network-vif-plugged-1ba30d61-83df-42ed-a559-81c0f7e89a5d for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:44:21 np0005476733 nova_compute[192580]: 2025-10-08 15:44:21.231 2 DEBUG nova.compute.manager [req-8002c357-78ee-440e-85a3-e13df09435b7 req-97fbcaac-f805-464a-a9b0-5dbaaac5f3f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Received event network-vif-unplugged-e1daf344-5b8d-4f3b-aebd-3abc590fa847 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:44:21 np0005476733 nova_compute[192580]: 2025-10-08 15:44:21.231 2 DEBUG oslo_concurrency.lockutils [req-8002c357-78ee-440e-85a3-e13df09435b7 req-97fbcaac-f805-464a-a9b0-5dbaaac5f3f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:44:21 np0005476733 nova_compute[192580]: 2025-10-08 15:44:21.232 2 DEBUG oslo_concurrency.lockutils [req-8002c357-78ee-440e-85a3-e13df09435b7 req-97fbcaac-f805-464a-a9b0-5dbaaac5f3f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:44:21 np0005476733 nova_compute[192580]: 2025-10-08 15:44:21.232 2 DEBUG oslo_concurrency.lockutils [req-8002c357-78ee-440e-85a3-e13df09435b7 req-97fbcaac-f805-464a-a9b0-5dbaaac5f3f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:44:21 np0005476733 nova_compute[192580]: 2025-10-08 15:44:21.232 2 DEBUG nova.compute.manager [req-8002c357-78ee-440e-85a3-e13df09435b7 req-97fbcaac-f805-464a-a9b0-5dbaaac5f3f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] No waiting events found dispatching network-vif-unplugged-e1daf344-5b8d-4f3b-aebd-3abc590fa847 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:44:21 np0005476733 nova_compute[192580]: 2025-10-08 15:44:21.232 2 DEBUG nova.compute.manager [req-8002c357-78ee-440e-85a3-e13df09435b7 req-97fbcaac-f805-464a-a9b0-5dbaaac5f3f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Received event network-vif-unplugged-e1daf344-5b8d-4f3b-aebd-3abc590fa847 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:44:21 np0005476733 nova_compute[192580]: 2025-10-08 15:44:21.233 2 DEBUG nova.compute.manager [req-8002c357-78ee-440e-85a3-e13df09435b7 req-97fbcaac-f805-464a-a9b0-5dbaaac5f3f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Received event network-vif-plugged-e1daf344-5b8d-4f3b-aebd-3abc590fa847 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:44:21 np0005476733 nova_compute[192580]: 2025-10-08 15:44:21.233 2 DEBUG oslo_concurrency.lockutils [req-8002c357-78ee-440e-85a3-e13df09435b7 req-97fbcaac-f805-464a-a9b0-5dbaaac5f3f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:44:21 np0005476733 nova_compute[192580]: 2025-10-08 15:44:21.234 2 DEBUG oslo_concurrency.lockutils [req-8002c357-78ee-440e-85a3-e13df09435b7 req-97fbcaac-f805-464a-a9b0-5dbaaac5f3f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:44:21 np0005476733 nova_compute[192580]: 2025-10-08 15:44:21.234 2 DEBUG oslo_concurrency.lockutils [req-8002c357-78ee-440e-85a3-e13df09435b7 req-97fbcaac-f805-464a-a9b0-5dbaaac5f3f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:44:21 np0005476733 nova_compute[192580]: 2025-10-08 15:44:21.234 2 DEBUG nova.compute.manager [req-8002c357-78ee-440e-85a3-e13df09435b7 req-97fbcaac-f805-464a-a9b0-5dbaaac5f3f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] No waiting events found dispatching network-vif-plugged-e1daf344-5b8d-4f3b-aebd-3abc590fa847 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:44:21 np0005476733 nova_compute[192580]: 2025-10-08 15:44:21.235 2 WARNING nova.compute.manager [req-8002c357-78ee-440e-85a3-e13df09435b7 req-97fbcaac-f805-464a-a9b0-5dbaaac5f3f2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Received unexpected event network-vif-plugged-e1daf344-5b8d-4f3b-aebd-3abc590fa847 for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:44:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:22.259 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:44:23 np0005476733 nova_compute[192580]: 2025-10-08 15:44:23.460 2 DEBUG nova.network.neutron [-] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:44:23 np0005476733 nova_compute[192580]: 2025-10-08 15:44:23.487 2 INFO nova.compute.manager [-] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Took 4.22 seconds to deallocate network for instance.#033[00m
Oct  8 11:44:23 np0005476733 nova_compute[192580]: 2025-10-08 15:44:23.524 2 DEBUG oslo_concurrency.lockutils [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:44:23 np0005476733 nova_compute[192580]: 2025-10-08 15:44:23.525 2 DEBUG oslo_concurrency.lockutils [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:44:23 np0005476733 nova_compute[192580]: 2025-10-08 15:44:23.602 2 DEBUG nova.compute.provider_tree [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:44:23 np0005476733 nova_compute[192580]: 2025-10-08 15:44:23.618 2 DEBUG nova.scheduler.client.report [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:44:23 np0005476733 nova_compute[192580]: 2025-10-08 15:44:23.639 2 DEBUG oslo_concurrency.lockutils [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:44:23 np0005476733 nova_compute[192580]: 2025-10-08 15:44:23.665 2 INFO nova.scheduler.client.report [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Deleted allocations for instance 7a9c56c2-9c61-4740-95cb-34aebd44fb1a#033[00m
Oct  8 11:44:23 np0005476733 nova_compute[192580]: 2025-10-08 15:44:23.768 2 DEBUG oslo_concurrency.lockutils [None req-1790131c-e6aa-4146-95c4-5c4338eb3cb7 ec8fd4ab84244ebb88e5af7fcd3ce92b 357683d0efd54df8878ddcfaabe6d388 - - default default] Lock "7a9c56c2-9c61-4740-95cb-34aebd44fb1a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.884s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:44:24 np0005476733 nova_compute[192580]: 2025-10-08 15:44:24.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:25 np0005476733 nova_compute[192580]: 2025-10-08 15:44:25.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:26.331 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:44:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:26.332 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:44:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:26.332 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:44:28 np0005476733 nova_compute[192580]: 2025-10-08 15:44:28.885 2 DEBUG oslo_concurrency.lockutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "df287684-9151-42eb-8ff2-01e29a07e1e1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:44:28 np0005476733 nova_compute[192580]: 2025-10-08 15:44:28.885 2 DEBUG oslo_concurrency.lockutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:44:28 np0005476733 nova_compute[192580]: 2025-10-08 15:44:28.908 2 DEBUG nova.compute.manager [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:44:28 np0005476733 nova_compute[192580]: 2025-10-08 15:44:28.993 2 DEBUG oslo_concurrency.lockutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:44:28 np0005476733 nova_compute[192580]: 2025-10-08 15:44:28.994 2 DEBUG oslo_concurrency.lockutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.002 2 DEBUG nova.virt.hardware [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.003 2 INFO nova.compute.claims [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.114 2 DEBUG nova.compute.provider_tree [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.134 2 DEBUG nova.scheduler.client.report [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.168 2 DEBUG oslo_concurrency.lockutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.169 2 DEBUG nova.compute.manager [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.212 2 DEBUG nova.compute.manager [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.213 2 DEBUG nova.network.neutron [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.235 2 INFO nova.virt.libvirt.driver [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:44:29 np0005476733 podman[239859]: 2025-10-08 15:44:29.243349059 +0000 UTC m=+0.072543111 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.255 2 DEBUG nova.compute.manager [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.352 2 DEBUG nova.compute.manager [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.354 2 DEBUG nova.virt.libvirt.driver [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.355 2 INFO nova.virt.libvirt.driver [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Creating image(s)#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.356 2 DEBUG oslo_concurrency.lockutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "/var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.356 2 DEBUG oslo_concurrency.lockutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "/var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.357 2 DEBUG oslo_concurrency.lockutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "/var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.383 2 DEBUG nova.policy [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.388 2 DEBUG oslo_concurrency.processutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.441 2 DEBUG oslo_concurrency.processutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.442 2 DEBUG oslo_concurrency.lockutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.442 2 DEBUG oslo_concurrency.lockutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.453 2 DEBUG oslo_concurrency.processutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.511 2 DEBUG oslo_concurrency.processutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.512 2 DEBUG oslo_concurrency.processutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.546 2 DEBUG oslo_concurrency.processutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk 10737418240" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.547 2 DEBUG oslo_concurrency.lockutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.548 2 DEBUG oslo_concurrency.processutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.598 2 DEBUG oslo_concurrency.processutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.599 2 DEBUG nova.objects.instance [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'migration_context' on Instance uuid df287684-9151-42eb-8ff2-01e29a07e1e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.613 2 DEBUG nova.virt.libvirt.driver [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.613 2 DEBUG nova.virt.libvirt.driver [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Ensure instance console log exists: /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.614 2 DEBUG oslo_concurrency.lockutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.614 2 DEBUG oslo_concurrency.lockutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:44:29 np0005476733 nova_compute[192580]: 2025-10-08 15:44:29.614 2 DEBUG oslo_concurrency.lockutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:44:30 np0005476733 nova_compute[192580]: 2025-10-08 15:44:30.111 2 DEBUG nova.network.neutron [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Successfully created port: 8a48fdf3-2293-49fb-81c8-b558651c0274 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 11:44:30 np0005476733 nova_compute[192580]: 2025-10-08 15:44:30.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:30 np0005476733 nova_compute[192580]: 2025-10-08 15:44:30.969 2 DEBUG nova.network.neutron [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Successfully updated port: 8a48fdf3-2293-49fb-81c8-b558651c0274 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:44:30 np0005476733 nova_compute[192580]: 2025-10-08 15:44:30.984 2 DEBUG oslo_concurrency.lockutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:44:30 np0005476733 nova_compute[192580]: 2025-10-08 15:44:30.985 2 DEBUG oslo_concurrency.lockutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquired lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:44:30 np0005476733 nova_compute[192580]: 2025-10-08 15:44:30.985 2 DEBUG nova.network.neutron [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.052 2 DEBUG nova.compute.manager [req-92b8ce2b-7344-4167-a3fd-d79d4dac7a26 req-383694f9-9896-4a39-acc8-2aa6062f2a26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received event network-changed-8a48fdf3-2293-49fb-81c8-b558651c0274 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.053 2 DEBUG nova.compute.manager [req-92b8ce2b-7344-4167-a3fd-d79d4dac7a26 req-383694f9-9896-4a39-acc8-2aa6062f2a26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Refreshing instance network info cache due to event network-changed-8a48fdf3-2293-49fb-81c8-b558651c0274. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.053 2 DEBUG oslo_concurrency.lockutils [req-92b8ce2b-7344-4167-a3fd-d79d4dac7a26 req-383694f9-9896-4a39-acc8-2aa6062f2a26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.209 2 DEBUG nova.network.neutron [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.937 2 DEBUG nova.network.neutron [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Updating instance_info_cache with network_info: [{"id": "8a48fdf3-2293-49fb-81c8-b558651c0274", "address": "fa:16:3e:36:af:68", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a48fdf3-22", "ovs_interfaceid": "8a48fdf3-2293-49fb-81c8-b558651c0274", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.959 2 DEBUG oslo_concurrency.lockutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Releasing lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.960 2 DEBUG nova.compute.manager [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Instance network_info: |[{"id": "8a48fdf3-2293-49fb-81c8-b558651c0274", "address": "fa:16:3e:36:af:68", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a48fdf3-22", "ovs_interfaceid": "8a48fdf3-2293-49fb-81c8-b558651c0274", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.960 2 DEBUG oslo_concurrency.lockutils [req-92b8ce2b-7344-4167-a3fd-d79d4dac7a26 req-383694f9-9896-4a39-acc8-2aa6062f2a26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.960 2 DEBUG nova.network.neutron [req-92b8ce2b-7344-4167-a3fd-d79d4dac7a26 req-383694f9-9896-4a39-acc8-2aa6062f2a26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Refreshing network info cache for port 8a48fdf3-2293-49fb-81c8-b558651c0274 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.965 2 DEBUG nova.virt.libvirt.driver [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Start _get_guest_xml network_info=[{"id": "8a48fdf3-2293-49fb-81c8-b558651c0274", "address": "fa:16:3e:36:af:68", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a48fdf3-22", "ovs_interfaceid": "8a48fdf3-2293-49fb-81c8-b558651c0274", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.971 2 WARNING nova.virt.libvirt.driver [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.975 2 DEBUG nova.virt.libvirt.host [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.976 2 DEBUG nova.virt.libvirt.host [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.984 2 DEBUG nova.virt.libvirt.host [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.985 2 DEBUG nova.virt.libvirt.host [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.986 2 DEBUG nova.virt.libvirt.driver [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.986 2 DEBUG nova.virt.hardware [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.986 2 DEBUG nova.virt.hardware [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.987 2 DEBUG nova.virt.hardware [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.987 2 DEBUG nova.virt.hardware [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.987 2 DEBUG nova.virt.hardware [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.987 2 DEBUG nova.virt.hardware [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.988 2 DEBUG nova.virt.hardware [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.988 2 DEBUG nova.virt.hardware [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.988 2 DEBUG nova.virt.hardware [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.988 2 DEBUG nova.virt.hardware [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.989 2 DEBUG nova.virt.hardware [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.993 2 DEBUG nova.virt.libvirt.vif [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:44:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_qos_after_cold_migration-1645830914',display_name='tempest-test_qos_after_cold_migration-1645830914',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-qos-after-cold-migration-1645830914',id=71,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-riz678gr',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:44:29Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=df287684-9151-42eb-8ff2-01e29a07e1e1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8a48fdf3-2293-49fb-81c8-b558651c0274", "address": "fa:16:3e:36:af:68", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a48fdf3-22", "ovs_interfaceid": "8a48fdf3-2293-49fb-81c8-b558651c0274", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.993 2 DEBUG nova.network.os_vif_util [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "8a48fdf3-2293-49fb-81c8-b558651c0274", "address": "fa:16:3e:36:af:68", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a48fdf3-22", "ovs_interfaceid": "8a48fdf3-2293-49fb-81c8-b558651c0274", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.994 2 DEBUG nova.network.os_vif_util [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:af:68,bridge_name='br-int',has_traffic_filtering=True,id=8a48fdf3-2293-49fb-81c8-b558651c0274,network=Network(ac5383ee-65ae-4340-bb19-495c4991fae8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a48fdf3-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:44:31 np0005476733 nova_compute[192580]: 2025-10-08 15:44:31.995 2 DEBUG nova.objects.instance [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'pci_devices' on Instance uuid df287684-9151-42eb-8ff2-01e29a07e1e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.018 2 DEBUG nova.virt.libvirt.driver [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:44:32 np0005476733 nova_compute[192580]:  <uuid>df287684-9151-42eb-8ff2-01e29a07e1e1</uuid>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:  <name>instance-00000047</name>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <nova:name>tempest-test_qos_after_cold_migration-1645830914</nova:name>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:44:31</nova:creationTime>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:44:32 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:        <nova:user uuid="d4d641ac754b44f89a23c1628056309a">tempest-QosTestCommon-1316104462-project-member</nova:user>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:        <nova:project uuid="d58fb802e34e481ea69b20f4fe8df6d2">tempest-QosTestCommon-1316104462</nova:project>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:        <nova:port uuid="8a48fdf3-2293-49fb-81c8-b558651c0274">
Oct  8 11:44:32 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.7.38" ipVersion="4"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <entry name="serial">df287684-9151-42eb-8ff2-01e29a07e1e1</entry>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <entry name="uuid">df287684-9151-42eb-8ff2-01e29a07e1e1</entry>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk.config"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:36:af:68"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <target dev="tap8a48fdf3-22"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/console.log" append="off"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:44:32 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:44:32 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:44:32 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:44:32 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.020 2 DEBUG nova.compute.manager [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Preparing to wait for external event network-vif-plugged-8a48fdf3-2293-49fb-81c8-b558651c0274 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.021 2 DEBUG oslo_concurrency.lockutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.021 2 DEBUG oslo_concurrency.lockutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.022 2 DEBUG oslo_concurrency.lockutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.022 2 DEBUG nova.virt.libvirt.vif [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:44:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_qos_after_cold_migration-1645830914',display_name='tempest-test_qos_after_cold_migration-1645830914',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-qos-after-cold-migration-1645830914',id=71,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-riz678gr',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:44:29Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=df287684-9151-42eb-8ff2-01e29a07e1e1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8a48fdf3-2293-49fb-81c8-b558651c0274", "address": "fa:16:3e:36:af:68", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a48fdf3-22", "ovs_interfaceid": "8a48fdf3-2293-49fb-81c8-b558651c0274", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.023 2 DEBUG nova.network.os_vif_util [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "8a48fdf3-2293-49fb-81c8-b558651c0274", "address": "fa:16:3e:36:af:68", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a48fdf3-22", "ovs_interfaceid": "8a48fdf3-2293-49fb-81c8-b558651c0274", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.024 2 DEBUG nova.network.os_vif_util [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:af:68,bridge_name='br-int',has_traffic_filtering=True,id=8a48fdf3-2293-49fb-81c8-b558651c0274,network=Network(ac5383ee-65ae-4340-bb19-495c4991fae8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a48fdf3-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.024 2 DEBUG os_vif [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:af:68,bridge_name='br-int',has_traffic_filtering=True,id=8a48fdf3-2293-49fb-81c8-b558651c0274,network=Network(ac5383ee-65ae-4340-bb19-495c4991fae8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a48fdf3-22') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.025 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.026 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.029 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8a48fdf3-22, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.030 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8a48fdf3-22, col_values=(('external_ids', {'iface-id': '8a48fdf3-2293-49fb-81c8-b558651c0274', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:36:af:68', 'vm-uuid': 'df287684-9151-42eb-8ff2-01e29a07e1e1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:32 np0005476733 NetworkManager[51699]: <info>  [1759938272.0339] manager: (tap8a48fdf3-22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/188)
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.040 2 INFO os_vif [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:af:68,bridge_name='br-int',has_traffic_filtering=True,id=8a48fdf3-2293-49fb-81c8-b558651c0274,network=Network(ac5383ee-65ae-4340-bb19-495c4991fae8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a48fdf3-22')#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.112 2 DEBUG nova.virt.libvirt.driver [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.113 2 DEBUG nova.virt.libvirt.driver [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.113 2 DEBUG nova.virt.libvirt.driver [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No VIF found with MAC fa:16:3e:36:af:68, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.114 2 INFO nova.virt.libvirt.driver [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Using config drive#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.530 2 INFO nova.virt.libvirt.driver [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Creating config drive at /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk.config#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.537 2 DEBUG oslo_concurrency.processutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2ihyz_xn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.668 2 DEBUG oslo_concurrency.processutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2ihyz_xn" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:44:32 np0005476733 kernel: tap8a48fdf3-22: entered promiscuous mode
Oct  8 11:44:32 np0005476733 NetworkManager[51699]: <info>  [1759938272.7345] manager: (tap8a48fdf3-22): new Tun device (/org/freedesktop/NetworkManager/Devices/189)
Oct  8 11:44:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:44:32Z|00572|binding|INFO|Claiming lport 8a48fdf3-2293-49fb-81c8-b558651c0274 for this chassis.
Oct  8 11:44:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:44:32Z|00573|binding|INFO|8a48fdf3-2293-49fb-81c8-b558651c0274: Claiming fa:16:3e:36:af:68 192.168.7.38
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:44:32Z|00574|binding|INFO|Setting lport 8a48fdf3-2293-49fb-81c8-b558651c0274 ovn-installed in OVS
Oct  8 11:44:32 np0005476733 nova_compute[192580]: 2025-10-08 15:44:32.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:44:32Z|00575|binding|INFO|Setting lport 8a48fdf3-2293-49fb-81c8-b558651c0274 up in Southbound
Oct  8 11:44:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:32.757 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:af:68 192.168.7.38'], port_security=['fa:16:3e:36:af:68 192.168.7.38'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.7.38/24', 'neutron:device_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac5383ee-65ae-4340-bb19-495c4991fae8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b449450f-29a2-4ba2-a56d-c4c1cca923db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c049a133-6546-4ba7-90ec-ddcac9cb5060, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=8a48fdf3-2293-49fb-81c8-b558651c0274) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:44:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:32.760 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 8a48fdf3-2293-49fb-81c8-b558651c0274 in datapath ac5383ee-65ae-4340-bb19-495c4991fae8 bound to our chassis#033[00m
Oct  8 11:44:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:32.763 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ac5383ee-65ae-4340-bb19-495c4991fae8#033[00m
Oct  8 11:44:32 np0005476733 systemd-udevd[239909]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:44:32 np0005476733 systemd-machined[152624]: New machine qemu-40-instance-00000047.
Oct  8 11:44:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:32.779 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[36eded1e-454e-4642-ad72-4080978a4410]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:32.781 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapac5383ee-61 in ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:44:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:32.784 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapac5383ee-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:44:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:32.784 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a82d93c9-88d4-40df-8386-3825cd3f1769]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:32.785 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f310e670-7e30-40e9-aa80-00952926f1ea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:32 np0005476733 NetworkManager[51699]: <info>  [1759938272.7922] device (tap8a48fdf3-22): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:44:32 np0005476733 systemd[1]: Started Virtual Machine qemu-40-instance-00000047.
Oct  8 11:44:32 np0005476733 NetworkManager[51699]: <info>  [1759938272.7947] device (tap8a48fdf3-22): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:44:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:32.798 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[b346f261-5aae-4f44-9381-1f5c9f86da20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:32.815 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3480af7c-cc40-497c-a2d7-eb7935843a6e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:32.855 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[1fd9bc97-49ba-45d4-8231-fedc833b0c5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:32 np0005476733 NetworkManager[51699]: <info>  [1759938272.8627] manager: (tapac5383ee-60): new Veth device (/org/freedesktop/NetworkManager/Devices/190)
Oct  8 11:44:32 np0005476733 systemd-udevd[239914]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:44:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:32.861 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b824ac63-0b1c-4740-8d88-2f22560338ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:32.901 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[1231fcd9-b172-40a7-8b96-f52b84953d7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:32.906 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[65f60544-6c71-4f93-85af-ef919b54f8be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:32 np0005476733 NetworkManager[51699]: <info>  [1759938272.9357] device (tapac5383ee-60): carrier: link connected
Oct  8 11:44:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:32.940 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[440ad1f8-a629-40ed-acc8-50c4491599c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:32.960 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3b507447-2df4-4a3a-b92b-6da5fa4bfecf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapac5383ee-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:d5:cb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 134], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517659, 'reachable_time': 16168, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239943, 'error': None, 'target': 'ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:32.979 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1257e532-b4f6-4b39-aabb-2630a3b5b438]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe47:d5cb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517659, 'tstamp': 517659}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239944, 'error': None, 'target': 'ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:33.003 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f2a2abc7-d177-4fca-b191-5ab4ed18b401]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapac5383ee-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:d5:cb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 134], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517659, 'reachable_time': 16168, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239945, 'error': None, 'target': 'ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:33.042 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[df8cb316-1ae7-4b83-b92e-92963772c7fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.089 2 DEBUG nova.network.neutron [req-92b8ce2b-7344-4167-a3fd-d79d4dac7a26 req-383694f9-9896-4a39-acc8-2aa6062f2a26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Updated VIF entry in instance network info cache for port 8a48fdf3-2293-49fb-81c8-b558651c0274. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.090 2 DEBUG nova.network.neutron [req-92b8ce2b-7344-4167-a3fd-d79d4dac7a26 req-383694f9-9896-4a39-acc8-2aa6062f2a26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Updating instance_info_cache with network_info: [{"id": "8a48fdf3-2293-49fb-81c8-b558651c0274", "address": "fa:16:3e:36:af:68", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a48fdf3-22", "ovs_interfaceid": "8a48fdf3-2293-49fb-81c8-b558651c0274", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.111 2 DEBUG oslo_concurrency.lockutils [req-92b8ce2b-7344-4167-a3fd-d79d4dac7a26 req-383694f9-9896-4a39-acc8-2aa6062f2a26 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:33.112 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[464dc1f6-ae48-41e0-922c-20710a6e3b25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:33.114 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac5383ee-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:33.114 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:33.115 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapac5383ee-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:33 np0005476733 NetworkManager[51699]: <info>  [1759938273.1184] manager: (tapac5383ee-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/191)
Oct  8 11:44:33 np0005476733 kernel: tapac5383ee-60: entered promiscuous mode
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:33.120 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapac5383ee-60, col_values=(('external_ids', {'iface-id': '825d48bf-0cf9-4bc4-9da2-41ee106cd6a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:33 np0005476733 ovn_controller[94857]: 2025-10-08T15:44:33Z|00576|binding|INFO|Releasing lport 825d48bf-0cf9-4bc4-9da2-41ee106cd6a9 from this chassis (sb_readonly=0)
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:33.136 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ac5383ee-65ae-4340-bb19-495c4991fae8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ac5383ee-65ae-4340-bb19-495c4991fae8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:33.137 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f006d3ce-590b-40fe-9b3e-92d4043ece9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:33.138 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-ac5383ee-65ae-4340-bb19-495c4991fae8
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/ac5383ee-65ae-4340-bb19-495c4991fae8.pid.haproxy
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID ac5383ee-65ae-4340-bb19-495c4991fae8
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:44:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:44:33.139 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8', 'env', 'PROCESS_TAG=haproxy-ac5383ee-65ae-4340-bb19-495c4991fae8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ac5383ee-65ae-4340-bb19-495c4991fae8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.150 2 DEBUG nova.compute.manager [req-25f37ce6-9d6a-48ae-9d5d-3163f36614c0 req-75b5006b-d62b-4ec1-80b4-fff3b0f1b176 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received event network-vif-plugged-8a48fdf3-2293-49fb-81c8-b558651c0274 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.150 2 DEBUG oslo_concurrency.lockutils [req-25f37ce6-9d6a-48ae-9d5d-3163f36614c0 req-75b5006b-d62b-4ec1-80b4-fff3b0f1b176 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.150 2 DEBUG oslo_concurrency.lockutils [req-25f37ce6-9d6a-48ae-9d5d-3163f36614c0 req-75b5006b-d62b-4ec1-80b4-fff3b0f1b176 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.151 2 DEBUG oslo_concurrency.lockutils [req-25f37ce6-9d6a-48ae-9d5d-3163f36614c0 req-75b5006b-d62b-4ec1-80b4-fff3b0f1b176 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.151 2 DEBUG nova.compute.manager [req-25f37ce6-9d6a-48ae-9d5d-3163f36614c0 req-75b5006b-d62b-4ec1-80b4-fff3b0f1b176 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Processing event network-vif-plugged-8a48fdf3-2293-49fb-81c8-b558651c0274 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.558 2 DEBUG nova.compute.manager [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.561 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759938273.557384, df287684-9151-42eb-8ff2-01e29a07e1e1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.561 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] VM Started (Lifecycle Event)#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.564 2 DEBUG nova.virt.libvirt.driver [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:44:33 np0005476733 podman[239984]: 2025-10-08 15:44:33.568004261 +0000 UTC m=+0.061309012 container create 35c8da4bb9d8d094ba0c24c795a07339716945b68fd7bb7d0f7f9c5e15826009 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.568 2 INFO nova.virt.libvirt.driver [-] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Instance spawned successfully.#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.568 2 DEBUG nova.virt.libvirt.driver [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.589 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.601 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:44:33 np0005476733 systemd[1]: Started libpod-conmon-35c8da4bb9d8d094ba0c24c795a07339716945b68fd7bb7d0f7f9c5e15826009.scope.
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.611 2 DEBUG nova.virt.libvirt.driver [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.612 2 DEBUG nova.virt.libvirt.driver [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.612 2 DEBUG nova.virt.libvirt.driver [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.613 2 DEBUG nova.virt.libvirt.driver [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.613 2 DEBUG nova.virt.libvirt.driver [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.614 2 DEBUG nova.virt.libvirt.driver [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.621 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.621 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759938273.5587087, df287684-9151-42eb-8ff2-01e29a07e1e1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.622 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:44:33 np0005476733 podman[239984]: 2025-10-08 15:44:33.532184394 +0000 UTC m=+0.025489145 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:44:33 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:44:33 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b45531ab907f45b2507671188323b74c524d8b8ac2f49804444785998cec1d00/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.659 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:44:33 np0005476733 podman[239984]: 2025-10-08 15:44:33.664557658 +0000 UTC m=+0.157862419 container init 35c8da4bb9d8d094ba0c24c795a07339716945b68fd7bb7d0f7f9c5e15826009 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.665 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759938273.563376, df287684-9151-42eb-8ff2-01e29a07e1e1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.665 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:44:33 np0005476733 podman[239984]: 2025-10-08 15:44:33.673252546 +0000 UTC m=+0.166557277 container start 35c8da4bb9d8d094ba0c24c795a07339716945b68fd7bb7d0f7f9c5e15826009 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  8 11:44:33 np0005476733 podman[240000]: 2025-10-08 15:44:33.683314588 +0000 UTC m=+0.071480207 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.691 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:44:33 np0005476733 neutron-haproxy-ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8[240006]: [NOTICE]   (240042) : New worker (240048) forked
Oct  8 11:44:33 np0005476733 neutron-haproxy-ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8[240006]: [NOTICE]   (240042) : Loading success.
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.698 2 INFO nova.compute.manager [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Took 4.35 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.698 2 DEBUG nova.compute.manager [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.702 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:44:33 np0005476733 podman[239997]: 2025-10-08 15:44:33.717917995 +0000 UTC m=+0.109180943 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2)
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.744 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.783 2 INFO nova.compute.manager [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Took 4.82 seconds to build instance.#033[00m
Oct  8 11:44:33 np0005476733 nova_compute[192580]: 2025-10-08 15:44:33.806 2 DEBUG oslo_concurrency.lockutils [None req-165db8e8-f7c2-40b4-942a-f88c128c7ef4 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.921s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:44:34 np0005476733 nova_compute[192580]: 2025-10-08 15:44:34.158 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759938259.1578689, 7a9c56c2-9c61-4740-95cb-34aebd44fb1a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:44:34 np0005476733 nova_compute[192580]: 2025-10-08 15:44:34.159 2 INFO nova.compute.manager [-] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:44:34 np0005476733 nova_compute[192580]: 2025-10-08 15:44:34.244 2 DEBUG nova.compute.manager [None req-8c9d576a-32e7-41be-a176-b95688a36b57 - - - - - -] [instance: 7a9c56c2-9c61-4740-95cb-34aebd44fb1a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:44:34 np0005476733 nova_compute[192580]: 2025-10-08 15:44:34.743 2 INFO nova.compute.manager [None req-dbe2b8b1-c594-4c13-ba42-8157b4e9487d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Get console output#033[00m
Oct  8 11:44:34 np0005476733 nova_compute[192580]: 2025-10-08 15:44:34.751 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:44:35 np0005476733 nova_compute[192580]: 2025-10-08 15:44:35.229 2 DEBUG nova.compute.manager [req-1553136b-721a-42e5-935e-d14ccd06e230 req-c3cb5268-e62c-423b-855e-5e357e8ac98b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received event network-vif-plugged-8a48fdf3-2293-49fb-81c8-b558651c0274 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:44:35 np0005476733 nova_compute[192580]: 2025-10-08 15:44:35.230 2 DEBUG oslo_concurrency.lockutils [req-1553136b-721a-42e5-935e-d14ccd06e230 req-c3cb5268-e62c-423b-855e-5e357e8ac98b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:44:35 np0005476733 nova_compute[192580]: 2025-10-08 15:44:35.230 2 DEBUG oslo_concurrency.lockutils [req-1553136b-721a-42e5-935e-d14ccd06e230 req-c3cb5268-e62c-423b-855e-5e357e8ac98b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:44:35 np0005476733 nova_compute[192580]: 2025-10-08 15:44:35.231 2 DEBUG oslo_concurrency.lockutils [req-1553136b-721a-42e5-935e-d14ccd06e230 req-c3cb5268-e62c-423b-855e-5e357e8ac98b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:44:35 np0005476733 nova_compute[192580]: 2025-10-08 15:44:35.231 2 DEBUG nova.compute.manager [req-1553136b-721a-42e5-935e-d14ccd06e230 req-c3cb5268-e62c-423b-855e-5e357e8ac98b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] No waiting events found dispatching network-vif-plugged-8a48fdf3-2293-49fb-81c8-b558651c0274 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:44:35 np0005476733 nova_compute[192580]: 2025-10-08 15:44:35.231 2 WARNING nova.compute.manager [req-1553136b-721a-42e5-935e-d14ccd06e230 req-c3cb5268-e62c-423b-855e-5e357e8ac98b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received unexpected event network-vif-plugged-8a48fdf3-2293-49fb-81c8-b558651c0274 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:44:35 np0005476733 nova_compute[192580]: 2025-10-08 15:44:35.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.015 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'name': 'tempest-test_qos_after_cold_migration-1645830914', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000047', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'hostId': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.016 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.020 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for df287684-9151-42eb-8ff2-01e29a07e1e1 / tap8a48fdf3-22 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.020 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '61aa88ae-a0e3-4fd0-8e8e-ea6fe08a44b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tap8a48fdf3-22', 'timestamp': '2025-10-08T15:44:36.016588', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tap8a48fdf3-22', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:36:af:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a48fdf3-22'}, 'message_id': 'b1317be4-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.73958734, 'message_signature': '542f691995e34704e493b2666389b537be8364e9fd6633260e2d6e32fb08a9b3'}]}, 'timestamp': '2025-10-08 15:44:36.021211', '_unique_id': 'f8032349678b4285885d61a37b49ce71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.022 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.024 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.024 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.024 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-test_qos_after_cold_migration-1645830914>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_qos_after_cold_migration-1645830914>]
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.024 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.046 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.047 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1caccbd8-2d5c-4d4a-933f-54a03cf605e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-vda', 'timestamp': '2025-10-08T15:44:36.024898', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'b135783e-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.747917807, 'message_signature': '296cb611a7dd8b6d0e3060190ab05e674fecd23e993c75af3fd28412c1fec427'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-sda', 'timestamp': '2025-10-08T15:44:36.024898', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'b13586a8-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.747917807, 'message_signature': 'ef80e04c028b794243d6df5a53f81cb2dd3240c0286916fdffaf5162b464656b'}]}, 'timestamp': '2025-10-08 15:44:36.047599', '_unique_id': '579e7dc4a560453ba65755b353ffd32b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.048 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.049 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.063 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.063 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b3cf795-2746-4ef7-a728-fdbd235b204b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-vda', 'timestamp': '2025-10-08T15:44:36.049948', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'b1380202-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.772956467, 'message_signature': 'f1c3e519335f92039d52dd25e777f1c5990e6477e8e37f572b6801f6d24901f2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-sda', 'timestamp': '2025-10-08T15:44:36.049948', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'b1380e82-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.772956467, 'message_signature': 'f9412abe91f9328c26e9195b486d1294d9ce821352d22cb95acd3796db9cdc3d'}]}, 'timestamp': '2025-10-08 15:44:36.064187', '_unique_id': 'cdcd603125d247ec9a1e0f0b87309d64'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.065 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.066 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.066 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be8a4f10-c773-4470-b742-6b631193761e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tap8a48fdf3-22', 'timestamp': '2025-10-08T15:44:36.066805', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tap8a48fdf3-22', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:36:af:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a48fdf3-22'}, 'message_id': 'b13882fe-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.73958734, 'message_signature': 'd7a9afb0e6fc421c8eb670fbc39a45f487f503b9842fc8d456eafd86c7ee1c2b'}]}, 'timestamp': '2025-10-08 15:44:36.067202', '_unique_id': 'eb630b729f9e48099557494f54e1000d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.067 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.068 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.068 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.write.latency volume: 15787136 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.068 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d5b3aba-3d53-437f-816b-e2238f2dc7fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15787136, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-vda', 'timestamp': '2025-10-08T15:44:36.068621', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'b138c872-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.747917807, 'message_signature': '3871778dbb67255f8032ac5e33fea76b299d6ff6637f7372d39dd20e541032d6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-sda', 'timestamp': '2025-10-08T15:44:36.068621', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'b138d1f0-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.747917807, 'message_signature': '8efdd8404b31b92617fd49ea5b8dd7a3016e992a488614269e2871cf4770351b'}]}, 'timestamp': '2025-10-08 15:44:36.069164', '_unique_id': '4abc63c842114a0cafb7f7d2bd4ffdda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.069 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.070 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.070 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95c35342-b73f-47c5-b083-5e3b5273fcc1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tap8a48fdf3-22', 'timestamp': '2025-10-08T15:44:36.070581', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tap8a48fdf3-22', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:36:af:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a48fdf3-22'}, 'message_id': 'b13915ac-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.73958734, 'message_signature': 'abedeed59ede583d4fa3783e770559d9c834b258f018153eaa169cf4d4cd48f2'}]}, 'timestamp': '2025-10-08 15:44:36.070902', '_unique_id': 'e1f369a12b734db5a41077105fabf2dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.071 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.072 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.072 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.read.requests volume: 850 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.072 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30be48bb-a322-4d12-a047-083342e8c9fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 850, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-vda', 'timestamp': '2025-10-08T15:44:36.072185', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'b13953aa-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.747917807, 'message_signature': '85a9988696199bc1536e23b130963a364350d4170830ce1115a3c4955ff10163'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-sda', 'timestamp': '2025-10-08T15:44:36.072185', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'b1395e36-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.747917807, 'message_signature': '5f01bf54f80a6574603fa812304fc99a5f8a56d2f4b86dd4207a136dd41fec72'}]}, 'timestamp': '2025-10-08 15:44:36.072711', '_unique_id': 'c6e35677426748fd97e20c38e81d3f4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.073 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.074 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.074 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.074 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-test_qos_after_cold_migration-1645830914>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_qos_after_cold_migration-1645830914>]
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.074 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.074 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.write.bytes volume: 1024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.074 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f185692-17ed-4f6e-9f8c-7e185ea227ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1024, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-vda', 'timestamp': '2025-10-08T15:44:36.074579', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'b139b084-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.747917807, 'message_signature': 'caddcfb9477823a2bc22ad9a41f91bb2d7969bd4abcd4922ddd2dc6b3c1a7df3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-sda', 'timestamp': '2025-10-08T15:44:36.074579', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'b139b8d6-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.747917807, 'message_signature': '29be2cd97fc7fee373471b1f2159ea01b3cf0d0da53a3763b591d139a82078fc'}]}, 'timestamp': '2025-10-08 15:44:36.075059', '_unique_id': '5e0a237c18bf41b6b4749906991f80c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.075 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.076 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.076 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.076 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-test_qos_after_cold_migration-1645830914>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_qos_after_cold_migration-1645830914>]
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.076 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.076 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.read.bytes volume: 13875712 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80e608da-0632-4957-810c-9e840d98a758', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 13875712, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-vda', 'timestamp': '2025-10-08T15:44:36.076835', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'b13a091c-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.747917807, 'message_signature': 'cf024e6aad1e7c565b5b4696fa394420cce20526e8a04aecd4dcb95b66f667fe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-sda', 'timestamp': '2025-10-08T15:44:36.076835', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'b13a124a-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.747917807, 'message_signature': '63ecf89e7a72f4a88aa623f5b24c13931a327d5e59c571eae32ab1d574481e0d'}]}, 'timestamp': '2025-10-08 15:44:36.077311', '_unique_id': 'f5d3bab4d6a443cea86d63b2be96c0b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.077 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.078 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.078 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9e18433-a201-451b-8135-1558dc811172', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tap8a48fdf3-22', 'timestamp': '2025-10-08T15:44:36.078657', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tap8a48fdf3-22', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:36:af:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a48fdf3-22'}, 'message_id': 'b13a4fda-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.73958734, 'message_signature': 'fa155af79aa52732a05264d24ea53480614f107c64b7109ece34b9db8374e813'}]}, 'timestamp': '2025-10-08 15:44:36.078910', '_unique_id': '12f62ed376e3402d948d7fa61daecd44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.079 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.080 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.080 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce6d049b-9bdb-48a9-b1e9-603147784bb7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tap8a48fdf3-22', 'timestamp': '2025-10-08T15:44:36.080432', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tap8a48fdf3-22', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:36:af:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a48fdf3-22'}, 'message_id': 'b13a96f2-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.73958734, 'message_signature': '66aa86f6dfddd411427ffbecb17e1859805bdc0c000b58dcc7862ae73e1e78b4'}]}, 'timestamp': '2025-10-08 15:44:36.080769', '_unique_id': 'e42605fa855a475783d5057439f21784'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.081 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.082 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.082 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '683fe758-dbe9-4e0c-8661-01bc564a98eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tap8a48fdf3-22', 'timestamp': '2025-10-08T15:44:36.082709', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tap8a48fdf3-22', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:36:af:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a48fdf3-22'}, 'message_id': 'b13af034-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.73958734, 'message_signature': '6c2aef66b481f54fabda22b5b14dcd8bdd3682fef6e11c08358ec92d7e1738be'}]}, 'timestamp': '2025-10-08 15:44:36.083067', '_unique_id': '56302ca09e7740e09b6e2ff34c283c6a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.083 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.084 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.084 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '61f9a153-aefb-4a14-8314-cc6f6427bf36', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tap8a48fdf3-22', 'timestamp': '2025-10-08T15:44:36.084325', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tap8a48fdf3-22', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:36:af:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a48fdf3-22'}, 'message_id': 'b13b2df6-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.73958734, 'message_signature': '2ba5b107b99bcc63f5633e29d208f09397211b213d5cac56340f05eff9e07ac8'}]}, 'timestamp': '2025-10-08 15:44:36.084626', '_unique_id': 'c1cc525e26ef4ba192ee1c95bc1dc919'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.085 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.086 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.086 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.read.latency volume: 477639626 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.086 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.read.latency volume: 3994567 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be546007-fe99-4b6f-a39f-c1bcaf656743', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 477639626, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-vda', 'timestamp': '2025-10-08T15:44:36.086112', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'b13b7432-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.747917807, 'message_signature': 'a53026f19c036bc8c208c0ab39438f646458858c64d3fef2ac81c2df442b6a3a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3994567, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-sda', 'timestamp': '2025-10-08T15:44:36.086112', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'b13b7e6e-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.747917807, 'message_signature': '6511b586e933e3dc205cd7fd42a57a10160793253f69076ffad4e8cabd4d9f62'}]}, 'timestamp': '2025-10-08 15:44:36.086655', '_unique_id': '3d64f12279ae402f907345fea10586c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.087 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.088 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.088 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.allocation volume: 1253376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.088 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9dbd10a2-c53d-4f70-8f07-10d4f308bcb6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1253376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-vda', 'timestamp': '2025-10-08T15:44:36.088149', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'b13bc374-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.772956467, 'message_signature': 'e371c1ed77922373d661ff06349ce3380616c803726e32343c4bf01ef082456c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-sda', 'timestamp': '2025-10-08T15:44:36.088149', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'b13bcda6-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.772956467, 'message_signature': 'f510551a7b18e92b9e3a8f69d894b1d4416f6240f277640e22bc0b000a54c041'}]}, 'timestamp': '2025-10-08 15:44:36.088711', '_unique_id': '29ba55a9101148d094faed61787dcd68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.089 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.090 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.090 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.090 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c5a9d64-38b3-4584-b481-91609a211d3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-vda', 'timestamp': '2025-10-08T15:44:36.090212', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'b13c13ce-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.772956467, 'message_signature': 'b36af54ccd947a544c88aa888973b9bf3e0fb1a251b465fece33ad4f3462b424'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-sda', 'timestamp': '2025-10-08T15:44:36.090212', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'b13c1e14-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.772956467, 'message_signature': 'b5a0197408dc0a7e0fad81653f4af07e04c609964092987063473f723306dea3'}]}, 'timestamp': '2025-10-08 15:44:36.090726', '_unique_id': 'ec91fde04174443490b6531d9bd286d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.091 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f464fcd6-92a8-4a85-90ab-e6561b2f0548', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tap8a48fdf3-22', 'timestamp': '2025-10-08T15:44:36.091863', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tap8a48fdf3-22', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:36:af:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a48fdf3-22'}, 'message_id': 'b13c54a6-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.73958734, 'message_signature': '2f650db578cee8543bff3c170c78980447dcb246ba1c2419814d1bcf4eafe7ec'}]}, 'timestamp': '2025-10-08 15:44:36.092185', '_unique_id': 'da0d5d2dd9fe4ac39e01ed472e13b149'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.092 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.093 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.110 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/cpu volume: 2450000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e694d15-4ac4-4587-b789-e2123ace45b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2450000000, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'timestamp': '2025-10-08T15:44:36.093327', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'b13f4ab2-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.833775972, 'message_signature': 'dba2a29339ecd8f170cd1bf3f7d987d0127d0c3219d7aa3f63dc2f76da1f08df'}]}, 'timestamp': '2025-10-08 15:44:36.111690', '_unique_id': '2e16dbbcdb3f4c43bada44c0e8b8c493'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.113 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.114 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.114 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.114 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-test_qos_after_cold_migration-1645830914>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_qos_after_cold_migration-1645830914>]
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.114 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.114 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa60048b-daa5-48db-90f3-775ad1af0678', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tap8a48fdf3-22', 'timestamp': '2025-10-08T15:44:36.114791', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tap8a48fdf3-22', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:36:af:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a48fdf3-22'}, 'message_id': 'b13fd57c-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.73958734, 'message_signature': '590d088a052177e4df3a0c0379d889778f55ffbe34df480cc88af80e2687e55c'}]}, 'timestamp': '2025-10-08 15:44:36.115183', '_unique_id': '5ef0ce50f0b14ec4894ec66f009a24f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.115 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.123 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.123 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.123 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance df287684-9151-42eb-8ff2-01e29a07e1e1: ceilometer.compute.pollsters.NoVolumeException
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.124 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.124 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8d256b0-c923-4211-b2a9-c443f4feb47e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tap8a48fdf3-22', 'timestamp': '2025-10-08T15:44:36.124255', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tap8a48fdf3-22', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:36:af:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a48fdf3-22'}, 'message_id': 'b1414b28-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5179.73958734, 'message_signature': 'd6d61d2daa3ce99ba2676c594daf9e46eb4c3805ae8dc035b85ffa3a8b81a3e5'}]}, 'timestamp': '2025-10-08 15:44:36.124779', '_unique_id': '88002b6940a6417f89b98bebd304806f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:44:36.125 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:44:37 np0005476733 nova_compute[192580]: 2025-10-08 15:44:37.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:38 np0005476733 ovn_controller[94857]: 2025-10-08T15:44:38Z|00577|binding|INFO|Releasing lport 825d48bf-0cf9-4bc4-9da2-41ee106cd6a9 from this chassis (sb_readonly=0)
Oct  8 11:44:38 np0005476733 nova_compute[192580]: 2025-10-08 15:44:38.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:39 np0005476733 nova_compute[192580]: 2025-10-08 15:44:39.916 2 INFO nova.compute.manager [None req-f804cfb5-c0d8-4b91-ad2c-d09a7fbbd8f6 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Get console output#033[00m
Oct  8 11:44:39 np0005476733 nova_compute[192580]: 2025-10-08 15:44:39.920 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:44:40 np0005476733 nova_compute[192580]: 2025-10-08 15:44:40.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:42 np0005476733 nova_compute[192580]: 2025-10-08 15:44:42.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:42 np0005476733 podman[240063]: 2025-10-08 15:44:42.243884777 +0000 UTC m=+0.063368309 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 11:44:42 np0005476733 podman[240064]: 2025-10-08 15:44:42.250019982 +0000 UTC m=+0.065164144 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  8 11:44:42 np0005476733 podman[240062]: 2025-10-08 15:44:42.259161805 +0000 UTC m=+0.080680451 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct  8 11:44:45 np0005476733 nova_compute[192580]: 2025-10-08 15:44:45.065 2 INFO nova.compute.manager [None req-594c03d0-0b7f-460f-aba9-482b3892bafc d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Get console output#033[00m
Oct  8 11:44:45 np0005476733 nova_compute[192580]: 2025-10-08 15:44:45.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:47 np0005476733 nova_compute[192580]: 2025-10-08 15:44:47.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:50 np0005476733 nova_compute[192580]: 2025-10-08 15:44:50.208 2 INFO nova.compute.manager [None req-e9020627-7944-4b2c-b580-0527c9f6cebf d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Get console output#033[00m
Oct  8 11:44:50 np0005476733 nova_compute[192580]: 2025-10-08 15:44:50.216 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:44:50 np0005476733 podman[240135]: 2025-10-08 15:44:50.235437735 +0000 UTC m=+0.062306164 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 11:44:50 np0005476733 podman[240136]: 2025-10-08 15:44:50.245303601 +0000 UTC m=+0.066539650 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 11:44:50 np0005476733 nova_compute[192580]: 2025-10-08 15:44:50.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:52 np0005476733 nova_compute[192580]: 2025-10-08 15:44:52.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:52 np0005476733 nova_compute[192580]: 2025-10-08 15:44:52.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:44:54 np0005476733 nova_compute[192580]: 2025-10-08 15:44:54.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:44:55 np0005476733 nova_compute[192580]: 2025-10-08 15:44:55.351 2 INFO nova.compute.manager [None req-b9158869-0965-4974-a3b8-a1ae3024c740 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Get console output#033[00m
Oct  8 11:44:55 np0005476733 nova_compute[192580]: 2025-10-08 15:44:55.359 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:44:55 np0005476733 nova_compute[192580]: 2025-10-08 15:44:55.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:57 np0005476733 nova_compute[192580]: 2025-10-08 15:44:57.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:44:57 np0005476733 ovn_controller[94857]: 2025-10-08T15:44:57Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:36:af:68 192.168.7.38
Oct  8 11:44:57 np0005476733 ovn_controller[94857]: 2025-10-08T15:44:57Z|00083|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:36:af:68 192.168.7.38
Oct  8 11:44:58 np0005476733 nova_compute[192580]: 2025-10-08 15:44:58.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:45:00 np0005476733 podman[240177]: 2025-10-08 15:45:00.248399218 +0000 UTC m=+0.074013208 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:45:00 np0005476733 nova_compute[192580]: 2025-10-08 15:45:00.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:00 np0005476733 nova_compute[192580]: 2025-10-08 15:45:00.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:45:00 np0005476733 nova_compute[192580]: 2025-10-08 15:45:00.625 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:45:00 np0005476733 nova_compute[192580]: 2025-10-08 15:45:00.626 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:45:00 np0005476733 nova_compute[192580]: 2025-10-08 15:45:00.626 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:45:00 np0005476733 nova_compute[192580]: 2025-10-08 15:45:00.626 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:45:00 np0005476733 nova_compute[192580]: 2025-10-08 15:45:00.747 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:45:00 np0005476733 nova_compute[192580]: 2025-10-08 15:45:00.811 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:45:00 np0005476733 nova_compute[192580]: 2025-10-08 15:45:00.812 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:45:00 np0005476733 nova_compute[192580]: 2025-10-08 15:45:00.883 2 INFO nova.compute.manager [None req-9529ee27-3930-4086-8c96-01be9367d39d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Get console output#033[00m
Oct  8 11:45:00 np0005476733 nova_compute[192580]: 2025-10-08 15:45:00.884 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:45:00 np0005476733 nova_compute[192580]: 2025-10-08 15:45:00.892 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:45:01 np0005476733 nova_compute[192580]: 2025-10-08 15:45:01.040 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:45:01 np0005476733 nova_compute[192580]: 2025-10-08 15:45:01.041 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13250MB free_disk=111.31459045410156GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:45:01 np0005476733 nova_compute[192580]: 2025-10-08 15:45:01.042 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:45:01 np0005476733 nova_compute[192580]: 2025-10-08 15:45:01.042 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:45:01 np0005476733 nova_compute[192580]: 2025-10-08 15:45:01.308 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance df287684-9151-42eb-8ff2-01e29a07e1e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:45:01 np0005476733 nova_compute[192580]: 2025-10-08 15:45:01.309 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:45:01 np0005476733 nova_compute[192580]: 2025-10-08 15:45:01.309 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:45:01 np0005476733 nova_compute[192580]: 2025-10-08 15:45:01.473 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:45:01 np0005476733 nova_compute[192580]: 2025-10-08 15:45:01.512 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:45:01 np0005476733 nova_compute[192580]: 2025-10-08 15:45:01.575 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:45:01 np0005476733 nova_compute[192580]: 2025-10-08 15:45:01.576 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.534s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:45:02 np0005476733 nova_compute[192580]: 2025-10-08 15:45:02.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:03 np0005476733 nova_compute[192580]: 2025-10-08 15:45:03.568 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:45:03 np0005476733 nova_compute[192580]: 2025-10-08 15:45:03.570 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:45:03 np0005476733 nova_compute[192580]: 2025-10-08 15:45:03.570 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:45:03 np0005476733 nova_compute[192580]: 2025-10-08 15:45:03.570 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:45:04 np0005476733 nova_compute[192580]: 2025-10-08 15:45:04.001 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:45:04 np0005476733 nova_compute[192580]: 2025-10-08 15:45:04.002 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:45:04 np0005476733 nova_compute[192580]: 2025-10-08 15:45:04.002 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:45:04 np0005476733 nova_compute[192580]: 2025-10-08 15:45:04.002 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid df287684-9151-42eb-8ff2-01e29a07e1e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:45:04 np0005476733 podman[240204]: 2025-10-08 15:45:04.268764909 +0000 UTC m=+0.081370054 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:45:04 np0005476733 podman[240203]: 2025-10-08 15:45:04.298127578 +0000 UTC m=+0.111596640 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  8 11:45:05 np0005476733 nova_compute[192580]: 2025-10-08 15:45:05.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:05 np0005476733 nova_compute[192580]: 2025-10-08 15:45:05.790 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Updating instance_info_cache with network_info: [{"id": "8a48fdf3-2293-49fb-81c8-b558651c0274", "address": "fa:16:3e:36:af:68", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a48fdf3-22", "ovs_interfaceid": "8a48fdf3-2293-49fb-81c8-b558651c0274", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:45:05 np0005476733 nova_compute[192580]: 2025-10-08 15:45:05.827 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:45:05 np0005476733 nova_compute[192580]: 2025-10-08 15:45:05.828 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:45:05 np0005476733 nova_compute[192580]: 2025-10-08 15:45:05.828 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:45:05 np0005476733 nova_compute[192580]: 2025-10-08 15:45:05.828 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:45:05 np0005476733 nova_compute[192580]: 2025-10-08 15:45:05.829 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:45:06 np0005476733 nova_compute[192580]: 2025-10-08 15:45:06.042 2 INFO nova.compute.manager [None req-deb4c245-c461-44c9-89ac-2b1515fa8d66 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Get console output#033[00m
Oct  8 11:45:06 np0005476733 nova_compute[192580]: 2025-10-08 15:45:06.046 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:45:06 np0005476733 nova_compute[192580]: 2025-10-08 15:45:06.051 2 INFO nova.virt.libvirt.driver [None req-deb4c245-c461-44c9-89ac-2b1515fa8d66 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Truncated console log returned, 3052 bytes ignored#033[00m
Oct  8 11:45:07 np0005476733 nova_compute[192580]: 2025-10-08 15:45:07.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:10 np0005476733 nova_compute[192580]: 2025-10-08 15:45:10.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:11 np0005476733 nova_compute[192580]: 2025-10-08 15:45:11.230 2 INFO nova.compute.manager [None req-99fb31f1-eacc-427c-8fb2-e878a9e474fb d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Get console output#033[00m
Oct  8 11:45:11 np0005476733 nova_compute[192580]: 2025-10-08 15:45:11.235 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:45:11 np0005476733 nova_compute[192580]: 2025-10-08 15:45:11.238 2 INFO nova.virt.libvirt.driver [None req-99fb31f1-eacc-427c-8fb2-e878a9e474fb d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Truncated console log returned, 3263 bytes ignored#033[00m
Oct  8 11:45:12 np0005476733 nova_compute[192580]: 2025-10-08 15:45:12.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:13 np0005476733 podman[240265]: 2025-10-08 15:45:13.259988921 +0000 UTC m=+0.060068283 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  8 11:45:13 np0005476733 podman[240266]: 2025-10-08 15:45:13.274127223 +0000 UTC m=+0.058427790 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 11:45:13 np0005476733 podman[240270]: 2025-10-08 15:45:13.286107375 +0000 UTC m=+0.072507139 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, config_id=edpm, release=1755695350, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal)
Oct  8 11:45:15 np0005476733 nova_compute[192580]: 2025-10-08 15:45:15.208 2 DEBUG nova.compute.manager [req-e22d712a-fa03-4d31-8710-ee3ee3aa0c94 req-9a7be98b-89d9-48e5-9ef6-7dd055cda537 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received event network-changed-8a48fdf3-2293-49fb-81c8-b558651c0274 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:45:15 np0005476733 nova_compute[192580]: 2025-10-08 15:45:15.209 2 DEBUG nova.compute.manager [req-e22d712a-fa03-4d31-8710-ee3ee3aa0c94 req-9a7be98b-89d9-48e5-9ef6-7dd055cda537 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Refreshing instance network info cache due to event network-changed-8a48fdf3-2293-49fb-81c8-b558651c0274. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:45:15 np0005476733 nova_compute[192580]: 2025-10-08 15:45:15.209 2 DEBUG oslo_concurrency.lockutils [req-e22d712a-fa03-4d31-8710-ee3ee3aa0c94 req-9a7be98b-89d9-48e5-9ef6-7dd055cda537 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:45:15 np0005476733 nova_compute[192580]: 2025-10-08 15:45:15.209 2 DEBUG oslo_concurrency.lockutils [req-e22d712a-fa03-4d31-8710-ee3ee3aa0c94 req-9a7be98b-89d9-48e5-9ef6-7dd055cda537 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:45:15 np0005476733 nova_compute[192580]: 2025-10-08 15:45:15.210 2 DEBUG nova.network.neutron [req-e22d712a-fa03-4d31-8710-ee3ee3aa0c94 req-9a7be98b-89d9-48e5-9ef6-7dd055cda537 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Refreshing network info cache for port 8a48fdf3-2293-49fb-81c8-b558651c0274 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:45:15 np0005476733 nova_compute[192580]: 2025-10-08 15:45:15.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:16 np0005476733 nova_compute[192580]: 2025-10-08 15:45:16.560 2 DEBUG nova.network.neutron [req-e22d712a-fa03-4d31-8710-ee3ee3aa0c94 req-9a7be98b-89d9-48e5-9ef6-7dd055cda537 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Updated VIF entry in instance network info cache for port 8a48fdf3-2293-49fb-81c8-b558651c0274. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:45:16 np0005476733 nova_compute[192580]: 2025-10-08 15:45:16.562 2 DEBUG nova.network.neutron [req-e22d712a-fa03-4d31-8710-ee3ee3aa0c94 req-9a7be98b-89d9-48e5-9ef6-7dd055cda537 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Updating instance_info_cache with network_info: [{"id": "8a48fdf3-2293-49fb-81c8-b558651c0274", "address": "fa:16:3e:36:af:68", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a48fdf3-22", "ovs_interfaceid": "8a48fdf3-2293-49fb-81c8-b558651c0274", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:45:16 np0005476733 nova_compute[192580]: 2025-10-08 15:45:16.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:45:16 np0005476733 nova_compute[192580]: 2025-10-08 15:45:16.590 2 DEBUG oslo_concurrency.lockutils [req-e22d712a-fa03-4d31-8710-ee3ee3aa0c94 req-9a7be98b-89d9-48e5-9ef6-7dd055cda537 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:45:17 np0005476733 nova_compute[192580]: 2025-10-08 15:45:17.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:18 np0005476733 nova_compute[192580]: 2025-10-08 15:45:18.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:45:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:18.849 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:45:18 np0005476733 nova_compute[192580]: 2025-10-08 15:45:18.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:18.853 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:45:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:19.857 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:45:20 np0005476733 nova_compute[192580]: 2025-10-08 15:45:20.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:21 np0005476733 podman[240327]: 2025-10-08 15:45:21.220919812 +0000 UTC m=+0.048910825 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 11:45:21 np0005476733 podman[240326]: 2025-10-08 15:45:21.221466239 +0000 UTC m=+0.050786435 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct  8 11:45:22 np0005476733 nova_compute[192580]: 2025-10-08 15:45:22.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:22 np0005476733 nova_compute[192580]: 2025-10-08 15:45:22.618 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:45:22 np0005476733 nova_compute[192580]: 2025-10-08 15:45:22.618 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 11:45:24 np0005476733 ovn_controller[94857]: 2025-10-08T15:45:24Z|00578|pinctrl|WARN|Dropped 2863 log messages in last 64 seconds (most recently, 5 seconds ago) due to excessive rate
Oct  8 11:45:24 np0005476733 ovn_controller[94857]: 2025-10-08T15:45:24Z|00579|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:45:25 np0005476733 nova_compute[192580]: 2025-10-08 15:45:25.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:26.333 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:45:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:26.334 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:45:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:26.339 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:45:27 np0005476733 nova_compute[192580]: 2025-10-08 15:45:27.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:30 np0005476733 nova_compute[192580]: 2025-10-08 15:45:30.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:31 np0005476733 nova_compute[192580]: 2025-10-08 15:45:31.078 2 DEBUG oslo_concurrency.lockutils [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "interface-df287684-9151-42eb-8ff2-01e29a07e1e1-d02e7451-3ef3-44f1-b34a-5c7cbee26989" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:45:31 np0005476733 nova_compute[192580]: 2025-10-08 15:45:31.079 2 DEBUG oslo_concurrency.lockutils [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "interface-df287684-9151-42eb-8ff2-01e29a07e1e1-d02e7451-3ef3-44f1-b34a-5c7cbee26989" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:45:31 np0005476733 nova_compute[192580]: 2025-10-08 15:45:31.079 2 DEBUG nova.objects.instance [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'flavor' on Instance uuid df287684-9151-42eb-8ff2-01e29a07e1e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:45:31 np0005476733 podman[240393]: 2025-10-08 15:45:31.234478075 +0000 UTC m=+0.057293583 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 11:45:31 np0005476733 nova_compute[192580]: 2025-10-08 15:45:31.686 2 DEBUG nova.objects.instance [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'pci_requests' on Instance uuid df287684-9151-42eb-8ff2-01e29a07e1e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:45:31 np0005476733 nova_compute[192580]: 2025-10-08 15:45:31.710 2 DEBUG nova.network.neutron [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:45:32 np0005476733 nova_compute[192580]: 2025-10-08 15:45:32.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:32 np0005476733 nova_compute[192580]: 2025-10-08 15:45:32.190 2 DEBUG nova.policy [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:45:33 np0005476733 nova_compute[192580]: 2025-10-08 15:45:33.184 2 DEBUG nova.network.neutron [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Successfully updated port: d02e7451-3ef3-44f1-b34a-5c7cbee26989 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:45:33 np0005476733 nova_compute[192580]: 2025-10-08 15:45:33.209 2 DEBUG oslo_concurrency.lockutils [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:45:33 np0005476733 nova_compute[192580]: 2025-10-08 15:45:33.210 2 DEBUG oslo_concurrency.lockutils [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquired lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:45:33 np0005476733 nova_compute[192580]: 2025-10-08 15:45:33.210 2 DEBUG nova.network.neutron [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:45:33 np0005476733 nova_compute[192580]: 2025-10-08 15:45:33.288 2 DEBUG nova.compute.manager [req-f8de4849-2027-418b-aafe-41250386febd req-d22dde8b-b0e7-493f-ae3e-c83f56f98754 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received event network-changed-d02e7451-3ef3-44f1-b34a-5c7cbee26989 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:45:33 np0005476733 nova_compute[192580]: 2025-10-08 15:45:33.288 2 DEBUG nova.compute.manager [req-f8de4849-2027-418b-aafe-41250386febd req-d22dde8b-b0e7-493f-ae3e-c83f56f98754 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Refreshing instance network info cache due to event network-changed-d02e7451-3ef3-44f1-b34a-5c7cbee26989. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:45:33 np0005476733 nova_compute[192580]: 2025-10-08 15:45:33.289 2 DEBUG oslo_concurrency.lockutils [req-f8de4849-2027-418b-aafe-41250386febd req-d22dde8b-b0e7-493f-ae3e-c83f56f98754 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:45:34 np0005476733 nova_compute[192580]: 2025-10-08 15:45:34.916 2 DEBUG nova.network.neutron [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Updating instance_info_cache with network_info: [{"id": "8a48fdf3-2293-49fb-81c8-b558651c0274", "address": "fa:16:3e:36:af:68", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a48fdf3-22", "ovs_interfaceid": "8a48fdf3-2293-49fb-81c8-b558651c0274", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "address": "fa:16:3e:f7:81:a1", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd02e7451-3e", "ovs_interfaceid": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:45:35 np0005476733 podman[240413]: 2025-10-08 15:45:35.246310462 +0000 UTC m=+0.064357779 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  8 11:45:35 np0005476733 podman[240412]: 2025-10-08 15:45:35.262440398 +0000 UTC m=+0.087057235 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.614 2 DEBUG oslo_concurrency.lockutils [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Releasing lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.616 2 DEBUG oslo_concurrency.lockutils [req-f8de4849-2027-418b-aafe-41250386febd req-d22dde8b-b0e7-493f-ae3e-c83f56f98754 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.616 2 DEBUG nova.network.neutron [req-f8de4849-2027-418b-aafe-41250386febd req-d22dde8b-b0e7-493f-ae3e-c83f56f98754 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Refreshing network info cache for port d02e7451-3ef3-44f1-b34a-5c7cbee26989 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.620 2 DEBUG nova.virt.libvirt.vif [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:44:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_qos_after_cold_migration-1645830914',display_name='tempest-test_qos_after_cold_migration-1645830914',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-qos-after-cold-migration-1645830914',id=71,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:44:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-riz678gr',resources=<?>,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:44:33Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=df287684-9151-42eb-8ff2-01e29a07e1e1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "address": "fa:16:3e:f7:81:a1", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd02e7451-3e", "ovs_interfaceid": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.621 2 DEBUG nova.network.os_vif_util [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "address": "fa:16:3e:f7:81:a1", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd02e7451-3e", "ovs_interfaceid": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.622 2 DEBUG nova.network.os_vif_util [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:81:a1,bridge_name='br-int',has_traffic_filtering=True,id=d02e7451-3ef3-44f1-b34a-5c7cbee26989,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd02e7451-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.622 2 DEBUG os_vif [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:81:a1,bridge_name='br-int',has_traffic_filtering=True,id=d02e7451-3ef3-44f1-b34a-5c7cbee26989,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd02e7451-3e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.623 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.624 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.627 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd02e7451-3e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.627 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd02e7451-3e, col_values=(('external_ids', {'iface-id': 'd02e7451-3ef3-44f1-b34a-5c7cbee26989', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f7:81:a1', 'vm-uuid': 'df287684-9151-42eb-8ff2-01e29a07e1e1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:35 np0005476733 NetworkManager[51699]: <info>  [1759938335.6311] manager: (tapd02e7451-3e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/192)
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.640 2 INFO os_vif [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:81:a1,bridge_name='br-int',has_traffic_filtering=True,id=d02e7451-3ef3-44f1-b34a-5c7cbee26989,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd02e7451-3e')#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.641 2 DEBUG nova.virt.libvirt.vif [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:44:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_qos_after_cold_migration-1645830914',display_name='tempest-test_qos_after_cold_migration-1645830914',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-qos-after-cold-migration-1645830914',id=71,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:44:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-riz678gr',resources=<?>,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:44:33Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=df287684-9151-42eb-8ff2-01e29a07e1e1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "address": "fa:16:3e:f7:81:a1", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd02e7451-3e", "ovs_interfaceid": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.642 2 DEBUG nova.network.os_vif_util [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "address": "fa:16:3e:f7:81:a1", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd02e7451-3e", "ovs_interfaceid": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.642 2 DEBUG nova.network.os_vif_util [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:81:a1,bridge_name='br-int',has_traffic_filtering=True,id=d02e7451-3ef3-44f1-b34a-5c7cbee26989,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd02e7451-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.646 2 DEBUG nova.virt.libvirt.guest [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] attach device xml: <interface type="ethernet">
Oct  8 11:45:35 np0005476733 nova_compute[192580]:  <mac address="fa:16:3e:f7:81:a1"/>
Oct  8 11:45:35 np0005476733 nova_compute[192580]:  <model type="virtio"/>
Oct  8 11:45:35 np0005476733 nova_compute[192580]:  <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:45:35 np0005476733 nova_compute[192580]:  <mtu size="1342"/>
Oct  8 11:45:35 np0005476733 nova_compute[192580]:  <target dev="tapd02e7451-3e"/>
Oct  8 11:45:35 np0005476733 nova_compute[192580]: </interface>
Oct  8 11:45:35 np0005476733 nova_compute[192580]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  8 11:45:35 np0005476733 kernel: tapd02e7451-3e: entered promiscuous mode
Oct  8 11:45:35 np0005476733 NetworkManager[51699]: <info>  [1759938335.6623] manager: (tapd02e7451-3e): new Tun device (/org/freedesktop/NetworkManager/Devices/193)
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:35 np0005476733 ovn_controller[94857]: 2025-10-08T15:45:35Z|00580|binding|INFO|Claiming lport d02e7451-3ef3-44f1-b34a-5c7cbee26989 for this chassis.
Oct  8 11:45:35 np0005476733 ovn_controller[94857]: 2025-10-08T15:45:35Z|00581|binding|INFO|d02e7451-3ef3-44f1-b34a-5c7cbee26989: Claiming fa:16:3e:f7:81:a1 10.100.0.10
Oct  8 11:45:35 np0005476733 ovn_controller[94857]: 2025-10-08T15:45:35Z|00582|binding|INFO|Setting lport d02e7451-3ef3-44f1-b34a-5c7cbee26989 ovn-installed in OVS
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:35 np0005476733 ovn_controller[94857]: 2025-10-08T15:45:35Z|00583|binding|INFO|Setting lport d02e7451-3ef3-44f1-b34a-5c7cbee26989 up in Southbound
Oct  8 11:45:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:35.691 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:81:a1 10.100.0.10'], port_security=['fa:16:3e:f7:81:a1 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '82ea289b-c65f-44fe-a172-e9784a3ab9f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3da71a44-b74e-4032-87c4-3337484b3d54, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=d02e7451-3ef3-44f1-b34a-5c7cbee26989) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:45:35 np0005476733 systemd-udevd[240464]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:45:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:35.692 103739 INFO neutron.agent.ovn.metadata.agent [-] Port d02e7451-3ef3-44f1-b34a-5c7cbee26989 in datapath 58a69152-b5a6-41d0-85d5-36ab51cfbfb5 bound to our chassis#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:35.695 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58a69152-b5a6-41d0-85d5-36ab51cfbfb5#033[00m
Oct  8 11:45:35 np0005476733 NetworkManager[51699]: <info>  [1759938335.7095] device (tapd02e7451-3e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:45:35 np0005476733 NetworkManager[51699]: <info>  [1759938335.7104] device (tapd02e7451-3e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:45:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:35.711 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[85d9e61b-62a4-4722-98c3-f3c0b74bcf41]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:45:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:35.713 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap58a69152-b1 in ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:45:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:35.715 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap58a69152-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:45:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:35.715 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[eece15e3-1bad-4273-a211-15d3f2fd4d28]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:45:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:35.717 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[baacc172-2ace-4896-8424-db7dd55cd1ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:45:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:35.734 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[c063c28e-9347-4a5b-9a46-4cc5cd0d299f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:45:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:35.761 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[172173fe-89bc-401a-9db0-ceb66b83dd29]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:45:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:35.792 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[f9efc497-261f-48e0-a6fd-f9dfc9fb5a0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.799 2 DEBUG nova.virt.libvirt.driver [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.800 2 DEBUG nova.virt.libvirt.driver [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.800 2 DEBUG nova.virt.libvirt.driver [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No VIF found with MAC fa:16:3e:36:af:68, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.800 2 DEBUG nova.virt.libvirt.driver [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No VIF found with MAC fa:16:3e:f7:81:a1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:45:35 np0005476733 NetworkManager[51699]: <info>  [1759938335.8022] manager: (tap58a69152-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/194)
Oct  8 11:45:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:35.801 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[bc392c57-341c-4f3c-bcff-b79c60401a20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:45:35 np0005476733 systemd-udevd[240468]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:45:35 np0005476733 ovn_controller[94857]: 2025-10-08T15:45:35Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f7:81:a1 10.100.0.10
Oct  8 11:45:35 np0005476733 ovn_controller[94857]: 2025-10-08T15:45:35Z|00085|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f7:81:a1 10.100.0.10
Oct  8 11:45:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:35.848 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[b95adc70-72d4-4d14-8a18-0685274e6b9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:45:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:35.853 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[f3a61220-b909-434e-a56d-d3696eea3ec7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:45:35 np0005476733 NetworkManager[51699]: <info>  [1759938335.8850] device (tap58a69152-b0): carrier: link connected
Oct  8 11:45:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:35.892 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[ad46db1d-1737-4f9d-b0c7-fac6a5252077]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:45:35 np0005476733 nova_compute[192580]: 2025-10-08 15:45:35.897 2 DEBUG nova.virt.libvirt.guest [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:45:35 np0005476733 nova_compute[192580]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:45:35 np0005476733 nova_compute[192580]:  <nova:name>tempest-test_qos_after_cold_migration-1645830914</nova:name>
Oct  8 11:45:35 np0005476733 nova_compute[192580]:  <nova:creationTime>2025-10-08 15:45:35</nova:creationTime>
Oct  8 11:45:35 np0005476733 nova_compute[192580]:  <nova:flavor name="custom_neutron_guest">
Oct  8 11:45:35 np0005476733 nova_compute[192580]:    <nova:memory>1024</nova:memory>
Oct  8 11:45:35 np0005476733 nova_compute[192580]:    <nova:disk>10</nova:disk>
Oct  8 11:45:35 np0005476733 nova_compute[192580]:    <nova:swap>0</nova:swap>
Oct  8 11:45:35 np0005476733 nova_compute[192580]:    <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:45:35 np0005476733 nova_compute[192580]:    <nova:vcpus>1</nova:vcpus>
Oct  8 11:45:35 np0005476733 nova_compute[192580]:  </nova:flavor>
Oct  8 11:45:35 np0005476733 nova_compute[192580]:  <nova:owner>
Oct  8 11:45:35 np0005476733 nova_compute[192580]:    <nova:user uuid="d4d641ac754b44f89a23c1628056309a">tempest-QosTestCommon-1316104462-project-member</nova:user>
Oct  8 11:45:35 np0005476733 nova_compute[192580]:    <nova:project uuid="d58fb802e34e481ea69b20f4fe8df6d2">tempest-QosTestCommon-1316104462</nova:project>
Oct  8 11:45:35 np0005476733 nova_compute[192580]:  </nova:owner>
Oct  8 11:45:35 np0005476733 nova_compute[192580]:  <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:45:35 np0005476733 nova_compute[192580]:  <nova:ports>
Oct  8 11:45:35 np0005476733 nova_compute[192580]:    <nova:port uuid="8a48fdf3-2293-49fb-81c8-b558651c0274">
Oct  8 11:45:35 np0005476733 nova_compute[192580]:      <nova:ip type="fixed" address="192.168.7.38" ipVersion="4"/>
Oct  8 11:45:35 np0005476733 nova_compute[192580]:    </nova:port>
Oct  8 11:45:35 np0005476733 nova_compute[192580]:    <nova:port uuid="d02e7451-3ef3-44f1-b34a-5c7cbee26989">
Oct  8 11:45:35 np0005476733 nova_compute[192580]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  8 11:45:35 np0005476733 nova_compute[192580]:    </nova:port>
Oct  8 11:45:35 np0005476733 nova_compute[192580]:  </nova:ports>
Oct  8 11:45:35 np0005476733 nova_compute[192580]: </nova:instance>
Oct  8 11:45:35 np0005476733 nova_compute[192580]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  8 11:45:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:35.913 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a79c1921-174b-4e21-bf0a-0d3e019f1e88]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58a69152-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:63:a5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 136], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523954, 'reachable_time': 38821, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240492, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:45:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:35.933 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[86377b77-2870-41ea-89dc-9e1b2046e59f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecf:63a5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 523954, 'tstamp': 523954}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240493, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:45:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:35.960 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0117e1a4-7a7d-480a-9cca-48048fda016d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58a69152-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:63:a5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 136], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523954, 'reachable_time': 38821, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240494, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:45:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:35.994 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4ee32cee-9905-46e3-a208-5faf080db35d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:45:36 np0005476733 nova_compute[192580]: 2025-10-08 15:45:36.031 2 DEBUG oslo_concurrency.lockutils [None req-afecaf7b-2752-4585-b608-3947730b191d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "interface-df287684-9151-42eb-8ff2-01e29a07e1e1-d02e7451-3ef3-44f1-b34a-5c7cbee26989" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 4.952s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:36.065 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6a57e209-e0d6-4bbd-b567-7651f3e33715]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:36.066 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58a69152-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:36.066 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:36.067 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58a69152-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:45:36 np0005476733 NetworkManager[51699]: <info>  [1759938336.0696] manager: (tap58a69152-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/195)
Oct  8 11:45:36 np0005476733 kernel: tap58a69152-b0: entered promiscuous mode
Oct  8 11:45:36 np0005476733 nova_compute[192580]: 2025-10-08 15:45:36.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:36.072 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58a69152-b0, col_values=(('external_ids', {'iface-id': '46f589fc-b5d9-4e1f-b085-8789fd1f48e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:45:36 np0005476733 ovn_controller[94857]: 2025-10-08T15:45:36Z|00584|binding|INFO|Releasing lport 46f589fc-b5d9-4e1f-b085-8789fd1f48e9 from this chassis (sb_readonly=0)
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:36.074 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/58a69152-b5a6-41d0-85d5-36ab51cfbfb5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/58a69152-b5a6-41d0-85d5-36ab51cfbfb5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:36.075 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[506df00b-7dc4-4527-bc36-1678a06c6c07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:36.077 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-58a69152-b5a6-41d0-85d5-36ab51cfbfb5
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/58a69152-b5a6-41d0-85d5-36ab51cfbfb5.pid.haproxy
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 58a69152-b5a6-41d0-85d5-36ab51cfbfb5
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:45:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:45:36.077 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'env', 'PROCESS_TAG=haproxy-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/58a69152-b5a6-41d0-85d5-36ab51cfbfb5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:45:36 np0005476733 nova_compute[192580]: 2025-10-08 15:45:36.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:36 np0005476733 nova_compute[192580]: 2025-10-08 15:45:36.249 2 DEBUG nova.compute.manager [req-682b8492-9fbe-4cc4-ac22-d1ffd7d80787 req-69e1dd22-4a22-444d-99db-24ae3d3cb00c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received event network-vif-plugged-d02e7451-3ef3-44f1-b34a-5c7cbee26989 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:45:36 np0005476733 nova_compute[192580]: 2025-10-08 15:45:36.249 2 DEBUG oslo_concurrency.lockutils [req-682b8492-9fbe-4cc4-ac22-d1ffd7d80787 req-69e1dd22-4a22-444d-99db-24ae3d3cb00c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:45:36 np0005476733 nova_compute[192580]: 2025-10-08 15:45:36.250 2 DEBUG oslo_concurrency.lockutils [req-682b8492-9fbe-4cc4-ac22-d1ffd7d80787 req-69e1dd22-4a22-444d-99db-24ae3d3cb00c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:45:36 np0005476733 nova_compute[192580]: 2025-10-08 15:45:36.250 2 DEBUG oslo_concurrency.lockutils [req-682b8492-9fbe-4cc4-ac22-d1ffd7d80787 req-69e1dd22-4a22-444d-99db-24ae3d3cb00c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:45:36 np0005476733 nova_compute[192580]: 2025-10-08 15:45:36.250 2 DEBUG nova.compute.manager [req-682b8492-9fbe-4cc4-ac22-d1ffd7d80787 req-69e1dd22-4a22-444d-99db-24ae3d3cb00c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] No waiting events found dispatching network-vif-plugged-d02e7451-3ef3-44f1-b34a-5c7cbee26989 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:45:36 np0005476733 nova_compute[192580]: 2025-10-08 15:45:36.250 2 WARNING nova.compute.manager [req-682b8492-9fbe-4cc4-ac22-d1ffd7d80787 req-69e1dd22-4a22-444d-99db-24ae3d3cb00c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received unexpected event network-vif-plugged-d02e7451-3ef3-44f1-b34a-5c7cbee26989 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:45:36 np0005476733 podman[240526]: 2025-10-08 15:45:36.476920577 +0000 UTC m=+0.059348459 container create 6b81038db517f07e4fe8ca79110a7172454fa218a20be6303b3a036e4c4c2090 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0)
Oct  8 11:45:36 np0005476733 systemd[1]: Started libpod-conmon-6b81038db517f07e4fe8ca79110a7172454fa218a20be6303b3a036e4c4c2090.scope.
Oct  8 11:45:36 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:45:36 np0005476733 podman[240526]: 2025-10-08 15:45:36.447990862 +0000 UTC m=+0.030418774 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:45:36 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c508b8d4fcbc9d211d0dc467a961eb4c7929129ec4ff4fe7c0a5648e5b26655e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:45:36 np0005476733 podman[240526]: 2025-10-08 15:45:36.564464526 +0000 UTC m=+0.146892428 container init 6b81038db517f07e4fe8ca79110a7172454fa218a20be6303b3a036e4c4c2090 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  8 11:45:36 np0005476733 podman[240526]: 2025-10-08 15:45:36.569781137 +0000 UTC m=+0.152209019 container start 6b81038db517f07e4fe8ca79110a7172454fa218a20be6303b3a036e4c4c2090 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:45:36 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[240542]: [NOTICE]   (240546) : New worker (240548) forked
Oct  8 11:45:36 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[240542]: [NOTICE]   (240546) : Loading success.
Oct  8 11:45:37 np0005476733 nova_compute[192580]: 2025-10-08 15:45:37.202 2 DEBUG nova.network.neutron [req-f8de4849-2027-418b-aafe-41250386febd req-d22dde8b-b0e7-493f-ae3e-c83f56f98754 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Updated VIF entry in instance network info cache for port d02e7451-3ef3-44f1-b34a-5c7cbee26989. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:45:37 np0005476733 nova_compute[192580]: 2025-10-08 15:45:37.203 2 DEBUG nova.network.neutron [req-f8de4849-2027-418b-aafe-41250386febd req-d22dde8b-b0e7-493f-ae3e-c83f56f98754 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Updating instance_info_cache with network_info: [{"id": "8a48fdf3-2293-49fb-81c8-b558651c0274", "address": "fa:16:3e:36:af:68", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a48fdf3-22", "ovs_interfaceid": "8a48fdf3-2293-49fb-81c8-b558651c0274", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "address": "fa:16:3e:f7:81:a1", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd02e7451-3e", "ovs_interfaceid": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:45:37 np0005476733 nova_compute[192580]: 2025-10-08 15:45:37.226 2 DEBUG oslo_concurrency.lockutils [req-f8de4849-2027-418b-aafe-41250386febd req-d22dde8b-b0e7-493f-ae3e-c83f56f98754 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:45:37 np0005476733 nova_compute[192580]: 2025-10-08 15:45:37.654 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:45:37 np0005476733 nova_compute[192580]: 2025-10-08 15:45:37.656 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 11:45:37 np0005476733 nova_compute[192580]: 2025-10-08 15:45:37.683 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 11:45:38 np0005476733 nova_compute[192580]: 2025-10-08 15:45:38.370 2 DEBUG nova.compute.manager [req-90be314d-60fe-40ee-a321-3375d8a166f8 req-d5efa19a-cd7c-405f-ba98-1f502c3ff8bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received event network-vif-plugged-d02e7451-3ef3-44f1-b34a-5c7cbee26989 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:45:38 np0005476733 nova_compute[192580]: 2025-10-08 15:45:38.370 2 DEBUG oslo_concurrency.lockutils [req-90be314d-60fe-40ee-a321-3375d8a166f8 req-d5efa19a-cd7c-405f-ba98-1f502c3ff8bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:45:38 np0005476733 nova_compute[192580]: 2025-10-08 15:45:38.371 2 DEBUG oslo_concurrency.lockutils [req-90be314d-60fe-40ee-a321-3375d8a166f8 req-d5efa19a-cd7c-405f-ba98-1f502c3ff8bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:45:38 np0005476733 nova_compute[192580]: 2025-10-08 15:45:38.371 2 DEBUG oslo_concurrency.lockutils [req-90be314d-60fe-40ee-a321-3375d8a166f8 req-d5efa19a-cd7c-405f-ba98-1f502c3ff8bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:45:38 np0005476733 nova_compute[192580]: 2025-10-08 15:45:38.371 2 DEBUG nova.compute.manager [req-90be314d-60fe-40ee-a321-3375d8a166f8 req-d5efa19a-cd7c-405f-ba98-1f502c3ff8bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] No waiting events found dispatching network-vif-plugged-d02e7451-3ef3-44f1-b34a-5c7cbee26989 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:45:38 np0005476733 nova_compute[192580]: 2025-10-08 15:45:38.371 2 WARNING nova.compute.manager [req-90be314d-60fe-40ee-a321-3375d8a166f8 req-d5efa19a-cd7c-405f-ba98-1f502c3ff8bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received unexpected event network-vif-plugged-d02e7451-3ef3-44f1-b34a-5c7cbee26989 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:45:38 np0005476733 nova_compute[192580]: 2025-10-08 15:45:38.524 2 DEBUG nova.compute.manager [req-34b6a450-bbbb-47ad-a138-a44d31c13080 req-f29f4a0c-3a84-4c09-9575-67dc3e949953 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received event network-changed-d02e7451-3ef3-44f1-b34a-5c7cbee26989 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:45:38 np0005476733 nova_compute[192580]: 2025-10-08 15:45:38.525 2 DEBUG nova.compute.manager [req-34b6a450-bbbb-47ad-a138-a44d31c13080 req-f29f4a0c-3a84-4c09-9575-67dc3e949953 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Refreshing instance network info cache due to event network-changed-d02e7451-3ef3-44f1-b34a-5c7cbee26989. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:45:38 np0005476733 nova_compute[192580]: 2025-10-08 15:45:38.526 2 DEBUG oslo_concurrency.lockutils [req-34b6a450-bbbb-47ad-a138-a44d31c13080 req-f29f4a0c-3a84-4c09-9575-67dc3e949953 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:45:38 np0005476733 nova_compute[192580]: 2025-10-08 15:45:38.527 2 DEBUG oslo_concurrency.lockutils [req-34b6a450-bbbb-47ad-a138-a44d31c13080 req-f29f4a0c-3a84-4c09-9575-67dc3e949953 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:45:38 np0005476733 nova_compute[192580]: 2025-10-08 15:45:38.527 2 DEBUG nova.network.neutron [req-34b6a450-bbbb-47ad-a138-a44d31c13080 req-f29f4a0c-3a84-4c09-9575-67dc3e949953 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Refreshing network info cache for port d02e7451-3ef3-44f1-b34a-5c7cbee26989 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:45:40 np0005476733 nova_compute[192580]: 2025-10-08 15:45:40.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:40 np0005476733 nova_compute[192580]: 2025-10-08 15:45:40.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:41 np0005476733 nova_compute[192580]: 2025-10-08 15:45:41.326 2 DEBUG nova.network.neutron [req-34b6a450-bbbb-47ad-a138-a44d31c13080 req-f29f4a0c-3a84-4c09-9575-67dc3e949953 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Updated VIF entry in instance network info cache for port d02e7451-3ef3-44f1-b34a-5c7cbee26989. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:45:41 np0005476733 nova_compute[192580]: 2025-10-08 15:45:41.327 2 DEBUG nova.network.neutron [req-34b6a450-bbbb-47ad-a138-a44d31c13080 req-f29f4a0c-3a84-4c09-9575-67dc3e949953 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Updating instance_info_cache with network_info: [{"id": "8a48fdf3-2293-49fb-81c8-b558651c0274", "address": "fa:16:3e:36:af:68", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a48fdf3-22", "ovs_interfaceid": "8a48fdf3-2293-49fb-81c8-b558651c0274", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "address": "fa:16:3e:f7:81:a1", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd02e7451-3e", "ovs_interfaceid": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:45:41 np0005476733 nova_compute[192580]: 2025-10-08 15:45:41.350 2 DEBUG oslo_concurrency.lockutils [req-34b6a450-bbbb-47ad-a138-a44d31c13080 req-f29f4a0c-3a84-4c09-9575-67dc3e949953 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:45:44 np0005476733 podman[240557]: 2025-10-08 15:45:44.23062829 +0000 UTC m=+0.057472978 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:45:44 np0005476733 podman[240558]: 2025-10-08 15:45:44.233051788 +0000 UTC m=+0.054697280 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 11:45:44 np0005476733 podman[240559]: 2025-10-08 15:45:44.237199491 +0000 UTC m=+0.059184154 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, architecture=x86_64, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public)
Oct  8 11:45:45 np0005476733 nova_compute[192580]: 2025-10-08 15:45:45.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:45:45 np0005476733 nova_compute[192580]: 2025-10-08 15:45:45.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:45:45 np0005476733 nova_compute[192580]: 2025-10-08 15:45:45.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  8 11:45:45 np0005476733 nova_compute[192580]: 2025-10-08 15:45:45.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 11:45:45 np0005476733 nova_compute[192580]: 2025-10-08 15:45:45.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:45 np0005476733 nova_compute[192580]: 2025-10-08 15:45:45.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 11:45:50 np0005476733 nova_compute[192580]: 2025-10-08 15:45:50.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:45:50 np0005476733 nova_compute[192580]: 2025-10-08 15:45:50.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:50 np0005476733 nova_compute[192580]: 2025-10-08 15:45:50.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  8 11:45:50 np0005476733 nova_compute[192580]: 2025-10-08 15:45:50.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 11:45:50 np0005476733 nova_compute[192580]: 2025-10-08 15:45:50.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 11:45:50 np0005476733 nova_compute[192580]: 2025-10-08 15:45:50.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:52 np0005476733 podman[240622]: 2025-10-08 15:45:52.232792069 +0000 UTC m=+0.053749960 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:45:52 np0005476733 podman[240621]: 2025-10-08 15:45:52.242301733 +0000 UTC m=+0.068296065 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  8 11:45:52 np0005476733 nova_compute[192580]: 2025-10-08 15:45:52.617 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:45:55 np0005476733 nova_compute[192580]: 2025-10-08 15:45:55.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:45:56 np0005476733 nova_compute[192580]: 2025-10-08 15:45:56.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:45:59 np0005476733 nova_compute[192580]: 2025-10-08 15:45:59.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:46:00 np0005476733 nova_compute[192580]: 2025-10-08 15:46:00.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:46:02 np0005476733 podman[240666]: 2025-10-08 15:46:02.242494679 +0000 UTC m=+0.061788488 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 11:46:02 np0005476733 nova_compute[192580]: 2025-10-08 15:46:02.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:46:02 np0005476733 nova_compute[192580]: 2025-10-08 15:46:02.615 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:46:02 np0005476733 nova_compute[192580]: 2025-10-08 15:46:02.615 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:46:02 np0005476733 nova_compute[192580]: 2025-10-08 15:46:02.615 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:46:02 np0005476733 nova_compute[192580]: 2025-10-08 15:46:02.615 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:46:02 np0005476733 nova_compute[192580]: 2025-10-08 15:46:02.797 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:46:02 np0005476733 nova_compute[192580]: 2025-10-08 15:46:02.891 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:46:02 np0005476733 nova_compute[192580]: 2025-10-08 15:46:02.893 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:46:02 np0005476733 nova_compute[192580]: 2025-10-08 15:46:02.994 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:46:03 np0005476733 nova_compute[192580]: 2025-10-08 15:46:03.212 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:46:03 np0005476733 nova_compute[192580]: 2025-10-08 15:46:03.214 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12975MB free_disk=111.18974685668945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:46:03 np0005476733 nova_compute[192580]: 2025-10-08 15:46:03.214 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:46:03 np0005476733 nova_compute[192580]: 2025-10-08 15:46:03.214 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:46:03 np0005476733 nova_compute[192580]: 2025-10-08 15:46:03.297 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance df287684-9151-42eb-8ff2-01e29a07e1e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:46:03 np0005476733 nova_compute[192580]: 2025-10-08 15:46:03.298 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:46:03 np0005476733 nova_compute[192580]: 2025-10-08 15:46:03.298 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:46:03 np0005476733 nova_compute[192580]: 2025-10-08 15:46:03.316 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 11:46:03 np0005476733 nova_compute[192580]: 2025-10-08 15:46:03.336 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 11:46:03 np0005476733 nova_compute[192580]: 2025-10-08 15:46:03.337 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 11:46:03 np0005476733 nova_compute[192580]: 2025-10-08 15:46:03.351 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 11:46:03 np0005476733 nova_compute[192580]: 2025-10-08 15:46:03.376 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 11:46:03 np0005476733 nova_compute[192580]: 2025-10-08 15:46:03.426 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:46:03 np0005476733 nova_compute[192580]: 2025-10-08 15:46:03.444 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:46:03 np0005476733 nova_compute[192580]: 2025-10-08 15:46:03.446 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:46:03 np0005476733 nova_compute[192580]: 2025-10-08 15:46:03.446 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:46:04 np0005476733 nova_compute[192580]: 2025-10-08 15:46:04.447 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:46:04 np0005476733 nova_compute[192580]: 2025-10-08 15:46:04.448 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:46:04 np0005476733 nova_compute[192580]: 2025-10-08 15:46:04.448 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:46:04 np0005476733 nova_compute[192580]: 2025-10-08 15:46:04.633 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:46:04 np0005476733 nova_compute[192580]: 2025-10-08 15:46:04.634 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:46:04 np0005476733 nova_compute[192580]: 2025-10-08 15:46:04.635 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:46:04 np0005476733 nova_compute[192580]: 2025-10-08 15:46:04.635 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid df287684-9151-42eb-8ff2-01e29a07e1e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:46:05 np0005476733 nova_compute[192580]: 2025-10-08 15:46:05.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:46:06 np0005476733 podman[240693]: 2025-10-08 15:46:06.256576499 +0000 UTC m=+0.068889204 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:46:06 np0005476733 podman[240692]: 2025-10-08 15:46:06.278563592 +0000 UTC m=+0.097619763 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  8 11:46:08 np0005476733 nova_compute[192580]: 2025-10-08 15:46:08.114 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Updating instance_info_cache with network_info: [{"id": "8a48fdf3-2293-49fb-81c8-b558651c0274", "address": "fa:16:3e:36:af:68", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a48fdf3-22", "ovs_interfaceid": "8a48fdf3-2293-49fb-81c8-b558651c0274", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "address": "fa:16:3e:f7:81:a1", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd02e7451-3e", "ovs_interfaceid": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:46:08 np0005476733 nova_compute[192580]: 2025-10-08 15:46:08.139 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:46:08 np0005476733 nova_compute[192580]: 2025-10-08 15:46:08.140 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:46:08 np0005476733 nova_compute[192580]: 2025-10-08 15:46:08.141 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:46:08 np0005476733 nova_compute[192580]: 2025-10-08 15:46:08.141 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:46:08 np0005476733 nova_compute[192580]: 2025-10-08 15:46:08.141 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:46:08 np0005476733 nova_compute[192580]: 2025-10-08 15:46:08.274 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:46:10 np0005476733 nova_compute[192580]: 2025-10-08 15:46:10.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:46:15 np0005476733 podman[240739]: 2025-10-08 15:46:15.243522871 +0000 UTC m=+0.065349791 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:46:15 np0005476733 podman[240740]: 2025-10-08 15:46:15.248050656 +0000 UTC m=+0.066219589 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, name=ubi9-minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  8 11:46:15 np0005476733 podman[240738]: 2025-10-08 15:46:15.267209669 +0000 UTC m=+0.093486072 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct  8 11:46:15 np0005476733 nova_compute[192580]: 2025-10-08 15:46:15.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:46:16 np0005476733 nova_compute[192580]: 2025-10-08 15:46:16.579 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:46:18 np0005476733 nova_compute[192580]: 2025-10-08 15:46:18.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:46:20 np0005476733 nova_compute[192580]: 2025-10-08 15:46:20.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:46:23 np0005476733 podman[240803]: 2025-10-08 15:46:23.23538077 +0000 UTC m=+0.060790516 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 11:46:23 np0005476733 podman[240802]: 2025-10-08 15:46:23.263335654 +0000 UTC m=+0.088630756 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:46:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:46:25Z|00585|pinctrl|WARN|Dropped 779 log messages in last 61 seconds (most recently, 17 seconds ago) due to excessive rate
Oct  8 11:46:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:46:25Z|00586|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:46:25 np0005476733 nova_compute[192580]: 2025-10-08 15:46:25.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:46:25 np0005476733 nova_compute[192580]: 2025-10-08 15:46:25.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:46:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:46:26.333 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:46:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:46:26.334 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:46:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:46:26.335 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:46:30 np0005476733 nova_compute[192580]: 2025-10-08 15:46:30.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:46:30 np0005476733 nova_compute[192580]: 2025-10-08 15:46:30.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:46:30 np0005476733 nova_compute[192580]: 2025-10-08 15:46:30.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  8 11:46:30 np0005476733 nova_compute[192580]: 2025-10-08 15:46:30.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 11:46:30 np0005476733 nova_compute[192580]: 2025-10-08 15:46:30.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:46:30 np0005476733 nova_compute[192580]: 2025-10-08 15:46:30.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 11:46:33 np0005476733 podman[240846]: 2025-10-08 15:46:33.263653874 +0000 UTC m=+0.074409371 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 11:46:35 np0005476733 nova_compute[192580]: 2025-10-08 15:46:35.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:46:35 np0005476733 nova_compute[192580]: 2025-10-08 15:46:35.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:46:35 np0005476733 nova_compute[192580]: 2025-10-08 15:46:35.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  8 11:46:35 np0005476733 nova_compute[192580]: 2025-10-08 15:46:35.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 11:46:35 np0005476733 nova_compute[192580]: 2025-10-08 15:46:35.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 11:46:35 np0005476733 nova_compute[192580]: 2025-10-08 15:46:35.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.015 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'name': 'tempest-test_qos_after_cold_migration-1645830914', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000047', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'hostId': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.015 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.020 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for df287684-9151-42eb-8ff2-01e29a07e1e1 / tapd02e7451-3e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.021 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.021 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9eb5d4c1-18dc-4853-b1a7-c7f584389318', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tap8a48fdf3-22', 'timestamp': '2025-10-08T15:46:36.016167', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tap8a48fdf3-22', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:36:af:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a48fdf3-22'}, 'message_id': 'f8b81e8c-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.739153131, 'message_signature': '4d7c0a646845cbd90b557e4fbeb5ef90f93fba3281540aaf604c7a5e00bb2874'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tapd02e7451-3e', 'timestamp': '2025-10-08T15:46:36.016167', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tapd02e7451-3e', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f7:81:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd02e7451-3e'}, 'message_id': 'f8b82e5e-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.739153131, 'message_signature': '521bceb8e9daa2b09dc8b6e3769c241c5044e0ae38f9950bd9b2fffa653824d6'}]}, 'timestamp': '2025-10-08 15:46:36.022112', '_unique_id': 'cd18ae7127fb411098a4dda0ad05a45b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.023 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.024 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.039 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.usage volume: 152567808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.040 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '731e9db3-f136-45ce-8307-150548f539be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152567808, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-vda', 'timestamp': '2025-10-08T15:46:36.024863', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'f8baf8c8-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.74787129, 'message_signature': '26d3cf44e4288e30d74dad90b61d681bbe08e2c22a459ddc463592b8b6685a1b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-sda', 'timestamp': '2025-10-08T15:46:36.024863', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'f8bb0926-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.74787129, 'message_signature': '6bfb6eb79d3b8a4cf5f91e40158d3ee39cd12e6daa7e240945eb6ce7867592fb'}]}, 'timestamp': '2025-10-08 15:46:36.040798', '_unique_id': 'ac8809577d88415ea8d5c77590376028'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.042 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.043 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.043 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.043 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af95e128-0352-4aeb-92d2-df6512e73cb7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tap8a48fdf3-22', 'timestamp': '2025-10-08T15:46:36.043393', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tap8a48fdf3-22', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:36:af:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a48fdf3-22'}, 'message_id': 'f8bb7ca8-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.739153131, 'message_signature': '05666713ff92b2d65f9e952989eccdfed31cf93c20061f1b722eebcb5b711979'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tapd02e7451-3e', 'timestamp': '2025-10-08T15:46:36.043393', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tapd02e7451-3e', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f7:81:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd02e7451-3e'}, 'message_id': 'f8bb8950-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.739153131, 'message_signature': 'a19962fe7da5398977cefcb2d8e8d2cbec09273e4c52ab6eaff704b4d6699c14'}]}, 'timestamp': '2025-10-08 15:46:36.044064', '_unique_id': '510a4950b83e4026a03e37540aaa4247'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.044 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.045 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.063 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/cpu volume: 43570000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '729cba17-0cfe-4d08-94ac-294aaca340de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 43570000000, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'timestamp': '2025-10-08T15:46:36.045835', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'f8be9f96-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.786435664, 'message_signature': '6c7ffeff480d3bd7f3b8b9f7a605145cc122aff18d5a9fe552618a953bbd4c4f'}]}, 'timestamp': '2025-10-08 15:46:36.064414', '_unique_id': '0d5ad4683600429c830da1b99223c39b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.065 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.067 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.088 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.read.latency volume: 7731326325 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.089 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.read.latency volume: 48363422 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bffe15b6-217b-45e2-829b-792af7d7261d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7731326325, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-vda', 'timestamp': '2025-10-08T15:46:36.067334', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'f8c2799a-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.790323727, 'message_signature': '98adc635d0b43932d97d4c93c6cdcb88762b701426e4c351c05757220365e2aa'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 48363422, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-sda', 'timestamp': '2025-10-08T15:46:36.067334', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'f8c288fe-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.790323727, 'message_signature': 'e9fa88d879405e8122a4712f9b0d58605c80d5eb903520e61b666841e34817d4'}]}, 'timestamp': '2025-10-08 15:46:36.089936', '_unique_id': '1475a49fefdb4097989c2f2a8bc9c48e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.091 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.092 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.092 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.read.requests volume: 11702 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.092 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9d7e8b9-4ba8-46f0-b242-acbd3df5f6ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11702, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-vda', 'timestamp': '2025-10-08T15:46:36.092597', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'f8c2fda2-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.790323727, 'message_signature': '7df9248f3a644753a9cf48066a4c8bc92062f6f8b60223780894c780ef0fa101'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-sda', 'timestamp': '2025-10-08T15:46:36.092597', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'f8c3075c-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.790323727, 'message_signature': '876ee06035b6257e05bb5a022c16c95bd79184ebefe5d2ca3a70dedb409eda54'}]}, 'timestamp': '2025-10-08 15:46:36.093137', '_unique_id': 'd7551fa6c7f3496097d8bc33ddccc937'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.093 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.094 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.094 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.write.requests volume: 858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.094 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba9e70c8-5285-4bdb-9bff-5675f4ad3d88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 858, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-vda', 'timestamp': '2025-10-08T15:46:36.094415', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'f8c342c6-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.790323727, 'message_signature': '548b12552d5633288f36eb1c9cd0f0b76ae9791016cd8c47275609be863514cf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-sda', 'timestamp': '2025-10-08T15:46:36.094415', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'f8c34aa0-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.790323727, 'message_signature': 'b6ab098ffde535dffeafe8435d35fffebf26c1fe5316187889d8e4af7a2f60b4'}]}, 'timestamp': '2025-10-08 15:46:36.094829', '_unique_id': '881be71da2aa4ce7aef9b25fee3c088b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.095 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.096 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.096 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.096 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5dfad20a-c22d-4c93-acae-775a601a7bee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tap8a48fdf3-22', 'timestamp': '2025-10-08T15:46:36.096150', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tap8a48fdf3-22', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:36:af:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a48fdf3-22'}, 'message_id': 'f8c386f0-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.739153131, 'message_signature': '9d5c3b2b3743e5341be101c00f72c8b4b38c5a4c24692d0d16a81706a649f7da'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tapd02e7451-3e', 'timestamp': '2025-10-08T15:46:36.096150', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tapd02e7451-3e', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f7:81:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd02e7451-3e'}, 'message_id': 'f8c38f4c-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.739153131, 'message_signature': '0fb24b56fab6e16955812dd3462c10396e05946320842fd5dee015e06f5205f8'}]}, 'timestamp': '2025-10-08 15:46:36.096626', '_unique_id': '0a8658ac2a034fedb0e6dd4139961a76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.097 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.outgoing.bytes volume: 107958 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.outgoing.bytes volume: 45629 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0929bb02-7f58-4dcf-9639-2023cccdf757', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 107958, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tap8a48fdf3-22', 'timestamp': '2025-10-08T15:46:36.097942', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tap8a48fdf3-22', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:36:af:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a48fdf3-22'}, 'message_id': 'f8c3ce58-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.739153131, 'message_signature': '293e4529c0523ce90ec8284e57627560483b76d04faad3ca56c79e745e1360f9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 45629, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tapd02e7451-3e', 'timestamp': '2025-10-08T15:46:36.097942', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tapd02e7451-3e', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f7:81:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd02e7451-3e'}, 'message_id': 'f8c3d74a-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.739153131, 'message_signature': '470842e0c50a1516816a801d56fd9183f811b162985e16da98616b67f8774505'}]}, 'timestamp': '2025-10-08 15:46:36.098438', '_unique_id': 'a7017961e8a74cf997d3fa1ab709b6f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.098 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.099 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.099 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.incoming.packets volume: 1481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.099 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.incoming.packets volume: 168 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50a9afee-87e3-4619-bc4e-a20249afcf86', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1481, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tap8a48fdf3-22', 'timestamp': '2025-10-08T15:46:36.099651', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tap8a48fdf3-22', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:36:af:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a48fdf3-22'}, 'message_id': 'f8c40f58-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.739153131, 'message_signature': '9165fb1d11c692265ff454c365aad435bf214b84c7c42b8b6d488df708920458'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 168, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tapd02e7451-3e', 'timestamp': '2025-10-08T15:46:36.099651', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tapd02e7451-3e', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f7:81:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd02e7451-3e'}, 'message_id': 'f8c4176e-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.739153131, 'message_signature': '41a35b684bdc8cb6a502b65ed7a8f3104406d54cce9740d3ef6b88bdd9c1536f'}]}, 'timestamp': '2025-10-08 15:46:36.100079', '_unique_id': 'b946d04bc53c41db99e5b542f8abdc2b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.100 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.101 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.101 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.read.bytes volume: 331109888 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.101 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd21810a9-7105-4ef6-a6a4-96bd3795b488', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 331109888, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-vda', 'timestamp': '2025-10-08T15:46:36.101187', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'f8c44b26-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.790323727, 'message_signature': 'f5ae7af4e3032ad7e1377e3658d971101f49959f4d754702804e6158fa13c7c3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-sda', 'timestamp': '2025-10-08T15:46:36.101187', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'f8c452c4-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.790323727, 'message_signature': 'dbbab95121c21e952d627d5724758ea07ba50f2e624cae204730a86fb90c4cba'}]}, 'timestamp': '2025-10-08 15:46:36.101606', '_unique_id': '47482efa30b0462a99287ccd705a35f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.102 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '582abc8f-e64e-4c88-8f1c-bbc93172ed23', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tap8a48fdf3-22', 'timestamp': '2025-10-08T15:46:36.102729', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tap8a48fdf3-22', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:36:af:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a48fdf3-22'}, 'message_id': 'f8c487f8-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.739153131, 'message_signature': '71cc1fe4700cb937c3061621a5b7092ebd3776f65b4c45ec4da63df638c7471a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tapd02e7451-3e', 'timestamp': '2025-10-08T15:46:36.102729', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tapd02e7451-3e', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f7:81:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd02e7451-3e'}, 'message_id': 'f8c49158-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.739153131, 'message_signature': 'dc8f8545bbe902e5c720a255a2332383d144f136ec51c728572dd484ca98f6cb'}]}, 'timestamp': '2025-10-08 15:46:36.103224', '_unique_id': 'f90c7789c76045c791f051632c7d7e8c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.103 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.104 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.104 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.write.bytes volume: 137610752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.104 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e6b41e1-f729-475d-ad15-53462fc19f5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 137610752, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-vda', 'timestamp': '2025-10-08T15:46:36.104541', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'f8c4ce2a-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.790323727, 'message_signature': '8bd5b0bbe49ca479d6c451f4202de936d3a7e476b658b4535dc4bed9891c8f6f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-sda', 'timestamp': '2025-10-08T15:46:36.104541', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'f8c4d60e-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.790323727, 'message_signature': '1c3e67581e0521fd4e7c93b05e8b909fc87b3562430c2926dd00971e43cef447'}]}, 'timestamp': '2025-10-08 15:46:36.104952', '_unique_id': '479136ff4f3f4b9c8cfd409db7b1dbec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.105 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.106 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.106 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.outgoing.bytes.delta volume: 107958 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.106 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e36b016-90a8-4a6a-9681-eaff475758df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 107958, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tap8a48fdf3-22', 'timestamp': '2025-10-08T15:46:36.106133', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tap8a48fdf3-22', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:36:af:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a48fdf3-22'}, 'message_id': 'f8c50ca0-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.739153131, 'message_signature': '5b0968d3557c9e2e1951701f165cc6c652a4b6d7922eea5d781facc4da8f2d01'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tapd02e7451-3e', 'timestamp': '2025-10-08T15:46:36.106133', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tapd02e7451-3e', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f7:81:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd02e7451-3e'}, 'message_id': 'f8c514a2-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.739153131, 'message_signature': 'df83d4ebb27a806af696a18b0034ecfed73bb6afe240191f4c21010164442369'}]}, 'timestamp': '2025-10-08 15:46:36.106560', '_unique_id': 'dcaa9b8d0bd240c79bc9231d1e0823c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.107 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.write.latency volume: 12653202442 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0fb095ed-3fd5-4a60-a50f-8f8d5c15a467', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12653202442, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-vda', 'timestamp': '2025-10-08T15:46:36.107748', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'f8c54d0a-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.790323727, 'message_signature': 'e77fd9a1457ffec3c537e541221df503dc6295bc167b01f87d4c14322cc4328b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-sda', 'timestamp': '2025-10-08T15:46:36.107748', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'f8c555ca-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.790323727, 'message_signature': '674538d02f91a33a9d684c02b989c60253d26241a3e8b283d14207a9c9770856'}]}, 'timestamp': '2025-10-08 15:46:36.108221', '_unique_id': '91bec649b7f74bbdac82c81aa8bb8032'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.108 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.109 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.109 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.outgoing.packets volume: 996 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.109 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.outgoing.packets volume: 164 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '539a9996-ca8b-42d0-bd0e-ab128256bf64', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 996, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tap8a48fdf3-22', 'timestamp': '2025-10-08T15:46:36.109525', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tap8a48fdf3-22', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:36:af:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a48fdf3-22'}, 'message_id': 'f8c5912a-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.739153131, 'message_signature': 'e6abb00cf5ef0d411c77906501bb34b0b6e34eb88cacb43e7ead298a82435f20'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 164, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tapd02e7451-3e', 'timestamp': '2025-10-08T15:46:36.109525', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tapd02e7451-3e', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f7:81:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd02e7451-3e'}, 'message_id': 'f8c599ae-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.739153131, 'message_signature': '201c0bcf99634f0b1e37d771249bcc9c37805a64c691bd116d9676138f483c0d'}]}, 'timestamp': '2025-10-08 15:46:36.109965', '_unique_id': '673bd0c3c6b749a5a7800aeab2aa894d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.110 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.111 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.111 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e951d0e-31a5-42f5-9fab-cfb29cf2705f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-vda', 'timestamp': '2025-10-08T15:46:36.111049', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'f8c5ce42-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.74787129, 'message_signature': 'fd6569356dc79c71daa581b754e63fc36b5c0b9fb63f363ac0a3ecde15a270bd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-sda', 'timestamp': '2025-10-08T15:46:36.111049', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'f8c5d626-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.74787129, 'message_signature': 'b63b7f594cabc582bc27fa4897442aa5beec9ebf33ddd0f20a5ebc7047f0644d'}]}, 'timestamp': '2025-10-08 15:46:36.111509', '_unique_id': '977199e686014810a5183a5694f21da0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.112 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.incoming.bytes volume: 16009337 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.incoming.bytes volume: 29591 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '225631b3-6835-44f9-bb09-3e773d0db119', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 16009337, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tap8a48fdf3-22', 'timestamp': '2025-10-08T15:46:36.112838', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tap8a48fdf3-22', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:36:af:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a48fdf3-22'}, 'message_id': 'f8c61276-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.739153131, 'message_signature': 'a549956317c3338f069e7b60c0364a4d7696bbbe1e5ece57d8daa78b267e43c3'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29591, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tapd02e7451-3e', 'timestamp': '2025-10-08T15:46:36.112838', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tapd02e7451-3e', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f7:81:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd02e7451-3e'}, 'message_id': 'f8c61b4a-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.739153131, 'message_signature': '711e978986ed1a5757c3fc2ea5a712084b2ed48d95c42c2c25a3c26edcd8cee0'}]}, 'timestamp': '2025-10-08 15:46:36.113285', '_unique_id': 'dc40e3c2e3e542cb8e48e225d8556364'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.113 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.114 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.114 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.incoming.bytes.delta volume: 16009227 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.114 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '814e34e4-aa1e-4715-8d87-66aaf2014ef7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 16009227, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tap8a48fdf3-22', 'timestamp': '2025-10-08T15:46:36.114382', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tap8a48fdf3-22', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:36:af:68', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a48fdf3-22'}, 'message_id': 'f8c64eda-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.739153131, 'message_signature': '266f025d06f7f08cdbf18e9f4facf59d8fe6ad70e5e977d6f70763de49c00fa7'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000047-df287684-9151-42eb-8ff2-01e29a07e1e1-tapd02e7451-3e', 'timestamp': '2025-10-08T15:46:36.114382', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'tapd02e7451-3e', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f7:81:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd02e7451-3e'}, 'message_id': 'f8c65704-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.739153131, 'message_signature': 'a209740428406b5504a6079d00c38dd22a50ffeb7e720da0e7866ba1ee22ef2e'}]}, 'timestamp': '2025-10-08 15:46:36.114814', '_unique_id': '88bcc1be655b49ab9d21e3e0a179a395'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.115 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4873dfc1-e0b0-427f-92f8-f42c643521be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-vda', 'timestamp': '2025-10-08T15:46:36.116022', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'f8c68f62-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.74787129, 'message_signature': 'beb2d1c2c121e0e215d24dea2af2ad21c668d89d621a2fcfdbe5d7b1013d6b4d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1-sda', 'timestamp': '2025-10-08T15:46:36.116022', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'f8c697aa-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.74787129, 'message_signature': 'd0ebc4555336b9c9e967b020f0813f9ec583ae3c65ce30b30a4230c858e4935b'}]}, 'timestamp': '2025-10-08 15:46:36.116489', '_unique_id': 'b6f93243235b479ebba563f049957f1f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.116 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.117 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.117 12 DEBUG ceilometer.compute.pollsters [-] df287684-9151-42eb-8ff2-01e29a07e1e1/memory.usage volume: 270.2109375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea49bb7c-cee4-4d03-9244-a1ede38266bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 270.2109375, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'timestamp': '2025-10-08T15:46:36.117596', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1645830914', 'name': 'instance-00000047', 'instance_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': 'f8c6cc2a-a45d-11f0-9274-fa163ef67048', 'monotonic_time': 5299.786435664, 'message_signature': '6ffaa340ceac81eb1e56925f33f06d722c7bf2cb0074a335630da1da44d4008b'}]}, 'timestamp': '2025-10-08 15:46:36.117812', '_unique_id': 'de950f9a219b41db86a46ee12e43360c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:46:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:46:37 np0005476733 podman[240865]: 2025-10-08 15:46:37.243279332 +0000 UTC m=+0.070020691 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Oct  8 11:46:37 np0005476733 podman[240864]: 2025-10-08 15:46:37.304182589 +0000 UTC m=+0.125967329 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 11:46:38 np0005476733 ovn_controller[94857]: 2025-10-08T15:46:38Z|00587|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct  8 11:46:40 np0005476733 nova_compute[192580]: 2025-10-08 15:46:40.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:46:42 np0005476733 nova_compute[192580]: 2025-10-08 15:46:42.227 2 DEBUG nova.compute.manager [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Oct  8 11:46:42 np0005476733 nova_compute[192580]: 2025-10-08 15:46:42.646 2 DEBUG oslo_concurrency.lockutils [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:46:42 np0005476733 nova_compute[192580]: 2025-10-08 15:46:42.647 2 DEBUG oslo_concurrency.lockutils [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:46:42 np0005476733 nova_compute[192580]: 2025-10-08 15:46:42.920 2 DEBUG nova.objects.instance [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lazy-loading 'pci_requests' on Instance uuid 13616378-6f23-42e4-8c8c-5182a5056326 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:46:43 np0005476733 nova_compute[192580]: 2025-10-08 15:46:43.149 2 DEBUG nova.virt.hardware [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:46:43 np0005476733 nova_compute[192580]: 2025-10-08 15:46:43.150 2 INFO nova.compute.claims [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:46:43 np0005476733 nova_compute[192580]: 2025-10-08 15:46:43.151 2 DEBUG nova.objects.instance [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lazy-loading 'resources' on Instance uuid 13616378-6f23-42e4-8c8c-5182a5056326 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:46:43 np0005476733 nova_compute[192580]: 2025-10-08 15:46:43.327 2 DEBUG nova.objects.instance [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lazy-loading 'numa_topology' on Instance uuid 13616378-6f23-42e4-8c8c-5182a5056326 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:46:43 np0005476733 nova_compute[192580]: 2025-10-08 15:46:43.567 2 DEBUG nova.objects.instance [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lazy-loading 'pci_devices' on Instance uuid 13616378-6f23-42e4-8c8c-5182a5056326 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:46:44 np0005476733 nova_compute[192580]: 2025-10-08 15:46:44.135 2 INFO nova.compute.resource_tracker [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Updating resource usage from migration 1be60ce2-4e16-4e22-83ab-7f33b64c6f62#033[00m
Oct  8 11:46:44 np0005476733 nova_compute[192580]: 2025-10-08 15:46:44.135 2 DEBUG nova.compute.resource_tracker [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Starting to track incoming migration 1be60ce2-4e16-4e22-83ab-7f33b64c6f62 with flavor 22222222-2222-2222-2222-222222222222 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Oct  8 11:46:44 np0005476733 nova_compute[192580]: 2025-10-08 15:46:44.458 2 DEBUG nova.compute.provider_tree [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:46:44 np0005476733 nova_compute[192580]: 2025-10-08 15:46:44.628 2 DEBUG nova.scheduler.client.report [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:46:45 np0005476733 nova_compute[192580]: 2025-10-08 15:46:45.003 2 DEBUG oslo_concurrency.lockutils [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 2.356s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:46:45 np0005476733 nova_compute[192580]: 2025-10-08 15:46:45.004 2 INFO nova.compute.manager [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Migrating#033[00m
Oct  8 11:46:45 np0005476733 nova_compute[192580]: 2025-10-08 15:46:45.004 2 DEBUG oslo_concurrency.lockutils [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:46:45 np0005476733 nova_compute[192580]: 2025-10-08 15:46:45.005 2 DEBUG oslo_concurrency.lockutils [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:46:45 np0005476733 nova_compute[192580]: 2025-10-08 15:46:45.052 2 INFO nova.compute.rpcapi [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m
Oct  8 11:46:45 np0005476733 nova_compute[192580]: 2025-10-08 15:46:45.054 2 DEBUG oslo_concurrency.lockutils [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:46:45 np0005476733 nova_compute[192580]: 2025-10-08 15:46:45.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:46:46 np0005476733 podman[240913]: 2025-10-08 15:46:46.229174358 +0000 UTC m=+0.054883936 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 11:46:46 np0005476733 podman[240914]: 2025-10-08 15:46:46.248258939 +0000 UTC m=+0.070325060 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, config_id=edpm, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., managed_by=edpm_ansible)
Oct  8 11:46:46 np0005476733 podman[240912]: 2025-10-08 15:46:46.260004494 +0000 UTC m=+0.079521684 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:46:49 np0005476733 systemd[1]: Created slice User Slice of UID 42436.
Oct  8 11:46:49 np0005476733 systemd[1]: Starting User Runtime Directory /run/user/42436...
Oct  8 11:46:49 np0005476733 systemd-logind[827]: New session 41 of user nova.
Oct  8 11:46:49 np0005476733 systemd[1]: Finished User Runtime Directory /run/user/42436.
Oct  8 11:46:49 np0005476733 systemd[1]: Starting User Manager for UID 42436...
Oct  8 11:46:49 np0005476733 systemd[240979]: Queued start job for default target Main User Target.
Oct  8 11:46:49 np0005476733 systemd[240979]: Created slice User Application Slice.
Oct  8 11:46:49 np0005476733 systemd[240979]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  8 11:46:49 np0005476733 systemd[240979]: Started Daily Cleanup of User's Temporary Directories.
Oct  8 11:46:49 np0005476733 systemd[240979]: Reached target Paths.
Oct  8 11:46:49 np0005476733 systemd[240979]: Reached target Timers.
Oct  8 11:46:49 np0005476733 systemd[240979]: Starting D-Bus User Message Bus Socket...
Oct  8 11:46:49 np0005476733 systemd[240979]: Starting Create User's Volatile Files and Directories...
Oct  8 11:46:49 np0005476733 systemd[240979]: Finished Create User's Volatile Files and Directories.
Oct  8 11:46:49 np0005476733 systemd[240979]: Listening on D-Bus User Message Bus Socket.
Oct  8 11:46:49 np0005476733 systemd[240979]: Reached target Sockets.
Oct  8 11:46:49 np0005476733 systemd[240979]: Reached target Basic System.
Oct  8 11:46:49 np0005476733 systemd[240979]: Reached target Main User Target.
Oct  8 11:46:49 np0005476733 systemd[240979]: Startup finished in 165ms.
Oct  8 11:46:49 np0005476733 systemd[1]: Started User Manager for UID 42436.
Oct  8 11:46:49 np0005476733 systemd[1]: Started Session 41 of User nova.
Oct  8 11:46:49 np0005476733 systemd-logind[827]: Session 41 logged out. Waiting for processes to exit.
Oct  8 11:46:49 np0005476733 systemd[1]: session-41.scope: Deactivated successfully.
Oct  8 11:46:49 np0005476733 systemd-logind[827]: Removed session 41.
Oct  8 11:46:50 np0005476733 systemd-logind[827]: New session 43 of user nova.
Oct  8 11:46:50 np0005476733 systemd[1]: Started Session 43 of User nova.
Oct  8 11:46:50 np0005476733 systemd[1]: session-43.scope: Deactivated successfully.
Oct  8 11:46:50 np0005476733 systemd-logind[827]: Session 43 logged out. Waiting for processes to exit.
Oct  8 11:46:50 np0005476733 systemd-logind[827]: Removed session 43.
Oct  8 11:46:50 np0005476733 nova_compute[192580]: 2025-10-08 15:46:50.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:46:51 np0005476733 systemd-logind[827]: New session 44 of user nova.
Oct  8 11:46:51 np0005476733 systemd[1]: Started Session 44 of User nova.
Oct  8 11:46:53 np0005476733 systemd[1]: session-44.scope: Deactivated successfully.
Oct  8 11:46:53 np0005476733 systemd-logind[827]: Session 44 logged out. Waiting for processes to exit.
Oct  8 11:46:53 np0005476733 systemd-logind[827]: Removed session 44.
Oct  8 11:46:53 np0005476733 systemd-logind[827]: New session 45 of user nova.
Oct  8 11:46:53 np0005476733 systemd[1]: Started Session 45 of User nova.
Oct  8 11:46:53 np0005476733 systemd[1]: session-45.scope: Deactivated successfully.
Oct  8 11:46:53 np0005476733 podman[241010]: 2025-10-08 15:46:53.374580778 +0000 UTC m=+0.085148054 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:46:53 np0005476733 systemd-logind[827]: Session 45 logged out. Waiting for processes to exit.
Oct  8 11:46:53 np0005476733 systemd-logind[827]: Removed session 45.
Oct  8 11:46:53 np0005476733 podman[241011]: 2025-10-08 15:46:53.378274606 +0000 UTC m=+0.087051335 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.license=GPLv2)
Oct  8 11:46:53 np0005476733 systemd-logind[827]: New session 46 of user nova.
Oct  8 11:46:53 np0005476733 systemd[1]: Started Session 46 of User nova.
Oct  8 11:46:53 np0005476733 nova_compute[192580]: 2025-10-08 15:46:53.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:46:53 np0005476733 systemd[1]: session-46.scope: Deactivated successfully.
Oct  8 11:46:53 np0005476733 systemd-logind[827]: Session 46 logged out. Waiting for processes to exit.
Oct  8 11:46:53 np0005476733 systemd-logind[827]: Removed session 46.
Oct  8 11:46:54 np0005476733 nova_compute[192580]: 2025-10-08 15:46:54.614 2 DEBUG nova.compute.manager [req-b088587d-a79e-4c23-8b9f-7ac0a22e1aac req-668cb64b-e6c3-4467-a434-305d30a560bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received event network-vif-unplugged-4e3afd85-f9b7-45ee-b86c-5db61eaec58c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:46:54 np0005476733 nova_compute[192580]: 2025-10-08 15:46:54.614 2 DEBUG oslo_concurrency.lockutils [req-b088587d-a79e-4c23-8b9f-7ac0a22e1aac req-668cb64b-e6c3-4467-a434-305d30a560bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "13616378-6f23-42e4-8c8c-5182a5056326-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:46:54 np0005476733 nova_compute[192580]: 2025-10-08 15:46:54.615 2 DEBUG oslo_concurrency.lockutils [req-b088587d-a79e-4c23-8b9f-7ac0a22e1aac req-668cb64b-e6c3-4467-a434-305d30a560bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:46:54 np0005476733 nova_compute[192580]: 2025-10-08 15:46:54.615 2 DEBUG oslo_concurrency.lockutils [req-b088587d-a79e-4c23-8b9f-7ac0a22e1aac req-668cb64b-e6c3-4467-a434-305d30a560bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:46:54 np0005476733 nova_compute[192580]: 2025-10-08 15:46:54.615 2 DEBUG nova.compute.manager [req-b088587d-a79e-4c23-8b9f-7ac0a22e1aac req-668cb64b-e6c3-4467-a434-305d30a560bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] No waiting events found dispatching network-vif-unplugged-4e3afd85-f9b7-45ee-b86c-5db61eaec58c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:46:54 np0005476733 nova_compute[192580]: 2025-10-08 15:46:54.615 2 WARNING nova.compute.manager [req-b088587d-a79e-4c23-8b9f-7ac0a22e1aac req-668cb64b-e6c3-4467-a434-305d30a560bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received unexpected event network-vif-unplugged-4e3afd85-f9b7-45ee-b86c-5db61eaec58c for instance with vm_state active and task_state resize_migrated.#033[00m
Oct  8 11:46:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:46:54.862 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:46:54 np0005476733 nova_compute[192580]: 2025-10-08 15:46:54.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:46:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:46:54.864 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:46:54 np0005476733 nova_compute[192580]: 2025-10-08 15:46:54.915 2 INFO nova.network.neutron [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Updating port 4e3afd85-f9b7-45ee-b86c-5db61eaec58c with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Oct  8 11:46:55 np0005476733 nova_compute[192580]: 2025-10-08 15:46:55.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:46:56 np0005476733 nova_compute[192580]: 2025-10-08 15:46:56.740 2 DEBUG nova.compute.manager [req-68111b31-ffac-4662-b7cf-d37e7b8b8422 req-4216bd92-94d0-4b6d-8ef4-362bdb3f2fc1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received event network-vif-plugged-4e3afd85-f9b7-45ee-b86c-5db61eaec58c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:46:56 np0005476733 nova_compute[192580]: 2025-10-08 15:46:56.740 2 DEBUG oslo_concurrency.lockutils [req-68111b31-ffac-4662-b7cf-d37e7b8b8422 req-4216bd92-94d0-4b6d-8ef4-362bdb3f2fc1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "13616378-6f23-42e4-8c8c-5182a5056326-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:46:56 np0005476733 nova_compute[192580]: 2025-10-08 15:46:56.740 2 DEBUG oslo_concurrency.lockutils [req-68111b31-ffac-4662-b7cf-d37e7b8b8422 req-4216bd92-94d0-4b6d-8ef4-362bdb3f2fc1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:46:56 np0005476733 nova_compute[192580]: 2025-10-08 15:46:56.741 2 DEBUG oslo_concurrency.lockutils [req-68111b31-ffac-4662-b7cf-d37e7b8b8422 req-4216bd92-94d0-4b6d-8ef4-362bdb3f2fc1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:46:56 np0005476733 nova_compute[192580]: 2025-10-08 15:46:56.741 2 DEBUG nova.compute.manager [req-68111b31-ffac-4662-b7cf-d37e7b8b8422 req-4216bd92-94d0-4b6d-8ef4-362bdb3f2fc1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] No waiting events found dispatching network-vif-plugged-4e3afd85-f9b7-45ee-b86c-5db61eaec58c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:46:56 np0005476733 nova_compute[192580]: 2025-10-08 15:46:56.741 2 WARNING nova.compute.manager [req-68111b31-ffac-4662-b7cf-d37e7b8b8422 req-4216bd92-94d0-4b6d-8ef4-362bdb3f2fc1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received unexpected event network-vif-plugged-4e3afd85-f9b7-45ee-b86c-5db61eaec58c for instance with vm_state active and task_state resize_migrated.#033[00m
Oct  8 11:46:56 np0005476733 nova_compute[192580]: 2025-10-08 15:46:56.741 2 DEBUG nova.compute.manager [req-68111b31-ffac-4662-b7cf-d37e7b8b8422 req-4216bd92-94d0-4b6d-8ef4-362bdb3f2fc1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received event network-vif-unplugged-6771bc83-98b8-4b06-8442-9bb11777cdc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:46:56 np0005476733 nova_compute[192580]: 2025-10-08 15:46:56.742 2 DEBUG oslo_concurrency.lockutils [req-68111b31-ffac-4662-b7cf-d37e7b8b8422 req-4216bd92-94d0-4b6d-8ef4-362bdb3f2fc1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "13616378-6f23-42e4-8c8c-5182a5056326-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:46:56 np0005476733 nova_compute[192580]: 2025-10-08 15:46:56.742 2 DEBUG oslo_concurrency.lockutils [req-68111b31-ffac-4662-b7cf-d37e7b8b8422 req-4216bd92-94d0-4b6d-8ef4-362bdb3f2fc1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:46:56 np0005476733 nova_compute[192580]: 2025-10-08 15:46:56.742 2 DEBUG oslo_concurrency.lockutils [req-68111b31-ffac-4662-b7cf-d37e7b8b8422 req-4216bd92-94d0-4b6d-8ef4-362bdb3f2fc1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:46:56 np0005476733 nova_compute[192580]: 2025-10-08 15:46:56.742 2 DEBUG nova.compute.manager [req-68111b31-ffac-4662-b7cf-d37e7b8b8422 req-4216bd92-94d0-4b6d-8ef4-362bdb3f2fc1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] No waiting events found dispatching network-vif-unplugged-6771bc83-98b8-4b06-8442-9bb11777cdc6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:46:56 np0005476733 nova_compute[192580]: 2025-10-08 15:46:56.742 2 WARNING nova.compute.manager [req-68111b31-ffac-4662-b7cf-d37e7b8b8422 req-4216bd92-94d0-4b6d-8ef4-362bdb3f2fc1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received unexpected event network-vif-unplugged-6771bc83-98b8-4b06-8442-9bb11777cdc6 for instance with vm_state active and task_state resize_migrated.#033[00m
Oct  8 11:46:56 np0005476733 nova_compute[192580]: 2025-10-08 15:46:56.743 2 DEBUG nova.compute.manager [req-68111b31-ffac-4662-b7cf-d37e7b8b8422 req-4216bd92-94d0-4b6d-8ef4-362bdb3f2fc1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received event network-vif-plugged-6771bc83-98b8-4b06-8442-9bb11777cdc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:46:56 np0005476733 nova_compute[192580]: 2025-10-08 15:46:56.743 2 DEBUG oslo_concurrency.lockutils [req-68111b31-ffac-4662-b7cf-d37e7b8b8422 req-4216bd92-94d0-4b6d-8ef4-362bdb3f2fc1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "13616378-6f23-42e4-8c8c-5182a5056326-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:46:56 np0005476733 nova_compute[192580]: 2025-10-08 15:46:56.743 2 DEBUG oslo_concurrency.lockutils [req-68111b31-ffac-4662-b7cf-d37e7b8b8422 req-4216bd92-94d0-4b6d-8ef4-362bdb3f2fc1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:46:56 np0005476733 nova_compute[192580]: 2025-10-08 15:46:56.743 2 DEBUG oslo_concurrency.lockutils [req-68111b31-ffac-4662-b7cf-d37e7b8b8422 req-4216bd92-94d0-4b6d-8ef4-362bdb3f2fc1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:46:56 np0005476733 nova_compute[192580]: 2025-10-08 15:46:56.743 2 DEBUG nova.compute.manager [req-68111b31-ffac-4662-b7cf-d37e7b8b8422 req-4216bd92-94d0-4b6d-8ef4-362bdb3f2fc1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] No waiting events found dispatching network-vif-plugged-6771bc83-98b8-4b06-8442-9bb11777cdc6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:46:56 np0005476733 nova_compute[192580]: 2025-10-08 15:46:56.744 2 WARNING nova.compute.manager [req-68111b31-ffac-4662-b7cf-d37e7b8b8422 req-4216bd92-94d0-4b6d-8ef4-362bdb3f2fc1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received unexpected event network-vif-plugged-6771bc83-98b8-4b06-8442-9bb11777cdc6 for instance with vm_state active and task_state resize_migrated.#033[00m
Oct  8 11:46:57 np0005476733 nova_compute[192580]: 2025-10-08 15:46:57.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:46:58 np0005476733 nova_compute[192580]: 2025-10-08 15:46:58.093 2 INFO nova.network.neutron [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Updating port 6771bc83-98b8-4b06-8442-9bb11777cdc6 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Oct  8 11:46:58 np0005476733 nova_compute[192580]: 2025-10-08 15:46:58.505 2 DEBUG nova.compute.manager [req-234fbc6a-bd29-4c13-b51e-dd564580f502 req-6b7f7fd5-4a6a-410f-971e-d181254b461f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received event network-changed-4e3afd85-f9b7-45ee-b86c-5db61eaec58c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:46:58 np0005476733 nova_compute[192580]: 2025-10-08 15:46:58.506 2 DEBUG nova.compute.manager [req-234fbc6a-bd29-4c13-b51e-dd564580f502 req-6b7f7fd5-4a6a-410f-971e-d181254b461f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Refreshing instance network info cache due to event network-changed-4e3afd85-f9b7-45ee-b86c-5db61eaec58c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:46:58 np0005476733 nova_compute[192580]: 2025-10-08 15:46:58.507 2 DEBUG oslo_concurrency.lockutils [req-234fbc6a-bd29-4c13-b51e-dd564580f502 req-6b7f7fd5-4a6a-410f-971e-d181254b461f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-13616378-6f23-42e4-8c8c-5182a5056326" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:46:58 np0005476733 nova_compute[192580]: 2025-10-08 15:46:58.507 2 DEBUG oslo_concurrency.lockutils [req-234fbc6a-bd29-4c13-b51e-dd564580f502 req-6b7f7fd5-4a6a-410f-971e-d181254b461f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-13616378-6f23-42e4-8c8c-5182a5056326" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:46:58 np0005476733 nova_compute[192580]: 2025-10-08 15:46:58.507 2 DEBUG nova.network.neutron [req-234fbc6a-bd29-4c13-b51e-dd564580f502 req-6b7f7fd5-4a6a-410f-971e-d181254b461f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Refreshing network info cache for port 4e3afd85-f9b7-45ee-b86c-5db61eaec58c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:47:00 np0005476733 nova_compute[192580]: 2025-10-08 15:47:00.085 2 DEBUG oslo_concurrency.lockutils [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquiring lock "refresh_cache-13616378-6f23-42e4-8c8c-5182a5056326" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:47:00 np0005476733 nova_compute[192580]: 2025-10-08 15:47:00.383 2 DEBUG nova.network.neutron [req-234fbc6a-bd29-4c13-b51e-dd564580f502 req-6b7f7fd5-4a6a-410f-971e-d181254b461f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Updated VIF entry in instance network info cache for port 4e3afd85-f9b7-45ee-b86c-5db61eaec58c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:47:00 np0005476733 nova_compute[192580]: 2025-10-08 15:47:00.384 2 DEBUG nova.network.neutron [req-234fbc6a-bd29-4c13-b51e-dd564580f502 req-6b7f7fd5-4a6a-410f-971e-d181254b461f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Updating instance_info_cache with network_info: [{"id": "4e3afd85-f9b7-45ee-b86c-5db61eaec58c", "address": "fa:16:3e:e8:4d:2e", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e3afd85-f9", "ovs_interfaceid": "4e3afd85-f9b7-45ee-b86c-5db61eaec58c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6771bc83-98b8-4b06-8442-9bb11777cdc6", "address": "fa:16:3e:b8:d2:73", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6771bc83-98", "ovs_interfaceid": "6771bc83-98b8-4b06-8442-9bb11777cdc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:47:00 np0005476733 nova_compute[192580]: 2025-10-08 15:47:00.577 2 DEBUG oslo_concurrency.lockutils [req-234fbc6a-bd29-4c13-b51e-dd564580f502 req-6b7f7fd5-4a6a-410f-971e-d181254b461f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-13616378-6f23-42e4-8c8c-5182a5056326" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:47:00 np0005476733 nova_compute[192580]: 2025-10-08 15:47:00.578 2 DEBUG oslo_concurrency.lockutils [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquired lock "refresh_cache-13616378-6f23-42e4-8c8c-5182a5056326" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:47:00 np0005476733 nova_compute[192580]: 2025-10-08 15:47:00.578 2 DEBUG nova.network.neutron [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:47:00 np0005476733 nova_compute[192580]: 2025-10-08 15:47:00.628 2 DEBUG nova.compute.manager [req-989b610c-7894-4764-ba82-4ba52ec3e40b req-6ba473ff-e0d6-445c-a8d5-e3da65d8269a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received event network-changed-6771bc83-98b8-4b06-8442-9bb11777cdc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:47:00 np0005476733 nova_compute[192580]: 2025-10-08 15:47:00.628 2 DEBUG nova.compute.manager [req-989b610c-7894-4764-ba82-4ba52ec3e40b req-6ba473ff-e0d6-445c-a8d5-e3da65d8269a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Refreshing instance network info cache due to event network-changed-6771bc83-98b8-4b06-8442-9bb11777cdc6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:47:00 np0005476733 nova_compute[192580]: 2025-10-08 15:47:00.629 2 DEBUG oslo_concurrency.lockutils [req-989b610c-7894-4764-ba82-4ba52ec3e40b req-6ba473ff-e0d6-445c-a8d5-e3da65d8269a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-13616378-6f23-42e4-8c8c-5182a5056326" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:47:00 np0005476733 nova_compute[192580]: 2025-10-08 15:47:00.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:47:00 np0005476733 nova_compute[192580]: 2025-10-08 15:47:00.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:00 np0005476733 nova_compute[192580]: 2025-10-08 15:47:00.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  8 11:47:00 np0005476733 nova_compute[192580]: 2025-10-08 15:47:00.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 11:47:00 np0005476733 nova_compute[192580]: 2025-10-08 15:47:00.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 11:47:00 np0005476733 nova_compute[192580]: 2025-10-08 15:47:00.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:01 np0005476733 nova_compute[192580]: 2025-10-08 15:47:01.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.610 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-13616378-6f23-42e4-8c8c-5182a5056326" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:47:03 np0005476733 systemd[1]: Stopping User Manager for UID 42436...
Oct  8 11:47:03 np0005476733 systemd[240979]: Activating special unit Exit the Session...
Oct  8 11:47:03 np0005476733 systemd[240979]: Stopped target Main User Target.
Oct  8 11:47:03 np0005476733 systemd[240979]: Stopped target Basic System.
Oct  8 11:47:03 np0005476733 systemd[240979]: Stopped target Paths.
Oct  8 11:47:03 np0005476733 systemd[240979]: Stopped target Sockets.
Oct  8 11:47:03 np0005476733 systemd[240979]: Stopped target Timers.
Oct  8 11:47:03 np0005476733 systemd[240979]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct  8 11:47:03 np0005476733 systemd[240979]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  8 11:47:03 np0005476733 systemd[240979]: Closed D-Bus User Message Bus Socket.
Oct  8 11:47:03 np0005476733 systemd[240979]: Stopped Create User's Volatile Files and Directories.
Oct  8 11:47:03 np0005476733 systemd[240979]: Removed slice User Application Slice.
Oct  8 11:47:03 np0005476733 systemd[240979]: Reached target Shutdown.
Oct  8 11:47:03 np0005476733 systemd[240979]: Finished Exit the Session.
Oct  8 11:47:03 np0005476733 systemd[240979]: Reached target Exit the Session.
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.685 2 DEBUG nova.network.neutron [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Updating instance_info_cache with network_info: [{"id": "4e3afd85-f9b7-45ee-b86c-5db61eaec58c", "address": "fa:16:3e:e8:4d:2e", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e3afd85-f9", "ovs_interfaceid": "4e3afd85-f9b7-45ee-b86c-5db61eaec58c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6771bc83-98b8-4b06-8442-9bb11777cdc6", "address": "fa:16:3e:b8:d2:73", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6771bc83-98", "ovs_interfaceid": "6771bc83-98b8-4b06-8442-9bb11777cdc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:47:03 np0005476733 systemd[1]: user@42436.service: Deactivated successfully.
Oct  8 11:47:03 np0005476733 systemd[1]: Stopped User Manager for UID 42436.
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.701 2 DEBUG oslo_concurrency.lockutils [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Releasing lock "refresh_cache-13616378-6f23-42e4-8c8c-5182a5056326" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:47:03 np0005476733 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.705 2 DEBUG oslo_concurrency.lockutils [req-989b610c-7894-4764-ba82-4ba52ec3e40b req-6ba473ff-e0d6-445c-a8d5-e3da65d8269a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-13616378-6f23-42e4-8c8c-5182a5056326" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.706 2 DEBUG nova.network.neutron [req-989b610c-7894-4764-ba82-4ba52ec3e40b req-6ba473ff-e0d6-445c-a8d5-e3da65d8269a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Refreshing network info cache for port 6771bc83-98b8-4b06-8442-9bb11777cdc6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:47:03 np0005476733 systemd[1]: run-user-42436.mount: Deactivated successfully.
Oct  8 11:47:03 np0005476733 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Oct  8 11:47:03 np0005476733 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Oct  8 11:47:03 np0005476733 systemd[1]: Removed slice User Slice of UID 42436.
Oct  8 11:47:03 np0005476733 systemd[1]: user-42436.slice: Consumed 1.078s CPU time.
Oct  8 11:47:03 np0005476733 podman[241064]: 2025-10-08 15:47:03.762229715 +0000 UTC m=+0.075049652 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.854 2 DEBUG nova.virt.libvirt.driver [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.857 2 DEBUG nova.virt.libvirt.driver [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.857 2 INFO nova.virt.libvirt.driver [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Creating image(s)#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.858 2 DEBUG nova.objects.instance [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lazy-loading 'trusted_certs' on Instance uuid 13616378-6f23-42e4-8c8c-5182a5056326 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:47:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:03.866 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.870 2 DEBUG oslo_concurrency.processutils [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.930 2 DEBUG oslo_concurrency.processutils [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.947 2 DEBUG nova.virt.libvirt.driver [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.947 2 DEBUG nova.virt.libvirt.driver [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Ensure instance console log exists: /var/lib/nova/instances/13616378-6f23-42e4-8c8c-5182a5056326/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.948 2 DEBUG oslo_concurrency.lockutils [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.948 2 DEBUG oslo_concurrency.lockutils [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.948 2 DEBUG oslo_concurrency.lockutils [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.951 2 DEBUG nova.virt.libvirt.driver [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Start _get_guest_xml network_info=[{"id": "4e3afd85-f9b7-45ee-b86c-5db61eaec58c", "address": "fa:16:3e:e8:4d:2e", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-test-network--238655981", "vif_mac": "fa:16:3e:e8:4d:2e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e3afd85-f9", "ovs_interfaceid": "4e3afd85-f9b7-45ee-b86c-5db61eaec58c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6771bc83-98b8-4b06-8442-9bb11777cdc6", "address": "fa:16:3e:b8:d2:73", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-tenant-ctl-network-648960884", "vif_mac": "fa:16:3e:b8:d2:73"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6771bc83-98", "ovs_interfaceid": "6771bc83-98b8-4b06-8442-9bb11777cdc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.956 2 WARNING nova.virt.libvirt.driver [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.963 2 DEBUG nova.virt.libvirt.host [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.963 2 DEBUG nova.virt.libvirt.host [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.967 2 DEBUG nova.virt.libvirt.host [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.967 2 DEBUG nova.virt.libvirt.host [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.968 2 DEBUG nova.virt.libvirt.driver [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.968 2 DEBUG nova.virt.hardware [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.969 2 DEBUG nova.virt.hardware [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.969 2 DEBUG nova.virt.hardware [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.969 2 DEBUG nova.virt.hardware [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.969 2 DEBUG nova.virt.hardware [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.969 2 DEBUG nova.virt.hardware [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.969 2 DEBUG nova.virt.hardware [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.970 2 DEBUG nova.virt.hardware [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.970 2 DEBUG nova.virt.hardware [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.970 2 DEBUG nova.virt.hardware [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.970 2 DEBUG nova.virt.hardware [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.970 2 DEBUG nova.objects.instance [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lazy-loading 'vcpu_model' on Instance uuid 13616378-6f23-42e4-8c8c-5182a5056326 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:47:03 np0005476733 nova_compute[192580]: 2025-10-08 15:47:03.993 2 DEBUG oslo_concurrency.processutils [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13616378-6f23-42e4-8c8c-5182a5056326/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.052 2 DEBUG oslo_concurrency.processutils [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13616378-6f23-42e4-8c8c-5182a5056326/disk.config --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.053 2 DEBUG oslo_concurrency.lockutils [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquiring lock "/var/lib/nova/instances/13616378-6f23-42e4-8c8c-5182a5056326/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.053 2 DEBUG oslo_concurrency.lockutils [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "/var/lib/nova/instances/13616378-6f23-42e4-8c8c-5182a5056326/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.054 2 DEBUG oslo_concurrency.lockutils [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "/var/lib/nova/instances/13616378-6f23-42e4-8c8c-5182a5056326/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.055 2 DEBUG nova.virt.libvirt.vif [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:43:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_qos_after_cold_migration-1644920499',display_name='tempest-test_qos_after_cold_migration-1644920499',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-qos-after-cold-migration-1644920499',id=70,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:43:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-sqgstxg9',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',old_vm_state='active',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:46:54Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=13616378-6f23-42e4-8c8c-5182a5056326,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4e3afd85-f9b7-45ee-b86c-5db61eaec58c", "address": "fa:16:3e:e8:4d:2e", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-test-network--238655981", "vif_mac": "fa:16:3e:e8:4d:2e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e3afd85-f9", "ovs_interfaceid": "4e3afd85-f9b7-45ee-b86c-5db61eaec58c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.055 2 DEBUG nova.network.os_vif_util [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converting VIF {"id": "4e3afd85-f9b7-45ee-b86c-5db61eaec58c", "address": "fa:16:3e:e8:4d:2e", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-test-network--238655981", "vif_mac": "fa:16:3e:e8:4d:2e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e3afd85-f9", "ovs_interfaceid": "4e3afd85-f9b7-45ee-b86c-5db61eaec58c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.056 2 DEBUG nova.network.os_vif_util [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:4d:2e,bridge_name='br-int',has_traffic_filtering=True,id=4e3afd85-f9b7-45ee-b86c-5db61eaec58c,network=Network(ac5383ee-65ae-4340-bb19-495c4991fae8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e3afd85-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.057 2 DEBUG nova.virt.libvirt.vif [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:43:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_qos_after_cold_migration-1644920499',display_name='tempest-test_qos_after_cold_migration-1644920499',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-qos-after-cold-migration-1644920499',id=70,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:43:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-sqgstxg9',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',old_vm_state='active',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:46:54Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=13616378-6f23-42e4-8c8c-5182a5056326,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6771bc83-98b8-4b06-8442-9bb11777cdc6", "address": "fa:16:3e:b8:d2:73", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-tenant-ctl-network-648960884", "vif_mac": "fa:16:3e:b8:d2:73"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6771bc83-98", "ovs_interfaceid": "6771bc83-98b8-4b06-8442-9bb11777cdc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.057 2 DEBUG nova.network.os_vif_util [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converting VIF {"id": "6771bc83-98b8-4b06-8442-9bb11777cdc6", "address": "fa:16:3e:b8:d2:73", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-tenant-ctl-network-648960884", "vif_mac": "fa:16:3e:b8:d2:73"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6771bc83-98", "ovs_interfaceid": "6771bc83-98b8-4b06-8442-9bb11777cdc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.058 2 DEBUG nova.network.os_vif_util [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:d2:73,bridge_name='br-int',has_traffic_filtering=True,id=6771bc83-98b8-4b06-8442-9bb11777cdc6,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6771bc83-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.060 2 DEBUG nova.virt.libvirt.driver [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:47:04 np0005476733 nova_compute[192580]:  <uuid>13616378-6f23-42e4-8c8c-5182a5056326</uuid>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:  <name>instance-00000046</name>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <nova:name>tempest-test_qos_after_cold_migration-1644920499</nova:name>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:47:03</nova:creationTime>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:47:04 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:        <nova:user uuid="d4d641ac754b44f89a23c1628056309a">tempest-QosTestCommon-1316104462-project-member</nova:user>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:        <nova:project uuid="d58fb802e34e481ea69b20f4fe8df6d2">tempest-QosTestCommon-1316104462</nova:project>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:        <nova:port uuid="4e3afd85-f9b7-45ee-b86c-5db61eaec58c">
Oct  8 11:47:04 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.7.32" ipVersion="4"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:        <nova:port uuid="6771bc83-98b8-4b06-8442-9bb11777cdc6">
Oct  8 11:47:04 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <entry name="serial">13616378-6f23-42e4-8c8c-5182a5056326</entry>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <entry name="uuid">13616378-6f23-42e4-8c8c-5182a5056326</entry>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/13616378-6f23-42e4-8c8c-5182a5056326/disk"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/13616378-6f23-42e4-8c8c-5182a5056326/disk.config"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:e8:4d:2e"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <target dev="tap4e3afd85-f9"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:b8:d2:73"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <target dev="tap6771bc83-98"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/13616378-6f23-42e4-8c8c-5182a5056326/console.log" append="off"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:47:04 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:47:04 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:47:04 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:47:04 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.061 2 DEBUG nova.virt.libvirt.vif [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:43:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_qos_after_cold_migration-1644920499',display_name='tempest-test_qos_after_cold_migration-1644920499',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-qos-after-cold-migration-1644920499',id=70,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:43:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-sqgstxg9',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',old_vm_state='active',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:46:54Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=13616378-6f23-42e4-8c8c-5182a5056326,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4e3afd85-f9b7-45ee-b86c-5db61eaec58c", "address": "fa:16:3e:e8:4d:2e", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-test-network--238655981", "vif_mac": "fa:16:3e:e8:4d:2e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e3afd85-f9", "ovs_interfaceid": "4e3afd85-f9b7-45ee-b86c-5db61eaec58c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.061 2 DEBUG nova.network.os_vif_util [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converting VIF {"id": "4e3afd85-f9b7-45ee-b86c-5db61eaec58c", "address": "fa:16:3e:e8:4d:2e", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-test-network--238655981", "vif_mac": "fa:16:3e:e8:4d:2e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e3afd85-f9", "ovs_interfaceid": "4e3afd85-f9b7-45ee-b86c-5db61eaec58c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.062 2 DEBUG nova.network.os_vif_util [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:4d:2e,bridge_name='br-int',has_traffic_filtering=True,id=4e3afd85-f9b7-45ee-b86c-5db61eaec58c,network=Network(ac5383ee-65ae-4340-bb19-495c4991fae8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e3afd85-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.062 2 DEBUG os_vif [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:4d:2e,bridge_name='br-int',has_traffic_filtering=True,id=4e3afd85-f9b7-45ee-b86c-5db61eaec58c,network=Network(ac5383ee-65ae-4340-bb19-495c4991fae8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e3afd85-f9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.063 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.063 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.066 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4e3afd85-f9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.066 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4e3afd85-f9, col_values=(('external_ids', {'iface-id': '4e3afd85-f9b7-45ee-b86c-5db61eaec58c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e8:4d:2e', 'vm-uuid': '13616378-6f23-42e4-8c8c-5182a5056326'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:04 np0005476733 NetworkManager[51699]: <info>  [1759938424.0697] manager: (tap4e3afd85-f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/196)
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.076 2 INFO os_vif [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:4d:2e,bridge_name='br-int',has_traffic_filtering=True,id=4e3afd85-f9b7-45ee-b86c-5db61eaec58c,network=Network(ac5383ee-65ae-4340-bb19-495c4991fae8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e3afd85-f9')#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.077 2 DEBUG nova.virt.libvirt.vif [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:43:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_qos_after_cold_migration-1644920499',display_name='tempest-test_qos_after_cold_migration-1644920499',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-qos-after-cold-migration-1644920499',id=70,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:43:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-sqgstxg9',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',old_vm_state='active',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:46:54Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=13616378-6f23-42e4-8c8c-5182a5056326,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6771bc83-98b8-4b06-8442-9bb11777cdc6", "address": "fa:16:3e:b8:d2:73", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-tenant-ctl-network-648960884", "vif_mac": "fa:16:3e:b8:d2:73"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6771bc83-98", "ovs_interfaceid": "6771bc83-98b8-4b06-8442-9bb11777cdc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.077 2 DEBUG nova.network.os_vif_util [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converting VIF {"id": "6771bc83-98b8-4b06-8442-9bb11777cdc6", "address": "fa:16:3e:b8:d2:73", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-tenant-ctl-network-648960884", "vif_mac": "fa:16:3e:b8:d2:73"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6771bc83-98", "ovs_interfaceid": "6771bc83-98b8-4b06-8442-9bb11777cdc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.078 2 DEBUG nova.network.os_vif_util [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:d2:73,bridge_name='br-int',has_traffic_filtering=True,id=6771bc83-98b8-4b06-8442-9bb11777cdc6,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6771bc83-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.078 2 DEBUG os_vif [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:d2:73,bridge_name='br-int',has_traffic_filtering=True,id=6771bc83-98b8-4b06-8442-9bb11777cdc6,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6771bc83-98') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.078 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.079 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.080 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6771bc83-98, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.080 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6771bc83-98, col_values=(('external_ids', {'iface-id': '6771bc83-98b8-4b06-8442-9bb11777cdc6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b8:d2:73', 'vm-uuid': '13616378-6f23-42e4-8c8c-5182a5056326'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:04 np0005476733 NetworkManager[51699]: <info>  [1759938424.0828] manager: (tap6771bc83-98): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/197)
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.094 2 INFO os_vif [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:d2:73,bridge_name='br-int',has_traffic_filtering=True,id=6771bc83-98b8-4b06-8442-9bb11777cdc6,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6771bc83-98')#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.160 2 DEBUG nova.virt.libvirt.driver [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.161 2 DEBUG nova.virt.libvirt.driver [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.161 2 DEBUG nova.virt.libvirt.driver [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] No VIF found with MAC fa:16:3e:e8:4d:2e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.161 2 DEBUG nova.virt.libvirt.driver [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] No VIF found with MAC fa:16:3e:b8:d2:73, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.161 2 INFO nova.virt.libvirt.driver [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Using config drive#033[00m
Oct  8 11:47:04 np0005476733 kernel: tap4e3afd85-f9: entered promiscuous mode
Oct  8 11:47:04 np0005476733 NetworkManager[51699]: <info>  [1759938424.2416] manager: (tap4e3afd85-f9): new Tun device (/org/freedesktop/NetworkManager/Devices/198)
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:04 np0005476733 ovn_controller[94857]: 2025-10-08T15:47:04Z|00588|binding|INFO|Claiming lport 4e3afd85-f9b7-45ee-b86c-5db61eaec58c for this chassis.
Oct  8 11:47:04 np0005476733 ovn_controller[94857]: 2025-10-08T15:47:04Z|00589|binding|INFO|4e3afd85-f9b7-45ee-b86c-5db61eaec58c: Claiming fa:16:3e:e8:4d:2e 192.168.7.32
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.255 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:4d:2e 192.168.7.32'], port_security=['fa:16:3e:e8:4d:2e 192.168.7.32'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.7.32/24', 'neutron:device_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac5383ee-65ae-4340-bb19-495c4991fae8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'b449450f-29a2-4ba2-a56d-c4c1cca923db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.184'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c049a133-6546-4ba7-90ec-ddcac9cb5060, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=4e3afd85-f9b7-45ee-b86c-5db61eaec58c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.257 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 4e3afd85-f9b7-45ee-b86c-5db61eaec58c in datapath ac5383ee-65ae-4340-bb19-495c4991fae8 bound to our chassis#033[00m
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.259 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ac5383ee-65ae-4340-bb19-495c4991fae8#033[00m
Oct  8 11:47:04 np0005476733 ovn_controller[94857]: 2025-10-08T15:47:04Z|00590|binding|INFO|Setting lport 4e3afd85-f9b7-45ee-b86c-5db61eaec58c ovn-installed in OVS
Oct  8 11:47:04 np0005476733 ovn_controller[94857]: 2025-10-08T15:47:04Z|00591|binding|INFO|Setting lport 4e3afd85-f9b7-45ee-b86c-5db61eaec58c up in Southbound
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:04 np0005476733 NetworkManager[51699]: <info>  [1759938424.2736] manager: (tap6771bc83-98): new Tun device (/org/freedesktop/NetworkManager/Devices/199)
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:04 np0005476733 kernel: tap6771bc83-98: entered promiscuous mode
Oct  8 11:47:04 np0005476733 ovn_controller[94857]: 2025-10-08T15:47:04Z|00592|binding|INFO|Claiming lport 6771bc83-98b8-4b06-8442-9bb11777cdc6 for this chassis.
Oct  8 11:47:04 np0005476733 ovn_controller[94857]: 2025-10-08T15:47:04Z|00593|binding|INFO|6771bc83-98b8-4b06-8442-9bb11777cdc6: Claiming fa:16:3e:b8:d2:73 10.100.0.5
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.278 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[310a6693-617c-414c-b151-7144474a7da1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:04 np0005476733 systemd-udevd[241113]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:47:04 np0005476733 systemd-udevd[241112]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.284 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:d2:73 10.100.0.5'], port_security=['fa:16:3e:b8:d2:73 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '6', 'neutron:security_group_ids': '82ea289b-c65f-44fe-a172-e9784a3ab9f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.202'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3da71a44-b74e-4032-87c4-3337484b3d54, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=6771bc83-98b8-4b06-8442-9bb11777cdc6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:47:04 np0005476733 ovn_controller[94857]: 2025-10-08T15:47:04Z|00594|binding|INFO|Setting lport 6771bc83-98b8-4b06-8442-9bb11777cdc6 ovn-installed in OVS
Oct  8 11:47:04 np0005476733 ovn_controller[94857]: 2025-10-08T15:47:04Z|00595|binding|INFO|Setting lport 6771bc83-98b8-4b06-8442-9bb11777cdc6 up in Southbound
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:04 np0005476733 NetworkManager[51699]: <info>  [1759938424.3030] device (tap6771bc83-98): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:47:04 np0005476733 NetworkManager[51699]: <info>  [1759938424.3047] device (tap4e3afd85-f9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:47:04 np0005476733 NetworkManager[51699]: <info>  [1759938424.3058] device (tap6771bc83-98): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:47:04 np0005476733 NetworkManager[51699]: <info>  [1759938424.3065] device (tap4e3afd85-f9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.318 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[ccf3461c-66f9-4b98-bfe6-2df1228c2e1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.323 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[1ce25563-6f34-4d33-b012-2d7e3c78a7b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:47:04 np0005476733 systemd-machined[152624]: New machine qemu-41-instance-00000046.
Oct  8 11:47:04 np0005476733 systemd[1]: Started Virtual Machine qemu-41-instance-00000046.
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.355 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[34bad466-358d-4a78-8b1b-399d43c4b841]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.377 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[682494fa-3880-4e64-99b9-0e14e9a95be3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapac5383ee-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:d5:cb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 11, 'tx_packets': 5, 'rx_bytes': 958, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 11, 'tx_packets': 5, 'rx_bytes': 958, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 134], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517659, 'reachable_time': 35557, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241125, 'error': None, 'target': 'ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.395 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9af16871-a07d-4f73-9452-275218233da1]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapac5383ee-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517673, 'tstamp': 517673}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241129, 'error': None, 'target': 'ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.7.2'], ['IFA_LOCAL', '192.168.7.2'], ['IFA_BROADCAST', '192.168.7.255'], ['IFA_LABEL', 'tapac5383ee-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517676, 'tstamp': 517676}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241129, 'error': None, 'target': 'ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.398 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac5383ee-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.402 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapac5383ee-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.402 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.403 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapac5383ee-60, col_values=(('external_ids', {'iface-id': '825d48bf-0cf9-4bc4-9da2-41ee106cd6a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.403 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.405 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 6771bc83-98b8-4b06-8442-9bb11777cdc6 in datapath 58a69152-b5a6-41d0-85d5-36ab51cfbfb5 unbound from our chassis#033[00m
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.408 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58a69152-b5a6-41d0-85d5-36ab51cfbfb5#033[00m
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.426 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[010d125c-31df-4da7-b558-d93bdad73d35]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.462 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[335cc09b-f15a-440e-a2fe-057117edc1c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.466 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[566d34f5-a2c4-4ceb-aa5f-2cdd025fd2d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.500 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[8176c73d-cb4f-46cf-9133-5b40ee0544eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.519 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3dfb91b8-3bcd-43ee-9197-bcea3b2a1441]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58a69152-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:63:a5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 136], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523954, 'reachable_time': 38821, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241144, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.542 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[77b91dbd-ada7-4721-b2cf-879ab1681cc3]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap58a69152-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 523968, 'tstamp': 523968}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241145, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap58a69152-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 523972, 'tstamp': 523972}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241145, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.544 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58a69152-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.548 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58a69152-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.548 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.549 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58a69152-b0, col_values=(('external_ids', {'iface-id': '46f589fc-b5d9-4e1f-b085-8789fd1f48e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:47:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:04.549 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.662 2 DEBUG nova.compute.manager [req-89dadd47-92a1-4736-a324-2b2c46d5664a req-fa5a92ee-2627-4f16-96b4-2eec4dfbe72f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received event network-vif-plugged-4e3afd85-f9b7-45ee-b86c-5db61eaec58c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.664 2 DEBUG oslo_concurrency.lockutils [req-89dadd47-92a1-4736-a324-2b2c46d5664a req-fa5a92ee-2627-4f16-96b4-2eec4dfbe72f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "13616378-6f23-42e4-8c8c-5182a5056326-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.665 2 DEBUG oslo_concurrency.lockutils [req-89dadd47-92a1-4736-a324-2b2c46d5664a req-fa5a92ee-2627-4f16-96b4-2eec4dfbe72f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.666 2 DEBUG oslo_concurrency.lockutils [req-89dadd47-92a1-4736-a324-2b2c46d5664a req-fa5a92ee-2627-4f16-96b4-2eec4dfbe72f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.666 2 DEBUG nova.compute.manager [req-89dadd47-92a1-4736-a324-2b2c46d5664a req-fa5a92ee-2627-4f16-96b4-2eec4dfbe72f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] No waiting events found dispatching network-vif-plugged-4e3afd85-f9b7-45ee-b86c-5db61eaec58c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:47:04 np0005476733 nova_compute[192580]: 2025-10-08 15:47:04.666 2 WARNING nova.compute.manager [req-89dadd47-92a1-4736-a324-2b2c46d5664a req-fa5a92ee-2627-4f16-96b4-2eec4dfbe72f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received unexpected event network-vif-plugged-4e3afd85-f9b7-45ee-b86c-5db61eaec58c for instance with vm_state active and task_state resize_finish.#033[00m
Oct  8 11:47:05 np0005476733 nova_compute[192580]: 2025-10-08 15:47:05.015 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759938425.0153348, 13616378-6f23-42e4-8c8c-5182a5056326 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:47:05 np0005476733 nova_compute[192580]: 2025-10-08 15:47:05.016 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:47:05 np0005476733 nova_compute[192580]: 2025-10-08 15:47:05.019 2 DEBUG nova.compute.manager [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:47:05 np0005476733 nova_compute[192580]: 2025-10-08 15:47:05.022 2 INFO nova.virt.libvirt.driver [-] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Instance running successfully.#033[00m
Oct  8 11:47:05 np0005476733 virtqemud[192152]: argument unsupported: QEMU guest agent is not configured
Oct  8 11:47:05 np0005476733 nova_compute[192580]: 2025-10-08 15:47:05.026 2 DEBUG nova.virt.libvirt.guest [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Oct  8 11:47:05 np0005476733 nova_compute[192580]: 2025-10-08 15:47:05.026 2 DEBUG nova.virt.libvirt.driver [None req-26187459-03c7-460a-ba9a-a52e6a8a38e5 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Oct  8 11:47:05 np0005476733 nova_compute[192580]: 2025-10-08 15:47:05.051 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:47:05 np0005476733 nova_compute[192580]: 2025-10-08 15:47:05.055 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:47:05 np0005476733 nova_compute[192580]: 2025-10-08 15:47:05.103 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Oct  8 11:47:05 np0005476733 nova_compute[192580]: 2025-10-08 15:47:05.103 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759938425.0159495, 13616378-6f23-42e4-8c8c-5182a5056326 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:47:05 np0005476733 nova_compute[192580]: 2025-10-08 15:47:05.104 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] VM Started (Lifecycle Event)#033[00m
Oct  8 11:47:05 np0005476733 nova_compute[192580]: 2025-10-08 15:47:05.265 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:47:05 np0005476733 nova_compute[192580]: 2025-10-08 15:47:05.269 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:47:05 np0005476733 nova_compute[192580]: 2025-10-08 15:47:05.663 2 DEBUG nova.network.neutron [req-989b610c-7894-4764-ba82-4ba52ec3e40b req-6ba473ff-e0d6-445c-a8d5-e3da65d8269a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Updated VIF entry in instance network info cache for port 6771bc83-98b8-4b06-8442-9bb11777cdc6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:47:05 np0005476733 nova_compute[192580]: 2025-10-08 15:47:05.665 2 DEBUG nova.network.neutron [req-989b610c-7894-4764-ba82-4ba52ec3e40b req-6ba473ff-e0d6-445c-a8d5-e3da65d8269a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Updating instance_info_cache with network_info: [{"id": "4e3afd85-f9b7-45ee-b86c-5db61eaec58c", "address": "fa:16:3e:e8:4d:2e", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e3afd85-f9", "ovs_interfaceid": "4e3afd85-f9b7-45ee-b86c-5db61eaec58c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6771bc83-98b8-4b06-8442-9bb11777cdc6", "address": "fa:16:3e:b8:d2:73", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6771bc83-98", "ovs_interfaceid": "6771bc83-98b8-4b06-8442-9bb11777cdc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:47:05 np0005476733 nova_compute[192580]: 2025-10-08 15:47:05.705 2 DEBUG oslo_concurrency.lockutils [req-989b610c-7894-4764-ba82-4ba52ec3e40b req-6ba473ff-e0d6-445c-a8d5-e3da65d8269a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-13616378-6f23-42e4-8c8c-5182a5056326" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:47:05 np0005476733 nova_compute[192580]: 2025-10-08 15:47:05.707 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-13616378-6f23-42e4-8c8c-5182a5056326" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:47:05 np0005476733 nova_compute[192580]: 2025-10-08 15:47:05.707 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:47:05 np0005476733 nova_compute[192580]: 2025-10-08 15:47:05.708 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 13616378-6f23-42e4-8c8c-5182a5056326 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:47:05 np0005476733 nova_compute[192580]: 2025-10-08 15:47:05.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:06 np0005476733 nova_compute[192580]: 2025-10-08 15:47:06.755 2 DEBUG nova.compute.manager [req-f6a7e604-950b-4906-8900-a16dad54ca94 req-e3edddec-7e21-49ca-8c54-afc1d11baea9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received event network-vif-plugged-4e3afd85-f9b7-45ee-b86c-5db61eaec58c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:47:06 np0005476733 nova_compute[192580]: 2025-10-08 15:47:06.756 2 DEBUG oslo_concurrency.lockutils [req-f6a7e604-950b-4906-8900-a16dad54ca94 req-e3edddec-7e21-49ca-8c54-afc1d11baea9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "13616378-6f23-42e4-8c8c-5182a5056326-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:47:06 np0005476733 nova_compute[192580]: 2025-10-08 15:47:06.756 2 DEBUG oslo_concurrency.lockutils [req-f6a7e604-950b-4906-8900-a16dad54ca94 req-e3edddec-7e21-49ca-8c54-afc1d11baea9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:47:06 np0005476733 nova_compute[192580]: 2025-10-08 15:47:06.757 2 DEBUG oslo_concurrency.lockutils [req-f6a7e604-950b-4906-8900-a16dad54ca94 req-e3edddec-7e21-49ca-8c54-afc1d11baea9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:47:06 np0005476733 nova_compute[192580]: 2025-10-08 15:47:06.757 2 DEBUG nova.compute.manager [req-f6a7e604-950b-4906-8900-a16dad54ca94 req-e3edddec-7e21-49ca-8c54-afc1d11baea9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] No waiting events found dispatching network-vif-plugged-4e3afd85-f9b7-45ee-b86c-5db61eaec58c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:47:06 np0005476733 nova_compute[192580]: 2025-10-08 15:47:06.758 2 WARNING nova.compute.manager [req-f6a7e604-950b-4906-8900-a16dad54ca94 req-e3edddec-7e21-49ca-8c54-afc1d11baea9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received unexpected event network-vif-plugged-4e3afd85-f9b7-45ee-b86c-5db61eaec58c for instance with vm_state resized and task_state None.#033[00m
Oct  8 11:47:06 np0005476733 nova_compute[192580]: 2025-10-08 15:47:06.758 2 DEBUG nova.compute.manager [req-f6a7e604-950b-4906-8900-a16dad54ca94 req-e3edddec-7e21-49ca-8c54-afc1d11baea9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received event network-vif-plugged-6771bc83-98b8-4b06-8442-9bb11777cdc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:47:06 np0005476733 nova_compute[192580]: 2025-10-08 15:47:06.759 2 DEBUG oslo_concurrency.lockutils [req-f6a7e604-950b-4906-8900-a16dad54ca94 req-e3edddec-7e21-49ca-8c54-afc1d11baea9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "13616378-6f23-42e4-8c8c-5182a5056326-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:47:06 np0005476733 nova_compute[192580]: 2025-10-08 15:47:06.759 2 DEBUG oslo_concurrency.lockutils [req-f6a7e604-950b-4906-8900-a16dad54ca94 req-e3edddec-7e21-49ca-8c54-afc1d11baea9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:47:06 np0005476733 nova_compute[192580]: 2025-10-08 15:47:06.759 2 DEBUG oslo_concurrency.lockutils [req-f6a7e604-950b-4906-8900-a16dad54ca94 req-e3edddec-7e21-49ca-8c54-afc1d11baea9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:47:06 np0005476733 nova_compute[192580]: 2025-10-08 15:47:06.760 2 DEBUG nova.compute.manager [req-f6a7e604-950b-4906-8900-a16dad54ca94 req-e3edddec-7e21-49ca-8c54-afc1d11baea9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] No waiting events found dispatching network-vif-plugged-6771bc83-98b8-4b06-8442-9bb11777cdc6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:47:06 np0005476733 nova_compute[192580]: 2025-10-08 15:47:06.760 2 WARNING nova.compute.manager [req-f6a7e604-950b-4906-8900-a16dad54ca94 req-e3edddec-7e21-49ca-8c54-afc1d11baea9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received unexpected event network-vif-plugged-6771bc83-98b8-4b06-8442-9bb11777cdc6 for instance with vm_state resized and task_state None.#033[00m
Oct  8 11:47:06 np0005476733 nova_compute[192580]: 2025-10-08 15:47:06.761 2 DEBUG nova.compute.manager [req-f6a7e604-950b-4906-8900-a16dad54ca94 req-e3edddec-7e21-49ca-8c54-afc1d11baea9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received event network-vif-plugged-6771bc83-98b8-4b06-8442-9bb11777cdc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:47:06 np0005476733 nova_compute[192580]: 2025-10-08 15:47:06.761 2 DEBUG oslo_concurrency.lockutils [req-f6a7e604-950b-4906-8900-a16dad54ca94 req-e3edddec-7e21-49ca-8c54-afc1d11baea9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "13616378-6f23-42e4-8c8c-5182a5056326-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:47:06 np0005476733 nova_compute[192580]: 2025-10-08 15:47:06.762 2 DEBUG oslo_concurrency.lockutils [req-f6a7e604-950b-4906-8900-a16dad54ca94 req-e3edddec-7e21-49ca-8c54-afc1d11baea9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:47:06 np0005476733 nova_compute[192580]: 2025-10-08 15:47:06.762 2 DEBUG oslo_concurrency.lockutils [req-f6a7e604-950b-4906-8900-a16dad54ca94 req-e3edddec-7e21-49ca-8c54-afc1d11baea9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:47:06 np0005476733 nova_compute[192580]: 2025-10-08 15:47:06.763 2 DEBUG nova.compute.manager [req-f6a7e604-950b-4906-8900-a16dad54ca94 req-e3edddec-7e21-49ca-8c54-afc1d11baea9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] No waiting events found dispatching network-vif-plugged-6771bc83-98b8-4b06-8442-9bb11777cdc6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:47:06 np0005476733 nova_compute[192580]: 2025-10-08 15:47:06.763 2 WARNING nova.compute.manager [req-f6a7e604-950b-4906-8900-a16dad54ca94 req-e3edddec-7e21-49ca-8c54-afc1d11baea9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received unexpected event network-vif-plugged-6771bc83-98b8-4b06-8442-9bb11777cdc6 for instance with vm_state resized and task_state None.#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.008 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Updating instance_info_cache with network_info: [{"id": "4e3afd85-f9b7-45ee-b86c-5db61eaec58c", "address": "fa:16:3e:e8:4d:2e", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e3afd85-f9", "ovs_interfaceid": "4e3afd85-f9b7-45ee-b86c-5db61eaec58c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6771bc83-98b8-4b06-8442-9bb11777cdc6", "address": "fa:16:3e:b8:d2:73", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6771bc83-98", "ovs_interfaceid": "6771bc83-98b8-4b06-8442-9bb11777cdc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.030 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-13616378-6f23-42e4-8c8c-5182a5056326" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.031 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.032 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.032 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.032 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.033 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.059 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.060 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.061 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.061 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.138 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13616378-6f23-42e4-8c8c-5182a5056326/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:47:08 np0005476733 podman[241149]: 2025-10-08 15:47:08.224787317 +0000 UTC m=+0.100169245 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.226 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13616378-6f23-42e4-8c8c-5182a5056326/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.229 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13616378-6f23-42e4-8c8c-5182a5056326/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:47:08 np0005476733 podman[241148]: 2025-10-08 15:47:08.244883419 +0000 UTC m=+0.120819744 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.286 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13616378-6f23-42e4-8c8c-5182a5056326/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.293 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.355 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.357 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.421 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.607 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.608 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12767MB free_disk=111.04684448242188GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.609 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.609 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.674 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Applying migration context for instance 13616378-6f23-42e4-8c8c-5182a5056326 as it has an incoming, in-progress migration 1be60ce2-4e16-4e22-83ab-7f33b64c6f62. Migration status is confirming _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.676 2 INFO nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Updating resource usage from migration 1be60ce2-4e16-4e22-83ab-7f33b64c6f62#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.701 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance df287684-9151-42eb-8ff2-01e29a07e1e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.702 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 13616378-6f23-42e4-8c8c-5182a5056326 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.702 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.702 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=2560MB phys_disk=119GB used_disk=20GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.766 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.783 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.814 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:47:08 np0005476733 nova_compute[192580]: 2025-10-08 15:47:08.815 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:47:09 np0005476733 nova_compute[192580]: 2025-10-08 15:47:09.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:10 np0005476733 nova_compute[192580]: 2025-10-08 15:47:10.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:10 np0005476733 nova_compute[192580]: 2025-10-08 15:47:10.807 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:47:14 np0005476733 nova_compute[192580]: 2025-10-08 15:47:14.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:15 np0005476733 nova_compute[192580]: 2025-10-08 15:47:15.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:17 np0005476733 podman[241206]: 2025-10-08 15:47:17.265144616 +0000 UTC m=+0.078823881 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 11:47:17 np0005476733 podman[241205]: 2025-10-08 15:47:17.282646146 +0000 UTC m=+0.098184911 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 11:47:17 np0005476733 podman[241207]: 2025-10-08 15:47:17.297676287 +0000 UTC m=+0.099205134 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, release=1755695350, config_id=edpm, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  8 11:47:19 np0005476733 nova_compute[192580]: 2025-10-08 15:47:19.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:19 np0005476733 nova_compute[192580]: 2025-10-08 15:47:19.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:47:20 np0005476733 nova_compute[192580]: 2025-10-08 15:47:20.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:24 np0005476733 nova_compute[192580]: 2025-10-08 15:47:24.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:24 np0005476733 podman[241279]: 2025-10-08 15:47:24.245235009 +0000 UTC m=+0.061569490 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:47:24 np0005476733 podman[241278]: 2025-10-08 15:47:24.257299235 +0000 UTC m=+0.078038087 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible)
Oct  8 11:47:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:47:25Z|00596|pinctrl|WARN|Dropped 859 log messages in last 60 seconds (most recently, 18 seconds ago) due to excessive rate
Oct  8 11:47:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:47:25Z|00597|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:47:25 np0005476733 nova_compute[192580]: 2025-10-08 15:47:25.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:26.334 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:47:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:26.335 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:47:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:26.337 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:47:29 np0005476733 nova_compute[192580]: 2025-10-08 15:47:29.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:29 np0005476733 ovn_controller[94857]: 2025-10-08T15:47:29Z|00086|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e8:4d:2e 192.168.7.32
Oct  8 11:47:29 np0005476733 ovn_controller[94857]: 2025-10-08T15:47:29Z|00087|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b8:d2:73 10.100.0.5
Oct  8 11:47:30 np0005476733 nova_compute[192580]: 2025-10-08 15:47:30.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:34 np0005476733 nova_compute[192580]: 2025-10-08 15:47:34.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:34 np0005476733 podman[241322]: 2025-10-08 15:47:34.233328087 +0000 UTC m=+0.063823082 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 11:47:34 np0005476733 ovn_controller[94857]: 2025-10-08T15:47:34Z|00598|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Oct  8 11:47:35 np0005476733 nova_compute[192580]: 2025-10-08 15:47:35.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:39 np0005476733 nova_compute[192580]: 2025-10-08 15:47:39.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:39 np0005476733 podman[241343]: 2025-10-08 15:47:39.25255768 +0000 UTC m=+0.078961236 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  8 11:47:39 np0005476733 podman[241342]: 2025-10-08 15:47:39.26695494 +0000 UTC m=+0.097034824 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:47:40 np0005476733 nova_compute[192580]: 2025-10-08 15:47:40.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:44 np0005476733 nova_compute[192580]: 2025-10-08 15:47:44.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:45 np0005476733 nova_compute[192580]: 2025-10-08 15:47:45.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:47 np0005476733 nova_compute[192580]: 2025-10-08 15:47:47.016 2 DEBUG oslo_concurrency.lockutils [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquiring lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:47:47 np0005476733 nova_compute[192580]: 2025-10-08 15:47:47.017 2 DEBUG oslo_concurrency.lockutils [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquired lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:47:47 np0005476733 nova_compute[192580]: 2025-10-08 15:47:47.017 2 DEBUG nova.network.neutron [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:47:48 np0005476733 podman[241389]: 2025-10-08 15:47:48.248260652 +0000 UTC m=+0.071109884 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 11:47:48 np0005476733 podman[241390]: 2025-10-08 15:47:48.259358367 +0000 UTC m=+0.073566393 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, release=1755695350, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, version=9.6, container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  8 11:47:48 np0005476733 podman[241388]: 2025-10-08 15:47:48.259945936 +0000 UTC m=+0.090136164 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:47:49 np0005476733 nova_compute[192580]: 2025-10-08 15:47:49.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:50 np0005476733 nova_compute[192580]: 2025-10-08 15:47:50.232 2 DEBUG nova.network.neutron [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Updating instance_info_cache with network_info: [{"id": "8a48fdf3-2293-49fb-81c8-b558651c0274", "address": "fa:16:3e:36:af:68", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a48fdf3-22", "ovs_interfaceid": "8a48fdf3-2293-49fb-81c8-b558651c0274", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "address": "fa:16:3e:f7:81:a1", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd02e7451-3e", "ovs_interfaceid": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:47:50 np0005476733 nova_compute[192580]: 2025-10-08 15:47:50.291 2 DEBUG oslo_concurrency.lockutils [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Releasing lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:47:50 np0005476733 nova_compute[192580]: 2025-10-08 15:47:50.553 2 DEBUG nova.virt.libvirt.driver [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Oct  8 11:47:50 np0005476733 nova_compute[192580]: 2025-10-08 15:47:50.554 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Creating file /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/d1f2a951952e4a5c8cb1082cc6a3128b.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Oct  8 11:47:50 np0005476733 nova_compute[192580]: 2025-10-08 15:47:50.554 2 DEBUG oslo_concurrency.processutils [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/d1f2a951952e4a5c8cb1082cc6a3128b.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:47:50 np0005476733 nova_compute[192580]: 2025-10-08 15:47:50.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:50 np0005476733 nova_compute[192580]: 2025-10-08 15:47:50.970 2 DEBUG oslo_concurrency.processutils [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/d1f2a951952e4a5c8cb1082cc6a3128b.tmp" returned: 1 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:47:50 np0005476733 nova_compute[192580]: 2025-10-08 15:47:50.971 2 DEBUG oslo_concurrency.processutils [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/d1f2a951952e4a5c8cb1082cc6a3128b.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Oct  8 11:47:50 np0005476733 nova_compute[192580]: 2025-10-08 15:47:50.971 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Creating directory /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Oct  8 11:47:50 np0005476733 nova_compute[192580]: 2025-10-08 15:47:50.972 2 DEBUG oslo_concurrency.processutils [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:47:51 np0005476733 nova_compute[192580]: 2025-10-08 15:47:51.183 2 DEBUG oslo_concurrency.processutils [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1" returned: 0 in 0.212s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:47:51 np0005476733 nova_compute[192580]: 2025-10-08 15:47:51.190 2 DEBUG nova.virt.libvirt.driver [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.201 2 INFO nova.virt.libvirt.driver [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Instance shutdown successfully after 1 seconds.#033[00m
Oct  8 11:47:52 np0005476733 kernel: tap8a48fdf3-22 (unregistering): left promiscuous mode
Oct  8 11:47:52 np0005476733 NetworkManager[51699]: <info>  [1759938472.2105] device (tap8a48fdf3-22): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:47:52 np0005476733 ovn_controller[94857]: 2025-10-08T15:47:52Z|00599|binding|INFO|Releasing lport 8a48fdf3-2293-49fb-81c8-b558651c0274 from this chassis (sb_readonly=0)
Oct  8 11:47:52 np0005476733 ovn_controller[94857]: 2025-10-08T15:47:52Z|00600|binding|INFO|Setting lport 8a48fdf3-2293-49fb-81c8-b558651c0274 down in Southbound
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:52 np0005476733 ovn_controller[94857]: 2025-10-08T15:47:52Z|00601|binding|INFO|Removing iface tap8a48fdf3-22 ovn-installed in OVS
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.234 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:af:68 192.168.7.38'], port_security=['fa:16:3e:36:af:68 192.168.7.38'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.7.38/24', 'neutron:device_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac5383ee-65ae-4340-bb19-495c4991fae8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b449450f-29a2-4ba2-a56d-c4c1cca923db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.172'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c049a133-6546-4ba7-90ec-ddcac9cb5060, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=8a48fdf3-2293-49fb-81c8-b558651c0274) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.235 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 8a48fdf3-2293-49fb-81c8-b558651c0274 in datapath ac5383ee-65ae-4340-bb19-495c4991fae8 unbound from our chassis#033[00m
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.237 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ac5383ee-65ae-4340-bb19-495c4991fae8#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:52 np0005476733 kernel: tapd02e7451-3e (unregistering): left promiscuous mode
Oct  8 11:47:52 np0005476733 NetworkManager[51699]: <info>  [1759938472.2545] device (tapd02e7451-3e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.254 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c29ec22f-e7b7-4b60-af1e-7f85bd05a0c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:47:52 np0005476733 ovn_controller[94857]: 2025-10-08T15:47:52Z|00602|binding|INFO|Releasing lport d02e7451-3ef3-44f1-b34a-5c7cbee26989 from this chassis (sb_readonly=0)
Oct  8 11:47:52 np0005476733 ovn_controller[94857]: 2025-10-08T15:47:52Z|00603|binding|INFO|Setting lport d02e7451-3ef3-44f1-b34a-5c7cbee26989 down in Southbound
Oct  8 11:47:52 np0005476733 ovn_controller[94857]: 2025-10-08T15:47:52Z|00604|binding|INFO|Removing iface tapd02e7451-3e ovn-installed in OVS
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.281 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:81:a1 10.100.0.10'], port_security=['fa:16:3e:f7:81:a1 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'df287684-9151-42eb-8ff2-01e29a07e1e1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '82ea289b-c65f-44fe-a172-e9784a3ab9f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.173'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3da71a44-b74e-4032-87c4-3337484b3d54, chassis=[], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=d02e7451-3ef3-44f1-b34a-5c7cbee26989) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.287 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[cf883cfe-feab-4c5d-8329-384bf74d1a68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.291 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[fc236098-c98e-428e-98f0-42d9e6ff61d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.319 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[69ab1bf4-98cd-4a02-8d85-7494b3d2ab0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:47:52 np0005476733 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000047.scope: Deactivated successfully.
Oct  8 11:47:52 np0005476733 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000047.scope: Consumed 50.190s CPU time.
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.338 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8676da56-2068-42f0-b982-d33f7a46ccab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapac5383ee-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:d5:cb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 7, 'rx_bytes': 1084, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 7, 'rx_bytes': 1084, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 134], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517659, 'reachable_time': 35557, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241471, 'error': None, 'target': 'ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:47:52 np0005476733 systemd-machined[152624]: Machine qemu-40-instance-00000047 terminated.
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.359 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0d0916be-995c-49d3-8676-ffa5d8b39d99]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapac5383ee-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517673, 'tstamp': 517673}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241472, 'error': None, 'target': 'ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.7.2'], ['IFA_LOCAL', '192.168.7.2'], ['IFA_BROADCAST', '192.168.7.255'], ['IFA_LABEL', 'tapac5383ee-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517676, 'tstamp': 517676}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241472, 'error': None, 'target': 'ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.361 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac5383ee-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.371 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapac5383ee-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.372 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.372 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapac5383ee-60, col_values=(('external_ids', {'iface-id': '825d48bf-0cf9-4bc4-9da2-41ee106cd6a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.373 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.374 103739 INFO neutron.agent.ovn.metadata.agent [-] Port d02e7451-3ef3-44f1-b34a-5c7cbee26989 in datapath 58a69152-b5a6-41d0-85d5-36ab51cfbfb5 unbound from our chassis#033[00m
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.376 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58a69152-b5a6-41d0-85d5-36ab51cfbfb5#033[00m
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.394 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8aaf2d23-40ef-4bb6-87cc-f6428fec5474]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.424 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[af7f3127-7652-4355-a139-5fcd52440b40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.427 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[2920ae13-13e5-4a18-99ca-2689248423e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:47:52 np0005476733 NetworkManager[51699]: <info>  [1759938472.4374] manager: (tapd02e7451-3e): new Tun device (/org/freedesktop/NetworkManager/Devices/200)
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.466 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[9c555aab-ca61-433c-823c-7c85948b5fed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.485 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b11d4c57-9897-496b-bdf2-4e8d9e624c15]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58a69152-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:63:a5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 7, 'rx_bytes': 1042, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 7, 'rx_bytes': 1042, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 136], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523954, 'reachable_time': 38821, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241503, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.500 2 INFO nova.virt.libvirt.driver [-] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Instance destroyed successfully.#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.502 2 DEBUG nova.virt.libvirt.vif [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:44:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_qos_after_cold_migration-1645830914',display_name='tempest-test_qos_after_cold_migration-1645830914',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-qos-after-cold-migration-1645830914',id=71,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:44:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-riz678gr',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',old_vm_state='active',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:47:46Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=df287684-9151-42eb-8ff2-01e29a07e1e1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8a48fdf3-2293-49fb-81c8-b558651c0274", "address": "fa:16:3e:36:af:68", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-test-network--238655981", "vif_mac": "fa:16:3e:36:af:68"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a48fdf3-22", "ovs_interfaceid": "8a48fdf3-2293-49fb-81c8-b558651c0274", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.502 2 DEBUG nova.network.os_vif_util [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converting VIF {"id": "8a48fdf3-2293-49fb-81c8-b558651c0274", "address": "fa:16:3e:36:af:68", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-test-network--238655981", "vif_mac": "fa:16:3e:36:af:68"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a48fdf3-22", "ovs_interfaceid": "8a48fdf3-2293-49fb-81c8-b558651c0274", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.503 2 DEBUG nova.network.os_vif_util [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:36:af:68,bridge_name='br-int',has_traffic_filtering=True,id=8a48fdf3-2293-49fb-81c8-b558651c0274,network=Network(ac5383ee-65ae-4340-bb19-495c4991fae8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a48fdf3-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.504 2 DEBUG os_vif [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:36:af:68,bridge_name='br-int',has_traffic_filtering=True,id=8a48fdf3-2293-49fb-81c8-b558651c0274,network=Network(ac5383ee-65ae-4340-bb19-495c4991fae8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a48fdf3-22') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.503 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[878d0bd5-b39c-4c67-bb5d-319b5251b239]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap58a69152-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 523968, 'tstamp': 523968}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241508, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap58a69152-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 523972, 'tstamp': 523972}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241508, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.505 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58a69152-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.507 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8a48fdf3-22, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.514 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58a69152-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.514 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.514 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58a69152-b0, col_values=(('external_ids', {'iface-id': '46f589fc-b5d9-4e1f-b085-8789fd1f48e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:47:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:52.515 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.520 2 INFO os_vif [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:36:af:68,bridge_name='br-int',has_traffic_filtering=True,id=8a48fdf3-2293-49fb-81c8-b558651c0274,network=Network(ac5383ee-65ae-4340-bb19-495c4991fae8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a48fdf3-22')#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.521 2 DEBUG nova.virt.libvirt.vif [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:44:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_qos_after_cold_migration-1645830914',display_name='tempest-test_qos_after_cold_migration-1645830914',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-qos-after-cold-migration-1645830914',id=71,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:44:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-riz678gr',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',old_vm_state='active',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:47:46Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=df287684-9151-42eb-8ff2-01e29a07e1e1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "address": "fa:16:3e:f7:81:a1", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-tenant-ctl-network-648960884", "vif_mac": "fa:16:3e:f7:81:a1"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd02e7451-3e", "ovs_interfaceid": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.521 2 DEBUG nova.network.os_vif_util [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converting VIF {"id": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "address": "fa:16:3e:f7:81:a1", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-tenant-ctl-network-648960884", "vif_mac": "fa:16:3e:f7:81:a1"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd02e7451-3e", "ovs_interfaceid": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.522 2 DEBUG nova.network.os_vif_util [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f7:81:a1,bridge_name='br-int',has_traffic_filtering=True,id=d02e7451-3ef3-44f1-b34a-5c7cbee26989,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd02e7451-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.522 2 DEBUG os_vif [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:81:a1,bridge_name='br-int',has_traffic_filtering=True,id=d02e7451-3ef3-44f1-b34a-5c7cbee26989,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd02e7451-3e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.523 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd02e7451-3e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.529 2 INFO os_vif [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:81:a1,bridge_name='br-int',has_traffic_filtering=True,id=d02e7451-3ef3-44f1-b34a-5c7cbee26989,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd02e7451-3e')#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.534 2 DEBUG oslo_concurrency.processutils [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.586 2 DEBUG nova.compute.manager [req-ea0106c6-b8cc-4a7b-be0c-bc73e265d149 req-b7908932-c716-485d-80a8-745e4d6deef7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received event network-vif-unplugged-d02e7451-3ef3-44f1-b34a-5c7cbee26989 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.587 2 DEBUG oslo_concurrency.lockutils [req-ea0106c6-b8cc-4a7b-be0c-bc73e265d149 req-b7908932-c716-485d-80a8-745e4d6deef7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.587 2 DEBUG oslo_concurrency.lockutils [req-ea0106c6-b8cc-4a7b-be0c-bc73e265d149 req-b7908932-c716-485d-80a8-745e4d6deef7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.587 2 DEBUG oslo_concurrency.lockutils [req-ea0106c6-b8cc-4a7b-be0c-bc73e265d149 req-b7908932-c716-485d-80a8-745e4d6deef7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.588 2 DEBUG nova.compute.manager [req-ea0106c6-b8cc-4a7b-be0c-bc73e265d149 req-b7908932-c716-485d-80a8-745e4d6deef7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] No waiting events found dispatching network-vif-unplugged-d02e7451-3ef3-44f1-b34a-5c7cbee26989 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.588 2 WARNING nova.compute.manager [req-ea0106c6-b8cc-4a7b-be0c-bc73e265d149 req-b7908932-c716-485d-80a8-745e4d6deef7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received unexpected event network-vif-unplugged-d02e7451-3ef3-44f1-b34a-5c7cbee26989 for instance with vm_state active and task_state resize_migrating.#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.589 2 DEBUG oslo_concurrency.processutils [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.589 2 DEBUG oslo_concurrency.processutils [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.620 2 DEBUG nova.compute.manager [req-d065c8dd-1e11-40c8-bf3f-b761de821890 req-f4ea279b-6bc8-4b8f-9e32-f21c06b6a6b7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received event network-vif-unplugged-8a48fdf3-2293-49fb-81c8-b558651c0274 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.621 2 DEBUG oslo_concurrency.lockutils [req-d065c8dd-1e11-40c8-bf3f-b761de821890 req-f4ea279b-6bc8-4b8f-9e32-f21c06b6a6b7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.621 2 DEBUG oslo_concurrency.lockutils [req-d065c8dd-1e11-40c8-bf3f-b761de821890 req-f4ea279b-6bc8-4b8f-9e32-f21c06b6a6b7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.621 2 DEBUG oslo_concurrency.lockutils [req-d065c8dd-1e11-40c8-bf3f-b761de821890 req-f4ea279b-6bc8-4b8f-9e32-f21c06b6a6b7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.621 2 DEBUG nova.compute.manager [req-d065c8dd-1e11-40c8-bf3f-b761de821890 req-f4ea279b-6bc8-4b8f-9e32-f21c06b6a6b7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] No waiting events found dispatching network-vif-unplugged-8a48fdf3-2293-49fb-81c8-b558651c0274 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.621 2 WARNING nova.compute.manager [req-d065c8dd-1e11-40c8-bf3f-b761de821890 req-f4ea279b-6bc8-4b8f-9e32-f21c06b6a6b7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received unexpected event network-vif-unplugged-8a48fdf3-2293-49fb-81c8-b558651c0274 for instance with vm_state active and task_state resize_migrating.#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.666 2 DEBUG oslo_concurrency.processutils [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.667 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Copying file /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1_resize/disk to 192.168.122.100:/var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  8 11:47:52 np0005476733 nova_compute[192580]: 2025-10-08 15:47:52.668 2 DEBUG oslo_concurrency.processutils [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1_resize/disk 192.168.122.100:/var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:47:54 np0005476733 nova_compute[192580]: 2025-10-08 15:47:54.433 2 DEBUG oslo_concurrency.processutils [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] CMD "scp -r /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1_resize/disk 192.168.122.100:/var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk" returned: 0 in 1.766s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:47:54 np0005476733 nova_compute[192580]: 2025-10-08 15:47:54.434 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Copying file /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  8 11:47:54 np0005476733 nova_compute[192580]: 2025-10-08 15:47:54.435 2 DEBUG oslo_concurrency.processutils [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1_resize/disk.config 192.168.122.100:/var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:47:54 np0005476733 nova_compute[192580]: 2025-10-08 15:47:54.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:47:54 np0005476733 nova_compute[192580]: 2025-10-08 15:47:54.722 2 DEBUG oslo_concurrency.processutils [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] CMD "scp -C -r /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1_resize/disk.config 192.168.122.100:/var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk.config" returned: 0 in 0.287s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:47:54 np0005476733 nova_compute[192580]: 2025-10-08 15:47:54.724 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Copying file /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  8 11:47:54 np0005476733 nova_compute[192580]: 2025-10-08 15:47:54.724 2 DEBUG oslo_concurrency.processutils [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1_resize/disk.info 192.168.122.100:/var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:47:54 np0005476733 nova_compute[192580]: 2025-10-08 15:47:54.766 2 DEBUG nova.compute.manager [req-f8a17773-adb8-4767-a287-8477a275c48e req-68c1ba6e-a0f5-4c29-8f30-6da5d1085d1a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received event network-vif-plugged-d02e7451-3ef3-44f1-b34a-5c7cbee26989 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:47:54 np0005476733 nova_compute[192580]: 2025-10-08 15:47:54.768 2 DEBUG oslo_concurrency.lockutils [req-f8a17773-adb8-4767-a287-8477a275c48e req-68c1ba6e-a0f5-4c29-8f30-6da5d1085d1a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:47:54 np0005476733 nova_compute[192580]: 2025-10-08 15:47:54.768 2 DEBUG oslo_concurrency.lockutils [req-f8a17773-adb8-4767-a287-8477a275c48e req-68c1ba6e-a0f5-4c29-8f30-6da5d1085d1a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:47:54 np0005476733 nova_compute[192580]: 2025-10-08 15:47:54.769 2 DEBUG oslo_concurrency.lockutils [req-f8a17773-adb8-4767-a287-8477a275c48e req-68c1ba6e-a0f5-4c29-8f30-6da5d1085d1a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:47:54 np0005476733 nova_compute[192580]: 2025-10-08 15:47:54.769 2 DEBUG nova.compute.manager [req-f8a17773-adb8-4767-a287-8477a275c48e req-68c1ba6e-a0f5-4c29-8f30-6da5d1085d1a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] No waiting events found dispatching network-vif-plugged-d02e7451-3ef3-44f1-b34a-5c7cbee26989 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:47:54 np0005476733 nova_compute[192580]: 2025-10-08 15:47:54.770 2 WARNING nova.compute.manager [req-f8a17773-adb8-4767-a287-8477a275c48e req-68c1ba6e-a0f5-4c29-8f30-6da5d1085d1a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received unexpected event network-vif-plugged-d02e7451-3ef3-44f1-b34a-5c7cbee26989 for instance with vm_state active and task_state resize_migrating.#033[00m
Oct  8 11:47:54 np0005476733 nova_compute[192580]: 2025-10-08 15:47:54.772 2 DEBUG nova.compute.manager [req-76f24cdb-4eea-44b9-bd50-b39237a9cca0 req-8b66bf16-4bdd-46eb-971e-bbd602ec50fe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received event network-vif-plugged-8a48fdf3-2293-49fb-81c8-b558651c0274 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:47:54 np0005476733 nova_compute[192580]: 2025-10-08 15:47:54.773 2 DEBUG oslo_concurrency.lockutils [req-76f24cdb-4eea-44b9-bd50-b39237a9cca0 req-8b66bf16-4bdd-46eb-971e-bbd602ec50fe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:47:54 np0005476733 nova_compute[192580]: 2025-10-08 15:47:54.773 2 DEBUG oslo_concurrency.lockutils [req-76f24cdb-4eea-44b9-bd50-b39237a9cca0 req-8b66bf16-4bdd-46eb-971e-bbd602ec50fe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:47:54 np0005476733 nova_compute[192580]: 2025-10-08 15:47:54.774 2 DEBUG oslo_concurrency.lockutils [req-76f24cdb-4eea-44b9-bd50-b39237a9cca0 req-8b66bf16-4bdd-46eb-971e-bbd602ec50fe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:47:54 np0005476733 nova_compute[192580]: 2025-10-08 15:47:54.774 2 DEBUG nova.compute.manager [req-76f24cdb-4eea-44b9-bd50-b39237a9cca0 req-8b66bf16-4bdd-46eb-971e-bbd602ec50fe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] No waiting events found dispatching network-vif-plugged-8a48fdf3-2293-49fb-81c8-b558651c0274 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:47:54 np0005476733 nova_compute[192580]: 2025-10-08 15:47:54.775 2 WARNING nova.compute.manager [req-76f24cdb-4eea-44b9-bd50-b39237a9cca0 req-8b66bf16-4bdd-46eb-971e-bbd602ec50fe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received unexpected event network-vif-plugged-8a48fdf3-2293-49fb-81c8-b558651c0274 for instance with vm_state active and task_state resize_migrating.#033[00m
Oct  8 11:47:54 np0005476733 nova_compute[192580]: 2025-10-08 15:47:54.972 2 DEBUG oslo_concurrency.processutils [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] CMD "scp -C -r /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1_resize/disk.info 192.168.122.100:/var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk.info" returned: 0 in 0.247s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:47:55 np0005476733 nova_compute[192580]: 2025-10-08 15:47:55.194 2 DEBUG neutronclient.v2_0.client [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 8a48fdf3-2293-49fb-81c8-b558651c0274 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Oct  8 11:47:55 np0005476733 podman[241524]: 2025-10-08 15:47:55.280341889 +0000 UTC m=+0.088838512 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  8 11:47:55 np0005476733 podman[241525]: 2025-10-08 15:47:55.289934636 +0000 UTC m=+0.096470356 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:47:55 np0005476733 nova_compute[192580]: 2025-10-08 15:47:55.339 2 DEBUG oslo_concurrency.lockutils [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquiring lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:47:55 np0005476733 nova_compute[192580]: 2025-10-08 15:47:55.340 2 DEBUG oslo_concurrency.lockutils [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:47:55 np0005476733 nova_compute[192580]: 2025-10-08 15:47:55.341 2 DEBUG oslo_concurrency.lockutils [None req-3e538a06-3782-4502-9f69-4c16d7e1954a 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:47:55 np0005476733 nova_compute[192580]: 2025-10-08 15:47:55.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:56.300 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:47:56 np0005476733 nova_compute[192580]: 2025-10-08 15:47:56.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:56.302 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:47:56 np0005476733 nova_compute[192580]: 2025-10-08 15:47:56.954 2 DEBUG nova.compute.manager [req-8a9ea32b-3bf7-41b2-9880-08d7fc970b5b req-9f2fbd7c-0e4c-479a-b768-f158cade7755 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received event network-changed-8a48fdf3-2293-49fb-81c8-b558651c0274 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:47:56 np0005476733 nova_compute[192580]: 2025-10-08 15:47:56.955 2 DEBUG nova.compute.manager [req-8a9ea32b-3bf7-41b2-9880-08d7fc970b5b req-9f2fbd7c-0e4c-479a-b768-f158cade7755 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Refreshing instance network info cache due to event network-changed-8a48fdf3-2293-49fb-81c8-b558651c0274. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:47:56 np0005476733 nova_compute[192580]: 2025-10-08 15:47:56.955 2 DEBUG oslo_concurrency.lockutils [req-8a9ea32b-3bf7-41b2-9880-08d7fc970b5b req-9f2fbd7c-0e4c-479a-b768-f158cade7755 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:47:56 np0005476733 nova_compute[192580]: 2025-10-08 15:47:56.955 2 DEBUG oslo_concurrency.lockutils [req-8a9ea32b-3bf7-41b2-9880-08d7fc970b5b req-9f2fbd7c-0e4c-479a-b768-f158cade7755 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:47:56 np0005476733 nova_compute[192580]: 2025-10-08 15:47:56.956 2 DEBUG nova.network.neutron [req-8a9ea32b-3bf7-41b2-9880-08d7fc970b5b req-9f2fbd7c-0e4c-479a-b768-f158cade7755 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Refreshing network info cache for port 8a48fdf3-2293-49fb-81c8-b558651c0274 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:47:57 np0005476733 nova_compute[192580]: 2025-10-08 15:47:57.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:47:58 np0005476733 nova_compute[192580]: 2025-10-08 15:47:58.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:47:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:47:59.304 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:48:00 np0005476733 nova_compute[192580]: 2025-10-08 15:48:00.498 2 DEBUG nova.compute.manager [req-14fd4827-ef86-40d8-a2bf-e16396ee08de req-92cc4519-9a75-4d44-8854-56602f3e8f8e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received event network-changed-d02e7451-3ef3-44f1-b34a-5c7cbee26989 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:48:00 np0005476733 nova_compute[192580]: 2025-10-08 15:48:00.499 2 DEBUG nova.compute.manager [req-14fd4827-ef86-40d8-a2bf-e16396ee08de req-92cc4519-9a75-4d44-8854-56602f3e8f8e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Refreshing instance network info cache due to event network-changed-d02e7451-3ef3-44f1-b34a-5c7cbee26989. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:48:00 np0005476733 nova_compute[192580]: 2025-10-08 15:48:00.499 2 DEBUG oslo_concurrency.lockutils [req-14fd4827-ef86-40d8-a2bf-e16396ee08de req-92cc4519-9a75-4d44-8854-56602f3e8f8e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:48:00 np0005476733 nova_compute[192580]: 2025-10-08 15:48:00.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:02 np0005476733 nova_compute[192580]: 2025-10-08 15:48:02.464 2 DEBUG nova.network.neutron [req-8a9ea32b-3bf7-41b2-9880-08d7fc970b5b req-9f2fbd7c-0e4c-479a-b768-f158cade7755 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Updated VIF entry in instance network info cache for port 8a48fdf3-2293-49fb-81c8-b558651c0274. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:48:02 np0005476733 nova_compute[192580]: 2025-10-08 15:48:02.465 2 DEBUG nova.network.neutron [req-8a9ea32b-3bf7-41b2-9880-08d7fc970b5b req-9f2fbd7c-0e4c-479a-b768-f158cade7755 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Updating instance_info_cache with network_info: [{"id": "8a48fdf3-2293-49fb-81c8-b558651c0274", "address": "fa:16:3e:36:af:68", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a48fdf3-22", "ovs_interfaceid": "8a48fdf3-2293-49fb-81c8-b558651c0274", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "address": "fa:16:3e:f7:81:a1", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd02e7451-3e", "ovs_interfaceid": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:48:02 np0005476733 nova_compute[192580]: 2025-10-08 15:48:02.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:02 np0005476733 nova_compute[192580]: 2025-10-08 15:48:02.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:48:02 np0005476733 nova_compute[192580]: 2025-10-08 15:48:02.659 2 DEBUG oslo_concurrency.lockutils [req-8a9ea32b-3bf7-41b2-9880-08d7fc970b5b req-9f2fbd7c-0e4c-479a-b768-f158cade7755 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:48:02 np0005476733 nova_compute[192580]: 2025-10-08 15:48:02.661 2 DEBUG oslo_concurrency.lockutils [req-14fd4827-ef86-40d8-a2bf-e16396ee08de req-92cc4519-9a75-4d44-8854-56602f3e8f8e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:48:02 np0005476733 nova_compute[192580]: 2025-10-08 15:48:02.661 2 DEBUG nova.network.neutron [req-14fd4827-ef86-40d8-a2bf-e16396ee08de req-92cc4519-9a75-4d44-8854-56602f3e8f8e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Refreshing network info cache for port d02e7451-3ef3-44f1-b34a-5c7cbee26989 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:48:03 np0005476733 nova_compute[192580]: 2025-10-08 15:48:03.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:48:03 np0005476733 nova_compute[192580]: 2025-10-08 15:48:03.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:48:03 np0005476733 nova_compute[192580]: 2025-10-08 15:48:03.690 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Skipping network cache update for instance because it has been migrated to another host. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9902#033[00m
Oct  8 11:48:03 np0005476733 nova_compute[192580]: 2025-10-08 15:48:03.690 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 11:48:05 np0005476733 podman[241575]: 2025-10-08 15:48:05.242985663 +0000 UTC m=+0.067729827 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 11:48:05 np0005476733 nova_compute[192580]: 2025-10-08 15:48:05.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:48:05 np0005476733 nova_compute[192580]: 2025-10-08 15:48:05.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:48:05 np0005476733 nova_compute[192580]: 2025-10-08 15:48:05.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:48:05 np0005476733 nova_compute[192580]: 2025-10-08 15:48:05.627 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:48:05 np0005476733 nova_compute[192580]: 2025-10-08 15:48:05.628 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:48:05 np0005476733 nova_compute[192580]: 2025-10-08 15:48:05.629 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:48:05 np0005476733 nova_compute[192580]: 2025-10-08 15:48:05.629 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:48:05 np0005476733 nova_compute[192580]: 2025-10-08 15:48:05.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:05 np0005476733 nova_compute[192580]: 2025-10-08 15:48:05.884 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13616378-6f23-42e4-8c8c-5182a5056326/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:48:05 np0005476733 nova_compute[192580]: 2025-10-08 15:48:05.976 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13616378-6f23-42e4-8c8c-5182a5056326/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:48:05 np0005476733 nova_compute[192580]: 2025-10-08 15:48:05.978 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13616378-6f23-42e4-8c8c-5182a5056326/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:48:06 np0005476733 nova_compute[192580]: 2025-10-08 15:48:06.038 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13616378-6f23-42e4-8c8c-5182a5056326/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:48:06 np0005476733 nova_compute[192580]: 2025-10-08 15:48:06.047 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-00000047, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/df287684-9151-42eb-8ff2-01e29a07e1e1/disk#033[00m
Oct  8 11:48:06 np0005476733 nova_compute[192580]: 2025-10-08 15:48:06.254 2 DEBUG nova.network.neutron [req-14fd4827-ef86-40d8-a2bf-e16396ee08de req-92cc4519-9a75-4d44-8854-56602f3e8f8e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Updated VIF entry in instance network info cache for port d02e7451-3ef3-44f1-b34a-5c7cbee26989. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:48:06 np0005476733 nova_compute[192580]: 2025-10-08 15:48:06.255 2 DEBUG nova.network.neutron [req-14fd4827-ef86-40d8-a2bf-e16396ee08de req-92cc4519-9a75-4d44-8854-56602f3e8f8e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Updating instance_info_cache with network_info: [{"id": "8a48fdf3-2293-49fb-81c8-b558651c0274", "address": "fa:16:3e:36:af:68", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a48fdf3-22", "ovs_interfaceid": "8a48fdf3-2293-49fb-81c8-b558651c0274", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "address": "fa:16:3e:f7:81:a1", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd02e7451-3e", "ovs_interfaceid": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:48:06 np0005476733 nova_compute[192580]: 2025-10-08 15:48:06.275 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:48:06 np0005476733 nova_compute[192580]: 2025-10-08 15:48:06.276 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12965MB free_disk=111.04489135742188GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:48:06 np0005476733 nova_compute[192580]: 2025-10-08 15:48:06.277 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:48:06 np0005476733 nova_compute[192580]: 2025-10-08 15:48:06.277 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:48:06 np0005476733 nova_compute[192580]: 2025-10-08 15:48:06.433 2 DEBUG oslo_concurrency.lockutils [req-14fd4827-ef86-40d8-a2bf-e16396ee08de req-92cc4519-9a75-4d44-8854-56602f3e8f8e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:48:06 np0005476733 nova_compute[192580]: 2025-10-08 15:48:06.444 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Migration for instance df287684-9151-42eb-8ff2-01e29a07e1e1 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Oct  8 11:48:06 np0005476733 nova_compute[192580]: 2025-10-08 15:48:06.512 2 INFO nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Updating resource usage from migration d3745443-cfe6-4122-9352-901845131915#033[00m
Oct  8 11:48:06 np0005476733 nova_compute[192580]: 2025-10-08 15:48:06.513 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Starting to track outgoing migration d3745443-cfe6-4122-9352-901845131915 with flavor 22222222-2222-2222-2222-222222222222 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444#033[00m
Oct  8 11:48:06 np0005476733 nova_compute[192580]: 2025-10-08 15:48:06.550 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 13616378-6f23-42e4-8c8c-5182a5056326 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:48:06 np0005476733 nova_compute[192580]: 2025-10-08 15:48:06.550 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Migration d3745443-cfe6-4122-9352-901845131915 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 1024, 'DISK_GB': 10}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Oct  8 11:48:06 np0005476733 nova_compute[192580]: 2025-10-08 15:48:06.551 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:48:06 np0005476733 nova_compute[192580]: 2025-10-08 15:48:06.551 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=2560MB phys_disk=119GB used_disk=20GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:48:06 np0005476733 nova_compute[192580]: 2025-10-08 15:48:06.622 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:48:06 np0005476733 nova_compute[192580]: 2025-10-08 15:48:06.707 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:48:06 np0005476733 nova_compute[192580]: 2025-10-08 15:48:06.815 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:48:06 np0005476733 nova_compute[192580]: 2025-10-08 15:48:06.815 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.538s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:48:07 np0005476733 nova_compute[192580]: 2025-10-08 15:48:07.499 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759938472.4971488, df287684-9151-42eb-8ff2-01e29a07e1e1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:48:07 np0005476733 nova_compute[192580]: 2025-10-08 15:48:07.500 2 INFO nova.compute.manager [-] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:48:07 np0005476733 nova_compute[192580]: 2025-10-08 15:48:07.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:07 np0005476733 nova_compute[192580]: 2025-10-08 15:48:07.628 2 DEBUG nova.compute.manager [None req-7413c7a6-fc47-4a55-bfad-043c1d781f43 - - - - - -] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:48:07 np0005476733 nova_compute[192580]: 2025-10-08 15:48:07.633 2 DEBUG nova.compute.manager [None req-7413c7a6-fc47-4a55-bfad-043c1d781f43 - - - - - -] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: resize_migrated, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:48:07 np0005476733 nova_compute[192580]: 2025-10-08 15:48:07.816 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:48:07 np0005476733 nova_compute[192580]: 2025-10-08 15:48:07.816 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:48:07 np0005476733 nova_compute[192580]: 2025-10-08 15:48:07.899 2 INFO nova.compute.manager [None req-7413c7a6-fc47-4a55-bfad-043c1d781f43 - - - - - -] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Oct  8 11:48:10 np0005476733 podman[241600]: 2025-10-08 15:48:10.236797454 +0000 UTC m=+0.058097558 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:48:10 np0005476733 podman[241599]: 2025-10-08 15:48:10.260830083 +0000 UTC m=+0.087247701 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 11:48:10 np0005476733 nova_compute[192580]: 2025-10-08 15:48:10.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:10 np0005476733 nova_compute[192580]: 2025-10-08 15:48:10.970 2 DEBUG nova.compute.manager [req-26452110-d4c3-487c-9296-c56f53724f16 req-396ea85c-25c3-4b6f-b433-1682679eaf13 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received event network-vif-plugged-d02e7451-3ef3-44f1-b34a-5c7cbee26989 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:48:10 np0005476733 nova_compute[192580]: 2025-10-08 15:48:10.971 2 DEBUG oslo_concurrency.lockutils [req-26452110-d4c3-487c-9296-c56f53724f16 req-396ea85c-25c3-4b6f-b433-1682679eaf13 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:48:10 np0005476733 nova_compute[192580]: 2025-10-08 15:48:10.971 2 DEBUG oslo_concurrency.lockutils [req-26452110-d4c3-487c-9296-c56f53724f16 req-396ea85c-25c3-4b6f-b433-1682679eaf13 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:48:10 np0005476733 nova_compute[192580]: 2025-10-08 15:48:10.971 2 DEBUG oslo_concurrency.lockutils [req-26452110-d4c3-487c-9296-c56f53724f16 req-396ea85c-25c3-4b6f-b433-1682679eaf13 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:48:10 np0005476733 nova_compute[192580]: 2025-10-08 15:48:10.971 2 DEBUG nova.compute.manager [req-26452110-d4c3-487c-9296-c56f53724f16 req-396ea85c-25c3-4b6f-b433-1682679eaf13 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] No waiting events found dispatching network-vif-plugged-d02e7451-3ef3-44f1-b34a-5c7cbee26989 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:48:10 np0005476733 nova_compute[192580]: 2025-10-08 15:48:10.972 2 WARNING nova.compute.manager [req-26452110-d4c3-487c-9296-c56f53724f16 req-396ea85c-25c3-4b6f-b433-1682679eaf13 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received unexpected event network-vif-plugged-d02e7451-3ef3-44f1-b34a-5c7cbee26989 for instance with vm_state resized and task_state None.#033[00m
Oct  8 11:48:11 np0005476733 nova_compute[192580]: 2025-10-08 15:48:11.397 2 DEBUG nova.compute.manager [req-e88e8718-d260-4f8c-b2bd-70610db0dbdf req-d995a6fe-e588-48f2-a9e9-dcb17cf2b3bf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received event network-vif-plugged-8a48fdf3-2293-49fb-81c8-b558651c0274 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:48:11 np0005476733 nova_compute[192580]: 2025-10-08 15:48:11.398 2 DEBUG oslo_concurrency.lockutils [req-e88e8718-d260-4f8c-b2bd-70610db0dbdf req-d995a6fe-e588-48f2-a9e9-dcb17cf2b3bf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:48:11 np0005476733 nova_compute[192580]: 2025-10-08 15:48:11.398 2 DEBUG oslo_concurrency.lockutils [req-e88e8718-d260-4f8c-b2bd-70610db0dbdf req-d995a6fe-e588-48f2-a9e9-dcb17cf2b3bf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:48:11 np0005476733 nova_compute[192580]: 2025-10-08 15:48:11.398 2 DEBUG oslo_concurrency.lockutils [req-e88e8718-d260-4f8c-b2bd-70610db0dbdf req-d995a6fe-e588-48f2-a9e9-dcb17cf2b3bf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:48:11 np0005476733 nova_compute[192580]: 2025-10-08 15:48:11.398 2 DEBUG nova.compute.manager [req-e88e8718-d260-4f8c-b2bd-70610db0dbdf req-d995a6fe-e588-48f2-a9e9-dcb17cf2b3bf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] No waiting events found dispatching network-vif-plugged-8a48fdf3-2293-49fb-81c8-b558651c0274 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:48:11 np0005476733 nova_compute[192580]: 2025-10-08 15:48:11.399 2 WARNING nova.compute.manager [req-e88e8718-d260-4f8c-b2bd-70610db0dbdf req-d995a6fe-e588-48f2-a9e9-dcb17cf2b3bf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received unexpected event network-vif-plugged-8a48fdf3-2293-49fb-81c8-b558651c0274 for instance with vm_state resized and task_state None.#033[00m
Oct  8 11:48:11 np0005476733 nova_compute[192580]: 2025-10-08 15:48:11.982 2 DEBUG oslo_concurrency.lockutils [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquiring lock "df287684-9151-42eb-8ff2-01e29a07e1e1" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:48:11 np0005476733 nova_compute[192580]: 2025-10-08 15:48:11.984 2 DEBUG oslo_concurrency.lockutils [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:48:11 np0005476733 nova_compute[192580]: 2025-10-08 15:48:11.984 2 DEBUG nova.compute.manager [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Going to confirm migration 2 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Oct  8 11:48:12 np0005476733 nova_compute[192580]: 2025-10-08 15:48:12.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:12 np0005476733 nova_compute[192580]: 2025-10-08 15:48:12.830 2 DEBUG neutronclient.v2_0.client [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 8a48fdf3-2293-49fb-81c8-b558651c0274 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Oct  8 11:48:12 np0005476733 nova_compute[192580]: 2025-10-08 15:48:12.994 2 DEBUG neutronclient.v2_0.client [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port d02e7451-3ef3-44f1-b34a-5c7cbee26989 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Oct  8 11:48:12 np0005476733 nova_compute[192580]: 2025-10-08 15:48:12.995 2 DEBUG oslo_concurrency.lockutils [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquiring lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:48:12 np0005476733 nova_compute[192580]: 2025-10-08 15:48:12.995 2 DEBUG oslo_concurrency.lockutils [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquired lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:48:12 np0005476733 nova_compute[192580]: 2025-10-08 15:48:12.995 2 DEBUG nova.network.neutron [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:48:12 np0005476733 nova_compute[192580]: 2025-10-08 15:48:12.995 2 DEBUG nova.objects.instance [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lazy-loading 'info_cache' on Instance uuid df287684-9151-42eb-8ff2-01e29a07e1e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:48:13 np0005476733 nova_compute[192580]: 2025-10-08 15:48:13.162 2 DEBUG nova.compute.manager [req-b4f37557-ca30-4441-a237-868d21f0cf92 req-912b866b-c939-4448-8c82-7d9ade489341 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received event network-vif-plugged-d02e7451-3ef3-44f1-b34a-5c7cbee26989 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:48:13 np0005476733 nova_compute[192580]: 2025-10-08 15:48:13.163 2 DEBUG oslo_concurrency.lockutils [req-b4f37557-ca30-4441-a237-868d21f0cf92 req-912b866b-c939-4448-8c82-7d9ade489341 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:48:13 np0005476733 nova_compute[192580]: 2025-10-08 15:48:13.163 2 DEBUG oslo_concurrency.lockutils [req-b4f37557-ca30-4441-a237-868d21f0cf92 req-912b866b-c939-4448-8c82-7d9ade489341 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:48:13 np0005476733 nova_compute[192580]: 2025-10-08 15:48:13.163 2 DEBUG oslo_concurrency.lockutils [req-b4f37557-ca30-4441-a237-868d21f0cf92 req-912b866b-c939-4448-8c82-7d9ade489341 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:48:13 np0005476733 nova_compute[192580]: 2025-10-08 15:48:13.163 2 DEBUG nova.compute.manager [req-b4f37557-ca30-4441-a237-868d21f0cf92 req-912b866b-c939-4448-8c82-7d9ade489341 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] No waiting events found dispatching network-vif-plugged-d02e7451-3ef3-44f1-b34a-5c7cbee26989 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:48:13 np0005476733 nova_compute[192580]: 2025-10-08 15:48:13.163 2 WARNING nova.compute.manager [req-b4f37557-ca30-4441-a237-868d21f0cf92 req-912b866b-c939-4448-8c82-7d9ade489341 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received unexpected event network-vif-plugged-d02e7451-3ef3-44f1-b34a-5c7cbee26989 for instance with vm_state resized and task_state None.#033[00m
Oct  8 11:48:13 np0005476733 nova_compute[192580]: 2025-10-08 15:48:13.554 2 DEBUG nova.compute.manager [req-a68dc5e1-a9c8-4b41-87f4-cd53930b80fe req-0455f9d0-6e8e-4726-9841-de364aabf79b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received event network-vif-plugged-8a48fdf3-2293-49fb-81c8-b558651c0274 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:48:13 np0005476733 nova_compute[192580]: 2025-10-08 15:48:13.555 2 DEBUG oslo_concurrency.lockutils [req-a68dc5e1-a9c8-4b41-87f4-cd53930b80fe req-0455f9d0-6e8e-4726-9841-de364aabf79b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:48:13 np0005476733 nova_compute[192580]: 2025-10-08 15:48:13.556 2 DEBUG oslo_concurrency.lockutils [req-a68dc5e1-a9c8-4b41-87f4-cd53930b80fe req-0455f9d0-6e8e-4726-9841-de364aabf79b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:48:13 np0005476733 nova_compute[192580]: 2025-10-08 15:48:13.556 2 DEBUG oslo_concurrency.lockutils [req-a68dc5e1-a9c8-4b41-87f4-cd53930b80fe req-0455f9d0-6e8e-4726-9841-de364aabf79b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:48:13 np0005476733 nova_compute[192580]: 2025-10-08 15:48:13.557 2 DEBUG nova.compute.manager [req-a68dc5e1-a9c8-4b41-87f4-cd53930b80fe req-0455f9d0-6e8e-4726-9841-de364aabf79b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] No waiting events found dispatching network-vif-plugged-8a48fdf3-2293-49fb-81c8-b558651c0274 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:48:13 np0005476733 nova_compute[192580]: 2025-10-08 15:48:13.558 2 WARNING nova.compute.manager [req-a68dc5e1-a9c8-4b41-87f4-cd53930b80fe req-0455f9d0-6e8e-4726-9841-de364aabf79b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Received unexpected event network-vif-plugged-8a48fdf3-2293-49fb-81c8-b558651c0274 for instance with vm_state resized and task_state None.#033[00m
Oct  8 11:48:15 np0005476733 nova_compute[192580]: 2025-10-08 15:48:15.168 2 DEBUG nova.network.neutron [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: df287684-9151-42eb-8ff2-01e29a07e1e1] Updating instance_info_cache with network_info: [{"id": "8a48fdf3-2293-49fb-81c8-b558651c0274", "address": "fa:16:3e:36:af:68", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a48fdf3-22", "ovs_interfaceid": "8a48fdf3-2293-49fb-81c8-b558651c0274", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "address": "fa:16:3e:f7:81:a1", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd02e7451-3e", "ovs_interfaceid": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:48:15 np0005476733 nova_compute[192580]: 2025-10-08 15:48:15.566 2 DEBUG oslo_concurrency.lockutils [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Releasing lock "refresh_cache-df287684-9151-42eb-8ff2-01e29a07e1e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:48:15 np0005476733 nova_compute[192580]: 2025-10-08 15:48:15.566 2 DEBUG nova.objects.instance [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lazy-loading 'migration_context' on Instance uuid df287684-9151-42eb-8ff2-01e29a07e1e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:48:15 np0005476733 nova_compute[192580]: 2025-10-08 15:48:15.631 2 DEBUG nova.virt.libvirt.vif [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:44:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_qos_after_cold_migration-1645830914',display_name='tempest-test_qos_after_cold_migration-1645830914',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-test-qos-after-cold-migration-1645830914',id=71,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:48:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-riz678gr',resources=<?>,root_device_name='/dev/vda',root_gb=10,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',old_vm_state='active',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:48:10Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=df287684-9151-42eb-8ff2-01e29a07e1e1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "8a48fdf3-2293-49fb-81c8-b558651c0274", "address": "fa:16:3e:36:af:68", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a48fdf3-22", "ovs_interfaceid": "8a48fdf3-2293-49fb-81c8-b558651c0274", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:48:15 np0005476733 nova_compute[192580]: 2025-10-08 15:48:15.632 2 DEBUG nova.network.os_vif_util [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converting VIF {"id": "8a48fdf3-2293-49fb-81c8-b558651c0274", "address": "fa:16:3e:36:af:68", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a48fdf3-22", "ovs_interfaceid": "8a48fdf3-2293-49fb-81c8-b558651c0274", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:48:15 np0005476733 nova_compute[192580]: 2025-10-08 15:48:15.633 2 DEBUG nova.network.os_vif_util [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:36:af:68,bridge_name='br-int',has_traffic_filtering=True,id=8a48fdf3-2293-49fb-81c8-b558651c0274,network=Network(ac5383ee-65ae-4340-bb19-495c4991fae8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a48fdf3-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:48:15 np0005476733 nova_compute[192580]: 2025-10-08 15:48:15.633 2 DEBUG os_vif [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:36:af:68,bridge_name='br-int',has_traffic_filtering=True,id=8a48fdf3-2293-49fb-81c8-b558651c0274,network=Network(ac5383ee-65ae-4340-bb19-495c4991fae8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a48fdf3-22') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:48:15 np0005476733 nova_compute[192580]: 2025-10-08 15:48:15.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:15 np0005476733 nova_compute[192580]: 2025-10-08 15:48:15.635 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8a48fdf3-22, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:48:15 np0005476733 nova_compute[192580]: 2025-10-08 15:48:15.636 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:48:15 np0005476733 nova_compute[192580]: 2025-10-08 15:48:15.638 2 INFO os_vif [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:36:af:68,bridge_name='br-int',has_traffic_filtering=True,id=8a48fdf3-2293-49fb-81c8-b558651c0274,network=Network(ac5383ee-65ae-4340-bb19-495c4991fae8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a48fdf3-22')#033[00m
Oct  8 11:48:15 np0005476733 nova_compute[192580]: 2025-10-08 15:48:15.639 2 DEBUG nova.virt.libvirt.vif [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:44:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_qos_after_cold_migration-1645830914',display_name='tempest-test_qos_after_cold_migration-1645830914',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-test-qos-after-cold-migration-1645830914',id=71,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:48:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-riz678gr',resources=<?>,root_device_name='/dev/vda',root_gb=10,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',old_vm_state='active',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:48:10Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=df287684-9151-42eb-8ff2-01e29a07e1e1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "address": "fa:16:3e:f7:81:a1", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd02e7451-3e", "ovs_interfaceid": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:48:15 np0005476733 nova_compute[192580]: 2025-10-08 15:48:15.639 2 DEBUG nova.network.os_vif_util [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converting VIF {"id": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "address": "fa:16:3e:f7:81:a1", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd02e7451-3e", "ovs_interfaceid": "d02e7451-3ef3-44f1-b34a-5c7cbee26989", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:48:15 np0005476733 nova_compute[192580]: 2025-10-08 15:48:15.640 2 DEBUG nova.network.os_vif_util [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f7:81:a1,bridge_name='br-int',has_traffic_filtering=True,id=d02e7451-3ef3-44f1-b34a-5c7cbee26989,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd02e7451-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:48:15 np0005476733 nova_compute[192580]: 2025-10-08 15:48:15.640 2 DEBUG os_vif [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:81:a1,bridge_name='br-int',has_traffic_filtering=True,id=d02e7451-3ef3-44f1-b34a-5c7cbee26989,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd02e7451-3e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:48:15 np0005476733 nova_compute[192580]: 2025-10-08 15:48:15.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:15 np0005476733 nova_compute[192580]: 2025-10-08 15:48:15.641 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd02e7451-3e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:48:15 np0005476733 nova_compute[192580]: 2025-10-08 15:48:15.641 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:48:15 np0005476733 nova_compute[192580]: 2025-10-08 15:48:15.643 2 INFO os_vif [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:81:a1,bridge_name='br-int',has_traffic_filtering=True,id=d02e7451-3ef3-44f1-b34a-5c7cbee26989,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd02e7451-3e')#033[00m
Oct  8 11:48:15 np0005476733 nova_compute[192580]: 2025-10-08 15:48:15.644 2 DEBUG oslo_concurrency.lockutils [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:48:15 np0005476733 nova_compute[192580]: 2025-10-08 15:48:15.645 2 DEBUG oslo_concurrency.lockutils [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:48:15 np0005476733 nova_compute[192580]: 2025-10-08 15:48:15.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:15 np0005476733 nova_compute[192580]: 2025-10-08 15:48:15.884 2 DEBUG nova.compute.provider_tree [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:48:15 np0005476733 nova_compute[192580]: 2025-10-08 15:48:15.916 2 DEBUG nova.scheduler.client.report [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:48:16 np0005476733 nova_compute[192580]: 2025-10-08 15:48:16.128 2 DEBUG oslo_concurrency.lockutils [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.483s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:48:16 np0005476733 nova_compute[192580]: 2025-10-08 15:48:16.680 2 INFO nova.scheduler.client.report [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Deleted allocation for migration d3745443-cfe6-4122-9352-901845131915#033[00m
Oct  8 11:48:16 np0005476733 nova_compute[192580]: 2025-10-08 15:48:16.964 2 DEBUG oslo_concurrency.lockutils [None req-3dd8c8e3-42e4-4849-b0a2-42d671ade854 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "df287684-9151-42eb-8ff2-01e29a07e1e1" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 4.980s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:48:17 np0005476733 nova_compute[192580]: 2025-10-08 15:48:17.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:18 np0005476733 nova_compute[192580]: 2025-10-08 15:48:18.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:48:19 np0005476733 podman[241647]: 2025-10-08 15:48:19.234967325 +0000 UTC m=+0.061380834 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:48:19 np0005476733 podman[241648]: 2025-10-08 15:48:19.237197906 +0000 UTC m=+0.062697516 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, config_id=edpm, architecture=x86_64)
Oct  8 11:48:19 np0005476733 podman[241646]: 2025-10-08 15:48:19.266994449 +0000 UTC m=+0.092168559 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  8 11:48:20 np0005476733 nova_compute[192580]: 2025-10-08 15:48:20.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:48:20 np0005476733 nova_compute[192580]: 2025-10-08 15:48:20.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:22 np0005476733 nova_compute[192580]: 2025-10-08 15:48:22.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:48:25Z|00605|pinctrl|WARN|Dropped 759 log messages in last 60 seconds (most recently, 14 seconds ago) due to excessive rate
Oct  8 11:48:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:48:25Z|00606|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:48:25 np0005476733 nova_compute[192580]: 2025-10-08 15:48:25.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:26 np0005476733 podman[241725]: 2025-10-08 15:48:26.236396999 +0000 UTC m=+0.052280792 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:48:26 np0005476733 podman[241724]: 2025-10-08 15:48:26.243422305 +0000 UTC m=+0.063477382 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:48:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:48:26.336 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:48:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:48:26.336 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:48:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:48:26.337 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:48:27 np0005476733 nova_compute[192580]: 2025-10-08 15:48:27.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:30 np0005476733 nova_compute[192580]: 2025-10-08 15:48:30.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:32 np0005476733 nova_compute[192580]: 2025-10-08 15:48:32.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:35 np0005476733 nova_compute[192580]: 2025-10-08 15:48:35.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.014 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '13616378-6f23-42e4-8c8c-5182a5056326', 'name': 'tempest-test_qos_after_cold_migration-1644920499', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000046', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'hostId': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.014 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.039 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/disk.device.read.requests volume: 11470 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.040 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/disk.device.read.requests volume: 50 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '362ad9f2-35e6-45af-a697-4007e5caba46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11470, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '13616378-6f23-42e4-8c8c-5182a5056326-vda', 'timestamp': '2025-10-08T15:48:36.015006', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'instance-00000046', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '40418ad6-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.737965249, 'message_signature': '062c75de2bb786291765ee0a30070e17b3ff22c1732739f04af6cd907b6a9aa4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 50, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '13616378-6f23-42e4-8c8c-5182a5056326-sda', 'timestamp': '2025-10-08T15:48:36.015006', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'instance-00000046', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4041967a-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.737965249, 'message_signature': 'f1b6396dd112690c41416755ed78436a8d3c57cbc529721ad3e25d2b4c182497'}]}, 'timestamp': '2025-10-08 15:48:36.040796', '_unique_id': '7ec39e1cedac4101b378bddf4f21126e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.041 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.042 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.048 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 13616378-6f23-42e4-8c8c-5182a5056326 / tap4e3afd85-f9 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.048 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 13616378-6f23-42e4-8c8c-5182a5056326 / tap6771bc83-98 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.049 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/network.incoming.packets volume: 43 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.049 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/network.incoming.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e869482-273a-4ba1-b46b-0fd6bec5aadc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 43, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000046-13616378-6f23-42e4-8c8c-5182a5056326-tap4e3afd85-f9', 'timestamp': '2025-10-08T15:48:36.042823', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'tap4e3afd85-f9', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e8:4d:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4e3afd85-f9'}, 'message_id': '4042e958-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.76581407, 'message_signature': '6160a89ed58a426f234c842bcbabc41fb0589911e9a1053d72a94b344b458925'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000046-13616378-6f23-42e4-8c8c-5182a5056326-tap6771bc83-98', 'timestamp': '2025-10-08T15:48:36.042823', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'tap6771bc83-98', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b8:d2:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6771bc83-98'}, 'message_id': '4042f466-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.76581407, 'message_signature': 'fcdae42283560692cc1f3e0de056dc1e48fbaca354b95e7a282ff9688e6cf982'}]}, 'timestamp': '2025-10-08 15:48:36.049742', '_unique_id': '450391dec8b54a88bea289a3e140ca72'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.050 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.051 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.070 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/cpu volume: 28310000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '857f4c51-2821-46ab-bf42-9878edd78279', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28310000000, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'timestamp': '2025-10-08T15:48:36.051489', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'instance-00000046', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '4046200a-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.792869936, 'message_signature': '46a0160e9dc6d362b892e215b8c4094d4fcd3d73a018c47e2538f0547640fa3b'}]}, 'timestamp': '2025-10-08 15:48:36.070637', '_unique_id': '1796ef837c1f4a5da6c3a17fe3aa8c79'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.071 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.072 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.085 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/disk.device.allocation volume: 154931200 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.085 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c936a47d-4992-4c36-803c-9fb446700fa0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 154931200, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '13616378-6f23-42e4-8c8c-5182a5056326-vda', 'timestamp': '2025-10-08T15:48:36.072424', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'instance-00000046', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '40487148-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.795400437, 'message_signature': '1c9e3b5f1e4060d577db75d078d2f3dfb3828157af9bedd8e4f8cd750ed9e768'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '13616378-6f23-42e4-8c8c-5182a5056326-sda', 'timestamp': '2025-10-08T15:48:36.072424', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'instance-00000046', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '40488016-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.795400437, 'message_signature': '24900f48cf6c4b41c569e6188fe6a57edad88cb0f4864577aab6597484aab588'}]}, 'timestamp': '2025-10-08 15:48:36.086144', '_unique_id': 'd112483f7e9a4bcb84197273c46bf0bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.087 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.088 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.088 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/disk.device.write.bytes volume: 5733888 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.088 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8904d410-45dc-4163-8456-8eff1f78f346', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5733888, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '13616378-6f23-42e4-8c8c-5182a5056326-vda', 'timestamp': '2025-10-08T15:48:36.088605', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'instance-00000046', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4048edee-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.737965249, 'message_signature': '9664842938383dff98cde1039b5cc3b7d00abd127440b8f8fc7d81fea0a20bf9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '13616378-6f23-42e4-8c8c-5182a5056326-sda', 'timestamp': '2025-10-08T15:48:36.088605', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'instance-00000046', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4048f8de-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.737965249, 'message_signature': 'cbc0d912b974c5d869b98ae869892b0deb4bd11cd12e08a89b23a898f66840d2'}]}, 'timestamp': '2025-10-08 15:48:36.089221', '_unique_id': 'efdbc42f80084c448e6d131a77195422'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.089 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.090 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.090 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.091 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9e5e9a3-457e-4081-ae3a-1c81902ba04b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000046-13616378-6f23-42e4-8c8c-5182a5056326-tap4e3afd85-f9', 'timestamp': '2025-10-08T15:48:36.090783', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'tap4e3afd85-f9', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e8:4d:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4e3afd85-f9'}, 'message_id': '40494302-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.76581407, 'message_signature': '23fd1f1d3e8061d307f30e6b61bd9e4646f094ca24ef946e430d0d03a8276c76'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000046-13616378-6f23-42e4-8c8c-5182a5056326-tap6771bc83-98', 'timestamp': '2025-10-08T15:48:36.090783', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'tap6771bc83-98', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b8:d2:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6771bc83-98'}, 'message_id': '40494fdc-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.76581407, 'message_signature': 'eb67e05c7116ccdb1e79b896c0aff25a054bd90c8964b4b2af9fada0519e4c25'}]}, 'timestamp': '2025-10-08 15:48:36.091441', '_unique_id': 'b65b6c49e2024df58a34b59039932285'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.092 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.093 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dbdaf944-0108-460f-be93-1bec670ee175', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000046-13616378-6f23-42e4-8c8c-5182a5056326-tap4e3afd85-f9', 'timestamp': '2025-10-08T15:48:36.092964', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'tap4e3afd85-f9', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e8:4d:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4e3afd85-f9'}, 'message_id': '404998ca-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.76581407, 'message_signature': '001d3113dcd7d14e482bccd07a7f3ccb3603e1cbbf103c0e73d84072e6c7ea27'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000046-13616378-6f23-42e4-8c8c-5182a5056326-tap6771bc83-98', 'timestamp': '2025-10-08T15:48:36.092964', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'tap6771bc83-98', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b8:d2:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6771bc83-98'}, 'message_id': '4049a446-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.76581407, 'message_signature': '708cc70475f0ec762d838513bf52242ea7a93e769959d3a1025dcdb290ba8f0c'}]}, 'timestamp': '2025-10-08 15:48:36.093610', '_unique_id': '3516556ea64a42b886637629e65f62a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.094 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.095 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.095 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/disk.device.read.bytes volume: 253569024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.095 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/disk.device.read.bytes volume: 162020 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc4c2d93-ce29-4f58-80e6-cefe985c8f0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 253569024, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '13616378-6f23-42e4-8c8c-5182a5056326-vda', 'timestamp': '2025-10-08T15:48:36.095228', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'instance-00000046', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4049f0a4-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.737965249, 'message_signature': '1ed7d303fdad63a556446e7b87472f382376f87d3ddf75574d80dd3c4fc4018c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 162020, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '13616378-6f23-42e4-8c8c-5182a5056326-sda', 'timestamp': '2025-10-08T15:48:36.095228', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'instance-00000046', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4049fc66-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.737965249, 'message_signature': '3a239ab7f389912eac3a1f62104081ca4a3dc6d0cb4b384f1db3e13c96296f0e'}]}, 'timestamp': '2025-10-08 15:48:36.095840', '_unique_id': 'c5bd302ec53f4db098ff2f82e1db2e13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.096 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.097 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.097 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.097 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a5def85-9b55-4f0e-b8c2-7d8fdbf9d831', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000046-13616378-6f23-42e4-8c8c-5182a5056326-tap4e3afd85-f9', 'timestamp': '2025-10-08T15:48:36.097473', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'tap4e3afd85-f9', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e8:4d:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4e3afd85-f9'}, 'message_id': '404a4838-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.76581407, 'message_signature': 'a061bcc45b0bee0e39ae49e18ebc3b9bc39bd0ca0a42c52065c34ed187f02214'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000046-13616378-6f23-42e4-8c8c-5182a5056326-tap6771bc83-98', 'timestamp': '2025-10-08T15:48:36.097473', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'tap6771bc83-98', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b8:d2:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6771bc83-98'}, 'message_id': '404a54ae-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.76581407, 'message_signature': '2bd23db04dd373968dbf67a642d2fb1daf375de6b64dfb3eff5d566de0292ed8'}]}, 'timestamp': '2025-10-08 15:48:36.098144', '_unique_id': 'dd243ef96e5c43798903deea494c7cc0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.098 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.099 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.099 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/network.outgoing.bytes volume: 8327 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.099 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/network.outgoing.bytes volume: 1256 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8c95fed-dfd2-4963-91df-700a13486d2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8327, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000046-13616378-6f23-42e4-8c8c-5182a5056326-tap4e3afd85-f9', 'timestamp': '2025-10-08T15:48:36.099504', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'tap4e3afd85-f9', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e8:4d:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4e3afd85-f9'}, 'message_id': '404a96c6-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.76581407, 'message_signature': '46fed3d3d9b32671bd101072e606504d724aacf4b514ca7cb8e03cb4c8864375'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1256, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000046-13616378-6f23-42e4-8c8c-5182a5056326-tap6771bc83-98', 'timestamp': '2025-10-08T15:48:36.099504', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'tap6771bc83-98', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b8:d2:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6771bc83-98'}, 'message_id': '404aa1c0-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.76581407, 'message_signature': 'f42dc80ceb52e64a7f55c596008a2a30e5086422bd955723cc7d99347efedf70'}]}, 'timestamp': '2025-10-08 15:48:36.100056', '_unique_id': '23a25b644122400d929d1790e408e63f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.100 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.101 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.101 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/network.incoming.bytes volume: 5736 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.101 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/network.incoming.bytes volume: 1449 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b4252df-9c5b-44c3-a927-d365cfc53cd5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5736, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000046-13616378-6f23-42e4-8c8c-5182a5056326-tap4e3afd85-f9', 'timestamp': '2025-10-08T15:48:36.101307', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'tap4e3afd85-f9', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e8:4d:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4e3afd85-f9'}, 'message_id': '404add52-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.76581407, 'message_signature': 'afefda667d27ebc0c1aa30f2e85fde5d29b88dd15fbb10bda0ccb2e29286890e'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1449, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000046-13616378-6f23-42e4-8c8c-5182a5056326-tap6771bc83-98', 'timestamp': '2025-10-08T15:48:36.101307', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'tap6771bc83-98', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b8:d2:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6771bc83-98'}, 'message_id': '404ae7fc-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.76581407, 'message_signature': '26519b3a04fcd2a56aacef385742f66822da14d61f6eed91ff9b33925d369b66'}]}, 'timestamp': '2025-10-08 15:48:36.101844', '_unique_id': 'bde2bcde10ff470996eac3f46b8019f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.102 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.103 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.103 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-test_qos_after_cold_migration-1644920499>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_qos_after_cold_migration-1644920499>]
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.103 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.103 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.103 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd7458798-56b0-4010-b9c9-8c7f26a63cff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000046-13616378-6f23-42e4-8c8c-5182a5056326-tap4e3afd85-f9', 'timestamp': '2025-10-08T15:48:36.103514', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'tap4e3afd85-f9', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e8:4d:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4e3afd85-f9'}, 'message_id': '404b33c4-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.76581407, 'message_signature': '1bd50108ba2ff2988a0f4635a7c7a1e0d602aae4092b0e75b09a2579aa2bac11'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000046-13616378-6f23-42e4-8c8c-5182a5056326-tap6771bc83-98', 'timestamp': '2025-10-08T15:48:36.103514', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'tap6771bc83-98', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b8:d2:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6771bc83-98'}, 'message_id': '404b3cde-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.76581407, 'message_signature': 'ce98f84dac2c94c76541458c001956af47c4e9bae43a429a9ccddec9aac3c534'}]}, 'timestamp': '2025-10-08 15:48:36.104036', '_unique_id': '824aca98854c440c933270ea3f5e571f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.104 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.105 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.105 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.105 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-test_qos_after_cold_migration-1644920499>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_qos_after_cold_migration-1644920499>]
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.105 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.105 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-test_qos_after_cold_migration-1644920499>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_qos_after_cold_migration-1644920499>]
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.105 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.106 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/disk.device.write.requests volume: 136 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.106 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6ef1f83-b5da-498f-9a73-a3c5bdad5a4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 136, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '13616378-6f23-42e4-8c8c-5182a5056326-vda', 'timestamp': '2025-10-08T15:48:36.106052', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'instance-00000046', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '404b97a6-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.737965249, 'message_signature': '143b68a09b207f7fc36ec79eec0adcb5857f1cf1a0438a5e15f03b0e7522d354'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '13616378-6f23-42e4-8c8c-5182a5056326-sda', 'timestamp': '2025-10-08T15:48:36.106052', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'instance-00000046', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '404ba0d4-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.737965249, 'message_signature': 'cfdc7ef0e75191334e86028c4041d587b7d14a1acb4a570c000a5c103bc832b1'}]}, 'timestamp': '2025-10-08 15:48:36.106564', '_unique_id': '9ef78d2d368e401dbc4658700adca151'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.107 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/disk.device.read.latency volume: 6288971154 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/disk.device.read.latency volume: 20649112 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2ee5e22-a12f-4031-9ee8-6ea901a62973', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6288971154, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '13616378-6f23-42e4-8c8c-5182a5056326-vda', 'timestamp': '2025-10-08T15:48:36.107728', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'instance-00000046', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '404bd7de-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.737965249, 'message_signature': '90498b40ea98434bc38d9f7760b88af9973a0b3855734b0dcd8347af04843fb3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20649112, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '13616378-6f23-42e4-8c8c-5182a5056326-sda', 'timestamp': '2025-10-08T15:48:36.107728', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'instance-00000046', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '404be31e-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.737965249, 'message_signature': 'aa0ab30a145822335fe5c39877f6c52bbdcd57ed94ba505507a239b98effd05e'}]}, 'timestamp': '2025-10-08 15:48:36.108273', '_unique_id': 'e0ddf5787747474ea28289fd6fc439a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.108 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.109 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.109 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/disk.device.write.latency volume: 248067786 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.109 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab5f4ac2-594f-42d9-aa39-b70961b8ae0b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 248067786, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '13616378-6f23-42e4-8c8c-5182a5056326-vda', 'timestamp': '2025-10-08T15:48:36.109555', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'instance-00000046', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '404c1f28-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.737965249, 'message_signature': '42e5970a1bf21d75cebaa11d5d88ab5a20f762f7520c97d275c7c801f650c0a6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '13616378-6f23-42e4-8c8c-5182a5056326-sda', 'timestamp': '2025-10-08T15:48:36.109555', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'instance-00000046', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '404c295a-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.737965249, 'message_signature': '99f86e1850437d7503014caa2cb9ac7f83f8668da6a3da8aec02b50fe3b7a5a0'}]}, 'timestamp': '2025-10-08 15:48:36.110061', '_unique_id': '2ebb4ad3469a4061a7c6c61171fef1ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.110 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.111 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.111 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.111 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-test_qos_after_cold_migration-1644920499>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_qos_after_cold_migration-1644920499>]
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.111 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.111 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/disk.device.usage volume: 154927104 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.111 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '891a610a-fc33-4d1b-88ca-3a9f3121a8f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 154927104, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '13616378-6f23-42e4-8c8c-5182a5056326-vda', 'timestamp': '2025-10-08T15:48:36.111734', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'instance-00000046', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '404c74b4-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.795400437, 'message_signature': '46ef2ac605d48f09fafc906d4aea1fed8cb27f123cd859c04404da0cc6459aa2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '13616378-6f23-42e4-8c8c-5182a5056326-sda', 'timestamp': '2025-10-08T15:48:36.111734', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'instance-00000046', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '404c7e1e-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.795400437, 'message_signature': '253a0e069be46c2f115dfdcc0c8b7e3dd1776920d74104cef45da0801ae729b5'}]}, 'timestamp': '2025-10-08 15:48:36.112258', '_unique_id': 'd7701f056732457bb2e8777e7cabb69a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.112 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.113 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.113 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/network.outgoing.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.113 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/network.outgoing.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '440724b0-9d2a-48fd-86cb-2fa7e85a62a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000046-13616378-6f23-42e4-8c8c-5182a5056326-tap4e3afd85-f9', 'timestamp': '2025-10-08T15:48:36.113425', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'tap4e3afd85-f9', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e8:4d:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4e3afd85-f9'}, 'message_id': '404cb69a-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.76581407, 'message_signature': '153bcb7790f49975a85a75eeb467a490781407407ba34609fde020b22a48d29e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000046-13616378-6f23-42e4-8c8c-5182a5056326-tap6771bc83-98', 'timestamp': '2025-10-08T15:48:36.113425', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'tap6771bc83-98', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b8:d2:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6771bc83-98'}, 'message_id': '404cc126-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.76581407, 'message_signature': 'a64e52750c37de3573d353153d8312e40a4937ed74ed7d475da3a1e43be73426'}]}, 'timestamp': '2025-10-08 15:48:36.113955', '_unique_id': 'c709184b8b2c4542a01d98d518346265'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.114 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/memory.usage volume: 225.1796875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ecd5998f-23b8-49cc-a1cb-548680df04fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 225.1796875, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'timestamp': '2025-10-08T15:48:36.115159', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'instance-00000046', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '404cfa88-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.792869936, 'message_signature': '1b2ee85bd2a33b4c49339d3dd94557bd0b701b3388269b95c42bcb790460e86f'}]}, 'timestamp': '2025-10-08 15:48:36.115450', '_unique_id': '3f86ff85bf9a43479294aa84db3cf4be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.115 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.116 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.116 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.116 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45769f80-0b20-46ca-bc3c-fea617fd4752', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000046-13616378-6f23-42e4-8c8c-5182a5056326-tap4e3afd85-f9', 'timestamp': '2025-10-08T15:48:36.116562', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'tap4e3afd85-f9', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e8:4d:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4e3afd85-f9'}, 'message_id': '404d3110-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.76581407, 'message_signature': '7d2764b1f94c07c0220dfc325d255c9e20d32393451a11c6b67a10d5be9e2ae0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000046-13616378-6f23-42e4-8c8c-5182a5056326-tap6771bc83-98', 'timestamp': '2025-10-08T15:48:36.116562', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'tap6771bc83-98', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b8:d2:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6771bc83-98'}, 'message_id': '404d3b88-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.76581407, 'message_signature': '7ce0a33f68839a8d89a80b8a17dbdac30c52bfe855b6e32fe96f5d1b23c1f6c8'}]}, 'timestamp': '2025-10-08 15:48:36.117106', '_unique_id': '9b08b7c682014ee6b9dc8bd0a171d7da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.117 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.118 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.118 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.118 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b86ecad7-1b53-4fb5-b9e5-8d974cb37043', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000046-13616378-6f23-42e4-8c8c-5182a5056326-tap4e3afd85-f9', 'timestamp': '2025-10-08T15:48:36.118234', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'tap4e3afd85-f9', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e8:4d:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4e3afd85-f9'}, 'message_id': '404d7260-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.76581407, 'message_signature': '4a3c12bbcf73c8ad0ce8285c1a58b91d8a34757deec8ba3300dfe91cc20d1c7e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000046-13616378-6f23-42e4-8c8c-5182a5056326-tap6771bc83-98', 'timestamp': '2025-10-08T15:48:36.118234', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'tap6771bc83-98', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b8:d2:73', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6771bc83-98'}, 'message_id': '404d7da0-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.76581407, 'message_signature': '6097b46b5b09420790ec21737a399adc52fdb124e668ced9476aadf97c8b9d0d'}]}, 'timestamp': '2025-10-08 15:48:36.118794', '_unique_id': '529322b404f44f9c84c7094a420e1d68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.119 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 DEBUG ceilometer.compute.pollsters [-] 13616378-6f23-42e4-8c8c-5182a5056326/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '849ff1ed-d6c3-4609-8a26-2ed99ecafa12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '13616378-6f23-42e4-8c8c-5182a5056326-vda', 'timestamp': '2025-10-08T15:48:36.119918', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'instance-00000046', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '404db40a-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.795400437, 'message_signature': '113c788985dea584c6a542848a55c6dd72598a1d144f584cd157d51a7d376e25'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '13616378-6f23-42e4-8c8c-5182a5056326-sda', 'timestamp': '2025-10-08T15:48:36.119918', 'resource_metadata': {'display_name': 'tempest-test_qos_after_cold_migration-1644920499', 'name': 'instance-00000046', 'instance_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '404dbf2c-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5419.795400437, 'message_signature': '2355b3f7ef84a97628103de68f15165f5a3e8fbc52b5ac0da6f0bb2f5499f308'}]}, 'timestamp': '2025-10-08 15:48:36.120449', '_unique_id': '03d50ba241584790ad55acb3c06e7a40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:48:36.120 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:48:36 np0005476733 podman[241767]: 2025-10-08 15:48:36.223453575 +0000 UTC m=+0.054005009 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Oct  8 11:48:37 np0005476733 nova_compute[192580]: 2025-10-08 15:48:37.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:40 np0005476733 nova_compute[192580]: 2025-10-08 15:48:40.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:41 np0005476733 podman[241788]: 2025-10-08 15:48:41.246256143 +0000 UTC m=+0.069644928 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:48:41 np0005476733 podman[241787]: 2025-10-08 15:48:41.279220937 +0000 UTC m=+0.104923927 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 11:48:42 np0005476733 nova_compute[192580]: 2025-10-08 15:48:42.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:45 np0005476733 nova_compute[192580]: 2025-10-08 15:48:45.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:47 np0005476733 nova_compute[192580]: 2025-10-08 15:48:47.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:50 np0005476733 podman[241833]: 2025-10-08 15:48:50.232530502 +0000 UTC m=+0.054814414 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 11:48:50 np0005476733 podman[241834]: 2025-10-08 15:48:50.242924864 +0000 UTC m=+0.062522740 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, release=1755695350, distribution-scope=public, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, version=9.6, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  8 11:48:50 np0005476733 podman[241832]: 2025-10-08 15:48:50.253232514 +0000 UTC m=+0.079241245 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=multipathd, managed_by=edpm_ansible)
Oct  8 11:48:50 np0005476733 nova_compute[192580]: 2025-10-08 15:48:50.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:52 np0005476733 nova_compute[192580]: 2025-10-08 15:48:52.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:55 np0005476733 nova_compute[192580]: 2025-10-08 15:48:55.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:56 np0005476733 nova_compute[192580]: 2025-10-08 15:48:56.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:48:57 np0005476733 podman[241901]: 2025-10-08 15:48:57.241444358 +0000 UTC m=+0.065206036 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 11:48:57 np0005476733 podman[241900]: 2025-10-08 15:48:57.249583868 +0000 UTC m=+0.071158626 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 11:48:57 np0005476733 nova_compute[192580]: 2025-10-08 15:48:57.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:48:58 np0005476733 nova_compute[192580]: 2025-10-08 15:48:58.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:49:00 np0005476733 nova_compute[192580]: 2025-10-08 15:49:00.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:02 np0005476733 nova_compute[192580]: 2025-10-08 15:49:02.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:03 np0005476733 nova_compute[192580]: 2025-10-08 15:49:03.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:49:04 np0005476733 nova_compute[192580]: 2025-10-08 15:49:04.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:49:04 np0005476733 nova_compute[192580]: 2025-10-08 15:49:04.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:49:04 np0005476733 nova_compute[192580]: 2025-10-08 15:49:04.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:49:04 np0005476733 nova_compute[192580]: 2025-10-08 15:49:04.777 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-13616378-6f23-42e4-8c8c-5182a5056326" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:49:04 np0005476733 nova_compute[192580]: 2025-10-08 15:49:04.778 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-13616378-6f23-42e4-8c8c-5182a5056326" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:49:04 np0005476733 nova_compute[192580]: 2025-10-08 15:49:04.778 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:49:04 np0005476733 nova_compute[192580]: 2025-10-08 15:49:04.778 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 13616378-6f23-42e4-8c8c-5182a5056326 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:49:05 np0005476733 nova_compute[192580]: 2025-10-08 15:49:05.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:07 np0005476733 podman[241941]: 2025-10-08 15:49:07.289152173 +0000 UTC m=+0.107410225 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:49:07 np0005476733 nova_compute[192580]: 2025-10-08 15:49:07.481 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Updating instance_info_cache with network_info: [{"id": "4e3afd85-f9b7-45ee-b86c-5db61eaec58c", "address": "fa:16:3e:e8:4d:2e", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e3afd85-f9", "ovs_interfaceid": "4e3afd85-f9b7-45ee-b86c-5db61eaec58c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6771bc83-98b8-4b06-8442-9bb11777cdc6", "address": "fa:16:3e:b8:d2:73", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6771bc83-98", "ovs_interfaceid": "6771bc83-98b8-4b06-8442-9bb11777cdc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:49:07 np0005476733 nova_compute[192580]: 2025-10-08 15:49:07.579 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-13616378-6f23-42e4-8c8c-5182a5056326" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:49:07 np0005476733 nova_compute[192580]: 2025-10-08 15:49:07.580 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:49:07 np0005476733 nova_compute[192580]: 2025-10-08 15:49:07.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:49:07 np0005476733 nova_compute[192580]: 2025-10-08 15:49:07.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:49:07 np0005476733 nova_compute[192580]: 2025-10-08 15:49:07.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:49:07 np0005476733 nova_compute[192580]: 2025-10-08 15:49:07.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:49:07 np0005476733 nova_compute[192580]: 2025-10-08 15:49:07.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:07 np0005476733 nova_compute[192580]: 2025-10-08 15:49:07.653 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:49:07 np0005476733 nova_compute[192580]: 2025-10-08 15:49:07.654 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:49:07 np0005476733 nova_compute[192580]: 2025-10-08 15:49:07.654 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:49:07 np0005476733 nova_compute[192580]: 2025-10-08 15:49:07.654 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:49:07 np0005476733 nova_compute[192580]: 2025-10-08 15:49:07.813 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13616378-6f23-42e4-8c8c-5182a5056326/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:49:07 np0005476733 nova_compute[192580]: 2025-10-08 15:49:07.902 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13616378-6f23-42e4-8c8c-5182a5056326/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:49:07 np0005476733 nova_compute[192580]: 2025-10-08 15:49:07.903 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13616378-6f23-42e4-8c8c-5182a5056326/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:49:07 np0005476733 nova_compute[192580]: 2025-10-08 15:49:07.969 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13616378-6f23-42e4-8c8c-5182a5056326/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:49:08 np0005476733 nova_compute[192580]: 2025-10-08 15:49:08.177 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:49:08 np0005476733 nova_compute[192580]: 2025-10-08 15:49:08.179 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12970MB free_disk=111.18806076049805GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:49:08 np0005476733 nova_compute[192580]: 2025-10-08 15:49:08.179 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:49:08 np0005476733 nova_compute[192580]: 2025-10-08 15:49:08.180 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:49:08 np0005476733 nova_compute[192580]: 2025-10-08 15:49:08.385 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 13616378-6f23-42e4-8c8c-5182a5056326 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:49:08 np0005476733 nova_compute[192580]: 2025-10-08 15:49:08.386 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:49:08 np0005476733 nova_compute[192580]: 2025-10-08 15:49:08.387 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:49:08 np0005476733 nova_compute[192580]: 2025-10-08 15:49:08.440 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:49:08 np0005476733 nova_compute[192580]: 2025-10-08 15:49:08.461 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:49:08 np0005476733 nova_compute[192580]: 2025-10-08 15:49:08.462 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:49:08 np0005476733 nova_compute[192580]: 2025-10-08 15:49:08.463 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.283s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:49:09 np0005476733 nova_compute[192580]: 2025-10-08 15:49:09.462 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:49:10 np0005476733 nova_compute[192580]: 2025-10-08 15:49:10.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:12 np0005476733 podman[241969]: 2025-10-08 15:49:12.248447611 +0000 UTC m=+0.055887968 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm)
Oct  8 11:49:12 np0005476733 podman[241968]: 2025-10-08 15:49:12.277640864 +0000 UTC m=+0.098877342 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Oct  8 11:49:12 np0005476733 nova_compute[192580]: 2025-10-08 15:49:12.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:13.783 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:49:13 np0005476733 nova_compute[192580]: 2025-10-08 15:49:13.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:13.784 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:49:15 np0005476733 nova_compute[192580]: 2025-10-08 15:49:15.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:16.787 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:49:17 np0005476733 nova_compute[192580]: 2025-10-08 15:49:17.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:49:20Z|00607|pinctrl|WARN|Dropped 499 log messages in last 55 seconds (most recently, 1 seconds ago) due to excessive rate
Oct  8 11:49:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:49:20Z|00608|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:49:20 np0005476733 nova_compute[192580]: 2025-10-08 15:49:20.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:21 np0005476733 podman[242016]: 2025-10-08 15:49:21.230558338 +0000 UTC m=+0.054831114 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:49:21 np0005476733 podman[242015]: 2025-10-08 15:49:21.23687137 +0000 UTC m=+0.065945080 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:49:21 np0005476733 podman[242017]: 2025-10-08 15:49:21.265277138 +0000 UTC m=+0.085546896 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, version=9.6, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=)
Oct  8 11:49:22 np0005476733 nova_compute[192580]: 2025-10-08 15:49:22.592 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:49:22 np0005476733 nova_compute[192580]: 2025-10-08 15:49:22.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:25 np0005476733 nova_compute[192580]: 2025-10-08 15:49:25.810 2 DEBUG oslo_concurrency.lockutils [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "13616378-6f23-42e4-8c8c-5182a5056326" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:49:25 np0005476733 nova_compute[192580]: 2025-10-08 15:49:25.811 2 DEBUG oslo_concurrency.lockutils [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:49:25 np0005476733 nova_compute[192580]: 2025-10-08 15:49:25.811 2 DEBUG oslo_concurrency.lockutils [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "13616378-6f23-42e4-8c8c-5182a5056326-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:49:25 np0005476733 nova_compute[192580]: 2025-10-08 15:49:25.812 2 DEBUG oslo_concurrency.lockutils [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:49:25 np0005476733 nova_compute[192580]: 2025-10-08 15:49:25.812 2 DEBUG oslo_concurrency.lockutils [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:49:25 np0005476733 nova_compute[192580]: 2025-10-08 15:49:25.814 2 INFO nova.compute.manager [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Terminating instance#033[00m
Oct  8 11:49:25 np0005476733 nova_compute[192580]: 2025-10-08 15:49:25.816 2 DEBUG nova.compute.manager [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:49:25 np0005476733 kernel: tap4e3afd85-f9 (unregistering): left promiscuous mode
Oct  8 11:49:25 np0005476733 NetworkManager[51699]: <info>  [1759938565.8536] device (tap4e3afd85-f9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:49:25 np0005476733 nova_compute[192580]: 2025-10-08 15:49:25.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:49:25Z|00609|binding|INFO|Releasing lport 4e3afd85-f9b7-45ee-b86c-5db61eaec58c from this chassis (sb_readonly=0)
Oct  8 11:49:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:49:25Z|00610|binding|INFO|Setting lport 4e3afd85-f9b7-45ee-b86c-5db61eaec58c down in Southbound
Oct  8 11:49:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:49:25Z|00611|binding|INFO|Removing iface tap4e3afd85-f9 ovn-installed in OVS
Oct  8 11:49:25 np0005476733 nova_compute[192580]: 2025-10-08 15:49:25.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:25 np0005476733 nova_compute[192580]: 2025-10-08 15:49:25.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:25.888 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:4d:2e 192.168.7.32'], port_security=['fa:16:3e:e8:4d:2e 192.168.7.32'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.7.32/24', 'neutron:device_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac5383ee-65ae-4340-bb19-495c4991fae8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'b449450f-29a2-4ba2-a56d-c4c1cca923db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.184', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c049a133-6546-4ba7-90ec-ddcac9cb5060, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=4e3afd85-f9b7-45ee-b86c-5db61eaec58c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:49:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:25.889 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 4e3afd85-f9b7-45ee-b86c-5db61eaec58c in datapath ac5383ee-65ae-4340-bb19-495c4991fae8 unbound from our chassis#033[00m
Oct  8 11:49:25 np0005476733 kernel: tap6771bc83-98 (unregistering): left promiscuous mode
Oct  8 11:49:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:25.891 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ac5383ee-65ae-4340-bb19-495c4991fae8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:49:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:25.893 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[47911532-06c3-4ded-9c20-ce3d31035242]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:25.893 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8 namespace which is not needed anymore#033[00m
Oct  8 11:49:25 np0005476733 NetworkManager[51699]: <info>  [1759938565.8952] device (tap6771bc83-98): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:49:25 np0005476733 nova_compute[192580]: 2025-10-08 15:49:25.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:25 np0005476733 nova_compute[192580]: 2025-10-08 15:49:25.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:49:25Z|00612|binding|INFO|Releasing lport 6771bc83-98b8-4b06-8442-9bb11777cdc6 from this chassis (sb_readonly=0)
Oct  8 11:49:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:49:25Z|00613|binding|INFO|Setting lport 6771bc83-98b8-4b06-8442-9bb11777cdc6 down in Southbound
Oct  8 11:49:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:49:25Z|00614|binding|INFO|Removing iface tap6771bc83-98 ovn-installed in OVS
Oct  8 11:49:25 np0005476733 nova_compute[192580]: 2025-10-08 15:49:25.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:25 np0005476733 nova_compute[192580]: 2025-10-08 15:49:25.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:25.931 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:d2:73 10.100.0.5'], port_security=['fa:16:3e:b8:d2:73 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '13616378-6f23-42e4-8c8c-5182a5056326', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '8', 'neutron:security_group_ids': '82ea289b-c65f-44fe-a172-e9784a3ab9f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.202', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3da71a44-b74e-4032-87c4-3337484b3d54, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=6771bc83-98b8-4b06-8442-9bb11777cdc6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:49:25 np0005476733 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000046.scope: Deactivated successfully.
Oct  8 11:49:25 np0005476733 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000046.scope: Consumed 34.356s CPU time.
Oct  8 11:49:25 np0005476733 systemd-machined[152624]: Machine qemu-41-instance-00000046 terminated.
Oct  8 11:49:26 np0005476733 NetworkManager[51699]: <info>  [1759938566.0418] manager: (tap4e3afd85-f9): new Tun device (/org/freedesktop/NetworkManager/Devices/201)
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:26 np0005476733 neutron-haproxy-ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8[240006]: [NOTICE]   (240042) : haproxy version is 2.8.14-c23fe91
Oct  8 11:49:26 np0005476733 NetworkManager[51699]: <info>  [1759938566.0592] manager: (tap6771bc83-98): new Tun device (/org/freedesktop/NetworkManager/Devices/202)
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:26 np0005476733 neutron-haproxy-ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8[240006]: [NOTICE]   (240042) : path to executable is /usr/sbin/haproxy
Oct  8 11:49:26 np0005476733 neutron-haproxy-ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8[240006]: [WARNING]  (240042) : Exiting Master process...
Oct  8 11:49:26 np0005476733 neutron-haproxy-ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8[240006]: [WARNING]  (240042) : Exiting Master process...
Oct  8 11:49:26 np0005476733 neutron-haproxy-ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8[240006]: [ALERT]    (240042) : Current worker (240048) exited with code 143 (Terminated)
Oct  8 11:49:26 np0005476733 neutron-haproxy-ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8[240006]: [WARNING]  (240042) : All workers exited. Exiting... (0)
Oct  8 11:49:26 np0005476733 systemd[1]: libpod-35c8da4bb9d8d094ba0c24c795a07339716945b68fd7bb7d0f7f9c5e15826009.scope: Deactivated successfully.
Oct  8 11:49:26 np0005476733 podman[242107]: 2025-10-08 15:49:26.069604921 +0000 UTC m=+0.069230265 container died 35c8da4bb9d8d094ba0c24c795a07339716945b68fd7bb7d0f7f9c5e15826009 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  8 11:49:26 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-35c8da4bb9d8d094ba0c24c795a07339716945b68fd7bb7d0f7f9c5e15826009-userdata-shm.mount: Deactivated successfully.
Oct  8 11:49:26 np0005476733 systemd[1]: var-lib-containers-storage-overlay-b45531ab907f45b2507671188323b74c524d8b8ac2f49804444785998cec1d00-merged.mount: Deactivated successfully.
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.124 2 INFO nova.virt.libvirt.driver [-] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Instance destroyed successfully.#033[00m
Oct  8 11:49:26 np0005476733 podman[242107]: 2025-10-08 15:49:26.126625543 +0000 UTC m=+0.126250867 container cleanup 35c8da4bb9d8d094ba0c24c795a07339716945b68fd7bb7d0f7f9c5e15826009 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.126 2 DEBUG nova.objects.instance [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'resources' on Instance uuid 13616378-6f23-42e4-8c8c-5182a5056326 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:49:26 np0005476733 systemd[1]: libpod-conmon-35c8da4bb9d8d094ba0c24c795a07339716945b68fd7bb7d0f7f9c5e15826009.scope: Deactivated successfully.
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.152 2 DEBUG nova.virt.libvirt.vif [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:43:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_qos_after_cold_migration-1644920499',display_name='tempest-test_qos_after_cold_migration-1644920499',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-qos-after-cold-migration-1644920499',id=70,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:47:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-sqgstxg9',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:47:09Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=13616378-6f23-42e4-8c8c-5182a5056326,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4e3afd85-f9b7-45ee-b86c-5db61eaec58c", "address": "fa:16:3e:e8:4d:2e", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e3afd85-f9", "ovs_interfaceid": "4e3afd85-f9b7-45ee-b86c-5db61eaec58c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.154 2 DEBUG nova.network.os_vif_util [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "4e3afd85-f9b7-45ee-b86c-5db61eaec58c", "address": "fa:16:3e:e8:4d:2e", "network": {"id": "ac5383ee-65ae-4340-bb19-495c4991fae8", "bridge": "br-int", "label": "tempest-test-network--238655981", "subnets": [{"cidr": "192.168.7.0/24", "dns": [], "gateway": {"address": "192.168.7.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.7.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4e3afd85-f9", "ovs_interfaceid": "4e3afd85-f9b7-45ee-b86c-5db61eaec58c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.155 2 DEBUG nova.network.os_vif_util [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e8:4d:2e,bridge_name='br-int',has_traffic_filtering=True,id=4e3afd85-f9b7-45ee-b86c-5db61eaec58c,network=Network(ac5383ee-65ae-4340-bb19-495c4991fae8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e3afd85-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.155 2 DEBUG os_vif [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e8:4d:2e,bridge_name='br-int',has_traffic_filtering=True,id=4e3afd85-f9b7-45ee-b86c-5db61eaec58c,network=Network(ac5383ee-65ae-4340-bb19-495c4991fae8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e3afd85-f9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.157 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4e3afd85-f9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.167 2 INFO os_vif [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e8:4d:2e,bridge_name='br-int',has_traffic_filtering=True,id=4e3afd85-f9b7-45ee-b86c-5db61eaec58c,network=Network(ac5383ee-65ae-4340-bb19-495c4991fae8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4e3afd85-f9')#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.168 2 DEBUG nova.virt.libvirt.vif [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:43:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_qos_after_cold_migration-1644920499',display_name='tempest-test_qos_after_cold_migration-1644920499',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-qos-after-cold-migration-1644920499',id=70,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:47:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-sqgstxg9',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:47:09Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=13616378-6f23-42e4-8c8c-5182a5056326,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6771bc83-98b8-4b06-8442-9bb11777cdc6", "address": "fa:16:3e:b8:d2:73", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6771bc83-98", "ovs_interfaceid": "6771bc83-98b8-4b06-8442-9bb11777cdc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.169 2 DEBUG nova.network.os_vif_util [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "6771bc83-98b8-4b06-8442-9bb11777cdc6", "address": "fa:16:3e:b8:d2:73", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6771bc83-98", "ovs_interfaceid": "6771bc83-98b8-4b06-8442-9bb11777cdc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.170 2 DEBUG nova.network.os_vif_util [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b8:d2:73,bridge_name='br-int',has_traffic_filtering=True,id=6771bc83-98b8-4b06-8442-9bb11777cdc6,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6771bc83-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.170 2 DEBUG os_vif [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b8:d2:73,bridge_name='br-int',has_traffic_filtering=True,id=6771bc83-98b8-4b06-8442-9bb11777cdc6,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6771bc83-98') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.173 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6771bc83-98, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.178 2 INFO os_vif [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b8:d2:73,bridge_name='br-int',has_traffic_filtering=True,id=6771bc83-98b8-4b06-8442-9bb11777cdc6,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6771bc83-98')#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.179 2 INFO nova.virt.libvirt.driver [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Deleting instance files /var/lib/nova/instances/13616378-6f23-42e4-8c8c-5182a5056326_del#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.210 2 INFO nova.virt.libvirt.driver [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Deletion of /var/lib/nova/instances/13616378-6f23-42e4-8c8c-5182a5056326_del complete#033[00m
Oct  8 11:49:26 np0005476733 podman[242165]: 2025-10-08 15:49:26.218636937 +0000 UTC m=+0.056236090 container remove 35c8da4bb9d8d094ba0c24c795a07339716945b68fd7bb7d0f7f9c5e15826009 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.224 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b40e22dc-1c1b-4638-8a71-55575a428c26]: (4, ('Wed Oct  8 03:49:25 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8 (35c8da4bb9d8d094ba0c24c795a07339716945b68fd7bb7d0f7f9c5e15826009)\n35c8da4bb9d8d094ba0c24c795a07339716945b68fd7bb7d0f7f9c5e15826009\nWed Oct  8 03:49:26 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8 (35c8da4bb9d8d094ba0c24c795a07339716945b68fd7bb7d0f7f9c5e15826009)\n35c8da4bb9d8d094ba0c24c795a07339716945b68fd7bb7d0f7f9c5e15826009\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.227 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[cf6f96a7-c342-4176-9107-d5f76bbd0fd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.228 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac5383ee-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:26 np0005476733 kernel: tapac5383ee-60: left promiscuous mode
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.245 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7e452020-0be1-41fe-b7b1-5ca2f67401d0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.274 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[202071a9-f526-45a9-bf13-a135208d954f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.276 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[fe800fae-3630-4b64-bedb-ad5ed392d5ff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.292 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c308c1c4-7bd7-4d8d-89b0-80c0380f234e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517651, 'reachable_time': 29527, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242182, 'error': None, 'target': 'ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:26 np0005476733 systemd[1]: run-netns-ovnmeta\x2dac5383ee\x2d65ae\x2d4340\x2dbb19\x2d495c4991fae8.mount: Deactivated successfully.
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.295 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ac5383ee-65ae-4340-bb19-495c4991fae8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.296 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[72dbbcb2-8bbc-4896-9df9-cfff025f4ca4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.297 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 6771bc83-98b8-4b06-8442-9bb11777cdc6 in datapath 58a69152-b5a6-41d0-85d5-36ab51cfbfb5 unbound from our chassis#033[00m
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.299 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58a69152-b5a6-41d0-85d5-36ab51cfbfb5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.300 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[86161a6c-c553-44e7-b768-1d315352e242]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.301 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 namespace which is not needed anymore#033[00m
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.336 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.337 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.337 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.355 2 INFO nova.compute.manager [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Took 0.54 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.356 2 DEBUG oslo.service.loopingcall [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.356 2 DEBUG nova.compute.manager [-] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.357 2 DEBUG nova.network.neutron [-] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:49:26 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[240542]: [NOTICE]   (240546) : haproxy version is 2.8.14-c23fe91
Oct  8 11:49:26 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[240542]: [NOTICE]   (240546) : path to executable is /usr/sbin/haproxy
Oct  8 11:49:26 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[240542]: [WARNING]  (240546) : Exiting Master process...
Oct  8 11:49:26 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[240542]: [ALERT]    (240546) : Current worker (240548) exited with code 143 (Terminated)
Oct  8 11:49:26 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[240542]: [WARNING]  (240546) : All workers exited. Exiting... (0)
Oct  8 11:49:26 np0005476733 systemd[1]: libpod-6b81038db517f07e4fe8ca79110a7172454fa218a20be6303b3a036e4c4c2090.scope: Deactivated successfully.
Oct  8 11:49:26 np0005476733 podman[242200]: 2025-10-08 15:49:26.44421731 +0000 UTC m=+0.047587662 container died 6b81038db517f07e4fe8ca79110a7172454fa218a20be6303b3a036e4c4c2090 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  8 11:49:26 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6b81038db517f07e4fe8ca79110a7172454fa218a20be6303b3a036e4c4c2090-userdata-shm.mount: Deactivated successfully.
Oct  8 11:49:26 np0005476733 systemd[1]: var-lib-containers-storage-overlay-c508b8d4fcbc9d211d0dc467a961eb4c7929129ec4ff4fe7c0a5648e5b26655e-merged.mount: Deactivated successfully.
Oct  8 11:49:26 np0005476733 podman[242200]: 2025-10-08 15:49:26.481280916 +0000 UTC m=+0.084651308 container cleanup 6b81038db517f07e4fe8ca79110a7172454fa218a20be6303b3a036e4c4c2090 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  8 11:49:26 np0005476733 systemd[1]: libpod-conmon-6b81038db517f07e4fe8ca79110a7172454fa218a20be6303b3a036e4c4c2090.scope: Deactivated successfully.
Oct  8 11:49:26 np0005476733 podman[242229]: 2025-10-08 15:49:26.556337156 +0000 UTC m=+0.050876788 container remove 6b81038db517f07e4fe8ca79110a7172454fa218a20be6303b3a036e4c4c2090 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.565 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9444e1eb-0781-4a83-a1d9-6abca1eaf3b6]: (4, ('Wed Oct  8 03:49:26 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 (6b81038db517f07e4fe8ca79110a7172454fa218a20be6303b3a036e4c4c2090)\n6b81038db517f07e4fe8ca79110a7172454fa218a20be6303b3a036e4c4c2090\nWed Oct  8 03:49:26 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 (6b81038db517f07e4fe8ca79110a7172454fa218a20be6303b3a036e4c4c2090)\n6b81038db517f07e4fe8ca79110a7172454fa218a20be6303b3a036e4c4c2090\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.567 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a2bf7b8e-fc71-49b9-a6bb-b526ab6a9afc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.568 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58a69152-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:26 np0005476733 kernel: tap58a69152-b0: left promiscuous mode
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.592 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[79340635-352c-40c7-a9d2-7c36afb97493]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.625 2 DEBUG nova.compute.manager [req-64b468ea-bc91-4722-88cb-b3687559d201 req-cb66e201-1e01-41b4-a1df-367f23ad1966 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received event network-vif-unplugged-4e3afd85-f9b7-45ee-b86c-5db61eaec58c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.624 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[15eb5175-323e-4165-a303-d0ae8dec7a63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.627 2 DEBUG oslo_concurrency.lockutils [req-64b468ea-bc91-4722-88cb-b3687559d201 req-cb66e201-1e01-41b4-a1df-367f23ad1966 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "13616378-6f23-42e4-8c8c-5182a5056326-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.628 2 DEBUG oslo_concurrency.lockutils [req-64b468ea-bc91-4722-88cb-b3687559d201 req-cb66e201-1e01-41b4-a1df-367f23ad1966 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.628 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3d6a71c1-7ba0-4402-b716-2011b42052e9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.628 2 DEBUG oslo_concurrency.lockutils [req-64b468ea-bc91-4722-88cb-b3687559d201 req-cb66e201-1e01-41b4-a1df-367f23ad1966 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.629 2 DEBUG nova.compute.manager [req-64b468ea-bc91-4722-88cb-b3687559d201 req-cb66e201-1e01-41b4-a1df-367f23ad1966 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] No waiting events found dispatching network-vif-unplugged-4e3afd85-f9b7-45ee-b86c-5db61eaec58c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:49:26 np0005476733 nova_compute[192580]: 2025-10-08 15:49:26.629 2 DEBUG nova.compute.manager [req-64b468ea-bc91-4722-88cb-b3687559d201 req-cb66e201-1e01-41b4-a1df-367f23ad1966 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received event network-vif-unplugged-4e3afd85-f9b7-45ee-b86c-5db61eaec58c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.650 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b0f17d9d-b138-414e-85f8-1098a789546d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 523945, 'reachable_time': 40225, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242244, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.652 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:26.653 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[5c4de955-5fe9-44c7-8ecb-2713d88f1eed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:27 np0005476733 systemd[1]: run-netns-ovnmeta\x2d58a69152\x2db5a6\x2d41d0\x2d85d5\x2d36ab51cfbfb5.mount: Deactivated successfully.
Oct  8 11:49:28 np0005476733 podman[242246]: 2025-10-08 15:49:28.237969025 +0000 UTC m=+0.061695414 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:49:28 np0005476733 podman[242245]: 2025-10-08 15:49:28.239083001 +0000 UTC m=+0.064296567 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.591 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.593 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.594 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.594 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.595 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.595 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.846 2 DEBUG nova.compute.manager [req-097b734e-0310-430b-bcf9-453f61385a39 req-52ae6eb7-2807-4b54-ab54-5c29ac2bb141 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received event network-vif-plugged-4e3afd85-f9b7-45ee-b86c-5db61eaec58c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.847 2 DEBUG oslo_concurrency.lockutils [req-097b734e-0310-430b-bcf9-453f61385a39 req-52ae6eb7-2807-4b54-ab54-5c29ac2bb141 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "13616378-6f23-42e4-8c8c-5182a5056326-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.848 2 DEBUG oslo_concurrency.lockutils [req-097b734e-0310-430b-bcf9-453f61385a39 req-52ae6eb7-2807-4b54-ab54-5c29ac2bb141 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.848 2 DEBUG oslo_concurrency.lockutils [req-097b734e-0310-430b-bcf9-453f61385a39 req-52ae6eb7-2807-4b54-ab54-5c29ac2bb141 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.849 2 DEBUG nova.compute.manager [req-097b734e-0310-430b-bcf9-453f61385a39 req-52ae6eb7-2807-4b54-ab54-5c29ac2bb141 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] No waiting events found dispatching network-vif-plugged-4e3afd85-f9b7-45ee-b86c-5db61eaec58c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.849 2 WARNING nova.compute.manager [req-097b734e-0310-430b-bcf9-453f61385a39 req-52ae6eb7-2807-4b54-ab54-5c29ac2bb141 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received unexpected event network-vif-plugged-4e3afd85-f9b7-45ee-b86c-5db61eaec58c for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.850 2 DEBUG nova.compute.manager [req-097b734e-0310-430b-bcf9-453f61385a39 req-52ae6eb7-2807-4b54-ab54-5c29ac2bb141 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received event network-vif-unplugged-6771bc83-98b8-4b06-8442-9bb11777cdc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.850 2 DEBUG oslo_concurrency.lockutils [req-097b734e-0310-430b-bcf9-453f61385a39 req-52ae6eb7-2807-4b54-ab54-5c29ac2bb141 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "13616378-6f23-42e4-8c8c-5182a5056326-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.851 2 DEBUG oslo_concurrency.lockutils [req-097b734e-0310-430b-bcf9-453f61385a39 req-52ae6eb7-2807-4b54-ab54-5c29ac2bb141 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.851 2 DEBUG oslo_concurrency.lockutils [req-097b734e-0310-430b-bcf9-453f61385a39 req-52ae6eb7-2807-4b54-ab54-5c29ac2bb141 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.851 2 DEBUG nova.compute.manager [req-097b734e-0310-430b-bcf9-453f61385a39 req-52ae6eb7-2807-4b54-ab54-5c29ac2bb141 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] No waiting events found dispatching network-vif-unplugged-6771bc83-98b8-4b06-8442-9bb11777cdc6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.852 2 DEBUG nova.compute.manager [req-097b734e-0310-430b-bcf9-453f61385a39 req-52ae6eb7-2807-4b54-ab54-5c29ac2bb141 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received event network-vif-unplugged-6771bc83-98b8-4b06-8442-9bb11777cdc6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.852 2 DEBUG nova.compute.manager [req-097b734e-0310-430b-bcf9-453f61385a39 req-52ae6eb7-2807-4b54-ab54-5c29ac2bb141 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received event network-vif-plugged-6771bc83-98b8-4b06-8442-9bb11777cdc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.853 2 DEBUG oslo_concurrency.lockutils [req-097b734e-0310-430b-bcf9-453f61385a39 req-52ae6eb7-2807-4b54-ab54-5c29ac2bb141 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "13616378-6f23-42e4-8c8c-5182a5056326-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.853 2 DEBUG oslo_concurrency.lockutils [req-097b734e-0310-430b-bcf9-453f61385a39 req-52ae6eb7-2807-4b54-ab54-5c29ac2bb141 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.854 2 DEBUG oslo_concurrency.lockutils [req-097b734e-0310-430b-bcf9-453f61385a39 req-52ae6eb7-2807-4b54-ab54-5c29ac2bb141 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.854 2 DEBUG nova.compute.manager [req-097b734e-0310-430b-bcf9-453f61385a39 req-52ae6eb7-2807-4b54-ab54-5c29ac2bb141 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] No waiting events found dispatching network-vif-plugged-6771bc83-98b8-4b06-8442-9bb11777cdc6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.854 2 WARNING nova.compute.manager [req-097b734e-0310-430b-bcf9-453f61385a39 req-52ae6eb7-2807-4b54-ab54-5c29ac2bb141 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received unexpected event network-vif-plugged-6771bc83-98b8-4b06-8442-9bb11777cdc6 for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.926 2 DEBUG nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.927 2 DEBUG nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Image id 11111111-1111-1111-1111-111111111111 yields fingerprint bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.927 2 INFO nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] image 11111111-1111-1111-1111-111111111111 at (/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e): checking#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.928 2 DEBUG nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] image 11111111-1111-1111-1111-111111111111 at (/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.932 2 DEBUG nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.933 2 WARNING nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.933 2 INFO nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Active base files: /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.933 2 INFO nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Removable base files: /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.934 2 INFO nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.934 2 DEBUG nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.935 2 DEBUG nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Oct  8 11:49:28 np0005476733 nova_compute[192580]: 2025-10-08 15:49:28.935 2 DEBUG nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Oct  8 11:49:29 np0005476733 nova_compute[192580]: 2025-10-08 15:49:29.969 2 DEBUG nova.network.neutron [-] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:49:30 np0005476733 nova_compute[192580]: 2025-10-08 15:49:30.001 2 INFO nova.compute.manager [-] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Took 3.64 seconds to deallocate network for instance.#033[00m
Oct  8 11:49:30 np0005476733 nova_compute[192580]: 2025-10-08 15:49:30.034 2 DEBUG nova.compute.manager [req-207e28f6-e087-4584-bf82-f325e82da8e7 req-5938a8cc-4866-4418-a3cc-967aae96296b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Received event network-vif-deleted-4e3afd85-f9b7-45ee-b86c-5db61eaec58c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:49:30 np0005476733 nova_compute[192580]: 2025-10-08 15:49:30.079 2 DEBUG oslo_concurrency.lockutils [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:49:30 np0005476733 nova_compute[192580]: 2025-10-08 15:49:30.080 2 DEBUG oslo_concurrency.lockutils [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:49:30 np0005476733 nova_compute[192580]: 2025-10-08 15:49:30.136 2 DEBUG nova.compute.provider_tree [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:49:30 np0005476733 nova_compute[192580]: 2025-10-08 15:49:30.164 2 DEBUG nova.scheduler.client.report [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:49:30 np0005476733 nova_compute[192580]: 2025-10-08 15:49:30.210 2 DEBUG oslo_concurrency.lockutils [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:49:30 np0005476733 nova_compute[192580]: 2025-10-08 15:49:30.241 2 INFO nova.scheduler.client.report [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Deleted allocations for instance 13616378-6f23-42e4-8c8c-5182a5056326#033[00m
Oct  8 11:49:30 np0005476733 nova_compute[192580]: 2025-10-08 15:49:30.364 2 DEBUG oslo_concurrency.lockutils [None req-186f08fd-278f-4a7c-a67b-733e7f657c1e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "13616378-6f23-42e4-8c8c-5182a5056326" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.553s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:49:30 np0005476733 nova_compute[192580]: 2025-10-08 15:49:30.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:31 np0005476733 nova_compute[192580]: 2025-10-08 15:49:31.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:35 np0005476733 nova_compute[192580]: 2025-10-08 15:49:35.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:36 np0005476733 nova_compute[192580]: 2025-10-08 15:49:36.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:38 np0005476733 podman[242286]: 2025-10-08 15:49:38.247132247 +0000 UTC m=+0.066245299 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Oct  8 11:49:39 np0005476733 nova_compute[192580]: 2025-10-08 15:49:39.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:40 np0005476733 nova_compute[192580]: 2025-10-08 15:49:40.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:41 np0005476733 nova_compute[192580]: 2025-10-08 15:49:41.124 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759938566.122062, 13616378-6f23-42e4-8c8c-5182a5056326 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:49:41 np0005476733 nova_compute[192580]: 2025-10-08 15:49:41.124 2 INFO nova.compute.manager [-] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:49:41 np0005476733 nova_compute[192580]: 2025-10-08 15:49:41.156 2 DEBUG nova.compute.manager [None req-aacceda3-3a6a-4146-8606-9ac36ae0ff70 - - - - - -] [instance: 13616378-6f23-42e4-8c8c-5182a5056326] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:49:41 np0005476733 nova_compute[192580]: 2025-10-08 15:49:41.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:43 np0005476733 nova_compute[192580]: 2025-10-08 15:49:43.247 2 DEBUG oslo_concurrency.lockutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:49:43 np0005476733 nova_compute[192580]: 2025-10-08 15:49:43.248 2 DEBUG oslo_concurrency.lockutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:49:43 np0005476733 podman[242306]: 2025-10-08 15:49:43.27538455 +0000 UTC m=+0.095501515 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 11:49:43 np0005476733 nova_compute[192580]: 2025-10-08 15:49:43.283 2 DEBUG nova.compute.manager [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:49:43 np0005476733 podman[242305]: 2025-10-08 15:49:43.323418447 +0000 UTC m=+0.142109456 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Oct  8 11:49:43 np0005476733 nova_compute[192580]: 2025-10-08 15:49:43.373 2 DEBUG oslo_concurrency.lockutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:49:43 np0005476733 nova_compute[192580]: 2025-10-08 15:49:43.373 2 DEBUG oslo_concurrency.lockutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:49:43 np0005476733 nova_compute[192580]: 2025-10-08 15:49:43.382 2 DEBUG nova.virt.hardware [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:49:43 np0005476733 nova_compute[192580]: 2025-10-08 15:49:43.382 2 INFO nova.compute.claims [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:49:43 np0005476733 nova_compute[192580]: 2025-10-08 15:49:43.548 2 DEBUG nova.compute.provider_tree [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:49:43 np0005476733 nova_compute[192580]: 2025-10-08 15:49:43.627 2 DEBUG nova.scheduler.client.report [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:49:43 np0005476733 nova_compute[192580]: 2025-10-08 15:49:43.698 2 DEBUG oslo_concurrency.lockutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.324s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:49:43 np0005476733 nova_compute[192580]: 2025-10-08 15:49:43.699 2 DEBUG nova.compute.manager [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:49:43 np0005476733 nova_compute[192580]: 2025-10-08 15:49:43.786 2 DEBUG nova.compute.manager [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:49:43 np0005476733 nova_compute[192580]: 2025-10-08 15:49:43.787 2 DEBUG nova.network.neutron [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:49:43 np0005476733 nova_compute[192580]: 2025-10-08 15:49:43.828 2 INFO nova.virt.libvirt.driver [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:49:43 np0005476733 nova_compute[192580]: 2025-10-08 15:49:43.904 2 DEBUG nova.compute.manager [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:49:44 np0005476733 nova_compute[192580]: 2025-10-08 15:49:44.029 2 DEBUG nova.compute.manager [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:49:44 np0005476733 nova_compute[192580]: 2025-10-08 15:49:44.031 2 DEBUG nova.virt.libvirt.driver [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:49:44 np0005476733 nova_compute[192580]: 2025-10-08 15:49:44.032 2 INFO nova.virt.libvirt.driver [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Creating image(s)#033[00m
Oct  8 11:49:44 np0005476733 nova_compute[192580]: 2025-10-08 15:49:44.033 2 DEBUG oslo_concurrency.lockutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "/var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:49:44 np0005476733 nova_compute[192580]: 2025-10-08 15:49:44.033 2 DEBUG oslo_concurrency.lockutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "/var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:49:44 np0005476733 nova_compute[192580]: 2025-10-08 15:49:44.034 2 DEBUG oslo_concurrency.lockutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "/var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:49:44 np0005476733 nova_compute[192580]: 2025-10-08 15:49:44.059 2 DEBUG oslo_concurrency.processutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:49:44 np0005476733 nova_compute[192580]: 2025-10-08 15:49:44.117 2 DEBUG oslo_concurrency.processutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:49:44 np0005476733 nova_compute[192580]: 2025-10-08 15:49:44.119 2 DEBUG oslo_concurrency.lockutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:49:44 np0005476733 nova_compute[192580]: 2025-10-08 15:49:44.120 2 DEBUG oslo_concurrency.lockutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:49:44 np0005476733 nova_compute[192580]: 2025-10-08 15:49:44.144 2 DEBUG oslo_concurrency.processutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:49:44 np0005476733 nova_compute[192580]: 2025-10-08 15:49:44.205 2 DEBUG oslo_concurrency.processutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:49:44 np0005476733 nova_compute[192580]: 2025-10-08 15:49:44.206 2 DEBUG oslo_concurrency.processutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:49:44 np0005476733 nova_compute[192580]: 2025-10-08 15:49:44.266 2 DEBUG nova.policy [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:49:44 np0005476733 nova_compute[192580]: 2025-10-08 15:49:44.900 2 DEBUG oslo_concurrency.processutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk 10737418240" returned: 0 in 0.694s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:49:44 np0005476733 nova_compute[192580]: 2025-10-08 15:49:44.901 2 DEBUG oslo_concurrency.lockutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.781s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:49:44 np0005476733 nova_compute[192580]: 2025-10-08 15:49:44.902 2 DEBUG oslo_concurrency.processutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:49:44 np0005476733 nova_compute[192580]: 2025-10-08 15:49:44.984 2 DEBUG oslo_concurrency.processutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:49:44 np0005476733 nova_compute[192580]: 2025-10-08 15:49:44.985 2 DEBUG nova.objects.instance [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'migration_context' on Instance uuid e899db9a-b18d-4036-a523-fe0907dba023 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:49:45 np0005476733 nova_compute[192580]: 2025-10-08 15:49:45.028 2 DEBUG nova.virt.libvirt.driver [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:49:45 np0005476733 nova_compute[192580]: 2025-10-08 15:49:45.028 2 DEBUG nova.virt.libvirt.driver [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Ensure instance console log exists: /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:49:45 np0005476733 nova_compute[192580]: 2025-10-08 15:49:45.029 2 DEBUG oslo_concurrency.lockutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:49:45 np0005476733 nova_compute[192580]: 2025-10-08 15:49:45.030 2 DEBUG oslo_concurrency.lockutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:49:45 np0005476733 nova_compute[192580]: 2025-10-08 15:49:45.030 2 DEBUG oslo_concurrency.lockutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:49:45 np0005476733 nova_compute[192580]: 2025-10-08 15:49:45.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:46 np0005476733 nova_compute[192580]: 2025-10-08 15:49:46.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:46 np0005476733 nova_compute[192580]: 2025-10-08 15:49:46.439 2 DEBUG nova.network.neutron [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Successfully created port: b060d65a-9028-402c-8b84-594cba794144 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 11:49:47 np0005476733 nova_compute[192580]: 2025-10-08 15:49:47.208 2 DEBUG nova.network.neutron [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Successfully updated port: b060d65a-9028-402c-8b84-594cba794144 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:49:47 np0005476733 nova_compute[192580]: 2025-10-08 15:49:47.290 2 DEBUG oslo_concurrency.lockutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:49:47 np0005476733 nova_compute[192580]: 2025-10-08 15:49:47.290 2 DEBUG oslo_concurrency.lockutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquired lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:49:47 np0005476733 nova_compute[192580]: 2025-10-08 15:49:47.290 2 DEBUG nova.network.neutron [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:49:47 np0005476733 nova_compute[192580]: 2025-10-08 15:49:47.372 2 DEBUG nova.compute.manager [req-27fe6ad2-8605-4d29-865d-5e15810ddcf4 req-d247c504-89c6-4555-b8a5-1c53a15517d4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-changed-b060d65a-9028-402c-8b84-594cba794144 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:49:47 np0005476733 nova_compute[192580]: 2025-10-08 15:49:47.373 2 DEBUG nova.compute.manager [req-27fe6ad2-8605-4d29-865d-5e15810ddcf4 req-d247c504-89c6-4555-b8a5-1c53a15517d4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Refreshing instance network info cache due to event network-changed-b060d65a-9028-402c-8b84-594cba794144. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:49:47 np0005476733 nova_compute[192580]: 2025-10-08 15:49:47.373 2 DEBUG oslo_concurrency.lockutils [req-27fe6ad2-8605-4d29-865d-5e15810ddcf4 req-d247c504-89c6-4555-b8a5-1c53a15517d4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:49:47 np0005476733 nova_compute[192580]: 2025-10-08 15:49:47.455 2 DEBUG nova.network.neutron [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.457 2 DEBUG nova.network.neutron [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updating instance_info_cache with network_info: [{"id": "b060d65a-9028-402c-8b84-594cba794144", "address": "fa:16:3e:82:ce:fa", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.197", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb060d65a-90", "ovs_interfaceid": "b060d65a-9028-402c-8b84-594cba794144", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.592 2 DEBUG oslo_concurrency.lockutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Releasing lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.592 2 DEBUG nova.compute.manager [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Instance network_info: |[{"id": "b060d65a-9028-402c-8b84-594cba794144", "address": "fa:16:3e:82:ce:fa", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.197", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb060d65a-90", "ovs_interfaceid": "b060d65a-9028-402c-8b84-594cba794144", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.593 2 DEBUG oslo_concurrency.lockutils [req-27fe6ad2-8605-4d29-865d-5e15810ddcf4 req-d247c504-89c6-4555-b8a5-1c53a15517d4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.593 2 DEBUG nova.network.neutron [req-27fe6ad2-8605-4d29-865d-5e15810ddcf4 req-d247c504-89c6-4555-b8a5-1c53a15517d4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Refreshing network info cache for port b060d65a-9028-402c-8b84-594cba794144 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.596 2 DEBUG nova.virt.libvirt.driver [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Start _get_guest_xml network_info=[{"id": "b060d65a-9028-402c-8b84-594cba794144", "address": "fa:16:3e:82:ce:fa", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.197", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb060d65a-90", "ovs_interfaceid": "b060d65a-9028-402c-8b84-594cba794144", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.601 2 WARNING nova.virt.libvirt.driver [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.607 2 DEBUG nova.virt.libvirt.host [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.608 2 DEBUG nova.virt.libvirt.host [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.613 2 DEBUG nova.virt.libvirt.host [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.616 2 DEBUG nova.virt.libvirt.host [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.617 2 DEBUG nova.virt.libvirt.driver [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.617 2 DEBUG nova.virt.hardware [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.618 2 DEBUG nova.virt.hardware [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.618 2 DEBUG nova.virt.hardware [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.618 2 DEBUG nova.virt.hardware [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.619 2 DEBUG nova.virt.hardware [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.619 2 DEBUG nova.virt.hardware [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.619 2 DEBUG nova.virt.hardware [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.619 2 DEBUG nova.virt.hardware [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.620 2 DEBUG nova.virt.hardware [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.620 2 DEBUG nova.virt.hardware [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.620 2 DEBUG nova.virt.hardware [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.624 2 DEBUG nova.virt.libvirt.vif [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:49:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_qos_after_live_migration-234900889',display_name='tempest-test_qos_after_live_migration-234900889',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-qos-after-live-migration-234900889',id=72,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-atdej0cz',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:49:43Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=e899db9a-b18d-4036-a523-fe0907dba023,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b060d65a-9028-402c-8b84-594cba794144", "address": "fa:16:3e:82:ce:fa", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.197", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb060d65a-90", "ovs_interfaceid": "b060d65a-9028-402c-8b84-594cba794144", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.625 2 DEBUG nova.network.os_vif_util [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "b060d65a-9028-402c-8b84-594cba794144", "address": "fa:16:3e:82:ce:fa", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.197", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb060d65a-90", "ovs_interfaceid": "b060d65a-9028-402c-8b84-594cba794144", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.625 2 DEBUG nova.network.os_vif_util [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:ce:fa,bridge_name='br-int',has_traffic_filtering=True,id=b060d65a-9028-402c-8b84-594cba794144,network=Network(f2fece7d-de46-49dc-874d-3e87e96b491f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb060d65a-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.627 2 DEBUG nova.objects.instance [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'pci_devices' on Instance uuid e899db9a-b18d-4036-a523-fe0907dba023 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.669 2 DEBUG nova.virt.libvirt.driver [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:49:48 np0005476733 nova_compute[192580]:  <uuid>e899db9a-b18d-4036-a523-fe0907dba023</uuid>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:  <name>instance-00000048</name>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <nova:name>tempest-test_qos_after_live_migration-234900889</nova:name>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:49:48</nova:creationTime>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:49:48 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:        <nova:user uuid="d4d641ac754b44f89a23c1628056309a">tempest-QosTestCommon-1316104462-project-member</nova:user>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:        <nova:project uuid="d58fb802e34e481ea69b20f4fe8df6d2">tempest-QosTestCommon-1316104462</nova:project>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:        <nova:port uuid="b060d65a-9028-402c-8b84-594cba794144">
Oct  8 11:49:48 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.8.197" ipVersion="4"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <entry name="serial">e899db9a-b18d-4036-a523-fe0907dba023</entry>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <entry name="uuid">e899db9a-b18d-4036-a523-fe0907dba023</entry>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk.config"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:82:ce:fa"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <target dev="tapb060d65a-90"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/console.log" append="off"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:49:48 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:49:48 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:49:48 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:49:48 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.670 2 DEBUG nova.compute.manager [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Preparing to wait for external event network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.671 2 DEBUG oslo_concurrency.lockutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.671 2 DEBUG oslo_concurrency.lockutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.671 2 DEBUG oslo_concurrency.lockutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.672 2 DEBUG nova.virt.libvirt.vif [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:49:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_qos_after_live_migration-234900889',display_name='tempest-test_qos_after_live_migration-234900889',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-qos-after-live-migration-234900889',id=72,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-atdej0cz',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:49:43Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=e899db9a-b18d-4036-a523-fe0907dba023,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b060d65a-9028-402c-8b84-594cba794144", "address": "fa:16:3e:82:ce:fa", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.197", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb060d65a-90", "ovs_interfaceid": "b060d65a-9028-402c-8b84-594cba794144", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.673 2 DEBUG nova.network.os_vif_util [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "b060d65a-9028-402c-8b84-594cba794144", "address": "fa:16:3e:82:ce:fa", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.197", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb060d65a-90", "ovs_interfaceid": "b060d65a-9028-402c-8b84-594cba794144", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.673 2 DEBUG nova.network.os_vif_util [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:ce:fa,bridge_name='br-int',has_traffic_filtering=True,id=b060d65a-9028-402c-8b84-594cba794144,network=Network(f2fece7d-de46-49dc-874d-3e87e96b491f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb060d65a-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.674 2 DEBUG os_vif [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:ce:fa,bridge_name='br-int',has_traffic_filtering=True,id=b060d65a-9028-402c-8b84-594cba794144,network=Network(f2fece7d-de46-49dc-874d-3e87e96b491f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb060d65a-90') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.675 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.676 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.682 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb060d65a-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.682 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb060d65a-90, col_values=(('external_ids', {'iface-id': 'b060d65a-9028-402c-8b84-594cba794144', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:82:ce:fa', 'vm-uuid': 'e899db9a-b18d-4036-a523-fe0907dba023'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:48 np0005476733 NetworkManager[51699]: <info>  [1759938588.7288] manager: (tapb060d65a-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/203)
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.735 2 INFO os_vif [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:ce:fa,bridge_name='br-int',has_traffic_filtering=True,id=b060d65a-9028-402c-8b84-594cba794144,network=Network(f2fece7d-de46-49dc-874d-3e87e96b491f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb060d65a-90')#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.815 2 DEBUG nova.virt.libvirt.driver [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.815 2 DEBUG nova.virt.libvirt.driver [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.815 2 DEBUG nova.virt.libvirt.driver [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No VIF found with MAC fa:16:3e:82:ce:fa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:49:48 np0005476733 nova_compute[192580]: 2025-10-08 15:49:48.816 2 INFO nova.virt.libvirt.driver [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Using config drive#033[00m
Oct  8 11:49:49 np0005476733 nova_compute[192580]: 2025-10-08 15:49:49.404 2 INFO nova.virt.libvirt.driver [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Creating config drive at /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk.config#033[00m
Oct  8 11:49:49 np0005476733 nova_compute[192580]: 2025-10-08 15:49:49.409 2 DEBUG oslo_concurrency.processutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf7ylgfjh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:49:49 np0005476733 nova_compute[192580]: 2025-10-08 15:49:49.535 2 DEBUG oslo_concurrency.processutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf7ylgfjh" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:49:49 np0005476733 kernel: tapb060d65a-90: entered promiscuous mode
Oct  8 11:49:49 np0005476733 NetworkManager[51699]: <info>  [1759938589.6084] manager: (tapb060d65a-90): new Tun device (/org/freedesktop/NetworkManager/Devices/204)
Oct  8 11:49:49 np0005476733 ovn_controller[94857]: 2025-10-08T15:49:49Z|00615|binding|INFO|Claiming lport b060d65a-9028-402c-8b84-594cba794144 for this chassis.
Oct  8 11:49:49 np0005476733 ovn_controller[94857]: 2025-10-08T15:49:49Z|00616|binding|INFO|b060d65a-9028-402c-8b84-594cba794144: Claiming fa:16:3e:82:ce:fa 192.168.8.197
Oct  8 11:49:49 np0005476733 nova_compute[192580]: 2025-10-08 15:49:49.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:49.617 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:ce:fa 192.168.8.197'], port_security=['fa:16:3e:82:ce:fa 192.168.8.197'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.8.197/24', 'neutron:device_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2fece7d-de46-49dc-874d-3e87e96b491f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b449450f-29a2-4ba2-a56d-c4c1cca923db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07c02bbf-eada-4d49-8b61-f6f456154844, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=b060d65a-9028-402c-8b84-594cba794144) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:49.620 103739 INFO neutron.agent.ovn.metadata.agent [-] Port b060d65a-9028-402c-8b84-594cba794144 in datapath f2fece7d-de46-49dc-874d-3e87e96b491f bound to our chassis#033[00m
Oct  8 11:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:49.623 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f2fece7d-de46-49dc-874d-3e87e96b491f#033[00m
Oct  8 11:49:49 np0005476733 ovn_controller[94857]: 2025-10-08T15:49:49Z|00617|binding|INFO|Setting lport b060d65a-9028-402c-8b84-594cba794144 ovn-installed in OVS
Oct  8 11:49:49 np0005476733 ovn_controller[94857]: 2025-10-08T15:49:49Z|00618|binding|INFO|Setting lport b060d65a-9028-402c-8b84-594cba794144 up in Southbound
Oct  8 11:49:49 np0005476733 nova_compute[192580]: 2025-10-08 15:49:49.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:49.636 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0b8a8650-ee2e-4433-93ef-0cb06efca111]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:49.637 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf2fece7d-d1 in ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:49.639 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf2fece7d-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:49.639 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[638be5bf-ef8b-4747-86d8-868bd28e7cd8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:49.640 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[321fb140-8bb3-418c-9218-618fa29c41f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:49 np0005476733 systemd-udevd[242382]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:49:49 np0005476733 systemd-machined[152624]: New machine qemu-42-instance-00000048.
Oct  8 11:49:49 np0005476733 NetworkManager[51699]: <info>  [1759938589.6616] device (tapb060d65a-90): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:49.660 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[5ee5f94d-7a81-4331-b92e-a7abbcd95814]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:49 np0005476733 NetworkManager[51699]: <info>  [1759938589.6628] device (tapb060d65a-90): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:49:49 np0005476733 systemd[1]: Started Virtual Machine qemu-42-instance-00000048.
Oct  8 11:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:49.688 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ed9b90c6-d323-40cf-82cf-84f7b1f04d42]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:49.720 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[050db6fb-062c-49da-ad31-9e5e5ea3435a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:49 np0005476733 systemd-udevd[242386]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:49.728 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[61826e57-1ca3-42d3-adff-dc639ccd0b16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:49 np0005476733 NetworkManager[51699]: <info>  [1759938589.7296] manager: (tapf2fece7d-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/205)
Oct  8 11:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:49.769 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[d7ee52ec-9a0e-460f-9594-be2d7d8513db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:49.772 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[07f9b5df-0ce7-4c9e-8c6e-4bfc4c0c296c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:49 np0005476733 NetworkManager[51699]: <info>  [1759938589.8090] device (tapf2fece7d-d0): carrier: link connected
Oct  8 11:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:49.817 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[7780c919-3104-4954-b2c2-48d9b740adb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:49.838 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[be82e4b5-5d06-4170-b9ed-d7fd679ff90c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf2fece7d-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:76:b4:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 144], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549347, 'reachable_time': 20224, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242415, 'error': None, 'target': 'ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:49.858 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[dd7f22bb-e364-4c06-ab31-b358f3a9de93]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe76:b4af'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 549347, 'tstamp': 549347}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242416, 'error': None, 'target': 'ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:49 np0005476733 nova_compute[192580]: 2025-10-08 15:49:49.878 2 DEBUG nova.compute.manager [req-564138a5-918d-4713-9a63-5bf9a335a625 req-bf57bc04-11aa-493c-9bdb-8d55c29819d9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:49:49 np0005476733 nova_compute[192580]: 2025-10-08 15:49:49.878 2 DEBUG oslo_concurrency.lockutils [req-564138a5-918d-4713-9a63-5bf9a335a625 req-bf57bc04-11aa-493c-9bdb-8d55c29819d9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:49:49 np0005476733 nova_compute[192580]: 2025-10-08 15:49:49.878 2 DEBUG oslo_concurrency.lockutils [req-564138a5-918d-4713-9a63-5bf9a335a625 req-bf57bc04-11aa-493c-9bdb-8d55c29819d9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:49:49 np0005476733 nova_compute[192580]: 2025-10-08 15:49:49.879 2 DEBUG oslo_concurrency.lockutils [req-564138a5-918d-4713-9a63-5bf9a335a625 req-bf57bc04-11aa-493c-9bdb-8d55c29819d9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:49:49 np0005476733 nova_compute[192580]: 2025-10-08 15:49:49.879 2 DEBUG nova.compute.manager [req-564138a5-918d-4713-9a63-5bf9a335a625 req-bf57bc04-11aa-493c-9bdb-8d55c29819d9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Processing event network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:49.886 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[bf98b847-e3cb-4a77-896c-00cb9f0f05b9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf2fece7d-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:76:b4:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 144], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549347, 'reachable_time': 20224, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242417, 'error': None, 'target': 'ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:49.933 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6a077d71-2a9d-4634-8107-8330571a3622]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:50.015 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f86237b3-34b3-4f89-a896-951dd3995196]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:50.022 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2fece7d-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:50.023 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:50.023 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf2fece7d-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:50 np0005476733 NetworkManager[51699]: <info>  [1759938590.0311] manager: (tapf2fece7d-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/206)
Oct  8 11:49:50 np0005476733 kernel: tapf2fece7d-d0: entered promiscuous mode
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:50.036 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf2fece7d-d0, col_values=(('external_ids', {'iface-id': 'c203ff41-0371-4edb-b491-721a1c14b7ee'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:49:50 np0005476733 ovn_controller[94857]: 2025-10-08T15:49:50Z|00619|binding|INFO|Releasing lport c203ff41-0371-4edb-b491-721a1c14b7ee from this chassis (sb_readonly=0)
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:50.039 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f2fece7d-de46-49dc-874d-3e87e96b491f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f2fece7d-de46-49dc-874d-3e87e96b491f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:50.051 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5f53f47d-39cb-4633-aa4e-d1d7082a225c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:50.054 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-f2fece7d-de46-49dc-874d-3e87e96b491f
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/f2fece7d-de46-49dc-874d-3e87e96b491f.pid.haproxy
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID f2fece7d-de46-49dc-874d-3e87e96b491f
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:49:50 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:49:50.055 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f', 'env', 'PROCESS_TAG=haproxy-f2fece7d-de46-49dc-874d-3e87e96b491f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f2fece7d-de46-49dc-874d-3e87e96b491f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.287 2 DEBUG nova.network.neutron [req-27fe6ad2-8605-4d29-865d-5e15810ddcf4 req-d247c504-89c6-4555-b8a5-1c53a15517d4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updated VIF entry in instance network info cache for port b060d65a-9028-402c-8b84-594cba794144. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.288 2 DEBUG nova.network.neutron [req-27fe6ad2-8605-4d29-865d-5e15810ddcf4 req-d247c504-89c6-4555-b8a5-1c53a15517d4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updating instance_info_cache with network_info: [{"id": "b060d65a-9028-402c-8b84-594cba794144", "address": "fa:16:3e:82:ce:fa", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.197", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb060d65a-90", "ovs_interfaceid": "b060d65a-9028-402c-8b84-594cba794144", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.327 2 DEBUG oslo_concurrency.lockutils [req-27fe6ad2-8605-4d29-865d-5e15810ddcf4 req-d247c504-89c6-4555-b8a5-1c53a15517d4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:49:50 np0005476733 podman[242456]: 2025-10-08 15:49:50.48835205 +0000 UTC m=+0.058683798 container create 9a168bd37e94dfb53a1c3b595deb0a1f63bc4c3995188adbc8489fa7a0d05a02 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  8 11:49:50 np0005476733 systemd[1]: Started libpod-conmon-9a168bd37e94dfb53a1c3b595deb0a1f63bc4c3995188adbc8489fa7a0d05a02.scope.
Oct  8 11:49:50 np0005476733 podman[242456]: 2025-10-08 15:49:50.456335515 +0000 UTC m=+0.026667293 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:49:50 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:49:50 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b449656fb64ae5b2242049126f7cd4981689d6b316867b882933c446ece54a35/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:49:50 np0005476733 podman[242456]: 2025-10-08 15:49:50.589006508 +0000 UTC m=+0.159338276 container init 9a168bd37e94dfb53a1c3b595deb0a1f63bc4c3995188adbc8489fa7a0d05a02 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  8 11:49:50 np0005476733 podman[242456]: 2025-10-08 15:49:50.597626924 +0000 UTC m=+0.167958672 container start 9a168bd37e94dfb53a1c3b595deb0a1f63bc4c3995188adbc8489fa7a0d05a02 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:49:50 np0005476733 neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f[242471]: [NOTICE]   (242475) : New worker (242477) forked
Oct  8 11:49:50 np0005476733 neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f[242471]: [NOTICE]   (242475) : Loading success.
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.651 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759938590.6507912, e899db9a-b18d-4036-a523-fe0907dba023 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.652 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] VM Started (Lifecycle Event)#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.655 2 DEBUG nova.compute.manager [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.660 2 DEBUG nova.virt.libvirt.driver [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.664 2 INFO nova.virt.libvirt.driver [-] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Instance spawned successfully.#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.665 2 DEBUG nova.virt.libvirt.driver [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.730 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.734 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.800 2 DEBUG nova.virt.libvirt.driver [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.801 2 DEBUG nova.virt.libvirt.driver [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.801 2 DEBUG nova.virt.libvirt.driver [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.802 2 DEBUG nova.virt.libvirt.driver [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.803 2 DEBUG nova.virt.libvirt.driver [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.803 2 DEBUG nova.virt.libvirt.driver [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.869 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.869 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759938590.6510022, e899db9a-b18d-4036-a523-fe0907dba023 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.870 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.924 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.929 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759938590.659632, e899db9a-b18d-4036-a523-fe0907dba023 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.929 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.967 2 INFO nova.compute.manager [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Took 6.94 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:49:50 np0005476733 nova_compute[192580]: 2025-10-08 15:49:50.968 2 DEBUG nova.compute.manager [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:49:51 np0005476733 nova_compute[192580]: 2025-10-08 15:49:51.046 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:49:51 np0005476733 nova_compute[192580]: 2025-10-08 15:49:51.050 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:49:51 np0005476733 nova_compute[192580]: 2025-10-08 15:49:51.212 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:49:51 np0005476733 nova_compute[192580]: 2025-10-08 15:49:51.223 2 INFO nova.compute.manager [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Took 7.88 seconds to build instance.#033[00m
Oct  8 11:49:51 np0005476733 nova_compute[192580]: 2025-10-08 15:49:51.303 2 DEBUG oslo_concurrency.lockutils [None req-35920590-7f71-4717-b0f6-0fa3fb49b5ee d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.055s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:49:51 np0005476733 nova_compute[192580]: 2025-10-08 15:49:51.960 2 DEBUG nova.compute.manager [req-19102bcb-83fc-4a20-baa7-b8eb62f1dd2b req-00b86c09-d2b4-4a01-a02d-cde43b39ddd8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:49:51 np0005476733 nova_compute[192580]: 2025-10-08 15:49:51.960 2 DEBUG oslo_concurrency.lockutils [req-19102bcb-83fc-4a20-baa7-b8eb62f1dd2b req-00b86c09-d2b4-4a01-a02d-cde43b39ddd8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:49:51 np0005476733 nova_compute[192580]: 2025-10-08 15:49:51.961 2 DEBUG oslo_concurrency.lockutils [req-19102bcb-83fc-4a20-baa7-b8eb62f1dd2b req-00b86c09-d2b4-4a01-a02d-cde43b39ddd8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:49:51 np0005476733 nova_compute[192580]: 2025-10-08 15:49:51.961 2 DEBUG oslo_concurrency.lockutils [req-19102bcb-83fc-4a20-baa7-b8eb62f1dd2b req-00b86c09-d2b4-4a01-a02d-cde43b39ddd8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:49:51 np0005476733 nova_compute[192580]: 2025-10-08 15:49:51.961 2 DEBUG nova.compute.manager [req-19102bcb-83fc-4a20-baa7-b8eb62f1dd2b req-00b86c09-d2b4-4a01-a02d-cde43b39ddd8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:49:51 np0005476733 nova_compute[192580]: 2025-10-08 15:49:51.962 2 WARNING nova.compute.manager [req-19102bcb-83fc-4a20-baa7-b8eb62f1dd2b req-00b86c09-d2b4-4a01-a02d-cde43b39ddd8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received unexpected event network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:49:52 np0005476733 podman[242487]: 2025-10-08 15:49:52.066163887 +0000 UTC m=+0.064794223 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 11:49:52 np0005476733 podman[242486]: 2025-10-08 15:49:52.066362424 +0000 UTC m=+0.068800022 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 11:49:52 np0005476733 podman[242488]: 2025-10-08 15:49:52.080099583 +0000 UTC m=+0.073386808 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_id=edpm, name=ubi9-minimal, vcs-type=git)
Oct  8 11:49:52 np0005476733 nova_compute[192580]: 2025-10-08 15:49:52.577 2 INFO nova.compute.manager [None req-1c7d93b6-99a6-4469-bdb6-796edf6fec78 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Get console output#033[00m
Oct  8 11:49:52 np0005476733 nova_compute[192580]: 2025-10-08 15:49:52.584 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:49:53 np0005476733 nova_compute[192580]: 2025-10-08 15:49:53.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:55 np0005476733 nova_compute[192580]: 2025-10-08 15:49:55.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:57 np0005476733 nova_compute[192580]: 2025-10-08 15:49:57.752 2 INFO nova.compute.manager [None req-0f22b449-85a4-416f-9478-61465362da89 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Get console output#033[00m
Oct  8 11:49:57 np0005476733 nova_compute[192580]: 2025-10-08 15:49:57.934 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:49:58 np0005476733 nova_compute[192580]: 2025-10-08 15:49:58.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:49:59 np0005476733 podman[242550]: 2025-10-08 15:49:59.224945974 +0000 UTC m=+0.046271691 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:49:59 np0005476733 podman[242549]: 2025-10-08 15:49:59.251975028 +0000 UTC m=+0.076059383 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct  8 11:50:00 np0005476733 nova_compute[192580]: 2025-10-08 15:50:00.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:50:00 np0005476733 nova_compute[192580]: 2025-10-08 15:50:00.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:50:02 np0005476733 nova_compute[192580]: 2025-10-08 15:50:02.917 2 INFO nova.compute.manager [None req-a4ba42ae-e622-4a86-bc02-6b13f12536a1 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Get console output#033[00m
Oct  8 11:50:03 np0005476733 nova_compute[192580]: 2025-10-08 15:50:03.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:50:05 np0005476733 nova_compute[192580]: 2025-10-08 15:50:05.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:50:05 np0005476733 nova_compute[192580]: 2025-10-08 15:50:05.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:50:06 np0005476733 nova_compute[192580]: 2025-10-08 15:50:06.592 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:50:06 np0005476733 nova_compute[192580]: 2025-10-08 15:50:06.593 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:50:06 np0005476733 nova_compute[192580]: 2025-10-08 15:50:06.594 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:50:07 np0005476733 nova_compute[192580]: 2025-10-08 15:50:07.247 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:50:07 np0005476733 nova_compute[192580]: 2025-10-08 15:50:07.248 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:50:07 np0005476733 nova_compute[192580]: 2025-10-08 15:50:07.248 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:50:07 np0005476733 nova_compute[192580]: 2025-10-08 15:50:07.249 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e899db9a-b18d-4036-a523-fe0907dba023 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:50:08 np0005476733 nova_compute[192580]: 2025-10-08 15:50:08.104 2 INFO nova.compute.manager [None req-41a2277a-7007-4db4-94e7-467339e5defa d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Get console output#033[00m
Oct  8 11:50:08 np0005476733 nova_compute[192580]: 2025-10-08 15:50:08.744 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updating instance_info_cache with network_info: [{"id": "b060d65a-9028-402c-8b84-594cba794144", "address": "fa:16:3e:82:ce:fa", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.197", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb060d65a-90", "ovs_interfaceid": "b060d65a-9028-402c-8b84-594cba794144", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:50:08 np0005476733 nova_compute[192580]: 2025-10-08 15:50:08.804 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:50:08 np0005476733 nova_compute[192580]: 2025-10-08 15:50:08.804 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:50:08 np0005476733 nova_compute[192580]: 2025-10-08 15:50:08.805 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:50:08 np0005476733 nova_compute[192580]: 2025-10-08 15:50:08.805 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:50:08 np0005476733 nova_compute[192580]: 2025-10-08 15:50:08.805 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:50:08 np0005476733 nova_compute[192580]: 2025-10-08 15:50:08.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:50:08 np0005476733 nova_compute[192580]: 2025-10-08 15:50:08.877 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:50:08 np0005476733 nova_compute[192580]: 2025-10-08 15:50:08.878 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:50:08 np0005476733 nova_compute[192580]: 2025-10-08 15:50:08.878 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:50:08 np0005476733 nova_compute[192580]: 2025-10-08 15:50:08.878 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:50:08 np0005476733 nova_compute[192580]: 2025-10-08 15:50:08.970 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:50:08 np0005476733 podman[242605]: 2025-10-08 15:50:08.976422776 +0000 UTC m=+0.058394879 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  8 11:50:09 np0005476733 nova_compute[192580]: 2025-10-08 15:50:09.035 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:50:09 np0005476733 nova_compute[192580]: 2025-10-08 15:50:09.036 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:50:09 np0005476733 nova_compute[192580]: 2025-10-08 15:50:09.101 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:50:09 np0005476733 nova_compute[192580]: 2025-10-08 15:50:09.294 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:50:09 np0005476733 nova_compute[192580]: 2025-10-08 15:50:09.298 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13327MB free_disk=111.32923126220703GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:50:09 np0005476733 nova_compute[192580]: 2025-10-08 15:50:09.299 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:50:09 np0005476733 nova_compute[192580]: 2025-10-08 15:50:09.299 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:50:09 np0005476733 nova_compute[192580]: 2025-10-08 15:50:09.444 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance e899db9a-b18d-4036-a523-fe0907dba023 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:50:09 np0005476733 nova_compute[192580]: 2025-10-08 15:50:09.445 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:50:09 np0005476733 nova_compute[192580]: 2025-10-08 15:50:09.445 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:50:09 np0005476733 nova_compute[192580]: 2025-10-08 15:50:09.567 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:50:09 np0005476733 nova_compute[192580]: 2025-10-08 15:50:09.590 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:50:09 np0005476733 nova_compute[192580]: 2025-10-08 15:50:09.615 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:50:09 np0005476733 nova_compute[192580]: 2025-10-08 15:50:09.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.317s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:50:10 np0005476733 nova_compute[192580]: 2025-10-08 15:50:10.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:50:11 np0005476733 nova_compute[192580]: 2025-10-08 15:50:11.400 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:50:11 np0005476733 nova_compute[192580]: 2025-10-08 15:50:11.401 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:50:13 np0005476733 nova_compute[192580]: 2025-10-08 15:50:13.436 2 INFO nova.compute.manager [None req-9542a091-0776-4f10-b22f-5908b1b249a8 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Get console output#033[00m
Oct  8 11:50:13 np0005476733 nova_compute[192580]: 2025-10-08 15:50:13.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:50:14 np0005476733 podman[242629]: 2025-10-08 15:50:14.250355175 +0000 UTC m=+0.066900310 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 11:50:14 np0005476733 podman[242628]: 2025-10-08 15:50:14.275676475 +0000 UTC m=+0.096986323 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 11:50:14 np0005476733 ovn_controller[94857]: 2025-10-08T15:50:14Z|00088|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:82:ce:fa 192.168.8.197
Oct  8 11:50:14 np0005476733 ovn_controller[94857]: 2025-10-08T15:50:14Z|00089|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:82:ce:fa 192.168.8.197
Oct  8 11:50:15 np0005476733 nova_compute[192580]: 2025-10-08 15:50:15.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:50:18 np0005476733 nova_compute[192580]: 2025-10-08 15:50:18.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:50:18 np0005476733 nova_compute[192580]: 2025-10-08 15:50:18.697 2 INFO nova.compute.manager [None req-634b0374-8751-432c-8fbc-7c7739bc517e d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Get console output#033[00m
Oct  8 11:50:18 np0005476733 nova_compute[192580]: 2025-10-08 15:50:18.703 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:50:18 np0005476733 nova_compute[192580]: 2025-10-08 15:50:18.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:50:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:50:20Z|00620|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Oct  8 11:50:20 np0005476733 nova_compute[192580]: 2025-10-08 15:50:20.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:50:22 np0005476733 podman[242674]: 2025-10-08 15:50:22.248754865 +0000 UTC m=+0.074524875 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 11:50:22 np0005476733 podman[242675]: 2025-10-08 15:50:22.255606013 +0000 UTC m=+0.074583835 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:50:22 np0005476733 podman[242676]: 2025-10-08 15:50:22.27832453 +0000 UTC m=+0.082341424 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, version=9.6, distribution-scope=public, name=ubi9-minimal, vcs-type=git)
Oct  8 11:50:23 np0005476733 nova_compute[192580]: 2025-10-08 15:50:23.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:50:23 np0005476733 nova_compute[192580]: 2025-10-08 15:50:23.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:50:23 np0005476733 nova_compute[192580]: 2025-10-08 15:50:23.890 2 INFO nova.compute.manager [None req-dca5c484-7bf0-43a4-b7db-7a028e344614 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Get console output#033[00m
Oct  8 11:50:23 np0005476733 nova_compute[192580]: 2025-10-08 15:50:23.896 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:50:23 np0005476733 nova_compute[192580]: 2025-10-08 15:50:23.899 2 INFO nova.virt.libvirt.driver [None req-dca5c484-7bf0-43a4-b7db-7a028e344614 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Truncated console log returned, 3394 bytes ignored#033[00m
Oct  8 11:50:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:50:25Z|00621|pinctrl|WARN|Dropped 1491 log messages in last 65 seconds (most recently, 6 seconds ago) due to excessive rate
Oct  8 11:50:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:50:25Z|00622|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:50:25 np0005476733 nova_compute[192580]: 2025-10-08 15:50:25.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:50:25 np0005476733 nova_compute[192580]: 2025-10-08 15:50:25.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 11:50:25 np0005476733 nova_compute[192580]: 2025-10-08 15:50:25.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:50:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:50:26.342 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:50:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:50:26.343 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:50:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:50:26.344 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:50:28 np0005476733 nova_compute[192580]: 2025-10-08 15:50:28.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:50:28 np0005476733 nova_compute[192580]: 2025-10-08 15:50:28.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:50:29 np0005476733 nova_compute[192580]: 2025-10-08 15:50:29.092 2 INFO nova.compute.manager [None req-845657af-5d04-49c7-acc7-4133ee4d987d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Get console output#033[00m
Oct  8 11:50:29 np0005476733 nova_compute[192580]: 2025-10-08 15:50:29.100 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 11:50:29 np0005476733 nova_compute[192580]: 2025-10-08 15:50:29.103 2 INFO nova.virt.libvirt.driver [None req-845657af-5d04-49c7-acc7-4133ee4d987d d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Truncated console log returned, 3604 bytes ignored#033[00m
Oct  8 11:50:30 np0005476733 podman[242754]: 2025-10-08 15:50:30.227785065 +0000 UTC m=+0.054489214 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:50:30 np0005476733 podman[242755]: 2025-10-08 15:50:30.234019183 +0000 UTC m=+0.056920260 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:50:30 np0005476733 nova_compute[192580]: 2025-10-08 15:50:30.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:50:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:50:33.385 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:50:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:50:33.386 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:50:33 np0005476733 nova_compute[192580]: 2025-10-08 15:50:33.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:50:33 np0005476733 nova_compute[192580]: 2025-10-08 15:50:33.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:50:35 np0005476733 nova_compute[192580]: 2025-10-08 15:50:35.836 2 DEBUG nova.compute.manager [req-8eaa3b6c-7b2b-40e6-bd3a-8749a3be65d4 req-65649963-7ba5-43df-a8fd-e2acaf942358 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-changed-b060d65a-9028-402c-8b84-594cba794144 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:50:35 np0005476733 nova_compute[192580]: 2025-10-08 15:50:35.836 2 DEBUG nova.compute.manager [req-8eaa3b6c-7b2b-40e6-bd3a-8749a3be65d4 req-65649963-7ba5-43df-a8fd-e2acaf942358 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Refreshing instance network info cache due to event network-changed-b060d65a-9028-402c-8b84-594cba794144. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:50:35 np0005476733 nova_compute[192580]: 2025-10-08 15:50:35.837 2 DEBUG oslo_concurrency.lockutils [req-8eaa3b6c-7b2b-40e6-bd3a-8749a3be65d4 req-65649963-7ba5-43df-a8fd-e2acaf942358 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:50:35 np0005476733 nova_compute[192580]: 2025-10-08 15:50:35.837 2 DEBUG oslo_concurrency.lockutils [req-8eaa3b6c-7b2b-40e6-bd3a-8749a3be65d4 req-65649963-7ba5-43df-a8fd-e2acaf942358 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:50:35 np0005476733 nova_compute[192580]: 2025-10-08 15:50:35.837 2 DEBUG nova.network.neutron [req-8eaa3b6c-7b2b-40e6-bd3a-8749a3be65d4 req-65649963-7ba5-43df-a8fd-e2acaf942358 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Refreshing network info cache for port b060d65a-9028-402c-8b84-594cba794144 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:50:36 np0005476733 nova_compute[192580]: 2025-10-08 15:50:36.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.037 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'name': 'tempest-test_qos_after_live_migration-234900889', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000048', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'hostId': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.038 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.062 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.write.latency volume: 17901156746 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.063 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e37c9c7-a306-4f28-b0f2-f2b54cd91480', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17901156746, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-vda', 'timestamp': '2025-10-08T15:50:36.038810', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '87cb9482-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.761775762, 'message_signature': '12ea9c7fa00e9f3dedea82e12f877e72180ca1a4f83fc82198c75124f74715b5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-sda', 'timestamp': '2025-10-08T15:50:36.038810', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '87cba8c8-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.761775762, 'message_signature': '8d92794f9f5c48d3839ce60306d5046d8ccc4c31af98c505b6c11706b9d5cb9f'}]}, 'timestamp': '2025-10-08 15:50:36.063922', '_unique_id': 'd63ba343893b4f9796a5b09f4254dd53'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.065 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.066 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.070 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e899db9a-b18d-4036-a523-fe0907dba023 / tapb060d65a-90 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.071 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a815579-63b4-4760-a32c-cd8f51ed28b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tapb060d65a-90', 'timestamp': '2025-10-08T15:50:36.066916', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tapb060d65a-90', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:82:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb060d65a-90'}, 'message_id': '87ccd996-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.789931702, 'message_signature': 'b49b8e7452edb28a70988bfe201725aa5e971d5ef867114a04839493eaff4340'}]}, 'timestamp': '2025-10-08 15:50:36.071834', '_unique_id': '83d3dff317c94f68ac96f638a1ebe111'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.073 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.074 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.074 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.read.latency volume: 10289276837 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.074 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.read.latency volume: 79912291 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a037f347-67f7-4c02-8066-98b26c6c459d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10289276837, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-vda', 'timestamp': '2025-10-08T15:50:36.074257', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '87cd4a20-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.761775762, 'message_signature': '5803c97043c0506ee0e16fde9d8aba05e2e14cbcfbe53db200a20790b41a6954'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 79912291, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-sda', 'timestamp': '2025-10-08T15:50:36.074257', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '87cd5664-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.761775762, 'message_signature': '1bc95ba25da7c5a5d0ee30bbfd46e68ff4dbc064d939773416cdbb15706a3da5'}]}, 'timestamp': '2025-10-08 15:50:36.074901', '_unique_id': '3873e0ba028f415393e7ab24d970474f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.075 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.076 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.076 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '692ca285-e5f8-4853-a1d7-ab909d657d30', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tapb060d65a-90', 'timestamp': '2025-10-08T15:50:36.076807', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tapb060d65a-90', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:82:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb060d65a-90'}, 'message_id': '87cdad62-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.789931702, 'message_signature': '7ef55503f15a538104c0cd7da1201aabafb817aad1ccebedce5152b5fd6802c4'}]}, 'timestamp': '2025-10-08 15:50:36.077176', '_unique_id': '3d67de7a004749fe9e570e503c0b55ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.077 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.079 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.079 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c83b89e-971e-44a5-99eb-25e1fbd45674', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tapb060d65a-90', 'timestamp': '2025-10-08T15:50:36.079262', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tapb060d65a-90', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:82:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb060d65a-90'}, 'message_id': '87ce0f6e-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.789931702, 'message_signature': '483e6729a39468567aee28dde20392b973ddc202da45c57976410cad41f0ef2d'}]}, 'timestamp': '2025-10-08 15:50:36.079800', '_unique_id': '80e428445d3d4a5c9adfe161e864a8ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.080 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.081 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.082 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.082 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-test_qos_after_live_migration-234900889>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_qos_after_live_migration-234900889>]
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.082 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.102 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/cpu volume: 40380000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8aa0ca98-337b-4c47-b5eb-615718e1f5e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 40380000000, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'timestamp': '2025-10-08T15:50:36.082552', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '87d19bca-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.825126487, 'message_signature': 'b7a820ad869f4bfe781799950614a5f9c2644e393e2775f4d9df2fbfff1c7156'}]}, 'timestamp': '2025-10-08 15:50:36.102980', '_unique_id': '0bbf6380ca8545788b4bc40422450f4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.104 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.105 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.incoming.bytes volume: 2234 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f802c80-448c-4772-97ff-ce34996cbeec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2234, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tapb060d65a-90', 'timestamp': '2025-10-08T15:50:36.105142', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tapb060d65a-90', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:82:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb060d65a-90'}, 'message_id': '87d1fef8-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.789931702, 'message_signature': '6fac6fc140c66e968fbc4141ec42f89759b8af979ebdb068d99aa466951577c3'}]}, 'timestamp': '2025-10-08 15:50:36.105418', '_unique_id': 'a6abff6837de47fd80d452689dbe44ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.106 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.read.requests volume: 11481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.107 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5dea29d0-ab62-4a50-adc4-e13eff1af0d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11481, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-vda', 'timestamp': '2025-10-08T15:50:36.106878', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '87d2435e-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.761775762, 'message_signature': '8c0b4ba0b9ca08302c56f3648465f20c9b58d9abcf949956b145258b5f2512ce'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-sda', 'timestamp': '2025-10-08T15:50:36.106878', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '87d24eb2-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.761775762, 'message_signature': '4096229aba9a2d44cad148b2c68ff46563dc458ad5d939e3e9eeb48d4a6351f1'}]}, 'timestamp': '2025-10-08 15:50:36.107465', '_unique_id': '0cddff28318f47acae79401c6c32cebf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.108 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.outgoing.bytes volume: 3212 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8665c46d-94d6-4a78-9cbb-d1ee9a469566', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3212, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tapb060d65a-90', 'timestamp': '2025-10-08T15:50:36.108720', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tapb060d65a-90', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:82:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb060d65a-90'}, 'message_id': '87d289b8-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.789931702, 'message_signature': '6bef85b1ed16ccde249252ee59e785813ca02659fb023d38a11a32c1997096d1'}]}, 'timestamp': '2025-10-08 15:50:36.108951', '_unique_id': 'ffdcbb1cc1794bf29f165a037412e91b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.109 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '501b17c3-1837-492a-bcbe-492dd18d59c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tapb060d65a-90', 'timestamp': '2025-10-08T15:50:36.110020', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tapb060d65a-90', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:82:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb060d65a-90'}, 'message_id': '87d2bd48-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.789931702, 'message_signature': 'b7ba7648dc79e942d56a279d4a504a71dad344a47564217d1cf7ea26c1f7cec0'}]}, 'timestamp': '2025-10-08 15:50:36.110269', '_unique_id': 'c20d7336c32b4136bff958bdea1788e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.110 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.111 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.111 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.111 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-test_qos_after_live_migration-234900889>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_qos_after_live_migration-234900889>]
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.111 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.111 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.write.requests volume: 381 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85f5cfea-ecfc-43b8-b585-6a13e5341e15', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 381, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-vda', 'timestamp': '2025-10-08T15:50:36.111908', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '87d30758-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.761775762, 'message_signature': '00a308a0fe9cb73e7bb9880c5016cc4ab44318f130af9e9421a428d6a8764f2e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-sda', 'timestamp': '2025-10-08T15:50:36.111908', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '87d310ea-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.761775762, 'message_signature': '4d0ed37bfea18a762d218401abdd3aa2f185e792b2065e19f0d79a19cb7c1773'}]}, 'timestamp': '2025-10-08 15:50:36.112400', '_unique_id': 'c494a62d22a146d6bb7467e5815c9334'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.112 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.113 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.127 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.usage volume: 85524480 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.128 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '487d14b2-1272-4bb4-bb30-39d98f551e30', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 85524480, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-vda', 'timestamp': '2025-10-08T15:50:36.113577', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '87d5765a-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.836567923, 'message_signature': '6089da885c288bd258519ed6f505547c1a535e6c93b8b4f264569d5c53f29501'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-sda', 'timestamp': '2025-10-08T15:50:36.113577', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '87d58532-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.836567923, 'message_signature': '83653a210818ffa116f9f4912d0f3a915970ba6252ad48ea1600aa7a011b38b6'}]}, 'timestamp': '2025-10-08 15:50:36.128517', '_unique_id': '6157e58a3ea14b5da40c638bdb69fcdf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.130 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.130 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/memory.usage volume: 292.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe7d0410-df29-43f7-b4c6-7f49280c9231', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 292.38671875, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'timestamp': '2025-10-08T15:50:36.130855', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '87d5eab8-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.825126487, 'message_signature': 'df4c323cba3727b930fe4c25b984fdaa75b6bde06ab14850d9195e9036624cc6'}]}, 'timestamp': '2025-10-08 15:50:36.131119', '_unique_id': '156061fd364a4e038ba971f99835eec0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.131 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57ba3632-139b-4a5c-bab1-99f110aed281', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tapb060d65a-90', 'timestamp': '2025-10-08T15:50:36.132294', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tapb060d65a-90', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:82:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb060d65a-90'}, 'message_id': '87d622da-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.789931702, 'message_signature': '20f7ca75daea2719891025969bf3d25b76f57baa13e02ef9aa49a4c11bcb08af'}]}, 'timestamp': '2025-10-08 15:50:36.132535', '_unique_id': '6fef475bda96425c9b2becca5f277908'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.132 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.133 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.133 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.read.bytes volume: 314078208 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.133 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '227a6fd9-c82c-46fd-b7f6-19e1652a42c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 314078208, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-vda', 'timestamp': '2025-10-08T15:50:36.133652', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '87d6580e-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.761775762, 'message_signature': '09fc5c5be9fc69a7c0d43fdd3287c2227be30a1a976cddf7926165ef119f18e3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-sda', 'timestamp': '2025-10-08T15:50:36.133652', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '87d661b4-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.761775762, 'message_signature': '6cd45ed6080e0f1ab86309261aafea58bc96f7a294af7753516d21564ef14cda'}]}, 'timestamp': '2025-10-08 15:50:36.134178', '_unique_id': 'b7f6981db915456688567365fa902c05'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.134 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.135 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.135 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.write.bytes volume: 77940224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.135 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff1e1d4f-0900-4db9-ba37-524435c54461', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 77940224, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-vda', 'timestamp': '2025-10-08T15:50:36.135448', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '87d69dbe-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.761775762, 'message_signature': 'a08704be0be1fdf12bed3e89a61b91cead826e579823e50ca524a761574f5557'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-sda', 'timestamp': '2025-10-08T15:50:36.135448', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '87d6a5f2-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.761775762, 'message_signature': '71f12e114a3b4d8d5d61af6a3f225d75f8a144bb7221cbef9a7d5f08cb6c894a'}]}, 'timestamp': '2025-10-08 15:50:36.135872', '_unique_id': '30ca9f8a73034d53b2f2f81450c49895'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.136 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.137 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.137 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-test_qos_after_live_migration-234900889>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_qos_after_live_migration-234900889>]
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.137 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.137 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b4c2a92-c1d8-4e4a-9dca-26d2cd799298', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tapb060d65a-90', 'timestamp': '2025-10-08T15:50:36.137401', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tapb060d65a-90', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:82:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb060d65a-90'}, 'message_id': '87d6ea62-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.789931702, 'message_signature': '91030e2d115b496039285d0c41b4a5d59dd76469b6d70fc903374dba4edea83a'}]}, 'timestamp': '2025-10-08 15:50:36.137640', '_unique_id': '3603b220c70f43d6be5ea23eb474e0bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.138 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.outgoing.packets volume: 30 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '753cf079-cb2e-4cbb-8057-7fcb3d927f2a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 30, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tapb060d65a-90', 'timestamp': '2025-10-08T15:50:36.138703', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tapb060d65a-90', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:82:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb060d65a-90'}, 'message_id': '87d71cbc-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.789931702, 'message_signature': '69cc33c2cfc63710d1e5b04faf5d2eb4882047ce515f2d00fb4525a68a289c1f'}]}, 'timestamp': '2025-10-08 15:50:36.138928', '_unique_id': '970078d6b3f44167a453eb0bae4cff32'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.139 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.140 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.140 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-test_qos_after_live_migration-234900889>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_qos_after_live_migration-234900889>]
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.140 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.140 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.140 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4958706e-1a71-4612-9512-5e6edb30cfd0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-vda', 'timestamp': '2025-10-08T15:50:36.140296', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '87d75b32-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.836567923, 'message_signature': 'b05e2c1062c5ef147539292c1c332b06d8f43a2f21ac0c0a317c88bcaa6e3c4d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-sda', 'timestamp': '2025-10-08T15:50:36.140296', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '87d76316-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.836567923, 'message_signature': '10d3a535a9d3a589d1c0f23e1da85ce784a8d3dfc24844d7579eeeca15c8dd4b'}]}, 'timestamp': '2025-10-08 15:50:36.140716', '_unique_id': '629e5964cc864daeb898c7b37a5addfe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.141 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.allocation volume: 85987328 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7f29458-72de-4bf1-b46a-05e2ba04e182', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 85987328, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-vda', 'timestamp': '2025-10-08T15:50:36.141806', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '87d795de-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.836567923, 'message_signature': 'ad2517fe0653f6cf83aa0ae27d3ad2578828a1c5d770ff42b28de15bae1be4cf'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-sda', 'timestamp': '2025-10-08T15:50:36.141806', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '87d79e3a-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.836567923, 'message_signature': '16ec4b7cca0b5ef65e91d747a12acd56be2e9fa0b3505b4e06316fd1028dc24e'}]}, 'timestamp': '2025-10-08 15:50:36.142231', '_unique_id': '14811265663c489fa06e1e090f956de7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.142 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.143 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.143 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ec16b7b-a961-4e1b-8573-0724b8fdfaf7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tapb060d65a-90', 'timestamp': '2025-10-08T15:50:36.143343', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tapb060d65a-90', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:82:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb060d65a-90'}, 'message_id': '87d7d21a-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5539.789931702, 'message_signature': '6f178fe2fb0dd100783bcaafedc150d18f8a133868c6dcc10ad22b0d37c5e024'}]}, 'timestamp': '2025-10-08 15:50:36.143571', '_unique_id': '5d4a40b29ca746f1a7795a4394f998df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:50:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:50:38 np0005476733 nova_compute[192580]: 2025-10-08 15:50:38.358 2 DEBUG nova.network.neutron [req-8eaa3b6c-7b2b-40e6-bd3a-8749a3be65d4 req-65649963-7ba5-43df-a8fd-e2acaf942358 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updated VIF entry in instance network info cache for port b060d65a-9028-402c-8b84-594cba794144. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:50:38 np0005476733 nova_compute[192580]: 2025-10-08 15:50:38.359 2 DEBUG nova.network.neutron [req-8eaa3b6c-7b2b-40e6-bd3a-8749a3be65d4 req-65649963-7ba5-43df-a8fd-e2acaf942358 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updating instance_info_cache with network_info: [{"id": "b060d65a-9028-402c-8b84-594cba794144", "address": "fa:16:3e:82:ce:fa", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.197", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb060d65a-90", "ovs_interfaceid": "b060d65a-9028-402c-8b84-594cba794144", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:50:38 np0005476733 nova_compute[192580]: 2025-10-08 15:50:38.554 2 DEBUG oslo_concurrency.lockutils [req-8eaa3b6c-7b2b-40e6-bd3a-8749a3be65d4 req-65649963-7ba5-43df-a8fd-e2acaf942358 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:50:38 np0005476733 nova_compute[192580]: 2025-10-08 15:50:38.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:50:39 np0005476733 podman[242798]: 2025-10-08 15:50:39.229465076 +0000 UTC m=+0.054737561 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 11:50:41 np0005476733 nova_compute[192580]: 2025-10-08 15:50:41.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:50:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:50:41.390 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:50:43 np0005476733 nova_compute[192580]: 2025-10-08 15:50:43.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:50:45 np0005476733 podman[242819]: 2025-10-08 15:50:45.241655514 +0000 UTC m=+0.067334844 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute)
Oct  8 11:50:45 np0005476733 podman[242818]: 2025-10-08 15:50:45.26653708 +0000 UTC m=+0.095479755 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  8 11:50:46 np0005476733 nova_compute[192580]: 2025-10-08 15:50:46.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:50:48 np0005476733 nova_compute[192580]: 2025-10-08 15:50:48.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:50:51 np0005476733 nova_compute[192580]: 2025-10-08 15:50:51.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:50:51 np0005476733 nova_compute[192580]: 2025-10-08 15:50:51.617 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:50:51 np0005476733 nova_compute[192580]: 2025-10-08 15:50:51.617 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 11:50:51 np0005476733 nova_compute[192580]: 2025-10-08 15:50:51.642 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 11:50:53 np0005476733 podman[242863]: 2025-10-08 15:50:53.247734957 +0000 UTC m=+0.071969401 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:50:53 np0005476733 podman[242865]: 2025-10-08 15:50:53.270378262 +0000 UTC m=+0.076053503 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vcs-type=git)
Oct  8 11:50:53 np0005476733 podman[242864]: 2025-10-08 15:50:53.279480093 +0000 UTC m=+0.098063586 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:50:53 np0005476733 nova_compute[192580]: 2025-10-08 15:50:53.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:50:56 np0005476733 nova_compute[192580]: 2025-10-08 15:50:56.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:50:57 np0005476733 nova_compute[192580]: 2025-10-08 15:50:57.613 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:50:58 np0005476733 nova_compute[192580]: 2025-10-08 15:50:58.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:00 np0005476733 nova_compute[192580]: 2025-10-08 15:51:00.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:51:01 np0005476733 nova_compute[192580]: 2025-10-08 15:51:01.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:01 np0005476733 podman[242920]: 2025-10-08 15:51:01.232721296 +0000 UTC m=+0.055794285 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 11:51:01 np0005476733 podman[242919]: 2025-10-08 15:51:01.247975253 +0000 UTC m=+0.072419377 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid)
Oct  8 11:51:02 np0005476733 nova_compute[192580]: 2025-10-08 15:51:02.023 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:51:02 np0005476733 nova_compute[192580]: 2025-10-08 15:51:02.293 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Triggering sync for uuid e899db9a-b18d-4036-a523-fe0907dba023 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  8 11:51:02 np0005476733 nova_compute[192580]: 2025-10-08 15:51:02.294 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:51:02 np0005476733 nova_compute[192580]: 2025-10-08 15:51:02.294 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "e899db9a-b18d-4036-a523-fe0907dba023" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:51:02 np0005476733 nova_compute[192580]: 2025-10-08 15:51:02.418 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "e899db9a-b18d-4036-a523-fe0907dba023" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:51:03 np0005476733 nova_compute[192580]: 2025-10-08 15:51:03.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:06 np0005476733 nova_compute[192580]: 2025-10-08 15:51:06.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:06 np0005476733 nova_compute[192580]: 2025-10-08 15:51:06.859 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:51:06 np0005476733 nova_compute[192580]: 2025-10-08 15:51:06.860 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:51:06 np0005476733 nova_compute[192580]: 2025-10-08 15:51:06.860 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:51:07 np0005476733 nova_compute[192580]: 2025-10-08 15:51:07.310 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:51:07 np0005476733 nova_compute[192580]: 2025-10-08 15:51:07.311 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:51:07 np0005476733 nova_compute[192580]: 2025-10-08 15:51:07.311 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:51:07 np0005476733 nova_compute[192580]: 2025-10-08 15:51:07.311 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e899db9a-b18d-4036-a523-fe0907dba023 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:51:08 np0005476733 nova_compute[192580]: 2025-10-08 15:51:08.685 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updating instance_info_cache with network_info: [{"id": "b060d65a-9028-402c-8b84-594cba794144", "address": "fa:16:3e:82:ce:fa", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.197", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb060d65a-90", "ovs_interfaceid": "b060d65a-9028-402c-8b84-594cba794144", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:51:08 np0005476733 nova_compute[192580]: 2025-10-08 15:51:08.710 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:51:08 np0005476733 nova_compute[192580]: 2025-10-08 15:51:08.711 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:51:08 np0005476733 nova_compute[192580]: 2025-10-08 15:51:08.711 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:51:08 np0005476733 nova_compute[192580]: 2025-10-08 15:51:08.712 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:51:08 np0005476733 nova_compute[192580]: 2025-10-08 15:51:08.712 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:51:08 np0005476733 nova_compute[192580]: 2025-10-08 15:51:08.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:10 np0005476733 podman[242963]: 2025-10-08 15:51:10.251932097 +0000 UTC m=+0.078132269 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct  8 11:51:10 np0005476733 nova_compute[192580]: 2025-10-08 15:51:10.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:51:10 np0005476733 nova_compute[192580]: 2025-10-08 15:51:10.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:51:10 np0005476733 nova_compute[192580]: 2025-10-08 15:51:10.689 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:51:10 np0005476733 nova_compute[192580]: 2025-10-08 15:51:10.690 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:51:10 np0005476733 nova_compute[192580]: 2025-10-08 15:51:10.690 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:51:10 np0005476733 nova_compute[192580]: 2025-10-08 15:51:10.690 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:51:10 np0005476733 nova_compute[192580]: 2025-10-08 15:51:10.869 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:51:10 np0005476733 nova_compute[192580]: 2025-10-08 15:51:10.941 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:51:10 np0005476733 nova_compute[192580]: 2025-10-08 15:51:10.942 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:51:11 np0005476733 nova_compute[192580]: 2025-10-08 15:51:11.016 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:51:11 np0005476733 nova_compute[192580]: 2025-10-08 15:51:11.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:11 np0005476733 nova_compute[192580]: 2025-10-08 15:51:11.186 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:51:11 np0005476733 nova_compute[192580]: 2025-10-08 15:51:11.187 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12972MB free_disk=111.18978881835938GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:51:11 np0005476733 nova_compute[192580]: 2025-10-08 15:51:11.187 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:51:11 np0005476733 nova_compute[192580]: 2025-10-08 15:51:11.187 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:51:11 np0005476733 nova_compute[192580]: 2025-10-08 15:51:11.402 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance e899db9a-b18d-4036-a523-fe0907dba023 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:51:11 np0005476733 nova_compute[192580]: 2025-10-08 15:51:11.403 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:51:11 np0005476733 nova_compute[192580]: 2025-10-08 15:51:11.403 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:51:11 np0005476733 nova_compute[192580]: 2025-10-08 15:51:11.418 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 11:51:11 np0005476733 nova_compute[192580]: 2025-10-08 15:51:11.453 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 11:51:11 np0005476733 nova_compute[192580]: 2025-10-08 15:51:11.454 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 11:51:11 np0005476733 nova_compute[192580]: 2025-10-08 15:51:11.471 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 11:51:11 np0005476733 nova_compute[192580]: 2025-10-08 15:51:11.503 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 11:51:11 np0005476733 nova_compute[192580]: 2025-10-08 15:51:11.553 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:51:11 np0005476733 nova_compute[192580]: 2025-10-08 15:51:11.602 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:51:11 np0005476733 nova_compute[192580]: 2025-10-08 15:51:11.604 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:51:11 np0005476733 nova_compute[192580]: 2025-10-08 15:51:11.604 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.417s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:51:13 np0005476733 nova_compute[192580]: 2025-10-08 15:51:13.603 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:51:13 np0005476733 nova_compute[192580]: 2025-10-08 15:51:13.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:14 np0005476733 ovn_controller[94857]: 2025-10-08T15:51:14Z|00623|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Oct  8 11:51:16 np0005476733 nova_compute[192580]: 2025-10-08 15:51:16.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:16 np0005476733 podman[242989]: 2025-10-08 15:51:16.249980935 +0000 UTC m=+0.069009939 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 11:51:16 np0005476733 podman[242988]: 2025-10-08 15:51:16.273130804 +0000 UTC m=+0.101073832 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:51:18 np0005476733 nova_compute[192580]: 2025-10-08 15:51:18.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:21 np0005476733 nova_compute[192580]: 2025-10-08 15:51:21.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:23 np0005476733 nova_compute[192580]: 2025-10-08 15:51:23.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:24 np0005476733 podman[243038]: 2025-10-08 15:51:24.247562116 +0000 UTC m=+0.061219099 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 11:51:24 np0005476733 podman[243037]: 2025-10-08 15:51:24.289638632 +0000 UTC m=+0.098728669 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct  8 11:51:24 np0005476733 podman[243039]: 2025-10-08 15:51:24.296640535 +0000 UTC m=+0.096204637 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, config_id=edpm, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7)
Oct  8 11:51:24 np0005476733 nova_compute[192580]: 2025-10-08 15:51:24.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:51:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:51:25Z|00624|pinctrl|WARN|Dropped 483 log messages in last 60 seconds (most recently, 10 seconds ago) due to excessive rate
Oct  8 11:51:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:51:25Z|00625|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:51:26 np0005476733 nova_compute[192580]: 2025-10-08 15:51:26.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:26.342 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:51:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:26.342 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:51:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:26.343 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:51:28 np0005476733 nova_compute[192580]: 2025-10-08 15:51:28.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:31 np0005476733 nova_compute[192580]: 2025-10-08 15:51:31.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:32 np0005476733 podman[243103]: 2025-10-08 15:51:32.225055054 +0000 UTC m=+0.049783082 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 11:51:32 np0005476733 podman[243102]: 2025-10-08 15:51:32.242191283 +0000 UTC m=+0.063881304 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:51:33 np0005476733 nova_compute[192580]: 2025-10-08 15:51:33.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:36 np0005476733 nova_compute[192580]: 2025-10-08 15:51:36.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:37 np0005476733 nova_compute[192580]: 2025-10-08 15:51:37.499 2 DEBUG oslo_concurrency.lockutils [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "interface-e899db9a-b18d-4036-a523-fe0907dba023-e5b8e4de-db0e-48eb-95c7-99aee1735230" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:51:37 np0005476733 nova_compute[192580]: 2025-10-08 15:51:37.500 2 DEBUG oslo_concurrency.lockutils [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "interface-e899db9a-b18d-4036-a523-fe0907dba023-e5b8e4de-db0e-48eb-95c7-99aee1735230" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:51:37 np0005476733 nova_compute[192580]: 2025-10-08 15:51:37.501 2 DEBUG nova.objects.instance [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'flavor' on Instance uuid e899db9a-b18d-4036-a523-fe0907dba023 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:51:38 np0005476733 nova_compute[192580]: 2025-10-08 15:51:38.759 2 DEBUG nova.objects.instance [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'pci_requests' on Instance uuid e899db9a-b18d-4036-a523-fe0907dba023 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:51:38 np0005476733 nova_compute[192580]: 2025-10-08 15:51:38.781 2 DEBUG nova.network.neutron [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:51:39 np0005476733 nova_compute[192580]: 2025-10-08 15:51:39.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:39 np0005476733 nova_compute[192580]: 2025-10-08 15:51:39.457 2 DEBUG nova.policy [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:51:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:39.853 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:51:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:39.854 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:51:39 np0005476733 nova_compute[192580]: 2025-10-08 15:51:39.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:40 np0005476733 nova_compute[192580]: 2025-10-08 15:51:40.479 2 DEBUG nova.network.neutron [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Successfully updated port: e5b8e4de-db0e-48eb-95c7-99aee1735230 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:51:40 np0005476733 nova_compute[192580]: 2025-10-08 15:51:40.499 2 DEBUG oslo_concurrency.lockutils [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:51:40 np0005476733 nova_compute[192580]: 2025-10-08 15:51:40.500 2 DEBUG oslo_concurrency.lockutils [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquired lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:51:40 np0005476733 nova_compute[192580]: 2025-10-08 15:51:40.500 2 DEBUG nova.network.neutron [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:51:40 np0005476733 nova_compute[192580]: 2025-10-08 15:51:40.579 2 DEBUG nova.compute.manager [req-4316d9f8-c970-4996-9522-22c38c460742 req-776a584d-2345-497c-96d5-295f10213234 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-changed-e5b8e4de-db0e-48eb-95c7-99aee1735230 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:51:40 np0005476733 nova_compute[192580]: 2025-10-08 15:51:40.580 2 DEBUG nova.compute.manager [req-4316d9f8-c970-4996-9522-22c38c460742 req-776a584d-2345-497c-96d5-295f10213234 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Refreshing instance network info cache due to event network-changed-e5b8e4de-db0e-48eb-95c7-99aee1735230. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:51:40 np0005476733 nova_compute[192580]: 2025-10-08 15:51:40.580 2 DEBUG oslo_concurrency.lockutils [req-4316d9f8-c970-4996-9522-22c38c460742 req-776a584d-2345-497c-96d5-295f10213234 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:51:41 np0005476733 nova_compute[192580]: 2025-10-08 15:51:41.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:41 np0005476733 podman[243157]: 2025-10-08 15:51:41.25242864 +0000 UTC m=+0.068472161 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.621 2 DEBUG nova.network.neutron [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updating instance_info_cache with network_info: [{"id": "b060d65a-9028-402c-8b84-594cba794144", "address": "fa:16:3e:82:ce:fa", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.197", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb060d65a-90", "ovs_interfaceid": "b060d65a-9028-402c-8b84-594cba794144", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "address": "fa:16:3e:67:2e:56", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5b8e4de-db", "ovs_interfaceid": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.659 2 DEBUG oslo_concurrency.lockutils [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Releasing lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.661 2 DEBUG oslo_concurrency.lockutils [req-4316d9f8-c970-4996-9522-22c38c460742 req-776a584d-2345-497c-96d5-295f10213234 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.662 2 DEBUG nova.network.neutron [req-4316d9f8-c970-4996-9522-22c38c460742 req-776a584d-2345-497c-96d5-295f10213234 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Refreshing network info cache for port e5b8e4de-db0e-48eb-95c7-99aee1735230 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.667 2 DEBUG nova.virt.libvirt.vif [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:49:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_qos_after_live_migration-234900889',display_name='tempest-test_qos_after_live_migration-234900889',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-qos-after-live-migration-234900889',id=72,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:49:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-atdej0cz',resources=<?>,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:49:51Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=e899db9a-b18d-4036-a523-fe0907dba023,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "address": "fa:16:3e:67:2e:56", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5b8e4de-db", "ovs_interfaceid": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.668 2 DEBUG nova.network.os_vif_util [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "address": "fa:16:3e:67:2e:56", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5b8e4de-db", "ovs_interfaceid": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.669 2 DEBUG nova.network.os_vif_util [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:67:2e:56,bridge_name='br-int',has_traffic_filtering=True,id=e5b8e4de-db0e-48eb-95c7-99aee1735230,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape5b8e4de-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.670 2 DEBUG os_vif [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:67:2e:56,bridge_name='br-int',has_traffic_filtering=True,id=e5b8e4de-db0e-48eb-95c7-99aee1735230,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape5b8e4de-db') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.672 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.673 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.679 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape5b8e4de-db, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.680 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape5b8e4de-db, col_values=(('external_ids', {'iface-id': 'e5b8e4de-db0e-48eb-95c7-99aee1735230', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:67:2e:56', 'vm-uuid': 'e899db9a-b18d-4036-a523-fe0907dba023'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:51:42 np0005476733 NetworkManager[51699]: <info>  [1759938702.7148] manager: (tape5b8e4de-db): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/207)
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.724 2 INFO os_vif [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:67:2e:56,bridge_name='br-int',has_traffic_filtering=True,id=e5b8e4de-db0e-48eb-95c7-99aee1735230,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape5b8e4de-db')#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.726 2 DEBUG nova.virt.libvirt.vif [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:49:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_qos_after_live_migration-234900889',display_name='tempest-test_qos_after_live_migration-234900889',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-qos-after-live-migration-234900889',id=72,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:49:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-atdej0cz',resources=<?>,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:49:51Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=e899db9a-b18d-4036-a523-fe0907dba023,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "address": "fa:16:3e:67:2e:56", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5b8e4de-db", "ovs_interfaceid": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.727 2 DEBUG nova.network.os_vif_util [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "address": "fa:16:3e:67:2e:56", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5b8e4de-db", "ovs_interfaceid": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.728 2 DEBUG nova.network.os_vif_util [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:67:2e:56,bridge_name='br-int',has_traffic_filtering=True,id=e5b8e4de-db0e-48eb-95c7-99aee1735230,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape5b8e4de-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.731 2 DEBUG nova.virt.libvirt.guest [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] attach device xml: <interface type="ethernet">
Oct  8 11:51:42 np0005476733 nova_compute[192580]:  <mac address="fa:16:3e:67:2e:56"/>
Oct  8 11:51:42 np0005476733 nova_compute[192580]:  <model type="virtio"/>
Oct  8 11:51:42 np0005476733 nova_compute[192580]:  <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:51:42 np0005476733 nova_compute[192580]:  <mtu size="1342"/>
Oct  8 11:51:42 np0005476733 nova_compute[192580]:  <target dev="tape5b8e4de-db"/>
Oct  8 11:51:42 np0005476733 nova_compute[192580]: </interface>
Oct  8 11:51:42 np0005476733 nova_compute[192580]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  8 11:51:42 np0005476733 kernel: tape5b8e4de-db: entered promiscuous mode
Oct  8 11:51:42 np0005476733 NetworkManager[51699]: <info>  [1759938702.7440] manager: (tape5b8e4de-db): new Tun device (/org/freedesktop/NetworkManager/Devices/208)
Oct  8 11:51:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:51:42Z|00626|binding|INFO|Claiming lport e5b8e4de-db0e-48eb-95c7-99aee1735230 for this chassis.
Oct  8 11:51:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:51:42Z|00627|binding|INFO|e5b8e4de-db0e-48eb-95c7-99aee1735230: Claiming fa:16:3e:67:2e:56 10.100.0.4
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:42.756 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:67:2e:56 10.100.0.4'], port_security=['fa:16:3e:67:2e:56 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '82ea289b-c65f-44fe-a172-e9784a3ab9f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3da71a44-b74e-4032-87c4-3337484b3d54, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=9, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=e5b8e4de-db0e-48eb-95c7-99aee1735230) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:51:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:42.758 103739 INFO neutron.agent.ovn.metadata.agent [-] Port e5b8e4de-db0e-48eb-95c7-99aee1735230 in datapath 58a69152-b5a6-41d0-85d5-36ab51cfbfb5 bound to our chassis#033[00m
Oct  8 11:51:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:51:42Z|00628|binding|INFO|Setting lport e5b8e4de-db0e-48eb-95c7-99aee1735230 ovn-installed in OVS
Oct  8 11:51:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:51:42Z|00629|binding|INFO|Setting lport e5b8e4de-db0e-48eb-95c7-99aee1735230 up in Southbound
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:42.764 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58a69152-b5a6-41d0-85d5-36ab51cfbfb5#033[00m
Oct  8 11:51:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:42.778 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e542ded6-ff61-4889-8a4c-d6fd6cb2e15e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:51:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:42.779 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap58a69152-b1 in ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:51:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:42.781 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap58a69152-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:51:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:42.781 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[279dce27-6037-4c50-8f67-9073ba7c6bc0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:51:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:42.785 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[80b22bd1-31ae-4e9b-8b5d-564cf972b706]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:51:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:42.801 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[0cefdc91-732e-4ec4-abac-19de23593a11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:51:42 np0005476733 systemd-udevd[243185]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:51:42 np0005476733 NetworkManager[51699]: <info>  [1759938702.8292] device (tape5b8e4de-db): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:51:42 np0005476733 NetworkManager[51699]: <info>  [1759938702.8310] device (tape5b8e4de-db): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:51:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:42.834 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b2a231e6-298c-428a-9c6e-8e44070861e8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:51:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:42.871 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[c5c09557-afc9-49cc-89a7-e6c0220d6f5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:51:42 np0005476733 NetworkManager[51699]: <info>  [1759938702.8839] manager: (tap58a69152-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/209)
Oct  8 11:51:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:42.882 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[73dbb273-9394-425c-b3e5-1dfcb2d6d194]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.895 2 DEBUG nova.virt.libvirt.driver [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.896 2 DEBUG nova.virt.libvirt.driver [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.896 2 DEBUG nova.virt.libvirt.driver [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No VIF found with MAC fa:16:3e:82:ce:fa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.896 2 DEBUG nova.virt.libvirt.driver [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] No VIF found with MAC fa:16:3e:67:2e:56, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:51:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:51:42Z|00090|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:67:2e:56 10.100.0.4
Oct  8 11:51:42 np0005476733 ovn_controller[94857]: 2025-10-08T15:51:42Z|00091|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:67:2e:56 10.100.0.4
Oct  8 11:51:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:42.921 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[7c4b3803-305b-4bed-835f-d38c4ca358d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.921 2 DEBUG nova.virt.libvirt.guest [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:51:42 np0005476733 nova_compute[192580]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:51:42 np0005476733 nova_compute[192580]:  <nova:name>tempest-test_qos_after_live_migration-234900889</nova:name>
Oct  8 11:51:42 np0005476733 nova_compute[192580]:  <nova:creationTime>2025-10-08 15:51:42</nova:creationTime>
Oct  8 11:51:42 np0005476733 nova_compute[192580]:  <nova:flavor name="custom_neutron_guest">
Oct  8 11:51:42 np0005476733 nova_compute[192580]:    <nova:memory>1024</nova:memory>
Oct  8 11:51:42 np0005476733 nova_compute[192580]:    <nova:disk>10</nova:disk>
Oct  8 11:51:42 np0005476733 nova_compute[192580]:    <nova:swap>0</nova:swap>
Oct  8 11:51:42 np0005476733 nova_compute[192580]:    <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:51:42 np0005476733 nova_compute[192580]:    <nova:vcpus>1</nova:vcpus>
Oct  8 11:51:42 np0005476733 nova_compute[192580]:  </nova:flavor>
Oct  8 11:51:42 np0005476733 nova_compute[192580]:  <nova:owner>
Oct  8 11:51:42 np0005476733 nova_compute[192580]:    <nova:user uuid="d4d641ac754b44f89a23c1628056309a">tempest-QosTestCommon-1316104462-project-member</nova:user>
Oct  8 11:51:42 np0005476733 nova_compute[192580]:    <nova:project uuid="d58fb802e34e481ea69b20f4fe8df6d2">tempest-QosTestCommon-1316104462</nova:project>
Oct  8 11:51:42 np0005476733 nova_compute[192580]:  </nova:owner>
Oct  8 11:51:42 np0005476733 nova_compute[192580]:  <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:51:42 np0005476733 nova_compute[192580]:  <nova:ports>
Oct  8 11:51:42 np0005476733 nova_compute[192580]:    <nova:port uuid="b060d65a-9028-402c-8b84-594cba794144">
Oct  8 11:51:42 np0005476733 nova_compute[192580]:      <nova:ip type="fixed" address="192.168.8.197" ipVersion="4"/>
Oct  8 11:51:42 np0005476733 nova_compute[192580]:    </nova:port>
Oct  8 11:51:42 np0005476733 nova_compute[192580]:    <nova:port uuid="e5b8e4de-db0e-48eb-95c7-99aee1735230">
Oct  8 11:51:42 np0005476733 nova_compute[192580]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  8 11:51:42 np0005476733 nova_compute[192580]:    </nova:port>
Oct  8 11:51:42 np0005476733 nova_compute[192580]:  </nova:ports>
Oct  8 11:51:42 np0005476733 nova_compute[192580]: </nova:instance>
Oct  8 11:51:42 np0005476733 nova_compute[192580]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  8 11:51:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:42.925 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[40a859cd-39ec-4fb8-ab5e-abc07fc64268]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:51:42 np0005476733 NetworkManager[51699]: <info>  [1759938702.9479] device (tap58a69152-b0): carrier: link connected
Oct  8 11:51:42 np0005476733 nova_compute[192580]: 2025-10-08 15:51:42.952 2 DEBUG oslo_concurrency.lockutils [None req-76a3dc7e-666e-4e76-af4c-9a0b92270841 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "interface-e899db9a-b18d-4036-a523-fe0907dba023-e5b8e4de-db0e-48eb-95c7-99aee1735230" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 5.451s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:51:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:42.954 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[a5d16c5e-05da-4ef9-9343-5889c160f0d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:51:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:42.974 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[890ac2b6-a109-449c-8d56-7ff39699fe45]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58a69152-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:63:a5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 146], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560661, 'reachable_time': 27305, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243210, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:51:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:42.993 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0e81e41b-de26-46db-bd62-95f7246181e4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecf:63a5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 560661, 'tstamp': 560661}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243211, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:43.013 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[30484ef1-6881-4a97-8026-e1ebeddd2888]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58a69152-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:63:a5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 146], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560661, 'reachable_time': 27305, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243212, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:43.041 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4efac7c4-8867-4714-a87e-fc351c1aeb5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:51:43 np0005476733 nova_compute[192580]: 2025-10-08 15:51:43.042 2 DEBUG nova.compute.manager [req-325274dc-6f9c-47ac-9043-c37cee67178b req-4a384a25-7c2a-45a1-853b-0458d7227e29 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:51:43 np0005476733 nova_compute[192580]: 2025-10-08 15:51:43.043 2 DEBUG oslo_concurrency.lockutils [req-325274dc-6f9c-47ac-9043-c37cee67178b req-4a384a25-7c2a-45a1-853b-0458d7227e29 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:51:43 np0005476733 nova_compute[192580]: 2025-10-08 15:51:43.044 2 DEBUG oslo_concurrency.lockutils [req-325274dc-6f9c-47ac-9043-c37cee67178b req-4a384a25-7c2a-45a1-853b-0458d7227e29 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:51:43 np0005476733 nova_compute[192580]: 2025-10-08 15:51:43.044 2 DEBUG oslo_concurrency.lockutils [req-325274dc-6f9c-47ac-9043-c37cee67178b req-4a384a25-7c2a-45a1-853b-0458d7227e29 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:51:43 np0005476733 nova_compute[192580]: 2025-10-08 15:51:43.045 2 DEBUG nova.compute.manager [req-325274dc-6f9c-47ac-9043-c37cee67178b req-4a384a25-7c2a-45a1-853b-0458d7227e29 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:51:43 np0005476733 nova_compute[192580]: 2025-10-08 15:51:43.045 2 WARNING nova.compute.manager [req-325274dc-6f9c-47ac-9043-c37cee67178b req-4a384a25-7c2a-45a1-853b-0458d7227e29 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received unexpected event network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:43.113 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c4c83fce-ccaf-4e71-ae5b-9f1fad4e3f4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:43.114 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58a69152-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:43.115 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:43.115 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58a69152-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:51:43 np0005476733 NetworkManager[51699]: <info>  [1759938703.1178] manager: (tap58a69152-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/210)
Oct  8 11:51:43 np0005476733 kernel: tap58a69152-b0: entered promiscuous mode
Oct  8 11:51:43 np0005476733 nova_compute[192580]: 2025-10-08 15:51:43.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:43.121 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58a69152-b0, col_values=(('external_ids', {'iface-id': '46f589fc-b5d9-4e1f-b085-8789fd1f48e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:51:43 np0005476733 ovn_controller[94857]: 2025-10-08T15:51:43Z|00630|binding|INFO|Releasing lport 46f589fc-b5d9-4e1f-b085-8789fd1f48e9 from this chassis (sb_readonly=0)
Oct  8 11:51:43 np0005476733 nova_compute[192580]: 2025-10-08 15:51:43.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:43.124 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/58a69152-b5a6-41d0-85d5-36ab51cfbfb5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/58a69152-b5a6-41d0-85d5-36ab51cfbfb5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:43.125 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[98edf31d-463a-40a3-9b1b-eaa01e3ae46f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:43.126 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-58a69152-b5a6-41d0-85d5-36ab51cfbfb5
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/58a69152-b5a6-41d0-85d5-36ab51cfbfb5.pid.haproxy
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 58a69152-b5a6-41d0-85d5-36ab51cfbfb5
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:51:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:43.127 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'env', 'PROCESS_TAG=haproxy-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/58a69152-b5a6-41d0-85d5-36ab51cfbfb5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:51:43 np0005476733 nova_compute[192580]: 2025-10-08 15:51:43.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:43 np0005476733 podman[243245]: 2025-10-08 15:51:43.480948918 +0000 UTC m=+0.024977750 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:51:43 np0005476733 podman[243245]: 2025-10-08 15:51:43.700153938 +0000 UTC m=+0.244182760 container create 7f2e9fc7581ea566e77b6bf977fb1103ad79bc7512a1e746ab7fbe9744bc7691 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:51:43 np0005476733 systemd[1]: Started libpod-conmon-7f2e9fc7581ea566e77b6bf977fb1103ad79bc7512a1e746ab7fbe9744bc7691.scope.
Oct  8 11:51:43 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:51:43 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/008853341ae7962e007552bdd67954078d0f5508c9429181dfe0d3cfc8a98e43/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:51:43 np0005476733 podman[243245]: 2025-10-08 15:51:43.823377959 +0000 UTC m=+0.367406781 container init 7f2e9fc7581ea566e77b6bf977fb1103ad79bc7512a1e746ab7fbe9744bc7691 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  8 11:51:43 np0005476733 podman[243245]: 2025-10-08 15:51:43.829827344 +0000 UTC m=+0.373856146 container start 7f2e9fc7581ea566e77b6bf977fb1103ad79bc7512a1e746ab7fbe9744bc7691 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:51:43 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[243260]: [NOTICE]   (243264) : New worker (243266) forked
Oct  8 11:51:43 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[243260]: [NOTICE]   (243264) : Loading success.
Oct  8 11:51:44 np0005476733 nova_compute[192580]: 2025-10-08 15:51:44.088 2 DEBUG nova.network.neutron [req-4316d9f8-c970-4996-9522-22c38c460742 req-776a584d-2345-497c-96d5-295f10213234 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updated VIF entry in instance network info cache for port e5b8e4de-db0e-48eb-95c7-99aee1735230. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:51:44 np0005476733 nova_compute[192580]: 2025-10-08 15:51:44.090 2 DEBUG nova.network.neutron [req-4316d9f8-c970-4996-9522-22c38c460742 req-776a584d-2345-497c-96d5-295f10213234 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updating instance_info_cache with network_info: [{"id": "b060d65a-9028-402c-8b84-594cba794144", "address": "fa:16:3e:82:ce:fa", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.197", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb060d65a-90", "ovs_interfaceid": "b060d65a-9028-402c-8b84-594cba794144", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "address": "fa:16:3e:67:2e:56", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5b8e4de-db", "ovs_interfaceid": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:51:44 np0005476733 nova_compute[192580]: 2025-10-08 15:51:44.124 2 DEBUG oslo_concurrency.lockutils [req-4316d9f8-c970-4996-9522-22c38c460742 req-776a584d-2345-497c-96d5-295f10213234 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:51:45 np0005476733 nova_compute[192580]: 2025-10-08 15:51:45.188 2 DEBUG nova.compute.manager [req-bd3a96f2-2c97-4e92-84ca-c0feaa29bba9 req-ae65693b-c4b8-44cf-a5b5-239931c3127e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:51:45 np0005476733 nova_compute[192580]: 2025-10-08 15:51:45.188 2 DEBUG oslo_concurrency.lockutils [req-bd3a96f2-2c97-4e92-84ca-c0feaa29bba9 req-ae65693b-c4b8-44cf-a5b5-239931c3127e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:51:45 np0005476733 nova_compute[192580]: 2025-10-08 15:51:45.189 2 DEBUG oslo_concurrency.lockutils [req-bd3a96f2-2c97-4e92-84ca-c0feaa29bba9 req-ae65693b-c4b8-44cf-a5b5-239931c3127e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:51:45 np0005476733 nova_compute[192580]: 2025-10-08 15:51:45.189 2 DEBUG oslo_concurrency.lockutils [req-bd3a96f2-2c97-4e92-84ca-c0feaa29bba9 req-ae65693b-c4b8-44cf-a5b5-239931c3127e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:51:45 np0005476733 nova_compute[192580]: 2025-10-08 15:51:45.190 2 DEBUG nova.compute.manager [req-bd3a96f2-2c97-4e92-84ca-c0feaa29bba9 req-ae65693b-c4b8-44cf-a5b5-239931c3127e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:51:45 np0005476733 nova_compute[192580]: 2025-10-08 15:51:45.190 2 WARNING nova.compute.manager [req-bd3a96f2-2c97-4e92-84ca-c0feaa29bba9 req-ae65693b-c4b8-44cf-a5b5-239931c3127e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received unexpected event network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:51:45 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:51:45.857 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:51:46 np0005476733 nova_compute[192580]: 2025-10-08 15:51:46.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:47 np0005476733 podman[243276]: 2025-10-08 15:51:47.289881176 +0000 UTC m=+0.109658457 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:51:47 np0005476733 podman[243275]: 2025-10-08 15:51:47.301479387 +0000 UTC m=+0.126589220 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  8 11:51:47 np0005476733 nova_compute[192580]: 2025-10-08 15:51:47.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:48 np0005476733 nova_compute[192580]: 2025-10-08 15:51:48.667 2 DEBUG nova.compute.manager [req-8719050b-af0c-4368-9c39-48b02ce1d500 req-ed577f98-129e-44d5-9c89-e5a5f946e774 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-changed-e5b8e4de-db0e-48eb-95c7-99aee1735230 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:51:48 np0005476733 nova_compute[192580]: 2025-10-08 15:51:48.668 2 DEBUG nova.compute.manager [req-8719050b-af0c-4368-9c39-48b02ce1d500 req-ed577f98-129e-44d5-9c89-e5a5f946e774 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Refreshing instance network info cache due to event network-changed-e5b8e4de-db0e-48eb-95c7-99aee1735230. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:51:48 np0005476733 nova_compute[192580]: 2025-10-08 15:51:48.668 2 DEBUG oslo_concurrency.lockutils [req-8719050b-af0c-4368-9c39-48b02ce1d500 req-ed577f98-129e-44d5-9c89-e5a5f946e774 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:51:48 np0005476733 nova_compute[192580]: 2025-10-08 15:51:48.668 2 DEBUG oslo_concurrency.lockutils [req-8719050b-af0c-4368-9c39-48b02ce1d500 req-ed577f98-129e-44d5-9c89-e5a5f946e774 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:51:48 np0005476733 nova_compute[192580]: 2025-10-08 15:51:48.668 2 DEBUG nova.network.neutron [req-8719050b-af0c-4368-9c39-48b02ce1d500 req-ed577f98-129e-44d5-9c89-e5a5f946e774 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Refreshing network info cache for port e5b8e4de-db0e-48eb-95c7-99aee1735230 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:51:51 np0005476733 nova_compute[192580]: 2025-10-08 15:51:51.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:51 np0005476733 nova_compute[192580]: 2025-10-08 15:51:51.379 2 DEBUG nova.network.neutron [req-8719050b-af0c-4368-9c39-48b02ce1d500 req-ed577f98-129e-44d5-9c89-e5a5f946e774 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updated VIF entry in instance network info cache for port e5b8e4de-db0e-48eb-95c7-99aee1735230. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:51:51 np0005476733 nova_compute[192580]: 2025-10-08 15:51:51.380 2 DEBUG nova.network.neutron [req-8719050b-af0c-4368-9c39-48b02ce1d500 req-ed577f98-129e-44d5-9c89-e5a5f946e774 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updating instance_info_cache with network_info: [{"id": "b060d65a-9028-402c-8b84-594cba794144", "address": "fa:16:3e:82:ce:fa", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.197", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb060d65a-90", "ovs_interfaceid": "b060d65a-9028-402c-8b84-594cba794144", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "address": "fa:16:3e:67:2e:56", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5b8e4de-db", "ovs_interfaceid": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:51:51 np0005476733 nova_compute[192580]: 2025-10-08 15:51:51.487 2 DEBUG oslo_concurrency.lockutils [req-8719050b-af0c-4368-9c39-48b02ce1d500 req-ed577f98-129e-44d5-9c89-e5a5f946e774 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:51:52 np0005476733 nova_compute[192580]: 2025-10-08 15:51:52.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:55 np0005476733 podman[243319]: 2025-10-08 15:51:55.244610057 +0000 UTC m=+0.062641264 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 11:51:55 np0005476733 podman[243318]: 2025-10-08 15:51:55.252425857 +0000 UTC m=+0.070223897 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  8 11:51:55 np0005476733 podman[243320]: 2025-10-08 15:51:55.269910996 +0000 UTC m=+0.075651630 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  8 11:51:56 np0005476733 nova_compute[192580]: 2025-10-08 15:51:56.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:57 np0005476733 nova_compute[192580]: 2025-10-08 15:51:57.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:51:58 np0005476733 nova_compute[192580]: 2025-10-08 15:51:58.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:52:00 np0005476733 nova_compute[192580]: 2025-10-08 15:52:00.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:52:01 np0005476733 nova_compute[192580]: 2025-10-08 15:52:01.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:52:02 np0005476733 nova_compute[192580]: 2025-10-08 15:52:02.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:52:03 np0005476733 podman[243383]: 2025-10-08 15:52:03.244492913 +0000 UTC m=+0.073144099 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2)
Oct  8 11:52:03 np0005476733 podman[243384]: 2025-10-08 15:52:03.281736184 +0000 UTC m=+0.098878923 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:52:06 np0005476733 nova_compute[192580]: 2025-10-08 15:52:06.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:52:07 np0005476733 nova_compute[192580]: 2025-10-08 15:52:07.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:52:07 np0005476733 nova_compute[192580]: 2025-10-08 15:52:07.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:52:07 np0005476733 nova_compute[192580]: 2025-10-08 15:52:07.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:52:07 np0005476733 nova_compute[192580]: 2025-10-08 15:52:07.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:52:08 np0005476733 nova_compute[192580]: 2025-10-08 15:52:08.321 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:52:08 np0005476733 nova_compute[192580]: 2025-10-08 15:52:08.321 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:52:08 np0005476733 nova_compute[192580]: 2025-10-08 15:52:08.322 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:52:08 np0005476733 nova_compute[192580]: 2025-10-08 15:52:08.322 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e899db9a-b18d-4036-a523-fe0907dba023 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:52:10 np0005476733 nova_compute[192580]: 2025-10-08 15:52:10.901 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updating instance_info_cache with network_info: [{"id": "b060d65a-9028-402c-8b84-594cba794144", "address": "fa:16:3e:82:ce:fa", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.197", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb060d65a-90", "ovs_interfaceid": "b060d65a-9028-402c-8b84-594cba794144", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "address": "fa:16:3e:67:2e:56", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5b8e4de-db", "ovs_interfaceid": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:52:10 np0005476733 nova_compute[192580]: 2025-10-08 15:52:10.921 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:52:10 np0005476733 nova_compute[192580]: 2025-10-08 15:52:10.922 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:52:10 np0005476733 nova_compute[192580]: 2025-10-08 15:52:10.923 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:52:10 np0005476733 nova_compute[192580]: 2025-10-08 15:52:10.923 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:52:10 np0005476733 nova_compute[192580]: 2025-10-08 15:52:10.923 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:52:11 np0005476733 nova_compute[192580]: 2025-10-08 15:52:11.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:52:11 np0005476733 nova_compute[192580]: 2025-10-08 15:52:11.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:52:11 np0005476733 nova_compute[192580]: 2025-10-08 15:52:11.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:52:11 np0005476733 nova_compute[192580]: 2025-10-08 15:52:11.618 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:52:11 np0005476733 nova_compute[192580]: 2025-10-08 15:52:11.618 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:52:11 np0005476733 nova_compute[192580]: 2025-10-08 15:52:11.619 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:52:11 np0005476733 nova_compute[192580]: 2025-10-08 15:52:11.619 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:52:11 np0005476733 nova_compute[192580]: 2025-10-08 15:52:11.685 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:52:11 np0005476733 nova_compute[192580]: 2025-10-08 15:52:11.750 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:52:11 np0005476733 nova_compute[192580]: 2025-10-08 15:52:11.751 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:52:11 np0005476733 nova_compute[192580]: 2025-10-08 15:52:11.812 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:52:11 np0005476733 nova_compute[192580]: 2025-10-08 15:52:11.991 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:52:11 np0005476733 nova_compute[192580]: 2025-10-08 15:52:11.992 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12973MB free_disk=111.18963241577148GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:52:11 np0005476733 nova_compute[192580]: 2025-10-08 15:52:11.992 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:52:11 np0005476733 nova_compute[192580]: 2025-10-08 15:52:11.993 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:52:12 np0005476733 nova_compute[192580]: 2025-10-08 15:52:12.066 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance e899db9a-b18d-4036-a523-fe0907dba023 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:52:12 np0005476733 nova_compute[192580]: 2025-10-08 15:52:12.066 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:52:12 np0005476733 nova_compute[192580]: 2025-10-08 15:52:12.066 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:52:12 np0005476733 nova_compute[192580]: 2025-10-08 15:52:12.108 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:52:12 np0005476733 nova_compute[192580]: 2025-10-08 15:52:12.122 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:52:12 np0005476733 nova_compute[192580]: 2025-10-08 15:52:12.123 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:52:12 np0005476733 nova_compute[192580]: 2025-10-08 15:52:12.124 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:52:12 np0005476733 podman[243434]: 2025-10-08 15:52:12.253277532 +0000 UTC m=+0.073026325 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent)
Oct  8 11:52:12 np0005476733 nova_compute[192580]: 2025-10-08 15:52:12.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:52:14 np0005476733 nova_compute[192580]: 2025-10-08 15:52:14.124 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:52:16 np0005476733 nova_compute[192580]: 2025-10-08 15:52:16.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:52:17 np0005476733 nova_compute[192580]: 2025-10-08 15:52:17.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:52:18 np0005476733 podman[243454]: 2025-10-08 15:52:18.250903026 +0000 UTC m=+0.066868789 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Oct  8 11:52:18 np0005476733 podman[243453]: 2025-10-08 15:52:18.292597879 +0000 UTC m=+0.114222663 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  8 11:52:21 np0005476733 nova_compute[192580]: 2025-10-08 15:52:21.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:52:22 np0005476733 nova_compute[192580]: 2025-10-08 15:52:22.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:52:23 np0005476733 nova_compute[192580]: 2025-10-08 15:52:23.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:52:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:52:25Z|00631|pinctrl|WARN|Dropped 1143 log messages in last 60 seconds (most recently, 8 seconds ago) due to excessive rate
Oct  8 11:52:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:52:25Z|00632|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:52:26 np0005476733 podman[243503]: 2025-10-08 15:52:26.243176259 +0000 UTC m=+0.062016135 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:52:26 np0005476733 podman[243502]: 2025-10-08 15:52:26.25069529 +0000 UTC m=+0.073075358 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 11:52:26 np0005476733 nova_compute[192580]: 2025-10-08 15:52:26.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:52:26 np0005476733 podman[243504]: 2025-10-08 15:52:26.27103873 +0000 UTC m=+0.088475950 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  8 11:52:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:52:26.343 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:52:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:52:26.344 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:52:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:52:26.344 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:52:26 np0005476733 nova_compute[192580]: 2025-10-08 15:52:26.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:52:27 np0005476733 nova_compute[192580]: 2025-10-08 15:52:27.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:52:31 np0005476733 nova_compute[192580]: 2025-10-08 15:52:31.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:52:32 np0005476733 nova_compute[192580]: 2025-10-08 15:52:32.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:52:34 np0005476733 podman[243563]: 2025-10-08 15:52:34.228468941 +0000 UTC m=+0.053713059 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 11:52:34 np0005476733 podman[243562]: 2025-10-08 15:52:34.23219824 +0000 UTC m=+0.063946115 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.049 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'name': 'tempest-test_qos_after_live_migration-234900889', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000048', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'hostId': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.049 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.049 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.054 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e899db9a-b18d-4036-a523-fe0907dba023 / tape5b8e4de-db inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.054 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.055 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d823356-9054-4b02-9e65-89ba5821df1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tapb060d65a-90', 'timestamp': '2025-10-08T15:52:36.049979', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tapb060d65a-90', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:82:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb060d65a-90'}, 'message_id': 'cf50daa6-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.772967685, 'message_signature': '90318424975a81206398e4529cd0cab9045f6930f7c7cd063a172741f8926a1a'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tape5b8e4de-db', 'timestamp': '2025-10-08T15:52:36.049979', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tape5b8e4de-db', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:67:2e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape5b8e4de-db'}, 'message_id': 'cf50e6cc-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.772967685, 'message_signature': '40b5d3f8a55a0e5251dc9e6a6938b6b4c587ef097cf152fe7e69bf0a2c2ff2d0'}]}, 'timestamp': '2025-10-08 15:52:36.055380', '_unique_id': 'de073a336369487a98083264f0e079f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.056 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.057 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.073 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/cpu volume: 45650000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b987b2b1-7d34-4103-ac2f-9e74326e2dfb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 45650000000, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'timestamp': '2025-10-08T15:52:36.057475', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'cf53cc0c-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.796769967, 'message_signature': 'e9bff27d34033abb9596ef6fb2340f17172c8d44754f7de0d90f1326a789f073'}]}, 'timestamp': '2025-10-08 15:52:36.074402', '_unique_id': 'f28fdd52823642fb995c67039e804594'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.075 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/memory.usage volume: 240.6953125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '399e9003-b01c-4c8d-adc5-bb24db2a33c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 240.6953125, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'timestamp': '2025-10-08T15:52:36.076090', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': 'cf541d56-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.796769967, 'message_signature': '1881c49d0d1244b2658dfcec8d715a66e6cd22aa55040001da960dbff801803c'}]}, 'timestamp': '2025-10-08 15:52:36.076407', '_unique_id': 'aa26b83ec3e1479a9f77eee1012596ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.076 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.077 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.089 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.090 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ea589ca-061b-4eab-b17d-a023af19f0ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-vda', 'timestamp': '2025-10-08T15:52:36.077843', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cf56402c-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.800834436, 'message_signature': 'e710c8fe6af881e58f3488ca50418b70fa17658a5b6f9b282e1670cb37fcda3d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-sda', 'timestamp': '2025-10-08T15:52:36.077843', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cf565026-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.800834436, 'message_signature': '343d42376a4f48fa8c80dfe8012c14035d7b8424ce4e4d88407b43dac711867d'}]}, 'timestamp': '2025-10-08 15:52:36.090856', '_unique_id': 'ab44d621d4344ee3922e9cea09248d86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.092 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.093 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.093 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.incoming.packets volume: 902 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.093 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.incoming.packets volume: 99 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8bd8b2eb-a843-4d4d-b286-08dd03516116', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 902, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tapb060d65a-90', 'timestamp': '2025-10-08T15:52:36.093346', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tapb060d65a-90', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:82:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb060d65a-90'}, 'message_id': 'cf56c2ae-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.772967685, 'message_signature': '13a7042e9f63e9c600a5bd4053def4d3c4df9ea2e91b5fad722765db1b5d32f7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 99, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tape5b8e4de-db', 'timestamp': '2025-10-08T15:52:36.093346', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tape5b8e4de-db', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:67:2e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape5b8e4de-db'}, 'message_id': 'cf56d050-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.772967685, 'message_signature': '5ce03117147c414a3c6a122e3d9410c938456f7751016f27ae629fbe7d7b2c71'}]}, 'timestamp': '2025-10-08 15:52:36.094143', '_unique_id': 'f308e3346add4fe5b50d56853c697541'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.094 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.095 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.096 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.outgoing.bytes.delta volume: 16169083 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.096 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a089a04e-2787-4eab-9b45-2f368b97188c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 16169083, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tapb060d65a-90', 'timestamp': '2025-10-08T15:52:36.095970', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tapb060d65a-90', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:82:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb060d65a-90'}, 'message_id': 'cf57271c-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.772967685, 'message_signature': '2e33daf4e97df2f0dc13483829250fd780b8c15e2e5e8630f3d188325cd2ba00'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tape5b8e4de-db', 'timestamp': '2025-10-08T15:52:36.095970', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tape5b8e4de-db', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:67:2e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape5b8e4de-db'}, 'message_id': 'cf57340a-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.772967685, 'message_signature': '77023bf5b0dbdaee030060b55eb2c1349bfc6cc4f3297802ff97118271f65336'}]}, 'timestamp': '2025-10-08 15:52:36.096661', '_unique_id': 'b4f4766ce10c45e3951096da45180b28'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.097 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.098 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.098 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.outgoing.packets volume: 1160 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.098 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.outgoing.packets volume: 101 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1683a754-df45-4a56-b908-c90cd19871a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1160, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tapb060d65a-90', 'timestamp': '2025-10-08T15:52:36.098425', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tapb060d65a-90', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:82:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb060d65a-90'}, 'message_id': 'cf5785d6-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.772967685, 'message_signature': '83a8a771eca1279ecaa2345e3db0ae399147d7e7fd68b870e4a9c81ef6b68e51'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 101, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tape5b8e4de-db', 'timestamp': '2025-10-08T15:52:36.098425', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tape5b8e4de-db', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:67:2e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape5b8e4de-db'}, 'message_id': 'cf5793c8-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.772967685, 'message_signature': '1dd443122c81a26cc9c03527660151ebc67800630fdfa86d53ce079dbc64ed5a'}]}, 'timestamp': '2025-10-08 15:52:36.099173', '_unique_id': '8685a1b7dbf84ed995794d180b72d54c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.099 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.100 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.100 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.101 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.101 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '934e1f41-556a-4616-aa11-67c2b8fe8ce7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-vda', 'timestamp': '2025-10-08T15:52:36.101023', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cf57ec24-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.800834436, 'message_signature': '56051b41915c2cbc1fa149561002015d92dedbb26db3d0d9686205d28d1bc570'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-sda', 'timestamp': '2025-10-08T15:52:36.101023', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cf57f7fa-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.800834436, 'message_signature': '826cee6c553186a633e37c1ef0d837421cd66d89fb3f1202408ff1861924728e'}]}, 'timestamp': '2025-10-08 15:52:36.101697', '_unique_id': 'fb51264f17de48f993cba3767f4c739d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.102 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.103 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.103 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.outgoing.bytes volume: 16172295 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.103 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.outgoing.bytes volume: 22669 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5776b932-2e3d-4056-af90-a3acb2340c69', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 16172295, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tapb060d65a-90', 'timestamp': '2025-10-08T15:52:36.103428', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tapb060d65a-90', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:82:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb060d65a-90'}, 'message_id': 'cf58494e-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.772967685, 'message_signature': 'ad199bce574ac1101a5bb4217bfdb3c5b9bffdf2efdd15224ad681f3bc7d9c7b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 22669, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tape5b8e4de-db', 'timestamp': '2025-10-08T15:52:36.103428', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tape5b8e4de-db', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:67:2e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape5b8e4de-db'}, 'message_id': 'cf585650-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.772967685, 'message_signature': 'fe533b98f071c9bb0c8d1982cc7da434373d115d17aabb1a1c6c5d138b41c80f'}]}, 'timestamp': '2025-10-08 15:52:36.104094', '_unique_id': '3b43377c98234b12a6bc2ac566f55600'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.104 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.108 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.108 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.incoming.bytes.delta volume: 73246 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.108 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0024b14f-8b9d-4af9-85c9-2a3969bda507', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 73246, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tapb060d65a-90', 'timestamp': '2025-10-08T15:52:36.108440', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tapb060d65a-90', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:82:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb060d65a-90'}, 'message_id': 'cf590e10-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.772967685, 'message_signature': 'ae177ffbb9a8c96e092ba204556503957d0921ca8792b54f3bc16d464ce58b56'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tape5b8e4de-db', 'timestamp': '2025-10-08T15:52:36.108440', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tape5b8e4de-db', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:67:2e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape5b8e4de-db'}, 'message_id': 'cf59182e-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.772967685, 'message_signature': '9669f5da4fbf9af1d7e835a2582b767037a379931293d00a0e48f9602e81793e'}]}, 'timestamp': '2025-10-08 15:52:36.109029', '_unique_id': 'ab840b70e9e74fb8ac5474a55724d299'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.110 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.111 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.111 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.111 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd19eff3-6daf-49d1-8dc8-3e34862fc6da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tapb060d65a-90', 'timestamp': '2025-10-08T15:52:36.111140', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tapb060d65a-90', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:82:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb060d65a-90'}, 'message_id': 'cf5975ee-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.772967685, 'message_signature': '497d0077753393aaad67c05c9039a69a34e11ecac3621b9b4a8891dfa5c29b2d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tape5b8e4de-db', 'timestamp': '2025-10-08T15:52:36.111140', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tape5b8e4de-db', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:67:2e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape5b8e4de-db'}, 'message_id': 'cf597ec2-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.772967685, 'message_signature': '241a49dae713c269bd030b79934fd2bf0720090dc14a1e3b5746e5eff4a4b9de'}]}, 'timestamp': '2025-10-08 15:52:36.111636', '_unique_id': 'fcf599983363425293e68779cb6942a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.112 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.incoming.bytes volume: 75480 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.incoming.bytes volume: 17057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1da1e36-1a94-45d4-b8ee-84956742da33', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 75480, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tapb060d65a-90', 'timestamp': '2025-10-08T15:52:36.112813', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tapb060d65a-90', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:82:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb060d65a-90'}, 'message_id': 'cf59b5cc-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.772967685, 'message_signature': '47faa223a48ec4971330a16f7b558928be416ff5eaf228d6b8c2a3f8268a728c'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 17057, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tape5b8e4de-db', 'timestamp': '2025-10-08T15:52:36.112813', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tape5b8e4de-db', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:67:2e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape5b8e4de-db'}, 'message_id': 'cf59bffe-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.772967685, 'message_signature': '957d882834b19b6258c329cd1c925c0913968435473dc22eeb9c2bf74df40c12'}]}, 'timestamp': '2025-10-08 15:52:36.113338', '_unique_id': '5c690b56a2a44642842eed3ed672ef71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.113 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.114 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.135 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.write.bytes volume: 137289728 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.136 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18d9c8e9-bd53-4d6c-bbc9-56464fbc0787', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 137289728, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-vda', 'timestamp': '2025-10-08T15:52:36.114700', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cf5d363e-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.837703705, 'message_signature': '87aa020df3ebae829738d830b388bd9814ad7f1ff1d6429f7c12f08677e3d17d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-sda', 'timestamp': '2025-10-08T15:52:36.114700', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cf5d46c4-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.837703705, 'message_signature': '95740824d3d60df425ce2990c9f90fdc47706e76e558b6d5b5c14159bd4d4cd6'}]}, 'timestamp': '2025-10-08 15:52:36.136469', '_unique_id': 'd072dd2030af4ee0b000af765d793d74'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.137 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.138 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.138 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.write.requests volume: 833 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.139 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af71eeab-f486-4a94-915d-b9721fd37661', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 833, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-vda', 'timestamp': '2025-10-08T15:52:36.138815', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cf5daf1a-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.837703705, 'message_signature': '4b51d2fa73114e3ea958515c59cab7fd656460d132ea07b5d6356fca12520e8c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-sda', 'timestamp': '2025-10-08T15:52:36.138815', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cf5dbb54-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.837703705, 'message_signature': '4d10c623384afd1f9f09b67ac724a8ff8b1ce9526954fc8bf4647e4823efa43c'}]}, 'timestamp': '2025-10-08 15:52:36.139433', '_unique_id': '016b65e41dbf46ccb38d0e28083ada0f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.140 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.141 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.read.bytes volume: 329594368 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.141 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae990a1d-378b-4f3b-a7a6-12d88be738af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 329594368, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-vda', 'timestamp': '2025-10-08T15:52:36.141004', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cf5e050a-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.837703705, 'message_signature': '0486474d55918d1a13e111cacf163db010d51ee832e34ae51f695ff2f0bad48f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-sda', 'timestamp': '2025-10-08T15:52:36.141004', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cf5e0d7a-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.837703705, 'message_signature': '58a1f20c5f86163a5613b12e422272c17731ab605ae0f2caefefba5974bdb4d6'}]}, 'timestamp': '2025-10-08 15:52:36.141495', '_unique_id': 'ac4ee62738a64c588e478466b13781cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.142 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.read.latency volume: 10510742541 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.read.latency volume: 79912291 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ed0a696e-0950-4808-95fc-08032d27146a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10510742541, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-vda', 'timestamp': '2025-10-08T15:52:36.142750', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cf5e481c-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.837703705, 'message_signature': '04cd09d5ea5ff85ea4603247a7603b9012319590353d69422491e753193663b8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 79912291, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-sda', 'timestamp': '2025-10-08T15:52:36.142750', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cf5e517c-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.837703705, 'message_signature': '600975c6f23dc20f66412b738e18a0bccf4149b66e6826865358ae81226ef604'}]}, 'timestamp': '2025-10-08 15:52:36.143240', '_unique_id': 'ee9a051d598d4fd29dc8f41300f5119a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.143 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.144 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.144 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.read.requests volume: 11682 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.144 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc0e66a3-d394-4658-b47b-821cdf9b48e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11682, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-vda', 'timestamp': '2025-10-08T15:52:36.144365', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cf5e85de-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.837703705, 'message_signature': 'f0653fd1726416b46fe42c8bbb541c13ec84bdd93d18b7cfc70cf8413712e460'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-sda', 'timestamp': '2025-10-08T15:52:36.144365', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cf5e8ee4-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.837703705, 'message_signature': '12b846aa129168dd3a5ee340ade062ff12cc0861e3722026da55f2d51e5bee6e'}]}, 'timestamp': '2025-10-08 15:52:36.144825', '_unique_id': '1e3ca048be01459a9b637bb58c244c56'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.145 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.146 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.146 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.146 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9b06a4e-4fa9-4864-8991-3b5a4c44f501', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tapb060d65a-90', 'timestamp': '2025-10-08T15:52:36.146137', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tapb060d65a-90', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:82:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb060d65a-90'}, 'message_id': 'cf5ecbf2-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.772967685, 'message_signature': '4816f15af9f0a9ecdea8d20fa86e0083935c08fceb10c6407d1a4e2bd3ce93e3'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tape5b8e4de-db', 'timestamp': '2025-10-08T15:52:36.146137', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tape5b8e4de-db', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:67:2e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape5b8e4de-db'}, 'message_id': 'cf5ed674-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.772967685, 'message_signature': '0ab4023f82443a2b896a78e90fec3539838c44803d1cab92628c9bc58b6c1882'}]}, 'timestamp': '2025-10-08 15:52:36.146655', '_unique_id': 'cd97ae40b1cb4705a73eda78cb0cfb35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.147 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.usage volume: 152698880 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.148 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '671d6b95-6f5c-4740-b769-cb99f2936ed8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152698880, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-vda', 'timestamp': '2025-10-08T15:52:36.147880', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cf5f0f54-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.800834436, 'message_signature': 'daa0af6591a34e0bd4290fe1aa34514ac0d63c0655fc354539202dc3a1d386da'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-sda', 'timestamp': '2025-10-08T15:52:36.147880', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cf5f19e0-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.800834436, 'message_signature': '358acef40d31e3f7c610ef7f8f6ac2622e9c7bba8996aa42a8fe7c1edf6c4cdf'}]}, 'timestamp': '2025-10-08 15:52:36.148403', '_unique_id': '0d4ee98f039b43dfa289f5fc8f2f7b45'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.149 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.write.latency volume: 23172594397 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.150 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d1df4e8-49c7-40c7-81ef-1252102c2005', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23172594397, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-vda', 'timestamp': '2025-10-08T15:52:36.149851', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cf5f5d9c-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.837703705, 'message_signature': '93a5e886f8f3f068e77d8f92a36313f3cd26ff83b948d3cb1e4232f36908e950'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'e899db9a-b18d-4036-a523-fe0907dba023-sda', 'timestamp': '2025-10-08T15:52:36.149851', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'instance-00000048', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cf5f6954-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.837703705, 'message_signature': '072b781082a4948899609af0382b71cbdcb7cf9823601e369650c8cce5244062'}]}, 'timestamp': '2025-10-08 15:52:36.150431', '_unique_id': '8bcd3cd292344424a42d2a5a1fbb50f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.151 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.152 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.152 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.153 12 DEBUG ceilometer.compute.pollsters [-] e899db9a-b18d-4036-a523-fe0907dba023/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dcb01e9c-1b21-4340-a16d-fa6534c24af6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tapb060d65a-90', 'timestamp': '2025-10-08T15:52:36.152781', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tapb060d65a-90', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:82:ce:fa', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb060d65a-90'}, 'message_id': 'cf5fd54c-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.772967685, 'message_signature': '00d55fe5aa99ee05db30b2d497f7428cacc7c72a5105e54ba5b7007874f6334e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000048-e899db9a-b18d-4036-a523-fe0907dba023-tape5b8e4de-db', 'timestamp': '2025-10-08T15:52:36.152781', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-234900889', 'name': 'tape5b8e4de-db', 'instance_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:67:2e:56', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape5b8e4de-db'}, 'message_id': 'cf5fe2e4-a45e-11f0-9274-fa163ef67048', 'monotonic_time': 5659.772967685, 'message_signature': '67ae6c8439f5a75efd688a5acc0d0065551802c7b4f49fe6d48915499ab86325'}]}, 'timestamp': '2025-10-08 15:52:36.153539', '_unique_id': '08512f8c93cf4730b2f3cc489dcc0afd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:52:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:52:36 np0005476733 nova_compute[192580]: 2025-10-08 15:52:36.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:52:37 np0005476733 nova_compute[192580]: 2025-10-08 15:52:37.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:52:41 np0005476733 nova_compute[192580]: 2025-10-08 15:52:41.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:52:42 np0005476733 nova_compute[192580]: 2025-10-08 15:52:42.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:52:43 np0005476733 podman[243604]: 2025-10-08 15:52:43.254760671 +0000 UTC m=+0.054209535 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 11:52:46 np0005476733 nova_compute[192580]: 2025-10-08 15:52:46.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:52:47 np0005476733 nova_compute[192580]: 2025-10-08 15:52:47.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:52:49 np0005476733 podman[243624]: 2025-10-08 15:52:49.251303239 +0000 UTC m=+0.068922444 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute)
Oct  8 11:52:49 np0005476733 podman[243623]: 2025-10-08 15:52:49.325041538 +0000 UTC m=+0.148406797 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 11:52:51 np0005476733 nova_compute[192580]: 2025-10-08 15:52:51.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:52:52 np0005476733 nova_compute[192580]: 2025-10-08 15:52:52.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:52:55 np0005476733 nova_compute[192580]: 2025-10-08 15:52:55.870 2 DEBUG oslo_concurrency.lockutils [None req-049bc4ce-a307-4201-b0ab-26193c2479fd 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:52:55 np0005476733 nova_compute[192580]: 2025-10-08 15:52:55.871 2 DEBUG oslo_concurrency.lockutils [None req-049bc4ce-a307-4201-b0ab-26193c2479fd 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:52:55 np0005476733 nova_compute[192580]: 2025-10-08 15:52:55.872 2 INFO nova.compute.manager [None req-049bc4ce-a307-4201-b0ab-26193c2479fd 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Rebooting instance#033[00m
Oct  8 11:52:55 np0005476733 nova_compute[192580]: 2025-10-08 15:52:55.891 2 DEBUG oslo_concurrency.lockutils [None req-049bc4ce-a307-4201-b0ab-26193c2479fd 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquiring lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:52:55 np0005476733 nova_compute[192580]: 2025-10-08 15:52:55.891 2 DEBUG oslo_concurrency.lockutils [None req-049bc4ce-a307-4201-b0ab-26193c2479fd 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquired lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:52:55 np0005476733 nova_compute[192580]: 2025-10-08 15:52:55.892 2 DEBUG nova.network.neutron [None req-049bc4ce-a307-4201-b0ab-26193c2479fd 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:52:56 np0005476733 nova_compute[192580]: 2025-10-08 15:52:56.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:52:57 np0005476733 podman[243669]: 2025-10-08 15:52:57.231440702 +0000 UTC m=+0.053918465 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 11:52:57 np0005476733 podman[243668]: 2025-10-08 15:52:57.237054331 +0000 UTC m=+0.064497973 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  8 11:52:57 np0005476733 podman[243670]: 2025-10-08 15:52:57.250366098 +0000 UTC m=+0.068889735 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Oct  8 11:52:57 np0005476733 ovn_controller[94857]: 2025-10-08T15:52:57Z|00633|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Oct  8 11:52:57 np0005476733 nova_compute[192580]: 2025-10-08 15:52:57.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:52:58 np0005476733 nova_compute[192580]: 2025-10-08 15:52:58.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:53:01 np0005476733 nova_compute[192580]: 2025-10-08 15:53:01.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:01 np0005476733 nova_compute[192580]: 2025-10-08 15:53:01.431 2 DEBUG nova.network.neutron [None req-049bc4ce-a307-4201-b0ab-26193c2479fd 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updating instance_info_cache with network_info: [{"id": "b060d65a-9028-402c-8b84-594cba794144", "address": "fa:16:3e:82:ce:fa", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.197", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb060d65a-90", "ovs_interfaceid": "b060d65a-9028-402c-8b84-594cba794144", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "address": "fa:16:3e:67:2e:56", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5b8e4de-db", "ovs_interfaceid": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:53:01 np0005476733 nova_compute[192580]: 2025-10-08 15:53:01.454 2 DEBUG oslo_concurrency.lockutils [None req-049bc4ce-a307-4201-b0ab-26193c2479fd 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Releasing lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:53:01 np0005476733 nova_compute[192580]: 2025-10-08 15:53:01.456 2 DEBUG nova.compute.manager [None req-049bc4ce-a307-4201-b0ab-26193c2479fd 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:53:02 np0005476733 kernel: tapb060d65a-90 (unregistering): left promiscuous mode
Oct  8 11:53:02 np0005476733 NetworkManager[51699]: <info>  [1759938782.4343] device (tapb060d65a-90): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:53:02 np0005476733 ovn_controller[94857]: 2025-10-08T15:53:02Z|00634|binding|INFO|Releasing lport b060d65a-9028-402c-8b84-594cba794144 from this chassis (sb_readonly=0)
Oct  8 11:53:02 np0005476733 ovn_controller[94857]: 2025-10-08T15:53:02Z|00635|binding|INFO|Setting lport b060d65a-9028-402c-8b84-594cba794144 down in Southbound
Oct  8 11:53:02 np0005476733 nova_compute[192580]: 2025-10-08 15:53:02.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:02 np0005476733 ovn_controller[94857]: 2025-10-08T15:53:02Z|00636|binding|INFO|Removing iface tapb060d65a-90 ovn-installed in OVS
Oct  8 11:53:02 np0005476733 nova_compute[192580]: 2025-10-08 15:53:02.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:02.453 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:ce:fa 192.168.8.197'], port_security=['fa:16:3e:82:ce:fa 192.168.8.197'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.8.197/24', 'neutron:device_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2fece7d-de46-49dc-874d-3e87e96b491f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b449450f-29a2-4ba2-a56d-c4c1cca923db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.241'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07c02bbf-eada-4d49-8b61-f6f456154844, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=b060d65a-9028-402c-8b84-594cba794144) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:53:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:02.457 103739 INFO neutron.agent.ovn.metadata.agent [-] Port b060d65a-9028-402c-8b84-594cba794144 in datapath f2fece7d-de46-49dc-874d-3e87e96b491f unbound from our chassis#033[00m
Oct  8 11:53:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:02.459 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f2fece7d-de46-49dc-874d-3e87e96b491f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:53:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:02.461 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[afbebf48-b24d-4f35-8173-cb17ac90f044]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:02.461 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f namespace which is not needed anymore#033[00m
Oct  8 11:53:02 np0005476733 nova_compute[192580]: 2025-10-08 15:53:02.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:02 np0005476733 kernel: tape5b8e4de-db (unregistering): left promiscuous mode
Oct  8 11:53:02 np0005476733 NetworkManager[51699]: <info>  [1759938782.4965] device (tape5b8e4de-db): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:53:02 np0005476733 nova_compute[192580]: 2025-10-08 15:53:02.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:02 np0005476733 ovn_controller[94857]: 2025-10-08T15:53:02Z|00637|binding|INFO|Releasing lport e5b8e4de-db0e-48eb-95c7-99aee1735230 from this chassis (sb_readonly=0)
Oct  8 11:53:02 np0005476733 ovn_controller[94857]: 2025-10-08T15:53:02Z|00638|binding|INFO|Setting lport e5b8e4de-db0e-48eb-95c7-99aee1735230 down in Southbound
Oct  8 11:53:02 np0005476733 nova_compute[192580]: 2025-10-08 15:53:02.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:02 np0005476733 ovn_controller[94857]: 2025-10-08T15:53:02Z|00639|binding|INFO|Removing iface tape5b8e4de-db ovn-installed in OVS
Oct  8 11:53:02 np0005476733 nova_compute[192580]: 2025-10-08 15:53:02.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:02.517 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:67:2e:56 10.100.0.4'], port_security=['fa:16:3e:67:2e:56 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '82ea289b-c65f-44fe-a172-e9784a3ab9f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.205'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3da71a44-b74e-4032-87c4-3337484b3d54, chassis=[], tunnel_key=9, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=e5b8e4de-db0e-48eb-95c7-99aee1735230) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:53:02 np0005476733 nova_compute[192580]: 2025-10-08 15:53:02.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:02 np0005476733 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000048.scope: Deactivated successfully.
Oct  8 11:53:02 np0005476733 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000048.scope: Consumed 51.061s CPU time.
Oct  8 11:53:02 np0005476733 systemd-machined[152624]: Machine qemu-42-instance-00000048 terminated.
Oct  8 11:53:02 np0005476733 nova_compute[192580]: 2025-10-08 15:53:02.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:53:02 np0005476733 neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f[242471]: [NOTICE]   (242475) : haproxy version is 2.8.14-c23fe91
Oct  8 11:53:02 np0005476733 neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f[242471]: [NOTICE]   (242475) : path to executable is /usr/sbin/haproxy
Oct  8 11:53:02 np0005476733 neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f[242471]: [ALERT]    (242475) : Current worker (242477) exited with code 143 (Terminated)
Oct  8 11:53:02 np0005476733 neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f[242471]: [WARNING]  (242475) : All workers exited. Exiting... (0)
Oct  8 11:53:02 np0005476733 systemd[1]: libpod-9a168bd37e94dfb53a1c3b595deb0a1f63bc4c3995188adbc8489fa7a0d05a02.scope: Deactivated successfully.
Oct  8 11:53:02 np0005476733 podman[243766]: 2025-10-08 15:53:02.632189937 +0000 UTC m=+0.053773039 container died 9a168bd37e94dfb53a1c3b595deb0a1f63bc4c3995188adbc8489fa7a0d05a02 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  8 11:53:02 np0005476733 systemd[1]: var-lib-containers-storage-overlay-b449656fb64ae5b2242049126f7cd4981689d6b316867b882933c446ece54a35-merged.mount: Deactivated successfully.
Oct  8 11:53:02 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9a168bd37e94dfb53a1c3b595deb0a1f63bc4c3995188adbc8489fa7a0d05a02-userdata-shm.mount: Deactivated successfully.
Oct  8 11:53:02 np0005476733 podman[243766]: 2025-10-08 15:53:02.683518438 +0000 UTC m=+0.105101540 container cleanup 9a168bd37e94dfb53a1c3b595deb0a1f63bc4c3995188adbc8489fa7a0d05a02 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  8 11:53:02 np0005476733 systemd[1]: libpod-conmon-9a168bd37e94dfb53a1c3b595deb0a1f63bc4c3995188adbc8489fa7a0d05a02.scope: Deactivated successfully.
Oct  8 11:53:02 np0005476733 NetworkManager[51699]: <info>  [1759938782.7585] manager: (tape5b8e4de-db): new Tun device (/org/freedesktop/NetworkManager/Devices/211)
Oct  8 11:53:02 np0005476733 podman[243795]: 2025-10-08 15:53:02.768236566 +0000 UTC m=+0.056206448 container remove 9a168bd37e94dfb53a1c3b595deb0a1f63bc4c3995188adbc8489fa7a0d05a02 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 11:53:02 np0005476733 nova_compute[192580]: 2025-10-08 15:53:02.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:02.777 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[af49c5af-dc23-4552-9f4e-fc9a85da3316]: (4, ('Wed Oct  8 03:53:02 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f (9a168bd37e94dfb53a1c3b595deb0a1f63bc4c3995188adbc8489fa7a0d05a02)\n9a168bd37e94dfb53a1c3b595deb0a1f63bc4c3995188adbc8489fa7a0d05a02\nWed Oct  8 03:53:02 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f (9a168bd37e94dfb53a1c3b595deb0a1f63bc4c3995188adbc8489fa7a0d05a02)\n9a168bd37e94dfb53a1c3b595deb0a1f63bc4c3995188adbc8489fa7a0d05a02\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:02.780 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5e293733-277b-492c-8e83-6dbb80f27f80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:02.781 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2fece7d-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:53:02 np0005476733 nova_compute[192580]: 2025-10-08 15:53:02.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:02 np0005476733 kernel: tapf2fece7d-d0: left promiscuous mode
Oct  8 11:53:02 np0005476733 nova_compute[192580]: 2025-10-08 15:53:02.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:02.822 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7ba97b38-d12f-49ae-8018-0c71b8e537fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:02.852 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a87ad263-9d75-482c-8f13-fa4cba5ce7ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:02.855 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[49574803-846f-46d0-aa38-ec9e07e2249a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:02.879 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9bbb394f-2550-4ff0-a6c6-90f046e6564b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549337, 'reachable_time': 34803, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243843, 'error': None, 'target': 'ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:02.882 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:53:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:02.882 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[db2a32e8-ccc7-42f5-876b-60374c28d112]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:02.883 103739 INFO neutron.agent.ovn.metadata.agent [-] Port e5b8e4de-db0e-48eb-95c7-99aee1735230 in datapath 58a69152-b5a6-41d0-85d5-36ab51cfbfb5 unbound from our chassis#033[00m
Oct  8 11:53:02 np0005476733 systemd[1]: run-netns-ovnmeta\x2df2fece7d\x2dde46\x2d49dc\x2d874d\x2d3e87e96b491f.mount: Deactivated successfully.
Oct  8 11:53:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:02.884 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58a69152-b5a6-41d0-85d5-36ab51cfbfb5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:53:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:02.885 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7aabebed-8b48-425a-bb33-4531e2a0adab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:02.886 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 namespace which is not needed anymore#033[00m
Oct  8 11:53:03 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[243260]: [NOTICE]   (243264) : haproxy version is 2.8.14-c23fe91
Oct  8 11:53:03 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[243260]: [NOTICE]   (243264) : path to executable is /usr/sbin/haproxy
Oct  8 11:53:03 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[243260]: [WARNING]  (243264) : Exiting Master process...
Oct  8 11:53:03 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[243260]: [ALERT]    (243264) : Current worker (243266) exited with code 143 (Terminated)
Oct  8 11:53:03 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[243260]: [WARNING]  (243264) : All workers exited. Exiting... (0)
Oct  8 11:53:03 np0005476733 systemd[1]: libpod-7f2e9fc7581ea566e77b6bf977fb1103ad79bc7512a1e746ab7fbe9744bc7691.scope: Deactivated successfully.
Oct  8 11:53:03 np0005476733 podman[243861]: 2025-10-08 15:53:03.090279187 +0000 UTC m=+0.061210207 container died 7f2e9fc7581ea566e77b6bf977fb1103ad79bc7512a1e746ab7fbe9744bc7691 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 11:53:03 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7f2e9fc7581ea566e77b6bf977fb1103ad79bc7512a1e746ab7fbe9744bc7691-userdata-shm.mount: Deactivated successfully.
Oct  8 11:53:03 np0005476733 systemd[1]: var-lib-containers-storage-overlay-008853341ae7962e007552bdd67954078d0f5508c9429181dfe0d3cfc8a98e43-merged.mount: Deactivated successfully.
Oct  8 11:53:03 np0005476733 podman[243861]: 2025-10-08 15:53:03.12732053 +0000 UTC m=+0.098251560 container cleanup 7f2e9fc7581ea566e77b6bf977fb1103ad79bc7512a1e746ab7fbe9744bc7691 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 11:53:03 np0005476733 systemd[1]: libpod-conmon-7f2e9fc7581ea566e77b6bf977fb1103ad79bc7512a1e746ab7fbe9744bc7691.scope: Deactivated successfully.
Oct  8 11:53:03 np0005476733 podman[243890]: 2025-10-08 15:53:03.197637927 +0000 UTC m=+0.044583685 container remove 7f2e9fc7581ea566e77b6bf977fb1103ad79bc7512a1e746ab7fbe9744bc7691 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:53:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:03.204 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2b9afd8f-8562-441c-8706-e36f588aa9e4]: (4, ('Wed Oct  8 03:53:03 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 (7f2e9fc7581ea566e77b6bf977fb1103ad79bc7512a1e746ab7fbe9744bc7691)\n7f2e9fc7581ea566e77b6bf977fb1103ad79bc7512a1e746ab7fbe9744bc7691\nWed Oct  8 03:53:03 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 (7f2e9fc7581ea566e77b6bf977fb1103ad79bc7512a1e746ab7fbe9744bc7691)\n7f2e9fc7581ea566e77b6bf977fb1103ad79bc7512a1e746ab7fbe9744bc7691\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:03.206 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[892a9c78-6b26-495f-9d78-5a7f02757074]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:03.207 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58a69152-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:53:03 np0005476733 nova_compute[192580]: 2025-10-08 15:53:03.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:03 np0005476733 kernel: tap58a69152-b0: left promiscuous mode
Oct  8 11:53:03 np0005476733 nova_compute[192580]: 2025-10-08 15:53:03.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:03.238 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c4b44c0b-117a-4664-9d7a-84fb95929619]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:03.275 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e81816ce-ca6f-4806-b4fd-7fcc77c7f0f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:03.277 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a890f368-3b41-4347-a11c-f2cfd0b35835]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:03.295 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ecbf5502-1e74-47fc-81d6-6e1c1a5d33cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560652, 'reachable_time': 40631, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243911, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:03.298 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:53:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:03.298 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[53905625-cb6b-476d-b6de-1fd696df03e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:03 np0005476733 systemd[1]: run-netns-ovnmeta\x2d58a69152\x2db5a6\x2d41d0\x2d85d5\x2d36ab51cfbfb5.mount: Deactivated successfully.
Oct  8 11:53:03 np0005476733 nova_compute[192580]: 2025-10-08 15:53:03.814 2 INFO nova.virt.libvirt.driver [None req-049bc4ce-a307-4201-b0ab-26193c2479fd 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Instance shutdown successfully.#033[00m
Oct  8 11:53:03 np0005476733 kernel: tapb060d65a-90: entered promiscuous mode
Oct  8 11:53:03 np0005476733 NetworkManager[51699]: <info>  [1759938783.8844] manager: (tapb060d65a-90): new Tun device (/org/freedesktop/NetworkManager/Devices/212)
Oct  8 11:53:03 np0005476733 systemd-udevd[243833]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:53:03 np0005476733 ovn_controller[94857]: 2025-10-08T15:53:03Z|00640|binding|INFO|Claiming lport b060d65a-9028-402c-8b84-594cba794144 for this chassis.
Oct  8 11:53:03 np0005476733 ovn_controller[94857]: 2025-10-08T15:53:03Z|00641|binding|INFO|b060d65a-9028-402c-8b84-594cba794144: Claiming fa:16:3e:82:ce:fa 192.168.8.197
Oct  8 11:53:03 np0005476733 nova_compute[192580]: 2025-10-08 15:53:03.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:03 np0005476733 NetworkManager[51699]: <info>  [1759938783.9000] device (tapb060d65a-90): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:53:03 np0005476733 NetworkManager[51699]: <info>  [1759938783.9019] device (tapb060d65a-90): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:53:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:03.902 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:ce:fa 192.168.8.197'], port_security=['fa:16:3e:82:ce:fa 192.168.8.197'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.8.197/24', 'neutron:device_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2fece7d-de46-49dc-874d-3e87e96b491f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b449450f-29a2-4ba2-a56d-c4c1cca923db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.241'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07c02bbf-eada-4d49-8b61-f6f456154844, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=b060d65a-9028-402c-8b84-594cba794144) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:53:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:03.903 103739 INFO neutron.agent.ovn.metadata.agent [-] Port b060d65a-9028-402c-8b84-594cba794144 in datapath f2fece7d-de46-49dc-874d-3e87e96b491f bound to our chassis#033[00m
Oct  8 11:53:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:03.905 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f2fece7d-de46-49dc-874d-3e87e96b491f#033[00m
Oct  8 11:53:03 np0005476733 ovn_controller[94857]: 2025-10-08T15:53:03Z|00642|binding|INFO|Setting lport b060d65a-9028-402c-8b84-594cba794144 ovn-installed in OVS
Oct  8 11:53:03 np0005476733 ovn_controller[94857]: 2025-10-08T15:53:03Z|00643|binding|INFO|Setting lport b060d65a-9028-402c-8b84-594cba794144 up in Southbound
Oct  8 11:53:03 np0005476733 nova_compute[192580]: 2025-10-08 15:53:03.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:03 np0005476733 nova_compute[192580]: 2025-10-08 15:53:03.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:03 np0005476733 NetworkManager[51699]: <info>  [1759938783.9200] manager: (tape5b8e4de-db): new Tun device (/org/freedesktop/NetworkManager/Devices/213)
Oct  8 11:53:03 np0005476733 kernel: tape5b8e4de-db: entered promiscuous mode
Oct  8 11:53:03 np0005476733 ovn_controller[94857]: 2025-10-08T15:53:03Z|00644|binding|INFO|Claiming lport e5b8e4de-db0e-48eb-95c7-99aee1735230 for this chassis.
Oct  8 11:53:03 np0005476733 ovn_controller[94857]: 2025-10-08T15:53:03Z|00645|binding|INFO|e5b8e4de-db0e-48eb-95c7-99aee1735230: Claiming fa:16:3e:67:2e:56 10.100.0.4
Oct  8 11:53:03 np0005476733 nova_compute[192580]: 2025-10-08 15:53:03.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:03 np0005476733 nova_compute[192580]: 2025-10-08 15:53:03.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:03.926 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e3e4b310-d92a-4354-92cd-815bc000f942]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:03.927 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf2fece7d-d1 in ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:53:03 np0005476733 NetworkManager[51699]: <info>  [1759938783.9321] device (tape5b8e4de-db): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:53:03 np0005476733 NetworkManager[51699]: <info>  [1759938783.9328] device (tape5b8e4de-db): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:53:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:03.937 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf2fece7d-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:53:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:03.938 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[44affe96-9c90-4f6e-8bfa-b0f1b9571b38]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:03.939 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4885c54f-a0c0-46ef-8166-a79f1739e16c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:03 np0005476733 ovn_controller[94857]: 2025-10-08T15:53:03Z|00646|binding|INFO|Setting lport e5b8e4de-db0e-48eb-95c7-99aee1735230 ovn-installed in OVS
Oct  8 11:53:03 np0005476733 nova_compute[192580]: 2025-10-08 15:53:03.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:03 np0005476733 nova_compute[192580]: 2025-10-08 15:53:03.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:03.946 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:67:2e:56 10.100.0.4'], port_security=['fa:16:3e:67:2e:56 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '82ea289b-c65f-44fe-a172-e9784a3ab9f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.205'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3da71a44-b74e-4032-87c4-3337484b3d54, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=9, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=e5b8e4de-db0e-48eb-95c7-99aee1735230) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:53:03 np0005476733 ovn_controller[94857]: 2025-10-08T15:53:03Z|00647|binding|INFO|Setting lport e5b8e4de-db0e-48eb-95c7-99aee1735230 up in Southbound
Oct  8 11:53:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:03.953 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[8989e2a0-9f09-46c2-bf22-d21899b53bd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:03.968 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[57202f40-3c99-4ebf-926f-1292f55451df]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:03 np0005476733 systemd-machined[152624]: New machine qemu-43-instance-00000048.
Oct  8 11:53:03 np0005476733 systemd[1]: Started Virtual Machine qemu-43-instance-00000048.
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.005 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[b1d5e67e-ac8b-4e1c-8074-2a1f7ed4bf48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:04 np0005476733 NetworkManager[51699]: <info>  [1759938784.0149] manager: (tapf2fece7d-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/214)
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.013 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e27e36c1-39d5-4097-9007-4f4db176e688]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.056 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[a0782818-cf94-4702-88b0-4bed574c96a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.061 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[caa13383-9f6d-4f85-9882-570374e7d744]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:04 np0005476733 NetworkManager[51699]: <info>  [1759938784.0907] device (tapf2fece7d-d0): carrier: link connected
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.097 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[35f843a3-882f-4c98-969b-69f2fa6da141]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.116 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[cffefeda-8847-4bbc-abe0-cf0ac1ce7f00]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf2fece7d-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:76:b4:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568775, 'reachable_time': 25795, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243964, 'error': None, 'target': 'ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.138 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e9a29e84-cb1e-4917-ab6a-49c098fe9d97]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe76:b4af'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 568775, 'tstamp': 568775}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243965, 'error': None, 'target': 'ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.162 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4f4878ba-41df-4b04-b51f-9c43924c6fb9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf2fece7d-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:76:b4:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568775, 'reachable_time': 25795, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243966, 'error': None, 'target': 'ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.204 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8313eca2-4176-4901-9cc7-d8a196cf4e59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.280 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e757ad0f-5e1d-43d3-b7fe-1406a96494db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.282 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2fece7d-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.282 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.283 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf2fece7d-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:53:04 np0005476733 nova_compute[192580]: 2025-10-08 15:53:04.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:04 np0005476733 NetworkManager[51699]: <info>  [1759938784.2864] manager: (tapf2fece7d-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/215)
Oct  8 11:53:04 np0005476733 kernel: tapf2fece7d-d0: entered promiscuous mode
Oct  8 11:53:04 np0005476733 nova_compute[192580]: 2025-10-08 15:53:04.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.291 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf2fece7d-d0, col_values=(('external_ids', {'iface-id': 'c203ff41-0371-4edb-b491-721a1c14b7ee'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:53:04 np0005476733 nova_compute[192580]: 2025-10-08 15:53:04.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:04 np0005476733 ovn_controller[94857]: 2025-10-08T15:53:04Z|00648|binding|INFO|Releasing lport c203ff41-0371-4edb-b491-721a1c14b7ee from this chassis (sb_readonly=0)
Oct  8 11:53:04 np0005476733 nova_compute[192580]: 2025-10-08 15:53:04.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.295 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f2fece7d-de46-49dc-874d-3e87e96b491f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f2fece7d-de46-49dc-874d-3e87e96b491f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.296 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[714413ff-3cb0-4468-929d-79ab73a6f09d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.297 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-f2fece7d-de46-49dc-874d-3e87e96b491f
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/f2fece7d-de46-49dc-874d-3e87e96b491f.pid.haproxy
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID f2fece7d-de46-49dc-874d-3e87e96b491f
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.298 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f', 'env', 'PROCESS_TAG=haproxy-f2fece7d-de46-49dc-874d-3e87e96b491f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f2fece7d-de46-49dc-874d-3e87e96b491f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:53:04 np0005476733 nova_compute[192580]: 2025-10-08 15:53:04.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:04 np0005476733 nova_compute[192580]: 2025-10-08 15:53:04.688 2 DEBUG nova.compute.manager [req-fa1dd31a-cea3-4a08-9055-80d7782adc61 req-a8d13212-08e4-436a-b3ee-ac72bb484b1b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-unplugged-b060d65a-9028-402c-8b84-594cba794144 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:04 np0005476733 nova_compute[192580]: 2025-10-08 15:53:04.688 2 DEBUG oslo_concurrency.lockutils [req-fa1dd31a-cea3-4a08-9055-80d7782adc61 req-a8d13212-08e4-436a-b3ee-ac72bb484b1b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:04 np0005476733 nova_compute[192580]: 2025-10-08 15:53:04.689 2 DEBUG oslo_concurrency.lockutils [req-fa1dd31a-cea3-4a08-9055-80d7782adc61 req-a8d13212-08e4-436a-b3ee-ac72bb484b1b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:04 np0005476733 nova_compute[192580]: 2025-10-08 15:53:04.689 2 DEBUG oslo_concurrency.lockutils [req-fa1dd31a-cea3-4a08-9055-80d7782adc61 req-a8d13212-08e4-436a-b3ee-ac72bb484b1b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:04 np0005476733 nova_compute[192580]: 2025-10-08 15:53:04.689 2 DEBUG nova.compute.manager [req-fa1dd31a-cea3-4a08-9055-80d7782adc61 req-a8d13212-08e4-436a-b3ee-ac72bb484b1b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-unplugged-b060d65a-9028-402c-8b84-594cba794144 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:53:04 np0005476733 nova_compute[192580]: 2025-10-08 15:53:04.690 2 WARNING nova.compute.manager [req-fa1dd31a-cea3-4a08-9055-80d7782adc61 req-a8d13212-08e4-436a-b3ee-ac72bb484b1b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received unexpected event network-vif-unplugged-b060d65a-9028-402c-8b84-594cba794144 for instance with vm_state active and task_state reboot_started.#033[00m
Oct  8 11:53:04 np0005476733 nova_compute[192580]: 2025-10-08 15:53:04.710 2 DEBUG nova.virt.libvirt.host [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Removed pending event for e899db9a-b18d-4036-a523-fe0907dba023 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  8 11:53:04 np0005476733 nova_compute[192580]: 2025-10-08 15:53:04.711 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759938784.7103562, e899db9a-b18d-4036-a523-fe0907dba023 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:53:04 np0005476733 nova_compute[192580]: 2025-10-08 15:53:04.712 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:53:04 np0005476733 nova_compute[192580]: 2025-10-08 15:53:04.720 2 INFO nova.virt.libvirt.driver [-] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Instance running successfully.#033[00m
Oct  8 11:53:04 np0005476733 nova_compute[192580]: 2025-10-08 15:53:04.721 2 INFO nova.virt.libvirt.driver [None req-049bc4ce-a307-4201-b0ab-26193c2479fd 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Instance soft rebooted successfully.#033[00m
Oct  8 11:53:04 np0005476733 nova_compute[192580]: 2025-10-08 15:53:04.722 2 DEBUG nova.compute.manager [None req-049bc4ce-a307-4201-b0ab-26193c2479fd 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:53:04 np0005476733 podman[244006]: 2025-10-08 15:53:04.740618047 +0000 UTC m=+0.061030072 container create 86208c36d5f8e1a61615975cbff3d0c9554b842a6513f7e5cd87d72f84ab1f31 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  8 11:53:04 np0005476733 nova_compute[192580]: 2025-10-08 15:53:04.748 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:53:04 np0005476733 nova_compute[192580]: 2025-10-08 15:53:04.756 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:53:04 np0005476733 nova_compute[192580]: 2025-10-08 15:53:04.785 2 DEBUG oslo_concurrency.lockutils [None req-049bc4ce-a307-4201-b0ab-26193c2479fd 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 8.914s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:04 np0005476733 nova_compute[192580]: 2025-10-08 15:53:04.789 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759938784.7125988, e899db9a-b18d-4036-a523-fe0907dba023 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:53:04 np0005476733 nova_compute[192580]: 2025-10-08 15:53:04.789 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] VM Started (Lifecycle Event)#033[00m
Oct  8 11:53:04 np0005476733 podman[244006]: 2025-10-08 15:53:04.703542272 +0000 UTC m=+0.023954307 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:53:04 np0005476733 systemd[1]: Started libpod-conmon-86208c36d5f8e1a61615975cbff3d0c9554b842a6513f7e5cd87d72f84ab1f31.scope.
Oct  8 11:53:04 np0005476733 nova_compute[192580]: 2025-10-08 15:53:04.819 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:53:04 np0005476733 nova_compute[192580]: 2025-10-08 15:53:04.827 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:53:04 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:53:04 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a17204afac4459b62c6f55d66b4ae9fa4c0e54f77dfbc4c07aaf0450278db9ca/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:53:04 np0005476733 podman[244020]: 2025-10-08 15:53:04.8690299 +0000 UTC m=+0.085403590 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:53:04 np0005476733 podman[244006]: 2025-10-08 15:53:04.869667011 +0000 UTC m=+0.190079056 container init 86208c36d5f8e1a61615975cbff3d0c9554b842a6513f7e5cd87d72f84ab1f31 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  8 11:53:04 np0005476733 podman[244006]: 2025-10-08 15:53:04.876430487 +0000 UTC m=+0.196842502 container start 86208c36d5f8e1a61615975cbff3d0c9554b842a6513f7e5cd87d72f84ab1f31 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:53:04 np0005476733 podman[244019]: 2025-10-08 15:53:04.884101182 +0000 UTC m=+0.104764689 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct  8 11:53:04 np0005476733 neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f[244042]: [NOTICE]   (244064) : New worker (244069) forked
Oct  8 11:53:04 np0005476733 neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f[244042]: [NOTICE]   (244064) : Loading success.
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.948 103739 INFO neutron.agent.ovn.metadata.agent [-] Port e5b8e4de-db0e-48eb-95c7-99aee1735230 in datapath 58a69152-b5a6-41d0-85d5-36ab51cfbfb5 unbound from our chassis#033[00m
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.955 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58a69152-b5a6-41d0-85d5-36ab51cfbfb5#033[00m
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.968 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a479ba53-e4d0-4f30-812a-4ec5c5d6bf3f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.969 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap58a69152-b1 in ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.972 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap58a69152-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.972 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2b970bce-b82b-456d-84e5-0a932714bc0b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.973 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[85f252c0-6f26-409c-9ff1-2f1184825d47]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:04.986 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[16c1adef-e45b-4b96-ae91-07b9c201a17c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:05.004 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8211a5bb-008d-4f93-8627-c9adbad175a0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:05.038 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[4d1685bf-07d8-4ed7-8d46-de877d182336]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:05.045 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a18d65f4-0abd-4b26-bd0a-ffabaacf46ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:05 np0005476733 NetworkManager[51699]: <info>  [1759938785.0506] manager: (tap58a69152-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/216)
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:05.092 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[f3d1352f-9417-46b1-a05b-0cb27d100bb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:05.097 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[71438456-7931-4e92-aee6-a6f922e2e428]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:05 np0005476733 NetworkManager[51699]: <info>  [1759938785.1289] device (tap58a69152-b0): carrier: link connected
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:05.132 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[9e1656d5-65d0-4c26-bd8c-b218ad119114]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:05.151 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[30d2f573-e7a4-472c-b14a-f1b17d2de399]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58a69152-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:63:a5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 152], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568879, 'reachable_time': 21121, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244089, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:05.173 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b0e84a21-2dfe-48a7-95b8-810af6740799]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecf:63a5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 568879, 'tstamp': 568879}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244090, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:05.193 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d65a9bd7-e820-4588-8387-4fddb050df32]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58a69152-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:63:a5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 152], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568879, 'reachable_time': 21121, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244091, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:05.234 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[91fde76e-b40d-48c5-b46c-58b86da23613]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:05.293 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[26e8161d-807c-478d-ba39-a40bb4173b0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:05.294 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58a69152-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:05.294 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:05.295 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58a69152-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:53:05 np0005476733 NetworkManager[51699]: <info>  [1759938785.2976] manager: (tap58a69152-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/217)
Oct  8 11:53:05 np0005476733 nova_compute[192580]: 2025-10-08 15:53:05.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:05 np0005476733 kernel: tap58a69152-b0: entered promiscuous mode
Oct  8 11:53:05 np0005476733 nova_compute[192580]: 2025-10-08 15:53:05.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:05.301 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58a69152-b0, col_values=(('external_ids', {'iface-id': '46f589fc-b5d9-4e1f-b085-8789fd1f48e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:53:05 np0005476733 nova_compute[192580]: 2025-10-08 15:53:05.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:05 np0005476733 ovn_controller[94857]: 2025-10-08T15:53:05Z|00649|binding|INFO|Releasing lport 46f589fc-b5d9-4e1f-b085-8789fd1f48e9 from this chassis (sb_readonly=0)
Oct  8 11:53:05 np0005476733 nova_compute[192580]: 2025-10-08 15:53:05.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:05.315 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/58a69152-b5a6-41d0-85d5-36ab51cfbfb5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/58a69152-b5a6-41d0-85d5-36ab51cfbfb5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:05.316 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d5d9430d-815a-41d5-a6c5-cce14d1c5402]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:05.317 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-58a69152-b5a6-41d0-85d5-36ab51cfbfb5
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/58a69152-b5a6-41d0-85d5-36ab51cfbfb5.pid.haproxy
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 58a69152-b5a6-41d0-85d5-36ab51cfbfb5
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:05.317 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'env', 'PROCESS_TAG=haproxy-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/58a69152-b5a6-41d0-85d5-36ab51cfbfb5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:53:05 np0005476733 nova_compute[192580]: 2025-10-08 15:53:05.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:05.553 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:53:05 np0005476733 podman[244124]: 2025-10-08 15:53:05.707918599 +0000 UTC m=+0.060586958 container create 98b121614890ed4340f446b8aa95470cab2e72c535e4b528206b286167d19758 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:53:05 np0005476733 systemd[1]: Started libpod-conmon-98b121614890ed4340f446b8aa95470cab2e72c535e4b528206b286167d19758.scope.
Oct  8 11:53:05 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:53:05 np0005476733 podman[244124]: 2025-10-08 15:53:05.679005125 +0000 UTC m=+0.031673474 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:53:05 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7692ae9b4a5d737d27e1ac5452ea96b145f66f6d73f6c26ee61cc469824cbb25/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:53:05 np0005476733 podman[244124]: 2025-10-08 15:53:05.78900473 +0000 UTC m=+0.141673089 container init 98b121614890ed4340f446b8aa95470cab2e72c535e4b528206b286167d19758 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  8 11:53:05 np0005476733 podman[244124]: 2025-10-08 15:53:05.796854221 +0000 UTC m=+0.149522560 container start 98b121614890ed4340f446b8aa95470cab2e72c535e4b528206b286167d19758 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:53:05 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[244139]: [NOTICE]   (244143) : New worker (244145) forked
Oct  8 11:53:05 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[244139]: [NOTICE]   (244143) : Loading success.
Oct  8 11:53:05 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:05.870 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:53:06 np0005476733 nova_compute[192580]: 2025-10-08 15:53:06.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:06 np0005476733 nova_compute[192580]: 2025-10-08 15:53:06.790 2 DEBUG nova.compute.manager [req-ae10f258-d02b-4ec3-8918-f35c892a3333 req-231b6094-59cb-46eb-a182-7c4691e58df1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:06 np0005476733 nova_compute[192580]: 2025-10-08 15:53:06.790 2 DEBUG oslo_concurrency.lockutils [req-ae10f258-d02b-4ec3-8918-f35c892a3333 req-231b6094-59cb-46eb-a182-7c4691e58df1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:06 np0005476733 nova_compute[192580]: 2025-10-08 15:53:06.791 2 DEBUG oslo_concurrency.lockutils [req-ae10f258-d02b-4ec3-8918-f35c892a3333 req-231b6094-59cb-46eb-a182-7c4691e58df1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:06 np0005476733 nova_compute[192580]: 2025-10-08 15:53:06.791 2 DEBUG oslo_concurrency.lockutils [req-ae10f258-d02b-4ec3-8918-f35c892a3333 req-231b6094-59cb-46eb-a182-7c4691e58df1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:06 np0005476733 nova_compute[192580]: 2025-10-08 15:53:06.791 2 DEBUG nova.compute.manager [req-ae10f258-d02b-4ec3-8918-f35c892a3333 req-231b6094-59cb-46eb-a182-7c4691e58df1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:53:06 np0005476733 nova_compute[192580]: 2025-10-08 15:53:06.792 2 WARNING nova.compute.manager [req-ae10f258-d02b-4ec3-8918-f35c892a3333 req-231b6094-59cb-46eb-a182-7c4691e58df1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received unexpected event network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:53:06 np0005476733 nova_compute[192580]: 2025-10-08 15:53:06.792 2 DEBUG nova.compute.manager [req-ae10f258-d02b-4ec3-8918-f35c892a3333 req-231b6094-59cb-46eb-a182-7c4691e58df1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-unplugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:06 np0005476733 nova_compute[192580]: 2025-10-08 15:53:06.792 2 DEBUG oslo_concurrency.lockutils [req-ae10f258-d02b-4ec3-8918-f35c892a3333 req-231b6094-59cb-46eb-a182-7c4691e58df1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:06 np0005476733 nova_compute[192580]: 2025-10-08 15:53:06.793 2 DEBUG oslo_concurrency.lockutils [req-ae10f258-d02b-4ec3-8918-f35c892a3333 req-231b6094-59cb-46eb-a182-7c4691e58df1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:06 np0005476733 nova_compute[192580]: 2025-10-08 15:53:06.793 2 DEBUG oslo_concurrency.lockutils [req-ae10f258-d02b-4ec3-8918-f35c892a3333 req-231b6094-59cb-46eb-a182-7c4691e58df1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:06 np0005476733 nova_compute[192580]: 2025-10-08 15:53:06.793 2 DEBUG nova.compute.manager [req-ae10f258-d02b-4ec3-8918-f35c892a3333 req-231b6094-59cb-46eb-a182-7c4691e58df1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-unplugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:53:06 np0005476733 nova_compute[192580]: 2025-10-08 15:53:06.794 2 WARNING nova.compute.manager [req-ae10f258-d02b-4ec3-8918-f35c892a3333 req-231b6094-59cb-46eb-a182-7c4691e58df1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received unexpected event network-vif-unplugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:53:07 np0005476733 nova_compute[192580]: 2025-10-08 15:53:07.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:08 np0005476733 nova_compute[192580]: 2025-10-08 15:53:08.893 2 DEBUG nova.compute.manager [req-08f423eb-69ed-4a46-99b7-7b78588794c1 req-958d3a13-d347-42f7-9c5b-f48b6b32ad95 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:08 np0005476733 nova_compute[192580]: 2025-10-08 15:53:08.894 2 DEBUG oslo_concurrency.lockutils [req-08f423eb-69ed-4a46-99b7-7b78588794c1 req-958d3a13-d347-42f7-9c5b-f48b6b32ad95 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:08 np0005476733 nova_compute[192580]: 2025-10-08 15:53:08.895 2 DEBUG oslo_concurrency.lockutils [req-08f423eb-69ed-4a46-99b7-7b78588794c1 req-958d3a13-d347-42f7-9c5b-f48b6b32ad95 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:08 np0005476733 nova_compute[192580]: 2025-10-08 15:53:08.895 2 DEBUG oslo_concurrency.lockutils [req-08f423eb-69ed-4a46-99b7-7b78588794c1 req-958d3a13-d347-42f7-9c5b-f48b6b32ad95 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:08 np0005476733 nova_compute[192580]: 2025-10-08 15:53:08.896 2 DEBUG nova.compute.manager [req-08f423eb-69ed-4a46-99b7-7b78588794c1 req-958d3a13-d347-42f7-9c5b-f48b6b32ad95 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:53:08 np0005476733 nova_compute[192580]: 2025-10-08 15:53:08.896 2 WARNING nova.compute.manager [req-08f423eb-69ed-4a46-99b7-7b78588794c1 req-958d3a13-d347-42f7-9c5b-f48b6b32ad95 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received unexpected event network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:53:08 np0005476733 nova_compute[192580]: 2025-10-08 15:53:08.896 2 DEBUG nova.compute.manager [req-08f423eb-69ed-4a46-99b7-7b78588794c1 req-958d3a13-d347-42f7-9c5b-f48b6b32ad95 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:08 np0005476733 nova_compute[192580]: 2025-10-08 15:53:08.897 2 DEBUG oslo_concurrency.lockutils [req-08f423eb-69ed-4a46-99b7-7b78588794c1 req-958d3a13-d347-42f7-9c5b-f48b6b32ad95 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:08 np0005476733 nova_compute[192580]: 2025-10-08 15:53:08.897 2 DEBUG oslo_concurrency.lockutils [req-08f423eb-69ed-4a46-99b7-7b78588794c1 req-958d3a13-d347-42f7-9c5b-f48b6b32ad95 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:08 np0005476733 nova_compute[192580]: 2025-10-08 15:53:08.897 2 DEBUG oslo_concurrency.lockutils [req-08f423eb-69ed-4a46-99b7-7b78588794c1 req-958d3a13-d347-42f7-9c5b-f48b6b32ad95 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:08 np0005476733 nova_compute[192580]: 2025-10-08 15:53:08.897 2 DEBUG nova.compute.manager [req-08f423eb-69ed-4a46-99b7-7b78588794c1 req-958d3a13-d347-42f7-9c5b-f48b6b32ad95 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:53:08 np0005476733 nova_compute[192580]: 2025-10-08 15:53:08.898 2 WARNING nova.compute.manager [req-08f423eb-69ed-4a46-99b7-7b78588794c1 req-958d3a13-d347-42f7-9c5b-f48b6b32ad95 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received unexpected event network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:53:08 np0005476733 nova_compute[192580]: 2025-10-08 15:53:08.898 2 DEBUG nova.compute.manager [req-08f423eb-69ed-4a46-99b7-7b78588794c1 req-958d3a13-d347-42f7-9c5b-f48b6b32ad95 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:08 np0005476733 nova_compute[192580]: 2025-10-08 15:53:08.898 2 DEBUG oslo_concurrency.lockutils [req-08f423eb-69ed-4a46-99b7-7b78588794c1 req-958d3a13-d347-42f7-9c5b-f48b6b32ad95 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:08 np0005476733 nova_compute[192580]: 2025-10-08 15:53:08.898 2 DEBUG oslo_concurrency.lockutils [req-08f423eb-69ed-4a46-99b7-7b78588794c1 req-958d3a13-d347-42f7-9c5b-f48b6b32ad95 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:08 np0005476733 nova_compute[192580]: 2025-10-08 15:53:08.899 2 DEBUG oslo_concurrency.lockutils [req-08f423eb-69ed-4a46-99b7-7b78588794c1 req-958d3a13-d347-42f7-9c5b-f48b6b32ad95 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:08 np0005476733 nova_compute[192580]: 2025-10-08 15:53:08.899 2 DEBUG nova.compute.manager [req-08f423eb-69ed-4a46-99b7-7b78588794c1 req-958d3a13-d347-42f7-9c5b-f48b6b32ad95 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:53:08 np0005476733 nova_compute[192580]: 2025-10-08 15:53:08.899 2 WARNING nova.compute.manager [req-08f423eb-69ed-4a46-99b7-7b78588794c1 req-958d3a13-d347-42f7-9c5b-f48b6b32ad95 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received unexpected event network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:53:08 np0005476733 nova_compute[192580]: 2025-10-08 15:53:08.900 2 DEBUG nova.compute.manager [req-08f423eb-69ed-4a46-99b7-7b78588794c1 req-958d3a13-d347-42f7-9c5b-f48b6b32ad95 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:08 np0005476733 nova_compute[192580]: 2025-10-08 15:53:08.900 2 DEBUG oslo_concurrency.lockutils [req-08f423eb-69ed-4a46-99b7-7b78588794c1 req-958d3a13-d347-42f7-9c5b-f48b6b32ad95 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:08 np0005476733 nova_compute[192580]: 2025-10-08 15:53:08.900 2 DEBUG oslo_concurrency.lockutils [req-08f423eb-69ed-4a46-99b7-7b78588794c1 req-958d3a13-d347-42f7-9c5b-f48b6b32ad95 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:08 np0005476733 nova_compute[192580]: 2025-10-08 15:53:08.901 2 DEBUG oslo_concurrency.lockutils [req-08f423eb-69ed-4a46-99b7-7b78588794c1 req-958d3a13-d347-42f7-9c5b-f48b6b32ad95 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:08 np0005476733 nova_compute[192580]: 2025-10-08 15:53:08.901 2 DEBUG nova.compute.manager [req-08f423eb-69ed-4a46-99b7-7b78588794c1 req-958d3a13-d347-42f7-9c5b-f48b6b32ad95 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:53:08 np0005476733 nova_compute[192580]: 2025-10-08 15:53:08.901 2 WARNING nova.compute.manager [req-08f423eb-69ed-4a46-99b7-7b78588794c1 req-958d3a13-d347-42f7-9c5b-f48b6b32ad95 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received unexpected event network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 for instance with vm_state active and task_state None.#033[00m
Oct  8 11:53:09 np0005476733 nova_compute[192580]: 2025-10-08 15:53:09.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:53:09 np0005476733 nova_compute[192580]: 2025-10-08 15:53:09.591 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:53:09 np0005476733 nova_compute[192580]: 2025-10-08 15:53:09.592 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:53:09 np0005476733 nova_compute[192580]: 2025-10-08 15:53:09.610 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:53:09 np0005476733 nova_compute[192580]: 2025-10-08 15:53:09.611 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:53:09 np0005476733 nova_compute[192580]: 2025-10-08 15:53:09.611 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:53:09 np0005476733 nova_compute[192580]: 2025-10-08 15:53:09.611 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e899db9a-b18d-4036-a523-fe0907dba023 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:53:11 np0005476733 nova_compute[192580]: 2025-10-08 15:53:11.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:11 np0005476733 nova_compute[192580]: 2025-10-08 15:53:11.315 2 DEBUG nova.compute.manager [req-033ef4d4-1cdd-484d-aaba-c1d9fdea0c61 req-c7db1edd-d199-452c-8713-9c4684c4bfa7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:11 np0005476733 nova_compute[192580]: 2025-10-08 15:53:11.316 2 DEBUG oslo_concurrency.lockutils [req-033ef4d4-1cdd-484d-aaba-c1d9fdea0c61 req-c7db1edd-d199-452c-8713-9c4684c4bfa7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:11 np0005476733 nova_compute[192580]: 2025-10-08 15:53:11.317 2 DEBUG oslo_concurrency.lockutils [req-033ef4d4-1cdd-484d-aaba-c1d9fdea0c61 req-c7db1edd-d199-452c-8713-9c4684c4bfa7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:11 np0005476733 nova_compute[192580]: 2025-10-08 15:53:11.317 2 DEBUG oslo_concurrency.lockutils [req-033ef4d4-1cdd-484d-aaba-c1d9fdea0c61 req-c7db1edd-d199-452c-8713-9c4684c4bfa7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:11 np0005476733 nova_compute[192580]: 2025-10-08 15:53:11.318 2 DEBUG nova.compute.manager [req-033ef4d4-1cdd-484d-aaba-c1d9fdea0c61 req-c7db1edd-d199-452c-8713-9c4684c4bfa7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:53:11 np0005476733 nova_compute[192580]: 2025-10-08 15:53:11.318 2 WARNING nova.compute.manager [req-033ef4d4-1cdd-484d-aaba-c1d9fdea0c61 req-c7db1edd-d199-452c-8713-9c4684c4bfa7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received unexpected event network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 for instance with vm_state active and task_state migrating.#033[00m
Oct  8 11:53:12 np0005476733 nova_compute[192580]: 2025-10-08 15:53:12.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:12.874 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.135 2 DEBUG nova.virt.libvirt.driver [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Check if temp file /var/lib/nova/instances/tmprfbin1qp exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.144 2 DEBUG oslo_concurrency.processutils [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.208 2 DEBUG oslo_concurrency.processutils [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.208 2 DEBUG oslo_concurrency.processutils [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:53:14 np0005476733 podman[244154]: 2025-10-08 15:53:14.226372324 +0000 UTC m=+0.055310008 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.279 2 DEBUG oslo_concurrency.processutils [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.280 2 DEBUG nova.compute.manager [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=103424,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmprfbin1qp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e899db9a-b18d-4036-a523-fe0907dba023',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData,VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.377 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updating instance_info_cache with network_info: [{"id": "b060d65a-9028-402c-8b84-594cba794144", "address": "fa:16:3e:82:ce:fa", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.197", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb060d65a-90", "ovs_interfaceid": "b060d65a-9028-402c-8b84-594cba794144", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "address": "fa:16:3e:67:2e:56", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5b8e4de-db", "ovs_interfaceid": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.399 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.400 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.402 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.402 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.403 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.404 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.437 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.438 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.438 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.439 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.538 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.596 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.597 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.660 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.881 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.882 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13499MB free_disk=111.18964767456055GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.883 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:14 np0005476733 nova_compute[192580]: 2025-10-08 15:53:14.883 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:15 np0005476733 nova_compute[192580]: 2025-10-08 15:53:15.001 2 INFO nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 0a579024-97a5-4ce2-b72f-05d3aff4408d has allocations against this compute host but is not found in the database.#033[00m
Oct  8 11:53:15 np0005476733 nova_compute[192580]: 2025-10-08 15:53:15.002 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:53:15 np0005476733 nova_compute[192580]: 2025-10-08 15:53:15.002 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:53:15 np0005476733 nova_compute[192580]: 2025-10-08 15:53:15.046 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:53:15 np0005476733 nova_compute[192580]: 2025-10-08 15:53:15.063 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:53:15 np0005476733 nova_compute[192580]: 2025-10-08 15:53:15.087 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:53:15 np0005476733 nova_compute[192580]: 2025-10-08 15:53:15.088 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:15 np0005476733 nova_compute[192580]: 2025-10-08 15:53:15.274 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:53:15 np0005476733 nova_compute[192580]: 2025-10-08 15:53:15.275 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:53:16 np0005476733 nova_compute[192580]: 2025-10-08 15:53:16.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:16 np0005476733 nova_compute[192580]: 2025-10-08 15:53:16.593 2 DEBUG oslo_concurrency.processutils [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:53:16 np0005476733 nova_compute[192580]: 2025-10-08 15:53:16.670 2 DEBUG oslo_concurrency.processutils [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:53:16 np0005476733 nova_compute[192580]: 2025-10-08 15:53:16.672 2 DEBUG oslo_concurrency.processutils [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:53:16 np0005476733 nova_compute[192580]: 2025-10-08 15:53:16.732 2 DEBUG oslo_concurrency.processutils [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:53:17 np0005476733 nova_compute[192580]: 2025-10-08 15:53:17.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:20 np0005476733 podman[244192]: 2025-10-08 15:53:20.251887692 +0000 UTC m=+0.070814454 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm)
Oct  8 11:53:20 np0005476733 podman[244191]: 2025-10-08 15:53:20.280156664 +0000 UTC m=+0.102590809 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Oct  8 11:53:20 np0005476733 systemd[1]: Created slice User Slice of UID 42436.
Oct  8 11:53:20 np0005476733 systemd[1]: Starting User Runtime Directory /run/user/42436...
Oct  8 11:53:20 np0005476733 systemd-logind[827]: New session 47 of user nova.
Oct  8 11:53:20 np0005476733 systemd[1]: Finished User Runtime Directory /run/user/42436.
Oct  8 11:53:20 np0005476733 systemd[1]: Starting User Manager for UID 42436...
Oct  8 11:53:20 np0005476733 systemd[244241]: Queued start job for default target Main User Target.
Oct  8 11:53:20 np0005476733 systemd[244241]: Created slice User Application Slice.
Oct  8 11:53:20 np0005476733 systemd[244241]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  8 11:53:20 np0005476733 systemd[244241]: Started Daily Cleanup of User's Temporary Directories.
Oct  8 11:53:20 np0005476733 systemd[244241]: Reached target Paths.
Oct  8 11:53:20 np0005476733 systemd[244241]: Reached target Timers.
Oct  8 11:53:20 np0005476733 systemd[244241]: Starting D-Bus User Message Bus Socket...
Oct  8 11:53:20 np0005476733 systemd[244241]: Starting Create User's Volatile Files and Directories...
Oct  8 11:53:20 np0005476733 systemd[244241]: Finished Create User's Volatile Files and Directories.
Oct  8 11:53:20 np0005476733 systemd[244241]: Listening on D-Bus User Message Bus Socket.
Oct  8 11:53:20 np0005476733 systemd[244241]: Reached target Sockets.
Oct  8 11:53:20 np0005476733 systemd[244241]: Reached target Basic System.
Oct  8 11:53:20 np0005476733 systemd[244241]: Reached target Main User Target.
Oct  8 11:53:20 np0005476733 systemd[244241]: Startup finished in 164ms.
Oct  8 11:53:20 np0005476733 systemd[1]: Started User Manager for UID 42436.
Oct  8 11:53:20 np0005476733 systemd[1]: Started Session 47 of User nova.
Oct  8 11:53:20 np0005476733 systemd[1]: session-47.scope: Deactivated successfully.
Oct  8 11:53:20 np0005476733 systemd-logind[827]: Session 47 logged out. Waiting for processes to exit.
Oct  8 11:53:20 np0005476733 systemd-logind[827]: Removed session 47.
Oct  8 11:53:21 np0005476733 nova_compute[192580]: 2025-10-08 15:53:21.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:22 np0005476733 ovn_controller[94857]: 2025-10-08T15:53:22Z|00650|pinctrl|WARN|Dropped 1297 log messages in last 57 seconds (most recently, 3 seconds ago) due to excessive rate
Oct  8 11:53:22 np0005476733 ovn_controller[94857]: 2025-10-08T15:53:22Z|00651|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:53:22 np0005476733 nova_compute[192580]: 2025-10-08 15:53:22.527 2 DEBUG nova.compute.manager [req-21008f0c-6744-461c-8051-8101daa73598 req-dc108680-e81e-4dad-9780-81db2e1e56f8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-unplugged-b060d65a-9028-402c-8b84-594cba794144 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:22 np0005476733 nova_compute[192580]: 2025-10-08 15:53:22.528 2 DEBUG oslo_concurrency.lockutils [req-21008f0c-6744-461c-8051-8101daa73598 req-dc108680-e81e-4dad-9780-81db2e1e56f8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:22 np0005476733 nova_compute[192580]: 2025-10-08 15:53:22.528 2 DEBUG oslo_concurrency.lockutils [req-21008f0c-6744-461c-8051-8101daa73598 req-dc108680-e81e-4dad-9780-81db2e1e56f8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:22 np0005476733 nova_compute[192580]: 2025-10-08 15:53:22.529 2 DEBUG oslo_concurrency.lockutils [req-21008f0c-6744-461c-8051-8101daa73598 req-dc108680-e81e-4dad-9780-81db2e1e56f8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:22 np0005476733 nova_compute[192580]: 2025-10-08 15:53:22.529 2 DEBUG nova.compute.manager [req-21008f0c-6744-461c-8051-8101daa73598 req-dc108680-e81e-4dad-9780-81db2e1e56f8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-unplugged-b060d65a-9028-402c-8b84-594cba794144 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:53:22 np0005476733 nova_compute[192580]: 2025-10-08 15:53:22.529 2 DEBUG nova.compute.manager [req-21008f0c-6744-461c-8051-8101daa73598 req-dc108680-e81e-4dad-9780-81db2e1e56f8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-unplugged-b060d65a-9028-402c-8b84-594cba794144 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:53:22 np0005476733 nova_compute[192580]: 2025-10-08 15:53:22.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:24 np0005476733 nova_compute[192580]: 2025-10-08 15:53:24.691 2 DEBUG nova.compute.manager [req-1c8d0c23-ec1e-480a-8290-7189c81b7bb5 req-6c1f5652-dd44-4cfa-a256-58dda57dc9ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:24 np0005476733 nova_compute[192580]: 2025-10-08 15:53:24.691 2 DEBUG oslo_concurrency.lockutils [req-1c8d0c23-ec1e-480a-8290-7189c81b7bb5 req-6c1f5652-dd44-4cfa-a256-58dda57dc9ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:24 np0005476733 nova_compute[192580]: 2025-10-08 15:53:24.692 2 DEBUG oslo_concurrency.lockutils [req-1c8d0c23-ec1e-480a-8290-7189c81b7bb5 req-6c1f5652-dd44-4cfa-a256-58dda57dc9ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:24 np0005476733 nova_compute[192580]: 2025-10-08 15:53:24.692 2 DEBUG oslo_concurrency.lockutils [req-1c8d0c23-ec1e-480a-8290-7189c81b7bb5 req-6c1f5652-dd44-4cfa-a256-58dda57dc9ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:24 np0005476733 nova_compute[192580]: 2025-10-08 15:53:24.692 2 DEBUG nova.compute.manager [req-1c8d0c23-ec1e-480a-8290-7189c81b7bb5 req-6c1f5652-dd44-4cfa-a256-58dda57dc9ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:53:24 np0005476733 nova_compute[192580]: 2025-10-08 15:53:24.692 2 WARNING nova.compute.manager [req-1c8d0c23-ec1e-480a-8290-7189c81b7bb5 req-6c1f5652-dd44-4cfa-a256-58dda57dc9ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received unexpected event network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 for instance with vm_state active and task_state migrating.#033[00m
Oct  8 11:53:24 np0005476733 nova_compute[192580]: 2025-10-08 15:53:24.693 2 DEBUG nova.compute.manager [req-1c8d0c23-ec1e-480a-8290-7189c81b7bb5 req-6c1f5652-dd44-4cfa-a256-58dda57dc9ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-changed-b060d65a-9028-402c-8b84-594cba794144 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:24 np0005476733 nova_compute[192580]: 2025-10-08 15:53:24.693 2 DEBUG nova.compute.manager [req-1c8d0c23-ec1e-480a-8290-7189c81b7bb5 req-6c1f5652-dd44-4cfa-a256-58dda57dc9ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Refreshing instance network info cache due to event network-changed-b060d65a-9028-402c-8b84-594cba794144. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:53:24 np0005476733 nova_compute[192580]: 2025-10-08 15:53:24.693 2 DEBUG oslo_concurrency.lockutils [req-1c8d0c23-ec1e-480a-8290-7189c81b7bb5 req-6c1f5652-dd44-4cfa-a256-58dda57dc9ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:53:24 np0005476733 nova_compute[192580]: 2025-10-08 15:53:24.693 2 DEBUG oslo_concurrency.lockutils [req-1c8d0c23-ec1e-480a-8290-7189c81b7bb5 req-6c1f5652-dd44-4cfa-a256-58dda57dc9ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:53:24 np0005476733 nova_compute[192580]: 2025-10-08 15:53:24.694 2 DEBUG nova.network.neutron [req-1c8d0c23-ec1e-480a-8290-7189c81b7bb5 req-6c1f5652-dd44-4cfa-a256-58dda57dc9ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Refreshing network info cache for port b060d65a-9028-402c-8b84-594cba794144 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:53:24 np0005476733 nova_compute[192580]: 2025-10-08 15:53:24.917 2 INFO nova.compute.manager [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Took 8.18 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Oct  8 11:53:24 np0005476733 nova_compute[192580]: 2025-10-08 15:53:24.918 2 DEBUG nova.compute.manager [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:53:24 np0005476733 nova_compute[192580]: 2025-10-08 15:53:24.940 2 DEBUG nova.compute.manager [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=103424,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmprfbin1qp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e899db9a-b18d-4036-a523-fe0907dba023',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(0a579024-97a5-4ce2-b72f-05d3aff4408d),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData,VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Oct  8 11:53:24 np0005476733 nova_compute[192580]: 2025-10-08 15:53:24.975 2 DEBUG nova.objects.instance [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lazy-loading 'migration_context' on Instance uuid e899db9a-b18d-4036-a523-fe0907dba023 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:53:24 np0005476733 nova_compute[192580]: 2025-10-08 15:53:24.977 2 DEBUG nova.virt.libvirt.driver [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Oct  8 11:53:24 np0005476733 nova_compute[192580]: 2025-10-08 15:53:24.979 2 DEBUG nova.virt.libvirt.driver [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Oct  8 11:53:24 np0005476733 nova_compute[192580]: 2025-10-08 15:53:24.980 2 DEBUG nova.virt.libvirt.driver [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Oct  8 11:53:25 np0005476733 nova_compute[192580]: 2025-10-08 15:53:25.005 2 DEBUG nova.virt.libvirt.vif [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:49:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_qos_after_live_migration-234900889',display_name='tempest-test_qos_after_live_migration-234900889',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-qos-after-live-migration-234900889',id=72,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:49:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-atdej0cz',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:53:04Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=e899db9a-b18d-4036-a523-fe0907dba023,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b060d65a-9028-402c-8b84-594cba794144", "address": "fa:16:3e:82:ce:fa", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.197", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapb060d65a-90", "ovs_interfaceid": "b060d65a-9028-402c-8b84-594cba794144", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:53:25 np0005476733 nova_compute[192580]: 2025-10-08 15:53:25.006 2 DEBUG nova.network.os_vif_util [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converting VIF {"id": "b060d65a-9028-402c-8b84-594cba794144", "address": "fa:16:3e:82:ce:fa", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.197", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapb060d65a-90", "ovs_interfaceid": "b060d65a-9028-402c-8b84-594cba794144", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:53:25 np0005476733 nova_compute[192580]: 2025-10-08 15:53:25.007 2 DEBUG nova.network.os_vif_util [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:82:ce:fa,bridge_name='br-int',has_traffic_filtering=True,id=b060d65a-9028-402c-8b84-594cba794144,network=Network(f2fece7d-de46-49dc-874d-3e87e96b491f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb060d65a-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:53:25 np0005476733 nova_compute[192580]: 2025-10-08 15:53:25.008 2 DEBUG nova.virt.libvirt.migration [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updating guest XML with vif config: <interface type="ethernet">
Oct  8 11:53:25 np0005476733 nova_compute[192580]:  <mac address="fa:16:3e:82:ce:fa"/>
Oct  8 11:53:25 np0005476733 nova_compute[192580]:  <model type="virtio"/>
Oct  8 11:53:25 np0005476733 nova_compute[192580]:  <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:53:25 np0005476733 nova_compute[192580]:  <mtu size="1342"/>
Oct  8 11:53:25 np0005476733 nova_compute[192580]:  <target dev="tapb060d65a-90"/>
Oct  8 11:53:25 np0005476733 nova_compute[192580]: </interface>
Oct  8 11:53:25 np0005476733 nova_compute[192580]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Oct  8 11:53:25 np0005476733 nova_compute[192580]: 2025-10-08 15:53:25.010 2 DEBUG nova.virt.libvirt.vif [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:49:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_qos_after_live_migration-234900889',display_name='tempest-test_qos_after_live_migration-234900889',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-qos-after-live-migration-234900889',id=72,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:49:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-atdej0cz',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:53:04Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=e899db9a-b18d-4036-a523-fe0907dba023,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "address": "fa:16:3e:67:2e:56", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape5b8e4de-db", "ovs_interfaceid": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:53:25 np0005476733 nova_compute[192580]: 2025-10-08 15:53:25.011 2 DEBUG nova.network.os_vif_util [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converting VIF {"id": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "address": "fa:16:3e:67:2e:56", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape5b8e4de-db", "ovs_interfaceid": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:53:25 np0005476733 nova_compute[192580]: 2025-10-08 15:53:25.012 2 DEBUG nova.network.os_vif_util [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:67:2e:56,bridge_name='br-int',has_traffic_filtering=True,id=e5b8e4de-db0e-48eb-95c7-99aee1735230,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape5b8e4de-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:53:25 np0005476733 nova_compute[192580]: 2025-10-08 15:53:25.013 2 DEBUG nova.virt.libvirt.migration [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updating guest XML with vif config: <interface type="ethernet">
Oct  8 11:53:25 np0005476733 nova_compute[192580]:  <mac address="fa:16:3e:67:2e:56"/>
Oct  8 11:53:25 np0005476733 nova_compute[192580]:  <model type="virtio"/>
Oct  8 11:53:25 np0005476733 nova_compute[192580]:  <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:53:25 np0005476733 nova_compute[192580]:  <mtu size="1342"/>
Oct  8 11:53:25 np0005476733 nova_compute[192580]:  <target dev="tape5b8e4de-db"/>
Oct  8 11:53:25 np0005476733 nova_compute[192580]: </interface>
Oct  8 11:53:25 np0005476733 nova_compute[192580]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Oct  8 11:53:25 np0005476733 nova_compute[192580]: 2025-10-08 15:53:25.014 2 DEBUG nova.virt.libvirt.driver [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Oct  8 11:53:25 np0005476733 nova_compute[192580]: 2025-10-08 15:53:25.483 2 DEBUG nova.virt.libvirt.migration [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  8 11:53:25 np0005476733 nova_compute[192580]: 2025-10-08 15:53:25.484 2 INFO nova.virt.libvirt.migration [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Oct  8 11:53:25 np0005476733 nova_compute[192580]: 2025-10-08 15:53:25.587 2 INFO nova.virt.libvirt.driver [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.090 2 DEBUG nova.virt.libvirt.migration [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.090 2 DEBUG nova.virt.libvirt.migration [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:26.343 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:26.346 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:26.348 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.594 2 DEBUG nova.virt.libvirt.migration [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.595 2 DEBUG nova.virt.libvirt.migration [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.643 2 DEBUG nova.network.neutron [req-1c8d0c23-ec1e-480a-8290-7189c81b7bb5 req-6c1f5652-dd44-4cfa-a256-58dda57dc9ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updated VIF entry in instance network info cache for port b060d65a-9028-402c-8b84-594cba794144. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.644 2 DEBUG nova.network.neutron [req-1c8d0c23-ec1e-480a-8290-7189c81b7bb5 req-6c1f5652-dd44-4cfa-a256-58dda57dc9ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updating instance_info_cache with network_info: [{"id": "b060d65a-9028-402c-8b84-594cba794144", "address": "fa:16:3e:82:ce:fa", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.197", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb060d65a-90", "ovs_interfaceid": "b060d65a-9028-402c-8b84-594cba794144", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "address": "fa:16:3e:67:2e:56", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5b8e4de-db", "ovs_interfaceid": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.671 2 DEBUG oslo_concurrency.lockutils [req-1c8d0c23-ec1e-480a-8290-7189c81b7bb5 req-6c1f5652-dd44-4cfa-a256-58dda57dc9ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.672 2 DEBUG nova.compute.manager [req-1c8d0c23-ec1e-480a-8290-7189c81b7bb5 req-6c1f5652-dd44-4cfa-a256-58dda57dc9ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-unplugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.672 2 DEBUG oslo_concurrency.lockutils [req-1c8d0c23-ec1e-480a-8290-7189c81b7bb5 req-6c1f5652-dd44-4cfa-a256-58dda57dc9ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.672 2 DEBUG oslo_concurrency.lockutils [req-1c8d0c23-ec1e-480a-8290-7189c81b7bb5 req-6c1f5652-dd44-4cfa-a256-58dda57dc9ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.672 2 DEBUG oslo_concurrency.lockutils [req-1c8d0c23-ec1e-480a-8290-7189c81b7bb5 req-6c1f5652-dd44-4cfa-a256-58dda57dc9ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.673 2 DEBUG nova.compute.manager [req-1c8d0c23-ec1e-480a-8290-7189c81b7bb5 req-6c1f5652-dd44-4cfa-a256-58dda57dc9ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-unplugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.673 2 DEBUG nova.compute.manager [req-1c8d0c23-ec1e-480a-8290-7189c81b7bb5 req-6c1f5652-dd44-4cfa-a256-58dda57dc9ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-unplugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.863 2 DEBUG nova.compute.manager [req-a6b7015a-11d2-4db2-a031-4801d29c076b req-8eea1f4f-2b29-4de5-9850-ec881539ba2b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.864 2 DEBUG oslo_concurrency.lockutils [req-a6b7015a-11d2-4db2-a031-4801d29c076b req-8eea1f4f-2b29-4de5-9850-ec881539ba2b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.864 2 DEBUG oslo_concurrency.lockutils [req-a6b7015a-11d2-4db2-a031-4801d29c076b req-8eea1f4f-2b29-4de5-9850-ec881539ba2b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.865 2 DEBUG oslo_concurrency.lockutils [req-a6b7015a-11d2-4db2-a031-4801d29c076b req-8eea1f4f-2b29-4de5-9850-ec881539ba2b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.865 2 DEBUG nova.compute.manager [req-a6b7015a-11d2-4db2-a031-4801d29c076b req-8eea1f4f-2b29-4de5-9850-ec881539ba2b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.865 2 WARNING nova.compute.manager [req-a6b7015a-11d2-4db2-a031-4801d29c076b req-8eea1f4f-2b29-4de5-9850-ec881539ba2b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received unexpected event network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 for instance with vm_state active and task_state migrating.#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.866 2 DEBUG nova.compute.manager [req-a6b7015a-11d2-4db2-a031-4801d29c076b req-8eea1f4f-2b29-4de5-9850-ec881539ba2b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-changed-e5b8e4de-db0e-48eb-95c7-99aee1735230 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.866 2 DEBUG nova.compute.manager [req-a6b7015a-11d2-4db2-a031-4801d29c076b req-8eea1f4f-2b29-4de5-9850-ec881539ba2b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Refreshing instance network info cache due to event network-changed-e5b8e4de-db0e-48eb-95c7-99aee1735230. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.866 2 DEBUG oslo_concurrency.lockutils [req-a6b7015a-11d2-4db2-a031-4801d29c076b req-8eea1f4f-2b29-4de5-9850-ec881539ba2b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.866 2 DEBUG oslo_concurrency.lockutils [req-a6b7015a-11d2-4db2-a031-4801d29c076b req-8eea1f4f-2b29-4de5-9850-ec881539ba2b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:53:26 np0005476733 nova_compute[192580]: 2025-10-08 15:53:26.867 2 DEBUG nova.network.neutron [req-a6b7015a-11d2-4db2-a031-4801d29c076b req-8eea1f4f-2b29-4de5-9850-ec881539ba2b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Refreshing network info cache for port e5b8e4de-db0e-48eb-95c7-99aee1735230 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:53:27 np0005476733 nova_compute[192580]: 2025-10-08 15:53:27.099 2 DEBUG nova.virt.libvirt.migration [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  8 11:53:27 np0005476733 nova_compute[192580]: 2025-10-08 15:53:27.100 2 DEBUG nova.virt.libvirt.migration [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  8 11:53:27 np0005476733 nova_compute[192580]: 2025-10-08 15:53:27.605 2 DEBUG nova.virt.libvirt.migration [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  8 11:53:27 np0005476733 nova_compute[192580]: 2025-10-08 15:53:27.607 2 DEBUG nova.virt.libvirt.migration [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  8 11:53:27 np0005476733 nova_compute[192580]: 2025-10-08 15:53:27.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:28 np0005476733 nova_compute[192580]: 2025-10-08 15:53:28.112 2 DEBUG nova.virt.libvirt.migration [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  8 11:53:28 np0005476733 nova_compute[192580]: 2025-10-08 15:53:28.114 2 DEBUG nova.virt.libvirt.migration [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  8 11:53:28 np0005476733 podman[244278]: 2025-10-08 15:53:28.298532738 +0000 UTC m=+0.102140374 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, vcs-type=git, name=ubi9-minimal, version=9.6, release=1755695350, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  8 11:53:28 np0005476733 podman[244276]: 2025-10-08 15:53:28.299450798 +0000 UTC m=+0.102664221 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  8 11:53:28 np0005476733 podman[244277]: 2025-10-08 15:53:28.30390945 +0000 UTC m=+0.102551278 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 11:53:28 np0005476733 nova_compute[192580]: 2025-10-08 15:53:28.555 2 DEBUG nova.network.neutron [req-a6b7015a-11d2-4db2-a031-4801d29c076b req-8eea1f4f-2b29-4de5-9850-ec881539ba2b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updated VIF entry in instance network info cache for port e5b8e4de-db0e-48eb-95c7-99aee1735230. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:53:28 np0005476733 nova_compute[192580]: 2025-10-08 15:53:28.556 2 DEBUG nova.network.neutron [req-a6b7015a-11d2-4db2-a031-4801d29c076b req-8eea1f4f-2b29-4de5-9850-ec881539ba2b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Updating instance_info_cache with network_info: [{"id": "b060d65a-9028-402c-8b84-594cba794144", "address": "fa:16:3e:82:ce:fa", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.197", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb060d65a-90", "ovs_interfaceid": "b060d65a-9028-402c-8b84-594cba794144", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "address": "fa:16:3e:67:2e:56", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5b8e4de-db", "ovs_interfaceid": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:53:28 np0005476733 nova_compute[192580]: 2025-10-08 15:53:28.558 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759938808.5558994, e899db9a-b18d-4036-a523-fe0907dba023 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:53:28 np0005476733 nova_compute[192580]: 2025-10-08 15:53:28.558 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:53:28 np0005476733 nova_compute[192580]: 2025-10-08 15:53:28.578 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:53:28 np0005476733 nova_compute[192580]: 2025-10-08 15:53:28.583 2 DEBUG oslo_concurrency.lockutils [req-a6b7015a-11d2-4db2-a031-4801d29c076b req-8eea1f4f-2b29-4de5-9850-ec881539ba2b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-e899db9a-b18d-4036-a523-fe0907dba023" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:53:28 np0005476733 nova_compute[192580]: 2025-10-08 15:53:28.585 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:53:28 np0005476733 nova_compute[192580]: 2025-10-08 15:53:28.602 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Oct  8 11:53:28 np0005476733 nova_compute[192580]: 2025-10-08 15:53:28.618 2 DEBUG nova.virt.libvirt.migration [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  8 11:53:28 np0005476733 nova_compute[192580]: 2025-10-08 15:53:28.619 2 DEBUG nova.virt.libvirt.migration [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  8 11:53:28 np0005476733 kernel: tapb060d65a-90 (unregistering): left promiscuous mode
Oct  8 11:53:28 np0005476733 NetworkManager[51699]: <info>  [1759938808.7348] device (tapb060d65a-90): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:53:28 np0005476733 ovn_controller[94857]: 2025-10-08T15:53:28Z|00652|binding|INFO|Releasing lport b060d65a-9028-402c-8b84-594cba794144 from this chassis (sb_readonly=0)
Oct  8 11:53:28 np0005476733 ovn_controller[94857]: 2025-10-08T15:53:28Z|00653|binding|INFO|Setting lport b060d65a-9028-402c-8b84-594cba794144 down in Southbound
Oct  8 11:53:28 np0005476733 nova_compute[192580]: 2025-10-08 15:53:28.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:28 np0005476733 ovn_controller[94857]: 2025-10-08T15:53:28Z|00654|binding|INFO|Removing iface tapb060d65a-90 ovn-installed in OVS
Oct  8 11:53:28 np0005476733 nova_compute[192580]: 2025-10-08 15:53:28.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:28.761 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:ce:fa 192.168.8.197'], port_security=['fa:16:3e:82:ce:fa 192.168.8.197'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '329dd4f3-73f4-4bda-955c-e971074e916e'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.8.197/24', 'neutron:device_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2fece7d-de46-49dc-874d-3e87e96b491f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'b449450f-29a2-4ba2-a56d-c4c1cca923db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.241'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07c02bbf-eada-4d49-8b61-f6f456154844, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=b060d65a-9028-402c-8b84-594cba794144) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:53:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:28.765 103739 INFO neutron.agent.ovn.metadata.agent [-] Port b060d65a-9028-402c-8b84-594cba794144 in datapath f2fece7d-de46-49dc-874d-3e87e96b491f unbound from our chassis#033[00m
Oct  8 11:53:28 np0005476733 kernel: tape5b8e4de-db (unregistering): left promiscuous mode
Oct  8 11:53:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:28.770 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f2fece7d-de46-49dc-874d-3e87e96b491f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:53:28 np0005476733 NetworkManager[51699]: <info>  [1759938808.7726] device (tape5b8e4de-db): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:53:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:28.773 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0fedcc18-dd7e-4120-9a9f-9ca9cfac0b20]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:28.774 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f namespace which is not needed anymore#033[00m
Oct  8 11:53:28 np0005476733 nova_compute[192580]: 2025-10-08 15:53:28.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:28 np0005476733 ovn_controller[94857]: 2025-10-08T15:53:28Z|00655|binding|INFO|Releasing lport e5b8e4de-db0e-48eb-95c7-99aee1735230 from this chassis (sb_readonly=0)
Oct  8 11:53:28 np0005476733 ovn_controller[94857]: 2025-10-08T15:53:28Z|00656|binding|INFO|Setting lport e5b8e4de-db0e-48eb-95c7-99aee1735230 down in Southbound
Oct  8 11:53:28 np0005476733 ovn_controller[94857]: 2025-10-08T15:53:28Z|00657|binding|INFO|Removing iface tape5b8e4de-db ovn-installed in OVS
Oct  8 11:53:28 np0005476733 nova_compute[192580]: 2025-10-08 15:53:28.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:28 np0005476733 nova_compute[192580]: 2025-10-08 15:53:28.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:28.790 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:67:2e:56 10.100.0.4'], port_security=['fa:16:3e:67:2e:56 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '329dd4f3-73f4-4bda-955c-e971074e916e'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'e899db9a-b18d-4036-a523-fe0907dba023', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '10', 'neutron:security_group_ids': '82ea289b-c65f-44fe-a172-e9784a3ab9f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.205'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3da71a44-b74e-4032-87c4-3337484b3d54, chassis=[], tunnel_key=9, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=e5b8e4de-db0e-48eb-95c7-99aee1735230) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:53:28 np0005476733 nova_compute[192580]: 2025-10-08 15:53:28.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:28 np0005476733 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000048.scope: Deactivated successfully.
Oct  8 11:53:28 np0005476733 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000048.scope: Consumed 26.223s CPU time.
Oct  8 11:53:28 np0005476733 systemd-machined[152624]: Machine qemu-43-instance-00000048 terminated.
Oct  8 11:53:28 np0005476733 NetworkManager[51699]: <info>  [1759938808.9315] manager: (tapb060d65a-90): new Tun device (/org/freedesktop/NetworkManager/Devices/218)
Oct  8 11:53:28 np0005476733 neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f[244042]: [NOTICE]   (244064) : haproxy version is 2.8.14-c23fe91
Oct  8 11:53:28 np0005476733 neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f[244042]: [NOTICE]   (244064) : path to executable is /usr/sbin/haproxy
Oct  8 11:53:28 np0005476733 neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f[244042]: [WARNING]  (244064) : Exiting Master process...
Oct  8 11:53:28 np0005476733 neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f[244042]: [ALERT]    (244064) : Current worker (244069) exited with code 143 (Terminated)
Oct  8 11:53:28 np0005476733 neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f[244042]: [WARNING]  (244064) : All workers exited. Exiting... (0)
Oct  8 11:53:28 np0005476733 nova_compute[192580]: 2025-10-08 15:53:28.949 2 DEBUG nova.compute.manager [req-5f9160b5-f637-4c62-9aff-f30ad39f89d1 req-ae5464c8-11d7-408f-b33f-80956601cc74 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-unplugged-b060d65a-9028-402c-8b84-594cba794144 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:28 np0005476733 nova_compute[192580]: 2025-10-08 15:53:28.950 2 DEBUG oslo_concurrency.lockutils [req-5f9160b5-f637-4c62-9aff-f30ad39f89d1 req-ae5464c8-11d7-408f-b33f-80956601cc74 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:28 np0005476733 nova_compute[192580]: 2025-10-08 15:53:28.950 2 DEBUG oslo_concurrency.lockutils [req-5f9160b5-f637-4c62-9aff-f30ad39f89d1 req-ae5464c8-11d7-408f-b33f-80956601cc74 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:28 np0005476733 nova_compute[192580]: 2025-10-08 15:53:28.950 2 DEBUG oslo_concurrency.lockutils [req-5f9160b5-f637-4c62-9aff-f30ad39f89d1 req-ae5464c8-11d7-408f-b33f-80956601cc74 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:28 np0005476733 systemd[1]: libpod-86208c36d5f8e1a61615975cbff3d0c9554b842a6513f7e5cd87d72f84ab1f31.scope: Deactivated successfully.
Oct  8 11:53:28 np0005476733 nova_compute[192580]: 2025-10-08 15:53:28.951 2 DEBUG nova.compute.manager [req-5f9160b5-f637-4c62-9aff-f30ad39f89d1 req-ae5464c8-11d7-408f-b33f-80956601cc74 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-unplugged-b060d65a-9028-402c-8b84-594cba794144 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:53:28 np0005476733 nova_compute[192580]: 2025-10-08 15:53:28.951 2 DEBUG nova.compute.manager [req-5f9160b5-f637-4c62-9aff-f30ad39f89d1 req-ae5464c8-11d7-408f-b33f-80956601cc74 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-unplugged-b060d65a-9028-402c-8b84-594cba794144 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:53:28 np0005476733 conmon[244042]: conmon 86208c36d5f8e1a61615 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-86208c36d5f8e1a61615975cbff3d0c9554b842a6513f7e5cd87d72f84ab1f31.scope/container/memory.events
Oct  8 11:53:28 np0005476733 NetworkManager[51699]: <info>  [1759938808.9556] manager: (tape5b8e4de-db): new Tun device (/org/freedesktop/NetworkManager/Devices/219)
Oct  8 11:53:28 np0005476733 podman[244369]: 2025-10-08 15:53:28.958004314 +0000 UTC m=+0.061404554 container died 86208c36d5f8e1a61615975cbff3d0c9554b842a6513f7e5cd87d72f84ab1f31 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:53:29 np0005476733 nova_compute[192580]: 2025-10-08 15:53:29.012 2 DEBUG nova.virt.libvirt.driver [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Oct  8 11:53:29 np0005476733 nova_compute[192580]: 2025-10-08 15:53:29.013 2 DEBUG nova.virt.libvirt.driver [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Oct  8 11:53:29 np0005476733 nova_compute[192580]: 2025-10-08 15:53:29.013 2 DEBUG nova.virt.libvirt.driver [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Oct  8 11:53:29 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-86208c36d5f8e1a61615975cbff3d0c9554b842a6513f7e5cd87d72f84ab1f31-userdata-shm.mount: Deactivated successfully.
Oct  8 11:53:29 np0005476733 systemd[1]: var-lib-containers-storage-overlay-a17204afac4459b62c6f55d66b4ae9fa4c0e54f77dfbc4c07aaf0450278db9ca-merged.mount: Deactivated successfully.
Oct  8 11:53:29 np0005476733 podman[244369]: 2025-10-08 15:53:29.075594082 +0000 UTC m=+0.178994322 container cleanup 86208c36d5f8e1a61615975cbff3d0c9554b842a6513f7e5cd87d72f84ab1f31 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  8 11:53:29 np0005476733 systemd[1]: libpod-conmon-86208c36d5f8e1a61615975cbff3d0c9554b842a6513f7e5cd87d72f84ab1f31.scope: Deactivated successfully.
Oct  8 11:53:29 np0005476733 nova_compute[192580]: 2025-10-08 15:53:29.121 2 DEBUG nova.virt.libvirt.guest [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'e899db9a-b18d-4036-a523-fe0907dba023' (instance-00000048) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Oct  8 11:53:29 np0005476733 nova_compute[192580]: 2025-10-08 15:53:29.123 2 INFO nova.virt.libvirt.driver [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Migration operation has completed#033[00m
Oct  8 11:53:29 np0005476733 nova_compute[192580]: 2025-10-08 15:53:29.123 2 INFO nova.compute.manager [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] _post_live_migration() is started..#033[00m
Oct  8 11:53:29 np0005476733 podman[244429]: 2025-10-08 15:53:29.151754806 +0000 UTC m=+0.052197600 container remove 86208c36d5f8e1a61615975cbff3d0c9554b842a6513f7e5cd87d72f84ab1f31 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 11:53:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:29.156 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ce445174-85fc-439d-8301-2f69ad199036]: (4, ('Wed Oct  8 03:53:28 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f (86208c36d5f8e1a61615975cbff3d0c9554b842a6513f7e5cd87d72f84ab1f31)\n86208c36d5f8e1a61615975cbff3d0c9554b842a6513f7e5cd87d72f84ab1f31\nWed Oct  8 03:53:29 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f (86208c36d5f8e1a61615975cbff3d0c9554b842a6513f7e5cd87d72f84ab1f31)\n86208c36d5f8e1a61615975cbff3d0c9554b842a6513f7e5cd87d72f84ab1f31\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:29.158 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3fbae416-9f2b-47bd-927b-18a19b553df2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:29.160 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2fece7d-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:53:29 np0005476733 nova_compute[192580]: 2025-10-08 15:53:29.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:29 np0005476733 kernel: tapf2fece7d-d0: left promiscuous mode
Oct  8 11:53:29 np0005476733 nova_compute[192580]: 2025-10-08 15:53:29.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:29.186 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[86c07f6c-053b-4780-8e97-c027607d8547]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:29.217 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[486943cb-9d3c-4a61-8fa8-b612d8fd6ba5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:29.219 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0e90e4b8-b49a-4eb7-9a16-fc01099cb49d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:29.240 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c9936f70-5d41-41b8-9e69-1a3f25794256]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568766, 'reachable_time': 36906, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244451, 'error': None, 'target': 'ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:29 np0005476733 systemd[1]: run-netns-ovnmeta\x2df2fece7d\x2dde46\x2d49dc\x2d874d\x2d3e87e96b491f.mount: Deactivated successfully.
Oct  8 11:53:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:29.247 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:53:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:29.248 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[c152d09e-813f-43c0-bb60-46e53c4b817b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:29.250 103739 INFO neutron.agent.ovn.metadata.agent [-] Port e5b8e4de-db0e-48eb-95c7-99aee1735230 in datapath 58a69152-b5a6-41d0-85d5-36ab51cfbfb5 unbound from our chassis#033[00m
Oct  8 11:53:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:29.253 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58a69152-b5a6-41d0-85d5-36ab51cfbfb5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:53:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:29.254 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[45cb90c7-37d5-463f-8d97-cd48dac9b12c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:29.255 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 namespace which is not needed anymore#033[00m
Oct  8 11:53:29 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[244139]: [NOTICE]   (244143) : haproxy version is 2.8.14-c23fe91
Oct  8 11:53:29 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[244139]: [NOTICE]   (244143) : path to executable is /usr/sbin/haproxy
Oct  8 11:53:29 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[244139]: [WARNING]  (244143) : Exiting Master process...
Oct  8 11:53:29 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[244139]: [WARNING]  (244143) : Exiting Master process...
Oct  8 11:53:29 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[244139]: [ALERT]    (244143) : Current worker (244145) exited with code 143 (Terminated)
Oct  8 11:53:29 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[244139]: [WARNING]  (244143) : All workers exited. Exiting... (0)
Oct  8 11:53:29 np0005476733 systemd[1]: libpod-98b121614890ed4340f446b8aa95470cab2e72c535e4b528206b286167d19758.scope: Deactivated successfully.
Oct  8 11:53:29 np0005476733 podman[244470]: 2025-10-08 15:53:29.405576527 +0000 UTC m=+0.049685949 container died 98b121614890ed4340f446b8aa95470cab2e72c535e4b528206b286167d19758 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:53:29 np0005476733 systemd[1]: var-lib-containers-storage-overlay-7692ae9b4a5d737d27e1ac5452ea96b145f66f6d73f6c26ee61cc469824cbb25-merged.mount: Deactivated successfully.
Oct  8 11:53:29 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-98b121614890ed4340f446b8aa95470cab2e72c535e4b528206b286167d19758-userdata-shm.mount: Deactivated successfully.
Oct  8 11:53:29 np0005476733 podman[244470]: 2025-10-08 15:53:29.448411686 +0000 UTC m=+0.092521138 container cleanup 98b121614890ed4340f446b8aa95470cab2e72c535e4b528206b286167d19758 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  8 11:53:29 np0005476733 systemd[1]: libpod-conmon-98b121614890ed4340f446b8aa95470cab2e72c535e4b528206b286167d19758.scope: Deactivated successfully.
Oct  8 11:53:29 np0005476733 podman[244501]: 2025-10-08 15:53:29.519190817 +0000 UTC m=+0.044747961 container remove 98b121614890ed4340f446b8aa95470cab2e72c535e4b528206b286167d19758 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 11:53:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:29.524 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f49dcbb8-878e-4d38-ac58-2ad95cbed874]: (4, ('Wed Oct  8 03:53:29 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 (98b121614890ed4340f446b8aa95470cab2e72c535e4b528206b286167d19758)\n98b121614890ed4340f446b8aa95470cab2e72c535e4b528206b286167d19758\nWed Oct  8 03:53:29 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 (98b121614890ed4340f446b8aa95470cab2e72c535e4b528206b286167d19758)\n98b121614890ed4340f446b8aa95470cab2e72c535e4b528206b286167d19758\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:29.526 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[883539ac-0531-4e50-9c3c-763719591f1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:29.527 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58a69152-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:53:29 np0005476733 nova_compute[192580]: 2025-10-08 15:53:29.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:29 np0005476733 kernel: tap58a69152-b0: left promiscuous mode
Oct  8 11:53:29 np0005476733 nova_compute[192580]: 2025-10-08 15:53:29.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:29.552 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[dc6d6d31-1189-4b21-b57a-883100bbf41a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:29.579 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0004609e-38b0-4e65-9773-49c322f1289d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:29.581 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[18ab7f61-b6b9-47c1-92f2-715e847f1068]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:29.597 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ba84eaa2-051a-4106-bb8b-8d8baa49277a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568869, 'reachable_time': 31382, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244523, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:29.599 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:53:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:53:29.599 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[12a2c95b-eb4c-46ed-97f0-e3be04874b1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:53:29 np0005476733 systemd[1]: run-netns-ovnmeta\x2d58a69152\x2db5a6\x2d41d0\x2d85d5\x2d36ab51cfbfb5.mount: Deactivated successfully.
Oct  8 11:53:29 np0005476733 nova_compute[192580]: 2025-10-08 15:53:29.821 2 DEBUG nova.compute.manager [req-6a4df161-ba4b-45bc-96e1-dba1ff5e5634 req-c0a8c8b9-447e-41c1-892f-16ce970f581e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-unplugged-b060d65a-9028-402c-8b84-594cba794144 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:29 np0005476733 nova_compute[192580]: 2025-10-08 15:53:29.822 2 DEBUG oslo_concurrency.lockutils [req-6a4df161-ba4b-45bc-96e1-dba1ff5e5634 req-c0a8c8b9-447e-41c1-892f-16ce970f581e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:29 np0005476733 nova_compute[192580]: 2025-10-08 15:53:29.822 2 DEBUG oslo_concurrency.lockutils [req-6a4df161-ba4b-45bc-96e1-dba1ff5e5634 req-c0a8c8b9-447e-41c1-892f-16ce970f581e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:29 np0005476733 nova_compute[192580]: 2025-10-08 15:53:29.822 2 DEBUG oslo_concurrency.lockutils [req-6a4df161-ba4b-45bc-96e1-dba1ff5e5634 req-c0a8c8b9-447e-41c1-892f-16ce970f581e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:29 np0005476733 nova_compute[192580]: 2025-10-08 15:53:29.823 2 DEBUG nova.compute.manager [req-6a4df161-ba4b-45bc-96e1-dba1ff5e5634 req-c0a8c8b9-447e-41c1-892f-16ce970f581e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-unplugged-b060d65a-9028-402c-8b84-594cba794144 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:53:29 np0005476733 nova_compute[192580]: 2025-10-08 15:53:29.823 2 DEBUG nova.compute.manager [req-6a4df161-ba4b-45bc-96e1-dba1ff5e5634 req-c0a8c8b9-447e-41c1-892f-16ce970f581e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-unplugged-b060d65a-9028-402c-8b84-594cba794144 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:53:30 np0005476733 nova_compute[192580]: 2025-10-08 15:53:30.003 2 DEBUG nova.network.neutron [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Activated binding for port b060d65a-9028-402c-8b84-594cba794144 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Oct  8 11:53:30 np0005476733 systemd[1]: Stopping User Manager for UID 42436...
Oct  8 11:53:30 np0005476733 systemd[244241]: Activating special unit Exit the Session...
Oct  8 11:53:30 np0005476733 systemd[244241]: Stopped target Main User Target.
Oct  8 11:53:30 np0005476733 systemd[244241]: Stopped target Basic System.
Oct  8 11:53:30 np0005476733 systemd[244241]: Stopped target Paths.
Oct  8 11:53:30 np0005476733 systemd[244241]: Stopped target Sockets.
Oct  8 11:53:30 np0005476733 systemd[244241]: Stopped target Timers.
Oct  8 11:53:30 np0005476733 systemd[244241]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct  8 11:53:30 np0005476733 systemd[244241]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  8 11:53:30 np0005476733 systemd[244241]: Closed D-Bus User Message Bus Socket.
Oct  8 11:53:30 np0005476733 systemd[244241]: Stopped Create User's Volatile Files and Directories.
Oct  8 11:53:30 np0005476733 systemd[244241]: Removed slice User Application Slice.
Oct  8 11:53:30 np0005476733 systemd[244241]: Reached target Shutdown.
Oct  8 11:53:30 np0005476733 systemd[244241]: Finished Exit the Session.
Oct  8 11:53:30 np0005476733 systemd[244241]: Reached target Exit the Session.
Oct  8 11:53:30 np0005476733 systemd[1]: user@42436.service: Deactivated successfully.
Oct  8 11:53:30 np0005476733 systemd[1]: Stopped User Manager for UID 42436.
Oct  8 11:53:30 np0005476733 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Oct  8 11:53:30 np0005476733 systemd[1]: run-user-42436.mount: Deactivated successfully.
Oct  8 11:53:30 np0005476733 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Oct  8 11:53:30 np0005476733 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Oct  8 11:53:30 np0005476733 systemd[1]: Removed slice User Slice of UID 42436.
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.029 2 DEBUG nova.compute.manager [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.031 2 DEBUG oslo_concurrency.lockutils [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.031 2 DEBUG oslo_concurrency.lockutils [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.031 2 DEBUG oslo_concurrency.lockutils [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.032 2 DEBUG nova.compute.manager [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.032 2 WARNING nova.compute.manager [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received unexpected event network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 for instance with vm_state active and task_state migrating.#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.032 2 DEBUG nova.compute.manager [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.033 2 DEBUG oslo_concurrency.lockutils [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.033 2 DEBUG oslo_concurrency.lockutils [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.033 2 DEBUG oslo_concurrency.lockutils [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.034 2 DEBUG nova.compute.manager [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.034 2 WARNING nova.compute.manager [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received unexpected event network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 for instance with vm_state active and task_state migrating.#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.034 2 DEBUG nova.compute.manager [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-unplugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.034 2 DEBUG oslo_concurrency.lockutils [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.035 2 DEBUG oslo_concurrency.lockutils [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.035 2 DEBUG oslo_concurrency.lockutils [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.035 2 DEBUG nova.compute.manager [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-unplugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.036 2 DEBUG nova.compute.manager [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-unplugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.036 2 DEBUG nova.compute.manager [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.036 2 DEBUG oslo_concurrency.lockutils [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.036 2 DEBUG oslo_concurrency.lockutils [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.037 2 DEBUG oslo_concurrency.lockutils [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.037 2 DEBUG nova.compute.manager [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.037 2 WARNING nova.compute.manager [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received unexpected event network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 for instance with vm_state active and task_state migrating.#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.038 2 DEBUG nova.compute.manager [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.038 2 DEBUG oslo_concurrency.lockutils [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.038 2 DEBUG oslo_concurrency.lockutils [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.038 2 DEBUG oslo_concurrency.lockutils [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.039 2 DEBUG nova.compute.manager [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.039 2 WARNING nova.compute.manager [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received unexpected event network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 for instance with vm_state active and task_state migrating.#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.039 2 DEBUG nova.compute.manager [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.040 2 DEBUG oslo_concurrency.lockutils [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.040 2 DEBUG oslo_concurrency.lockutils [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.040 2 DEBUG oslo_concurrency.lockutils [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.040 2 DEBUG nova.compute.manager [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.041 2 WARNING nova.compute.manager [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received unexpected event network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 for instance with vm_state active and task_state migrating.#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.041 2 DEBUG nova.compute.manager [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.041 2 DEBUG oslo_concurrency.lockutils [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.042 2 DEBUG oslo_concurrency.lockutils [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.042 2 DEBUG oslo_concurrency.lockutils [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.042 2 DEBUG nova.compute.manager [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.043 2 WARNING nova.compute.manager [req-33dacc53-810b-4ffa-bc25-b824bcc4b33c req-bfd2a4e3-feb6-412f-8f84-216e0c91c459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received unexpected event network-vif-plugged-b060d65a-9028-402c-8b84-594cba794144 for instance with vm_state active and task_state migrating.#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.515 2 DEBUG nova.network.neutron [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Activated binding for port e5b8e4de-db0e-48eb-95c7-99aee1735230 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.516 2 DEBUG nova.compute.manager [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "b060d65a-9028-402c-8b84-594cba794144", "address": "fa:16:3e:82:ce:fa", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.197", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb060d65a-90", "ovs_interfaceid": "b060d65a-9028-402c-8b84-594cba794144", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "address": "fa:16:3e:67:2e:56", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5b8e4de-db", "ovs_interfaceid": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.518 2 DEBUG nova.virt.libvirt.vif [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:49:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_qos_after_live_migration-234900889',display_name='tempest-test_qos_after_live_migration-234900889',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-qos-after-live-migration-234900889',id=72,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:49:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-atdej0cz',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:53:08Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=e899db9a-b18d-4036-a523-fe0907dba023,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b060d65a-9028-402c-8b84-594cba794144", "address": "fa:16:3e:82:ce:fa", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.197", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb060d65a-90", "ovs_interfaceid": "b060d65a-9028-402c-8b84-594cba794144", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.518 2 DEBUG nova.network.os_vif_util [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converting VIF {"id": "b060d65a-9028-402c-8b84-594cba794144", "address": "fa:16:3e:82:ce:fa", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.197", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb060d65a-90", "ovs_interfaceid": "b060d65a-9028-402c-8b84-594cba794144", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.519 2 DEBUG nova.network.os_vif_util [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:82:ce:fa,bridge_name='br-int',has_traffic_filtering=True,id=b060d65a-9028-402c-8b84-594cba794144,network=Network(f2fece7d-de46-49dc-874d-3e87e96b491f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb060d65a-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.520 2 DEBUG os_vif [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:ce:fa,bridge_name='br-int',has_traffic_filtering=True,id=b060d65a-9028-402c-8b84-594cba794144,network=Network(f2fece7d-de46-49dc-874d-3e87e96b491f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb060d65a-90') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.523 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb060d65a-90, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.535 2 INFO os_vif [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:ce:fa,bridge_name='br-int',has_traffic_filtering=True,id=b060d65a-9028-402c-8b84-594cba794144,network=Network(f2fece7d-de46-49dc-874d-3e87e96b491f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb060d65a-90')#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.536 2 DEBUG nova.virt.libvirt.vif [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:49:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_qos_after_live_migration-234900889',display_name='tempest-test_qos_after_live_migration-234900889',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-qos-after-live-migration-234900889',id=72,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:49:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-atdej0cz',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:53:08Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=e899db9a-b18d-4036-a523-fe0907dba023,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "address": "fa:16:3e:67:2e:56", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5b8e4de-db", "ovs_interfaceid": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.537 2 DEBUG nova.network.os_vif_util [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converting VIF {"id": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "address": "fa:16:3e:67:2e:56", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5b8e4de-db", "ovs_interfaceid": "e5b8e4de-db0e-48eb-95c7-99aee1735230", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.538 2 DEBUG nova.network.os_vif_util [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:67:2e:56,bridge_name='br-int',has_traffic_filtering=True,id=e5b8e4de-db0e-48eb-95c7-99aee1735230,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape5b8e4de-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.538 2 DEBUG os_vif [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:67:2e:56,bridge_name='br-int',has_traffic_filtering=True,id=e5b8e4de-db0e-48eb-95c7-99aee1735230,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape5b8e4de-db') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.540 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape5b8e4de-db, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.546 2 INFO os_vif [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:67:2e:56,bridge_name='br-int',has_traffic_filtering=True,id=e5b8e4de-db0e-48eb-95c7-99aee1735230,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape5b8e4de-db')#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.546 2 DEBUG oslo_concurrency.lockutils [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.546 2 DEBUG oslo_concurrency.lockutils [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.547 2 DEBUG oslo_concurrency.lockutils [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.547 2 DEBUG nova.compute.manager [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.547 2 INFO nova.virt.libvirt.driver [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Deleting instance files /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023_del#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.548 2 INFO nova.virt.libvirt.driver [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Deletion of /var/lib/nova/instances/e899db9a-b18d-4036-a523-fe0907dba023_del complete#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.943 2 DEBUG nova.compute.manager [req-1ad87162-460e-4f91-8e79-ef3822c77a29 req-cc07de2f-8349-46da-b6c6-0ddadab163e2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-unplugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.944 2 DEBUG oslo_concurrency.lockutils [req-1ad87162-460e-4f91-8e79-ef3822c77a29 req-cc07de2f-8349-46da-b6c6-0ddadab163e2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.944 2 DEBUG oslo_concurrency.lockutils [req-1ad87162-460e-4f91-8e79-ef3822c77a29 req-cc07de2f-8349-46da-b6c6-0ddadab163e2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.945 2 DEBUG oslo_concurrency.lockutils [req-1ad87162-460e-4f91-8e79-ef3822c77a29 req-cc07de2f-8349-46da-b6c6-0ddadab163e2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.945 2 DEBUG nova.compute.manager [req-1ad87162-460e-4f91-8e79-ef3822c77a29 req-cc07de2f-8349-46da-b6c6-0ddadab163e2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-unplugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:53:31 np0005476733 nova_compute[192580]: 2025-10-08 15:53:31.946 2 DEBUG nova.compute.manager [req-1ad87162-460e-4f91-8e79-ef3822c77a29 req-cc07de2f-8349-46da-b6c6-0ddadab163e2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-unplugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:53:33 np0005476733 nova_compute[192580]: 2025-10-08 15:53:33.123 2 DEBUG nova.compute.manager [req-3ca5cf4a-8e40-4d00-baa6-b8f6631bb875 req-55e35326-acf9-4ab7-b06c-c6fc6152320e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:33 np0005476733 nova_compute[192580]: 2025-10-08 15:53:33.123 2 DEBUG oslo_concurrency.lockutils [req-3ca5cf4a-8e40-4d00-baa6-b8f6631bb875 req-55e35326-acf9-4ab7-b06c-c6fc6152320e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:33 np0005476733 nova_compute[192580]: 2025-10-08 15:53:33.124 2 DEBUG oslo_concurrency.lockutils [req-3ca5cf4a-8e40-4d00-baa6-b8f6631bb875 req-55e35326-acf9-4ab7-b06c-c6fc6152320e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:33 np0005476733 nova_compute[192580]: 2025-10-08 15:53:33.124 2 DEBUG oslo_concurrency.lockutils [req-3ca5cf4a-8e40-4d00-baa6-b8f6631bb875 req-55e35326-acf9-4ab7-b06c-c6fc6152320e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:33 np0005476733 nova_compute[192580]: 2025-10-08 15:53:33.124 2 DEBUG nova.compute.manager [req-3ca5cf4a-8e40-4d00-baa6-b8f6631bb875 req-55e35326-acf9-4ab7-b06c-c6fc6152320e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:53:33 np0005476733 nova_compute[192580]: 2025-10-08 15:53:33.124 2 WARNING nova.compute.manager [req-3ca5cf4a-8e40-4d00-baa6-b8f6631bb875 req-55e35326-acf9-4ab7-b06c-c6fc6152320e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received unexpected event network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 for instance with vm_state active and task_state migrating.#033[00m
Oct  8 11:53:33 np0005476733 nova_compute[192580]: 2025-10-08 15:53:33.125 2 DEBUG nova.compute.manager [req-3ca5cf4a-8e40-4d00-baa6-b8f6631bb875 req-55e35326-acf9-4ab7-b06c-c6fc6152320e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received event network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:53:33 np0005476733 nova_compute[192580]: 2025-10-08 15:53:33.125 2 DEBUG oslo_concurrency.lockutils [req-3ca5cf4a-8e40-4d00-baa6-b8f6631bb875 req-55e35326-acf9-4ab7-b06c-c6fc6152320e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:33 np0005476733 nova_compute[192580]: 2025-10-08 15:53:33.125 2 DEBUG oslo_concurrency.lockutils [req-3ca5cf4a-8e40-4d00-baa6-b8f6631bb875 req-55e35326-acf9-4ab7-b06c-c6fc6152320e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:33 np0005476733 nova_compute[192580]: 2025-10-08 15:53:33.125 2 DEBUG oslo_concurrency.lockutils [req-3ca5cf4a-8e40-4d00-baa6-b8f6631bb875 req-55e35326-acf9-4ab7-b06c-c6fc6152320e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:33 np0005476733 nova_compute[192580]: 2025-10-08 15:53:33.125 2 DEBUG nova.compute.manager [req-3ca5cf4a-8e40-4d00-baa6-b8f6631bb875 req-55e35326-acf9-4ab7-b06c-c6fc6152320e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] No waiting events found dispatching network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:53:33 np0005476733 nova_compute[192580]: 2025-10-08 15:53:33.126 2 WARNING nova.compute.manager [req-3ca5cf4a-8e40-4d00-baa6-b8f6631bb875 req-55e35326-acf9-4ab7-b06c-c6fc6152320e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Received unexpected event network-vif-plugged-e5b8e4de-db0e-48eb-95c7-99aee1735230 for instance with vm_state active and task_state migrating.#033[00m
Oct  8 11:53:35 np0005476733 podman[244530]: 2025-10-08 15:53:35.244948806 +0000 UTC m=+0.062961173 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 11:53:35 np0005476733 podman[244529]: 2025-10-08 15:53:35.247563889 +0000 UTC m=+0.063284183 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  8 11:53:36 np0005476733 nova_compute[192580]: 2025-10-08 15:53:36.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:36 np0005476733 nova_compute[192580]: 2025-10-08 15:53:36.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:37 np0005476733 nova_compute[192580]: 2025-10-08 15:53:37.883 2 DEBUG oslo_concurrency.lockutils [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquiring lock "e899db9a-b18d-4036-a523-fe0907dba023-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:37 np0005476733 nova_compute[192580]: 2025-10-08 15:53:37.884 2 DEBUG oslo_concurrency.lockutils [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:37 np0005476733 nova_compute[192580]: 2025-10-08 15:53:37.884 2 DEBUG oslo_concurrency.lockutils [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "e899db9a-b18d-4036-a523-fe0907dba023-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:37 np0005476733 nova_compute[192580]: 2025-10-08 15:53:37.908 2 DEBUG oslo_concurrency.lockutils [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:37 np0005476733 nova_compute[192580]: 2025-10-08 15:53:37.909 2 DEBUG oslo_concurrency.lockutils [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:37 np0005476733 nova_compute[192580]: 2025-10-08 15:53:37.909 2 DEBUG oslo_concurrency.lockutils [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:37 np0005476733 nova_compute[192580]: 2025-10-08 15:53:37.909 2 DEBUG nova.compute.resource_tracker [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:53:38 np0005476733 nova_compute[192580]: 2025-10-08 15:53:38.098 2 WARNING nova.virt.libvirt.driver [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:53:38 np0005476733 nova_compute[192580]: 2025-10-08 15:53:38.099 2 DEBUG nova.compute.resource_tracker [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13765MB free_disk=111.33284759521484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:53:38 np0005476733 nova_compute[192580]: 2025-10-08 15:53:38.099 2 DEBUG oslo_concurrency.lockutils [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:53:38 np0005476733 nova_compute[192580]: 2025-10-08 15:53:38.099 2 DEBUG oslo_concurrency.lockutils [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:53:38 np0005476733 nova_compute[192580]: 2025-10-08 15:53:38.140 2 DEBUG nova.compute.resource_tracker [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Migration for instance e899db9a-b18d-4036-a523-fe0907dba023 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Oct  8 11:53:38 np0005476733 nova_compute[192580]: 2025-10-08 15:53:38.175 2 DEBUG nova.compute.resource_tracker [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Oct  8 11:53:38 np0005476733 nova_compute[192580]: 2025-10-08 15:53:38.218 2 DEBUG nova.compute.resource_tracker [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Migration 0a579024-97a5-4ce2-b72f-05d3aff4408d is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 1024, 'DISK_GB': 10}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Oct  8 11:53:38 np0005476733 nova_compute[192580]: 2025-10-08 15:53:38.219 2 DEBUG nova.compute.resource_tracker [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:53:38 np0005476733 nova_compute[192580]: 2025-10-08 15:53:38.219 2 DEBUG nova.compute.resource_tracker [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:53:38 np0005476733 nova_compute[192580]: 2025-10-08 15:53:38.263 2 DEBUG nova.compute.provider_tree [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:53:38 np0005476733 nova_compute[192580]: 2025-10-08 15:53:38.319 2 DEBUG nova.scheduler.client.report [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:53:38 np0005476733 nova_compute[192580]: 2025-10-08 15:53:38.349 2 DEBUG nova.compute.resource_tracker [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:53:38 np0005476733 nova_compute[192580]: 2025-10-08 15:53:38.349 2 DEBUG oslo_concurrency.lockutils [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:53:38 np0005476733 nova_compute[192580]: 2025-10-08 15:53:38.356 2 INFO nova.compute.manager [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Oct  8 11:53:38 np0005476733 nova_compute[192580]: 2025-10-08 15:53:38.473 2 INFO nova.scheduler.client.report [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Deleted allocation for migration 0a579024-97a5-4ce2-b72f-05d3aff4408d#033[00m
Oct  8 11:53:38 np0005476733 nova_compute[192580]: 2025-10-08 15:53:38.474 2 DEBUG nova.virt.libvirt.driver [None req-28723949-16d1-4cee-84ee-ea5403575976 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Oct  8 11:53:41 np0005476733 nova_compute[192580]: 2025-10-08 15:53:41.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:41 np0005476733 nova_compute[192580]: 2025-10-08 15:53:41.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:44 np0005476733 nova_compute[192580]: 2025-10-08 15:53:44.010 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759938809.0083344, e899db9a-b18d-4036-a523-fe0907dba023 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:53:44 np0005476733 nova_compute[192580]: 2025-10-08 15:53:44.010 2 INFO nova.compute.manager [-] [instance: e899db9a-b18d-4036-a523-fe0907dba023] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:53:44 np0005476733 nova_compute[192580]: 2025-10-08 15:53:44.034 2 DEBUG nova.compute.manager [None req-9d313dc0-8d12-4b6d-95bf-e2f2e93b2108 - - - - - -] [instance: e899db9a-b18d-4036-a523-fe0907dba023] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:53:45 np0005476733 podman[244574]: 2025-10-08 15:53:45.268672115 +0000 UTC m=+0.083126919 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:53:46 np0005476733 nova_compute[192580]: 2025-10-08 15:53:46.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:46 np0005476733 nova_compute[192580]: 2025-10-08 15:53:46.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:51 np0005476733 podman[244596]: 2025-10-08 15:53:51.249419922 +0000 UTC m=+0.067727133 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:53:51 np0005476733 podman[244595]: 2025-10-08 15:53:51.280227327 +0000 UTC m=+0.102828736 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Oct  8 11:53:51 np0005476733 nova_compute[192580]: 2025-10-08 15:53:51.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:51 np0005476733 nova_compute[192580]: 2025-10-08 15:53:51.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:56 np0005476733 nova_compute[192580]: 2025-10-08 15:53:56.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:56 np0005476733 nova_compute[192580]: 2025-10-08 15:53:56.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:53:59 np0005476733 podman[244642]: 2025-10-08 15:53:59.243444238 +0000 UTC m=+0.065763534 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:53:59 np0005476733 podman[244641]: 2025-10-08 15:53:59.24539476 +0000 UTC m=+0.074598445 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:53:59 np0005476733 podman[244643]: 2025-10-08 15:53:59.269548772 +0000 UTC m=+0.086238018 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, version=9.6)
Oct  8 11:54:00 np0005476733 nova_compute[192580]: 2025-10-08 15:54:00.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:54:01 np0005476733 nova_compute[192580]: 2025-10-08 15:54:01.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:01 np0005476733 nova_compute[192580]: 2025-10-08 15:54:01.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:03 np0005476733 nova_compute[192580]: 2025-10-08 15:54:03.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:54:06 np0005476733 podman[244704]: 2025-10-08 15:54:06.23290947 +0000 UTC m=+0.060312587 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:54:06 np0005476733 podman[244705]: 2025-10-08 15:54:06.26542282 +0000 UTC m=+0.085233385 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:54:06 np0005476733 nova_compute[192580]: 2025-10-08 15:54:06.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:06 np0005476733 nova_compute[192580]: 2025-10-08 15:54:06.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:07 np0005476733 nova_compute[192580]: 2025-10-08 15:54:07.010 2 DEBUG nova.virt.libvirt.driver [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Creating tmpfile /var/lib/nova/instances/tmp_m1rohd5 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Oct  8 11:54:07 np0005476733 nova_compute[192580]: 2025-10-08 15:54:07.124 2 DEBUG nova.compute.manager [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=113664,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_m1rohd5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData,VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Oct  8 11:54:09 np0005476733 nova_compute[192580]: 2025-10-08 15:54:09.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:54:09 np0005476733 nova_compute[192580]: 2025-10-08 15:54:09.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:54:09 np0005476733 nova_compute[192580]: 2025-10-08 15:54:09.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:54:09 np0005476733 nova_compute[192580]: 2025-10-08 15:54:09.669 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 11:54:09 np0005476733 nova_compute[192580]: 2025-10-08 15:54:09.670 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:54:10 np0005476733 nova_compute[192580]: 2025-10-08 15:54:10.865 2 DEBUG nova.compute.manager [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=113664,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_m1rohd5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='570998ad-005c-4c0c-b2df-5b23a6c4448d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData,VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Oct  8 11:54:10 np0005476733 nova_compute[192580]: 2025-10-08 15:54:10.910 2 DEBUG oslo_concurrency.lockutils [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquiring lock "refresh_cache-570998ad-005c-4c0c-b2df-5b23a6c4448d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:54:10 np0005476733 nova_compute[192580]: 2025-10-08 15:54:10.910 2 DEBUG oslo_concurrency.lockutils [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquired lock "refresh_cache-570998ad-005c-4c0c-b2df-5b23a6c4448d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:54:10 np0005476733 nova_compute[192580]: 2025-10-08 15:54:10.911 2 DEBUG nova.network.neutron [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:54:11 np0005476733 nova_compute[192580]: 2025-10-08 15:54:11.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:11 np0005476733 nova_compute[192580]: 2025-10-08 15:54:11.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:11 np0005476733 nova_compute[192580]: 2025-10-08 15:54:11.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:54:11 np0005476733 nova_compute[192580]: 2025-10-08 15:54:11.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:54:13 np0005476733 nova_compute[192580]: 2025-10-08 15:54:13.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.549 2 DEBUG nova.network.neutron [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Updating instance_info_cache with network_info: [{"id": "8a7eab74-6ed6-42aa-beab-977d24da645d", "address": "fa:16:3e:f7:33:06", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.242", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a7eab74-6e", "ovs_interfaceid": "8a7eab74-6ed6-42aa-beab-977d24da645d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1989484f-dc1f-4f5c-94f0-a26c6ea9ece9", "address": "fa:16:3e:2d:41:77", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1989484f-dc", "ovs_interfaceid": "1989484f-dc1f-4f5c-94f0-a26c6ea9ece9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.564 2 DEBUG oslo_concurrency.lockutils [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Releasing lock "refresh_cache-570998ad-005c-4c0c-b2df-5b23a6c4448d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.566 2 DEBUG nova.virt.libvirt.driver [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=113664,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_m1rohd5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='570998ad-005c-4c0c-b2df-5b23a6c4448d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData,VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.566 2 DEBUG nova.virt.libvirt.driver [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Creating instance directory: /var/lib/nova/instances/570998ad-005c-4c0c-b2df-5b23a6c4448d pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.567 2 DEBUG nova.virt.libvirt.driver [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Creating disk.info with the contents: {'/var/lib/nova/instances/570998ad-005c-4c0c-b2df-5b23a6c4448d/disk': 'qcow2', '/var/lib/nova/instances/570998ad-005c-4c0c-b2df-5b23a6c4448d/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.567 2 DEBUG nova.virt.libvirt.driver [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.568 2 DEBUG nova.objects.instance [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lazy-loading 'trusted_certs' on Instance uuid 570998ad-005c-4c0c-b2df-5b23a6c4448d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.602 2 DEBUG oslo_concurrency.processutils [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.662 2 DEBUG oslo_concurrency.processutils [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.663 2 DEBUG oslo_concurrency.lockutils [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.664 2 DEBUG oslo_concurrency.lockutils [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.674 2 DEBUG oslo_concurrency.processutils [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.693 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.694 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.694 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.695 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.732 2 DEBUG oslo_concurrency.processutils [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.733 2 DEBUG oslo_concurrency.processutils [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/570998ad-005c-4c0c-b2df-5b23a6c4448d/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.773 2 DEBUG oslo_concurrency.processutils [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/570998ad-005c-4c0c-b2df-5b23a6c4448d/disk 10737418240" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.775 2 DEBUG oslo_concurrency.lockutils [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.775 2 DEBUG oslo_concurrency.processutils [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.838 2 DEBUG oslo_concurrency.processutils [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.840 2 DEBUG nova.objects.instance [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lazy-loading 'migration_context' on Instance uuid 570998ad-005c-4c0c-b2df-5b23a6c4448d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.874 2 DEBUG oslo_concurrency.processutils [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/570998ad-005c-4c0c-b2df-5b23a6c4448d/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.902 2 DEBUG oslo_concurrency.processutils [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/570998ad-005c-4c0c-b2df-5b23a6c4448d/disk.config 485376" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.904 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/570998ad-005c-4c0c-b2df-5b23a6c4448d/disk.config to /var/lib/nova/instances/570998ad-005c-4c0c-b2df-5b23a6c4448d copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.905 2 DEBUG oslo_concurrency.processutils [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/570998ad-005c-4c0c-b2df-5b23a6c4448d/disk.config /var/lib/nova/instances/570998ad-005c-4c0c-b2df-5b23a6c4448d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.966 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.967 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13795MB free_disk=111.33282470703125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.967 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:54:14 np0005476733 nova_compute[192580]: 2025-10-08 15:54:14.968 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.033 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Migration for instance 570998ad-005c-4c0c-b2df-5b23a6c4448d refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.096 2 INFO nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Updating resource usage from migration a4b1ad1b-fb12-497b-a4f5-df79e85ad34f#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.097 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Starting to track incoming migration a4b1ad1b-fb12-497b-a4f5-df79e85ad34f with flavor 22222222-2222-2222-2222-222222222222 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.280 2 WARNING nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 570998ad-005c-4c0c-b2df-5b23a6c4448d has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}.#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.282 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.282 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.339 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.360 2 DEBUG oslo_concurrency.processutils [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/570998ad-005c-4c0c-b2df-5b23a6c4448d/disk.config /var/lib/nova/instances/570998ad-005c-4c0c-b2df-5b23a6c4448d" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.361 2 DEBUG nova.virt.libvirt.driver [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.363 2 DEBUG nova.virt.libvirt.vif [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:50:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_qos_after_live_migration-908003806',display_name='tempest-test_qos_after_live_migration-908003806',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-test-qos-after-live-migration-908003806',id=73,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:50:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-1ld0jmzv',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:54:01Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=570998ad-005c-4c0c-b2df-5b23a6c4448d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8a7eab74-6ed6-42aa-beab-977d24da645d", "address": "fa:16:3e:f7:33:06", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.242", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap8a7eab74-6e", "ovs_interfaceid": "8a7eab74-6ed6-42aa-beab-977d24da645d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.363 2 DEBUG nova.network.os_vif_util [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converting VIF {"id": "8a7eab74-6ed6-42aa-beab-977d24da645d", "address": "fa:16:3e:f7:33:06", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.242", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap8a7eab74-6e", "ovs_interfaceid": "8a7eab74-6ed6-42aa-beab-977d24da645d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.364 2 DEBUG nova.network.os_vif_util [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f7:33:06,bridge_name='br-int',has_traffic_filtering=True,id=8a7eab74-6ed6-42aa-beab-977d24da645d,network=Network(f2fece7d-de46-49dc-874d-3e87e96b491f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a7eab74-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.364 2 DEBUG os_vif [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:33:06,bridge_name='br-int',has_traffic_filtering=True,id=8a7eab74-6ed6-42aa-beab-977d24da645d,network=Network(f2fece7d-de46-49dc-874d-3e87e96b491f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a7eab74-6e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.366 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.366 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.369 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8a7eab74-6e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.369 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8a7eab74-6e, col_values=(('external_ids', {'iface-id': '8a7eab74-6ed6-42aa-beab-977d24da645d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f7:33:06', 'vm-uuid': '570998ad-005c-4c0c-b2df-5b23a6c4448d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:15 np0005476733 NetworkManager[51699]: <info>  [1759938855.3724] manager: (tap8a7eab74-6e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/220)
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.374 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.380 2 INFO os_vif [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:33:06,bridge_name='br-int',has_traffic_filtering=True,id=8a7eab74-6ed6-42aa-beab-977d24da645d,network=Network(f2fece7d-de46-49dc-874d-3e87e96b491f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a7eab74-6e')#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.381 2 DEBUG nova.virt.libvirt.vif [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:50:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_qos_after_live_migration-908003806',display_name='tempest-test_qos_after_live_migration-908003806',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-test-qos-after-live-migration-908003806',id=73,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:50:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-1ld0jmzv',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:54:01Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=570998ad-005c-4c0c-b2df-5b23a6c4448d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1989484f-dc1f-4f5c-94f0-a26c6ea9ece9", "address": "fa:16:3e:2d:41:77", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1989484f-dc", "ovs_interfaceid": "1989484f-dc1f-4f5c-94f0-a26c6ea9ece9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.381 2 DEBUG nova.network.os_vif_util [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converting VIF {"id": "1989484f-dc1f-4f5c-94f0-a26c6ea9ece9", "address": "fa:16:3e:2d:41:77", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1989484f-dc", "ovs_interfaceid": "1989484f-dc1f-4f5c-94f0-a26c6ea9ece9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.382 2 DEBUG nova.network.os_vif_util [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2d:41:77,bridge_name='br-int',has_traffic_filtering=True,id=1989484f-dc1f-4f5c-94f0-a26c6ea9ece9,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1989484f-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.382 2 DEBUG os_vif [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:41:77,bridge_name='br-int',has_traffic_filtering=True,id=1989484f-dc1f-4f5c-94f0-a26c6ea9ece9,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1989484f-dc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.383 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.383 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.387 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1989484f-dc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.388 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1989484f-dc, col_values=(('external_ids', {'iface-id': '1989484f-dc1f-4f5c-94f0-a26c6ea9ece9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2d:41:77', 'vm-uuid': '570998ad-005c-4c0c-b2df-5b23a6c4448d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:15 np0005476733 NetworkManager[51699]: <info>  [1759938855.3911] manager: (tap1989484f-dc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/221)
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.403 2 INFO os_vif [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:41:77,bridge_name='br-int',has_traffic_filtering=True,id=1989484f-dc1f-4f5c-94f0-a26c6ea9ece9,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1989484f-dc')#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.404 2 DEBUG nova.virt.libvirt.driver [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.404 2 DEBUG nova.compute.manager [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=113664,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_m1rohd5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='570998ad-005c-4c0c-b2df-5b23a6c4448d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData,VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.414 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:54:15 np0005476733 nova_compute[192580]: 2025-10-08 15:54:15.415 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.447s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:54:16 np0005476733 podman[244770]: 2025-10-08 15:54:16.240950827 +0000 UTC m=+0.061586888 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 11:54:16 np0005476733 nova_compute[192580]: 2025-10-08 15:54:16.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:16 np0005476733 nova_compute[192580]: 2025-10-08 15:54:16.416 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:54:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:16.443 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:54:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:16.444 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:54:16 np0005476733 nova_compute[192580]: 2025-10-08 15:54:16.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:16 np0005476733 nova_compute[192580]: 2025-10-08 15:54:16.781 2 DEBUG nova.network.neutron [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Port 1989484f-dc1f-4f5c-94f0-a26c6ea9ece9 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Oct  8 11:54:18 np0005476733 nova_compute[192580]: 2025-10-08 15:54:18.416 2 DEBUG nova.network.neutron [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Port 8a7eab74-6ed6-42aa-beab-977d24da645d updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Oct  8 11:54:18 np0005476733 nova_compute[192580]: 2025-10-08 15:54:18.418 2 DEBUG nova.compute.manager [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=113664,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_m1rohd5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='570998ad-005c-4c0c-b2df-5b23a6c4448d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData,VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Oct  8 11:54:18 np0005476733 systemd[1]: Starting libvirt proxy daemon...
Oct  8 11:54:18 np0005476733 systemd[1]: Started libvirt proxy daemon.
Oct  8 11:54:19 np0005476733 kernel: tap8a7eab74-6e: entered promiscuous mode
Oct  8 11:54:19 np0005476733 NetworkManager[51699]: <info>  [1759938859.0506] manager: (tap8a7eab74-6e): new Tun device (/org/freedesktop/NetworkManager/Devices/222)
Oct  8 11:54:19 np0005476733 ovn_controller[94857]: 2025-10-08T15:54:19Z|00658|binding|INFO|Claiming lport 8a7eab74-6ed6-42aa-beab-977d24da645d for this additional chassis.
Oct  8 11:54:19 np0005476733 nova_compute[192580]: 2025-10-08 15:54:19.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:19 np0005476733 ovn_controller[94857]: 2025-10-08T15:54:19Z|00659|binding|INFO|8a7eab74-6ed6-42aa-beab-977d24da645d: Claiming fa:16:3e:f7:33:06 192.168.8.242
Oct  8 11:54:19 np0005476733 ovn_controller[94857]: 2025-10-08T15:54:19Z|00660|binding|INFO|Setting lport 8a7eab74-6ed6-42aa-beab-977d24da645d ovn-installed in OVS
Oct  8 11:54:19 np0005476733 nova_compute[192580]: 2025-10-08 15:54:19.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:19 np0005476733 nova_compute[192580]: 2025-10-08 15:54:19.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:19 np0005476733 kernel: tap1989484f-dc: entered promiscuous mode
Oct  8 11:54:19 np0005476733 NetworkManager[51699]: <info>  [1759938859.0856] manager: (tap1989484f-dc): new Tun device (/org/freedesktop/NetworkManager/Devices/223)
Oct  8 11:54:19 np0005476733 systemd-udevd[244825]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:54:19 np0005476733 systemd-udevd[244823]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:54:19 np0005476733 nova_compute[192580]: 2025-10-08 15:54:19.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:19 np0005476733 nova_compute[192580]: 2025-10-08 15:54:19.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:19 np0005476733 ovn_controller[94857]: 2025-10-08T15:54:19Z|00661|if_status|INFO|Not updating pb chassis for 1989484f-dc1f-4f5c-94f0-a26c6ea9ece9 now as sb is readonly
Oct  8 11:54:19 np0005476733 NetworkManager[51699]: <info>  [1759938859.1025] device (tap1989484f-dc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:54:19 np0005476733 NetworkManager[51699]: <info>  [1759938859.1036] device (tap1989484f-dc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:54:19 np0005476733 nova_compute[192580]: 2025-10-08 15:54:19.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:19 np0005476733 NetworkManager[51699]: <info>  [1759938859.1079] device (tap8a7eab74-6e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:54:19 np0005476733 nova_compute[192580]: 2025-10-08 15:54:19.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:19 np0005476733 NetworkManager[51699]: <info>  [1759938859.1095] device (tap8a7eab74-6e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:54:19 np0005476733 systemd-machined[152624]: New machine qemu-44-instance-00000049.
Oct  8 11:54:19 np0005476733 systemd[1]: Started Virtual Machine qemu-44-instance-00000049.
Oct  8 11:54:19 np0005476733 ovn_controller[94857]: 2025-10-08T15:54:19Z|00662|binding|INFO|Claiming lport 1989484f-dc1f-4f5c-94f0-a26c6ea9ece9 for this additional chassis.
Oct  8 11:54:19 np0005476733 ovn_controller[94857]: 2025-10-08T15:54:19Z|00663|binding|INFO|1989484f-dc1f-4f5c-94f0-a26c6ea9ece9: Claiming fa:16:3e:2d:41:77 10.100.0.11
Oct  8 11:54:19 np0005476733 ovn_controller[94857]: 2025-10-08T15:54:19Z|00664|binding|INFO|Setting lport 1989484f-dc1f-4f5c-94f0-a26c6ea9ece9 ovn-installed in OVS
Oct  8 11:54:19 np0005476733 nova_compute[192580]: 2025-10-08 15:54:19.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:20 np0005476733 nova_compute[192580]: 2025-10-08 15:54:20.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:20 np0005476733 nova_compute[192580]: 2025-10-08 15:54:20.489 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759938860.488894, 570998ad-005c-4c0c-b2df-5b23a6c4448d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:54:20 np0005476733 nova_compute[192580]: 2025-10-08 15:54:20.490 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] VM Started (Lifecycle Event)#033[00m
Oct  8 11:54:20 np0005476733 nova_compute[192580]: 2025-10-08 15:54:20.540 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:54:21 np0005476733 nova_compute[192580]: 2025-10-08 15:54:21.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:22 np0005476733 podman[244861]: 2025-10-08 15:54:22.264012346 +0000 UTC m=+0.078502160 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:54:22 np0005476733 ovn_controller[94857]: 2025-10-08T15:54:22Z|00665|pinctrl|WARN|Dropped 1759 log messages in last 59 seconds (most recently, 3 seconds ago) due to excessive rate
Oct  8 11:54:22 np0005476733 ovn_controller[94857]: 2025-10-08T15:54:22Z|00666|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:54:22 np0005476733 podman[244860]: 2025-10-08 15:54:22.357972359 +0000 UTC m=+0.172497943 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller)
Oct  8 11:54:22 np0005476733 nova_compute[192580]: 2025-10-08 15:54:22.685 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759938862.6854098, 570998ad-005c-4c0c-b2df-5b23a6c4448d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:54:22 np0005476733 nova_compute[192580]: 2025-10-08 15:54:22.686 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:54:22 np0005476733 nova_compute[192580]: 2025-10-08 15:54:22.730 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:54:22 np0005476733 nova_compute[192580]: 2025-10-08 15:54:22.735 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:54:22 np0005476733 nova_compute[192580]: 2025-10-08 15:54:22.771 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Oct  8 11:54:23 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:23.447 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:54:24 np0005476733 nova_compute[192580]: 2025-10-08 15:54:24.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:54:24 np0005476733 ovn_controller[94857]: 2025-10-08T15:54:24Z|00667|binding|INFO|Claiming lport 8a7eab74-6ed6-42aa-beab-977d24da645d for this chassis.
Oct  8 11:54:24 np0005476733 ovn_controller[94857]: 2025-10-08T15:54:24Z|00668|binding|INFO|8a7eab74-6ed6-42aa-beab-977d24da645d: Claiming fa:16:3e:f7:33:06 192.168.8.242
Oct  8 11:54:24 np0005476733 ovn_controller[94857]: 2025-10-08T15:54:24Z|00669|binding|INFO|Setting lport 8a7eab74-6ed6-42aa-beab-977d24da645d up in Southbound
Oct  8 11:54:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:24.648 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:33:06 192.168.8.242'], port_security=['fa:16:3e:f7:33:06 192.168.8.242'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.8.242/24', 'neutron:device_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2fece7d-de46-49dc-874d-3e87e96b491f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '12', 'neutron:security_group_ids': 'b449450f-29a2-4ba2-a56d-c4c1cca923db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.238', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07c02bbf-eada-4d49-8b61-f6f456154844, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=8a7eab74-6ed6-42aa-beab-977d24da645d) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:54:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:24.650 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 8a7eab74-6ed6-42aa-beab-977d24da645d in datapath f2fece7d-de46-49dc-874d-3e87e96b491f bound to our chassis#033[00m
Oct  8 11:54:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:24.654 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f2fece7d-de46-49dc-874d-3e87e96b491f#033[00m
Oct  8 11:54:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:24.670 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e557c08a-d5da-4638-bdbb-1078e0cbb7f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:24.671 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf2fece7d-d1 in ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:54:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:24.674 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf2fece7d-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:54:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:24.675 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ce7fec2d-163d-4a21-b0d9-ce1c6d03b02e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:24.677 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4ab89b28-6c59-4cb7-9dc6-a4b654d47832]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:24.696 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[a87cd9d9-d70a-4b89-ac36-d1638639e55e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:24.722 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[94bc860a-f0be-4ad6-b9e5-9812c86d7e63]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:24.767 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[8b493047-3c4a-407b-9a28-eb913db16416]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:24.776 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[69f09333-2c65-427e-9390-9acf28a4b4b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:24 np0005476733 NetworkManager[51699]: <info>  [1759938864.7785] manager: (tapf2fece7d-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/224)
Oct  8 11:54:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:24.815 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[ce4d2176-ef39-4694-bc3b-5aad4391f018]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:24.819 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[d8bdc781-0c7d-488a-ac0d-b262244b0d20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:24 np0005476733 systemd-udevd[244911]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:54:24 np0005476733 NetworkManager[51699]: <info>  [1759938864.8578] device (tapf2fece7d-d0): carrier: link connected
Oct  8 11:54:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:24.865 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[1034f666-bba4-4f36-8aee-3145711f9b5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:24.883 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0ec33d89-242d-47e6-9fa6-a604b30f0de2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf2fece7d-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:76:b4:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 157], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 576852, 'reachable_time': 33038, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244930, 'error': None, 'target': 'ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:24.897 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[164d001e-3068-4484-bfac-f49b1f1e9b09]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe76:b4af'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 576852, 'tstamp': 576852}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244931, 'error': None, 'target': 'ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:24.918 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[32c623fe-aab5-4d8f-a9b9-f52b24977e80]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf2fece7d-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:76:b4:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 157], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 576852, 'reachable_time': 33038, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244932, 'error': None, 'target': 'ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:24.954 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[34a8590f-541c-4c49-8848-f85407b8e4fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:25.040 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3eb9f7f3-0eb6-4790-acad-f811e8c6b69c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:25.042 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2fece7d-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:25.043 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:25.043 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf2fece7d-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:54:25 np0005476733 kernel: tapf2fece7d-d0: entered promiscuous mode
Oct  8 11:54:25 np0005476733 NetworkManager[51699]: <info>  [1759938865.0469] manager: (tapf2fece7d-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/225)
Oct  8 11:54:25 np0005476733 nova_compute[192580]: 2025-10-08 15:54:25.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:25 np0005476733 nova_compute[192580]: 2025-10-08 15:54:25.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:25.051 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf2fece7d-d0, col_values=(('external_ids', {'iface-id': 'c203ff41-0371-4edb-b491-721a1c14b7ee'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:54:25 np0005476733 nova_compute[192580]: 2025-10-08 15:54:25.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:25 np0005476733 ovn_controller[94857]: 2025-10-08T15:54:25Z|00670|binding|INFO|Releasing lport c203ff41-0371-4edb-b491-721a1c14b7ee from this chassis (sb_readonly=0)
Oct  8 11:54:25 np0005476733 nova_compute[192580]: 2025-10-08 15:54:25.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:25.056 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f2fece7d-de46-49dc-874d-3e87e96b491f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f2fece7d-de46-49dc-874d-3e87e96b491f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:25.057 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a2133c03-0aa8-45ef-be95-e12ae4f8a10a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:25.058 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-f2fece7d-de46-49dc-874d-3e87e96b491f
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/f2fece7d-de46-49dc-874d-3e87e96b491f.pid.haproxy
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID f2fece7d-de46-49dc-874d-3e87e96b491f
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:54:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:25.059 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f', 'env', 'PROCESS_TAG=haproxy-f2fece7d-de46-49dc-874d-3e87e96b491f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f2fece7d-de46-49dc-874d-3e87e96b491f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:54:25 np0005476733 nova_compute[192580]: 2025-10-08 15:54:25.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:25 np0005476733 nova_compute[192580]: 2025-10-08 15:54:25.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:25 np0005476733 podman[244965]: 2025-10-08 15:54:25.497205759 +0000 UTC m=+0.083755258 container create 3f832d9ce10584ef050c87c5ba56df427a694600bbed683d2d5a7ff155dd8fe5 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  8 11:54:25 np0005476733 systemd[1]: Started libpod-conmon-3f832d9ce10584ef050c87c5ba56df427a694600bbed683d2d5a7ff155dd8fe5.scope.
Oct  8 11:54:25 np0005476733 podman[244965]: 2025-10-08 15:54:25.461708925 +0000 UTC m=+0.048258424 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:54:25 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:54:25 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/058df4e41d2b24e5e9eed709cea7ba9c5c1e6518b1f2c6ddada6e2b56cb331cf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:54:25 np0005476733 podman[244965]: 2025-10-08 15:54:25.588039432 +0000 UTC m=+0.174588961 container init 3f832d9ce10584ef050c87c5ba56df427a694600bbed683d2d5a7ff155dd8fe5 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 11:54:25 np0005476733 podman[244965]: 2025-10-08 15:54:25.593528857 +0000 UTC m=+0.180078346 container start 3f832d9ce10584ef050c87c5ba56df427a694600bbed683d2d5a7ff155dd8fe5 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 11:54:25 np0005476733 neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f[244980]: [NOTICE]   (244984) : New worker (244986) forked
Oct  8 11:54:25 np0005476733 neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f[244980]: [NOTICE]   (244984) : Loading success.
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.344 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.345 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.346 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:54:26 np0005476733 nova_compute[192580]: 2025-10-08 15:54:26.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:26 np0005476733 ovn_controller[94857]: 2025-10-08T15:54:26Z|00671|binding|INFO|Claiming lport 1989484f-dc1f-4f5c-94f0-a26c6ea9ece9 for this chassis.
Oct  8 11:54:26 np0005476733 ovn_controller[94857]: 2025-10-08T15:54:26Z|00672|binding|INFO|1989484f-dc1f-4f5c-94f0-a26c6ea9ece9: Claiming fa:16:3e:2d:41:77 10.100.0.11
Oct  8 11:54:26 np0005476733 ovn_controller[94857]: 2025-10-08T15:54:26Z|00673|binding|INFO|Setting lport 1989484f-dc1f-4f5c-94f0-a26c6ea9ece9 up in Southbound
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.530 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:41:77 10.100.0.11'], port_security=['fa:16:3e:2d:41:77 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '11', 'neutron:security_group_ids': '82ea289b-c65f-44fe-a172-e9784a3ab9f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.208'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3da71a44-b74e-4032-87c4-3337484b3d54, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=10, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=1989484f-dc1f-4f5c-94f0-a26c6ea9ece9) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.532 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 1989484f-dc1f-4f5c-94f0-a26c6ea9ece9 in datapath 58a69152-b5a6-41d0-85d5-36ab51cfbfb5 bound to our chassis#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.535 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58a69152-b5a6-41d0-85d5-36ab51cfbfb5#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.555 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4e9795e6-90e9-4b00-96fa-901c7513fff8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.556 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap58a69152-b1 in ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.560 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap58a69152-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.560 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8c32cfb8-2840-41d3-9761-310f82e6975e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.562 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9eb8e896-fc07-47fc-8424-b0f301db6e88]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.581 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[cc45e956-43ce-4a78-85dc-6ceaa5cab06d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.608 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[40593db5-23ef-44de-a572-8cbf4a3b9b12]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.664 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[d5bd17fa-835c-407a-98ac-a4ba5c007830]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.674 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[91f97ffc-3c39-4c49-9e5c-64970361c804]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:26 np0005476733 systemd-udevd[244926]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:54:26 np0005476733 NetworkManager[51699]: <info>  [1759938866.6786] manager: (tap58a69152-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/226)
Oct  8 11:54:26 np0005476733 nova_compute[192580]: 2025-10-08 15:54:26.710 2 INFO nova.compute.manager [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Post operation of migration started#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.732 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[5bb4b473-32da-424e-b41f-e7d05fec9df5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.736 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[c66f87fa-d115-48f4-9033-0ef6e8ccf760]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:26 np0005476733 NetworkManager[51699]: <info>  [1759938866.7660] device (tap58a69152-b0): carrier: link connected
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.774 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[32527b33-ea43-4371-a855-af66d7ef19f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.798 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[cf89e712-2a51-4da2-98c2-a62bf68d1c1a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58a69152-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:63:a5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 158], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577043, 'reachable_time': 37494, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245005, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.818 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[50ea6017-a027-4294-a0bd-95e89cc0db10]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecf:63a5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 577043, 'tstamp': 577043}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245006, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.838 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a5afaa5a-67bf-43b5-85fe-2ebab8f77434]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58a69152-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:63:a5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 158], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577043, 'reachable_time': 37494, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245007, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.877 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3ca7784b-3ba6-4e7c-b3fe-85b63d2db942]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.958 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6ed8a445-4582-494d-996a-7c9307183fe5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.960 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58a69152-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.961 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.961 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58a69152-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:54:26 np0005476733 nova_compute[192580]: 2025-10-08 15:54:26.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:26 np0005476733 kernel: tap58a69152-b0: entered promiscuous mode
Oct  8 11:54:26 np0005476733 NetworkManager[51699]: <info>  [1759938866.9646] manager: (tap58a69152-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/227)
Oct  8 11:54:26 np0005476733 nova_compute[192580]: 2025-10-08 15:54:26.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.968 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58a69152-b0, col_values=(('external_ids', {'iface-id': '46f589fc-b5d9-4e1f-b085-8789fd1f48e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:54:26 np0005476733 ovn_controller[94857]: 2025-10-08T15:54:26Z|00674|binding|INFO|Releasing lport 46f589fc-b5d9-4e1f-b085-8789fd1f48e9 from this chassis (sb_readonly=0)
Oct  8 11:54:26 np0005476733 nova_compute[192580]: 2025-10-08 15:54:26.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.984 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/58a69152-b5a6-41d0-85d5-36ab51cfbfb5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/58a69152-b5a6-41d0-85d5-36ab51cfbfb5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.985 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d9be1189-babf-4050-96ca-664f27025cab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.985 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-58a69152-b5a6-41d0-85d5-36ab51cfbfb5
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/58a69152-b5a6-41d0-85d5-36ab51cfbfb5.pid.haproxy
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 58a69152-b5a6-41d0-85d5-36ab51cfbfb5
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:54:26.986 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'env', 'PROCESS_TAG=haproxy-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/58a69152-b5a6-41d0-85d5-36ab51cfbfb5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:54:27 np0005476733 nova_compute[192580]: 2025-10-08 15:54:27.171 2 DEBUG oslo_concurrency.lockutils [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquiring lock "refresh_cache-570998ad-005c-4c0c-b2df-5b23a6c4448d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:54:27 np0005476733 nova_compute[192580]: 2025-10-08 15:54:27.172 2 DEBUG oslo_concurrency.lockutils [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquired lock "refresh_cache-570998ad-005c-4c0c-b2df-5b23a6c4448d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:54:27 np0005476733 nova_compute[192580]: 2025-10-08 15:54:27.172 2 DEBUG nova.network.neutron [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:54:27 np0005476733 podman[245038]: 2025-10-08 15:54:27.358123298 +0000 UTC m=+0.051467335 container create feeabbdacb8e5a453aa43b7655724ed9d597db435482ee98e68667766e0ef233 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:54:27 np0005476733 systemd[1]: Started libpod-conmon-feeabbdacb8e5a453aa43b7655724ed9d597db435482ee98e68667766e0ef233.scope.
Oct  8 11:54:27 np0005476733 podman[245038]: 2025-10-08 15:54:27.331346003 +0000 UTC m=+0.024696570 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:54:27 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:54:27 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7883be799b68d6a1ea3f8abb9e9a60a60c81acf90c61aa114799645bc2053454/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:54:27 np0005476733 podman[245038]: 2025-10-08 15:54:27.44547842 +0000 UTC m=+0.138822487 container init feeabbdacb8e5a453aa43b7655724ed9d597db435482ee98e68667766e0ef233 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:54:27 np0005476733 podman[245038]: 2025-10-08 15:54:27.451522063 +0000 UTC m=+0.144866100 container start feeabbdacb8e5a453aa43b7655724ed9d597db435482ee98e68667766e0ef233 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  8 11:54:27 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[245053]: [NOTICE]   (245057) : New worker (245059) forked
Oct  8 11:54:27 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[245053]: [NOTICE]   (245057) : Loading success.
Oct  8 11:54:27 np0005476733 nova_compute[192580]: 2025-10-08 15:54:27.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:54:29 np0005476733 ovn_controller[94857]: 2025-10-08T15:54:29Z|00092|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f7:33:06 192.168.8.242
Oct  8 11:54:29 np0005476733 ovn_controller[94857]: 2025-10-08T15:54:29Z|00093|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2d:41:77 10.100.0.11
Oct  8 11:54:29 np0005476733 nova_compute[192580]: 2025-10-08 15:54:29.971 2 DEBUG nova.network.neutron [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Updating instance_info_cache with network_info: [{"id": "8a7eab74-6ed6-42aa-beab-977d24da645d", "address": "fa:16:3e:f7:33:06", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.242", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a7eab74-6e", "ovs_interfaceid": "8a7eab74-6ed6-42aa-beab-977d24da645d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1989484f-dc1f-4f5c-94f0-a26c6ea9ece9", "address": "fa:16:3e:2d:41:77", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1989484f-dc", "ovs_interfaceid": "1989484f-dc1f-4f5c-94f0-a26c6ea9ece9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:54:29 np0005476733 nova_compute[192580]: 2025-10-08 15:54:29.994 2 DEBUG oslo_concurrency.lockutils [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Releasing lock "refresh_cache-570998ad-005c-4c0c-b2df-5b23a6c4448d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:54:30 np0005476733 nova_compute[192580]: 2025-10-08 15:54:30.012 2 DEBUG oslo_concurrency.lockutils [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:54:30 np0005476733 nova_compute[192580]: 2025-10-08 15:54:30.012 2 DEBUG oslo_concurrency.lockutils [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:54:30 np0005476733 nova_compute[192580]: 2025-10-08 15:54:30.013 2 DEBUG oslo_concurrency.lockutils [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:54:30 np0005476733 nova_compute[192580]: 2025-10-08 15:54:30.018 2 INFO nova.virt.libvirt.driver [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Oct  8 11:54:30 np0005476733 virtqemud[192152]: Domain id=44 name='instance-00000049' uuid=570998ad-005c-4c0c-b2df-5b23a6c4448d is tainted: custom-monitor
Oct  8 11:54:30 np0005476733 podman[245068]: 2025-10-08 15:54:30.250964055 +0000 UTC m=+0.067214209 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, container_name=multipathd)
Oct  8 11:54:30 np0005476733 podman[245070]: 2025-10-08 15:54:30.271925425 +0000 UTC m=+0.073246932 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vendor=Red Hat, Inc., version=9.6)
Oct  8 11:54:30 np0005476733 podman[245069]: 2025-10-08 15:54:30.281112248 +0000 UTC m=+0.088659814 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:54:30 np0005476733 nova_compute[192580]: 2025-10-08 15:54:30.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:31 np0005476733 nova_compute[192580]: 2025-10-08 15:54:31.025 2 INFO nova.virt.libvirt.driver [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Oct  8 11:54:31 np0005476733 nova_compute[192580]: 2025-10-08 15:54:31.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:32 np0005476733 nova_compute[192580]: 2025-10-08 15:54:32.035 2 INFO nova.virt.libvirt.driver [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Oct  8 11:54:32 np0005476733 nova_compute[192580]: 2025-10-08 15:54:32.045 2 DEBUG nova.compute.manager [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:54:32 np0005476733 nova_compute[192580]: 2025-10-08 15:54:32.069 2 DEBUG nova.objects.instance [None req-6b654b79-6cd8-4f4f-9f4c-682ed7c8b567 764876185b70438abb53e0563a07c1dc 24d972083fd548a8b7dfd2237d4cd83c - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  8 11:54:35 np0005476733 nova_compute[192580]: 2025-10-08 15:54:35.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.052 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'name': 'tempest-test_qos_after_live_migration-908003806', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000049', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'hostId': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.054 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.072 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/cpu volume: 12790000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48b33e31-f7ba-4ce3-8c79-80347b3d2ea6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12790000000, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'timestamp': '2025-10-08T15:54:36.054371', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'instance-00000049', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '16da42ea-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.795612311, 'message_signature': '37c1aee49feafad29ff0e79268ad0f9bfc77d47264a5c5f559cf5f25f2316572'}]}, 'timestamp': '2025-10-08 15:54:36.073925', '_unique_id': 'b586bc7618b24a93bb007b14b990ee85'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.075 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.076 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.105 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/disk.device.read.requests volume: 4505 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.106 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/disk.device.read.requests volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19976b73-669c-4f79-962d-b74836fe8740', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 4505, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d-vda', 'timestamp': '2025-10-08T15:54:36.076188', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'instance-00000049', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '16df3e1c-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.799174744, 'message_signature': 'ffa209996c156843cdfd696de890c0071699b41ec3569a05fa82969b7fad01b4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 15, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d-sda', 'timestamp': '2025-10-08T15:54:36.076188', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'instance-00000049', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '16df5b22-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.799174744, 'message_signature': 'a0dd9b4a51cc834a5987a4a6700d9855694ba401bee8bb5a12f80dd57a9fe906'}]}, 'timestamp': '2025-10-08 15:54:36.107383', '_unique_id': '87e2a00c31fc4dfa88998fcc2e3ce5ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.110 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.118 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 570998ad-005c-4c0c-b2df-5b23a6c4448d / tap8a7eab74-6e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.118 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 570998ad-005c-4c0c-b2df-5b23a6c4448d / tap1989484f-dc inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.118 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/network.incoming.bytes volume: 3902 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.119 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/network.incoming.bytes volume: 1197 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16bfefc9-357e-40a6-bb19-5696f7a8f20c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3902, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000049-570998ad-005c-4c0c-b2df-5b23a6c4448d-tap8a7eab74-6e', 'timestamp': '2025-10-08T15:54:36.111003', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'tap8a7eab74-6e', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f7:33:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a7eab74-6e'}, 'message_id': '16e12e5c-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.834110371, 'message_signature': '0bcbe7fd5c0f96d7f2a3b45b11e704f893c2a73955b77e3ca135cd39c68e1a6a'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1197, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000049-570998ad-005c-4c0c-b2df-5b23a6c4448d-tap1989484f-dc', 'timestamp': '2025-10-08T15:54:36.111003', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'tap1989484f-dc', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2d:41:77', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1989484f-dc'}, 'message_id': '16e13938-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.834110371, 'message_signature': '50ef48e91dedb18458418404eeec1722442bcd85f806fb827f2d256e600d5aa0'}]}, 'timestamp': '2025-10-08 15:54:36.119384', '_unique_id': 'bf1c4378df6840bcbe3bd105ea19d351'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.120 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29cc1cdc-65de-4695-ba51-4483f56e6e89', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000049-570998ad-005c-4c0c-b2df-5b23a6c4448d-tap8a7eab74-6e', 'timestamp': '2025-10-08T15:54:36.121010', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'tap8a7eab74-6e', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f7:33:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a7eab74-6e'}, 'message_id': '16e18280-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.834110371, 'message_signature': 'cc19befe66fad0fa273c923e1dbd708f777d6165013054361f07962c8d381268'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000049-570998ad-005c-4c0c-b2df-5b23a6c4448d-tap1989484f-dc', 'timestamp': '2025-10-08T15:54:36.121010', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'tap1989484f-dc', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2d:41:77', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1989484f-dc'}, 'message_id': '16e18abe-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.834110371, 'message_signature': '5c31205382e61543922df351da7afbeff71d1b0da23cecd4d3c083b514d3d8de'}]}, 'timestamp': '2025-10-08 15:54:36.121461', '_unique_id': '7439edfa4a0f4b8f886a7f065f6e7248'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.121 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.122 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.122 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '466b78da-1bed-4c28-b80f-a82de008eded', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000049-570998ad-005c-4c0c-b2df-5b23a6c4448d-tap8a7eab74-6e', 'timestamp': '2025-10-08T15:54:36.122747', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'tap8a7eab74-6e', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f7:33:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a7eab74-6e'}, 'message_id': '16e1c6aa-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.834110371, 'message_signature': 'a3fe7db89f615486f6962e0be66d2bd7661a152314b84a8568475d8839973899'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000049-570998ad-005c-4c0c-b2df-5b23a6c4448d-tap1989484f-dc', 'timestamp': '2025-10-08T15:54:36.122747', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'tap1989484f-dc', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2d:41:77', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1989484f-dc'}, 'message_id': '16e1d028-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.834110371, 'message_signature': 'a55974e61853a1a0f8775153928b5309761f99719f6a502e7f96de8838e74455'}]}, 'timestamp': '2025-10-08 15:54:36.123240', '_unique_id': '9c951863d9844812a64aa8b28311869e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.123 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.124 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.124 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.124 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-test_qos_after_live_migration-908003806>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_qos_after_live_migration-908003806>]
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.125 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.125 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/network.outgoing.packets volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.125 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/network.outgoing.packets volume: 71 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20b6f214-3042-4bd4-93a0-ae55dedb342f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 90, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000049-570998ad-005c-4c0c-b2df-5b23a6c4448d-tap8a7eab74-6e', 'timestamp': '2025-10-08T15:54:36.125356', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'tap8a7eab74-6e', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f7:33:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a7eab74-6e'}, 'message_id': '16e22c4e-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.834110371, 'message_signature': '10b27a4ecdc9efd84f84056c18d700075b2ab941ce1f4097bfdcf75b4b9c7dc5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 71, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000049-570998ad-005c-4c0c-b2df-5b23a6c4448d-tap1989484f-dc', 'timestamp': '2025-10-08T15:54:36.125356', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'tap1989484f-dc', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2d:41:77', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1989484f-dc'}, 'message_id': '16e2357c-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.834110371, 'message_signature': '14cc5b4253d5724144aef9b4e136ab0bcfa562f83f3d2fba2382b14b91618934'}]}, 'timestamp': '2025-10-08 15:54:36.125864', '_unique_id': 'a216db2715264c518f18293bb77d5ae1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.126 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.127 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.147 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/disk.device.usage volume: 155254784 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.148 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27081392-1050-4e64-bd4f-d9d0c81fdb86', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 155254784, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d-vda', 'timestamp': '2025-10-08T15:54:36.127302', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'instance-00000049', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '16e5adce-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.850291358, 'message_signature': '17da5e425687c5486cc19f6e13c6138d5c634564e4dc9ffca5410de410153ec2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d-sda', 'timestamp': '2025-10-08T15:54:36.127302', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'instance-00000049', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '16e5bd28-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.850291358, 'message_signature': '53cfec8ad7e78fbaabd43648dde5fbe2a924d0f0573554a156ed3e34d41fbbe7'}]}, 'timestamp': '2025-10-08 15:54:36.149025', '_unique_id': '2d6f4742f70a4329a1ecae5c56d7bfb1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.151 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.151 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/disk.device.read.bytes volume: 118301696 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.151 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/disk.device.read.bytes volume: 53296 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54976705-f2bc-4611-b0bb-f74066285291', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 118301696, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d-vda', 'timestamp': '2025-10-08T15:54:36.151233', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'instance-00000049', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '16e61f16-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.799174744, 'message_signature': 'ec8e1d81f0e1f554c70e24a4a7ce135bc6e96bf12e50fcc892755273adc92b88'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 53296, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d-sda', 'timestamp': '2025-10-08T15:54:36.151233', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'instance-00000049', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '16e62740-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.799174744, 'message_signature': '5be819591f06c9ea480aa9409ce959d33e89224b7e88583cf77e8af53a782ef9'}]}, 'timestamp': '2025-10-08 15:54:36.151683', '_unique_id': 'e6ccad537087453987adb0ba45f38e85'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/memory.usage volume: 251.07421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db609c9e-8ef2-42eb-a055-ea8ae9ba8b4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 251.07421875, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'timestamp': '2025-10-08T15:54:36.153155', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'instance-00000049', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '16e669c6-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.795612311, 'message_signature': 'a1779c3850e7432f49fee557c318bd9771f8f375a6b3569d40071bfa84ee2bfa'}]}, 'timestamp': '2025-10-08 15:54:36.153392', '_unique_id': '7db272fdd7274864be15d654e574bac5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.153 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.154 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.154 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/disk.device.write.latency volume: 45966545 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.154 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cbf26e93-f432-4aeb-ac13-906b1666fef3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 45966545, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d-vda', 'timestamp': '2025-10-08T15:54:36.154612', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'instance-00000049', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '16e6a3b4-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.799174744, 'message_signature': 'bb989a2c6aa1046302f2d69a883c236c2c00e3be3f3f159d53cd466e3d6d7e02'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d-sda', 'timestamp': '2025-10-08T15:54:36.154612', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'instance-00000049', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '16e6ad28-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.799174744, 'message_signature': 'f75c9bf18a893347ae9af383f6a633e47b5f582b7107af7da579aa2e40316fa1'}]}, 'timestamp': '2025-10-08 15:54:36.155133', '_unique_id': '56f65367fe964217a2f2b768fe440808'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.155 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.156 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.156 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.156 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5cd42b50-d15e-4971-b726-691221903115', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000049-570998ad-005c-4c0c-b2df-5b23a6c4448d-tap8a7eab74-6e', 'timestamp': '2025-10-08T15:54:36.156422', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'tap8a7eab74-6e', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f7:33:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a7eab74-6e'}, 'message_id': '16e6ea4a-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.834110371, 'message_signature': '24be0b991d30390fcd2c80393b0505311639e21f0c5cc247e10641f6f1994542'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000049-570998ad-005c-4c0c-b2df-5b23a6c4448d-tap1989484f-dc', 'timestamp': '2025-10-08T15:54:36.156422', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'tap1989484f-dc', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2d:41:77', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1989484f-dc'}, 'message_id': '16e6f404-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.834110371, 'message_signature': '4b93e06f1f4ccd9d7b0e08adf42eb4df975e80e70689b42e384418e38fd0083b'}]}, 'timestamp': '2025-10-08 15:54:36.156941', '_unique_id': '7fc2bca277c04d4295ce57b6b4718ee1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.157 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.158 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.158 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/disk.device.write.bytes volume: 2357760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.158 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1113c84-4057-4ace-8429-04d21ff65f9c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2357760, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d-vda', 'timestamp': '2025-10-08T15:54:36.158207', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'instance-00000049', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '16e72eb0-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.799174744, 'message_signature': '7cd838948ea5ad3b2520b8ebec0dfc5592eada487449745db5e65766db34ba7a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d-sda', 'timestamp': '2025-10-08T15:54:36.158207', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'instance-00000049', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '16e73784-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.799174744, 'message_signature': '202079c32f115f9b456e33a714a03aeddf1f670df11de25d0461f1036db235de'}]}, 'timestamp': '2025-10-08 15:54:36.158652', '_unique_id': '86dd9351bb744bb99113606525832acc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/network.outgoing.bytes volume: 8126 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.159 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/network.outgoing.bytes volume: 5109 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '055a5797-0955-47a6-8788-fa5f9c74acc0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8126, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000049-570998ad-005c-4c0c-b2df-5b23a6c4448d-tap8a7eab74-6e', 'timestamp': '2025-10-08T15:54:36.159738', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'tap8a7eab74-6e', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f7:33:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a7eab74-6e'}, 'message_id': '16e76a7e-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.834110371, 'message_signature': '1ea434e702715312987359e9d55238932aea9580409e784957f1039342c7d0fd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5109, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000049-570998ad-005c-4c0c-b2df-5b23a6c4448d-tap1989484f-dc', 'timestamp': '2025-10-08T15:54:36.159738', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'tap1989484f-dc', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2d:41:77', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1989484f-dc'}, 'message_id': '16e772b2-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.834110371, 'message_signature': 'f8cea4dd1bf0937ea0c8cc27486e46091d90b71acd260febdcfc847ca01ebdf7'}]}, 'timestamp': '2025-10-08 15:54:36.160189', '_unique_id': 'fe49e978c311498b8ef204a7533a536b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.161 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.161 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/disk.device.read.latency volume: 3141811734 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.161 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/disk.device.read.latency volume: 19327425 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0cbb92be-9b0a-41db-9eea-62bfec154066', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3141811734, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d-vda', 'timestamp': '2025-10-08T15:54:36.161299', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'instance-00000049', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '16e7a778-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.799174744, 'message_signature': 'd763aea9e429249c0851d369ae4c59c91f442c0bd4e72c8ef9f3276fb2c73fe8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19327425, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d-sda', 'timestamp': '2025-10-08T15:54:36.161299', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'instance-00000049', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '16e7af2a-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.799174744, 'message_signature': 'a3daebe4d860e70fe4164478bfcebf1af1256a55e42252b018e78bb07d1baeeb'}]}, 'timestamp': '2025-10-08 15:54:36.161738', '_unique_id': 'aa043bf153f747b8bda81dc297e5ef26'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.162 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.163 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-test_qos_after_live_migration-908003806>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_qos_after_live_migration-908003806>]
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.163 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.163 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.163 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0ccf4b1-1bb4-445a-bbf2-78e6a438e7a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000049-570998ad-005c-4c0c-b2df-5b23a6c4448d-tap8a7eab74-6e', 'timestamp': '2025-10-08T15:54:36.163307', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'tap8a7eab74-6e', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f7:33:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a7eab74-6e'}, 'message_id': '16e7f5e8-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.834110371, 'message_signature': '9b69f8d39b25c386ecdf050e28c412386b64d4cb022fed5863ff5540170cad71'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000049-570998ad-005c-4c0c-b2df-5b23a6c4448d-tap1989484f-dc', 'timestamp': '2025-10-08T15:54:36.163307', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'tap1989484f-dc', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2d:41:77', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1989484f-dc'}, 'message_id': '16e7fe26-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.834110371, 'message_signature': '114fdbbd609436b7a21edc8e6abab0f79fb1d9ae47cecf4296fb5cd815dcfad3'}]}, 'timestamp': '2025-10-08 15:54:36.163738', '_unique_id': '11a84c1895144f88bbf668b189ac9fb0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.164 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.165 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.165 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-test_qos_after_live_migration-908003806>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_qos_after_live_migration-908003806>]
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.165 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.165 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.165 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a53d52f-e16c-42e7-8ec2-047f9c2b8ee1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000049-570998ad-005c-4c0c-b2df-5b23a6c4448d-tap8a7eab74-6e', 'timestamp': '2025-10-08T15:54:36.165362', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'tap8a7eab74-6e', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f7:33:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a7eab74-6e'}, 'message_id': '16e84606-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.834110371, 'message_signature': 'aecb54f78db4922fc1da442b38ea062145a941693dbea1e1f7f96da8043f15b8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000049-570998ad-005c-4c0c-b2df-5b23a6c4448d-tap1989484f-dc', 'timestamp': '2025-10-08T15:54:36.165362', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'tap1989484f-dc', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2d:41:77', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1989484f-dc'}, 'message_id': '16e84e12-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.834110371, 'message_signature': 'de7498516e49a9e180c0ad01fff7378670277bfbc4ea8f5a484f4f2bcd957a5b'}]}, 'timestamp': '2025-10-08 15:54:36.165807', '_unique_id': 'dfaadd0e94fe4c6aa11f26d2c9219d61'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.166 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/disk.device.allocation volume: 156241920 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3a5b8e9-b830-486d-974b-4d85ba56733c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 156241920, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d-vda', 'timestamp': '2025-10-08T15:54:36.167073', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'instance-00000049', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '16e889ea-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.850291358, 'message_signature': 'b01e7ca261802f06bdc1630af6ba45111db6d9045b405cc84e19a53783f5f1cf'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d-sda', 'timestamp': '2025-10-08T15:54:36.167073', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'instance-00000049', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '16e89228-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.850291358, 'message_signature': 'a77ce426116dbe9ba68ce55e9823d74472e0d25b64e5d2327548c7af75a7c0f1'}]}, 'timestamp': '2025-10-08 15:54:36.167522', '_unique_id': 'ba111f7086ab41eb8b94f969f0efefc4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.167 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.168 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.168 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/disk.device.write.requests volume: 45 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.168 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd284fa05-66dd-4a8e-b1dd-7b910d9b150d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 45, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d-vda', 'timestamp': '2025-10-08T15:54:36.168677', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'instance-00000049', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '16e8c874-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.799174744, 'message_signature': '5355e27a6f320c6960f80affdc02637fbdb2febd043ac91bafee4d3550c75be5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d-sda', 'timestamp': '2025-10-08T15:54:36.168677', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'instance-00000049', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '16e8d184-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.799174744, 'message_signature': 'c9294c97709602153edc933c96a8df056504f683a1e9f316201e15a0a3e2d967'}]}, 'timestamp': '2025-10-08 15:54:36.169168', '_unique_id': '77a9e62b85c54eeab7e2896933c1d09b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.169 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.170 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.170 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/network.incoming.packets volume: 21 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.170 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f990e2ce-a692-4fa5-a4ca-7c198b8ccbdd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 21, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000049-570998ad-005c-4c0c-b2df-5b23a6c4448d-tap8a7eab74-6e', 'timestamp': '2025-10-08T15:54:36.170347', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'tap8a7eab74-6e', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f7:33:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a7eab74-6e'}, 'message_id': '16e9099c-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.834110371, 'message_signature': 'd4636a4c58aaff6608e8120a05247b2c4888a3b418711d594d821cda9cea54f2'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000049-570998ad-005c-4c0c-b2df-5b23a6c4448d-tap1989484f-dc', 'timestamp': '2025-10-08T15:54:36.170347', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'tap1989484f-dc', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2d:41:77', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1989484f-dc'}, 'message_id': '16e914c8-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.834110371, 'message_signature': 'd0b557b6e1d26fe1ce14e213098998a0b5a84276e0dec93961ccca3fedf07820'}]}, 'timestamp': '2025-10-08 15:54:36.170887', '_unique_id': 'b57895dd9fe94b7493f5b833ef714493'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.171 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.172 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.172 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-test_qos_after_live_migration-908003806>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_qos_after_live_migration-908003806>]
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.172 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.172 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.172 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eeb1a7c3-eaef-4084-90b9-41af51e3e719', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d-vda', 'timestamp': '2025-10-08T15:54:36.172336', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'instance-00000049', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '16e956b8-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.850291358, 'message_signature': 'cbd584804a5e62f0325533e172d1b11a044bfada191ba52c01c9eafded55f71b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d-sda', 'timestamp': '2025-10-08T15:54:36.172336', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'instance-00000049', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '16e95f50-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.850291358, 'message_signature': 'ddd65e382f9d3623cf76a9f61c2cabb7cbcad79cad86901f71cfe6587efd29c3'}]}, 'timestamp': '2025-10-08 15:54:36.172785', '_unique_id': 'c1c5f7519c334a27b269fb5229c52e22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.173 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 DEBUG ceilometer.compute.pollsters [-] 570998ad-005c-4c0c-b2df-5b23a6c4448d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eee0f663-5be2-4a9e-a60f-3deb9974101b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000049-570998ad-005c-4c0c-b2df-5b23a6c4448d-tap8a7eab74-6e', 'timestamp': '2025-10-08T15:54:36.173889', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'tap8a7eab74-6e', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f7:33:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8a7eab74-6e'}, 'message_id': '16e99312-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.834110371, 'message_signature': 'f3ad36fd2d0c86041fa2be31e8bd912b44c349b824abda7871c859923351ec23'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4d641ac754b44f89a23c1628056309a', 'user_name': None, 'project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'project_name': None, 'resource_id': 'instance-00000049-570998ad-005c-4c0c-b2df-5b23a6c4448d-tap1989484f-dc', 'timestamp': '2025-10-08T15:54:36.173889', 'resource_metadata': {'display_name': 'tempest-test_qos_after_live_migration-908003806', 'name': 'tap1989484f-dc', 'instance_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'instance_type': 'custom_neutron_guest', 'host': '6f7292cc9e2fcd007770c14927a628f02c44762b6b3d15159200c136', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2d:41:77', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1989484f-dc'}, 'message_id': '16e99c7c-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 5779.834110371, 'message_signature': 'cbddcf08ca2d4a5a9329e69ceeed34627128d21a0dad02d2f76ea37af2d7e12e'}]}, 'timestamp': '2025-10-08 15:54:36.174358', '_unique_id': '7ff24f8f9d0c4c4f93afb4cc446c963f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:54:36.174 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:54:36 np0005476733 nova_compute[192580]: 2025-10-08 15:54:36.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:37 np0005476733 podman[245135]: 2025-10-08 15:54:37.303731861 +0000 UTC m=+0.115912296 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 11:54:37 np0005476733 podman[245136]: 2025-10-08 15:54:37.303847815 +0000 UTC m=+0.114569312 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:54:40 np0005476733 nova_compute[192580]: 2025-10-08 15:54:40.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:41 np0005476733 nova_compute[192580]: 2025-10-08 15:54:41.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:45 np0005476733 nova_compute[192580]: 2025-10-08 15:54:45.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:46 np0005476733 nova_compute[192580]: 2025-10-08 15:54:46.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:47 np0005476733 podman[245179]: 2025-10-08 15:54:47.238128674 +0000 UTC m=+0.069976586 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:54:50 np0005476733 nova_compute[192580]: 2025-10-08 15:54:50.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:51 np0005476733 nova_compute[192580]: 2025-10-08 15:54:51.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:53 np0005476733 podman[245213]: 2025-10-08 15:54:53.264371764 +0000 UTC m=+0.077875249 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  8 11:54:53 np0005476733 podman[245199]: 2025-10-08 15:54:53.290297852 +0000 UTC m=+0.110845363 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 11:54:55 np0005476733 nova_compute[192580]: 2025-10-08 15:54:55.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:54:56 np0005476733 ovn_controller[94857]: 2025-10-08T15:54:56Z|00675|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Oct  8 11:54:56 np0005476733 nova_compute[192580]: 2025-10-08 15:54:56.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:00 np0005476733 nova_compute[192580]: 2025-10-08 15:55:00.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:00 np0005476733 nova_compute[192580]: 2025-10-08 15:55:00.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:55:01 np0005476733 podman[245258]: 2025-10-08 15:55:01.250642802 +0000 UTC m=+0.070427501 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 11:55:01 np0005476733 podman[245259]: 2025-10-08 15:55:01.283557614 +0000 UTC m=+0.094522401 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Oct  8 11:55:01 np0005476733 podman[245257]: 2025-10-08 15:55:01.293050717 +0000 UTC m=+0.109908163 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 11:55:01 np0005476733 nova_compute[192580]: 2025-10-08 15:55:01.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:03 np0005476733 nova_compute[192580]: 2025-10-08 15:55:03.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:55:05 np0005476733 nova_compute[192580]: 2025-10-08 15:55:05.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:06 np0005476733 nova_compute[192580]: 2025-10-08 15:55:06.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:08 np0005476733 podman[245319]: 2025-10-08 15:55:08.244368542 +0000 UTC m=+0.062094525 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:55:08 np0005476733 podman[245318]: 2025-10-08 15:55:08.251991055 +0000 UTC m=+0.075779962 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid)
Oct  8 11:55:10 np0005476733 nova_compute[192580]: 2025-10-08 15:55:10.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:10 np0005476733 nova_compute[192580]: 2025-10-08 15:55:10.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:55:11 np0005476733 nova_compute[192580]: 2025-10-08 15:55:11.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:11 np0005476733 nova_compute[192580]: 2025-10-08 15:55:11.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:55:11 np0005476733 nova_compute[192580]: 2025-10-08 15:55:11.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:55:11 np0005476733 nova_compute[192580]: 2025-10-08 15:55:11.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:55:12 np0005476733 nova_compute[192580]: 2025-10-08 15:55:12.392 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-570998ad-005c-4c0c-b2df-5b23a6c4448d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:55:12 np0005476733 nova_compute[192580]: 2025-10-08 15:55:12.393 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-570998ad-005c-4c0c-b2df-5b23a6c4448d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:55:12 np0005476733 nova_compute[192580]: 2025-10-08 15:55:12.393 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:55:12 np0005476733 nova_compute[192580]: 2025-10-08 15:55:12.394 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 570998ad-005c-4c0c-b2df-5b23a6c4448d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:55:14 np0005476733 nova_compute[192580]: 2025-10-08 15:55:14.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:14.615 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:55:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:14.617 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:55:15 np0005476733 nova_compute[192580]: 2025-10-08 15:55:15.401 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Updating instance_info_cache with network_info: [{"id": "8a7eab74-6ed6-42aa-beab-977d24da645d", "address": "fa:16:3e:f7:33:06", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.242", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a7eab74-6e", "ovs_interfaceid": "8a7eab74-6ed6-42aa-beab-977d24da645d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1989484f-dc1f-4f5c-94f0-a26c6ea9ece9", "address": "fa:16:3e:2d:41:77", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1989484f-dc", "ovs_interfaceid": "1989484f-dc1f-4f5c-94f0-a26c6ea9ece9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:55:15 np0005476733 nova_compute[192580]: 2025-10-08 15:55:15.415 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-570998ad-005c-4c0c-b2df-5b23a6c4448d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:55:15 np0005476733 nova_compute[192580]: 2025-10-08 15:55:15.416 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:55:15 np0005476733 nova_compute[192580]: 2025-10-08 15:55:15.416 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:55:15 np0005476733 nova_compute[192580]: 2025-10-08 15:55:15.416 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:55:15 np0005476733 nova_compute[192580]: 2025-10-08 15:55:15.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:15 np0005476733 nova_compute[192580]: 2025-10-08 15:55:15.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:55:15 np0005476733 nova_compute[192580]: 2025-10-08 15:55:15.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:55:15 np0005476733 nova_compute[192580]: 2025-10-08 15:55:15.615 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:55:15 np0005476733 nova_compute[192580]: 2025-10-08 15:55:15.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:55:15 np0005476733 nova_compute[192580]: 2025-10-08 15:55:15.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:55:15 np0005476733 nova_compute[192580]: 2025-10-08 15:55:15.616 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:55:15 np0005476733 nova_compute[192580]: 2025-10-08 15:55:15.711 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/570998ad-005c-4c0c-b2df-5b23a6c4448d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:55:15 np0005476733 nova_compute[192580]: 2025-10-08 15:55:15.794 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/570998ad-005c-4c0c-b2df-5b23a6c4448d/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:55:15 np0005476733 nova_compute[192580]: 2025-10-08 15:55:15.796 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/570998ad-005c-4c0c-b2df-5b23a6c4448d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:55:15 np0005476733 nova_compute[192580]: 2025-10-08 15:55:15.876 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/570998ad-005c-4c0c-b2df-5b23a6c4448d/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:55:16 np0005476733 nova_compute[192580]: 2025-10-08 15:55:16.041 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:55:16 np0005476733 nova_compute[192580]: 2025-10-08 15:55:16.042 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13142MB free_disk=111.18670272827148GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:55:16 np0005476733 nova_compute[192580]: 2025-10-08 15:55:16.043 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:55:16 np0005476733 nova_compute[192580]: 2025-10-08 15:55:16.043 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:55:16 np0005476733 nova_compute[192580]: 2025-10-08 15:55:16.230 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 570998ad-005c-4c0c-b2df-5b23a6c4448d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:55:16 np0005476733 nova_compute[192580]: 2025-10-08 15:55:16.231 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:55:16 np0005476733 nova_compute[192580]: 2025-10-08 15:55:16.232 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:55:16 np0005476733 nova_compute[192580]: 2025-10-08 15:55:16.349 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:55:16 np0005476733 nova_compute[192580]: 2025-10-08 15:55:16.363 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:55:16 np0005476733 nova_compute[192580]: 2025-10-08 15:55:16.387 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:55:16 np0005476733 nova_compute[192580]: 2025-10-08 15:55:16.388 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.345s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:55:16 np0005476733 nova_compute[192580]: 2025-10-08 15:55:16.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:17 np0005476733 nova_compute[192580]: 2025-10-08 15:55:17.728 2 DEBUG oslo_concurrency.lockutils [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "570998ad-005c-4c0c-b2df-5b23a6c4448d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:55:17 np0005476733 nova_compute[192580]: 2025-10-08 15:55:17.729 2 DEBUG oslo_concurrency.lockutils [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "570998ad-005c-4c0c-b2df-5b23a6c4448d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:55:17 np0005476733 nova_compute[192580]: 2025-10-08 15:55:17.729 2 DEBUG oslo_concurrency.lockutils [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "570998ad-005c-4c0c-b2df-5b23a6c4448d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:55:17 np0005476733 nova_compute[192580]: 2025-10-08 15:55:17.730 2 DEBUG oslo_concurrency.lockutils [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "570998ad-005c-4c0c-b2df-5b23a6c4448d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:55:17 np0005476733 nova_compute[192580]: 2025-10-08 15:55:17.730 2 DEBUG oslo_concurrency.lockutils [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "570998ad-005c-4c0c-b2df-5b23a6c4448d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:55:17 np0005476733 nova_compute[192580]: 2025-10-08 15:55:17.732 2 INFO nova.compute.manager [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Terminating instance#033[00m
Oct  8 11:55:17 np0005476733 nova_compute[192580]: 2025-10-08 15:55:17.733 2 DEBUG nova.compute.manager [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:55:17 np0005476733 kernel: tap8a7eab74-6e (unregistering): left promiscuous mode
Oct  8 11:55:17 np0005476733 NetworkManager[51699]: <info>  [1759938917.7667] device (tap8a7eab74-6e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:55:17 np0005476733 ovn_controller[94857]: 2025-10-08T15:55:17Z|00676|binding|INFO|Releasing lport 8a7eab74-6ed6-42aa-beab-977d24da645d from this chassis (sb_readonly=0)
Oct  8 11:55:17 np0005476733 ovn_controller[94857]: 2025-10-08T15:55:17Z|00677|binding|INFO|Setting lport 8a7eab74-6ed6-42aa-beab-977d24da645d down in Southbound
Oct  8 11:55:17 np0005476733 nova_compute[192580]: 2025-10-08 15:55:17.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:17 np0005476733 ovn_controller[94857]: 2025-10-08T15:55:17Z|00678|binding|INFO|Removing iface tap8a7eab74-6e ovn-installed in OVS
Oct  8 11:55:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:17.784 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:33:06 192.168.8.242'], port_security=['fa:16:3e:f7:33:06 192.168.8.242'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.8.242/24', 'neutron:device_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2fece7d-de46-49dc-874d-3e87e96b491f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '14', 'neutron:security_group_ids': 'b449450f-29a2-4ba2-a56d-c4c1cca923db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.238', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07c02bbf-eada-4d49-8b61-f6f456154844, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=8a7eab74-6ed6-42aa-beab-977d24da645d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:55:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:17.785 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 8a7eab74-6ed6-42aa-beab-977d24da645d in datapath f2fece7d-de46-49dc-874d-3e87e96b491f unbound from our chassis#033[00m
Oct  8 11:55:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:17.788 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f2fece7d-de46-49dc-874d-3e87e96b491f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:55:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:17.789 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[293d8fdf-fc91-4d71-b855-56b77be42507]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:55:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:17.791 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f namespace which is not needed anymore#033[00m
Oct  8 11:55:17 np0005476733 nova_compute[192580]: 2025-10-08 15:55:17.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:17 np0005476733 kernel: tap1989484f-dc (unregistering): left promiscuous mode
Oct  8 11:55:17 np0005476733 NetworkManager[51699]: <info>  [1759938917.8115] device (tap1989484f-dc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:55:17 np0005476733 nova_compute[192580]: 2025-10-08 15:55:17.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:17 np0005476733 nova_compute[192580]: 2025-10-08 15:55:17.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:17 np0005476733 ovn_controller[94857]: 2025-10-08T15:55:17Z|00679|binding|INFO|Releasing lport 1989484f-dc1f-4f5c-94f0-a26c6ea9ece9 from this chassis (sb_readonly=0)
Oct  8 11:55:17 np0005476733 ovn_controller[94857]: 2025-10-08T15:55:17Z|00680|binding|INFO|Setting lport 1989484f-dc1f-4f5c-94f0-a26c6ea9ece9 down in Southbound
Oct  8 11:55:17 np0005476733 ovn_controller[94857]: 2025-10-08T15:55:17Z|00681|binding|INFO|Removing iface tap1989484f-dc ovn-installed in OVS
Oct  8 11:55:17 np0005476733 nova_compute[192580]: 2025-10-08 15:55:17.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:17.828 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:41:77 10.100.0.11'], port_security=['fa:16:3e:2d:41:77 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '570998ad-005c-4c0c-b2df-5b23a6c4448d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd58fb802e34e481ea69b20f4fe8df6d2', 'neutron:revision_number': '13', 'neutron:security_group_ids': '82ea289b-c65f-44fe-a172-e9784a3ab9f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.208', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3da71a44-b74e-4032-87c4-3337484b3d54, chassis=[], tunnel_key=10, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=1989484f-dc1f-4f5c-94f0-a26c6ea9ece9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:55:17 np0005476733 nova_compute[192580]: 2025-10-08 15:55:17.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:17 np0005476733 podman[245373]: 2025-10-08 15:55:17.892472048 +0000 UTC m=+0.082598391 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:55:17 np0005476733 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000049.scope: Deactivated successfully.
Oct  8 11:55:17 np0005476733 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000049.scope: Consumed 20.769s CPU time.
Oct  8 11:55:17 np0005476733 systemd-machined[152624]: Machine qemu-44-instance-00000049 terminated.
Oct  8 11:55:17 np0005476733 neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f[244980]: [NOTICE]   (244984) : haproxy version is 2.8.14-c23fe91
Oct  8 11:55:17 np0005476733 neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f[244980]: [NOTICE]   (244984) : path to executable is /usr/sbin/haproxy
Oct  8 11:55:17 np0005476733 neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f[244980]: [WARNING]  (244984) : Exiting Master process...
Oct  8 11:55:17 np0005476733 neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f[244980]: [WARNING]  (244984) : Exiting Master process...
Oct  8 11:55:17 np0005476733 neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f[244980]: [ALERT]    (244984) : Current worker (244986) exited with code 143 (Terminated)
Oct  8 11:55:17 np0005476733 neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f[244980]: [WARNING]  (244984) : All workers exited. Exiting... (0)
Oct  8 11:55:17 np0005476733 systemd[1]: libpod-3f832d9ce10584ef050c87c5ba56df427a694600bbed683d2d5a7ff155dd8fe5.scope: Deactivated successfully.
Oct  8 11:55:17 np0005476733 podman[245417]: 2025-10-08 15:55:17.977792514 +0000 UTC m=+0.055218475 container died 3f832d9ce10584ef050c87c5ba56df427a694600bbed683d2d5a7ff155dd8fe5 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 11:55:18 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3f832d9ce10584ef050c87c5ba56df427a694600bbed683d2d5a7ff155dd8fe5-userdata-shm.mount: Deactivated successfully.
Oct  8 11:55:18 np0005476733 systemd[1]: var-lib-containers-storage-overlay-058df4e41d2b24e5e9eed709cea7ba9c5c1e6518b1f2c6ddada6e2b56cb331cf-merged.mount: Deactivated successfully.
Oct  8 11:55:18 np0005476733 podman[245417]: 2025-10-08 15:55:18.013563807 +0000 UTC m=+0.090989768 container cleanup 3f832d9ce10584ef050c87c5ba56df427a694600bbed683d2d5a7ff155dd8fe5 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:55:18 np0005476733 systemd[1]: libpod-conmon-3f832d9ce10584ef050c87c5ba56df427a694600bbed683d2d5a7ff155dd8fe5.scope: Deactivated successfully.
Oct  8 11:55:18 np0005476733 podman[245446]: 2025-10-08 15:55:18.084306328 +0000 UTC m=+0.046543848 container remove 3f832d9ce10584ef050c87c5ba56df427a694600bbed683d2d5a7ff155dd8fe5 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:55:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:18.090 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1179c2f2-8a68-4c5f-99a4-e988576641c7]: (4, ('Wed Oct  8 03:55:17 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f (3f832d9ce10584ef050c87c5ba56df427a694600bbed683d2d5a7ff155dd8fe5)\n3f832d9ce10584ef050c87c5ba56df427a694600bbed683d2d5a7ff155dd8fe5\nWed Oct  8 03:55:18 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f (3f832d9ce10584ef050c87c5ba56df427a694600bbed683d2d5a7ff155dd8fe5)\n3f832d9ce10584ef050c87c5ba56df427a694600bbed683d2d5a7ff155dd8fe5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:55:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:18.092 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a3ed43cf-777a-4914-8d23-c8923b5e4e5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:55:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:18.093 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2fece7d-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:18 np0005476733 kernel: tapf2fece7d-d0: left promiscuous mode
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:18.118 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[216a9813-8db1-4fd9-b056-bd77b5677042]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:55:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:18.147 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[16e0508c-3c90-451d-aeef-b04f60edfb00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:55:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:18.149 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[bae65d63-9ccf-4d95-8994-4bf4b4c119ab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:55:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:18.167 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[fbd9362c-2d22-4b74-bd16-882e086b8a6c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 576842, 'reachable_time': 44050, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245471, 'error': None, 'target': 'ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:55:18 np0005476733 systemd[1]: run-netns-ovnmeta\x2df2fece7d\x2dde46\x2d49dc\x2d874d\x2d3e87e96b491f.mount: Deactivated successfully.
Oct  8 11:55:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:18.175 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f2fece7d-de46-49dc-874d-3e87e96b491f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:55:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:18.176 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[674baa51-fd48-46c3-af77-71efe126b939]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:55:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:18.177 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 1989484f-dc1f-4f5c-94f0-a26c6ea9ece9 in datapath 58a69152-b5a6-41d0-85d5-36ab51cfbfb5 unbound from our chassis#033[00m
Oct  8 11:55:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:18.178 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58a69152-b5a6-41d0-85d5-36ab51cfbfb5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:55:18 np0005476733 NetworkManager[51699]: <info>  [1759938918.1791] manager: (tap1989484f-dc): new Tun device (/org/freedesktop/NetworkManager/Devices/228)
Oct  8 11:55:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:18.179 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c222b1e4-1a02-4c6b-91d7-65047e19bac7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:55:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:18.180 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 namespace which is not needed anymore#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.231 2 INFO nova.virt.libvirt.driver [-] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Instance destroyed successfully.#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.232 2 DEBUG nova.objects.instance [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lazy-loading 'resources' on Instance uuid 570998ad-005c-4c0c-b2df-5b23a6c4448d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.251 2 DEBUG nova.virt.libvirt.vif [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-08T15:50:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_qos_after_live_migration-908003806',display_name='tempest-test_qos_after_live_migration-908003806',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-qos-after-live-migration-908003806',id=73,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:50:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-1ld0jmzv',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:54:32Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=570998ad-005c-4c0c-b2df-5b23a6c4448d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8a7eab74-6ed6-42aa-beab-977d24da645d", "address": "fa:16:3e:f7:33:06", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.242", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a7eab74-6e", "ovs_interfaceid": "8a7eab74-6ed6-42aa-beab-977d24da645d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.251 2 DEBUG nova.network.os_vif_util [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "8a7eab74-6ed6-42aa-beab-977d24da645d", "address": "fa:16:3e:f7:33:06", "network": {"id": "f2fece7d-de46-49dc-874d-3e87e96b491f", "bridge": "br-int", "label": "tempest-test-network--2016052797", "subnets": [{"cidr": "192.168.8.0/24", "dns": [], "gateway": {"address": "192.168.8.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.8.242", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a7eab74-6e", "ovs_interfaceid": "8a7eab74-6ed6-42aa-beab-977d24da645d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.252 2 DEBUG nova.network.os_vif_util [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f7:33:06,bridge_name='br-int',has_traffic_filtering=True,id=8a7eab74-6ed6-42aa-beab-977d24da645d,network=Network(f2fece7d-de46-49dc-874d-3e87e96b491f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a7eab74-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.252 2 DEBUG os_vif [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:33:06,bridge_name='br-int',has_traffic_filtering=True,id=8a7eab74-6ed6-42aa-beab-977d24da645d,network=Network(f2fece7d-de46-49dc-874d-3e87e96b491f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a7eab74-6e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.255 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8a7eab74-6e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.264 2 INFO os_vif [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:33:06,bridge_name='br-int',has_traffic_filtering=True,id=8a7eab74-6ed6-42aa-beab-977d24da645d,network=Network(f2fece7d-de46-49dc-874d-3e87e96b491f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a7eab74-6e')#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.265 2 DEBUG nova.virt.libvirt.vif [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-08T15:50:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_qos_after_live_migration-908003806',display_name='tempest-test_qos_after_live_migration-908003806',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-qos-after-live-migration-908003806',id=73,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOE9ODXeiXcWLfSeEUOBA+44YLAlme2Z7xFpJGg3DyRt8Fh9/1/bJir+mRXzCseDIo7WzEfWD2jc++qubErHr4N9mM09+Er46yEHP5b5qDMoTx/zNdGYDkRRzIozsGw84g==',key_name='tempest-keypair-test-1203258448',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:50:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d58fb802e34e481ea69b20f4fe8df6d2',ramdisk_id='',reservation_id='r-1ld0jmzv',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-1316104462',owner_user_name='tempest-QosTestCommon-1316104462-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:54:32Z,user_data=None,user_id='d4d641ac754b44f89a23c1628056309a',uuid=570998ad-005c-4c0c-b2df-5b23a6c4448d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1989484f-dc1f-4f5c-94f0-a26c6ea9ece9", "address": "fa:16:3e:2d:41:77", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1989484f-dc", "ovs_interfaceid": "1989484f-dc1f-4f5c-94f0-a26c6ea9ece9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.265 2 DEBUG nova.network.os_vif_util [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converting VIF {"id": "1989484f-dc1f-4f5c-94f0-a26c6ea9ece9", "address": "fa:16:3e:2d:41:77", "network": {"id": "58a69152-b5a6-41d0-85d5-36ab51cfbfb5", "bridge": "br-int", "label": "tempest-tenant-ctl-network-648960884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d58fb802e34e481ea69b20f4fe8df6d2", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1989484f-dc", "ovs_interfaceid": "1989484f-dc1f-4f5c-94f0-a26c6ea9ece9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.266 2 DEBUG nova.network.os_vif_util [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2d:41:77,bridge_name='br-int',has_traffic_filtering=True,id=1989484f-dc1f-4f5c-94f0-a26c6ea9ece9,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1989484f-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.266 2 DEBUG os_vif [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:41:77,bridge_name='br-int',has_traffic_filtering=True,id=1989484f-dc1f-4f5c-94f0-a26c6ea9ece9,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1989484f-dc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.268 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1989484f-dc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.273 2 INFO os_vif [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:41:77,bridge_name='br-int',has_traffic_filtering=True,id=1989484f-dc1f-4f5c-94f0-a26c6ea9ece9,network=Network(58a69152-b5a6-41d0-85d5-36ab51cfbfb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap1989484f-dc')#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.273 2 INFO nova.virt.libvirt.driver [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Deleting instance files /var/lib/nova/instances/570998ad-005c-4c0c-b2df-5b23a6c4448d_del#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.274 2 INFO nova.virt.libvirt.driver [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Deletion of /var/lib/nova/instances/570998ad-005c-4c0c-b2df-5b23a6c4448d_del complete#033[00m
Oct  8 11:55:18 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[245053]: [NOTICE]   (245057) : haproxy version is 2.8.14-c23fe91
Oct  8 11:55:18 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[245053]: [NOTICE]   (245057) : path to executable is /usr/sbin/haproxy
Oct  8 11:55:18 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[245053]: [WARNING]  (245057) : Exiting Master process...
Oct  8 11:55:18 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[245053]: [ALERT]    (245057) : Current worker (245059) exited with code 143 (Terminated)
Oct  8 11:55:18 np0005476733 neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5[245053]: [WARNING]  (245057) : All workers exited. Exiting... (0)
Oct  8 11:55:18 np0005476733 systemd[1]: libpod-feeabbdacb8e5a453aa43b7655724ed9d597db435482ee98e68667766e0ef233.scope: Deactivated successfully.
Oct  8 11:55:18 np0005476733 podman[245515]: 2025-10-08 15:55:18.324678869 +0000 UTC m=+0.045923268 container died feeabbdacb8e5a453aa43b7655724ed9d597db435482ee98e68667766e0ef233 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.337 2 INFO nova.compute.manager [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Took 0.60 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.338 2 DEBUG oslo.service.loopingcall [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.338 2 DEBUG nova.compute.manager [-] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.338 2 DEBUG nova.network.neutron [-] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:55:18 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-feeabbdacb8e5a453aa43b7655724ed9d597db435482ee98e68667766e0ef233-userdata-shm.mount: Deactivated successfully.
Oct  8 11:55:18 np0005476733 systemd[1]: var-lib-containers-storage-overlay-7883be799b68d6a1ea3f8abb9e9a60a60c81acf90c61aa114799645bc2053454-merged.mount: Deactivated successfully.
Oct  8 11:55:18 np0005476733 podman[245515]: 2025-10-08 15:55:18.362912092 +0000 UTC m=+0.084156491 container cleanup feeabbdacb8e5a453aa43b7655724ed9d597db435482ee98e68667766e0ef233 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  8 11:55:18 np0005476733 systemd[1]: libpod-conmon-feeabbdacb8e5a453aa43b7655724ed9d597db435482ee98e68667766e0ef233.scope: Deactivated successfully.
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.388 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:55:18 np0005476733 podman[245544]: 2025-10-08 15:55:18.422611359 +0000 UTC m=+0.037235660 container remove feeabbdacb8e5a453aa43b7655724ed9d597db435482ee98e68667766e0ef233 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:55:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:18.429 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0472cee4-6036-4540-8bea-ab2065f2f0d1]: (4, ('Wed Oct  8 03:55:18 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 (feeabbdacb8e5a453aa43b7655724ed9d597db435482ee98e68667766e0ef233)\nfeeabbdacb8e5a453aa43b7655724ed9d597db435482ee98e68667766e0ef233\nWed Oct  8 03:55:18 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 (feeabbdacb8e5a453aa43b7655724ed9d597db435482ee98e68667766e0ef233)\nfeeabbdacb8e5a453aa43b7655724ed9d597db435482ee98e68667766e0ef233\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:55:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:18.431 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[747b6621-69f2-4681-80b0-0af19763c7e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:55:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:18.432 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58a69152-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:18 np0005476733 kernel: tap58a69152-b0: left promiscuous mode
Oct  8 11:55:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:18.440 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[abf8b62d-f12b-416f-84f6-d51d1e8876c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:18.481 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d2cb8b73-2972-411d-8804-6eed08015518]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:55:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:18.483 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d346e9db-9f94-400f-926b-f46780dc5bad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:55:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:18.503 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8f5354a5-6a37-42c6-8a28-677ee0f0ed58]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577031, 'reachable_time': 27549, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245560, 'error': None, 'target': 'ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:55:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:18.505 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-58a69152-b5a6-41d0-85d5-36ab51cfbfb5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:55:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:18.505 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[db28b278-45bc-453b-809c-e02df2ff0cb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.587 2 DEBUG nova.compute.manager [req-3eadd6d3-e519-4fd1-bd0b-47a96f6a7d2f req-17447c38-ab5a-415f-a016-3f0bca9761e6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Received event network-vif-unplugged-8a7eab74-6ed6-42aa-beab-977d24da645d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.588 2 DEBUG oslo_concurrency.lockutils [req-3eadd6d3-e519-4fd1-bd0b-47a96f6a7d2f req-17447c38-ab5a-415f-a016-3f0bca9761e6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "570998ad-005c-4c0c-b2df-5b23a6c4448d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.588 2 DEBUG oslo_concurrency.lockutils [req-3eadd6d3-e519-4fd1-bd0b-47a96f6a7d2f req-17447c38-ab5a-415f-a016-3f0bca9761e6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "570998ad-005c-4c0c-b2df-5b23a6c4448d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.588 2 DEBUG oslo_concurrency.lockutils [req-3eadd6d3-e519-4fd1-bd0b-47a96f6a7d2f req-17447c38-ab5a-415f-a016-3f0bca9761e6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "570998ad-005c-4c0c-b2df-5b23a6c4448d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.588 2 DEBUG nova.compute.manager [req-3eadd6d3-e519-4fd1-bd0b-47a96f6a7d2f req-17447c38-ab5a-415f-a016-3f0bca9761e6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] No waiting events found dispatching network-vif-unplugged-8a7eab74-6ed6-42aa-beab-977d24da645d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:55:18 np0005476733 nova_compute[192580]: 2025-10-08 15:55:18.589 2 DEBUG nova.compute.manager [req-3eadd6d3-e519-4fd1-bd0b-47a96f6a7d2f req-17447c38-ab5a-415f-a016-3f0bca9761e6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Received event network-vif-unplugged-8a7eab74-6ed6-42aa-beab-977d24da645d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:55:19 np0005476733 systemd[1]: run-netns-ovnmeta\x2d58a69152\x2db5a6\x2d41d0\x2d85d5\x2d36ab51cfbfb5.mount: Deactivated successfully.
Oct  8 11:55:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:55:20Z|00682|pinctrl|WARN|Dropped 1363 log messages in last 59 seconds (most recently, 1 seconds ago) due to excessive rate
Oct  8 11:55:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:55:20Z|00683|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:55:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:20.620 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:55:20 np0005476733 nova_compute[192580]: 2025-10-08 15:55:20.700 2 DEBUG nova.compute.manager [req-8f5832b6-712a-4b15-a455-2abbbe630618 req-bfcd4ef3-f9d6-4cb1-905a-f034275c9434 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Received event network-vif-plugged-8a7eab74-6ed6-42aa-beab-977d24da645d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:55:20 np0005476733 nova_compute[192580]: 2025-10-08 15:55:20.700 2 DEBUG oslo_concurrency.lockutils [req-8f5832b6-712a-4b15-a455-2abbbe630618 req-bfcd4ef3-f9d6-4cb1-905a-f034275c9434 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "570998ad-005c-4c0c-b2df-5b23a6c4448d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:55:20 np0005476733 nova_compute[192580]: 2025-10-08 15:55:20.702 2 DEBUG oslo_concurrency.lockutils [req-8f5832b6-712a-4b15-a455-2abbbe630618 req-bfcd4ef3-f9d6-4cb1-905a-f034275c9434 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "570998ad-005c-4c0c-b2df-5b23a6c4448d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:55:20 np0005476733 nova_compute[192580]: 2025-10-08 15:55:20.702 2 DEBUG oslo_concurrency.lockutils [req-8f5832b6-712a-4b15-a455-2abbbe630618 req-bfcd4ef3-f9d6-4cb1-905a-f034275c9434 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "570998ad-005c-4c0c-b2df-5b23a6c4448d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:55:20 np0005476733 nova_compute[192580]: 2025-10-08 15:55:20.703 2 DEBUG nova.compute.manager [req-8f5832b6-712a-4b15-a455-2abbbe630618 req-bfcd4ef3-f9d6-4cb1-905a-f034275c9434 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] No waiting events found dispatching network-vif-plugged-8a7eab74-6ed6-42aa-beab-977d24da645d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:55:20 np0005476733 nova_compute[192580]: 2025-10-08 15:55:20.703 2 WARNING nova.compute.manager [req-8f5832b6-712a-4b15-a455-2abbbe630618 req-bfcd4ef3-f9d6-4cb1-905a-f034275c9434 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Received unexpected event network-vif-plugged-8a7eab74-6ed6-42aa-beab-977d24da645d for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:55:20 np0005476733 nova_compute[192580]: 2025-10-08 15:55:20.703 2 DEBUG nova.compute.manager [req-8f5832b6-712a-4b15-a455-2abbbe630618 req-bfcd4ef3-f9d6-4cb1-905a-f034275c9434 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Received event network-vif-unplugged-1989484f-dc1f-4f5c-94f0-a26c6ea9ece9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:55:20 np0005476733 nova_compute[192580]: 2025-10-08 15:55:20.704 2 DEBUG oslo_concurrency.lockutils [req-8f5832b6-712a-4b15-a455-2abbbe630618 req-bfcd4ef3-f9d6-4cb1-905a-f034275c9434 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "570998ad-005c-4c0c-b2df-5b23a6c4448d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:55:20 np0005476733 nova_compute[192580]: 2025-10-08 15:55:20.704 2 DEBUG oslo_concurrency.lockutils [req-8f5832b6-712a-4b15-a455-2abbbe630618 req-bfcd4ef3-f9d6-4cb1-905a-f034275c9434 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "570998ad-005c-4c0c-b2df-5b23a6c4448d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:55:20 np0005476733 nova_compute[192580]: 2025-10-08 15:55:20.704 2 DEBUG oslo_concurrency.lockutils [req-8f5832b6-712a-4b15-a455-2abbbe630618 req-bfcd4ef3-f9d6-4cb1-905a-f034275c9434 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "570998ad-005c-4c0c-b2df-5b23a6c4448d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:55:20 np0005476733 nova_compute[192580]: 2025-10-08 15:55:20.704 2 DEBUG nova.compute.manager [req-8f5832b6-712a-4b15-a455-2abbbe630618 req-bfcd4ef3-f9d6-4cb1-905a-f034275c9434 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] No waiting events found dispatching network-vif-unplugged-1989484f-dc1f-4f5c-94f0-a26c6ea9ece9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:55:20 np0005476733 nova_compute[192580]: 2025-10-08 15:55:20.705 2 DEBUG nova.compute.manager [req-8f5832b6-712a-4b15-a455-2abbbe630618 req-bfcd4ef3-f9d6-4cb1-905a-f034275c9434 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Received event network-vif-unplugged-1989484f-dc1f-4f5c-94f0-a26c6ea9ece9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:55:20 np0005476733 nova_compute[192580]: 2025-10-08 15:55:20.705 2 DEBUG nova.compute.manager [req-8f5832b6-712a-4b15-a455-2abbbe630618 req-bfcd4ef3-f9d6-4cb1-905a-f034275c9434 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Received event network-vif-plugged-1989484f-dc1f-4f5c-94f0-a26c6ea9ece9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:55:20 np0005476733 nova_compute[192580]: 2025-10-08 15:55:20.705 2 DEBUG oslo_concurrency.lockutils [req-8f5832b6-712a-4b15-a455-2abbbe630618 req-bfcd4ef3-f9d6-4cb1-905a-f034275c9434 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "570998ad-005c-4c0c-b2df-5b23a6c4448d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:55:20 np0005476733 nova_compute[192580]: 2025-10-08 15:55:20.705 2 DEBUG oslo_concurrency.lockutils [req-8f5832b6-712a-4b15-a455-2abbbe630618 req-bfcd4ef3-f9d6-4cb1-905a-f034275c9434 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "570998ad-005c-4c0c-b2df-5b23a6c4448d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:55:20 np0005476733 nova_compute[192580]: 2025-10-08 15:55:20.706 2 DEBUG oslo_concurrency.lockutils [req-8f5832b6-712a-4b15-a455-2abbbe630618 req-bfcd4ef3-f9d6-4cb1-905a-f034275c9434 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "570998ad-005c-4c0c-b2df-5b23a6c4448d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:55:20 np0005476733 nova_compute[192580]: 2025-10-08 15:55:20.706 2 DEBUG nova.compute.manager [req-8f5832b6-712a-4b15-a455-2abbbe630618 req-bfcd4ef3-f9d6-4cb1-905a-f034275c9434 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] No waiting events found dispatching network-vif-plugged-1989484f-dc1f-4f5c-94f0-a26c6ea9ece9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:55:20 np0005476733 nova_compute[192580]: 2025-10-08 15:55:20.706 2 WARNING nova.compute.manager [req-8f5832b6-712a-4b15-a455-2abbbe630618 req-bfcd4ef3-f9d6-4cb1-905a-f034275c9434 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Received unexpected event network-vif-plugged-1989484f-dc1f-4f5c-94f0-a26c6ea9ece9 for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:55:21 np0005476733 nova_compute[192580]: 2025-10-08 15:55:21.455 2 DEBUG nova.network.neutron [-] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:55:21 np0005476733 nova_compute[192580]: 2025-10-08 15:55:21.494 2 INFO nova.compute.manager [-] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Took 3.16 seconds to deallocate network for instance.#033[00m
Oct  8 11:55:21 np0005476733 nova_compute[192580]: 2025-10-08 15:55:21.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:21 np0005476733 nova_compute[192580]: 2025-10-08 15:55:21.534 2 DEBUG oslo_concurrency.lockutils [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:55:21 np0005476733 nova_compute[192580]: 2025-10-08 15:55:21.535 2 DEBUG oslo_concurrency.lockutils [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:55:21 np0005476733 nova_compute[192580]: 2025-10-08 15:55:21.543 2 DEBUG nova.compute.manager [req-99a64ab3-ff80-432f-8784-347ea3ee2f6d req-e9b78501-f454-40a7-8e49-38df43e571f9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Received event network-vif-deleted-8a7eab74-6ed6-42aa-beab-977d24da645d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:55:21 np0005476733 nova_compute[192580]: 2025-10-08 15:55:21.597 2 DEBUG nova.compute.provider_tree [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:55:21 np0005476733 nova_compute[192580]: 2025-10-08 15:55:21.616 2 DEBUG nova.scheduler.client.report [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:55:21 np0005476733 nova_compute[192580]: 2025-10-08 15:55:21.649 2 DEBUG oslo_concurrency.lockutils [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:55:21 np0005476733 nova_compute[192580]: 2025-10-08 15:55:21.684 2 INFO nova.scheduler.client.report [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Deleted allocations for instance 570998ad-005c-4c0c-b2df-5b23a6c4448d#033[00m
Oct  8 11:55:21 np0005476733 nova_compute[192580]: 2025-10-08 15:55:21.760 2 DEBUG oslo_concurrency.lockutils [None req-bf57e107-89af-4f67-b858-1b33600742d0 d4d641ac754b44f89a23c1628056309a d58fb802e34e481ea69b20f4fe8df6d2 - - default default] Lock "570998ad-005c-4c0c-b2df-5b23a6c4448d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.030s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:55:23 np0005476733 nova_compute[192580]: 2025-10-08 15:55:23.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:24 np0005476733 podman[245562]: 2025-10-08 15:55:24.283551906 +0000 UTC m=+0.092409284 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:55:24 np0005476733 podman[245561]: 2025-10-08 15:55:24.330202896 +0000 UTC m=+0.142485064 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller)
Oct  8 11:55:25 np0005476733 nova_compute[192580]: 2025-10-08 15:55:25.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:55:25 np0005476733 nova_compute[192580]: 2025-10-08 15:55:25.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 11:55:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:26.345 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:55:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:26.346 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:55:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:55:26.346 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:55:26 np0005476733 nova_compute[192580]: 2025-10-08 15:55:26.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:27 np0005476733 nova_compute[192580]: 2025-10-08 15:55:27.612 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:55:28 np0005476733 nova_compute[192580]: 2025-10-08 15:55:28.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:28 np0005476733 nova_compute[192580]: 2025-10-08 15:55:28.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:55:31 np0005476733 nova_compute[192580]: 2025-10-08 15:55:31.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:32 np0005476733 podman[245604]: 2025-10-08 15:55:32.216265372 +0000 UTC m=+0.046058272 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:55:32 np0005476733 podman[245603]: 2025-10-08 15:55:32.219537457 +0000 UTC m=+0.051796885 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct  8 11:55:32 np0005476733 podman[245605]: 2025-10-08 15:55:32.23966118 +0000 UTC m=+0.063926873 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  8 11:55:33 np0005476733 nova_compute[192580]: 2025-10-08 15:55:33.230 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759938918.2290025, 570998ad-005c-4c0c-b2df-5b23a6c4448d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:55:33 np0005476733 nova_compute[192580]: 2025-10-08 15:55:33.231 2 INFO nova.compute.manager [-] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] VM Stopped (Lifecycle Event)#033[00m
Oct  8 11:55:33 np0005476733 nova_compute[192580]: 2025-10-08 15:55:33.250 2 DEBUG nova.compute.manager [None req-9a22f423-1073-49ae-9c8f-192ab3823f47 - - - - - -] [instance: 570998ad-005c-4c0c-b2df-5b23a6c4448d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:55:33 np0005476733 nova_compute[192580]: 2025-10-08 15:55:33.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:36 np0005476733 nova_compute[192580]: 2025-10-08 15:55:36.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:38 np0005476733 nova_compute[192580]: 2025-10-08 15:55:38.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:39 np0005476733 podman[245668]: 2025-10-08 15:55:39.222612346 +0000 UTC m=+0.048688927 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 11:55:39 np0005476733 podman[245667]: 2025-10-08 15:55:39.233153643 +0000 UTC m=+0.060774372 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=iscsid, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 11:55:41 np0005476733 nova_compute[192580]: 2025-10-08 15:55:41.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:43 np0005476733 nova_compute[192580]: 2025-10-08 15:55:43.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:46 np0005476733 nova_compute[192580]: 2025-10-08 15:55:46.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:48 np0005476733 podman[245711]: 2025-10-08 15:55:48.226852594 +0000 UTC m=+0.048287083 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 11:55:48 np0005476733 nova_compute[192580]: 2025-10-08 15:55:48.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:51 np0005476733 nova_compute[192580]: 2025-10-08 15:55:51.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:53 np0005476733 nova_compute[192580]: 2025-10-08 15:55:53.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:55 np0005476733 podman[245732]: 2025-10-08 15:55:55.260357414 +0000 UTC m=+0.075807933 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 11:55:55 np0005476733 podman[245731]: 2025-10-08 15:55:55.274949781 +0000 UTC m=+0.103210379 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Oct  8 11:55:56 np0005476733 nova_compute[192580]: 2025-10-08 15:55:56.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:55:58 np0005476733 nova_compute[192580]: 2025-10-08 15:55:58.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:00 np0005476733 nova_compute[192580]: 2025-10-08 15:56:00.600 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:56:00 np0005476733 nova_compute[192580]: 2025-10-08 15:56:00.600 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:56:00 np0005476733 nova_compute[192580]: 2025-10-08 15:56:00.601 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 11:56:00 np0005476733 nova_compute[192580]: 2025-10-08 15:56:00.619 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 11:56:01 np0005476733 nova_compute[192580]: 2025-10-08 15:56:01.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:03 np0005476733 podman[245779]: 2025-10-08 15:56:03.240888158 +0000 UTC m=+0.067486157 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:56:03 np0005476733 podman[245780]: 2025-10-08 15:56:03.242545532 +0000 UTC m=+0.065821265 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:56:03 np0005476733 podman[245781]: 2025-10-08 15:56:03.246410144 +0000 UTC m=+0.070940766 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, architecture=x86_64)
Oct  8 11:56:03 np0005476733 nova_compute[192580]: 2025-10-08 15:56:03.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:04 np0005476733 nova_compute[192580]: 2025-10-08 15:56:04.608 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:56:06 np0005476733 nova_compute[192580]: 2025-10-08 15:56:06.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:08 np0005476733 nova_compute[192580]: 2025-10-08 15:56:08.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:10 np0005476733 podman[245840]: 2025-10-08 15:56:10.241492818 +0000 UTC m=+0.061437344 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 11:56:10 np0005476733 podman[245841]: 2025-10-08 15:56:10.241810578 +0000 UTC m=+0.056449645 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:56:11 np0005476733 nova_compute[192580]: 2025-10-08 15:56:11.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:11 np0005476733 nova_compute[192580]: 2025-10-08 15:56:11.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:56:11 np0005476733 nova_compute[192580]: 2025-10-08 15:56:11.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:56:11 np0005476733 nova_compute[192580]: 2025-10-08 15:56:11.591 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:56:11 np0005476733 nova_compute[192580]: 2025-10-08 15:56:11.643 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 11:56:12 np0005476733 nova_compute[192580]: 2025-10-08 15:56:12.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:56:12 np0005476733 nova_compute[192580]: 2025-10-08 15:56:12.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:13 np0005476733 nova_compute[192580]: 2025-10-08 15:56:13.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:13 np0005476733 nova_compute[192580]: 2025-10-08 15:56:13.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:13 np0005476733 nova_compute[192580]: 2025-10-08 15:56:13.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:56:13 np0005476733 nova_compute[192580]: 2025-10-08 15:56:13.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:56:14 np0005476733 nova_compute[192580]: 2025-10-08 15:56:14.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:56:16 np0005476733 nova_compute[192580]: 2025-10-08 15:56:16.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:56:16.238 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:56:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:56:16.240 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:56:16 np0005476733 nova_compute[192580]: 2025-10-08 15:56:16.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:16 np0005476733 nova_compute[192580]: 2025-10-08 15:56:16.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:56:16 np0005476733 nova_compute[192580]: 2025-10-08 15:56:16.889 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:56:16 np0005476733 nova_compute[192580]: 2025-10-08 15:56:16.889 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:56:16 np0005476733 nova_compute[192580]: 2025-10-08 15:56:16.890 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:56:16 np0005476733 nova_compute[192580]: 2025-10-08 15:56:16.890 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:56:17 np0005476733 nova_compute[192580]: 2025-10-08 15:56:17.050 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:56:17 np0005476733 nova_compute[192580]: 2025-10-08 15:56:17.052 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13800MB free_disk=111.33283615112305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:56:17 np0005476733 nova_compute[192580]: 2025-10-08 15:56:17.053 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:56:17 np0005476733 nova_compute[192580]: 2025-10-08 15:56:17.053 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:56:17 np0005476733 nova_compute[192580]: 2025-10-08 15:56:17.539 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:56:17 np0005476733 nova_compute[192580]: 2025-10-08 15:56:17.540 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:56:17 np0005476733 nova_compute[192580]: 2025-10-08 15:56:17.555 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 11:56:17 np0005476733 nova_compute[192580]: 2025-10-08 15:56:17.583 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 11:56:17 np0005476733 nova_compute[192580]: 2025-10-08 15:56:17.583 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 11:56:17 np0005476733 nova_compute[192580]: 2025-10-08 15:56:17.599 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 11:56:17 np0005476733 nova_compute[192580]: 2025-10-08 15:56:17.626 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 11:56:17 np0005476733 nova_compute[192580]: 2025-10-08 15:56:17.652 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:56:18 np0005476733 nova_compute[192580]: 2025-10-08 15:56:18.041 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:56:18 np0005476733 nova_compute[192580]: 2025-10-08 15:56:18.314 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:56:18 np0005476733 nova_compute[192580]: 2025-10-08 15:56:18.315 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:56:18 np0005476733 nova_compute[192580]: 2025-10-08 15:56:18.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:19 np0005476733 podman[245884]: 2025-10-08 15:56:19.227417242 +0000 UTC m=+0.057275682 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 11:56:20 np0005476733 nova_compute[192580]: 2025-10-08 15:56:20.316 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:56:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:56:20Z|00684|pinctrl|WARN|Dropped 2485 log messages in last 60 seconds (most recently, 1 seconds ago) due to excessive rate
Oct  8 11:56:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:56:20Z|00685|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:56:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:56:21.243 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:56:21 np0005476733 nova_compute[192580]: 2025-10-08 15:56:21.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:23 np0005476733 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct  8 11:56:23 np0005476733 nova_compute[192580]: 2025-10-08 15:56:23.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:26 np0005476733 podman[245905]: 2025-10-08 15:56:26.255770736 +0000 UTC m=+0.082953692 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 11:56:26 np0005476733 podman[245904]: 2025-10-08 15:56:26.270023591 +0000 UTC m=+0.098876589 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  8 11:56:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:56:26.346 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:56:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:56:26.346 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:56:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:56:26.347 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:56:26 np0005476733 nova_compute[192580]: 2025-10-08 15:56:26.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:26 np0005476733 nova_compute[192580]: 2025-10-08 15:56:26.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:56:28 np0005476733 nova_compute[192580]: 2025-10-08 15:56:28.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:28 np0005476733 nova_compute[192580]: 2025-10-08 15:56:28.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:56:31 np0005476733 nova_compute[192580]: 2025-10-08 15:56:31.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:33 np0005476733 nova_compute[192580]: 2025-10-08 15:56:33.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:34 np0005476733 podman[245948]: 2025-10-08 15:56:34.228397058 +0000 UTC m=+0.051684102 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:56:34 np0005476733 podman[245949]: 2025-10-08 15:56:34.236647962 +0000 UTC m=+0.058773219 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., maintainer=Red Hat, Inc.)
Oct  8 11:56:34 np0005476733 podman[245947]: 2025-10-08 15:56:34.271274568 +0000 UTC m=+0.092480366 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.047 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.047 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.047 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.047 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.048 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.048 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.048 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.048 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.048 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.048 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.048 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.049 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.049 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.049 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.049 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.049 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.049 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.049 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.050 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.050 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.050 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.050 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.050 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.050 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:56:36.050 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 11:56:36 np0005476733 nova_compute[192580]: 2025-10-08 15:56:36.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:38 np0005476733 nova_compute[192580]: 2025-10-08 15:56:38.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:41 np0005476733 podman[246012]: 2025-10-08 15:56:41.237802308 +0000 UTC m=+0.057316923 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 11:56:41 np0005476733 podman[246011]: 2025-10-08 15:56:41.266460834 +0000 UTC m=+0.088522251 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  8 11:56:41 np0005476733 nova_compute[192580]: 2025-10-08 15:56:41.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:43 np0005476733 nova_compute[192580]: 2025-10-08 15:56:43.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:46 np0005476733 nova_compute[192580]: 2025-10-08 15:56:46.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:48 np0005476733 nova_compute[192580]: 2025-10-08 15:56:48.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:50 np0005476733 podman[246055]: 2025-10-08 15:56:50.217226593 +0000 UTC m=+0.050475283 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct  8 11:56:51 np0005476733 nova_compute[192580]: 2025-10-08 15:56:51.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:53 np0005476733 nova_compute[192580]: 2025-10-08 15:56:53.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:55 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:56:55.380 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:56:55 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:56:55.380 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:56:55 np0005476733 nova_compute[192580]: 2025-10-08 15:56:55.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:56 np0005476733 nova_compute[192580]: 2025-10-08 15:56:56.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:56:57 np0005476733 podman[246075]: 2025-10-08 15:56:57.28101654 +0000 UTC m=+0.094652585 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=edpm, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:56:57 np0005476733 podman[246074]: 2025-10-08 15:56:57.298755828 +0000 UTC m=+0.122373871 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 11:56:58 np0005476733 nova_compute[192580]: 2025-10-08 15:56:58.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:57:01.383 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:57:01 np0005476733 nova_compute[192580]: 2025-10-08 15:57:01.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:01 np0005476733 systemd-logind[827]: New session 49 of user zuul.
Oct  8 11:57:01 np0005476733 systemd[1]: Started Session 49 of User zuul.
Oct  8 11:57:02 np0005476733 systemd[1]: session-49.scope: Deactivated successfully.
Oct  8 11:57:02 np0005476733 systemd-logind[827]: Session 49 logged out. Waiting for processes to exit.
Oct  8 11:57:02 np0005476733 systemd-logind[827]: Removed session 49.
Oct  8 11:57:02 np0005476733 nova_compute[192580]: 2025-10-08 15:57:02.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:57:03 np0005476733 nova_compute[192580]: 2025-10-08 15:57:03.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:05 np0005476733 podman[246151]: 2025-10-08 15:57:05.229398487 +0000 UTC m=+0.054060698 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, architecture=x86_64, container_name=openstack_network_exporter, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  8 11:57:05 np0005476733 podman[246149]: 2025-10-08 15:57:05.234976366 +0000 UTC m=+0.065624017 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:57:05 np0005476733 podman[246150]: 2025-10-08 15:57:05.247032991 +0000 UTC m=+0.067375063 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 11:57:05 np0005476733 nova_compute[192580]: 2025-10-08 15:57:05.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:57:06 np0005476733 nova_compute[192580]: 2025-10-08 15:57:06.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:08 np0005476733 nova_compute[192580]: 2025-10-08 15:57:08.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:11 np0005476733 nova_compute[192580]: 2025-10-08 15:57:11.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:12 np0005476733 podman[246211]: 2025-10-08 15:57:12.259035085 +0000 UTC m=+0.088056695 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.vendor=CentOS)
Oct  8 11:57:12 np0005476733 podman[246212]: 2025-10-08 15:57:12.282608168 +0000 UTC m=+0.099856042 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:57:12 np0005476733 nova_compute[192580]: 2025-10-08 15:57:12.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:57:12 np0005476733 nova_compute[192580]: 2025-10-08 15:57:12.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:57:12 np0005476733 nova_compute[192580]: 2025-10-08 15:57:12.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:57:12 np0005476733 nova_compute[192580]: 2025-10-08 15:57:12.619 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 11:57:12 np0005476733 nova_compute[192580]: 2025-10-08 15:57:12.620 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:57:13 np0005476733 nova_compute[192580]: 2025-10-08 15:57:13.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:13 np0005476733 ovn_controller[94857]: 2025-10-08T15:57:13Z|00686|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct  8 11:57:14 np0005476733 nova_compute[192580]: 2025-10-08 15:57:14.612 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:57:15 np0005476733 nova_compute[192580]: 2025-10-08 15:57:15.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:57:15 np0005476733 nova_compute[192580]: 2025-10-08 15:57:15.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:57:16 np0005476733 nova_compute[192580]: 2025-10-08 15:57:16.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:16 np0005476733 nova_compute[192580]: 2025-10-08 15:57:16.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:57:16 np0005476733 nova_compute[192580]: 2025-10-08 15:57:16.619 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:57:16 np0005476733 nova_compute[192580]: 2025-10-08 15:57:16.620 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:57:16 np0005476733 nova_compute[192580]: 2025-10-08 15:57:16.620 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:57:16 np0005476733 nova_compute[192580]: 2025-10-08 15:57:16.620 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:57:16 np0005476733 nova_compute[192580]: 2025-10-08 15:57:16.773 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:57:16 np0005476733 nova_compute[192580]: 2025-10-08 15:57:16.774 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13807MB free_disk=111.33185958862305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:57:16 np0005476733 nova_compute[192580]: 2025-10-08 15:57:16.774 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:57:16 np0005476733 nova_compute[192580]: 2025-10-08 15:57:16.774 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:57:16 np0005476733 nova_compute[192580]: 2025-10-08 15:57:16.832 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:57:16 np0005476733 nova_compute[192580]: 2025-10-08 15:57:16.832 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:57:16 np0005476733 nova_compute[192580]: 2025-10-08 15:57:16.852 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:57:16 np0005476733 nova_compute[192580]: 2025-10-08 15:57:16.865 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:57:16 np0005476733 nova_compute[192580]: 2025-10-08 15:57:16.867 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:57:16 np0005476733 nova_compute[192580]: 2025-10-08 15:57:16.867 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:57:18 np0005476733 nova_compute[192580]: 2025-10-08 15:57:18.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:18 np0005476733 nova_compute[192580]: 2025-10-08 15:57:18.867 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:57:21 np0005476733 podman[246254]: 2025-10-08 15:57:21.218550405 +0000 UTC m=+0.047570301 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  8 11:57:21 np0005476733 nova_compute[192580]: 2025-10-08 15:57:21.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:21 np0005476733 ovn_controller[94857]: 2025-10-08T15:57:21Z|00687|pinctrl|WARN|Dropped 667 log messages in last 61 seconds (most recently, 8 seconds ago) due to excessive rate
Oct  8 11:57:21 np0005476733 ovn_controller[94857]: 2025-10-08T15:57:21Z|00688|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:57:23 np0005476733 nova_compute[192580]: 2025-10-08 15:57:23.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:57:26.346 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:57:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:57:26.347 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:57:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:57:26.347 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:57:26 np0005476733 nova_compute[192580]: 2025-10-08 15:57:26.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:28 np0005476733 podman[246275]: 2025-10-08 15:57:28.230553807 +0000 UTC m=+0.056119934 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 11:57:28 np0005476733 podman[246274]: 2025-10-08 15:57:28.280917967 +0000 UTC m=+0.109187061 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251001, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible)
Oct  8 11:57:28 np0005476733 nova_compute[192580]: 2025-10-08 15:57:28.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:28 np0005476733 nova_compute[192580]: 2025-10-08 15:57:28.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:57:31 np0005476733 nova_compute[192580]: 2025-10-08 15:57:31.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:33 np0005476733 nova_compute[192580]: 2025-10-08 15:57:33.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:36 np0005476733 podman[246318]: 2025-10-08 15:57:36.233036694 +0000 UTC m=+0.061610150 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  8 11:57:36 np0005476733 podman[246319]: 2025-10-08 15:57:36.232664972 +0000 UTC m=+0.057682474 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 11:57:36 np0005476733 podman[246320]: 2025-10-08 15:57:36.263252839 +0000 UTC m=+0.085991049 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  8 11:57:36 np0005476733 nova_compute[192580]: 2025-10-08 15:57:36.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:38 np0005476733 nova_compute[192580]: 2025-10-08 15:57:38.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:41 np0005476733 nova_compute[192580]: 2025-10-08 15:57:41.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:43 np0005476733 podman[246383]: 2025-10-08 15:57:43.240215102 +0000 UTC m=+0.061504536 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 11:57:43 np0005476733 podman[246382]: 2025-10-08 15:57:43.272114781 +0000 UTC m=+0.098718655 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:57:43 np0005476733 nova_compute[192580]: 2025-10-08 15:57:43.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:46 np0005476733 nova_compute[192580]: 2025-10-08 15:57:46.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:48 np0005476733 nova_compute[192580]: 2025-10-08 15:57:48.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:51 np0005476733 nova_compute[192580]: 2025-10-08 15:57:51.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:52 np0005476733 podman[246426]: 2025-10-08 15:57:52.238394598 +0000 UTC m=+0.064110270 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 11:57:53 np0005476733 nova_compute[192580]: 2025-10-08 15:57:53.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:56 np0005476733 nova_compute[192580]: 2025-10-08 15:57:56.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:58 np0005476733 nova_compute[192580]: 2025-10-08 15:57:58.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:57:59 np0005476733 podman[246446]: 2025-10-08 15:57:59.223965437 +0000 UTC m=+0.050908667 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute)
Oct  8 11:57:59 np0005476733 podman[246445]: 2025-10-08 15:57:59.253306664 +0000 UTC m=+0.081928548 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:58:00 np0005476733 ovn_controller[94857]: 2025-10-08T15:58:00Z|00689|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Oct  8 11:58:01 np0005476733 nova_compute[192580]: 2025-10-08 15:58:01.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:02 np0005476733 nova_compute[192580]: 2025-10-08 15:58:02.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:58:03 np0005476733 nova_compute[192580]: 2025-10-08 15:58:03.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:06 np0005476733 nova_compute[192580]: 2025-10-08 15:58:06.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:07 np0005476733 podman[246493]: 2025-10-08 15:58:07.22726586 +0000 UTC m=+0.059492732 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct  8 11:58:07 np0005476733 podman[246494]: 2025-10-08 15:58:07.22727245 +0000 UTC m=+0.049243765 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 11:58:07 np0005476733 podman[246495]: 2025-10-08 15:58:07.235324957 +0000 UTC m=+0.059206532 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9)
Oct  8 11:58:07 np0005476733 nova_compute[192580]: 2025-10-08 15:58:07.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:58:08 np0005476733 nova_compute[192580]: 2025-10-08 15:58:08.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:11 np0005476733 nova_compute[192580]: 2025-10-08 15:58:11.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:12 np0005476733 nova_compute[192580]: 2025-10-08 15:58:12.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:58:12 np0005476733 nova_compute[192580]: 2025-10-08 15:58:12.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:58:12 np0005476733 nova_compute[192580]: 2025-10-08 15:58:12.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:58:12 np0005476733 nova_compute[192580]: 2025-10-08 15:58:12.610 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 11:58:13 np0005476733 nova_compute[192580]: 2025-10-08 15:58:13.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:14 np0005476733 podman[246556]: 2025-10-08 15:58:14.234567225 +0000 UTC m=+0.060100972 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  8 11:58:14 np0005476733 podman[246557]: 2025-10-08 15:58:14.243442638 +0000 UTC m=+0.059812202 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 11:58:14 np0005476733 nova_compute[192580]: 2025-10-08 15:58:14.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:58:15 np0005476733 nova_compute[192580]: 2025-10-08 15:58:15.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:58:15 np0005476733 nova_compute[192580]: 2025-10-08 15:58:15.707 2 DEBUG oslo_concurrency.lockutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquiring lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:58:15 np0005476733 nova_compute[192580]: 2025-10-08 15:58:15.707 2 DEBUG oslo_concurrency.lockutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:58:15 np0005476733 nova_compute[192580]: 2025-10-08 15:58:15.726 2 DEBUG nova.compute.manager [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 11:58:15 np0005476733 nova_compute[192580]: 2025-10-08 15:58:15.813 2 DEBUG oslo_concurrency.lockutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:58:15 np0005476733 nova_compute[192580]: 2025-10-08 15:58:15.814 2 DEBUG oslo_concurrency.lockutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:58:15 np0005476733 nova_compute[192580]: 2025-10-08 15:58:15.822 2 DEBUG nova.virt.hardware [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 11:58:15 np0005476733 nova_compute[192580]: 2025-10-08 15:58:15.823 2 INFO nova.compute.claims [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 11:58:15 np0005476733 nova_compute[192580]: 2025-10-08 15:58:15.953 2 DEBUG nova.compute.provider_tree [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:58:15 np0005476733 nova_compute[192580]: 2025-10-08 15:58:15.971 2 DEBUG nova.scheduler.client.report [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:15.999 2 DEBUG oslo_concurrency.lockutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.000 2 DEBUG nova.compute.manager [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.057 2 DEBUG nova.compute.manager [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.058 2 DEBUG nova.network.neutron [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.083 2 INFO nova.virt.libvirt.driver [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.103 2 DEBUG nova.compute.manager [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.214 2 DEBUG nova.compute.manager [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.216 2 DEBUG nova.virt.libvirt.driver [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.216 2 INFO nova.virt.libvirt.driver [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Creating image(s)#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.216 2 DEBUG oslo_concurrency.lockutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquiring lock "/var/lib/nova/instances/9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.217 2 DEBUG oslo_concurrency.lockutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "/var/lib/nova/instances/9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.217 2 DEBUG oslo_concurrency.lockutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "/var/lib/nova/instances/9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.229 2 DEBUG oslo_concurrency.processutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.301 2 DEBUG oslo_concurrency.processutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.302 2 DEBUG oslo_concurrency.lockutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.303 2 DEBUG oslo_concurrency.lockutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.314 2 DEBUG oslo_concurrency.processutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.369 2 DEBUG oslo_concurrency.processutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.370 2 DEBUG oslo_concurrency.processutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.412 2 DEBUG oslo_concurrency.processutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk 10737418240" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.413 2 DEBUG oslo_concurrency.lockutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.414 2 DEBUG oslo_concurrency.processutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.469 2 DEBUG oslo_concurrency.processutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.470 2 DEBUG nova.objects.instance [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lazy-loading 'migration_context' on Instance uuid 9a8d7e02-6802-40ae-8c7a-6f40179085f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.498 2 DEBUG nova.virt.libvirt.driver [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.499 2 DEBUG nova.virt.libvirt.driver [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Ensure instance console log exists: /var/lib/nova/instances/9a8d7e02-6802-40ae-8c7a-6f40179085f0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.499 2 DEBUG oslo_concurrency.lockutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.500 2 DEBUG oslo_concurrency.lockutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.500 2 DEBUG oslo_concurrency.lockutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:16 np0005476733 nova_compute[192580]: 2025-10-08 15:58:16.749 2 DEBUG nova.policy [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:58:17 np0005476733 nova_compute[192580]: 2025-10-08 15:58:17.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:58:17 np0005476733 nova_compute[192580]: 2025-10-08 15:58:17.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:58:17 np0005476733 nova_compute[192580]: 2025-10-08 15:58:17.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:58:17 np0005476733 nova_compute[192580]: 2025-10-08 15:58:17.608 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:58:17 np0005476733 nova_compute[192580]: 2025-10-08 15:58:17.609 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:58:17 np0005476733 nova_compute[192580]: 2025-10-08 15:58:17.609 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:58:17 np0005476733 nova_compute[192580]: 2025-10-08 15:58:17.609 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:58:17 np0005476733 nova_compute[192580]: 2025-10-08 15:58:17.762 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:58:17 np0005476733 nova_compute[192580]: 2025-10-08 15:58:17.763 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13797MB free_disk=111.33164978027344GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:58:17 np0005476733 nova_compute[192580]: 2025-10-08 15:58:17.763 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:58:17 np0005476733 nova_compute[192580]: 2025-10-08 15:58:17.763 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:58:17 np0005476733 nova_compute[192580]: 2025-10-08 15:58:17.827 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 9a8d7e02-6802-40ae-8c7a-6f40179085f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:58:17 np0005476733 nova_compute[192580]: 2025-10-08 15:58:17.828 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:58:17 np0005476733 nova_compute[192580]: 2025-10-08 15:58:17.828 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:58:17 np0005476733 nova_compute[192580]: 2025-10-08 15:58:17.870 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:58:17 np0005476733 nova_compute[192580]: 2025-10-08 15:58:17.885 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:58:17 np0005476733 nova_compute[192580]: 2025-10-08 15:58:17.905 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:58:17 np0005476733 nova_compute[192580]: 2025-10-08 15:58:17.906 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:58:18 np0005476733 nova_compute[192580]: 2025-10-08 15:58:18.149 2 DEBUG nova.network.neutron [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Successfully created port: ec4d6d14-21c2-4a40-a5d6-e96b314661ca _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 11:58:18 np0005476733 nova_compute[192580]: 2025-10-08 15:58:18.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:18.929 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:58:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:18.930 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:58:18 np0005476733 nova_compute[192580]: 2025-10-08 15:58:18.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:19 np0005476733 nova_compute[192580]: 2025-10-08 15:58:19.714 2 DEBUG nova.network.neutron [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Successfully updated port: ec4d6d14-21c2-4a40-a5d6-e96b314661ca _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:58:19 np0005476733 nova_compute[192580]: 2025-10-08 15:58:19.849 2 DEBUG oslo_concurrency.lockutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquiring lock "refresh_cache-9a8d7e02-6802-40ae-8c7a-6f40179085f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:58:19 np0005476733 nova_compute[192580]: 2025-10-08 15:58:19.849 2 DEBUG oslo_concurrency.lockutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquired lock "refresh_cache-9a8d7e02-6802-40ae-8c7a-6f40179085f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:58:19 np0005476733 nova_compute[192580]: 2025-10-08 15:58:19.850 2 DEBUG nova.network.neutron [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:58:19 np0005476733 nova_compute[192580]: 2025-10-08 15:58:19.861 2 DEBUG nova.compute.manager [req-f7749b00-ae87-447e-85e6-67c8c210424c req-1e8fff24-6c04-4ff0-a057-3edb1c136f64 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Received event network-changed-ec4d6d14-21c2-4a40-a5d6-e96b314661ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:58:19 np0005476733 nova_compute[192580]: 2025-10-08 15:58:19.861 2 DEBUG nova.compute.manager [req-f7749b00-ae87-447e-85e6-67c8c210424c req-1e8fff24-6c04-4ff0-a057-3edb1c136f64 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Refreshing instance network info cache due to event network-changed-ec4d6d14-21c2-4a40-a5d6-e96b314661ca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:58:19 np0005476733 nova_compute[192580]: 2025-10-08 15:58:19.862 2 DEBUG oslo_concurrency.lockutils [req-f7749b00-ae87-447e-85e6-67c8c210424c req-1e8fff24-6c04-4ff0-a057-3edb1c136f64 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-9a8d7e02-6802-40ae-8c7a-6f40179085f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:58:20 np0005476733 nova_compute[192580]: 2025-10-08 15:58:20.004 2 DEBUG nova.network.neutron [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 11:58:20 np0005476733 nova_compute[192580]: 2025-10-08 15:58:20.906 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.057 2 DEBUG nova.network.neutron [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Updating instance_info_cache with network_info: [{"id": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "address": "fa:16:3e:e0:64:d7", "network": {"id": "4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2", "bridge": "br-int", "label": "tempest-test-network--2028459875", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.159", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4d6d14-21", "ovs_interfaceid": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.074 2 DEBUG oslo_concurrency.lockutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Releasing lock "refresh_cache-9a8d7e02-6802-40ae-8c7a-6f40179085f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.075 2 DEBUG nova.compute.manager [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Instance network_info: |[{"id": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "address": "fa:16:3e:e0:64:d7", "network": {"id": "4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2", "bridge": "br-int", "label": "tempest-test-network--2028459875", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.159", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4d6d14-21", "ovs_interfaceid": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.075 2 DEBUG oslo_concurrency.lockutils [req-f7749b00-ae87-447e-85e6-67c8c210424c req-1e8fff24-6c04-4ff0-a057-3edb1c136f64 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-9a8d7e02-6802-40ae-8c7a-6f40179085f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.075 2 DEBUG nova.network.neutron [req-f7749b00-ae87-447e-85e6-67c8c210424c req-1e8fff24-6c04-4ff0-a057-3edb1c136f64 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Refreshing network info cache for port ec4d6d14-21c2-4a40-a5d6-e96b314661ca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.077 2 DEBUG nova.virt.libvirt.driver [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Start _get_guest_xml network_info=[{"id": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "address": "fa:16:3e:e0:64:d7", "network": {"id": "4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2", "bridge": "br-int", "label": "tempest-test-network--2028459875", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.159", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4d6d14-21", "ovs_interfaceid": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.082 2 WARNING nova.virt.libvirt.driver [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.090 2 DEBUG nova.virt.libvirt.host [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.090 2 DEBUG nova.virt.libvirt.host [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.096 2 DEBUG nova.virt.libvirt.host [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.096 2 DEBUG nova.virt.libvirt.host [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.097 2 DEBUG nova.virt.libvirt.driver [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.097 2 DEBUG nova.virt.hardware [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.098 2 DEBUG nova.virt.hardware [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.098 2 DEBUG nova.virt.hardware [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.098 2 DEBUG nova.virt.hardware [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.098 2 DEBUG nova.virt.hardware [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.099 2 DEBUG nova.virt.hardware [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.099 2 DEBUG nova.virt.hardware [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.099 2 DEBUG nova.virt.hardware [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.099 2 DEBUG nova.virt.hardware [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.100 2 DEBUG nova.virt.hardware [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.100 2 DEBUG nova.virt.hardware [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.108 2 DEBUG nova.virt.libvirt.vif [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:58:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1797033195',display_name='tempest-server-test-1797033195',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1797033195',id=75,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGmGuURdoBH7+8UKntqBm5AWKwSqVw41oQIfoqZW4juzRa+zLIDUZQk+8q96NsvV1QhNKcV4HhEHGQj7RYtO04Z0WfqqmlMfeVZDrcQlemJhjx+knV/dWY2Bcp0Y0lXzvQ==',key_name='tempest-keypair-test-1490490112',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='13cdd2bb6c7648f5ab8709ff695b5cda',ramdisk_id='',reservation_id='r-fge78uux',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestOvn-1026583770',owner_user_name='tempest-QosTestOvn-1026583770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:58:16Z,user_data=None,user_id='bf4219ece8f54f268b2ece84f150d555',uuid=9a8d7e02-6802-40ae-8c7a-6f40179085f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "address": "fa:16:3e:e0:64:d7", "network": {"id": "4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2", "bridge": "br-int", "label": "tempest-test-network--2028459875", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.159", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4d6d14-21", "ovs_interfaceid": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.109 2 DEBUG nova.network.os_vif_util [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Converting VIF {"id": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "address": "fa:16:3e:e0:64:d7", "network": {"id": "4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2", "bridge": "br-int", "label": "tempest-test-network--2028459875", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.159", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4d6d14-21", "ovs_interfaceid": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.110 2 DEBUG nova.network.os_vif_util [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:64:d7,bridge_name='br-int',has_traffic_filtering=True,id=ec4d6d14-21c2-4a40-a5d6-e96b314661ca,network=Network(4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec4d6d14-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.112 2 DEBUG nova.objects.instance [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lazy-loading 'pci_devices' on Instance uuid 9a8d7e02-6802-40ae-8c7a-6f40179085f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.127 2 DEBUG nova.virt.libvirt.driver [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] End _get_guest_xml xml=<domain type="kvm">
Oct  8 11:58:21 np0005476733 nova_compute[192580]:  <uuid>9a8d7e02-6802-40ae-8c7a-6f40179085f0</uuid>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:  <name>instance-0000004b</name>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <nova:name>tempest-server-test-1797033195</nova:name>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 15:58:21</nova:creationTime>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 11:58:21 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:        <nova:user uuid="bf4219ece8f54f268b2ece84f150d555">tempest-QosTestOvn-1026583770-project-member</nova:user>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:        <nova:project uuid="13cdd2bb6c7648f5ab8709ff695b5cda">tempest-QosTestOvn-1026583770</nova:project>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:        <nova:port uuid="ec4d6d14-21c2-4a40-a5d6-e96b314661ca">
Oct  8 11:58:21 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.10.1.159" ipVersion="4"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <system>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <entry name="serial">9a8d7e02-6802-40ae-8c7a-6f40179085f0</entry>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <entry name="uuid">9a8d7e02-6802-40ae-8c7a-6f40179085f0</entry>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    </system>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:  <os>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:  </os>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:  <features>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:  </features>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:  </clock>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:  <devices>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.config"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    </disk>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:e0:64:d7"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <target dev="tapec4d6d14-21"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    </interface>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/9a8d7e02-6802-40ae-8c7a-6f40179085f0/console.log" append="off"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    </serial>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <video>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    </video>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    </rng>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 11:58:21 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 11:58:21 np0005476733 nova_compute[192580]:  </devices>
Oct  8 11:58:21 np0005476733 nova_compute[192580]: </domain>
Oct  8 11:58:21 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.128 2 DEBUG nova.compute.manager [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Preparing to wait for external event network-vif-plugged-ec4d6d14-21c2-4a40-a5d6-e96b314661ca prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.129 2 DEBUG oslo_concurrency.lockutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquiring lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.129 2 DEBUG oslo_concurrency.lockutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.129 2 DEBUG oslo_concurrency.lockutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.130 2 DEBUG nova.virt.libvirt.vif [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T15:58:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1797033195',display_name='tempest-server-test-1797033195',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1797033195',id=75,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGmGuURdoBH7+8UKntqBm5AWKwSqVw41oQIfoqZW4juzRa+zLIDUZQk+8q96NsvV1QhNKcV4HhEHGQj7RYtO04Z0WfqqmlMfeVZDrcQlemJhjx+knV/dWY2Bcp0Y0lXzvQ==',key_name='tempest-keypair-test-1490490112',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='13cdd2bb6c7648f5ab8709ff695b5cda',ramdisk_id='',reservation_id='r-fge78uux',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestOvn-1026583770',owner_user_name='tempest-QosTestOvn-1026583770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T15:58:16Z,user_data=None,user_id='bf4219ece8f54f268b2ece84f150d555',uuid=9a8d7e02-6802-40ae-8c7a-6f40179085f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "address": "fa:16:3e:e0:64:d7", "network": {"id": "4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2", "bridge": "br-int", "label": "tempest-test-network--2028459875", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.159", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4d6d14-21", "ovs_interfaceid": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.131 2 DEBUG nova.network.os_vif_util [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Converting VIF {"id": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "address": "fa:16:3e:e0:64:d7", "network": {"id": "4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2", "bridge": "br-int", "label": "tempest-test-network--2028459875", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.159", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4d6d14-21", "ovs_interfaceid": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.131 2 DEBUG nova.network.os_vif_util [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:64:d7,bridge_name='br-int',has_traffic_filtering=True,id=ec4d6d14-21c2-4a40-a5d6-e96b314661ca,network=Network(4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec4d6d14-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.132 2 DEBUG os_vif [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:64:d7,bridge_name='br-int',has_traffic_filtering=True,id=ec4d6d14-21c2-4a40-a5d6-e96b314661ca,network=Network(4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec4d6d14-21') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.133 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.134 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.137 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec4d6d14-21, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.137 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapec4d6d14-21, col_values=(('external_ids', {'iface-id': 'ec4d6d14-21c2-4a40-a5d6-e96b314661ca', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e0:64:d7', 'vm-uuid': '9a8d7e02-6802-40ae-8c7a-6f40179085f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:21 np0005476733 ovn_controller[94857]: 2025-10-08T15:58:21Z|00690|pinctrl|WARN|Dropped 561 log messages in last 59 seconds (most recently, 2 seconds ago) due to excessive rate
Oct  8 11:58:21 np0005476733 ovn_controller[94857]: 2025-10-08T15:58:21Z|00691|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:58:21 np0005476733 NetworkManager[51699]: <info>  [1759939101.1400] manager: (tapec4d6d14-21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/229)
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.147 2 INFO os_vif [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:64:d7,bridge_name='br-int',has_traffic_filtering=True,id=ec4d6d14-21c2-4a40-a5d6-e96b314661ca,network=Network(4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec4d6d14-21')#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.200 2 DEBUG nova.virt.libvirt.driver [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.200 2 DEBUG nova.virt.libvirt.driver [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.201 2 DEBUG nova.virt.libvirt.driver [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] No VIF found with MAC fa:16:3e:e0:64:d7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.201 2 INFO nova.virt.libvirt.driver [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Using config drive#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.712 2 INFO nova.virt.libvirt.driver [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Creating config drive at /var/lib/nova/instances/9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.config#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.717 2 DEBUG oslo_concurrency.processutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy94a1zvz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.846 2 DEBUG oslo_concurrency.processutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy94a1zvz" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:58:21 np0005476733 kernel: tapec4d6d14-21: entered promiscuous mode
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:21 np0005476733 ovn_controller[94857]: 2025-10-08T15:58:21Z|00692|binding|INFO|Claiming lport ec4d6d14-21c2-4a40-a5d6-e96b314661ca for this chassis.
Oct  8 11:58:21 np0005476733 ovn_controller[94857]: 2025-10-08T15:58:21Z|00693|binding|INFO|ec4d6d14-21c2-4a40-a5d6-e96b314661ca: Claiming fa:16:3e:e0:64:d7 10.10.1.159
Oct  8 11:58:21 np0005476733 NetworkManager[51699]: <info>  [1759939101.9429] manager: (tapec4d6d14-21): new Tun device (/org/freedesktop/NetworkManager/Devices/230)
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:21 np0005476733 nova_compute[192580]: 2025-10-08 15:58:21.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:21 np0005476733 NetworkManager[51699]: <info>  [1759939101.9598] manager: (patch-provnet-20e9b335-697c-41bf-8f62-8813cb01ba99-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/231)
Oct  8 11:58:21 np0005476733 NetworkManager[51699]: <info>  [1759939101.9605] manager: (patch-br-int-to-provnet-20e9b335-697c-41bf-8f62-8813cb01ba99): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/232)
Oct  8 11:58:21 np0005476733 systemd-udevd[246629]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:58:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:21.974 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:64:d7 10.10.1.159'], port_security=['fa:16:3e:e0:64:d7 10.10.1.159'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.10.1.159/24', 'neutron:device_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2445930c-6263-4f08-8a84-8d4597739544', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=704dd4df-5dd3-4687-bb14-6b18adad356e, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=ec4d6d14-21c2-4a40-a5d6-e96b314661ca) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:58:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:21.977 103739 INFO neutron.agent.ovn.metadata.agent [-] Port ec4d6d14-21c2-4a40-a5d6-e96b314661ca in datapath 4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2 bound to our chassis#033[00m
Oct  8 11:58:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:21.979 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2#033[00m
Oct  8 11:58:21 np0005476733 NetworkManager[51699]: <info>  [1759939101.9832] device (tapec4d6d14-21): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:58:21 np0005476733 NetworkManager[51699]: <info>  [1759939101.9838] device (tapec4d6d14-21): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:58:21 np0005476733 systemd-machined[152624]: New machine qemu-45-instance-0000004b.
Oct  8 11:58:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:21.992 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[dc7f8af7-6655-48e1-95af-ebe07582b656]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:58:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:21.994 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4ecfeba1-f1 in ovnmeta-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:58:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:21.996 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4ecfeba1-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:58:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:21.996 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8fff77a9-bd66-4ba2-8464-321ea5557a7c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:58:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:21.997 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[bb97cffc-3241-4dd5-b09a-bea2c4e46f92]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:22.011 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[cbb73fdd-a8f5-4b52-a8a2-26140b2987de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:58:22 np0005476733 systemd[1]: Started Virtual Machine qemu-45-instance-0000004b.
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:22.035 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[bbc45378-4825-4a43-a6ac-78f370118b3f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:22.080 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[ec3ec589-3351-4c21-9fda-2e7c45a633ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:58:22 np0005476733 ovn_controller[94857]: 2025-10-08T15:58:22Z|00694|binding|INFO|Setting lport ec4d6d14-21c2-4a40-a5d6-e96b314661ca up in Southbound
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:22.129 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[cc39f84d-5f80-4252-a250-44fd70886bc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:58:22 np0005476733 NetworkManager[51699]: <info>  [1759939102.1304] manager: (tap4ecfeba1-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/233)
Oct  8 11:58:22 np0005476733 nova_compute[192580]: 2025-10-08 15:58:22.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:22 np0005476733 ovn_controller[94857]: 2025-10-08T15:58:22Z|00695|binding|INFO|Setting lport ec4d6d14-21c2-4a40-a5d6-e96b314661ca ovn-installed in OVS
Oct  8 11:58:22 np0005476733 nova_compute[192580]: 2025-10-08 15:58:22.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:22.174 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[3017869f-1cbf-4e35-acfa-a29f26ab9e8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:22.177 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[2e996d14-3632-4601-83d0-ec5ba20304ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:58:22 np0005476733 NetworkManager[51699]: <info>  [1759939102.2105] device (tap4ecfeba1-f0): carrier: link connected
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:22.223 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[83a55d67-3e39-4b0d-ba6f-7b4a9e083328]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:58:22 np0005476733 nova_compute[192580]: 2025-10-08 15:58:22.228 2 DEBUG nova.network.neutron [req-f7749b00-ae87-447e-85e6-67c8c210424c req-1e8fff24-6c04-4ff0-a057-3edb1c136f64 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Updated VIF entry in instance network info cache for port ec4d6d14-21c2-4a40-a5d6-e96b314661ca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:58:22 np0005476733 nova_compute[192580]: 2025-10-08 15:58:22.228 2 DEBUG nova.network.neutron [req-f7749b00-ae87-447e-85e6-67c8c210424c req-1e8fff24-6c04-4ff0-a057-3edb1c136f64 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Updating instance_info_cache with network_info: [{"id": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "address": "fa:16:3e:e0:64:d7", "network": {"id": "4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2", "bridge": "br-int", "label": "tempest-test-network--2028459875", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.159", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4d6d14-21", "ovs_interfaceid": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:58:22 np0005476733 nova_compute[192580]: 2025-10-08 15:58:22.249 2 DEBUG oslo_concurrency.lockutils [req-f7749b00-ae87-447e-85e6-67c8c210424c req-1e8fff24-6c04-4ff0-a057-3edb1c136f64 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-9a8d7e02-6802-40ae-8c7a-6f40179085f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:22.251 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[db4e16d4-87d7-4ac8-a4a6-7e6024848fd4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4ecfeba1-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:07:a5:15'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 162], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600587, 'reachable_time': 40748, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246664, 'error': None, 'target': 'ovnmeta-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:22.265 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e85e2f94-7f02-49de-9d89-aac3b124c2f9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe07:a515'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600587, 'tstamp': 600587}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246665, 'error': None, 'target': 'ovnmeta-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:22.282 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e3f8c1b4-038c-41cb-9905-67ab1eab4d8e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4ecfeba1-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:07:a5:15'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 162], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600587, 'reachable_time': 40748, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246666, 'error': None, 'target': 'ovnmeta-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:22.310 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9e908bd7-bdc6-4ffe-b602-b0c1a6d2f0ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:58:22 np0005476733 nova_compute[192580]: 2025-10-08 15:58:22.330 2 DEBUG nova.compute.manager [req-c15c0bc0-3926-43fc-a710-364d5f6ba9c7 req-c439db45-012c-4489-8748-f861e8045415 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Received event network-vif-plugged-ec4d6d14-21c2-4a40-a5d6-e96b314661ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:58:22 np0005476733 nova_compute[192580]: 2025-10-08 15:58:22.330 2 DEBUG oslo_concurrency.lockutils [req-c15c0bc0-3926-43fc-a710-364d5f6ba9c7 req-c439db45-012c-4489-8748-f861e8045415 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:58:22 np0005476733 nova_compute[192580]: 2025-10-08 15:58:22.330 2 DEBUG oslo_concurrency.lockutils [req-c15c0bc0-3926-43fc-a710-364d5f6ba9c7 req-c439db45-012c-4489-8748-f861e8045415 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:58:22 np0005476733 nova_compute[192580]: 2025-10-08 15:58:22.330 2 DEBUG oslo_concurrency.lockutils [req-c15c0bc0-3926-43fc-a710-364d5f6ba9c7 req-c439db45-012c-4489-8748-f861e8045415 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:58:22 np0005476733 nova_compute[192580]: 2025-10-08 15:58:22.331 2 DEBUG nova.compute.manager [req-c15c0bc0-3926-43fc-a710-364d5f6ba9c7 req-c439db45-012c-4489-8748-f861e8045415 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Processing event network-vif-plugged-ec4d6d14-21c2-4a40-a5d6-e96b314661ca _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:22.377 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[dac62c3b-f6cd-4cb2-8989-d5d2d5d073d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:22.378 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4ecfeba1-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:22.379 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:22.379 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4ecfeba1-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:58:22 np0005476733 NetworkManager[51699]: <info>  [1759939102.3818] manager: (tap4ecfeba1-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/234)
Oct  8 11:58:22 np0005476733 kernel: tap4ecfeba1-f0: entered promiscuous mode
Oct  8 11:58:22 np0005476733 nova_compute[192580]: 2025-10-08 15:58:22.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:22.386 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4ecfeba1-f0, col_values=(('external_ids', {'iface-id': '7920586c-8b0c-4f76-b9dc-2b52faea37bf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:58:22 np0005476733 nova_compute[192580]: 2025-10-08 15:58:22.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:22 np0005476733 ovn_controller[94857]: 2025-10-08T15:58:22Z|00696|binding|INFO|Releasing lport 7920586c-8b0c-4f76-b9dc-2b52faea37bf from this chassis (sb_readonly=0)
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:22.390 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:22.391 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2891e753-988c-4133-8320-7238747fc7af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:22.392 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2.pid.haproxy
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:58:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:22.393 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2', 'env', 'PROCESS_TAG=haproxy-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:58:22 np0005476733 nova_compute[192580]: 2025-10-08 15:58:22.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:22 np0005476733 podman[246698]: 2025-10-08 15:58:22.834062467 +0000 UTC m=+0.108524339 container create cc05d11a82e4eca86a1471f685bea16267f65e3f566b5921e051da8a4ed1e78e (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  8 11:58:22 np0005476733 podman[246698]: 2025-10-08 15:58:22.786618691 +0000 UTC m=+0.061080583 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:58:22 np0005476733 systemd[1]: Started libpod-conmon-cc05d11a82e4eca86a1471f685bea16267f65e3f566b5921e051da8a4ed1e78e.scope.
Oct  8 11:58:22 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:58:22 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/deeea52d269f0184888dc13369a841c405956c3941f430bc4995b9822ef08f96/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:58:22 np0005476733 podman[246717]: 2025-10-08 15:58:22.950826048 +0000 UTC m=+0.086299358 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:58:22 np0005476733 podman[246698]: 2025-10-08 15:58:22.951757848 +0000 UTC m=+0.226219730 container init cc05d11a82e4eca86a1471f685bea16267f65e3f566b5921e051da8a4ed1e78e (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 11:58:22 np0005476733 podman[246698]: 2025-10-08 15:58:22.959580968 +0000 UTC m=+0.234042840 container start cc05d11a82e4eca86a1471f685bea16267f65e3f566b5921e051da8a4ed1e78e (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  8 11:58:22 np0005476733 neutron-haproxy-ovnmeta-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2[246730]: [NOTICE]   (246743) : New worker (246745) forked
Oct  8 11:58:22 np0005476733 neutron-haproxy-ovnmeta-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2[246730]: [NOTICE]   (246743) : Loading success.
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.346 2 DEBUG nova.compute.manager [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.348 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759939103.3460276, 9a8d7e02-6802-40ae-8c7a-6f40179085f0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.348 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] VM Started (Lifecycle Event)#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.353 2 DEBUG nova.virt.libvirt.driver [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.359 2 INFO nova.virt.libvirt.driver [-] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Instance spawned successfully.#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.361 2 DEBUG nova.virt.libvirt.driver [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.368 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.372 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.380 2 DEBUG nova.virt.libvirt.driver [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.381 2 DEBUG nova.virt.libvirt.driver [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.381 2 DEBUG nova.virt.libvirt.driver [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.382 2 DEBUG nova.virt.libvirt.driver [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.382 2 DEBUG nova.virt.libvirt.driver [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.383 2 DEBUG nova.virt.libvirt.driver [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.393 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.394 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759939103.3463376, 9a8d7e02-6802-40ae-8c7a-6f40179085f0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.395 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] VM Paused (Lifecycle Event)#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.423 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.429 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759939103.351737, 9a8d7e02-6802-40ae-8c7a-6f40179085f0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.429 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] VM Resumed (Lifecycle Event)#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.447 2 INFO nova.compute.manager [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Took 7.23 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.448 2 DEBUG nova.compute.manager [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.460 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.465 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.506 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.532 2 INFO nova.compute.manager [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Took 7.75 seconds to build instance.#033[00m
Oct  8 11:58:23 np0005476733 nova_compute[192580]: 2025-10-08 15:58:23.548 2 DEBUG oslo_concurrency.lockutils [None req-ba5873aa-d064-4996-9d52-77a59ee94f31 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.840s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:58:24 np0005476733 nova_compute[192580]: 2025-10-08 15:58:24.610 2 DEBUG nova.compute.manager [req-f0a5f819-925d-4143-9518-21a4ab471a95 req-60d4160d-f6c4-4125-ba79-c2190a902617 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Received event network-vif-plugged-ec4d6d14-21c2-4a40-a5d6-e96b314661ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:58:24 np0005476733 nova_compute[192580]: 2025-10-08 15:58:24.611 2 DEBUG oslo_concurrency.lockutils [req-f0a5f819-925d-4143-9518-21a4ab471a95 req-60d4160d-f6c4-4125-ba79-c2190a902617 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:58:24 np0005476733 nova_compute[192580]: 2025-10-08 15:58:24.612 2 DEBUG oslo_concurrency.lockutils [req-f0a5f819-925d-4143-9518-21a4ab471a95 req-60d4160d-f6c4-4125-ba79-c2190a902617 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:58:24 np0005476733 nova_compute[192580]: 2025-10-08 15:58:24.612 2 DEBUG oslo_concurrency.lockutils [req-f0a5f819-925d-4143-9518-21a4ab471a95 req-60d4160d-f6c4-4125-ba79-c2190a902617 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:58:24 np0005476733 nova_compute[192580]: 2025-10-08 15:58:24.612 2 DEBUG nova.compute.manager [req-f0a5f819-925d-4143-9518-21a4ab471a95 req-60d4160d-f6c4-4125-ba79-c2190a902617 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] No waiting events found dispatching network-vif-plugged-ec4d6d14-21c2-4a40-a5d6-e96b314661ca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:58:24 np0005476733 nova_compute[192580]: 2025-10-08 15:58:24.613 2 WARNING nova.compute.manager [req-f0a5f819-925d-4143-9518-21a4ab471a95 req-60d4160d-f6c4-4125-ba79-c2190a902617 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Received unexpected event network-vif-plugged-ec4d6d14-21c2-4a40-a5d6-e96b314661ca for instance with vm_state active and task_state None.#033[00m
Oct  8 11:58:25 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:25.932 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:58:26 np0005476733 nova_compute[192580]: 2025-10-08 15:58:26.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:26.347 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:58:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:26.348 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:58:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:58:26.349 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:58:26 np0005476733 nova_compute[192580]: 2025-10-08 15:58:26.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:28 np0005476733 nova_compute[192580]: 2025-10-08 15:58:27.999 2 DEBUG nova.compute.manager [req-4d9eb100-d64b-481b-a950-6d48561b01e3 req-b621fa9a-8576-4968-ac5e-fc6d7c220070 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Received event network-changed-ec4d6d14-21c2-4a40-a5d6-e96b314661ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:58:28 np0005476733 nova_compute[192580]: 2025-10-08 15:58:28.000 2 DEBUG nova.compute.manager [req-4d9eb100-d64b-481b-a950-6d48561b01e3 req-b621fa9a-8576-4968-ac5e-fc6d7c220070 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Refreshing instance network info cache due to event network-changed-ec4d6d14-21c2-4a40-a5d6-e96b314661ca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:58:28 np0005476733 nova_compute[192580]: 2025-10-08 15:58:28.000 2 DEBUG oslo_concurrency.lockutils [req-4d9eb100-d64b-481b-a950-6d48561b01e3 req-b621fa9a-8576-4968-ac5e-fc6d7c220070 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-9a8d7e02-6802-40ae-8c7a-6f40179085f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:58:28 np0005476733 nova_compute[192580]: 2025-10-08 15:58:28.000 2 DEBUG oslo_concurrency.lockutils [req-4d9eb100-d64b-481b-a950-6d48561b01e3 req-b621fa9a-8576-4968-ac5e-fc6d7c220070 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-9a8d7e02-6802-40ae-8c7a-6f40179085f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:58:28 np0005476733 nova_compute[192580]: 2025-10-08 15:58:28.001 2 DEBUG nova.network.neutron [req-4d9eb100-d64b-481b-a950-6d48561b01e3 req-b621fa9a-8576-4968-ac5e-fc6d7c220070 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Refreshing network info cache for port ec4d6d14-21c2-4a40-a5d6-e96b314661ca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:58:29 np0005476733 nova_compute[192580]: 2025-10-08 15:58:29.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:58:29 np0005476733 nova_compute[192580]: 2025-10-08 15:58:29.611 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:58:29 np0005476733 nova_compute[192580]: 2025-10-08 15:58:29.962 2 DEBUG nova.network.neutron [req-4d9eb100-d64b-481b-a950-6d48561b01e3 req-b621fa9a-8576-4968-ac5e-fc6d7c220070 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Updated VIF entry in instance network info cache for port ec4d6d14-21c2-4a40-a5d6-e96b314661ca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:58:29 np0005476733 nova_compute[192580]: 2025-10-08 15:58:29.963 2 DEBUG nova.network.neutron [req-4d9eb100-d64b-481b-a950-6d48561b01e3 req-b621fa9a-8576-4968-ac5e-fc6d7c220070 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Updating instance_info_cache with network_info: [{"id": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "address": "fa:16:3e:e0:64:d7", "network": {"id": "4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2", "bridge": "br-int", "label": "tempest-test-network--2028459875", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.159", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4d6d14-21", "ovs_interfaceid": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:58:29 np0005476733 nova_compute[192580]: 2025-10-08 15:58:29.995 2 DEBUG oslo_concurrency.lockutils [req-4d9eb100-d64b-481b-a950-6d48561b01e3 req-b621fa9a-8576-4968-ac5e-fc6d7c220070 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-9a8d7e02-6802-40ae-8c7a-6f40179085f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:58:30 np0005476733 podman[246755]: 2025-10-08 15:58:30.243610804 +0000 UTC m=+0.068779468 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 11:58:30 np0005476733 podman[246754]: 2025-10-08 15:58:30.269059488 +0000 UTC m=+0.098460418 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 11:58:31 np0005476733 nova_compute[192580]: 2025-10-08 15:58:31.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:31 np0005476733 nova_compute[192580]: 2025-10-08 15:58:31.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.051 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'name': 'tempest-server-test-1797033195', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000004b', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'hostId': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.052 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.063 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.064 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f18d7a5-b49d-4ddb-a118-7c19a9851c30', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0-vda', 'timestamp': '2025-10-08T15:58:36.052584', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'instance-0000004b', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'a5e5dc42-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.775559765, 'message_signature': 'f58da19b358c977b971387744cce4921631ce27c24a48c356c5a8f807b319491'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0-sda', 'timestamp': '2025-10-08T15:58:36.052584', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'instance-0000004b', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'a5e5e7aa-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.775559765, 'message_signature': 'a61275fa166cbf8ecd27701ec041f9e3b624a6ab54165182dba3c4e597f7e08b'}]}, 'timestamp': '2025-10-08 15:58:36.064263', '_unique_id': 'f29123ee2e494342baf0f60b5b05853d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.065 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.066 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.070 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 9a8d7e02-6802-40ae-8c7a-6f40179085f0 / tapec4d6d14-21 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.070 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '970bc0b9-7572-4a40-b025-0f7c0401e491', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': 'instance-0000004b-9a8d7e02-6802-40ae-8c7a-6f40179085f0-tapec4d6d14-21', 'timestamp': '2025-10-08T15:58:36.066232', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'tapec4d6d14-21', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e0:64:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapec4d6d14-21'}, 'message_id': 'a5e6e1b4-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.789203831, 'message_signature': '51a3d587351ff2f15d3b323bbe7f911ff682ff18f50a5be89d1e6f58be5acf0f'}]}, 'timestamp': '2025-10-08 15:58:36.070680', '_unique_id': 'd48e69b0fdee42a58f9b2fbcd4852cc9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.071 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51011209-048e-45b3-98ca-9260a4639435', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': 'instance-0000004b-9a8d7e02-6802-40ae-8c7a-6f40179085f0-tapec4d6d14-21', 'timestamp': '2025-10-08T15:58:36.072022', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'tapec4d6d14-21', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e0:64:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapec4d6d14-21'}, 'message_id': 'a5e7212e-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.789203831, 'message_signature': '678e0e1b040d3d2df0cae1038c0e99e940c97c5a9e827ee4a1a2b7c5480f02b6'}]}, 'timestamp': '2025-10-08 15:58:36.072282', '_unique_id': '72ebef603f9045529afe4a14eb9120a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.072 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.073 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.073 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1955764-b1f1-40b0-a4c6-2cd39a6f3154', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': 'instance-0000004b-9a8d7e02-6802-40ae-8c7a-6f40179085f0-tapec4d6d14-21', 'timestamp': '2025-10-08T15:58:36.073649', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'tapec4d6d14-21', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e0:64:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapec4d6d14-21'}, 'message_id': 'a5e75ffe-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.789203831, 'message_signature': '7a8e8eeed4bcfb95d51c596da45152478d72ae74c1cd9ddc584cb01fe93ce0a7'}]}, 'timestamp': '2025-10-08 15:58:36.073888', '_unique_id': '57a09093256941138ff0aae5906d6b73'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.074 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.075 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1797033195>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1797033195>]
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.075 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.075 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '168b4262-634d-4062-bafa-17a71239c671', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': 'instance-0000004b-9a8d7e02-6802-40ae-8c7a-6f40179085f0-tapec4d6d14-21', 'timestamp': '2025-10-08T15:58:36.075320', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'tapec4d6d14-21', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e0:64:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapec4d6d14-21'}, 'message_id': 'a5e7a09a-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.789203831, 'message_signature': 'dc370ef7d7aa118a14fb6900bf194e0849f731c78268c3aab4e3b35b4471708c'}]}, 'timestamp': '2025-10-08 15:58:36.075548', '_unique_id': 'dc6e61eac0d24740b4b373cabe2a387f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-1797033195>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1797033195>]
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.076 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.089 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/cpu volume: 12020000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c350fe8-50d2-40fc-ab07-2a002ffcbc0e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12020000000, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'timestamp': '2025-10-08T15:58:36.076947', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'instance-0000004b', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'a5e9cbcc-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.812150684, 'message_signature': '32bda5c6205a14a29feccb4006805d1efeb50227f3f3c9358dacaeb00d71ff6c'}]}, 'timestamp': '2025-10-08 15:58:36.089836', '_unique_id': 'dedfde4f14774954b42f4e95d89332cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.090 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.091 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.091 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb0b5a73-1849-4b72-a25b-f15dcbd1a8dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': 'instance-0000004b-9a8d7e02-6802-40ae-8c7a-6f40179085f0-tapec4d6d14-21', 'timestamp': '2025-10-08T15:58:36.091503', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'tapec4d6d14-21', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e0:64:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapec4d6d14-21'}, 'message_id': 'a5ea1974-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.789203831, 'message_signature': 'a631161d72cdc8222f21df7626df6f87b7a4f385f54d504bad72765b1f2f5553'}]}, 'timestamp': '2025-10-08 15:58:36.091743', '_unique_id': '7e32a68feb80458791d121136de7aa4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.092 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb001c82-426b-47d7-9832-1ac9c8d571bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': 'instance-0000004b-9a8d7e02-6802-40ae-8c7a-6f40179085f0-tapec4d6d14-21', 'timestamp': '2025-10-08T15:58:36.092821', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'tapec4d6d14-21', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e0:64:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapec4d6d14-21'}, 'message_id': 'a5ea4c8c-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.789203831, 'message_signature': '32fdfaafe787af0db1b7d359859408040281322fcd77c7748025d9f48625ab52'}]}, 'timestamp': '2025-10-08 15:58:36.093049', '_unique_id': '87ccfc7367454b529e5748f117dfae33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.093 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.094 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.094 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.094 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6ff6e07-0989-48d4-b08d-4ff833c23a9d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0-vda', 'timestamp': '2025-10-08T15:58:36.094162', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'instance-0000004b', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'a5ea8134-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.775559765, 'message_signature': 'a4bbfad1e1f94b0b7feb904609141c0bd667d7faed110453a6a19079af2f86c7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0-sda', 'timestamp': '2025-10-08T15:58:36.094162', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'instance-0000004b', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'a5ea88e6-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.775559765, 'message_signature': '9ca17cc0d82790e48c95cb27f85126f73305cce37fcaafdfe4c7d92c57956093'}]}, 'timestamp': '2025-10-08 15:58:36.094579', '_unique_id': 'e80f91987c8b48f08cbdd0cb5ad4db8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.095 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.111 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.device.read.latency volume: 3707907530 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.112 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.device.read.latency volume: 3791221 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23bf92be-cfa0-4fa6-92e6-cd01cb12b7a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3707907530, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0-vda', 'timestamp': '2025-10-08T15:58:36.095685', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'instance-0000004b', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'a5ed350a-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.818651872, 'message_signature': 'c7fc2709f7ca58494ad6ff6bf62e513530b807b4435f12b44101b01712db1c3a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3791221, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0-sda', 'timestamp': '2025-10-08T15:58:36.095685', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'instance-0000004b', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'a5ed41a8-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.818651872, 'message_signature': '5edcdc7ae70a8360fa3851d10d62a1d61d7187972eedf75f6035a61164be9731'}]}, 'timestamp': '2025-10-08 15:58:36.112459', '_unique_id': 'af65bb3b4f724e138b62e11c5f2de4b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.113 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.114 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.114 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e6240ee-9503-4d73-9281-d485e6acdef5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': 'instance-0000004b-9a8d7e02-6802-40ae-8c7a-6f40179085f0-tapec4d6d14-21', 'timestamp': '2025-10-08T15:58:36.114438', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'tapec4d6d14-21', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e0:64:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapec4d6d14-21'}, 'message_id': 'a5ed9b6c-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.789203831, 'message_signature': '1d583e683dfae71395879f0036b1c2db584088928ed9b6f9ae34d00169648d89'}]}, 'timestamp': '2025-10-08 15:58:36.114787', '_unique_id': '90908a6b92444e4d8ea63741ecebd31c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.115 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.116 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.116 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47797b80-42dc-4435-a933-456c36aa3e63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': 'instance-0000004b-9a8d7e02-6802-40ae-8c7a-6f40179085f0-tapec4d6d14-21', 'timestamp': '2025-10-08T15:58:36.116334', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'tapec4d6d14-21', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e0:64:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapec4d6d14-21'}, 'message_id': 'a5ede40a-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.789203831, 'message_signature': 'f3664af748ed0c62a85400dae24e0f9fecb8eb6149dc196c4947feb41025f53f'}]}, 'timestamp': '2025-10-08 15:58:36.116597', '_unique_id': '076e63a3b2f14e2db1969902e2661c97'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.117 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e60137ec-f076-4e60-a5e8-08273593ae95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': 'instance-0000004b-9a8d7e02-6802-40ae-8c7a-6f40179085f0-tapec4d6d14-21', 'timestamp': '2025-10-08T15:58:36.117829', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'tapec4d6d14-21', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e0:64:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapec4d6d14-21'}, 'message_id': 'a5ee1d80-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.789203831, 'message_signature': 'a42cf4c4165d295251734d2cdead3b47a61ad2b16daa8dc3f0770a9455161066'}]}, 'timestamp': '2025-10-08 15:58:36.118079', '_unique_id': '4b00243b378c4416a0709b608ac60953'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.119 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.119 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.119 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-1797033195>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1797033195>]
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.119 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.119 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.119 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 9a8d7e02-6802-40ae-8c7a-6f40179085f0: ceilometer.compute.pollsters.NoVolumeException
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.119 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.119 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.119 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1797033195>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1797033195>]
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.119 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.120 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.device.allocation volume: 1253376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.120 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79238b90-c468-47f0-8a39-92b0d9f57dc3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1253376, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0-vda', 'timestamp': '2025-10-08T15:58:36.120061', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'instance-0000004b', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'a5ee77e4-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.775559765, 'message_signature': '64c0b55343e49bfb4adb713d5be81e67048473b6bb54f43ca5aa9be2d9975b5a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0-sda', 'timestamp': '2025-10-08T15:58:36.120061', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'instance-0000004b', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'a5ee82ac-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.775559765, 'message_signature': '77b5b9da2f41a608114b9e7123cdacccf96d89f969857b8d0a70502658d18c93'}]}, 'timestamp': '2025-10-08 15:58:36.120677', '_unique_id': '179fd69b8a63496caca7d9eb46ea2c33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.121 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.122 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.122 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.device.read.bytes volume: 93131776 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.122 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4c6385c-aa00-4abf-9d03-66dc81119182', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 93131776, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0-vda', 'timestamp': '2025-10-08T15:58:36.122186', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'instance-0000004b', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'a5eec7f8-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.818651872, 'message_signature': '0fb09122f784eff7976eb4a1b64a27cc35fd607e59a6383f2e2d7e8114d5d1dc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0-sda', 'timestamp': '2025-10-08T15:58:36.122186', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'instance-0000004b', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'a5eed086-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.818651872, 'message_signature': '377d3bf286bfcb03507a91027aedeb8936cf0d1b7c49e7b3229dd56d5c01209b'}]}, 'timestamp': '2025-10-08 15:58:36.122727', '_unique_id': 'f4f262f584084e07a9b4e078985f1848'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.123 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9ff7f36-e37f-479a-afdc-f3434ac80aa5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': 'instance-0000004b-9a8d7e02-6802-40ae-8c7a-6f40179085f0-tapec4d6d14-21', 'timestamp': '2025-10-08T15:58:36.123842', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'tapec4d6d14-21', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e0:64:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapec4d6d14-21'}, 'message_id': 'a5ef0880-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.789203831, 'message_signature': '9f49f87b39fd82a4f90ac9ce72cfe0f66a31e5f8c4d16a9f47f885848aef894d'}]}, 'timestamp': '2025-10-08 15:58:36.124074', '_unique_id': 'e82cb830417b48cba2185dd6689b2542'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.124 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.125 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.125 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.device.write.bytes volume: 1024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.125 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4bcb64e4-de9b-40e8-84c3-a84f2c6d4d1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1024, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0-vda', 'timestamp': '2025-10-08T15:58:36.125294', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'instance-0000004b', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'a5ef41b0-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.818651872, 'message_signature': '793278a41e4fe31360938dc29a26aaa780246834229f60529eea0c0964577002'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0-sda', 'timestamp': '2025-10-08T15:58:36.125294', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'instance-0000004b', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'a5ef4944-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.818651872, 'message_signature': '10d56012236158c59afa1f42b54b81c2fa7f366079b79b903e888f8555627e33'}]}, 'timestamp': '2025-10-08 15:58:36.125716', '_unique_id': 'db1a9cdb3bd84bce81fe38fb81dc74bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.126 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.device.write.latency volume: 321868165 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd24c788a-23c5-4bf3-a75e-e9d014fbeab8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 321868165, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0-vda', 'timestamp': '2025-10-08T15:58:36.126799', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'instance-0000004b', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'a5ef7b94-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.818651872, 'message_signature': '3d8c5a81769f6111cc61e1c87d8e5d32d2368a0987ef0b7e4be9eb48f6561be3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0-sda', 'timestamp': '2025-10-08T15:58:36.126799', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'instance-0000004b', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'a5ef83dc-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.818651872, 'message_signature': '0f9662228ce574d55b95074d26f38467d5ece42d0fc0c3719c495830a9581c08'}]}, 'timestamp': '2025-10-08 15:58:36.127242', '_unique_id': 'a7e9a1d1438b4d659e3edda03612c796'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.127 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.128 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.128 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.device.read.requests volume: 5688 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.128 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a26d7513-64e2-4e11-81e7-7ac17bde71f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 5688, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0-vda', 'timestamp': '2025-10-08T15:58:36.128407', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'instance-0000004b', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'a5efbb04-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.818651872, 'message_signature': '9e5248e1c95383956587a8a6527c4b92d40bf40c36cbddbdee744ef623652723'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0-sda', 'timestamp': '2025-10-08T15:58:36.128407', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'instance-0000004b', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'a5efc342-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.818651872, 'message_signature': '7cb5ead9c42baff53962b4194c08f66758e1d0436b389021fdf36f81a2f8b7dc'}]}, 'timestamp': '2025-10-08 15:58:36.128869', '_unique_id': '17e07b9616f849ec8ca189b9e0f7c6af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.129 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 DEBUG ceilometer.compute.pollsters [-] 9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4473d4d9-2319-4aea-a535-0452f0309f41', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0-vda', 'timestamp': '2025-10-08T15:58:36.130054', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'instance-0000004b', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'a5effbe6-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.818651872, 'message_signature': 'd81f5b4febb78edade28720cd71471b080c7dcb1135c9e6a71b159d13c3a9669'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0-sda', 'timestamp': '2025-10-08T15:58:36.130054', 'resource_metadata': {'display_name': 'tempest-server-test-1797033195', 'name': 'instance-0000004b', 'instance_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'a5f00514-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6019.818651872, 'message_signature': '107f046ecf8885eb97d8f35d89d2cd66b6144e304ab758050ce0639f872eb30e'}]}, 'timestamp': '2025-10-08 15:58:36.130527', '_unique_id': 'cf716c3103474ebc83626d405feec00d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 11:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 15:58:36.130 12 ERROR oslo_messaging.notify.messaging 
Oct  8 11:58:36 np0005476733 nova_compute[192580]: 2025-10-08 15:58:36.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:36 np0005476733 nova_compute[192580]: 2025-10-08 15:58:36.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:38 np0005476733 podman[246803]: 2025-10-08 15:58:38.235143672 +0000 UTC m=+0.064647007 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 11:58:38 np0005476733 podman[246802]: 2025-10-08 15:58:38.235307147 +0000 UTC m=+0.064894705 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001)
Oct  8 11:58:38 np0005476733 podman[246804]: 2025-10-08 15:58:38.272298449 +0000 UTC m=+0.094060417 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.openshift.expose-services=, config_id=edpm, distribution-scope=public, architecture=x86_64, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter)
Oct  8 11:58:41 np0005476733 nova_compute[192580]: 2025-10-08 15:58:41.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:41 np0005476733 nova_compute[192580]: 2025-10-08 15:58:41.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:45 np0005476733 podman[246874]: 2025-10-08 15:58:45.234350296 +0000 UTC m=+0.057840280 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct  8 11:58:45 np0005476733 podman[246875]: 2025-10-08 15:58:45.257269858 +0000 UTC m=+0.063244872 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 11:58:46 np0005476733 nova_compute[192580]: 2025-10-08 15:58:46.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:46 np0005476733 nova_compute[192580]: 2025-10-08 15:58:46.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:49 np0005476733 ovn_controller[94857]: 2025-10-08T15:58:49Z|00094|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e0:64:d7 10.10.1.159
Oct  8 11:58:49 np0005476733 ovn_controller[94857]: 2025-10-08T15:58:49Z|00095|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e0:64:d7 10.10.1.159
Oct  8 11:58:51 np0005476733 nova_compute[192580]: 2025-10-08 15:58:51.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:51 np0005476733 nova_compute[192580]: 2025-10-08 15:58:51.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:53 np0005476733 podman[246916]: 2025-10-08 15:58:53.222030659 +0000 UTC m=+0.045688521 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent)
Oct  8 11:58:56 np0005476733 nova_compute[192580]: 2025-10-08 15:58:56.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:56 np0005476733 nova_compute[192580]: 2025-10-08 15:58:56.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:58:56 np0005476733 ovn_controller[94857]: 2025-10-08T15:58:56Z|00697|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Oct  8 11:59:01 np0005476733 podman[246936]: 2025-10-08 15:59:01.237558531 +0000 UTC m=+0.055803774 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 11:59:01 np0005476733 nova_compute[192580]: 2025-10-08 15:59:01.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:01 np0005476733 podman[246935]: 2025-10-08 15:59:01.337397862 +0000 UTC m=+0.163025781 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 11:59:01 np0005476733 nova_compute[192580]: 2025-10-08 15:59:01.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:04 np0005476733 nova_compute[192580]: 2025-10-08 15:59:04.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:59:06 np0005476733 nova_compute[192580]: 2025-10-08 15:59:06.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:06 np0005476733 nova_compute[192580]: 2025-10-08 15:59:06.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:08 np0005476733 nova_compute[192580]: 2025-10-08 15:59:08.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:59:09 np0005476733 podman[246979]: 2025-10-08 15:59:09.264027394 +0000 UTC m=+0.084167241 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 11:59:09 np0005476733 podman[246980]: 2025-10-08 15:59:09.26516554 +0000 UTC m=+0.082408195 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 11:59:09 np0005476733 podman[246981]: 2025-10-08 15:59:09.27487461 +0000 UTC m=+0.091334929 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  8 11:59:11 np0005476733 nova_compute[192580]: 2025-10-08 15:59:11.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:11 np0005476733 nova_compute[192580]: 2025-10-08 15:59:11.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:13 np0005476733 nova_compute[192580]: 2025-10-08 15:59:13.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:59:13 np0005476733 nova_compute[192580]: 2025-10-08 15:59:13.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 11:59:13 np0005476733 nova_compute[192580]: 2025-10-08 15:59:13.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 11:59:13 np0005476733 nova_compute[192580]: 2025-10-08 15:59:13.819 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-9a8d7e02-6802-40ae-8c7a-6f40179085f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:59:13 np0005476733 nova_compute[192580]: 2025-10-08 15:59:13.819 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-9a8d7e02-6802-40ae-8c7a-6f40179085f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:59:13 np0005476733 nova_compute[192580]: 2025-10-08 15:59:13.820 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 11:59:13 np0005476733 nova_compute[192580]: 2025-10-08 15:59:13.820 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9a8d7e02-6802-40ae-8c7a-6f40179085f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:59:15 np0005476733 nova_compute[192580]: 2025-10-08 15:59:15.594 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Updating instance_info_cache with network_info: [{"id": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "address": "fa:16:3e:e0:64:d7", "network": {"id": "4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2", "bridge": "br-int", "label": "tempest-test-network--2028459875", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.159", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4d6d14-21", "ovs_interfaceid": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:59:15 np0005476733 nova_compute[192580]: 2025-10-08 15:59:15.610 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-9a8d7e02-6802-40ae-8c7a-6f40179085f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:59:15 np0005476733 nova_compute[192580]: 2025-10-08 15:59:15.611 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 11:59:15 np0005476733 nova_compute[192580]: 2025-10-08 15:59:15.611 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:59:16 np0005476733 podman[247078]: 2025-10-08 15:59:16.242819396 +0000 UTC m=+0.061906809 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 11:59:16 np0005476733 podman[247077]: 2025-10-08 15:59:16.24826036 +0000 UTC m=+0.069763740 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 11:59:16 np0005476733 nova_compute[192580]: 2025-10-08 15:59:16.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:16 np0005476733 nova_compute[192580]: 2025-10-08 15:59:16.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:17 np0005476733 nova_compute[192580]: 2025-10-08 15:59:17.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:59:17 np0005476733 nova_compute[192580]: 2025-10-08 15:59:17.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:59:17 np0005476733 nova_compute[192580]: 2025-10-08 15:59:17.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 11:59:19 np0005476733 nova_compute[192580]: 2025-10-08 15:59:19.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:59:19 np0005476733 nova_compute[192580]: 2025-10-08 15:59:19.655 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:59:19 np0005476733 nova_compute[192580]: 2025-10-08 15:59:19.656 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:59:19 np0005476733 nova_compute[192580]: 2025-10-08 15:59:19.656 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:59:19 np0005476733 nova_compute[192580]: 2025-10-08 15:59:19.657 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 11:59:19 np0005476733 nova_compute[192580]: 2025-10-08 15:59:19.753 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:59:19 np0005476733 nova_compute[192580]: 2025-10-08 15:59:19.822 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:59:19 np0005476733 nova_compute[192580]: 2025-10-08 15:59:19.823 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 11:59:19 np0005476733 nova_compute[192580]: 2025-10-08 15:59:19.883 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9a8d7e02-6802-40ae-8c7a-6f40179085f0/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 11:59:20 np0005476733 nova_compute[192580]: 2025-10-08 15:59:20.037 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 11:59:20 np0005476733 nova_compute[192580]: 2025-10-08 15:59:20.039 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12974MB free_disk=111.18895721435547GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 11:59:20 np0005476733 nova_compute[192580]: 2025-10-08 15:59:20.039 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:59:20 np0005476733 nova_compute[192580]: 2025-10-08 15:59:20.039 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:59:20 np0005476733 nova_compute[192580]: 2025-10-08 15:59:20.108 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 9a8d7e02-6802-40ae-8c7a-6f40179085f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 11:59:20 np0005476733 nova_compute[192580]: 2025-10-08 15:59:20.109 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 11:59:20 np0005476733 nova_compute[192580]: 2025-10-08 15:59:20.109 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 11:59:20 np0005476733 nova_compute[192580]: 2025-10-08 15:59:20.151 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:59:20 np0005476733 nova_compute[192580]: 2025-10-08 15:59:20.169 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:59:20 np0005476733 nova_compute[192580]: 2025-10-08 15:59:20.202 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 11:59:20 np0005476733 nova_compute[192580]: 2025-10-08 15:59:20.202 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:59:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:59:20Z|00698|pinctrl|WARN|Dropped 499 log messages in last 60 seconds (most recently, 0 seconds ago) due to excessive rate
Oct  8 11:59:20 np0005476733 ovn_controller[94857]: 2025-10-08T15:59:20Z|00699|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 11:59:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:20.989 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:59:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:20.990 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 11:59:20 np0005476733 nova_compute[192580]: 2025-10-08 15:59:20.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:21 np0005476733 nova_compute[192580]: 2025-10-08 15:59:21.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:21 np0005476733 nova_compute[192580]: 2025-10-08 15:59:21.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:22 np0005476733 nova_compute[192580]: 2025-10-08 15:59:22.201 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:59:24 np0005476733 podman[247130]: 2025-10-08 15:59:24.238277448 +0000 UTC m=+0.065493614 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent)
Oct  8 11:59:24 np0005476733 nova_compute[192580]: 2025-10-08 15:59:24.558 2 DEBUG oslo_concurrency.lockutils [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquiring lock "interface-9a8d7e02-6802-40ae-8c7a-6f40179085f0-c81b3f65-3846-40eb-a945-0b7613a2369e" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:59:24 np0005476733 nova_compute[192580]: 2025-10-08 15:59:24.559 2 DEBUG oslo_concurrency.lockutils [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "interface-9a8d7e02-6802-40ae-8c7a-6f40179085f0-c81b3f65-3846-40eb-a945-0b7613a2369e" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:59:24 np0005476733 nova_compute[192580]: 2025-10-08 15:59:24.560 2 DEBUG nova.objects.instance [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lazy-loading 'flavor' on Instance uuid 9a8d7e02-6802-40ae-8c7a-6f40179085f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:59:25 np0005476733 nova_compute[192580]: 2025-10-08 15:59:25.301 2 DEBUG nova.objects.instance [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lazy-loading 'pci_requests' on Instance uuid 9a8d7e02-6802-40ae-8c7a-6f40179085f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:59:25 np0005476733 nova_compute[192580]: 2025-10-08 15:59:25.319 2 DEBUG nova.network.neutron [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 11:59:25 np0005476733 nova_compute[192580]: 2025-10-08 15:59:25.697 2 DEBUG nova.policy [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 11:59:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:26.348 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:59:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:26.349 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:59:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:26.349 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:59:26 np0005476733 nova_compute[192580]: 2025-10-08 15:59:26.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:26 np0005476733 nova_compute[192580]: 2025-10-08 15:59:26.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:28 np0005476733 nova_compute[192580]: 2025-10-08 15:59:28.642 2 DEBUG nova.network.neutron [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Successfully updated port: c81b3f65-3846-40eb-a945-0b7613a2369e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 11:59:28 np0005476733 nova_compute[192580]: 2025-10-08 15:59:28.664 2 DEBUG oslo_concurrency.lockutils [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquiring lock "refresh_cache-9a8d7e02-6802-40ae-8c7a-6f40179085f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:59:28 np0005476733 nova_compute[192580]: 2025-10-08 15:59:28.665 2 DEBUG oslo_concurrency.lockutils [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquired lock "refresh_cache-9a8d7e02-6802-40ae-8c7a-6f40179085f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:59:28 np0005476733 nova_compute[192580]: 2025-10-08 15:59:28.665 2 DEBUG nova.network.neutron [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 11:59:28 np0005476733 nova_compute[192580]: 2025-10-08 15:59:28.776 2 DEBUG nova.compute.manager [req-824fada7-521e-4736-8ffe-7c3d6c235d9e req-8cc418e5-40b1-43e8-9c67-725fff746f0a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Received event network-changed-c81b3f65-3846-40eb-a945-0b7613a2369e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:59:28 np0005476733 nova_compute[192580]: 2025-10-08 15:59:28.776 2 DEBUG nova.compute.manager [req-824fada7-521e-4736-8ffe-7c3d6c235d9e req-8cc418e5-40b1-43e8-9c67-725fff746f0a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Refreshing instance network info cache due to event network-changed-c81b3f65-3846-40eb-a945-0b7613a2369e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:59:28 np0005476733 nova_compute[192580]: 2025-10-08 15:59:28.776 2 DEBUG oslo_concurrency.lockutils [req-824fada7-521e-4736-8ffe-7c3d6c235d9e req-8cc418e5-40b1-43e8-9c67-725fff746f0a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-9a8d7e02-6802-40ae-8c7a-6f40179085f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:59:30 np0005476733 nova_compute[192580]: 2025-10-08 15:59:30.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 11:59:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:30.993 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:59:31 np0005476733 nova_compute[192580]: 2025-10-08 15:59:31.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:31 np0005476733 nova_compute[192580]: 2025-10-08 15:59:31.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.129 2 DEBUG nova.network.neutron [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Updating instance_info_cache with network_info: [{"id": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "address": "fa:16:3e:e0:64:d7", "network": {"id": "4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2", "bridge": "br-int", "label": "tempest-test-network--2028459875", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.159", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4d6d14-21", "ovs_interfaceid": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c81b3f65-3846-40eb-a945-0b7613a2369e", "address": "fa:16:3e:8a:ad:7e", "network": {"id": "99ef19c5-07b6-4860-a1cf-a7e362446d16", "bridge": "br-int", "label": "tempest-tenant-ctl-network-1362306064", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc81b3f65-38", "ovs_interfaceid": "c81b3f65-3846-40eb-a945-0b7613a2369e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.150 2 DEBUG oslo_concurrency.lockutils [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Releasing lock "refresh_cache-9a8d7e02-6802-40ae-8c7a-6f40179085f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.151 2 DEBUG oslo_concurrency.lockutils [req-824fada7-521e-4736-8ffe-7c3d6c235d9e req-8cc418e5-40b1-43e8-9c67-725fff746f0a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-9a8d7e02-6802-40ae-8c7a-6f40179085f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.152 2 DEBUG nova.network.neutron [req-824fada7-521e-4736-8ffe-7c3d6c235d9e req-8cc418e5-40b1-43e8-9c67-725fff746f0a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Refreshing network info cache for port c81b3f65-3846-40eb-a945-0b7613a2369e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.154 2 DEBUG nova.virt.libvirt.vif [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:58:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1797033195',display_name='tempest-server-test-1797033195',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1797033195',id=75,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGmGuURdoBH7+8UKntqBm5AWKwSqVw41oQIfoqZW4juzRa+zLIDUZQk+8q96NsvV1QhNKcV4HhEHGQj7RYtO04Z0WfqqmlMfeVZDrcQlemJhjx+knV/dWY2Bcp0Y0lXzvQ==',key_name='tempest-keypair-test-1490490112',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:58:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='13cdd2bb6c7648f5ab8709ff695b5cda',ramdisk_id='',reservation_id='r-fge78uux',resources=<?>,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestOvn-1026583770',owner_user_name='tempest-QosTestOvn-1026583770-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:58:23Z,user_data=None,user_id='bf4219ece8f54f268b2ece84f150d555',uuid=9a8d7e02-6802-40ae-8c7a-6f40179085f0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c81b3f65-3846-40eb-a945-0b7613a2369e", "address": "fa:16:3e:8a:ad:7e", "network": {"id": "99ef19c5-07b6-4860-a1cf-a7e362446d16", "bridge": "br-int", "label": "tempest-tenant-ctl-network-1362306064", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc81b3f65-38", "ovs_interfaceid": "c81b3f65-3846-40eb-a945-0b7613a2369e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.154 2 DEBUG nova.network.os_vif_util [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Converting VIF {"id": "c81b3f65-3846-40eb-a945-0b7613a2369e", "address": "fa:16:3e:8a:ad:7e", "network": {"id": "99ef19c5-07b6-4860-a1cf-a7e362446d16", "bridge": "br-int", "label": "tempest-tenant-ctl-network-1362306064", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc81b3f65-38", "ovs_interfaceid": "c81b3f65-3846-40eb-a945-0b7613a2369e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.155 2 DEBUG nova.network.os_vif_util [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:ad:7e,bridge_name='br-int',has_traffic_filtering=True,id=c81b3f65-3846-40eb-a945-0b7613a2369e,network=Network(99ef19c5-07b6-4860-a1cf-a7e362446d16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc81b3f65-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.155 2 DEBUG os_vif [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:ad:7e,bridge_name='br-int',has_traffic_filtering=True,id=c81b3f65-3846-40eb-a945-0b7613a2369e,network=Network(99ef19c5-07b6-4860-a1cf-a7e362446d16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc81b3f65-38') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.156 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.157 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.159 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc81b3f65-38, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.160 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc81b3f65-38, col_values=(('external_ids', {'iface-id': 'c81b3f65-3846-40eb-a945-0b7613a2369e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8a:ad:7e', 'vm-uuid': '9a8d7e02-6802-40ae-8c7a-6f40179085f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:32 np0005476733 NetworkManager[51699]: <info>  [1759939172.1624] manager: (tapc81b3f65-38): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/235)
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.172 2 INFO os_vif [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:ad:7e,bridge_name='br-int',has_traffic_filtering=True,id=c81b3f65-3846-40eb-a945-0b7613a2369e,network=Network(99ef19c5-07b6-4860-a1cf-a7e362446d16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc81b3f65-38')#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.173 2 DEBUG nova.virt.libvirt.vif [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:58:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1797033195',display_name='tempest-server-test-1797033195',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1797033195',id=75,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGmGuURdoBH7+8UKntqBm5AWKwSqVw41oQIfoqZW4juzRa+zLIDUZQk+8q96NsvV1QhNKcV4HhEHGQj7RYtO04Z0WfqqmlMfeVZDrcQlemJhjx+knV/dWY2Bcp0Y0lXzvQ==',key_name='tempest-keypair-test-1490490112',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:58:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='13cdd2bb6c7648f5ab8709ff695b5cda',ramdisk_id='',reservation_id='r-fge78uux',resources=<?>,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestOvn-1026583770',owner_user_name='tempest-QosTestOvn-1026583770-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:58:23Z,user_data=None,user_id='bf4219ece8f54f268b2ece84f150d555',uuid=9a8d7e02-6802-40ae-8c7a-6f40179085f0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c81b3f65-3846-40eb-a945-0b7613a2369e", "address": "fa:16:3e:8a:ad:7e", "network": {"id": "99ef19c5-07b6-4860-a1cf-a7e362446d16", "bridge": "br-int", "label": "tempest-tenant-ctl-network-1362306064", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc81b3f65-38", "ovs_interfaceid": "c81b3f65-3846-40eb-a945-0b7613a2369e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.173 2 DEBUG nova.network.os_vif_util [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Converting VIF {"id": "c81b3f65-3846-40eb-a945-0b7613a2369e", "address": "fa:16:3e:8a:ad:7e", "network": {"id": "99ef19c5-07b6-4860-a1cf-a7e362446d16", "bridge": "br-int", "label": "tempest-tenant-ctl-network-1362306064", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc81b3f65-38", "ovs_interfaceid": "c81b3f65-3846-40eb-a945-0b7613a2369e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.174 2 DEBUG nova.network.os_vif_util [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:ad:7e,bridge_name='br-int',has_traffic_filtering=True,id=c81b3f65-3846-40eb-a945-0b7613a2369e,network=Network(99ef19c5-07b6-4860-a1cf-a7e362446d16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc81b3f65-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.175 2 DEBUG nova.virt.libvirt.guest [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] attach device xml: <interface type="ethernet">
Oct  8 11:59:32 np0005476733 nova_compute[192580]:  <mac address="fa:16:3e:8a:ad:7e"/>
Oct  8 11:59:32 np0005476733 nova_compute[192580]:  <model type="virtio"/>
Oct  8 11:59:32 np0005476733 nova_compute[192580]:  <driver name="vhost" rx_queue_size="512"/>
Oct  8 11:59:32 np0005476733 nova_compute[192580]:  <mtu size="1342"/>
Oct  8 11:59:32 np0005476733 nova_compute[192580]:  <target dev="tapc81b3f65-38"/>
Oct  8 11:59:32 np0005476733 nova_compute[192580]: </interface>
Oct  8 11:59:32 np0005476733 nova_compute[192580]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  8 11:59:32 np0005476733 kernel: tapc81b3f65-38: entered promiscuous mode
Oct  8 11:59:32 np0005476733 NetworkManager[51699]: <info>  [1759939172.1952] manager: (tapc81b3f65-38): new Tun device (/org/freedesktop/NetworkManager/Devices/236)
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:59:32Z|00700|binding|INFO|Claiming lport c81b3f65-3846-40eb-a945-0b7613a2369e for this chassis.
Oct  8 11:59:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:59:32Z|00701|binding|INFO|c81b3f65-3846-40eb-a945-0b7613a2369e: Claiming fa:16:3e:8a:ad:7e 10.100.0.4
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.208 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:ad:7e 10.100.0.4'], port_security=['fa:16:3e:8a:ad:7e 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-99ef19c5-07b6-4860-a1cf-a7e362446d16', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'neutron:revision_number': '2', 'neutron:security_group_ids': '495dd58b-359c-4273-9624-4fe93a315db4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1358b3f2-f465-4779-a0e2-bc53cf16e2f0, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=c81b3f65-3846-40eb-a945-0b7613a2369e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.210 103739 INFO neutron.agent.ovn.metadata.agent [-] Port c81b3f65-3846-40eb-a945-0b7613a2369e in datapath 99ef19c5-07b6-4860-a1cf-a7e362446d16 bound to our chassis#033[00m
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.211 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 99ef19c5-07b6-4860-a1cf-a7e362446d16#033[00m
Oct  8 11:59:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:59:32Z|00702|binding|INFO|Setting lport c81b3f65-3846-40eb-a945-0b7613a2369e ovn-installed in OVS
Oct  8 11:59:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:59:32Z|00703|binding|INFO|Setting lport c81b3f65-3846-40eb-a945-0b7613a2369e up in Southbound
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:32 np0005476733 systemd-udevd[247189]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.225 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b143e557-9356-4068-a161-74675ca831d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.225 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap99ef19c5-01 in ovnmeta-99ef19c5-07b6-4860-a1cf-a7e362446d16 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.227 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap99ef19c5-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.228 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d9633dfb-0665-430a-a871-4dd9dce43d58]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.229 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3c1e0086-1d6d-431f-a931-c41152a4d052]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:32 np0005476733 NetworkManager[51699]: <info>  [1759939172.2374] device (tapc81b3f65-38): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 11:59:32 np0005476733 podman[247152]: 2025-10-08 15:59:32.238069447 +0000 UTC m=+0.067369264 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 11:59:32 np0005476733 NetworkManager[51699]: <info>  [1759939172.2399] device (tapc81b3f65-38): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.243 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[c88e176f-fd0a-4156-977e-e7b1df656f61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.271 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0243d426-0725-4700-bb36-e7c9749a32de]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:32 np0005476733 podman[247151]: 2025-10-08 15:59:32.279749929 +0000 UTC m=+0.110091369 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.288 2 DEBUG nova.virt.libvirt.driver [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.289 2 DEBUG nova.virt.libvirt.driver [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.289 2 DEBUG nova.virt.libvirt.driver [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] No VIF found with MAC fa:16:3e:e0:64:d7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.290 2 DEBUG nova.virt.libvirt.driver [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] No VIF found with MAC fa:16:3e:8a:ad:7e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.308 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[40f070a4-c01c-4b31-8461-17b5239d0fe4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.315 2 DEBUG nova.virt.libvirt.guest [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 11:59:32 np0005476733 nova_compute[192580]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 11:59:32 np0005476733 nova_compute[192580]:  <nova:name>tempest-server-test-1797033195</nova:name>
Oct  8 11:59:32 np0005476733 nova_compute[192580]:  <nova:creationTime>2025-10-08 15:59:32</nova:creationTime>
Oct  8 11:59:32 np0005476733 nova_compute[192580]:  <nova:flavor name="custom_neutron_guest">
Oct  8 11:59:32 np0005476733 nova_compute[192580]:    <nova:memory>1024</nova:memory>
Oct  8 11:59:32 np0005476733 nova_compute[192580]:    <nova:disk>10</nova:disk>
Oct  8 11:59:32 np0005476733 nova_compute[192580]:    <nova:swap>0</nova:swap>
Oct  8 11:59:32 np0005476733 nova_compute[192580]:    <nova:ephemeral>0</nova:ephemeral>
Oct  8 11:59:32 np0005476733 nova_compute[192580]:    <nova:vcpus>1</nova:vcpus>
Oct  8 11:59:32 np0005476733 nova_compute[192580]:  </nova:flavor>
Oct  8 11:59:32 np0005476733 nova_compute[192580]:  <nova:owner>
Oct  8 11:59:32 np0005476733 nova_compute[192580]:    <nova:user uuid="bf4219ece8f54f268b2ece84f150d555">tempest-QosTestOvn-1026583770-project-member</nova:user>
Oct  8 11:59:32 np0005476733 nova_compute[192580]:    <nova:project uuid="13cdd2bb6c7648f5ab8709ff695b5cda">tempest-QosTestOvn-1026583770</nova:project>
Oct  8 11:59:32 np0005476733 nova_compute[192580]:  </nova:owner>
Oct  8 11:59:32 np0005476733 nova_compute[192580]:  <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 11:59:32 np0005476733 nova_compute[192580]:  <nova:ports>
Oct  8 11:59:32 np0005476733 nova_compute[192580]:    <nova:port uuid="ec4d6d14-21c2-4a40-a5d6-e96b314661ca">
Oct  8 11:59:32 np0005476733 nova_compute[192580]:      <nova:ip type="fixed" address="10.10.1.159" ipVersion="4"/>
Oct  8 11:59:32 np0005476733 nova_compute[192580]:    </nova:port>
Oct  8 11:59:32 np0005476733 nova_compute[192580]:    <nova:port uuid="c81b3f65-3846-40eb-a945-0b7613a2369e">
Oct  8 11:59:32 np0005476733 nova_compute[192580]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  8 11:59:32 np0005476733 nova_compute[192580]:    </nova:port>
Oct  8 11:59:32 np0005476733 nova_compute[192580]:  </nova:ports>
Oct  8 11:59:32 np0005476733 nova_compute[192580]: </nova:instance>
Oct  8 11:59:32 np0005476733 nova_compute[192580]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.315 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[824d5240-eabe-4ff4-a304-7598b8b6df1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:32 np0005476733 NetworkManager[51699]: <info>  [1759939172.3166] manager: (tap99ef19c5-00): new Veth device (/org/freedesktop/NetworkManager/Devices/237)
Oct  8 11:59:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:59:32Z|00096|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8a:ad:7e 10.100.0.4
Oct  8 11:59:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:59:32Z|00097|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8a:ad:7e 10.100.0.4
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.344 2 DEBUG oslo_concurrency.lockutils [None req-ea05601d-ad73-4469-95af-1c8380ecb468 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "interface-9a8d7e02-6802-40ae-8c7a-6f40179085f0-c81b3f65-3846-40eb-a945-0b7613a2369e" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 7.785s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.347 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[aff1d449-9d59-48a2-9c23-23b242b28885]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.350 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[cfb8d89e-05d2-42a0-86cc-7c37ca1374d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:32 np0005476733 NetworkManager[51699]: <info>  [1759939172.3726] device (tap99ef19c5-00): carrier: link connected
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.379 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[c4f11625-fb0b-4edc-9ebe-368e850f4960]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.404 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[54ef080c-a71e-49d4-b30c-eca8594d4a16]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap99ef19c5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:1f:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 607603, 'reachable_time': 27615, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247232, 'error': None, 'target': 'ovnmeta-99ef19c5-07b6-4860-a1cf-a7e362446d16', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.423 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[116826de-b673-472c-9f9c-df955b42581e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe13:1fee'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 607603, 'tstamp': 607603}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247233, 'error': None, 'target': 'ovnmeta-99ef19c5-07b6-4860-a1cf-a7e362446d16', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.438 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[cf66e44e-d0c7-4a64-bbd3-8feac93dbade]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap99ef19c5-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:1f:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 607603, 'reachable_time': 27615, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247234, 'error': None, 'target': 'ovnmeta-99ef19c5-07b6-4860-a1cf-a7e362446d16', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.468 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0b567ceb-83ee-42c8-b651-53053b6462f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.514 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e026a247-69a9-410c-9700-9ce1aae5c6ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.516 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap99ef19c5-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.516 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.516 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap99ef19c5-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:32 np0005476733 kernel: tap99ef19c5-00: entered promiscuous mode
Oct  8 11:59:32 np0005476733 NetworkManager[51699]: <info>  [1759939172.5188] manager: (tap99ef19c5-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/238)
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.522 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap99ef19c5-00, col_values=(('external_ids', {'iface-id': '099f6914-271e-48ae-8ce4-8c9528d4e277'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:59:32 np0005476733 ovn_controller[94857]: 2025-10-08T15:59:32Z|00704|binding|INFO|Releasing lport 099f6914-271e-48ae-8ce4-8c9528d4e277 from this chassis (sb_readonly=0)
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.524 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/99ef19c5-07b6-4860-a1cf-a7e362446d16.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/99ef19c5-07b6-4860-a1cf-a7e362446d16.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.526 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2e3bef63-ffce-4fe7-b369-fa2a7f1ddd72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.526 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-99ef19c5-07b6-4860-a1cf-a7e362446d16
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/99ef19c5-07b6-4860-a1cf-a7e362446d16.pid.haproxy
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 99ef19c5-07b6-4860-a1cf-a7e362446d16
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 11:59:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:32.527 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-99ef19c5-07b6-4860-a1cf-a7e362446d16', 'env', 'PROCESS_TAG=haproxy-99ef19c5-07b6-4860-a1cf-a7e362446d16', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/99ef19c5-07b6-4860-a1cf-a7e362446d16.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:32 np0005476733 podman[247266]: 2025-10-08 15:59:32.872633135 +0000 UTC m=+0.050180264 container create cd03c0e2347b184489e725fbc0761362a161909900c2103fe0a5f7657948510e (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-99ef19c5-07b6-4860-a1cf-a7e362446d16, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  8 11:59:32 np0005476733 systemd[1]: Started libpod-conmon-cd03c0e2347b184489e725fbc0761362a161909900c2103fe0a5f7657948510e.scope.
Oct  8 11:59:32 np0005476733 podman[247266]: 2025-10-08 15:59:32.844006271 +0000 UTC m=+0.021553400 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 11:59:32 np0005476733 systemd[1]: Started libcrun container.
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.955 2 DEBUG nova.compute.manager [req-66f461fa-1184-42d6-b43c-046510194b12 req-a49a6ea0-9329-4e8e-a2d8-4f9cda348675 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Received event network-vif-plugged-c81b3f65-3846-40eb-a945-0b7613a2369e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.956 2 DEBUG oslo_concurrency.lockutils [req-66f461fa-1184-42d6-b43c-046510194b12 req-a49a6ea0-9329-4e8e-a2d8-4f9cda348675 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.956 2 DEBUG oslo_concurrency.lockutils [req-66f461fa-1184-42d6-b43c-046510194b12 req-a49a6ea0-9329-4e8e-a2d8-4f9cda348675 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.956 2 DEBUG oslo_concurrency.lockutils [req-66f461fa-1184-42d6-b43c-046510194b12 req-a49a6ea0-9329-4e8e-a2d8-4f9cda348675 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.956 2 DEBUG nova.compute.manager [req-66f461fa-1184-42d6-b43c-046510194b12 req-a49a6ea0-9329-4e8e-a2d8-4f9cda348675 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] No waiting events found dispatching network-vif-plugged-c81b3f65-3846-40eb-a945-0b7613a2369e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:59:32 np0005476733 nova_compute[192580]: 2025-10-08 15:59:32.957 2 WARNING nova.compute.manager [req-66f461fa-1184-42d6-b43c-046510194b12 req-a49a6ea0-9329-4e8e-a2d8-4f9cda348675 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Received unexpected event network-vif-plugged-c81b3f65-3846-40eb-a945-0b7613a2369e for instance with vm_state active and task_state None.#033[00m
Oct  8 11:59:32 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/885d1b1d01f1e399dc2665d188a320a1b51d29399728ca8e2f6c0419b842c3db/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 11:59:32 np0005476733 podman[247266]: 2025-10-08 15:59:32.978618183 +0000 UTC m=+0.156165312 container init cd03c0e2347b184489e725fbc0761362a161909900c2103fe0a5f7657948510e (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-99ef19c5-07b6-4860-a1cf-a7e362446d16, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:59:32 np0005476733 podman[247266]: 2025-10-08 15:59:32.989065977 +0000 UTC m=+0.166613086 container start cd03c0e2347b184489e725fbc0761362a161909900c2103fe0a5f7657948510e (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-99ef19c5-07b6-4860-a1cf-a7e362446d16, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0)
Oct  8 11:59:33 np0005476733 neutron-haproxy-ovnmeta-99ef19c5-07b6-4860-a1cf-a7e362446d16[247281]: [NOTICE]   (247286) : New worker (247288) forked
Oct  8 11:59:33 np0005476733 neutron-haproxy-ovnmeta-99ef19c5-07b6-4860-a1cf-a7e362446d16[247281]: [NOTICE]   (247286) : Loading success.
Oct  8 11:59:34 np0005476733 nova_compute[192580]: 2025-10-08 15:59:34.730 2 DEBUG nova.network.neutron [req-824fada7-521e-4736-8ffe-7c3d6c235d9e req-8cc418e5-40b1-43e8-9c67-725fff746f0a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Updated VIF entry in instance network info cache for port c81b3f65-3846-40eb-a945-0b7613a2369e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:59:34 np0005476733 nova_compute[192580]: 2025-10-08 15:59:34.731 2 DEBUG nova.network.neutron [req-824fada7-521e-4736-8ffe-7c3d6c235d9e req-8cc418e5-40b1-43e8-9c67-725fff746f0a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Updating instance_info_cache with network_info: [{"id": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "address": "fa:16:3e:e0:64:d7", "network": {"id": "4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2", "bridge": "br-int", "label": "tempest-test-network--2028459875", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.159", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4d6d14-21", "ovs_interfaceid": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c81b3f65-3846-40eb-a945-0b7613a2369e", "address": "fa:16:3e:8a:ad:7e", "network": {"id": "99ef19c5-07b6-4860-a1cf-a7e362446d16", "bridge": "br-int", "label": "tempest-tenant-ctl-network-1362306064", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc81b3f65-38", "ovs_interfaceid": "c81b3f65-3846-40eb-a945-0b7613a2369e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:59:34 np0005476733 nova_compute[192580]: 2025-10-08 15:59:34.799 2 DEBUG oslo_concurrency.lockutils [req-824fada7-521e-4736-8ffe-7c3d6c235d9e req-8cc418e5-40b1-43e8-9c67-725fff746f0a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-9a8d7e02-6802-40ae-8c7a-6f40179085f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:59:35 np0005476733 nova_compute[192580]: 2025-10-08 15:59:35.162 2 DEBUG nova.compute.manager [req-faacf3c6-3bc9-45a7-93cf-b6d1448b94e2 req-7ceebc3a-517f-4a12-b041-9c596e0fbb10 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Received event network-vif-plugged-c81b3f65-3846-40eb-a945-0b7613a2369e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:59:35 np0005476733 nova_compute[192580]: 2025-10-08 15:59:35.162 2 DEBUG oslo_concurrency.lockutils [req-faacf3c6-3bc9-45a7-93cf-b6d1448b94e2 req-7ceebc3a-517f-4a12-b041-9c596e0fbb10 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:59:35 np0005476733 nova_compute[192580]: 2025-10-08 15:59:35.162 2 DEBUG oslo_concurrency.lockutils [req-faacf3c6-3bc9-45a7-93cf-b6d1448b94e2 req-7ceebc3a-517f-4a12-b041-9c596e0fbb10 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:59:35 np0005476733 nova_compute[192580]: 2025-10-08 15:59:35.162 2 DEBUG oslo_concurrency.lockutils [req-faacf3c6-3bc9-45a7-93cf-b6d1448b94e2 req-7ceebc3a-517f-4a12-b041-9c596e0fbb10 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:59:35 np0005476733 nova_compute[192580]: 2025-10-08 15:59:35.163 2 DEBUG nova.compute.manager [req-faacf3c6-3bc9-45a7-93cf-b6d1448b94e2 req-7ceebc3a-517f-4a12-b041-9c596e0fbb10 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] No waiting events found dispatching network-vif-plugged-c81b3f65-3846-40eb-a945-0b7613a2369e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:59:35 np0005476733 nova_compute[192580]: 2025-10-08 15:59:35.163 2 WARNING nova.compute.manager [req-faacf3c6-3bc9-45a7-93cf-b6d1448b94e2 req-7ceebc3a-517f-4a12-b041-9c596e0fbb10 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Received unexpected event network-vif-plugged-c81b3f65-3846-40eb-a945-0b7613a2369e for instance with vm_state active and task_state None.#033[00m
Oct  8 11:59:35 np0005476733 nova_compute[192580]: 2025-10-08 15:59:35.754 2 DEBUG nova.compute.manager [req-babf11a6-6911-40f1-90ab-b12be2b56f02 req-60140edd-e483-440c-9573-79d99e2c631f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Received event network-changed-c81b3f65-3846-40eb-a945-0b7613a2369e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:59:35 np0005476733 nova_compute[192580]: 2025-10-08 15:59:35.755 2 DEBUG nova.compute.manager [req-babf11a6-6911-40f1-90ab-b12be2b56f02 req-60140edd-e483-440c-9573-79d99e2c631f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Refreshing instance network info cache due to event network-changed-c81b3f65-3846-40eb-a945-0b7613a2369e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:59:35 np0005476733 nova_compute[192580]: 2025-10-08 15:59:35.755 2 DEBUG oslo_concurrency.lockutils [req-babf11a6-6911-40f1-90ab-b12be2b56f02 req-60140edd-e483-440c-9573-79d99e2c631f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-9a8d7e02-6802-40ae-8c7a-6f40179085f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:59:35 np0005476733 nova_compute[192580]: 2025-10-08 15:59:35.756 2 DEBUG oslo_concurrency.lockutils [req-babf11a6-6911-40f1-90ab-b12be2b56f02 req-60140edd-e483-440c-9573-79d99e2c631f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-9a8d7e02-6802-40ae-8c7a-6f40179085f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:59:35 np0005476733 nova_compute[192580]: 2025-10-08 15:59:35.756 2 DEBUG nova.network.neutron [req-babf11a6-6911-40f1-90ab-b12be2b56f02 req-60140edd-e483-440c-9573-79d99e2c631f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Refreshing network info cache for port c81b3f65-3846-40eb-a945-0b7613a2369e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:59:36 np0005476733 nova_compute[192580]: 2025-10-08 15:59:36.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:37 np0005476733 nova_compute[192580]: 2025-10-08 15:59:37.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:37 np0005476733 nova_compute[192580]: 2025-10-08 15:59:37.223 2 DEBUG nova.network.neutron [req-babf11a6-6911-40f1-90ab-b12be2b56f02 req-60140edd-e483-440c-9573-79d99e2c631f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Updated VIF entry in instance network info cache for port c81b3f65-3846-40eb-a945-0b7613a2369e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:59:37 np0005476733 nova_compute[192580]: 2025-10-08 15:59:37.225 2 DEBUG nova.network.neutron [req-babf11a6-6911-40f1-90ab-b12be2b56f02 req-60140edd-e483-440c-9573-79d99e2c631f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Updating instance_info_cache with network_info: [{"id": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "address": "fa:16:3e:e0:64:d7", "network": {"id": "4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2", "bridge": "br-int", "label": "tempest-test-network--2028459875", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.159", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4d6d14-21", "ovs_interfaceid": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c81b3f65-3846-40eb-a945-0b7613a2369e", "address": "fa:16:3e:8a:ad:7e", "network": {"id": "99ef19c5-07b6-4860-a1cf-a7e362446d16", "bridge": "br-int", "label": "tempest-tenant-ctl-network-1362306064", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc81b3f65-38", "ovs_interfaceid": "c81b3f65-3846-40eb-a945-0b7613a2369e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:59:37 np0005476733 nova_compute[192580]: 2025-10-08 15:59:37.253 2 DEBUG oslo_concurrency.lockutils [req-babf11a6-6911-40f1-90ab-b12be2b56f02 req-60140edd-e483-440c-9573-79d99e2c631f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-9a8d7e02-6802-40ae-8c7a-6f40179085f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:59:40 np0005476733 podman[247297]: 2025-10-08 15:59:40.246183264 +0000 UTC m=+0.067589872 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd)
Oct  8 11:59:40 np0005476733 podman[247299]: 2025-10-08 15:59:40.246296377 +0000 UTC m=+0.063347856 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  8 11:59:40 np0005476733 podman[247298]: 2025-10-08 15:59:40.266992818 +0000 UTC m=+0.087021302 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 11:59:41 np0005476733 nova_compute[192580]: 2025-10-08 15:59:41.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:42 np0005476733 nova_compute[192580]: 2025-10-08 15:59:42.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:46 np0005476733 nova_compute[192580]: 2025-10-08 15:59:46.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:47 np0005476733 nova_compute[192580]: 2025-10-08 15:59:47.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:47 np0005476733 podman[247361]: 2025-10-08 15:59:47.231328258 +0000 UTC m=+0.052035694 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 11:59:47 np0005476733 podman[247360]: 2025-10-08 15:59:47.24266228 +0000 UTC m=+0.062770777 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  8 11:59:47 np0005476733 nova_compute[192580]: 2025-10-08 15:59:47.670 2 DEBUG nova.compute.manager [req-706bd50b-0a8f-4bb7-adaf-b497ca2d38fd req-5fe29d0d-b0fa-435a-a30b-404c06d8470d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Received event network-changed-ec4d6d14-21c2-4a40-a5d6-e96b314661ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:59:47 np0005476733 nova_compute[192580]: 2025-10-08 15:59:47.670 2 DEBUG nova.compute.manager [req-706bd50b-0a8f-4bb7-adaf-b497ca2d38fd req-5fe29d0d-b0fa-435a-a30b-404c06d8470d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Refreshing instance network info cache due to event network-changed-ec4d6d14-21c2-4a40-a5d6-e96b314661ca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 11:59:47 np0005476733 nova_compute[192580]: 2025-10-08 15:59:47.670 2 DEBUG oslo_concurrency.lockutils [req-706bd50b-0a8f-4bb7-adaf-b497ca2d38fd req-5fe29d0d-b0fa-435a-a30b-404c06d8470d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-9a8d7e02-6802-40ae-8c7a-6f40179085f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 11:59:47 np0005476733 nova_compute[192580]: 2025-10-08 15:59:47.671 2 DEBUG oslo_concurrency.lockutils [req-706bd50b-0a8f-4bb7-adaf-b497ca2d38fd req-5fe29d0d-b0fa-435a-a30b-404c06d8470d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-9a8d7e02-6802-40ae-8c7a-6f40179085f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 11:59:47 np0005476733 nova_compute[192580]: 2025-10-08 15:59:47.671 2 DEBUG nova.network.neutron [req-706bd50b-0a8f-4bb7-adaf-b497ca2d38fd req-5fe29d0d-b0fa-435a-a30b-404c06d8470d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Refreshing network info cache for port ec4d6d14-21c2-4a40-a5d6-e96b314661ca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 11:59:47 np0005476733 nova_compute[192580]: 2025-10-08 15:59:47.756 2 DEBUG oslo_concurrency.lockutils [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquiring lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:59:47 np0005476733 nova_compute[192580]: 2025-10-08 15:59:47.757 2 DEBUG oslo_concurrency.lockutils [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:59:47 np0005476733 nova_compute[192580]: 2025-10-08 15:59:47.757 2 DEBUG oslo_concurrency.lockutils [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquiring lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:59:47 np0005476733 nova_compute[192580]: 2025-10-08 15:59:47.758 2 DEBUG oslo_concurrency.lockutils [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:59:47 np0005476733 nova_compute[192580]: 2025-10-08 15:59:47.758 2 DEBUG oslo_concurrency.lockutils [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:59:47 np0005476733 nova_compute[192580]: 2025-10-08 15:59:47.759 2 INFO nova.compute.manager [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Terminating instance#033[00m
Oct  8 11:59:47 np0005476733 nova_compute[192580]: 2025-10-08 15:59:47.760 2 DEBUG nova.compute.manager [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 11:59:47 np0005476733 kernel: tapec4d6d14-21 (unregistering): left promiscuous mode
Oct  8 11:59:47 np0005476733 NetworkManager[51699]: <info>  [1759939187.7863] device (tapec4d6d14-21): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:59:47 np0005476733 ovn_controller[94857]: 2025-10-08T15:59:47Z|00705|binding|INFO|Releasing lport ec4d6d14-21c2-4a40-a5d6-e96b314661ca from this chassis (sb_readonly=0)
Oct  8 11:59:47 np0005476733 ovn_controller[94857]: 2025-10-08T15:59:47Z|00706|binding|INFO|Setting lport ec4d6d14-21c2-4a40-a5d6-e96b314661ca down in Southbound
Oct  8 11:59:47 np0005476733 ovn_controller[94857]: 2025-10-08T15:59:47Z|00707|binding|INFO|Removing iface tapec4d6d14-21 ovn-installed in OVS
Oct  8 11:59:47 np0005476733 nova_compute[192580]: 2025-10-08 15:59:47.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:47 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:47.807 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:64:d7 10.10.1.159'], port_security=['fa:16:3e:e0:64:d7 10.10.1.159'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.10.1.159/24', 'neutron:device_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2445930c-6263-4f08-8a84-8d4597739544', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=704dd4df-5dd3-4687-bb14-6b18adad356e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=ec4d6d14-21c2-4a40-a5d6-e96b314661ca) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:59:47 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:47.809 103739 INFO neutron.agent.ovn.metadata.agent [-] Port ec4d6d14-21c2-4a40-a5d6-e96b314661ca in datapath 4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2 unbound from our chassis#033[00m
Oct  8 11:59:47 np0005476733 nova_compute[192580]: 2025-10-08 15:59:47.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:47 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:47.811 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:59:47 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:47.811 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3737bfa5-b901-44ec-8a26-2056ee930c98]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:47 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:47.812 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2 namespace which is not needed anymore#033[00m
Oct  8 11:59:47 np0005476733 kernel: tapc81b3f65-38 (unregistering): left promiscuous mode
Oct  8 11:59:47 np0005476733 NetworkManager[51699]: <info>  [1759939187.8295] device (tapc81b3f65-38): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 11:59:47 np0005476733 nova_compute[192580]: 2025-10-08 15:59:47.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:47 np0005476733 ovn_controller[94857]: 2025-10-08T15:59:47Z|00708|binding|INFO|Releasing lport c81b3f65-3846-40eb-a945-0b7613a2369e from this chassis (sb_readonly=0)
Oct  8 11:59:47 np0005476733 ovn_controller[94857]: 2025-10-08T15:59:47Z|00709|binding|INFO|Setting lport c81b3f65-3846-40eb-a945-0b7613a2369e down in Southbound
Oct  8 11:59:47 np0005476733 nova_compute[192580]: 2025-10-08 15:59:47.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:47 np0005476733 ovn_controller[94857]: 2025-10-08T15:59:47Z|00710|binding|INFO|Removing iface tapc81b3f65-38 ovn-installed in OVS
Oct  8 11:59:47 np0005476733 nova_compute[192580]: 2025-10-08 15:59:47.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:47 np0005476733 nova_compute[192580]: 2025-10-08 15:59:47.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:47 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:47.858 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:ad:7e 10.100.0.4'], port_security=['fa:16:3e:8a:ad:7e 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '9a8d7e02-6802-40ae-8c7a-6f40179085f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-99ef19c5-07b6-4860-a1cf-a7e362446d16', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'neutron:revision_number': '4', 'neutron:security_group_ids': '495dd58b-359c-4273-9624-4fe93a315db4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.236'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1358b3f2-f465-4779-a0e2-bc53cf16e2f0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=c81b3f65-3846-40eb-a945-0b7613a2369e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 11:59:47 np0005476733 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d0000004b.scope: Deactivated successfully.
Oct  8 11:59:47 np0005476733 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d0000004b.scope: Consumed 48.750s CPU time.
Oct  8 11:59:47 np0005476733 systemd-machined[152624]: Machine qemu-45-instance-0000004b terminated.
Oct  8 11:59:47 np0005476733 neutron-haproxy-ovnmeta-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2[246730]: [NOTICE]   (246743) : haproxy version is 2.8.14-c23fe91
Oct  8 11:59:47 np0005476733 neutron-haproxy-ovnmeta-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2[246730]: [NOTICE]   (246743) : path to executable is /usr/sbin/haproxy
Oct  8 11:59:47 np0005476733 neutron-haproxy-ovnmeta-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2[246730]: [WARNING]  (246743) : Exiting Master process...
Oct  8 11:59:47 np0005476733 neutron-haproxy-ovnmeta-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2[246730]: [ALERT]    (246743) : Current worker (246745) exited with code 143 (Terminated)
Oct  8 11:59:47 np0005476733 neutron-haproxy-ovnmeta-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2[246730]: [WARNING]  (246743) : All workers exited. Exiting... (0)
Oct  8 11:59:47 np0005476733 systemd[1]: libpod-cc05d11a82e4eca86a1471f685bea16267f65e3f566b5921e051da8a4ed1e78e.scope: Deactivated successfully.
Oct  8 11:59:47 np0005476733 podman[247431]: 2025-10-08 15:59:47.962263326 +0000 UTC m=+0.052856420 container died cc05d11a82e4eca86a1471f685bea16267f65e3f566b5921e051da8a4ed1e78e (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  8 11:59:47 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cc05d11a82e4eca86a1471f685bea16267f65e3f566b5921e051da8a4ed1e78e-userdata-shm.mount: Deactivated successfully.
Oct  8 11:59:48 np0005476733 NetworkManager[51699]: <info>  [1759939188.0006] manager: (tapc81b3f65-38): new Tun device (/org/freedesktop/NetworkManager/Devices/239)
Oct  8 11:59:48 np0005476733 systemd[1]: var-lib-containers-storage-overlay-deeea52d269f0184888dc13369a841c405956c3941f430bc4995b9822ef08f96-merged.mount: Deactivated successfully.
Oct  8 11:59:48 np0005476733 podman[247431]: 2025-10-08 15:59:48.017501102 +0000 UTC m=+0.108094176 container cleanup cc05d11a82e4eca86a1471f685bea16267f65e3f566b5921e051da8a4ed1e78e (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0)
Oct  8 11:59:48 np0005476733 systemd[1]: libpod-conmon-cc05d11a82e4eca86a1471f685bea16267f65e3f566b5921e051da8a4ed1e78e.scope: Deactivated successfully.
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.044 2 INFO nova.virt.libvirt.driver [-] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Instance destroyed successfully.#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.045 2 DEBUG nova.objects.instance [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lazy-loading 'resources' on Instance uuid 9a8d7e02-6802-40ae-8c7a-6f40179085f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.058 2 DEBUG nova.virt.libvirt.vif [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:58:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1797033195',display_name='tempest-server-test-1797033195',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1797033195',id=75,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGmGuURdoBH7+8UKntqBm5AWKwSqVw41oQIfoqZW4juzRa+zLIDUZQk+8q96NsvV1QhNKcV4HhEHGQj7RYtO04Z0WfqqmlMfeVZDrcQlemJhjx+knV/dWY2Bcp0Y0lXzvQ==',key_name='tempest-keypair-test-1490490112',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:58:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='13cdd2bb6c7648f5ab8709ff695b5cda',ramdisk_id='',reservation_id='r-fge78uux',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestOvn-1026583770',owner_user_name='tempest-QosTestOvn-1026583770-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:58:23Z,user_data=None,user_id='bf4219ece8f54f268b2ece84f150d555',uuid=9a8d7e02-6802-40ae-8c7a-6f40179085f0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "address": "fa:16:3e:e0:64:d7", "network": {"id": "4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2", "bridge": "br-int", "label": "tempest-test-network--2028459875", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.159", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4d6d14-21", "ovs_interfaceid": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.059 2 DEBUG nova.network.os_vif_util [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Converting VIF {"id": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "address": "fa:16:3e:e0:64:d7", "network": {"id": "4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2", "bridge": "br-int", "label": "tempest-test-network--2028459875", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.159", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4d6d14-21", "ovs_interfaceid": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.059 2 DEBUG nova.network.os_vif_util [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e0:64:d7,bridge_name='br-int',has_traffic_filtering=True,id=ec4d6d14-21c2-4a40-a5d6-e96b314661ca,network=Network(4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec4d6d14-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.060 2 DEBUG os_vif [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e0:64:d7,bridge_name='br-int',has_traffic_filtering=True,id=ec4d6d14-21c2-4a40-a5d6-e96b314661ca,network=Network(4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec4d6d14-21') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.062 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec4d6d14-21, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.071 2 INFO os_vif [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e0:64:d7,bridge_name='br-int',has_traffic_filtering=True,id=ec4d6d14-21c2-4a40-a5d6-e96b314661ca,network=Network(4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec4d6d14-21')#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.072 2 DEBUG nova.virt.libvirt.vif [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T15:58:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1797033195',display_name='tempest-server-test-1797033195',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1797033195',id=75,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGmGuURdoBH7+8UKntqBm5AWKwSqVw41oQIfoqZW4juzRa+zLIDUZQk+8q96NsvV1QhNKcV4HhEHGQj7RYtO04Z0WfqqmlMfeVZDrcQlemJhjx+knV/dWY2Bcp0Y0lXzvQ==',key_name='tempest-keypair-test-1490490112',keypairs=<?>,launch_index=0,launched_at=2025-10-08T15:58:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='13cdd2bb6c7648f5ab8709ff695b5cda',ramdisk_id='',reservation_id='r-fge78uux',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestOvn-1026583770',owner_user_name='tempest-QosTestOvn-1026583770-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T15:58:23Z,user_data=None,user_id='bf4219ece8f54f268b2ece84f150d555',uuid=9a8d7e02-6802-40ae-8c7a-6f40179085f0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c81b3f65-3846-40eb-a945-0b7613a2369e", "address": "fa:16:3e:8a:ad:7e", "network": {"id": "99ef19c5-07b6-4860-a1cf-a7e362446d16", "bridge": "br-int", "label": "tempest-tenant-ctl-network-1362306064", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc81b3f65-38", "ovs_interfaceid": "c81b3f65-3846-40eb-a945-0b7613a2369e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.072 2 DEBUG nova.network.os_vif_util [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Converting VIF {"id": "c81b3f65-3846-40eb-a945-0b7613a2369e", "address": "fa:16:3e:8a:ad:7e", "network": {"id": "99ef19c5-07b6-4860-a1cf-a7e362446d16", "bridge": "br-int", "label": "tempest-tenant-ctl-network-1362306064", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc81b3f65-38", "ovs_interfaceid": "c81b3f65-3846-40eb-a945-0b7613a2369e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.073 2 DEBUG nova.network.os_vif_util [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8a:ad:7e,bridge_name='br-int',has_traffic_filtering=True,id=c81b3f65-3846-40eb-a945-0b7613a2369e,network=Network(99ef19c5-07b6-4860-a1cf-a7e362446d16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc81b3f65-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.074 2 DEBUG os_vif [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8a:ad:7e,bridge_name='br-int',has_traffic_filtering=True,id=c81b3f65-3846-40eb-a945-0b7613a2369e,network=Network(99ef19c5-07b6-4860-a1cf-a7e362446d16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc81b3f65-38') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.076 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc81b3f65-38, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.081 2 INFO os_vif [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8a:ad:7e,bridge_name='br-int',has_traffic_filtering=True,id=c81b3f65-3846-40eb-a945-0b7613a2369e,network=Network(99ef19c5-07b6-4860-a1cf-a7e362446d16),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc81b3f65-38')#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.081 2 INFO nova.virt.libvirt.driver [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Deleting instance files /var/lib/nova/instances/9a8d7e02-6802-40ae-8c7a-6f40179085f0_del#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.082 2 INFO nova.virt.libvirt.driver [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Deletion of /var/lib/nova/instances/9a8d7e02-6802-40ae-8c7a-6f40179085f0_del complete#033[00m
Oct  8 11:59:48 np0005476733 podman[247487]: 2025-10-08 15:59:48.091814526 +0000 UTC m=+0.047962414 container remove cc05d11a82e4eca86a1471f685bea16267f65e3f566b5921e051da8a4ed1e78e (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:59:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:48.097 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d626ba07-e6e9-41c3-becb-cbf042b23763]: (4, ('Wed Oct  8 03:59:47 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2 (cc05d11a82e4eca86a1471f685bea16267f65e3f566b5921e051da8a4ed1e78e)\ncc05d11a82e4eca86a1471f685bea16267f65e3f566b5921e051da8a4ed1e78e\nWed Oct  8 03:59:48 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2 (cc05d11a82e4eca86a1471f685bea16267f65e3f566b5921e051da8a4ed1e78e)\ncc05d11a82e4eca86a1471f685bea16267f65e3f566b5921e051da8a4ed1e78e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:48.099 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d585121a-ed91-47b9-aeeb-3264997dbae6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:48.100 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4ecfeba1-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:48 np0005476733 kernel: tap4ecfeba1-f0: left promiscuous mode
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:48.117 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b688c565-7696-4d70-b27e-a1ee998f2856]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.127 2 INFO nova.compute.manager [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.128 2 DEBUG oslo.service.loopingcall [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.128 2 DEBUG nova.compute.manager [-] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.128 2 DEBUG nova.network.neutron [-] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 11:59:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:48.146 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d66730d2-58cd-410b-b1a2-cd822cc06716]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:48.148 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f6a18bcb-769d-424c-b791-82356eeb31ba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:48.164 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ed198941-8721-4bf6-90c0-3969c312a0e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600572, 'reachable_time': 36074, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247505, 'error': None, 'target': 'ovnmeta-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:48 np0005476733 systemd[1]: run-netns-ovnmeta\x2d4ecfeba1\x2dfe9f\x2d47ec\x2d8d09\x2d6604f3ff95b2.mount: Deactivated successfully.
Oct  8 11:59:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:48.168 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:59:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:48.169 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[3041dce4-72d5-4e0d-a0b9-8a159e9a959e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:48.170 103739 INFO neutron.agent.ovn.metadata.agent [-] Port c81b3f65-3846-40eb-a945-0b7613a2369e in datapath 99ef19c5-07b6-4860-a1cf-a7e362446d16 unbound from our chassis#033[00m
Oct  8 11:59:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:48.171 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 99ef19c5-07b6-4860-a1cf-a7e362446d16, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 11:59:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:48.172 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0c9eda33-6945-433a-a3ab-2f18ccacb3d1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:48.173 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-99ef19c5-07b6-4860-a1cf-a7e362446d16 namespace which is not needed anymore#033[00m
Oct  8 11:59:48 np0005476733 neutron-haproxy-ovnmeta-99ef19c5-07b6-4860-a1cf-a7e362446d16[247281]: [NOTICE]   (247286) : haproxy version is 2.8.14-c23fe91
Oct  8 11:59:48 np0005476733 neutron-haproxy-ovnmeta-99ef19c5-07b6-4860-a1cf-a7e362446d16[247281]: [NOTICE]   (247286) : path to executable is /usr/sbin/haproxy
Oct  8 11:59:48 np0005476733 neutron-haproxy-ovnmeta-99ef19c5-07b6-4860-a1cf-a7e362446d16[247281]: [WARNING]  (247286) : Exiting Master process...
Oct  8 11:59:48 np0005476733 neutron-haproxy-ovnmeta-99ef19c5-07b6-4860-a1cf-a7e362446d16[247281]: [WARNING]  (247286) : Exiting Master process...
Oct  8 11:59:48 np0005476733 neutron-haproxy-ovnmeta-99ef19c5-07b6-4860-a1cf-a7e362446d16[247281]: [ALERT]    (247286) : Current worker (247288) exited with code 143 (Terminated)
Oct  8 11:59:48 np0005476733 neutron-haproxy-ovnmeta-99ef19c5-07b6-4860-a1cf-a7e362446d16[247281]: [WARNING]  (247286) : All workers exited. Exiting... (0)
Oct  8 11:59:48 np0005476733 systemd[1]: libpod-cd03c0e2347b184489e725fbc0761362a161909900c2103fe0a5f7657948510e.scope: Deactivated successfully.
Oct  8 11:59:48 np0005476733 podman[247522]: 2025-10-08 15:59:48.307896782 +0000 UTC m=+0.047265812 container died cd03c0e2347b184489e725fbc0761362a161909900c2103fe0a5f7657948510e (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-99ef19c5-07b6-4860-a1cf-a7e362446d16, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001)
Oct  8 11:59:48 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cd03c0e2347b184489e725fbc0761362a161909900c2103fe0a5f7657948510e-userdata-shm.mount: Deactivated successfully.
Oct  8 11:59:48 np0005476733 systemd[1]: var-lib-containers-storage-overlay-885d1b1d01f1e399dc2665d188a320a1b51d29399728ca8e2f6c0419b842c3db-merged.mount: Deactivated successfully.
Oct  8 11:59:48 np0005476733 podman[247522]: 2025-10-08 15:59:48.343914823 +0000 UTC m=+0.083283873 container cleanup cd03c0e2347b184489e725fbc0761362a161909900c2103fe0a5f7657948510e (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-99ef19c5-07b6-4860-a1cf-a7e362446d16, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  8 11:59:48 np0005476733 systemd[1]: libpod-conmon-cd03c0e2347b184489e725fbc0761362a161909900c2103fe0a5f7657948510e.scope: Deactivated successfully.
Oct  8 11:59:48 np0005476733 podman[247553]: 2025-10-08 15:59:48.411362038 +0000 UTC m=+0.042024524 container remove cd03c0e2347b184489e725fbc0761362a161909900c2103fe0a5f7657948510e (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-99ef19c5-07b6-4860-a1cf-a7e362446d16, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 11:59:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:48.417 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e9a0fab1-9cd3-46de-8e9e-cc7c97a499f3]: (4, ('Wed Oct  8 03:59:48 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-99ef19c5-07b6-4860-a1cf-a7e362446d16 (cd03c0e2347b184489e725fbc0761362a161909900c2103fe0a5f7657948510e)\ncd03c0e2347b184489e725fbc0761362a161909900c2103fe0a5f7657948510e\nWed Oct  8 03:59:48 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-99ef19c5-07b6-4860-a1cf-a7e362446d16 (cd03c0e2347b184489e725fbc0761362a161909900c2103fe0a5f7657948510e)\ncd03c0e2347b184489e725fbc0761362a161909900c2103fe0a5f7657948510e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:48.418 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c5e5a6ab-319e-4c73-9c0e-8b79dd23bdb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:48.419 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap99ef19c5-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:48 np0005476733 kernel: tap99ef19c5-00: left promiscuous mode
Oct  8 11:59:48 np0005476733 nova_compute[192580]: 2025-10-08 15:59:48.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:48.437 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a7e89df1-47b6-44fc-b35d-16e06bbb85c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:48.465 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[050b0b75-6f56-4832-acb6-ed36cff62438]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:48.467 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[474c4779-8400-44d0-8e6a-9a8c5413a197]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:48.486 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8ab322eb-0738-4836-a6e0-964f807452fd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 607596, 'reachable_time': 19595, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247568, 'error': None, 'target': 'ovnmeta-99ef19c5-07b6-4860-a1cf-a7e362446d16', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:48.489 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-99ef19c5-07b6-4860-a1cf-a7e362446d16 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 11:59:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 15:59:48.489 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[a44d90ef-af27-4177-8d3b-d1af489d5f13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 11:59:48 np0005476733 systemd[1]: run-netns-ovnmeta\x2d99ef19c5\x2d07b6\x2d4860\x2da1cf\x2da7e362446d16.mount: Deactivated successfully.
Oct  8 11:59:49 np0005476733 nova_compute[192580]: 2025-10-08 15:59:49.918 2 DEBUG nova.compute.manager [req-d4b51ce2-317f-4782-a306-3e8135b6ca61 req-40186488-c33f-43b7-a261-72091134b6e3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Received event network-vif-unplugged-ec4d6d14-21c2-4a40-a5d6-e96b314661ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:59:49 np0005476733 nova_compute[192580]: 2025-10-08 15:59:49.919 2 DEBUG oslo_concurrency.lockutils [req-d4b51ce2-317f-4782-a306-3e8135b6ca61 req-40186488-c33f-43b7-a261-72091134b6e3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:59:49 np0005476733 nova_compute[192580]: 2025-10-08 15:59:49.919 2 DEBUG oslo_concurrency.lockutils [req-d4b51ce2-317f-4782-a306-3e8135b6ca61 req-40186488-c33f-43b7-a261-72091134b6e3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:59:49 np0005476733 nova_compute[192580]: 2025-10-08 15:59:49.920 2 DEBUG oslo_concurrency.lockutils [req-d4b51ce2-317f-4782-a306-3e8135b6ca61 req-40186488-c33f-43b7-a261-72091134b6e3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:59:49 np0005476733 nova_compute[192580]: 2025-10-08 15:59:49.920 2 DEBUG nova.compute.manager [req-d4b51ce2-317f-4782-a306-3e8135b6ca61 req-40186488-c33f-43b7-a261-72091134b6e3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] No waiting events found dispatching network-vif-unplugged-ec4d6d14-21c2-4a40-a5d6-e96b314661ca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:59:49 np0005476733 nova_compute[192580]: 2025-10-08 15:59:49.921 2 DEBUG nova.compute.manager [req-d4b51ce2-317f-4782-a306-3e8135b6ca61 req-40186488-c33f-43b7-a261-72091134b6e3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Received event network-vif-unplugged-ec4d6d14-21c2-4a40-a5d6-e96b314661ca for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:59:50 np0005476733 nova_compute[192580]: 2025-10-08 15:59:50.327 2 DEBUG nova.network.neutron [req-706bd50b-0a8f-4bb7-adaf-b497ca2d38fd req-5fe29d0d-b0fa-435a-a30b-404c06d8470d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Updated VIF entry in instance network info cache for port ec4d6d14-21c2-4a40-a5d6-e96b314661ca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 11:59:50 np0005476733 nova_compute[192580]: 2025-10-08 15:59:50.327 2 DEBUG nova.network.neutron [req-706bd50b-0a8f-4bb7-adaf-b497ca2d38fd req-5fe29d0d-b0fa-435a-a30b-404c06d8470d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Updating instance_info_cache with network_info: [{"id": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "address": "fa:16:3e:e0:64:d7", "network": {"id": "4ecfeba1-fe9f-47ec-8d09-6604f3ff95b2", "bridge": "br-int", "label": "tempest-test-network--2028459875", "subnets": [{"cidr": "10.10.1.0/24", "dns": [], "gateway": {"address": "10.10.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.1.159", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec4d6d14-21", "ovs_interfaceid": "ec4d6d14-21c2-4a40-a5d6-e96b314661ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c81b3f65-3846-40eb-a945-0b7613a2369e", "address": "fa:16:3e:8a:ad:7e", "network": {"id": "99ef19c5-07b6-4860-a1cf-a7e362446d16", "bridge": "br-int", "label": "tempest-tenant-ctl-network-1362306064", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc81b3f65-38", "ovs_interfaceid": "c81b3f65-3846-40eb-a945-0b7613a2369e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:59:50 np0005476733 nova_compute[192580]: 2025-10-08 15:59:50.363 2 DEBUG oslo_concurrency.lockutils [req-706bd50b-0a8f-4bb7-adaf-b497ca2d38fd req-5fe29d0d-b0fa-435a-a30b-404c06d8470d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-9a8d7e02-6802-40ae-8c7a-6f40179085f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 11:59:51 np0005476733 nova_compute[192580]: 2025-10-08 15:59:51.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:52 np0005476733 nova_compute[192580]: 2025-10-08 15:59:52.010 2 DEBUG nova.compute.manager [req-90b3afd9-9ee2-4d66-9c18-63de3456ee5b req-c0fe525a-e448-4a74-83f7-9d5509817c4b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Received event network-vif-plugged-ec4d6d14-21c2-4a40-a5d6-e96b314661ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:59:52 np0005476733 nova_compute[192580]: 2025-10-08 15:59:52.011 2 DEBUG oslo_concurrency.lockutils [req-90b3afd9-9ee2-4d66-9c18-63de3456ee5b req-c0fe525a-e448-4a74-83f7-9d5509817c4b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:59:52 np0005476733 nova_compute[192580]: 2025-10-08 15:59:52.012 2 DEBUG oslo_concurrency.lockutils [req-90b3afd9-9ee2-4d66-9c18-63de3456ee5b req-c0fe525a-e448-4a74-83f7-9d5509817c4b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:59:52 np0005476733 nova_compute[192580]: 2025-10-08 15:59:52.012 2 DEBUG oslo_concurrency.lockutils [req-90b3afd9-9ee2-4d66-9c18-63de3456ee5b req-c0fe525a-e448-4a74-83f7-9d5509817c4b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:59:52 np0005476733 nova_compute[192580]: 2025-10-08 15:59:52.013 2 DEBUG nova.compute.manager [req-90b3afd9-9ee2-4d66-9c18-63de3456ee5b req-c0fe525a-e448-4a74-83f7-9d5509817c4b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] No waiting events found dispatching network-vif-plugged-ec4d6d14-21c2-4a40-a5d6-e96b314661ca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:59:52 np0005476733 nova_compute[192580]: 2025-10-08 15:59:52.013 2 WARNING nova.compute.manager [req-90b3afd9-9ee2-4d66-9c18-63de3456ee5b req-c0fe525a-e448-4a74-83f7-9d5509817c4b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Received unexpected event network-vif-plugged-ec4d6d14-21c2-4a40-a5d6-e96b314661ca for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:59:52 np0005476733 nova_compute[192580]: 2025-10-08 15:59:52.014 2 DEBUG nova.compute.manager [req-90b3afd9-9ee2-4d66-9c18-63de3456ee5b req-c0fe525a-e448-4a74-83f7-9d5509817c4b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Received event network-vif-unplugged-c81b3f65-3846-40eb-a945-0b7613a2369e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:59:52 np0005476733 nova_compute[192580]: 2025-10-08 15:59:52.015 2 DEBUG oslo_concurrency.lockutils [req-90b3afd9-9ee2-4d66-9c18-63de3456ee5b req-c0fe525a-e448-4a74-83f7-9d5509817c4b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:59:52 np0005476733 nova_compute[192580]: 2025-10-08 15:59:52.015 2 DEBUG oslo_concurrency.lockutils [req-90b3afd9-9ee2-4d66-9c18-63de3456ee5b req-c0fe525a-e448-4a74-83f7-9d5509817c4b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:59:52 np0005476733 nova_compute[192580]: 2025-10-08 15:59:52.016 2 DEBUG oslo_concurrency.lockutils [req-90b3afd9-9ee2-4d66-9c18-63de3456ee5b req-c0fe525a-e448-4a74-83f7-9d5509817c4b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:59:52 np0005476733 nova_compute[192580]: 2025-10-08 15:59:52.016 2 DEBUG nova.compute.manager [req-90b3afd9-9ee2-4d66-9c18-63de3456ee5b req-c0fe525a-e448-4a74-83f7-9d5509817c4b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] No waiting events found dispatching network-vif-unplugged-c81b3f65-3846-40eb-a945-0b7613a2369e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:59:52 np0005476733 nova_compute[192580]: 2025-10-08 15:59:52.017 2 DEBUG nova.compute.manager [req-90b3afd9-9ee2-4d66-9c18-63de3456ee5b req-c0fe525a-e448-4a74-83f7-9d5509817c4b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Received event network-vif-unplugged-c81b3f65-3846-40eb-a945-0b7613a2369e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 11:59:52 np0005476733 nova_compute[192580]: 2025-10-08 15:59:52.017 2 DEBUG nova.compute.manager [req-90b3afd9-9ee2-4d66-9c18-63de3456ee5b req-c0fe525a-e448-4a74-83f7-9d5509817c4b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Received event network-vif-plugged-c81b3f65-3846-40eb-a945-0b7613a2369e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:59:52 np0005476733 nova_compute[192580]: 2025-10-08 15:59:52.018 2 DEBUG oslo_concurrency.lockutils [req-90b3afd9-9ee2-4d66-9c18-63de3456ee5b req-c0fe525a-e448-4a74-83f7-9d5509817c4b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:59:52 np0005476733 nova_compute[192580]: 2025-10-08 15:59:52.018 2 DEBUG oslo_concurrency.lockutils [req-90b3afd9-9ee2-4d66-9c18-63de3456ee5b req-c0fe525a-e448-4a74-83f7-9d5509817c4b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:59:52 np0005476733 nova_compute[192580]: 2025-10-08 15:59:52.019 2 DEBUG oslo_concurrency.lockutils [req-90b3afd9-9ee2-4d66-9c18-63de3456ee5b req-c0fe525a-e448-4a74-83f7-9d5509817c4b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:59:52 np0005476733 nova_compute[192580]: 2025-10-08 15:59:52.019 2 DEBUG nova.compute.manager [req-90b3afd9-9ee2-4d66-9c18-63de3456ee5b req-c0fe525a-e448-4a74-83f7-9d5509817c4b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] No waiting events found dispatching network-vif-plugged-c81b3f65-3846-40eb-a945-0b7613a2369e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 11:59:52 np0005476733 nova_compute[192580]: 2025-10-08 15:59:52.020 2 WARNING nova.compute.manager [req-90b3afd9-9ee2-4d66-9c18-63de3456ee5b req-c0fe525a-e448-4a74-83f7-9d5509817c4b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Received unexpected event network-vif-plugged-c81b3f65-3846-40eb-a945-0b7613a2369e for instance with vm_state active and task_state deleting.#033[00m
Oct  8 11:59:53 np0005476733 nova_compute[192580]: 2025-10-08 15:59:53.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:53 np0005476733 nova_compute[192580]: 2025-10-08 15:59:53.785 2 DEBUG nova.network.neutron [-] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 11:59:53 np0005476733 nova_compute[192580]: 2025-10-08 15:59:53.810 2 INFO nova.compute.manager [-] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Took 5.68 seconds to deallocate network for instance.#033[00m
Oct  8 11:59:53 np0005476733 nova_compute[192580]: 2025-10-08 15:59:53.857 2 DEBUG oslo_concurrency.lockutils [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 11:59:53 np0005476733 nova_compute[192580]: 2025-10-08 15:59:53.858 2 DEBUG oslo_concurrency.lockutils [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 11:59:53 np0005476733 nova_compute[192580]: 2025-10-08 15:59:53.924 2 DEBUG nova.compute.provider_tree [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 11:59:53 np0005476733 nova_compute[192580]: 2025-10-08 15:59:53.940 2 DEBUG nova.scheduler.client.report [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 11:59:53 np0005476733 nova_compute[192580]: 2025-10-08 15:59:53.962 2 DEBUG oslo_concurrency.lockutils [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:59:54 np0005476733 nova_compute[192580]: 2025-10-08 15:59:54.005 2 INFO nova.scheduler.client.report [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Deleted allocations for instance 9a8d7e02-6802-40ae-8c7a-6f40179085f0#033[00m
Oct  8 11:59:54 np0005476733 nova_compute[192580]: 2025-10-08 15:59:54.086 2 DEBUG oslo_concurrency.lockutils [None req-16f42609-edff-4bdf-ac29-ec2433e58275 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "9a8d7e02-6802-40ae-8c7a-6f40179085f0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.329s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 11:59:54 np0005476733 nova_compute[192580]: 2025-10-08 15:59:54.095 2 DEBUG nova.compute.manager [req-d6a39bc2-5ef2-46c7-a332-9ea27e819d6f req-e5615df7-ed30-4610-b2b9-0e61b078cd69 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Received event network-vif-deleted-ec4d6d14-21c2-4a40-a5d6-e96b314661ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 11:59:55 np0005476733 podman[247569]: 2025-10-08 15:59:55.24466474 +0000 UTC m=+0.064542443 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Oct  8 11:59:56 np0005476733 nova_compute[192580]: 2025-10-08 15:59:56.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 11:59:58 np0005476733 nova_compute[192580]: 2025-10-08 15:59:58.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:01 np0005476733 nova_compute[192580]: 2025-10-08 16:00:01.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:03 np0005476733 nova_compute[192580]: 2025-10-08 16:00:03.043 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759939188.0425255, 9a8d7e02-6802-40ae-8c7a-6f40179085f0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:00:03 np0005476733 nova_compute[192580]: 2025-10-08 16:00:03.044 2 INFO nova.compute.manager [-] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] VM Stopped (Lifecycle Event)#033[00m
Oct  8 12:00:03 np0005476733 nova_compute[192580]: 2025-10-08 16:00:03.081 2 DEBUG nova.compute.manager [None req-d21d5b02-c0bf-43ef-a198-1e24d7adf59c - - - - - -] [instance: 9a8d7e02-6802-40ae-8c7a-6f40179085f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:00:03 np0005476733 nova_compute[192580]: 2025-10-08 16:00:03.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:03 np0005476733 podman[247590]: 2025-10-08 16:00:03.284758976 +0000 UTC m=+0.096952998 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  8 12:00:03 np0005476733 podman[247589]: 2025-10-08 16:00:03.327634297 +0000 UTC m=+0.150797890 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 12:00:06 np0005476733 nova_compute[192580]: 2025-10-08 16:00:06.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:00:06 np0005476733 nova_compute[192580]: 2025-10-08 16:00:06.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:08 np0005476733 nova_compute[192580]: 2025-10-08 16:00:08.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:09 np0005476733 nova_compute[192580]: 2025-10-08 16:00:09.592 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:00:11 np0005476733 podman[247636]: 2025-10-08 16:00:11.251230963 +0000 UTC m=+0.065018229 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 12:00:11 np0005476733 podman[247635]: 2025-10-08 16:00:11.273006648 +0000 UTC m=+0.091833174 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Oct  8 12:00:11 np0005476733 podman[247637]: 2025-10-08 16:00:11.276078446 +0000 UTC m=+0.076193135 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:00:11 np0005476733 nova_compute[192580]: 2025-10-08 16:00:11.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:12 np0005476733 nova_compute[192580]: 2025-10-08 16:00:12.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:13 np0005476733 nova_compute[192580]: 2025-10-08 16:00:13.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:13 np0005476733 nova_compute[192580]: 2025-10-08 16:00:13.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:00:13 np0005476733 nova_compute[192580]: 2025-10-08 16:00:13.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:00:13 np0005476733 nova_compute[192580]: 2025-10-08 16:00:13.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:00:13 np0005476733 nova_compute[192580]: 2025-10-08 16:00:13.609 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:00:16 np0005476733 nova_compute[192580]: 2025-10-08 16:00:16.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:00:16 np0005476733 nova_compute[192580]: 2025-10-08 16:00:16.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:00:16 np0005476733 nova_compute[192580]: 2025-10-08 16:00:16.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:17 np0005476733 nova_compute[192580]: 2025-10-08 16:00:17.434 2 DEBUG oslo_concurrency.lockutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquiring lock "60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:00:17 np0005476733 nova_compute[192580]: 2025-10-08 16:00:17.434 2 DEBUG oslo_concurrency.lockutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:00:17 np0005476733 nova_compute[192580]: 2025-10-08 16:00:17.564 2 DEBUG nova.compute.manager [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 12:00:17 np0005476733 nova_compute[192580]: 2025-10-08 16:00:17.807 2 DEBUG oslo_concurrency.lockutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:00:17 np0005476733 nova_compute[192580]: 2025-10-08 16:00:17.808 2 DEBUG oslo_concurrency.lockutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:00:17 np0005476733 nova_compute[192580]: 2025-10-08 16:00:17.816 2 DEBUG nova.virt.hardware [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 12:00:17 np0005476733 nova_compute[192580]: 2025-10-08 16:00:17.817 2 INFO nova.compute.claims [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 12:00:18 np0005476733 nova_compute[192580]: 2025-10-08 16:00:18.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:18 np0005476733 nova_compute[192580]: 2025-10-08 16:00:18.151 2 DEBUG nova.compute.provider_tree [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:00:18 np0005476733 nova_compute[192580]: 2025-10-08 16:00:18.218 2 DEBUG nova.scheduler.client.report [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:00:18 np0005476733 podman[247701]: 2025-10-08 16:00:18.239895218 +0000 UTC m=+0.058073856 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20251001)
Oct  8 12:00:18 np0005476733 podman[247702]: 2025-10-08 16:00:18.24181133 +0000 UTC m=+0.055917058 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:00:18 np0005476733 nova_compute[192580]: 2025-10-08 16:00:18.517 2 DEBUG oslo_concurrency.lockutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.709s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:00:18 np0005476733 nova_compute[192580]: 2025-10-08 16:00:18.518 2 DEBUG nova.compute.manager [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 12:00:18 np0005476733 nova_compute[192580]: 2025-10-08 16:00:18.751 2 DEBUG nova.compute.manager [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 12:00:18 np0005476733 nova_compute[192580]: 2025-10-08 16:00:18.752 2 DEBUG nova.network.neutron [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 12:00:18 np0005476733 nova_compute[192580]: 2025-10-08 16:00:18.802 2 INFO nova.virt.libvirt.driver [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 12:00:18 np0005476733 nova_compute[192580]: 2025-10-08 16:00:18.827 2 DEBUG nova.compute.manager [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 12:00:18 np0005476733 nova_compute[192580]: 2025-10-08 16:00:18.929 2 DEBUG nova.compute.manager [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 12:00:18 np0005476733 nova_compute[192580]: 2025-10-08 16:00:18.931 2 DEBUG nova.virt.libvirt.driver [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 12:00:18 np0005476733 nova_compute[192580]: 2025-10-08 16:00:18.931 2 INFO nova.virt.libvirt.driver [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Creating image(s)#033[00m
Oct  8 12:00:18 np0005476733 nova_compute[192580]: 2025-10-08 16:00:18.932 2 DEBUG oslo_concurrency.lockutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquiring lock "/var/lib/nova/instances/60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:00:18 np0005476733 nova_compute[192580]: 2025-10-08 16:00:18.932 2 DEBUG oslo_concurrency.lockutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "/var/lib/nova/instances/60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:00:18 np0005476733 nova_compute[192580]: 2025-10-08 16:00:18.932 2 DEBUG oslo_concurrency.lockutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "/var/lib/nova/instances/60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:00:18 np0005476733 nova_compute[192580]: 2025-10-08 16:00:18.949 2 DEBUG oslo_concurrency.processutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:00:19 np0005476733 nova_compute[192580]: 2025-10-08 16:00:19.013 2 DEBUG oslo_concurrency.processutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:00:19 np0005476733 nova_compute[192580]: 2025-10-08 16:00:19.015 2 DEBUG oslo_concurrency.lockutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:00:19 np0005476733 nova_compute[192580]: 2025-10-08 16:00:19.015 2 DEBUG oslo_concurrency.lockutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:00:19 np0005476733 nova_compute[192580]: 2025-10-08 16:00:19.031 2 DEBUG oslo_concurrency.processutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:00:19 np0005476733 nova_compute[192580]: 2025-10-08 16:00:19.090 2 DEBUG oslo_concurrency.processutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:00:19 np0005476733 nova_compute[192580]: 2025-10-08 16:00:19.091 2 DEBUG oslo_concurrency.processutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:00:19 np0005476733 nova_compute[192580]: 2025-10-08 16:00:19.134 2 DEBUG oslo_concurrency.processutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk 10737418240" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:00:19 np0005476733 nova_compute[192580]: 2025-10-08 16:00:19.135 2 DEBUG oslo_concurrency.lockutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:00:19 np0005476733 nova_compute[192580]: 2025-10-08 16:00:19.135 2 DEBUG oslo_concurrency.processutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:00:19 np0005476733 nova_compute[192580]: 2025-10-08 16:00:19.203 2 DEBUG oslo_concurrency.processutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:00:19 np0005476733 nova_compute[192580]: 2025-10-08 16:00:19.204 2 DEBUG nova.objects.instance [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lazy-loading 'migration_context' on Instance uuid 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:00:19 np0005476733 nova_compute[192580]: 2025-10-08 16:00:19.220 2 DEBUG nova.virt.libvirt.driver [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 12:00:19 np0005476733 nova_compute[192580]: 2025-10-08 16:00:19.220 2 DEBUG nova.virt.libvirt.driver [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Ensure instance console log exists: /var/lib/nova/instances/60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 12:00:19 np0005476733 nova_compute[192580]: 2025-10-08 16:00:19.221 2 DEBUG oslo_concurrency.lockutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:00:19 np0005476733 nova_compute[192580]: 2025-10-08 16:00:19.221 2 DEBUG oslo_concurrency.lockutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:00:19 np0005476733 nova_compute[192580]: 2025-10-08 16:00:19.222 2 DEBUG oslo_concurrency.lockutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:00:19 np0005476733 nova_compute[192580]: 2025-10-08 16:00:19.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:00:19 np0005476733 nova_compute[192580]: 2025-10-08 16:00:19.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:00:19 np0005476733 nova_compute[192580]: 2025-10-08 16:00:19.793 2 DEBUG nova.policy [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 12:00:21 np0005476733 nova_compute[192580]: 2025-10-08 16:00:21.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:00:21 np0005476733 nova_compute[192580]: 2025-10-08 16:00:21.611 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:00:21 np0005476733 nova_compute[192580]: 2025-10-08 16:00:21.612 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:00:21 np0005476733 nova_compute[192580]: 2025-10-08 16:00:21.612 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:00:21 np0005476733 nova_compute[192580]: 2025-10-08 16:00:21.612 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:00:21 np0005476733 nova_compute[192580]: 2025-10-08 16:00:21.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:21 np0005476733 nova_compute[192580]: 2025-10-08 16:00:21.813 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:00:21 np0005476733 nova_compute[192580]: 2025-10-08 16:00:21.815 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13779MB free_disk=111.33169174194336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:00:21 np0005476733 nova_compute[192580]: 2025-10-08 16:00:21.815 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:00:21 np0005476733 nova_compute[192580]: 2025-10-08 16:00:21.815 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:00:22 np0005476733 nova_compute[192580]: 2025-10-08 16:00:22.026 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:00:22 np0005476733 nova_compute[192580]: 2025-10-08 16:00:22.027 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:00:22 np0005476733 nova_compute[192580]: 2025-10-08 16:00:22.027 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:00:22 np0005476733 nova_compute[192580]: 2025-10-08 16:00:22.070 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:00:22 np0005476733 nova_compute[192580]: 2025-10-08 16:00:22.107 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:00:22 np0005476733 nova_compute[192580]: 2025-10-08 16:00:22.240 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:00:22 np0005476733 nova_compute[192580]: 2025-10-08 16:00:22.240 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.425s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:00:23 np0005476733 nova_compute[192580]: 2025-10-08 16:00:23.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:23 np0005476733 ovn_controller[94857]: 2025-10-08T16:00:23Z|00711|pinctrl|WARN|Dropped 1093 log messages in last 62 seconds (most recently, 10 seconds ago) due to excessive rate
Oct  8 12:00:23 np0005476733 ovn_controller[94857]: 2025-10-08T16:00:23Z|00712|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:00:23 np0005476733 nova_compute[192580]: 2025-10-08 16:00:23.241 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:00:23 np0005476733 nova_compute[192580]: 2025-10-08 16:00:23.271 2 DEBUG nova.network.neutron [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Successfully created port: f741863c-37e3-4d41-b47d-a825b21d9eac _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 12:00:24 np0005476733 nova_compute[192580]: 2025-10-08 16:00:24.789 2 DEBUG nova.network.neutron [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Successfully updated port: f741863c-37e3-4d41-b47d-a825b21d9eac _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 12:00:25 np0005476733 nova_compute[192580]: 2025-10-08 16:00:25.029 2 DEBUG nova.compute.manager [req-0a0e54b5-6a12-4653-a5a0-042e03ba95cf req-cfac7c67-d6f4-4737-9853-f9e9f6a6882c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Received event network-changed-f741863c-37e3-4d41-b47d-a825b21d9eac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:00:25 np0005476733 nova_compute[192580]: 2025-10-08 16:00:25.029 2 DEBUG nova.compute.manager [req-0a0e54b5-6a12-4653-a5a0-042e03ba95cf req-cfac7c67-d6f4-4737-9853-f9e9f6a6882c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Refreshing instance network info cache due to event network-changed-f741863c-37e3-4d41-b47d-a825b21d9eac. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:00:25 np0005476733 nova_compute[192580]: 2025-10-08 16:00:25.030 2 DEBUG oslo_concurrency.lockutils [req-0a0e54b5-6a12-4653-a5a0-042e03ba95cf req-cfac7c67-d6f4-4737-9853-f9e9f6a6882c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:00:25 np0005476733 nova_compute[192580]: 2025-10-08 16:00:25.030 2 DEBUG oslo_concurrency.lockutils [req-0a0e54b5-6a12-4653-a5a0-042e03ba95cf req-cfac7c67-d6f4-4737-9853-f9e9f6a6882c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:00:25 np0005476733 nova_compute[192580]: 2025-10-08 16:00:25.030 2 DEBUG nova.network.neutron [req-0a0e54b5-6a12-4653-a5a0-042e03ba95cf req-cfac7c67-d6f4-4737-9853-f9e9f6a6882c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Refreshing network info cache for port f741863c-37e3-4d41-b47d-a825b21d9eac _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:00:25 np0005476733 nova_compute[192580]: 2025-10-08 16:00:25.044 2 DEBUG oslo_concurrency.lockutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquiring lock "refresh_cache-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:00:25 np0005476733 nova_compute[192580]: 2025-10-08 16:00:25.402 2 DEBUG nova.network.neutron [req-0a0e54b5-6a12-4653-a5a0-042e03ba95cf req-cfac7c67-d6f4-4737-9853-f9e9f6a6882c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 12:00:25 np0005476733 nova_compute[192580]: 2025-10-08 16:00:25.749 2 DEBUG nova.network.neutron [req-0a0e54b5-6a12-4653-a5a0-042e03ba95cf req-cfac7c67-d6f4-4737-9853-f9e9f6a6882c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:00:25 np0005476733 nova_compute[192580]: 2025-10-08 16:00:25.783 2 DEBUG oslo_concurrency.lockutils [req-0a0e54b5-6a12-4653-a5a0-042e03ba95cf req-cfac7c67-d6f4-4737-9853-f9e9f6a6882c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:00:25 np0005476733 nova_compute[192580]: 2025-10-08 16:00:25.783 2 DEBUG oslo_concurrency.lockutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquired lock "refresh_cache-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:00:25 np0005476733 nova_compute[192580]: 2025-10-08 16:00:25.784 2 DEBUG nova.network.neutron [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:00:26 np0005476733 nova_compute[192580]: 2025-10-08 16:00:26.113 2 DEBUG nova.network.neutron [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 12:00:26 np0005476733 podman[247760]: 2025-10-08 16:00:26.233521412 +0000 UTC m=+0.064483282 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:00:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:26.349 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:00:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:26.350 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:00:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:26.350 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:00:26 np0005476733 nova_compute[192580]: 2025-10-08 16:00:26.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.807 2 DEBUG nova.network.neutron [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Updating instance_info_cache with network_info: [{"id": "f741863c-37e3-4d41-b47d-a825b21d9eac", "address": "fa:16:3e:b6:1d:5f", "network": {"id": "a5aa5041-0bfa-4eb8-8951-cc523b99ac9a", "bridge": "br-int", "label": "tempest-test-network--152303658", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf741863c-37", "ovs_interfaceid": "f741863c-37e3-4d41-b47d-a825b21d9eac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.885 2 DEBUG oslo_concurrency.lockutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Releasing lock "refresh_cache-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.886 2 DEBUG nova.compute.manager [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Instance network_info: |[{"id": "f741863c-37e3-4d41-b47d-a825b21d9eac", "address": "fa:16:3e:b6:1d:5f", "network": {"id": "a5aa5041-0bfa-4eb8-8951-cc523b99ac9a", "bridge": "br-int", "label": "tempest-test-network--152303658", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf741863c-37", "ovs_interfaceid": "f741863c-37e3-4d41-b47d-a825b21d9eac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.890 2 DEBUG nova.virt.libvirt.driver [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Start _get_guest_xml network_info=[{"id": "f741863c-37e3-4d41-b47d-a825b21d9eac", "address": "fa:16:3e:b6:1d:5f", "network": {"id": "a5aa5041-0bfa-4eb8-8951-cc523b99ac9a", "bridge": "br-int", "label": "tempest-test-network--152303658", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf741863c-37", "ovs_interfaceid": "f741863c-37e3-4d41-b47d-a825b21d9eac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.896 2 WARNING nova.virt.libvirt.driver [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.901 2 DEBUG nova.virt.libvirt.host [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.902 2 DEBUG nova.virt.libvirt.host [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.908 2 DEBUG nova.virt.libvirt.host [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.909 2 DEBUG nova.virt.libvirt.host [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.910 2 DEBUG nova.virt.libvirt.driver [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.910 2 DEBUG nova.virt.hardware [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.911 2 DEBUG nova.virt.hardware [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.911 2 DEBUG nova.virt.hardware [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.912 2 DEBUG nova.virt.hardware [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.912 2 DEBUG nova.virt.hardware [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.912 2 DEBUG nova.virt.hardware [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.913 2 DEBUG nova.virt.hardware [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.913 2 DEBUG nova.virt.hardware [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.914 2 DEBUG nova.virt.hardware [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.914 2 DEBUG nova.virt.hardware [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.915 2 DEBUG nova.virt.hardware [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.921 2 DEBUG nova.virt.libvirt.vif [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:00:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-2082575820',display_name='tempest-server-test-2082575820',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-2082575820',id=76,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGmGuURdoBH7+8UKntqBm5AWKwSqVw41oQIfoqZW4juzRa+zLIDUZQk+8q96NsvV1QhNKcV4HhEHGQj7RYtO04Z0WfqqmlMfeVZDrcQlemJhjx+knV/dWY2Bcp0Y0lXzvQ==',key_name='tempest-keypair-test-1490490112',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='13cdd2bb6c7648f5ab8709ff695b5cda',ramdisk_id='',reservation_id='r-q49mk07u',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestOvn-1026583770',owner_user_name='tempest-QosTestOvn-1026583770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:00:18Z,user_data=None,user_id='bf4219ece8f54f268b2ece84f150d555',uuid=60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f741863c-37e3-4d41-b47d-a825b21d9eac", "address": "fa:16:3e:b6:1d:5f", "network": {"id": "a5aa5041-0bfa-4eb8-8951-cc523b99ac9a", "bridge": "br-int", "label": "tempest-test-network--152303658", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf741863c-37", "ovs_interfaceid": "f741863c-37e3-4d41-b47d-a825b21d9eac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.921 2 DEBUG nova.network.os_vif_util [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Converting VIF {"id": "f741863c-37e3-4d41-b47d-a825b21d9eac", "address": "fa:16:3e:b6:1d:5f", "network": {"id": "a5aa5041-0bfa-4eb8-8951-cc523b99ac9a", "bridge": "br-int", "label": "tempest-test-network--152303658", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf741863c-37", "ovs_interfaceid": "f741863c-37e3-4d41-b47d-a825b21d9eac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.922 2 DEBUG nova.network.os_vif_util [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:1d:5f,bridge_name='br-int',has_traffic_filtering=True,id=f741863c-37e3-4d41-b47d-a825b21d9eac,network=Network(a5aa5041-0bfa-4eb8-8951-cc523b99ac9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf741863c-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.927 2 DEBUG nova.objects.instance [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lazy-loading 'pci_devices' on Instance uuid 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.954 2 DEBUG nova.virt.libvirt.driver [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] End _get_guest_xml xml=<domain type="kvm">
Oct  8 12:00:27 np0005476733 nova_compute[192580]:  <uuid>60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f</uuid>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:  <name>instance-0000004c</name>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <nova:name>tempest-server-test-2082575820</nova:name>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 16:00:27</nova:creationTime>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 12:00:27 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:        <nova:user uuid="bf4219ece8f54f268b2ece84f150d555">tempest-QosTestOvn-1026583770-project-member</nova:user>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:        <nova:project uuid="13cdd2bb6c7648f5ab8709ff695b5cda">tempest-QosTestOvn-1026583770</nova:project>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:        <nova:port uuid="f741863c-37e3-4d41-b47d-a825b21d9eac">
Oct  8 12:00:27 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.28" ipVersion="4"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <system>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <entry name="serial">60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f</entry>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <entry name="uuid">60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f</entry>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    </system>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:  <os>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:  </os>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:  <features>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:  </features>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:  </clock>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:  <devices>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.config"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:b6:1d:5f"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <target dev="tapf741863c-37"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    </interface>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/console.log" append="off"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    </serial>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <video>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    </video>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    </rng>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 12:00:27 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 12:00:27 np0005476733 nova_compute[192580]:  </devices>
Oct  8 12:00:27 np0005476733 nova_compute[192580]: </domain>
Oct  8 12:00:27 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.956 2 DEBUG nova.compute.manager [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Preparing to wait for external event network-vif-plugged-f741863c-37e3-4d41-b47d-a825b21d9eac prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.956 2 DEBUG oslo_concurrency.lockutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquiring lock "60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.956 2 DEBUG oslo_concurrency.lockutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.957 2 DEBUG oslo_concurrency.lockutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.957 2 DEBUG nova.virt.libvirt.vif [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:00:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-2082575820',display_name='tempest-server-test-2082575820',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-2082575820',id=76,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGmGuURdoBH7+8UKntqBm5AWKwSqVw41oQIfoqZW4juzRa+zLIDUZQk+8q96NsvV1QhNKcV4HhEHGQj7RYtO04Z0WfqqmlMfeVZDrcQlemJhjx+knV/dWY2Bcp0Y0lXzvQ==',key_name='tempest-keypair-test-1490490112',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='13cdd2bb6c7648f5ab8709ff695b5cda',ramdisk_id='',reservation_id='r-q49mk07u',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestOvn-1026583770',owner_user_name='tempest-QosTestOvn-1026583770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:00:18Z,user_data=None,user_id='bf4219ece8f54f268b2ece84f150d555',uuid=60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f741863c-37e3-4d41-b47d-a825b21d9eac", "address": "fa:16:3e:b6:1d:5f", "network": {"id": "a5aa5041-0bfa-4eb8-8951-cc523b99ac9a", "bridge": "br-int", "label": "tempest-test-network--152303658", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf741863c-37", "ovs_interfaceid": "f741863c-37e3-4d41-b47d-a825b21d9eac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.958 2 DEBUG nova.network.os_vif_util [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Converting VIF {"id": "f741863c-37e3-4d41-b47d-a825b21d9eac", "address": "fa:16:3e:b6:1d:5f", "network": {"id": "a5aa5041-0bfa-4eb8-8951-cc523b99ac9a", "bridge": "br-int", "label": "tempest-test-network--152303658", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf741863c-37", "ovs_interfaceid": "f741863c-37e3-4d41-b47d-a825b21d9eac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.958 2 DEBUG nova.network.os_vif_util [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:1d:5f,bridge_name='br-int',has_traffic_filtering=True,id=f741863c-37e3-4d41-b47d-a825b21d9eac,network=Network(a5aa5041-0bfa-4eb8-8951-cc523b99ac9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf741863c-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.959 2 DEBUG os_vif [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:1d:5f,bridge_name='br-int',has_traffic_filtering=True,id=f741863c-37e3-4d41-b47d-a825b21d9eac,network=Network(a5aa5041-0bfa-4eb8-8951-cc523b99ac9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf741863c-37') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.960 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.960 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.964 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf741863c-37, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.964 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf741863c-37, col_values=(('external_ids', {'iface-id': 'f741863c-37e3-4d41-b47d-a825b21d9eac', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b6:1d:5f', 'vm-uuid': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:27 np0005476733 NetworkManager[51699]: <info>  [1759939227.9677] manager: (tapf741863c-37): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/240)
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:27 np0005476733 nova_compute[192580]: 2025-10-08 16:00:27.974 2 INFO os_vif [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:1d:5f,bridge_name='br-int',has_traffic_filtering=True,id=f741863c-37e3-4d41-b47d-a825b21d9eac,network=Network(a5aa5041-0bfa-4eb8-8951-cc523b99ac9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf741863c-37')#033[00m
Oct  8 12:00:28 np0005476733 nova_compute[192580]: 2025-10-08 16:00:28.057 2 DEBUG nova.virt.libvirt.driver [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:00:28 np0005476733 nova_compute[192580]: 2025-10-08 16:00:28.058 2 DEBUG nova.virt.libvirt.driver [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:00:28 np0005476733 nova_compute[192580]: 2025-10-08 16:00:28.059 2 DEBUG nova.virt.libvirt.driver [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] No VIF found with MAC fa:16:3e:b6:1d:5f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 12:00:28 np0005476733 nova_compute[192580]: 2025-10-08 16:00:28.059 2 INFO nova.virt.libvirt.driver [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Using config drive#033[00m
Oct  8 12:00:28 np0005476733 nova_compute[192580]: 2025-10-08 16:00:28.676 2 INFO nova.virt.libvirt.driver [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Creating config drive at /var/lib/nova/instances/60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.config#033[00m
Oct  8 12:00:28 np0005476733 nova_compute[192580]: 2025-10-08 16:00:28.686 2 DEBUG oslo_concurrency.processutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr8io4ubg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:00:28 np0005476733 nova_compute[192580]: 2025-10-08 16:00:28.821 2 DEBUG oslo_concurrency.processutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr8io4ubg" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:00:28 np0005476733 kernel: tapf741863c-37: entered promiscuous mode
Oct  8 12:00:28 np0005476733 NetworkManager[51699]: <info>  [1759939228.8944] manager: (tapf741863c-37): new Tun device (/org/freedesktop/NetworkManager/Devices/241)
Oct  8 12:00:28 np0005476733 nova_compute[192580]: 2025-10-08 16:00:28.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:28 np0005476733 ovn_controller[94857]: 2025-10-08T16:00:28Z|00713|binding|INFO|Claiming lport f741863c-37e3-4d41-b47d-a825b21d9eac for this chassis.
Oct  8 12:00:28 np0005476733 ovn_controller[94857]: 2025-10-08T16:00:28Z|00714|binding|INFO|f741863c-37e3-4d41-b47d-a825b21d9eac: Claiming fa:16:3e:b6:1d:5f 10.100.0.28
Oct  8 12:00:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:28.909 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b6:1d:5f 10.100.0.28'], port_security=['fa:16:3e:b6:1d:5f 10.100.0.28'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.28/28', 'neutron:device_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'neutron:revision_number': '2', 'neutron:security_group_ids': '495dd58b-359c-4273-9624-4fe93a315db4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=13531c51-2495-4720-8dcb-3418781478cb, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=f741863c-37e3-4d41-b47d-a825b21d9eac) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:00:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:28.910 103739 INFO neutron.agent.ovn.metadata.agent [-] Port f741863c-37e3-4d41-b47d-a825b21d9eac in datapath a5aa5041-0bfa-4eb8-8951-cc523b99ac9a bound to our chassis#033[00m
Oct  8 12:00:28 np0005476733 ovn_controller[94857]: 2025-10-08T16:00:28Z|00715|binding|INFO|Setting lport f741863c-37e3-4d41-b47d-a825b21d9eac ovn-installed in OVS
Oct  8 12:00:28 np0005476733 ovn_controller[94857]: 2025-10-08T16:00:28Z|00716|binding|INFO|Setting lport f741863c-37e3-4d41-b47d-a825b21d9eac up in Southbound
Oct  8 12:00:28 np0005476733 nova_compute[192580]: 2025-10-08 16:00:28.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:28.912 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a5aa5041-0bfa-4eb8-8951-cc523b99ac9a#033[00m
Oct  8 12:00:28 np0005476733 nova_compute[192580]: 2025-10-08 16:00:28.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:28 np0005476733 systemd-udevd[247797]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:00:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:28.926 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b58dbeac-fa2c-4515-b419-819def22d59e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:00:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:28.927 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa5aa5041-01 in ovnmeta-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 12:00:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:28.929 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa5aa5041-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 12:00:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:28.929 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7335ec75-524d-4713-84c6-c21df44c1603]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:00:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:28.930 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f580f9f8-b6c6-440e-95c3-20e06e3d45af]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:00:28 np0005476733 NetworkManager[51699]: <info>  [1759939228.9383] device (tapf741863c-37): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:00:28 np0005476733 NetworkManager[51699]: <info>  [1759939228.9393] device (tapf741863c-37): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 12:00:28 np0005476733 systemd-machined[152624]: New machine qemu-46-instance-0000004c.
Oct  8 12:00:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:28.944 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[ff6b8769-47dc-45a9-9c45-cf7a80fe9b07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:00:28 np0005476733 systemd[1]: Started Virtual Machine qemu-46-instance-0000004c.
Oct  8 12:00:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:28.963 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[aa90e08a-7eb4-43ad-b952-fdca110207b9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:00:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:28.996 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[d98ad165-c7a3-4644-975e-a8fddc81c7dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:00:29 np0005476733 NetworkManager[51699]: <info>  [1759939229.0050] manager: (tapa5aa5041-00): new Veth device (/org/freedesktop/NetworkManager/Devices/242)
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:29.005 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7855d0c8-e655-494d-8a0b-86df7fbf08e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:29.039 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[676ab060-f375-4b1b-8278-09d6a8a0852c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:29.043 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[aa2c434d-0e57-407d-bb44-0fa9f70ddac5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:00:29 np0005476733 NetworkManager[51699]: <info>  [1759939229.0727] device (tapa5aa5041-00): carrier: link connected
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:29.076 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[1e20db81-c803-4364-a541-9330657d99fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:29.097 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2c0a2f5d-389e-4f94-8293-92bb92542aa9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa5aa5041-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f5:44:c9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 613273, 'reachable_time': 23926, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247832, 'error': None, 'target': 'ovnmeta-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:29.119 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c626ec45-1856-4f10-835f-ec906a3b9336]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef5:44c9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 613273, 'tstamp': 613273}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247838, 'error': None, 'target': 'ovnmeta-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.143 2 DEBUG nova.compute.manager [req-b566d978-2730-4c66-a075-1c4d8626b0a5 req-155cd256-79a8-4cb0-b7cb-1ea031b5f777 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Received event network-vif-plugged-f741863c-37e3-4d41-b47d-a825b21d9eac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.143 2 DEBUG oslo_concurrency.lockutils [req-b566d978-2730-4c66-a075-1c4d8626b0a5 req-155cd256-79a8-4cb0-b7cb-1ea031b5f777 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.144 2 DEBUG oslo_concurrency.lockutils [req-b566d978-2730-4c66-a075-1c4d8626b0a5 req-155cd256-79a8-4cb0-b7cb-1ea031b5f777 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.144 2 DEBUG oslo_concurrency.lockutils [req-b566d978-2730-4c66-a075-1c4d8626b0a5 req-155cd256-79a8-4cb0-b7cb-1ea031b5f777 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.145 2 DEBUG nova.compute.manager [req-b566d978-2730-4c66-a075-1c4d8626b0a5 req-155cd256-79a8-4cb0-b7cb-1ea031b5f777 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Processing event network-vif-plugged-f741863c-37e3-4d41-b47d-a825b21d9eac _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:29.146 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7331ee27-f21d-42a4-a190-0cbc6dec6277]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa5aa5041-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f5:44:c9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 613273, 'reachable_time': 23926, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247839, 'error': None, 'target': 'ovnmeta-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:29.188 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[70a7eaac-3f58-4d6a-a468-34365407163e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:29.258 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[38e806e8-4296-40ff-be61-19daee4ceeb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:29.260 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa5aa5041-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:29.261 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:29.262 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa5aa5041-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:00:29 np0005476733 NetworkManager[51699]: <info>  [1759939229.2649] manager: (tapa5aa5041-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/243)
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:29 np0005476733 kernel: tapa5aa5041-00: entered promiscuous mode
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:29.270 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa5aa5041-00, col_values=(('external_ids', {'iface-id': '3768e588-92b3-476c-93ce-9170fa6601f5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:00:29 np0005476733 ovn_controller[94857]: 2025-10-08T16:00:29Z|00717|binding|INFO|Releasing lport 3768e588-92b3-476c-93ce-9170fa6601f5 from this chassis (sb_readonly=0)
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:29.276 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a5aa5041-0bfa-4eb8-8951-cc523b99ac9a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a5aa5041-0bfa-4eb8-8951-cc523b99ac9a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:29.277 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0f792514-4861-446e-90c4-a824519bd75b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:29.278 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/a5aa5041-0bfa-4eb8-8951-cc523b99ac9a.pid.haproxy
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID a5aa5041-0bfa-4eb8-8951-cc523b99ac9a
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 12:00:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:29.279 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a', 'env', 'PROCESS_TAG=haproxy-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a5aa5041-0bfa-4eb8-8951-cc523b99ac9a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.637 2 DEBUG nova.compute.manager [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.642 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759939229.6370099, 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.644 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] VM Started (Lifecycle Event)#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.646 2 DEBUG nova.virt.libvirt.driver [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.649 2 INFO nova.virt.libvirt.driver [-] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Instance spawned successfully.#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.649 2 DEBUG nova.virt.libvirt.driver [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 12:00:29 np0005476733 podman[247872]: 2025-10-08 16:00:29.663857315 +0000 UTC m=+0.053590963 container create 0ff9344e6845259e09e5b1a351c894c223a84f0f0f5206a379a1d1b43e046c16 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.669 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.674 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.677 2 DEBUG nova.virt.libvirt.driver [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.678 2 DEBUG nova.virt.libvirt.driver [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.678 2 DEBUG nova.virt.libvirt.driver [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.678 2 DEBUG nova.virt.libvirt.driver [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.679 2 DEBUG nova.virt.libvirt.driver [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.679 2 DEBUG nova.virt.libvirt.driver [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.694 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.694 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759939229.6381087, 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.694 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] VM Paused (Lifecycle Event)#033[00m
Oct  8 12:00:29 np0005476733 systemd[1]: Started libpod-conmon-0ff9344e6845259e09e5b1a351c894c223a84f0f0f5206a379a1d1b43e046c16.scope.
Oct  8 12:00:29 np0005476733 systemd[1]: Started libcrun container.
Oct  8 12:00:29 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06da9a7aa1dea972cda8093b2d09a3619f7424e282a51fb848f26c626c50b8f2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.718 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.722 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759939229.6438997, 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.723 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] VM Resumed (Lifecycle Event)#033[00m
Oct  8 12:00:29 np0005476733 podman[247872]: 2025-10-08 16:00:29.635689235 +0000 UTC m=+0.025422903 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 12:00:29 np0005476733 podman[247872]: 2025-10-08 16:00:29.729077549 +0000 UTC m=+0.118811207 container init 0ff9344e6845259e09e5b1a351c894c223a84f0f0f5206a379a1d1b43e046c16 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:00:29 np0005476733 podman[247872]: 2025-10-08 16:00:29.741250298 +0000 UTC m=+0.130983946 container start 0ff9344e6845259e09e5b1a351c894c223a84f0f0f5206a379a1d1b43e046c16 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.754 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.757 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.763 2 INFO nova.compute.manager [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Took 10.83 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.764 2 DEBUG nova.compute.manager [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:00:29 np0005476733 neutron-haproxy-ovnmeta-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a[247886]: [NOTICE]   (247892) : New worker (247894) forked
Oct  8 12:00:29 np0005476733 neutron-haproxy-ovnmeta-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a[247886]: [NOTICE]   (247892) : Loading success.
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.780 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.830 2 INFO nova.compute.manager [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Took 12.11 seconds to build instance.#033[00m
Oct  8 12:00:29 np0005476733 nova_compute[192580]: 2025-10-08 16:00:29.849 2 DEBUG oslo_concurrency.lockutils [None req-21d695ff-618d-4403-aa12-75d04d07b692 bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.415s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:00:30 np0005476733 nova_compute[192580]: 2025-10-08 16:00:30.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:00:31 np0005476733 nova_compute[192580]: 2025-10-08 16:00:31.306 2 DEBUG nova.compute.manager [req-1fbe2398-2ef6-4065-81ef-99e8d14fa998 req-ffa9c093-6aa4-4827-a1b4-0345d476474f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Received event network-vif-plugged-f741863c-37e3-4d41-b47d-a825b21d9eac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:00:31 np0005476733 nova_compute[192580]: 2025-10-08 16:00:31.308 2 DEBUG oslo_concurrency.lockutils [req-1fbe2398-2ef6-4065-81ef-99e8d14fa998 req-ffa9c093-6aa4-4827-a1b4-0345d476474f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:00:31 np0005476733 nova_compute[192580]: 2025-10-08 16:00:31.309 2 DEBUG oslo_concurrency.lockutils [req-1fbe2398-2ef6-4065-81ef-99e8d14fa998 req-ffa9c093-6aa4-4827-a1b4-0345d476474f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:00:31 np0005476733 nova_compute[192580]: 2025-10-08 16:00:31.310 2 DEBUG oslo_concurrency.lockutils [req-1fbe2398-2ef6-4065-81ef-99e8d14fa998 req-ffa9c093-6aa4-4827-a1b4-0345d476474f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:00:31 np0005476733 nova_compute[192580]: 2025-10-08 16:00:31.310 2 DEBUG nova.compute.manager [req-1fbe2398-2ef6-4065-81ef-99e8d14fa998 req-ffa9c093-6aa4-4827-a1b4-0345d476474f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] No waiting events found dispatching network-vif-plugged-f741863c-37e3-4d41-b47d-a825b21d9eac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:00:31 np0005476733 nova_compute[192580]: 2025-10-08 16:00:31.311 2 WARNING nova.compute.manager [req-1fbe2398-2ef6-4065-81ef-99e8d14fa998 req-ffa9c093-6aa4-4827-a1b4-0345d476474f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Received unexpected event network-vif-plugged-f741863c-37e3-4d41-b47d-a825b21d9eac for instance with vm_state active and task_state None.#033[00m
Oct  8 12:00:31 np0005476733 nova_compute[192580]: 2025-10-08 16:00:31.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:00:31 np0005476733 nova_compute[192580]: 2025-10-08 16:00:31.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:32 np0005476733 nova_compute[192580]: 2025-10-08 16:00:32.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:00:32 np0005476733 nova_compute[192580]: 2025-10-08 16:00:32.592 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 12:00:32 np0005476733 nova_compute[192580]: 2025-10-08 16:00:32.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:34 np0005476733 podman[247905]: 2025-10-08 16:00:34.249594362 +0000 UTC m=+0.070915917 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  8 12:00:34 np0005476733 podman[247904]: 2025-10-08 16:00:34.275974164 +0000 UTC m=+0.092591909 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Oct  8 12:00:34 np0005476733 nova_compute[192580]: 2025-10-08 16:00:34.591 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:00:35 np0005476733 nova_compute[192580]: 2025-10-08 16:00:35.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:35.048 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:00:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:35.050 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.051 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'name': 'tempest-server-test-2082575820', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000004c', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'hostId': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.051 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.070 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/cpu volume: 6110000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d8d2a65-27ce-440d-ab65-f94534e5a5c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6110000000, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'timestamp': '2025-10-08T16:00:36.052126', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'instance-0000004c', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'ed6d8218-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.793684421, 'message_signature': '877924d7783a9c4473f4f98830ed97ce585a36c09c4b4c754e85a41bbd96db24'}]}, 'timestamp': '2025-10-08 16:00:36.071263', '_unique_id': '4eabd7441d36429caa6871c798ccba00'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.072 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.074 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.095 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.device.read.requests volume: 5688 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.096 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '083079ac-116b-4664-b7f9-80b5ee513f0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 5688, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-vda', 'timestamp': '2025-10-08T16:00:36.074179', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'instance-0000004c', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ed7149ca-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.797146902, 'message_signature': 'ad3f3ababbf43047d647e5bb9e28d119219140552096f20db4de8bac26bc7fa9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-sda', 'timestamp': '2025-10-08T16:00:36.074179', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'instance-0000004c', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ed7157c6-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.797146902, 'message_signature': 'dcf43773712b64b841d708714deefbb58e992282341a17d26b725b0d733335a6'}]}, 'timestamp': '2025-10-08 16:00:36.096317', '_unique_id': '949eab67f5ec44e9b873eadf4a6a839a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.097 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.098 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.109 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.110 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b3e594c-934f-4c00-ac2b-91f688ee9bbd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-vda', 'timestamp': '2025-10-08T16:00:36.098564', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'instance-0000004c', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ed737a10-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.821556742, 'message_signature': 'dff4ac19d5237901c37e812e34b28355696dd7a3de7237f5c3f8e92581227f4c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-sda', 'timestamp': '2025-10-08T16:00:36.098564', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'instance-0000004c', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ed738708-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.821556742, 'message_signature': '728dec2a146227a712ff436fbff3ab1b106d2c0425fb94befa59b296f8f94823'}]}, 'timestamp': '2025-10-08 16:00:36.110630', '_unique_id': '482e34fe6d924711b683efa91322d6c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.111 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.112 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.112 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '611f4134-0406-4b98-9773-eb3b426e48d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-vda', 'timestamp': '2025-10-08T16:00:36.112935', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'instance-0000004c', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ed73ec8e-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.821556742, 'message_signature': '6a878ea8e4206130e32dc82b3c07025ed9970790a7d44664e523d689ddac3399'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-sda', 'timestamp': '2025-10-08T16:00:36.112935', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'instance-0000004c', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ed73f634-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.821556742, 'message_signature': '4fda763b26820cee6c224ae41414a33e7027b8227f13f1996a087b8b6d1173e7'}]}, 'timestamp': '2025-10-08 16:00:36.113458', '_unique_id': '3f4224fcad7346e39e8e659073764f54'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.113 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.114 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.117 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f / tapf741863c-37 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.118 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b406c4b-8459-45d8-89bc-d196560c2835', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': 'instance-0000004c-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-tapf741863c-37', 'timestamp': '2025-10-08T16:00:36.114942', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'tapf741863c-37', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b6:1d:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf741863c-37'}, 'message_id': 'ed74bb50-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.837929845, 'message_signature': '95e89275699e2d09599179e2549afae0cadc5e6fce98b619f69fcd7685fea8d2'}]}, 'timestamp': '2025-10-08 16:00:36.118748', '_unique_id': 'e666f9db3fdf4dbe886dade6f69f90c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.119 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.120 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.121 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.121 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-2082575820>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-2082575820>]
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.121 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.121 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.device.write.latency volume: 7918883 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.121 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b438771c-4d57-456d-aedc-b4907ef0a6d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7918883, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-vda', 'timestamp': '2025-10-08T16:00:36.121564', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'instance-0000004c', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ed753ecc-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.797146902, 'message_signature': '8fe5cce48baa8d3ed053cf2f5ec498e8139a94cba73624e77b5a29809878b7d6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-sda', 'timestamp': '2025-10-08T16:00:36.121564', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'instance-0000004c', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ed754a98-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.797146902, 'message_signature': 'b89ac4f6b1bb5af31cc7dce57bcd0bc3e5f341d88b2bbf2e86260c0d8b7e905f'}]}, 'timestamp': '2025-10-08 16:00:36.122210', '_unique_id': '21628cca78214b81bcd22be50769cc61'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.122 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.123 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.123 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.device.allocation volume: 1253376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee4c9e14-289e-4ed2-81e5-fbe5cebbcc34', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1253376, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-vda', 'timestamp': '2025-10-08T16:00:36.123755', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'instance-0000004c', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ed7592a0-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.821556742, 'message_signature': 'c9131d7777ba99853ee13454ea55243d5c028365dfbde7b3d2fbac45d25823fa'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-sda', 'timestamp': '2025-10-08T16:00:36.123755', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'instance-0000004c', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ed759d2c-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.821556742, 'message_signature': '6111e40c88a0c63ba34a4c8b2667be513f6a365b8302a94979dfc0c8a7e7d0cf'}]}, 'timestamp': '2025-10-08 16:00:36.124313', '_unique_id': 'd10d578c31424ebaaa2cb0599cf69385'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.124 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.125 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '047803f2-18b5-4073-9488-3746300b456e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': 'instance-0000004c-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-tapf741863c-37', 'timestamp': '2025-10-08T16:00:36.126029', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'tapf741863c-37', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b6:1d:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf741863c-37'}, 'message_id': 'ed75eeda-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.837929845, 'message_signature': 'e55d7525d04accc09baba92164c669226193bfbd935f3882d1879e340a1d42bf'}]}, 'timestamp': '2025-10-08 16:00:36.126435', '_unique_id': 'b65e2aa4a8a74eb3947604cdd1dc7009'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.126 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.128 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.128 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.device.read.latency volume: 2531079202 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.128 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.device.read.latency volume: 3904485 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7be34d2a-9056-441e-914d-476ce2773a05', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2531079202, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-vda', 'timestamp': '2025-10-08T16:00:36.128310', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'instance-0000004c', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ed7646fa-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.797146902, 'message_signature': '5e0ab2b0991bfc03a0e1896a0cba94fb7da16d035f115d2f0cca96c1d9b40f25'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3904485, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-sda', 'timestamp': '2025-10-08T16:00:36.128310', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'instance-0000004c', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ed765302-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.797146902, 'message_signature': '807de22a2fcc364f8e5efcca7eafccee4f6afe6accbc31c8157194a43ad289fd'}]}, 'timestamp': '2025-10-08 16:00:36.128974', '_unique_id': '0e5cf47616004ef9a020874c43a6fac0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.130 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.130 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2458f81b-7fe7-4a6b-aec2-4b1f8f94988b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': 'instance-0000004c-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-tapf741863c-37', 'timestamp': '2025-10-08T16:00:36.130810', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'tapf741863c-37', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b6:1d:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf741863c-37'}, 'message_id': 'ed76a834-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.837929845, 'message_signature': '55eb9ba73c66a579689941f602453e8a62023cd00a63eca6810c5df0333dbf19'}]}, 'timestamp': '2025-10-08 16:00:36.131196', '_unique_id': '89a099003098464da686c4e19bec4c9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.131 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.132 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03cfaa73-0406-4beb-8f3f-123a2b3d9db0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': 'instance-0000004c-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-tapf741863c-37', 'timestamp': '2025-10-08T16:00:36.132991', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'tapf741863c-37', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b6:1d:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf741863c-37'}, 'message_id': 'ed76fc44-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.837929845, 'message_signature': '3af8d668919a535fb4b43cc5a02a666cf43b6a898f6008c98cb1ffeeeb3b7df0'}]}, 'timestamp': '2025-10-08 16:00:36.133286', '_unique_id': '69d9c2d4b9804d4185b534805aef4586'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.133 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.134 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.134 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.device.read.bytes volume: 93131776 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.135 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c6266ae-e91a-4b65-86ce-c82d320c8668', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 93131776, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-vda', 'timestamp': '2025-10-08T16:00:36.134910', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'instance-0000004c', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ed7748f2-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.797146902, 'message_signature': 'c8a4b3c284b6742d32dc60c9223e77977d17e94e04622a942efe63a388cef2ef'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-sda', 'timestamp': '2025-10-08T16:00:36.134910', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'instance-0000004c', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ed775572-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.797146902, 'message_signature': 'cdaed69b4da16dd1da4e8564597f19b17fdf6eedb06efd7eed24d127b80a7a8c'}]}, 'timestamp': '2025-10-08 16:00:36.135616', '_unique_id': '9aabadd290de41a88bbdf0248f9f5369'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.136 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.137 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.137 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.137 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-2082575820>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-2082575820>]
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.137 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.137 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.138 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48751e16-4c0e-431d-99f5-40a8defde7a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-vda', 'timestamp': '2025-10-08T16:00:36.137870', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'instance-0000004c', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ed77bb5c-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.797146902, 'message_signature': 'c746accc84d9428b83f37ea8cd7ebd63e6f6bbf529657e3f22744a55189ce7ea'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-sda', 'timestamp': '2025-10-08T16:00:36.137870', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'instance-0000004c', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ed77c8ae-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.797146902, 'message_signature': '85a7e03c19fe2b3ec348e6fcf7e81f8d3a4049a37c26698e093079a508da0655'}]}, 'timestamp': '2025-10-08 16:00:36.138551', '_unique_id': 'd78b5c9f509849c5b11b1a1cc8db4996'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.139 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.140 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.140 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd04e3bc8-1048-4528-846e-8d811a9f9ee4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': 'instance-0000004c-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-tapf741863c-37', 'timestamp': '2025-10-08T16:00:36.140327', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'tapf741863c-37', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b6:1d:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf741863c-37'}, 'message_id': 'ed781a66-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.837929845, 'message_signature': 'b4f328bc54cc6995db186ef21271d682d1eb9c2f2aced090930c5f1908520f80'}]}, 'timestamp': '2025-10-08 16:00:36.140612', '_unique_id': 'abee7d441ca248219008f5fa835a9172'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.141 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.142 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.142 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.142 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f: ceilometer.compute.pollsters.NoVolumeException
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.142 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.142 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b127f1e2-096d-4dff-94aa-1543241a275f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': 'instance-0000004c-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-tapf741863c-37', 'timestamp': '2025-10-08T16:00:36.142599', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'tapf741863c-37', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b6:1d:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf741863c-37'}, 'message_id': 'ed7873f8-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.837929845, 'message_signature': '83e3293a5dacfd7ce7d90b61d9981564e5d2c9a44c22f1159fdf39b689ba2fa7'}]}, 'timestamp': '2025-10-08 16:00:36.142938', '_unique_id': '8e416dc3a7b34bbb941d11719bfc00ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:00:36 np0005476733 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.143 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.144 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.144 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b68ae8b4-9819-4e22-bf8e-f7bf6af22bc5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': 'instance-0000004c-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-tapf741863c-37', 'timestamp': '2025-10-08T16:00:36.144815', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'tapf741863c-37', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b6:1d:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf741863c-37'}, 'message_id': 'ed78cb28-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.837929845, 'message_signature': '97841820d2ac67192187e2ed946d8c63cb6c497b56f45d68f74a9dd2e42db03a'}]}, 'timestamp': '2025-10-08 16:00:36.145165', '_unique_id': 'c348fd9a005942ea968cf465d33efe0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.145 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.146 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.146 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.146 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-2082575820>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-2082575820>]
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.147 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.147 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.147 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-2082575820>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-2082575820>]
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.147 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.147 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50e23c93-0c20-4242-a2e8-22cc6819d387', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': 'instance-0000004c-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-tapf741863c-37', 'timestamp': '2025-10-08T16:00:36.147593', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'tapf741863c-37', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b6:1d:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf741863c-37'}, 'message_id': 'ed79363a-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.837929845, 'message_signature': 'c0c03c583a9383fa709bdb6de8ed74e55d5bb6804b1e612968351c4b5ee8e2aa'}]}, 'timestamp': '2025-10-08 16:00:36.147882', '_unique_id': '170ed6dd5e504a0dacdae6b2b297349f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.148 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.149 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.149 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1cb5256-182c-45a5-9c07-09c413fe4c0e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': 'instance-0000004c-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-tapf741863c-37', 'timestamp': '2025-10-08T16:00:36.149402', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'tapf741863c-37', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b6:1d:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf741863c-37'}, 'message_id': 'ed797c6c-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.837929845, 'message_signature': '9d7268381d9154a2117a26720d95253740d1f7a3a27f281a51f4a61a94709d89'}]}, 'timestamp': '2025-10-08 16:00:36.149677', '_unique_id': 'a01f88209d204e7b9708bd689fea74f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.163 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.163 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.device.write.bytes volume: 1024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.163 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76da0a06-c146-4b4d-9327-966356cb2a24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1024, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-vda', 'timestamp': '2025-10-08T16:00:36.163494', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'instance-0000004c', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ed7ba4b0-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.797146902, 'message_signature': '145376117fc85eae908a908c740be9ef8c184bcd47c21a454ed0514ae6e65578'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-sda', 'timestamp': '2025-10-08T16:00:36.163494', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'instance-0000004c', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ed7bafbe-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.797146902, 'message_signature': '9887eae2b8b62473869787f1ce5a69aeef61c12bf1f54605bbe93571bfc33c20'}]}, 'timestamp': '2025-10-08 16:00:36.164136', '_unique_id': 'b360a35a887c4026a7d9bb8883ad93e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.165 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 DEBUG ceilometer.compute.pollsters [-] 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7dcb7d5c-c4d2-415d-997b-d0ea3ea48593', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'bf4219ece8f54f268b2ece84f150d555', 'user_name': None, 'project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'project_name': None, 'resource_id': 'instance-0000004c-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-tapf741863c-37', 'timestamp': '2025-10-08T16:00:36.165998', 'resource_metadata': {'display_name': 'tempest-server-test-2082575820', 'name': 'tapf741863c-37', 'instance_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'instance_type': 'custom_neutron_guest', 'host': '73e73bbdfd9d00232e1563894754e6dca27b688972afafa68ce6d5c3', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:b6:1d:5f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf741863c-37'}, 'message_id': 'ed7c0522-a45f-11f0-9274-fa163ef67048', 'monotonic_time': 6139.837929845, 'message_signature': 'b8e752396d7bf1379facad530e25c29575b6ab9b8d3b41a234106517894ff990'}]}, 'timestamp': '2025-10-08 16:00:36.166310', '_unique_id': '2109446d56954004946e60a8690189ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:00:36.166 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:00:36 np0005476733 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 12:00:36 np0005476733 nova_compute[192580]: 2025-10-08 16:00:36.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:37.052 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:00:37 np0005476733 nova_compute[192580]: 2025-10-08 16:00:37.433 2 DEBUG nova.compute.manager [req-c9543e1a-914c-4cac-b4a9-350bf2a27422 req-25a92dfd-d697-4ca0-a210-48a15dc1d1ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Received event network-changed-f741863c-37e3-4d41-b47d-a825b21d9eac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:00:37 np0005476733 nova_compute[192580]: 2025-10-08 16:00:37.434 2 DEBUG nova.compute.manager [req-c9543e1a-914c-4cac-b4a9-350bf2a27422 req-25a92dfd-d697-4ca0-a210-48a15dc1d1ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Refreshing instance network info cache due to event network-changed-f741863c-37e3-4d41-b47d-a825b21d9eac. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:00:37 np0005476733 nova_compute[192580]: 2025-10-08 16:00:37.434 2 DEBUG oslo_concurrency.lockutils [req-c9543e1a-914c-4cac-b4a9-350bf2a27422 req-25a92dfd-d697-4ca0-a210-48a15dc1d1ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:00:37 np0005476733 nova_compute[192580]: 2025-10-08 16:00:37.435 2 DEBUG oslo_concurrency.lockutils [req-c9543e1a-914c-4cac-b4a9-350bf2a27422 req-25a92dfd-d697-4ca0-a210-48a15dc1d1ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:00:37 np0005476733 nova_compute[192580]: 2025-10-08 16:00:37.435 2 DEBUG nova.network.neutron [req-c9543e1a-914c-4cac-b4a9-350bf2a27422 req-25a92dfd-d697-4ca0-a210-48a15dc1d1ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Refreshing network info cache for port f741863c-37e3-4d41-b47d-a825b21d9eac _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:00:37 np0005476733 nova_compute[192580]: 2025-10-08 16:00:37.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:39 np0005476733 nova_compute[192580]: 2025-10-08 16:00:39.330 2 DEBUG nova.network.neutron [req-c9543e1a-914c-4cac-b4a9-350bf2a27422 req-25a92dfd-d697-4ca0-a210-48a15dc1d1ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Updated VIF entry in instance network info cache for port f741863c-37e3-4d41-b47d-a825b21d9eac. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:00:39 np0005476733 nova_compute[192580]: 2025-10-08 16:00:39.332 2 DEBUG nova.network.neutron [req-c9543e1a-914c-4cac-b4a9-350bf2a27422 req-25a92dfd-d697-4ca0-a210-48a15dc1d1ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Updating instance_info_cache with network_info: [{"id": "f741863c-37e3-4d41-b47d-a825b21d9eac", "address": "fa:16:3e:b6:1d:5f", "network": {"id": "a5aa5041-0bfa-4eb8-8951-cc523b99ac9a", "bridge": "br-int", "label": "tempest-test-network--152303658", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf741863c-37", "ovs_interfaceid": "f741863c-37e3-4d41-b47d-a825b21d9eac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:00:39 np0005476733 nova_compute[192580]: 2025-10-08 16:00:39.684 2 DEBUG oslo_concurrency.lockutils [req-c9543e1a-914c-4cac-b4a9-350bf2a27422 req-25a92dfd-d697-4ca0-a210-48a15dc1d1ee 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:00:40 np0005476733 nova_compute[192580]: 2025-10-08 16:00:40.442 2 DEBUG nova.compute.manager [req-4ab26ed2-542c-43ee-aa66-3f6592a1969d req-564b1803-01fa-4122-ab0f-ff496f84d2ad 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Received event network-changed-f741863c-37e3-4d41-b47d-a825b21d9eac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:00:40 np0005476733 nova_compute[192580]: 2025-10-08 16:00:40.443 2 DEBUG nova.compute.manager [req-4ab26ed2-542c-43ee-aa66-3f6592a1969d req-564b1803-01fa-4122-ab0f-ff496f84d2ad 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Refreshing instance network info cache due to event network-changed-f741863c-37e3-4d41-b47d-a825b21d9eac. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:00:40 np0005476733 nova_compute[192580]: 2025-10-08 16:00:40.444 2 DEBUG oslo_concurrency.lockutils [req-4ab26ed2-542c-43ee-aa66-3f6592a1969d req-564b1803-01fa-4122-ab0f-ff496f84d2ad 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:00:40 np0005476733 nova_compute[192580]: 2025-10-08 16:00:40.444 2 DEBUG oslo_concurrency.lockutils [req-4ab26ed2-542c-43ee-aa66-3f6592a1969d req-564b1803-01fa-4122-ab0f-ff496f84d2ad 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:00:40 np0005476733 nova_compute[192580]: 2025-10-08 16:00:40.444 2 DEBUG nova.network.neutron [req-4ab26ed2-542c-43ee-aa66-3f6592a1969d req-564b1803-01fa-4122-ab0f-ff496f84d2ad 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Refreshing network info cache for port f741863c-37e3-4d41-b47d-a825b21d9eac _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.484 2 DEBUG oslo_concurrency.lockutils [None req-5b0c9172-578d-4616-90a1-9c810edd699d bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquiring lock "60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.486 2 DEBUG oslo_concurrency.lockutils [None req-5b0c9172-578d-4616-90a1-9c810edd699d bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.487 2 DEBUG oslo_concurrency.lockutils [None req-5b0c9172-578d-4616-90a1-9c810edd699d bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquiring lock "60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.487 2 DEBUG oslo_concurrency.lockutils [None req-5b0c9172-578d-4616-90a1-9c810edd699d bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.488 2 DEBUG oslo_concurrency.lockutils [None req-5b0c9172-578d-4616-90a1-9c810edd699d bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.490 2 INFO nova.compute.manager [None req-5b0c9172-578d-4616-90a1-9c810edd699d bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Terminating instance#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.492 2 DEBUG nova.compute.manager [None req-5b0c9172-578d-4616-90a1-9c810edd699d bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 12:00:41 np0005476733 kernel: tapf741863c-37 (unregistering): left promiscuous mode
Oct  8 12:00:41 np0005476733 NetworkManager[51699]: <info>  [1759939241.5261] device (tapf741863c-37): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 12:00:41 np0005476733 ovn_controller[94857]: 2025-10-08T16:00:41Z|00718|binding|INFO|Releasing lport f741863c-37e3-4d41-b47d-a825b21d9eac from this chassis (sb_readonly=0)
Oct  8 12:00:41 np0005476733 ovn_controller[94857]: 2025-10-08T16:00:41Z|00719|binding|INFO|Setting lport f741863c-37e3-4d41-b47d-a825b21d9eac down in Southbound
Oct  8 12:00:41 np0005476733 ovn_controller[94857]: 2025-10-08T16:00:41Z|00720|binding|INFO|Removing iface tapf741863c-37 ovn-installed in OVS
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:41.558 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b6:1d:5f 10.100.0.28'], port_security=['fa:16:3e:b6:1d:5f 10.100.0.28'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.28/28', 'neutron:device_id': '60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '13cdd2bb6c7648f5ab8709ff695b5cda', 'neutron:revision_number': '4', 'neutron:security_group_ids': '495dd58b-359c-4273-9624-4fe93a315db4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.207'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=13531c51-2495-4720-8dcb-3418781478cb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=f741863c-37e3-4d41-b47d-a825b21d9eac) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:00:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:41.560 103739 INFO neutron.agent.ovn.metadata.agent [-] Port f741863c-37e3-4d41-b47d-a825b21d9eac in datapath a5aa5041-0bfa-4eb8-8951-cc523b99ac9a unbound from our chassis#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:41.562 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a5aa5041-0bfa-4eb8-8951-cc523b99ac9a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 12:00:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:41.564 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e16ef4d9-1969-4d9b-97c5-ce6a64b90028]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:00:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:41.564 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a namespace which is not needed anymore#033[00m
Oct  8 12:00:41 np0005476733 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000004c.scope: Deactivated successfully.
Oct  8 12:00:41 np0005476733 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000004c.scope: Consumed 13.126s CPU time.
Oct  8 12:00:41 np0005476733 systemd-machined[152624]: Machine qemu-46-instance-0000004c terminated.
Oct  8 12:00:41 np0005476733 podman[247956]: 2025-10-08 16:00:41.624077509 +0000 UTC m=+0.062907162 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 12:00:41 np0005476733 podman[247957]: 2025-10-08 16:00:41.632440376 +0000 UTC m=+0.065766493 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vcs-type=git, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  8 12:00:41 np0005476733 podman[247952]: 2025-10-08 16:00:41.637851649 +0000 UTC m=+0.081027240 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Oct  8 12:00:41 np0005476733 neutron-haproxy-ovnmeta-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a[247886]: [NOTICE]   (247892) : haproxy version is 2.8.14-c23fe91
Oct  8 12:00:41 np0005476733 neutron-haproxy-ovnmeta-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a[247886]: [NOTICE]   (247892) : path to executable is /usr/sbin/haproxy
Oct  8 12:00:41 np0005476733 neutron-haproxy-ovnmeta-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a[247886]: [WARNING]  (247892) : Exiting Master process...
Oct  8 12:00:41 np0005476733 neutron-haproxy-ovnmeta-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a[247886]: [ALERT]    (247892) : Current worker (247894) exited with code 143 (Terminated)
Oct  8 12:00:41 np0005476733 neutron-haproxy-ovnmeta-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a[247886]: [WARNING]  (247892) : All workers exited. Exiting... (0)
Oct  8 12:00:41 np0005476733 systemd[1]: libpod-0ff9344e6845259e09e5b1a351c894c223a84f0f0f5206a379a1d1b43e046c16.scope: Deactivated successfully.
Oct  8 12:00:41 np0005476733 podman[248040]: 2025-10-08 16:00:41.708762745 +0000 UTC m=+0.054066308 container died 0ff9344e6845259e09e5b1a351c894c223a84f0f0f5206a379a1d1b43e046c16 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3)
Oct  8 12:00:41 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0ff9344e6845259e09e5b1a351c894c223a84f0f0f5206a379a1d1b43e046c16-userdata-shm.mount: Deactivated successfully.
Oct  8 12:00:41 np0005476733 systemd[1]: var-lib-containers-storage-overlay-06da9a7aa1dea972cda8093b2d09a3619f7424e282a51fb848f26c626c50b8f2-merged.mount: Deactivated successfully.
Oct  8 12:00:41 np0005476733 podman[248040]: 2025-10-08 16:00:41.754697893 +0000 UTC m=+0.100001466 container cleanup 0ff9344e6845259e09e5b1a351c894c223a84f0f0f5206a379a1d1b43e046c16 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:00:41 np0005476733 systemd[1]: libpod-conmon-0ff9344e6845259e09e5b1a351c894c223a84f0f0f5206a379a1d1b43e046c16.scope: Deactivated successfully.
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.778 2 INFO nova.virt.libvirt.driver [-] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Instance destroyed successfully.#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.779 2 DEBUG nova.objects.instance [None req-5b0c9172-578d-4616-90a1-9c810edd699d bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lazy-loading 'resources' on Instance uuid 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.814 2 DEBUG nova.network.neutron [req-4ab26ed2-542c-43ee-aa66-3f6592a1969d req-564b1803-01fa-4122-ab0f-ff496f84d2ad 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Updated VIF entry in instance network info cache for port f741863c-37e3-4d41-b47d-a825b21d9eac. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.815 2 DEBUG nova.network.neutron [req-4ab26ed2-542c-43ee-aa66-3f6592a1969d req-564b1803-01fa-4122-ab0f-ff496f84d2ad 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Updating instance_info_cache with network_info: [{"id": "f741863c-37e3-4d41-b47d-a825b21d9eac", "address": "fa:16:3e:b6:1d:5f", "network": {"id": "a5aa5041-0bfa-4eb8-8951-cc523b99ac9a", "bridge": "br-int", "label": "tempest-test-network--152303658", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf741863c-37", "ovs_interfaceid": "f741863c-37e3-4d41-b47d-a825b21d9eac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:00:41 np0005476733 podman[248089]: 2025-10-08 16:00:41.834803033 +0000 UTC m=+0.052138227 container remove 0ff9344e6845259e09e5b1a351c894c223a84f0f0f5206a379a1d1b43e046c16 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 12:00:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:41.840 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[39d0964f-069d-437e-ad45-addeb845294a]: (4, ('Wed Oct  8 04:00:41 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a (0ff9344e6845259e09e5b1a351c894c223a84f0f0f5206a379a1d1b43e046c16)\n0ff9344e6845259e09e5b1a351c894c223a84f0f0f5206a379a1d1b43e046c16\nWed Oct  8 04:00:41 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a (0ff9344e6845259e09e5b1a351c894c223a84f0f0f5206a379a1d1b43e046c16)\n0ff9344e6845259e09e5b1a351c894c223a84f0f0f5206a379a1d1b43e046c16\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:00:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:41.842 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a96528d1-70fa-4b6c-ad49-8314084e9d83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:00:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:41.843 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa5aa5041-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.847 2 DEBUG nova.virt.libvirt.vif [None req-5b0c9172-578d-4616-90a1-9c810edd699d bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T16:00:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-2082575820',display_name='tempest-server-test-2082575820',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-2082575820',id=76,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGmGuURdoBH7+8UKntqBm5AWKwSqVw41oQIfoqZW4juzRa+zLIDUZQk+8q96NsvV1QhNKcV4HhEHGQj7RYtO04Z0WfqqmlMfeVZDrcQlemJhjx+knV/dWY2Bcp0Y0lXzvQ==',key_name='tempest-keypair-test-1490490112',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:00:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='13cdd2bb6c7648f5ab8709ff695b5cda',ramdisk_id='',reservation_id='r-q49mk07u',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestOvn-1026583770',owner_user_name='tempest-QosTestOvn-1026583770-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T16:00:29Z,user_data=None,user_id='bf4219ece8f54f268b2ece84f150d555',uuid=60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f741863c-37e3-4d41-b47d-a825b21d9eac", "address": "fa:16:3e:b6:1d:5f", "network": {"id": "a5aa5041-0bfa-4eb8-8951-cc523b99ac9a", "bridge": "br-int", "label": "tempest-test-network--152303658", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf741863c-37", "ovs_interfaceid": "f741863c-37e3-4d41-b47d-a825b21d9eac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.847 2 DEBUG nova.network.os_vif_util [None req-5b0c9172-578d-4616-90a1-9c810edd699d bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Converting VIF {"id": "f741863c-37e3-4d41-b47d-a825b21d9eac", "address": "fa:16:3e:b6:1d:5f", "network": {"id": "a5aa5041-0bfa-4eb8-8951-cc523b99ac9a", "bridge": "br-int", "label": "tempest-test-network--152303658", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "13cdd2bb6c7648f5ab8709ff695b5cda", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf741863c-37", "ovs_interfaceid": "f741863c-37e3-4d41-b47d-a825b21d9eac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.848 2 DEBUG nova.network.os_vif_util [None req-5b0c9172-578d-4616-90a1-9c810edd699d bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b6:1d:5f,bridge_name='br-int',has_traffic_filtering=True,id=f741863c-37e3-4d41-b47d-a825b21d9eac,network=Network(a5aa5041-0bfa-4eb8-8951-cc523b99ac9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf741863c-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.848 2 DEBUG os_vif [None req-5b0c9172-578d-4616-90a1-9c810edd699d bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b6:1d:5f,bridge_name='br-int',has_traffic_filtering=True,id=f741863c-37e3-4d41-b47d-a825b21d9eac,network=Network(a5aa5041-0bfa-4eb8-8951-cc523b99ac9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf741863c-37') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.851 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf741863c-37, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.855 2 DEBUG oslo_concurrency.lockutils [req-4ab26ed2-542c-43ee-aa66-3f6592a1969d req-564b1803-01fa-4122-ab0f-ff496f84d2ad 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:41 np0005476733 kernel: tapa5aa5041-00: left promiscuous mode
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:41.876 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0af85b98-f89e-4fb0-9de5-784d41604a37]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.877 2 INFO os_vif [None req-5b0c9172-578d-4616-90a1-9c810edd699d bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b6:1d:5f,bridge_name='br-int',has_traffic_filtering=True,id=f741863c-37e3-4d41-b47d-a825b21d9eac,network=Network(a5aa5041-0bfa-4eb8-8951-cc523b99ac9a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf741863c-37')#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.878 2 INFO nova.virt.libvirt.driver [None req-5b0c9172-578d-4616-90a1-9c810edd699d bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Deleting instance files /var/lib/nova/instances/60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f_del#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.879 2 INFO nova.virt.libvirt.driver [None req-5b0c9172-578d-4616-90a1-9c810edd699d bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Deletion of /var/lib/nova/instances/60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f_del complete#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.885 2 DEBUG nova.compute.manager [req-a63d6a51-e44e-4026-99f0-1e69edd48ac9 req-cd9a5bdf-3138-4bd7-9f9a-0ec83500221f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Received event network-vif-unplugged-f741863c-37e3-4d41-b47d-a825b21d9eac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.886 2 DEBUG oslo_concurrency.lockutils [req-a63d6a51-e44e-4026-99f0-1e69edd48ac9 req-cd9a5bdf-3138-4bd7-9f9a-0ec83500221f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.886 2 DEBUG oslo_concurrency.lockutils [req-a63d6a51-e44e-4026-99f0-1e69edd48ac9 req-cd9a5bdf-3138-4bd7-9f9a-0ec83500221f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.886 2 DEBUG oslo_concurrency.lockutils [req-a63d6a51-e44e-4026-99f0-1e69edd48ac9 req-cd9a5bdf-3138-4bd7-9f9a-0ec83500221f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.886 2 DEBUG nova.compute.manager [req-a63d6a51-e44e-4026-99f0-1e69edd48ac9 req-cd9a5bdf-3138-4bd7-9f9a-0ec83500221f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] No waiting events found dispatching network-vif-unplugged-f741863c-37e3-4d41-b47d-a825b21d9eac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.886 2 DEBUG nova.compute.manager [req-a63d6a51-e44e-4026-99f0-1e69edd48ac9 req-cd9a5bdf-3138-4bd7-9f9a-0ec83500221f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Received event network-vif-unplugged-f741863c-37e3-4d41-b47d-a825b21d9eac for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:41.921 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7a531624-861e-4b8c-8136-3e7c5dc64cca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:00:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:41.924 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[26dcba66-6398-4696-837c-6a1a894e71a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:00:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:41.942 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a1d44804-5bc5-4fac-b1e6-40be4b18a090]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 613265, 'reachable_time': 35044, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248104, 'error': None, 'target': 'ovnmeta-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:00:41 np0005476733 systemd[1]: run-netns-ovnmeta\x2da5aa5041\x2d0bfa\x2d4eb8\x2d8951\x2dcc523b99ac9a.mount: Deactivated successfully.
Oct  8 12:00:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:41.945 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a5aa5041-0bfa-4eb8-8951-cc523b99ac9a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 12:00:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:00:41.945 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[d9022b06-a8b8-4c10-b6e5-2d0896058f98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.978 2 INFO nova.compute.manager [None req-5b0c9172-578d-4616-90a1-9c810edd699d bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Took 0.49 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.979 2 DEBUG oslo.service.loopingcall [None req-5b0c9172-578d-4616-90a1-9c810edd699d bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.979 2 DEBUG nova.compute.manager [-] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 12:00:41 np0005476733 nova_compute[192580]: 2025-10-08 16:00:41.980 2 DEBUG nova.network.neutron [-] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 12:00:42 np0005476733 nova_compute[192580]: 2025-10-08 16:00:42.620 2 DEBUG nova.compute.manager [req-236d0662-bfc7-4bb5-b0ed-94fbe4304044 req-1cdeadb4-fbb9-4647-9983-c0bb0e96e5d9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Received event network-changed-f741863c-37e3-4d41-b47d-a825b21d9eac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:00:42 np0005476733 nova_compute[192580]: 2025-10-08 16:00:42.621 2 DEBUG nova.compute.manager [req-236d0662-bfc7-4bb5-b0ed-94fbe4304044 req-1cdeadb4-fbb9-4647-9983-c0bb0e96e5d9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Refreshing instance network info cache due to event network-changed-f741863c-37e3-4d41-b47d-a825b21d9eac. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:00:42 np0005476733 nova_compute[192580]: 2025-10-08 16:00:42.621 2 DEBUG oslo_concurrency.lockutils [req-236d0662-bfc7-4bb5-b0ed-94fbe4304044 req-1cdeadb4-fbb9-4647-9983-c0bb0e96e5d9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:00:42 np0005476733 nova_compute[192580]: 2025-10-08 16:00:42.622 2 DEBUG oslo_concurrency.lockutils [req-236d0662-bfc7-4bb5-b0ed-94fbe4304044 req-1cdeadb4-fbb9-4647-9983-c0bb0e96e5d9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:00:42 np0005476733 nova_compute[192580]: 2025-10-08 16:00:42.622 2 DEBUG nova.network.neutron [req-236d0662-bfc7-4bb5-b0ed-94fbe4304044 req-1cdeadb4-fbb9-4647-9983-c0bb0e96e5d9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Refreshing network info cache for port f741863c-37e3-4d41-b47d-a825b21d9eac _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:00:43 np0005476733 nova_compute[192580]: 2025-10-08 16:00:43.121 2 INFO nova.network.neutron [req-236d0662-bfc7-4bb5-b0ed-94fbe4304044 req-1cdeadb4-fbb9-4647-9983-c0bb0e96e5d9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Port f741863c-37e3-4d41-b47d-a825b21d9eac from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Oct  8 12:00:43 np0005476733 nova_compute[192580]: 2025-10-08 16:00:43.121 2 DEBUG nova.network.neutron [req-236d0662-bfc7-4bb5-b0ed-94fbe4304044 req-1cdeadb4-fbb9-4647-9983-c0bb0e96e5d9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:00:43 np0005476733 nova_compute[192580]: 2025-10-08 16:00:43.260 2 DEBUG oslo_concurrency.lockutils [req-236d0662-bfc7-4bb5-b0ed-94fbe4304044 req-1cdeadb4-fbb9-4647-9983-c0bb0e96e5d9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:00:43 np0005476733 nova_compute[192580]: 2025-10-08 16:00:43.630 2 DEBUG nova.network.neutron [-] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:00:43 np0005476733 nova_compute[192580]: 2025-10-08 16:00:43.678 2 INFO nova.compute.manager [-] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Took 1.70 seconds to deallocate network for instance.#033[00m
Oct  8 12:00:43 np0005476733 nova_compute[192580]: 2025-10-08 16:00:43.923 2 DEBUG oslo_concurrency.lockutils [None req-5b0c9172-578d-4616-90a1-9c810edd699d bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:00:43 np0005476733 nova_compute[192580]: 2025-10-08 16:00:43.924 2 DEBUG oslo_concurrency.lockutils [None req-5b0c9172-578d-4616-90a1-9c810edd699d bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:00:43 np0005476733 nova_compute[192580]: 2025-10-08 16:00:43.993 2 DEBUG nova.compute.provider_tree [None req-5b0c9172-578d-4616-90a1-9c810edd699d bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:00:44 np0005476733 nova_compute[192580]: 2025-10-08 16:00:44.063 2 DEBUG nova.compute.manager [req-9e8bdfaf-de9d-4cca-92a5-a38278250497 req-bd094959-5593-40dd-a420-674cf597d380 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Received event network-vif-plugged-f741863c-37e3-4d41-b47d-a825b21d9eac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:00:44 np0005476733 nova_compute[192580]: 2025-10-08 16:00:44.063 2 DEBUG oslo_concurrency.lockutils [req-9e8bdfaf-de9d-4cca-92a5-a38278250497 req-bd094959-5593-40dd-a420-674cf597d380 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:00:44 np0005476733 nova_compute[192580]: 2025-10-08 16:00:44.064 2 DEBUG oslo_concurrency.lockutils [req-9e8bdfaf-de9d-4cca-92a5-a38278250497 req-bd094959-5593-40dd-a420-674cf597d380 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:00:44 np0005476733 nova_compute[192580]: 2025-10-08 16:00:44.064 2 DEBUG oslo_concurrency.lockutils [req-9e8bdfaf-de9d-4cca-92a5-a38278250497 req-bd094959-5593-40dd-a420-674cf597d380 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:00:44 np0005476733 nova_compute[192580]: 2025-10-08 16:00:44.064 2 DEBUG nova.compute.manager [req-9e8bdfaf-de9d-4cca-92a5-a38278250497 req-bd094959-5593-40dd-a420-674cf597d380 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] No waiting events found dispatching network-vif-plugged-f741863c-37e3-4d41-b47d-a825b21d9eac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:00:44 np0005476733 nova_compute[192580]: 2025-10-08 16:00:44.064 2 WARNING nova.compute.manager [req-9e8bdfaf-de9d-4cca-92a5-a38278250497 req-bd094959-5593-40dd-a420-674cf597d380 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Received unexpected event network-vif-plugged-f741863c-37e3-4d41-b47d-a825b21d9eac for instance with vm_state deleted and task_state None.#033[00m
Oct  8 12:00:44 np0005476733 nova_compute[192580]: 2025-10-08 16:00:44.088 2 DEBUG nova.scheduler.client.report [None req-5b0c9172-578d-4616-90a1-9c810edd699d bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:00:44 np0005476733 nova_compute[192580]: 2025-10-08 16:00:44.265 2 DEBUG oslo_concurrency.lockutils [None req-5b0c9172-578d-4616-90a1-9c810edd699d bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.341s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:00:44 np0005476733 nova_compute[192580]: 2025-10-08 16:00:44.488 2 INFO nova.scheduler.client.report [None req-5b0c9172-578d-4616-90a1-9c810edd699d bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Deleted allocations for instance 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f#033[00m
Oct  8 12:00:44 np0005476733 nova_compute[192580]: 2025-10-08 16:00:44.925 2 DEBUG oslo_concurrency.lockutils [None req-5b0c9172-578d-4616-90a1-9c810edd699d bf4219ece8f54f268b2ece84f150d555 13cdd2bb6c7648f5ab8709ff695b5cda - - default default] Lock "60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.439s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:00:44 np0005476733 nova_compute[192580]: 2025-10-08 16:00:44.943 2 DEBUG nova.compute.manager [req-1a30a15f-73f6-4cc5-9444-d499e08e54cd req-b3aa9404-ea52-4ad9-a8c2-f4f5d8c5c1be 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Received event network-vif-deleted-f741863c-37e3-4d41-b47d-a825b21d9eac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:00:46 np0005476733 nova_compute[192580]: 2025-10-08 16:00:46.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:46 np0005476733 nova_compute[192580]: 2025-10-08 16:00:46.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:49 np0005476733 podman[248106]: 2025-10-08 16:00:49.219219618 +0000 UTC m=+0.048633755 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:00:49 np0005476733 podman[248105]: 2025-10-08 16:00:49.251995115 +0000 UTC m=+0.083680835 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 12:00:51 np0005476733 nova_compute[192580]: 2025-10-08 16:00:51.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:51 np0005476733 nova_compute[192580]: 2025-10-08 16:00:51.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:56 np0005476733 nova_compute[192580]: 2025-10-08 16:00:56.773 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759939241.7720585, 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:00:56 np0005476733 nova_compute[192580]: 2025-10-08 16:00:56.774 2 INFO nova.compute.manager [-] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] VM Stopped (Lifecycle Event)#033[00m
Oct  8 12:00:56 np0005476733 nova_compute[192580]: 2025-10-08 16:00:56.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:56 np0005476733 nova_compute[192580]: 2025-10-08 16:00:56.864 2 DEBUG nova.compute.manager [None req-2b39894d-715f-4ea2-b346-c26d2f4e5ea7 - - - - - -] [instance: 60bcebc4-ec32-4fd3-91e9-dc7e588c2d7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:00:56 np0005476733 nova_compute[192580]: 2025-10-08 16:00:56.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:00:57 np0005476733 podman[248151]: 2025-10-08 16:00:57.286399521 +0000 UTC m=+0.108160257 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 12:01:01 np0005476733 nova_compute[192580]: 2025-10-08 16:01:01.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:01 np0005476733 nova_compute[192580]: 2025-10-08 16:01:01.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:03 np0005476733 nova_compute[192580]: 2025-10-08 16:01:03.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:04 np0005476733 nova_compute[192580]: 2025-10-08 16:01:04.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:05 np0005476733 podman[248185]: 2025-10-08 16:01:05.235950255 +0000 UTC m=+0.064454620 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  8 12:01:05 np0005476733 podman[248184]: 2025-10-08 16:01:05.286989956 +0000 UTC m=+0.118327762 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 12:01:06 np0005476733 nova_compute[192580]: 2025-10-08 16:01:06.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:06 np0005476733 nova_compute[192580]: 2025-10-08 16:01:06.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:07 np0005476733 nova_compute[192580]: 2025-10-08 16:01:07.606 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:01:07 np0005476733 nova_compute[192580]: 2025-10-08 16:01:07.606 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 12:01:07 np0005476733 nova_compute[192580]: 2025-10-08 16:01:07.651 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 12:01:08 np0005476733 nova_compute[192580]: 2025-10-08 16:01:08.633 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:01:09 np0005476733 nova_compute[192580]: 2025-10-08 16:01:09.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:01:11 np0005476733 nova_compute[192580]: 2025-10-08 16:01:11.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:11 np0005476733 nova_compute[192580]: 2025-10-08 16:01:11.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:12 np0005476733 podman[248228]: 2025-10-08 16:01:12.267488603 +0000 UTC m=+0.075149163 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, vendor=Red Hat, Inc.)
Oct  8 12:01:12 np0005476733 podman[248227]: 2025-10-08 16:01:12.297099818 +0000 UTC m=+0.105749410 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 12:01:12 np0005476733 podman[248226]: 2025-10-08 16:01:12.314412132 +0000 UTC m=+0.123334693 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct  8 12:01:13 np0005476733 nova_compute[192580]: 2025-10-08 16:01:13.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:01:13 np0005476733 nova_compute[192580]: 2025-10-08 16:01:13.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:01:13 np0005476733 nova_compute[192580]: 2025-10-08 16:01:13.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:01:13 np0005476733 nova_compute[192580]: 2025-10-08 16:01:13.749 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:01:16 np0005476733 nova_compute[192580]: 2025-10-08 16:01:16.024 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:01:16 np0005476733 nova_compute[192580]: 2025-10-08 16:01:16.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:16 np0005476733 nova_compute[192580]: 2025-10-08 16:01:16.812 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:01:16 np0005476733 nova_compute[192580]: 2025-10-08 16:01:16.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:18 np0005476733 nova_compute[192580]: 2025-10-08 16:01:18.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:01:20 np0005476733 podman[248290]: 2025-10-08 16:01:20.260985521 +0000 UTC m=+0.079124470 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:01:20 np0005476733 podman[248289]: 2025-10-08 16:01:20.265405152 +0000 UTC m=+0.086174864 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct  8 12:01:21 np0005476733 nova_compute[192580]: 2025-10-08 16:01:21.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:01:21 np0005476733 nova_compute[192580]: 2025-10-08 16:01:21.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:01:21 np0005476733 nova_compute[192580]: 2025-10-08 16:01:21.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:21 np0005476733 ovn_controller[94857]: 2025-10-08T16:01:21Z|00721|pinctrl|WARN|Dropped 1211 log messages in last 59 seconds (most recently, 2 seconds ago) due to excessive rate
Oct  8 12:01:21 np0005476733 ovn_controller[94857]: 2025-10-08T16:01:21Z|00722|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:01:21 np0005476733 nova_compute[192580]: 2025-10-08 16:01:21.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:23 np0005476733 nova_compute[192580]: 2025-10-08 16:01:23.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:01:23 np0005476733 nova_compute[192580]: 2025-10-08 16:01:23.615 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:01:23 np0005476733 nova_compute[192580]: 2025-10-08 16:01:23.615 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:01:23 np0005476733 nova_compute[192580]: 2025-10-08 16:01:23.615 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:01:23 np0005476733 nova_compute[192580]: 2025-10-08 16:01:23.616 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:01:23 np0005476733 nova_compute[192580]: 2025-10-08 16:01:23.783 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:01:23 np0005476733 nova_compute[192580]: 2025-10-08 16:01:23.784 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13791MB free_disk=111.33189010620117GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:01:23 np0005476733 nova_compute[192580]: 2025-10-08 16:01:23.784 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:01:23 np0005476733 nova_compute[192580]: 2025-10-08 16:01:23.785 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:01:23 np0005476733 nova_compute[192580]: 2025-10-08 16:01:23.857 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:01:23 np0005476733 nova_compute[192580]: 2025-10-08 16:01:23.858 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:01:23 np0005476733 nova_compute[192580]: 2025-10-08 16:01:23.876 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 12:01:23 np0005476733 nova_compute[192580]: 2025-10-08 16:01:23.911 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 12:01:23 np0005476733 nova_compute[192580]: 2025-10-08 16:01:23.911 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 12:01:23 np0005476733 nova_compute[192580]: 2025-10-08 16:01:23.930 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 12:01:23 np0005476733 nova_compute[192580]: 2025-10-08 16:01:23.958 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 12:01:23 np0005476733 nova_compute[192580]: 2025-10-08 16:01:23.985 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:01:24 np0005476733 nova_compute[192580]: 2025-10-08 16:01:24.026 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:01:24 np0005476733 nova_compute[192580]: 2025-10-08 16:01:24.082 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:01:24 np0005476733 nova_compute[192580]: 2025-10-08 16:01:24.082 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.297s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:01:25 np0005476733 nova_compute[192580]: 2025-10-08 16:01:25.083 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:01:26.350 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:01:26.351 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:01:26.352 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:01:26 np0005476733 nova_compute[192580]: 2025-10-08 16:01:26.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:26 np0005476733 nova_compute[192580]: 2025-10-08 16:01:26.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:28 np0005476733 podman[248334]: 2025-10-08 16:01:28.267813986 +0000 UTC m=+0.093533100 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true)
Oct  8 12:01:31 np0005476733 nova_compute[192580]: 2025-10-08 16:01:31.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:01:31 np0005476733 nova_compute[192580]: 2025-10-08 16:01:31.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:31 np0005476733 nova_compute[192580]: 2025-10-08 16:01:31.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:36 np0005476733 podman[248355]: 2025-10-08 16:01:36.244952681 +0000 UTC m=+0.068480279 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:01:36 np0005476733 podman[248354]: 2025-10-08 16:01:36.310436315 +0000 UTC m=+0.130141911 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 12:01:36 np0005476733 nova_compute[192580]: 2025-10-08 16:01:36.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:36 np0005476733 nova_compute[192580]: 2025-10-08 16:01:36.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:41 np0005476733 nova_compute[192580]: 2025-10-08 16:01:41.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:41 np0005476733 nova_compute[192580]: 2025-10-08 16:01:41.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:43 np0005476733 podman[248402]: 2025-10-08 16:01:43.224916938 +0000 UTC m=+0.053135099 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:01:43 np0005476733 podman[248401]: 2025-10-08 16:01:43.243725789 +0000 UTC m=+0.075620868 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  8 12:01:43 np0005476733 podman[248403]: 2025-10-08 16:01:43.247160549 +0000 UTC m=+0.071487156 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=edpm, maintainer=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  8 12:01:46 np0005476733 nova_compute[192580]: 2025-10-08 16:01:46.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:46 np0005476733 nova_compute[192580]: 2025-10-08 16:01:46.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:49 np0005476733 ovn_controller[94857]: 2025-10-08T16:01:49Z|00723|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Oct  8 12:01:51 np0005476733 podman[248468]: 2025-10-08 16:01:51.270016585 +0000 UTC m=+0.079953836 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 12:01:51 np0005476733 podman[248467]: 2025-10-08 16:01:51.269626512 +0000 UTC m=+0.082685723 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 12:01:51 np0005476733 nova_compute[192580]: 2025-10-08 16:01:51.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:51 np0005476733 nova_compute[192580]: 2025-10-08 16:01:51.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:56 np0005476733 nova_compute[192580]: 2025-10-08 16:01:56.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:56 np0005476733 nova_compute[192580]: 2025-10-08 16:01:56.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:01:59 np0005476733 podman[248509]: 2025-10-08 16:01:59.295713103 +0000 UTC m=+0.115239023 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 12:02:01 np0005476733 nova_compute[192580]: 2025-10-08 16:02:01.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:02:01 np0005476733 nova_compute[192580]: 2025-10-08 16:02:01.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:02:06 np0005476733 nova_compute[192580]: 2025-10-08 16:02:06.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:02:06 np0005476733 nova_compute[192580]: 2025-10-08 16:02:06.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:02:07 np0005476733 podman[248529]: 2025-10-08 16:02:07.283119697 +0000 UTC m=+0.094872143 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, org.label-schema.license=GPLv2, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  8 12:02:07 np0005476733 podman[248528]: 2025-10-08 16:02:07.302388874 +0000 UTC m=+0.130386729 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, container_name=ovn_controller)
Oct  8 12:02:08 np0005476733 nova_compute[192580]: 2025-10-08 16:02:08.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:02:11 np0005476733 nova_compute[192580]: 2025-10-08 16:02:11.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:02:11 np0005476733 nova_compute[192580]: 2025-10-08 16:02:11.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:02:11 np0005476733 nova_compute[192580]: 2025-10-08 16:02:11.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:02:14 np0005476733 podman[248576]: 2025-10-08 16:02:14.266698188 +0000 UTC m=+0.075256186 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:02:14 np0005476733 podman[248575]: 2025-10-08 16:02:14.273592418 +0000 UTC m=+0.091737963 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 12:02:14 np0005476733 podman[248577]: 2025-10-08 16:02:14.296167779 +0000 UTC m=+0.102560718 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., release=1755695350)
Oct  8 12:02:15 np0005476733 nova_compute[192580]: 2025-10-08 16:02:15.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:02:15 np0005476733 nova_compute[192580]: 2025-10-08 16:02:15.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:02:15 np0005476733 nova_compute[192580]: 2025-10-08 16:02:15.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:02:15 np0005476733 nova_compute[192580]: 2025-10-08 16:02:15.612 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:02:16 np0005476733 nova_compute[192580]: 2025-10-08 16:02:16.611 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:02:16 np0005476733 nova_compute[192580]: 2025-10-08 16:02:16.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:02:16 np0005476733 nova_compute[192580]: 2025-10-08 16:02:16.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:02:20 np0005476733 nova_compute[192580]: 2025-10-08 16:02:20.593 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:02:21 np0005476733 nova_compute[192580]: 2025-10-08 16:02:21.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:02:21 np0005476733 nova_compute[192580]: 2025-10-08 16:02:21.591 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:02:21 np0005476733 nova_compute[192580]: 2025-10-08 16:02:21.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:02:21 np0005476733 nova_compute[192580]: 2025-10-08 16:02:21.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:02:22 np0005476733 podman[248639]: 2025-10-08 16:02:22.232893754 +0000 UTC m=+0.053863132 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:02:22 np0005476733 podman[248638]: 2025-10-08 16:02:22.258026077 +0000 UTC m=+0.085507984 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=iscsid, config_id=iscsid, org.label-schema.schema-version=1.0)
Oct  8 12:02:24 np0005476733 nova_compute[192580]: 2025-10-08 16:02:24.591 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:02:24 np0005476733 nova_compute[192580]: 2025-10-08 16:02:24.591 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:02:24 np0005476733 nova_compute[192580]: 2025-10-08 16:02:24.647 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:02:24 np0005476733 nova_compute[192580]: 2025-10-08 16:02:24.647 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:02:24 np0005476733 nova_compute[192580]: 2025-10-08 16:02:24.648 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:02:24 np0005476733 nova_compute[192580]: 2025-10-08 16:02:24.648 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:02:24 np0005476733 nova_compute[192580]: 2025-10-08 16:02:24.777 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:02:24 np0005476733 nova_compute[192580]: 2025-10-08 16:02:24.778 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13800MB free_disk=111.33189010620117GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:02:24 np0005476733 nova_compute[192580]: 2025-10-08 16:02:24.778 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:02:24 np0005476733 nova_compute[192580]: 2025-10-08 16:02:24.779 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:02:24 np0005476733 nova_compute[192580]: 2025-10-08 16:02:24.843 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:02:24 np0005476733 nova_compute[192580]: 2025-10-08 16:02:24.843 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:02:24 np0005476733 nova_compute[192580]: 2025-10-08 16:02:24.871 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:02:24 np0005476733 nova_compute[192580]: 2025-10-08 16:02:24.900 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:02:24 np0005476733 nova_compute[192580]: 2025-10-08 16:02:24.901 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:02:24 np0005476733 nova_compute[192580]: 2025-10-08 16:02:24.901 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:02:25 np0005476733 ovn_controller[94857]: 2025-10-08T16:02:25Z|00724|pinctrl|WARN|Dropped 47 log messages in last 64 seconds (most recently, 19 seconds ago) due to excessive rate
Oct  8 12:02:25 np0005476733 ovn_controller[94857]: 2025-10-08T16:02:25Z|00725|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:02:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:02:26.352 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:02:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:02:26.352 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:02:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:02:26.353 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:02:26 np0005476733 nova_compute[192580]: 2025-10-08 16:02:26.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:02:26 np0005476733 nova_compute[192580]: 2025-10-08 16:02:26.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:02:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:02:27.827 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:02:27 np0005476733 nova_compute[192580]: 2025-10-08 16:02:27.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:02:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:02:27.828 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:02:30 np0005476733 podman[248683]: 2025-10-08 16:02:30.217994234 +0000 UTC m=+0.045424453 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 12:02:31 np0005476733 nova_compute[192580]: 2025-10-08 16:02:31.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:02:31 np0005476733 nova_compute[192580]: 2025-10-08 16:02:31.898 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:02:31 np0005476733 nova_compute[192580]: 2025-10-08 16:02:31.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:02:32 np0005476733 nova_compute[192580]: 2025-10-08 16:02:32.582 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:02:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:02:33.831 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.048 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.049 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.049 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.049 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.049 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.049 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.049 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.050 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.050 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.050 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.050 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.050 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.050 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.050 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.050 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.050 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.050 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.051 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.051 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.051 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.051 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.051 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.051 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.051 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:02:36.051 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:02:36 np0005476733 nova_compute[192580]: 2025-10-08 16:02:36.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:02:36 np0005476733 nova_compute[192580]: 2025-10-08 16:02:36.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:02:36 np0005476733 systemd-logind[827]: New session 50 of user zuul.
Oct  8 12:02:36 np0005476733 systemd[1]: Started Session 50 of User zuul.
Oct  8 12:02:37 np0005476733 systemd[1]: session-50.scope: Deactivated successfully.
Oct  8 12:02:37 np0005476733 systemd-logind[827]: Session 50 logged out. Waiting for processes to exit.
Oct  8 12:02:37 np0005476733 systemd-logind[827]: Removed session 50.
Oct  8 12:02:38 np0005476733 podman[248731]: 2025-10-08 16:02:38.245499809 +0000 UTC m=+0.061108863 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3)
Oct  8 12:02:38 np0005476733 podman[248730]: 2025-10-08 16:02:38.281008154 +0000 UTC m=+0.100425000 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:02:41 np0005476733 nova_compute[192580]: 2025-10-08 16:02:41.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:02:41 np0005476733 nova_compute[192580]: 2025-10-08 16:02:41.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:02:45 np0005476733 podman[248777]: 2025-10-08 16:02:45.227718756 +0000 UTC m=+0.059226505 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 12:02:45 np0005476733 podman[248778]: 2025-10-08 16:02:45.237983183 +0000 UTC m=+0.063104917 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:02:45 np0005476733 podman[248779]: 2025-10-08 16:02:45.2635463 +0000 UTC m=+0.076077902 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-type=git, config_id=edpm, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  8 12:02:46 np0005476733 nova_compute[192580]: 2025-10-08 16:02:46.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:02:46 np0005476733 nova_compute[192580]: 2025-10-08 16:02:46.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:02:51 np0005476733 nova_compute[192580]: 2025-10-08 16:02:51.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:02:51 np0005476733 nova_compute[192580]: 2025-10-08 16:02:51.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:02:53 np0005476733 podman[248841]: 2025-10-08 16:02:53.223587281 +0000 UTC m=+0.049607737 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 12:02:53 np0005476733 podman[248840]: 2025-10-08 16:02:53.225618235 +0000 UTC m=+0.055637968 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2)
Oct  8 12:02:56 np0005476733 nova_compute[192580]: 2025-10-08 16:02:56.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:02:56 np0005476733 nova_compute[192580]: 2025-10-08 16:02:56.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:01 np0005476733 podman[248881]: 2025-10-08 16:03:01.224313481 +0000 UTC m=+0.053098267 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:03:01 np0005476733 nova_compute[192580]: 2025-10-08 16:03:01.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:01 np0005476733 nova_compute[192580]: 2025-10-08 16:03:01.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:06 np0005476733 nova_compute[192580]: 2025-10-08 16:03:06.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:06 np0005476733 nova_compute[192580]: 2025-10-08 16:03:06.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:09 np0005476733 podman[248900]: 2025-10-08 16:03:09.250263374 +0000 UTC m=+0.082272092 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  8 12:03:09 np0005476733 podman[248901]: 2025-10-08 16:03:09.26831522 +0000 UTC m=+0.091707952 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 12:03:10 np0005476733 nova_compute[192580]: 2025-10-08 16:03:10.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:03:11 np0005476733 nova_compute[192580]: 2025-10-08 16:03:11.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:11 np0005476733 nova_compute[192580]: 2025-10-08 16:03:11.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:13 np0005476733 nova_compute[192580]: 2025-10-08 16:03:13.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:03:16 np0005476733 podman[248943]: 2025-10-08 16:03:16.25097195 +0000 UTC m=+0.074306796 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:03:16 np0005476733 podman[248944]: 2025-10-08 16:03:16.254020497 +0000 UTC m=+0.068604783 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:03:16 np0005476733 podman[248945]: 2025-10-08 16:03:16.300222915 +0000 UTC m=+0.101262118 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  8 12:03:16 np0005476733 nova_compute[192580]: 2025-10-08 16:03:16.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:03:16 np0005476733 nova_compute[192580]: 2025-10-08 16:03:16.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:03:16 np0005476733 nova_compute[192580]: 2025-10-08 16:03:16.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:03:16 np0005476733 nova_compute[192580]: 2025-10-08 16:03:16.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:03:16 np0005476733 nova_compute[192580]: 2025-10-08 16:03:16.626 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:03:16 np0005476733 nova_compute[192580]: 2025-10-08 16:03:16.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:16 np0005476733 nova_compute[192580]: 2025-10-08 16:03:16.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:20 np0005476733 nova_compute[192580]: 2025-10-08 16:03:20.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:03:21 np0005476733 nova_compute[192580]: 2025-10-08 16:03:21.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:21 np0005476733 nova_compute[192580]: 2025-10-08 16:03:21.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:22 np0005476733 nova_compute[192580]: 2025-10-08 16:03:22.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:03:22 np0005476733 nova_compute[192580]: 2025-10-08 16:03:22.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:03:24 np0005476733 podman[249007]: 2025-10-08 16:03:24.240290005 +0000 UTC m=+0.057240190 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 12:03:24 np0005476733 podman[249006]: 2025-10-08 16:03:24.276372949 +0000 UTC m=+0.095103011 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 12:03:25 np0005476733 ovn_controller[94857]: 2025-10-08T16:03:25Z|00726|pinctrl|WARN|Dropped 47 log messages in last 60 seconds (most recently, 17 seconds ago) due to excessive rate
Oct  8 12:03:25 np0005476733 ovn_controller[94857]: 2025-10-08T16:03:25Z|00727|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:03:25 np0005476733 nova_compute[192580]: 2025-10-08 16:03:25.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:03:26 np0005476733 nova_compute[192580]: 2025-10-08 16:03:26.206 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:03:26 np0005476733 nova_compute[192580]: 2025-10-08 16:03:26.207 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:03:26 np0005476733 nova_compute[192580]: 2025-10-08 16:03:26.207 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:03:26 np0005476733 nova_compute[192580]: 2025-10-08 16:03:26.207 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:03:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:03:26.353 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:03:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:03:26.354 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:03:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:03:26.354 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:03:26 np0005476733 nova_compute[192580]: 2025-10-08 16:03:26.392 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:03:26 np0005476733 nova_compute[192580]: 2025-10-08 16:03:26.393 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13797MB free_disk=111.33189010620117GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:03:26 np0005476733 nova_compute[192580]: 2025-10-08 16:03:26.393 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:03:26 np0005476733 nova_compute[192580]: 2025-10-08 16:03:26.394 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:03:26 np0005476733 nova_compute[192580]: 2025-10-08 16:03:26.791 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:03:26 np0005476733 nova_compute[192580]: 2025-10-08 16:03:26.791 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:03:26 np0005476733 nova_compute[192580]: 2025-10-08 16:03:26.817 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:03:26 np0005476733 nova_compute[192580]: 2025-10-08 16:03:26.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:26 np0005476733 nova_compute[192580]: 2025-10-08 16:03:26.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:26 np0005476733 nova_compute[192580]: 2025-10-08 16:03:26.963 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:03:26 np0005476733 nova_compute[192580]: 2025-10-08 16:03:26.966 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:03:26 np0005476733 nova_compute[192580]: 2025-10-08 16:03:26.967 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:03:27 np0005476733 nova_compute[192580]: 2025-10-08 16:03:27.968 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:03:31 np0005476733 nova_compute[192580]: 2025-10-08 16:03:31.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:31 np0005476733 nova_compute[192580]: 2025-10-08 16:03:31.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:32 np0005476733 podman[249047]: 2025-10-08 16:03:32.21581318 +0000 UTC m=+0.046301162 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:03:33 np0005476733 nova_compute[192580]: 2025-10-08 16:03:33.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:03:36 np0005476733 nova_compute[192580]: 2025-10-08 16:03:36.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:36 np0005476733 nova_compute[192580]: 2025-10-08 16:03:36.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:40 np0005476733 podman[249067]: 2025-10-08 16:03:40.228883374 +0000 UTC m=+0.052593021 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  8 12:03:40 np0005476733 podman[249066]: 2025-10-08 16:03:40.292290511 +0000 UTC m=+0.110966968 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:03:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:03:41.153 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:03:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:03:41.155 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:03:41 np0005476733 nova_compute[192580]: 2025-10-08 16:03:41.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:41 np0005476733 nova_compute[192580]: 2025-10-08 16:03:41.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:41 np0005476733 nova_compute[192580]: 2025-10-08 16:03:41.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:45 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:03:45.157 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:03:46 np0005476733 nova_compute[192580]: 2025-10-08 16:03:46.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:46 np0005476733 nova_compute[192580]: 2025-10-08 16:03:46.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:47 np0005476733 podman[249114]: 2025-10-08 16:03:47.223474775 +0000 UTC m=+0.049735890 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 12:03:47 np0005476733 podman[249115]: 2025-10-08 16:03:47.237891006 +0000 UTC m=+0.057508839 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vcs-type=git, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, release=1755695350, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9)
Oct  8 12:03:47 np0005476733 podman[249113]: 2025-10-08 16:03:47.254016851 +0000 UTC m=+0.081757954 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd)
Oct  8 12:03:51 np0005476733 nova_compute[192580]: 2025-10-08 16:03:51.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:51 np0005476733 nova_compute[192580]: 2025-10-08 16:03:51.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:55 np0005476733 podman[249176]: 2025-10-08 16:03:55.232531972 +0000 UTC m=+0.056320421 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 12:03:55 np0005476733 podman[249175]: 2025-10-08 16:03:55.233461842 +0000 UTC m=+0.060017350 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3)
Oct  8 12:03:56 np0005476733 nova_compute[192580]: 2025-10-08 16:03:56.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:56 np0005476733 nova_compute[192580]: 2025-10-08 16:03:56.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:03:58 np0005476733 systemd-logind[827]: New session 51 of user zuul.
Oct  8 12:03:58 np0005476733 systemd[1]: Started Session 51 of User zuul.
Oct  8 12:03:58 np0005476733 systemd[1]: session-51.scope: Deactivated successfully.
Oct  8 12:03:58 np0005476733 systemd-logind[827]: Session 51 logged out. Waiting for processes to exit.
Oct  8 12:03:58 np0005476733 systemd-logind[827]: Removed session 51.
Oct  8 12:04:01 np0005476733 nova_compute[192580]: 2025-10-08 16:04:01.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:04:03 np0005476733 podman[249247]: 2025-10-08 16:04:03.242072544 +0000 UTC m=+0.064652196 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:04:06 np0005476733 nova_compute[192580]: 2025-10-08 16:04:06.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:04:10 np0005476733 nova_compute[192580]: 2025-10-08 16:04:10.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:04:11 np0005476733 podman[249267]: 2025-10-08 16:04:11.267502615 +0000 UTC m=+0.078667725 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Oct  8 12:04:11 np0005476733 podman[249266]: 2025-10-08 16:04:11.275899423 +0000 UTC m=+0.095416690 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  8 12:04:11 np0005476733 nova_compute[192580]: 2025-10-08 16:04:11.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:04:13 np0005476733 nova_compute[192580]: 2025-10-08 16:04:13.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:04:16 np0005476733 nova_compute[192580]: 2025-10-08 16:04:16.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:04:16 np0005476733 nova_compute[192580]: 2025-10-08 16:04:16.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:04:16 np0005476733 nova_compute[192580]: 2025-10-08 16:04:16.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:04:16 np0005476733 nova_compute[192580]: 2025-10-08 16:04:16.646 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:04:16 np0005476733 nova_compute[192580]: 2025-10-08 16:04:16.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:04:16 np0005476733 nova_compute[192580]: 2025-10-08 16:04:16.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:04:18 np0005476733 podman[249311]: 2025-10-08 16:04:18.253530712 +0000 UTC m=+0.076968540 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 12:04:18 np0005476733 podman[249313]: 2025-10-08 16:04:18.257371555 +0000 UTC m=+0.072105896 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., config_id=edpm, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  8 12:04:18 np0005476733 podman[249312]: 2025-10-08 16:04:18.264490772 +0000 UTC m=+0.076458264 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:04:18 np0005476733 nova_compute[192580]: 2025-10-08 16:04:18.638 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:04:21 np0005476733 nova_compute[192580]: 2025-10-08 16:04:21.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:04:22 np0005476733 nova_compute[192580]: 2025-10-08 16:04:22.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:04:23 np0005476733 nova_compute[192580]: 2025-10-08 16:04:23.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:04:23 np0005476733 nova_compute[192580]: 2025-10-08 16:04:23.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:04:23 np0005476733 ovn_controller[94857]: 2025-10-08T16:04:23Z|00728|pinctrl|WARN|Dropped 191 log messages in last 58 seconds (most recently, 5 seconds ago) due to excessive rate
Oct  8 12:04:23 np0005476733 ovn_controller[94857]: 2025-10-08T16:04:23Z|00729|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:04:26 np0005476733 podman[249376]: 2025-10-08 16:04:26.239418959 +0000 UTC m=+0.059750831 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 12:04:26 np0005476733 podman[249375]: 2025-10-08 16:04:26.260903295 +0000 UTC m=+0.078712597 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  8 12:04:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:04:26.357 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:04:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:04:26.358 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:04:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:04:26.358 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:04:26 np0005476733 nova_compute[192580]: 2025-10-08 16:04:26.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:04:26 np0005476733 nova_compute[192580]: 2025-10-08 16:04:26.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:04:27 np0005476733 nova_compute[192580]: 2025-10-08 16:04:27.595 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:04:27 np0005476733 nova_compute[192580]: 2025-10-08 16:04:27.625 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:04:27 np0005476733 nova_compute[192580]: 2025-10-08 16:04:27.626 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:04:27 np0005476733 nova_compute[192580]: 2025-10-08 16:04:27.627 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:04:27 np0005476733 nova_compute[192580]: 2025-10-08 16:04:27.627 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:04:27 np0005476733 nova_compute[192580]: 2025-10-08 16:04:27.825 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:04:27 np0005476733 nova_compute[192580]: 2025-10-08 16:04:27.826 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13803MB free_disk=111.33250045776367GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:04:27 np0005476733 nova_compute[192580]: 2025-10-08 16:04:27.826 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:04:27 np0005476733 nova_compute[192580]: 2025-10-08 16:04:27.827 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:04:27 np0005476733 nova_compute[192580]: 2025-10-08 16:04:27.902 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:04:27 np0005476733 nova_compute[192580]: 2025-10-08 16:04:27.903 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:04:27 np0005476733 nova_compute[192580]: 2025-10-08 16:04:27.936 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:04:27 np0005476733 nova_compute[192580]: 2025-10-08 16:04:27.956 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:04:27 np0005476733 nova_compute[192580]: 2025-10-08 16:04:27.958 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:04:27 np0005476733 nova_compute[192580]: 2025-10-08 16:04:27.959 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:04:31 np0005476733 nova_compute[192580]: 2025-10-08 16:04:31.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:04:34 np0005476733 podman[249419]: 2025-10-08 16:04:34.228418384 +0000 UTC m=+0.059971189 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001)
Oct  8 12:04:35 np0005476733 nova_compute[192580]: 2025-10-08 16:04:35.945 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:04:35 np0005476733 nova_compute[192580]: 2025-10-08 16:04:35.982 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.056 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.057 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.057 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.057 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.057 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.057 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.057 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.057 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.057 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.057 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.058 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.058 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.058 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.058 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.058 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.058 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.058 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.058 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.058 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.058 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.058 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.058 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.058 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.059 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:04:36.059 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:04:36 np0005476733 nova_compute[192580]: 2025-10-08 16:04:36.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:04:37 np0005476733 nova_compute[192580]: 2025-10-08 16:04:37.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:04:37 np0005476733 nova_compute[192580]: 2025-10-08 16:04:37.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5048 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  8 12:04:37 np0005476733 nova_compute[192580]: 2025-10-08 16:04:37.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 12:04:37 np0005476733 nova_compute[192580]: 2025-10-08 16:04:37.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 12:04:37 np0005476733 nova_compute[192580]: 2025-10-08 16:04:37.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:04:42 np0005476733 nova_compute[192580]: 2025-10-08 16:04:42.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:04:42 np0005476733 podman[249441]: 2025-10-08 16:04:42.26596317 +0000 UTC m=+0.084977958 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Oct  8 12:04:42 np0005476733 podman[249440]: 2025-10-08 16:04:42.304590634 +0000 UTC m=+0.133422345 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Oct  8 12:04:44 np0005476733 ovn_controller[94857]: 2025-10-08T16:04:44Z|00730|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Oct  8 12:04:47 np0005476733 nova_compute[192580]: 2025-10-08 16:04:47.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:04:49 np0005476733 podman[249488]: 2025-10-08 16:04:49.249947092 +0000 UTC m=+0.064310737 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:04:49 np0005476733 podman[249487]: 2025-10-08 16:04:49.249931311 +0000 UTC m=+0.068027744 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20251001, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 12:04:49 np0005476733 podman[249489]: 2025-10-08 16:04:49.285708275 +0000 UTC m=+0.088887132 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, name=ubi9-minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  8 12:04:52 np0005476733 nova_compute[192580]: 2025-10-08 16:04:52.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:04:57 np0005476733 nova_compute[192580]: 2025-10-08 16:04:57.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:04:57 np0005476733 podman[249550]: 2025-10-08 16:04:57.23371672 +0000 UTC m=+0.060484364 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid)
Oct  8 12:04:57 np0005476733 podman[249551]: 2025-10-08 16:04:57.243322067 +0000 UTC m=+0.064673058 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:05:02 np0005476733 nova_compute[192580]: 2025-10-08 16:05:02.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:02 np0005476733 nova_compute[192580]: 2025-10-08 16:05:02.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:02.427 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:05:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:02.429 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:05:02 np0005476733 nova_compute[192580]: 2025-10-08 16:05:02.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:05 np0005476733 podman[249593]: 2025-10-08 16:05:05.229369039 +0000 UTC m=+0.056485535 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.216 2 DEBUG oslo_concurrency.lockutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "0bb99735-ce66-4e0e-9084-3ed659692146" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.217 2 DEBUG oslo_concurrency.lockutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "0bb99735-ce66-4e0e-9084-3ed659692146" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.238 2 DEBUG nova.compute.manager [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.314 2 DEBUG oslo_concurrency.lockutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.315 2 DEBUG oslo_concurrency.lockutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.323 2 DEBUG nova.virt.hardware [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.324 2 INFO nova.compute.claims [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.428 2 DEBUG nova.compute.provider_tree [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.442 2 DEBUG nova.scheduler.client.report [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.461 2 DEBUG oslo_concurrency.lockutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.462 2 DEBUG nova.compute.manager [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.504 2 DEBUG nova.compute.manager [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.504 2 DEBUG nova.network.neutron [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.527 2 INFO nova.virt.libvirt.driver [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.547 2 DEBUG nova.compute.manager [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.632 2 DEBUG nova.compute.manager [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.635 2 DEBUG nova.virt.libvirt.driver [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.635 2 INFO nova.virt.libvirt.driver [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Creating image(s)#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.636 2 DEBUG oslo_concurrency.lockutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "/var/lib/nova/instances/0bb99735-ce66-4e0e-9084-3ed659692146/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.637 2 DEBUG oslo_concurrency.lockutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "/var/lib/nova/instances/0bb99735-ce66-4e0e-9084-3ed659692146/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.638 2 DEBUG oslo_concurrency.lockutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "/var/lib/nova/instances/0bb99735-ce66-4e0e-9084-3ed659692146/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.663 2 DEBUG nova.policy [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.667 2 DEBUG oslo_concurrency.processutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.747 2 DEBUG oslo_concurrency.processutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.749 2 DEBUG oslo_concurrency.lockutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.750 2 DEBUG oslo_concurrency.lockutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.766 2 DEBUG oslo_concurrency.processutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.828 2 DEBUG oslo_concurrency.processutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.829 2 DEBUG oslo_concurrency.processutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/0bb99735-ce66-4e0e-9084-3ed659692146/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.870 2 DEBUG oslo_concurrency.processutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/0bb99735-ce66-4e0e-9084-3ed659692146/disk 10737418240" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.871 2 DEBUG oslo_concurrency.lockutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.872 2 DEBUG oslo_concurrency.processutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.967 2 DEBUG oslo_concurrency.processutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.969 2 DEBUG nova.objects.instance [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lazy-loading 'migration_context' on Instance uuid 0bb99735-ce66-4e0e-9084-3ed659692146 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.992 2 DEBUG nova.virt.libvirt.driver [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.993 2 DEBUG nova.virt.libvirt.driver [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Ensure instance console log exists: /var/lib/nova/instances/0bb99735-ce66-4e0e-9084-3ed659692146/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.994 2 DEBUG oslo_concurrency.lockutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.995 2 DEBUG oslo_concurrency.lockutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:05:06 np0005476733 nova_compute[192580]: 2025-10-08 16:05:06.995 2 DEBUG oslo_concurrency.lockutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:05:07 np0005476733 nova_compute[192580]: 2025-10-08 16:05:07.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:08 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:08.431 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:05:09 np0005476733 nova_compute[192580]: 2025-10-08 16:05:09.977 2 DEBUG nova.network.neutron [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Successfully created port: e01a0204-a0c3-4267-bec4-88b5e24e15bd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 12:05:11 np0005476733 nova_compute[192580]: 2025-10-08 16:05:11.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:05:12 np0005476733 nova_compute[192580]: 2025-10-08 16:05:12.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:05:12 np0005476733 nova_compute[192580]: 2025-10-08 16:05:12.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:12 np0005476733 nova_compute[192580]: 2025-10-08 16:05:12.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5060 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  8 12:05:12 np0005476733 nova_compute[192580]: 2025-10-08 16:05:12.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 12:05:12 np0005476733 nova_compute[192580]: 2025-10-08 16:05:12.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 12:05:12 np0005476733 nova_compute[192580]: 2025-10-08 16:05:12.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:13 np0005476733 podman[249626]: 2025-10-08 16:05:13.253771716 +0000 UTC m=+0.076753044 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, managed_by=edpm_ansible, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 12:05:13 np0005476733 podman[249625]: 2025-10-08 16:05:13.327593745 +0000 UTC m=+0.144310763 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 12:05:14 np0005476733 nova_compute[192580]: 2025-10-08 16:05:14.051 2 DEBUG nova.network.neutron [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Successfully updated port: e01a0204-a0c3-4267-bec4-88b5e24e15bd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 12:05:14 np0005476733 nova_compute[192580]: 2025-10-08 16:05:14.088 2 DEBUG oslo_concurrency.lockutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "refresh_cache-0bb99735-ce66-4e0e-9084-3ed659692146" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:05:14 np0005476733 nova_compute[192580]: 2025-10-08 16:05:14.089 2 DEBUG oslo_concurrency.lockutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquired lock "refresh_cache-0bb99735-ce66-4e0e-9084-3ed659692146" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:05:14 np0005476733 nova_compute[192580]: 2025-10-08 16:05:14.089 2 DEBUG nova.network.neutron [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:05:14 np0005476733 nova_compute[192580]: 2025-10-08 16:05:14.213 2 DEBUG nova.compute.manager [req-888bd92e-9aeb-4b12-aca6-865611a92e2b req-437beeb7-411c-4ef7-83d3-60ced9795fb6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Received event network-changed-e01a0204-a0c3-4267-bec4-88b5e24e15bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:05:14 np0005476733 nova_compute[192580]: 2025-10-08 16:05:14.214 2 DEBUG nova.compute.manager [req-888bd92e-9aeb-4b12-aca6-865611a92e2b req-437beeb7-411c-4ef7-83d3-60ced9795fb6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Refreshing instance network info cache due to event network-changed-e01a0204-a0c3-4267-bec4-88b5e24e15bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:05:14 np0005476733 nova_compute[192580]: 2025-10-08 16:05:14.214 2 DEBUG oslo_concurrency.lockutils [req-888bd92e-9aeb-4b12-aca6-865611a92e2b req-437beeb7-411c-4ef7-83d3-60ced9795fb6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-0bb99735-ce66-4e0e-9084-3ed659692146" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:05:14 np0005476733 nova_compute[192580]: 2025-10-08 16:05:14.319 2 DEBUG nova.network.neutron [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 12:05:14 np0005476733 nova_compute[192580]: 2025-10-08 16:05:14.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.271 2 DEBUG nova.network.neutron [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Updating instance_info_cache with network_info: [{"id": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "address": "fa:16:3e:98:9e:69", "network": {"id": "d072591a-0382-43cd-8966-59eb24fe6dd1", "bridge": "br-int", "label": "tempest-test-network--231794910", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71bd615ba6694cba8794c8eb5dadbe81", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape01a0204-a0", "ovs_interfaceid": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.300 2 DEBUG oslo_concurrency.lockutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Releasing lock "refresh_cache-0bb99735-ce66-4e0e-9084-3ed659692146" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.300 2 DEBUG nova.compute.manager [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Instance network_info: |[{"id": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "address": "fa:16:3e:98:9e:69", "network": {"id": "d072591a-0382-43cd-8966-59eb24fe6dd1", "bridge": "br-int", "label": "tempest-test-network--231794910", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71bd615ba6694cba8794c8eb5dadbe81", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape01a0204-a0", "ovs_interfaceid": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.301 2 DEBUG oslo_concurrency.lockutils [req-888bd92e-9aeb-4b12-aca6-865611a92e2b req-437beeb7-411c-4ef7-83d3-60ced9795fb6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-0bb99735-ce66-4e0e-9084-3ed659692146" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.301 2 DEBUG nova.network.neutron [req-888bd92e-9aeb-4b12-aca6-865611a92e2b req-437beeb7-411c-4ef7-83d3-60ced9795fb6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Refreshing network info cache for port e01a0204-a0c3-4267-bec4-88b5e24e15bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.304 2 DEBUG nova.virt.libvirt.driver [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Start _get_guest_xml network_info=[{"id": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "address": "fa:16:3e:98:9e:69", "network": {"id": "d072591a-0382-43cd-8966-59eb24fe6dd1", "bridge": "br-int", "label": "tempest-test-network--231794910", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71bd615ba6694cba8794c8eb5dadbe81", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape01a0204-a0", "ovs_interfaceid": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.309 2 WARNING nova.virt.libvirt.driver [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.314 2 DEBUG nova.virt.libvirt.host [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.315 2 DEBUG nova.virt.libvirt.host [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.318 2 DEBUG nova.virt.libvirt.host [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.318 2 DEBUG nova.virt.libvirt.host [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.318 2 DEBUG nova.virt.libvirt.driver [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.319 2 DEBUG nova.virt.hardware [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.319 2 DEBUG nova.virt.hardware [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.319 2 DEBUG nova.virt.hardware [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.320 2 DEBUG nova.virt.hardware [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.320 2 DEBUG nova.virt.hardware [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.320 2 DEBUG nova.virt.hardware [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.320 2 DEBUG nova.virt.hardware [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.320 2 DEBUG nova.virt.hardware [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.321 2 DEBUG nova.virt.hardware [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.321 2 DEBUG nova.virt.hardware [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.321 2 DEBUG nova.virt.hardware [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.325 2 DEBUG nova.virt.libvirt.vif [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:05:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-test_dvr_vip_failover_basic-237145123',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dvr-vip-failover-basic-237145123',id=78,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNdA0tsd4prmAtaaiPDbr0srwjwa73lUEwXPQ7487oxI1AHjPlOjgV1xIPQKf206OdbyLFsmy0ZYOIXSGyym/svb66IKtgGKZCbQTa1vbaeLuI4LvJfdLsM/uLuzgQoA5Q==',key_name='tempest-keypair-test-460735074',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71bd615ba6694cba8794c8eb5dadbe81',ramdisk_id='',reservation_id='r-a50k1nty',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-OvnDvrAdvancedTest-1107478320',owner_user_name='tempest-OvnDvrAdvancedTest-1107478320-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:05:06Z,user_data=None,user_id='000a8d1cd17e4a4c8398ef814dd4db2d',uuid=0bb99735-ce66-4e0e-9084-3ed659692146,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "address": "fa:16:3e:98:9e:69", "network": {"id": "d072591a-0382-43cd-8966-59eb24fe6dd1", "bridge": "br-int", "label": "tempest-test-network--231794910", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71bd615ba6694cba8794c8eb5dadbe81", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape01a0204-a0", "ovs_interfaceid": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.325 2 DEBUG nova.network.os_vif_util [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Converting VIF {"id": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "address": "fa:16:3e:98:9e:69", "network": {"id": "d072591a-0382-43cd-8966-59eb24fe6dd1", "bridge": "br-int", "label": "tempest-test-network--231794910", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71bd615ba6694cba8794c8eb5dadbe81", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape01a0204-a0", "ovs_interfaceid": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.326 2 DEBUG nova.network.os_vif_util [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:9e:69,bridge_name='br-int',has_traffic_filtering=True,id=e01a0204-a0c3-4267-bec4-88b5e24e15bd,network=Network(d072591a-0382-43cd-8966-59eb24fe6dd1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape01a0204-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.327 2 DEBUG nova.objects.instance [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0bb99735-ce66-4e0e-9084-3ed659692146 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.346 2 DEBUG nova.virt.libvirt.driver [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] End _get_guest_xml xml=<domain type="kvm">
Oct  8 12:05:15 np0005476733 nova_compute[192580]:  <uuid>0bb99735-ce66-4e0e-9084-3ed659692146</uuid>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:  <name>instance-0000004e</name>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <nova:name>tempest-test_dvr_vip_failover_basic-237145123</nova:name>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 16:05:15</nova:creationTime>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 12:05:15 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:        <nova:user uuid="000a8d1cd17e4a4c8398ef814dd4db2d">tempest-OvnDvrAdvancedTest-1107478320-project-member</nova:user>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:        <nova:project uuid="71bd615ba6694cba8794c8eb5dadbe81">tempest-OvnDvrAdvancedTest-1107478320</nova:project>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:        <nova:port uuid="e01a0204-a0c3-4267-bec4-88b5e24e15bd">
Oct  8 12:05:15 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.0.238" ipVersion="4"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <system>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <entry name="serial">0bb99735-ce66-4e0e-9084-3ed659692146</entry>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <entry name="uuid">0bb99735-ce66-4e0e-9084-3ed659692146</entry>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    </system>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:  <os>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:  </os>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:  <features>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:  </features>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:  </clock>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:  <devices>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/0bb99735-ce66-4e0e-9084-3ed659692146/disk"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/0bb99735-ce66-4e0e-9084-3ed659692146/disk.config"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:98:9e:69"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <target dev="tape01a0204-a0"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    </interface>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/0bb99735-ce66-4e0e-9084-3ed659692146/console.log" append="off"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    </serial>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <video>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    </video>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    </rng>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 12:05:15 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 12:05:15 np0005476733 nova_compute[192580]:  </devices>
Oct  8 12:05:15 np0005476733 nova_compute[192580]: </domain>
Oct  8 12:05:15 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.348 2 DEBUG nova.compute.manager [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Preparing to wait for external event network-vif-plugged-e01a0204-a0c3-4267-bec4-88b5e24e15bd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.348 2 DEBUG oslo_concurrency.lockutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "0bb99735-ce66-4e0e-9084-3ed659692146-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.348 2 DEBUG oslo_concurrency.lockutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "0bb99735-ce66-4e0e-9084-3ed659692146-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.349 2 DEBUG oslo_concurrency.lockutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "0bb99735-ce66-4e0e-9084-3ed659692146-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.349 2 DEBUG nova.virt.libvirt.vif [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:05:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-test_dvr_vip_failover_basic-237145123',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dvr-vip-failover-basic-237145123',id=78,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNdA0tsd4prmAtaaiPDbr0srwjwa73lUEwXPQ7487oxI1AHjPlOjgV1xIPQKf206OdbyLFsmy0ZYOIXSGyym/svb66IKtgGKZCbQTa1vbaeLuI4LvJfdLsM/uLuzgQoA5Q==',key_name='tempest-keypair-test-460735074',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71bd615ba6694cba8794c8eb5dadbe81',ramdisk_id='',reservation_id='r-a50k1nty',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-OvnDvrAdvancedTest-1107478320',owner_user_name='tempest-OvnDvrAdvancedTest-1107478320-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:05:06Z,user_data=None,user_id='000a8d1cd17e4a4c8398ef814dd4db2d',uuid=0bb99735-ce66-4e0e-9084-3ed659692146,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "address": "fa:16:3e:98:9e:69", "network": {"id": "d072591a-0382-43cd-8966-59eb24fe6dd1", "bridge": "br-int", "label": "tempest-test-network--231794910", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71bd615ba6694cba8794c8eb5dadbe81", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape01a0204-a0", "ovs_interfaceid": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.350 2 DEBUG nova.network.os_vif_util [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Converting VIF {"id": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "address": "fa:16:3e:98:9e:69", "network": {"id": "d072591a-0382-43cd-8966-59eb24fe6dd1", "bridge": "br-int", "label": "tempest-test-network--231794910", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71bd615ba6694cba8794c8eb5dadbe81", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape01a0204-a0", "ovs_interfaceid": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.350 2 DEBUG nova.network.os_vif_util [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:9e:69,bridge_name='br-int',has_traffic_filtering=True,id=e01a0204-a0c3-4267-bec4-88b5e24e15bd,network=Network(d072591a-0382-43cd-8966-59eb24fe6dd1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape01a0204-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.351 2 DEBUG os_vif [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:9e:69,bridge_name='br-int',has_traffic_filtering=True,id=e01a0204-a0c3-4267-bec4-88b5e24e15bd,network=Network(d072591a-0382-43cd-8966-59eb24fe6dd1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape01a0204-a0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.352 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.352 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.356 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape01a0204-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.357 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape01a0204-a0, col_values=(('external_ids', {'iface-id': 'e01a0204-a0c3-4267-bec4-88b5e24e15bd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:98:9e:69', 'vm-uuid': '0bb99735-ce66-4e0e-9084-3ed659692146'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:15 np0005476733 NetworkManager[51699]: <info>  [1759939515.3604] manager: (tape01a0204-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/244)
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.367 2 INFO os_vif [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:9e:69,bridge_name='br-int',has_traffic_filtering=True,id=e01a0204-a0c3-4267-bec4-88b5e24e15bd,network=Network(d072591a-0382-43cd-8966-59eb24fe6dd1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape01a0204-a0')#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.428 2 DEBUG nova.virt.libvirt.driver [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.430 2 DEBUG nova.virt.libvirt.driver [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.430 2 DEBUG nova.virt.libvirt.driver [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] No VIF found with MAC fa:16:3e:98:9e:69, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 12:05:15 np0005476733 nova_compute[192580]: 2025-10-08 16:05:15.431 2 INFO nova.virt.libvirt.driver [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Using config drive#033[00m
Oct  8 12:05:16 np0005476733 nova_compute[192580]: 2025-10-08 16:05:16.048 2 INFO nova.virt.libvirt.driver [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Creating config drive at /var/lib/nova/instances/0bb99735-ce66-4e0e-9084-3ed659692146/disk.config#033[00m
Oct  8 12:05:16 np0005476733 nova_compute[192580]: 2025-10-08 16:05:16.058 2 DEBUG oslo_concurrency.processutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0bb99735-ce66-4e0e-9084-3ed659692146/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6ol4ignn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:05:16 np0005476733 nova_compute[192580]: 2025-10-08 16:05:16.192 2 DEBUG oslo_concurrency.processutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0bb99735-ce66-4e0e-9084-3ed659692146/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6ol4ignn" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:05:16 np0005476733 NetworkManager[51699]: <info>  [1759939516.2627] manager: (tape01a0204-a0): new Tun device (/org/freedesktop/NetworkManager/Devices/245)
Oct  8 12:05:16 np0005476733 kernel: tape01a0204-a0: entered promiscuous mode
Oct  8 12:05:16 np0005476733 ovn_controller[94857]: 2025-10-08T16:05:16Z|00731|binding|INFO|Claiming lport e01a0204-a0c3-4267-bec4-88b5e24e15bd for this chassis.
Oct  8 12:05:16 np0005476733 ovn_controller[94857]: 2025-10-08T16:05:16Z|00732|binding|INFO|e01a0204-a0c3-4267-bec4-88b5e24e15bd: Claiming fa:16:3e:98:9e:69 192.168.0.238
Oct  8 12:05:16 np0005476733 nova_compute[192580]: 2025-10-08 16:05:16.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:16 np0005476733 nova_compute[192580]: 2025-10-08 16:05:16.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:16 np0005476733 nova_compute[192580]: 2025-10-08 16:05:16.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:16 np0005476733 nova_compute[192580]: 2025-10-08 16:05:16.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:16 np0005476733 NetworkManager[51699]: <info>  [1759939516.2814] manager: (patch-provnet-20e9b335-697c-41bf-8f62-8813cb01ba99-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/246)
Oct  8 12:05:16 np0005476733 NetworkManager[51699]: <info>  [1759939516.2824] manager: (patch-br-int-to-provnet-20e9b335-697c-41bf-8f62-8813cb01ba99): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/247)
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.290 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:9e:69 192.168.0.238'], port_security=['fa:16:3e:98:9e:69 192.168.0.238'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.238/24', 'neutron:device_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d072591a-0382-43cd-8966-59eb24fe6dd1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'neutron:revision_number': '2', 'neutron:security_group_ids': '24bffacf-e176-4693-befb-0f2fe8062d38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=528be15d-7e7e-49a4-9215-203e9b098cff, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=e01a0204-a0c3-4267-bec4-88b5e24e15bd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.291 103739 INFO neutron.agent.ovn.metadata.agent [-] Port e01a0204-a0c3-4267-bec4-88b5e24e15bd in datapath d072591a-0382-43cd-8966-59eb24fe6dd1 bound to our chassis#033[00m
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.293 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d072591a-0382-43cd-8966-59eb24fe6dd1#033[00m
Oct  8 12:05:16 np0005476733 systemd-udevd[249688]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.310 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4c51d23c-eaeb-43fb-ad25-d659b67143d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.311 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd072591a-01 in ovnmeta-d072591a-0382-43cd-8966-59eb24fe6dd1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.314 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd072591a-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.314 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6fbdd84b-cb37-4761-8808-23179bbb8953]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:05:16 np0005476733 systemd-machined[152624]: New machine qemu-47-instance-0000004e.
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.315 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e027b5b1-6fc3-4cbb-a15e-f3c7112b6373]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:05:16 np0005476733 NetworkManager[51699]: <info>  [1759939516.3264] device (tape01a0204-a0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:05:16 np0005476733 NetworkManager[51699]: <info>  [1759939516.3275] device (tape01a0204-a0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.332 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[90257ab8-83d5-45aa-a9a6-24b6814c4e29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:05:16 np0005476733 nova_compute[192580]: 2025-10-08 16:05:16.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.353 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7205207d-9971-4430-8672-45e8c0d69254]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:05:16 np0005476733 systemd[1]: Started Virtual Machine qemu-47-instance-0000004e.
Oct  8 12:05:16 np0005476733 nova_compute[192580]: 2025-10-08 16:05:16.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:16 np0005476733 ovn_controller[94857]: 2025-10-08T16:05:16Z|00733|binding|INFO|Setting lport e01a0204-a0c3-4267-bec4-88b5e24e15bd ovn-installed in OVS
Oct  8 12:05:16 np0005476733 ovn_controller[94857]: 2025-10-08T16:05:16Z|00734|binding|INFO|Setting lport e01a0204-a0c3-4267-bec4-88b5e24e15bd up in Southbound
Oct  8 12:05:16 np0005476733 nova_compute[192580]: 2025-10-08 16:05:16.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.392 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[e8efd052-46b7-462d-8c9c-a537e01d34f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:05:16 np0005476733 NetworkManager[51699]: <info>  [1759939516.4009] manager: (tapd072591a-00): new Veth device (/org/freedesktop/NetworkManager/Devices/248)
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.400 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0392cc8e-88c5-444b-b0f4-5571e1a65765]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.432 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[03d6e362-2f43-4615-8535-cb731ef10563]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.439 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[eadbcbc7-041c-4d11-ae60-773e72a8d048]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:05:16 np0005476733 NetworkManager[51699]: <info>  [1759939516.4624] device (tapd072591a-00): carrier: link connected
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.470 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[ea8168ed-e207-4ae0-a2bb-2e256870602d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.492 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4de5ccac-900a-42ef-a954-5296996f0adb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd072591a-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:a8:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 171], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 642012, 'reachable_time': 29883, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249720, 'error': None, 'target': 'ovnmeta-d072591a-0382-43cd-8966-59eb24fe6dd1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.516 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d845ca96-80aa-4a2e-98d2-e533c19dec8e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:a8b0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 642012, 'tstamp': 642012}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249721, 'error': None, 'target': 'ovnmeta-d072591a-0382-43cd-8966-59eb24fe6dd1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.544 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a1d4c73f-c68a-4dcd-bf46-e11434f93d3a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd072591a-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:a8:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 171], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 642012, 'reachable_time': 29883, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 249722, 'error': None, 'target': 'ovnmeta-d072591a-0382-43cd-8966-59eb24fe6dd1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.591 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[fa0f075e-fb01-48c3-8da9-439a3d53e57f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.660 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3318080d-cbe2-4231-9452-ebe5b4c638f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.661 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd072591a-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.661 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.662 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd072591a-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:05:16 np0005476733 nova_compute[192580]: 2025-10-08 16:05:16.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:16 np0005476733 kernel: tapd072591a-00: entered promiscuous mode
Oct  8 12:05:16 np0005476733 NetworkManager[51699]: <info>  [1759939516.6651] manager: (tapd072591a-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/249)
Oct  8 12:05:16 np0005476733 nova_compute[192580]: 2025-10-08 16:05:16.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.668 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd072591a-00, col_values=(('external_ids', {'iface-id': '4f3788b4-25d9-495b-9ce4-e46caee21294'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:05:16 np0005476733 nova_compute[192580]: 2025-10-08 16:05:16.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:16 np0005476733 ovn_controller[94857]: 2025-10-08T16:05:16Z|00735|binding|INFO|Releasing lport 4f3788b4-25d9-495b-9ce4-e46caee21294 from this chassis (sb_readonly=0)
Oct  8 12:05:16 np0005476733 nova_compute[192580]: 2025-10-08 16:05:16.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.695 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d072591a-0382-43cd-8966-59eb24fe6dd1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d072591a-0382-43cd-8966-59eb24fe6dd1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.696 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d7fae1bb-7c79-467a-b510-81e870e705c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.697 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-d072591a-0382-43cd-8966-59eb24fe6dd1
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/d072591a-0382-43cd-8966-59eb24fe6dd1.pid.haproxy
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID d072591a-0382-43cd-8966-59eb24fe6dd1
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 12:05:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:16.698 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d072591a-0382-43cd-8966-59eb24fe6dd1', 'env', 'PROCESS_TAG=haproxy-d072591a-0382-43cd-8966-59eb24fe6dd1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d072591a-0382-43cd-8966-59eb24fe6dd1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 12:05:17 np0005476733 podman[249761]: 2025-10-08 16:05:17.059298139 +0000 UTC m=+0.052014484 container create f043f999fd47ce72b7d3089b56c6ca6b51b409df7371c9fa6fc861ae8f73f7da (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-d072591a-0382-43cd-8966-59eb24fe6dd1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 12:05:17 np0005476733 systemd[1]: Started libpod-conmon-f043f999fd47ce72b7d3089b56c6ca6b51b409df7371c9fa6fc861ae8f73f7da.scope.
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.101 2 DEBUG nova.compute.manager [req-10ac76cc-8b14-4b2d-a44f-6fb69914f975 req-ef5b3e7a-86f8-4de7-ab62-948b9dd35340 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Received event network-vif-plugged-e01a0204-a0c3-4267-bec4-88b5e24e15bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.102 2 DEBUG oslo_concurrency.lockutils [req-10ac76cc-8b14-4b2d-a44f-6fb69914f975 req-ef5b3e7a-86f8-4de7-ab62-948b9dd35340 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "0bb99735-ce66-4e0e-9084-3ed659692146-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.103 2 DEBUG oslo_concurrency.lockutils [req-10ac76cc-8b14-4b2d-a44f-6fb69914f975 req-ef5b3e7a-86f8-4de7-ab62-948b9dd35340 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "0bb99735-ce66-4e0e-9084-3ed659692146-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.103 2 DEBUG oslo_concurrency.lockutils [req-10ac76cc-8b14-4b2d-a44f-6fb69914f975 req-ef5b3e7a-86f8-4de7-ab62-948b9dd35340 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "0bb99735-ce66-4e0e-9084-3ed659692146-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.103 2 DEBUG nova.compute.manager [req-10ac76cc-8b14-4b2d-a44f-6fb69914f975 req-ef5b3e7a-86f8-4de7-ab62-948b9dd35340 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Processing event network-vif-plugged-e01a0204-a0c3-4267-bec4-88b5e24e15bd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:17 np0005476733 podman[249761]: 2025-10-08 16:05:17.031842182 +0000 UTC m=+0.024558577 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 12:05:17 np0005476733 systemd[1]: Started libcrun container.
Oct  8 12:05:17 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6cafb77de68f7a45632b5660803d3e80fb0b7c63eb5a99627d8fe2246634f83/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 12:05:17 np0005476733 podman[249761]: 2025-10-08 16:05:17.150759932 +0000 UTC m=+0.143476287 container init f043f999fd47ce72b7d3089b56c6ca6b51b409df7371c9fa6fc861ae8f73f7da (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-d072591a-0382-43cd-8966-59eb24fe6dd1, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:05:17 np0005476733 podman[249761]: 2025-10-08 16:05:17.157614222 +0000 UTC m=+0.150330567 container start f043f999fd47ce72b7d3089b56c6ca6b51b409df7371c9fa6fc861ae8f73f7da (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-d072591a-0382-43cd-8966-59eb24fe6dd1, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:05:17 np0005476733 neutron-haproxy-ovnmeta-d072591a-0382-43cd-8966-59eb24fe6dd1[249776]: [NOTICE]   (249780) : New worker (249782) forked
Oct  8 12:05:17 np0005476733 neutron-haproxy-ovnmeta-d072591a-0382-43cd-8966-59eb24fe6dd1[249776]: [NOTICE]   (249780) : Loading success.
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.275 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759939517.275144, 0bb99735-ce66-4e0e-9084-3ed659692146 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.277 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] VM Started (Lifecycle Event)#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.278 2 DEBUG nova.compute.manager [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.282 2 DEBUG nova.virt.libvirt.driver [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.286 2 INFO nova.virt.libvirt.driver [-] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Instance spawned successfully.#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.286 2 DEBUG nova.virt.libvirt.driver [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.308 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.314 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.318 2 DEBUG nova.virt.libvirt.driver [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.319 2 DEBUG nova.virt.libvirt.driver [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.319 2 DEBUG nova.virt.libvirt.driver [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.319 2 DEBUG nova.virt.libvirt.driver [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.320 2 DEBUG nova.virt.libvirt.driver [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.320 2 DEBUG nova.virt.libvirt.driver [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.353 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.354 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759939517.2752976, 0bb99735-ce66-4e0e-9084-3ed659692146 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.354 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] VM Paused (Lifecycle Event)#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.391 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.394 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759939517.2813346, 0bb99735-ce66-4e0e-9084-3ed659692146 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.395 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] VM Resumed (Lifecycle Event)#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.408 2 INFO nova.compute.manager [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Took 10.77 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.408 2 DEBUG nova.compute.manager [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.422 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.425 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.454 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.495 2 INFO nova.compute.manager [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Took 11.21 seconds to build instance.#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.517 2 DEBUG oslo_concurrency.lockutils [None req-60a40c3e-28ed-4511-8244-009e82982503 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "0bb99735-ce66-4e0e-9084-3ed659692146" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.301s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.971 2 DEBUG nova.network.neutron [req-888bd92e-9aeb-4b12-aca6-865611a92e2b req-437beeb7-411c-4ef7-83d3-60ced9795fb6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Updated VIF entry in instance network info cache for port e01a0204-a0c3-4267-bec4-88b5e24e15bd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.971 2 DEBUG nova.network.neutron [req-888bd92e-9aeb-4b12-aca6-865611a92e2b req-437beeb7-411c-4ef7-83d3-60ced9795fb6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Updating instance_info_cache with network_info: [{"id": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "address": "fa:16:3e:98:9e:69", "network": {"id": "d072591a-0382-43cd-8966-59eb24fe6dd1", "bridge": "br-int", "label": "tempest-test-network--231794910", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71bd615ba6694cba8794c8eb5dadbe81", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape01a0204-a0", "ovs_interfaceid": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:05:17 np0005476733 nova_compute[192580]: 2025-10-08 16:05:17.987 2 DEBUG oslo_concurrency.lockutils [req-888bd92e-9aeb-4b12-aca6-865611a92e2b req-437beeb7-411c-4ef7-83d3-60ced9795fb6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-0bb99735-ce66-4e0e-9084-3ed659692146" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:05:18 np0005476733 nova_compute[192580]: 2025-10-08 16:05:18.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:05:18 np0005476733 nova_compute[192580]: 2025-10-08 16:05:18.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:05:18 np0005476733 nova_compute[192580]: 2025-10-08 16:05:18.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:05:18 np0005476733 nova_compute[192580]: 2025-10-08 16:05:18.802 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-0bb99735-ce66-4e0e-9084-3ed659692146" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:05:18 np0005476733 nova_compute[192580]: 2025-10-08 16:05:18.803 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-0bb99735-ce66-4e0e-9084-3ed659692146" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:05:18 np0005476733 nova_compute[192580]: 2025-10-08 16:05:18.803 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:05:18 np0005476733 nova_compute[192580]: 2025-10-08 16:05:18.803 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0bb99735-ce66-4e0e-9084-3ed659692146 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:05:19 np0005476733 nova_compute[192580]: 2025-10-08 16:05:19.277 2 DEBUG nova.compute.manager [req-e512deff-7a8f-45fb-917f-be62464d340e req-6d26be83-73db-4b2f-9b29-b2f13cd8baaa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Received event network-vif-plugged-e01a0204-a0c3-4267-bec4-88b5e24e15bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:05:19 np0005476733 nova_compute[192580]: 2025-10-08 16:05:19.277 2 DEBUG oslo_concurrency.lockutils [req-e512deff-7a8f-45fb-917f-be62464d340e req-6d26be83-73db-4b2f-9b29-b2f13cd8baaa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "0bb99735-ce66-4e0e-9084-3ed659692146-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:05:19 np0005476733 nova_compute[192580]: 2025-10-08 16:05:19.278 2 DEBUG oslo_concurrency.lockutils [req-e512deff-7a8f-45fb-917f-be62464d340e req-6d26be83-73db-4b2f-9b29-b2f13cd8baaa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "0bb99735-ce66-4e0e-9084-3ed659692146-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:05:19 np0005476733 nova_compute[192580]: 2025-10-08 16:05:19.278 2 DEBUG oslo_concurrency.lockutils [req-e512deff-7a8f-45fb-917f-be62464d340e req-6d26be83-73db-4b2f-9b29-b2f13cd8baaa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "0bb99735-ce66-4e0e-9084-3ed659692146-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:05:19 np0005476733 nova_compute[192580]: 2025-10-08 16:05:19.279 2 DEBUG nova.compute.manager [req-e512deff-7a8f-45fb-917f-be62464d340e req-6d26be83-73db-4b2f-9b29-b2f13cd8baaa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] No waiting events found dispatching network-vif-plugged-e01a0204-a0c3-4267-bec4-88b5e24e15bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:05:19 np0005476733 nova_compute[192580]: 2025-10-08 16:05:19.279 2 WARNING nova.compute.manager [req-e512deff-7a8f-45fb-917f-be62464d340e req-6d26be83-73db-4b2f-9b29-b2f13cd8baaa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Received unexpected event network-vif-plugged-e01a0204-a0c3-4267-bec4-88b5e24e15bd for instance with vm_state active and task_state None.#033[00m
Oct  8 12:05:19 np0005476733 nova_compute[192580]: 2025-10-08 16:05:19.814 2 INFO nova.compute.manager [None req-33afa329-e595-4923-ad0a-cb5bb29b3f79 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Get console output#033[00m
Oct  8 12:05:19 np0005476733 nova_compute[192580]: 2025-10-08 16:05:19.825 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:05:20 np0005476733 podman[249792]: 2025-10-08 16:05:20.245141855 +0000 UTC m=+0.068711937 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:05:20 np0005476733 podman[249791]: 2025-10-08 16:05:20.246222559 +0000 UTC m=+0.071909209 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 12:05:20 np0005476733 podman[249793]: 2025-10-08 16:05:20.255360072 +0000 UTC m=+0.075827615 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, container_name=openstack_network_exporter, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Oct  8 12:05:20 np0005476733 nova_compute[192580]: 2025-10-08 16:05:20.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:20 np0005476733 nova_compute[192580]: 2025-10-08 16:05:20.493 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Updating instance_info_cache with network_info: [{"id": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "address": "fa:16:3e:98:9e:69", "network": {"id": "d072591a-0382-43cd-8966-59eb24fe6dd1", "bridge": "br-int", "label": "tempest-test-network--231794910", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71bd615ba6694cba8794c8eb5dadbe81", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape01a0204-a0", "ovs_interfaceid": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:05:20 np0005476733 nova_compute[192580]: 2025-10-08 16:05:20.516 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-0bb99735-ce66-4e0e-9084-3ed659692146" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:05:20 np0005476733 nova_compute[192580]: 2025-10-08 16:05:20.517 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:05:22 np0005476733 nova_compute[192580]: 2025-10-08 16:05:22.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:22 np0005476733 nova_compute[192580]: 2025-10-08 16:05:22.509 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:05:22 np0005476733 nova_compute[192580]: 2025-10-08 16:05:22.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:05:23 np0005476733 nova_compute[192580]: 2025-10-08 16:05:23.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:05:23 np0005476733 nova_compute[192580]: 2025-10-08 16:05:23.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:05:25 np0005476733 nova_compute[192580]: 2025-10-08 16:05:25.025 2 INFO nova.compute.manager [None req-976d6792-9e00-427b-a244-e4bd362c0e3a 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Get console output#033[00m
Oct  8 12:05:25 np0005476733 nova_compute[192580]: 2025-10-08 16:05:25.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:25 np0005476733 ovn_controller[94857]: 2025-10-08T16:05:25Z|00736|pinctrl|WARN|Dropped 257 log messages in last 62 seconds (most recently, 8 seconds ago) due to excessive rate
Oct  8 12:05:25 np0005476733 ovn_controller[94857]: 2025-10-08T16:05:25Z|00737|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:05:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:26.359 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:05:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:26.360 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:05:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:05:26.362 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:05:26 np0005476733 nova_compute[192580]: 2025-10-08 16:05:26.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:05:27 np0005476733 nova_compute[192580]: 2025-10-08 16:05:27.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:28 np0005476733 podman[249856]: 2025-10-08 16:05:28.236215206 +0000 UTC m=+0.061986491 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct  8 12:05:28 np0005476733 podman[249857]: 2025-10-08 16:05:28.25287931 +0000 UTC m=+0.069346449 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 12:05:29 np0005476733 nova_compute[192580]: 2025-10-08 16:05:29.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:05:29 np0005476733 nova_compute[192580]: 2025-10-08 16:05:29.627 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:05:29 np0005476733 nova_compute[192580]: 2025-10-08 16:05:29.627 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:05:29 np0005476733 nova_compute[192580]: 2025-10-08 16:05:29.627 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:05:29 np0005476733 nova_compute[192580]: 2025-10-08 16:05:29.627 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:05:29 np0005476733 nova_compute[192580]: 2025-10-08 16:05:29.708 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0bb99735-ce66-4e0e-9084-3ed659692146/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:05:29 np0005476733 nova_compute[192580]: 2025-10-08 16:05:29.783 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0bb99735-ce66-4e0e-9084-3ed659692146/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:05:29 np0005476733 nova_compute[192580]: 2025-10-08 16:05:29.785 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0bb99735-ce66-4e0e-9084-3ed659692146/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:05:29 np0005476733 nova_compute[192580]: 2025-10-08 16:05:29.849 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0bb99735-ce66-4e0e-9084-3ed659692146/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:05:30 np0005476733 nova_compute[192580]: 2025-10-08 16:05:30.060 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:05:30 np0005476733 nova_compute[192580]: 2025-10-08 16:05:30.063 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13415MB free_disk=111.33043670654297GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:05:30 np0005476733 nova_compute[192580]: 2025-10-08 16:05:30.064 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:05:30 np0005476733 nova_compute[192580]: 2025-10-08 16:05:30.065 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:05:30 np0005476733 nova_compute[192580]: 2025-10-08 16:05:30.191 2 INFO nova.compute.manager [None req-6821f051-fba4-4352-84e3-3c6a848415ef 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Get console output#033[00m
Oct  8 12:05:30 np0005476733 nova_compute[192580]: 2025-10-08 16:05:30.214 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 0bb99735-ce66-4e0e-9084-3ed659692146 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:05:30 np0005476733 nova_compute[192580]: 2025-10-08 16:05:30.216 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:05:30 np0005476733 nova_compute[192580]: 2025-10-08 16:05:30.217 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:05:30 np0005476733 nova_compute[192580]: 2025-10-08 16:05:30.350 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:05:30 np0005476733 nova_compute[192580]: 2025-10-08 16:05:30.370 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:05:30 np0005476733 nova_compute[192580]: 2025-10-08 16:05:30.403 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:05:30 np0005476733 nova_compute[192580]: 2025-10-08 16:05:30.403 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.338s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:05:30 np0005476733 nova_compute[192580]: 2025-10-08 16:05:30.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:32 np0005476733 nova_compute[192580]: 2025-10-08 16:05:32.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:35 np0005476733 nova_compute[192580]: 2025-10-08 16:05:35.411 2 INFO nova.compute.manager [None req-dbc5d9f0-c1ea-442d-83a2-50a2d12efbd0 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Get console output#033[00m
Oct  8 12:05:35 np0005476733 nova_compute[192580]: 2025-10-08 16:05:35.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:35 np0005476733 nova_compute[192580]: 2025-10-08 16:05:35.419 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:05:36 np0005476733 podman[249917]: 2025-10-08 16:05:36.21858363 +0000 UTC m=+0.047205210 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 12:05:36 np0005476733 nova_compute[192580]: 2025-10-08 16:05:36.405 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:05:36 np0005476733 nova_compute[192580]: 2025-10-08 16:05:36.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:05:36 np0005476733 nova_compute[192580]: 2025-10-08 16:05:36.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 12:05:37 np0005476733 nova_compute[192580]: 2025-10-08 16:05:37.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:40 np0005476733 nova_compute[192580]: 2025-10-08 16:05:40.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:40 np0005476733 nova_compute[192580]: 2025-10-08 16:05:40.576 2 INFO nova.compute.manager [None req-57d99ddc-bea8-47c7-a526-654b69522060 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Get console output#033[00m
Oct  8 12:05:40 np0005476733 nova_compute[192580]: 2025-10-08 16:05:40.581 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:05:40 np0005476733 ovn_controller[94857]: 2025-10-08T16:05:40Z|00098|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:98:9e:69 192.168.0.238
Oct  8 12:05:40 np0005476733 ovn_controller[94857]: 2025-10-08T16:05:40Z|00099|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:98:9e:69 192.168.0.238
Oct  8 12:05:42 np0005476733 nova_compute[192580]: 2025-10-08 16:05:42.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:44 np0005476733 podman[249937]: 2025-10-08 16:05:44.257905024 +0000 UTC m=+0.062287962 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 12:05:44 np0005476733 podman[249936]: 2025-10-08 16:05:44.302909393 +0000 UTC m=+0.113996316 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 12:05:44 np0005476733 nova_compute[192580]: 2025-10-08 16:05:44.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:05:45 np0005476733 nova_compute[192580]: 2025-10-08 16:05:45.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:45 np0005476733 nova_compute[192580]: 2025-10-08 16:05:45.858 2 INFO nova.compute.manager [None req-3dd9cde7-0979-4f19-a3f0-ca284b3da702 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Get console output#033[00m
Oct  8 12:05:45 np0005476733 nova_compute[192580]: 2025-10-08 16:05:45.866 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:05:45 np0005476733 nova_compute[192580]: 2025-10-08 16:05:45.871 2 INFO nova.virt.libvirt.driver [None req-3dd9cde7-0979-4f19-a3f0-ca284b3da702 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Truncated console log returned, 2874 bytes ignored#033[00m
Oct  8 12:05:46 np0005476733 ovn_controller[94857]: 2025-10-08T16:05:46Z|00738|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Oct  8 12:05:47 np0005476733 nova_compute[192580]: 2025-10-08 16:05:47.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:50 np0005476733 nova_compute[192580]: 2025-10-08 16:05:50.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:51 np0005476733 nova_compute[192580]: 2025-10-08 16:05:51.044 2 INFO nova.compute.manager [None req-38a41223-be22-4330-9818-f71218d2a221 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Get console output#033[00m
Oct  8 12:05:51 np0005476733 nova_compute[192580]: 2025-10-08 16:05:51.050 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:05:51 np0005476733 nova_compute[192580]: 2025-10-08 16:05:51.053 2 INFO nova.virt.libvirt.driver [None req-38a41223-be22-4330-9818-f71218d2a221 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Truncated console log returned, 3234 bytes ignored#033[00m
Oct  8 12:05:51 np0005476733 podman[249985]: 2025-10-08 16:05:51.242542149 +0000 UTC m=+0.059622507 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 12:05:51 np0005476733 podman[249984]: 2025-10-08 16:05:51.245330578 +0000 UTC m=+0.067042604 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Oct  8 12:05:51 np0005476733 podman[249987]: 2025-10-08 16:05:51.257820096 +0000 UTC m=+0.066820066 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, architecture=x86_64, managed_by=edpm_ansible, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc.)
Oct  8 12:05:52 np0005476733 nova_compute[192580]: 2025-10-08 16:05:52.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:55 np0005476733 nova_compute[192580]: 2025-10-08 16:05:55.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:55 np0005476733 nova_compute[192580]: 2025-10-08 16:05:55.502 2 DEBUG nova.compute.manager [req-9bd4c8ef-58a1-49d7-8ea7-a3e4d0d8cd25 req-680d5e4e-b241-4d0e-b632-d0ff0a1c4bab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Received event network-changed-e01a0204-a0c3-4267-bec4-88b5e24e15bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:05:55 np0005476733 nova_compute[192580]: 2025-10-08 16:05:55.502 2 DEBUG nova.compute.manager [req-9bd4c8ef-58a1-49d7-8ea7-a3e4d0d8cd25 req-680d5e4e-b241-4d0e-b632-d0ff0a1c4bab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Refreshing instance network info cache due to event network-changed-e01a0204-a0c3-4267-bec4-88b5e24e15bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:05:55 np0005476733 nova_compute[192580]: 2025-10-08 16:05:55.502 2 DEBUG oslo_concurrency.lockutils [req-9bd4c8ef-58a1-49d7-8ea7-a3e4d0d8cd25 req-680d5e4e-b241-4d0e-b632-d0ff0a1c4bab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-0bb99735-ce66-4e0e-9084-3ed659692146" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:05:55 np0005476733 nova_compute[192580]: 2025-10-08 16:05:55.503 2 DEBUG oslo_concurrency.lockutils [req-9bd4c8ef-58a1-49d7-8ea7-a3e4d0d8cd25 req-680d5e4e-b241-4d0e-b632-d0ff0a1c4bab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-0bb99735-ce66-4e0e-9084-3ed659692146" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:05:55 np0005476733 nova_compute[192580]: 2025-10-08 16:05:55.503 2 DEBUG nova.network.neutron [req-9bd4c8ef-58a1-49d7-8ea7-a3e4d0d8cd25 req-680d5e4e-b241-4d0e-b632-d0ff0a1c4bab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Refreshing network info cache for port e01a0204-a0c3-4267-bec4-88b5e24e15bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:05:56 np0005476733 nova_compute[192580]: 2025-10-08 16:05:56.784 2 DEBUG nova.network.neutron [req-9bd4c8ef-58a1-49d7-8ea7-a3e4d0d8cd25 req-680d5e4e-b241-4d0e-b632-d0ff0a1c4bab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Updated VIF entry in instance network info cache for port e01a0204-a0c3-4267-bec4-88b5e24e15bd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:05:56 np0005476733 nova_compute[192580]: 2025-10-08 16:05:56.785 2 DEBUG nova.network.neutron [req-9bd4c8ef-58a1-49d7-8ea7-a3e4d0d8cd25 req-680d5e4e-b241-4d0e-b632-d0ff0a1c4bab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Updating instance_info_cache with network_info: [{"id": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "address": "fa:16:3e:98:9e:69", "network": {"id": "d072591a-0382-43cd-8966-59eb24fe6dd1", "bridge": "br-int", "label": "tempest-test-network--231794910", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71bd615ba6694cba8794c8eb5dadbe81", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape01a0204-a0", "ovs_interfaceid": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:05:56 np0005476733 nova_compute[192580]: 2025-10-08 16:05:56.816 2 DEBUG oslo_concurrency.lockutils [req-9bd4c8ef-58a1-49d7-8ea7-a3e4d0d8cd25 req-680d5e4e-b241-4d0e-b632-d0ff0a1c4bab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-0bb99735-ce66-4e0e-9084-3ed659692146" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:05:57 np0005476733 nova_compute[192580]: 2025-10-08 16:05:57.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:05:59 np0005476733 podman[250043]: 2025-10-08 16:05:59.223433635 +0000 UTC m=+0.052293193 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:05:59 np0005476733 podman[250044]: 2025-10-08 16:05:59.230848461 +0000 UTC m=+0.055991360 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 12:05:59 np0005476733 nova_compute[192580]: 2025-10-08 16:05:59.255 2 DEBUG nova.compute.manager [req-6f586428-8cc0-480e-9545-80039974e3c1 req-05006445-696a-4d73-b561-be91b7906627 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Received event network-changed-e01a0204-a0c3-4267-bec4-88b5e24e15bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:05:59 np0005476733 nova_compute[192580]: 2025-10-08 16:05:59.256 2 DEBUG nova.compute.manager [req-6f586428-8cc0-480e-9545-80039974e3c1 req-05006445-696a-4d73-b561-be91b7906627 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Refreshing instance network info cache due to event network-changed-e01a0204-a0c3-4267-bec4-88b5e24e15bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:05:59 np0005476733 nova_compute[192580]: 2025-10-08 16:05:59.256 2 DEBUG oslo_concurrency.lockutils [req-6f586428-8cc0-480e-9545-80039974e3c1 req-05006445-696a-4d73-b561-be91b7906627 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-0bb99735-ce66-4e0e-9084-3ed659692146" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:05:59 np0005476733 nova_compute[192580]: 2025-10-08 16:05:59.256 2 DEBUG oslo_concurrency.lockutils [req-6f586428-8cc0-480e-9545-80039974e3c1 req-05006445-696a-4d73-b561-be91b7906627 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-0bb99735-ce66-4e0e-9084-3ed659692146" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:05:59 np0005476733 nova_compute[192580]: 2025-10-08 16:05:59.256 2 DEBUG nova.network.neutron [req-6f586428-8cc0-480e-9545-80039974e3c1 req-05006445-696a-4d73-b561-be91b7906627 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Refreshing network info cache for port e01a0204-a0c3-4267-bec4-88b5e24e15bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:06:00 np0005476733 nova_compute[192580]: 2025-10-08 16:06:00.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:00 np0005476733 nova_compute[192580]: 2025-10-08 16:06:00.472 2 DEBUG nova.network.neutron [req-6f586428-8cc0-480e-9545-80039974e3c1 req-05006445-696a-4d73-b561-be91b7906627 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Updated VIF entry in instance network info cache for port e01a0204-a0c3-4267-bec4-88b5e24e15bd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:06:00 np0005476733 nova_compute[192580]: 2025-10-08 16:06:00.473 2 DEBUG nova.network.neutron [req-6f586428-8cc0-480e-9545-80039974e3c1 req-05006445-696a-4d73-b561-be91b7906627 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Updating instance_info_cache with network_info: [{"id": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "address": "fa:16:3e:98:9e:69", "network": {"id": "d072591a-0382-43cd-8966-59eb24fe6dd1", "bridge": "br-int", "label": "tempest-test-network--231794910", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71bd615ba6694cba8794c8eb5dadbe81", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape01a0204-a0", "ovs_interfaceid": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:06:00 np0005476733 nova_compute[192580]: 2025-10-08 16:06:00.509 2 DEBUG oslo_concurrency.lockutils [req-6f586428-8cc0-480e-9545-80039974e3c1 req-05006445-696a-4d73-b561-be91b7906627 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-0bb99735-ce66-4e0e-9084-3ed659692146" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:06:02 np0005476733 nova_compute[192580]: 2025-10-08 16:06:02.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:05 np0005476733 nova_compute[192580]: 2025-10-08 16:06:05.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:07 np0005476733 nova_compute[192580]: 2025-10-08 16:06:07.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:07 np0005476733 podman[250120]: 2025-10-08 16:06:07.274743022 +0000 UTC m=+0.084931246 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 12:06:08 np0005476733 nova_compute[192580]: 2025-10-08 16:06:08.622 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:06:08 np0005476733 nova_compute[192580]: 2025-10-08 16:06:08.623 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 12:06:08 np0005476733 nova_compute[192580]: 2025-10-08 16:06:08.651 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 12:06:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:06:10.179 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:06:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:06:10.180 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:06:10 np0005476733 nova_compute[192580]: 2025-10-08 16:06:10.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:10 np0005476733 nova_compute[192580]: 2025-10-08 16:06:10.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:06:11.183 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:06:12 np0005476733 nova_compute[192580]: 2025-10-08 16:06:12.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:13 np0005476733 nova_compute[192580]: 2025-10-08 16:06:13.617 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:06:15 np0005476733 podman[250140]: 2025-10-08 16:06:15.257934288 +0000 UTC m=+0.077867000 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Oct  8 12:06:15 np0005476733 podman[250139]: 2025-10-08 16:06:15.297163392 +0000 UTC m=+0.114161270 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct  8 12:06:15 np0005476733 nova_compute[192580]: 2025-10-08 16:06:15.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:16 np0005476733 nova_compute[192580]: 2025-10-08 16:06:16.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:06:17 np0005476733 nova_compute[192580]: 2025-10-08 16:06:17.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:20 np0005476733 nova_compute[192580]: 2025-10-08 16:06:20.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:20 np0005476733 nova_compute[192580]: 2025-10-08 16:06:20.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:06:20 np0005476733 nova_compute[192580]: 2025-10-08 16:06:20.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:06:20 np0005476733 nova_compute[192580]: 2025-10-08 16:06:20.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:06:20 np0005476733 nova_compute[192580]: 2025-10-08 16:06:20.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:06:20 np0005476733 nova_compute[192580]: 2025-10-08 16:06:20.984 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-0bb99735-ce66-4e0e-9084-3ed659692146" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:06:20 np0005476733 nova_compute[192580]: 2025-10-08 16:06:20.984 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-0bb99735-ce66-4e0e-9084-3ed659692146" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:06:20 np0005476733 nova_compute[192580]: 2025-10-08 16:06:20.984 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:06:20 np0005476733 nova_compute[192580]: 2025-10-08 16:06:20.985 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0bb99735-ce66-4e0e-9084-3ed659692146 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:06:21 np0005476733 ovn_controller[94857]: 2025-10-08T16:06:21Z|00739|pinctrl|WARN|Dropped 275 log messages in last 55 seconds (most recently, 1 seconds ago) due to excessive rate
Oct  8 12:06:21 np0005476733 ovn_controller[94857]: 2025-10-08T16:06:21Z|00740|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:06:22 np0005476733 podman[250186]: 2025-10-08 16:06:22.230059432 +0000 UTC m=+0.051662933 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 12:06:22 np0005476733 nova_compute[192580]: 2025-10-08 16:06:22.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:22 np0005476733 podman[250187]: 2025-10-08 16:06:22.25815876 +0000 UTC m=+0.072971683 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, vcs-type=git, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, release=1755695350, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  8 12:06:22 np0005476733 podman[250185]: 2025-10-08 16:06:22.262957993 +0000 UTC m=+0.080480203 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, tcib_managed=true, config_id=multipathd)
Oct  8 12:06:22 np0005476733 nova_compute[192580]: 2025-10-08 16:06:22.920 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Updating instance_info_cache with network_info: [{"id": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "address": "fa:16:3e:98:9e:69", "network": {"id": "d072591a-0382-43cd-8966-59eb24fe6dd1", "bridge": "br-int", "label": "tempest-test-network--231794910", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71bd615ba6694cba8794c8eb5dadbe81", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape01a0204-a0", "ovs_interfaceid": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:06:22 np0005476733 nova_compute[192580]: 2025-10-08 16:06:22.941 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-0bb99735-ce66-4e0e-9084-3ed659692146" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:06:22 np0005476733 nova_compute[192580]: 2025-10-08 16:06:22.941 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:06:22 np0005476733 nova_compute[192580]: 2025-10-08 16:06:22.942 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:06:24 np0005476733 systemd-logind[827]: New session 52 of user zuul.
Oct  8 12:06:24 np0005476733 systemd[1]: Started Session 52 of User zuul.
Oct  8 12:06:24 np0005476733 systemd-logind[827]: New session 53 of user zuul.
Oct  8 12:06:24 np0005476733 systemd[1]: Started Session 53 of User zuul.
Oct  8 12:06:24 np0005476733 systemd[1]: session-53.scope: Deactivated successfully.
Oct  8 12:06:24 np0005476733 systemd-logind[827]: Session 53 logged out. Waiting for processes to exit.
Oct  8 12:06:24 np0005476733 systemd-logind[827]: Removed session 53.
Oct  8 12:06:24 np0005476733 nova_compute[192580]: 2025-10-08 16:06:24.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:06:24 np0005476733 nova_compute[192580]: 2025-10-08 16:06:24.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:06:25 np0005476733 nova_compute[192580]: 2025-10-08 16:06:25.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:06:26.360 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:06:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:06:26.360 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:06:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:06:26.361 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:06:27 np0005476733 nova_compute[192580]: 2025-10-08 16:06:27.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:27 np0005476733 nova_compute[192580]: 2025-10-08 16:06:27.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:06:30 np0005476733 podman[250309]: 2025-10-08 16:06:30.219874653 +0000 UTC m=+0.049712400 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:06:30 np0005476733 podman[250308]: 2025-10-08 16:06:30.227214767 +0000 UTC m=+0.059888294 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.license=GPLv2)
Oct  8 12:06:30 np0005476733 nova_compute[192580]: 2025-10-08 16:06:30.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:30 np0005476733 nova_compute[192580]: 2025-10-08 16:06:30.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:06:30 np0005476733 nova_compute[192580]: 2025-10-08 16:06:30.617 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:06:30 np0005476733 nova_compute[192580]: 2025-10-08 16:06:30.617 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:06:30 np0005476733 nova_compute[192580]: 2025-10-08 16:06:30.617 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:06:30 np0005476733 nova_compute[192580]: 2025-10-08 16:06:30.618 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:06:30 np0005476733 nova_compute[192580]: 2025-10-08 16:06:30.687 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0bb99735-ce66-4e0e-9084-3ed659692146/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:06:30 np0005476733 nova_compute[192580]: 2025-10-08 16:06:30.752 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0bb99735-ce66-4e0e-9084-3ed659692146/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:06:30 np0005476733 nova_compute[192580]: 2025-10-08 16:06:30.753 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0bb99735-ce66-4e0e-9084-3ed659692146/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:06:30 np0005476733 nova_compute[192580]: 2025-10-08 16:06:30.810 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0bb99735-ce66-4e0e-9084-3ed659692146/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:06:30 np0005476733 nova_compute[192580]: 2025-10-08 16:06:30.941 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:06:30 np0005476733 nova_compute[192580]: 2025-10-08 16:06:30.942 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12957MB free_disk=111.18882751464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:06:30 np0005476733 nova_compute[192580]: 2025-10-08 16:06:30.942 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:06:30 np0005476733 nova_compute[192580]: 2025-10-08 16:06:30.943 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:06:31 np0005476733 nova_compute[192580]: 2025-10-08 16:06:31.004 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 0bb99735-ce66-4e0e-9084-3ed659692146 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:06:31 np0005476733 nova_compute[192580]: 2025-10-08 16:06:31.005 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:06:31 np0005476733 nova_compute[192580]: 2025-10-08 16:06:31.005 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:06:31 np0005476733 nova_compute[192580]: 2025-10-08 16:06:31.024 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 12:06:31 np0005476733 nova_compute[192580]: 2025-10-08 16:06:31.047 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 12:06:31 np0005476733 nova_compute[192580]: 2025-10-08 16:06:31.047 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 12:06:31 np0005476733 nova_compute[192580]: 2025-10-08 16:06:31.063 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 12:06:31 np0005476733 nova_compute[192580]: 2025-10-08 16:06:31.098 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 12:06:31 np0005476733 nova_compute[192580]: 2025-10-08 16:06:31.145 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:06:31 np0005476733 nova_compute[192580]: 2025-10-08 16:06:31.173 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:06:31 np0005476733 nova_compute[192580]: 2025-10-08 16:06:31.175 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:06:31 np0005476733 nova_compute[192580]: 2025-10-08 16:06:31.176 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:06:32 np0005476733 nova_compute[192580]: 2025-10-08 16:06:32.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:35 np0005476733 nova_compute[192580]: 2025-10-08 16:06:35.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.060 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'name': 'tempest-test_dvr_vip_failover_basic-237145123', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000004e', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '71bd615ba6694cba8794c8eb5dadbe81', 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'hostId': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.062 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.080 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/disk.device.read.latency volume: 6557887352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.081 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/disk.device.read.latency volume: 84429004 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c1c4e75-d94c-46b0-a22b-91065607568e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6557887352, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '0bb99735-ce66-4e0e-9084-3ed659692146-vda', 'timestamp': '2025-10-08T16:06:36.062420', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'instance-0000004e', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'c402be2e-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.785422166, 'message_signature': '3e96cd152da88678b3e9358d0f8061d15cd44651d430d4565700a93dd71ad144'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 84429004, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '0bb99735-ce66-4e0e-9084-3ed659692146-sda', 'timestamp': '2025-10-08T16:06:36.062420', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'instance-0000004e', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'c402cbda-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.785422166, 'message_signature': 'b50f22ea56134d595ec490169c98ae36369f9c83efb553ff3bc2858e32be4d95'}]}, 'timestamp': '2025-10-08 16:06:36.082010', '_unique_id': 'dc364a98e9f842ba87675b4be544c954'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.083 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.084 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.084 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/disk.device.read.bytes volume: 328459776 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.084 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6470a40c-b110-4f1d-b4eb-a27141504db2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 328459776, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '0bb99735-ce66-4e0e-9084-3ed659692146-vda', 'timestamp': '2025-10-08T16:06:36.084648', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'instance-0000004e', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'c4034092-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.785422166, 'message_signature': 'b47b78f0154bd2038aaa6e346312c1ade8aea8af1ff5cf6662d636aa88e9f87c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '0bb99735-ce66-4e0e-9084-3ed659692146-sda', 'timestamp': '2025-10-08T16:06:36.084648', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'instance-0000004e', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'c4034e7a-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.785422166, 'message_signature': '1726e6b54a3627e9931f78d673a62aaa04eb9a7cddd7a9129734257b0c709612'}]}, 'timestamp': '2025-10-08 16:06:36.085351', '_unique_id': '7dc6f9b53c744f25956e4b3daa9fcd1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.086 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.087 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.089 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 0bb99735-ce66-4e0e-9084-3ed659692146 / tape01a0204-a0 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.090 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/network.incoming.bytes volume: 5419 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '01fbdc1a-72a7-4612-9967-da5e1593f540', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5419, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-0000004e-0bb99735-ce66-4e0e-9084-3ed659692146-tape01a0204-a0', 'timestamp': '2025-10-08T16:06:36.087546', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'tape01a0204-a0', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:98:9e:69', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape01a0204-a0'}, 'message_id': 'c4041896-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.810543219, 'message_signature': '12806463717b79335c6c9a26cf7b5646a11b493be24f1b1dcaef6284a8acaadd'}]}, 'timestamp': '2025-10-08 16:06:36.090524', '_unique_id': '7ff6dd05f8644909899943f193f80bbb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.091 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.092 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.092 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/network.incoming.packets volume: 32 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b5fdd6d-6dfe-487d-a889-d7c1ce3461ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 32, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-0000004e-0bb99735-ce66-4e0e-9084-3ed659692146-tape01a0204-a0', 'timestamp': '2025-10-08T16:06:36.092486', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'tape01a0204-a0', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:98:9e:69', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape01a0204-a0'}, 'message_id': 'c4047124-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.810543219, 'message_signature': '680fddf8db6d1c855373116db7761c8eccc65e1b2dbfd0906f82248481199c7d'}]}, 'timestamp': '2025-10-08 16:06:36.092758', '_unique_id': '46995fe76c0c4edf856b3bade8f1da8f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.093 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.094 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.094 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/disk.device.write.requests volume: 711 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.094 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6c719c0-f6be-42f1-a9a8-79dd6ed95f3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 711, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '0bb99735-ce66-4e0e-9084-3ed659692146-vda', 'timestamp': '2025-10-08T16:06:36.094436', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'instance-0000004e', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'c404bd28-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.785422166, 'message_signature': '76d716a59e6694be0ac2f6d9bae9308c4839a0f58348ba7c17133e052557a113'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '0bb99735-ce66-4e0e-9084-3ed659692146-sda', 'timestamp': '2025-10-08T16:06:36.094436', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'instance-0000004e', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'c404c76e-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.785422166, 'message_signature': 'f8e3d68a135c9c515657ff22486c6dca6698608844925200a9308e99d9045a26'}]}, 'timestamp': '2025-10-08 16:06:36.094975', '_unique_id': '67254206398a4f21b60f15c1a2aee128'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.095 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.096 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.096 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/disk.device.read.requests volume: 11648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.096 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54d453c3-5c4b-4b78-90f3-9bdc446534c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11648, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '0bb99735-ce66-4e0e-9084-3ed659692146-vda', 'timestamp': '2025-10-08T16:06:36.096615', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'instance-0000004e', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'c4051232-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.785422166, 'message_signature': '533f9c4fb5bdc9da45178260811f8c31a7e0197adc7fcc6bfbf07f9db85e23e7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '0bb99735-ce66-4e0e-9084-3ed659692146-sda', 'timestamp': '2025-10-08T16:06:36.096615', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'instance-0000004e', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'c4051b88-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.785422166, 'message_signature': '86e43af7ab9b890b2dc65fa50a5c1a3e0514bbcf830cfc7c49150e0e0af8b0bc'}]}, 'timestamp': '2025-10-08 16:06:36.097123', '_unique_id': '6d8fa62cbdc24d2698fb0fa102937a3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.097 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.098 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.098 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/disk.device.write.bytes volume: 135508992 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.099 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae375ee0-1fda-4d61-8ce9-a0a9aa352e6e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 135508992, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '0bb99735-ce66-4e0e-9084-3ed659692146-vda', 'timestamp': '2025-10-08T16:06:36.098857', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'instance-0000004e', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'c4057060-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.785422166, 'message_signature': 'e1a946bb8f6ef1ad002ccf1fc3710909cbeab4355e3f3282d9c1791c7cd555d3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '0bb99735-ce66-4e0e-9084-3ed659692146-sda', 'timestamp': '2025-10-08T16:06:36.098857', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'instance-0000004e', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'c4057d1c-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.785422166, 'message_signature': 'b362adea84c7b6a0cb25c24efb145192fe806ab7ad54fa8217b92d5d1e62e90d'}]}, 'timestamp': '2025-10-08 16:06:36.099622', '_unique_id': 'f5a17f76237046d48b7a01d464acdb37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.100 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.101 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.101 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.102 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-test_dvr_vip_failover_basic-237145123>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dvr_vip_failover_basic-237145123>]
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.102 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.102 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/network.outgoing.bytes volume: 8017 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '672016bf-4af6-4451-a5ec-80148f251364', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8017, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-0000004e-0bb99735-ce66-4e0e-9084-3ed659692146-tape01a0204-a0', 'timestamp': '2025-10-08T16:06:36.102371', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'tape01a0204-a0', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:98:9e:69', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape01a0204-a0'}, 'message_id': 'c405f36e-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.810543219, 'message_signature': '07c66eccb1106151c8ca370a060bb85144cfc7f38ba975859c1faba228e007c8'}]}, 'timestamp': '2025-10-08 16:06:36.102648', '_unique_id': '0ea954903df24b3fadb8b5b342b066bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.103 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.104 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.113 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.114 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '235d67d7-ad11-4f80-a202-91f644b4dc67', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '0bb99735-ce66-4e0e-9084-3ed659692146-vda', 'timestamp': '2025-10-08T16:06:36.104150', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'instance-0000004e', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'c407c2e8-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.827135719, 'message_signature': '8011faba2a35e1774af3596e3065da929ec42e0a8b67e524d6f2c8728209a984'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '0bb99735-ce66-4e0e-9084-3ed659692146-sda', 'timestamp': '2025-10-08T16:06:36.104150', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'instance-0000004e', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'c407d300-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.827135719, 'message_signature': '95e551d3379ecb87229b7585d5f1b07cfec5d7753604d41758e145b8e9704290'}]}, 'timestamp': '2025-10-08 16:06:36.114969', '_unique_id': '45f92674777449efab89c97cf1d35afb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.116 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.117 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.117 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e03a5d31-aa5a-43ce-b449-269891b2cdc6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-0000004e-0bb99735-ce66-4e0e-9084-3ed659692146-tape01a0204-a0', 'timestamp': '2025-10-08T16:06:36.117284', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'tape01a0204-a0', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:98:9e:69', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape01a0204-a0'}, 'message_id': 'c4083a98-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.810543219, 'message_signature': 'ad4da358c9144f410f6b52153e6d3629720e21ff25c411865f0e20a13befadbc'}]}, 'timestamp': '2025-10-08 16:06:36.117577', '_unique_id': '08e24d0cf71a42c59e293bf7211ea7c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.118 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c295371-a354-404a-8fcb-2491fbb049db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-0000004e-0bb99735-ce66-4e0e-9084-3ed659692146-tape01a0204-a0', 'timestamp': '2025-10-08T16:06:36.119085', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'tape01a0204-a0', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:98:9e:69', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape01a0204-a0'}, 'message_id': 'c40883e0-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.810543219, 'message_signature': '4ab0d232d964d6bc3044905d4bc08ad9a65de9bb08529368a840c770d696c337'}]}, 'timestamp': '2025-10-08 16:06:36.119468', '_unique_id': '4c39bd8091e945729ec607ca8d562197'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.119 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.120 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.121 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.121 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-test_dvr_vip_failover_basic-237145123>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dvr_vip_failover_basic-237145123>]
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.121 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.134 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/memory.usage volume: 230.87109375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '023ae7ac-0412-45e0-a7ca-327c25b15559', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 230.87109375, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'timestamp': '2025-10-08T16:06:36.121460', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'instance-0000004e', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': 'c40ae252-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.857254741, 'message_signature': 'a29146498bf0d289a194a93f2a67f9b0e1aa3781a8b2a6d3ac6410815f6313b8'}]}, 'timestamp': '2025-10-08 16:06:36.135045', '_unique_id': '931761247f6548319bfde50027f3f839'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.136 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.137 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.137 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0edb3c00-1b81-44e8-a9b7-5561ad307d78', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-0000004e-0bb99735-ce66-4e0e-9084-3ed659692146-tape01a0204-a0', 'timestamp': '2025-10-08T16:06:36.137330', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'tape01a0204-a0', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:98:9e:69', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape01a0204-a0'}, 'message_id': 'c40b4c6a-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.810543219, 'message_signature': 'f8c84456658ce19b19fed00b36e29010b30cdfb8cba2f9523c04c4bbdb8578af'}]}, 'timestamp': '2025-10-08 16:06:36.137766', '_unique_id': '7504a708f20e431fbf6c35eab9afdd0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.138 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.139 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.139 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39b873e9-eb49-4caa-9164-04b38a5972f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-0000004e-0bb99735-ce66-4e0e-9084-3ed659692146-tape01a0204-a0', 'timestamp': '2025-10-08T16:06:36.139834', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'tape01a0204-a0', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:98:9e:69', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape01a0204-a0'}, 'message_id': 'c40baca0-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.810543219, 'message_signature': '0b98b7a103bbc490fc80a3cee34966807730ce9250f65710e8b6c3a966fb1206'}]}, 'timestamp': '2025-10-08 16:06:36.140208', '_unique_id': '4382bd0425a4463a9bc0eb3db09a742d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.142 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.142 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.142 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-test_dvr_vip_failover_basic-237145123>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dvr_vip_failover_basic-237145123>]
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.142 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.142 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/disk.device.usage volume: 152633344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.143 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b9fbad6-e2bf-4497-a90b-1e08034af8a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152633344, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '0bb99735-ce66-4e0e-9084-3ed659692146-vda', 'timestamp': '2025-10-08T16:06:36.142966', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'instance-0000004e', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'c40c28c4-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.827135719, 'message_signature': 'e1d4c74dd3f89875335b9bf004fd50eca5747d2cee0389d1bd9bc8071661c0ac'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '0bb99735-ce66-4e0e-9084-3ed659692146-sda', 'timestamp': '2025-10-08T16:06:36.142966', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'instance-0000004e', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'c40c3792-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.827135719, 'message_signature': 'af032acaa21d1ca1ce9e36b9fb9a7696e518487e87e84457bc4de01552c1e755'}]}, 'timestamp': '2025-10-08 16:06:36.143746', '_unique_id': 'af3a3b9179d544e99d0958186d4bad06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.145 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.146 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.146 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-test_dvr_vip_failover_basic-237145123>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dvr_vip_failover_basic-237145123>]
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.146 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.146 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/cpu volume: 39490000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0fa45399-b2ee-4c4e-881a-b91bcaf53fed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 39490000000, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'timestamp': '2025-10-08T16:06:36.146485', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'instance-0000004e', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'c40cb096-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.857254741, 'message_signature': '35847f83e1e478e381d1fc183b1f71d0d262f6011cf7efce7f81020904928057'}]}, 'timestamp': '2025-10-08 16:06:36.146850', '_unique_id': 'cff09a23642441b69e58b5382b9e3fc7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.147 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.148 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.148 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/network.outgoing.packets volume: 58 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70f5af55-0e9f-45f6-8437-ea842884d22c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 58, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-0000004e-0bb99735-ce66-4e0e-9084-3ed659692146-tape01a0204-a0', 'timestamp': '2025-10-08T16:06:36.148689', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'tape01a0204-a0', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:98:9e:69', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape01a0204-a0'}, 'message_id': 'c40d06ae-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.810543219, 'message_signature': 'a87e7731c8fe79c3e21951b7bfcdbbaa0e10588c77e664d110ae969999f640cc'}]}, 'timestamp': '2025-10-08 16:06:36.149062', '_unique_id': 'a69127ce33dc4dbba3862b6f17b1fa08'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.149 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.150 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.151 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/disk.device.allocation volume: 153100288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.151 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e68a9c87-cd64-40ab-979b-8c08401a6066', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153100288, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '0bb99735-ce66-4e0e-9084-3ed659692146-vda', 'timestamp': '2025-10-08T16:06:36.150998', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'instance-0000004e', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'c40d618a-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.827135719, 'message_signature': '6db9ad72acfee12ba8ffa326cf4a546952237978c32b7c3b4987a4b7cb81f71b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '0bb99735-ce66-4e0e-9084-3ed659692146-sda', 'timestamp': '2025-10-08T16:06:36.150998', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'instance-0000004e', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'c40d6f4a-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.827135719, 'message_signature': 'a3cf33182331ccd39e3a96690def5f905419ea942789a3d2dba02d3b313b17ae'}]}, 'timestamp': '2025-10-08 16:06:36.151751', '_unique_id': 'bcd49020817b4e5186ec28834b9f285f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.153 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.153 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb98608e-5025-4167-b5ae-2ce68212a4ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-0000004e-0bb99735-ce66-4e0e-9084-3ed659692146-tape01a0204-a0', 'timestamp': '2025-10-08T16:06:36.153903', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'tape01a0204-a0', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:98:9e:69', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape01a0204-a0'}, 'message_id': 'c40dd2aa-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.810543219, 'message_signature': 'bbb8a0d7b35b3353c7aa0c6b164d54838c77ed69ed66b4975068be9ccc97919b'}]}, 'timestamp': '2025-10-08 16:06:36.154277', '_unique_id': 'b91d05a2293140e29aea612e9e401cc0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.156 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.156 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e48684e-45bf-44a8-aee0-da12843ec5b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-0000004e-0bb99735-ce66-4e0e-9084-3ed659692146-tape01a0204-a0', 'timestamp': '2025-10-08T16:06:36.156272', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'tape01a0204-a0', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:98:9e:69', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape01a0204-a0'}, 'message_id': 'c40e2d9a-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.810543219, 'message_signature': 'a33a677422a567d135f7a05bdf3687427139c8a9bc791ca314861d1a39909964'}]}, 'timestamp': '2025-10-08 16:06:36.156592', '_unique_id': '6bf28f604ad84b7d96c3e7088fb17c7b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.157 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.158 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.158 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/disk.device.write.latency volume: 5299920470 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.158 12 DEBUG ceilometer.compute.pollsters [-] 0bb99735-ce66-4e0e-9084-3ed659692146/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9506aaa8-48f7-4a8b-a17a-c1f6c45d4f1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5299920470, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '0bb99735-ce66-4e0e-9084-3ed659692146-vda', 'timestamp': '2025-10-08T16:06:36.158432', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'instance-0000004e', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'c40e8542-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.785422166, 'message_signature': '1087423178765ceb9dbf34d4241016722ee9c1d3ffae9c3a2f245c2797b247b1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '0bb99735-ce66-4e0e-9084-3ed659692146-sda', 'timestamp': '2025-10-08T16:06:36.158432', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_basic-237145123', 'name': 'instance-0000004e', 'instance_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'c40e91fe-a460-11f0-9274-fa163ef67048', 'monotonic_time': 6499.785422166, 'message_signature': '2a7e6b7498c5c39d5deb0aa1c785b0a79d898830d97cb8959ac08505c6b03ae1'}]}, 'timestamp': '2025-10-08 16:06:36.159188', '_unique_id': '5ebb6d1de5644713a68ff34d84fca2ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:06:36.159 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:06:37 np0005476733 nova_compute[192580]: 2025-10-08 16:06:37.178 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:06:37 np0005476733 nova_compute[192580]: 2025-10-08 16:06:37.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:37 np0005476733 nova_compute[192580]: 2025-10-08 16:06:37.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:06:38 np0005476733 podman[250359]: 2025-10-08 16:06:38.221064328 +0000 UTC m=+0.052029504 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct  8 12:06:40 np0005476733 nova_compute[192580]: 2025-10-08 16:06:40.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:42 np0005476733 nova_compute[192580]: 2025-10-08 16:06:42.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:44 np0005476733 ovn_controller[94857]: 2025-10-08T16:06:44Z|00741|memory_trim|INFO|Detected inactivity (last active 30020 ms ago): trimming memory
Oct  8 12:06:45 np0005476733 nova_compute[192580]: 2025-10-08 16:06:45.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:46 np0005476733 podman[250380]: 2025-10-08 16:06:46.227931453 +0000 UTC m=+0.056925900 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  8 12:06:46 np0005476733 podman[250379]: 2025-10-08 16:06:46.250839936 +0000 UTC m=+0.078946384 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  8 12:06:47 np0005476733 nova_compute[192580]: 2025-10-08 16:06:47.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:50 np0005476733 nova_compute[192580]: 2025-10-08 16:06:50.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:52 np0005476733 nova_compute[192580]: 2025-10-08 16:06:52.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:53 np0005476733 podman[250426]: 2025-10-08 16:06:53.22265756 +0000 UTC m=+0.051001561 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 12:06:53 np0005476733 podman[250425]: 2025-10-08 16:06:53.222664991 +0000 UTC m=+0.055547527 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd)
Oct  8 12:06:53 np0005476733 podman[250427]: 2025-10-08 16:06:53.226886035 +0000 UTC m=+0.051590610 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, release=1755695350, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9)
Oct  8 12:06:55 np0005476733 nova_compute[192580]: 2025-10-08 16:06:55.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:57 np0005476733 nova_compute[192580]: 2025-10-08 16:06:57.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:06:57 np0005476733 ovn_controller[94857]: 2025-10-08T16:06:57Z|00742|pinctrl|INFO|Claiming virtual lport d6d636d9-980d-4d5a-8195-7dbf9ae15260 for this chassis with the virtual parent e01a0204-a0c3-4267-bec4-88b5e24e15bd
Oct  8 12:06:57 np0005476733 ovn_controller[94857]: 2025-10-08T16:06:57Z|00743|binding|INFO|Setting lport d6d636d9-980d-4d5a-8195-7dbf9ae15260 up in Southbound
Oct  8 12:07:00 np0005476733 nova_compute[192580]: 2025-10-08 16:07:00.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:01 np0005476733 podman[250494]: 2025-10-08 16:07:01.226143339 +0000 UTC m=+0.047998966 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:07:01 np0005476733 podman[250493]: 2025-10-08 16:07:01.226551832 +0000 UTC m=+0.052528801 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=iscsid)
Oct  8 12:07:02 np0005476733 nova_compute[192580]: 2025-10-08 16:07:02.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:05 np0005476733 ovn_controller[94857]: 2025-10-08T16:07:05Z|00744|binding|INFO|Releasing lport d6d636d9-980d-4d5a-8195-7dbf9ae15260
Oct  8 12:07:05 np0005476733 ovn_controller[94857]: 2025-10-08T16:07:05Z|00745|binding|INFO|Setting lport d6d636d9-980d-4d5a-8195-7dbf9ae15260 down in Southbound
Oct  8 12:07:05 np0005476733 nova_compute[192580]: 2025-10-08 16:07:05.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:06 np0005476733 systemd-logind[827]: New session 54 of user zuul.
Oct  8 12:07:06 np0005476733 systemd[1]: Started Session 54 of User zuul.
Oct  8 12:07:07 np0005476733 nova_compute[192580]: 2025-10-08 16:07:07.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:09 np0005476733 podman[250567]: 2025-10-08 16:07:09.232972932 +0000 UTC m=+0.058757329 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Oct  8 12:07:10 np0005476733 nova_compute[192580]: 2025-10-08 16:07:10.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:11 np0005476733 systemd-logind[827]: New session 55 of user zuul.
Oct  8 12:07:11 np0005476733 systemd[1]: Started Session 55 of User zuul.
Oct  8 12:07:12 np0005476733 systemd[1]: session-55.scope: Deactivated successfully.
Oct  8 12:07:12 np0005476733 systemd-logind[827]: Session 55 logged out. Waiting for processes to exit.
Oct  8 12:07:12 np0005476733 systemd-logind[827]: Removed session 55.
Oct  8 12:07:12 np0005476733 nova_compute[192580]: 2025-10-08 16:07:12.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:14 np0005476733 nova_compute[192580]: 2025-10-08 16:07:14.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:07:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:15.531 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=50, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=49) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:07:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:15.532 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:07:15 np0005476733 nova_compute[192580]: 2025-10-08 16:07:15.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:15 np0005476733 nova_compute[192580]: 2025-10-08 16:07:15.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:16 np0005476733 nova_compute[192580]: 2025-10-08 16:07:16.745 2 DEBUG oslo_concurrency.lockutils [None req-10565fd7-a179-48c1-8ca0-f37efa78991c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "0bb99735-ce66-4e0e-9084-3ed659692146" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:07:16 np0005476733 nova_compute[192580]: 2025-10-08 16:07:16.746 2 DEBUG oslo_concurrency.lockutils [None req-10565fd7-a179-48c1-8ca0-f37efa78991c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "0bb99735-ce66-4e0e-9084-3ed659692146" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:07:16 np0005476733 nova_compute[192580]: 2025-10-08 16:07:16.746 2 DEBUG oslo_concurrency.lockutils [None req-10565fd7-a179-48c1-8ca0-f37efa78991c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "0bb99735-ce66-4e0e-9084-3ed659692146-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:07:16 np0005476733 nova_compute[192580]: 2025-10-08 16:07:16.746 2 DEBUG oslo_concurrency.lockutils [None req-10565fd7-a179-48c1-8ca0-f37efa78991c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "0bb99735-ce66-4e0e-9084-3ed659692146-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:07:16 np0005476733 nova_compute[192580]: 2025-10-08 16:07:16.746 2 DEBUG oslo_concurrency.lockutils [None req-10565fd7-a179-48c1-8ca0-f37efa78991c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "0bb99735-ce66-4e0e-9084-3ed659692146-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:07:16 np0005476733 nova_compute[192580]: 2025-10-08 16:07:16.747 2 INFO nova.compute.manager [None req-10565fd7-a179-48c1-8ca0-f37efa78991c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Terminating instance#033[00m
Oct  8 12:07:16 np0005476733 nova_compute[192580]: 2025-10-08 16:07:16.748 2 DEBUG nova.compute.manager [None req-10565fd7-a179-48c1-8ca0-f37efa78991c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 12:07:16 np0005476733 kernel: tape01a0204-a0 (unregistering): left promiscuous mode
Oct  8 12:07:16 np0005476733 NetworkManager[51699]: <info>  [1759939636.7713] device (tape01a0204-a0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 12:07:16 np0005476733 ovn_controller[94857]: 2025-10-08T16:07:16Z|00746|binding|INFO|Releasing lport e01a0204-a0c3-4267-bec4-88b5e24e15bd from this chassis (sb_readonly=0)
Oct  8 12:07:16 np0005476733 ovn_controller[94857]: 2025-10-08T16:07:16Z|00747|binding|INFO|Setting lport e01a0204-a0c3-4267-bec4-88b5e24e15bd down in Southbound
Oct  8 12:07:16 np0005476733 nova_compute[192580]: 2025-10-08 16:07:16.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:16 np0005476733 ovn_controller[94857]: 2025-10-08T16:07:16Z|00748|binding|INFO|Removing iface tape01a0204-a0 ovn-installed in OVS
Oct  8 12:07:16 np0005476733 nova_compute[192580]: 2025-10-08 16:07:16.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:16.786 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:9e:69 192.168.0.238'], port_security=['fa:16:3e:98:9e:69 192.168.0.238 192.168.0.59'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.238/24', 'neutron:device_id': '0bb99735-ce66-4e0e-9084-3ed659692146', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d072591a-0382-43cd-8966-59eb24fe6dd1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'neutron:revision_number': '5', 'neutron:security_group_ids': '24bffacf-e176-4693-befb-0f2fe8062d38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.210'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=528be15d-7e7e-49a4-9215-203e9b098cff, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=e01a0204-a0c3-4267-bec4-88b5e24e15bd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:07:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:16.788 103739 INFO neutron.agent.ovn.metadata.agent [-] Port e01a0204-a0c3-4267-bec4-88b5e24e15bd in datapath d072591a-0382-43cd-8966-59eb24fe6dd1 unbound from our chassis#033[00m
Oct  8 12:07:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:16.790 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d072591a-0382-43cd-8966-59eb24fe6dd1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 12:07:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:16.794 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[73c41bba-fe0d-4bcd-8468-3f624366b238]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:07:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:16.796 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d072591a-0382-43cd-8966-59eb24fe6dd1 namespace which is not needed anymore#033[00m
Oct  8 12:07:16 np0005476733 nova_compute[192580]: 2025-10-08 16:07:16.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:16 np0005476733 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000004e.scope: Deactivated successfully.
Oct  8 12:07:16 np0005476733 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000004e.scope: Consumed 44.247s CPU time.
Oct  8 12:07:16 np0005476733 systemd-machined[152624]: Machine qemu-47-instance-0000004e terminated.
Oct  8 12:07:16 np0005476733 podman[250621]: 2025-10-08 16:07:16.884231201 +0000 UTC m=+0.065450093 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  8 12:07:16 np0005476733 podman[250617]: 2025-10-08 16:07:16.910907533 +0000 UTC m=+0.101428593 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  8 12:07:16 np0005476733 neutron-haproxy-ovnmeta-d072591a-0382-43cd-8966-59eb24fe6dd1[249776]: [NOTICE]   (249780) : haproxy version is 2.8.14-c23fe91
Oct  8 12:07:16 np0005476733 neutron-haproxy-ovnmeta-d072591a-0382-43cd-8966-59eb24fe6dd1[249776]: [NOTICE]   (249780) : path to executable is /usr/sbin/haproxy
Oct  8 12:07:16 np0005476733 neutron-haproxy-ovnmeta-d072591a-0382-43cd-8966-59eb24fe6dd1[249776]: [WARNING]  (249780) : Exiting Master process...
Oct  8 12:07:16 np0005476733 neutron-haproxy-ovnmeta-d072591a-0382-43cd-8966-59eb24fe6dd1[249776]: [ALERT]    (249780) : Current worker (249782) exited with code 143 (Terminated)
Oct  8 12:07:16 np0005476733 neutron-haproxy-ovnmeta-d072591a-0382-43cd-8966-59eb24fe6dd1[249776]: [WARNING]  (249780) : All workers exited. Exiting... (0)
Oct  8 12:07:16 np0005476733 systemd[1]: libpod-f043f999fd47ce72b7d3089b56c6ca6b51b409df7371c9fa6fc861ae8f73f7da.scope: Deactivated successfully.
Oct  8 12:07:16 np0005476733 podman[250684]: 2025-10-08 16:07:16.941676077 +0000 UTC m=+0.047927273 container died f043f999fd47ce72b7d3089b56c6ca6b51b409df7371c9fa6fc861ae8f73f7da (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-d072591a-0382-43cd-8966-59eb24fe6dd1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:07:16 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f043f999fd47ce72b7d3089b56c6ca6b51b409df7371c9fa6fc861ae8f73f7da-userdata-shm.mount: Deactivated successfully.
Oct  8 12:07:16 np0005476733 systemd[1]: var-lib-containers-storage-overlay-e6cafb77de68f7a45632b5660803d3e80fb0b7c63eb5a99627d8fe2246634f83-merged.mount: Deactivated successfully.
Oct  8 12:07:16 np0005476733 nova_compute[192580]: 2025-10-08 16:07:16.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:16 np0005476733 nova_compute[192580]: 2025-10-08 16:07:16.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:16 np0005476733 podman[250684]: 2025-10-08 16:07:16.982602296 +0000 UTC m=+0.088853492 container cleanup f043f999fd47ce72b7d3089b56c6ca6b51b409df7371c9fa6fc861ae8f73f7da (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-d072591a-0382-43cd-8966-59eb24fe6dd1, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 12:07:16 np0005476733 systemd[1]: libpod-conmon-f043f999fd47ce72b7d3089b56c6ca6b51b409df7371c9fa6fc861ae8f73f7da.scope: Deactivated successfully.
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.013 2 INFO nova.virt.libvirt.driver [-] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Instance destroyed successfully.#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.014 2 DEBUG nova.objects.instance [None req-10565fd7-a179-48c1-8ca0-f37efa78991c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lazy-loading 'resources' on Instance uuid 0bb99735-ce66-4e0e-9084-3ed659692146 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.033 2 DEBUG nova.virt.libvirt.vif [None req-10565fd7-a179-48c1-8ca0-f37efa78991c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T16:05:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-test_dvr_vip_failover_basic-237145123',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dvr-vip-failover-basic-237145123',id=78,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNdA0tsd4prmAtaaiPDbr0srwjwa73lUEwXPQ7487oxI1AHjPlOjgV1xIPQKf206OdbyLFsmy0ZYOIXSGyym/svb66IKtgGKZCbQTa1vbaeLuI4LvJfdLsM/uLuzgQoA5Q==',key_name='tempest-keypair-test-460735074',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:05:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='71bd615ba6694cba8794c8eb5dadbe81',ramdisk_id='',reservation_id='r-a50k1nty',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-OvnDvrAdvancedTest-1107478320',owner_user_name='tempest-OvnDvrAdvancedTest-1107478320-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T16:05:17Z,user_data=None,user_id='000a8d1cd17e4a4c8398ef814dd4db2d',uuid=0bb99735-ce66-4e0e-9084-3ed659692146,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "address": "fa:16:3e:98:9e:69", "network": {"id": "d072591a-0382-43cd-8966-59eb24fe6dd1", "bridge": "br-int", "label": "tempest-test-network--231794910", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71bd615ba6694cba8794c8eb5dadbe81", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape01a0204-a0", "ovs_interfaceid": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.034 2 DEBUG nova.network.os_vif_util [None req-10565fd7-a179-48c1-8ca0-f37efa78991c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Converting VIF {"id": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "address": "fa:16:3e:98:9e:69", "network": {"id": "d072591a-0382-43cd-8966-59eb24fe6dd1", "bridge": "br-int", "label": "tempest-test-network--231794910", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71bd615ba6694cba8794c8eb5dadbe81", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape01a0204-a0", "ovs_interfaceid": "e01a0204-a0c3-4267-bec4-88b5e24e15bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.034 2 DEBUG nova.network.os_vif_util [None req-10565fd7-a179-48c1-8ca0-f37efa78991c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:98:9e:69,bridge_name='br-int',has_traffic_filtering=True,id=e01a0204-a0c3-4267-bec4-88b5e24e15bd,network=Network(d072591a-0382-43cd-8966-59eb24fe6dd1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape01a0204-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.035 2 DEBUG os_vif [None req-10565fd7-a179-48c1-8ca0-f37efa78991c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:98:9e:69,bridge_name='br-int',has_traffic_filtering=True,id=e01a0204-a0c3-4267-bec4-88b5e24e15bd,network=Network(d072591a-0382-43cd-8966-59eb24fe6dd1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape01a0204-a0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.037 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape01a0204-a0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:07:17 np0005476733 podman[250731]: 2025-10-08 16:07:17.044316948 +0000 UTC m=+0.039037969 container remove f043f999fd47ce72b7d3089b56c6ca6b51b409df7371c9fa6fc861ae8f73f7da (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-d072591a-0382-43cd-8966-59eb24fe6dd1, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.047 2 INFO os_vif [None req-10565fd7-a179-48c1-8ca0-f37efa78991c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:98:9e:69,bridge_name='br-int',has_traffic_filtering=True,id=e01a0204-a0c3-4267-bec4-88b5e24e15bd,network=Network(d072591a-0382-43cd-8966-59eb24fe6dd1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape01a0204-a0')#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.048 2 INFO nova.virt.libvirt.driver [None req-10565fd7-a179-48c1-8ca0-f37efa78991c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Deleting instance files /var/lib/nova/instances/0bb99735-ce66-4e0e-9084-3ed659692146_del#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.048 2 INFO nova.virt.libvirt.driver [None req-10565fd7-a179-48c1-8ca0-f37efa78991c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Deletion of /var/lib/nova/instances/0bb99735-ce66-4e0e-9084-3ed659692146_del complete#033[00m
Oct  8 12:07:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:17.051 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[532bccab-22d7-40f3-9ee2-7e427fb4fbc5]: (4, ('Wed Oct  8 04:07:16 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d072591a-0382-43cd-8966-59eb24fe6dd1 (f043f999fd47ce72b7d3089b56c6ca6b51b409df7371c9fa6fc861ae8f73f7da)\nf043f999fd47ce72b7d3089b56c6ca6b51b409df7371c9fa6fc861ae8f73f7da\nWed Oct  8 04:07:16 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d072591a-0382-43cd-8966-59eb24fe6dd1 (f043f999fd47ce72b7d3089b56c6ca6b51b409df7371c9fa6fc861ae8f73f7da)\nf043f999fd47ce72b7d3089b56c6ca6b51b409df7371c9fa6fc861ae8f73f7da\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:07:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:17.053 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5f6ab9d6-80a9-434d-ab88-6721d889c4ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:07:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:17.053 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd072591a-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:17 np0005476733 kernel: tapd072591a-00: left promiscuous mode
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:17.069 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[abf8d988-f147-4aa2-af22-da3e63cd5dfe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:07:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:17.101 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c2b21dc4-16b6-49b0-b104-8e00ad1d0592]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:07:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:17.103 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d6718744-ab10-400a-85aa-b66aef774e32]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.109 2 INFO nova.compute.manager [None req-10565fd7-a179-48c1-8ca0-f37efa78991c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.111 2 DEBUG oslo.service.loopingcall [None req-10565fd7-a179-48c1-8ca0-f37efa78991c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.111 2 DEBUG nova.compute.manager [-] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.111 2 DEBUG nova.network.neutron [-] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 12:07:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:17.120 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[64c9e49f-299c-4750-9077-9b67b388641c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 642005, 'reachable_time': 30818, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250749, 'error': None, 'target': 'ovnmeta-d072591a-0382-43cd-8966-59eb24fe6dd1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:07:17 np0005476733 systemd[1]: run-netns-ovnmeta\x2dd072591a\x2d0382\x2d43cd\x2d8966\x2d59eb24fe6dd1.mount: Deactivated successfully.
Oct  8 12:07:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:17.128 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d072591a-0382-43cd-8966-59eb24fe6dd1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 12:07:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:17.129 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[f3f48a7f-2e0f-4c80-95ea-dfb733b69b46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.493 2 DEBUG nova.compute.manager [req-6d4c3df8-ba97-46f5-a297-156ae9c33575 req-b50a9ee0-8909-4662-bae9-7df83a560991 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Received event network-vif-unplugged-e01a0204-a0c3-4267-bec4-88b5e24e15bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.494 2 DEBUG oslo_concurrency.lockutils [req-6d4c3df8-ba97-46f5-a297-156ae9c33575 req-b50a9ee0-8909-4662-bae9-7df83a560991 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "0bb99735-ce66-4e0e-9084-3ed659692146-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.494 2 DEBUG oslo_concurrency.lockutils [req-6d4c3df8-ba97-46f5-a297-156ae9c33575 req-b50a9ee0-8909-4662-bae9-7df83a560991 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "0bb99735-ce66-4e0e-9084-3ed659692146-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.494 2 DEBUG oslo_concurrency.lockutils [req-6d4c3df8-ba97-46f5-a297-156ae9c33575 req-b50a9ee0-8909-4662-bae9-7df83a560991 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "0bb99735-ce66-4e0e-9084-3ed659692146-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.494 2 DEBUG nova.compute.manager [req-6d4c3df8-ba97-46f5-a297-156ae9c33575 req-b50a9ee0-8909-4662-bae9-7df83a560991 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] No waiting events found dispatching network-vif-unplugged-e01a0204-a0c3-4267-bec4-88b5e24e15bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.494 2 DEBUG nova.compute.manager [req-6d4c3df8-ba97-46f5-a297-156ae9c33575 req-b50a9ee0-8909-4662-bae9-7df83a560991 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Received event network-vif-unplugged-e01a0204-a0c3-4267-bec4-88b5e24e15bd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.495 2 DEBUG nova.compute.manager [req-6d4c3df8-ba97-46f5-a297-156ae9c33575 req-b50a9ee0-8909-4662-bae9-7df83a560991 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Received event network-vif-plugged-e01a0204-a0c3-4267-bec4-88b5e24e15bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.495 2 DEBUG oslo_concurrency.lockutils [req-6d4c3df8-ba97-46f5-a297-156ae9c33575 req-b50a9ee0-8909-4662-bae9-7df83a560991 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "0bb99735-ce66-4e0e-9084-3ed659692146-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.495 2 DEBUG oslo_concurrency.lockutils [req-6d4c3df8-ba97-46f5-a297-156ae9c33575 req-b50a9ee0-8909-4662-bae9-7df83a560991 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "0bb99735-ce66-4e0e-9084-3ed659692146-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.495 2 DEBUG oslo_concurrency.lockutils [req-6d4c3df8-ba97-46f5-a297-156ae9c33575 req-b50a9ee0-8909-4662-bae9-7df83a560991 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "0bb99735-ce66-4e0e-9084-3ed659692146-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.495 2 DEBUG nova.compute.manager [req-6d4c3df8-ba97-46f5-a297-156ae9c33575 req-b50a9ee0-8909-4662-bae9-7df83a560991 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] No waiting events found dispatching network-vif-plugged-e01a0204-a0c3-4267-bec4-88b5e24e15bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:07:17 np0005476733 nova_compute[192580]: 2025-10-08 16:07:17.495 2 WARNING nova.compute.manager [req-6d4c3df8-ba97-46f5-a297-156ae9c33575 req-b50a9ee0-8909-4662-bae9-7df83a560991 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Received unexpected event network-vif-plugged-e01a0204-a0c3-4267-bec4-88b5e24e15bd for instance with vm_state active and task_state deleting.#033[00m
Oct  8 12:07:18 np0005476733 nova_compute[192580]: 2025-10-08 16:07:18.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:07:18 np0005476733 nova_compute[192580]: 2025-10-08 16:07:18.771 2 DEBUG nova.network.neutron [-] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:07:18 np0005476733 nova_compute[192580]: 2025-10-08 16:07:18.814 2 INFO nova.compute.manager [-] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Took 1.70 seconds to deallocate network for instance.#033[00m
Oct  8 12:07:18 np0005476733 nova_compute[192580]: 2025-10-08 16:07:18.883 2 DEBUG oslo_concurrency.lockutils [None req-10565fd7-a179-48c1-8ca0-f37efa78991c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:07:18 np0005476733 nova_compute[192580]: 2025-10-08 16:07:18.884 2 DEBUG oslo_concurrency.lockutils [None req-10565fd7-a179-48c1-8ca0-f37efa78991c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:07:18 np0005476733 nova_compute[192580]: 2025-10-08 16:07:18.913 2 DEBUG nova.compute.manager [req-a85fbb4f-6ff4-4682-ba6d-015b290065bd req-33061059-8bfa-44fd-8b17-5f044fc263c5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Received event network-vif-deleted-e01a0204-a0c3-4267-bec4-88b5e24e15bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:07:18 np0005476733 nova_compute[192580]: 2025-10-08 16:07:18.946 2 DEBUG nova.compute.provider_tree [None req-10565fd7-a179-48c1-8ca0-f37efa78991c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:07:18 np0005476733 nova_compute[192580]: 2025-10-08 16:07:18.967 2 DEBUG nova.scheduler.client.report [None req-10565fd7-a179-48c1-8ca0-f37efa78991c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:07:18 np0005476733 nova_compute[192580]: 2025-10-08 16:07:18.991 2 DEBUG oslo_concurrency.lockutils [None req-10565fd7-a179-48c1-8ca0-f37efa78991c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:07:19 np0005476733 nova_compute[192580]: 2025-10-08 16:07:19.038 2 INFO nova.scheduler.client.report [None req-10565fd7-a179-48c1-8ca0-f37efa78991c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Deleted allocations for instance 0bb99735-ce66-4e0e-9084-3ed659692146#033[00m
Oct  8 12:07:19 np0005476733 nova_compute[192580]: 2025-10-08 16:07:19.142 2 DEBUG oslo_concurrency.lockutils [None req-10565fd7-a179-48c1-8ca0-f37efa78991c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "0bb99735-ce66-4e0e-9084-3ed659692146" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.396s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:07:20 np0005476733 ovn_controller[94857]: 2025-10-08T16:07:20Z|00749|pinctrl|WARN|Dropped 335 log messages in last 60 seconds (most recently, 1 seconds ago) due to excessive rate
Oct  8 12:07:20 np0005476733 ovn_controller[94857]: 2025-10-08T16:07:20Z|00750|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:07:20 np0005476733 nova_compute[192580]: 2025-10-08 16:07:20.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:07:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:21.534 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '50'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:07:21 np0005476733 nova_compute[192580]: 2025-10-08 16:07:21.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:07:21 np0005476733 nova_compute[192580]: 2025-10-08 16:07:21.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:07:21 np0005476733 nova_compute[192580]: 2025-10-08 16:07:21.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:07:21 np0005476733 nova_compute[192580]: 2025-10-08 16:07:21.613 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:07:22 np0005476733 nova_compute[192580]: 2025-10-08 16:07:22.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:22 np0005476733 nova_compute[192580]: 2025-10-08 16:07:22.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:23 np0005476733 nova_compute[192580]: 2025-10-08 16:07:23.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:07:24 np0005476733 podman[250752]: 2025-10-08 16:07:24.239621865 +0000 UTC m=+0.058198231 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vcs-type=git, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter)
Oct  8 12:07:24 np0005476733 podman[250751]: 2025-10-08 16:07:24.240680259 +0000 UTC m=+0.058174710 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:07:24 np0005476733 podman[250750]: 2025-10-08 16:07:24.248058715 +0000 UTC m=+0.065134502 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:07:24 np0005476733 nova_compute[192580]: 2025-10-08 16:07:24.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:07:24 np0005476733 nova_compute[192580]: 2025-10-08 16:07:24.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:07:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:26.361 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:07:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:26.362 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:07:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:26.362 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:07:27 np0005476733 nova_compute[192580]: 2025-10-08 16:07:27.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:27 np0005476733 nova_compute[192580]: 2025-10-08 16:07:27.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:27 np0005476733 nova_compute[192580]: 2025-10-08 16:07:27.453 2 DEBUG oslo_concurrency.lockutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "7187c34b-929f-4f25-a15b-f6294b5087bc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:07:27 np0005476733 nova_compute[192580]: 2025-10-08 16:07:27.454 2 DEBUG oslo_concurrency.lockutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "7187c34b-929f-4f25-a15b-f6294b5087bc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:07:27 np0005476733 nova_compute[192580]: 2025-10-08 16:07:27.482 2 DEBUG nova.compute.manager [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 12:07:27 np0005476733 nova_compute[192580]: 2025-10-08 16:07:27.557 2 DEBUG oslo_concurrency.lockutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:07:27 np0005476733 nova_compute[192580]: 2025-10-08 16:07:27.558 2 DEBUG oslo_concurrency.lockutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:07:27 np0005476733 nova_compute[192580]: 2025-10-08 16:07:27.567 2 DEBUG nova.virt.hardware [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 12:07:27 np0005476733 nova_compute[192580]: 2025-10-08 16:07:27.567 2 INFO nova.compute.claims [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 12:07:27 np0005476733 nova_compute[192580]: 2025-10-08 16:07:27.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:07:27 np0005476733 nova_compute[192580]: 2025-10-08 16:07:27.725 2 DEBUG nova.compute.provider_tree [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:07:27 np0005476733 nova_compute[192580]: 2025-10-08 16:07:27.761 2 DEBUG nova.scheduler.client.report [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:07:27 np0005476733 nova_compute[192580]: 2025-10-08 16:07:27.821 2 DEBUG oslo_concurrency.lockutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.263s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:07:27 np0005476733 nova_compute[192580]: 2025-10-08 16:07:27.822 2 DEBUG nova.compute.manager [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 12:07:27 np0005476733 nova_compute[192580]: 2025-10-08 16:07:27.885 2 DEBUG nova.compute.manager [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 12:07:27 np0005476733 nova_compute[192580]: 2025-10-08 16:07:27.886 2 DEBUG nova.network.neutron [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 12:07:27 np0005476733 nova_compute[192580]: 2025-10-08 16:07:27.925 2 INFO nova.virt.libvirt.driver [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 12:07:27 np0005476733 nova_compute[192580]: 2025-10-08 16:07:27.955 2 DEBUG nova.compute.manager [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 12:07:28 np0005476733 nova_compute[192580]: 2025-10-08 16:07:28.147 2 DEBUG nova.compute.manager [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 12:07:28 np0005476733 nova_compute[192580]: 2025-10-08 16:07:28.148 2 DEBUG nova.virt.libvirt.driver [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 12:07:28 np0005476733 nova_compute[192580]: 2025-10-08 16:07:28.149 2 INFO nova.virt.libvirt.driver [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Creating image(s)#033[00m
Oct  8 12:07:28 np0005476733 nova_compute[192580]: 2025-10-08 16:07:28.149 2 DEBUG oslo_concurrency.lockutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "/var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:07:28 np0005476733 nova_compute[192580]: 2025-10-08 16:07:28.149 2 DEBUG oslo_concurrency.lockutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "/var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:07:28 np0005476733 nova_compute[192580]: 2025-10-08 16:07:28.150 2 DEBUG oslo_concurrency.lockutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "/var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:07:28 np0005476733 nova_compute[192580]: 2025-10-08 16:07:28.164 2 DEBUG oslo_concurrency.processutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:07:28 np0005476733 nova_compute[192580]: 2025-10-08 16:07:28.260 2 DEBUG oslo_concurrency.processutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:07:28 np0005476733 nova_compute[192580]: 2025-10-08 16:07:28.261 2 DEBUG oslo_concurrency.lockutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:07:28 np0005476733 nova_compute[192580]: 2025-10-08 16:07:28.262 2 DEBUG oslo_concurrency.lockutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:07:28 np0005476733 nova_compute[192580]: 2025-10-08 16:07:28.276 2 DEBUG oslo_concurrency.processutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:07:28 np0005476733 nova_compute[192580]: 2025-10-08 16:07:28.338 2 DEBUG oslo_concurrency.processutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:07:28 np0005476733 nova_compute[192580]: 2025-10-08 16:07:28.339 2 DEBUG oslo_concurrency.processutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:07:28 np0005476733 nova_compute[192580]: 2025-10-08 16:07:28.638 2 DEBUG oslo_concurrency.processutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc/disk 10737418240" returned: 0 in 0.299s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:07:28 np0005476733 nova_compute[192580]: 2025-10-08 16:07:28.639 2 DEBUG oslo_concurrency.lockutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.378s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:07:28 np0005476733 nova_compute[192580]: 2025-10-08 16:07:28.640 2 DEBUG oslo_concurrency.processutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:07:28 np0005476733 nova_compute[192580]: 2025-10-08 16:07:28.716 2 DEBUG oslo_concurrency.processutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:07:28 np0005476733 nova_compute[192580]: 2025-10-08 16:07:28.717 2 DEBUG nova.objects.instance [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lazy-loading 'migration_context' on Instance uuid 7187c34b-929f-4f25-a15b-f6294b5087bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:07:28 np0005476733 nova_compute[192580]: 2025-10-08 16:07:28.754 2 DEBUG nova.policy [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 12:07:28 np0005476733 nova_compute[192580]: 2025-10-08 16:07:28.814 2 DEBUG nova.virt.libvirt.driver [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 12:07:28 np0005476733 nova_compute[192580]: 2025-10-08 16:07:28.814 2 DEBUG nova.virt.libvirt.driver [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Ensure instance console log exists: /var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 12:07:28 np0005476733 nova_compute[192580]: 2025-10-08 16:07:28.814 2 DEBUG oslo_concurrency.lockutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:07:28 np0005476733 nova_compute[192580]: 2025-10-08 16:07:28.815 2 DEBUG oslo_concurrency.lockutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:07:28 np0005476733 nova_compute[192580]: 2025-10-08 16:07:28.815 2 DEBUG oslo_concurrency.lockutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:07:31 np0005476733 nova_compute[192580]: 2025-10-08 16:07:31.262 2 DEBUG nova.network.neutron [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Successfully created port: 43b9af24-0a3b-4a87-8883-35ff0783ea2c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 12:07:31 np0005476733 nova_compute[192580]: 2025-10-08 16:07:31.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:07:31 np0005476733 nova_compute[192580]: 2025-10-08 16:07:31.633 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:07:31 np0005476733 nova_compute[192580]: 2025-10-08 16:07:31.634 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:07:31 np0005476733 nova_compute[192580]: 2025-10-08 16:07:31.634 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:07:31 np0005476733 nova_compute[192580]: 2025-10-08 16:07:31.634 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:07:31 np0005476733 podman[250829]: 2025-10-08 16:07:31.742668868 +0000 UTC m=+0.061936621 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20251001)
Oct  8 12:07:31 np0005476733 podman[250830]: 2025-10-08 16:07:31.744392014 +0000 UTC m=+0.062452018 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 12:07:31 np0005476733 nova_compute[192580]: 2025-10-08 16:07:31.820 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:07:31 np0005476733 nova_compute[192580]: 2025-10-08 16:07:31.822 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13767MB free_disk=111.33185577392578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:07:31 np0005476733 nova_compute[192580]: 2025-10-08 16:07:31.822 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:07:31 np0005476733 nova_compute[192580]: 2025-10-08 16:07:31.822 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:07:32 np0005476733 nova_compute[192580]: 2025-10-08 16:07:32.012 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759939637.0111356, 0bb99735-ce66-4e0e-9084-3ed659692146 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:07:32 np0005476733 nova_compute[192580]: 2025-10-08 16:07:32.013 2 INFO nova.compute.manager [-] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] VM Stopped (Lifecycle Event)#033[00m
Oct  8 12:07:32 np0005476733 nova_compute[192580]: 2025-10-08 16:07:32.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:32 np0005476733 nova_compute[192580]: 2025-10-08 16:07:32.090 2 DEBUG nova.compute.manager [None req-098580e2-b443-4819-bac2-10f1d39969ee - - - - - -] [instance: 0bb99735-ce66-4e0e-9084-3ed659692146] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:07:32 np0005476733 nova_compute[192580]: 2025-10-08 16:07:32.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:32 np0005476733 nova_compute[192580]: 2025-10-08 16:07:32.750 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 7187c34b-929f-4f25-a15b-f6294b5087bc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:07:32 np0005476733 nova_compute[192580]: 2025-10-08 16:07:32.750 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:07:32 np0005476733 nova_compute[192580]: 2025-10-08 16:07:32.751 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:07:33 np0005476733 nova_compute[192580]: 2025-10-08 16:07:33.034 2 DEBUG nova.network.neutron [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Successfully updated port: 43b9af24-0a3b-4a87-8883-35ff0783ea2c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 12:07:33 np0005476733 nova_compute[192580]: 2025-10-08 16:07:33.077 2 DEBUG oslo_concurrency.lockutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "refresh_cache-7187c34b-929f-4f25-a15b-f6294b5087bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:07:33 np0005476733 nova_compute[192580]: 2025-10-08 16:07:33.078 2 DEBUG oslo_concurrency.lockutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquired lock "refresh_cache-7187c34b-929f-4f25-a15b-f6294b5087bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:07:33 np0005476733 nova_compute[192580]: 2025-10-08 16:07:33.078 2 DEBUG nova.network.neutron [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:07:33 np0005476733 nova_compute[192580]: 2025-10-08 16:07:33.162 2 DEBUG nova.compute.manager [req-491488f0-09b5-4fcc-8203-f443e584b4f6 req-1f4187f6-f427-4c92-9f2b-a7660656e6cc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Received event network-changed-43b9af24-0a3b-4a87-8883-35ff0783ea2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:07:33 np0005476733 nova_compute[192580]: 2025-10-08 16:07:33.162 2 DEBUG nova.compute.manager [req-491488f0-09b5-4fcc-8203-f443e584b4f6 req-1f4187f6-f427-4c92-9f2b-a7660656e6cc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Refreshing instance network info cache due to event network-changed-43b9af24-0a3b-4a87-8883-35ff0783ea2c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:07:33 np0005476733 nova_compute[192580]: 2025-10-08 16:07:33.162 2 DEBUG oslo_concurrency.lockutils [req-491488f0-09b5-4fcc-8203-f443e584b4f6 req-1f4187f6-f427-4c92-9f2b-a7660656e6cc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-7187c34b-929f-4f25-a15b-f6294b5087bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:07:33 np0005476733 nova_compute[192580]: 2025-10-08 16:07:33.241 2 DEBUG nova.network.neutron [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 12:07:33 np0005476733 nova_compute[192580]: 2025-10-08 16:07:33.585 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:07:33 np0005476733 nova_compute[192580]: 2025-10-08 16:07:33.621 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:07:33 np0005476733 nova_compute[192580]: 2025-10-08 16:07:33.665 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:07:33 np0005476733 nova_compute[192580]: 2025-10-08 16:07:33.666 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.843s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.376 2 DEBUG nova.network.neutron [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Updating instance_info_cache with network_info: [{"id": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "address": "fa:16:3e:43:cd:2e", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.250", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43b9af24-0a", "ovs_interfaceid": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.457 2 DEBUG oslo_concurrency.lockutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Releasing lock "refresh_cache-7187c34b-929f-4f25-a15b-f6294b5087bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.457 2 DEBUG nova.compute.manager [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Instance network_info: |[{"id": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "address": "fa:16:3e:43:cd:2e", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.250", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43b9af24-0a", "ovs_interfaceid": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.458 2 DEBUG oslo_concurrency.lockutils [req-491488f0-09b5-4fcc-8203-f443e584b4f6 req-1f4187f6-f427-4c92-9f2b-a7660656e6cc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-7187c34b-929f-4f25-a15b-f6294b5087bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.458 2 DEBUG nova.network.neutron [req-491488f0-09b5-4fcc-8203-f443e584b4f6 req-1f4187f6-f427-4c92-9f2b-a7660656e6cc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Refreshing network info cache for port 43b9af24-0a3b-4a87-8883-35ff0783ea2c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.461 2 DEBUG nova.virt.libvirt.driver [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Start _get_guest_xml network_info=[{"id": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "address": "fa:16:3e:43:cd:2e", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.250", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43b9af24-0a", "ovs_interfaceid": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.464 2 WARNING nova.virt.libvirt.driver [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.468 2 DEBUG nova.virt.libvirt.host [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.468 2 DEBUG nova.virt.libvirt.host [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.471 2 DEBUG nova.virt.libvirt.host [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.472 2 DEBUG nova.virt.libvirt.host [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.472 2 DEBUG nova.virt.libvirt.driver [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.472 2 DEBUG nova.virt.hardware [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.473 2 DEBUG nova.virt.hardware [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.473 2 DEBUG nova.virt.hardware [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.473 2 DEBUG nova.virt.hardware [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.473 2 DEBUG nova.virt.hardware [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.473 2 DEBUG nova.virt.hardware [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.474 2 DEBUG nova.virt.hardware [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.474 2 DEBUG nova.virt.hardware [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.474 2 DEBUG nova.virt.hardware [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.474 2 DEBUG nova.virt.hardware [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.474 2 DEBUG nova.virt.hardware [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.478 2 DEBUG nova.virt.libvirt.vif [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:07:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-test_dvr_vip_failover_external_network-1149265317',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dvr-vip-failover-external-network-1149265317',id=80,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNdA0tsd4prmAtaaiPDbr0srwjwa73lUEwXPQ7487oxI1AHjPlOjgV1xIPQKf206OdbyLFsmy0ZYOIXSGyym/svb66IKtgGKZCbQTa1vbaeLuI4LvJfdLsM/uLuzgQoA5Q==',key_name='tempest-keypair-test-460735074',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71bd615ba6694cba8794c8eb5dadbe81',ramdisk_id='',reservation_id='r-wspb073o',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-OvnDvrAdvancedTest-1107478320',owner_user_name='tempest-OvnDvrAdvancedTest-1107478320-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:07:28Z,user_data=None,user_id='000a8d1cd17e4a4c8398ef814dd4db2d',uuid=7187c34b-929f-4f25-a15b-f6294b5087bc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "address": "fa:16:3e:43:cd:2e", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.250", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43b9af24-0a", "ovs_interfaceid": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.478 2 DEBUG nova.network.os_vif_util [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Converting VIF {"id": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "address": "fa:16:3e:43:cd:2e", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.250", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43b9af24-0a", "ovs_interfaceid": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.479 2 DEBUG nova.network.os_vif_util [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:cd:2e,bridge_name='br-int',has_traffic_filtering=True,id=43b9af24-0a3b-4a87-8883-35ff0783ea2c,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43b9af24-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.480 2 DEBUG nova.objects.instance [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7187c34b-929f-4f25-a15b-f6294b5087bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.633 2 DEBUG nova.virt.libvirt.driver [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] End _get_guest_xml xml=<domain type="kvm">
Oct  8 12:07:34 np0005476733 nova_compute[192580]:  <uuid>7187c34b-929f-4f25-a15b-f6294b5087bc</uuid>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:  <name>instance-00000050</name>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <nova:name>tempest-test_dvr_vip_failover_external_network-1149265317</nova:name>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 16:07:34</nova:creationTime>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 12:07:34 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:        <nova:user uuid="000a8d1cd17e4a4c8398ef814dd4db2d">tempest-OvnDvrAdvancedTest-1107478320-project-member</nova:user>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:        <nova:project uuid="71bd615ba6694cba8794c8eb5dadbe81">tempest-OvnDvrAdvancedTest-1107478320</nova:project>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:        <nova:port uuid="43b9af24-0a3b-4a87-8883-35ff0783ea2c">
Oct  8 12:07:34 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.122.250" ipVersion="4"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <system>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <entry name="serial">7187c34b-929f-4f25-a15b-f6294b5087bc</entry>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <entry name="uuid">7187c34b-929f-4f25-a15b-f6294b5087bc</entry>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    </system>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:  <os>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:  </os>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:  <features>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:  </features>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:  </clock>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:  <devices>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc/disk"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc/disk.config"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:43:cd:2e"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <mtu size="1400"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <target dev="tap43b9af24-0a"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    </interface>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc/console.log" append="off"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    </serial>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <video>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    </video>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    </rng>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 12:07:34 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 12:07:34 np0005476733 nova_compute[192580]:  </devices>
Oct  8 12:07:34 np0005476733 nova_compute[192580]: </domain>
Oct  8 12:07:34 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.635 2 DEBUG nova.compute.manager [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Preparing to wait for external event network-vif-plugged-43b9af24-0a3b-4a87-8883-35ff0783ea2c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.635 2 DEBUG oslo_concurrency.lockutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "7187c34b-929f-4f25-a15b-f6294b5087bc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.635 2 DEBUG oslo_concurrency.lockutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "7187c34b-929f-4f25-a15b-f6294b5087bc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.635 2 DEBUG oslo_concurrency.lockutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "7187c34b-929f-4f25-a15b-f6294b5087bc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.636 2 DEBUG nova.virt.libvirt.vif [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:07:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-test_dvr_vip_failover_external_network-1149265317',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dvr-vip-failover-external-network-1149265317',id=80,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNdA0tsd4prmAtaaiPDbr0srwjwa73lUEwXPQ7487oxI1AHjPlOjgV1xIPQKf206OdbyLFsmy0ZYOIXSGyym/svb66IKtgGKZCbQTa1vbaeLuI4LvJfdLsM/uLuzgQoA5Q==',key_name='tempest-keypair-test-460735074',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71bd615ba6694cba8794c8eb5dadbe81',ramdisk_id='',reservation_id='r-wspb073o',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-OvnDvrAdvancedTest-1107478320',owner_user_name='tempest-OvnDvrAdvancedTest-1107478320-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:07:28Z,user_data=None,user_id='000a8d1cd17e4a4c8398ef814dd4db2d',uuid=7187c34b-929f-4f25-a15b-f6294b5087bc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "address": "fa:16:3e:43:cd:2e", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.250", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43b9af24-0a", "ovs_interfaceid": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.636 2 DEBUG nova.network.os_vif_util [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Converting VIF {"id": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "address": "fa:16:3e:43:cd:2e", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.250", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43b9af24-0a", "ovs_interfaceid": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.637 2 DEBUG nova.network.os_vif_util [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:cd:2e,bridge_name='br-int',has_traffic_filtering=True,id=43b9af24-0a3b-4a87-8883-35ff0783ea2c,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43b9af24-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.637 2 DEBUG os_vif [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:cd:2e,bridge_name='br-int',has_traffic_filtering=True,id=43b9af24-0a3b-4a87-8883-35ff0783ea2c,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43b9af24-0a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.638 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.639 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.641 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap43b9af24-0a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.641 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap43b9af24-0a, col_values=(('external_ids', {'iface-id': '43b9af24-0a3b-4a87-8883-35ff0783ea2c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:43:cd:2e', 'vm-uuid': '7187c34b-929f-4f25-a15b-f6294b5087bc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:34 np0005476733 NetworkManager[51699]: <info>  [1759939654.6437] manager: (tap43b9af24-0a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/250)
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.648 2 INFO os_vif [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:cd:2e,bridge_name='br-int',has_traffic_filtering=True,id=43b9af24-0a3b-4a87-8883-35ff0783ea2c,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43b9af24-0a')#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.867 2 DEBUG nova.virt.libvirt.driver [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.868 2 DEBUG nova.virt.libvirt.driver [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.868 2 DEBUG nova.virt.libvirt.driver [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] No VIF found with MAC fa:16:3e:43:cd:2e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 12:07:34 np0005476733 nova_compute[192580]: 2025-10-08 16:07:34.869 2 INFO nova.virt.libvirt.driver [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Using config drive#033[00m
Oct  8 12:07:35 np0005476733 nova_compute[192580]: 2025-10-08 16:07:35.235 2 INFO nova.virt.libvirt.driver [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Creating config drive at /var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc/disk.config#033[00m
Oct  8 12:07:35 np0005476733 nova_compute[192580]: 2025-10-08 16:07:35.240 2 DEBUG oslo_concurrency.processutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppk30r_l2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:07:35 np0005476733 nova_compute[192580]: 2025-10-08 16:07:35.373 2 DEBUG oslo_concurrency.processutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppk30r_l2" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:07:35 np0005476733 kernel: tap43b9af24-0a: entered promiscuous mode
Oct  8 12:07:35 np0005476733 NetworkManager[51699]: <info>  [1759939655.4381] manager: (tap43b9af24-0a): new Tun device (/org/freedesktop/NetworkManager/Devices/251)
Oct  8 12:07:35 np0005476733 nova_compute[192580]: 2025-10-08 16:07:35.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:35 np0005476733 ovn_controller[94857]: 2025-10-08T16:07:35Z|00751|binding|INFO|Claiming lport 43b9af24-0a3b-4a87-8883-35ff0783ea2c for this chassis.
Oct  8 12:07:35 np0005476733 ovn_controller[94857]: 2025-10-08T16:07:35Z|00752|binding|INFO|43b9af24-0a3b-4a87-8883-35ff0783ea2c: Claiming fa:16:3e:43:cd:2e 192.168.122.250
Oct  8 12:07:35 np0005476733 ovn_controller[94857]: 2025-10-08T16:07:35Z|00753|binding|INFO|Setting lport 43b9af24-0a3b-4a87-8883-35ff0783ea2c ovn-installed in OVS
Oct  8 12:07:35 np0005476733 nova_compute[192580]: 2025-10-08 16:07:35.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:35 np0005476733 nova_compute[192580]: 2025-10-08 16:07:35.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:35 np0005476733 ovn_controller[94857]: 2025-10-08T16:07:35Z|00754|binding|INFO|Setting lport 43b9af24-0a3b-4a87-8883-35ff0783ea2c up in Southbound
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.460 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:43:cd:2e 192.168.122.250'], port_security=['fa:16:3e:43:cd:2e 192.168.122.250'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.250/24', 'neutron:device_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'neutron:revision_number': '2', 'neutron:security_group_ids': '24bffacf-e176-4693-befb-0f2fe8062d38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5b64086-e7d8-42ad-b439-67cb79e13d7c, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=43b9af24-0a3b-4a87-8883-35ff0783ea2c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.461 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 43b9af24-0a3b-4a87-8883-35ff0783ea2c in datapath 81c575b5-ac88-40d3-8b00-79c5c936eec4 bound to our chassis#033[00m
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.462 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 81c575b5-ac88-40d3-8b00-79c5c936eec4#033[00m
Oct  8 12:07:35 np0005476733 systemd-udevd[250893]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:07:35 np0005476733 systemd-machined[152624]: New machine qemu-48-instance-00000050.
Oct  8 12:07:35 np0005476733 NetworkManager[51699]: <info>  [1759939655.4788] device (tap43b9af24-0a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.480 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7d44e668-96ad-449e-999a-8ebd018b2eac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.482 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap81c575b5-a1 in ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 12:07:35 np0005476733 NetworkManager[51699]: <info>  [1759939655.4838] device (tap43b9af24-0a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 12:07:35 np0005476733 systemd[1]: Started Virtual Machine qemu-48-instance-00000050.
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.484 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap81c575b5-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.484 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c4199b15-0bd3-4a92-8d3e-f6740f4aa61a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.486 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b7a64f22-36a3-4034-80af-44777862e43e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.497 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[848ccda2-2526-4628-b4fa-ba823277fcb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.522 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[55e484f4-0639-4d13-bf8f-d9aecee4327a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.554 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[19ded53e-75f2-4bef-b329-63e23512d57b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.559 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[cc38bb1e-2ea3-4adb-9dfa-1e301c3006f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:07:35 np0005476733 NetworkManager[51699]: <info>  [1759939655.5608] manager: (tap81c575b5-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/252)
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.593 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[47a7adf3-b5fb-45f6-875d-07dc7600524f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.596 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[043c2a3b-7640-403c-9783-3cb7fb00fd46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:07:35 np0005476733 NetworkManager[51699]: <info>  [1759939655.6198] device (tap81c575b5-a0): carrier: link connected
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.626 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[aa02d2f9-b8ab-44f1-883b-0da67c63f335]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.645 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[55083e0b-91d2-4c00-bff5-cb300015f6b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81c575b5-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:bf:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655928, 'reachable_time': 32413, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250927, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.662 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4cc9bc58-c324-4070-bcb6-4420d989680f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe79:bf12'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 655928, 'tstamp': 655928}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250928, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.682 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ff7df238-8ae1-4357-aa98-f59f23585d54]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81c575b5-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:bf:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655928, 'reachable_time': 32413, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 250929, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.718 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a88426d2-124b-4cec-8f61-1f8dabca003b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.787 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c8fa0bf3-f25e-45f2-a21e-72058e9c6d25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.789 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81c575b5-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.789 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.789 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81c575b5-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:07:35 np0005476733 nova_compute[192580]: 2025-10-08 16:07:35.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:35 np0005476733 NetworkManager[51699]: <info>  [1759939655.7937] manager: (tap81c575b5-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/253)
Oct  8 12:07:35 np0005476733 kernel: tap81c575b5-a0: entered promiscuous mode
Oct  8 12:07:35 np0005476733 nova_compute[192580]: 2025-10-08 16:07:35.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.796 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap81c575b5-a0, col_values=(('external_ids', {'iface-id': '3737b929-673d-4d30-a674-dbb8c6c2e54d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:07:35 np0005476733 nova_compute[192580]: 2025-10-08 16:07:35.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:35 np0005476733 ovn_controller[94857]: 2025-10-08T16:07:35Z|00755|binding|INFO|Releasing lport 3737b929-673d-4d30-a674-dbb8c6c2e54d from this chassis (sb_readonly=0)
Oct  8 12:07:35 np0005476733 nova_compute[192580]: 2025-10-08 16:07:35.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.798 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/81c575b5-ac88-40d3-8b00-79c5c936eec4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/81c575b5-ac88-40d3-8b00-79c5c936eec4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.799 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5494866d-bba6-4bd4-9eb3-726e74e38620]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.800 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-81c575b5-ac88-40d3-8b00-79c5c936eec4
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/81c575b5-ac88-40d3-8b00-79c5c936eec4.pid.haproxy
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 81c575b5-ac88-40d3-8b00-79c5c936eec4
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 12:07:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:07:35.802 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'env', 'PROCESS_TAG=haproxy-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/81c575b5-ac88-40d3-8b00-79c5c936eec4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 12:07:35 np0005476733 nova_compute[192580]: 2025-10-08 16:07:35.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:35 np0005476733 nova_compute[192580]: 2025-10-08 16:07:35.973 2 DEBUG nova.compute.manager [req-b67112d8-9365-42a5-b45f-72e9899a0d38 req-15896785-70c5-464a-8fe8-da1f2ed2a111 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Received event network-vif-plugged-43b9af24-0a3b-4a87-8883-35ff0783ea2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:07:35 np0005476733 nova_compute[192580]: 2025-10-08 16:07:35.973 2 DEBUG oslo_concurrency.lockutils [req-b67112d8-9365-42a5-b45f-72e9899a0d38 req-15896785-70c5-464a-8fe8-da1f2ed2a111 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "7187c34b-929f-4f25-a15b-f6294b5087bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:07:35 np0005476733 nova_compute[192580]: 2025-10-08 16:07:35.973 2 DEBUG oslo_concurrency.lockutils [req-b67112d8-9365-42a5-b45f-72e9899a0d38 req-15896785-70c5-464a-8fe8-da1f2ed2a111 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7187c34b-929f-4f25-a15b-f6294b5087bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:07:35 np0005476733 nova_compute[192580]: 2025-10-08 16:07:35.973 2 DEBUG oslo_concurrency.lockutils [req-b67112d8-9365-42a5-b45f-72e9899a0d38 req-15896785-70c5-464a-8fe8-da1f2ed2a111 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7187c34b-929f-4f25-a15b-f6294b5087bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:07:35 np0005476733 nova_compute[192580]: 2025-10-08 16:07:35.974 2 DEBUG nova.compute.manager [req-b67112d8-9365-42a5-b45f-72e9899a0d38 req-15896785-70c5-464a-8fe8-da1f2ed2a111 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Processing event network-vif-plugged-43b9af24-0a3b-4a87-8883-35ff0783ea2c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.109 2 DEBUG nova.network.neutron [req-491488f0-09b5-4fcc-8203-f443e584b4f6 req-1f4187f6-f427-4c92-9f2b-a7660656e6cc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Updated VIF entry in instance network info cache for port 43b9af24-0a3b-4a87-8883-35ff0783ea2c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.111 2 DEBUG nova.network.neutron [req-491488f0-09b5-4fcc-8203-f443e584b4f6 req-1f4187f6-f427-4c92-9f2b-a7660656e6cc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Updating instance_info_cache with network_info: [{"id": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "address": "fa:16:3e:43:cd:2e", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.250", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43b9af24-0a", "ovs_interfaceid": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.166 2 DEBUG oslo_concurrency.lockutils [req-491488f0-09b5-4fcc-8203-f443e584b4f6 req-1f4187f6-f427-4c92-9f2b-a7660656e6cc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-7187c34b-929f-4f25-a15b-f6294b5087bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:07:36 np0005476733 podman[250966]: 2025-10-08 16:07:36.183942831 +0000 UTC m=+0.054211864 container create c92cf677d1d11a6cba2d5c459c82c3cfa884f531ed0472419519c035eaf50805 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:07:36 np0005476733 systemd[1]: Started libpod-conmon-c92cf677d1d11a6cba2d5c459c82c3cfa884f531ed0472419519c035eaf50805.scope.
Oct  8 12:07:36 np0005476733 podman[250966]: 2025-10-08 16:07:36.152019551 +0000 UTC m=+0.022288564 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 12:07:36 np0005476733 systemd[1]: Started libcrun container.
Oct  8 12:07:36 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0eea9b1a3d86fd20e7c2d1d3c118ff2807a8746210e9c8189bafdd4c12e5f57e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 12:07:36 np0005476733 podman[250966]: 2025-10-08 16:07:36.291339384 +0000 UTC m=+0.161608437 container init c92cf677d1d11a6cba2d5c459c82c3cfa884f531ed0472419519c035eaf50805 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:07:36 np0005476733 podman[250966]: 2025-10-08 16:07:36.299232996 +0000 UTC m=+0.169501999 container start c92cf677d1d11a6cba2d5c459c82c3cfa884f531ed0472419519c035eaf50805 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Oct  8 12:07:36 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[250981]: [NOTICE]   (250985) : New worker (250987) forked
Oct  8 12:07:36 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[250981]: [NOTICE]   (250985) : Loading success.
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.331 2 DEBUG nova.compute.manager [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.332 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759939656.3305144, 7187c34b-929f-4f25-a15b-f6294b5087bc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.332 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] VM Started (Lifecycle Event)#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.340 2 DEBUG nova.virt.libvirt.driver [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.346 2 INFO nova.virt.libvirt.driver [-] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Instance spawned successfully.#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.346 2 DEBUG nova.virt.libvirt.driver [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.363 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.372 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.375 2 DEBUG nova.virt.libvirt.driver [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.375 2 DEBUG nova.virt.libvirt.driver [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.376 2 DEBUG nova.virt.libvirt.driver [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.376 2 DEBUG nova.virt.libvirt.driver [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.377 2 DEBUG nova.virt.libvirt.driver [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.377 2 DEBUG nova.virt.libvirt.driver [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.392 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.393 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759939656.3308427, 7187c34b-929f-4f25-a15b-f6294b5087bc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.393 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] VM Paused (Lifecycle Event)#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.418 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.421 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759939656.3403594, 7187c34b-929f-4f25-a15b-f6294b5087bc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.421 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] VM Resumed (Lifecycle Event)#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.442 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.445 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.448 2 INFO nova.compute.manager [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Took 8.30 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.449 2 DEBUG nova.compute.manager [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.482 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.527 2 INFO nova.compute.manager [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Took 9.00 seconds to build instance.#033[00m
Oct  8 12:07:36 np0005476733 nova_compute[192580]: 2025-10-08 16:07:36.550 2 DEBUG oslo_concurrency.lockutils [None req-175dd8ee-0946-413c-8dfd-387182e0479e 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "7187c34b-929f-4f25-a15b-f6294b5087bc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:07:37 np0005476733 nova_compute[192580]: 2025-10-08 16:07:37.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:38 np0005476733 nova_compute[192580]: 2025-10-08 16:07:38.079 2 DEBUG nova.compute.manager [req-6949fab1-4d44-42f1-8c12-7021b6e46c3d req-5f6d1634-c30e-408a-b5e6-af6b63bd75a9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Received event network-vif-plugged-43b9af24-0a3b-4a87-8883-35ff0783ea2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:07:38 np0005476733 nova_compute[192580]: 2025-10-08 16:07:38.079 2 DEBUG oslo_concurrency.lockutils [req-6949fab1-4d44-42f1-8c12-7021b6e46c3d req-5f6d1634-c30e-408a-b5e6-af6b63bd75a9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "7187c34b-929f-4f25-a15b-f6294b5087bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:07:38 np0005476733 nova_compute[192580]: 2025-10-08 16:07:38.079 2 DEBUG oslo_concurrency.lockutils [req-6949fab1-4d44-42f1-8c12-7021b6e46c3d req-5f6d1634-c30e-408a-b5e6-af6b63bd75a9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7187c34b-929f-4f25-a15b-f6294b5087bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:07:38 np0005476733 nova_compute[192580]: 2025-10-08 16:07:38.079 2 DEBUG oslo_concurrency.lockutils [req-6949fab1-4d44-42f1-8c12-7021b6e46c3d req-5f6d1634-c30e-408a-b5e6-af6b63bd75a9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7187c34b-929f-4f25-a15b-f6294b5087bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:07:38 np0005476733 nova_compute[192580]: 2025-10-08 16:07:38.080 2 DEBUG nova.compute.manager [req-6949fab1-4d44-42f1-8c12-7021b6e46c3d req-5f6d1634-c30e-408a-b5e6-af6b63bd75a9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] No waiting events found dispatching network-vif-plugged-43b9af24-0a3b-4a87-8883-35ff0783ea2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:07:38 np0005476733 nova_compute[192580]: 2025-10-08 16:07:38.080 2 WARNING nova.compute.manager [req-6949fab1-4d44-42f1-8c12-7021b6e46c3d req-5f6d1634-c30e-408a-b5e6-af6b63bd75a9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Received unexpected event network-vif-plugged-43b9af24-0a3b-4a87-8883-35ff0783ea2c for instance with vm_state active and task_state None.#033[00m
Oct  8 12:07:38 np0005476733 nova_compute[192580]: 2025-10-08 16:07:38.665 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:07:38 np0005476733 nova_compute[192580]: 2025-10-08 16:07:38.851 2 INFO nova.compute.manager [None req-b1474fd2-6477-4d3d-8f6e-5045bb59a3a8 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Get console output#033[00m
Oct  8 12:07:38 np0005476733 nova_compute[192580]: 2025-10-08 16:07:38.856 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:07:39 np0005476733 nova_compute[192580]: 2025-10-08 16:07:39.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:40 np0005476733 podman[250996]: 2025-10-08 16:07:40.219961701 +0000 UTC m=+0.050492265 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001)
Oct  8 12:07:42 np0005476733 nova_compute[192580]: 2025-10-08 16:07:42.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:44 np0005476733 nova_compute[192580]: 2025-10-08 16:07:44.216 2 INFO nova.compute.manager [None req-26beac32-e673-4f2f-b9c8-e3e53f4f0578 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Get console output#033[00m
Oct  8 12:07:44 np0005476733 nova_compute[192580]: 2025-10-08 16:07:44.223 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:07:44 np0005476733 nova_compute[192580]: 2025-10-08 16:07:44.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:47 np0005476733 podman[251014]: 2025-10-08 16:07:47.24077178 +0000 UTC m=+0.063845151 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct  8 12:07:47 np0005476733 podman[251013]: 2025-10-08 16:07:47.27799509 +0000 UTC m=+0.101669501 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 12:07:47 np0005476733 nova_compute[192580]: 2025-10-08 16:07:47.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:49 np0005476733 nova_compute[192580]: 2025-10-08 16:07:49.548 2 INFO nova.compute.manager [None req-c7d18399-5bbf-4aa7-8467-3b43b8e6bde7 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Get console output#033[00m
Oct  8 12:07:49 np0005476733 nova_compute[192580]: 2025-10-08 16:07:49.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:52 np0005476733 nova_compute[192580]: 2025-10-08 16:07:52.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:54 np0005476733 nova_compute[192580]: 2025-10-08 16:07:54.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:54 np0005476733 nova_compute[192580]: 2025-10-08 16:07:54.783 2 INFO nova.compute.manager [None req-fa60c397-3d82-45b5-b2cf-88a795412498 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Get console output#033[00m
Oct  8 12:07:54 np0005476733 nova_compute[192580]: 2025-10-08 16:07:54.790 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:07:55 np0005476733 podman[251067]: 2025-10-08 16:07:55.239906489 +0000 UTC m=+0.057634752 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:07:55 np0005476733 podman[251066]: 2025-10-08 16:07:55.252361338 +0000 UTC m=+0.071908860 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:07:55 np0005476733 podman[251068]: 2025-10-08 16:07:55.270917741 +0000 UTC m=+0.075691631 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9)
Oct  8 12:07:57 np0005476733 nova_compute[192580]: 2025-10-08 16:07:57.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:07:59 np0005476733 nova_compute[192580]: 2025-10-08 16:07:59.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:00 np0005476733 nova_compute[192580]: 2025-10-08 16:08:00.428 2 INFO nova.compute.manager [None req-cd0b4cce-86b2-4eff-84b4-36adec8ece2c 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Get console output#033[00m
Oct  8 12:08:00 np0005476733 nova_compute[192580]: 2025-10-08 16:08:00.433 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:08:02 np0005476733 podman[251138]: 2025-10-08 16:08:02.244043156 +0000 UTC m=+0.058900984 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 12:08:02 np0005476733 podman[251137]: 2025-10-08 16:08:02.247468565 +0000 UTC m=+0.072043773 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 12:08:02 np0005476733 nova_compute[192580]: 2025-10-08 16:08:02.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:04 np0005476733 nova_compute[192580]: 2025-10-08 16:08:04.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:05 np0005476733 ovn_controller[94857]: 2025-10-08T16:08:05Z|00100|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:43:cd:2e 192.168.122.250
Oct  8 12:08:05 np0005476733 ovn_controller[94857]: 2025-10-08T16:08:05Z|00101|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:43:cd:2e 192.168.122.250
Oct  8 12:08:05 np0005476733 nova_compute[192580]: 2025-10-08 16:08:05.600 2 INFO nova.compute.manager [None req-1e3dc116-f9e3-4b0b-a589-8636a2024603 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Get console output#033[00m
Oct  8 12:08:05 np0005476733 ovn_controller[94857]: 2025-10-08T16:08:05Z|00756|memory_trim|INFO|Detected inactivity (last active 30020 ms ago): trimming memory
Oct  8 12:08:07 np0005476733 nova_compute[192580]: 2025-10-08 16:08:07.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:09 np0005476733 nova_compute[192580]: 2025-10-08 16:08:09.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:10 np0005476733 nova_compute[192580]: 2025-10-08 16:08:10.817 2 INFO nova.compute.manager [None req-925d3319-dfdb-437c-af68-902e859d580d 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Get console output#033[00m
Oct  8 12:08:10 np0005476733 nova_compute[192580]: 2025-10-08 16:08:10.829 2 INFO nova.virt.libvirt.driver [None req-925d3319-dfdb-437c-af68-902e859d580d 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Truncated console log returned, 3094 bytes ignored#033[00m
Oct  8 12:08:11 np0005476733 podman[251181]: 2025-10-08 16:08:11.255192871 +0000 UTC m=+0.076870598 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 12:08:12 np0005476733 nova_compute[192580]: 2025-10-08 16:08:12.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:14 np0005476733 nova_compute[192580]: 2025-10-08 16:08:14.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:15 np0005476733 nova_compute[192580]: 2025-10-08 16:08:15.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:08:16 np0005476733 nova_compute[192580]: 2025-10-08 16:08:16.028 2 INFO nova.compute.manager [None req-c5cc5455-54d3-4820-9538-aca83b3e5e19 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Get console output#033[00m
Oct  8 12:08:16 np0005476733 nova_compute[192580]: 2025-10-08 16:08:16.035 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:08:16 np0005476733 nova_compute[192580]: 2025-10-08 16:08:16.041 2 INFO nova.virt.libvirt.driver [None req-c5cc5455-54d3-4820-9538-aca83b3e5e19 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Truncated console log returned, 3387 bytes ignored#033[00m
Oct  8 12:08:17 np0005476733 nova_compute[192580]: 2025-10-08 16:08:17.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:18 np0005476733 podman[251224]: 2025-10-08 16:08:18.238235172 +0000 UTC m=+0.061759655 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  8 12:08:18 np0005476733 podman[251223]: 2025-10-08 16:08:18.293840149 +0000 UTC m=+0.111007128 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  8 12:08:18 np0005476733 nova_compute[192580]: 2025-10-08 16:08:18.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:08:19 np0005476733 nova_compute[192580]: 2025-10-08 16:08:19.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:22 np0005476733 nova_compute[192580]: 2025-10-08 16:08:22.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:22 np0005476733 nova_compute[192580]: 2025-10-08 16:08:22.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:08:23 np0005476733 nova_compute[192580]: 2025-10-08 16:08:23.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:08:23 np0005476733 nova_compute[192580]: 2025-10-08 16:08:23.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:08:23 np0005476733 nova_compute[192580]: 2025-10-08 16:08:23.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:08:24 np0005476733 nova_compute[192580]: 2025-10-08 16:08:24.047 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-7187c34b-929f-4f25-a15b-f6294b5087bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:08:24 np0005476733 nova_compute[192580]: 2025-10-08 16:08:24.048 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-7187c34b-929f-4f25-a15b-f6294b5087bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:08:24 np0005476733 nova_compute[192580]: 2025-10-08 16:08:24.049 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:08:24 np0005476733 nova_compute[192580]: 2025-10-08 16:08:24.049 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7187c34b-929f-4f25-a15b-f6294b5087bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:08:24 np0005476733 nova_compute[192580]: 2025-10-08 16:08:24.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:24 np0005476733 ovn_controller[94857]: 2025-10-08T16:08:24Z|00757|pinctrl|WARN|Dropped 233 log messages in last 64 seconds (most recently, 7 seconds ago) due to excessive rate
Oct  8 12:08:24 np0005476733 ovn_controller[94857]: 2025-10-08T16:08:24Z|00758|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:08:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:08:24.877 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=51, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=50) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:08:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:08:24.878 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:08:24 np0005476733 nova_compute[192580]: 2025-10-08 16:08:24.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:26 np0005476733 nova_compute[192580]: 2025-10-08 16:08:26.067 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Updating instance_info_cache with network_info: [{"id": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "address": "fa:16:3e:43:cd:2e", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.250", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43b9af24-0a", "ovs_interfaceid": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:08:26 np0005476733 nova_compute[192580]: 2025-10-08 16:08:26.098 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-7187c34b-929f-4f25-a15b-f6294b5087bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:08:26 np0005476733 nova_compute[192580]: 2025-10-08 16:08:26.099 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:08:26 np0005476733 nova_compute[192580]: 2025-10-08 16:08:26.099 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:08:26 np0005476733 nova_compute[192580]: 2025-10-08 16:08:26.100 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:08:26 np0005476733 nova_compute[192580]: 2025-10-08 16:08:26.100 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:08:26 np0005476733 podman[251270]: 2025-10-08 16:08:26.250515272 +0000 UTC m=+0.071198136 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:08:26 np0005476733 podman[251269]: 2025-10-08 16:08:26.252179946 +0000 UTC m=+0.075650379 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 12:08:26 np0005476733 podman[251271]: 2025-10-08 16:08:26.271193273 +0000 UTC m=+0.082632553 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Oct  8 12:08:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:08:26.361 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:08:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:08:26.362 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:08:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:08:26.362 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:08:27 np0005476733 nova_compute[192580]: 2025-10-08 16:08:27.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:29 np0005476733 nova_compute[192580]: 2025-10-08 16:08:29.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:08:29 np0005476733 nova_compute[192580]: 2025-10-08 16:08:29.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:32 np0005476733 nova_compute[192580]: 2025-10-08 16:08:32.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:32 np0005476733 nova_compute[192580]: 2025-10-08 16:08:32.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:08:32 np0005476733 nova_compute[192580]: 2025-10-08 16:08:32.621 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:08:32 np0005476733 nova_compute[192580]: 2025-10-08 16:08:32.622 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:08:32 np0005476733 nova_compute[192580]: 2025-10-08 16:08:32.623 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:08:32 np0005476733 nova_compute[192580]: 2025-10-08 16:08:32.624 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:08:32 np0005476733 podman[251329]: 2025-10-08 16:08:32.739149173 +0000 UTC m=+0.063788811 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 12:08:32 np0005476733 nova_compute[192580]: 2025-10-08 16:08:32.741 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:08:32 np0005476733 podman[251331]: 2025-10-08 16:08:32.756575649 +0000 UTC m=+0.068231801 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:08:32 np0005476733 nova_compute[192580]: 2025-10-08 16:08:32.812 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:08:32 np0005476733 nova_compute[192580]: 2025-10-08 16:08:32.813 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:08:32 np0005476733 nova_compute[192580]: 2025-10-08 16:08:32.873 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:08:33 np0005476733 nova_compute[192580]: 2025-10-08 16:08:33.015 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:08:33 np0005476733 nova_compute[192580]: 2025-10-08 16:08:33.016 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12976MB free_disk=111.1888427734375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:08:33 np0005476733 nova_compute[192580]: 2025-10-08 16:08:33.016 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:08:33 np0005476733 nova_compute[192580]: 2025-10-08 16:08:33.016 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:08:33 np0005476733 nova_compute[192580]: 2025-10-08 16:08:33.137 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 7187c34b-929f-4f25-a15b-f6294b5087bc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:08:33 np0005476733 nova_compute[192580]: 2025-10-08 16:08:33.138 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:08:33 np0005476733 nova_compute[192580]: 2025-10-08 16:08:33.138 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:08:33 np0005476733 nova_compute[192580]: 2025-10-08 16:08:33.229 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:08:33 np0005476733 nova_compute[192580]: 2025-10-08 16:08:33.261 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:08:33 np0005476733 nova_compute[192580]: 2025-10-08 16:08:33.318 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:08:33 np0005476733 nova_compute[192580]: 2025-10-08 16:08:33.319 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.303s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:08:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:08:33.880 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '51'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:08:34 np0005476733 nova_compute[192580]: 2025-10-08 16:08:34.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.059 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000050', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '71bd615ba6694cba8794c8eb5dadbe81', 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'hostId': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.060 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.076 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.read.bytes volume: 327759360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.076 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba403852-f8e8-4539-9a85-d5654a75ed84', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 327759360, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-vda', 'timestamp': '2025-10-08T16:08:36.060603', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '0b888cec-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.783568825, 'message_signature': '163c505f57faf200b722000178b6793f3992dc61ca495a1be3f5081c099b367d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-sda', 'timestamp': '2025-10-08T16:08:36.060603', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '0b889a2a-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.783568825, 'message_signature': '8f4d0d0edd2da647481f0bae45f82116ed79c6fd1c46751d92017930a0a331b5'}]}, 'timestamp': '2025-10-08 16:08:36.077151', '_unique_id': 'cfa4b56da4bc465a8859710d25ce707d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.078 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.079 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.079 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.write.bytes volume: 135545856 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.079 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3179e9c-88e6-41f0-a2d8-2a8999363500', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 135545856, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-vda', 'timestamp': '2025-10-08T16:08:36.079578', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '0b890780-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.783568825, 'message_signature': 'f608e1b816c0a81195b2b01793d2c9c071ef020bfe17bafddc5ada018480a6df'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-sda', 'timestamp': '2025-10-08T16:08:36.079578', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '0b891540-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.783568825, 'message_signature': '474de4a4787a54c3e676aa10983e78441063e8bc42c08375fe2ee317fdc4b34e'}]}, 'timestamp': '2025-10-08 16:08:36.080326', '_unique_id': 'd1b3872361fc44599202a286fc8c73f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.081 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.082 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.082 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.082 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-test_dvr_vip_failover_external_network-1149265317>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dvr_vip_failover_external_network-1149265317>]
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.082 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.086 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 7187c34b-929f-4f25-a15b-f6294b5087bc / tap43b9af24-0a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.086 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/network.incoming.packets volume: 17 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efa6bdeb-f877-4f8e-8f31-08335f4fb5f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 17, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000050-7187c34b-929f-4f25-a15b-f6294b5087bc-tap43b9af24-0a', 'timestamp': '2025-10-08T16:08:36.082988', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'tap43b9af24-0a', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:43:cd:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43b9af24-0a'}, 'message_id': '0b8a16a2-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.805973811, 'message_signature': '78f66f44a961605e563d01322f21a8169f08854efce75c2ff020c6fe7710c037'}]}, 'timestamp': '2025-10-08 16:08:36.086922', '_unique_id': 'c8d91d386c9a4923b3c108af43d60b3a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.087 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4469c024-94d2-4116-83c8-820bec50aff3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000050-7187c34b-929f-4f25-a15b-f6294b5087bc-tap43b9af24-0a', 'timestamp': '2025-10-08T16:08:36.089138', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'tap43b9af24-0a', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:43:cd:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43b9af24-0a'}, 'message_id': '0b8a7ab6-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.805973811, 'message_signature': 'bedbddec04f6a5e6d3af8b1a4d64f1709a6ca4a4bcc8320643b1915833ce4143'}]}, 'timestamp': '2025-10-08 16:08:36.089424', '_unique_id': '4c2058ea67044ec68c4ee626eb50aab2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.089 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.090 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.090 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.write.requests volume: 707 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02bc90da-fa26-48ac-b626-beb3a6bc840a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 707, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-vda', 'timestamp': '2025-10-08T16:08:36.090922', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '0b8ac12e-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.783568825, 'message_signature': 'e9e72280bb9026ffbd0046348d9622be7805788998614a1899fb1b3fee0d5307'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-sda', 'timestamp': '2025-10-08T16:08:36.090922', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '0b8acbe2-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.783568825, 'message_signature': '148e7e8f71f5af98a572c24e21d8cb37fd3f9ba065e7bc760ddb21ff09e2e972'}]}, 'timestamp': '2025-10-08 16:08:36.091487', '_unique_id': '70d86b601cb64a35974c81f0463bd559'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.091 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.092 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.092 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/network.outgoing.packets volume: 35 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60605cc7-a40d-4a06-9fe4-409f135f8404', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 35, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000050-7187c34b-929f-4f25-a15b-f6294b5087bc-tap43b9af24-0a', 'timestamp': '2025-10-08T16:08:36.092970', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'tap43b9af24-0a', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:43:cd:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43b9af24-0a'}, 'message_id': '0b8b10a2-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.805973811, 'message_signature': '2c5414b728e5d06530f3c12b5e416345f71f901eee52c859cb11b461d7e68ab5'}]}, 'timestamp': '2025-10-08 16:08:36.093256', '_unique_id': '6df90cca1bba47cf9796e35593d64c70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.093 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.094 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.103 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.usage volume: 152567808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.104 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ed35d6d-6753-4c5d-8e62-c0acafcc56f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152567808, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-vda', 'timestamp': '2025-10-08T16:08:36.094676', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '0b8cc3ac-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.817656964, 'message_signature': '4f4e4f19e1bb7a5106fbf311a0424ca0c4bde7d94b49a5dc57b91a311fbfdbbd'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-sda', 'timestamp': '2025-10-08T16:08:36.094676', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '0b8ccf14-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.817656964, 'message_signature': 'eed442922e66e30961151d995e68869a1d66e602f48e498040a0369f5625dd8f'}]}, 'timestamp': '2025-10-08 16:08:36.104689', '_unique_id': '740f946e0e01426f9a2ad846592b0a83'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.105 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.106 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.106 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/network.outgoing.bytes volume: 3722 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b7f59a3-b138-48b5-91ab-4ab0941bde5f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3722, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000050-7187c34b-929f-4f25-a15b-f6294b5087bc-tap43b9af24-0a', 'timestamp': '2025-10-08T16:08:36.106710', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'tap43b9af24-0a', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:43:cd:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43b9af24-0a'}, 'message_id': '0b8d28d8-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.805973811, 'message_signature': '2b0d415654494084b41af3211887b248777b02a70635af7311f4771e4bbf285f'}]}, 'timestamp': '2025-10-08 16:08:36.106983', '_unique_id': '7a43310be7cd4ea7aa9422950f81dfe8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.107 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.108 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.108 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.108 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '831dfb26-c254-4241-82e3-f3d518944725', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-vda', 'timestamp': '2025-10-08T16:08:36.108512', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '0b8d6ee2-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.817656964, 'message_signature': 'a40ad2a7cf96a4ce982828dfa8ddeba9aaca5f33bce4a786d20e62afcf3b29d3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-sda', 'timestamp': '2025-10-08T16:08:36.108512', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '0b8d7ffe-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.817656964, 'message_signature': '749d0db7df8baa7a3993fb922b23dbaf87dc77ed469f1bd47841145564bf6d39'}]}, 'timestamp': '2025-10-08 16:08:36.109208', '_unique_id': '9d3c6db920a94f55bb3b351cb320dbd1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.110 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.110 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.110 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-test_dvr_vip_failover_external_network-1149265317>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dvr_vip_failover_external_network-1149265317>]
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e853b949-ccbe-4c68-9e29-4433905ea57e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000050-7187c34b-929f-4f25-a15b-f6294b5087bc-tap43b9af24-0a', 'timestamp': '2025-10-08T16:08:36.111116', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'tap43b9af24-0a', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:43:cd:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43b9af24-0a'}, 'message_id': '0b8dd4e0-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.805973811, 'message_signature': 'b6de25b3ab585e880ea49a0b8a52c09f93294946c0ce0fffdac480f85c90fbce'}]}, 'timestamp': '2025-10-08 16:08:36.111388', '_unique_id': '6f9b8c228775402892ea831d4a4f7c7b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.111 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.112 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.112 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.112 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-test_dvr_vip_failover_external_network-1149265317>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dvr_vip_failover_external_network-1149265317>]
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56647252-54e3-4551-97a2-61d2ecca3316', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000050-7187c34b-929f-4f25-a15b-f6294b5087bc-tap43b9af24-0a', 'timestamp': '2025-10-08T16:08:36.113230', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'tap43b9af24-0a', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:43:cd:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43b9af24-0a'}, 'message_id': '0b8e2710-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.805973811, 'message_signature': '743862785bd0f6382b306bd1077f15be485947cf7f97012e4e3965b9be66f76a'}]}, 'timestamp': '2025-10-08 16:08:36.113491', '_unique_id': '5883601f4b2e4288a3e5152bc70d1b81'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.113 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.114 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.114 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.read.latency volume: 16179075266 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.read.latency volume: 72777917 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54ecbba7-88e5-48eb-a2ca-566f381d0151', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16179075266, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-vda', 'timestamp': '2025-10-08T16:08:36.114908', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '0b8e689c-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.783568825, 'message_signature': '8f3ed3fc62cf508cf7aec156840df377363af405e0c7d5e25653bda150b3fb85'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 72777917, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-sda', 'timestamp': '2025-10-08T16:08:36.114908', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '0b8e7620-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.783568825, 'message_signature': '699b621c3b92f5e4c3b3dedabd120103f61a468bee891d180eb63f0e0be9e9c4'}]}, 'timestamp': '2025-10-08 16:08:36.115503', '_unique_id': '28f5a48a691f4f9784687731fb2b3161'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.115 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.116 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.116 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6427781c-cdba-4464-aa18-2b556a7fd75a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000050-7187c34b-929f-4f25-a15b-f6294b5087bc-tap43b9af24-0a', 'timestamp': '2025-10-08T16:08:36.116974', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'tap43b9af24-0a', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:43:cd:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43b9af24-0a'}, 'message_id': '0b8eba72-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.805973811, 'message_signature': '4048724fadc815c282db71a0433cbaa3f9a016af4a705f8b29fd9855ac09f808'}]}, 'timestamp': '2025-10-08 16:08:36.117263', '_unique_id': '15872ddd91374973aebf810c0e9c517a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.117 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.118 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.118 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.write.latency volume: 54594221220 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.118 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37d3582a-1977-4a03-9a41-f6af8bba2693', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 54594221220, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-vda', 'timestamp': '2025-10-08T16:08:36.118670', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '0b8efba4-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.783568825, 'message_signature': 'b858824ca3750277155bdbbe6723f1d1ab8fcdd469b6e7e497886aae6ffa8bea'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-sda', 'timestamp': '2025-10-08T16:08:36.118670', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '0b8f04be-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.783568825, 'message_signature': 'c569aee02b7bb0140f9a5e24838e80db8e4a3cea65b4fc125a5cb117d39d51d5'}]}, 'timestamp': '2025-10-08 16:08:36.119169', '_unique_id': 'd18c91fd69244afb9e6a99b274447b5f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.119 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.120 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.134 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/cpu volume: 41960000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ba549e4-d0e5-43bc-abac-7892d12fa50c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 41960000000, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'timestamp': '2025-10-08T16:08:36.120714', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '0b917190-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.857110806, 'message_signature': 'fd1735f4ef5a80e778813991c83e89e2fcf4ca397d8799956397adad4453ef33'}]}, 'timestamp': '2025-10-08 16:08:36.135202', '_unique_id': 'd484ed186fea4352ba0275dd332dfd73'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.136 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.139 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.139 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '807b1430-b332-44e7-a403-7660b5911323', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000050-7187c34b-929f-4f25-a15b-f6294b5087bc-tap43b9af24-0a', 'timestamp': '2025-10-08T16:08:36.139280', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'tap43b9af24-0a', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:43:cd:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43b9af24-0a'}, 'message_id': '0b922806-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.805973811, 'message_signature': '920058f164f4994a6103bb19028c2b3595b557d54f6374fc6854edb6348f0080'}]}, 'timestamp': '2025-10-08 16:08:36.139864', '_unique_id': '93f6f61c81a34dc9b9ac03d3b5c0f2ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.141 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.142 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.142 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.read.requests volume: 11551 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.143 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b5d6f72-15e3-4ca3-9f2c-bbc6308154f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11551, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-vda', 'timestamp': '2025-10-08T16:08:36.142919', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '0b92b1cc-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.783568825, 'message_signature': 'ed88b9c86310cc7cea8393fcae965a950d9f1d8a72f2fe471436175c9d21028d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-sda', 'timestamp': '2025-10-08T16:08:36.142919', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '0b92bd8e-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.783568825, 'message_signature': '297289134c4e2532608268b4e49a9bb79e894d549a7b496019b6f55fcc7a36cb'}]}, 'timestamp': '2025-10-08 16:08:36.143558', '_unique_id': '20ec946f75184929be5988272010a4f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.145 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.145 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/network.incoming.bytes volume: 2420 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c605dc1-71df-445f-94c7-7ca5977ed3dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2420, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000050-7187c34b-929f-4f25-a15b-f6294b5087bc-tap43b9af24-0a', 'timestamp': '2025-10-08T16:08:36.145880', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'tap43b9af24-0a', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:43:cd:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43b9af24-0a'}, 'message_id': '0b9324b8-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.805973811, 'message_signature': 'e49dd5e3f0027c3d6edfb17da0957e0d251e9e641c8fa1140503ea372877e612'}]}, 'timestamp': '2025-10-08 16:08:36.146223', '_unique_id': '6d94accf40fb47dcba0cd507111ae556'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.146 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.147 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.148 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.148 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-test_dvr_vip_failover_external_network-1149265317>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dvr_vip_failover_external_network-1149265317>]
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.148 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.148 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/memory.usage volume: 226.4921875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e401188e-15ad-4a74-93be-240edcfa0ac3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 226.4921875, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'timestamp': '2025-10-08T16:08:36.148728', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '0b9394ac-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.857110806, 'message_signature': '0f61f809693138444959b965dae2cdceb284c50276d1f29ad46c82353d823fe3'}]}, 'timestamp': '2025-10-08 16:08:36.149157', '_unique_id': '3376d5f017f24434b64e6dca12255183'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.151 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.151 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.151 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa2132a3-3d4d-4ceb-b32d-794912408277', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-vda', 'timestamp': '2025-10-08T16:08:36.151599', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '0b9403b0-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.817656964, 'message_signature': '536e197106f79dc80f6867a1bac8497d8cae405d132dd1021881f4b9147c19ae'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-sda', 'timestamp': '2025-10-08T16:08:36.151599', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '0b940e96-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.817656964, 'message_signature': '0b41b9c4544dde3535366ffa7283a4501176e0f79bff588a705abecab2ca09ad'}]}, 'timestamp': '2025-10-08 16:08:36.152244', '_unique_id': '941d8fd52de14566af52d53a7ac49cf5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.153 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.154 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.154 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba8113ed-67c1-46a7-92ec-ff3988b32268', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000050-7187c34b-929f-4f25-a15b-f6294b5087bc-tap43b9af24-0a', 'timestamp': '2025-10-08T16:08:36.154452', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'tap43b9af24-0a', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:43:cd:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43b9af24-0a'}, 'message_id': '0b947444-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6619.805973811, 'message_signature': '490259bdebafdc083d0f1d22463e3310058433b954cc0c37bc758296c191d1ce'}]}, 'timestamp': '2025-10-08 16:08:36.154838', '_unique_id': '047a735a3a4e403f87a048bb2a070db6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:08:36.155 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:08:37 np0005476733 nova_compute[192580]: 2025-10-08 16:08:37.320 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:08:37 np0005476733 nova_compute[192580]: 2025-10-08 16:08:37.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:37 np0005476733 nova_compute[192580]: 2025-10-08 16:08:37.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:08:39 np0005476733 nova_compute[192580]: 2025-10-08 16:08:39.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:42 np0005476733 podman[251378]: 2025-10-08 16:08:42.213762501 +0000 UTC m=+0.046504767 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:08:42 np0005476733 nova_compute[192580]: 2025-10-08 16:08:42.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:44 np0005476733 nova_compute[192580]: 2025-10-08 16:08:44.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:47 np0005476733 nova_compute[192580]: 2025-10-08 16:08:47.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:49 np0005476733 podman[251398]: 2025-10-08 16:08:49.232440081 +0000 UTC m=+0.054485122 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 12:08:49 np0005476733 podman[251397]: 2025-10-08 16:08:49.25586421 +0000 UTC m=+0.086027670 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 12:08:49 np0005476733 nova_compute[192580]: 2025-10-08 16:08:49.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:52 np0005476733 nova_compute[192580]: 2025-10-08 16:08:52.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:54 np0005476733 nova_compute[192580]: 2025-10-08 16:08:54.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:54 np0005476733 ovn_controller[94857]: 2025-10-08T16:08:54Z|00759|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Oct  8 12:08:57 np0005476733 podman[251442]: 2025-10-08 16:08:57.244942338 +0000 UTC m=+0.070159503 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  8 12:08:57 np0005476733 podman[251444]: 2025-10-08 16:08:57.25125228 +0000 UTC m=+0.063546692 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-type=git, version=9.6, config_id=edpm, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public)
Oct  8 12:08:57 np0005476733 podman[251443]: 2025-10-08 16:08:57.258587454 +0000 UTC m=+0.071011870 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:08:57 np0005476733 nova_compute[192580]: 2025-10-08 16:08:57.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:08:59 np0005476733 nova_compute[192580]: 2025-10-08 16:08:59.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:02 np0005476733 nova_compute[192580]: 2025-10-08 16:09:02.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:03 np0005476733 podman[251503]: 2025-10-08 16:09:03.250380194 +0000 UTC m=+0.066015881 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 12:09:03 np0005476733 podman[251502]: 2025-10-08 16:09:03.254892957 +0000 UTC m=+0.083915403 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20251001)
Oct  8 12:09:04 np0005476733 nova_compute[192580]: 2025-10-08 16:09:04.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:07 np0005476733 nova_compute[192580]: 2025-10-08 16:09:07.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:09 np0005476733 nova_compute[192580]: 2025-10-08 16:09:09.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:12 np0005476733 nova_compute[192580]: 2025-10-08 16:09:12.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:13 np0005476733 podman[251548]: 2025-10-08 16:09:13.225140277 +0000 UTC m=+0.056551938 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ovn_metadata_agent)
Oct  8 12:09:14 np0005476733 nova_compute[192580]: 2025-10-08 16:09:14.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:16 np0005476733 nova_compute[192580]: 2025-10-08 16:09:16.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:09:17 np0005476733 nova_compute[192580]: 2025-10-08 16:09:17.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:18 np0005476733 nova_compute[192580]: 2025-10-08 16:09:18.103 2 DEBUG nova.compute.manager [req-7a0a8c38-959e-4fb8-802f-5a3908e5f4e2 req-304e70c5-21a5-40c9-a15c-ce5cbcf4b5e4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Received event network-changed-43b9af24-0a3b-4a87-8883-35ff0783ea2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:09:18 np0005476733 nova_compute[192580]: 2025-10-08 16:09:18.104 2 DEBUG nova.compute.manager [req-7a0a8c38-959e-4fb8-802f-5a3908e5f4e2 req-304e70c5-21a5-40c9-a15c-ce5cbcf4b5e4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Refreshing instance network info cache due to event network-changed-43b9af24-0a3b-4a87-8883-35ff0783ea2c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:09:18 np0005476733 nova_compute[192580]: 2025-10-08 16:09:18.105 2 DEBUG oslo_concurrency.lockutils [req-7a0a8c38-959e-4fb8-802f-5a3908e5f4e2 req-304e70c5-21a5-40c9-a15c-ce5cbcf4b5e4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-7187c34b-929f-4f25-a15b-f6294b5087bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:09:18 np0005476733 nova_compute[192580]: 2025-10-08 16:09:18.106 2 DEBUG oslo_concurrency.lockutils [req-7a0a8c38-959e-4fb8-802f-5a3908e5f4e2 req-304e70c5-21a5-40c9-a15c-ce5cbcf4b5e4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-7187c34b-929f-4f25-a15b-f6294b5087bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:09:18 np0005476733 nova_compute[192580]: 2025-10-08 16:09:18.106 2 DEBUG nova.network.neutron [req-7a0a8c38-959e-4fb8-802f-5a3908e5f4e2 req-304e70c5-21a5-40c9-a15c-ce5cbcf4b5e4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Refreshing network info cache for port 43b9af24-0a3b-4a87-8883-35ff0783ea2c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:09:19 np0005476733 nova_compute[192580]: 2025-10-08 16:09:19.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:09:19 np0005476733 nova_compute[192580]: 2025-10-08 16:09:19.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:20 np0005476733 podman[251568]: 2025-10-08 16:09:20.270448366 +0000 UTC m=+0.082489617 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2)
Oct  8 12:09:20 np0005476733 podman[251567]: 2025-10-08 16:09:20.308140421 +0000 UTC m=+0.118750846 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  8 12:09:21 np0005476733 nova_compute[192580]: 2025-10-08 16:09:21.022 2 DEBUG nova.network.neutron [req-7a0a8c38-959e-4fb8-802f-5a3908e5f4e2 req-304e70c5-21a5-40c9-a15c-ce5cbcf4b5e4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Updated VIF entry in instance network info cache for port 43b9af24-0a3b-4a87-8883-35ff0783ea2c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:09:21 np0005476733 nova_compute[192580]: 2025-10-08 16:09:21.023 2 DEBUG nova.network.neutron [req-7a0a8c38-959e-4fb8-802f-5a3908e5f4e2 req-304e70c5-21a5-40c9-a15c-ce5cbcf4b5e4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Updating instance_info_cache with network_info: [{"id": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "address": "fa:16:3e:43:cd:2e", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.250", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43b9af24-0a", "ovs_interfaceid": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:09:21 np0005476733 nova_compute[192580]: 2025-10-08 16:09:21.049 2 DEBUG oslo_concurrency.lockutils [req-7a0a8c38-959e-4fb8-802f-5a3908e5f4e2 req-304e70c5-21a5-40c9-a15c-ce5cbcf4b5e4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-7187c34b-929f-4f25-a15b-f6294b5087bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:09:22 np0005476733 nova_compute[192580]: 2025-10-08 16:09:22.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:23 np0005476733 nova_compute[192580]: 2025-10-08 16:09:23.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:09:24 np0005476733 nova_compute[192580]: 2025-10-08 16:09:24.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:09:24 np0005476733 nova_compute[192580]: 2025-10-08 16:09:24.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:09:24 np0005476733 nova_compute[192580]: 2025-10-08 16:09:24.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:09:24 np0005476733 nova_compute[192580]: 2025-10-08 16:09:24.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:24 np0005476733 ovn_controller[94857]: 2025-10-08T16:09:24Z|00760|pinctrl|WARN|Dropped 161 log messages in last 60 seconds (most recently, 5 seconds ago) due to excessive rate
Oct  8 12:09:24 np0005476733 ovn_controller[94857]: 2025-10-08T16:09:24Z|00761|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:09:25 np0005476733 nova_compute[192580]: 2025-10-08 16:09:25.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:09:25 np0005476733 nova_compute[192580]: 2025-10-08 16:09:25.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:09:25 np0005476733 nova_compute[192580]: 2025-10-08 16:09:25.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:09:25 np0005476733 ovn_controller[94857]: 2025-10-08T16:09:25Z|00762|pinctrl|INFO|Claiming virtual lport 47874a34-92ab-40af-8541-e7b04d3f603b for this chassis with the virtual parent 43b9af24-0a3b-4a87-8883-35ff0783ea2c
Oct  8 12:09:25 np0005476733 ovn_controller[94857]: 2025-10-08T16:09:25Z|00763|binding|INFO|Setting lport 47874a34-92ab-40af-8541-e7b04d3f603b up in Southbound
Oct  8 12:09:26 np0005476733 nova_compute[192580]: 2025-10-08 16:09:26.092 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-7187c34b-929f-4f25-a15b-f6294b5087bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:09:26 np0005476733 nova_compute[192580]: 2025-10-08 16:09:26.093 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-7187c34b-929f-4f25-a15b-f6294b5087bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:09:26 np0005476733 nova_compute[192580]: 2025-10-08 16:09:26.093 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:09:26 np0005476733 nova_compute[192580]: 2025-10-08 16:09:26.093 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7187c34b-929f-4f25-a15b-f6294b5087bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:09:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:09:26.363 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:09:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:09:26.363 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:09:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:09:26.364 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:09:27 np0005476733 nova_compute[192580]: 2025-10-08 16:09:27.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:27 np0005476733 nova_compute[192580]: 2025-10-08 16:09:27.509 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Updating instance_info_cache with network_info: [{"id": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "address": "fa:16:3e:43:cd:2e", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.250", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43b9af24-0a", "ovs_interfaceid": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:09:27 np0005476733 nova_compute[192580]: 2025-10-08 16:09:27.532 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-7187c34b-929f-4f25-a15b-f6294b5087bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:09:27 np0005476733 nova_compute[192580]: 2025-10-08 16:09:27.533 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:09:28 np0005476733 podman[251613]: 2025-10-08 16:09:28.257647724 +0000 UTC m=+0.081223717 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 12:09:28 np0005476733 podman[251612]: 2025-10-08 16:09:28.26940437 +0000 UTC m=+0.091067591 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 12:09:28 np0005476733 podman[251614]: 2025-10-08 16:09:28.270661521 +0000 UTC m=+0.082199499 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, release=1755695350, architecture=x86_64, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, vendor=Red Hat, Inc.)
Oct  8 12:09:28 np0005476733 nova_compute[192580]: 2025-10-08 16:09:28.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:09:29 np0005476733 nova_compute[192580]: 2025-10-08 16:09:29.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:30 np0005476733 nova_compute[192580]: 2025-10-08 16:09:30.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:09:30 np0005476733 nova_compute[192580]: 2025-10-08 16:09:30.624 2 DEBUG oslo_concurrency.lockutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "46a486b9-8873-425b-913e-b2931570477e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:09:30 np0005476733 nova_compute[192580]: 2025-10-08 16:09:30.625 2 DEBUG oslo_concurrency.lockutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "46a486b9-8873-425b-913e-b2931570477e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:09:30 np0005476733 nova_compute[192580]: 2025-10-08 16:09:30.654 2 DEBUG nova.compute.manager [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 12:09:30 np0005476733 nova_compute[192580]: 2025-10-08 16:09:30.757 2 DEBUG oslo_concurrency.lockutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:09:30 np0005476733 nova_compute[192580]: 2025-10-08 16:09:30.758 2 DEBUG oslo_concurrency.lockutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:09:30 np0005476733 nova_compute[192580]: 2025-10-08 16:09:30.767 2 DEBUG nova.virt.hardware [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 12:09:30 np0005476733 nova_compute[192580]: 2025-10-08 16:09:30.767 2 INFO nova.compute.claims [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 12:09:30 np0005476733 nova_compute[192580]: 2025-10-08 16:09:30.911 2 DEBUG nova.compute.provider_tree [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:09:30 np0005476733 nova_compute[192580]: 2025-10-08 16:09:30.932 2 DEBUG nova.scheduler.client.report [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:09:30 np0005476733 nova_compute[192580]: 2025-10-08 16:09:30.961 2 DEBUG oslo_concurrency.lockutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:09:30 np0005476733 nova_compute[192580]: 2025-10-08 16:09:30.962 2 DEBUG nova.compute.manager [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.037 2 DEBUG nova.compute.manager [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.038 2 DEBUG nova.network.neutron [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.064 2 INFO nova.virt.libvirt.driver [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.107 2 DEBUG nova.compute.manager [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.251 2 DEBUG nova.compute.manager [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.253 2 DEBUG nova.virt.libvirt.driver [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.254 2 INFO nova.virt.libvirt.driver [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Creating image(s)#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.255 2 DEBUG oslo_concurrency.lockutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "/var/lib/nova/instances/46a486b9-8873-425b-913e-b2931570477e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.256 2 DEBUG oslo_concurrency.lockutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "/var/lib/nova/instances/46a486b9-8873-425b-913e-b2931570477e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.258 2 DEBUG oslo_concurrency.lockutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "/var/lib/nova/instances/46a486b9-8873-425b-913e-b2931570477e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.284 2 DEBUG oslo_concurrency.processutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.382 2 DEBUG oslo_concurrency.processutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.384 2 DEBUG oslo_concurrency.lockutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.385 2 DEBUG oslo_concurrency.lockutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.413 2 DEBUG oslo_concurrency.processutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.474 2 DEBUG oslo_concurrency.processutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.475 2 DEBUG oslo_concurrency.processutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/46a486b9-8873-425b-913e-b2931570477e/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.522 2 DEBUG oslo_concurrency.processutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/46a486b9-8873-425b-913e-b2931570477e/disk 10737418240" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.523 2 DEBUG oslo_concurrency.lockutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.524 2 DEBUG oslo_concurrency.processutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.581 2 DEBUG oslo_concurrency.processutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.582 2 DEBUG nova.objects.instance [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lazy-loading 'migration_context' on Instance uuid 46a486b9-8873-425b-913e-b2931570477e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.603 2 DEBUG nova.virt.libvirt.driver [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.604 2 DEBUG nova.virt.libvirt.driver [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Ensure instance console log exists: /var/lib/nova/instances/46a486b9-8873-425b-913e-b2931570477e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.605 2 DEBUG oslo_concurrency.lockutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.605 2 DEBUG oslo_concurrency.lockutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:09:31 np0005476733 nova_compute[192580]: 2025-10-08 16:09:31.605 2 DEBUG oslo_concurrency.lockutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:09:32 np0005476733 nova_compute[192580]: 2025-10-08 16:09:32.120 2 DEBUG nova.policy [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 12:09:32 np0005476733 nova_compute[192580]: 2025-10-08 16:09:32.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:33 np0005476733 nova_compute[192580]: 2025-10-08 16:09:33.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:09:33 np0005476733 nova_compute[192580]: 2025-10-08 16:09:33.621 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:09:33 np0005476733 nova_compute[192580]: 2025-10-08 16:09:33.621 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:09:33 np0005476733 nova_compute[192580]: 2025-10-08 16:09:33.622 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:09:33 np0005476733 nova_compute[192580]: 2025-10-08 16:09:33.622 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:09:33 np0005476733 nova_compute[192580]: 2025-10-08 16:09:33.795 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:09:33 np0005476733 nova_compute[192580]: 2025-10-08 16:09:33.868 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:09:33 np0005476733 nova_compute[192580]: 2025-10-08 16:09:33.870 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:09:33 np0005476733 nova_compute[192580]: 2025-10-08 16:09:33.964 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:09:34 np0005476733 nova_compute[192580]: 2025-10-08 16:09:34.154 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:09:34 np0005476733 nova_compute[192580]: 2025-10-08 16:09:34.156 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12972MB free_disk=111.1887435913086GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:09:34 np0005476733 nova_compute[192580]: 2025-10-08 16:09:34.156 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:09:34 np0005476733 nova_compute[192580]: 2025-10-08 16:09:34.156 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:09:34 np0005476733 podman[251702]: 2025-10-08 16:09:34.21736333 +0000 UTC m=+0.048143370 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 12:09:34 np0005476733 podman[251703]: 2025-10-08 16:09:34.230952424 +0000 UTC m=+0.054687120 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 12:09:34 np0005476733 nova_compute[192580]: 2025-10-08 16:09:34.302 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 7187c34b-929f-4f25-a15b-f6294b5087bc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:09:34 np0005476733 nova_compute[192580]: 2025-10-08 16:09:34.302 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 46a486b9-8873-425b-913e-b2931570477e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:09:34 np0005476733 nova_compute[192580]: 2025-10-08 16:09:34.302 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:09:34 np0005476733 nova_compute[192580]: 2025-10-08 16:09:34.302 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=2560MB phys_disk=119GB used_disk=20GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:09:34 np0005476733 nova_compute[192580]: 2025-10-08 16:09:34.367 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:09:34 np0005476733 nova_compute[192580]: 2025-10-08 16:09:34.485 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:09:34 np0005476733 nova_compute[192580]: 2025-10-08 16:09:34.764 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:09:34 np0005476733 nova_compute[192580]: 2025-10-08 16:09:34.765 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.608s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:09:34 np0005476733 nova_compute[192580]: 2025-10-08 16:09:34.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:09:34.987 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=52, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=51) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:09:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:09:34.988 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:09:34 np0005476733 nova_compute[192580]: 2025-10-08 16:09:34.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:35 np0005476733 nova_compute[192580]: 2025-10-08 16:09:35.148 2 DEBUG nova.network.neutron [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Successfully created port: 2731ffa0-8eae-47d8-a816-32f3afa5b445 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 12:09:36 np0005476733 nova_compute[192580]: 2025-10-08 16:09:36.517 2 DEBUG nova.network.neutron [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Successfully updated port: 2731ffa0-8eae-47d8-a816-32f3afa5b445 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 12:09:36 np0005476733 nova_compute[192580]: 2025-10-08 16:09:36.643 2 DEBUG oslo_concurrency.lockutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "refresh_cache-46a486b9-8873-425b-913e-b2931570477e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:09:36 np0005476733 nova_compute[192580]: 2025-10-08 16:09:36.643 2 DEBUG oslo_concurrency.lockutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquired lock "refresh_cache-46a486b9-8873-425b-913e-b2931570477e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:09:36 np0005476733 nova_compute[192580]: 2025-10-08 16:09:36.643 2 DEBUG nova.network.neutron [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:09:36 np0005476733 nova_compute[192580]: 2025-10-08 16:09:36.667 2 DEBUG nova.compute.manager [req-8ce8fc7d-4016-4100-9410-073831bbe41e req-5645c665-7bc5-44b0-9881-4efc929663d0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Received event network-changed-2731ffa0-8eae-47d8-a816-32f3afa5b445 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:09:36 np0005476733 nova_compute[192580]: 2025-10-08 16:09:36.667 2 DEBUG nova.compute.manager [req-8ce8fc7d-4016-4100-9410-073831bbe41e req-5645c665-7bc5-44b0-9881-4efc929663d0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Refreshing instance network info cache due to event network-changed-2731ffa0-8eae-47d8-a816-32f3afa5b445. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:09:36 np0005476733 nova_compute[192580]: 2025-10-08 16:09:36.667 2 DEBUG oslo_concurrency.lockutils [req-8ce8fc7d-4016-4100-9410-073831bbe41e req-5645c665-7bc5-44b0-9881-4efc929663d0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-46a486b9-8873-425b-913e-b2931570477e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:09:37 np0005476733 nova_compute[192580]: 2025-10-08 16:09:37.097 2 DEBUG nova.network.neutron [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 12:09:37 np0005476733 nova_compute[192580]: 2025-10-08 16:09:37.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.122 2 DEBUG nova.network.neutron [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Updating instance_info_cache with network_info: [{"id": "2731ffa0-8eae-47d8-a816-32f3afa5b445", "address": "fa:16:3e:aa:8e:58", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.235", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2731ffa0-8e", "ovs_interfaceid": "2731ffa0-8eae-47d8-a816-32f3afa5b445", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.203 2 DEBUG oslo_concurrency.lockutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Releasing lock "refresh_cache-46a486b9-8873-425b-913e-b2931570477e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.204 2 DEBUG nova.compute.manager [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Instance network_info: |[{"id": "2731ffa0-8eae-47d8-a816-32f3afa5b445", "address": "fa:16:3e:aa:8e:58", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.235", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2731ffa0-8e", "ovs_interfaceid": "2731ffa0-8eae-47d8-a816-32f3afa5b445", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.204 2 DEBUG oslo_concurrency.lockutils [req-8ce8fc7d-4016-4100-9410-073831bbe41e req-5645c665-7bc5-44b0-9881-4efc929663d0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-46a486b9-8873-425b-913e-b2931570477e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.204 2 DEBUG nova.network.neutron [req-8ce8fc7d-4016-4100-9410-073831bbe41e req-5645c665-7bc5-44b0-9881-4efc929663d0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Refreshing network info cache for port 2731ffa0-8eae-47d8-a816-32f3afa5b445 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.208 2 DEBUG nova.virt.libvirt.driver [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Start _get_guest_xml network_info=[{"id": "2731ffa0-8eae-47d8-a816-32f3afa5b445", "address": "fa:16:3e:aa:8e:58", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.235", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2731ffa0-8e", "ovs_interfaceid": "2731ffa0-8eae-47d8-a816-32f3afa5b445", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.211 2 WARNING nova.virt.libvirt.driver [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.217 2 DEBUG nova.virt.libvirt.host [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.217 2 DEBUG nova.virt.libvirt.host [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.221 2 DEBUG nova.virt.libvirt.host [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.222 2 DEBUG nova.virt.libvirt.host [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.223 2 DEBUG nova.virt.libvirt.driver [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.223 2 DEBUG nova.virt.hardware [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.223 2 DEBUG nova.virt.hardware [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.224 2 DEBUG nova.virt.hardware [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.224 2 DEBUG nova.virt.hardware [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.224 2 DEBUG nova.virt.hardware [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.224 2 DEBUG nova.virt.hardware [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.224 2 DEBUG nova.virt.hardware [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.225 2 DEBUG nova.virt.hardware [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.225 2 DEBUG nova.virt.hardware [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.225 2 DEBUG nova.virt.hardware [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.226 2 DEBUG nova.virt.hardware [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.229 2 DEBUG nova.virt.libvirt.vif [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:09:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-server-test-1357248001',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1357248001',id=82,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNdA0tsd4prmAtaaiPDbr0srwjwa73lUEwXPQ7487oxI1AHjPlOjgV1xIPQKf206OdbyLFsmy0ZYOIXSGyym/svb66IKtgGKZCbQTa1vbaeLuI4LvJfdLsM/uLuzgQoA5Q==',key_name='tempest-keypair-test-460735074',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71bd615ba6694cba8794c8eb5dadbe81',ramdisk_id='',reservation_id='r-b58i6gm8',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-OvnDvrAdvancedTest-1107478320',owner_user_name='tempest-OvnDvrAdvancedTest-1107478320-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:09:31Z,user_data=None,user_id='000a8d1cd17e4a4c8398ef814dd4db2d',uuid=46a486b9-8873-425b-913e-b2931570477e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2731ffa0-8eae-47d8-a816-32f3afa5b445", "address": "fa:16:3e:aa:8e:58", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.235", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2731ffa0-8e", "ovs_interfaceid": "2731ffa0-8eae-47d8-a816-32f3afa5b445", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.230 2 DEBUG nova.network.os_vif_util [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Converting VIF {"id": "2731ffa0-8eae-47d8-a816-32f3afa5b445", "address": "fa:16:3e:aa:8e:58", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.235", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2731ffa0-8e", "ovs_interfaceid": "2731ffa0-8eae-47d8-a816-32f3afa5b445", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.230 2 DEBUG nova.network.os_vif_util [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:aa:8e:58,bridge_name='br-int',has_traffic_filtering=True,id=2731ffa0-8eae-47d8-a816-32f3afa5b445,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2731ffa0-8e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.232 2 DEBUG nova.objects.instance [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lazy-loading 'pci_devices' on Instance uuid 46a486b9-8873-425b-913e-b2931570477e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.275 2 DEBUG nova.virt.libvirt.driver [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] End _get_guest_xml xml=<domain type="kvm">
Oct  8 12:09:38 np0005476733 nova_compute[192580]:  <uuid>46a486b9-8873-425b-913e-b2931570477e</uuid>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:  <name>instance-00000052</name>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <nova:name>tempest-server-test-1357248001</nova:name>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 16:09:38</nova:creationTime>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 12:09:38 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:        <nova:user uuid="000a8d1cd17e4a4c8398ef814dd4db2d">tempest-OvnDvrAdvancedTest-1107478320-project-member</nova:user>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:        <nova:project uuid="71bd615ba6694cba8794c8eb5dadbe81">tempest-OvnDvrAdvancedTest-1107478320</nova:project>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:        <nova:port uuid="2731ffa0-8eae-47d8-a816-32f3afa5b445">
Oct  8 12:09:38 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.122.235" ipVersion="4"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <system>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <entry name="serial">46a486b9-8873-425b-913e-b2931570477e</entry>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <entry name="uuid">46a486b9-8873-425b-913e-b2931570477e</entry>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    </system>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:  <os>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:  </os>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:  <features>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:  </features>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:  </clock>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:  <devices>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/46a486b9-8873-425b-913e-b2931570477e/disk"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/46a486b9-8873-425b-913e-b2931570477e/disk.config"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:aa:8e:58"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <mtu size="1400"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <target dev="tap2731ffa0-8e"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    </interface>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/46a486b9-8873-425b-913e-b2931570477e/console.log" append="off"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    </serial>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <video>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    </video>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    </rng>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 12:09:38 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 12:09:38 np0005476733 nova_compute[192580]:  </devices>
Oct  8 12:09:38 np0005476733 nova_compute[192580]: </domain>
Oct  8 12:09:38 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.276 2 DEBUG nova.compute.manager [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Preparing to wait for external event network-vif-plugged-2731ffa0-8eae-47d8-a816-32f3afa5b445 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.277 2 DEBUG oslo_concurrency.lockutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "46a486b9-8873-425b-913e-b2931570477e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.277 2 DEBUG oslo_concurrency.lockutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "46a486b9-8873-425b-913e-b2931570477e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.277 2 DEBUG oslo_concurrency.lockutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "46a486b9-8873-425b-913e-b2931570477e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.278 2 DEBUG nova.virt.libvirt.vif [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:09:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-server-test-1357248001',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1357248001',id=82,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNdA0tsd4prmAtaaiPDbr0srwjwa73lUEwXPQ7487oxI1AHjPlOjgV1xIPQKf206OdbyLFsmy0ZYOIXSGyym/svb66IKtgGKZCbQTa1vbaeLuI4LvJfdLsM/uLuzgQoA5Q==',key_name='tempest-keypair-test-460735074',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71bd615ba6694cba8794c8eb5dadbe81',ramdisk_id='',reservation_id='r-b58i6gm8',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-OvnDvrAdvancedTest-1107478320',owner_user_name='tempest-OvnDvrAdvancedTest-1107478320-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:09:31Z,user_data=None,user_id='000a8d1cd17e4a4c8398ef814dd4db2d',uuid=46a486b9-8873-425b-913e-b2931570477e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2731ffa0-8eae-47d8-a816-32f3afa5b445", "address": "fa:16:3e:aa:8e:58", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.235", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2731ffa0-8e", "ovs_interfaceid": "2731ffa0-8eae-47d8-a816-32f3afa5b445", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.278 2 DEBUG nova.network.os_vif_util [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Converting VIF {"id": "2731ffa0-8eae-47d8-a816-32f3afa5b445", "address": "fa:16:3e:aa:8e:58", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.235", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2731ffa0-8e", "ovs_interfaceid": "2731ffa0-8eae-47d8-a816-32f3afa5b445", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.279 2 DEBUG nova.network.os_vif_util [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:aa:8e:58,bridge_name='br-int',has_traffic_filtering=True,id=2731ffa0-8eae-47d8-a816-32f3afa5b445,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2731ffa0-8e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.280 2 DEBUG os_vif [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:aa:8e:58,bridge_name='br-int',has_traffic_filtering=True,id=2731ffa0-8eae-47d8-a816-32f3afa5b445,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2731ffa0-8e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.281 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.281 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.283 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2731ffa0-8e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.284 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2731ffa0-8e, col_values=(('external_ids', {'iface-id': '2731ffa0-8eae-47d8-a816-32f3afa5b445', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:aa:8e:58', 'vm-uuid': '46a486b9-8873-425b-913e-b2931570477e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:09:38 np0005476733 NetworkManager[51699]: <info>  [1759939778.2869] manager: (tap2731ffa0-8e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/254)
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.293 2 INFO os_vif [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:aa:8e:58,bridge_name='br-int',has_traffic_filtering=True,id=2731ffa0-8eae-47d8-a816-32f3afa5b445,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2731ffa0-8e')#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.439 2 DEBUG nova.virt.libvirt.driver [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.440 2 DEBUG nova.virt.libvirt.driver [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.440 2 DEBUG nova.virt.libvirt.driver [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] No VIF found with MAC fa:16:3e:aa:8e:58, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 12:09:38 np0005476733 nova_compute[192580]: 2025-10-08 16:09:38.440 2 INFO nova.virt.libvirt.driver [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Using config drive#033[00m
Oct  8 12:09:38 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:09:38.990 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '52'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:09:39 np0005476733 nova_compute[192580]: 2025-10-08 16:09:39.181 2 INFO nova.virt.libvirt.driver [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Creating config drive at /var/lib/nova/instances/46a486b9-8873-425b-913e-b2931570477e/disk.config#033[00m
Oct  8 12:09:39 np0005476733 nova_compute[192580]: 2025-10-08 16:09:39.186 2 DEBUG oslo_concurrency.processutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/46a486b9-8873-425b-913e-b2931570477e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwc6sj93r execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:09:39 np0005476733 nova_compute[192580]: 2025-10-08 16:09:39.315 2 DEBUG oslo_concurrency.processutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/46a486b9-8873-425b-913e-b2931570477e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwc6sj93r" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:09:39 np0005476733 kernel: tap2731ffa0-8e: entered promiscuous mode
Oct  8 12:09:39 np0005476733 NetworkManager[51699]: <info>  [1759939779.3898] manager: (tap2731ffa0-8e): new Tun device (/org/freedesktop/NetworkManager/Devices/255)
Oct  8 12:09:39 np0005476733 ovn_controller[94857]: 2025-10-08T16:09:39Z|00764|binding|INFO|Claiming lport 2731ffa0-8eae-47d8-a816-32f3afa5b445 for this chassis.
Oct  8 12:09:39 np0005476733 ovn_controller[94857]: 2025-10-08T16:09:39Z|00765|binding|INFO|2731ffa0-8eae-47d8-a816-32f3afa5b445: Claiming fa:16:3e:aa:8e:58 192.168.122.235
Oct  8 12:09:39 np0005476733 nova_compute[192580]: 2025-10-08 16:09:39.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:39 np0005476733 ovn_controller[94857]: 2025-10-08T16:09:39Z|00766|binding|INFO|Setting lport 2731ffa0-8eae-47d8-a816-32f3afa5b445 ovn-installed in OVS
Oct  8 12:09:39 np0005476733 nova_compute[192580]: 2025-10-08 16:09:39.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:39 np0005476733 nova_compute[192580]: 2025-10-08 16:09:39.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:39 np0005476733 systemd-udevd[251764]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:09:39 np0005476733 ovn_controller[94857]: 2025-10-08T16:09:39Z|00767|binding|INFO|Setting lport 2731ffa0-8eae-47d8-a816-32f3afa5b445 up in Southbound
Oct  8 12:09:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:09:39.422 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:aa:8e:58 192.168.122.235'], port_security=['fa:16:3e:aa:8e:58 192.168.122.235'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.235/24', 'neutron:device_id': '46a486b9-8873-425b-913e-b2931570477e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'neutron:revision_number': '2', 'neutron:security_group_ids': '24bffacf-e176-4693-befb-0f2fe8062d38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5b64086-e7d8-42ad-b439-67cb79e13d7c, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=2731ffa0-8eae-47d8-a816-32f3afa5b445) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:09:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:09:39.424 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 2731ffa0-8eae-47d8-a816-32f3afa5b445 in datapath 81c575b5-ac88-40d3-8b00-79c5c936eec4 bound to our chassis#033[00m
Oct  8 12:09:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:09:39.426 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 81c575b5-ac88-40d3-8b00-79c5c936eec4#033[00m
Oct  8 12:09:39 np0005476733 systemd-machined[152624]: New machine qemu-49-instance-00000052.
Oct  8 12:09:39 np0005476733 NetworkManager[51699]: <info>  [1759939779.4424] device (tap2731ffa0-8e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:09:39 np0005476733 NetworkManager[51699]: <info>  [1759939779.4437] device (tap2731ffa0-8e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 12:09:39 np0005476733 systemd[1]: Started Virtual Machine qemu-49-instance-00000052.
Oct  8 12:09:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:09:39.444 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[20cafb9b-2b7b-431a-950f-107b4f0d8778]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:09:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:09:39.477 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[b8231b5e-dab8-4eb1-b885-63a27582f7e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:09:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:09:39.479 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[5042879c-2216-4930-8e1f-0b070df49d7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:09:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:09:39.508 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[191291b7-1f0c-4cc2-b4bb-349831631f58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:09:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:09:39.530 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6f2542b1-9d46-4091-aa6a-1440b0f04f14]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81c575b5-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:bf:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 29, 'tx_packets': 6, 'rx_bytes': 1714, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 29, 'tx_packets': 6, 'rx_bytes': 1714, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655928, 'reachable_time': 32413, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251779, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:09:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:09:39.548 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0e6c148a-65e6-445d-bec3-ed88101207a7]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap81c575b5-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 655941, 'tstamp': 655941}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 251780, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.122.171'], ['IFA_LOCAL', '192.168.122.171'], ['IFA_BROADCAST', '192.168.122.255'], ['IFA_LABEL', 'tap81c575b5-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 655944, 'tstamp': 655944}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 251780, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:09:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:09:39.550 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81c575b5-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:09:39 np0005476733 nova_compute[192580]: 2025-10-08 16:09:39.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:09:39.552 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81c575b5-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:09:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:09:39.553 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:09:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:09:39.553 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap81c575b5-a0, col_values=(('external_ids', {'iface-id': '3737b929-673d-4d30-a674-dbb8c6c2e54d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:09:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:09:39.553 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:09:39 np0005476733 nova_compute[192580]: 2025-10-08 16:09:39.707 2 DEBUG nova.compute.manager [req-7fea9e9f-a773-465f-8c93-25bd93f1a54a req-36768a60-31f3-4484-8ef9-f3d4b7c23b7f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Received event network-vif-plugged-2731ffa0-8eae-47d8-a816-32f3afa5b445 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:09:39 np0005476733 nova_compute[192580]: 2025-10-08 16:09:39.708 2 DEBUG oslo_concurrency.lockutils [req-7fea9e9f-a773-465f-8c93-25bd93f1a54a req-36768a60-31f3-4484-8ef9-f3d4b7c23b7f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "46a486b9-8873-425b-913e-b2931570477e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:09:39 np0005476733 nova_compute[192580]: 2025-10-08 16:09:39.708 2 DEBUG oslo_concurrency.lockutils [req-7fea9e9f-a773-465f-8c93-25bd93f1a54a req-36768a60-31f3-4484-8ef9-f3d4b7c23b7f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "46a486b9-8873-425b-913e-b2931570477e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:09:39 np0005476733 nova_compute[192580]: 2025-10-08 16:09:39.709 2 DEBUG oslo_concurrency.lockutils [req-7fea9e9f-a773-465f-8c93-25bd93f1a54a req-36768a60-31f3-4484-8ef9-f3d4b7c23b7f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "46a486b9-8873-425b-913e-b2931570477e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:09:39 np0005476733 nova_compute[192580]: 2025-10-08 16:09:39.709 2 DEBUG nova.compute.manager [req-7fea9e9f-a773-465f-8c93-25bd93f1a54a req-36768a60-31f3-4484-8ef9-f3d4b7c23b7f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Processing event network-vif-plugged-2731ffa0-8eae-47d8-a816-32f3afa5b445 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 12:09:39 np0005476733 nova_compute[192580]: 2025-10-08 16:09:39.765 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:09:39 np0005476733 nova_compute[192580]: 2025-10-08 16:09:39.836 2 DEBUG nova.network.neutron [req-8ce8fc7d-4016-4100-9410-073831bbe41e req-5645c665-7bc5-44b0-9881-4efc929663d0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Updated VIF entry in instance network info cache for port 2731ffa0-8eae-47d8-a816-32f3afa5b445. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:09:39 np0005476733 nova_compute[192580]: 2025-10-08 16:09:39.837 2 DEBUG nova.network.neutron [req-8ce8fc7d-4016-4100-9410-073831bbe41e req-5645c665-7bc5-44b0-9881-4efc929663d0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Updating instance_info_cache with network_info: [{"id": "2731ffa0-8eae-47d8-a816-32f3afa5b445", "address": "fa:16:3e:aa:8e:58", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.235", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2731ffa0-8e", "ovs_interfaceid": "2731ffa0-8eae-47d8-a816-32f3afa5b445", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:09:39 np0005476733 nova_compute[192580]: 2025-10-08 16:09:39.871 2 DEBUG oslo_concurrency.lockutils [req-8ce8fc7d-4016-4100-9410-073831bbe41e req-5645c665-7bc5-44b0-9881-4efc929663d0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-46a486b9-8873-425b-913e-b2931570477e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.409 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759939780.409004, 46a486b9-8873-425b-913e-b2931570477e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.410 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 46a486b9-8873-425b-913e-b2931570477e] VM Started (Lifecycle Event)#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.413 2 DEBUG nova.compute.manager [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.416 2 DEBUG nova.virt.libvirt.driver [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.420 2 INFO nova.virt.libvirt.driver [-] [instance: 46a486b9-8873-425b-913e-b2931570477e] Instance spawned successfully.#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.420 2 DEBUG nova.virt.libvirt.driver [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.441 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 46a486b9-8873-425b-913e-b2931570477e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.447 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 46a486b9-8873-425b-913e-b2931570477e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.450 2 DEBUG nova.virt.libvirt.driver [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.450 2 DEBUG nova.virt.libvirt.driver [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.451 2 DEBUG nova.virt.libvirt.driver [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.451 2 DEBUG nova.virt.libvirt.driver [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.452 2 DEBUG nova.virt.libvirt.driver [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.452 2 DEBUG nova.virt.libvirt.driver [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.493 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 46a486b9-8873-425b-913e-b2931570477e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.493 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759939780.4093068, 46a486b9-8873-425b-913e-b2931570477e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.494 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 46a486b9-8873-425b-913e-b2931570477e] VM Paused (Lifecycle Event)#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.532 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 46a486b9-8873-425b-913e-b2931570477e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.536 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759939780.4163294, 46a486b9-8873-425b-913e-b2931570477e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.536 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 46a486b9-8873-425b-913e-b2931570477e] VM Resumed (Lifecycle Event)#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.555 2 INFO nova.compute.manager [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Took 9.30 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.555 2 DEBUG nova.compute.manager [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.568 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 46a486b9-8873-425b-913e-b2931570477e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.571 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 46a486b9-8873-425b-913e-b2931570477e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.604 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 46a486b9-8873-425b-913e-b2931570477e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.636 2 INFO nova.compute.manager [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Took 9.92 seconds to build instance.#033[00m
Oct  8 12:09:40 np0005476733 nova_compute[192580]: 2025-10-08 16:09:40.656 2 DEBUG oslo_concurrency.lockutils [None req-c32bce86-27c7-487a-9765-1ac61128d169 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "46a486b9-8873-425b-913e-b2931570477e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.031s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:09:41 np0005476733 nova_compute[192580]: 2025-10-08 16:09:41.800 2 DEBUG nova.compute.manager [req-50243a97-7bd9-490a-97d5-60cc69587280 req-e8d8028e-6406-4c90-ae5f-5234d9071bde 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Received event network-vif-plugged-2731ffa0-8eae-47d8-a816-32f3afa5b445 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:09:41 np0005476733 nova_compute[192580]: 2025-10-08 16:09:41.801 2 DEBUG oslo_concurrency.lockutils [req-50243a97-7bd9-490a-97d5-60cc69587280 req-e8d8028e-6406-4c90-ae5f-5234d9071bde 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "46a486b9-8873-425b-913e-b2931570477e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:09:41 np0005476733 nova_compute[192580]: 2025-10-08 16:09:41.802 2 DEBUG oslo_concurrency.lockutils [req-50243a97-7bd9-490a-97d5-60cc69587280 req-e8d8028e-6406-4c90-ae5f-5234d9071bde 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "46a486b9-8873-425b-913e-b2931570477e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:09:41 np0005476733 nova_compute[192580]: 2025-10-08 16:09:41.802 2 DEBUG oslo_concurrency.lockutils [req-50243a97-7bd9-490a-97d5-60cc69587280 req-e8d8028e-6406-4c90-ae5f-5234d9071bde 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "46a486b9-8873-425b-913e-b2931570477e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:09:41 np0005476733 nova_compute[192580]: 2025-10-08 16:09:41.802 2 DEBUG nova.compute.manager [req-50243a97-7bd9-490a-97d5-60cc69587280 req-e8d8028e-6406-4c90-ae5f-5234d9071bde 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] No waiting events found dispatching network-vif-plugged-2731ffa0-8eae-47d8-a816-32f3afa5b445 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:09:41 np0005476733 nova_compute[192580]: 2025-10-08 16:09:41.802 2 WARNING nova.compute.manager [req-50243a97-7bd9-490a-97d5-60cc69587280 req-e8d8028e-6406-4c90-ae5f-5234d9071bde 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Received unexpected event network-vif-plugged-2731ffa0-8eae-47d8-a816-32f3afa5b445 for instance with vm_state active and task_state None.#033[00m
Oct  8 12:09:42 np0005476733 nova_compute[192580]: 2025-10-08 16:09:42.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:43 np0005476733 nova_compute[192580]: 2025-10-08 16:09:43.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:44 np0005476733 podman[251788]: 2025-10-08 16:09:44.247635959 +0000 UTC m=+0.064461592 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001)
Oct  8 12:09:45 np0005476733 systemd-logind[827]: New session 56 of user zuul.
Oct  8 12:09:45 np0005476733 systemd[1]: Started Session 56 of User zuul.
Oct  8 12:09:45 np0005476733 systemd-logind[827]: New session 57 of user zuul.
Oct  8 12:09:45 np0005476733 systemd[1]: Started Session 57 of User zuul.
Oct  8 12:09:45 np0005476733 systemd[1]: session-57.scope: Deactivated successfully.
Oct  8 12:09:45 np0005476733 systemd-logind[827]: Session 57 logged out. Waiting for processes to exit.
Oct  8 12:09:45 np0005476733 systemd-logind[827]: Removed session 57.
Oct  8 12:09:47 np0005476733 nova_compute[192580]: 2025-10-08 16:09:47.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:48 np0005476733 nova_compute[192580]: 2025-10-08 16:09:48.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:51 np0005476733 systemd[1]: Starting dnf makecache...
Oct  8 12:09:51 np0005476733 podman[251868]: 2025-10-08 16:09:51.265843195 +0000 UTC m=+0.085743912 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 12:09:51 np0005476733 podman[251869]: 2025-10-08 16:09:51.283466648 +0000 UTC m=+0.091188976 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:09:51 np0005476733 dnf[251870]: Repository 'gating-repo' is missing name in configuration, using id.
Oct  8 12:09:51 np0005476733 dnf[251870]: Metadata cache refreshed recently.
Oct  8 12:09:51 np0005476733 systemd[1]: dnf-makecache.service: Deactivated successfully.
Oct  8 12:09:51 np0005476733 systemd[1]: Finished dnf makecache.
Oct  8 12:09:52 np0005476733 nova_compute[192580]: 2025-10-08 16:09:52.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:53 np0005476733 nova_compute[192580]: 2025-10-08 16:09:53.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:57 np0005476733 nova_compute[192580]: 2025-10-08 16:09:57.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:58 np0005476733 nova_compute[192580]: 2025-10-08 16:09:58.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:09:59 np0005476733 nova_compute[192580]: 2025-10-08 16:09:59.055 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:09:59 np0005476733 podman[251920]: 2025-10-08 16:09:59.251896647 +0000 UTC m=+0.068655675 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3)
Oct  8 12:09:59 np0005476733 podman[251921]: 2025-10-08 16:09:59.260367777 +0000 UTC m=+0.063534961 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:09:59 np0005476733 podman[251922]: 2025-10-08 16:09:59.272882018 +0000 UTC m=+0.076324861 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vcs-type=git, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, distribution-scope=public, release=1755695350)
Oct  8 12:10:02 np0005476733 nova_compute[192580]: 2025-10-08 16:10:02.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:03 np0005476733 nova_compute[192580]: 2025-10-08 16:10:03.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:05 np0005476733 ovn_controller[94857]: 2025-10-08T16:10:05Z|00102|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:aa:8e:58 192.168.122.235
Oct  8 12:10:05 np0005476733 ovn_controller[94857]: 2025-10-08T16:10:05Z|00103|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:aa:8e:58 192.168.122.235
Oct  8 12:10:05 np0005476733 podman[251981]: 2025-10-08 16:10:05.236053943 +0000 UTC m=+0.059230544 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 12:10:05 np0005476733 podman[251980]: 2025-10-08 16:10:05.251803096 +0000 UTC m=+0.071511006 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  8 12:10:07 np0005476733 nova_compute[192580]: 2025-10-08 16:10:07.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:08 np0005476733 nova_compute[192580]: 2025-10-08 16:10:08.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:09 np0005476733 ovn_controller[94857]: 2025-10-08T16:10:09Z|00768|memory_trim|INFO|Detected inactivity (last active 30021 ms ago): trimming memory
Oct  8 12:10:12 np0005476733 nova_compute[192580]: 2025-10-08 16:10:12.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:13 np0005476733 nova_compute[192580]: 2025-10-08 16:10:13.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:15 np0005476733 podman[252027]: 2025-10-08 16:10:15.256169871 +0000 UTC m=+0.076754085 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 12:10:16 np0005476733 nova_compute[192580]: 2025-10-08 16:10:16.615 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:10:17 np0005476733 nova_compute[192580]: 2025-10-08 16:10:17.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:18 np0005476733 nova_compute[192580]: 2025-10-08 16:10:18.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:19 np0005476733 ovn_controller[94857]: 2025-10-08T16:10:19Z|00769|binding|INFO|Releasing lport 47874a34-92ab-40af-8541-e7b04d3f603b
Oct  8 12:10:19 np0005476733 ovn_controller[94857]: 2025-10-08T16:10:19Z|00770|if_status|INFO|Dropped 4 log messages in last 2556 seconds (most recently, 2556 seconds ago) due to excessive rate
Oct  8 12:10:19 np0005476733 ovn_controller[94857]: 2025-10-08T16:10:19Z|00771|if_status|INFO|Not setting lport 47874a34-92ab-40af-8541-e7b04d3f603b down as sb is readonly
Oct  8 12:10:19 np0005476733 ovn_controller[94857]: 2025-10-08T16:10:19Z|00772|binding|INFO|Setting lport 47874a34-92ab-40af-8541-e7b04d3f603b down in Southbound
Oct  8 12:10:19 np0005476733 nova_compute[192580]: 2025-10-08 16:10:19.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:10:22 np0005476733 ovn_controller[94857]: 2025-10-08T16:10:22Z|00773|pinctrl|WARN|Dropped 215 log messages in last 57 seconds (most recently, 2 seconds ago) due to excessive rate
Oct  8 12:10:22 np0005476733 ovn_controller[94857]: 2025-10-08T16:10:22Z|00774|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:10:22 np0005476733 podman[252055]: 2025-10-08 16:10:22.254949116 +0000 UTC m=+0.081716634 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2)
Oct  8 12:10:22 np0005476733 podman[252054]: 2025-10-08 16:10:22.279902633 +0000 UTC m=+0.111128593 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Oct  8 12:10:22 np0005476733 nova_compute[192580]: 2025-10-08 16:10:22.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:23 np0005476733 nova_compute[192580]: 2025-10-08 16:10:23.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:23 np0005476733 nova_compute[192580]: 2025-10-08 16:10:23.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:10:24 np0005476733 nova_compute[192580]: 2025-10-08 16:10:24.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:10:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:26.364 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:10:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:26.365 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:10:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:26.366 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:10:26 np0005476733 nova_compute[192580]: 2025-10-08 16:10:26.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:10:26 np0005476733 nova_compute[192580]: 2025-10-08 16:10:26.587 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:10:26 np0005476733 nova_compute[192580]: 2025-10-08 16:10:26.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:10:26 np0005476733 ovn_controller[94857]: 2025-10-08T16:10:26Z|00775|pinctrl|INFO|Claiming virtual lport 47874a34-92ab-40af-8541-e7b04d3f603b for this chassis with the virtual parent 43b9af24-0a3b-4a87-8883-35ff0783ea2c
Oct  8 12:10:26 np0005476733 ovn_controller[94857]: 2025-10-08T16:10:26Z|00776|binding|INFO|Setting lport 47874a34-92ab-40af-8541-e7b04d3f603b up in Southbound
Oct  8 12:10:27 np0005476733 nova_compute[192580]: 2025-10-08 16:10:27.217 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-7187c34b-929f-4f25-a15b-f6294b5087bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:10:27 np0005476733 nova_compute[192580]: 2025-10-08 16:10:27.218 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-7187c34b-929f-4f25-a15b-f6294b5087bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:10:27 np0005476733 nova_compute[192580]: 2025-10-08 16:10:27.218 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:10:27 np0005476733 nova_compute[192580]: 2025-10-08 16:10:27.218 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7187c34b-929f-4f25-a15b-f6294b5087bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:10:27 np0005476733 nova_compute[192580]: 2025-10-08 16:10:27.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:28 np0005476733 systemd-logind[827]: New session 58 of user zuul.
Oct  8 12:10:28 np0005476733 systemd[1]: Started Session 58 of User zuul.
Oct  8 12:10:28 np0005476733 nova_compute[192580]: 2025-10-08 16:10:28.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:28 np0005476733 nova_compute[192580]: 2025-10-08 16:10:28.510 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Updating instance_info_cache with network_info: [{"id": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "address": "fa:16:3e:43:cd:2e", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.250", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43b9af24-0a", "ovs_interfaceid": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:10:28 np0005476733 nova_compute[192580]: 2025-10-08 16:10:28.533 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-7187c34b-929f-4f25-a15b-f6294b5087bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:10:28 np0005476733 nova_compute[192580]: 2025-10-08 16:10:28.534 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:10:28 np0005476733 nova_compute[192580]: 2025-10-08 16:10:28.534 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:10:28 np0005476733 nova_compute[192580]: 2025-10-08 16:10:28.534 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:10:30 np0005476733 podman[252182]: 2025-10-08 16:10:30.234899797 +0000 UTC m=+0.058431659 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 12:10:30 np0005476733 podman[252183]: 2025-10-08 16:10:30.249829584 +0000 UTC m=+0.072342953 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, config_id=edpm, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public)
Oct  8 12:10:30 np0005476733 podman[252181]: 2025-10-08 16:10:30.250195536 +0000 UTC m=+0.069776371 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  8 12:10:30 np0005476733 nova_compute[192580]: 2025-10-08 16:10:30.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:10:32 np0005476733 nova_compute[192580]: 2025-10-08 16:10:32.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:33 np0005476733 nova_compute[192580]: 2025-10-08 16:10:33.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:33 np0005476733 systemd-logind[827]: New session 59 of user zuul.
Oct  8 12:10:33 np0005476733 systemd[1]: Started Session 59 of User zuul.
Oct  8 12:10:33 np0005476733 systemd[1]: session-59.scope: Deactivated successfully.
Oct  8 12:10:33 np0005476733 systemd-logind[827]: Session 59 logged out. Waiting for processes to exit.
Oct  8 12:10:33 np0005476733 systemd-logind[827]: Removed session 59.
Oct  8 12:10:34 np0005476733 nova_compute[192580]: 2025-10-08 16:10:34.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:10:34 np0005476733 nova_compute[192580]: 2025-10-08 16:10:34.634 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:10:34 np0005476733 nova_compute[192580]: 2025-10-08 16:10:34.635 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:10:34 np0005476733 nova_compute[192580]: 2025-10-08 16:10:34.635 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:10:34 np0005476733 nova_compute[192580]: 2025-10-08 16:10:34.635 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:10:34 np0005476733 nova_compute[192580]: 2025-10-08 16:10:34.826 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:10:34 np0005476733 nova_compute[192580]: 2025-10-08 16:10:34.947 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc/disk --force-share --output=json" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:10:34 np0005476733 nova_compute[192580]: 2025-10-08 16:10:34.949 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:10:35 np0005476733 nova_compute[192580]: 2025-10-08 16:10:35.031 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:10:35 np0005476733 nova_compute[192580]: 2025-10-08 16:10:35.042 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46a486b9-8873-425b-913e-b2931570477e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:10:35 np0005476733 nova_compute[192580]: 2025-10-08 16:10:35.140 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46a486b9-8873-425b-913e-b2931570477e/disk --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:10:35 np0005476733 nova_compute[192580]: 2025-10-08 16:10:35.142 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46a486b9-8873-425b-913e-b2931570477e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:10:35 np0005476733 nova_compute[192580]: 2025-10-08 16:10:35.219 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46a486b9-8873-425b-913e-b2931570477e/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:10:35 np0005476733 nova_compute[192580]: 2025-10-08 16:10:35.436 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:10:35 np0005476733 nova_compute[192580]: 2025-10-08 16:10:35.438 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12159MB free_disk=111.04621124267578GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:10:35 np0005476733 nova_compute[192580]: 2025-10-08 16:10:35.438 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:10:35 np0005476733 nova_compute[192580]: 2025-10-08 16:10:35.438 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:10:35 np0005476733 nova_compute[192580]: 2025-10-08 16:10:35.567 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 7187c34b-929f-4f25-a15b-f6294b5087bc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:10:35 np0005476733 nova_compute[192580]: 2025-10-08 16:10:35.567 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 46a486b9-8873-425b-913e-b2931570477e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:10:35 np0005476733 nova_compute[192580]: 2025-10-08 16:10:35.569 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:10:35 np0005476733 nova_compute[192580]: 2025-10-08 16:10:35.569 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=2560MB phys_disk=119GB used_disk=20GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:10:35 np0005476733 nova_compute[192580]: 2025-10-08 16:10:35.632 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:10:35 np0005476733 nova_compute[192580]: 2025-10-08 16:10:35.653 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:10:35 np0005476733 nova_compute[192580]: 2025-10-08 16:10:35.880 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:10:35 np0005476733 nova_compute[192580]: 2025-10-08 16:10:35.880 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.442s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.059 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000050', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '71bd615ba6694cba8794c8eb5dadbe81', 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'hostId': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.062 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '46a486b9-8873-425b-913e-b2931570477e', 'name': 'tempest-server-test-1357248001', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000052', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '71bd615ba6694cba8794c8eb5dadbe81', 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'hostId': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.062 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.062 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.062 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1357248001>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1357248001>]
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.063 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.078 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.write.latency volume: 55263562382 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.078 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.095 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/disk.device.write.latency volume: 7881722922 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.096 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4e929bf-0d60-4b63-ac76-dfb68f633dee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 55263562382, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-vda', 'timestamp': '2025-10-08T16:10:36.063170', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '530f657c-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.786129263, 'message_signature': '05bebb702077526babc84a90d51c6606e3d7d3134a484bfd51f1e1f7229be346'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-sda', 'timestamp': '2025-10-08T16:10:36.063170', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '530f729c-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.786129263, 'message_signature': 'a81500a8becb3ce6aa4e813e80427e089a729e44ed3cff350e9fd35b1744d480'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7881722922, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '46a486b9-8873-425b-913e-b2931570477e-vda', 'timestamp': '2025-10-08T16:10:36.063170', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'instance-00000052', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '531213a8-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.802086544, 'message_signature': '91aa3f1233909ebcae28a828d4b3393da0affed8ee7cdfbaf229128510a2c580'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '46a486b9-8873-425b-913e-b2931570477e-sda', 'timestamp': '2025-10-08T16:10:36.063170', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'instance-00000052', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '53121dbc-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.802086544, 'message_signature': '61dab4a47d08bb5ad7b71b09e81c6f1eb77e7c7383d690519a3a9569fdc4eefa'}]}, 'timestamp': '2025-10-08 16:10:36.096580', '_unique_id': 'b6e38806afe64c9b85e699b70ed40b85'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.097 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.098 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.100 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/network.incoming.packets volume: 202 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.102 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 46a486b9-8873-425b-913e-b2931570477e / tap2731ffa0-8e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.102 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/network.incoming.packets volume: 112 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f033357-c35c-4cd5-b28c-e2f117a3c02c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 202, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000050-7187c34b-929f-4f25-a15b-f6294b5087bc-tap43b9af24-0a', 'timestamp': '2025-10-08T16:10:36.098530', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'tap43b9af24-0a', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:43:cd:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43b9af24-0a'}, 'message_id': '5312ca3c-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.821510245, 'message_signature': '295924678cf7eaae2a00b84e142f2ae32554880ad1b41f06b71aa0e97f038e33'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 112, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000052-46a486b9-8873-425b-913e-b2931570477e-tap2731ffa0-8e', 'timestamp': '2025-10-08T16:10:36.098530', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'tap2731ffa0-8e', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:aa:8e:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2731ffa0-8e'}, 'message_id': '53131d20-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.823962703, 'message_signature': 'f760e443393edf78e97103bc6ebba843ce2192efc588153af08330a3bfc94b56'}]}, 'timestamp': '2025-10-08 16:10:36.103122', '_unique_id': '95c57365b99d47809c3e66affb77df34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.103 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.104 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.104 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/network.incoming.bytes volume: 31122 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.104 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/network.incoming.bytes volume: 21888 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e756616-6f37-43bb-a51a-a636e4ce662d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31122, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000050-7187c34b-929f-4f25-a15b-f6294b5087bc-tap43b9af24-0a', 'timestamp': '2025-10-08T16:10:36.104622', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'tap43b9af24-0a', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:43:cd:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43b9af24-0a'}, 'message_id': '531361b8-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.821510245, 'message_signature': 'bba8d0417bf8330d482fc59b21c7e0b9b604b0d8b40b3890e4f56aa0ff51c53d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 21888, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000052-46a486b9-8873-425b-913e-b2931570477e-tap2731ffa0-8e', 'timestamp': '2025-10-08T16:10:36.104622', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'tap2731ffa0-8e', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:aa:8e:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2731ffa0-8e'}, 'message_id': '53136aa0-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.823962703, 'message_signature': 'c84a2a66cc1ccd76605bc337f95a16e3a21f8306da201947c286aa4ca313793e'}]}, 'timestamp': '2025-10-08 16:10:36.105095', '_unique_id': '970d6b1e24f34a66b49d8947cf326242'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.105 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.106 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.106 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.read.requests volume: 11699 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.106 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.106 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/disk.device.read.requests volume: 11662 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.106 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3a2c37b-6d85-405d-a2ce-a3672f4e9dba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11699, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-vda', 'timestamp': '2025-10-08T16:10:36.106314', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '5313a394-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.786129263, 'message_signature': '3ce62eaa26125e2cbff04a4f9b29b8a76e9a794f1cd169f474a0536e691788fe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-sda', 'timestamp': '2025-10-08T16:10:36.106314', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '5313ab8c-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.786129263, 'message_signature': '64f009062a8d4dfb08e646664f7d02733e7bbe9a90663d55625553e1b12a49ce'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11662, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '46a486b9-8873-425b-913e-b2931570477e-vda', 'timestamp': '2025-10-08T16:10:36.106314', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'instance-00000052', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '5313b32a-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.802086544, 'message_signature': '03e1b76d826cbe592688e580aee4f79e6572d4acd8123d57d3a6eb4daa0921f8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '46a486b9-8873-425b-913e-b2931570477e-sda', 'timestamp': '2025-10-08T16:10:36.106314', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'instance-00000052', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '5313bbd6-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.802086544, 'message_signature': '15ba8d7df4e4c22fc00325c883f719e461553e3d4834055d1da11e19400b35f1'}]}, 'timestamp': '2025-10-08 16:10:36.107166', '_unique_id': 'efb1915bf13b4b24b4cf6e14f1a43a25'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.107 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.108 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.108 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.write.requests volume: 814 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.108 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.108 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/disk.device.write.requests volume: 712 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef148e12-9653-43ee-a8a9-6eecea47832c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 814, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-vda', 'timestamp': '2025-10-08T16:10:36.108505', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '5313fa88-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.786129263, 'message_signature': '2c5cae04843cb32673066f321ba95403d8537aa9fcd74e00e683ac01a4b73793'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-sda', 'timestamp': '2025-10-08T16:10:36.108505', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '531402b2-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.786129263, 'message_signature': '4b84b41e9a6b30cf2e12a3d8c4933b048dafa27ae0bc5c232b80bb62517a9006'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 712, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '46a486b9-8873-425b-913e-b2931570477e-vda', 'timestamp': '2025-10-08T16:10:36.108505', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'instance-00000052', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '53140a50-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.802086544, 'message_signature': 'b757193da0f6cba2843e1cc04b6d7addb728708eefa9543b24ded545a85673ab'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '46a486b9-8873-425b-913e-b2931570477e-sda', 'timestamp': '2025-10-08T16:10:36.108505', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'instance-00000052', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '531413ce-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.802086544, 'message_signature': '0660dc4c85aae9e8d8da8e6803db121c70217dc23b38b0f4cf6dbd65caae1a3e'}]}, 'timestamp': '2025-10-08 16:10:36.109404', '_unique_id': '2b5a891a8f9e4a0d855a48c2d62f6669'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.110 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.110 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.110 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1357248001>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1357248001>]
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.111 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.111 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.111 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-1357248001>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1357248001>]
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.111 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.124 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.124 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.137 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.137 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c19a04f6-7cda-49a8-b268-7bff2e39c6ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-vda', 'timestamp': '2025-10-08T16:10:36.111468', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '53165f26-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.834435268, 'message_signature': '38865694332e9adcdc7bb779d4bf23b95859d422825769956afb7e3a4e4dd694'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-sda', 'timestamp': '2025-10-08T16:10:36.111468', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '531668b8-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.834435268, 'message_signature': 'b00fd30ee9a51160e463e5ea9a71b4f9f6f6ffee82225d829063eb68e37f1d77'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '46a486b9-8873-425b-913e-b2931570477e-vda', 'timestamp': '2025-10-08T16:10:36.111468', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'instance-00000052', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '5318615e-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.8476316, 'message_signature': '55e738c9f1bcf4d5be46d63f32a40d820f039abc9ec8585fbe5984bf70307dc4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '46a486b9-8873-425b-913e-b2931570477e-sda', 'timestamp': '2025-10-08T16:10:36.111468', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'instance-00000052', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '53186cf8-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.8476316, 'message_signature': '779a6604d53f4f84c5995fa8925159410e88d0b703bb3cf8b66ccbf30c890d2f'}]}, 'timestamp': '2025-10-08 16:10:36.137949', '_unique_id': '9d294a8b659a470d805874fd645a8c42'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.139 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b5db3f9-c87c-40fb-a714-282c0fbd1aec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000050-7187c34b-929f-4f25-a15b-f6294b5087bc-tap43b9af24-0a', 'timestamp': '2025-10-08T16:10:36.140043', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'tap43b9af24-0a', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:43:cd:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43b9af24-0a'}, 'message_id': '5318ca4a-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.821510245, 'message_signature': '1829eda6f829ae491ac79adf91058cab6c54fd96ac874ecf1886de8a5f9a0a47'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000052-46a486b9-8873-425b-913e-b2931570477e-tap2731ffa0-8e', 'timestamp': '2025-10-08T16:10:36.140043', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'tap2731ffa0-8e', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:aa:8e:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2731ffa0-8e'}, 'message_id': '5318d292-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.823962703, 'message_signature': '0abae046404aee268321ccded3177d82ca3aaba8704f10595d623a39b00524a4'}]}, 'timestamp': '2025-10-08 16:10:36.140504', '_unique_id': '2be9d66ba52c42b3ad3717175d7a6acd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.141 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.141 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.141 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '101f2e84-f871-4c3d-82d2-37031b56af7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-vda', 'timestamp': '2025-10-08T16:10:36.141628', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '531906f4-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.834435268, 'message_signature': '36393f35093a300ece96587d20bcfb2c14eb8ec8ad009260526c225c6a7c4785'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-sda', 'timestamp': '2025-10-08T16:10:36.141628', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '53190ec4-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.834435268, 'message_signature': 'e02d11524e42978efb4ddee697bb15ebe3cb657111ad808c2eb04bfd61ba4609'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '46a486b9-8873-425b-913e-b2931570477e-vda', 'timestamp': '2025-10-08T16:10:36.141628', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'instance-00000052', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '53191716-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.8476316, 'message_signature': '1db8eb7f445ca953b0ade87c9da1470316fe19acd7db94e72da6eb674be10f15'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '46a486b9-8873-425b-913e-b2931570477e-sda', 'timestamp': '2025-10-08T16:10:36.141628', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'instance-00000052', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '53191f90-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.8476316, 'message_signature': '18e9013b15adf6a932e9d55d7336e9055a0c0d87283267593dc2d0263ec9c61f'}]}, 'timestamp': '2025-10-08 16:10:36.142467', '_unique_id': '15fba4ad8778449c9c41135c3a1c4951'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.142 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.143 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.143 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.143 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f879f4d9-118a-4916-95c5-998d7c1fefd2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000050-7187c34b-929f-4f25-a15b-f6294b5087bc-tap43b9af24-0a', 'timestamp': '2025-10-08T16:10:36.143593', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'tap43b9af24-0a', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:43:cd:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43b9af24-0a'}, 'message_id': '531953c0-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.821510245, 'message_signature': '02369b39bf2fd61d2d2e1fab552fcb640a5068120eecab51a32b5e9bf3733dd0'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000052-46a486b9-8873-425b-913e-b2931570477e-tap2731ffa0-8e', 'timestamp': '2025-10-08T16:10:36.143593', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'tap2731ffa0-8e', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:aa:8e:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2731ffa0-8e'}, 'message_id': '53195ba4-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.823962703, 'message_signature': 'cdc17e60465a6550dd32dee0ad9960cd385350c4bcf06ce18b5b9fb508c04616'}]}, 'timestamp': '2025-10-08 16:10:36.144017', '_unique_id': '8645a46e57e449e5af111e890e3887f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.145 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.145 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/network.outgoing.bytes volume: 44866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.145 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/network.outgoing.bytes volume: 24358 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c21896cb-31ec-403f-9d3b-e214d412da4b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 44866, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000050-7187c34b-929f-4f25-a15b-f6294b5087bc-tap43b9af24-0a', 'timestamp': '2025-10-08T16:10:36.145168', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'tap43b9af24-0a', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:43:cd:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43b9af24-0a'}, 'message_id': '531992c2-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.821510245, 'message_signature': 'f8ef3e34c584cae1e3ed9125f175d2681175dffbb4eac6da50b3358b0466e026'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 24358, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000052-46a486b9-8873-425b-913e-b2931570477e-tap2731ffa0-8e', 'timestamp': '2025-10-08T16:10:36.145168', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'tap2731ffa0-8e', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:aa:8e:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2731ffa0-8e'}, 'message_id': '53199b0a-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.823962703, 'message_signature': '6739dc550267989f5bf6aa01137f322cfa2bbf9c55107e3b8674d8e6ee426b56'}]}, 'timestamp': '2025-10-08 16:10:36.145636', '_unique_id': '98e3bc80d5264a5bb07546dd16dcce32'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.146 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-1357248001>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1357248001>]
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.147 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.147 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.147 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba86b13a-b2e8-4c72-b99e-6248071ba8b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000050-7187c34b-929f-4f25-a15b-f6294b5087bc-tap43b9af24-0a', 'timestamp': '2025-10-08T16:10:36.147061', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'tap43b9af24-0a', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:43:cd:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43b9af24-0a'}, 'message_id': '5319dc6e-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.821510245, 'message_signature': 'bde861180bb718fc80728a13c00f6fba7d3bc98b7e9e288e49751b3f0ed61358'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000052-46a486b9-8873-425b-913e-b2931570477e-tap2731ffa0-8e', 'timestamp': '2025-10-08T16:10:36.147061', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'tap2731ffa0-8e', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:aa:8e:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2731ffa0-8e'}, 'message_id': '5319e542-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.823962703, 'message_signature': '58e2d4a3ff45dec2c698480218d9979b88f72583c05069dcfe0ae656a28a38e5'}]}, 'timestamp': '2025-10-08 16:10:36.147563', '_unique_id': 'd3362c9a5b154838b155797882dd10c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.write.bytes volume: 137117184 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.148 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.149 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/disk.device.write.bytes volume: 135802880 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.149 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18924978-9492-4d19-a063-bb825cb61588', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 137117184, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-vda', 'timestamp': '2025-10-08T16:10:36.148771', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '531a1df0-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.786129263, 'message_signature': 'ac569196393ae5b381b67571d291cee5601aef319ee9406d71e076448a3675fb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-sda', 'timestamp': '2025-10-08T16:10:36.148771', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '531a2688-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.786129263, 'message_signature': '73b210132258276bbe91843db3276fe15cfe203e97a6b547c609ed7926f9b545'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 135802880, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '46a486b9-8873-425b-913e-b2931570477e-vda', 'timestamp': '2025-10-08T16:10:36.148771', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'instance-00000052', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '531a3056-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.802086544, 'message_signature': '10218631902d10ddf81dcdb1ab72c056aa71b64cc7c7313dde5fb8d6ce623267'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '46a486b9-8873-425b-913e-b2931570477e-sda', 'timestamp': '2025-10-08T16:10:36.148771', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'instance-00000052', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '531a3830-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.802086544, 'message_signature': '6abe2cfdad57b7169557c1120064b69f3f899083efa1339accaf1f1e45d1d313'}]}, 'timestamp': '2025-10-08 16:10:36.149650', '_unique_id': '0dbc5f3f137e4c008a3e3b210997792c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.150 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.usage volume: 152633344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.151 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.151 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/disk.device.usage volume: 152829952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.151 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c003a487-ee54-4295-ba9f-28d9a28a3685', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152633344, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-vda', 'timestamp': '2025-10-08T16:10:36.150850', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '531a6efe-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.834435268, 'message_signature': 'e770c468fca81235868e9aa2af7ecc770e96f4ccb1375a830c4aca72d30eeb49'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-sda', 'timestamp': '2025-10-08T16:10:36.150850', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '531a780e-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.834435268, 'message_signature': '304f13781b5a8beec2afb34ff56dbd531cd2bd2a0abe3a198426f1597afcbce3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152829952, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '46a486b9-8873-425b-913e-b2931570477e-vda', 'timestamp': '2025-10-08T16:10:36.150850', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'instance-00000052', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '531a7fde-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.8476316, 'message_signature': '8d70b40119e42ccdbb067af3e8881c31eca4cbd2290c8533429200c47b084925'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '46a486b9-8873-425b-913e-b2931570477e-sda', 'timestamp': '2025-10-08T16:10:36.150850', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'instance-00000052', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '531a86dc-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.8476316, 'message_signature': '72b413e75b686528fdcaf09c17d26685e49d998122800c7b43d6a55d7fcb20f3'}]}, 'timestamp': '2025-10-08 16:10:36.151663', '_unique_id': '6c9ba55aff7f4f8c8d93249bbf132e0f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.152 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/network.outgoing.packets volume: 257 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/network.outgoing.packets volume: 125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03081d32-8223-4923-a7a9-7bd7981db2f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 257, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000050-7187c34b-929f-4f25-a15b-f6294b5087bc-tap43b9af24-0a', 'timestamp': '2025-10-08T16:10:36.152808', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'tap43b9af24-0a', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:43:cd:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43b9af24-0a'}, 'message_id': '531abbca-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.821510245, 'message_signature': '5f2b03c0a85e4b53b1446dfb4f6b43b2f66ca15c12ebbd736350ea3d2bc49ed6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 125, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000052-46a486b9-8873-425b-913e-b2931570477e-tap2731ffa0-8e', 'timestamp': '2025-10-08T16:10:36.152808', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'tap2731ffa0-8e', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:aa:8e:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2731ffa0-8e'}, 'message_id': '531ac55c-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.823962703, 'message_signature': '679133b278a8298688b98d70af3479c431c865f11468c87fb8f697db59698911'}]}, 'timestamp': '2025-10-08 16:10:36.153294', '_unique_id': '462a497d318e45d5b60b7f43a3bc7597'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.153 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.154 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.181 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/cpu volume: 44370000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.208 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/cpu volume: 39870000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8a28a9e-1012-4bf4-9eb5-250b52279efc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 44370000000, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'timestamp': '2025-10-08T16:10:36.154601', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '531f140e-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.903906458, 'message_signature': '09f8b8431cf6ea3ff9f0aa8c52da92e36164bf00c35c2c1f3d92ef04822bcc36'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 39870000000, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '46a486b9-8873-425b-913e-b2931570477e', 'timestamp': '2025-10-08T16:10:36.154601', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'instance-00000052', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '5323484e-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.931547363, 'message_signature': '900a995e5098decd9b0d2c489362cacf3e3ebfda6cbc5c835f6d531a5c8fc9bf'}]}, 'timestamp': '2025-10-08 16:10:36.209145', '_unique_id': '9ec13ab45dca4841bb62fec1bc6a4418'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.210 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.read.latency volume: 16273315195 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.211 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.read.latency volume: 72777917 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.211 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/disk.device.read.latency volume: 8031919609 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.211 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/disk.device.read.latency volume: 101620868 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '99062e67-cd3d-4165-a464-d6d47741fc99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16273315195, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-vda', 'timestamp': '2025-10-08T16:10:36.210927', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '53239a4c-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.786129263, 'message_signature': '5ade0de4bbd04b39e9e05521e42ded9f20f730a2687625c942be4cfbeb61f8e0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 72777917, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-sda', 'timestamp': '2025-10-08T16:10:36.210927', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '5323a2e4-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.786129263, 'message_signature': '0139cdc0eff37e87970586a15014223cb76b4231520cbe503d087831e95ee7bd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8031919609, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '46a486b9-8873-425b-913e-b2931570477e-vda', 'timestamp': '2025-10-08T16:10:36.210927', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'instance-00000052', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '5323aa6e-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.802086544, 'message_signature': '1cde04b3cf4b78476dc7d8bc12beaa8656341003d0cd98da876b8fed41d1423a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101620868, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '46a486b9-8873-425b-913e-b2931570477e-sda', 'timestamp': '2025-10-08T16:10:36.210927', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'instance-00000052', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '5323b1c6-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.802086544, 'message_signature': '55105446247621e1f789700b5af1df0485cf3876090613c40da0c6d8264dc1d3'}]}, 'timestamp': '2025-10-08 16:10:36.211747', '_unique_id': '87ff3d398bbb430990ee90ef2a510680'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.212 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.213 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.read.bytes volume: 330614272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.213 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.213 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/disk.device.read.bytes volume: 329500160 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.213 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2306a49-8279-4e1a-923d-9d92f9f872be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 330614272, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-vda', 'timestamp': '2025-10-08T16:10:36.212977', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '5323ead8-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.786129263, 'message_signature': '2017c97bb27e04ae226988d7c76072a6d9ecb035818940fa7ee21af4bb326fc2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc-sda', 'timestamp': '2025-10-08T16:10:36.212977', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '5323f2b2-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.786129263, 'message_signature': '87f06a7bc0245d6889abbee5c06470b3055f397b31f0d4265008040e2a164647'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 329500160, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '46a486b9-8873-425b-913e-b2931570477e-vda', 'timestamp': '2025-10-08T16:10:36.212977', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'instance-00000052', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '5323fa3c-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.802086544, 'message_signature': '8b13cf2a464e97b372add5b8650352691996f76711e2ac15e30b0472851bc626'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '46a486b9-8873-425b-913e-b2931570477e-sda', 'timestamp': '2025-10-08T16:10:36.212977', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'instance-00000052', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '5324019e-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.802086544, 'message_signature': 'b0a545f716a8c28e0e0c7ffb40113013dcf72753ab31ef884645d643dd2ae086'}]}, 'timestamp': '2025-10-08 16:10:36.213791', '_unique_id': '5e3d0ff0b1a049478d526f06be320d01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.214 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.215 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/memory.usage volume: 251.71875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.215 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/memory.usage volume: 240.30078125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76e087be-6b32-423f-9f71-9ac1b8c0b2c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 251.71875, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'timestamp': '2025-10-08T16:10:36.215032', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'instance-00000050', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '53243c18-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.903906458, 'message_signature': '417b4a74c76b86284beb426986b9e03d3da23b64ae925cd0e5673e4cade38824'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 240.30078125, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': '46a486b9-8873-425b-913e-b2931570477e', 'timestamp': '2025-10-08T16:10:36.215032', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'instance-00000052', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '53244442-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.931547363, 'message_signature': '400d699574850893555a28fe3a0c2d73bd37c6abc3ef2a4a2bc537f67515995a'}]}, 'timestamp': '2025-10-08 16:10:36.215499', '_unique_id': '211723b74ad34eddb1dd9d9c1d05907e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/network.incoming.bytes.delta volume: 28702 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.216 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b287b83e-af3b-4f71-b247-52ff2c22217f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 28702, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000050-7187c34b-929f-4f25-a15b-f6294b5087bc-tap43b9af24-0a', 'timestamp': '2025-10-08T16:10:36.216711', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'tap43b9af24-0a', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:43:cd:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43b9af24-0a'}, 'message_id': '53247c3c-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.821510245, 'message_signature': '71850f9675a6fc1563e113bc83ed7410145872a53cc7d3aa6f4d9738301d8681'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000052-46a486b9-8873-425b-913e-b2931570477e-tap2731ffa0-8e', 'timestamp': '2025-10-08T16:10:36.216711', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'tap2731ffa0-8e', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:aa:8e:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2731ffa0-8e'}, 'message_id': '53248470-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.823962703, 'message_signature': '3bf8fe1eaa2e404e6d4c5ed02c71b441984b12b76a37069f8d54dd0744fc417f'}]}, 'timestamp': '2025-10-08 16:10:36.217170', '_unique_id': 'fb1ee692b07d4bbebf881dab97c8e48d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.217 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.218 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.218 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.218 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e591b10a-e380-414f-81f1-f342207b779c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000050-7187c34b-929f-4f25-a15b-f6294b5087bc-tap43b9af24-0a', 'timestamp': '2025-10-08T16:10:36.218315', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'tap43b9af24-0a', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:43:cd:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43b9af24-0a'}, 'message_id': '5324ba9e-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.821510245, 'message_signature': '407a5dee8db68c9a36d581d6b11a51f1d2c94b6ea6dfdd813466cbc270260711'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000052-46a486b9-8873-425b-913e-b2931570477e-tap2731ffa0-8e', 'timestamp': '2025-10-08T16:10:36.218315', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'tap2731ffa0-8e', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:aa:8e:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2731ffa0-8e'}, 'message_id': '5324c296-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.823962703, 'message_signature': '16c42bbf68748d028925dcec50261af038c92eff88818dbd0c56224f1df057fd'}]}, 'timestamp': '2025-10-08 16:10:36.218768', '_unique_id': '453b4f9cb65a4e8699c504d5d9cd109c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.219 12 DEBUG ceilometer.compute.pollsters [-] 7187c34b-929f-4f25-a15b-f6294b5087bc/network.outgoing.bytes.delta volume: 41144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 DEBUG ceilometer.compute.pollsters [-] 46a486b9-8873-425b-913e-b2931570477e/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd9f6fe9-dc98-4616-80ec-ca6cb12b7bf8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 41144, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000050-7187c34b-929f-4f25-a15b-f6294b5087bc-tap43b9af24-0a', 'timestamp': '2025-10-08T16:10:36.219893', 'resource_metadata': {'display_name': 'tempest-test_dvr_vip_failover_external_network-1149265317', 'name': 'tap43b9af24-0a', 'instance_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:43:cd:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap43b9af24-0a'}, 'message_id': '5324f860-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.821510245, 'message_signature': 'fa331bde98df2732923180eff916925348112ec770bc0b357125483c1407818b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '000a8d1cd17e4a4c8398ef814dd4db2d', 'user_name': None, 'project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'project_name': None, 'resource_id': 'instance-00000052-46a486b9-8873-425b-913e-b2931570477e-tap2731ffa0-8e', 'timestamp': '2025-10-08T16:10:36.219893', 'resource_metadata': {'display_name': 'tempest-server-test-1357248001', 'name': 'tap2731ffa0-8e', 'instance_id': '46a486b9-8873-425b-913e-b2931570477e', 'instance_type': 'custom_neutron_guest', 'host': 'd7f529dc3d48da14c602311aeb5fc63b811345c9d245390f890318cc', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:aa:8e:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2731ffa0-8e'}, 'message_id': '53250134-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6739.823962703, 'message_signature': '15a5db7ad73ac62926fe49e351df78a675996523820e1450fc80d5dec3fcd3d3'}]}, 'timestamp': '2025-10-08 16:10:36.220339', '_unique_id': '9a4144098edf41b18856f5a0c7e87fc0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:10:36.220 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:10:36 np0005476733 podman[252290]: 2025-10-08 16:10:36.246377929 +0000 UTC m=+0.066386013 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:10:36 np0005476733 podman[252289]: 2025-10-08 16:10:36.250990376 +0000 UTC m=+0.075526365 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 12:10:36 np0005476733 nova_compute[192580]: 2025-10-08 16:10:36.822 2 DEBUG oslo_concurrency.lockutils [None req-8d5fac48-faa1-4308-bed3-6e36bdc224a2 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "46a486b9-8873-425b-913e-b2931570477e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:10:36 np0005476733 nova_compute[192580]: 2025-10-08 16:10:36.823 2 DEBUG oslo_concurrency.lockutils [None req-8d5fac48-faa1-4308-bed3-6e36bdc224a2 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "46a486b9-8873-425b-913e-b2931570477e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:10:36 np0005476733 nova_compute[192580]: 2025-10-08 16:10:36.823 2 DEBUG oslo_concurrency.lockutils [None req-8d5fac48-faa1-4308-bed3-6e36bdc224a2 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "46a486b9-8873-425b-913e-b2931570477e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:10:36 np0005476733 nova_compute[192580]: 2025-10-08 16:10:36.823 2 DEBUG oslo_concurrency.lockutils [None req-8d5fac48-faa1-4308-bed3-6e36bdc224a2 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "46a486b9-8873-425b-913e-b2931570477e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:10:36 np0005476733 nova_compute[192580]: 2025-10-08 16:10:36.823 2 DEBUG oslo_concurrency.lockutils [None req-8d5fac48-faa1-4308-bed3-6e36bdc224a2 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "46a486b9-8873-425b-913e-b2931570477e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:10:36 np0005476733 nova_compute[192580]: 2025-10-08 16:10:36.825 2 INFO nova.compute.manager [None req-8d5fac48-faa1-4308-bed3-6e36bdc224a2 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Terminating instance#033[00m
Oct  8 12:10:36 np0005476733 nova_compute[192580]: 2025-10-08 16:10:36.826 2 DEBUG nova.compute.manager [None req-8d5fac48-faa1-4308-bed3-6e36bdc224a2 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 12:10:36 np0005476733 kernel: tap2731ffa0-8e (unregistering): left promiscuous mode
Oct  8 12:10:36 np0005476733 NetworkManager[51699]: <info>  [1759939836.9088] device (tap2731ffa0-8e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 12:10:36 np0005476733 nova_compute[192580]: 2025-10-08 16:10:36.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:36 np0005476733 ovn_controller[94857]: 2025-10-08T16:10:36Z|00777|binding|INFO|Releasing lport 2731ffa0-8eae-47d8-a816-32f3afa5b445 from this chassis (sb_readonly=0)
Oct  8 12:10:36 np0005476733 ovn_controller[94857]: 2025-10-08T16:10:36Z|00778|binding|INFO|Setting lport 2731ffa0-8eae-47d8-a816-32f3afa5b445 down in Southbound
Oct  8 12:10:36 np0005476733 ovn_controller[94857]: 2025-10-08T16:10:36Z|00779|binding|INFO|Removing iface tap2731ffa0-8e ovn-installed in OVS
Oct  8 12:10:36 np0005476733 nova_compute[192580]: 2025-10-08 16:10:36.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:36 np0005476733 nova_compute[192580]: 2025-10-08 16:10:36.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:36.964 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:aa:8e:58 192.168.122.235'], port_security=['fa:16:3e:aa:8e:58 192.168.122.235'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.235/24', 'neutron:device_id': '46a486b9-8873-425b-913e-b2931570477e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'neutron:revision_number': '4', 'neutron:security_group_ids': '24bffacf-e176-4693-befb-0f2fe8062d38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5b64086-e7d8-42ad-b439-67cb79e13d7c, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=2731ffa0-8eae-47d8-a816-32f3afa5b445) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:10:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:36.966 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 2731ffa0-8eae-47d8-a816-32f3afa5b445 in datapath 81c575b5-ac88-40d3-8b00-79c5c936eec4 unbound from our chassis#033[00m
Oct  8 12:10:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:36.967 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 81c575b5-ac88-40d3-8b00-79c5c936eec4#033[00m
Oct  8 12:10:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:36.983 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[85f5c9cd-b639-4ba0-a374-ff6d8e672c0a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:10:36 np0005476733 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d00000052.scope: Deactivated successfully.
Oct  8 12:10:37 np0005476733 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d00000052.scope: Consumed 43.768s CPU time.
Oct  8 12:10:37 np0005476733 systemd-machined[152624]: Machine qemu-49-instance-00000052 terminated.
Oct  8 12:10:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:37.017 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[4f288a33-08f0-4f0d-a28d-ec06721bccca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:10:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:37.021 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[7256fa33-33b9-4139-ac07-196d6747d109]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:10:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:37.051 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[49ef07f9-274b-48c2-877f-16fc6d96e4b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:10:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:37.072 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c890b5ef-3b96-4d37-924d-43da481d381f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81c575b5-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:bf:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 49, 'tx_packets': 8, 'rx_bytes': 2582, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 49, 'tx_packets': 8, 'rx_bytes': 2582, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655928, 'reachable_time': 32413, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 9, 'inoctets': 776, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 9, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 776, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 9, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252348, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.090 2 INFO nova.virt.libvirt.driver [-] [instance: 46a486b9-8873-425b-913e-b2931570477e] Instance destroyed successfully.#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.091 2 DEBUG nova.objects.instance [None req-8d5fac48-faa1-4308-bed3-6e36bdc224a2 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lazy-loading 'resources' on Instance uuid 46a486b9-8873-425b-913e-b2931570477e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:10:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:37.091 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3f3a6254-59b5-4e0e-958b-06ac979fbbee]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap81c575b5-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 655941, 'tstamp': 655941}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 252358, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.122.171'], ['IFA_LOCAL', '192.168.122.171'], ['IFA_BROADCAST', '192.168.122.255'], ['IFA_LABEL', 'tap81c575b5-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 655944, 'tstamp': 655944}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 252358, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:10:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:37.092 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81c575b5-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.132 2 DEBUG nova.virt.libvirt.vif [None req-8d5fac48-faa1-4308-bed3-6e36bdc224a2 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T16:09:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-server-test-1357248001',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1357248001',id=82,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNdA0tsd4prmAtaaiPDbr0srwjwa73lUEwXPQ7487oxI1AHjPlOjgV1xIPQKf206OdbyLFsmy0ZYOIXSGyym/svb66IKtgGKZCbQTa1vbaeLuI4LvJfdLsM/uLuzgQoA5Q==',key_name='tempest-keypair-test-460735074',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:09:40Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='71bd615ba6694cba8794c8eb5dadbe81',ramdisk_id='',reservation_id='r-b58i6gm8',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-OvnDvrAdvancedTest-1107478320',owner_user_name='tempest-OvnDvrAdvancedTest-1107478320-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T16:09:40Z,user_data=None,user_id='000a8d1cd17e4a4c8398ef814dd4db2d',uuid=46a486b9-8873-425b-913e-b2931570477e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2731ffa0-8eae-47d8-a816-32f3afa5b445", "address": "fa:16:3e:aa:8e:58", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.235", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2731ffa0-8e", "ovs_interfaceid": "2731ffa0-8eae-47d8-a816-32f3afa5b445", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.132 2 DEBUG nova.network.os_vif_util [None req-8d5fac48-faa1-4308-bed3-6e36bdc224a2 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Converting VIF {"id": "2731ffa0-8eae-47d8-a816-32f3afa5b445", "address": "fa:16:3e:aa:8e:58", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.235", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2731ffa0-8e", "ovs_interfaceid": "2731ffa0-8eae-47d8-a816-32f3afa5b445", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.133 2 DEBUG nova.network.os_vif_util [None req-8d5fac48-faa1-4308-bed3-6e36bdc224a2 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:aa:8e:58,bridge_name='br-int',has_traffic_filtering=True,id=2731ffa0-8eae-47d8-a816-32f3afa5b445,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2731ffa0-8e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.133 2 DEBUG os_vif [None req-8d5fac48-faa1-4308-bed3-6e36bdc224a2 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:aa:8e:58,bridge_name='br-int',has_traffic_filtering=True,id=2731ffa0-8eae-47d8-a816-32f3afa5b445,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2731ffa0-8e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.135 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2731ffa0-8e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:37.138 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81c575b5-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:10:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:37.138 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:37.139 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap81c575b5-a0, col_values=(('external_ids', {'iface-id': '3737b929-673d-4d30-a674-dbb8c6c2e54d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:10:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:37.139 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.140 2 INFO os_vif [None req-8d5fac48-faa1-4308-bed3-6e36bdc224a2 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:aa:8e:58,bridge_name='br-int',has_traffic_filtering=True,id=2731ffa0-8eae-47d8-a816-32f3afa5b445,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2731ffa0-8e')#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.141 2 INFO nova.virt.libvirt.driver [None req-8d5fac48-faa1-4308-bed3-6e36bdc224a2 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Deleting instance files /var/lib/nova/instances/46a486b9-8873-425b-913e-b2931570477e_del#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.141 2 INFO nova.virt.libvirt.driver [None req-8d5fac48-faa1-4308-bed3-6e36bdc224a2 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Deletion of /var/lib/nova/instances/46a486b9-8873-425b-913e-b2931570477e_del complete#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.231 2 INFO nova.compute.manager [None req-8d5fac48-faa1-4308-bed3-6e36bdc224a2 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.231 2 DEBUG oslo.service.loopingcall [None req-8d5fac48-faa1-4308-bed3-6e36bdc224a2 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.231 2 DEBUG nova.compute.manager [-] [instance: 46a486b9-8873-425b-913e-b2931570477e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.231 2 DEBUG nova.network.neutron [-] [instance: 46a486b9-8873-425b-913e-b2931570477e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.658 2 DEBUG nova.compute.manager [req-84d2ba9c-3223-4996-b284-7d89dd2fc33f req-6d638be9-65e4-4d0c-b523-42e41c8438cb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Received event network-vif-unplugged-2731ffa0-8eae-47d8-a816-32f3afa5b445 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.659 2 DEBUG oslo_concurrency.lockutils [req-84d2ba9c-3223-4996-b284-7d89dd2fc33f req-6d638be9-65e4-4d0c-b523-42e41c8438cb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "46a486b9-8873-425b-913e-b2931570477e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.659 2 DEBUG oslo_concurrency.lockutils [req-84d2ba9c-3223-4996-b284-7d89dd2fc33f req-6d638be9-65e4-4d0c-b523-42e41c8438cb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "46a486b9-8873-425b-913e-b2931570477e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.659 2 DEBUG oslo_concurrency.lockutils [req-84d2ba9c-3223-4996-b284-7d89dd2fc33f req-6d638be9-65e4-4d0c-b523-42e41c8438cb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "46a486b9-8873-425b-913e-b2931570477e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.659 2 DEBUG nova.compute.manager [req-84d2ba9c-3223-4996-b284-7d89dd2fc33f req-6d638be9-65e4-4d0c-b523-42e41c8438cb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] No waiting events found dispatching network-vif-unplugged-2731ffa0-8eae-47d8-a816-32f3afa5b445 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.659 2 DEBUG nova.compute.manager [req-84d2ba9c-3223-4996-b284-7d89dd2fc33f req-6d638be9-65e4-4d0c-b523-42e41c8438cb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Received event network-vif-unplugged-2731ffa0-8eae-47d8-a816-32f3afa5b445 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 12:10:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:37.828 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=53, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=52) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:10:37 np0005476733 nova_compute[192580]: 2025-10-08 16:10:37.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:37.831 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:10:38 np0005476733 nova_compute[192580]: 2025-10-08 16:10:38.221 2 DEBUG nova.network.neutron [-] [instance: 46a486b9-8873-425b-913e-b2931570477e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:10:38 np0005476733 nova_compute[192580]: 2025-10-08 16:10:38.254 2 INFO nova.compute.manager [-] [instance: 46a486b9-8873-425b-913e-b2931570477e] Took 1.02 seconds to deallocate network for instance.#033[00m
Oct  8 12:10:38 np0005476733 nova_compute[192580]: 2025-10-08 16:10:38.298 2 DEBUG nova.compute.manager [req-b1241c28-a32b-4995-860e-a49aa658e2b3 req-99d073ae-774d-47aa-ad69-777686d3e1fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Received event network-vif-deleted-2731ffa0-8eae-47d8-a816-32f3afa5b445 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:10:38 np0005476733 nova_compute[192580]: 2025-10-08 16:10:38.331 2 DEBUG oslo_concurrency.lockutils [None req-8d5fac48-faa1-4308-bed3-6e36bdc224a2 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:10:38 np0005476733 nova_compute[192580]: 2025-10-08 16:10:38.332 2 DEBUG oslo_concurrency.lockutils [None req-8d5fac48-faa1-4308-bed3-6e36bdc224a2 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:10:38 np0005476733 nova_compute[192580]: 2025-10-08 16:10:38.411 2 DEBUG nova.compute.provider_tree [None req-8d5fac48-faa1-4308-bed3-6e36bdc224a2 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:10:38 np0005476733 nova_compute[192580]: 2025-10-08 16:10:38.488 2 DEBUG nova.scheduler.client.report [None req-8d5fac48-faa1-4308-bed3-6e36bdc224a2 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:10:38 np0005476733 nova_compute[192580]: 2025-10-08 16:10:38.561 2 DEBUG oslo_concurrency.lockutils [None req-8d5fac48-faa1-4308-bed3-6e36bdc224a2 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:10:38 np0005476733 nova_compute[192580]: 2025-10-08 16:10:38.629 2 INFO nova.scheduler.client.report [None req-8d5fac48-faa1-4308-bed3-6e36bdc224a2 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Deleted allocations for instance 46a486b9-8873-425b-913e-b2931570477e#033[00m
Oct  8 12:10:38 np0005476733 nova_compute[192580]: 2025-10-08 16:10:38.810 2 DEBUG oslo_concurrency.lockutils [None req-8d5fac48-faa1-4308-bed3-6e36bdc224a2 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "46a486b9-8873-425b-913e-b2931570477e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.987s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:10:39 np0005476733 nova_compute[192580]: 2025-10-08 16:10:39.817 2 DEBUG nova.compute.manager [req-0ed92343-30c2-4cfa-b766-d833b112cd69 req-996b6df4-7cdc-468a-83fc-6b0b3699a8be 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Received event network-vif-plugged-2731ffa0-8eae-47d8-a816-32f3afa5b445 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:10:39 np0005476733 nova_compute[192580]: 2025-10-08 16:10:39.817 2 DEBUG oslo_concurrency.lockutils [req-0ed92343-30c2-4cfa-b766-d833b112cd69 req-996b6df4-7cdc-468a-83fc-6b0b3699a8be 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "46a486b9-8873-425b-913e-b2931570477e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:10:39 np0005476733 nova_compute[192580]: 2025-10-08 16:10:39.818 2 DEBUG oslo_concurrency.lockutils [req-0ed92343-30c2-4cfa-b766-d833b112cd69 req-996b6df4-7cdc-468a-83fc-6b0b3699a8be 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "46a486b9-8873-425b-913e-b2931570477e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:10:39 np0005476733 nova_compute[192580]: 2025-10-08 16:10:39.818 2 DEBUG oslo_concurrency.lockutils [req-0ed92343-30c2-4cfa-b766-d833b112cd69 req-996b6df4-7cdc-468a-83fc-6b0b3699a8be 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "46a486b9-8873-425b-913e-b2931570477e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:10:39 np0005476733 nova_compute[192580]: 2025-10-08 16:10:39.818 2 DEBUG nova.compute.manager [req-0ed92343-30c2-4cfa-b766-d833b112cd69 req-996b6df4-7cdc-468a-83fc-6b0b3699a8be 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] No waiting events found dispatching network-vif-plugged-2731ffa0-8eae-47d8-a816-32f3afa5b445 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:10:39 np0005476733 nova_compute[192580]: 2025-10-08 16:10:39.819 2 WARNING nova.compute.manager [req-0ed92343-30c2-4cfa-b766-d833b112cd69 req-996b6df4-7cdc-468a-83fc-6b0b3699a8be 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 46a486b9-8873-425b-913e-b2931570477e] Received unexpected event network-vif-plugged-2731ffa0-8eae-47d8-a816-32f3afa5b445 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 12:10:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:39.833 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '53'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:10:40 np0005476733 nova_compute[192580]: 2025-10-08 16:10:40.873 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:10:40 np0005476733 nova_compute[192580]: 2025-10-08 16:10:40.916 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.010 2 DEBUG oslo_concurrency.lockutils [None req-1cb7101d-915c-48ce-b157-06d04d6a69cc 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "7187c34b-929f-4f25-a15b-f6294b5087bc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.012 2 DEBUG oslo_concurrency.lockutils [None req-1cb7101d-915c-48ce-b157-06d04d6a69cc 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "7187c34b-929f-4f25-a15b-f6294b5087bc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.012 2 DEBUG oslo_concurrency.lockutils [None req-1cb7101d-915c-48ce-b157-06d04d6a69cc 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "7187c34b-929f-4f25-a15b-f6294b5087bc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.013 2 DEBUG oslo_concurrency.lockutils [None req-1cb7101d-915c-48ce-b157-06d04d6a69cc 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "7187c34b-929f-4f25-a15b-f6294b5087bc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.013 2 DEBUG oslo_concurrency.lockutils [None req-1cb7101d-915c-48ce-b157-06d04d6a69cc 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "7187c34b-929f-4f25-a15b-f6294b5087bc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.016 2 INFO nova.compute.manager [None req-1cb7101d-915c-48ce-b157-06d04d6a69cc 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Terminating instance#033[00m
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.018 2 DEBUG nova.compute.manager [None req-1cb7101d-915c-48ce-b157-06d04d6a69cc 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 12:10:41 np0005476733 kernel: tap43b9af24-0a (unregistering): left promiscuous mode
Oct  8 12:10:41 np0005476733 NetworkManager[51699]: <info>  [1759939841.0567] device (tap43b9af24-0a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 12:10:41 np0005476733 ovn_controller[94857]: 2025-10-08T16:10:41Z|00780|binding|INFO|Releasing lport 43b9af24-0a3b-4a87-8883-35ff0783ea2c from this chassis (sb_readonly=0)
Oct  8 12:10:41 np0005476733 ovn_controller[94857]: 2025-10-08T16:10:41Z|00781|binding|INFO|Setting lport 43b9af24-0a3b-4a87-8883-35ff0783ea2c down in Southbound
Oct  8 12:10:41 np0005476733 ovn_controller[94857]: 2025-10-08T16:10:41Z|00782|binding|INFO|Releasing lport 47874a34-92ab-40af-8541-e7b04d3f603b from this chassis (sb_readonly=0)
Oct  8 12:10:41 np0005476733 ovn_controller[94857]: 2025-10-08T16:10:41Z|00783|binding|INFO|Setting lport 47874a34-92ab-40af-8541-e7b04d3f603b down in Southbound
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:41 np0005476733 ovn_controller[94857]: 2025-10-08T16:10:41Z|00784|binding|INFO|Removing iface tap43b9af24-0a ovn-installed in OVS
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:41.085 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:43:cd:2e 192.168.122.250'], port_security=['fa:16:3e:43:cd:2e 192.168.122.250 192.168.122.179'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.250/24', 'neutron:device_id': '7187c34b-929f-4f25-a15b-f6294b5087bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71bd615ba6694cba8794c8eb5dadbe81', 'neutron:revision_number': '5', 'neutron:security_group_ids': '24bffacf-e176-4693-befb-0f2fe8062d38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5b64086-e7d8-42ad-b439-67cb79e13d7c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=43b9af24-0a3b-4a87-8883-35ff0783ea2c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:41.089 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 43b9af24-0a3b-4a87-8883-35ff0783ea2c in datapath 81c575b5-ac88-40d3-8b00-79c5c936eec4 unbound from our chassis#033[00m
Oct  8 12:10:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:41.091 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 81c575b5-ac88-40d3-8b00-79c5c936eec4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 12:10:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:41.093 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[fad2921a-4da9-45a6-928d-ea0969223124]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:10:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:41.094 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 namespace which is not needed anymore#033[00m
Oct  8 12:10:41 np0005476733 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000050.scope: Deactivated successfully.
Oct  8 12:10:41 np0005476733 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000050.scope: Consumed 48.460s CPU time.
Oct  8 12:10:41 np0005476733 systemd-machined[152624]: Machine qemu-48-instance-00000050 terminated.
Oct  8 12:10:41 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[250981]: [NOTICE]   (250985) : haproxy version is 2.8.14-c23fe91
Oct  8 12:10:41 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[250981]: [NOTICE]   (250985) : path to executable is /usr/sbin/haproxy
Oct  8 12:10:41 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[250981]: [WARNING]  (250985) : Exiting Master process...
Oct  8 12:10:41 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[250981]: [ALERT]    (250985) : Current worker (250987) exited with code 143 (Terminated)
Oct  8 12:10:41 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[250981]: [WARNING]  (250985) : All workers exited. Exiting... (0)
Oct  8 12:10:41 np0005476733 systemd[1]: libpod-c92cf677d1d11a6cba2d5c459c82c3cfa884f531ed0472419519c035eaf50805.scope: Deactivated successfully.
Oct  8 12:10:41 np0005476733 podman[252388]: 2025-10-08 16:10:41.276350653 +0000 UTC m=+0.060947940 container died c92cf677d1d11a6cba2d5c459c82c3cfa884f531ed0472419519c035eaf50805 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.290 2 INFO nova.virt.libvirt.driver [-] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Instance destroyed successfully.#033[00m
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.291 2 DEBUG nova.objects.instance [None req-1cb7101d-915c-48ce-b157-06d04d6a69cc 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lazy-loading 'resources' on Instance uuid 7187c34b-929f-4f25-a15b-f6294b5087bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:10:41 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c92cf677d1d11a6cba2d5c459c82c3cfa884f531ed0472419519c035eaf50805-userdata-shm.mount: Deactivated successfully.
Oct  8 12:10:41 np0005476733 systemd[1]: var-lib-containers-storage-overlay-0eea9b1a3d86fd20e7c2d1d3c118ff2807a8746210e9c8189bafdd4c12e5f57e-merged.mount: Deactivated successfully.
Oct  8 12:10:41 np0005476733 podman[252388]: 2025-10-08 16:10:41.314537074 +0000 UTC m=+0.099134361 container cleanup c92cf677d1d11a6cba2d5c459c82c3cfa884f531ed0472419519c035eaf50805 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 12:10:41 np0005476733 systemd[1]: libpod-conmon-c92cf677d1d11a6cba2d5c459c82c3cfa884f531ed0472419519c035eaf50805.scope: Deactivated successfully.
Oct  8 12:10:41 np0005476733 podman[252428]: 2025-10-08 16:10:41.37450266 +0000 UTC m=+0.039062919 container remove c92cf677d1d11a6cba2d5c459c82c3cfa884f531ed0472419519c035eaf50805 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 12:10:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:41.380 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6c70eb86-7c98-4355-a280-150f3d777a33]: (4, ('Wed Oct  8 04:10:41 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 (c92cf677d1d11a6cba2d5c459c82c3cfa884f531ed0472419519c035eaf50805)\nc92cf677d1d11a6cba2d5c459c82c3cfa884f531ed0472419519c035eaf50805\nWed Oct  8 04:10:41 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 (c92cf677d1d11a6cba2d5c459c82c3cfa884f531ed0472419519c035eaf50805)\nc92cf677d1d11a6cba2d5c459c82c3cfa884f531ed0472419519c035eaf50805\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:10:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:41.381 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6a696e70-74b3-476b-aeeb-b7955d329ae7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:10:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:41.382 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81c575b5-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:41 np0005476733 kernel: tap81c575b5-a0: left promiscuous mode
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.394 2 DEBUG nova.virt.libvirt.vif [None req-1cb7101d-915c-48ce-b157-06d04d6a69cc 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T16:07:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-test_dvr_vip_failover_external_network-1149265317',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dvr-vip-failover-external-network-1149265317',id=80,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNdA0tsd4prmAtaaiPDbr0srwjwa73lUEwXPQ7487oxI1AHjPlOjgV1xIPQKf206OdbyLFsmy0ZYOIXSGyym/svb66IKtgGKZCbQTa1vbaeLuI4LvJfdLsM/uLuzgQoA5Q==',key_name='tempest-keypair-test-460735074',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:07:36Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='71bd615ba6694cba8794c8eb5dadbe81',ramdisk_id='',reservation_id='r-wspb073o',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-OvnDvrAdvancedTest-1107478320',owner_user_name='tempest-OvnDvrAdvancedTest-1107478320-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T16:07:36Z,user_data=None,user_id='000a8d1cd17e4a4c8398ef814dd4db2d',uuid=7187c34b-929f-4f25-a15b-f6294b5087bc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "address": "fa:16:3e:43:cd:2e", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.250", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43b9af24-0a", "ovs_interfaceid": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.394 2 DEBUG nova.network.os_vif_util [None req-1cb7101d-915c-48ce-b157-06d04d6a69cc 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Converting VIF {"id": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "address": "fa:16:3e:43:cd:2e", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.250", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43b9af24-0a", "ovs_interfaceid": "43b9af24-0a3b-4a87-8883-35ff0783ea2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.395 2 DEBUG nova.network.os_vif_util [None req-1cb7101d-915c-48ce-b157-06d04d6a69cc 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:43:cd:2e,bridge_name='br-int',has_traffic_filtering=True,id=43b9af24-0a3b-4a87-8883-35ff0783ea2c,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43b9af24-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.395 2 DEBUG os_vif [None req-1cb7101d-915c-48ce-b157-06d04d6a69cc 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:43:cd:2e,bridge_name='br-int',has_traffic_filtering=True,id=43b9af24-0a3b-4a87-8883-35ff0783ea2c,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43b9af24-0a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.398 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap43b9af24-0a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:10:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:41.404 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ed68c67d-adaa-4d3d-9572-daf8c3477d0f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.403 2 INFO os_vif [None req-1cb7101d-915c-48ce-b157-06d04d6a69cc 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:43:cd:2e,bridge_name='br-int',has_traffic_filtering=True,id=43b9af24-0a3b-4a87-8883-35ff0783ea2c,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43b9af24-0a')#033[00m
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.405 2 INFO nova.virt.libvirt.driver [None req-1cb7101d-915c-48ce-b157-06d04d6a69cc 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Deleting instance files /var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc_del#033[00m
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.406 2 INFO nova.virt.libvirt.driver [None req-1cb7101d-915c-48ce-b157-06d04d6a69cc 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Deletion of /var/lib/nova/instances/7187c34b-929f-4f25-a15b-f6294b5087bc_del complete#033[00m
Oct  8 12:10:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:41.440 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e41623b6-f1fd-45e5-973d-8ac9afaab82c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:10:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:41.442 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[617d7c83-103c-4830-b23a-1e63c5aaaef4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:10:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:41.457 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[44765782-bf3f-496e-863c-13522891c3a5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655921, 'reachable_time': 23732, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252444, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:10:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:41.460 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 12:10:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:10:41.460 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[79d1524e-8391-4ebc-9b94-f6baddf043db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:10:41 np0005476733 systemd[1]: run-netns-ovnmeta\x2d81c575b5\x2dac88\x2d40d3\x2d8b00\x2d79c5c936eec4.mount: Deactivated successfully.
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.473 2 INFO nova.compute.manager [None req-1cb7101d-915c-48ce-b157-06d04d6a69cc 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Took 0.45 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.474 2 DEBUG oslo.service.loopingcall [None req-1cb7101d-915c-48ce-b157-06d04d6a69cc 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.474 2 DEBUG nova.compute.manager [-] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 12:10:41 np0005476733 nova_compute[192580]: 2025-10-08 16:10:41.474 2 DEBUG nova.network.neutron [-] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.106 2 DEBUG nova.network.neutron [-] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.130 2 INFO nova.compute.manager [-] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Took 0.66 seconds to deallocate network for instance.#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.181 2 DEBUG oslo_concurrency.lockutils [None req-1cb7101d-915c-48ce-b157-06d04d6a69cc 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.182 2 DEBUG oslo_concurrency.lockutils [None req-1cb7101d-915c-48ce-b157-06d04d6a69cc 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.246 2 DEBUG nova.compute.provider_tree [None req-1cb7101d-915c-48ce-b157-06d04d6a69cc 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.266 2 DEBUG nova.scheduler.client.report [None req-1cb7101d-915c-48ce-b157-06d04d6a69cc 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.300 2 DEBUG oslo_concurrency.lockutils [None req-1cb7101d-915c-48ce-b157-06d04d6a69cc 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.331 2 INFO nova.scheduler.client.report [None req-1cb7101d-915c-48ce-b157-06d04d6a69cc 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Deleted allocations for instance 7187c34b-929f-4f25-a15b-f6294b5087bc#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.409 2 DEBUG oslo_concurrency.lockutils [None req-1cb7101d-915c-48ce-b157-06d04d6a69cc 000a8d1cd17e4a4c8398ef814dd4db2d 71bd615ba6694cba8794c8eb5dadbe81 - - default default] Lock "7187c34b-929f-4f25-a15b-f6294b5087bc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.398s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.579 2 DEBUG nova.compute.manager [req-03066384-4c18-499f-8c5a-db330ab42fe9 req-61ec4d72-342b-415a-bc7d-96ec9bab8441 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Received event network-vif-unplugged-43b9af24-0a3b-4a87-8883-35ff0783ea2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.580 2 DEBUG oslo_concurrency.lockutils [req-03066384-4c18-499f-8c5a-db330ab42fe9 req-61ec4d72-342b-415a-bc7d-96ec9bab8441 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "7187c34b-929f-4f25-a15b-f6294b5087bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.580 2 DEBUG oslo_concurrency.lockutils [req-03066384-4c18-499f-8c5a-db330ab42fe9 req-61ec4d72-342b-415a-bc7d-96ec9bab8441 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7187c34b-929f-4f25-a15b-f6294b5087bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.581 2 DEBUG oslo_concurrency.lockutils [req-03066384-4c18-499f-8c5a-db330ab42fe9 req-61ec4d72-342b-415a-bc7d-96ec9bab8441 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7187c34b-929f-4f25-a15b-f6294b5087bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.581 2 DEBUG nova.compute.manager [req-03066384-4c18-499f-8c5a-db330ab42fe9 req-61ec4d72-342b-415a-bc7d-96ec9bab8441 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] No waiting events found dispatching network-vif-unplugged-43b9af24-0a3b-4a87-8883-35ff0783ea2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.582 2 WARNING nova.compute.manager [req-03066384-4c18-499f-8c5a-db330ab42fe9 req-61ec4d72-342b-415a-bc7d-96ec9bab8441 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Received unexpected event network-vif-unplugged-43b9af24-0a3b-4a87-8883-35ff0783ea2c for instance with vm_state deleted and task_state None.#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.582 2 DEBUG nova.compute.manager [req-03066384-4c18-499f-8c5a-db330ab42fe9 req-61ec4d72-342b-415a-bc7d-96ec9bab8441 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Received event network-vif-plugged-43b9af24-0a3b-4a87-8883-35ff0783ea2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.583 2 DEBUG oslo_concurrency.lockutils [req-03066384-4c18-499f-8c5a-db330ab42fe9 req-61ec4d72-342b-415a-bc7d-96ec9bab8441 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "7187c34b-929f-4f25-a15b-f6294b5087bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.583 2 DEBUG oslo_concurrency.lockutils [req-03066384-4c18-499f-8c5a-db330ab42fe9 req-61ec4d72-342b-415a-bc7d-96ec9bab8441 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7187c34b-929f-4f25-a15b-f6294b5087bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.584 2 DEBUG oslo_concurrency.lockutils [req-03066384-4c18-499f-8c5a-db330ab42fe9 req-61ec4d72-342b-415a-bc7d-96ec9bab8441 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "7187c34b-929f-4f25-a15b-f6294b5087bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.584 2 DEBUG nova.compute.manager [req-03066384-4c18-499f-8c5a-db330ab42fe9 req-61ec4d72-342b-415a-bc7d-96ec9bab8441 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] No waiting events found dispatching network-vif-plugged-43b9af24-0a3b-4a87-8883-35ff0783ea2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.585 2 WARNING nova.compute.manager [req-03066384-4c18-499f-8c5a-db330ab42fe9 req-61ec4d72-342b-415a-bc7d-96ec9bab8441 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Received unexpected event network-vif-plugged-43b9af24-0a3b-4a87-8883-35ff0783ea2c for instance with vm_state deleted and task_state None.#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.585 2 DEBUG nova.compute.manager [req-03066384-4c18-499f-8c5a-db330ab42fe9 req-61ec4d72-342b-415a-bc7d-96ec9bab8441 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Received event network-vif-deleted-43b9af24-0a3b-4a87-8883-35ff0783ea2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:10:42 np0005476733 nova_compute[192580]: 2025-10-08 16:10:42.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 12:10:46 np0005476733 podman[252448]: 2025-10-08 16:10:46.286112471 +0000 UTC m=+0.109135840 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent)
Oct  8 12:10:46 np0005476733 nova_compute[192580]: 2025-10-08 16:10:46.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:47 np0005476733 nova_compute[192580]: 2025-10-08 16:10:47.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:51 np0005476733 nova_compute[192580]: 2025-10-08 16:10:51.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:52 np0005476733 nova_compute[192580]: 2025-10-08 16:10:52.090 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759939837.0882742, 46a486b9-8873-425b-913e-b2931570477e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:10:52 np0005476733 nova_compute[192580]: 2025-10-08 16:10:52.091 2 INFO nova.compute.manager [-] [instance: 46a486b9-8873-425b-913e-b2931570477e] VM Stopped (Lifecycle Event)#033[00m
Oct  8 12:10:52 np0005476733 nova_compute[192580]: 2025-10-08 16:10:52.323 2 DEBUG nova.compute.manager [None req-bebfaeda-368f-48c1-96f1-85dc44baf18c - - - - - -] [instance: 46a486b9-8873-425b-913e-b2931570477e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:10:52 np0005476733 nova_compute[192580]: 2025-10-08 16:10:52.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:53 np0005476733 podman[252471]: 2025-10-08 16:10:53.249494694 +0000 UTC m=+0.069547425 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Oct  8 12:10:53 np0005476733 podman[252470]: 2025-10-08 16:10:53.292253771 +0000 UTC m=+0.112359053 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 12:10:55 np0005476733 nova_compute[192580]: 2025-10-08 16:10:55.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:10:56 np0005476733 nova_compute[192580]: 2025-10-08 16:10:56.289 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759939841.2875574, 7187c34b-929f-4f25-a15b-f6294b5087bc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:10:56 np0005476733 nova_compute[192580]: 2025-10-08 16:10:56.290 2 INFO nova.compute.manager [-] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] VM Stopped (Lifecycle Event)#033[00m
Oct  8 12:10:56 np0005476733 nova_compute[192580]: 2025-10-08 16:10:56.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:56 np0005476733 nova_compute[192580]: 2025-10-08 16:10:56.578 2 DEBUG nova.compute.manager [None req-a128a452-f000-4c95-9677-bc2bd8d96d0c - - - - - -] [instance: 7187c34b-929f-4f25-a15b-f6294b5087bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:10:56 np0005476733 podman[206798]: time="2025-10-08T16:10:56Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct  8 12:10:56 np0005476733 podman[206798]: @ - - [08/Oct/2025:16:10:56 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 25709 "" "Go-http-client/1.1"
Oct  8 12:10:57 np0005476733 nova_compute[192580]: 2025-10-08 16:10:57.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:57 np0005476733 nova_compute[192580]: 2025-10-08 16:10:57.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:10:57 np0005476733 nova_compute[192580]: 2025-10-08 16:10:57.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:01 np0005476733 podman[252518]: 2025-10-08 16:11:01.252134342 +0000 UTC m=+0.058768270 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:11:01 np0005476733 podman[252519]: 2025-10-08 16:11:01.257802073 +0000 UTC m=+0.062007454 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, version=9.6, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  8 12:11:01 np0005476733 podman[252517]: 2025-10-08 16:11:01.262242514 +0000 UTC m=+0.076574558 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 12:11:01 np0005476733 nova_compute[192580]: 2025-10-08 16:11:01.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:02 np0005476733 nova_compute[192580]: 2025-10-08 16:11:02.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:06 np0005476733 nova_compute[192580]: 2025-10-08 16:11:06.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:07 np0005476733 podman[252581]: 2025-10-08 16:11:07.250123633 +0000 UTC m=+0.070226606 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:11:07 np0005476733 podman[252580]: 2025-10-08 16:11:07.260645899 +0000 UTC m=+0.074642387 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  8 12:11:07 np0005476733 nova_compute[192580]: 2025-10-08 16:11:07.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:11 np0005476733 nova_compute[192580]: 2025-10-08 16:11:11.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:12 np0005476733 nova_compute[192580]: 2025-10-08 16:11:12.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:16 np0005476733 nova_compute[192580]: 2025-10-08 16:11:16.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:17 np0005476733 podman[252629]: 2025-10-08 16:11:17.228955907 +0000 UTC m=+0.059726670 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 12:11:17 np0005476733 nova_compute[192580]: 2025-10-08 16:11:17.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:17 np0005476733 nova_compute[192580]: 2025-10-08 16:11:17.718 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:11:18.404 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=54, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=53) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:11:18 np0005476733 nova_compute[192580]: 2025-10-08 16:11:18.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:11:18.405 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:11:19 np0005476733 nova_compute[192580]: 2025-10-08 16:11:19.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:11:19 np0005476733 nova_compute[192580]: 2025-10-08 16:11:19.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 12:11:19 np0005476733 nova_compute[192580]: 2025-10-08 16:11:19.658 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 12:11:20 np0005476733 nova_compute[192580]: 2025-10-08 16:11:20.658 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:11:21 np0005476733 nova_compute[192580]: 2025-10-08 16:11:21.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:22 np0005476733 nova_compute[192580]: 2025-10-08 16:11:22.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:23 np0005476733 systemd-logind[827]: New session 60 of user zuul.
Oct  8 12:11:23 np0005476733 systemd[1]: Started Session 60 of User zuul.
Oct  8 12:11:23 np0005476733 ovn_controller[94857]: 2025-10-08T16:11:23Z|00785|pinctrl|WARN|Dropped 441 log messages in last 62 seconds (most recently, 5 seconds ago) due to excessive rate
Oct  8 12:11:23 np0005476733 ovn_controller[94857]: 2025-10-08T16:11:23Z|00786|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:11:23 np0005476733 podman[252656]: 2025-10-08 16:11:23.48821993 +0000 UTC m=+0.060280438 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  8 12:11:23 np0005476733 podman[252654]: 2025-10-08 16:11:23.499837801 +0000 UTC m=+0.075875757 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 12:11:23 np0005476733 systemd[1]: session-60.scope: Deactivated successfully.
Oct  8 12:11:23 np0005476733 systemd-logind[827]: Session 60 logged out. Waiting for processes to exit.
Oct  8 12:11:23 np0005476733 systemd-logind[827]: Removed session 60.
Oct  8 12:11:25 np0005476733 nova_compute[192580]: 2025-10-08 16:11:25.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:11:25 np0005476733 nova_compute[192580]: 2025-10-08 16:11:25.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:11:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:11:26.365 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:11:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:11:26.366 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:11:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:11:26.366 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:11:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:11:26.408 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '54'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:11:26 np0005476733 nova_compute[192580]: 2025-10-08 16:11:26.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:27 np0005476733 nova_compute[192580]: 2025-10-08 16:11:27.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:11:27 np0005476733 nova_compute[192580]: 2025-10-08 16:11:27.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:11:27 np0005476733 nova_compute[192580]: 2025-10-08 16:11:27.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:27 np0005476733 nova_compute[192580]: 2025-10-08 16:11:27.678 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:11:27 np0005476733 nova_compute[192580]: 2025-10-08 16:11:27.678 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:11:27 np0005476733 nova_compute[192580]: 2025-10-08 16:11:27.679 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:11:30 np0005476733 ovn_controller[94857]: 2025-10-08T16:11:30Z|00787|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Oct  8 12:11:30 np0005476733 nova_compute[192580]: 2025-10-08 16:11:30.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:11:31 np0005476733 nova_compute[192580]: 2025-10-08 16:11:31.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:32 np0005476733 podman[252732]: 2025-10-08 16:11:32.228945133 +0000 UTC m=+0.057387435 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 12:11:32 np0005476733 podman[252733]: 2025-10-08 16:11:32.233617734 +0000 UTC m=+0.058605025 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., name=ubi9-minimal, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7)
Oct  8 12:11:32 np0005476733 podman[252731]: 2025-10-08 16:11:32.253583861 +0000 UTC m=+0.083828661 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:11:32 np0005476733 nova_compute[192580]: 2025-10-08 16:11:32.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:36 np0005476733 nova_compute[192580]: 2025-10-08 16:11:36.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:36 np0005476733 nova_compute[192580]: 2025-10-08 16:11:36.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:11:36 np0005476733 nova_compute[192580]: 2025-10-08 16:11:36.662 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:11:36 np0005476733 nova_compute[192580]: 2025-10-08 16:11:36.662 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:11:36 np0005476733 nova_compute[192580]: 2025-10-08 16:11:36.662 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:11:36 np0005476733 nova_compute[192580]: 2025-10-08 16:11:36.663 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:11:36 np0005476733 nova_compute[192580]: 2025-10-08 16:11:36.816 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:11:36 np0005476733 nova_compute[192580]: 2025-10-08 16:11:36.817 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13769MB free_disk=111.3318977355957GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:11:36 np0005476733 nova_compute[192580]: 2025-10-08 16:11:36.818 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:11:36 np0005476733 nova_compute[192580]: 2025-10-08 16:11:36.818 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:11:36 np0005476733 nova_compute[192580]: 2025-10-08 16:11:36.903 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:11:36 np0005476733 nova_compute[192580]: 2025-10-08 16:11:36.903 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:11:36 np0005476733 nova_compute[192580]: 2025-10-08 16:11:36.920 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 12:11:36 np0005476733 nova_compute[192580]: 2025-10-08 16:11:36.939 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 12:11:36 np0005476733 nova_compute[192580]: 2025-10-08 16:11:36.940 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 12:11:36 np0005476733 nova_compute[192580]: 2025-10-08 16:11:36.966 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 12:11:36 np0005476733 nova_compute[192580]: 2025-10-08 16:11:36.995 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 12:11:37 np0005476733 nova_compute[192580]: 2025-10-08 16:11:37.027 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:11:37 np0005476733 nova_compute[192580]: 2025-10-08 16:11:37.043 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:11:37 np0005476733 nova_compute[192580]: 2025-10-08 16:11:37.071 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:11:37 np0005476733 nova_compute[192580]: 2025-10-08 16:11:37.072 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.254s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:11:37 np0005476733 nova_compute[192580]: 2025-10-08 16:11:37.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:38 np0005476733 podman[252795]: 2025-10-08 16:11:38.221877503 +0000 UTC m=+0.050197346 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:11:38 np0005476733 podman[252794]: 2025-10-08 16:11:38.226290874 +0000 UTC m=+0.056394574 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 12:11:41 np0005476733 nova_compute[192580]: 2025-10-08 16:11:41.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:42 np0005476733 nova_compute[192580]: 2025-10-08 16:11:42.074 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:11:42 np0005476733 nova_compute[192580]: 2025-10-08 16:11:42.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:43 np0005476733 nova_compute[192580]: 2025-10-08 16:11:43.023 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:11:46 np0005476733 nova_compute[192580]: 2025-10-08 16:11:46.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:47 np0005476733 nova_compute[192580]: 2025-10-08 16:11:47.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:48 np0005476733 podman[252837]: 2025-10-08 16:11:48.223569958 +0000 UTC m=+0.055970200 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 12:11:51 np0005476733 nova_compute[192580]: 2025-10-08 16:11:51.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:52 np0005476733 nova_compute[192580]: 2025-10-08 16:11:52.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:54 np0005476733 podman[252859]: 2025-10-08 16:11:54.231553889 +0000 UTC m=+0.056404505 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 12:11:54 np0005476733 podman[252858]: 2025-10-08 16:11:54.251886799 +0000 UTC m=+0.080965029 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 12:11:56 np0005476733 nova_compute[192580]: 2025-10-08 16:11:56.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:11:57 np0005476733 nova_compute[192580]: 2025-10-08 16:11:57.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:01 np0005476733 nova_compute[192580]: 2025-10-08 16:12:01.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:02 np0005476733 nova_compute[192580]: 2025-10-08 16:12:02.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:03 np0005476733 podman[252906]: 2025-10-08 16:12:03.233148622 +0000 UTC m=+0.058171842 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=edpm, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public)
Oct  8 12:12:03 np0005476733 podman[252904]: 2025-10-08 16:12:03.233490532 +0000 UTC m=+0.062763537 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 12:12:03 np0005476733 podman[252905]: 2025-10-08 16:12:03.233886624 +0000 UTC m=+0.060552807 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 12:12:06 np0005476733 nova_compute[192580]: 2025-10-08 16:12:06.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:07 np0005476733 nova_compute[192580]: 2025-10-08 16:12:07.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:09 np0005476733 podman[252965]: 2025-10-08 16:12:09.216202434 +0000 UTC m=+0.050487403 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:12:09 np0005476733 podman[252966]: 2025-10-08 16:12:09.247210005 +0000 UTC m=+0.065904676 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:12:11 np0005476733 nova_compute[192580]: 2025-10-08 16:12:11.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:12 np0005476733 ovn_controller[94857]: 2025-10-08T16:12:12Z|00788|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Oct  8 12:12:12 np0005476733 nova_compute[192580]: 2025-10-08 16:12:12.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:15 np0005476733 nova_compute[192580]: 2025-10-08 16:12:15.597 2 DEBUG oslo_concurrency.lockutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:12:15 np0005476733 nova_compute[192580]: 2025-10-08 16:12:15.597 2 DEBUG oslo_concurrency.lockutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:12:15 np0005476733 nova_compute[192580]: 2025-10-08 16:12:15.977 2 DEBUG nova.compute.manager [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 12:12:16 np0005476733 nova_compute[192580]: 2025-10-08 16:12:16.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:17 np0005476733 nova_compute[192580]: 2025-10-08 16:12:17.053 2 DEBUG oslo_concurrency.lockutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:12:17 np0005476733 nova_compute[192580]: 2025-10-08 16:12:17.054 2 DEBUG oslo_concurrency.lockutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:12:17 np0005476733 nova_compute[192580]: 2025-10-08 16:12:17.066 2 DEBUG nova.virt.hardware [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 12:12:17 np0005476733 nova_compute[192580]: 2025-10-08 16:12:17.066 2 INFO nova.compute.claims [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 12:12:17 np0005476733 nova_compute[192580]: 2025-10-08 16:12:17.736 2 DEBUG nova.compute.provider_tree [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:12:17 np0005476733 nova_compute[192580]: 2025-10-08 16:12:17.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:17 np0005476733 nova_compute[192580]: 2025-10-08 16:12:17.848 2 DEBUG nova.scheduler.client.report [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:12:18 np0005476733 nova_compute[192580]: 2025-10-08 16:12:18.340 2 DEBUG oslo_concurrency.lockutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:12:18 np0005476733 nova_compute[192580]: 2025-10-08 16:12:18.340 2 DEBUG nova.compute.manager [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 12:12:18 np0005476733 nova_compute[192580]: 2025-10-08 16:12:18.604 2 DEBUG nova.compute.manager [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 12:12:18 np0005476733 nova_compute[192580]: 2025-10-08 16:12:18.604 2 DEBUG nova.network.neutron [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 12:12:18 np0005476733 nova_compute[192580]: 2025-10-08 16:12:18.614 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:12:18 np0005476733 nova_compute[192580]: 2025-10-08 16:12:18.756 2 INFO nova.virt.libvirt.driver [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 12:12:18 np0005476733 nova_compute[192580]: 2025-10-08 16:12:18.892 2 DEBUG nova.compute.manager [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 12:12:19 np0005476733 podman[253008]: 2025-10-08 16:12:19.251201043 +0000 UTC m=+0.071981992 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.416 2 DEBUG nova.compute.manager [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.418 2 DEBUG nova.virt.libvirt.driver [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.419 2 INFO nova.virt.libvirt.driver [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Creating image(s)#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.420 2 DEBUG oslo_concurrency.lockutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "/var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.420 2 DEBUG oslo_concurrency.lockutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "/var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.421 2 DEBUG oslo_concurrency.lockutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "/var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.449 2 DEBUG oslo_concurrency.processutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.533 2 DEBUG oslo_concurrency.processutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.534 2 DEBUG oslo_concurrency.lockutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.535 2 DEBUG oslo_concurrency.lockutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.546 2 DEBUG oslo_concurrency.processutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.603 2 DEBUG oslo_concurrency.processutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.604 2 DEBUG oslo_concurrency.processutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.643 2 DEBUG oslo_concurrency.processutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.645 2 DEBUG oslo_concurrency.lockutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.645 2 DEBUG oslo_concurrency.processutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.706 2 DEBUG oslo_concurrency.processutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.707 2 DEBUG nova.virt.disk.api [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Checking if we can resize image /var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.707 2 DEBUG oslo_concurrency.processutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.761 2 DEBUG oslo_concurrency.processutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.762 2 DEBUG nova.virt.disk.api [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Cannot resize image /var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.763 2 DEBUG nova.objects.instance [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lazy-loading 'migration_context' on Instance uuid ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.816 2 DEBUG nova.virt.libvirt.driver [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.817 2 DEBUG nova.virt.libvirt.driver [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Ensure instance console log exists: /var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.818 2 DEBUG oslo_concurrency.lockutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.818 2 DEBUG oslo_concurrency.lockutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:12:19 np0005476733 nova_compute[192580]: 2025-10-08 16:12:19.819 2 DEBUG oslo_concurrency.lockutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:12:20 np0005476733 nova_compute[192580]: 2025-10-08 16:12:20.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:12:20 np0005476733 ovn_controller[94857]: 2025-10-08T16:12:20Z|00789|pinctrl|WARN|Dropped 241 log messages in last 57 seconds (most recently, 2 seconds ago) due to excessive rate
Oct  8 12:12:20 np0005476733 ovn_controller[94857]: 2025-10-08T16:12:20Z|00790|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:12:21 np0005476733 nova_compute[192580]: 2025-10-08 16:12:21.269 2 DEBUG nova.network.neutron [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Successfully created port: 72980dd9-6898-4995-b757-fdaf79579051 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 12:12:21 np0005476733 nova_compute[192580]: 2025-10-08 16:12:21.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:22 np0005476733 nova_compute[192580]: 2025-10-08 16:12:22.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:23 np0005476733 nova_compute[192580]: 2025-10-08 16:12:23.405 2 DEBUG nova.network.neutron [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Successfully updated port: 72980dd9-6898-4995-b757-fdaf79579051 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 12:12:23 np0005476733 nova_compute[192580]: 2025-10-08 16:12:23.668 2 DEBUG oslo_concurrency.lockutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "refresh_cache-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:12:23 np0005476733 nova_compute[192580]: 2025-10-08 16:12:23.669 2 DEBUG oslo_concurrency.lockutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquired lock "refresh_cache-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:12:23 np0005476733 nova_compute[192580]: 2025-10-08 16:12:23.669 2 DEBUG nova.network.neutron [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:12:23 np0005476733 nova_compute[192580]: 2025-10-08 16:12:23.676 2 DEBUG nova.compute.manager [req-bc14d1f0-df80-4eaa-8d8d-1ca40efdc07c req-9aba0381-c5c2-4ce1-9211-12c8cda646e6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Received event network-changed-72980dd9-6898-4995-b757-fdaf79579051 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:12:23 np0005476733 nova_compute[192580]: 2025-10-08 16:12:23.677 2 DEBUG nova.compute.manager [req-bc14d1f0-df80-4eaa-8d8d-1ca40efdc07c req-9aba0381-c5c2-4ce1-9211-12c8cda646e6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Refreshing instance network info cache due to event network-changed-72980dd9-6898-4995-b757-fdaf79579051. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:12:23 np0005476733 nova_compute[192580]: 2025-10-08 16:12:23.677 2 DEBUG oslo_concurrency.lockutils [req-bc14d1f0-df80-4eaa-8d8d-1ca40efdc07c req-9aba0381-c5c2-4ce1-9211-12c8cda646e6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:12:25 np0005476733 podman[253045]: 2025-10-08 16:12:25.259149882 +0000 UTC m=+0.074450021 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  8 12:12:25 np0005476733 podman[253044]: 2025-10-08 16:12:25.273676926 +0000 UTC m=+0.099071618 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 12:12:25 np0005476733 nova_compute[192580]: 2025-10-08 16:12:25.584 2 DEBUG nova.network.neutron [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 12:12:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:26.368 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:12:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:26.368 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:12:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:26.368 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:12:26 np0005476733 nova_compute[192580]: 2025-10-08 16:12:26.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.500 2 DEBUG nova.network.neutron [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Updating instance_info_cache with network_info: [{"id": "72980dd9-6898-4995-b757-fdaf79579051", "address": "fa:16:3e:1f:3c:a8", "network": {"id": "39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a", "bridge": "br-int", "label": "tempest-test-network--268443214", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72980dd9-68", "ovs_interfaceid": "72980dd9-6898-4995-b757-fdaf79579051", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.524 2 DEBUG oslo_concurrency.lockutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Releasing lock "refresh_cache-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.524 2 DEBUG nova.compute.manager [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Instance network_info: |[{"id": "72980dd9-6898-4995-b757-fdaf79579051", "address": "fa:16:3e:1f:3c:a8", "network": {"id": "39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a", "bridge": "br-int", "label": "tempest-test-network--268443214", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72980dd9-68", "ovs_interfaceid": "72980dd9-6898-4995-b757-fdaf79579051", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.524 2 DEBUG oslo_concurrency.lockutils [req-bc14d1f0-df80-4eaa-8d8d-1ca40efdc07c req-9aba0381-c5c2-4ce1-9211-12c8cda646e6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.525 2 DEBUG nova.network.neutron [req-bc14d1f0-df80-4eaa-8d8d-1ca40efdc07c req-9aba0381-c5c2-4ce1-9211-12c8cda646e6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Refreshing network info cache for port 72980dd9-6898-4995-b757-fdaf79579051 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.527 2 DEBUG nova.virt.libvirt.driver [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Start _get_guest_xml network_info=[{"id": "72980dd9-6898-4995-b757-fdaf79579051", "address": "fa:16:3e:1f:3c:a8", "network": {"id": "39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a", "bridge": "br-int", "label": "tempest-test-network--268443214", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72980dd9-68", "ovs_interfaceid": "72980dd9-6898-4995-b757-fdaf79579051", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T15:17:39Z,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T15:17:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.532 2 WARNING nova.virt.libvirt.driver [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.543 2 DEBUG nova.virt.libvirt.host [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.543 2 DEBUG nova.virt.libvirt.host [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.548 2 DEBUG nova.virt.libvirt.host [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.549 2 DEBUG nova.virt.libvirt.host [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.549 2 DEBUG nova.virt.libvirt.driver [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.549 2 DEBUG nova.virt.hardware [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='987b2db7-1d21-4b59-831a-1e8ace40589b',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T15:17:39Z,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T15:17:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.550 2 DEBUG nova.virt.hardware [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.550 2 DEBUG nova.virt.hardware [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.550 2 DEBUG nova.virt.hardware [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.550 2 DEBUG nova.virt.hardware [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.550 2 DEBUG nova.virt.hardware [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.551 2 DEBUG nova.virt.hardware [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.551 2 DEBUG nova.virt.hardware [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.551 2 DEBUG nova.virt.hardware [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.551 2 DEBUG nova.virt.hardware [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.552 2 DEBUG nova.virt.hardware [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.555 2 DEBUG nova.virt.libvirt.vif [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:12:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-server-test-724242032',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-724242032',id=84,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgMtZOVnVeH3bWhrZPBKfXd+ywrgZUihuI2z5HN91rm6b66qXVN5XsBNFSC/a3XnUnD3sHUA86mE5v09Xc1EUgkfz3mw8V02tl2sDq2tzT1z7aRUqvhGDG3xh8qSR2ByQ==',key_name='tempest-keypair-862595394',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9d7b1c6f132443b0abac8495ed44621d',ramdisk_id='',reservation_id='r-ly64ptws',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-OvnDvrTest-313060968',owner_user_name='tempest-OvnDvrTest-313060968-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:12:19Z,user_data=None,user_id='81b62a8f3edf4f78aeb0b087fd79ebb7',uuid=ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "72980dd9-6898-4995-b757-fdaf79579051", "address": "fa:16:3e:1f:3c:a8", "network": {"id": "39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a", "bridge": "br-int", "label": "tempest-test-network--268443214", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72980dd9-68", "ovs_interfaceid": "72980dd9-6898-4995-b757-fdaf79579051", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.555 2 DEBUG nova.network.os_vif_util [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converting VIF {"id": "72980dd9-6898-4995-b757-fdaf79579051", "address": "fa:16:3e:1f:3c:a8", "network": {"id": "39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a", "bridge": "br-int", "label": "tempest-test-network--268443214", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72980dd9-68", "ovs_interfaceid": "72980dd9-6898-4995-b757-fdaf79579051", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.556 2 DEBUG nova.network.os_vif_util [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:3c:a8,bridge_name='br-int',has_traffic_filtering=True,id=72980dd9-6898-4995-b757-fdaf79579051,network=Network(39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72980dd9-68') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.557 2 DEBUG nova.objects.instance [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lazy-loading 'pci_devices' on Instance uuid ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.581 2 DEBUG nova.virt.libvirt.driver [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] End _get_guest_xml xml=<domain type="kvm">
Oct  8 12:12:27 np0005476733 nova_compute[192580]:  <uuid>ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579</uuid>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:  <name>instance-00000054</name>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:  <memory>131072</memory>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <nova:name>tempest-server-test-724242032</nova:name>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 16:12:27</nova:creationTime>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <nova:flavor name="m1.nano">
Oct  8 12:12:27 np0005476733 nova_compute[192580]:        <nova:memory>128</nova:memory>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:        <nova:disk>1</nova:disk>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:        <nova:user uuid="81b62a8f3edf4f78aeb0b087fd79ebb7">tempest-OvnDvrTest-313060968-project-admin</nova:user>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:        <nova:project uuid="9d7b1c6f132443b0abac8495ed44621d">tempest-OvnDvrTest-313060968</nova:project>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="ec29a055-bb5f-49c2-94be-8574c5ea97ea"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:        <nova:port uuid="72980dd9-6898-4995-b757-fdaf79579051">
Oct  8 12:12:27 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <system>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <entry name="serial">ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579</entry>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <entry name="uuid">ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579</entry>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    </system>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:  <os>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:  </os>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:  <features>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:  </features>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:  </clock>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:  <devices>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.config"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:1f:3c:a8"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <target dev="tap72980dd9-68"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    </interface>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/console.log" append="off"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    </serial>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <video>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    </video>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    </rng>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 12:12:27 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 12:12:27 np0005476733 nova_compute[192580]:  </devices>
Oct  8 12:12:27 np0005476733 nova_compute[192580]: </domain>
Oct  8 12:12:27 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.583 2 DEBUG nova.compute.manager [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Preparing to wait for external event network-vif-plugged-72980dd9-6898-4995-b757-fdaf79579051 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.583 2 DEBUG oslo_concurrency.lockutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.583 2 DEBUG oslo_concurrency.lockutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.583 2 DEBUG oslo_concurrency.lockutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.584 2 DEBUG nova.virt.libvirt.vif [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:12:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-server-test-724242032',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-724242032',id=84,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgMtZOVnVeH3bWhrZPBKfXd+ywrgZUihuI2z5HN91rm6b66qXVN5XsBNFSC/a3XnUnD3sHUA86mE5v09Xc1EUgkfz3mw8V02tl2sDq2tzT1z7aRUqvhGDG3xh8qSR2ByQ==',key_name='tempest-keypair-862595394',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9d7b1c6f132443b0abac8495ed44621d',ramdisk_id='',reservation_id='r-ly64ptws',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-OvnDvrTest-313060968',owner_user_name='tempest-OvnDvrTest-313060968-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:12:19Z,user_data=None,user_id='81b62a8f3edf4f78aeb0b087fd79ebb7',uuid=ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "72980dd9-6898-4995-b757-fdaf79579051", "address": "fa:16:3e:1f:3c:a8", "network": {"id": "39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a", "bridge": "br-int", "label": "tempest-test-network--268443214", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72980dd9-68", "ovs_interfaceid": "72980dd9-6898-4995-b757-fdaf79579051", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.584 2 DEBUG nova.network.os_vif_util [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converting VIF {"id": "72980dd9-6898-4995-b757-fdaf79579051", "address": "fa:16:3e:1f:3c:a8", "network": {"id": "39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a", "bridge": "br-int", "label": "tempest-test-network--268443214", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72980dd9-68", "ovs_interfaceid": "72980dd9-6898-4995-b757-fdaf79579051", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.585 2 DEBUG nova.network.os_vif_util [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:3c:a8,bridge_name='br-int',has_traffic_filtering=True,id=72980dd9-6898-4995-b757-fdaf79579051,network=Network(39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72980dd9-68') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.585 2 DEBUG os_vif [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:3c:a8,bridge_name='br-int',has_traffic_filtering=True,id=72980dd9-6898-4995-b757-fdaf79579051,network=Network(39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72980dd9-68') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.586 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.586 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.586 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.587 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.587 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.589 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap72980dd9-68, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.589 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap72980dd9-68, col_values=(('external_ids', {'iface-id': '72980dd9-6898-4995-b757-fdaf79579051', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1f:3c:a8', 'vm-uuid': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:27 np0005476733 NetworkManager[51699]: <info>  [1759939947.5935] manager: (tap72980dd9-68): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/256)
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.600 2 INFO os_vif [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:3c:a8,bridge_name='br-int',has_traffic_filtering=True,id=72980dd9-6898-4995-b757-fdaf79579051,network=Network(39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72980dd9-68')#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.604 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.604 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.605 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.605 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.606 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.657 2 DEBUG nova.virt.libvirt.driver [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.658 2 DEBUG nova.virt.libvirt.driver [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.658 2 DEBUG nova.virt.libvirt.driver [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] No VIF found with MAC fa:16:3e:1f:3c:a8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.658 2 INFO nova.virt.libvirt.driver [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Using config drive#033[00m
Oct  8 12:12:27 np0005476733 nova_compute[192580]: 2025-10-08 16:12:27.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:28 np0005476733 nova_compute[192580]: 2025-10-08 16:12:28.486 2 INFO nova.virt.libvirt.driver [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Creating config drive at /var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.config#033[00m
Oct  8 12:12:28 np0005476733 nova_compute[192580]: 2025-10-08 16:12:28.490 2 DEBUG oslo_concurrency.processutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn7foq2fw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:12:28 np0005476733 nova_compute[192580]: 2025-10-08 16:12:28.614 2 DEBUG oslo_concurrency.processutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn7foq2fw" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:12:28 np0005476733 kernel: tap72980dd9-68: entered promiscuous mode
Oct  8 12:12:28 np0005476733 NetworkManager[51699]: <info>  [1759939948.6810] manager: (tap72980dd9-68): new Tun device (/org/freedesktop/NetworkManager/Devices/257)
Oct  8 12:12:28 np0005476733 ovn_controller[94857]: 2025-10-08T16:12:28Z|00791|binding|INFO|Claiming lport 72980dd9-6898-4995-b757-fdaf79579051 for this chassis.
Oct  8 12:12:28 np0005476733 ovn_controller[94857]: 2025-10-08T16:12:28Z|00792|binding|INFO|72980dd9-6898-4995-b757-fdaf79579051: Claiming fa:16:3e:1f:3c:a8 10.100.0.10
Oct  8 12:12:28 np0005476733 nova_compute[192580]: 2025-10-08 16:12:28.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:28 np0005476733 nova_compute[192580]: 2025-10-08 16:12:28.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:28 np0005476733 NetworkManager[51699]: <info>  [1759939948.7358] manager: (patch-provnet-20e9b335-697c-41bf-8f62-8813cb01ba99-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/258)
Oct  8 12:12:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:28.738 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:3c:a8 10.100.0.10'], port_security=['fa:16:3e:1f:3c:a8 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d7b1c6f132443b0abac8495ed44621d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c4c5a072-db01-4f8f-8c8a-8e47ffd1595c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7983178b-63e4-47de-abc8-63e5de6fbbea, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=72980dd9-6898-4995-b757-fdaf79579051) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:12:28 np0005476733 NetworkManager[51699]: <info>  [1759939948.7386] manager: (patch-br-int-to-provnet-20e9b335-697c-41bf-8f62-8813cb01ba99): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/259)
Oct  8 12:12:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:28.739 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 72980dd9-6898-4995-b757-fdaf79579051 in datapath 39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a bound to our chassis#033[00m
Oct  8 12:12:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:28.741 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a#033[00m
Oct  8 12:12:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:28.754 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e5bc64ff-d451-4052-a966-6f209b6df1e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:12:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:28.755 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap39a1ef3d-31 in ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 12:12:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:28.757 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap39a1ef3d-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 12:12:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:28.757 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b7eb0955-1e8f-4661-98a7-a6223e019acc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:12:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:28.758 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[17e92714-02c1-4719-8a2c-128f6e16f520]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:12:28 np0005476733 systemd-machined[152624]: New machine qemu-50-instance-00000054.
Oct  8 12:12:28 np0005476733 systemd-udevd[253111]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:12:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:28.769 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[dab0c903-2874-4c5a-a5b1-cf9d7c97f578]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:12:28 np0005476733 NetworkManager[51699]: <info>  [1759939948.7772] device (tap72980dd9-68): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:12:28 np0005476733 NetworkManager[51699]: <info>  [1759939948.7780] device (tap72980dd9-68): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 12:12:28 np0005476733 nova_compute[192580]: 2025-10-08 16:12:28.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:28 np0005476733 systemd[1]: Started Virtual Machine qemu-50-instance-00000054.
Oct  8 12:12:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:28.801 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4a7ec564-92a7-4907-9d76-bceee914a5dd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:12:28 np0005476733 nova_compute[192580]: 2025-10-08 16:12:28.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:28 np0005476733 ovn_controller[94857]: 2025-10-08T16:12:28Z|00793|binding|INFO|Setting lport 72980dd9-6898-4995-b757-fdaf79579051 ovn-installed in OVS
Oct  8 12:12:28 np0005476733 ovn_controller[94857]: 2025-10-08T16:12:28Z|00794|binding|INFO|Setting lport 72980dd9-6898-4995-b757-fdaf79579051 up in Southbound
Oct  8 12:12:28 np0005476733 nova_compute[192580]: 2025-10-08 16:12:28.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:28.840 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[1d45578f-136a-4d7e-b3c7-661145f3b6bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:12:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:28.846 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[39c3fa2c-4c5b-4d48-b506-1c23d7ad6b79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:12:28 np0005476733 NetworkManager[51699]: <info>  [1759939948.8485] manager: (tap39a1ef3d-30): new Veth device (/org/freedesktop/NetworkManager/Devices/260)
Oct  8 12:12:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:28.886 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[da3e9567-f77b-4f62-8a7c-124cde70b860]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:12:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:28.889 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[0bdf665d-30da-4e49-94ba-9233dc44da80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:12:28 np0005476733 NetworkManager[51699]: <info>  [1759939948.9084] device (tap39a1ef3d-30): carrier: link connected
Oct  8 12:12:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:28.912 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[c0e2372c-f8a4-4838-8839-880143f01624]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:12:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:28.929 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[abeba152-2304-43aa-ae35-7f40c5a96c85]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap39a1ef3d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:86:22'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 179], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 685257, 'reachable_time': 43461, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253146, 'error': None, 'target': 'ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:12:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:28.941 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e1e87deb-35fd-4535-80a7-64e666d62b9e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe20:8622'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 685257, 'tstamp': 685257}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 253147, 'error': None, 'target': 'ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:12:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:28.955 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9508e111-ba11-4859-adc6-9472fc3408f4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap39a1ef3d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:86:22'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 179], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 685257, 'reachable_time': 43461, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 253148, 'error': None, 'target': 'ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:12:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:28.978 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[97cc412b-bcea-402f-9b54-32cbb8bd0e42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:29.019 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1ad4afe7-a7ee-43ae-b958-5de7f54e4fdc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:29.020 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap39a1ef3d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:29.020 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:29.021 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap39a1ef3d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:12:29 np0005476733 nova_compute[192580]: 2025-10-08 16:12:29.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:29 np0005476733 kernel: tap39a1ef3d-30: entered promiscuous mode
Oct  8 12:12:29 np0005476733 nova_compute[192580]: 2025-10-08 16:12:29.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:29.025 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap39a1ef3d-30, col_values=(('external_ids', {'iface-id': 'cdece465-f1f7-48d8-b327-8467e836ca94'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:12:29 np0005476733 NetworkManager[51699]: <info>  [1759939949.0261] manager: (tap39a1ef3d-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/261)
Oct  8 12:12:29 np0005476733 nova_compute[192580]: 2025-10-08 16:12:29.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:29 np0005476733 ovn_controller[94857]: 2025-10-08T16:12:29Z|00795|binding|INFO|Releasing lport cdece465-f1f7-48d8-b327-8467e836ca94 from this chassis (sb_readonly=0)
Oct  8 12:12:29 np0005476733 nova_compute[192580]: 2025-10-08 16:12:29.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:29.039 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:29.040 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7dcec19a-7927-49da-8e02-9b835918f36c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:29.041 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a.pid.haproxy
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 12:12:29 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:29.043 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a', 'env', 'PROCESS_TAG=haproxy-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 12:12:29 np0005476733 podman[253178]: 2025-10-08 16:12:29.349824247 +0000 UTC m=+0.041093796 container create 637fe88f0c98960fa9bfae710714b6d9b659dd89fe3a6ac90b2d791ec918e22e (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  8 12:12:29 np0005476733 systemd[1]: Started libpod-conmon-637fe88f0c98960fa9bfae710714b6d9b659dd89fe3a6ac90b2d791ec918e22e.scope.
Oct  8 12:12:29 np0005476733 podman[253178]: 2025-10-08 16:12:29.327008737 +0000 UTC m=+0.018278306 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 12:12:29 np0005476733 systemd[1]: Started libcrun container.
Oct  8 12:12:29 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2e4aca4ee7c8ac4df012c30771002f5b7c7fcc83eebaf536957dc4350e2119e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 12:12:29 np0005476733 podman[253178]: 2025-10-08 16:12:29.44845194 +0000 UTC m=+0.139721509 container init 637fe88f0c98960fa9bfae710714b6d9b659dd89fe3a6ac90b2d791ec918e22e (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  8 12:12:29 np0005476733 podman[253178]: 2025-10-08 16:12:29.454653097 +0000 UTC m=+0.145922646 container start 637fe88f0c98960fa9bfae710714b6d9b659dd89fe3a6ac90b2d791ec918e22e (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 12:12:29 np0005476733 neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a[253194]: [NOTICE]   (253198) : New worker (253201) forked
Oct  8 12:12:29 np0005476733 neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a[253194]: [NOTICE]   (253198) : Loading success.
Oct  8 12:12:29 np0005476733 nova_compute[192580]: 2025-10-08 16:12:29.542 2 DEBUG nova.compute.manager [req-b0b36a69-2077-4852-a4e8-b2653cd76f98 req-1d7bc166-3d57-4410-a955-23f466a3bdc2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Received event network-vif-plugged-72980dd9-6898-4995-b757-fdaf79579051 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:12:29 np0005476733 nova_compute[192580]: 2025-10-08 16:12:29.543 2 DEBUG oslo_concurrency.lockutils [req-b0b36a69-2077-4852-a4e8-b2653cd76f98 req-1d7bc166-3d57-4410-a955-23f466a3bdc2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:12:29 np0005476733 nova_compute[192580]: 2025-10-08 16:12:29.543 2 DEBUG oslo_concurrency.lockutils [req-b0b36a69-2077-4852-a4e8-b2653cd76f98 req-1d7bc166-3d57-4410-a955-23f466a3bdc2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:12:29 np0005476733 nova_compute[192580]: 2025-10-08 16:12:29.543 2 DEBUG oslo_concurrency.lockutils [req-b0b36a69-2077-4852-a4e8-b2653cd76f98 req-1d7bc166-3d57-4410-a955-23f466a3bdc2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:12:29 np0005476733 nova_compute[192580]: 2025-10-08 16:12:29.543 2 DEBUG nova.compute.manager [req-b0b36a69-2077-4852-a4e8-b2653cd76f98 req-1d7bc166-3d57-4410-a955-23f466a3bdc2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Processing event network-vif-plugged-72980dd9-6898-4995-b757-fdaf79579051 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.026 2 DEBUG nova.compute.manager [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.028 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759939950.0275404, ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.028 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] VM Started (Lifecycle Event)#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.034 2 DEBUG nova.virt.libvirt.driver [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.039 2 INFO nova.virt.libvirt.driver [-] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Instance spawned successfully.#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.040 2 DEBUG nova.virt.libvirt.driver [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.053 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.059 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.074 2 DEBUG nova.virt.libvirt.driver [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.074 2 DEBUG nova.virt.libvirt.driver [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.075 2 DEBUG nova.virt.libvirt.driver [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.076 2 DEBUG nova.virt.libvirt.driver [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.077 2 DEBUG nova.virt.libvirt.driver [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.077 2 DEBUG nova.virt.libvirt.driver [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.086 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.087 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759939950.028629, ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.087 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] VM Paused (Lifecycle Event)#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.136 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.140 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759939950.0319283, ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.141 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] VM Resumed (Lifecycle Event)#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.167 2 INFO nova.compute.manager [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Took 10.75 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.168 2 DEBUG nova.compute.manager [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.169 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.177 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.215 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.242 2 INFO nova.compute.manager [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Took 13.22 seconds to build instance.#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.259 2 DEBUG oslo_concurrency.lockutils [None req-2b1648e4-2e20-4142-a0b8-1cd33ffb9122 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.662s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.324 2 DEBUG nova.network.neutron [req-bc14d1f0-df80-4eaa-8d8d-1ca40efdc07c req-9aba0381-c5c2-4ce1-9211-12c8cda646e6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Updated VIF entry in instance network info cache for port 72980dd9-6898-4995-b757-fdaf79579051. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.326 2 DEBUG nova.network.neutron [req-bc14d1f0-df80-4eaa-8d8d-1ca40efdc07c req-9aba0381-c5c2-4ce1-9211-12c8cda646e6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Updating instance_info_cache with network_info: [{"id": "72980dd9-6898-4995-b757-fdaf79579051", "address": "fa:16:3e:1f:3c:a8", "network": {"id": "39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a", "bridge": "br-int", "label": "tempest-test-network--268443214", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72980dd9-68", "ovs_interfaceid": "72980dd9-6898-4995-b757-fdaf79579051", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.344 2 DEBUG oslo_concurrency.lockutils [req-bc14d1f0-df80-4eaa-8d8d-1ca40efdc07c req-9aba0381-c5c2-4ce1-9211-12c8cda646e6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:12:30 np0005476733 nova_compute[192580]: 2025-10-08 16:12:30.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:12:31 np0005476733 nova_compute[192580]: 2025-10-08 16:12:31.656 2 DEBUG nova.compute.manager [req-9bdd5922-51ff-4bea-8c92-fb5126655fd2 req-6d40bdee-1a07-42e2-a0d7-55f6b424444a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Received event network-vif-plugged-72980dd9-6898-4995-b757-fdaf79579051 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:12:31 np0005476733 nova_compute[192580]: 2025-10-08 16:12:31.657 2 DEBUG oslo_concurrency.lockutils [req-9bdd5922-51ff-4bea-8c92-fb5126655fd2 req-6d40bdee-1a07-42e2-a0d7-55f6b424444a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:12:31 np0005476733 nova_compute[192580]: 2025-10-08 16:12:31.657 2 DEBUG oslo_concurrency.lockutils [req-9bdd5922-51ff-4bea-8c92-fb5126655fd2 req-6d40bdee-1a07-42e2-a0d7-55f6b424444a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:12:31 np0005476733 nova_compute[192580]: 2025-10-08 16:12:31.658 2 DEBUG oslo_concurrency.lockutils [req-9bdd5922-51ff-4bea-8c92-fb5126655fd2 req-6d40bdee-1a07-42e2-a0d7-55f6b424444a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:12:31 np0005476733 nova_compute[192580]: 2025-10-08 16:12:31.658 2 DEBUG nova.compute.manager [req-9bdd5922-51ff-4bea-8c92-fb5126655fd2 req-6d40bdee-1a07-42e2-a0d7-55f6b424444a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] No waiting events found dispatching network-vif-plugged-72980dd9-6898-4995-b757-fdaf79579051 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:12:31 np0005476733 nova_compute[192580]: 2025-10-08 16:12:31.658 2 WARNING nova.compute.manager [req-9bdd5922-51ff-4bea-8c92-fb5126655fd2 req-6d40bdee-1a07-42e2-a0d7-55f6b424444a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Received unexpected event network-vif-plugged-72980dd9-6898-4995-b757-fdaf79579051 for instance with vm_state active and task_state None.#033[00m
Oct  8 12:12:32 np0005476733 nova_compute[192580]: 2025-10-08 16:12:32.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:32 np0005476733 nova_compute[192580]: 2025-10-08 16:12:32.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:34 np0005476733 systemd-logind[827]: New session 61 of user zuul.
Oct  8 12:12:34 np0005476733 systemd[1]: Started Session 61 of User zuul.
Oct  8 12:12:34 np0005476733 podman[253219]: 2025-10-08 16:12:34.105540622 +0000 UTC m=+0.098412007 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 12:12:34 np0005476733 podman[253221]: 2025-10-08 16:12:34.112359121 +0000 UTC m=+0.106755614 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  8 12:12:34 np0005476733 podman[253218]: 2025-10-08 16:12:34.121926346 +0000 UTC m=+0.118320664 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd)
Oct  8 12:12:34 np0005476733 systemd-logind[827]: New session 62 of user zuul.
Oct  8 12:12:34 np0005476733 systemd[1]: Started Session 62 of User zuul.
Oct  8 12:12:34 np0005476733 systemd[1]: session-62.scope: Deactivated successfully.
Oct  8 12:12:34 np0005476733 systemd-logind[827]: Session 62 logged out. Waiting for processes to exit.
Oct  8 12:12:34 np0005476733 systemd-logind[827]: Removed session 62.
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.060 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'name': 'tempest-server-test-724242032', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000054', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9d7b1c6f132443b0abac8495ed44621d', 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'hostId': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.061 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.080 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.081 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9cfa11b2-6cc2-421d-88fb-8e4bef29c13f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-vda', 'timestamp': '2025-10-08T16:12:36.061741', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9a965086-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.784707502, 'message_signature': '70b2dcdc1648114809fc0785cca83e879fc9bc2725a8eae34d64c9cc1193e55d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-sda', 'timestamp': '2025-10-08T16:12:36.061741', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9a965cca-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.784707502, 'message_signature': 'f61cab0b033380f1eec92d11a0fc18dfad84dd45d57676c45e073d8fb4dc5eba'}]}, 'timestamp': '2025-10-08 16:12:36.081486', '_unique_id': '1d6152945dc74f4ead5737802f36caad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.082 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.083 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.085 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579 / tap72980dd9-68 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4875505c-b82a-4528-84c2-03241fb9bd1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000054-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-tap72980dd9-68', 'timestamp': '2025-10-08T16:12:36.083237', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'tap72980dd9-68', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:3c:a8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72980dd9-68'}, 'message_id': '9a971b92-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.80622109, 'message_signature': 'd89efc4e41b8ffd26388f554d829094d44f300f38ea7cf2d50e202282b9779c0'}]}, 'timestamp': '2025-10-08 16:12:36.086377', '_unique_id': 'd5760fc27dbb44cb8606bffcbb374400'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.086 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.087 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.087 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.087 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50aa6178-7a32-422e-871f-47df54e47cae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-vda', 'timestamp': '2025-10-08T16:12:36.087564', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9a975328-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.784707502, 'message_signature': '01c882edfe134eca6e16ce7316dce98c6516f6646804f1a668fd9a2e41d7190e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-sda', 'timestamp': '2025-10-08T16:12:36.087564', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9a975b20-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.784707502, 'message_signature': '531f70d7af8d4b4101898dbf7a2833f8ca52f9795abbfa047bd9e0d6d511a014'}]}, 'timestamp': '2025-10-08 16:12:36.087981', '_unique_id': '80ce008f4c3f4b509cce7bbc3761e738'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.088 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d3cae4a-2c91-47ae-b5fb-1f7b77d164aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000054-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-tap72980dd9-68', 'timestamp': '2025-10-08T16:12:36.089116', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'tap72980dd9-68', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:3c:a8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72980dd9-68'}, 'message_id': '9a978ff0-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.80622109, 'message_signature': '9c293e7aece6f93ddb62d7bbe0278f93fe285b8fd89124daca95cd94d312adc2'}]}, 'timestamp': '2025-10-08 16:12:36.089347', '_unique_id': '1eedf667712247ddbbd4185958cf1f3d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.089 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.090 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.090 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '267fd2d8-8ce0-4d6a-a076-8e219cc1cbcb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000054-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-tap72980dd9-68', 'timestamp': '2025-10-08T16:12:36.090393', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'tap72980dd9-68', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:3c:a8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72980dd9-68'}, 'message_id': '9a97c1aa-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.80622109, 'message_signature': '9bbdd31a2be38dafd58a922bf6eb94e19a5486fb12a0ca81ba705c803266363e'}]}, 'timestamp': '2025-10-08 16:12:36.090621', '_unique_id': '11612c9f14fb4cabba3abd027403ae4a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.091 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca140d09-c636-4fdd-a7b0-a3fe86a8a6bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-vda', 'timestamp': '2025-10-08T16:12:36.091699', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9a97f51c-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.784707502, 'message_signature': '5f1ba033abad0a4203e2c2228cc4683937d73ba4d6cac53641054ddc0366d90c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-sda', 'timestamp': '2025-10-08T16:12:36.091699', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9a97fe90-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.784707502, 'message_signature': '75826333705d398bddba4af927e422abe462502cc757d9b724f7af9b4fce7df3'}]}, 'timestamp': '2025-10-08 16:12:36.092202', '_unique_id': 'b717cf35e2434c20b86f04e00d90248c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.092 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.093 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.103 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.103 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20ecaf68-72b8-42d2-a344-83da34e6b646', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-vda', 'timestamp': '2025-10-08T16:12:36.093333', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9a99c50e-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.816304692, 'message_signature': 'd73d770702c6db7130727c43f8ac5c8c829a0ce9affac702b35ec451965906c6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-sda', 'timestamp': '2025-10-08T16:12:36.093333', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9a99cf5e-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.816304692, 'message_signature': '7e44f14417eedad107cab2bd1c6523d7c51edc22023d884035c6579f01a9a7f5'}]}, 'timestamp': '2025-10-08 16:12:36.104122', '_unique_id': '156ff5a3d1d7401da25d3ee0ebef7b80'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.106 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.107 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.107 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.107 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3cb37af3-490d-4296-9636-e9b30ecf6474', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-vda', 'timestamp': '2025-10-08T16:12:36.107332', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9a9a592e-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.816304692, 'message_signature': '3ba6e572ef17adfe70a46b6e1db3198695bb5f5b794523b6c7bb02f9a5c134ba'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-sda', 'timestamp': '2025-10-08T16:12:36.107332', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9a9a641e-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.816304692, 'message_signature': 'f0b894a0b6c597a7cd9f106b45a05b5c6833774e07d5cd451cbe586f05d1687f'}]}, 'timestamp': '2025-10-08 16:12:36.107907', '_unique_id': 'a79212bc37da44c3a6c3607d44d624e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.108 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.109 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.109 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27230aea-63b4-4ab5-8791-39c3014bd261', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000054-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-tap72980dd9-68', 'timestamp': '2025-10-08T16:12:36.109662', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'tap72980dd9-68', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:3c:a8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72980dd9-68'}, 'message_id': '9a9ab374-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.80622109, 'message_signature': '0e102e830e7a8b3533d59b9dfa8bda611a12cc9c2c01dbe96e00479cf69a89dc'}]}, 'timestamp': '2025-10-08 16:12:36.109953', '_unique_id': '3513398f4f6c4f18881fb324ee3e8463'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.110 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.111 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.111 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.read.latency volume: 487043848 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.111 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.read.latency volume: 4180703 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1a44dcf-61b6-4c40-a9fd-1fe69c549773', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 487043848, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-vda', 'timestamp': '2025-10-08T16:12:36.111239', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9a9af0d2-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.784707502, 'message_signature': '98370c9bf60ab6ea5924eae99e6c9e54806a78a77558d4de26f2809b9f61523c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4180703, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-sda', 'timestamp': '2025-10-08T16:12:36.111239', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9a9af9f6-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.784707502, 'message_signature': 'a910dffc799fdcafadeda47c031c9993cbf1642ecda0029293f109fe37df9abb'}]}, 'timestamp': '2025-10-08 16:12:36.111710', '_unique_id': '44d53b98fe174841bb898a44457bff92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.112 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.113 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.113 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-724242032>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-724242032>]
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.113 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.113 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.113 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '908cf3fb-b756-404e-a444-2054d75f5c02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-vda', 'timestamp': '2025-10-08T16:12:36.113448', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9a9b4618-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.816304692, 'message_signature': '9d22636c64e5d66967d8aa90ca1ac963c61dbeb0ea4fc8225a18a7f3e9814516'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-sda', 'timestamp': '2025-10-08T16:12:36.113448', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9a9b4de8-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.816304692, 'message_signature': 'c7a749a93a77ef0b439adea5cb8980350d1eb39ea8955d19fc11908be5b92d73'}]}, 'timestamp': '2025-10-08 16:12:36.113854', '_unique_id': '105f2fbafb064eed881965902dc168b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.114 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.115 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-724242032>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-724242032>]
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.115 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.115 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'deb9f53b-6da6-4641-bf93-6e1dfecc178b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000054-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-tap72980dd9-68', 'timestamp': '2025-10-08T16:12:36.115379', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'tap72980dd9-68', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:3c:a8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72980dd9-68'}, 'message_id': '9a9b9370-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.80622109, 'message_signature': '0c296e67fc6389a7e5cbf7c272dd8604da2bae369c46143a1d3151fd71524fae'}]}, 'timestamp': '2025-10-08 16:12:36.115693', '_unique_id': '434218602c88475db2a3cf422fce7f89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.116 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.117 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-724242032>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-724242032>]
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.117 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.117 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f3e5d3b-aa45-4790-9295-8c21f94a8756', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000054-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-tap72980dd9-68', 'timestamp': '2025-10-08T16:12:36.117233', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'tap72980dd9-68', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:3c:a8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72980dd9-68'}, 'message_id': '9a9bd9fc-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.80622109, 'message_signature': 'c2d2a99587cdb5a7046c5abb91dc74c818aaea0a8431440ba7f51781eee4bbed'}]}, 'timestamp': '2025-10-08 16:12:36.117482', '_unique_id': 'ffe3e9c59ab64230a0f58df139ba19b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.118 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a07eb89-9b19-4a0d-bd4b-96551f35ce62', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-vda', 'timestamp': '2025-10-08T16:12:36.118755', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9a9c1534-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.784707502, 'message_signature': '1480fb567596c0557ec371d5d08d7d4d5ef00e6fb91fb9657f73f172be0a5185'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-sda', 'timestamp': '2025-10-08T16:12:36.118755', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9a9c1d22-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.784707502, 'message_signature': 'fb70b83e7e8599983cce970c8f9ca52fecc348b85b52b0a9c85b1a3f0d261fd1'}]}, 'timestamp': '2025-10-08 16:12:36.119177', '_unique_id': 'b61e2f732b4a41d4abe8641e7f029045'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.119 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.120 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.120 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b39ee81-1d61-48f8-8677-adcf654aafcc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000054-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-tap72980dd9-68', 'timestamp': '2025-10-08T16:12:36.120284', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'tap72980dd9-68', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:3c:a8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72980dd9-68'}, 'message_id': '9a9c522e-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.80622109, 'message_signature': '335752d7e443413859492a03d15f2ef24bde14a72e2419a71f51d035adbf1113'}]}, 'timestamp': '2025-10-08 16:12:36.120549', '_unique_id': '628660d1e3c04dca815518ecd94a3855'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.121 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f15cc210-347c-462c-baff-026aace6b50d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000054-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-tap72980dd9-68', 'timestamp': '2025-10-08T16:12:36.121680', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'tap72980dd9-68', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:3c:a8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72980dd9-68'}, 'message_id': '9a9c8816-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.80622109, 'message_signature': '784075ea81d4b612978a28814ab9e05fef5c5d27d926c19f7b82ee56237d5d3e'}]}, 'timestamp': '2025-10-08 16:12:36.121915', '_unique_id': '7ea5b9bcd007444a879f35f8259bd8f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.122 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.139 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.139 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579: ceilometer.compute.pollsters.NoVolumeException
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.139 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.140 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3f91a88-f4bc-483a-a0e2-6c1c22c75b3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000054-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-tap72980dd9-68', 'timestamp': '2025-10-08T16:12:36.139999', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'tap72980dd9-68', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:3c:a8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72980dd9-68'}, 'message_id': '9a9f5762-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.80622109, 'message_signature': 'b9170fb72cc888f3b23e2777484cc9cc03ca49a8047165e081f12a3c2b5821be'}]}, 'timestamp': '2025-10-08 16:12:36.140408', '_unique_id': '515f544c0f7440b9a032f9ca4662b6e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.141 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.142 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.142 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b963bbd6-9337-4d8b-80b9-9803f3a2590e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000054-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-tap72980dd9-68', 'timestamp': '2025-10-08T16:12:36.142217', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'tap72980dd9-68', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:3c:a8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72980dd9-68'}, 'message_id': '9a9fab7c-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.80622109, 'message_signature': '72595c239de2b3c333223abf112c5f4919961ad6352f3af74b137b51a08c2d63'}]}, 'timestamp': '2025-10-08 16:12:36.142524', '_unique_id': '0f7c96bcb2094a9691676f14c333fc14'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.143 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f91df444-c4ac-4a9d-94af-8a4d2b6f6439', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-vda', 'timestamp': '2025-10-08T16:12:36.143731', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9a9fe4fc-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.784707502, 'message_signature': '6897608fe5a388e06009f07650c671f751086544fa8ecf8d4e9e05f75d9afe34'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-sda', 'timestamp': '2025-10-08T16:12:36.143731', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9a9fece0-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.784707502, 'message_signature': '266f6b2deb574fcc4f90d6ef8033397891e240166f37f3c30321a168439a35e8'}]}, 'timestamp': '2025-10-08 16:12:36.144213', '_unique_id': 'c7620245c8d14e0c94767ccbf3fd7270'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.145 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.145 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.145 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-724242032>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-724242032>]
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.146 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.146 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/cpu volume: 5910000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50294c8b-6f11-4886-8847-1e1e61c74fd2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5910000000, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'timestamp': '2025-10-08T16:12:36.146207', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '9aa0474e-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6859.862427087, 'message_signature': '5360861c16a047a7cf84dc2665dd2632f12d951145edfd117e1be4c8c3a8176c'}]}, 'timestamp': '2025-10-08 16:12:36.146498', '_unique_id': 'fc76f9c350e44c0ea581e578c50768cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:12:36.147 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:12:37 np0005476733 nova_compute[192580]: 2025-10-08 16:12:37.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:12:37 np0005476733 nova_compute[192580]: 2025-10-08 16:12:37.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:37 np0005476733 nova_compute[192580]: 2025-10-08 16:12:37.618 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:12:37 np0005476733 nova_compute[192580]: 2025-10-08 16:12:37.619 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:12:37 np0005476733 nova_compute[192580]: 2025-10-08 16:12:37.619 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:12:37 np0005476733 nova_compute[192580]: 2025-10-08 16:12:37.620 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:12:37 np0005476733 nova_compute[192580]: 2025-10-08 16:12:37.702 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:12:37 np0005476733 nova_compute[192580]: 2025-10-08 16:12:37.775 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:12:37 np0005476733 nova_compute[192580]: 2025-10-08 16:12:37.777 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:12:37 np0005476733 nova_compute[192580]: 2025-10-08 16:12:37.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:37 np0005476733 nova_compute[192580]: 2025-10-08 16:12:37.853 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:12:38 np0005476733 nova_compute[192580]: 2025-10-08 16:12:38.020 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:12:38 np0005476733 nova_compute[192580]: 2025-10-08 16:12:38.022 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13629MB free_disk=111.31521987915039GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:12:38 np0005476733 nova_compute[192580]: 2025-10-08 16:12:38.022 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:12:38 np0005476733 nova_compute[192580]: 2025-10-08 16:12:38.022 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:12:38 np0005476733 nova_compute[192580]: 2025-10-08 16:12:38.164 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:12:38 np0005476733 nova_compute[192580]: 2025-10-08 16:12:38.165 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:12:38 np0005476733 nova_compute[192580]: 2025-10-08 16:12:38.165 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:12:38 np0005476733 nova_compute[192580]: 2025-10-08 16:12:38.315 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:12:38 np0005476733 nova_compute[192580]: 2025-10-08 16:12:38.342 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:12:38 np0005476733 nova_compute[192580]: 2025-10-08 16:12:38.382 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:12:38 np0005476733 nova_compute[192580]: 2025-10-08 16:12:38.383 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.360s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:12:40 np0005476733 podman[253349]: 2025-10-08 16:12:40.236757962 +0000 UTC m=+0.053919745 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 12:12:40 np0005476733 podman[253348]: 2025-10-08 16:12:40.258504737 +0000 UTC m=+0.075515195 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:12:40 np0005476733 nova_compute[192580]: 2025-10-08 16:12:40.375 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:12:41 np0005476733 ovn_controller[94857]: 2025-10-08T16:12:41Z|00104|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1f:3c:a8 10.100.0.10
Oct  8 12:12:41 np0005476733 ovn_controller[94857]: 2025-10-08T16:12:41Z|00105|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1f:3c:a8 10.100.0.10
Oct  8 12:12:42 np0005476733 nova_compute[192580]: 2025-10-08 16:12:42.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:12:42 np0005476733 nova_compute[192580]: 2025-10-08 16:12:42.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:42 np0005476733 nova_compute[192580]: 2025-10-08 16:12:42.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:47 np0005476733 nova_compute[192580]: 2025-10-08 16:12:47.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:47 np0005476733 nova_compute[192580]: 2025-10-08 16:12:47.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:50 np0005476733 podman[253405]: 2025-10-08 16:12:50.232738143 +0000 UTC m=+0.063211482 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 12:12:52 np0005476733 nova_compute[192580]: 2025-10-08 16:12:52.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:52 np0005476733 nova_compute[192580]: 2025-10-08 16:12:52.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:56 np0005476733 podman[253426]: 2025-10-08 16:12:56.243917424 +0000 UTC m=+0.060540536 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 12:12:56 np0005476733 podman[253425]: 2025-10-08 16:12:56.270952909 +0000 UTC m=+0.098482489 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 12:12:56 np0005476733 systemd-logind[827]: New session 63 of user zuul.
Oct  8 12:12:56 np0005476733 systemd[1]: Started Session 63 of User zuul.
Oct  8 12:12:57 np0005476733 nova_compute[192580]: 2025-10-08 16:12:57.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:57 np0005476733 nova_compute[192580]: 2025-10-08 16:12:57.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:58.346 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=55, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=54) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:12:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:12:58.347 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:12:58 np0005476733 nova_compute[192580]: 2025-10-08 16:12:58.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:12:59 np0005476733 ovn_controller[94857]: 2025-10-08T16:12:59Z|00796|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Oct  8 12:12:59 np0005476733 nova_compute[192580]: 2025-10-08 16:12:59.761 2 DEBUG nova.compute.manager [req-eceba76c-3e63-483b-992d-0e7e852310a1 req-157d27cc-15ca-405b-8336-4ff41d9cf4f1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Received event network-changed-72980dd9-6898-4995-b757-fdaf79579051 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:12:59 np0005476733 nova_compute[192580]: 2025-10-08 16:12:59.762 2 DEBUG nova.compute.manager [req-eceba76c-3e63-483b-992d-0e7e852310a1 req-157d27cc-15ca-405b-8336-4ff41d9cf4f1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Refreshing instance network info cache due to event network-changed-72980dd9-6898-4995-b757-fdaf79579051. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:12:59 np0005476733 nova_compute[192580]: 2025-10-08 16:12:59.762 2 DEBUG oslo_concurrency.lockutils [req-eceba76c-3e63-483b-992d-0e7e852310a1 req-157d27cc-15ca-405b-8336-4ff41d9cf4f1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:12:59 np0005476733 nova_compute[192580]: 2025-10-08 16:12:59.763 2 DEBUG oslo_concurrency.lockutils [req-eceba76c-3e63-483b-992d-0e7e852310a1 req-157d27cc-15ca-405b-8336-4ff41d9cf4f1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:12:59 np0005476733 nova_compute[192580]: 2025-10-08 16:12:59.763 2 DEBUG nova.network.neutron [req-eceba76c-3e63-483b-992d-0e7e852310a1 req-157d27cc-15ca-405b-8336-4ff41d9cf4f1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Refreshing network info cache for port 72980dd9-6898-4995-b757-fdaf79579051 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:13:01 np0005476733 nova_compute[192580]: 2025-10-08 16:13:01.290 2 DEBUG nova.network.neutron [req-eceba76c-3e63-483b-992d-0e7e852310a1 req-157d27cc-15ca-405b-8336-4ff41d9cf4f1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Updated VIF entry in instance network info cache for port 72980dd9-6898-4995-b757-fdaf79579051. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:13:01 np0005476733 nova_compute[192580]: 2025-10-08 16:13:01.291 2 DEBUG nova.network.neutron [req-eceba76c-3e63-483b-992d-0e7e852310a1 req-157d27cc-15ca-405b-8336-4ff41d9cf4f1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Updating instance_info_cache with network_info: [{"id": "72980dd9-6898-4995-b757-fdaf79579051", "address": "fa:16:3e:1f:3c:a8", "network": {"id": "39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a", "bridge": "br-int", "label": "tempest-test-network--268443214", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72980dd9-68", "ovs_interfaceid": "72980dd9-6898-4995-b757-fdaf79579051", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:13:01 np0005476733 nova_compute[192580]: 2025-10-08 16:13:01.321 2 DEBUG oslo_concurrency.lockutils [req-eceba76c-3e63-483b-992d-0e7e852310a1 req-157d27cc-15ca-405b-8336-4ff41d9cf4f1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:13:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:02.350 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '55'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:13:02 np0005476733 nova_compute[192580]: 2025-10-08 16:13:02.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:02 np0005476733 nova_compute[192580]: 2025-10-08 16:13:02.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:04 np0005476733 podman[253503]: 2025-10-08 16:13:04.25305315 +0000 UTC m=+0.067017413 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:13:04 np0005476733 podman[253504]: 2025-10-08 16:13:04.259133685 +0000 UTC m=+0.066634871 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, architecture=x86_64, config_id=edpm, version=9.6, vendor=Red Hat, Inc.)
Oct  8 12:13:04 np0005476733 podman[253502]: 2025-10-08 16:13:04.276132808 +0000 UTC m=+0.101398503 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:13:06 np0005476733 systemd-logind[827]: New session 64 of user zuul.
Oct  8 12:13:06 np0005476733 systemd[1]: Started Session 64 of User zuul.
Oct  8 12:13:06 np0005476733 systemd-logind[827]: New session 65 of user zuul.
Oct  8 12:13:06 np0005476733 systemd[1]: Started Session 65 of User zuul.
Oct  8 12:13:06 np0005476733 systemd[1]: session-65.scope: Deactivated successfully.
Oct  8 12:13:06 np0005476733 systemd-logind[827]: Session 65 logged out. Waiting for processes to exit.
Oct  8 12:13:06 np0005476733 systemd-logind[827]: Removed session 65.
Oct  8 12:13:07 np0005476733 nova_compute[192580]: 2025-10-08 16:13:07.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:07 np0005476733 nova_compute[192580]: 2025-10-08 16:13:07.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:11 np0005476733 podman[253627]: 2025-10-08 16:13:11.22659993 +0000 UTC m=+0.055910949 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 12:13:11 np0005476733 podman[253626]: 2025-10-08 16:13:11.22726093 +0000 UTC m=+0.056948121 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:13:12 np0005476733 nova_compute[192580]: 2025-10-08 16:13:12.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:12 np0005476733 nova_compute[192580]: 2025-10-08 16:13:12.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:15 np0005476733 systemd-logind[827]: New session 66 of user zuul.
Oct  8 12:13:15 np0005476733 systemd[1]: Started Session 66 of User zuul.
Oct  8 12:13:16 np0005476733 nova_compute[192580]: 2025-10-08 16:13:16.951 2 DEBUG nova.compute.manager [req-d35e8ad4-e980-479b-bf0d-71eeed383bf9 req-a2c8d293-86f7-40a6-97c2-00a50243b6b1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Received event network-changed-72980dd9-6898-4995-b757-fdaf79579051 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:13:16 np0005476733 nova_compute[192580]: 2025-10-08 16:13:16.952 2 DEBUG nova.compute.manager [req-d35e8ad4-e980-479b-bf0d-71eeed383bf9 req-a2c8d293-86f7-40a6-97c2-00a50243b6b1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Refreshing instance network info cache due to event network-changed-72980dd9-6898-4995-b757-fdaf79579051. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:13:16 np0005476733 nova_compute[192580]: 2025-10-08 16:13:16.952 2 DEBUG oslo_concurrency.lockutils [req-d35e8ad4-e980-479b-bf0d-71eeed383bf9 req-a2c8d293-86f7-40a6-97c2-00a50243b6b1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:13:16 np0005476733 nova_compute[192580]: 2025-10-08 16:13:16.952 2 DEBUG oslo_concurrency.lockutils [req-d35e8ad4-e980-479b-bf0d-71eeed383bf9 req-a2c8d293-86f7-40a6-97c2-00a50243b6b1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:13:16 np0005476733 nova_compute[192580]: 2025-10-08 16:13:16.953 2 DEBUG nova.network.neutron [req-d35e8ad4-e980-479b-bf0d-71eeed383bf9 req-a2c8d293-86f7-40a6-97c2-00a50243b6b1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Refreshing network info cache for port 72980dd9-6898-4995-b757-fdaf79579051 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:13:17 np0005476733 nova_compute[192580]: 2025-10-08 16:13:17.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:17 np0005476733 nova_compute[192580]: 2025-10-08 16:13:17.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:18 np0005476733 nova_compute[192580]: 2025-10-08 16:13:18.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:13:19 np0005476733 nova_compute[192580]: 2025-10-08 16:13:19.452 2 DEBUG nova.network.neutron [req-d35e8ad4-e980-479b-bf0d-71eeed383bf9 req-a2c8d293-86f7-40a6-97c2-00a50243b6b1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Updated VIF entry in instance network info cache for port 72980dd9-6898-4995-b757-fdaf79579051. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:13:19 np0005476733 nova_compute[192580]: 2025-10-08 16:13:19.453 2 DEBUG nova.network.neutron [req-d35e8ad4-e980-479b-bf0d-71eeed383bf9 req-a2c8d293-86f7-40a6-97c2-00a50243b6b1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Updating instance_info_cache with network_info: [{"id": "72980dd9-6898-4995-b757-fdaf79579051", "address": "fa:16:3e:1f:3c:a8", "network": {"id": "39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a", "bridge": "br-int", "label": "tempest-test-network--268443214", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72980dd9-68", "ovs_interfaceid": "72980dd9-6898-4995-b757-fdaf79579051", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:13:19 np0005476733 nova_compute[192580]: 2025-10-08 16:13:19.631 2 DEBUG oslo_concurrency.lockutils [req-d35e8ad4-e980-479b-bf0d-71eeed383bf9 req-a2c8d293-86f7-40a6-97c2-00a50243b6b1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:13:21 np0005476733 podman[253702]: 2025-10-08 16:13:21.212730629 +0000 UTC m=+0.046030883 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 12:13:21 np0005476733 nova_compute[192580]: 2025-10-08 16:13:21.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:13:22 np0005476733 systemd-logind[827]: New session 67 of user zuul.
Oct  8 12:13:22 np0005476733 systemd[1]: Started Session 67 of User zuul.
Oct  8 12:13:22 np0005476733 systemd-logind[827]: New session 68 of user zuul.
Oct  8 12:13:22 np0005476733 systemd[1]: Started Session 68 of User zuul.
Oct  8 12:13:22 np0005476733 nova_compute[192580]: 2025-10-08 16:13:22.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:22 np0005476733 systemd[1]: session-68.scope: Deactivated successfully.
Oct  8 12:13:22 np0005476733 systemd-logind[827]: Session 68 logged out. Waiting for processes to exit.
Oct  8 12:13:22 np0005476733 systemd-logind[827]: Removed session 68.
Oct  8 12:13:22 np0005476733 nova_compute[192580]: 2025-10-08 16:13:22.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:25 np0005476733 ovn_controller[94857]: 2025-10-08T16:13:25Z|00797|pinctrl|WARN|Dropped 239 log messages in last 65 seconds (most recently, 9 seconds ago) due to excessive rate
Oct  8 12:13:25 np0005476733 ovn_controller[94857]: 2025-10-08T16:13:25Z|00798|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:13:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:26.370 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:13:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:26.371 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:13:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:26.372 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:13:27 np0005476733 podman[253780]: 2025-10-08 16:13:27.226240917 +0000 UTC m=+0.051581941 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:13:27 np0005476733 podman[253779]: 2025-10-08 16:13:27.26169134 +0000 UTC m=+0.089126091 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:13:27 np0005476733 nova_compute[192580]: 2025-10-08 16:13:27.579 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:13:27 np0005476733 nova_compute[192580]: 2025-10-08 16:13:27.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:13:27 np0005476733 nova_compute[192580]: 2025-10-08 16:13:27.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:13:27 np0005476733 nova_compute[192580]: 2025-10-08 16:13:27.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:27 np0005476733 nova_compute[192580]: 2025-10-08 16:13:27.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:28 np0005476733 nova_compute[192580]: 2025-10-08 16:13:28.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:13:29 np0005476733 nova_compute[192580]: 2025-10-08 16:13:29.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:13:29 np0005476733 nova_compute[192580]: 2025-10-08 16:13:29.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:13:29 np0005476733 nova_compute[192580]: 2025-10-08 16:13:29.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:13:30 np0005476733 nova_compute[192580]: 2025-10-08 16:13:30.315 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:13:30 np0005476733 nova_compute[192580]: 2025-10-08 16:13:30.316 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:13:30 np0005476733 nova_compute[192580]: 2025-10-08 16:13:30.316 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:13:30 np0005476733 nova_compute[192580]: 2025-10-08 16:13:30.316 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:13:31 np0005476733 systemd-logind[827]: New session 69 of user zuul.
Oct  8 12:13:31 np0005476733 systemd[1]: Started Session 69 of User zuul.
Oct  8 12:13:32 np0005476733 nova_compute[192580]: 2025-10-08 16:13:32.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:32 np0005476733 nova_compute[192580]: 2025-10-08 16:13:32.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:33 np0005476733 nova_compute[192580]: 2025-10-08 16:13:33.604 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Updating instance_info_cache with network_info: [{"id": "72980dd9-6898-4995-b757-fdaf79579051", "address": "fa:16:3e:1f:3c:a8", "network": {"id": "39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a", "bridge": "br-int", "label": "tempest-test-network--268443214", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72980dd9-68", "ovs_interfaceid": "72980dd9-6898-4995-b757-fdaf79579051", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:13:33 np0005476733 nova_compute[192580]: 2025-10-08 16:13:33.627 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:13:33 np0005476733 nova_compute[192580]: 2025-10-08 16:13:33.627 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:13:33 np0005476733 nova_compute[192580]: 2025-10-08 16:13:33.627 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:13:34 np0005476733 nova_compute[192580]: 2025-10-08 16:13:34.709 2 DEBUG nova.compute.manager [req-2258026d-2d88-4ae4-80a0-6a5a24a788b1 req-bcb78cd1-1f65-4a55-bf1c-6c6ae4207b60 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Received event network-changed-72980dd9-6898-4995-b757-fdaf79579051 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:13:34 np0005476733 nova_compute[192580]: 2025-10-08 16:13:34.709 2 DEBUG nova.compute.manager [req-2258026d-2d88-4ae4-80a0-6a5a24a788b1 req-bcb78cd1-1f65-4a55-bf1c-6c6ae4207b60 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Refreshing instance network info cache due to event network-changed-72980dd9-6898-4995-b757-fdaf79579051. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:13:34 np0005476733 nova_compute[192580]: 2025-10-08 16:13:34.709 2 DEBUG oslo_concurrency.lockutils [req-2258026d-2d88-4ae4-80a0-6a5a24a788b1 req-bcb78cd1-1f65-4a55-bf1c-6c6ae4207b60 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:13:34 np0005476733 nova_compute[192580]: 2025-10-08 16:13:34.709 2 DEBUG oslo_concurrency.lockutils [req-2258026d-2d88-4ae4-80a0-6a5a24a788b1 req-bcb78cd1-1f65-4a55-bf1c-6c6ae4207b60 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:13:34 np0005476733 nova_compute[192580]: 2025-10-08 16:13:34.710 2 DEBUG nova.network.neutron [req-2258026d-2d88-4ae4-80a0-6a5a24a788b1 req-bcb78cd1-1f65-4a55-bf1c-6c6ae4207b60 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Refreshing network info cache for port 72980dd9-6898-4995-b757-fdaf79579051 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:13:35 np0005476733 podman[253865]: 2025-10-08 16:13:35.231922061 +0000 UTC m=+0.060752743 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:13:35 np0005476733 podman[253866]: 2025-10-08 16:13:35.235679391 +0000 UTC m=+0.060225577 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.6, io.openshift.expose-services=, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  8 12:13:35 np0005476733 podman[253864]: 2025-10-08 16:13:35.241706563 +0000 UTC m=+0.071528567 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 12:13:36 np0005476733 nova_compute[192580]: 2025-10-08 16:13:36.651 2 DEBUG nova.network.neutron [req-2258026d-2d88-4ae4-80a0-6a5a24a788b1 req-bcb78cd1-1f65-4a55-bf1c-6c6ae4207b60 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Updated VIF entry in instance network info cache for port 72980dd9-6898-4995-b757-fdaf79579051. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:13:36 np0005476733 nova_compute[192580]: 2025-10-08 16:13:36.652 2 DEBUG nova.network.neutron [req-2258026d-2d88-4ae4-80a0-6a5a24a788b1 req-bcb78cd1-1f65-4a55-bf1c-6c6ae4207b60 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Updating instance_info_cache with network_info: [{"id": "72980dd9-6898-4995-b757-fdaf79579051", "address": "fa:16:3e:1f:3c:a8", "network": {"id": "39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a", "bridge": "br-int", "label": "tempest-test-network--268443214", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72980dd9-68", "ovs_interfaceid": "72980dd9-6898-4995-b757-fdaf79579051", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:13:36 np0005476733 nova_compute[192580]: 2025-10-08 16:13:36.770 2 DEBUG oslo_concurrency.lockutils [req-2258026d-2d88-4ae4-80a0-6a5a24a788b1 req-bcb78cd1-1f65-4a55-bf1c-6c6ae4207b60 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:13:37 np0005476733 nova_compute[192580]: 2025-10-08 16:13:37.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:37 np0005476733 nova_compute[192580]: 2025-10-08 16:13:37.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:39 np0005476733 nova_compute[192580]: 2025-10-08 16:13:39.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:13:39 np0005476733 nova_compute[192580]: 2025-10-08 16:13:39.657 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:13:39 np0005476733 nova_compute[192580]: 2025-10-08 16:13:39.657 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:13:39 np0005476733 nova_compute[192580]: 2025-10-08 16:13:39.658 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:13:39 np0005476733 nova_compute[192580]: 2025-10-08 16:13:39.658 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:13:39 np0005476733 nova_compute[192580]: 2025-10-08 16:13:39.913 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:13:39 np0005476733 nova_compute[192580]: 2025-10-08 16:13:39.981 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:13:39 np0005476733 nova_compute[192580]: 2025-10-08 16:13:39.982 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:13:40 np0005476733 nova_compute[192580]: 2025-10-08 16:13:40.038 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:13:40 np0005476733 nova_compute[192580]: 2025-10-08 16:13:40.182 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:13:40 np0005476733 nova_compute[192580]: 2025-10-08 16:13:40.183 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13585MB free_disk=111.28749084472656GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:13:40 np0005476733 nova_compute[192580]: 2025-10-08 16:13:40.183 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:13:40 np0005476733 nova_compute[192580]: 2025-10-08 16:13:40.184 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:13:40 np0005476733 systemd-logind[827]: New session 70 of user zuul.
Oct  8 12:13:40 np0005476733 systemd[1]: Started Session 70 of User zuul.
Oct  8 12:13:40 np0005476733 nova_compute[192580]: 2025-10-08 16:13:40.600 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:13:40 np0005476733 nova_compute[192580]: 2025-10-08 16:13:40.600 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:13:40 np0005476733 nova_compute[192580]: 2025-10-08 16:13:40.601 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:13:40 np0005476733 nova_compute[192580]: 2025-10-08 16:13:40.647 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:13:40 np0005476733 systemd-logind[827]: New session 71 of user zuul.
Oct  8 12:13:40 np0005476733 systemd[1]: Started Session 71 of User zuul.
Oct  8 12:13:40 np0005476733 nova_compute[192580]: 2025-10-08 16:13:40.773 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:13:40 np0005476733 nova_compute[192580]: 2025-10-08 16:13:40.775 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:13:40 np0005476733 nova_compute[192580]: 2025-10-08 16:13:40.775 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.591s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:13:40 np0005476733 systemd[1]: session-71.scope: Deactivated successfully.
Oct  8 12:13:40 np0005476733 systemd-logind[827]: Session 71 logged out. Waiting for processes to exit.
Oct  8 12:13:40 np0005476733 systemd-logind[827]: Removed session 71.
Oct  8 12:13:42 np0005476733 podman[253994]: 2025-10-08 16:13:42.228164964 +0000 UTC m=+0.055023370 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3)
Oct  8 12:13:42 np0005476733 podman[253995]: 2025-10-08 16:13:42.255986104 +0000 UTC m=+0.082907472 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:13:42 np0005476733 nova_compute[192580]: 2025-10-08 16:13:42.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:42 np0005476733 nova_compute[192580]: 2025-10-08 16:13:42.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:43 np0005476733 nova_compute[192580]: 2025-10-08 16:13:43.774 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:13:47 np0005476733 nova_compute[192580]: 2025-10-08 16:13:47.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:47 np0005476733 nova_compute[192580]: 2025-10-08 16:13:47.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:49 np0005476733 systemd-logind[827]: New session 72 of user zuul.
Oct  8 12:13:49 np0005476733 systemd[1]: Started Session 72 of User zuul.
Oct  8 12:13:51 np0005476733 nova_compute[192580]: 2025-10-08 16:13:51.005 2 DEBUG oslo_concurrency.lockutils [None req-cfdfba35-690e-4278-8699-30e11bf022e4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:13:51 np0005476733 nova_compute[192580]: 2025-10-08 16:13:51.007 2 DEBUG oslo_concurrency.lockutils [None req-cfdfba35-690e-4278-8699-30e11bf022e4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:13:51 np0005476733 nova_compute[192580]: 2025-10-08 16:13:51.007 2 INFO nova.compute.manager [None req-cfdfba35-690e-4278-8699-30e11bf022e4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Rebooting instance#033[00m
Oct  8 12:13:51 np0005476733 nova_compute[192580]: 2025-10-08 16:13:51.041 2 DEBUG oslo_concurrency.lockutils [None req-cfdfba35-690e-4278-8699-30e11bf022e4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "refresh_cache-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:13:51 np0005476733 nova_compute[192580]: 2025-10-08 16:13:51.042 2 DEBUG oslo_concurrency.lockutils [None req-cfdfba35-690e-4278-8699-30e11bf022e4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquired lock "refresh_cache-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:13:51 np0005476733 nova_compute[192580]: 2025-10-08 16:13:51.042 2 DEBUG nova.network.neutron [None req-cfdfba35-690e-4278-8699-30e11bf022e4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:13:52 np0005476733 podman[254067]: 2025-10-08 16:13:52.256472881 +0000 UTC m=+0.085042220 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:13:52 np0005476733 nova_compute[192580]: 2025-10-08 16:13:52.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:52 np0005476733 nova_compute[192580]: 2025-10-08 16:13:52.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:53 np0005476733 nova_compute[192580]: 2025-10-08 16:13:53.338 2 DEBUG nova.network.neutron [None req-cfdfba35-690e-4278-8699-30e11bf022e4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Updating instance_info_cache with network_info: [{"id": "72980dd9-6898-4995-b757-fdaf79579051", "address": "fa:16:3e:1f:3c:a8", "network": {"id": "39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a", "bridge": "br-int", "label": "tempest-test-network--268443214", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72980dd9-68", "ovs_interfaceid": "72980dd9-6898-4995-b757-fdaf79579051", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:13:53 np0005476733 nova_compute[192580]: 2025-10-08 16:13:53.362 2 DEBUG oslo_concurrency.lockutils [None req-cfdfba35-690e-4278-8699-30e11bf022e4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Releasing lock "refresh_cache-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:13:53 np0005476733 nova_compute[192580]: 2025-10-08 16:13:53.364 2 DEBUG nova.compute.manager [None req-cfdfba35-690e-4278-8699-30e11bf022e4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:13:55 np0005476733 kernel: tap72980dd9-68 (unregistering): left promiscuous mode
Oct  8 12:13:55 np0005476733 NetworkManager[51699]: <info>  [1759940035.7194] device (tap72980dd9-68): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 12:13:55 np0005476733 nova_compute[192580]: 2025-10-08 16:13:55.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:55 np0005476733 ovn_controller[94857]: 2025-10-08T16:13:55Z|00799|binding|INFO|Releasing lport 72980dd9-6898-4995-b757-fdaf79579051 from this chassis (sb_readonly=0)
Oct  8 12:13:55 np0005476733 ovn_controller[94857]: 2025-10-08T16:13:55Z|00800|binding|INFO|Setting lport 72980dd9-6898-4995-b757-fdaf79579051 down in Southbound
Oct  8 12:13:55 np0005476733 ovn_controller[94857]: 2025-10-08T16:13:55Z|00801|binding|INFO|Removing iface tap72980dd9-68 ovn-installed in OVS
Oct  8 12:13:55 np0005476733 nova_compute[192580]: 2025-10-08 16:13:55.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:55 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:55.741 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:3c:a8 10.100.0.10'], port_security=['fa:16:3e:1f:3c:a8 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d7b1c6f132443b0abac8495ed44621d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c4c5a072-db01-4f8f-8c8a-8e47ffd1595c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.207'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7983178b-63e4-47de-abc8-63e5de6fbbea, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=72980dd9-6898-4995-b757-fdaf79579051) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:13:55 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:55.742 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 72980dd9-6898-4995-b757-fdaf79579051 in datapath 39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a unbound from our chassis#033[00m
Oct  8 12:13:55 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:55.743 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 12:13:55 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:55.744 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[42032460-7d6a-4d00-813e-92239d95505f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:13:55 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:55.745 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a namespace which is not needed anymore#033[00m
Oct  8 12:13:55 np0005476733 nova_compute[192580]: 2025-10-08 16:13:55.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:55 np0005476733 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000054.scope: Deactivated successfully.
Oct  8 12:13:55 np0005476733 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000054.scope: Consumed 15.683s CPU time.
Oct  8 12:13:55 np0005476733 systemd-machined[152624]: Machine qemu-50-instance-00000054 terminated.
Oct  8 12:13:55 np0005476733 neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a[253194]: [NOTICE]   (253198) : haproxy version is 2.8.14-c23fe91
Oct  8 12:13:55 np0005476733 neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a[253194]: [NOTICE]   (253198) : path to executable is /usr/sbin/haproxy
Oct  8 12:13:55 np0005476733 neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a[253194]: [WARNING]  (253198) : Exiting Master process...
Oct  8 12:13:55 np0005476733 neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a[253194]: [ALERT]    (253198) : Current worker (253201) exited with code 143 (Terminated)
Oct  8 12:13:55 np0005476733 neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a[253194]: [WARNING]  (253198) : All workers exited. Exiting... (0)
Oct  8 12:13:55 np0005476733 systemd[1]: libpod-637fe88f0c98960fa9bfae710714b6d9b659dd89fe3a6ac90b2d791ec918e22e.scope: Deactivated successfully.
Oct  8 12:13:55 np0005476733 podman[254112]: 2025-10-08 16:13:55.897897244 +0000 UTC m=+0.045684022 container died 637fe88f0c98960fa9bfae710714b6d9b659dd89fe3a6ac90b2d791ec918e22e (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  8 12:13:55 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-637fe88f0c98960fa9bfae710714b6d9b659dd89fe3a6ac90b2d791ec918e22e-userdata-shm.mount: Deactivated successfully.
Oct  8 12:13:55 np0005476733 systemd[1]: var-lib-containers-storage-overlay-e2e4aca4ee7c8ac4df012c30771002f5b7c7fcc83eebaf536957dc4350e2119e-merged.mount: Deactivated successfully.
Oct  8 12:13:55 np0005476733 podman[254112]: 2025-10-08 16:13:55.938242314 +0000 UTC m=+0.086029092 container cleanup 637fe88f0c98960fa9bfae710714b6d9b659dd89fe3a6ac90b2d791ec918e22e (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001)
Oct  8 12:13:55 np0005476733 systemd[1]: libpod-conmon-637fe88f0c98960fa9bfae710714b6d9b659dd89fe3a6ac90b2d791ec918e22e.scope: Deactivated successfully.
Oct  8 12:13:56 np0005476733 podman[254144]: 2025-10-08 16:13:56.005732282 +0000 UTC m=+0.047299704 container remove 637fe88f0c98960fa9bfae710714b6d9b659dd89fe3a6ac90b2d791ec918e22e (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.015 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[632471cc-7fa9-4e94-a41c-5e846e6b0078]: (4, ('Wed Oct  8 04:13:55 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a (637fe88f0c98960fa9bfae710714b6d9b659dd89fe3a6ac90b2d791ec918e22e)\n637fe88f0c98960fa9bfae710714b6d9b659dd89fe3a6ac90b2d791ec918e22e\nWed Oct  8 04:13:55 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a (637fe88f0c98960fa9bfae710714b6d9b659dd89fe3a6ac90b2d791ec918e22e)\n637fe88f0c98960fa9bfae710714b6d9b659dd89fe3a6ac90b2d791ec918e22e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.016 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8903ac8c-2c75-4741-824c-d072db64f590]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.017 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap39a1ef3d-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:13:56 np0005476733 nova_compute[192580]: 2025-10-08 16:13:56.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:56 np0005476733 kernel: tap39a1ef3d-30: left promiscuous mode
Oct  8 12:13:56 np0005476733 nova_compute[192580]: 2025-10-08 16:13:56.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.042 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[85b8cfaf-bb1b-4fd8-ab82-b5a352ac760f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.063 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7e9a6970-aef9-49f1-a5bf-6ecfe6a031a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.064 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d734f383-1f17-4557-9d90-cb2067d94e04]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.078 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[12c119a5-fdd7-48ef-a3ac-9fa6391f15fd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 685249, 'reachable_time': 37311, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254178, 'error': None, 'target': 'ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:13:56 np0005476733 systemd[1]: run-netns-ovnmeta\x2d39a1ef3d\x2d39fb\x2d4d17\x2d8b3f\x2dd6bff6ae557a.mount: Deactivated successfully.
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.083 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.083 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[909e8e01-3019-4191-8ba4-f1cccfe8d1f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:13:56 np0005476733 nova_compute[192580]: 2025-10-08 16:13:56.540 2 INFO nova.virt.libvirt.driver [None req-cfdfba35-690e-4278-8699-30e11bf022e4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Instance shutdown successfully.#033[00m
Oct  8 12:13:56 np0005476733 nova_compute[192580]: 2025-10-08 16:13:56.588 2 DEBUG nova.compute.manager [req-50fd4261-bfcb-42f8-8d7c-3f95a17a3e0f req-5abece71-526d-4eee-9621-3d18df689394 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Received event network-vif-unplugged-72980dd9-6898-4995-b757-fdaf79579051 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:13:56 np0005476733 nova_compute[192580]: 2025-10-08 16:13:56.588 2 DEBUG oslo_concurrency.lockutils [req-50fd4261-bfcb-42f8-8d7c-3f95a17a3e0f req-5abece71-526d-4eee-9621-3d18df689394 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:13:56 np0005476733 nova_compute[192580]: 2025-10-08 16:13:56.589 2 DEBUG oslo_concurrency.lockutils [req-50fd4261-bfcb-42f8-8d7c-3f95a17a3e0f req-5abece71-526d-4eee-9621-3d18df689394 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:13:56 np0005476733 nova_compute[192580]: 2025-10-08 16:13:56.589 2 DEBUG oslo_concurrency.lockutils [req-50fd4261-bfcb-42f8-8d7c-3f95a17a3e0f req-5abece71-526d-4eee-9621-3d18df689394 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:13:56 np0005476733 nova_compute[192580]: 2025-10-08 16:13:56.590 2 DEBUG nova.compute.manager [req-50fd4261-bfcb-42f8-8d7c-3f95a17a3e0f req-5abece71-526d-4eee-9621-3d18df689394 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] No waiting events found dispatching network-vif-unplugged-72980dd9-6898-4995-b757-fdaf79579051 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:13:56 np0005476733 nova_compute[192580]: 2025-10-08 16:13:56.590 2 WARNING nova.compute.manager [req-50fd4261-bfcb-42f8-8d7c-3f95a17a3e0f req-5abece71-526d-4eee-9621-3d18df689394 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Received unexpected event network-vif-unplugged-72980dd9-6898-4995-b757-fdaf79579051 for instance with vm_state active and task_state reboot_started.#033[00m
Oct  8 12:13:56 np0005476733 kernel: tap72980dd9-68: entered promiscuous mode
Oct  8 12:13:56 np0005476733 NetworkManager[51699]: <info>  [1759940036.6315] manager: (tap72980dd9-68): new Tun device (/org/freedesktop/NetworkManager/Devices/262)
Oct  8 12:13:56 np0005476733 systemd-udevd[254093]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:13:56 np0005476733 NetworkManager[51699]: <info>  [1759940036.6446] device (tap72980dd9-68): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:13:56 np0005476733 NetworkManager[51699]: <info>  [1759940036.6454] device (tap72980dd9-68): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 12:13:56 np0005476733 ovn_controller[94857]: 2025-10-08T16:13:56Z|00802|binding|INFO|Claiming lport 72980dd9-6898-4995-b757-fdaf79579051 for this chassis.
Oct  8 12:13:56 np0005476733 ovn_controller[94857]: 2025-10-08T16:13:56Z|00803|binding|INFO|72980dd9-6898-4995-b757-fdaf79579051: Claiming fa:16:3e:1f:3c:a8 10.100.0.10
Oct  8 12:13:56 np0005476733 nova_compute[192580]: 2025-10-08 16:13:56.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.678 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:3c:a8 10.100.0.10'], port_security=['fa:16:3e:1f:3c:a8 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d7b1c6f132443b0abac8495ed44621d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c4c5a072-db01-4f8f-8c8a-8e47ffd1595c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.207'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7983178b-63e4-47de-abc8-63e5de6fbbea, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=72980dd9-6898-4995-b757-fdaf79579051) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.679 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 72980dd9-6898-4995-b757-fdaf79579051 in datapath 39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a bound to our chassis#033[00m
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.681 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a#033[00m
Oct  8 12:13:56 np0005476733 ovn_controller[94857]: 2025-10-08T16:13:56Z|00804|binding|INFO|Setting lport 72980dd9-6898-4995-b757-fdaf79579051 ovn-installed in OVS
Oct  8 12:13:56 np0005476733 ovn_controller[94857]: 2025-10-08T16:13:56Z|00805|binding|INFO|Setting lport 72980dd9-6898-4995-b757-fdaf79579051 up in Southbound
Oct  8 12:13:56 np0005476733 nova_compute[192580]: 2025-10-08 16:13:56.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:56 np0005476733 nova_compute[192580]: 2025-10-08 16:13:56.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.697 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[fa5c9929-57ed-433f-be11-52a6979aac95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.698 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap39a1ef3d-31 in ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 12:13:56 np0005476733 systemd-machined[152624]: New machine qemu-51-instance-00000054.
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.701 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap39a1ef3d-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.701 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5d13068b-bd96-4009-bd0f-20e14c57bb01]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.706 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7b79e83f-865e-45af-b6e3-e52c4cd723ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:13:56 np0005476733 systemd[1]: Started Virtual Machine qemu-51-instance-00000054.
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.725 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[12c7b9ea-9cb7-44a1-bd4e-93bd4b49c3e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.745 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2eb354ad-acf0-4633-ac53-bc17871a490f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.789 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[1b6c40a6-a2ab-46ba-9fd1-22068d006234]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.798 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[adab18c5-25ad-452e-8f8e-79321382d328]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:13:56 np0005476733 NetworkManager[51699]: <info>  [1759940036.8002] manager: (tap39a1ef3d-30): new Veth device (/org/freedesktop/NetworkManager/Devices/263)
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.841 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[674bb298-4100-4d81-a0d4-033316405c27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.845 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[5a4ed580-e764-484e-b4e3-d584698cc8dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:13:56 np0005476733 NetworkManager[51699]: <info>  [1759940036.8683] device (tap39a1ef3d-30): carrier: link connected
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.875 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[c133f92e-ff4e-4062-97d5-b3efbe25993a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.893 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ba504e6e-c44a-4a22-a630-1061c8f7ad09]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap39a1ef3d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:86:22'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 182], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 694053, 'reachable_time': 43442, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254225, 'error': None, 'target': 'ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.913 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[de70ae76-22e6-4774-b8e2-8ef7f5a93306]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe20:8622'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 694053, 'tstamp': 694053}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 254226, 'error': None, 'target': 'ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.934 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2211cf15-e7e6-4b24-a54d-3aa0070058a8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap39a1ef3d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:86:22'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 182], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 694053, 'reachable_time': 43442, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 254227, 'error': None, 'target': 'ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:13:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:56.970 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[fbf723b5-34de-4801-8701-cb3a1f8b3af8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:57.046 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5c0ce3d0-3d0d-498a-ab71-bdc7bf9b3adb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:57.047 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap39a1ef3d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:57.047 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:57.047 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap39a1ef3d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:13:57 np0005476733 kernel: tap39a1ef3d-30: entered promiscuous mode
Oct  8 12:13:57 np0005476733 nova_compute[192580]: 2025-10-08 16:13:57.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:57 np0005476733 NetworkManager[51699]: <info>  [1759940037.0498] manager: (tap39a1ef3d-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/264)
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:57.055 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap39a1ef3d-30, col_values=(('external_ids', {'iface-id': 'cdece465-f1f7-48d8-b327-8467e836ca94'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:13:57 np0005476733 ovn_controller[94857]: 2025-10-08T16:13:57Z|00806|binding|INFO|Releasing lport cdece465-f1f7-48d8-b327-8467e836ca94 from this chassis (sb_readonly=0)
Oct  8 12:13:57 np0005476733 nova_compute[192580]: 2025-10-08 16:13:57.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:57.057 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:57.058 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[509d0761-0341-481a-81bc-842574ccdd6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:57.059 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a.pid.haproxy
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 12:13:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:13:57.059 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a', 'env', 'PROCESS_TAG=haproxy-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 12:13:57 np0005476733 nova_compute[192580]: 2025-10-08 16:13:57.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:57 np0005476733 podman[254259]: 2025-10-08 16:13:57.458240987 +0000 UTC m=+0.059789083 container create b8306c09716a86c3e589bc8bd4969e27f1d013afa8b20683b16704d44a6b3f13 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:13:57 np0005476733 systemd[1]: Started libpod-conmon-b8306c09716a86c3e589bc8bd4969e27f1d013afa8b20683b16704d44a6b3f13.scope.
Oct  8 12:13:57 np0005476733 podman[254259]: 2025-10-08 16:13:57.42738181 +0000 UTC m=+0.028929766 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 12:13:57 np0005476733 systemd[1]: Started libcrun container.
Oct  8 12:13:57 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02a114570f402d8889604f47ad017c557d7f5f537188d3f8d73d66e587fe8ea9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 12:13:57 np0005476733 podman[254280]: 2025-10-08 16:13:57.61038505 +0000 UTC m=+0.100697870 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Oct  8 12:13:57 np0005476733 podman[254259]: 2025-10-08 16:13:57.610924528 +0000 UTC m=+0.212472494 container init b8306c09716a86c3e589bc8bd4969e27f1d013afa8b20683b16704d44a6b3f13 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  8 12:13:57 np0005476733 podman[254259]: 2025-10-08 16:13:57.617408005 +0000 UTC m=+0.218955941 container start b8306c09716a86c3e589bc8bd4969e27f1d013afa8b20683b16704d44a6b3f13 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  8 12:13:57 np0005476733 neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a[254288]: [NOTICE]   (254328) : New worker (254330) forked
Oct  8 12:13:57 np0005476733 neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a[254288]: [NOTICE]   (254328) : Loading success.
Oct  8 12:13:57 np0005476733 podman[254282]: 2025-10-08 16:13:57.668024033 +0000 UTC m=+0.147840697 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 12:13:57 np0005476733 nova_compute[192580]: 2025-10-08 16:13:57.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:57 np0005476733 nova_compute[192580]: 2025-10-08 16:13:57.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:13:57 np0005476733 nova_compute[192580]: 2025-10-08 16:13:57.934 2 DEBUG nova.virt.libvirt.host [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Removed pending event for ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  8 12:13:57 np0005476733 nova_compute[192580]: 2025-10-08 16:13:57.935 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759940037.9341793, ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:13:57 np0005476733 nova_compute[192580]: 2025-10-08 16:13:57.935 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] VM Resumed (Lifecycle Event)#033[00m
Oct  8 12:13:57 np0005476733 nova_compute[192580]: 2025-10-08 16:13:57.940 2 INFO nova.virt.libvirt.driver [-] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Instance running successfully.#033[00m
Oct  8 12:13:57 np0005476733 nova_compute[192580]: 2025-10-08 16:13:57.940 2 INFO nova.virt.libvirt.driver [None req-cfdfba35-690e-4278-8699-30e11bf022e4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Instance soft rebooted successfully.#033[00m
Oct  8 12:13:57 np0005476733 nova_compute[192580]: 2025-10-08 16:13:57.941 2 DEBUG nova.compute.manager [None req-cfdfba35-690e-4278-8699-30e11bf022e4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:13:57 np0005476733 nova_compute[192580]: 2025-10-08 16:13:57.992 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:13:57 np0005476733 nova_compute[192580]: 2025-10-08 16:13:57.996 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:13:58 np0005476733 nova_compute[192580]: 2025-10-08 16:13:58.070 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] During sync_power_state the instance has a pending task (reboot_started). Skip.#033[00m
Oct  8 12:13:58 np0005476733 nova_compute[192580]: 2025-10-08 16:13:58.071 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759940037.934287, ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:13:58 np0005476733 nova_compute[192580]: 2025-10-08 16:13:58.072 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] VM Started (Lifecycle Event)#033[00m
Oct  8 12:13:58 np0005476733 nova_compute[192580]: 2025-10-08 16:13:58.140 2 DEBUG oslo_concurrency.lockutils [None req-cfdfba35-690e-4278-8699-30e11bf022e4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 7.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:13:58 np0005476733 nova_compute[192580]: 2025-10-08 16:13:58.142 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:13:58 np0005476733 nova_compute[192580]: 2025-10-08 16:13:58.145 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: reboot_started, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:13:58 np0005476733 nova_compute[192580]: 2025-10-08 16:13:58.727 2 DEBUG nova.compute.manager [req-bdb6e643-d851-4b31-be86-bc7499aa03be req-3f6c9a37-021b-4f63-b98b-b27fb67d0707 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Received event network-vif-plugged-72980dd9-6898-4995-b757-fdaf79579051 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:13:58 np0005476733 nova_compute[192580]: 2025-10-08 16:13:58.728 2 DEBUG oslo_concurrency.lockutils [req-bdb6e643-d851-4b31-be86-bc7499aa03be req-3f6c9a37-021b-4f63-b98b-b27fb67d0707 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:13:58 np0005476733 nova_compute[192580]: 2025-10-08 16:13:58.728 2 DEBUG oslo_concurrency.lockutils [req-bdb6e643-d851-4b31-be86-bc7499aa03be req-3f6c9a37-021b-4f63-b98b-b27fb67d0707 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:13:58 np0005476733 nova_compute[192580]: 2025-10-08 16:13:58.729 2 DEBUG oslo_concurrency.lockutils [req-bdb6e643-d851-4b31-be86-bc7499aa03be req-3f6c9a37-021b-4f63-b98b-b27fb67d0707 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:13:58 np0005476733 nova_compute[192580]: 2025-10-08 16:13:58.730 2 DEBUG nova.compute.manager [req-bdb6e643-d851-4b31-be86-bc7499aa03be req-3f6c9a37-021b-4f63-b98b-b27fb67d0707 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] No waiting events found dispatching network-vif-plugged-72980dd9-6898-4995-b757-fdaf79579051 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:13:58 np0005476733 nova_compute[192580]: 2025-10-08 16:13:58.730 2 WARNING nova.compute.manager [req-bdb6e643-d851-4b31-be86-bc7499aa03be req-3f6c9a37-021b-4f63-b98b-b27fb67d0707 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Received unexpected event network-vif-plugged-72980dd9-6898-4995-b757-fdaf79579051 for instance with vm_state active and task_state None.#033[00m
Oct  8 12:13:58 np0005476733 nova_compute[192580]: 2025-10-08 16:13:58.731 2 DEBUG nova.compute.manager [req-bdb6e643-d851-4b31-be86-bc7499aa03be req-3f6c9a37-021b-4f63-b98b-b27fb67d0707 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Received event network-vif-plugged-72980dd9-6898-4995-b757-fdaf79579051 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:13:58 np0005476733 nova_compute[192580]: 2025-10-08 16:13:58.731 2 DEBUG oslo_concurrency.lockutils [req-bdb6e643-d851-4b31-be86-bc7499aa03be req-3f6c9a37-021b-4f63-b98b-b27fb67d0707 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:13:58 np0005476733 nova_compute[192580]: 2025-10-08 16:13:58.732 2 DEBUG oslo_concurrency.lockutils [req-bdb6e643-d851-4b31-be86-bc7499aa03be req-3f6c9a37-021b-4f63-b98b-b27fb67d0707 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:13:58 np0005476733 nova_compute[192580]: 2025-10-08 16:13:58.732 2 DEBUG oslo_concurrency.lockutils [req-bdb6e643-d851-4b31-be86-bc7499aa03be req-3f6c9a37-021b-4f63-b98b-b27fb67d0707 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:13:58 np0005476733 nova_compute[192580]: 2025-10-08 16:13:58.733 2 DEBUG nova.compute.manager [req-bdb6e643-d851-4b31-be86-bc7499aa03be req-3f6c9a37-021b-4f63-b98b-b27fb67d0707 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] No waiting events found dispatching network-vif-plugged-72980dd9-6898-4995-b757-fdaf79579051 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:13:58 np0005476733 nova_compute[192580]: 2025-10-08 16:13:58.734 2 WARNING nova.compute.manager [req-bdb6e643-d851-4b31-be86-bc7499aa03be req-3f6c9a37-021b-4f63-b98b-b27fb67d0707 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Received unexpected event network-vif-plugged-72980dd9-6898-4995-b757-fdaf79579051 for instance with vm_state active and task_state None.#033[00m
Oct  8 12:13:58 np0005476733 nova_compute[192580]: 2025-10-08 16:13:58.734 2 DEBUG nova.compute.manager [req-bdb6e643-d851-4b31-be86-bc7499aa03be req-3f6c9a37-021b-4f63-b98b-b27fb67d0707 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Received event network-vif-plugged-72980dd9-6898-4995-b757-fdaf79579051 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:13:58 np0005476733 nova_compute[192580]: 2025-10-08 16:13:58.735 2 DEBUG oslo_concurrency.lockutils [req-bdb6e643-d851-4b31-be86-bc7499aa03be req-3f6c9a37-021b-4f63-b98b-b27fb67d0707 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:13:58 np0005476733 nova_compute[192580]: 2025-10-08 16:13:58.735 2 DEBUG oslo_concurrency.lockutils [req-bdb6e643-d851-4b31-be86-bc7499aa03be req-3f6c9a37-021b-4f63-b98b-b27fb67d0707 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:13:58 np0005476733 nova_compute[192580]: 2025-10-08 16:13:58.736 2 DEBUG oslo_concurrency.lockutils [req-bdb6e643-d851-4b31-be86-bc7499aa03be req-3f6c9a37-021b-4f63-b98b-b27fb67d0707 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:13:58 np0005476733 nova_compute[192580]: 2025-10-08 16:13:58.736 2 DEBUG nova.compute.manager [req-bdb6e643-d851-4b31-be86-bc7499aa03be req-3f6c9a37-021b-4f63-b98b-b27fb67d0707 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] No waiting events found dispatching network-vif-plugged-72980dd9-6898-4995-b757-fdaf79579051 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:13:58 np0005476733 nova_compute[192580]: 2025-10-08 16:13:58.737 2 WARNING nova.compute.manager [req-bdb6e643-d851-4b31-be86-bc7499aa03be req-3f6c9a37-021b-4f63-b98b-b27fb67d0707 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Received unexpected event network-vif-plugged-72980dd9-6898-4995-b757-fdaf79579051 for instance with vm_state active and task_state None.#033[00m
Oct  8 12:13:59 np0005476733 systemd-logind[827]: New session 73 of user zuul.
Oct  8 12:13:59 np0005476733 systemd[1]: Started Session 73 of User zuul.
Oct  8 12:13:59 np0005476733 systemd-logind[827]: New session 74 of user zuul.
Oct  8 12:13:59 np0005476733 systemd[1]: Started Session 74 of User zuul.
Oct  8 12:13:59 np0005476733 systemd[1]: session-74.scope: Deactivated successfully.
Oct  8 12:13:59 np0005476733 systemd-logind[827]: Session 74 logged out. Waiting for processes to exit.
Oct  8 12:13:59 np0005476733 systemd-logind[827]: Removed session 74.
Oct  8 12:14:02 np0005476733 nova_compute[192580]: 2025-10-08 16:14:02.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:02 np0005476733 nova_compute[192580]: 2025-10-08 16:14:02.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:06 np0005476733 podman[254402]: 2025-10-08 16:14:06.255852239 +0000 UTC m=+0.071874469 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:14:06 np0005476733 podman[254401]: 2025-10-08 16:14:06.261544991 +0000 UTC m=+0.078888454 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:14:06 np0005476733 podman[254403]: 2025-10-08 16:14:06.279898118 +0000 UTC m=+0.086413294 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, architecture=x86_64, name=ubi9-minimal, version=9.6, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9)
Oct  8 12:14:07 np0005476733 nova_compute[192580]: 2025-10-08 16:14:07.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:07 np0005476733 nova_compute[192580]: 2025-10-08 16:14:07.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:08 np0005476733 ovn_controller[94857]: 2025-10-08T16:14:08Z|00106|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1f:3c:a8 10.100.0.10
Oct  8 12:14:12 np0005476733 nova_compute[192580]: 2025-10-08 16:14:12.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:12 np0005476733 nova_compute[192580]: 2025-10-08 16:14:12.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:13 np0005476733 podman[254473]: 2025-10-08 16:14:13.244914612 +0000 UTC m=+0.067020923 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.3, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct  8 12:14:13 np0005476733 podman[254474]: 2025-10-08 16:14:13.251776551 +0000 UTC m=+0.068654445 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:14:17 np0005476733 nova_compute[192580]: 2025-10-08 16:14:17.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:18 np0005476733 nova_compute[192580]: 2025-10-08 16:14:18.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:18 np0005476733 nova_compute[192580]: 2025-10-08 16:14:18.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:14:21 np0005476733 systemd-logind[827]: New session 75 of user zuul.
Oct  8 12:14:21 np0005476733 systemd[1]: Started Session 75 of User zuul.
Oct  8 12:14:22 np0005476733 nova_compute[192580]: 2025-10-08 16:14:22.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:23 np0005476733 nova_compute[192580]: 2025-10-08 16:14:23.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:23 np0005476733 podman[254547]: 2025-10-08 16:14:23.219001266 +0000 UTC m=+0.052351795 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent)
Oct  8 12:14:23 np0005476733 nova_compute[192580]: 2025-10-08 16:14:23.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:14:25 np0005476733 ovn_controller[94857]: 2025-10-08T16:14:25Z|00807|pinctrl|WARN|Dropped 209 log messages in last 60 seconds (most recently, 28 seconds ago) due to excessive rate
Oct  8 12:14:25 np0005476733 ovn_controller[94857]: 2025-10-08T16:14:25Z|00808|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:14:25 np0005476733 ovn_controller[94857]: 2025-10-08T16:14:25Z|00809|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct  8 12:14:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:14:26.371 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:14:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:14:26.372 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:14:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:14:26.373 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:14:26 np0005476733 systemd-logind[827]: New session 76 of user zuul.
Oct  8 12:14:27 np0005476733 systemd[1]: Started Session 76 of User zuul.
Oct  8 12:14:27 np0005476733 systemd[1]: session-76.scope: Deactivated successfully.
Oct  8 12:14:27 np0005476733 systemd-logind[827]: Session 76 logged out. Waiting for processes to exit.
Oct  8 12:14:27 np0005476733 systemd-logind[827]: Removed session 76.
Oct  8 12:14:27 np0005476733 nova_compute[192580]: 2025-10-08 16:14:27.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:14:27 np0005476733 nova_compute[192580]: 2025-10-08 16:14:27.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:14:27 np0005476733 nova_compute[192580]: 2025-10-08 16:14:27.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:28 np0005476733 nova_compute[192580]: 2025-10-08 16:14:28.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:28 np0005476733 podman[254598]: 2025-10-08 16:14:28.230864841 +0000 UTC m=+0.060948309 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm)
Oct  8 12:14:28 np0005476733 podman[254597]: 2025-10-08 16:14:28.255921592 +0000 UTC m=+0.089448641 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Oct  8 12:14:28 np0005476733 nova_compute[192580]: 2025-10-08 16:14:28.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:14:29 np0005476733 nova_compute[192580]: 2025-10-08 16:14:29.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:14:31 np0005476733 nova_compute[192580]: 2025-10-08 16:14:31.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:14:31 np0005476733 nova_compute[192580]: 2025-10-08 16:14:31.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:14:31 np0005476733 nova_compute[192580]: 2025-10-08 16:14:31.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:14:31 np0005476733 nova_compute[192580]: 2025-10-08 16:14:31.780 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:14:31 np0005476733 nova_compute[192580]: 2025-10-08 16:14:31.780 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:14:31 np0005476733 nova_compute[192580]: 2025-10-08 16:14:31.780 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:14:31 np0005476733 nova_compute[192580]: 2025-10-08 16:14:31.780 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:14:32 np0005476733 nova_compute[192580]: 2025-10-08 16:14:32.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:33 np0005476733 nova_compute[192580]: 2025-10-08 16:14:33.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:33 np0005476733 nova_compute[192580]: 2025-10-08 16:14:33.567 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Updating instance_info_cache with network_info: [{"id": "72980dd9-6898-4995-b757-fdaf79579051", "address": "fa:16:3e:1f:3c:a8", "network": {"id": "39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a", "bridge": "br-int", "label": "tempest-test-network--268443214", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72980dd9-68", "ovs_interfaceid": "72980dd9-6898-4995-b757-fdaf79579051", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:14:33 np0005476733 nova_compute[192580]: 2025-10-08 16:14:33.607 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:14:33 np0005476733 nova_compute[192580]: 2025-10-08 16:14:33.607 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:14:33 np0005476733 nova_compute[192580]: 2025-10-08 16:14:33.608 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:14:34 np0005476733 systemd-logind[827]: New session 77 of user zuul.
Oct  8 12:14:34 np0005476733 systemd[1]: Started Session 77 of User zuul.
Oct  8 12:14:34 np0005476733 systemd[1]: session-77.scope: Deactivated successfully.
Oct  8 12:14:34 np0005476733 systemd-logind[827]: Session 77 logged out. Waiting for processes to exit.
Oct  8 12:14:34 np0005476733 systemd-logind[827]: Removed session 77.
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.061 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'name': 'tempest-server-test-724242032', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000054', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9d7b1c6f132443b0abac8495ed44621d', 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'hostId': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.061 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.074 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.075 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5aff9fe2-dc38-49f8-93d5-ec4887dc229b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-vda', 'timestamp': '2025-10-08T16:14:36.061922', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e21bf776-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.784882786, 'message_signature': '6b69c9ba63cd6b3d6ea156372a18d2fd3db11f52559015df96bb55c73f70e30b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-sda', 'timestamp': '2025-10-08T16:14:36.061922', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e21c03f6-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.784882786, 'message_signature': '7aa4cbd65f7564640126162a3e3c1be6b0eea4bddec13bca46316c3f09a52355'}]}, 'timestamp': '2025-10-08 16:14:36.075628', '_unique_id': '2ee36644802a43238be8c8c1621d65ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.076 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.077 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.080 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/network.outgoing.bytes volume: 5432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91f25069-2853-4110-8447-eebd02352971', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5432, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000054-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-tap72980dd9-68', 'timestamp': '2025-10-08T16:14:36.077917', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'tap72980dd9-68', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:3c:a8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72980dd9-68'}, 'message_id': 'e21ce15e-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.800898287, 'message_signature': '125bbb58aa6fd987850d0686e263b294a71c05e461d2445a117aaf156ac0544d'}]}, 'timestamp': '2025-10-08 16:14:36.081408', '_unique_id': '05cc6b899d96452a8835da4181b7290b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.082 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.083 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.083 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/network.outgoing.bytes.delta volume: 5432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62adb8f7-9920-48f5-9270-f1cb616e19fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 5432, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000054-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-tap72980dd9-68', 'timestamp': '2025-10-08T16:14:36.083852', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'tap72980dd9-68', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:3c:a8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72980dd9-68'}, 'message_id': 'e21d538c-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.800898287, 'message_signature': 'ed6fcba59c3c7e73d649733ab344487999e7f781052f0e8a302c47c5520a2d9a'}]}, 'timestamp': '2025-10-08 16:14:36.084275', '_unique_id': '923362ab43c44c89ac337b89c5cc66af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.084 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.086 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.086 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/network.outgoing.packets volume: 42 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea25f3c7-a6d4-463d-b7d7-ffbab5a905ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 42, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000054-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-tap72980dd9-68', 'timestamp': '2025-10-08T16:14:36.086299', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'tap72980dd9-68', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:3c:a8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72980dd9-68'}, 'message_id': 'e21db16a-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.800898287, 'message_signature': 'fc79bd1730c5abf8e5b1453739e1efb282fd8b051ca187868626fb5338f60b7c'}]}, 'timestamp': '2025-10-08 16:14:36.086666', '_unique_id': 'fdc8fe0db81c4acf92273477d655fdc6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.087 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.088 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.088 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ab2f928-c266-412b-b67f-5fbf738c276c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000054-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-tap72980dd9-68', 'timestamp': '2025-10-08T16:14:36.088833', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'tap72980dd9-68', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:3c:a8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72980dd9-68'}, 'message_id': 'e21e1402-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.800898287, 'message_signature': '35c899b67c70be56f67026d6416fb4cfe2dce6ca610ab5b658e9b46c59c5db3b'}]}, 'timestamp': '2025-10-08 16:14:36.089231', '_unique_id': 'beb073c11764423899e65cdc670b1fa6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.089 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.091 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.091 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.usage volume: 30212096 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.091 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf17fd39-4a2f-42b5-80c0-7c5bc7ba96b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30212096, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-vda', 'timestamp': '2025-10-08T16:14:36.091186', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e21e7050-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.784882786, 'message_signature': '343abc4cb69e8a9e30ff086d220dfd6bacb0ec5e523c1353d9076fd6ee1c9dca'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-sda', 'timestamp': '2025-10-08T16:14:36.091186', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e21e7ca8-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.784882786, 'message_signature': '6a0eb532d18c59f441bec3a2eebbbd55ca28143dd685694f7f066670b57d7623'}]}, 'timestamp': '2025-10-08 16:14:36.091849', '_unique_id': 'b612dd135e1d42358fbe5efc78313538'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.092 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.093 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.114 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.read.requests volume: 1221 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.114 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd7028eba-c81b-48fd-a24f-0f9eddf50064', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1221, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-vda', 'timestamp': '2025-10-08T16:14:36.093835', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e221fcf2-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.816827136, 'message_signature': 'e220ad6c442e792e62c16d0b48dd2c8d65ff7f50416a85292c5ed007461c11a2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-sda', 'timestamp': '2025-10-08T16:14:36.093835', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e2220c10-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.816827136, 'message_signature': '8f901b8adfba14d73e12da1d3473ec6e5f51bd9d3db5d7934c9320e98e4f60e9'}]}, 'timestamp': '2025-10-08 16:14:36.115174', '_unique_id': '5b8c631230ef4237b1a10e1715414c02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.116 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.117 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.117 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/network.incoming.packets volume: 43 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3cefde63-48e6-4d89-b12a-6ff1021ef5c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 43, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000054-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-tap72980dd9-68', 'timestamp': '2025-10-08T16:14:36.117139', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'tap72980dd9-68', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:3c:a8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72980dd9-68'}, 'message_id': 'e2226624-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.800898287, 'message_signature': 'f7c6d6a6994821dedf8461baab0034f1c34ffe05e55416edf310fa1f79a3e115'}]}, 'timestamp': '2025-10-08 16:14:36.117463', '_unique_id': '2bde5cc702a34b4298397a47ceade988'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.119 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.132 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/memory.usage volume: 42.27734375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe0bedbd-4760-4349-ab39-6527f9565dd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.27734375, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'timestamp': '2025-10-08T16:14:36.119393', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'e224c2f2-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.855374468, 'message_signature': '3d796090eb791b58450389e3e0af859865ae5f64d53cd7e5b878f990f071d5f7'}]}, 'timestamp': '2025-10-08 16:14:36.132986', '_unique_id': 'a74ab5dfd5d04364b39505d27083eb0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.133 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.134 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.134 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.allocation volume: 31072256 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.135 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8101cd2-7dbd-4dd9-811a-969175059066', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31072256, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-vda', 'timestamp': '2025-10-08T16:14:36.134884', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e2251978-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.784882786, 'message_signature': 'b74534b2af1477943ee6480913a8ae8d07dafd089fa70aa234aade62027189b4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-sda', 'timestamp': '2025-10-08T16:14:36.134884', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e22523aa-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.784882786, 'message_signature': 'b9f768bbe9665733340e493f685c4d199bbcd2c030f8181664e12c790160190e'}]}, 'timestamp': '2025-10-08 16:14:36.135408', '_unique_id': '05a1ee3686de42da872b51d07bf9ad92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.136 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.137 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.write.requests volume: 49 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.137 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9231603d-1328-4ec8-bac3-11c7ef5e5a0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 49, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-vda', 'timestamp': '2025-10-08T16:14:36.137095', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e2257120-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.816827136, 'message_signature': '075cef577a6ba850c8abd337e0d95545c8839035bb433aabede88c25cea0155d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-sda', 'timestamp': '2025-10-08T16:14:36.137095', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e2257b2a-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.816827136, 'message_signature': 'c4463d6f8e7617ba43e0ec0ed788e0f2efa07f86920c8eaa356acc5ece471506'}]}, 'timestamp': '2025-10-08 16:14:36.137648', '_unique_id': '05f4d5177d44480fa29605794f470474'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.138 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1441ee10-7c03-45db-9d07-1bf6309e8fa0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000054-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-tap72980dd9-68', 'timestamp': '2025-10-08T16:14:36.139129', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'tap72980dd9-68', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:3c:a8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72980dd9-68'}, 'message_id': 'e225bf36-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.800898287, 'message_signature': '374400b971823a3a983cab9c853d7210984a61fce130bf4f45d6a6f7a78f2007'}]}, 'timestamp': '2025-10-08 16:14:36.139403', '_unique_id': 'fb523523e9124c63a35488d653aac819'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.139 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.140 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.140 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/network.incoming.bytes.delta volume: 5397 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40a7036c-d73d-4851-9e5e-b002d436fd34', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 5397, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000054-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-tap72980dd9-68', 'timestamp': '2025-10-08T16:14:36.140807', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'tap72980dd9-68', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:3c:a8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72980dd9-68'}, 'message_id': 'e2260090-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.800898287, 'message_signature': '5522ccae88b403f791218a3752213ee28f05246034884ca219d62398f51e9a40'}]}, 'timestamp': '2025-10-08 16:14:36.141074', '_unique_id': '8d1d2bb5e6664a81b0def870d7693285'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.141 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.142 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.142 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.write.latency volume: 54326056 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.142 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6fbe920a-627a-4ad2-b379-60783f9e04a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 54326056, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-vda', 'timestamp': '2025-10-08T16:14:36.142219', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e2263600-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.816827136, 'message_signature': '707c6cc2e4dca5a72b1473c25c514b842cbc02685fed06b6158f3ccd905b7a98'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-sda', 'timestamp': '2025-10-08T16:14:36.142219', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e2263d8a-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.816827136, 'message_signature': '3258f063657edd8d2c143a8c4777f0ce4c19519ec81e51c578ca7ea4500e4734'}]}, 'timestamp': '2025-10-08 16:14:36.142621', '_unique_id': '8d726d2f5bb2425e8757d2b34822bb31'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.143 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56b934dd-d2f3-4f27-a2f1-de3be04c7f11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000054-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-tap72980dd9-68', 'timestamp': '2025-10-08T16:14:36.144181', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'tap72980dd9-68', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:3c:a8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72980dd9-68'}, 'message_id': 'e22682f4-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.800898287, 'message_signature': '4fa8ca394a38e3d0715ab46c82f4d873a0db8bb39c76364e8712f1abc1091a37'}]}, 'timestamp': '2025-10-08 16:14:36.144414', '_unique_id': '7442fc9e2ff14079acebcddb994260a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.145 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.145 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7eeebc57-8e89-4b7e-8128-ed63f0a1254b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000054-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-tap72980dd9-68', 'timestamp': '2025-10-08T16:14:36.145502', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'tap72980dd9-68', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:3c:a8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72980dd9-68'}, 'message_id': 'e226b63e-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.800898287, 'message_signature': '4b1ca96b0f495f028204fe18d92948e6b3203c88b74bf374429bcc4d70b95226'}]}, 'timestamp': '2025-10-08 16:14:36.145724', '_unique_id': '182721be05134e3c8c9c38351efb460a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.146 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.read.latency volume: 707325499 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.read.latency volume: 40389422 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8474873-65b0-430d-abcd-f7d4679d9e9a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 707325499, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-vda', 'timestamp': '2025-10-08T16:14:36.146787', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e226e852-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.816827136, 'message_signature': '7cec68b4d0384892439a4dbd5574e26b97ebd0800bac8f89c09ec9fce113ec94'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 40389422, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-sda', 'timestamp': '2025-10-08T16:14:36.146787', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e226f18a-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.816827136, 'message_signature': '4700ad742003beca77acf59f3b7699124dbf1864a57da250773abdf76ecee35c'}]}, 'timestamp': '2025-10-08 16:14:36.147231', '_unique_id': '2a63c438fc234cf0b4950845f5911112'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.147 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.148 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.148 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.148 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/cpu volume: 9780000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48a3315f-8037-4e9b-895e-0e27689872cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9780000000, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'timestamp': '2025-10-08T16:14:36.148377', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'e2272678-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.855374468, 'message_signature': '388147db8940365ef9d73c636d6d6d5cb5436d082c272bf21d4f78c6e0971ee2'}]}, 'timestamp': '2025-10-08 16:14:36.148591', '_unique_id': '864cf7b47dda4f59b9e6bfa1057b3a85'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.149 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a752317e-bf20-4a23-bcea-c6d5628ce6bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-vda', 'timestamp': '2025-10-08T16:14:36.149892', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e22761b0-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.816827136, 'message_signature': '155e446ade060c7925dbbd0565d6c26758fd6d5e91f05bd97b4052cd9e50293f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-sda', 'timestamp': '2025-10-08T16:14:36.149892', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e2276a3e-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.816827136, 'message_signature': '3518b52b6a4f376f4f777e60a39c161e6f2a229e8c7fa65c5dbaaf02d82d6d25'}]}, 'timestamp': '2025-10-08 16:14:36.150317', '_unique_id': '567e2b603dcf49e7a9ca5efe08afe6c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.151 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.151 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.read.bytes volume: 32131072 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.151 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16ec6716-4d46-403c-bf01-34a89b7549b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32131072, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-vda', 'timestamp': '2025-10-08T16:14:36.151402', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e2279cac-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.816827136, 'message_signature': '002231cd2c62976e449418dad05c85ad6b858cb4a93d2686c76bc8aa47efd5e1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-sda', 'timestamp': '2025-10-08T16:14:36.151402', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'instance-00000054', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e227a486-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.816827136, 'message_signature': 'fd24d2a5b81d9872859054dd0ec49ae724b269569a7ef7a95af1a34c7b7c3e75'}]}, 'timestamp': '2025-10-08 16:14:36.151808', '_unique_id': '0bc0af0b57514c4599ba5a9216be58f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.152 12 DEBUG ceilometer.compute.pollsters [-] ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/network.incoming.bytes volume: 5507 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5fb59bf-ad60-44c1-9545-35beba6d15ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5507, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000054-ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-tap72980dd9-68', 'timestamp': '2025-10-08T16:14:36.152852', 'resource_metadata': {'display_name': 'tempest-server-test-724242032', 'name': 'tap72980dd9-68', 'instance_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:3c:a8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72980dd9-68'}, 'message_id': 'e227d550-a461-11f0-9274-fa163ef67048', 'monotonic_time': 6979.800898287, 'message_signature': '6e45120e5893136957e6f2097399b7f3dcc01972302ab809ce4cbca38126ae87'}]}, 'timestamp': '2025-10-08 16:14:36.153076', '_unique_id': 'dc60a1c710434904ac7ae2c6cbe703d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:14:36.153 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:14:37 np0005476733 podman[254679]: 2025-10-08 16:14:37.259928744 +0000 UTC m=+0.085219876 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=edpm, vcs-type=git, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350)
Oct  8 12:14:37 np0005476733 podman[254678]: 2025-10-08 16:14:37.273917571 +0000 UTC m=+0.099268195 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:14:37 np0005476733 podman[254677]: 2025-10-08 16:14:37.284769558 +0000 UTC m=+0.110924228 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd)
Oct  8 12:14:37 np0005476733 nova_compute[192580]: 2025-10-08 16:14:37.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:38 np0005476733 nova_compute[192580]: 2025-10-08 16:14:38.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:41 np0005476733 systemd-logind[827]: New session 78 of user zuul.
Oct  8 12:14:41 np0005476733 systemd[1]: Started Session 78 of User zuul.
Oct  8 12:14:41 np0005476733 nova_compute[192580]: 2025-10-08 16:14:41.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:14:41 np0005476733 systemd[1]: session-78.scope: Deactivated successfully.
Oct  8 12:14:41 np0005476733 systemd-logind[827]: Session 78 logged out. Waiting for processes to exit.
Oct  8 12:14:41 np0005476733 systemd-logind[827]: Removed session 78.
Oct  8 12:14:41 np0005476733 nova_compute[192580]: 2025-10-08 16:14:41.718 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:14:41 np0005476733 nova_compute[192580]: 2025-10-08 16:14:41.797 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:14:41 np0005476733 nova_compute[192580]: 2025-10-08 16:14:41.798 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:14:41 np0005476733 nova_compute[192580]: 2025-10-08 16:14:41.798 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:14:41 np0005476733 nova_compute[192580]: 2025-10-08 16:14:41.799 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:14:42 np0005476733 nova_compute[192580]: 2025-10-08 16:14:42.042 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:14:42 np0005476733 nova_compute[192580]: 2025-10-08 16:14:42.127 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:14:42 np0005476733 nova_compute[192580]: 2025-10-08 16:14:42.128 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:14:42 np0005476733 nova_compute[192580]: 2025-10-08 16:14:42.208 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:14:42 np0005476733 nova_compute[192580]: 2025-10-08 16:14:42.355 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:14:42 np0005476733 nova_compute[192580]: 2025-10-08 16:14:42.356 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13563MB free_disk=111.28550720214844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:14:42 np0005476733 nova_compute[192580]: 2025-10-08 16:14:42.357 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:14:42 np0005476733 nova_compute[192580]: 2025-10-08 16:14:42.357 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:14:42 np0005476733 nova_compute[192580]: 2025-10-08 16:14:42.446 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:14:42 np0005476733 nova_compute[192580]: 2025-10-08 16:14:42.447 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:14:42 np0005476733 nova_compute[192580]: 2025-10-08 16:14:42.447 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:14:42 np0005476733 nova_compute[192580]: 2025-10-08 16:14:42.492 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:14:42 np0005476733 nova_compute[192580]: 2025-10-08 16:14:42.510 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:14:42 np0005476733 nova_compute[192580]: 2025-10-08 16:14:42.511 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:14:42 np0005476733 nova_compute[192580]: 2025-10-08 16:14:42.512 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:14:42 np0005476733 nova_compute[192580]: 2025-10-08 16:14:42.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:43 np0005476733 nova_compute[192580]: 2025-10-08 16:14:43.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:43 np0005476733 nova_compute[192580]: 2025-10-08 16:14:43.381 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:14:44 np0005476733 podman[254782]: 2025-10-08 16:14:44.259197155 +0000 UTC m=+0.073036636 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:14:44 np0005476733 podman[254781]: 2025-10-08 16:14:44.264838615 +0000 UTC m=+0.078819821 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible)
Oct  8 12:14:47 np0005476733 nova_compute[192580]: 2025-10-08 16:14:47.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:48 np0005476733 nova_compute[192580]: 2025-10-08 16:14:48.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:48 np0005476733 systemd-logind[827]: New session 79 of user zuul.
Oct  8 12:14:48 np0005476733 systemd[1]: Started Session 79 of User zuul.
Oct  8 12:14:49 np0005476733 systemd[1]: session-79.scope: Deactivated successfully.
Oct  8 12:14:49 np0005476733 systemd-logind[827]: Session 79 logged out. Waiting for processes to exit.
Oct  8 12:14:49 np0005476733 systemd-logind[827]: Removed session 79.
Oct  8 12:14:52 np0005476733 nova_compute[192580]: 2025-10-08 16:14:52.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:53 np0005476733 nova_compute[192580]: 2025-10-08 16:14:53.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:54 np0005476733 podman[254855]: 2025-10-08 16:14:54.260835549 +0000 UTC m=+0.075906738 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:14:56 np0005476733 systemd-logind[827]: New session 80 of user zuul.
Oct  8 12:14:56 np0005476733 systemd[1]: Started Session 80 of User zuul.
Oct  8 12:14:56 np0005476733 systemd-logind[827]: Session 80 logged out. Waiting for processes to exit.
Oct  8 12:14:56 np0005476733 systemd[1]: session-80.scope: Deactivated successfully.
Oct  8 12:14:56 np0005476733 systemd-logind[827]: Removed session 80.
Oct  8 12:14:57 np0005476733 nova_compute[192580]: 2025-10-08 16:14:57.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:58 np0005476733 nova_compute[192580]: 2025-10-08 16:14:58.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:59 np0005476733 podman[254908]: 2025-10-08 16:14:59.276408233 +0000 UTC m=+0.090392491 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:14:59 np0005476733 podman[254907]: 2025-10-08 16:14:59.282796607 +0000 UTC m=+0.113179069 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  8 12:14:59 np0005476733 nova_compute[192580]: 2025-10-08 16:14:59.683 2 DEBUG oslo_concurrency.lockutils [None req-7fd85ad2-8325-46de-9c2c-bccd9a0c81c4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:14:59 np0005476733 nova_compute[192580]: 2025-10-08 16:14:59.684 2 DEBUG oslo_concurrency.lockutils [None req-7fd85ad2-8325-46de-9c2c-bccd9a0c81c4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:14:59 np0005476733 nova_compute[192580]: 2025-10-08 16:14:59.684 2 DEBUG oslo_concurrency.lockutils [None req-7fd85ad2-8325-46de-9c2c-bccd9a0c81c4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:14:59 np0005476733 nova_compute[192580]: 2025-10-08 16:14:59.684 2 DEBUG oslo_concurrency.lockutils [None req-7fd85ad2-8325-46de-9c2c-bccd9a0c81c4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:14:59 np0005476733 nova_compute[192580]: 2025-10-08 16:14:59.684 2 DEBUG oslo_concurrency.lockutils [None req-7fd85ad2-8325-46de-9c2c-bccd9a0c81c4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:14:59 np0005476733 nova_compute[192580]: 2025-10-08 16:14:59.685 2 INFO nova.compute.manager [None req-7fd85ad2-8325-46de-9c2c-bccd9a0c81c4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Terminating instance#033[00m
Oct  8 12:14:59 np0005476733 nova_compute[192580]: 2025-10-08 16:14:59.686 2 DEBUG nova.compute.manager [None req-7fd85ad2-8325-46de-9c2c-bccd9a0c81c4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 12:14:59 np0005476733 kernel: tap72980dd9-68 (unregistering): left promiscuous mode
Oct  8 12:14:59 np0005476733 NetworkManager[51699]: <info>  [1759940099.7161] device (tap72980dd9-68): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 12:14:59 np0005476733 nova_compute[192580]: 2025-10-08 16:14:59.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:59 np0005476733 ovn_controller[94857]: 2025-10-08T16:14:59Z|00810|binding|INFO|Releasing lport 72980dd9-6898-4995-b757-fdaf79579051 from this chassis (sb_readonly=0)
Oct  8 12:14:59 np0005476733 ovn_controller[94857]: 2025-10-08T16:14:59Z|00811|binding|INFO|Setting lport 72980dd9-6898-4995-b757-fdaf79579051 down in Southbound
Oct  8 12:14:59 np0005476733 ovn_controller[94857]: 2025-10-08T16:14:59Z|00812|binding|INFO|Removing iface tap72980dd9-68 ovn-installed in OVS
Oct  8 12:14:59 np0005476733 nova_compute[192580]: 2025-10-08 16:14:59.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:59 np0005476733 nova_compute[192580]: 2025-10-08 16:14:59.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:14:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:14:59.744 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:3c:a8 10.100.0.10'], port_security=['fa:16:3e:1f:3c:a8 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d7b1c6f132443b0abac8495ed44621d', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'c4c5a072-db01-4f8f-8c8a-8e47ffd1595c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.207'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7983178b-63e4-47de-abc8-63e5de6fbbea, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=72980dd9-6898-4995-b757-fdaf79579051) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:14:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:14:59.746 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 72980dd9-6898-4995-b757-fdaf79579051 in datapath 39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a unbound from our chassis#033[00m
Oct  8 12:14:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:14:59.747 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 12:14:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:14:59.749 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6d91e90b-8c0f-4582-a44c-6f944c08b622]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:14:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:14:59.749 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a namespace which is not needed anymore#033[00m
Oct  8 12:14:59 np0005476733 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000054.scope: Deactivated successfully.
Oct  8 12:14:59 np0005476733 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000054.scope: Consumed 13.490s CPU time.
Oct  8 12:14:59 np0005476733 systemd-machined[152624]: Machine qemu-51-instance-00000054 terminated.
Oct  8 12:14:59 np0005476733 neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a[254288]: [NOTICE]   (254328) : haproxy version is 2.8.14-c23fe91
Oct  8 12:14:59 np0005476733 neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a[254288]: [NOTICE]   (254328) : path to executable is /usr/sbin/haproxy
Oct  8 12:14:59 np0005476733 neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a[254288]: [WARNING]  (254328) : Exiting Master process...
Oct  8 12:14:59 np0005476733 systemd[1]: libpod-b8306c09716a86c3e589bc8bd4969e27f1d013afa8b20683b16704d44a6b3f13.scope: Deactivated successfully.
Oct  8 12:14:59 np0005476733 neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a[254288]: [ALERT]    (254328) : Current worker (254330) exited with code 143 (Terminated)
Oct  8 12:14:59 np0005476733 neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a[254288]: [WARNING]  (254328) : All workers exited. Exiting... (0)
Oct  8 12:14:59 np0005476733 conmon[254288]: conmon b8306c09716a86c3e589 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b8306c09716a86c3e589bc8bd4969e27f1d013afa8b20683b16704d44a6b3f13.scope/container/memory.events
Oct  8 12:14:59 np0005476733 podman[254979]: 2025-10-08 16:14:59.962371312 +0000 UTC m=+0.082063754 container died b8306c09716a86c3e589bc8bd4969e27f1d013afa8b20683b16704d44a6b3f13 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:14:59 np0005476733 nova_compute[192580]: 2025-10-08 16:14:59.974 2 INFO nova.virt.libvirt.driver [-] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Instance destroyed successfully.#033[00m
Oct  8 12:14:59 np0005476733 nova_compute[192580]: 2025-10-08 16:14:59.974 2 DEBUG nova.objects.instance [None req-7fd85ad2-8325-46de-9c2c-bccd9a0c81c4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lazy-loading 'resources' on Instance uuid ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:14:59 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b8306c09716a86c3e589bc8bd4969e27f1d013afa8b20683b16704d44a6b3f13-userdata-shm.mount: Deactivated successfully.
Oct  8 12:14:59 np0005476733 systemd[1]: var-lib-containers-storage-overlay-02a114570f402d8889604f47ad017c557d7f5f537188d3f8d73d66e587fe8ea9-merged.mount: Deactivated successfully.
Oct  8 12:15:00 np0005476733 podman[254979]: 2025-10-08 16:15:00.010817011 +0000 UTC m=+0.130509473 container cleanup b8306c09716a86c3e589bc8bd4969e27f1d013afa8b20683b16704d44a6b3f13 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:15:00 np0005476733 nova_compute[192580]: 2025-10-08 16:15:00.020 2 DEBUG nova.virt.libvirt.vif [None req-7fd85ad2-8325-46de-9c2c-bccd9a0c81c4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T16:12:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-server-test-724242032',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-724242032',id=84,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgMtZOVnVeH3bWhrZPBKfXd+ywrgZUihuI2z5HN91rm6b66qXVN5XsBNFSC/a3XnUnD3sHUA86mE5v09Xc1EUgkfz3mw8V02tl2sDq2tzT1z7aRUqvhGDG3xh8qSR2ByQ==',key_name='tempest-keypair-862595394',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:12:30Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9d7b1c6f132443b0abac8495ed44621d',ramdisk_id='',reservation_id='r-ly64ptws',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-OvnDvrTest-313060968',owner_user_name='tempest-OvnDvrTest-313060968-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T16:13:58Z,user_data=None,user_id='81b62a8f3edf4f78aeb0b087fd79ebb7',uuid=ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "72980dd9-6898-4995-b757-fdaf79579051", "address": "fa:16:3e:1f:3c:a8", "network": {"id": "39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a", "bridge": "br-int", "label": "tempest-test-network--268443214", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72980dd9-68", "ovs_interfaceid": "72980dd9-6898-4995-b757-fdaf79579051", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 12:15:00 np0005476733 nova_compute[192580]: 2025-10-08 16:15:00.020 2 DEBUG nova.network.os_vif_util [None req-7fd85ad2-8325-46de-9c2c-bccd9a0c81c4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converting VIF {"id": "72980dd9-6898-4995-b757-fdaf79579051", "address": "fa:16:3e:1f:3c:a8", "network": {"id": "39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a", "bridge": "br-int", "label": "tempest-test-network--268443214", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72980dd9-68", "ovs_interfaceid": "72980dd9-6898-4995-b757-fdaf79579051", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:15:00 np0005476733 nova_compute[192580]: 2025-10-08 16:15:00.021 2 DEBUG nova.network.os_vif_util [None req-7fd85ad2-8325-46de-9c2c-bccd9a0c81c4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1f:3c:a8,bridge_name='br-int',has_traffic_filtering=True,id=72980dd9-6898-4995-b757-fdaf79579051,network=Network(39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72980dd9-68') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:15:00 np0005476733 nova_compute[192580]: 2025-10-08 16:15:00.021 2 DEBUG os_vif [None req-7fd85ad2-8325-46de-9c2c-bccd9a0c81c4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:3c:a8,bridge_name='br-int',has_traffic_filtering=True,id=72980dd9-6898-4995-b757-fdaf79579051,network=Network(39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72980dd9-68') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 12:15:00 np0005476733 systemd[1]: libpod-conmon-b8306c09716a86c3e589bc8bd4969e27f1d013afa8b20683b16704d44a6b3f13.scope: Deactivated successfully.
Oct  8 12:15:00 np0005476733 nova_compute[192580]: 2025-10-08 16:15:00.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:00 np0005476733 nova_compute[192580]: 2025-10-08 16:15:00.024 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap72980dd9-68, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:15:00 np0005476733 nova_compute[192580]: 2025-10-08 16:15:00.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:15:00 np0005476733 nova_compute[192580]: 2025-10-08 16:15:00.031 2 INFO os_vif [None req-7fd85ad2-8325-46de-9c2c-bccd9a0c81c4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:3c:a8,bridge_name='br-int',has_traffic_filtering=True,id=72980dd9-6898-4995-b757-fdaf79579051,network=Network(39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72980dd9-68')#033[00m
Oct  8 12:15:00 np0005476733 nova_compute[192580]: 2025-10-08 16:15:00.031 2 INFO nova.virt.libvirt.driver [None req-7fd85ad2-8325-46de-9c2c-bccd9a0c81c4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Deleting instance files /var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579_del#033[00m
Oct  8 12:15:00 np0005476733 nova_compute[192580]: 2025-10-08 16:15:00.032 2 INFO nova.virt.libvirt.driver [None req-7fd85ad2-8325-46de-9c2c-bccd9a0c81c4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Deletion of /var/lib/nova/instances/ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579_del complete#033[00m
Oct  8 12:15:00 np0005476733 podman[255025]: 2025-10-08 16:15:00.083200675 +0000 UTC m=+0.046012431 container remove b8306c09716a86c3e589bc8bd4969e27f1d013afa8b20683b16704d44a6b3f13 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  8 12:15:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:15:00.089 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[54ccfab4-f811-46bc-946b-f102b11d2315]: (4, ('Wed Oct  8 04:14:59 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a (b8306c09716a86c3e589bc8bd4969e27f1d013afa8b20683b16704d44a6b3f13)\nb8306c09716a86c3e589bc8bd4969e27f1d013afa8b20683b16704d44a6b3f13\nWed Oct  8 04:15:00 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a (b8306c09716a86c3e589bc8bd4969e27f1d013afa8b20683b16704d44a6b3f13)\nb8306c09716a86c3e589bc8bd4969e27f1d013afa8b20683b16704d44a6b3f13\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:15:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:15:00.091 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c9b74c47-f6a4-4ea7-947a-681076a75851]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:15:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:15:00.093 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap39a1ef3d-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:15:00 np0005476733 nova_compute[192580]: 2025-10-08 16:15:00.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:00 np0005476733 kernel: tap39a1ef3d-30: left promiscuous mode
Oct  8 12:15:00 np0005476733 nova_compute[192580]: 2025-10-08 16:15:00.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:15:00.111 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a5b56472-0a49-4a19-8386-1c3659e1720e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:15:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:15:00.141 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[60ea3abb-ec03-4e7c-9996-8d5d8860e968]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:15:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:15:00.143 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0ad32278-6215-426f-85a4-5a47f698739f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:15:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:15:00.164 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[51e903db-7d99-4de8-ab69-bfeabad76e63]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 694044, 'reachable_time': 44630, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255040, 'error': None, 'target': 'ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:15:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:15:00.168 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-39a1ef3d-39fb-4d17-8b3f-d6bff6ae557a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 12:15:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:15:00.169 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[5d28a68f-89bd-4b7a-9285-061a3285624c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:15:00 np0005476733 systemd[1]: run-netns-ovnmeta\x2d39a1ef3d\x2d39fb\x2d4d17\x2d8b3f\x2dd6bff6ae557a.mount: Deactivated successfully.
Oct  8 12:15:00 np0005476733 nova_compute[192580]: 2025-10-08 16:15:00.238 2 INFO nova.compute.manager [None req-7fd85ad2-8325-46de-9c2c-bccd9a0c81c4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Took 0.55 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 12:15:00 np0005476733 nova_compute[192580]: 2025-10-08 16:15:00.239 2 DEBUG oslo.service.loopingcall [None req-7fd85ad2-8325-46de-9c2c-bccd9a0c81c4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 12:15:00 np0005476733 nova_compute[192580]: 2025-10-08 16:15:00.239 2 DEBUG nova.compute.manager [-] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 12:15:00 np0005476733 nova_compute[192580]: 2025-10-08 16:15:00.240 2 DEBUG nova.network.neutron [-] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 12:15:00 np0005476733 nova_compute[192580]: 2025-10-08 16:15:00.849 2 DEBUG nova.compute.manager [req-ae93f7c4-b3fe-423a-9adc-1eff4011a919 req-dae64e99-2468-4144-bc0c-bb21ae22ff46 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Received event network-vif-unplugged-72980dd9-6898-4995-b757-fdaf79579051 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:15:00 np0005476733 nova_compute[192580]: 2025-10-08 16:15:00.850 2 DEBUG oslo_concurrency.lockutils [req-ae93f7c4-b3fe-423a-9adc-1eff4011a919 req-dae64e99-2468-4144-bc0c-bb21ae22ff46 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:15:00 np0005476733 nova_compute[192580]: 2025-10-08 16:15:00.850 2 DEBUG oslo_concurrency.lockutils [req-ae93f7c4-b3fe-423a-9adc-1eff4011a919 req-dae64e99-2468-4144-bc0c-bb21ae22ff46 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:15:00 np0005476733 nova_compute[192580]: 2025-10-08 16:15:00.850 2 DEBUG oslo_concurrency.lockutils [req-ae93f7c4-b3fe-423a-9adc-1eff4011a919 req-dae64e99-2468-4144-bc0c-bb21ae22ff46 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:15:00 np0005476733 nova_compute[192580]: 2025-10-08 16:15:00.850 2 DEBUG nova.compute.manager [req-ae93f7c4-b3fe-423a-9adc-1eff4011a919 req-dae64e99-2468-4144-bc0c-bb21ae22ff46 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] No waiting events found dispatching network-vif-unplugged-72980dd9-6898-4995-b757-fdaf79579051 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:15:00 np0005476733 nova_compute[192580]: 2025-10-08 16:15:00.850 2 DEBUG nova.compute.manager [req-ae93f7c4-b3fe-423a-9adc-1eff4011a919 req-dae64e99-2468-4144-bc0c-bb21ae22ff46 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Received event network-vif-unplugged-72980dd9-6898-4995-b757-fdaf79579051 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 12:15:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:15:02.178 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=56, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=55) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:15:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:15:02.179 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:15:02 np0005476733 nova_compute[192580]: 2025-10-08 16:15:02.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:02 np0005476733 nova_compute[192580]: 2025-10-08 16:15:02.292 2 DEBUG nova.network.neutron [-] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:15:02 np0005476733 nova_compute[192580]: 2025-10-08 16:15:02.333 2 INFO nova.compute.manager [-] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Took 2.09 seconds to deallocate network for instance.#033[00m
Oct  8 12:15:02 np0005476733 nova_compute[192580]: 2025-10-08 16:15:02.389 2 DEBUG oslo_concurrency.lockutils [None req-7fd85ad2-8325-46de-9c2c-bccd9a0c81c4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:15:02 np0005476733 nova_compute[192580]: 2025-10-08 16:15:02.389 2 DEBUG oslo_concurrency.lockutils [None req-7fd85ad2-8325-46de-9c2c-bccd9a0c81c4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:15:02 np0005476733 nova_compute[192580]: 2025-10-08 16:15:02.491 2 DEBUG nova.compute.provider_tree [None req-7fd85ad2-8325-46de-9c2c-bccd9a0c81c4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:15:02 np0005476733 nova_compute[192580]: 2025-10-08 16:15:02.518 2 DEBUG nova.scheduler.client.report [None req-7fd85ad2-8325-46de-9c2c-bccd9a0c81c4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:15:02 np0005476733 nova_compute[192580]: 2025-10-08 16:15:02.557 2 DEBUG oslo_concurrency.lockutils [None req-7fd85ad2-8325-46de-9c2c-bccd9a0c81c4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:15:02 np0005476733 nova_compute[192580]: 2025-10-08 16:15:02.603 2 INFO nova.scheduler.client.report [None req-7fd85ad2-8325-46de-9c2c-bccd9a0c81c4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Deleted allocations for instance ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579#033[00m
Oct  8 12:15:02 np0005476733 nova_compute[192580]: 2025-10-08 16:15:02.759 2 DEBUG oslo_concurrency.lockutils [None req-7fd85ad2-8325-46de-9c2c-bccd9a0c81c4 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.075s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:15:02 np0005476733 nova_compute[192580]: 2025-10-08 16:15:02.997 2 DEBUG nova.compute.manager [req-fd759443-ea5f-4c8a-a9ab-59a36cbacb85 req-d05b25e8-6b1f-47c6-b15d-da0f31da427c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Received event network-vif-plugged-72980dd9-6898-4995-b757-fdaf79579051 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:15:02 np0005476733 nova_compute[192580]: 2025-10-08 16:15:02.997 2 DEBUG oslo_concurrency.lockutils [req-fd759443-ea5f-4c8a-a9ab-59a36cbacb85 req-d05b25e8-6b1f-47c6-b15d-da0f31da427c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:15:02 np0005476733 nova_compute[192580]: 2025-10-08 16:15:02.998 2 DEBUG oslo_concurrency.lockutils [req-fd759443-ea5f-4c8a-a9ab-59a36cbacb85 req-d05b25e8-6b1f-47c6-b15d-da0f31da427c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:15:02 np0005476733 nova_compute[192580]: 2025-10-08 16:15:02.998 2 DEBUG oslo_concurrency.lockutils [req-fd759443-ea5f-4c8a-a9ab-59a36cbacb85 req-d05b25e8-6b1f-47c6-b15d-da0f31da427c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:15:02 np0005476733 nova_compute[192580]: 2025-10-08 16:15:02.998 2 DEBUG nova.compute.manager [req-fd759443-ea5f-4c8a-a9ab-59a36cbacb85 req-d05b25e8-6b1f-47c6-b15d-da0f31da427c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] No waiting events found dispatching network-vif-plugged-72980dd9-6898-4995-b757-fdaf79579051 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:15:02 np0005476733 nova_compute[192580]: 2025-10-08 16:15:02.998 2 WARNING nova.compute.manager [req-fd759443-ea5f-4c8a-a9ab-59a36cbacb85 req-d05b25e8-6b1f-47c6-b15d-da0f31da427c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Received unexpected event network-vif-plugged-72980dd9-6898-4995-b757-fdaf79579051 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 12:15:02 np0005476733 nova_compute[192580]: 2025-10-08 16:15:02.999 2 DEBUG nova.compute.manager [req-fd759443-ea5f-4c8a-a9ab-59a36cbacb85 req-d05b25e8-6b1f-47c6-b15d-da0f31da427c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Received event network-vif-deleted-72980dd9-6898-4995-b757-fdaf79579051 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:15:03 np0005476733 nova_compute[192580]: 2025-10-08 16:15:03.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:05 np0005476733 nova_compute[192580]: 2025-10-08 16:15:05.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:05 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:15:05.181 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '56'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:15:08 np0005476733 nova_compute[192580]: 2025-10-08 16:15:08.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:08 np0005476733 podman[255043]: 2025-10-08 16:15:08.239550808 +0000 UTC m=+0.060163845 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=edpm)
Oct  8 12:15:08 np0005476733 podman[255041]: 2025-10-08 16:15:08.265985802 +0000 UTC m=+0.080547496 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 12:15:08 np0005476733 podman[255042]: 2025-10-08 16:15:08.266750347 +0000 UTC m=+0.080004519 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 12:15:10 np0005476733 nova_compute[192580]: 2025-10-08 16:15:10.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:13 np0005476733 nova_compute[192580]: 2025-10-08 16:15:13.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:14 np0005476733 nova_compute[192580]: 2025-10-08 16:15:14.970 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759940099.9686522, ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:15:14 np0005476733 nova_compute[192580]: 2025-10-08 16:15:14.970 2 INFO nova.compute.manager [-] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] VM Stopped (Lifecycle Event)#033[00m
Oct  8 12:15:15 np0005476733 nova_compute[192580]: 2025-10-08 16:15:15.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:15 np0005476733 nova_compute[192580]: 2025-10-08 16:15:15.106 2 DEBUG nova.compute.manager [None req-769ae139-3b83-44a9-b0b6-bcee1529c24c - - - - - -] [instance: ce1d2a1a-8ddf-4ef1-a09e-3c0190e7a579] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:15:15 np0005476733 podman[255103]: 2025-10-08 16:15:15.22169555 +0000 UTC m=+0.046841978 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 12:15:15 np0005476733 podman[255102]: 2025-10-08 16:15:15.23294065 +0000 UTC m=+0.058833652 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 12:15:15 np0005476733 nova_compute[192580]: 2025-10-08 16:15:15.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:18 np0005476733 nova_compute[192580]: 2025-10-08 16:15:18.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:20 np0005476733 nova_compute[192580]: 2025-10-08 16:15:20.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:20 np0005476733 nova_compute[192580]: 2025-10-08 16:15:20.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:15:21 np0005476733 ovn_controller[94857]: 2025-10-08T16:15:21Z|00813|pinctrl|WARN|Dropped 369 log messages in last 56 seconds (most recently, 2 seconds ago) due to excessive rate
Oct  8 12:15:21 np0005476733 ovn_controller[94857]: 2025-10-08T16:15:21Z|00814|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:15:23 np0005476733 nova_compute[192580]: 2025-10-08 16:15:23.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:25 np0005476733 nova_compute[192580]: 2025-10-08 16:15:25.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:25 np0005476733 podman[255147]: 2025-10-08 16:15:25.217323311 +0000 UTC m=+0.051594300 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 12:15:25 np0005476733 nova_compute[192580]: 2025-10-08 16:15:25.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:15:25 np0005476733 nova_compute[192580]: 2025-10-08 16:15:25.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:15:26.372 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:15:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:15:26.372 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:15:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:15:26.372 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:15:28 np0005476733 nova_compute[192580]: 2025-10-08 16:15:28.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:29 np0005476733 nova_compute[192580]: 2025-10-08 16:15:29.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:15:29 np0005476733 nova_compute[192580]: 2025-10-08 16:15:29.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:15:30 np0005476733 nova_compute[192580]: 2025-10-08 16:15:30.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:30 np0005476733 podman[255166]: 2025-10-08 16:15:30.252636715 +0000 UTC m=+0.076878989 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:15:30 np0005476733 podman[255167]: 2025-10-08 16:15:30.253562074 +0000 UTC m=+0.070768702 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct  8 12:15:30 np0005476733 nova_compute[192580]: 2025-10-08 16:15:30.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:15:30 np0005476733 nova_compute[192580]: 2025-10-08 16:15:30.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:15:33 np0005476733 nova_compute[192580]: 2025-10-08 16:15:33.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:33 np0005476733 nova_compute[192580]: 2025-10-08 16:15:33.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:15:33 np0005476733 nova_compute[192580]: 2025-10-08 16:15:33.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:15:33 np0005476733 nova_compute[192580]: 2025-10-08 16:15:33.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:15:33 np0005476733 nova_compute[192580]: 2025-10-08 16:15:33.609 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:15:34 np0005476733 nova_compute[192580]: 2025-10-08 16:15:34.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:15:35 np0005476733 nova_compute[192580]: 2025-10-08 16:15:35.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:38 np0005476733 nova_compute[192580]: 2025-10-08 16:15:38.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:39 np0005476733 podman[255215]: 2025-10-08 16:15:39.228465495 +0000 UTC m=+0.056429345 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 12:15:39 np0005476733 podman[255214]: 2025-10-08 16:15:39.232859125 +0000 UTC m=+0.066079013 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 12:15:39 np0005476733 podman[255216]: 2025-10-08 16:15:39.242770562 +0000 UTC m=+0.070166614 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git)
Oct  8 12:15:40 np0005476733 nova_compute[192580]: 2025-10-08 16:15:40.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:41 np0005476733 nova_compute[192580]: 2025-10-08 16:15:41.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:15:41 np0005476733 nova_compute[192580]: 2025-10-08 16:15:41.615 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:15:41 np0005476733 nova_compute[192580]: 2025-10-08 16:15:41.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:15:41 np0005476733 nova_compute[192580]: 2025-10-08 16:15:41.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:15:41 np0005476733 nova_compute[192580]: 2025-10-08 16:15:41.616 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:15:41 np0005476733 nova_compute[192580]: 2025-10-08 16:15:41.774 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:15:41 np0005476733 nova_compute[192580]: 2025-10-08 16:15:41.775 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13747MB free_disk=111.31517028808594GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:15:41 np0005476733 nova_compute[192580]: 2025-10-08 16:15:41.775 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:15:41 np0005476733 nova_compute[192580]: 2025-10-08 16:15:41.775 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:15:41 np0005476733 nova_compute[192580]: 2025-10-08 16:15:41.868 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:15:41 np0005476733 nova_compute[192580]: 2025-10-08 16:15:41.869 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:15:41 np0005476733 nova_compute[192580]: 2025-10-08 16:15:41.889 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:15:41 np0005476733 nova_compute[192580]: 2025-10-08 16:15:41.909 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:15:41 np0005476733 nova_compute[192580]: 2025-10-08 16:15:41.934 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:15:41 np0005476733 nova_compute[192580]: 2025-10-08 16:15:41.935 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:15:43 np0005476733 nova_compute[192580]: 2025-10-08 16:15:43.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:43 np0005476733 nova_compute[192580]: 2025-10-08 16:15:43.936 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:15:45 np0005476733 nova_compute[192580]: 2025-10-08 16:15:45.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:46 np0005476733 podman[255278]: 2025-10-08 16:15:46.231292099 +0000 UTC m=+0.058171940 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:15:46 np0005476733 podman[255277]: 2025-10-08 16:15:46.248990984 +0000 UTC m=+0.072523227 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, org.label-schema.license=GPLv2, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  8 12:15:48 np0005476733 nova_compute[192580]: 2025-10-08 16:15:48.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:50 np0005476733 nova_compute[192580]: 2025-10-08 16:15:50.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:51 np0005476733 nova_compute[192580]: 2025-10-08 16:15:51.856 2 DEBUG oslo_concurrency.lockutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "8c079d33-9c46-438f-944a-8132a7cfcfb8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:15:51 np0005476733 nova_compute[192580]: 2025-10-08 16:15:51.856 2 DEBUG oslo_concurrency.lockutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:15:51 np0005476733 nova_compute[192580]: 2025-10-08 16:15:51.901 2 DEBUG nova.compute.manager [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 12:15:51 np0005476733 nova_compute[192580]: 2025-10-08 16:15:51.996 2 DEBUG oslo_concurrency.lockutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:15:51 np0005476733 nova_compute[192580]: 2025-10-08 16:15:51.997 2 DEBUG oslo_concurrency.lockutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.004 2 DEBUG nova.virt.hardware [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.005 2 INFO nova.compute.claims [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.161 2 DEBUG nova.compute.provider_tree [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.184 2 DEBUG nova.scheduler.client.report [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.218 2 DEBUG oslo_concurrency.lockutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.219 2 DEBUG nova.compute.manager [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.288 2 DEBUG nova.compute.manager [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.288 2 DEBUG nova.network.neutron [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.313 2 INFO nova.virt.libvirt.driver [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.336 2 DEBUG nova.compute.manager [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.444 2 DEBUG nova.compute.manager [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.446 2 DEBUG nova.virt.libvirt.driver [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.446 2 INFO nova.virt.libvirt.driver [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Creating image(s)#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.447 2 DEBUG oslo_concurrency.lockutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "/var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.447 2 DEBUG oslo_concurrency.lockutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "/var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.448 2 DEBUG oslo_concurrency.lockutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "/var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.461 2 DEBUG oslo_concurrency.processutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.518 2 DEBUG oslo_concurrency.processutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.519 2 DEBUG oslo_concurrency.lockutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.520 2 DEBUG oslo_concurrency.lockutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.536 2 DEBUG oslo_concurrency.processutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.590 2 DEBUG oslo_concurrency.processutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.592 2 DEBUG oslo_concurrency.processutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.645 2 DEBUG oslo_concurrency.processutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.646 2 DEBUG oslo_concurrency.lockutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.647 2 DEBUG oslo_concurrency.processutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.704 2 DEBUG oslo_concurrency.processutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.705 2 DEBUG nova.virt.disk.api [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Checking if we can resize image /var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.706 2 DEBUG oslo_concurrency.processutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.768 2 DEBUG oslo_concurrency.processutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.769 2 DEBUG nova.virt.disk.api [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Cannot resize image /var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.769 2 DEBUG nova.objects.instance [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lazy-loading 'migration_context' on Instance uuid 8c079d33-9c46-438f-944a-8132a7cfcfb8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.787 2 DEBUG nova.virt.libvirt.driver [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.788 2 DEBUG nova.virt.libvirt.driver [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Ensure instance console log exists: /var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.788 2 DEBUG oslo_concurrency.lockutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.788 2 DEBUG oslo_concurrency.lockutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:15:52 np0005476733 nova_compute[192580]: 2025-10-08 16:15:52.788 2 DEBUG oslo_concurrency.lockutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:15:53 np0005476733 nova_compute[192580]: 2025-10-08 16:15:53.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:53 np0005476733 nova_compute[192580]: 2025-10-08 16:15:53.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:15:53 np0005476733 nova_compute[192580]: 2025-10-08 16:15:53.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 12:15:55 np0005476733 nova_compute[192580]: 2025-10-08 16:15:55.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:55 np0005476733 nova_compute[192580]: 2025-10-08 16:15:55.138 2 DEBUG nova.network.neutron [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Successfully created port: fa1d0632-ef78-4a84-a89e-2efa740540a4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 12:15:56 np0005476733 podman[255337]: 2025-10-08 16:15:56.217476217 +0000 UTC m=+0.047252151 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 12:15:57 np0005476733 nova_compute[192580]: 2025-10-08 16:15:57.439 2 DEBUG nova.network.neutron [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Successfully updated port: fa1d0632-ef78-4a84-a89e-2efa740540a4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 12:15:57 np0005476733 nova_compute[192580]: 2025-10-08 16:15:57.463 2 DEBUG oslo_concurrency.lockutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "refresh_cache-8c079d33-9c46-438f-944a-8132a7cfcfb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:15:57 np0005476733 nova_compute[192580]: 2025-10-08 16:15:57.463 2 DEBUG oslo_concurrency.lockutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquired lock "refresh_cache-8c079d33-9c46-438f-944a-8132a7cfcfb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:15:57 np0005476733 nova_compute[192580]: 2025-10-08 16:15:57.463 2 DEBUG nova.network.neutron [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:15:57 np0005476733 nova_compute[192580]: 2025-10-08 16:15:57.578 2 DEBUG nova.compute.manager [req-136358c6-2560-4822-9160-18eee32cbc75 req-6809dd5f-515f-437b-ab8b-9be481048a89 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Received event network-changed-fa1d0632-ef78-4a84-a89e-2efa740540a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:15:57 np0005476733 nova_compute[192580]: 2025-10-08 16:15:57.578 2 DEBUG nova.compute.manager [req-136358c6-2560-4822-9160-18eee32cbc75 req-6809dd5f-515f-437b-ab8b-9be481048a89 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Refreshing instance network info cache due to event network-changed-fa1d0632-ef78-4a84-a89e-2efa740540a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:15:57 np0005476733 nova_compute[192580]: 2025-10-08 16:15:57.579 2 DEBUG oslo_concurrency.lockutils [req-136358c6-2560-4822-9160-18eee32cbc75 req-6809dd5f-515f-437b-ab8b-9be481048a89 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-8c079d33-9c46-438f-944a-8132a7cfcfb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:15:58 np0005476733 nova_compute[192580]: 2025-10-08 16:15:58.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:58 np0005476733 nova_compute[192580]: 2025-10-08 16:15:58.361 2 DEBUG nova.network.neutron [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.599 2 DEBUG nova.network.neutron [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Updating instance_info_cache with network_info: [{"id": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "address": "fa:16:3e:42:82:a7", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa1d0632-ef", "ovs_interfaceid": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.653 2 DEBUG oslo_concurrency.lockutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Releasing lock "refresh_cache-8c079d33-9c46-438f-944a-8132a7cfcfb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.653 2 DEBUG nova.compute.manager [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Instance network_info: |[{"id": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "address": "fa:16:3e:42:82:a7", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa1d0632-ef", "ovs_interfaceid": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.654 2 DEBUG oslo_concurrency.lockutils [req-136358c6-2560-4822-9160-18eee32cbc75 req-6809dd5f-515f-437b-ab8b-9be481048a89 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-8c079d33-9c46-438f-944a-8132a7cfcfb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.654 2 DEBUG nova.network.neutron [req-136358c6-2560-4822-9160-18eee32cbc75 req-6809dd5f-515f-437b-ab8b-9be481048a89 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Refreshing network info cache for port fa1d0632-ef78-4a84-a89e-2efa740540a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.656 2 DEBUG nova.virt.libvirt.driver [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Start _get_guest_xml network_info=[{"id": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "address": "fa:16:3e:42:82:a7", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa1d0632-ef", "ovs_interfaceid": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T15:17:39Z,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T15:17:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.660 2 WARNING nova.virt.libvirt.driver [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.665 2 DEBUG nova.virt.libvirt.host [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.665 2 DEBUG nova.virt.libvirt.host [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.668 2 DEBUG nova.virt.libvirt.host [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.668 2 DEBUG nova.virt.libvirt.host [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.669 2 DEBUG nova.virt.libvirt.driver [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.669 2 DEBUG nova.virt.hardware [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='987b2db7-1d21-4b59-831a-1e8ace40589b',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T15:17:39Z,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T15:17:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.669 2 DEBUG nova.virt.hardware [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.669 2 DEBUG nova.virt.hardware [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.670 2 DEBUG nova.virt.hardware [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.670 2 DEBUG nova.virt.hardware [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.670 2 DEBUG nova.virt.hardware [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.670 2 DEBUG nova.virt.hardware [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.670 2 DEBUG nova.virt.hardware [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.671 2 DEBUG nova.virt.hardware [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.671 2 DEBUG nova.virt.hardware [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.671 2 DEBUG nova.virt.hardware [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.675 2 DEBUG nova.virt.libvirt.vif [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:15:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-server-test-1599350823',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1599350823',id=86,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+zrgavmRKdnUQXrKVYbBp+WYlZmJhFQsa1kl3Dxiu3QX4zg/ahS8IRtri5xncBe6cTI7KlCkUrKeQ2+JI5DxlFw723YYmaC31z8s6e9ZieApUJckBa+6MT9ksK1drFCw==',key_name='tempest-keypair-1043886460',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9d7b1c6f132443b0abac8495ed44621d',ramdisk_id='',reservation_id='r-tpuvmdiu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-OvnDvrTest-313060968',owner_user_name='tempest-OvnDvrTest-313060968-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:15:52Z,user_data=None,user_id='81b62a8f3edf4f78aeb0b087fd79ebb7',uuid=8c079d33-9c46-438f-944a-8132a7cfcfb8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "address": "fa:16:3e:42:82:a7", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa1d0632-ef", "ovs_interfaceid": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.676 2 DEBUG nova.network.os_vif_util [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converting VIF {"id": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "address": "fa:16:3e:42:82:a7", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa1d0632-ef", "ovs_interfaceid": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.676 2 DEBUG nova.network.os_vif_util [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:82:a7,bridge_name='br-int',has_traffic_filtering=True,id=fa1d0632-ef78-4a84-a89e-2efa740540a4,network=Network(a60d5119-22ed-4506-b21f-c7850a67e1ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa1d0632-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.677 2 DEBUG nova.objects.instance [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lazy-loading 'pci_devices' on Instance uuid 8c079d33-9c46-438f-944a-8132a7cfcfb8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.712 2 DEBUG nova.virt.libvirt.driver [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] End _get_guest_xml xml=<domain type="kvm">
Oct  8 12:15:59 np0005476733 nova_compute[192580]:  <uuid>8c079d33-9c46-438f-944a-8132a7cfcfb8</uuid>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:  <name>instance-00000056</name>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:  <memory>131072</memory>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <nova:name>tempest-server-test-1599350823</nova:name>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 16:15:59</nova:creationTime>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <nova:flavor name="m1.nano">
Oct  8 12:15:59 np0005476733 nova_compute[192580]:        <nova:memory>128</nova:memory>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:        <nova:disk>1</nova:disk>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:        <nova:user uuid="81b62a8f3edf4f78aeb0b087fd79ebb7">tempest-OvnDvrTest-313060968-project-admin</nova:user>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:        <nova:project uuid="9d7b1c6f132443b0abac8495ed44621d">tempest-OvnDvrTest-313060968</nova:project>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="ec29a055-bb5f-49c2-94be-8574c5ea97ea"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:        <nova:port uuid="fa1d0632-ef78-4a84-a89e-2efa740540a4">
Oct  8 12:15:59 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.27" ipVersion="4"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <system>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <entry name="serial">8c079d33-9c46-438f-944a-8132a7cfcfb8</entry>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <entry name="uuid">8c079d33-9c46-438f-944a-8132a7cfcfb8</entry>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    </system>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:  <os>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:  </os>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:  <features>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:  </features>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:  </clock>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:  <devices>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8/disk"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.config"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:42:82:a7"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <target dev="tapfa1d0632-ef"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    </interface>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8/console.log" append="off"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    </serial>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <video>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    </video>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    </rng>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 12:15:59 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 12:15:59 np0005476733 nova_compute[192580]:  </devices>
Oct  8 12:15:59 np0005476733 nova_compute[192580]: </domain>
Oct  8 12:15:59 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.714 2 DEBUG nova.compute.manager [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Preparing to wait for external event network-vif-plugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.714 2 DEBUG oslo_concurrency.lockutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.714 2 DEBUG oslo_concurrency.lockutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.714 2 DEBUG oslo_concurrency.lockutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.715 2 DEBUG nova.virt.libvirt.vif [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:15:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-server-test-1599350823',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1599350823',id=86,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+zrgavmRKdnUQXrKVYbBp+WYlZmJhFQsa1kl3Dxiu3QX4zg/ahS8IRtri5xncBe6cTI7KlCkUrKeQ2+JI5DxlFw723YYmaC31z8s6e9ZieApUJckBa+6MT9ksK1drFCw==',key_name='tempest-keypair-1043886460',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9d7b1c6f132443b0abac8495ed44621d',ramdisk_id='',reservation_id='r-tpuvmdiu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-OvnDvrTest-313060968',owner_user_name='tempest-OvnDvrTest-313060968-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:15:52Z,user_data=None,user_id='81b62a8f3edf4f78aeb0b087fd79ebb7',uuid=8c079d33-9c46-438f-944a-8132a7cfcfb8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "address": "fa:16:3e:42:82:a7", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa1d0632-ef", "ovs_interfaceid": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.715 2 DEBUG nova.network.os_vif_util [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converting VIF {"id": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "address": "fa:16:3e:42:82:a7", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa1d0632-ef", "ovs_interfaceid": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.716 2 DEBUG nova.network.os_vif_util [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:82:a7,bridge_name='br-int',has_traffic_filtering=True,id=fa1d0632-ef78-4a84-a89e-2efa740540a4,network=Network(a60d5119-22ed-4506-b21f-c7850a67e1ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa1d0632-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.716 2 DEBUG os_vif [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:82:a7,bridge_name='br-int',has_traffic_filtering=True,id=fa1d0632-ef78-4a84-a89e-2efa740540a4,network=Network(a60d5119-22ed-4506-b21f-c7850a67e1ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa1d0632-ef') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.717 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.717 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.720 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfa1d0632-ef, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.721 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfa1d0632-ef, col_values=(('external_ids', {'iface-id': 'fa1d0632-ef78-4a84-a89e-2efa740540a4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:42:82:a7', 'vm-uuid': '8c079d33-9c46-438f-944a-8132a7cfcfb8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:59 np0005476733 NetworkManager[51699]: <info>  [1759940159.7228] manager: (tapfa1d0632-ef): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/265)
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.728 2 INFO os_vif [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:82:a7,bridge_name='br-int',has_traffic_filtering=True,id=fa1d0632-ef78-4a84-a89e-2efa740540a4,network=Network(a60d5119-22ed-4506-b21f-c7850a67e1ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa1d0632-ef')#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.792 2 DEBUG nova.virt.libvirt.driver [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.792 2 DEBUG nova.virt.libvirt.driver [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.792 2 DEBUG nova.virt.libvirt.driver [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] No VIF found with MAC fa:16:3e:42:82:a7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 12:15:59 np0005476733 nova_compute[192580]: 2025-10-08 16:15:59.793 2 INFO nova.virt.libvirt.driver [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Using config drive#033[00m
Oct  8 12:16:01 np0005476733 podman[255363]: 2025-10-08 16:16:01.231698897 +0000 UTC m=+0.058883233 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=edpm, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 12:16:01 np0005476733 podman[255362]: 2025-10-08 16:16:01.285659283 +0000 UTC m=+0.116673961 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 12:16:01 np0005476733 nova_compute[192580]: 2025-10-08 16:16:01.410 2 INFO nova.virt.libvirt.driver [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Creating config drive at /var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.config#033[00m
Oct  8 12:16:01 np0005476733 nova_compute[192580]: 2025-10-08 16:16:01.414 2 DEBUG oslo_concurrency.processutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz3p8pb2w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:16:01 np0005476733 nova_compute[192580]: 2025-10-08 16:16:01.541 2 DEBUG oslo_concurrency.processutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz3p8pb2w" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:16:01 np0005476733 kernel: tapfa1d0632-ef: entered promiscuous mode
Oct  8 12:16:01 np0005476733 NetworkManager[51699]: <info>  [1759940161.5910] manager: (tapfa1d0632-ef): new Tun device (/org/freedesktop/NetworkManager/Devices/266)
Oct  8 12:16:01 np0005476733 ovn_controller[94857]: 2025-10-08T16:16:01Z|00815|binding|INFO|Claiming lport fa1d0632-ef78-4a84-a89e-2efa740540a4 for this chassis.
Oct  8 12:16:01 np0005476733 ovn_controller[94857]: 2025-10-08T16:16:01Z|00816|binding|INFO|fa1d0632-ef78-4a84-a89e-2efa740540a4: Claiming fa:16:3e:42:82:a7 10.100.0.27
Oct  8 12:16:01 np0005476733 nova_compute[192580]: 2025-10-08 16:16:01.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.597 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:42:82:a7 10.100.0.27'], port_security=['fa:16:3e:42:82:a7 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a60d5119-22ed-4506-b21f-c7850a67e1ca', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d7b1c6f132443b0abac8495ed44621d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6563b284-b73c-434a-b0ec-7119c8f265ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=67e260af-4f33-4a56-b3c9-23ccc3032d2a, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=fa1d0632-ef78-4a84-a89e-2efa740540a4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.598 103739 INFO neutron.agent.ovn.metadata.agent [-] Port fa1d0632-ef78-4a84-a89e-2efa740540a4 in datapath a60d5119-22ed-4506-b21f-c7850a67e1ca bound to our chassis#033[00m
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.599 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a60d5119-22ed-4506-b21f-c7850a67e1ca#033[00m
Oct  8 12:16:01 np0005476733 ovn_controller[94857]: 2025-10-08T16:16:01Z|00817|binding|INFO|Setting lport fa1d0632-ef78-4a84-a89e-2efa740540a4 ovn-installed in OVS
Oct  8 12:16:01 np0005476733 ovn_controller[94857]: 2025-10-08T16:16:01Z|00818|binding|INFO|Setting lport fa1d0632-ef78-4a84-a89e-2efa740540a4 up in Southbound
Oct  8 12:16:01 np0005476733 nova_compute[192580]: 2025-10-08 16:16:01.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:01 np0005476733 nova_compute[192580]: 2025-10-08 16:16:01.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.612 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[bd4fe1e1-9210-4f76-a184-9f1a8e840a14]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.612 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa60d5119-21 in ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.614 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa60d5119-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.614 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[85d7ed95-7403-4b97-92f8-73ee89a23e48]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.615 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a23d27db-afec-4442-bb10-36382c743629]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:16:01 np0005476733 systemd-udevd[255424]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.626 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[dcf20a66-8e0d-4adf-b592-43c227f99487]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:16:01 np0005476733 systemd-machined[152624]: New machine qemu-52-instance-00000056.
Oct  8 12:16:01 np0005476733 NetworkManager[51699]: <info>  [1759940161.6405] device (tapfa1d0632-ef): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:16:01 np0005476733 NetworkManager[51699]: <info>  [1759940161.6418] device (tapfa1d0632-ef): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 12:16:01 np0005476733 systemd[1]: Started Virtual Machine qemu-52-instance-00000056.
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.653 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1a36e625-1a1d-4d03-ab39-f60bd28dbf2c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.679 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[06ea4089-888a-421b-9f69-0ff968558216]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.683 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e518f02d-1d73-48de-b55e-2a93dd2d6d3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:16:01 np0005476733 systemd-udevd[255427]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:16:01 np0005476733 NetworkManager[51699]: <info>  [1759940161.6863] manager: (tapa60d5119-20): new Veth device (/org/freedesktop/NetworkManager/Devices/267)
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.716 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[82311f99-d61d-4c2f-83ae-f2e6695d151e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.719 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[703a2a96-09fc-42ac-822b-f90db6d94a08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:16:01 np0005476733 NetworkManager[51699]: <info>  [1759940161.7411] device (tapa60d5119-20): carrier: link connected
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.748 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[9e7bed9a-fad4-423e-b404-9b1eb82cfd7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.764 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[905335e5-fd70-4077-b4fb-69ebc192f4f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa60d5119-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:5a:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 185], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706540, 'reachable_time': 44716, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255455, 'error': None, 'target': 'ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.778 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[dd4e3918-3a9d-4297-914d-f9194fc5acda]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb2:5af4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706540, 'tstamp': 706540}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255456, 'error': None, 'target': 'ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.794 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c4b0a694-7853-4971-ad34-508017222476]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa60d5119-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:5a:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 185], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706540, 'reachable_time': 44716, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 255457, 'error': None, 'target': 'ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.823 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d45ea005-70f4-467a-82eb-b444650262a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.889 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[bb7348f2-0aa4-4d9e-876b-059e132a08a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.891 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa60d5119-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.891 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.891 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa60d5119-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:16:01 np0005476733 NetworkManager[51699]: <info>  [1759940161.8941] manager: (tapa60d5119-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/268)
Oct  8 12:16:01 np0005476733 nova_compute[192580]: 2025-10-08 16:16:01.894 2 DEBUG nova.compute.manager [req-54b0b30a-a6e5-424c-a401-1e090bda04b5 req-04518a56-23b9-4647-bfc5-c5d5b2b9db5a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Received event network-vif-plugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:16:01 np0005476733 nova_compute[192580]: 2025-10-08 16:16:01.894 2 DEBUG oslo_concurrency.lockutils [req-54b0b30a-a6e5-424c-a401-1e090bda04b5 req-04518a56-23b9-4647-bfc5-c5d5b2b9db5a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:16:01 np0005476733 kernel: tapa60d5119-20: entered promiscuous mode
Oct  8 12:16:01 np0005476733 nova_compute[192580]: 2025-10-08 16:16:01.895 2 DEBUG oslo_concurrency.lockutils [req-54b0b30a-a6e5-424c-a401-1e090bda04b5 req-04518a56-23b9-4647-bfc5-c5d5b2b9db5a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:16:01 np0005476733 nova_compute[192580]: 2025-10-08 16:16:01.895 2 DEBUG oslo_concurrency.lockutils [req-54b0b30a-a6e5-424c-a401-1e090bda04b5 req-04518a56-23b9-4647-bfc5-c5d5b2b9db5a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:16:01 np0005476733 nova_compute[192580]: 2025-10-08 16:16:01.895 2 DEBUG nova.compute.manager [req-54b0b30a-a6e5-424c-a401-1e090bda04b5 req-04518a56-23b9-4647-bfc5-c5d5b2b9db5a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Processing event network-vif-plugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 12:16:01 np0005476733 nova_compute[192580]: 2025-10-08 16:16:01.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.898 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa60d5119-20, col_values=(('external_ids', {'iface-id': '6d5cee0f-29d4-4b28-ba94-d661be87caac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:16:01 np0005476733 nova_compute[192580]: 2025-10-08 16:16:01.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:01 np0005476733 ovn_controller[94857]: 2025-10-08T16:16:01Z|00819|binding|INFO|Releasing lport 6d5cee0f-29d4-4b28-ba94-d661be87caac from this chassis (sb_readonly=0)
Oct  8 12:16:01 np0005476733 nova_compute[192580]: 2025-10-08 16:16:01.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.923 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a60d5119-22ed-4506-b21f-c7850a67e1ca.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a60d5119-22ed-4506-b21f-c7850a67e1ca.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.925 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0f71d239-a881-4fd9-8518-9541bae46374]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.926 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-a60d5119-22ed-4506-b21f-c7850a67e1ca
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/a60d5119-22ed-4506-b21f-c7850a67e1ca.pid.haproxy
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID a60d5119-22ed-4506-b21f-c7850a67e1ca
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 12:16:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:01.930 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca', 'env', 'PROCESS_TAG=haproxy-a60d5119-22ed-4506-b21f-c7850a67e1ca', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a60d5119-22ed-4506-b21f-c7850a67e1ca.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 12:16:02 np0005476733 podman[255496]: 2025-10-08 16:16:02.294125272 +0000 UTC m=+0.049501404 container create d1f2fb68e8f03199a411e91b94351f24e0331217f388d280ae34d4a4bbfe4879 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:16:02 np0005476733 systemd[1]: Started libpod-conmon-d1f2fb68e8f03199a411e91b94351f24e0331217f388d280ae34d4a4bbfe4879.scope.
Oct  8 12:16:02 np0005476733 podman[255496]: 2025-10-08 16:16:02.269496704 +0000 UTC m=+0.024872626 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 12:16:02 np0005476733 systemd[1]: Started libcrun container.
Oct  8 12:16:02 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/139b1260890aa9c749e4be18ca87e2503f21de7326d2843f59d1a51473eccd5b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 12:16:02 np0005476733 podman[255496]: 2025-10-08 16:16:02.383742947 +0000 UTC m=+0.139118889 container init d1f2fb68e8f03199a411e91b94351f24e0331217f388d280ae34d4a4bbfe4879 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  8 12:16:02 np0005476733 podman[255496]: 2025-10-08 16:16:02.392057733 +0000 UTC m=+0.147433645 container start d1f2fb68e8f03199a411e91b94351f24e0331217f388d280ae34d4a4bbfe4879 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.398 2 DEBUG nova.network.neutron [req-136358c6-2560-4822-9160-18eee32cbc75 req-6809dd5f-515f-437b-ab8b-9be481048a89 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Updated VIF entry in instance network info cache for port fa1d0632-ef78-4a84-a89e-2efa740540a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.398 2 DEBUG nova.network.neutron [req-136358c6-2560-4822-9160-18eee32cbc75 req-6809dd5f-515f-437b-ab8b-9be481048a89 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Updating instance_info_cache with network_info: [{"id": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "address": "fa:16:3e:42:82:a7", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa1d0632-ef", "ovs_interfaceid": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:16:02 np0005476733 neutron-haproxy-ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca[255511]: [NOTICE]   (255515) : New worker (255517) forked
Oct  8 12:16:02 np0005476733 neutron-haproxy-ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca[255511]: [NOTICE]   (255515) : Loading success.
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.427 2 DEBUG oslo_concurrency.lockutils [req-136358c6-2560-4822-9160-18eee32cbc75 req-6809dd5f-515f-437b-ab8b-9be481048a89 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-8c079d33-9c46-438f-944a-8132a7cfcfb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.429 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759940162.4285753, 8c079d33-9c46-438f-944a-8132a7cfcfb8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.429 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] VM Started (Lifecycle Event)#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.431 2 DEBUG nova.compute.manager [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.433 2 DEBUG nova.virt.libvirt.driver [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.443 2 INFO nova.virt.libvirt.driver [-] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Instance spawned successfully.#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.443 2 DEBUG nova.virt.libvirt.driver [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.452 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:16:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:02.454 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=57, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=56) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:02.455 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.457 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.469 2 DEBUG nova.virt.libvirt.driver [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.469 2 DEBUG nova.virt.libvirt.driver [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.470 2 DEBUG nova.virt.libvirt.driver [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.470 2 DEBUG nova.virt.libvirt.driver [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.470 2 DEBUG nova.virt.libvirt.driver [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.471 2 DEBUG nova.virt.libvirt.driver [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.476 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.476 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759940162.4292696, 8c079d33-9c46-438f-944a-8132a7cfcfb8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.476 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] VM Paused (Lifecycle Event)#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.504 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.507 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759940162.4329395, 8c079d33-9c46-438f-944a-8132a7cfcfb8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.508 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] VM Resumed (Lifecycle Event)#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.539 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.543 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.546 2 INFO nova.compute.manager [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Took 10.10 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.547 2 DEBUG nova.compute.manager [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.578 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.623 2 INFO nova.compute.manager [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Took 10.66 seconds to build instance.#033[00m
Oct  8 12:16:02 np0005476733 nova_compute[192580]: 2025-10-08 16:16:02.648 2 DEBUG oslo_concurrency.lockutils [None req-6f74f2db-c4e8-400c-a2b7-c43b2eec6ade 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.792s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:16:03 np0005476733 nova_compute[192580]: 2025-10-08 16:16:03.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:03 np0005476733 nova_compute[192580]: 2025-10-08 16:16:03.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:16:04 np0005476733 nova_compute[192580]: 2025-10-08 16:16:04.089 2 DEBUG nova.compute.manager [req-59a5ce16-6222-41be-9ab0-9ea4d202e149 req-62374659-76f0-4eb6-8409-6999e6dcaf04 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Received event network-vif-plugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:16:04 np0005476733 nova_compute[192580]: 2025-10-08 16:16:04.089 2 DEBUG oslo_concurrency.lockutils [req-59a5ce16-6222-41be-9ab0-9ea4d202e149 req-62374659-76f0-4eb6-8409-6999e6dcaf04 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:16:04 np0005476733 nova_compute[192580]: 2025-10-08 16:16:04.089 2 DEBUG oslo_concurrency.lockutils [req-59a5ce16-6222-41be-9ab0-9ea4d202e149 req-62374659-76f0-4eb6-8409-6999e6dcaf04 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:16:04 np0005476733 nova_compute[192580]: 2025-10-08 16:16:04.090 2 DEBUG oslo_concurrency.lockutils [req-59a5ce16-6222-41be-9ab0-9ea4d202e149 req-62374659-76f0-4eb6-8409-6999e6dcaf04 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:16:04 np0005476733 nova_compute[192580]: 2025-10-08 16:16:04.090 2 DEBUG nova.compute.manager [req-59a5ce16-6222-41be-9ab0-9ea4d202e149 req-62374659-76f0-4eb6-8409-6999e6dcaf04 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] No waiting events found dispatching network-vif-plugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:16:04 np0005476733 nova_compute[192580]: 2025-10-08 16:16:04.090 2 WARNING nova.compute.manager [req-59a5ce16-6222-41be-9ab0-9ea4d202e149 req-62374659-76f0-4eb6-8409-6999e6dcaf04 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Received unexpected event network-vif-plugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 for instance with vm_state active and task_state None.#033[00m
Oct  8 12:16:04 np0005476733 nova_compute[192580]: 2025-10-08 16:16:04.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:06.458 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '57'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:16:07 np0005476733 nova_compute[192580]: 2025-10-08 16:16:07.209 2 DEBUG nova.compute.manager [req-b6ac9892-de17-4878-aed2-781c55a6a904 req-ba7a5912-d286-4b8c-81c3-1a560a9f80fb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Received event network-changed-fa1d0632-ef78-4a84-a89e-2efa740540a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:16:07 np0005476733 nova_compute[192580]: 2025-10-08 16:16:07.211 2 DEBUG nova.compute.manager [req-b6ac9892-de17-4878-aed2-781c55a6a904 req-ba7a5912-d286-4b8c-81c3-1a560a9f80fb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Refreshing instance network info cache due to event network-changed-fa1d0632-ef78-4a84-a89e-2efa740540a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:16:07 np0005476733 nova_compute[192580]: 2025-10-08 16:16:07.211 2 DEBUG oslo_concurrency.lockutils [req-b6ac9892-de17-4878-aed2-781c55a6a904 req-ba7a5912-d286-4b8c-81c3-1a560a9f80fb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-8c079d33-9c46-438f-944a-8132a7cfcfb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:16:07 np0005476733 nova_compute[192580]: 2025-10-08 16:16:07.212 2 DEBUG oslo_concurrency.lockutils [req-b6ac9892-de17-4878-aed2-781c55a6a904 req-ba7a5912-d286-4b8c-81c3-1a560a9f80fb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-8c079d33-9c46-438f-944a-8132a7cfcfb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:16:07 np0005476733 nova_compute[192580]: 2025-10-08 16:16:07.212 2 DEBUG nova.network.neutron [req-b6ac9892-de17-4878-aed2-781c55a6a904 req-ba7a5912-d286-4b8c-81c3-1a560a9f80fb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Refreshing network info cache for port fa1d0632-ef78-4a84-a89e-2efa740540a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:16:08 np0005476733 nova_compute[192580]: 2025-10-08 16:16:08.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:09 np0005476733 systemd-logind[827]: New session 81 of user zuul.
Oct  8 12:16:09 np0005476733 systemd[1]: Started Session 81 of User zuul.
Oct  8 12:16:09 np0005476733 podman[255530]: 2025-10-08 16:16:09.488993916 +0000 UTC m=+0.064575725 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:16:09 np0005476733 podman[255528]: 2025-10-08 16:16:09.510193494 +0000 UTC m=+0.085313658 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_id=multipathd)
Oct  8 12:16:09 np0005476733 podman[255531]: 2025-10-08 16:16:09.527414904 +0000 UTC m=+0.094589104 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, distribution-scope=public, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  8 12:16:09 np0005476733 systemd-logind[827]: New session 82 of user zuul.
Oct  8 12:16:09 np0005476733 systemd[1]: Started Session 82 of User zuul.
Oct  8 12:16:09 np0005476733 nova_compute[192580]: 2025-10-08 16:16:09.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:09 np0005476733 systemd[1]: session-82.scope: Deactivated successfully.
Oct  8 12:16:09 np0005476733 systemd-logind[827]: Session 82 logged out. Waiting for processes to exit.
Oct  8 12:16:09 np0005476733 systemd-logind[827]: Removed session 82.
Oct  8 12:16:10 np0005476733 nova_compute[192580]: 2025-10-08 16:16:10.387 2 DEBUG nova.network.neutron [req-b6ac9892-de17-4878-aed2-781c55a6a904 req-ba7a5912-d286-4b8c-81c3-1a560a9f80fb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Updated VIF entry in instance network info cache for port fa1d0632-ef78-4a84-a89e-2efa740540a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:16:10 np0005476733 nova_compute[192580]: 2025-10-08 16:16:10.388 2 DEBUG nova.network.neutron [req-b6ac9892-de17-4878-aed2-781c55a6a904 req-ba7a5912-d286-4b8c-81c3-1a560a9f80fb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Updating instance_info_cache with network_info: [{"id": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "address": "fa:16:3e:42:82:a7", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa1d0632-ef", "ovs_interfaceid": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:16:10 np0005476733 nova_compute[192580]: 2025-10-08 16:16:10.423 2 DEBUG oslo_concurrency.lockutils [req-b6ac9892-de17-4878-aed2-781c55a6a904 req-ba7a5912-d286-4b8c-81c3-1a560a9f80fb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-8c079d33-9c46-438f-944a-8132a7cfcfb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:16:13 np0005476733 nova_compute[192580]: 2025-10-08 16:16:13.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:14 np0005476733 nova_compute[192580]: 2025-10-08 16:16:14.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:14 np0005476733 ovn_controller[94857]: 2025-10-08T16:16:14Z|00107|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:42:82:a7 10.100.0.27
Oct  8 12:16:14 np0005476733 ovn_controller[94857]: 2025-10-08T16:16:14Z|00108|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:42:82:a7 10.100.0.27
Oct  8 12:16:17 np0005476733 podman[255663]: 2025-10-08 16:16:17.227123478 +0000 UTC m=+0.050611039 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 12:16:17 np0005476733 podman[255664]: 2025-10-08 16:16:17.237393636 +0000 UTC m=+0.058464260 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:16:18 np0005476733 nova_compute[192580]: 2025-10-08 16:16:18.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:18 np0005476733 systemd-logind[827]: New session 83 of user zuul.
Oct  8 12:16:18 np0005476733 systemd[1]: Started Session 83 of User zuul.
Oct  8 12:16:19 np0005476733 nova_compute[192580]: 2025-10-08 16:16:19.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:19 np0005476733 systemd-logind[827]: New session 84 of user zuul.
Oct  8 12:16:19 np0005476733 systemd[1]: Started Session 84 of User zuul.
Oct  8 12:16:19 np0005476733 systemd-logind[827]: New session 85 of user zuul.
Oct  8 12:16:19 np0005476733 systemd[1]: Started Session 85 of User zuul.
Oct  8 12:16:20 np0005476733 systemd[1]: session-85.scope: Deactivated successfully.
Oct  8 12:16:20 np0005476733 systemd-logind[827]: Session 85 logged out. Waiting for processes to exit.
Oct  8 12:16:20 np0005476733 systemd-logind[827]: Removed session 85.
Oct  8 12:16:21 np0005476733 nova_compute[192580]: 2025-10-08 16:16:21.839 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:16:23 np0005476733 nova_compute[192580]: 2025-10-08 16:16:23.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:24 np0005476733 nova_compute[192580]: 2025-10-08 16:16:24.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:25 np0005476733 nova_compute[192580]: 2025-10-08 16:16:25.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:16:25 np0005476733 nova_compute[192580]: 2025-10-08 16:16:25.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 12:16:25 np0005476733 ovn_controller[94857]: 2025-10-08T16:16:25Z|00820|pinctrl|WARN|Dropped 495 log messages in last 64 seconds (most recently, 13 seconds ago) due to excessive rate
Oct  8 12:16:25 np0005476733 ovn_controller[94857]: 2025-10-08T16:16:25Z|00821|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:16:25 np0005476733 nova_compute[192580]: 2025-10-08 16:16:25.615 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 12:16:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:26.373 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:16:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:26.374 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:16:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:26.375 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:16:26 np0005476733 nova_compute[192580]: 2025-10-08 16:16:26.615 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:16:27 np0005476733 podman[255796]: 2025-10-08 16:16:27.265381942 +0000 UTC m=+0.083456988 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  8 12:16:28 np0005476733 nova_compute[192580]: 2025-10-08 16:16:28.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:29 np0005476733 systemd-logind[827]: New session 86 of user zuul.
Oct  8 12:16:29 np0005476733 systemd[1]: Started Session 86 of User zuul.
Oct  8 12:16:29 np0005476733 nova_compute[192580]: 2025-10-08 16:16:29.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:30 np0005476733 systemd-logind[827]: New session 87 of user zuul.
Oct  8 12:16:30 np0005476733 systemd[1]: Started Session 87 of User zuul.
Oct  8 12:16:30 np0005476733 systemd-logind[827]: New session 88 of user zuul.
Oct  8 12:16:30 np0005476733 systemd[1]: Started Session 88 of User zuul.
Oct  8 12:16:30 np0005476733 nova_compute[192580]: 2025-10-08 16:16:30.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:16:30 np0005476733 systemd[1]: session-88.scope: Deactivated successfully.
Oct  8 12:16:30 np0005476733 systemd-logind[827]: Session 88 logged out. Waiting for processes to exit.
Oct  8 12:16:30 np0005476733 systemd-logind[827]: Removed session 88.
Oct  8 12:16:31 np0005476733 nova_compute[192580]: 2025-10-08 16:16:31.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:16:31 np0005476733 nova_compute[192580]: 2025-10-08 16:16:31.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:16:31 np0005476733 nova_compute[192580]: 2025-10-08 16:16:31.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:16:32 np0005476733 podman[255906]: 2025-10-08 16:16:32.229212333 +0000 UTC m=+0.060223717 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct  8 12:16:32 np0005476733 podman[255905]: 2025-10-08 16:16:32.256006319 +0000 UTC m=+0.088745628 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:16:33 np0005476733 nova_compute[192580]: 2025-10-08 16:16:33.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:33 np0005476733 nova_compute[192580]: 2025-10-08 16:16:33.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:16:33 np0005476733 nova_compute[192580]: 2025-10-08 16:16:33.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:16:33 np0005476733 nova_compute[192580]: 2025-10-08 16:16:33.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:16:34 np0005476733 nova_compute[192580]: 2025-10-08 16:16:34.745 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-8c079d33-9c46-438f-944a-8132a7cfcfb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:16:34 np0005476733 nova_compute[192580]: 2025-10-08 16:16:34.746 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-8c079d33-9c46-438f-944a-8132a7cfcfb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:16:34 np0005476733 nova_compute[192580]: 2025-10-08 16:16:34.746 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:16:34 np0005476733 nova_compute[192580]: 2025-10-08 16:16:34.746 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 8c079d33-9c46-438f-944a-8132a7cfcfb8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:16:34 np0005476733 nova_compute[192580]: 2025-10-08 16:16:34.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:35 np0005476733 nova_compute[192580]: 2025-10-08 16:16:35.947 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Updating instance_info_cache with network_info: [{"id": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "address": "fa:16:3e:42:82:a7", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa1d0632-ef", "ovs_interfaceid": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:16:35 np0005476733 nova_compute[192580]: 2025-10-08 16:16:35.965 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-8c079d33-9c46-438f-944a-8132a7cfcfb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:16:35 np0005476733 nova_compute[192580]: 2025-10-08 16:16:35.965 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.062 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'name': 'tempest-server-test-1599350823', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000056', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9d7b1c6f132443b0abac8495ed44621d', 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'hostId': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.063 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.068 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 8c079d33-9c46-438f-944a-8132a7cfcfb8 / tapfa1d0632-ef inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.068 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/network.incoming.packets volume: 36 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b58e1a4-6a8b-45d0-8dfb-266cf557114b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 36, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000056-8c079d33-9c46-438f-944a-8132a7cfcfb8-tapfa1d0632-ef', 'timestamp': '2025-10-08T16:16:36.063863', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'tapfa1d0632-ef', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:82:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfa1d0632-ef'}, 'message_id': '29a19aa6-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.786852206, 'message_signature': '8eeaed11ee716bea85729774920a7d38f363a6ba5bc94d734a02da31d4ba17cf'}]}, 'timestamp': '2025-10-08 16:16:36.069512', '_unique_id': '43b70100cbd344e6843e823a76a67b06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.071 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.072 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.099 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.device.write.requests volume: 321 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.100 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de17665f-e58b-43ba-82e2-3a29c5f968c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 321, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8-vda', 'timestamp': '2025-10-08T16:16:36.072558', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'instance-00000056', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '29a65a8c-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.795551804, 'message_signature': '4808cec7657cb372989229825c4fb3a828ff874079b7011af563f0a8e92c3481'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8-sda', 'timestamp': '2025-10-08T16:16:36.072558', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'instance-00000056', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '29a671fc-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.795551804, 'message_signature': 'fc9282d9ad3bad0b53d884f89afc40359ef87ea2c4cb623096a9a448372ebeda'}]}, 'timestamp': '2025-10-08 16:16:36.101226', '_unique_id': 'ae793db6b016484b825e601c67cbcc59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.102 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.104 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.104 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2de81abf-9d6d-469b-bf9f-1d4214906fac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000056-8c079d33-9c46-438f-944a-8132a7cfcfb8-tapfa1d0632-ef', 'timestamp': '2025-10-08T16:16:36.104232', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'tapfa1d0632-ef', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:82:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfa1d0632-ef'}, 'message_id': '29a6fcee-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.786852206, 'message_signature': 'c198d05a6ff90f4cca2e60802f9af97aa69ec3be220886127123f9f5404a10de'}]}, 'timestamp': '2025-10-08 16:16:36.104759', '_unique_id': '6098b879487d45109cd43e375e50929d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.105 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.107 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.107 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.device.write.bytes volume: 72982528 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.107 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0061d88-7cb2-419b-a4be-5251034d4512', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72982528, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8-vda', 'timestamp': '2025-10-08T16:16:36.107254', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'instance-00000056', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '29a7716a-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.795551804, 'message_signature': 'f3724775762ee57b615aed38eeae3c184bc82ff3ee11c4fcd31ea08e45d7d7c8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8-sda', 'timestamp': '2025-10-08T16:16:36.107254', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'instance-00000056', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '29a782e0-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.795551804, 'message_signature': 'b129bb5ed4e81faf7c818863596896779e7acef58a603d34332f1d31327bc31f'}]}, 'timestamp': '2025-10-08 16:16:36.108194', '_unique_id': '1f3dc6cdeb96451aa76738131c486bc1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.110 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.126 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.127 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ef92109-37fa-4955-accd-e91cade4ae59', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8-vda', 'timestamp': '2025-10-08T16:16:36.110584', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'instance-00000056', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '29aa66ea-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.83359581, 'message_signature': '8f124825f20b76e3e4268b515948490123fae73df8031fa5722772d94ba752a4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8-sda', 'timestamp': '2025-10-08T16:16:36.110584', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'instance-00000056', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '29aa7b62-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.83359581, 'message_signature': 'b16d8d360d1be200fcfe074f3ba4b1e2db4ee4c054547cdcf73f262026672011'}]}, 'timestamp': '2025-10-08 16:16:36.127633', '_unique_id': 'b6cda48b1e1040898042a91e0a8583e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.128 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.130 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.130 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.130 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae5b39df-78d3-4fd8-a5d6-09109476f88c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8-vda', 'timestamp': '2025-10-08T16:16:36.130412', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'instance-00000056', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '29aafa10-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.83359581, 'message_signature': 'bed493be89e71451bb4b71488a1f127d3fea8b3cd87ec345198f55d1b4fabf3d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8-sda', 'timestamp': '2025-10-08T16:16:36.130412', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'instance-00000056', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '29ab0c94-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.83359581, 'message_signature': '9b3299f96218ee5870713770e1e72284b5c5853648773255065eb51dfc1606c0'}]}, 'timestamp': '2025-10-08 16:16:36.131344', '_unique_id': '37ecbdaec395483c9d30637d88bff909'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.132 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.133 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.161 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/cpu volume: 10600000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '746144ca-e2d9-4395-842e-6de791e38efb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10600000000, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'timestamp': '2025-10-08T16:16:36.133758', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'instance-00000056', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '29afc7d4-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.884439916, 'message_signature': 'c4ed68a6058d5878d1f0c07eb94792f1c222b0e161d435f985ae1351d98dc73a'}]}, 'timestamp': '2025-10-08 16:16:36.162458', '_unique_id': 'e57af91df9134789b9ecf97c41206ed0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.163 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.165 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.165 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.165 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1599350823>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1599350823>]
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.166 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.166 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.device.read.latency volume: 535416166 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.166 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.device.read.latency volume: 44897409 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f597fd13-931d-43f8-b41e-e69af5966787', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 535416166, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8-vda', 'timestamp': '2025-10-08T16:16:36.166401', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'instance-00000056', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '29b0786e-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.795551804, 'message_signature': '7ce254c142054c12a6d2a89d1594755a087898984cfc6a4009236c30eb1b1fa2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 44897409, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8-sda', 'timestamp': '2025-10-08T16:16:36.166401', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'instance-00000056', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '29b08d9a-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.795551804, 'message_signature': 'f565ce3f28de3f2233e7fabdc894c474e3be06eabd6774b375f5131f45a62cd8'}]}, 'timestamp': '2025-10-08 16:16:36.167427', '_unique_id': 'c147fbe5e68c428f92b497f70fb5609e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.168 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.170 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.170 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '264bb424-559b-4170-82de-cb5921d92749', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000056-8c079d33-9c46-438f-944a-8132a7cfcfb8-tapfa1d0632-ef', 'timestamp': '2025-10-08T16:16:36.170269', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'tapfa1d0632-ef', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:82:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfa1d0632-ef'}, 'message_id': '29b10fa4-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.786852206, 'message_signature': 'bb4635e6ec43c1e803b2031e7b06818d7ce20237582f3814ee4ffb7538d383b6'}]}, 'timestamp': '2025-10-08 16:16:36.170944', '_unique_id': '3230b0259ab648b89c46f33bfb935064'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.172 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.173 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.173 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.174 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1599350823>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1599350823>]
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.174 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.174 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.device.write.latency volume: 1960704875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.175 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35263bf6-5f37-43c6-9d26-aad421448f01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1960704875, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8-vda', 'timestamp': '2025-10-08T16:16:36.174633', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'instance-00000056', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '29b1ba58-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.795551804, 'message_signature': 'bb1f37abf0bb0fbd45792fcf669610bbd3f9f7c744565fcfee61bc05bd7ff576'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8-sda', 'timestamp': '2025-10-08T16:16:36.174633', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'instance-00000056', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '29b1cd7c-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.795551804, 'message_signature': 'ee122688a69f7a1cc275cbcbee05f8a52a57234832c60b0b2072c771d86bd37e'}]}, 'timestamp': '2025-10-08 16:16:36.175603', '_unique_id': '359ade8e48884fab93ec757a638d3213'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.176 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.178 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.178 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.178 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-1599350823>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1599350823>]
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.178 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.179 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/network.incoming.bytes volume: 5343 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c114b59-9ecc-4ce8-8da4-bab72f052367', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5343, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000056-8c079d33-9c46-438f-944a-8132a7cfcfb8-tapfa1d0632-ef', 'timestamp': '2025-10-08T16:16:36.179077', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'tapfa1d0632-ef', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:82:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfa1d0632-ef'}, 'message_id': '29b26930-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.786852206, 'message_signature': '14cee25447582003b613d932a90780d7779f48bcf65517122d7039090d29cd63'}]}, 'timestamp': '2025-10-08 16:16:36.179617', '_unique_id': 'cecb8e311f9948d38ffee3f8d590fd9e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.180 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.182 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.182 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a8a759a-a051-44d2-9395-416f04085436', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000056-8c079d33-9c46-438f-944a-8132a7cfcfb8-tapfa1d0632-ef', 'timestamp': '2025-10-08T16:16:36.182206', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'tapfa1d0632-ef', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:82:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfa1d0632-ef'}, 'message_id': '29b2e432-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.786852206, 'message_signature': '55f25b90aa8451055ed3330b28b3121534ef4e4c51af8a27703c8c0b4843581f'}]}, 'timestamp': '2025-10-08 16:16:36.182765', '_unique_id': '1ede838f80404ecb9d09f45ca19cd4f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.183 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.185 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.185 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.device.read.requests volume: 1091 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.185 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '743c75a0-3479-4e31-8bc4-db76443a954f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1091, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8-vda', 'timestamp': '2025-10-08T16:16:36.185327', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'instance-00000056', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '29b35ba6-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.795551804, 'message_signature': '16189ef929bcf59a8916bd6b520de976662816c2537589c47aee49f7302f3579'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8-sda', 'timestamp': '2025-10-08T16:16:36.185327', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'instance-00000056', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '29b376ae-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.795551804, 'message_signature': '3bdb33e7cb6c39c3cdd2f80798363d7023ccf93527e42bc18e12e195ebae5cdc'}]}, 'timestamp': '2025-10-08 16:16:36.186430', '_unique_id': '62fdb3efefd0462f94e34913f03a46b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.187 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.188 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/network.outgoing.packets volume: 42 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb8ddadf-d517-4799-a46b-3d174b3ba4c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 42, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000056-8c079d33-9c46-438f-944a-8132a7cfcfb8-tapfa1d0632-ef', 'timestamp': '2025-10-08T16:16:36.188070', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'tapfa1d0632-ef', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:82:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfa1d0632-ef'}, 'message_id': '29b3c474-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.786852206, 'message_signature': '2df250252cdcff6dac31ba77728fc8e54facc2986a0009798dee2621413df1b4'}]}, 'timestamp': '2025-10-08 16:16:36.188431', '_unique_id': '3558f61cd26d41e185dfc66c88ef5e77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.189 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.190 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.190 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a9f5a5e-7a84-494a-a5fa-ba3e9b9d0e58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000056-8c079d33-9c46-438f-944a-8132a7cfcfb8-tapfa1d0632-ef', 'timestamp': '2025-10-08T16:16:36.190152', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'tapfa1d0632-ef', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:82:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfa1d0632-ef'}, 'message_id': '29b414a6-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.786852206, 'message_signature': '28f802b3a9bc479b90573a2c67bc70e3c517f6533c3efaa9139dbb6e7222d609'}]}, 'timestamp': '2025-10-08 16:16:36.190482', '_unique_id': '9cb19a55c3404e1bbed471f264f5abb8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.191 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.192 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.192 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23baee4b-5d22-4ad6-887b-40467e8b990b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000056-8c079d33-9c46-438f-944a-8132a7cfcfb8-tapfa1d0632-ef', 'timestamp': '2025-10-08T16:16:36.192268', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'tapfa1d0632-ef', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:82:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfa1d0632-ef'}, 'message_id': '29b4674e-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.786852206, 'message_signature': 'c2757a4980c61392fb25f1530977fc645244e3e1958c1a8ce390988ebce69ce0'}]}, 'timestamp': '2025-10-08 16:16:36.192600', '_unique_id': '6cfea5d94ec448b2bf8587a05925d7a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.193 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.194 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.194 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.194 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e33a6d9d-bda5-4ebb-87c7-85961064d33c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8-vda', 'timestamp': '2025-10-08T16:16:36.194263', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'instance-00000056', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '29b4b564-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.83359581, 'message_signature': '0c3f3959894a2d3b7fb6a79187b9e347a3b7f05bd712012f70c7823959a5c502'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8-sda', 'timestamp': '2025-10-08T16:16:36.194263', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'instance-00000056', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '29b4c310-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.83359581, 'message_signature': '59dc60dc79157d70508ab59ed9ffd51cdeaf74c6a9f0ae7cb6d19bee371f69b7'}]}, 'timestamp': '2025-10-08 16:16:36.194986', '_unique_id': '0d701188b6924870812b844b20c5b0d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.195 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.196 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.196 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.device.read.bytes volume: 30251520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.197 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '401ead4e-493f-4c24-b9d4-4615a5681569', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30251520, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8-vda', 'timestamp': '2025-10-08T16:16:36.196720', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'instance-00000056', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '29b516da-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.795551804, 'message_signature': '892955a564b38e46d8f3d662d13f1079f098f1fddd8347448c0386cf99698940'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8-sda', 'timestamp': '2025-10-08T16:16:36.196720', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'instance-00000056', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '29b523e6-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.795551804, 'message_signature': '29b3e0559240e5a47d78eec507e3758eae794c4cda9143bc5721258901b5ed37'}]}, 'timestamp': '2025-10-08 16:16:36.197406', '_unique_id': '73b03bb0c9df4b98b3a8f12f4d3c1a77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.198 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.199 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.199 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/network.outgoing.bytes volume: 5266 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1aa8af2-eaf6-47b6-ad17-868e85d12693', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5266, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000056-8c079d33-9c46-438f-944a-8132a7cfcfb8-tapfa1d0632-ef', 'timestamp': '2025-10-08T16:16:36.199290', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'tapfa1d0632-ef', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:82:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfa1d0632-ef'}, 'message_id': '29b57a1c-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.786852206, 'message_signature': 'efd37a7a26782c89dd5cd4b2c272d25618f36ac4632569dac5c1b8574fcdd746'}]}, 'timestamp': '2025-10-08 16:16:36.199634', '_unique_id': 'dc47832ef4584f01b14f90f17afad273'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.200 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.201 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.201 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.201 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-1599350823>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1599350823>]
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.201 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.201 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/memory.usage volume: 46.4921875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b73cfef5-89c0-49f5-a5b2-89fb299be5fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.4921875, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'timestamp': '2025-10-08T16:16:36.201783', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'instance-00000056', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '29b5db1a-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.884439916, 'message_signature': 'b5cc158b4c9dfb34d3cf6261611bedfa2d9cbe1bd25001e2c2f5b4678059fc5d'}]}, 'timestamp': '2025-10-08 16:16:36.202134', '_unique_id': 'b6c8020c1b1641d6b3db02421d0060b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.202 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.203 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.203 12 DEBUG ceilometer.compute.pollsters [-] 8c079d33-9c46-438f-944a-8132a7cfcfb8/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2dd50a7-0554-41d7-ad2d-52393379613a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000056-8c079d33-9c46-438f-944a-8132a7cfcfb8-tapfa1d0632-ef', 'timestamp': '2025-10-08T16:16:36.203721', 'resource_metadata': {'display_name': 'tempest-server-test-1599350823', 'name': 'tapfa1d0632-ef', 'instance_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:82:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfa1d0632-ef'}, 'message_id': '29b626b0-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7099.786852206, 'message_signature': '02575e466c6184b87d5b334be8fc059bcc23f03e50b29cc2a5cea4976ad2fe48'}]}, 'timestamp': '2025-10-08 16:16:36.204112', '_unique_id': 'fe463380be7743229821c7de1db61ecb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:16:36.204 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:16:36 np0005476733 nova_compute[192580]: 2025-10-08 16:16:36.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:16:36 np0005476733 ovn_controller[94857]: 2025-10-08T16:16:36Z|00822|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Oct  8 12:16:38 np0005476733 nova_compute[192580]: 2025-10-08 16:16:38.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:39 np0005476733 systemd-logind[827]: New session 89 of user zuul.
Oct  8 12:16:39 np0005476733 systemd[1]: Started Session 89 of User zuul.
Oct  8 12:16:39 np0005476733 podman[255953]: 2025-10-08 16:16:39.568812784 +0000 UTC m=+0.052941774 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:16:39 np0005476733 nova_compute[192580]: 2025-10-08 16:16:39.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:16:39 np0005476733 podman[255985]: 2025-10-08 16:16:39.651788777 +0000 UTC m=+0.056334702 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd)
Oct  8 12:16:39 np0005476733 podman[256000]: 2025-10-08 16:16:39.666872218 +0000 UTC m=+0.066432424 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, distribution-scope=public, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  8 12:16:39 np0005476733 nova_compute[192580]: 2025-10-08 16:16:39.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:43 np0005476733 nova_compute[192580]: 2025-10-08 16:16:43.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:43 np0005476733 nova_compute[192580]: 2025-10-08 16:16:43.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:16:43 np0005476733 nova_compute[192580]: 2025-10-08 16:16:43.615 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:16:43 np0005476733 nova_compute[192580]: 2025-10-08 16:16:43.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:16:43 np0005476733 nova_compute[192580]: 2025-10-08 16:16:43.617 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:16:43 np0005476733 nova_compute[192580]: 2025-10-08 16:16:43.617 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:16:43 np0005476733 nova_compute[192580]: 2025-10-08 16:16:43.706 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:16:43 np0005476733 nova_compute[192580]: 2025-10-08 16:16:43.765 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:16:43 np0005476733 nova_compute[192580]: 2025-10-08 16:16:43.766 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:16:43 np0005476733 nova_compute[192580]: 2025-10-08 16:16:43.826 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:16:43 np0005476733 nova_compute[192580]: 2025-10-08 16:16:43.957 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:16:43 np0005476733 nova_compute[192580]: 2025-10-08 16:16:43.958 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13557MB free_disk=111.28594589233398GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:16:43 np0005476733 nova_compute[192580]: 2025-10-08 16:16:43.959 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:16:43 np0005476733 nova_compute[192580]: 2025-10-08 16:16:43.959 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:16:44 np0005476733 nova_compute[192580]: 2025-10-08 16:16:44.031 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 8c079d33-9c46-438f-944a-8132a7cfcfb8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:16:44 np0005476733 nova_compute[192580]: 2025-10-08 16:16:44.031 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:16:44 np0005476733 nova_compute[192580]: 2025-10-08 16:16:44.031 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:16:44 np0005476733 nova_compute[192580]: 2025-10-08 16:16:44.058 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 12:16:44 np0005476733 nova_compute[192580]: 2025-10-08 16:16:44.074 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 12:16:44 np0005476733 nova_compute[192580]: 2025-10-08 16:16:44.075 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 12:16:44 np0005476733 nova_compute[192580]: 2025-10-08 16:16:44.092 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 12:16:44 np0005476733 nova_compute[192580]: 2025-10-08 16:16:44.116 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 12:16:44 np0005476733 nova_compute[192580]: 2025-10-08 16:16:44.171 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:16:44 np0005476733 nova_compute[192580]: 2025-10-08 16:16:44.189 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:16:44 np0005476733 nova_compute[192580]: 2025-10-08 16:16:44.217 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:16:44 np0005476733 nova_compute[192580]: 2025-10-08 16:16:44.217 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:16:44 np0005476733 nova_compute[192580]: 2025-10-08 16:16:44.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:44 np0005476733 nova_compute[192580]: 2025-10-08 16:16:44.869 2 DEBUG nova.virt.libvirt.driver [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Creating tmpfile /var/lib/nova/instances/tmpfze8n1k2 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Oct  8 12:16:44 np0005476733 nova_compute[192580]: 2025-10-08 16:16:44.994 2 DEBUG nova.compute.manager [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=112640,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfze8n1k2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Oct  8 12:16:45 np0005476733 nova_compute[192580]: 2025-10-08 16:16:45.218 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:16:46 np0005476733 nova_compute[192580]: 2025-10-08 16:16:46.000 2 DEBUG nova.compute.manager [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=112640,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfze8n1k2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Oct  8 12:16:46 np0005476733 nova_compute[192580]: 2025-10-08 16:16:46.032 2 DEBUG oslo_concurrency.lockutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "refresh_cache-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:16:46 np0005476733 nova_compute[192580]: 2025-10-08 16:16:46.032 2 DEBUG oslo_concurrency.lockutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquired lock "refresh_cache-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:16:46 np0005476733 nova_compute[192580]: 2025-10-08 16:16:46.033 2 DEBUG nova.network.neutron [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.190 2 DEBUG nova.network.neutron [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Updating instance_info_cache with network_info: [{"id": "eccc469f-f05b-41f3-ad17-900281358d00", "address": "fa:16:3e:e4:65:ad", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeccc469f-f0", "ovs_interfaceid": "eccc469f-f05b-41f3-ad17-900281358d00", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.214 2 DEBUG oslo_concurrency.lockutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Releasing lock "refresh_cache-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.215 2 DEBUG nova.virt.libvirt.driver [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=112640,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfze8n1k2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.216 2 DEBUG nova.virt.libvirt.driver [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Creating instance directory: /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.216 2 DEBUG nova.virt.libvirt.driver [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Creating disk.info with the contents: {'/var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk': 'qcow2', '/var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.216 2 DEBUG nova.virt.libvirt.driver [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.217 2 DEBUG nova.objects.instance [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.246 2 DEBUG oslo_concurrency.processutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.335 2 DEBUG oslo_concurrency.processutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.336 2 DEBUG oslo_concurrency.lockutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.337 2 DEBUG oslo_concurrency.lockutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.347 2 DEBUG oslo_concurrency.processutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.406 2 DEBUG oslo_concurrency.processutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.407 2 DEBUG oslo_concurrency.processutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.439 2 DEBUG oslo_concurrency.processutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.440 2 DEBUG oslo_concurrency.lockutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.441 2 DEBUG oslo_concurrency.processutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.526 2 DEBUG oslo_concurrency.processutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.527 2 DEBUG nova.virt.disk.api [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Checking if we can resize image /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.528 2 DEBUG oslo_concurrency.processutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.587 2 DEBUG oslo_concurrency.processutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.588 2 DEBUG nova.virt.disk.api [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Cannot resize image /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.589 2 DEBUG nova.objects.instance [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lazy-loading 'migration_context' on Instance uuid 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.604 2 DEBUG oslo_concurrency.processutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.629 2 DEBUG oslo_concurrency.processutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk.config 485376" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.630 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk.config to /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  8 12:16:47 np0005476733 nova_compute[192580]: 2025-10-08 16:16:47.631 2 DEBUG oslo_concurrency.processutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk.config /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:16:48 np0005476733 nova_compute[192580]: 2025-10-08 16:16:48.126 2 DEBUG oslo_concurrency.processutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk.config /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:16:48 np0005476733 nova_compute[192580]: 2025-10-08 16:16:48.127 2 DEBUG nova.virt.libvirt.driver [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Oct  8 12:16:48 np0005476733 nova_compute[192580]: 2025-10-08 16:16:48.128 2 DEBUG nova.virt.libvirt.vif [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T16:15:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-server-test-1716699684',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-1716699684',id=85,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+zrgavmRKdnUQXrKVYbBp+WYlZmJhFQsa1kl3Dxiu3QX4zg/ahS8IRtri5xncBe6cTI7KlCkUrKeQ2+JI5DxlFw723YYmaC31z8s6e9ZieApUJckBa+6MT9ksK1drFCw==',key_name='tempest-keypair-1043886460',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:15:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9d7b1c6f132443b0abac8495ed44621d',ramdisk_id='',reservation_id='r-b38at89j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-OvnDvrTest-313060968',owner_user_name='tempest-OvnDvrTest-313060968-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:15:39Z,user_data=None,user_id='81b62a8f3edf4f78aeb0b087fd79ebb7',uuid=9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eccc469f-f05b-41f3-ad17-900281358d00", "address": "fa:16:3e:e4:65:ad", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapeccc469f-f0", "ovs_interfaceid": "eccc469f-f05b-41f3-ad17-900281358d00", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 12:16:48 np0005476733 nova_compute[192580]: 2025-10-08 16:16:48.128 2 DEBUG nova.network.os_vif_util [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converting VIF {"id": "eccc469f-f05b-41f3-ad17-900281358d00", "address": "fa:16:3e:e4:65:ad", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapeccc469f-f0", "ovs_interfaceid": "eccc469f-f05b-41f3-ad17-900281358d00", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:16:48 np0005476733 nova_compute[192580]: 2025-10-08 16:16:48.129 2 DEBUG nova.network.os_vif_util [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e4:65:ad,bridge_name='br-int',has_traffic_filtering=True,id=eccc469f-f05b-41f3-ad17-900281358d00,network=Network(a60d5119-22ed-4506-b21f-c7850a67e1ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeccc469f-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:16:48 np0005476733 nova_compute[192580]: 2025-10-08 16:16:48.129 2 DEBUG os_vif [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e4:65:ad,bridge_name='br-int',has_traffic_filtering=True,id=eccc469f-f05b-41f3-ad17-900281358d00,network=Network(a60d5119-22ed-4506-b21f-c7850a67e1ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeccc469f-f0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 12:16:48 np0005476733 nova_compute[192580]: 2025-10-08 16:16:48.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:48 np0005476733 nova_compute[192580]: 2025-10-08 16:16:48.131 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:16:48 np0005476733 nova_compute[192580]: 2025-10-08 16:16:48.131 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:16:48 np0005476733 nova_compute[192580]: 2025-10-08 16:16:48.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:48 np0005476733 nova_compute[192580]: 2025-10-08 16:16:48.134 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeccc469f-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:16:48 np0005476733 nova_compute[192580]: 2025-10-08 16:16:48.134 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeccc469f-f0, col_values=(('external_ids', {'iface-id': 'eccc469f-f05b-41f3-ad17-900281358d00', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e4:65:ad', 'vm-uuid': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:16:48 np0005476733 nova_compute[192580]: 2025-10-08 16:16:48.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:48 np0005476733 NetworkManager[51699]: <info>  [1759940208.1371] manager: (tapeccc469f-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/269)
Oct  8 12:16:48 np0005476733 nova_compute[192580]: 2025-10-08 16:16:48.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:16:48 np0005476733 nova_compute[192580]: 2025-10-08 16:16:48.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:48 np0005476733 nova_compute[192580]: 2025-10-08 16:16:48.144 2 INFO os_vif [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e4:65:ad,bridge_name='br-int',has_traffic_filtering=True,id=eccc469f-f05b-41f3-ad17-900281358d00,network=Network(a60d5119-22ed-4506-b21f-c7850a67e1ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeccc469f-f0')#033[00m
Oct  8 12:16:48 np0005476733 nova_compute[192580]: 2025-10-08 16:16:48.144 2 DEBUG nova.virt.libvirt.driver [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Oct  8 12:16:48 np0005476733 nova_compute[192580]: 2025-10-08 16:16:48.145 2 DEBUG nova.compute.manager [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=112640,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfze8n1k2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Oct  8 12:16:48 np0005476733 podman[256077]: 2025-10-08 16:16:48.226041909 +0000 UTC m=+0.052881412 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 12:16:48 np0005476733 podman[256078]: 2025-10-08 16:16:48.233226868 +0000 UTC m=+0.057487748 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:16:48 np0005476733 nova_compute[192580]: 2025-10-08 16:16:48.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:49.115 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=58, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=57) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:16:49 np0005476733 nova_compute[192580]: 2025-10-08 16:16:49.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:49.117 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:16:50 np0005476733 nova_compute[192580]: 2025-10-08 16:16:50.367 2 DEBUG nova.network.neutron [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Port eccc469f-f05b-41f3-ad17-900281358d00 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Oct  8 12:16:50 np0005476733 nova_compute[192580]: 2025-10-08 16:16:50.369 2 DEBUG nova.compute.manager [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=112640,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfze8n1k2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Oct  8 12:16:50 np0005476733 systemd[1]: Starting libvirt proxy daemon...
Oct  8 12:16:50 np0005476733 systemd[1]: Started libvirt proxy daemon.
Oct  8 12:16:50 np0005476733 kernel: tapeccc469f-f0: entered promiscuous mode
Oct  8 12:16:50 np0005476733 NetworkManager[51699]: <info>  [1759940210.7364] manager: (tapeccc469f-f0): new Tun device (/org/freedesktop/NetworkManager/Devices/270)
Oct  8 12:16:50 np0005476733 ovn_controller[94857]: 2025-10-08T16:16:50Z|00823|binding|INFO|Claiming lport eccc469f-f05b-41f3-ad17-900281358d00 for this additional chassis.
Oct  8 12:16:50 np0005476733 ovn_controller[94857]: 2025-10-08T16:16:50Z|00824|binding|INFO|eccc469f-f05b-41f3-ad17-900281358d00: Claiming fa:16:3e:e4:65:ad 10.100.0.19
Oct  8 12:16:50 np0005476733 nova_compute[192580]: 2025-10-08 16:16:50.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:50 np0005476733 ovn_controller[94857]: 2025-10-08T16:16:50Z|00825|binding|INFO|Setting lport eccc469f-f05b-41f3-ad17-900281358d00 ovn-installed in OVS
Oct  8 12:16:50 np0005476733 nova_compute[192580]: 2025-10-08 16:16:50.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:50 np0005476733 systemd-machined[152624]: New machine qemu-53-instance-00000055.
Oct  8 12:16:50 np0005476733 systemd-udevd[256151]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:16:50 np0005476733 NetworkManager[51699]: <info>  [1759940210.8229] device (tapeccc469f-f0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:16:50 np0005476733 NetworkManager[51699]: <info>  [1759940210.8237] device (tapeccc469f-f0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 12:16:50 np0005476733 systemd[1]: Started Virtual Machine qemu-53-instance-00000055.
Oct  8 12:16:52 np0005476733 nova_compute[192580]: 2025-10-08 16:16:52.428 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759940212.427679, 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:16:52 np0005476733 nova_compute[192580]: 2025-10-08 16:16:52.430 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] VM Started (Lifecycle Event)#033[00m
Oct  8 12:16:52 np0005476733 nova_compute[192580]: 2025-10-08 16:16:52.463 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:16:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:53.119 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '58'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:16:53 np0005476733 nova_compute[192580]: 2025-10-08 16:16:53.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:53 np0005476733 nova_compute[192580]: 2025-10-08 16:16:53.192 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759940213.1918433, 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:16:53 np0005476733 nova_compute[192580]: 2025-10-08 16:16:53.192 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] VM Resumed (Lifecycle Event)#033[00m
Oct  8 12:16:53 np0005476733 nova_compute[192580]: 2025-10-08 16:16:53.215 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:16:53 np0005476733 nova_compute[192580]: 2025-10-08 16:16:53.219 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:16:53 np0005476733 nova_compute[192580]: 2025-10-08 16:16:53.242 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Oct  8 12:16:53 np0005476733 nova_compute[192580]: 2025-10-08 16:16:53.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:54 np0005476733 ovn_controller[94857]: 2025-10-08T16:16:54Z|00826|binding|INFO|Claiming lport eccc469f-f05b-41f3-ad17-900281358d00 for this chassis.
Oct  8 12:16:54 np0005476733 ovn_controller[94857]: 2025-10-08T16:16:54Z|00827|binding|INFO|eccc469f-f05b-41f3-ad17-900281358d00: Claiming fa:16:3e:e4:65:ad 10.100.0.19
Oct  8 12:16:54 np0005476733 ovn_controller[94857]: 2025-10-08T16:16:54Z|00828|binding|INFO|Setting lport eccc469f-f05b-41f3-ad17-900281358d00 up in Southbound
Oct  8 12:16:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:54.310 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:65:ad 10.100.0.19'], port_security=['fa:16:3e:e4:65:ad 10.100.0.19'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28', 'neutron:device_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a60d5119-22ed-4506-b21f-c7850a67e1ca', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d7b1c6f132443b0abac8495ed44621d', 'neutron:revision_number': '10', 'neutron:security_group_ids': '6563b284-b73c-434a-b0ec-7119c8f265ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.245'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=67e260af-4f33-4a56-b3c9-23ccc3032d2a, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=eccc469f-f05b-41f3-ad17-900281358d00) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:16:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:54.313 103739 INFO neutron.agent.ovn.metadata.agent [-] Port eccc469f-f05b-41f3-ad17-900281358d00 in datapath a60d5119-22ed-4506-b21f-c7850a67e1ca bound to our chassis#033[00m
Oct  8 12:16:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:54.316 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a60d5119-22ed-4506-b21f-c7850a67e1ca#033[00m
Oct  8 12:16:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:54.346 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[14d66937-3bf4-4e12-9ee5-9ab863cfdb8a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:16:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:54.402 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[4abd7b21-ca01-4ba7-99e7-a5ba30f31862]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:16:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:54.408 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[bb0fc76c-06e7-4161-9fb3-1ebf4c287557]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:16:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:54.460 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[09d6416a-b828-4a34-818c-f16373d91490]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:16:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:54.489 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[09404e57-56d2-491e-920e-414f685a0c9d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa60d5119-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:5a:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 15, 'tx_packets': 5, 'rx_bytes': 1126, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 15, 'tx_packets': 5, 'rx_bytes': 1126, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 185], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706540, 'reachable_time': 44716, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256186, 'error': None, 'target': 'ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:16:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:54.513 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[06eb0861-5230-445d-8f0e-f6f40c84e907]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa60d5119-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706551, 'tstamp': 706551}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256187, 'error': None, 'target': 'ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.18'], ['IFA_LOCAL', '10.100.0.18'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tapa60d5119-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706554, 'tstamp': 706554}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256187, 'error': None, 'target': 'ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:16:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:54.516 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa60d5119-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:16:54 np0005476733 nova_compute[192580]: 2025-10-08 16:16:54.553 2 INFO nova.compute.manager [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Post operation of migration started#033[00m
Oct  8 12:16:54 np0005476733 nova_compute[192580]: 2025-10-08 16:16:54.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:54 np0005476733 nova_compute[192580]: 2025-10-08 16:16:54.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:54.564 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa60d5119-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:16:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:54.565 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:16:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:54.566 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa60d5119-20, col_values=(('external_ids', {'iface-id': '6d5cee0f-29d4-4b28-ba94-d661be87caac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:16:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:16:54.566 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:16:54 np0005476733 nova_compute[192580]: 2025-10-08 16:16:54.968 2 DEBUG oslo_concurrency.lockutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "refresh_cache-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:16:54 np0005476733 nova_compute[192580]: 2025-10-08 16:16:54.969 2 DEBUG oslo_concurrency.lockutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquired lock "refresh_cache-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:16:54 np0005476733 nova_compute[192580]: 2025-10-08 16:16:54.970 2 DEBUG nova.network.neutron [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:16:56 np0005476733 nova_compute[192580]: 2025-10-08 16:16:56.382 2 DEBUG nova.network.neutron [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Updating instance_info_cache with network_info: [{"id": "eccc469f-f05b-41f3-ad17-900281358d00", "address": "fa:16:3e:e4:65:ad", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeccc469f-f0", "ovs_interfaceid": "eccc469f-f05b-41f3-ad17-900281358d00", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:16:56 np0005476733 nova_compute[192580]: 2025-10-08 16:16:56.412 2 DEBUG oslo_concurrency.lockutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Releasing lock "refresh_cache-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:16:56 np0005476733 nova_compute[192580]: 2025-10-08 16:16:56.430 2 DEBUG oslo_concurrency.lockutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:16:56 np0005476733 nova_compute[192580]: 2025-10-08 16:16:56.431 2 DEBUG oslo_concurrency.lockutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:16:56 np0005476733 nova_compute[192580]: 2025-10-08 16:16:56.431 2 DEBUG oslo_concurrency.lockutils [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:16:56 np0005476733 nova_compute[192580]: 2025-10-08 16:16:56.437 2 INFO nova.virt.libvirt.driver [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Oct  8 12:16:56 np0005476733 virtqemud[192152]: Domain id=53 name='instance-00000055' uuid=9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3 is tainted: custom-monitor
Oct  8 12:16:57 np0005476733 nova_compute[192580]: 2025-10-08 16:16:57.447 2 INFO nova.virt.libvirt.driver [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Oct  8 12:16:58 np0005476733 nova_compute[192580]: 2025-10-08 16:16:58.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:58 np0005476733 nova_compute[192580]: 2025-10-08 16:16:58.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:16:58 np0005476733 podman[256188]: 2025-10-08 16:16:58.350385546 +0000 UTC m=+0.176284627 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:16:58 np0005476733 nova_compute[192580]: 2025-10-08 16:16:58.454 2 INFO nova.virt.libvirt.driver [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Oct  8 12:16:58 np0005476733 nova_compute[192580]: 2025-10-08 16:16:58.459 2 DEBUG nova.compute.manager [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:16:58 np0005476733 nova_compute[192580]: 2025-10-08 16:16:58.481 2 DEBUG nova.objects.instance [None req-823724e6-ebed-4310-8bf4-080a052a24b8 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  8 12:17:03 np0005476733 podman[256208]: 2025-10-08 16:17:03.27216645 +0000 UTC m=+0.082828579 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:17:03 np0005476733 nova_compute[192580]: 2025-10-08 16:17:03.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:03 np0005476733 nova_compute[192580]: 2025-10-08 16:17:03.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:17:03 np0005476733 podman[256207]: 2025-10-08 16:17:03.330718102 +0000 UTC m=+0.148223969 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 12:17:04 np0005476733 nova_compute[192580]: 2025-10-08 16:17:04.925 2 DEBUG nova.virt.libvirt.driver [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Check if temp file /var/lib/nova/instances/tmpyr8w_0_e exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Oct  8 12:17:04 np0005476733 nova_compute[192580]: 2025-10-08 16:17:04.925 2 DEBUG nova.compute.manager [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=113664,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpyr8w_0_e',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8c079d33-9c46-438f-944a-8132a7cfcfb8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Oct  8 12:17:05 np0005476733 nova_compute[192580]: 2025-10-08 16:17:05.817 2 DEBUG oslo_concurrency.processutils [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:17:05 np0005476733 nova_compute[192580]: 2025-10-08 16:17:05.895 2 DEBUG oslo_concurrency.processutils [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:17:05 np0005476733 nova_compute[192580]: 2025-10-08 16:17:05.897 2 DEBUG oslo_concurrency.processutils [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:17:05 np0005476733 nova_compute[192580]: 2025-10-08 16:17:05.973 2 DEBUG oslo_concurrency.processutils [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:17:08 np0005476733 nova_compute[192580]: 2025-10-08 16:17:08.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:08 np0005476733 systemd-logind[827]: New session 90 of user nova.
Oct  8 12:17:08 np0005476733 systemd[1]: Created slice User Slice of UID 42436.
Oct  8 12:17:09 np0005476733 systemd[1]: Starting User Runtime Directory /run/user/42436...
Oct  8 12:17:09 np0005476733 systemd[1]: Finished User Runtime Directory /run/user/42436.
Oct  8 12:17:09 np0005476733 systemd[1]: Starting User Manager for UID 42436...
Oct  8 12:17:09 np0005476733 systemd[256262]: Queued start job for default target Main User Target.
Oct  8 12:17:09 np0005476733 systemd[256262]: Created slice User Application Slice.
Oct  8 12:17:09 np0005476733 systemd[256262]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  8 12:17:09 np0005476733 systemd[256262]: Started Daily Cleanup of User's Temporary Directories.
Oct  8 12:17:09 np0005476733 systemd[256262]: Reached target Paths.
Oct  8 12:17:09 np0005476733 systemd[256262]: Reached target Timers.
Oct  8 12:17:09 np0005476733 systemd[256262]: Starting D-Bus User Message Bus Socket...
Oct  8 12:17:09 np0005476733 systemd[256262]: Starting Create User's Volatile Files and Directories...
Oct  8 12:17:09 np0005476733 systemd[256262]: Finished Create User's Volatile Files and Directories.
Oct  8 12:17:09 np0005476733 systemd[256262]: Listening on D-Bus User Message Bus Socket.
Oct  8 12:17:09 np0005476733 systemd[256262]: Reached target Sockets.
Oct  8 12:17:09 np0005476733 systemd[256262]: Reached target Basic System.
Oct  8 12:17:09 np0005476733 systemd[256262]: Reached target Main User Target.
Oct  8 12:17:09 np0005476733 systemd[256262]: Startup finished in 140ms.
Oct  8 12:17:09 np0005476733 systemd[1]: Started User Manager for UID 42436.
Oct  8 12:17:09 np0005476733 systemd[1]: Started Session 90 of User nova.
Oct  8 12:17:09 np0005476733 systemd[1]: session-90.scope: Deactivated successfully.
Oct  8 12:17:09 np0005476733 systemd-logind[827]: Session 90 logged out. Waiting for processes to exit.
Oct  8 12:17:09 np0005476733 systemd-logind[827]: Removed session 90.
Oct  8 12:17:10 np0005476733 podman[256279]: 2025-10-08 16:17:10.248872041 +0000 UTC m=+0.078882182 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible)
Oct  8 12:17:10 np0005476733 podman[256281]: 2025-10-08 16:17:10.261462624 +0000 UTC m=+0.078409317 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41)
Oct  8 12:17:10 np0005476733 podman[256280]: 2025-10-08 16:17:10.269960326 +0000 UTC m=+0.082880301 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 12:17:11 np0005476733 nova_compute[192580]: 2025-10-08 16:17:11.732 2 DEBUG nova.compute.manager [req-e0248be4-5c1b-464b-bbde-3572864653bb req-546ecce5-efd3-4e00-b558-4d862cc10794 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Received event network-vif-unplugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:17:11 np0005476733 nova_compute[192580]: 2025-10-08 16:17:11.733 2 DEBUG oslo_concurrency.lockutils [req-e0248be4-5c1b-464b-bbde-3572864653bb req-546ecce5-efd3-4e00-b558-4d862cc10794 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:17:11 np0005476733 nova_compute[192580]: 2025-10-08 16:17:11.733 2 DEBUG oslo_concurrency.lockutils [req-e0248be4-5c1b-464b-bbde-3572864653bb req-546ecce5-efd3-4e00-b558-4d862cc10794 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:17:11 np0005476733 nova_compute[192580]: 2025-10-08 16:17:11.734 2 DEBUG oslo_concurrency.lockutils [req-e0248be4-5c1b-464b-bbde-3572864653bb req-546ecce5-efd3-4e00-b558-4d862cc10794 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:17:11 np0005476733 nova_compute[192580]: 2025-10-08 16:17:11.734 2 DEBUG nova.compute.manager [req-e0248be4-5c1b-464b-bbde-3572864653bb req-546ecce5-efd3-4e00-b558-4d862cc10794 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] No waiting events found dispatching network-vif-unplugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:17:11 np0005476733 nova_compute[192580]: 2025-10-08 16:17:11.734 2 DEBUG nova.compute.manager [req-e0248be4-5c1b-464b-bbde-3572864653bb req-546ecce5-efd3-4e00-b558-4d862cc10794 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Received event network-vif-unplugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 12:17:13 np0005476733 nova_compute[192580]: 2025-10-08 16:17:13.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:13 np0005476733 nova_compute[192580]: 2025-10-08 16:17:13.686 2 INFO nova.compute.manager [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Took 7.71 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Oct  8 12:17:13 np0005476733 nova_compute[192580]: 2025-10-08 16:17:13.686 2 DEBUG nova.compute.manager [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 12:17:13 np0005476733 nova_compute[192580]: 2025-10-08 16:17:13.709 2 DEBUG nova.compute.manager [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=113664,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpyr8w_0_e',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8c079d33-9c46-438f-944a-8132a7cfcfb8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(bc65eb8b-809e-498f-bef5-bac3a273c1c8),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Oct  8 12:17:13 np0005476733 nova_compute[192580]: 2025-10-08 16:17:13.730 2 DEBUG nova.objects.instance [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lazy-loading 'migration_context' on Instance uuid 8c079d33-9c46-438f-944a-8132a7cfcfb8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:17:13 np0005476733 nova_compute[192580]: 2025-10-08 16:17:13.731 2 DEBUG nova.virt.libvirt.driver [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Oct  8 12:17:13 np0005476733 nova_compute[192580]: 2025-10-08 16:17:13.732 2 DEBUG nova.virt.libvirt.driver [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Oct  8 12:17:13 np0005476733 nova_compute[192580]: 2025-10-08 16:17:13.733 2 DEBUG nova.virt.libvirt.driver [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Oct  8 12:17:13 np0005476733 nova_compute[192580]: 2025-10-08 16:17:13.762 2 DEBUG nova.virt.libvirt.vif [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T16:15:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-server-test-1599350823',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1599350823',id=86,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+zrgavmRKdnUQXrKVYbBp+WYlZmJhFQsa1kl3Dxiu3QX4zg/ahS8IRtri5xncBe6cTI7KlCkUrKeQ2+JI5DxlFw723YYmaC31z8s6e9ZieApUJckBa+6MT9ksK1drFCw==',key_name='tempest-keypair-1043886460',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:16:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9d7b1c6f132443b0abac8495ed44621d',ramdisk_id='',reservation_id='r-tpuvmdiu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-OvnDvrTest-313060968',owner_user_name='tempest-OvnDvrTest-313060968-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T16:16:02Z,user_data=None,user_id='81b62a8f3edf4f78aeb0b087fd79ebb7',uuid=8c079d33-9c46-438f-944a-8132a7cfcfb8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "address": "fa:16:3e:42:82:a7", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapfa1d0632-ef", "ovs_interfaceid": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 12:17:13 np0005476733 nova_compute[192580]: 2025-10-08 16:17:13.762 2 DEBUG nova.network.os_vif_util [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converting VIF {"id": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "address": "fa:16:3e:42:82:a7", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapfa1d0632-ef", "ovs_interfaceid": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:17:13 np0005476733 nova_compute[192580]: 2025-10-08 16:17:13.763 2 DEBUG nova.network.os_vif_util [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:42:82:a7,bridge_name='br-int',has_traffic_filtering=True,id=fa1d0632-ef78-4a84-a89e-2efa740540a4,network=Network(a60d5119-22ed-4506-b21f-c7850a67e1ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa1d0632-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:17:13 np0005476733 nova_compute[192580]: 2025-10-08 16:17:13.763 2 DEBUG nova.virt.libvirt.migration [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Updating guest XML with vif config: <interface type="ethernet">
Oct  8 12:17:13 np0005476733 nova_compute[192580]:  <mac address="fa:16:3e:42:82:a7"/>
Oct  8 12:17:13 np0005476733 nova_compute[192580]:  <model type="virtio"/>
Oct  8 12:17:13 np0005476733 nova_compute[192580]:  <driver name="vhost" rx_queue_size="512"/>
Oct  8 12:17:13 np0005476733 nova_compute[192580]:  <mtu size="1342"/>
Oct  8 12:17:13 np0005476733 nova_compute[192580]:  <target dev="tapfa1d0632-ef"/>
Oct  8 12:17:13 np0005476733 nova_compute[192580]: </interface>
Oct  8 12:17:13 np0005476733 nova_compute[192580]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Oct  8 12:17:13 np0005476733 nova_compute[192580]: 2025-10-08 16:17:13.764 2 DEBUG nova.virt.libvirt.driver [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Oct  8 12:17:13 np0005476733 nova_compute[192580]: 2025-10-08 16:17:13.872 2 DEBUG nova.compute.manager [req-90cb7d43-0e57-40a7-bee5-b6546e908e8c req-d8a363a7-ce4b-4aa4-8dc6-059f98e0dcab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Received event network-vif-plugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:17:13 np0005476733 nova_compute[192580]: 2025-10-08 16:17:13.873 2 DEBUG oslo_concurrency.lockutils [req-90cb7d43-0e57-40a7-bee5-b6546e908e8c req-d8a363a7-ce4b-4aa4-8dc6-059f98e0dcab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:17:13 np0005476733 nova_compute[192580]: 2025-10-08 16:17:13.873 2 DEBUG oslo_concurrency.lockutils [req-90cb7d43-0e57-40a7-bee5-b6546e908e8c req-d8a363a7-ce4b-4aa4-8dc6-059f98e0dcab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:17:13 np0005476733 nova_compute[192580]: 2025-10-08 16:17:13.873 2 DEBUG oslo_concurrency.lockutils [req-90cb7d43-0e57-40a7-bee5-b6546e908e8c req-d8a363a7-ce4b-4aa4-8dc6-059f98e0dcab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:17:13 np0005476733 nova_compute[192580]: 2025-10-08 16:17:13.874 2 DEBUG nova.compute.manager [req-90cb7d43-0e57-40a7-bee5-b6546e908e8c req-d8a363a7-ce4b-4aa4-8dc6-059f98e0dcab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] No waiting events found dispatching network-vif-plugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:17:13 np0005476733 nova_compute[192580]: 2025-10-08 16:17:13.874 2 WARNING nova.compute.manager [req-90cb7d43-0e57-40a7-bee5-b6546e908e8c req-d8a363a7-ce4b-4aa4-8dc6-059f98e0dcab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Received unexpected event network-vif-plugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 for instance with vm_state active and task_state migrating.#033[00m
Oct  8 12:17:13 np0005476733 nova_compute[192580]: 2025-10-08 16:17:13.874 2 DEBUG nova.compute.manager [req-90cb7d43-0e57-40a7-bee5-b6546e908e8c req-d8a363a7-ce4b-4aa4-8dc6-059f98e0dcab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Received event network-changed-fa1d0632-ef78-4a84-a89e-2efa740540a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:17:13 np0005476733 nova_compute[192580]: 2025-10-08 16:17:13.875 2 DEBUG nova.compute.manager [req-90cb7d43-0e57-40a7-bee5-b6546e908e8c req-d8a363a7-ce4b-4aa4-8dc6-059f98e0dcab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Refreshing instance network info cache due to event network-changed-fa1d0632-ef78-4a84-a89e-2efa740540a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:17:13 np0005476733 nova_compute[192580]: 2025-10-08 16:17:13.875 2 DEBUG oslo_concurrency.lockutils [req-90cb7d43-0e57-40a7-bee5-b6546e908e8c req-d8a363a7-ce4b-4aa4-8dc6-059f98e0dcab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-8c079d33-9c46-438f-944a-8132a7cfcfb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:17:13 np0005476733 nova_compute[192580]: 2025-10-08 16:17:13.876 2 DEBUG oslo_concurrency.lockutils [req-90cb7d43-0e57-40a7-bee5-b6546e908e8c req-d8a363a7-ce4b-4aa4-8dc6-059f98e0dcab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-8c079d33-9c46-438f-944a-8132a7cfcfb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:17:13 np0005476733 nova_compute[192580]: 2025-10-08 16:17:13.876 2 DEBUG nova.network.neutron [req-90cb7d43-0e57-40a7-bee5-b6546e908e8c req-d8a363a7-ce4b-4aa4-8dc6-059f98e0dcab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Refreshing network info cache for port fa1d0632-ef78-4a84-a89e-2efa740540a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:17:14 np0005476733 nova_compute[192580]: 2025-10-08 16:17:14.234 2 DEBUG nova.virt.libvirt.migration [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  8 12:17:14 np0005476733 nova_compute[192580]: 2025-10-08 16:17:14.235 2 INFO nova.virt.libvirt.migration [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Oct  8 12:17:14 np0005476733 nova_compute[192580]: 2025-10-08 16:17:14.310 2 INFO nova.virt.libvirt.driver [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Oct  8 12:17:14 np0005476733 nova_compute[192580]: 2025-10-08 16:17:14.813 2 DEBUG nova.virt.libvirt.migration [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  8 12:17:14 np0005476733 nova_compute[192580]: 2025-10-08 16:17:14.813 2 DEBUG nova.virt.libvirt.migration [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  8 12:17:15 np0005476733 nova_compute[192580]: 2025-10-08 16:17:15.316 2 DEBUG nova.virt.libvirt.migration [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  8 12:17:15 np0005476733 nova_compute[192580]: 2025-10-08 16:17:15.317 2 DEBUG nova.virt.libvirt.migration [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  8 12:17:15 np0005476733 nova_compute[192580]: 2025-10-08 16:17:15.636 2 DEBUG nova.network.neutron [req-90cb7d43-0e57-40a7-bee5-b6546e908e8c req-d8a363a7-ce4b-4aa4-8dc6-059f98e0dcab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Updated VIF entry in instance network info cache for port fa1d0632-ef78-4a84-a89e-2efa740540a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:17:15 np0005476733 nova_compute[192580]: 2025-10-08 16:17:15.637 2 DEBUG nova.network.neutron [req-90cb7d43-0e57-40a7-bee5-b6546e908e8c req-d8a363a7-ce4b-4aa4-8dc6-059f98e0dcab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Updating instance_info_cache with network_info: [{"id": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "address": "fa:16:3e:42:82:a7", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa1d0632-ef", "ovs_interfaceid": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:17:15 np0005476733 nova_compute[192580]: 2025-10-08 16:17:15.661 2 DEBUG oslo_concurrency.lockutils [req-90cb7d43-0e57-40a7-bee5-b6546e908e8c req-d8a363a7-ce4b-4aa4-8dc6-059f98e0dcab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-8c079d33-9c46-438f-944a-8132a7cfcfb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:17:15 np0005476733 nova_compute[192580]: 2025-10-08 16:17:15.821 2 DEBUG nova.virt.libvirt.migration [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  8 12:17:15 np0005476733 nova_compute[192580]: 2025-10-08 16:17:15.821 2 DEBUG nova.virt.libvirt.migration [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  8 12:17:16 np0005476733 nova_compute[192580]: 2025-10-08 16:17:16.324 2 DEBUG nova.virt.libvirt.migration [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  8 12:17:16 np0005476733 nova_compute[192580]: 2025-10-08 16:17:16.324 2 DEBUG nova.virt.libvirt.migration [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  8 12:17:16 np0005476733 nova_compute[192580]: 2025-10-08 16:17:16.829 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759940236.8294926, 8c079d33-9c46-438f-944a-8132a7cfcfb8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:17:16 np0005476733 nova_compute[192580]: 2025-10-08 16:17:16.830 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] VM Paused (Lifecycle Event)#033[00m
Oct  8 12:17:16 np0005476733 nova_compute[192580]: 2025-10-08 16:17:16.832 2 DEBUG nova.virt.libvirt.migration [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  8 12:17:16 np0005476733 nova_compute[192580]: 2025-10-08 16:17:16.833 2 DEBUG nova.virt.libvirt.migration [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  8 12:17:16 np0005476733 nova_compute[192580]: 2025-10-08 16:17:16.851 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:17:16 np0005476733 nova_compute[192580]: 2025-10-08 16:17:16.855 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:17:16 np0005476733 nova_compute[192580]: 2025-10-08 16:17:16.880 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Oct  8 12:17:17 np0005476733 kernel: tapfa1d0632-ef (unregistering): left promiscuous mode
Oct  8 12:17:17 np0005476733 NetworkManager[51699]: <info>  [1759940237.0138] device (tapfa1d0632-ef): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 12:17:17 np0005476733 nova_compute[192580]: 2025-10-08 16:17:17.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:17 np0005476733 ovn_controller[94857]: 2025-10-08T16:17:17Z|00829|binding|INFO|Releasing lport fa1d0632-ef78-4a84-a89e-2efa740540a4 from this chassis (sb_readonly=0)
Oct  8 12:17:17 np0005476733 ovn_controller[94857]: 2025-10-08T16:17:17Z|00830|binding|INFO|Setting lport fa1d0632-ef78-4a84-a89e-2efa740540a4 down in Southbound
Oct  8 12:17:17 np0005476733 ovn_controller[94857]: 2025-10-08T16:17:17Z|00831|binding|INFO|Removing iface tapfa1d0632-ef ovn-installed in OVS
Oct  8 12:17:17 np0005476733 nova_compute[192580]: 2025-10-08 16:17:17.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:17:17.034 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:42:82:a7 10.100.0.27'], port_security=['fa:16:3e:42:82:a7 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '329dd4f3-73f4-4bda-955c-e971074e916e'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': '8c079d33-9c46-438f-944a-8132a7cfcfb8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a60d5119-22ed-4506-b21f-c7850a67e1ca', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d7b1c6f132443b0abac8495ed44621d', 'neutron:revision_number': '8', 'neutron:security_group_ids': '6563b284-b73c-434a-b0ec-7119c8f265ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.203'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=67e260af-4f33-4a56-b3c9-23ccc3032d2a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=fa1d0632-ef78-4a84-a89e-2efa740540a4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:17:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:17:17.036 103739 INFO neutron.agent.ovn.metadata.agent [-] Port fa1d0632-ef78-4a84-a89e-2efa740540a4 in datapath a60d5119-22ed-4506-b21f-c7850a67e1ca unbound from our chassis#033[00m
Oct  8 12:17:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:17:17.038 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a60d5119-22ed-4506-b21f-c7850a67e1ca#033[00m
Oct  8 12:17:17 np0005476733 nova_compute[192580]: 2025-10-08 16:17:17.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:17:17.058 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d79b41d1-dcd8-4fe6-a5e7-6360cb68d463]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:17:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:17:17.090 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[8e7c3f0e-c6eb-4990-9760-ec21086bccb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:17:17 np0005476733 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000056.scope: Deactivated successfully.
Oct  8 12:17:17 np0005476733 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000056.scope: Consumed 15.056s CPU time.
Oct  8 12:17:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:17:17.093 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[8c3b260a-eeff-4e85-956e-e629e2b8f97f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:17:17 np0005476733 systemd-machined[152624]: Machine qemu-52-instance-00000056 terminated.
Oct  8 12:17:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:17:17.127 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[b30b99da-a145-457b-b5c3-c671ea5d6bc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:17:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:17:17.145 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[33d3593e-29cb-4cd7-9eba-6ef82b49ac47]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa60d5119-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:5a:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 185], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706540, 'reachable_time': 44716, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256370, 'error': None, 'target': 'ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:17:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:17:17.162 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ff884941-3172-45f5-92c9-4a2de91209de]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa60d5119-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706551, 'tstamp': 706551}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256371, 'error': None, 'target': 'ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.18'], ['IFA_LOCAL', '10.100.0.18'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tapa60d5119-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706554, 'tstamp': 706554}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256371, 'error': None, 'target': 'ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:17:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:17:17.164 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa60d5119-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:17:17 np0005476733 nova_compute[192580]: 2025-10-08 16:17:17.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:17 np0005476733 nova_compute[192580]: 2025-10-08 16:17:17.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:17:17.171 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa60d5119-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:17:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:17:17.171 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:17:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:17:17.172 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa60d5119-20, col_values=(('external_ids', {'iface-id': '6d5cee0f-29d4-4b28-ba94-d661be87caac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:17:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:17:17.172 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:17:17 np0005476733 nova_compute[192580]: 2025-10-08 16:17:17.270 2 DEBUG nova.virt.libvirt.driver [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Oct  8 12:17:17 np0005476733 nova_compute[192580]: 2025-10-08 16:17:17.271 2 DEBUG nova.virt.libvirt.driver [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Oct  8 12:17:17 np0005476733 nova_compute[192580]: 2025-10-08 16:17:17.271 2 DEBUG nova.virt.libvirt.driver [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Oct  8 12:17:17 np0005476733 nova_compute[192580]: 2025-10-08 16:17:17.335 2 DEBUG nova.virt.libvirt.guest [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '8c079d33-9c46-438f-944a-8132a7cfcfb8' (instance-00000056) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Oct  8 12:17:17 np0005476733 nova_compute[192580]: 2025-10-08 16:17:17.336 2 INFO nova.virt.libvirt.driver [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Migration operation has completed#033[00m
Oct  8 12:17:17 np0005476733 nova_compute[192580]: 2025-10-08 16:17:17.336 2 INFO nova.compute.manager [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] _post_live_migration() is started..#033[00m
Oct  8 12:17:17 np0005476733 nova_compute[192580]: 2025-10-08 16:17:17.610 2 DEBUG nova.compute.manager [req-cbcc71df-258b-418a-ab2f-ce9855e3fad5 req-a409b848-db0c-4fb6-b753-836123635a7e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Received event network-vif-unplugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:17:17 np0005476733 nova_compute[192580]: 2025-10-08 16:17:17.610 2 DEBUG oslo_concurrency.lockutils [req-cbcc71df-258b-418a-ab2f-ce9855e3fad5 req-a409b848-db0c-4fb6-b753-836123635a7e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:17:17 np0005476733 nova_compute[192580]: 2025-10-08 16:17:17.610 2 DEBUG oslo_concurrency.lockutils [req-cbcc71df-258b-418a-ab2f-ce9855e3fad5 req-a409b848-db0c-4fb6-b753-836123635a7e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:17:17 np0005476733 nova_compute[192580]: 2025-10-08 16:17:17.610 2 DEBUG oslo_concurrency.lockutils [req-cbcc71df-258b-418a-ab2f-ce9855e3fad5 req-a409b848-db0c-4fb6-b753-836123635a7e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:17:17 np0005476733 nova_compute[192580]: 2025-10-08 16:17:17.611 2 DEBUG nova.compute.manager [req-cbcc71df-258b-418a-ab2f-ce9855e3fad5 req-a409b848-db0c-4fb6-b753-836123635a7e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] No waiting events found dispatching network-vif-unplugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:17:17 np0005476733 nova_compute[192580]: 2025-10-08 16:17:17.611 2 DEBUG nova.compute.manager [req-cbcc71df-258b-418a-ab2f-ce9855e3fad5 req-a409b848-db0c-4fb6-b753-836123635a7e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Received event network-vif-unplugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 12:17:18 np0005476733 nova_compute[192580]: 2025-10-08 16:17:18.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:18 np0005476733 nova_compute[192580]: 2025-10-08 16:17:18.584 2 DEBUG nova.compute.manager [req-95a7c0e7-dafa-4981-8377-311e16416e70 req-d4ff5a5c-49e1-4e85-94cf-ed4c30da6fd8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Received event network-vif-unplugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:17:18 np0005476733 nova_compute[192580]: 2025-10-08 16:17:18.585 2 DEBUG oslo_concurrency.lockutils [req-95a7c0e7-dafa-4981-8377-311e16416e70 req-d4ff5a5c-49e1-4e85-94cf-ed4c30da6fd8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:17:18 np0005476733 nova_compute[192580]: 2025-10-08 16:17:18.585 2 DEBUG oslo_concurrency.lockutils [req-95a7c0e7-dafa-4981-8377-311e16416e70 req-d4ff5a5c-49e1-4e85-94cf-ed4c30da6fd8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:17:18 np0005476733 nova_compute[192580]: 2025-10-08 16:17:18.585 2 DEBUG oslo_concurrency.lockutils [req-95a7c0e7-dafa-4981-8377-311e16416e70 req-d4ff5a5c-49e1-4e85-94cf-ed4c30da6fd8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:17:18 np0005476733 nova_compute[192580]: 2025-10-08 16:17:18.585 2 DEBUG nova.compute.manager [req-95a7c0e7-dafa-4981-8377-311e16416e70 req-d4ff5a5c-49e1-4e85-94cf-ed4c30da6fd8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] No waiting events found dispatching network-vif-unplugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:17:18 np0005476733 nova_compute[192580]: 2025-10-08 16:17:18.586 2 DEBUG nova.compute.manager [req-95a7c0e7-dafa-4981-8377-311e16416e70 req-d4ff5a5c-49e1-4e85-94cf-ed4c30da6fd8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Received event network-vif-unplugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 12:17:18 np0005476733 nova_compute[192580]: 2025-10-08 16:17:18.706 2 DEBUG nova.network.neutron [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Activated binding for port fa1d0632-ef78-4a84-a89e-2efa740540a4 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Oct  8 12:17:18 np0005476733 nova_compute[192580]: 2025-10-08 16:17:18.706 2 DEBUG nova.compute.manager [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "address": "fa:16:3e:42:82:a7", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa1d0632-ef", "ovs_interfaceid": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Oct  8 12:17:18 np0005476733 nova_compute[192580]: 2025-10-08 16:17:18.707 2 DEBUG nova.virt.libvirt.vif [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T16:15:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-server-test-1599350823',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1599350823',id=86,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+zrgavmRKdnUQXrKVYbBp+WYlZmJhFQsa1kl3Dxiu3QX4zg/ahS8IRtri5xncBe6cTI7KlCkUrKeQ2+JI5DxlFw723YYmaC31z8s6e9ZieApUJckBa+6MT9ksK1drFCw==',key_name='tempest-keypair-1043886460',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:16:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9d7b1c6f132443b0abac8495ed44621d',ramdisk_id='',reservation_id='r-tpuvmdiu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-OvnDvrTest-313060968',owner_user_name='tempest-OvnDvrTest-313060968-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T16:17:01Z,user_data=None,user_id='81b62a8f3edf4f78aeb0b087fd79ebb7',uuid=8c079d33-9c46-438f-944a-8132a7cfcfb8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "address": "fa:16:3e:42:82:a7", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa1d0632-ef", "ovs_interfaceid": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 12:17:18 np0005476733 nova_compute[192580]: 2025-10-08 16:17:18.707 2 DEBUG nova.network.os_vif_util [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converting VIF {"id": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "address": "fa:16:3e:42:82:a7", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa1d0632-ef", "ovs_interfaceid": "fa1d0632-ef78-4a84-a89e-2efa740540a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:17:18 np0005476733 nova_compute[192580]: 2025-10-08 16:17:18.708 2 DEBUG nova.network.os_vif_util [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:42:82:a7,bridge_name='br-int',has_traffic_filtering=True,id=fa1d0632-ef78-4a84-a89e-2efa740540a4,network=Network(a60d5119-22ed-4506-b21f-c7850a67e1ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa1d0632-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:17:18 np0005476733 nova_compute[192580]: 2025-10-08 16:17:18.708 2 DEBUG os_vif [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:42:82:a7,bridge_name='br-int',has_traffic_filtering=True,id=fa1d0632-ef78-4a84-a89e-2efa740540a4,network=Network(a60d5119-22ed-4506-b21f-c7850a67e1ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa1d0632-ef') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 12:17:18 np0005476733 nova_compute[192580]: 2025-10-08 16:17:18.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:18 np0005476733 nova_compute[192580]: 2025-10-08 16:17:18.710 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa1d0632-ef, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:17:18 np0005476733 nova_compute[192580]: 2025-10-08 16:17:18.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:17:18 np0005476733 nova_compute[192580]: 2025-10-08 16:17:18.715 2 INFO os_vif [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:42:82:a7,bridge_name='br-int',has_traffic_filtering=True,id=fa1d0632-ef78-4a84-a89e-2efa740540a4,network=Network(a60d5119-22ed-4506-b21f-c7850a67e1ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa1d0632-ef')#033[00m
Oct  8 12:17:18 np0005476733 nova_compute[192580]: 2025-10-08 16:17:18.716 2 DEBUG oslo_concurrency.lockutils [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:17:18 np0005476733 nova_compute[192580]: 2025-10-08 16:17:18.716 2 DEBUG oslo_concurrency.lockutils [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:17:18 np0005476733 nova_compute[192580]: 2025-10-08 16:17:18.716 2 DEBUG oslo_concurrency.lockutils [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:17:18 np0005476733 nova_compute[192580]: 2025-10-08 16:17:18.716 2 DEBUG nova.compute.manager [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Oct  8 12:17:18 np0005476733 nova_compute[192580]: 2025-10-08 16:17:18.717 2 INFO nova.virt.libvirt.driver [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Deleting instance files /var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8_del#033[00m
Oct  8 12:17:18 np0005476733 nova_compute[192580]: 2025-10-08 16:17:18.717 2 INFO nova.virt.libvirt.driver [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Deletion of /var/lib/nova/instances/8c079d33-9c46-438f-944a-8132a7cfcfb8_del complete#033[00m
Oct  8 12:17:19 np0005476733 podman[256391]: 2025-10-08 16:17:19.236217489 +0000 UTC m=+0.062915062 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 12:17:19 np0005476733 podman[256390]: 2025-10-08 16:17:19.236385014 +0000 UTC m=+0.066970042 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct  8 12:17:19 np0005476733 systemd[1]: Stopping User Manager for UID 42436...
Oct  8 12:17:19 np0005476733 systemd[256262]: Activating special unit Exit the Session...
Oct  8 12:17:19 np0005476733 systemd[256262]: Stopped target Main User Target.
Oct  8 12:17:19 np0005476733 systemd[256262]: Stopped target Basic System.
Oct  8 12:17:19 np0005476733 systemd[256262]: Stopped target Paths.
Oct  8 12:17:19 np0005476733 systemd[256262]: Stopped target Sockets.
Oct  8 12:17:19 np0005476733 systemd[256262]: Stopped target Timers.
Oct  8 12:17:19 np0005476733 systemd[256262]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct  8 12:17:19 np0005476733 systemd[256262]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  8 12:17:19 np0005476733 systemd[256262]: Closed D-Bus User Message Bus Socket.
Oct  8 12:17:19 np0005476733 systemd[256262]: Stopped Create User's Volatile Files and Directories.
Oct  8 12:17:19 np0005476733 systemd[256262]: Removed slice User Application Slice.
Oct  8 12:17:19 np0005476733 systemd[256262]: Reached target Shutdown.
Oct  8 12:17:19 np0005476733 systemd[256262]: Finished Exit the Session.
Oct  8 12:17:19 np0005476733 systemd[256262]: Reached target Exit the Session.
Oct  8 12:17:19 np0005476733 systemd[1]: user@42436.service: Deactivated successfully.
Oct  8 12:17:19 np0005476733 systemd[1]: Stopped User Manager for UID 42436.
Oct  8 12:17:19 np0005476733 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Oct  8 12:17:19 np0005476733 systemd[1]: run-user-42436.mount: Deactivated successfully.
Oct  8 12:17:19 np0005476733 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Oct  8 12:17:19 np0005476733 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Oct  8 12:17:19 np0005476733 systemd[1]: Removed slice User Slice of UID 42436.
Oct  8 12:17:19 np0005476733 nova_compute[192580]: 2025-10-08 16:17:19.715 2 DEBUG nova.compute.manager [req-b8a0ffca-11ce-41a9-a80d-866309a643aa req-5b0e6c9e-e947-4357-a39c-3cb3d11835bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Received event network-vif-plugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:17:19 np0005476733 nova_compute[192580]: 2025-10-08 16:17:19.716 2 DEBUG oslo_concurrency.lockutils [req-b8a0ffca-11ce-41a9-a80d-866309a643aa req-5b0e6c9e-e947-4357-a39c-3cb3d11835bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:17:19 np0005476733 nova_compute[192580]: 2025-10-08 16:17:19.716 2 DEBUG oslo_concurrency.lockutils [req-b8a0ffca-11ce-41a9-a80d-866309a643aa req-5b0e6c9e-e947-4357-a39c-3cb3d11835bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:17:19 np0005476733 nova_compute[192580]: 2025-10-08 16:17:19.716 2 DEBUG oslo_concurrency.lockutils [req-b8a0ffca-11ce-41a9-a80d-866309a643aa req-5b0e6c9e-e947-4357-a39c-3cb3d11835bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:17:19 np0005476733 nova_compute[192580]: 2025-10-08 16:17:19.716 2 DEBUG nova.compute.manager [req-b8a0ffca-11ce-41a9-a80d-866309a643aa req-5b0e6c9e-e947-4357-a39c-3cb3d11835bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] No waiting events found dispatching network-vif-plugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:17:19 np0005476733 nova_compute[192580]: 2025-10-08 16:17:19.717 2 WARNING nova.compute.manager [req-b8a0ffca-11ce-41a9-a80d-866309a643aa req-5b0e6c9e-e947-4357-a39c-3cb3d11835bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Received unexpected event network-vif-plugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 for instance with vm_state active and task_state migrating.#033[00m
Oct  8 12:17:19 np0005476733 nova_compute[192580]: 2025-10-08 16:17:19.717 2 DEBUG nova.compute.manager [req-b8a0ffca-11ce-41a9-a80d-866309a643aa req-5b0e6c9e-e947-4357-a39c-3cb3d11835bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Received event network-vif-plugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:17:19 np0005476733 nova_compute[192580]: 2025-10-08 16:17:19.717 2 DEBUG oslo_concurrency.lockutils [req-b8a0ffca-11ce-41a9-a80d-866309a643aa req-5b0e6c9e-e947-4357-a39c-3cb3d11835bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:17:19 np0005476733 nova_compute[192580]: 2025-10-08 16:17:19.717 2 DEBUG oslo_concurrency.lockutils [req-b8a0ffca-11ce-41a9-a80d-866309a643aa req-5b0e6c9e-e947-4357-a39c-3cb3d11835bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:17:19 np0005476733 nova_compute[192580]: 2025-10-08 16:17:19.717 2 DEBUG oslo_concurrency.lockutils [req-b8a0ffca-11ce-41a9-a80d-866309a643aa req-5b0e6c9e-e947-4357-a39c-3cb3d11835bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:17:19 np0005476733 nova_compute[192580]: 2025-10-08 16:17:19.718 2 DEBUG nova.compute.manager [req-b8a0ffca-11ce-41a9-a80d-866309a643aa req-5b0e6c9e-e947-4357-a39c-3cb3d11835bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] No waiting events found dispatching network-vif-plugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:17:19 np0005476733 nova_compute[192580]: 2025-10-08 16:17:19.718 2 WARNING nova.compute.manager [req-b8a0ffca-11ce-41a9-a80d-866309a643aa req-5b0e6c9e-e947-4357-a39c-3cb3d11835bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Received unexpected event network-vif-plugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 for instance with vm_state active and task_state migrating.#033[00m
Oct  8 12:17:19 np0005476733 nova_compute[192580]: 2025-10-08 16:17:19.718 2 DEBUG nova.compute.manager [req-b8a0ffca-11ce-41a9-a80d-866309a643aa req-5b0e6c9e-e947-4357-a39c-3cb3d11835bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Received event network-vif-plugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:17:19 np0005476733 nova_compute[192580]: 2025-10-08 16:17:19.718 2 DEBUG oslo_concurrency.lockutils [req-b8a0ffca-11ce-41a9-a80d-866309a643aa req-5b0e6c9e-e947-4357-a39c-3cb3d11835bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:17:19 np0005476733 nova_compute[192580]: 2025-10-08 16:17:19.718 2 DEBUG oslo_concurrency.lockutils [req-b8a0ffca-11ce-41a9-a80d-866309a643aa req-5b0e6c9e-e947-4357-a39c-3cb3d11835bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:17:19 np0005476733 nova_compute[192580]: 2025-10-08 16:17:19.719 2 DEBUG oslo_concurrency.lockutils [req-b8a0ffca-11ce-41a9-a80d-866309a643aa req-5b0e6c9e-e947-4357-a39c-3cb3d11835bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:17:19 np0005476733 nova_compute[192580]: 2025-10-08 16:17:19.719 2 DEBUG nova.compute.manager [req-b8a0ffca-11ce-41a9-a80d-866309a643aa req-5b0e6c9e-e947-4357-a39c-3cb3d11835bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] No waiting events found dispatching network-vif-plugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:17:19 np0005476733 nova_compute[192580]: 2025-10-08 16:17:19.719 2 WARNING nova.compute.manager [req-b8a0ffca-11ce-41a9-a80d-866309a643aa req-5b0e6c9e-e947-4357-a39c-3cb3d11835bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Received unexpected event network-vif-plugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 for instance with vm_state active and task_state migrating.#033[00m
Oct  8 12:17:19 np0005476733 nova_compute[192580]: 2025-10-08 16:17:19.719 2 DEBUG nova.compute.manager [req-b8a0ffca-11ce-41a9-a80d-866309a643aa req-5b0e6c9e-e947-4357-a39c-3cb3d11835bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Received event network-vif-plugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:17:19 np0005476733 nova_compute[192580]: 2025-10-08 16:17:19.719 2 DEBUG oslo_concurrency.lockutils [req-b8a0ffca-11ce-41a9-a80d-866309a643aa req-5b0e6c9e-e947-4357-a39c-3cb3d11835bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:17:19 np0005476733 nova_compute[192580]: 2025-10-08 16:17:19.719 2 DEBUG oslo_concurrency.lockutils [req-b8a0ffca-11ce-41a9-a80d-866309a643aa req-5b0e6c9e-e947-4357-a39c-3cb3d11835bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:17:19 np0005476733 nova_compute[192580]: 2025-10-08 16:17:19.720 2 DEBUG oslo_concurrency.lockutils [req-b8a0ffca-11ce-41a9-a80d-866309a643aa req-5b0e6c9e-e947-4357-a39c-3cb3d11835bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:17:19 np0005476733 nova_compute[192580]: 2025-10-08 16:17:19.720 2 DEBUG nova.compute.manager [req-b8a0ffca-11ce-41a9-a80d-866309a643aa req-5b0e6c9e-e947-4357-a39c-3cb3d11835bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] No waiting events found dispatching network-vif-plugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:17:19 np0005476733 nova_compute[192580]: 2025-10-08 16:17:19.720 2 WARNING nova.compute.manager [req-b8a0ffca-11ce-41a9-a80d-866309a643aa req-5b0e6c9e-e947-4357-a39c-3cb3d11835bb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Received unexpected event network-vif-plugged-fa1d0632-ef78-4a84-a89e-2efa740540a4 for instance with vm_state active and task_state migrating.#033[00m
Oct  8 12:17:22 np0005476733 nova_compute[192580]: 2025-10-08 16:17:22.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:17:23 np0005476733 nova_compute[192580]: 2025-10-08 16:17:23.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:23 np0005476733 nova_compute[192580]: 2025-10-08 16:17:23.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:24 np0005476733 ovn_controller[94857]: 2025-10-08T16:17:24Z|00832|pinctrl|WARN|Dropped 423 log messages in last 59 seconds (most recently, 6 seconds ago) due to excessive rate
Oct  8 12:17:24 np0005476733 ovn_controller[94857]: 2025-10-08T16:17:24Z|00833|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:17:24 np0005476733 ovn_controller[94857]: 2025-10-08T16:17:24Z|00834|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.170 2 DEBUG oslo_concurrency.lockutils [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.171 2 DEBUG oslo_concurrency.lockutils [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.171 2 DEBUG oslo_concurrency.lockutils [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "8c079d33-9c46-438f-944a-8132a7cfcfb8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.211 2 DEBUG oslo_concurrency.lockutils [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.213 2 DEBUG oslo_concurrency.lockutils [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.213 2 DEBUG oslo_concurrency.lockutils [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.213 2 DEBUG nova.compute.resource_tracker [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.336 2 DEBUG oslo_concurrency.processutils [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:17:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:17:26.374 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:17:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:17:26.375 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:17:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:17:26.375 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.400 2 DEBUG oslo_concurrency.processutils [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.401 2 DEBUG oslo_concurrency.processutils [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.455 2 DEBUG oslo_concurrency.processutils [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.592 2 WARNING nova.virt.libvirt.driver [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.593 2 DEBUG nova.compute.resource_tracker [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13559MB free_disk=111.28630828857422GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.593 2 DEBUG oslo_concurrency.lockutils [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.594 2 DEBUG oslo_concurrency.lockutils [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.668 2 DEBUG nova.compute.resource_tracker [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Migration for instance 8c079d33-9c46-438f-944a-8132a7cfcfb8 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.687 2 DEBUG nova.compute.resource_tracker [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.726 2 DEBUG nova.compute.resource_tracker [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Instance 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.727 2 DEBUG nova.compute.resource_tracker [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Migration bc65eb8b-809e-498f-bef5-bac3a273c1c8 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.727 2 DEBUG nova.compute.resource_tracker [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.727 2 DEBUG nova.compute.resource_tracker [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.812 2 DEBUG nova.compute.provider_tree [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.836 2 DEBUG nova.scheduler.client.report [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.837 2 DEBUG nova.compute.resource_tracker [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.837 2 DEBUG oslo_concurrency.lockutils [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.244s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.843 2 INFO nova.compute.manager [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.956 2 INFO nova.scheduler.client.report [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Deleted allocation for migration bc65eb8b-809e-498f-bef5-bac3a273c1c8#033[00m
Oct  8 12:17:26 np0005476733 nova_compute[192580]: 2025-10-08 16:17:26.957 2 DEBUG nova.virt.libvirt.driver [None req-325fd76e-f50f-464d-9fd6-6181c0b80247 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Oct  8 12:17:27 np0005476733 nova_compute[192580]: 2025-10-08 16:17:27.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:17:27 np0005476733 systemd-logind[827]: New session 92 of user zuul.
Oct  8 12:17:27 np0005476733 systemd[1]: Started Session 92 of User zuul.
Oct  8 12:17:27 np0005476733 systemd-logind[827]: New session 93 of user zuul.
Oct  8 12:17:27 np0005476733 systemd[1]: Started Session 93 of User zuul.
Oct  8 12:17:28 np0005476733 systemd[1]: session-93.scope: Deactivated successfully.
Oct  8 12:17:28 np0005476733 systemd-logind[827]: Session 93 logged out. Waiting for processes to exit.
Oct  8 12:17:28 np0005476733 systemd-logind[827]: Removed session 93.
Oct  8 12:17:28 np0005476733 nova_compute[192580]: 2025-10-08 16:17:28.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:28 np0005476733 nova_compute[192580]: 2025-10-08 16:17:28.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:29 np0005476733 podman[256501]: 2025-10-08 16:17:29.226874073 +0000 UTC m=+0.055803136 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:17:30 np0005476733 nova_compute[192580]: 2025-10-08 16:17:30.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:17:32 np0005476733 nova_compute[192580]: 2025-10-08 16:17:32.269 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759940237.2675698, 8c079d33-9c46-438f-944a-8132a7cfcfb8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:17:32 np0005476733 nova_compute[192580]: 2025-10-08 16:17:32.269 2 INFO nova.compute.manager [-] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] VM Stopped (Lifecycle Event)#033[00m
Oct  8 12:17:32 np0005476733 nova_compute[192580]: 2025-10-08 16:17:32.293 2 DEBUG nova.compute.manager [None req-51455443-8687-4752-9be8-126e201d4a13 - - - - - -] [instance: 8c079d33-9c46-438f-944a-8132a7cfcfb8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:17:32 np0005476733 nova_compute[192580]: 2025-10-08 16:17:32.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:17:33 np0005476733 nova_compute[192580]: 2025-10-08 16:17:33.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:33 np0005476733 nova_compute[192580]: 2025-10-08 16:17:33.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:17:33 np0005476733 nova_compute[192580]: 2025-10-08 16:17:33.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:17:33 np0005476733 nova_compute[192580]: 2025-10-08 16:17:33.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:34 np0005476733 podman[256522]: 2025-10-08 16:17:34.247712024 +0000 UTC m=+0.066959201 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  8 12:17:34 np0005476733 podman[256521]: 2025-10-08 16:17:34.262827418 +0000 UTC m=+0.088991107 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:17:34 np0005476733 nova_compute[192580]: 2025-10-08 16:17:34.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:17:34 np0005476733 nova_compute[192580]: 2025-10-08 16:17:34.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:17:34 np0005476733 nova_compute[192580]: 2025-10-08 16:17:34.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:17:35 np0005476733 nova_compute[192580]: 2025-10-08 16:17:35.479 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:17:35 np0005476733 nova_compute[192580]: 2025-10-08 16:17:35.480 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:17:35 np0005476733 nova_compute[192580]: 2025-10-08 16:17:35.480 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:17:35 np0005476733 nova_compute[192580]: 2025-10-08 16:17:35.480 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:17:37 np0005476733 systemd-logind[827]: New session 94 of user zuul.
Oct  8 12:17:37 np0005476733 systemd[1]: Started Session 94 of User zuul.
Oct  8 12:17:37 np0005476733 nova_compute[192580]: 2025-10-08 16:17:37.510 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Updating instance_info_cache with network_info: [{"id": "eccc469f-f05b-41f3-ad17-900281358d00", "address": "fa:16:3e:e4:65:ad", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeccc469f-f0", "ovs_interfaceid": "eccc469f-f05b-41f3-ad17-900281358d00", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:17:37 np0005476733 nova_compute[192580]: 2025-10-08 16:17:37.538 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:17:37 np0005476733 nova_compute[192580]: 2025-10-08 16:17:37.539 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:17:38 np0005476733 systemd-logind[827]: New session 95 of user zuul.
Oct  8 12:17:38 np0005476733 systemd[1]: Started Session 95 of User zuul.
Oct  8 12:17:38 np0005476733 nova_compute[192580]: 2025-10-08 16:17:38.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:38 np0005476733 systemd-logind[827]: New session 96 of user zuul.
Oct  8 12:17:38 np0005476733 systemd[1]: Started Session 96 of User zuul.
Oct  8 12:17:38 np0005476733 nova_compute[192580]: 2025-10-08 16:17:38.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:17:38 np0005476733 systemd-logind[827]: Session 96 logged out. Waiting for processes to exit.
Oct  8 12:17:38 np0005476733 systemd[1]: session-96.scope: Deactivated successfully.
Oct  8 12:17:38 np0005476733 systemd-logind[827]: Removed session 96.
Oct  8 12:17:38 np0005476733 nova_compute[192580]: 2025-10-08 16:17:38.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:41 np0005476733 podman[256660]: 2025-10-08 16:17:41.245272551 +0000 UTC m=+0.075419802 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:17:41 np0005476733 podman[256659]: 2025-10-08 16:17:41.244971351 +0000 UTC m=+0.076213417 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0)
Oct  8 12:17:41 np0005476733 podman[256661]: 2025-10-08 16:17:41.267944785 +0000 UTC m=+0.096401672 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-type=git, maintainer=Red Hat, Inc., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64)
Oct  8 12:17:43 np0005476733 nova_compute[192580]: 2025-10-08 16:17:43.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:43 np0005476733 nova_compute[192580]: 2025-10-08 16:17:43.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:17:43 np0005476733 nova_compute[192580]: 2025-10-08 16:17:43.630 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:17:43 np0005476733 nova_compute[192580]: 2025-10-08 16:17:43.630 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:17:43 np0005476733 nova_compute[192580]: 2025-10-08 16:17:43.631 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:17:43 np0005476733 nova_compute[192580]: 2025-10-08 16:17:43.631 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:17:43 np0005476733 nova_compute[192580]: 2025-10-08 16:17:43.697 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:17:43 np0005476733 nova_compute[192580]: 2025-10-08 16:17:43.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:43 np0005476733 nova_compute[192580]: 2025-10-08 16:17:43.764 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:17:43 np0005476733 nova_compute[192580]: 2025-10-08 16:17:43.764 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:17:43 np0005476733 nova_compute[192580]: 2025-10-08 16:17:43.821 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:17:43 np0005476733 nova_compute[192580]: 2025-10-08 16:17:43.948 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:17:43 np0005476733 nova_compute[192580]: 2025-10-08 16:17:43.949 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13541MB free_disk=111.28628158569336GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:17:43 np0005476733 nova_compute[192580]: 2025-10-08 16:17:43.949 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:17:43 np0005476733 nova_compute[192580]: 2025-10-08 16:17:43.950 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:17:44 np0005476733 nova_compute[192580]: 2025-10-08 16:17:44.065 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:17:44 np0005476733 nova_compute[192580]: 2025-10-08 16:17:44.066 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:17:44 np0005476733 nova_compute[192580]: 2025-10-08 16:17:44.066 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:17:44 np0005476733 nova_compute[192580]: 2025-10-08 16:17:44.214 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:17:44 np0005476733 nova_compute[192580]: 2025-10-08 16:17:44.239 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:17:44 np0005476733 nova_compute[192580]: 2025-10-08 16:17:44.241 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:17:44 np0005476733 nova_compute[192580]: 2025-10-08 16:17:44.241 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:17:47 np0005476733 nova_compute[192580]: 2025-10-08 16:17:47.241 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:17:47 np0005476733 systemd-logind[827]: New session 97 of user zuul.
Oct  8 12:17:47 np0005476733 systemd[1]: Started Session 97 of User zuul.
Oct  8 12:17:48 np0005476733 nova_compute[192580]: 2025-10-08 16:17:48.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:48 np0005476733 systemd-logind[827]: New session 98 of user zuul.
Oct  8 12:17:48 np0005476733 nova_compute[192580]: 2025-10-08 16:17:48.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:48 np0005476733 systemd[1]: Started Session 98 of User zuul.
Oct  8 12:17:48 np0005476733 systemd-logind[827]: New session 99 of user zuul.
Oct  8 12:17:48 np0005476733 systemd[1]: Started Session 99 of User zuul.
Oct  8 12:17:49 np0005476733 systemd[1]: session-99.scope: Deactivated successfully.
Oct  8 12:17:49 np0005476733 systemd-logind[827]: Session 99 logged out. Waiting for processes to exit.
Oct  8 12:17:49 np0005476733 systemd-logind[827]: Removed session 99.
Oct  8 12:17:50 np0005476733 podman[256815]: 2025-10-08 16:17:50.249154536 +0000 UTC m=+0.070779963 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=iscsid, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:17:50 np0005476733 podman[256816]: 2025-10-08 16:17:50.25207181 +0000 UTC m=+0.072318973 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:17:53 np0005476733 nova_compute[192580]: 2025-10-08 16:17:53.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:53 np0005476733 nova_compute[192580]: 2025-10-08 16:17:53.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:57 np0005476733 systemd-logind[827]: New session 100 of user zuul.
Oct  8 12:17:57 np0005476733 systemd[1]: Started Session 100 of User zuul.
Oct  8 12:17:58 np0005476733 nova_compute[192580]: 2025-10-08 16:17:58.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:17:58 np0005476733 nova_compute[192580]: 2025-10-08 16:17:58.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:00 np0005476733 podman[256891]: 2025-10-08 16:18:00.242838965 +0000 UTC m=+0.061226988 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:18:03 np0005476733 systemd-logind[827]: New session 101 of user zuul.
Oct  8 12:18:03 np0005476733 systemd[1]: Started Session 101 of User zuul.
Oct  8 12:18:03 np0005476733 nova_compute[192580]: 2025-10-08 16:18:03.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:03 np0005476733 systemd-logind[827]: Session 101 logged out. Waiting for processes to exit.
Oct  8 12:18:03 np0005476733 systemd[1]: session-101.scope: Deactivated successfully.
Oct  8 12:18:03 np0005476733 systemd-logind[827]: Removed session 101.
Oct  8 12:18:03 np0005476733 nova_compute[192580]: 2025-10-08 16:18:03.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:05 np0005476733 podman[256942]: 2025-10-08 16:18:05.261823687 +0000 UTC m=+0.081141205 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Oct  8 12:18:05 np0005476733 podman[256941]: 2025-10-08 16:18:05.272952053 +0000 UTC m=+0.093429468 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 12:18:08 np0005476733 nova_compute[192580]: 2025-10-08 16:18:08.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:08 np0005476733 nova_compute[192580]: 2025-10-08 16:18:08.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:10 np0005476733 systemd-logind[827]: New session 102 of user zuul.
Oct  8 12:18:10 np0005476733 systemd[1]: Started Session 102 of User zuul.
Oct  8 12:18:10 np0005476733 systemd[1]: session-102.scope: Deactivated successfully.
Oct  8 12:18:10 np0005476733 systemd-logind[827]: Session 102 logged out. Waiting for processes to exit.
Oct  8 12:18:10 np0005476733 systemd-logind[827]: Removed session 102.
Oct  8 12:18:12 np0005476733 podman[257016]: 2025-10-08 16:18:12.215432209 +0000 UTC m=+0.046763016 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 12:18:12 np0005476733 podman[257017]: 2025-10-08 16:18:12.22891919 +0000 UTC m=+0.054970598 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, version=9.6, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9)
Oct  8 12:18:12 np0005476733 podman[257015]: 2025-10-08 16:18:12.231789381 +0000 UTC m=+0.061907490 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 12:18:13 np0005476733 nova_compute[192580]: 2025-10-08 16:18:13.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:13 np0005476733 nova_compute[192580]: 2025-10-08 16:18:13.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:17 np0005476733 systemd-logind[827]: New session 103 of user zuul.
Oct  8 12:18:17 np0005476733 systemd[1]: Started Session 103 of User zuul.
Oct  8 12:18:17 np0005476733 systemd[1]: session-103.scope: Deactivated successfully.
Oct  8 12:18:17 np0005476733 systemd-logind[827]: Session 103 logged out. Waiting for processes to exit.
Oct  8 12:18:17 np0005476733 systemd-logind[827]: Removed session 103.
Oct  8 12:18:18 np0005476733 nova_compute[192580]: 2025-10-08 16:18:18.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:18 np0005476733 nova_compute[192580]: 2025-10-08 16:18:18.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:21 np0005476733 podman[257114]: 2025-10-08 16:18:21.224002116 +0000 UTC m=+0.049559505 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 12:18:21 np0005476733 podman[257113]: 2025-10-08 16:18:21.224474611 +0000 UTC m=+0.053243423 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 12:18:22 np0005476733 nova_compute[192580]: 2025-10-08 16:18:22.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:18:23 np0005476733 nova_compute[192580]: 2025-10-08 16:18:23.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:23 np0005476733 nova_compute[192580]: 2025-10-08 16:18:23.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:24 np0005476733 systemd-logind[827]: New session 104 of user zuul.
Oct  8 12:18:24 np0005476733 systemd[1]: Started Session 104 of User zuul.
Oct  8 12:18:24 np0005476733 systemd[1]: session-104.scope: Deactivated successfully.
Oct  8 12:18:24 np0005476733 systemd-logind[827]: Session 104 logged out. Waiting for processes to exit.
Oct  8 12:18:24 np0005476733 systemd-logind[827]: Removed session 104.
Oct  8 12:18:25 np0005476733 ovn_controller[94857]: 2025-10-08T16:18:25Z|00835|pinctrl|WARN|Dropped 87 log messages in last 61 seconds (most recently, 9 seconds ago) due to excessive rate
Oct  8 12:18:25 np0005476733 ovn_controller[94857]: 2025-10-08T16:18:25Z|00836|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:18:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:18:26.376 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:18:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:18:26.376 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:18:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:18:26.377 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:18:28 np0005476733 nova_compute[192580]: 2025-10-08 16:18:28.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:28 np0005476733 nova_compute[192580]: 2025-10-08 16:18:28.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:18:28 np0005476733 nova_compute[192580]: 2025-10-08 16:18:28.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:31 np0005476733 podman[257186]: 2025-10-08 16:18:31.230435623 +0000 UTC m=+0.059405840 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent)
Oct  8 12:18:31 np0005476733 systemd-logind[827]: New session 105 of user zuul.
Oct  8 12:18:31 np0005476733 systemd[1]: Started Session 105 of User zuul.
Oct  8 12:18:32 np0005476733 systemd[1]: session-105.scope: Deactivated successfully.
Oct  8 12:18:32 np0005476733 systemd-logind[827]: Session 105 logged out. Waiting for processes to exit.
Oct  8 12:18:32 np0005476733 systemd-logind[827]: Removed session 105.
Oct  8 12:18:32 np0005476733 nova_compute[192580]: 2025-10-08 16:18:32.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:18:32 np0005476733 nova_compute[192580]: 2025-10-08 16:18:32.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:18:33 np0005476733 nova_compute[192580]: 2025-10-08 16:18:33.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:33 np0005476733 nova_compute[192580]: 2025-10-08 16:18:33.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:34 np0005476733 nova_compute[192580]: 2025-10-08 16:18:34.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:18:34 np0005476733 nova_compute[192580]: 2025-10-08 16:18:34.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.062 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'name': 'tempest-server-test-1716699684', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000055', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9d7b1c6f132443b0abac8495ed44621d', 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'hostId': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.063 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.086 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.087 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0bc0515-3707-4165-8c34-33c6f4d35422', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-vda', 'timestamp': '2025-10-08T16:18:36.063355', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'instance-00000055', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '712ad70c-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.786326749, 'message_signature': '93e1d1a92c26f6e226d2d85fccd0682d60620d5a4d4fc8dab8d9b8b0dd7e287e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-sda', 'timestamp': '2025-10-08T16:18:36.063355', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'instance-00000055', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '712ae576-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.786326749, 'message_signature': 'a6a5a7edc79a68a2975c23a8581ad7a04a52e7380f2c29db307066100678b07a'}]}, 'timestamp': '2025-10-08 16:18:36.087371', '_unique_id': 'f9dbcd3bb1f24aeab5af4f1d7bd9bf2f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.088 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.089 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.105 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/memory.usage volume: 43.0625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8cd673c-f8e0-41d1-a078-509602b15c5f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.0625, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'timestamp': '2025-10-08T16:18:36.089721', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'instance-00000055', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '712db864-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.828326021, 'message_signature': 'b580469469842292cddcacf635d15c1aaf526528060e2ffc20e68d80fbe37e4a'}]}, 'timestamp': '2025-10-08 16:18:36.105893', '_unique_id': '8ed5588f52be45559111cb1e5e87eaa0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.106 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.107 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.110 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3 / tapeccc469f-f0 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.110 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5f271b1-b3d4-4883-9964-7af5043d4621', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000055-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-tapeccc469f-f0', 'timestamp': '2025-10-08T16:18:36.107576', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'tapeccc469f-f0', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:65:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeccc469f-f0'}, 'message_id': '712e945a-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.830543822, 'message_signature': '0a65c828e965b51a1338193d8a497c6183dcb230f4b1d9b94f8fd8e1683b8ae8'}]}, 'timestamp': '2025-10-08 16:18:36.111894', '_unique_id': '45f2d65d6ac546db87f81db143b8e453'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.113 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.115 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.116 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20024fb0-4013-43bc-ad33-2ca3fd46fc21', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000055-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-tapeccc469f-f0', 'timestamp': '2025-10-08T16:18:36.116041', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'tapeccc469f-f0', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:65:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeccc469f-f0'}, 'message_id': '712f5c0a-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.830543822, 'message_signature': '12d5329e4c83fa067f1124d25a3cbaea5a3294b41a75f5f2ae8a3188ff36732b'}]}, 'timestamp': '2025-10-08 16:18:36.116835', '_unique_id': 'd96b7d1de8ff442fb67761c34b174196'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.119 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.131 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.132 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8fc98713-0ed2-4d91-8634-3d018c2b7829', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-vda', 'timestamp': '2025-10-08T16:18:36.119751', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'instance-00000055', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7131c33c-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.842763673, 'message_signature': '97ebd33cffbf3afe60bcf6c56ec034fa382c4dd3a877ae45563b6011b1ac309c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-sda', 'timestamp': '2025-10-08T16:18:36.119751', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'instance-00000055', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7131d3ea-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.842763673, 'message_signature': '6452aabe413398091def755ff3746e47566e92c2c118aa285deaec6f8fce0877'}]}, 'timestamp': '2025-10-08 16:18:36.132814', '_unique_id': '96c3ba9d10134eecacf0043cd4fd501d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.134 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.135 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/network.incoming.packets volume: 125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '93c97acb-64ad-46b2-a9f6-7d3dc8273503', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 125, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000055-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-tapeccc469f-f0', 'timestamp': '2025-10-08T16:18:36.135085', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'tapeccc469f-f0', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:65:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeccc469f-f0'}, 'message_id': '71323d26-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.830543822, 'message_signature': '5ef8f22286b318542466710f08edcfed118ee34e9428e9f140ed5460290e2d22'}]}, 'timestamp': '2025-10-08 16:18:36.135524', '_unique_id': '3dc3083e141a490f95876ce9c57050fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.136 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.137 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.137 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/network.outgoing.packets volume: 175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7332119-8d3d-4a13-9cf0-e26476f88e17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 175, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000055-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-tapeccc469f-f0', 'timestamp': '2025-10-08T16:18:36.137245', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'tapeccc469f-f0', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:65:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeccc469f-f0'}, 'message_id': '71328f6a-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.830543822, 'message_signature': 'edef60a719dc99771ee3f76ddc65de9c8c34b6443be4e814f92d9df27bf24d59'}]}, 'timestamp': '2025-10-08 16:18:36.137616', '_unique_id': 'd7527def6997414b806dc1b7d3616197'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.138 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.139 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.139 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk.device.allocation volume: 30412800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.139 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bba3da4b-61ee-4091-a5f6-afd53db6deab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30412800, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-vda', 'timestamp': '2025-10-08T16:18:36.139260', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'instance-00000055', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7132de52-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.842763673, 'message_signature': 'd41ec590893320679b24923b58edb9bfa81864b148494cbf059c62deb895ac78'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-sda', 'timestamp': '2025-10-08T16:18:36.139260', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'instance-00000055', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7132e9f6-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.842763673, 'message_signature': '7a2505e0ecc45e98764c1c9e9debb2f0eb8f203204b57b7298dd35f7855650f4'}]}, 'timestamp': '2025-10-08 16:18:36.139911', '_unique_id': '2a43e8e6c4474d2b8678534e4bb06e7a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.141 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.141 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.141 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1716699684>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1716699684>]
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.142 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.142 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2290803-a20a-4946-a166-e5b703c2501e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000055-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-tapeccc469f-f0', 'timestamp': '2025-10-08T16:18:36.142202', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'tapeccc469f-f0', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:65:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeccc469f-f0'}, 'message_id': '71334f90-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.830543822, 'message_signature': '052560e566285156a31c25de555dd3a2ad973b166216d6ff085ec62db0b019fe'}]}, 'timestamp': '2025-10-08 16:18:36.142533', '_unique_id': '530ba5bc264046a68148fa1af45fd1c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.143 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.144 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.144 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.145 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7695349-361b-4852-b4fd-eb8fb419b71a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-vda', 'timestamp': '2025-10-08T16:18:36.144140', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'instance-00000055', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '71339cc0-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.842763673, 'message_signature': 'b8dc33a58288b124a0884b927c590f521b0d2f6df06fb7fc3da971cb14165b8c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-sda', 'timestamp': '2025-10-08T16:18:36.144140', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'instance-00000055', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7133d500-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.842763673, 'message_signature': 'a540343add84506ae42c40c7e8ca33fef4e1a36da90cee6ca8c13ad5c92f38b8'}]}, 'timestamp': '2025-10-08 16:18:36.145967', '_unique_id': '6b5043280f80434a8a142ca9769b2397'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.147 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.148 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.148 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad69518f-42e7-4c3a-9d0b-5bbbe516ebe4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000055-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-tapeccc469f-f0', 'timestamp': '2025-10-08T16:18:36.148312', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'tapeccc469f-f0', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:65:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeccc469f-f0'}, 'message_id': '71343fd6-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.830543822, 'message_signature': '308a2995e4bf1fb95f7a61e73853dd8eae09509439d95427bf8d53a01ad418dc'}]}, 'timestamp': '2025-10-08 16:18:36.148677', '_unique_id': 'dc0f9a085fc94eca8e90be385759a638'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.149 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.150 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.150 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.150 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b91f7314-0d95-40b5-bb3f-2957ab103da3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-vda', 'timestamp': '2025-10-08T16:18:36.150114', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'instance-00000055', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7134868a-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.786326749, 'message_signature': 'e652c641af0e9dcceb2cdbc1e552b9c19abc060c3f982ac835deccb0fa750cc3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-sda', 'timestamp': '2025-10-08T16:18:36.150114', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'instance-00000055', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7134917a-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.786326749, 'message_signature': '8933fe89abec19f1db4c369c70193cd29346b3b18d7fc90c5fce688fef796e19'}]}, 'timestamp': '2025-10-08 16:18:36.150740', '_unique_id': 'c42504103e88454384da089d2cfa6bcf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.151 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/network.incoming.bytes volume: 18479 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '100ca793-24d5-4b6e-a437-f5055c3a6fed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 18479, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000055-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-tapeccc469f-f0', 'timestamp': '2025-10-08T16:18:36.151982', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'tapeccc469f-f0', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:65:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeccc469f-f0'}, 'message_id': '7134cff0-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.830543822, 'message_signature': 'f7801c029e70037c813781708820725ba8d5da7fbd2776ed63eaecd6f4a98438'}]}, 'timestamp': '2025-10-08 16:18:36.152375', '_unique_id': '6131b3daaea64173a22ef73922dc8f6e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.153 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.153 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk.device.write.latency volume: 10893070 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.153 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f51fa05-3889-4f38-8172-a5797944fe63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10893070, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-vda', 'timestamp': '2025-10-08T16:18:36.153479', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'instance-00000055', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '713505ec-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.786326749, 'message_signature': 'd4f92d4fb822cae76daea1de9b6468807ab8cd3f17b308478737411ac1944e60'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-sda', 'timestamp': '2025-10-08T16:18:36.153479', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'instance-00000055', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '71350e16-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.786326749, 'message_signature': '2ba6f0c78137c539d7f755c8584b2643029dff544c161e61e3f3f6b4885fbc91'}]}, 'timestamp': '2025-10-08 16:18:36.153902', '_unique_id': '7998bfd287fe4b308e3549641a093e35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.155 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.155 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.155 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '048b4ce2-d788-472f-be24-748a27adb43a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-vda', 'timestamp': '2025-10-08T16:18:36.155139', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'instance-00000055', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '71354822-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.786326749, 'message_signature': '7ab01b47705a60960c59c91bd94fd1213d99be95d74832e95217e9479ff1ff48'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-sda', 'timestamp': '2025-10-08T16:18:36.155139', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'instance-00000055', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '71355100-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.786326749, 'message_signature': 'b2acc4ac7e49270c18bc20d28f0135229b8336b1b72e5e8d7f6ab8c02f317d14'}]}, 'timestamp': '2025-10-08 16:18:36.155610', '_unique_id': '846fe65b5c724880a4b560567d64ffb3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.156 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-1716699684>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1716699684>]
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6fc855c-fd46-4b66-afcb-228bb37e6c53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000055-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-tapeccc469f-f0', 'timestamp': '2025-10-08T16:18:36.157155', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'tapeccc469f-f0', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:65:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeccc469f-f0'}, 'message_id': '713595de-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.830543822, 'message_signature': '92c64c357a36efb265e306ea0f14aec861568b08e1e604f62f800719d47a0319'}]}, 'timestamp': '2025-10-08 16:18:36.157393', '_unique_id': '366fa316ea9941b589dc0bd56e9fe2c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.157 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.158 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.158 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk.device.write.requests volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.158 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '820e3f72-a074-46e8-aa1c-28c0af55232f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-vda', 'timestamp': '2025-10-08T16:18:36.158483', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'instance-00000055', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7135c93c-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.786326749, 'message_signature': '9894706641e03fa0b6ab4e3c832d648bbb7cb7127d2159b821b56cc3f34a3aa1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-sda', 'timestamp': '2025-10-08T16:18:36.158483', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'instance-00000055', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7135d120-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.786326749, 'message_signature': '22e1afe097ee0157e44ff2ab4ea27c9353c811c1f17b3aa2bc16ff424022ea9c'}]}, 'timestamp': '2025-10-08 16:18:36.158892', '_unique_id': '4246e373fb9b4432a61db390db26817c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.159 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.160 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-1716699684>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1716699684>]
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.160 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.160 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk.device.write.bytes volume: 65536 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.160 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fcf3280f-d117-441a-aa2c-c6b0fda3ff36', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 65536, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-vda', 'timestamp': '2025-10-08T16:18:36.160274', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'instance-00000055', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '71360fe6-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.786326749, 'message_signature': '9bbf2937a16ec916caaedcbdaf468f70758ba8282aac466e2d5359d587cd21dd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-sda', 'timestamp': '2025-10-08T16:18:36.160274', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'instance-00000055', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '71361914-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.786326749, 'message_signature': '75f4436ac46af74abc3bace60550b1f93d74d7d94da9fc921fa4851fb1652753'}]}, 'timestamp': '2025-10-08 16:18:36.160746', '_unique_id': '9b6350de2dcb4e80ac96aab3d6bf07ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.161 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/cpu volume: 580000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78e78d93-6d57-41d5-acb1-b56fde49b43d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 580000000, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'timestamp': '2025-10-08T16:18:36.161978', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'instance-00000055', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '713652b2-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.828326021, 'message_signature': 'a03d41dcf669817d5ac3ddff145a3106ec7902ba9155e41a663116582b13b999'}]}, 'timestamp': '2025-10-08 16:18:36.162217', '_unique_id': '27cce846463d4e38b24d415dd3b2a682'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.162 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.163 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.163 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.163 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1716699684>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1716699684>]
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.163 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.163 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/network.outgoing.bytes volume: 21325 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32651266-f491-4763-9d1c-283ded595d57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 21325, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000055-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-tapeccc469f-f0', 'timestamp': '2025-10-08T16:18:36.163538', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'tapeccc469f-f0', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:65:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeccc469f-f0'}, 'message_id': '71368eee-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.830543822, 'message_signature': 'b8ef60c671051fd21657c98e715ceff62cdda6727bc302b5958013e631db8ecf'}]}, 'timestamp': '2025-10-08 16:18:36.163763', '_unique_id': '328dc4d528e3483892267fabea47887a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.164 12 DEBUG ceilometer.compute.pollsters [-] 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'feae86d1-e2ec-49bd-89aa-dd3122db0d15', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '81b62a8f3edf4f78aeb0b087fd79ebb7', 'user_name': None, 'project_id': '9d7b1c6f132443b0abac8495ed44621d', 'project_name': None, 'resource_id': 'instance-00000055-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-tapeccc469f-f0', 'timestamp': '2025-10-08T16:18:36.164863', 'resource_metadata': {'display_name': 'tempest-server-test-1716699684', 'name': 'tapeccc469f-f0', 'instance_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'instance_type': 'm1.nano', 'host': 'a5b68f8b2d337f02599aa965216dd4370015a3ad00959b35d4e4059b', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:65:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeccc469f-f0'}, 'message_id': '7136c2a6-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7219.830543822, 'message_signature': '2c8c20a3b5de0f7e94e146f065d1ac0d3da1191348fbbbde3ce8cce4aaa16240'}]}, 'timestamp': '2025-10-08 16:18:36.165104', '_unique_id': '1ca463dfb4494b7a992174ee4f3d42ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:18:36.165 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:18:36 np0005476733 podman[257236]: 2025-10-08 16:18:36.251932605 +0000 UTC m=+0.068704767 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251001)
Oct  8 12:18:36 np0005476733 podman[257235]: 2025-10-08 16:18:36.277632777 +0000 UTC m=+0.098588273 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  8 12:18:36 np0005476733 nova_compute[192580]: 2025-10-08 16:18:36.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:18:36 np0005476733 nova_compute[192580]: 2025-10-08 16:18:36.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:18:36 np0005476733 nova_compute[192580]: 2025-10-08 16:18:36.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:18:37 np0005476733 nova_compute[192580]: 2025-10-08 16:18:37.488 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:18:37 np0005476733 nova_compute[192580]: 2025-10-08 16:18:37.489 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:18:37 np0005476733 nova_compute[192580]: 2025-10-08 16:18:37.489 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:18:37 np0005476733 nova_compute[192580]: 2025-10-08 16:18:37.490 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:18:38 np0005476733 nova_compute[192580]: 2025-10-08 16:18:38.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:38 np0005476733 nova_compute[192580]: 2025-10-08 16:18:38.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:39 np0005476733 systemd-logind[827]: New session 106 of user zuul.
Oct  8 12:18:39 np0005476733 systemd[1]: Started Session 106 of User zuul.
Oct  8 12:18:39 np0005476733 systemd[1]: session-106.scope: Deactivated successfully.
Oct  8 12:18:39 np0005476733 systemd-logind[827]: Session 106 logged out. Waiting for processes to exit.
Oct  8 12:18:39 np0005476733 systemd-logind[827]: Removed session 106.
Oct  8 12:18:40 np0005476733 nova_compute[192580]: 2025-10-08 16:18:40.517 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Updating instance_info_cache with network_info: [{"id": "eccc469f-f05b-41f3-ad17-900281358d00", "address": "fa:16:3e:e4:65:ad", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeccc469f-f0", "ovs_interfaceid": "eccc469f-f05b-41f3-ad17-900281358d00", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:18:40 np0005476733 nova_compute[192580]: 2025-10-08 16:18:40.535 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:18:40 np0005476733 nova_compute[192580]: 2025-10-08 16:18:40.535 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:18:40 np0005476733 nova_compute[192580]: 2025-10-08 16:18:40.536 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:18:43 np0005476733 podman[257313]: 2025-10-08 16:18:43.238778659 +0000 UTC m=+0.059388110 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:18:43 np0005476733 podman[257312]: 2025-10-08 16:18:43.241356561 +0000 UTC m=+0.061894480 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 12:18:43 np0005476733 podman[257314]: 2025-10-08 16:18:43.269904794 +0000 UTC m=+0.091163186 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm)
Oct  8 12:18:43 np0005476733 nova_compute[192580]: 2025-10-08 16:18:43.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:18:43.524 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=59, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=58) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:18:43 np0005476733 nova_compute[192580]: 2025-10-08 16:18:43.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:18:43.525 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:18:43 np0005476733 nova_compute[192580]: 2025-10-08 16:18:43.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:18:43 np0005476733 nova_compute[192580]: 2025-10-08 16:18:43.616 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:18:43 np0005476733 nova_compute[192580]: 2025-10-08 16:18:43.644 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:18:43 np0005476733 nova_compute[192580]: 2025-10-08 16:18:43.645 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:18:43 np0005476733 nova_compute[192580]: 2025-10-08 16:18:43.645 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:18:43 np0005476733 nova_compute[192580]: 2025-10-08 16:18:43.646 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:18:43 np0005476733 nova_compute[192580]: 2025-10-08 16:18:43.722 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:18:43 np0005476733 nova_compute[192580]: 2025-10-08 16:18:43.782 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:18:43 np0005476733 nova_compute[192580]: 2025-10-08 16:18:43.783 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:18:43 np0005476733 nova_compute[192580]: 2025-10-08 16:18:43.839 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:18:43 np0005476733 nova_compute[192580]: 2025-10-08 16:18:43.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:44 np0005476733 nova_compute[192580]: 2025-10-08 16:18:44.011 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:18:44 np0005476733 nova_compute[192580]: 2025-10-08 16:18:44.012 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13543MB free_disk=111.28640747070312GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:18:44 np0005476733 nova_compute[192580]: 2025-10-08 16:18:44.013 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:18:44 np0005476733 nova_compute[192580]: 2025-10-08 16:18:44.013 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:18:44 np0005476733 nova_compute[192580]: 2025-10-08 16:18:44.437 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:18:44 np0005476733 nova_compute[192580]: 2025-10-08 16:18:44.438 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:18:44 np0005476733 nova_compute[192580]: 2025-10-08 16:18:44.438 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:18:44 np0005476733 nova_compute[192580]: 2025-10-08 16:18:44.495 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:18:44 np0005476733 nova_compute[192580]: 2025-10-08 16:18:44.511 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:18:44 np0005476733 nova_compute[192580]: 2025-10-08 16:18:44.513 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:18:44 np0005476733 nova_compute[192580]: 2025-10-08 16:18:44.513 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.500s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:18:47 np0005476733 nova_compute[192580]: 2025-10-08 16:18:47.485 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:18:47 np0005476733 nova_compute[192580]: 2025-10-08 16:18:47.828 2 DEBUG oslo_concurrency.lockutils [None req-e11088ca-eed9-41dd-a5d0-d1eb312162ca 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:18:47 np0005476733 nova_compute[192580]: 2025-10-08 16:18:47.829 2 DEBUG oslo_concurrency.lockutils [None req-e11088ca-eed9-41dd-a5d0-d1eb312162ca 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:18:47 np0005476733 nova_compute[192580]: 2025-10-08 16:18:47.830 2 DEBUG oslo_concurrency.lockutils [None req-e11088ca-eed9-41dd-a5d0-d1eb312162ca 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:18:47 np0005476733 nova_compute[192580]: 2025-10-08 16:18:47.830 2 DEBUG oslo_concurrency.lockutils [None req-e11088ca-eed9-41dd-a5d0-d1eb312162ca 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:18:47 np0005476733 nova_compute[192580]: 2025-10-08 16:18:47.830 2 DEBUG oslo_concurrency.lockutils [None req-e11088ca-eed9-41dd-a5d0-d1eb312162ca 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:18:47 np0005476733 nova_compute[192580]: 2025-10-08 16:18:47.832 2 INFO nova.compute.manager [None req-e11088ca-eed9-41dd-a5d0-d1eb312162ca 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Terminating instance#033[00m
Oct  8 12:18:47 np0005476733 nova_compute[192580]: 2025-10-08 16:18:47.832 2 DEBUG nova.compute.manager [None req-e11088ca-eed9-41dd-a5d0-d1eb312162ca 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 12:18:47 np0005476733 kernel: tapeccc469f-f0 (unregistering): left promiscuous mode
Oct  8 12:18:47 np0005476733 NetworkManager[51699]: <info>  [1759940327.8586] device (tapeccc469f-f0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 12:18:47 np0005476733 ovn_controller[94857]: 2025-10-08T16:18:47Z|00837|binding|INFO|Releasing lport eccc469f-f05b-41f3-ad17-900281358d00 from this chassis (sb_readonly=0)
Oct  8 12:18:47 np0005476733 ovn_controller[94857]: 2025-10-08T16:18:47Z|00838|binding|INFO|Setting lport eccc469f-f05b-41f3-ad17-900281358d00 down in Southbound
Oct  8 12:18:47 np0005476733 ovn_controller[94857]: 2025-10-08T16:18:47Z|00839|binding|INFO|Removing iface tapeccc469f-f0 ovn-installed in OVS
Oct  8 12:18:47 np0005476733 nova_compute[192580]: 2025-10-08 16:18:47.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:47 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:18:47.877 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:65:ad 10.100.0.19'], port_security=['fa:16:3e:e4:65:ad 10.100.0.19'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28', 'neutron:device_id': '9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a60d5119-22ed-4506-b21f-c7850a67e1ca', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d7b1c6f132443b0abac8495ed44621d', 'neutron:revision_number': '13', 'neutron:security_group_ids': '6563b284-b73c-434a-b0ec-7119c8f265ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=67e260af-4f33-4a56-b3c9-23ccc3032d2a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=eccc469f-f05b-41f3-ad17-900281358d00) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:18:47 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:18:47.878 103739 INFO neutron.agent.ovn.metadata.agent [-] Port eccc469f-f05b-41f3-ad17-900281358d00 in datapath a60d5119-22ed-4506-b21f-c7850a67e1ca unbound from our chassis#033[00m
Oct  8 12:18:47 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:18:47.880 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a60d5119-22ed-4506-b21f-c7850a67e1ca, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 12:18:47 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:18:47.881 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[73ddb02b-bf64-42f5-bd29-33a43d4cb9a9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:18:47 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:18:47.882 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca namespace which is not needed anymore#033[00m
Oct  8 12:18:47 np0005476733 nova_compute[192580]: 2025-10-08 16:18:47.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:47 np0005476733 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000055.scope: Deactivated successfully.
Oct  8 12:18:47 np0005476733 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000055.scope: Consumed 6.616s CPU time.
Oct  8 12:18:47 np0005476733 systemd-machined[152624]: Machine qemu-53-instance-00000055 terminated.
Oct  8 12:18:48 np0005476733 neutron-haproxy-ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca[255511]: [NOTICE]   (255515) : haproxy version is 2.8.14-c23fe91
Oct  8 12:18:48 np0005476733 neutron-haproxy-ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca[255511]: [NOTICE]   (255515) : path to executable is /usr/sbin/haproxy
Oct  8 12:18:48 np0005476733 neutron-haproxy-ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca[255511]: [WARNING]  (255515) : Exiting Master process...
Oct  8 12:18:48 np0005476733 neutron-haproxy-ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca[255511]: [WARNING]  (255515) : Exiting Master process...
Oct  8 12:18:48 np0005476733 neutron-haproxy-ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca[255511]: [ALERT]    (255515) : Current worker (255517) exited with code 143 (Terminated)
Oct  8 12:18:48 np0005476733 neutron-haproxy-ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca[255511]: [WARNING]  (255515) : All workers exited. Exiting... (0)
Oct  8 12:18:48 np0005476733 systemd[1]: libpod-d1f2fb68e8f03199a411e91b94351f24e0331217f388d280ae34d4a4bbfe4879.scope: Deactivated successfully.
Oct  8 12:18:48 np0005476733 conmon[255511]: conmon d1f2fb68e8f03199a411 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d1f2fb68e8f03199a411e91b94351f24e0331217f388d280ae34d4a4bbfe4879.scope/container/memory.events
Oct  8 12:18:48 np0005476733 podman[257400]: 2025-10-08 16:18:48.026904789 +0000 UTC m=+0.056996712 container died d1f2fb68e8f03199a411e91b94351f24e0331217f388d280ae34d4a4bbfe4879 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:18:48 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d1f2fb68e8f03199a411e91b94351f24e0331217f388d280ae34d4a4bbfe4879-userdata-shm.mount: Deactivated successfully.
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:48 np0005476733 systemd[1]: var-lib-containers-storage-overlay-139b1260890aa9c749e4be18ca87e2503f21de7326d2843f59d1a51473eccd5b-merged.mount: Deactivated successfully.
Oct  8 12:18:48 np0005476733 podman[257400]: 2025-10-08 16:18:48.080705729 +0000 UTC m=+0.110797662 container cleanup d1f2fb68e8f03199a411e91b94351f24e0331217f388d280ae34d4a4bbfe4879 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:48 np0005476733 systemd[1]: libpod-conmon-d1f2fb68e8f03199a411e91b94351f24e0331217f388d280ae34d4a4bbfe4879.scope: Deactivated successfully.
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.132 2 INFO nova.virt.libvirt.driver [-] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Instance destroyed successfully.#033[00m
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.133 2 DEBUG nova.objects.instance [None req-e11088ca-eed9-41dd-a5d0-d1eb312162ca 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lazy-loading 'resources' on Instance uuid 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:18:48 np0005476733 podman[257440]: 2025-10-08 16:18:48.159781298 +0000 UTC m=+0.052918693 container remove d1f2fb68e8f03199a411e91b94351f24e0331217f388d280ae34d4a4bbfe4879 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 12:18:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:18:48.165 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9f062246-884d-49a7-87ef-fc439dfd8cc2]: (4, ('Wed Oct  8 04:18:47 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca (d1f2fb68e8f03199a411e91b94351f24e0331217f388d280ae34d4a4bbfe4879)\nd1f2fb68e8f03199a411e91b94351f24e0331217f388d280ae34d4a4bbfe4879\nWed Oct  8 04:18:48 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca (d1f2fb68e8f03199a411e91b94351f24e0331217f388d280ae34d4a4bbfe4879)\nd1f2fb68e8f03199a411e91b94351f24e0331217f388d280ae34d4a4bbfe4879\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:18:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:18:48.167 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[86b5add0-39c9-48b2-90fe-e5e8a43cad74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:18:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:18:48.168 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa60d5119-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:48 np0005476733 kernel: tapa60d5119-20: left promiscuous mode
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:18:48.190 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[397466b4-e151-43a9-926a-de6a02133e10]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:18:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:18:48.225 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3b93fe20-529e-4b01-beb2-6de6ab14e8e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:18:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:18:48.227 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[09889196-fb01-4a1e-90b6-f1035b7a107b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:18:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:18:48.247 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4419fbee-49f1-4a2e-b1c0-91aee10b2a26]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706533, 'reachable_time': 20441, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257463, 'error': None, 'target': 'ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:18:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:18:48.250 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a60d5119-22ed-4506-b21f-c7850a67e1ca deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 12:18:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:18:48.250 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[5e83ba5b-69b3-4309-be2b-a9d3939f33e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:18:48 np0005476733 systemd[1]: run-netns-ovnmeta\x2da60d5119\x2d22ed\x2d4506\x2db21f\x2dc7850a67e1ca.mount: Deactivated successfully.
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.291 2 DEBUG nova.virt.libvirt.vif [None req-e11088ca-eed9-41dd-a5d0-d1eb312162ca 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-08T16:15:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-server-test-1716699684',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1716699684',id=85,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN+zrgavmRKdnUQXrKVYbBp+WYlZmJhFQsa1kl3Dxiu3QX4zg/ahS8IRtri5xncBe6cTI7KlCkUrKeQ2+JI5DxlFw723YYmaC31z8s6e9ZieApUJckBa+6MT9ksK1drFCw==',key_name='tempest-keypair-1043886460',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:15:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9d7b1c6f132443b0abac8495ed44621d',ramdisk_id='',reservation_id='r-b38at89j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',clean_attempts='1',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-OvnDvrTest-313060968',owner_user_name='tempest-OvnDvrTest-313060968-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T16:16:58Z,user_data=None,user_id='81b62a8f3edf4f78aeb0b087fd79ebb7',uuid=9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eccc469f-f05b-41f3-ad17-900281358d00", "address": "fa:16:3e:e4:65:ad", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeccc469f-f0", "ovs_interfaceid": "eccc469f-f05b-41f3-ad17-900281358d00", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.292 2 DEBUG nova.network.os_vif_util [None req-e11088ca-eed9-41dd-a5d0-d1eb312162ca 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converting VIF {"id": "eccc469f-f05b-41f3-ad17-900281358d00", "address": "fa:16:3e:e4:65:ad", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeccc469f-f0", "ovs_interfaceid": "eccc469f-f05b-41f3-ad17-900281358d00", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.292 2 DEBUG nova.network.os_vif_util [None req-e11088ca-eed9-41dd-a5d0-d1eb312162ca 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e4:65:ad,bridge_name='br-int',has_traffic_filtering=True,id=eccc469f-f05b-41f3-ad17-900281358d00,network=Network(a60d5119-22ed-4506-b21f-c7850a67e1ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeccc469f-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.293 2 DEBUG os_vif [None req-e11088ca-eed9-41dd-a5d0-d1eb312162ca 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e4:65:ad,bridge_name='br-int',has_traffic_filtering=True,id=eccc469f-f05b-41f3-ad17-900281358d00,network=Network(a60d5119-22ed-4506-b21f-c7850a67e1ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeccc469f-f0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.294 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeccc469f-f0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.300 2 INFO os_vif [None req-e11088ca-eed9-41dd-a5d0-d1eb312162ca 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e4:65:ad,bridge_name='br-int',has_traffic_filtering=True,id=eccc469f-f05b-41f3-ad17-900281358d00,network=Network(a60d5119-22ed-4506-b21f-c7850a67e1ca),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeccc469f-f0')#033[00m
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.300 2 INFO nova.virt.libvirt.driver [None req-e11088ca-eed9-41dd-a5d0-d1eb312162ca 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Deleting instance files /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3_del#033[00m
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.301 2 INFO nova.virt.libvirt.driver [None req-e11088ca-eed9-41dd-a5d0-d1eb312162ca 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Deletion of /var/lib/nova/instances/9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3_del complete#033[00m
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.430 2 INFO nova.compute.manager [None req-e11088ca-eed9-41dd-a5d0-d1eb312162ca 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Took 0.60 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.430 2 DEBUG oslo.service.loopingcall [None req-e11088ca-eed9-41dd-a5d0-d1eb312162ca 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.431 2 DEBUG nova.compute.manager [-] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.431 2 DEBUG nova.network.neutron [-] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.448 2 DEBUG nova.compute.manager [req-9194993f-2389-4265-b6cb-fddd624bcae5 req-eaaed09c-2761-4d6d-b016-e8e3fba4b0f0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Received event network-changed-eccc469f-f05b-41f3-ad17-900281358d00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.448 2 DEBUG nova.compute.manager [req-9194993f-2389-4265-b6cb-fddd624bcae5 req-eaaed09c-2761-4d6d-b016-e8e3fba4b0f0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Refreshing instance network info cache due to event network-changed-eccc469f-f05b-41f3-ad17-900281358d00. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.449 2 DEBUG oslo_concurrency.lockutils [req-9194993f-2389-4265-b6cb-fddd624bcae5 req-eaaed09c-2761-4d6d-b016-e8e3fba4b0f0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.449 2 DEBUG oslo_concurrency.lockutils [req-9194993f-2389-4265-b6cb-fddd624bcae5 req-eaaed09c-2761-4d6d-b016-e8e3fba4b0f0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:18:48 np0005476733 nova_compute[192580]: 2025-10-08 16:18:48.449 2 DEBUG nova.network.neutron [req-9194993f-2389-4265-b6cb-fddd624bcae5 req-eaaed09c-2761-4d6d-b016-e8e3fba4b0f0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Refreshing network info cache for port eccc469f-f05b-41f3-ad17-900281358d00 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:18:49 np0005476733 nova_compute[192580]: 2025-10-08 16:18:49.492 2 DEBUG nova.network.neutron [-] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:18:49 np0005476733 nova_compute[192580]: 2025-10-08 16:18:49.520 2 INFO nova.compute.manager [-] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Took 1.09 seconds to deallocate network for instance.#033[00m
Oct  8 12:18:49 np0005476733 nova_compute[192580]: 2025-10-08 16:18:49.577 2 DEBUG nova.compute.manager [req-5d9337ca-b618-4568-b149-a38a470798ad req-619eef92-24d8-4540-be5f-b3fa3461663f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Received event network-vif-deleted-eccc469f-f05b-41f3-ad17-900281358d00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:18:49 np0005476733 nova_compute[192580]: 2025-10-08 16:18:49.581 2 DEBUG oslo_concurrency.lockutils [None req-e11088ca-eed9-41dd-a5d0-d1eb312162ca 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:18:49 np0005476733 nova_compute[192580]: 2025-10-08 16:18:49.581 2 DEBUG oslo_concurrency.lockutils [None req-e11088ca-eed9-41dd-a5d0-d1eb312162ca 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:18:49 np0005476733 nova_compute[192580]: 2025-10-08 16:18:49.646 2 DEBUG nova.compute.provider_tree [None req-e11088ca-eed9-41dd-a5d0-d1eb312162ca 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:18:49 np0005476733 nova_compute[192580]: 2025-10-08 16:18:49.663 2 DEBUG nova.scheduler.client.report [None req-e11088ca-eed9-41dd-a5d0-d1eb312162ca 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:18:49 np0005476733 nova_compute[192580]: 2025-10-08 16:18:49.684 2 DEBUG oslo_concurrency.lockutils [None req-e11088ca-eed9-41dd-a5d0-d1eb312162ca 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:18:49 np0005476733 nova_compute[192580]: 2025-10-08 16:18:49.709 2 INFO nova.scheduler.client.report [None req-e11088ca-eed9-41dd-a5d0-d1eb312162ca 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Deleted allocations for instance 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3#033[00m
Oct  8 12:18:49 np0005476733 nova_compute[192580]: 2025-10-08 16:18:49.806 2 DEBUG oslo_concurrency.lockutils [None req-e11088ca-eed9-41dd-a5d0-d1eb312162ca 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.977s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:18:50 np0005476733 nova_compute[192580]: 2025-10-08 16:18:50.181 2 DEBUG nova.network.neutron [req-9194993f-2389-4265-b6cb-fddd624bcae5 req-eaaed09c-2761-4d6d-b016-e8e3fba4b0f0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Updated VIF entry in instance network info cache for port eccc469f-f05b-41f3-ad17-900281358d00. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:18:50 np0005476733 nova_compute[192580]: 2025-10-08 16:18:50.182 2 DEBUG nova.network.neutron [req-9194993f-2389-4265-b6cb-fddd624bcae5 req-eaaed09c-2761-4d6d-b016-e8e3fba4b0f0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Updating instance_info_cache with network_info: [{"id": "eccc469f-f05b-41f3-ad17-900281358d00", "address": "fa:16:3e:e4:65:ad", "network": {"id": "a60d5119-22ed-4506-b21f-c7850a67e1ca", "bridge": "br-int", "label": "tempest-test-network--1153265288", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeccc469f-f0", "ovs_interfaceid": "eccc469f-f05b-41f3-ad17-900281358d00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:18:50 np0005476733 nova_compute[192580]: 2025-10-08 16:18:50.210 2 DEBUG oslo_concurrency.lockutils [req-9194993f-2389-4265-b6cb-fddd624bcae5 req-eaaed09c-2761-4d6d-b016-e8e3fba4b0f0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:18:50 np0005476733 nova_compute[192580]: 2025-10-08 16:18:50.571 2 DEBUG nova.compute.manager [req-68fd9574-61a3-4d15-a760-f4a1b289c644 req-33eaf8c3-4032-41c5-b778-ccd90a88f4f8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Received event network-vif-unplugged-eccc469f-f05b-41f3-ad17-900281358d00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:18:50 np0005476733 nova_compute[192580]: 2025-10-08 16:18:50.572 2 DEBUG oslo_concurrency.lockutils [req-68fd9574-61a3-4d15-a760-f4a1b289c644 req-33eaf8c3-4032-41c5-b778-ccd90a88f4f8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:18:50 np0005476733 nova_compute[192580]: 2025-10-08 16:18:50.572 2 DEBUG oslo_concurrency.lockutils [req-68fd9574-61a3-4d15-a760-f4a1b289c644 req-33eaf8c3-4032-41c5-b778-ccd90a88f4f8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:18:50 np0005476733 nova_compute[192580]: 2025-10-08 16:18:50.572 2 DEBUG oslo_concurrency.lockutils [req-68fd9574-61a3-4d15-a760-f4a1b289c644 req-33eaf8c3-4032-41c5-b778-ccd90a88f4f8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:18:50 np0005476733 nova_compute[192580]: 2025-10-08 16:18:50.573 2 DEBUG nova.compute.manager [req-68fd9574-61a3-4d15-a760-f4a1b289c644 req-33eaf8c3-4032-41c5-b778-ccd90a88f4f8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] No waiting events found dispatching network-vif-unplugged-eccc469f-f05b-41f3-ad17-900281358d00 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:18:50 np0005476733 nova_compute[192580]: 2025-10-08 16:18:50.573 2 WARNING nova.compute.manager [req-68fd9574-61a3-4d15-a760-f4a1b289c644 req-33eaf8c3-4032-41c5-b778-ccd90a88f4f8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Received unexpected event network-vif-unplugged-eccc469f-f05b-41f3-ad17-900281358d00 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 12:18:50 np0005476733 nova_compute[192580]: 2025-10-08 16:18:50.573 2 DEBUG nova.compute.manager [req-68fd9574-61a3-4d15-a760-f4a1b289c644 req-33eaf8c3-4032-41c5-b778-ccd90a88f4f8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Received event network-vif-plugged-eccc469f-f05b-41f3-ad17-900281358d00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:18:50 np0005476733 nova_compute[192580]: 2025-10-08 16:18:50.573 2 DEBUG oslo_concurrency.lockutils [req-68fd9574-61a3-4d15-a760-f4a1b289c644 req-33eaf8c3-4032-41c5-b778-ccd90a88f4f8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:18:50 np0005476733 nova_compute[192580]: 2025-10-08 16:18:50.574 2 DEBUG oslo_concurrency.lockutils [req-68fd9574-61a3-4d15-a760-f4a1b289c644 req-33eaf8c3-4032-41c5-b778-ccd90a88f4f8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:18:50 np0005476733 nova_compute[192580]: 2025-10-08 16:18:50.574 2 DEBUG oslo_concurrency.lockutils [req-68fd9574-61a3-4d15-a760-f4a1b289c644 req-33eaf8c3-4032-41c5-b778-ccd90a88f4f8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:18:50 np0005476733 nova_compute[192580]: 2025-10-08 16:18:50.574 2 DEBUG nova.compute.manager [req-68fd9574-61a3-4d15-a760-f4a1b289c644 req-33eaf8c3-4032-41c5-b778-ccd90a88f4f8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] No waiting events found dispatching network-vif-plugged-eccc469f-f05b-41f3-ad17-900281358d00 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:18:50 np0005476733 nova_compute[192580]: 2025-10-08 16:18:50.574 2 WARNING nova.compute.manager [req-68fd9574-61a3-4d15-a760-f4a1b289c644 req-33eaf8c3-4032-41c5-b778-ccd90a88f4f8 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Received unexpected event network-vif-plugged-eccc469f-f05b-41f3-ad17-900281358d00 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 12:18:51 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:18:51.529 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '59'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:18:52 np0005476733 podman[257464]: 2025-10-08 16:18:52.260533247 +0000 UTC m=+0.075567370 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 12:18:52 np0005476733 podman[257465]: 2025-10-08 16:18:52.268568002 +0000 UTC m=+0.075006722 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:18:53 np0005476733 nova_compute[192580]: 2025-10-08 16:18:53.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:53 np0005476733 nova_compute[192580]: 2025-10-08 16:18:53.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:53 np0005476733 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct  8 12:18:53 np0005476733 nova_compute[192580]: 2025-10-08 16:18:53.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:58 np0005476733 nova_compute[192580]: 2025-10-08 16:18:58.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:18:58 np0005476733 nova_compute[192580]: 2025-10-08 16:18:58.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:00 np0005476733 nova_compute[192580]: 2025-10-08 16:19:00.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:02 np0005476733 podman[257511]: 2025-10-08 16:19:02.218453057 +0000 UTC m=+0.052613151 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  8 12:19:03 np0005476733 nova_compute[192580]: 2025-10-08 16:19:03.133 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759940328.13094, 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:19:03 np0005476733 nova_compute[192580]: 2025-10-08 16:19:03.133 2 INFO nova.compute.manager [-] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] VM Stopped (Lifecycle Event)#033[00m
Oct  8 12:19:03 np0005476733 nova_compute[192580]: 2025-10-08 16:19:03.171 2 DEBUG nova.compute.manager [None req-ad7b16fb-d5ac-4621-95fe-ab035ff3ca03 - - - - - -] [instance: 9e14e3e5-f2ec-41a4-adfe-e4d5e7823bb3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:19:03 np0005476733 nova_compute[192580]: 2025-10-08 16:19:03.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:03 np0005476733 nova_compute[192580]: 2025-10-08 16:19:03.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:05 np0005476733 nova_compute[192580]: 2025-10-08 16:19:05.409 2 DEBUG oslo_concurrency.lockutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Acquiring lock "e9519041-3cf1-40a3-8654-4ed813d2c48a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:19:05 np0005476733 nova_compute[192580]: 2025-10-08 16:19:05.410 2 DEBUG oslo_concurrency.lockutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:19:05 np0005476733 nova_compute[192580]: 2025-10-08 16:19:05.439 2 DEBUG nova.compute.manager [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 12:19:05 np0005476733 nova_compute[192580]: 2025-10-08 16:19:05.546 2 DEBUG oslo_concurrency.lockutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:19:05 np0005476733 nova_compute[192580]: 2025-10-08 16:19:05.547 2 DEBUG oslo_concurrency.lockutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:19:05 np0005476733 nova_compute[192580]: 2025-10-08 16:19:05.555 2 DEBUG nova.virt.hardware [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 12:19:05 np0005476733 nova_compute[192580]: 2025-10-08 16:19:05.555 2 INFO nova.compute.claims [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 12:19:05 np0005476733 nova_compute[192580]: 2025-10-08 16:19:05.707 2 DEBUG nova.compute.provider_tree [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:19:05 np0005476733 nova_compute[192580]: 2025-10-08 16:19:05.730 2 DEBUG nova.scheduler.client.report [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:19:05 np0005476733 nova_compute[192580]: 2025-10-08 16:19:05.771 2 DEBUG oslo_concurrency.lockutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:19:05 np0005476733 nova_compute[192580]: 2025-10-08 16:19:05.772 2 DEBUG nova.compute.manager [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 12:19:05 np0005476733 nova_compute[192580]: 2025-10-08 16:19:05.849 2 DEBUG nova.compute.manager [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 12:19:05 np0005476733 nova_compute[192580]: 2025-10-08 16:19:05.850 2 DEBUG nova.network.neutron [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 12:19:05 np0005476733 nova_compute[192580]: 2025-10-08 16:19:05.879 2 INFO nova.virt.libvirt.driver [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 12:19:05 np0005476733 nova_compute[192580]: 2025-10-08 16:19:05.907 2 DEBUG nova.compute.manager [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.024 2 DEBUG nova.compute.manager [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.025 2 DEBUG nova.virt.libvirt.driver [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.026 2 INFO nova.virt.libvirt.driver [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Creating image(s)#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.026 2 DEBUG oslo_concurrency.lockutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Acquiring lock "/var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.026 2 DEBUG oslo_concurrency.lockutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Lock "/var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.027 2 DEBUG oslo_concurrency.lockutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Lock "/var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.039 2 DEBUG oslo_concurrency.processutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.096 2 DEBUG oslo_concurrency.processutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.097 2 DEBUG oslo_concurrency.lockutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Acquiring lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.098 2 DEBUG oslo_concurrency.lockutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.114 2 DEBUG oslo_concurrency.processutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.171 2 DEBUG oslo_concurrency.processutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.172 2 DEBUG oslo_concurrency.processutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.204 2 DEBUG oslo_concurrency.processutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.205 2 DEBUG oslo_concurrency.lockutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.206 2 DEBUG oslo_concurrency.processutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.260 2 DEBUG oslo_concurrency.processutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.261 2 DEBUG nova.virt.disk.api [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Checking if we can resize image /var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.261 2 DEBUG oslo_concurrency.processutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.318 2 DEBUG oslo_concurrency.processutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.319 2 DEBUG nova.virt.disk.api [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Cannot resize image /var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.319 2 DEBUG nova.objects.instance [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Lazy-loading 'migration_context' on Instance uuid e9519041-3cf1-40a3-8654-4ed813d2c48a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.343 2 DEBUG nova.virt.libvirt.driver [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.344 2 DEBUG nova.virt.libvirt.driver [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Ensure instance console log exists: /var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.344 2 DEBUG oslo_concurrency.lockutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.344 2 DEBUG oslo_concurrency.lockutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.345 2 DEBUG oslo_concurrency.lockutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:19:06 np0005476733 nova_compute[192580]: 2025-10-08 16:19:06.617 2 DEBUG nova.policy [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 12:19:07 np0005476733 podman[257547]: 2025-10-08 16:19:07.224126454 +0000 UTC m=+0.055349288 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  8 12:19:07 np0005476733 podman[257546]: 2025-10-08 16:19:07.24449377 +0000 UTC m=+0.078467972 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:19:08 np0005476733 nova_compute[192580]: 2025-10-08 16:19:08.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:08 np0005476733 nova_compute[192580]: 2025-10-08 16:19:08.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:08 np0005476733 nova_compute[192580]: 2025-10-08 16:19:08.601 2 DEBUG nova.network.neutron [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Successfully created port: 091f9564-aabf-4b53-8693-9d3b05911b76 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 12:19:09 np0005476733 nova_compute[192580]: 2025-10-08 16:19:09.987 2 DEBUG nova.network.neutron [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Successfully updated port: 091f9564-aabf-4b53-8693-9d3b05911b76 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 12:19:10 np0005476733 nova_compute[192580]: 2025-10-08 16:19:10.016 2 DEBUG oslo_concurrency.lockutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Acquiring lock "refresh_cache-e9519041-3cf1-40a3-8654-4ed813d2c48a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:19:10 np0005476733 nova_compute[192580]: 2025-10-08 16:19:10.016 2 DEBUG oslo_concurrency.lockutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Acquired lock "refresh_cache-e9519041-3cf1-40a3-8654-4ed813d2c48a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:19:10 np0005476733 nova_compute[192580]: 2025-10-08 16:19:10.016 2 DEBUG nova.network.neutron [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:19:10 np0005476733 nova_compute[192580]: 2025-10-08 16:19:10.151 2 DEBUG nova.compute.manager [req-2a3d80ab-0f81-4a27-9bc3-0ccf3466d428 req-0ec6ff67-5173-4e83-b88f-928bd02c36d2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Received event network-changed-091f9564-aabf-4b53-8693-9d3b05911b76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:19:10 np0005476733 nova_compute[192580]: 2025-10-08 16:19:10.152 2 DEBUG nova.compute.manager [req-2a3d80ab-0f81-4a27-9bc3-0ccf3466d428 req-0ec6ff67-5173-4e83-b88f-928bd02c36d2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Refreshing instance network info cache due to event network-changed-091f9564-aabf-4b53-8693-9d3b05911b76. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:19:10 np0005476733 nova_compute[192580]: 2025-10-08 16:19:10.153 2 DEBUG oslo_concurrency.lockutils [req-2a3d80ab-0f81-4a27-9bc3-0ccf3466d428 req-0ec6ff67-5173-4e83-b88f-928bd02c36d2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-e9519041-3cf1-40a3-8654-4ed813d2c48a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:19:10 np0005476733 nova_compute[192580]: 2025-10-08 16:19:10.249 2 DEBUG nova.network.neutron [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.187 2 DEBUG nova.network.neutron [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Updating instance_info_cache with network_info: [{"id": "091f9564-aabf-4b53-8693-9d3b05911b76", "address": "fa:16:3e:36:43:2f", "network": {"id": "ec0b6c7f-9518-401a-b9ba-5148c7d4432d", "bridge": "br-int", "label": "tempest-test-network--685655984", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap091f9564-aa", "ovs_interfaceid": "091f9564-aabf-4b53-8693-9d3b05911b76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.211 2 DEBUG oslo_concurrency.lockutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Releasing lock "refresh_cache-e9519041-3cf1-40a3-8654-4ed813d2c48a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.211 2 DEBUG nova.compute.manager [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Instance network_info: |[{"id": "091f9564-aabf-4b53-8693-9d3b05911b76", "address": "fa:16:3e:36:43:2f", "network": {"id": "ec0b6c7f-9518-401a-b9ba-5148c7d4432d", "bridge": "br-int", "label": "tempest-test-network--685655984", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap091f9564-aa", "ovs_interfaceid": "091f9564-aabf-4b53-8693-9d3b05911b76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.211 2 DEBUG oslo_concurrency.lockutils [req-2a3d80ab-0f81-4a27-9bc3-0ccf3466d428 req-0ec6ff67-5173-4e83-b88f-928bd02c36d2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-e9519041-3cf1-40a3-8654-4ed813d2c48a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.212 2 DEBUG nova.network.neutron [req-2a3d80ab-0f81-4a27-9bc3-0ccf3466d428 req-0ec6ff67-5173-4e83-b88f-928bd02c36d2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Refreshing network info cache for port 091f9564-aabf-4b53-8693-9d3b05911b76 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.214 2 DEBUG nova.virt.libvirt.driver [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Start _get_guest_xml network_info=[{"id": "091f9564-aabf-4b53-8693-9d3b05911b76", "address": "fa:16:3e:36:43:2f", "network": {"id": "ec0b6c7f-9518-401a-b9ba-5148c7d4432d", "bridge": "br-int", "label": "tempest-test-network--685655984", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap091f9564-aa", "ovs_interfaceid": "091f9564-aabf-4b53-8693-9d3b05911b76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T15:17:39Z,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T15:17:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.219 2 WARNING nova.virt.libvirt.driver [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.227 2 DEBUG nova.virt.libvirt.host [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.228 2 DEBUG nova.virt.libvirt.host [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.232 2 DEBUG nova.virt.libvirt.host [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.233 2 DEBUG nova.virt.libvirt.host [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.233 2 DEBUG nova.virt.libvirt.driver [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.233 2 DEBUG nova.virt.hardware [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='987b2db7-1d21-4b59-831a-1e8ace40589b',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T15:17:39Z,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T15:17:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.234 2 DEBUG nova.virt.hardware [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.234 2 DEBUG nova.virt.hardware [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.234 2 DEBUG nova.virt.hardware [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.234 2 DEBUG nova.virt.hardware [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.234 2 DEBUG nova.virt.hardware [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.235 2 DEBUG nova.virt.hardware [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.235 2 DEBUG nova.virt.hardware [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.235 2 DEBUG nova.virt.hardware [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.235 2 DEBUG nova.virt.hardware [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.235 2 DEBUG nova.virt.hardware [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.239 2 DEBUG nova.virt.libvirt.vif [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:19:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-server-test-40326179',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-40326179',id=87,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJXmR2ZdZ/cPq7lDjgd3tfkKNExE73Ly5Y9+bY6BcqlYpolw2Hd9zh5sY9Gw6YddISE8ZprV1f9n+5/viUjWJD3yCOyl/ugmh/k10MEE7fGoiA6Dg0qDXStOp75SJcpxTQ==',key_name='tempest-keypair-test-542762937',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0433a72056854da48c168f13bcf53e59',ramdisk_id='',reservation_id='r-7499jz6f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-OvnDvrTest-1354773760',owner_user_name='tempest-OvnDvrTest-1354773760-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:19:05Z,user_data=None,user_id='2bdd69fe495b499fbadd2e2b8da36c6f',uuid=e9519041-3cf1-40a3-8654-4ed813d2c48a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "091f9564-aabf-4b53-8693-9d3b05911b76", "address": "fa:16:3e:36:43:2f", "network": {"id": "ec0b6c7f-9518-401a-b9ba-5148c7d4432d", "bridge": "br-int", "label": "tempest-test-network--685655984", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap091f9564-aa", "ovs_interfaceid": "091f9564-aabf-4b53-8693-9d3b05911b76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.240 2 DEBUG nova.network.os_vif_util [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Converting VIF {"id": "091f9564-aabf-4b53-8693-9d3b05911b76", "address": "fa:16:3e:36:43:2f", "network": {"id": "ec0b6c7f-9518-401a-b9ba-5148c7d4432d", "bridge": "br-int", "label": "tempest-test-network--685655984", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap091f9564-aa", "ovs_interfaceid": "091f9564-aabf-4b53-8693-9d3b05911b76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.240 2 DEBUG nova.network.os_vif_util [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:43:2f,bridge_name='br-int',has_traffic_filtering=True,id=091f9564-aabf-4b53-8693-9d3b05911b76,network=Network(ec0b6c7f-9518-401a-b9ba-5148c7d4432d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap091f9564-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.241 2 DEBUG nova.objects.instance [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Lazy-loading 'pci_devices' on Instance uuid e9519041-3cf1-40a3-8654-4ed813d2c48a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.261 2 DEBUG nova.virt.libvirt.driver [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] End _get_guest_xml xml=<domain type="kvm">
Oct  8 12:19:11 np0005476733 nova_compute[192580]:  <uuid>e9519041-3cf1-40a3-8654-4ed813d2c48a</uuid>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:  <name>instance-00000057</name>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:  <memory>131072</memory>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <nova:name>tempest-server-test-40326179</nova:name>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 16:19:11</nova:creationTime>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <nova:flavor name="m1.nano">
Oct  8 12:19:11 np0005476733 nova_compute[192580]:        <nova:memory>128</nova:memory>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:        <nova:disk>1</nova:disk>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:        <nova:user uuid="2bdd69fe495b499fbadd2e2b8da36c6f">tempest-OvnDvrTest-1354773760-project-member</nova:user>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:        <nova:project uuid="0433a72056854da48c168f13bcf53e59">tempest-OvnDvrTest-1354773760</nova:project>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="ec29a055-bb5f-49c2-94be-8574c5ea97ea"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:        <nova:port uuid="091f9564-aabf-4b53-8693-9d3b05911b76">
Oct  8 12:19:11 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.44" ipVersion="4"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <system>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <entry name="serial">e9519041-3cf1-40a3-8654-4ed813d2c48a</entry>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <entry name="uuid">e9519041-3cf1-40a3-8654-4ed813d2c48a</entry>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    </system>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:  <os>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:  </os>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:  <features>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:  </features>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:  </clock>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:  <devices>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.config"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:36:43:2f"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <target dev="tap091f9564-aa"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    </interface>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/console.log" append="off"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    </serial>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <video>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    </video>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    </rng>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 12:19:11 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 12:19:11 np0005476733 nova_compute[192580]:  </devices>
Oct  8 12:19:11 np0005476733 nova_compute[192580]: </domain>
Oct  8 12:19:11 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.262 2 DEBUG nova.compute.manager [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Preparing to wait for external event network-vif-plugged-091f9564-aabf-4b53-8693-9d3b05911b76 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.263 2 DEBUG oslo_concurrency.lockutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Acquiring lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.263 2 DEBUG oslo_concurrency.lockutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.263 2 DEBUG oslo_concurrency.lockutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.264 2 DEBUG nova.virt.libvirt.vif [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:19:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-server-test-40326179',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-40326179',id=87,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJXmR2ZdZ/cPq7lDjgd3tfkKNExE73Ly5Y9+bY6BcqlYpolw2Hd9zh5sY9Gw6YddISE8ZprV1f9n+5/viUjWJD3yCOyl/ugmh/k10MEE7fGoiA6Dg0qDXStOp75SJcpxTQ==',key_name='tempest-keypair-test-542762937',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0433a72056854da48c168f13bcf53e59',ramdisk_id='',reservation_id='r-7499jz6f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-OvnDvrTest-1354773760',owner_user_name='tempest-OvnDvrTest-1354773760-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:19:05Z,user_data=None,user_id='2bdd69fe495b499fbadd2e2b8da36c6f',uuid=e9519041-3cf1-40a3-8654-4ed813d2c48a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "091f9564-aabf-4b53-8693-9d3b05911b76", "address": "fa:16:3e:36:43:2f", "network": {"id": "ec0b6c7f-9518-401a-b9ba-5148c7d4432d", "bridge": "br-int", "label": "tempest-test-network--685655984", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap091f9564-aa", "ovs_interfaceid": "091f9564-aabf-4b53-8693-9d3b05911b76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.264 2 DEBUG nova.network.os_vif_util [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Converting VIF {"id": "091f9564-aabf-4b53-8693-9d3b05911b76", "address": "fa:16:3e:36:43:2f", "network": {"id": "ec0b6c7f-9518-401a-b9ba-5148c7d4432d", "bridge": "br-int", "label": "tempest-test-network--685655984", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap091f9564-aa", "ovs_interfaceid": "091f9564-aabf-4b53-8693-9d3b05911b76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.265 2 DEBUG nova.network.os_vif_util [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:43:2f,bridge_name='br-int',has_traffic_filtering=True,id=091f9564-aabf-4b53-8693-9d3b05911b76,network=Network(ec0b6c7f-9518-401a-b9ba-5148c7d4432d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap091f9564-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.265 2 DEBUG os_vif [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:43:2f,bridge_name='br-int',has_traffic_filtering=True,id=091f9564-aabf-4b53-8693-9d3b05911b76,network=Network(ec0b6c7f-9518-401a-b9ba-5148c7d4432d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap091f9564-aa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.266 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.266 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.268 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap091f9564-aa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.269 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap091f9564-aa, col_values=(('external_ids', {'iface-id': '091f9564-aabf-4b53-8693-9d3b05911b76', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:36:43:2f', 'vm-uuid': 'e9519041-3cf1-40a3-8654-4ed813d2c48a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:19:11 np0005476733 NetworkManager[51699]: <info>  [1759940351.3074] manager: (tap091f9564-aa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/271)
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.313 2 INFO os_vif [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:43:2f,bridge_name='br-int',has_traffic_filtering=True,id=091f9564-aabf-4b53-8693-9d3b05911b76,network=Network(ec0b6c7f-9518-401a-b9ba-5148c7d4432d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap091f9564-aa')#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.385 2 DEBUG nova.virt.libvirt.driver [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.386 2 DEBUG nova.virt.libvirt.driver [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.386 2 DEBUG nova.virt.libvirt.driver [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] No VIF found with MAC fa:16:3e:36:43:2f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.387 2 INFO nova.virt.libvirt.driver [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Using config drive#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.764 2 INFO nova.virt.libvirt.driver [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Creating config drive at /var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.config#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.768 2 DEBUG oslo_concurrency.processutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf6_fzxau execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.901 2 DEBUG oslo_concurrency.processutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf6_fzxau" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:19:11 np0005476733 NetworkManager[51699]: <info>  [1759940351.9735] manager: (tap091f9564-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/272)
Oct  8 12:19:11 np0005476733 kernel: tap091f9564-aa: entered promiscuous mode
Oct  8 12:19:11 np0005476733 ovn_controller[94857]: 2025-10-08T16:19:11Z|00840|binding|INFO|Claiming lport 091f9564-aabf-4b53-8693-9d3b05911b76 for this chassis.
Oct  8 12:19:11 np0005476733 ovn_controller[94857]: 2025-10-08T16:19:11Z|00841|binding|INFO|091f9564-aabf-4b53-8693-9d3b05911b76: Claiming fa:16:3e:36:43:2f 10.100.0.44
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:11.982 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:43:2f 10.100.0.44'], port_security=['fa:16:3e:36:43:2f 10.100.0.44'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.44/28', 'neutron:device_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec0b6c7f-9518-401a-b9ba-5148c7d4432d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0433a72056854da48c168f13bcf53e59', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6694f32d-684d-42bf-9b26-d5a5c55c3f82', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbb6917-eb54-40f6-a6ab-44006c4799a4, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=091f9564-aabf-4b53-8693-9d3b05911b76) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:19:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:11.983 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 091f9564-aabf-4b53-8693-9d3b05911b76 in datapath ec0b6c7f-9518-401a-b9ba-5148c7d4432d bound to our chassis#033[00m
Oct  8 12:19:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:11.984 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec0b6c7f-9518-401a-b9ba-5148c7d4432d#033[00m
Oct  8 12:19:11 np0005476733 ovn_controller[94857]: 2025-10-08T16:19:11Z|00842|binding|INFO|Setting lport 091f9564-aabf-4b53-8693-9d3b05911b76 ovn-installed in OVS
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:11 np0005476733 ovn_controller[94857]: 2025-10-08T16:19:11Z|00843|binding|INFO|Setting lport 091f9564-aabf-4b53-8693-9d3b05911b76 up in Southbound
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:11 np0005476733 nova_compute[192580]: 2025-10-08 16:19:11.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:11.997 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4067628d-ca46-4980-9b18-a799f7d31268]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:19:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:11.998 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapec0b6c7f-91 in ovnmeta-ec0b6c7f-9518-401a-b9ba-5148c7d4432d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:11.999 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapec0b6c7f-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:12.000 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b4bde90a-2b83-4492-82e4-94b31a8024e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:12.001 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[10411270-db5f-4608-8f67-86c2caf26d13]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:19:12 np0005476733 systemd-udevd[257611]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:12.013 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[eeacb031-cb81-4388-ac6f-160606a50714]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:19:12 np0005476733 systemd-machined[152624]: New machine qemu-54-instance-00000057.
Oct  8 12:19:12 np0005476733 NetworkManager[51699]: <info>  [1759940352.0240] device (tap091f9564-aa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:19:12 np0005476733 NetworkManager[51699]: <info>  [1759940352.0248] device (tap091f9564-aa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 12:19:12 np0005476733 systemd[1]: Started Virtual Machine qemu-54-instance-00000057.
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:12.031 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4eb3374c-415a-44ea-921d-2da396a10f5c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:12.060 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[afb60e2c-07f3-4312-9ce7-ef55a2eee8be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:19:12 np0005476733 NetworkManager[51699]: <info>  [1759940352.0676] manager: (tapec0b6c7f-90): new Veth device (/org/freedesktop/NetworkManager/Devices/273)
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:12.066 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[db5589a7-4dd2-4d79-9fec-224f46f42827]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:19:12 np0005476733 systemd-udevd[257616]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:12.098 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[a7e24e39-4ca6-4ce1-a318-bb7920761515]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:12.102 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[ccaa3541-4b8a-47b3-abcd-6c4a537ecf35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:19:12 np0005476733 NetworkManager[51699]: <info>  [1759940352.1218] device (tapec0b6c7f-90): carrier: link connected
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:12.125 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[df3b6497-1717-4c15-a5e8-c06644822f88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:12.142 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a879d533-1160-4b6f-a5f0-0abef77118cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec0b6c7f-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:4d:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 725578, 'reachable_time': 19173, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257644, 'error': None, 'target': 'ovnmeta-ec0b6c7f-9518-401a-b9ba-5148c7d4432d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:12.155 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1a92826f-47e9-406f-aa24-c50d131faf83]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe21:4dab'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 725578, 'tstamp': 725578}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 257645, 'error': None, 'target': 'ovnmeta-ec0b6c7f-9518-401a-b9ba-5148c7d4432d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:12.178 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[622dea05-22db-4072-a768-7620d0acb6b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec0b6c7f-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:4d:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 725578, 'reachable_time': 19173, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 257646, 'error': None, 'target': 'ovnmeta-ec0b6c7f-9518-401a-b9ba-5148c7d4432d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:12.201 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7f845cc2-6783-4482-9da4-74ae698ea23e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:12.269 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[86f4bdeb-5a76-4ff3-b84a-664f0ce88af0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:12.270 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec0b6c7f-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:12.270 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:12.271 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec0b6c7f-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:19:12 np0005476733 kernel: tapec0b6c7f-90: entered promiscuous mode
Oct  8 12:19:12 np0005476733 NetworkManager[51699]: <info>  [1759940352.2734] manager: (tapec0b6c7f-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/274)
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:12.277 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec0b6c7f-90, col_values=(('external_ids', {'iface-id': 'e7df1276-2f2e-438e-bf88-93bd833d95ea'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:19:12 np0005476733 nova_compute[192580]: 2025-10-08 16:19:12.273 2 DEBUG nova.compute.manager [req-0cac7b34-f209-431a-9260-613dba200b82 req-26e1150a-1c77-48c2-b97b-a7d8edf1e119 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Received event network-vif-plugged-091f9564-aabf-4b53-8693-9d3b05911b76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:19:12 np0005476733 nova_compute[192580]: 2025-10-08 16:19:12.274 2 DEBUG oslo_concurrency.lockutils [req-0cac7b34-f209-431a-9260-613dba200b82 req-26e1150a-1c77-48c2-b97b-a7d8edf1e119 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:19:12 np0005476733 nova_compute[192580]: 2025-10-08 16:19:12.275 2 DEBUG oslo_concurrency.lockutils [req-0cac7b34-f209-431a-9260-613dba200b82 req-26e1150a-1c77-48c2-b97b-a7d8edf1e119 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:19:12 np0005476733 nova_compute[192580]: 2025-10-08 16:19:12.275 2 DEBUG oslo_concurrency.lockutils [req-0cac7b34-f209-431a-9260-613dba200b82 req-26e1150a-1c77-48c2-b97b-a7d8edf1e119 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:19:12 np0005476733 nova_compute[192580]: 2025-10-08 16:19:12.275 2 DEBUG nova.compute.manager [req-0cac7b34-f209-431a-9260-613dba200b82 req-26e1150a-1c77-48c2-b97b-a7d8edf1e119 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Processing event network-vif-plugged-091f9564-aabf-4b53-8693-9d3b05911b76 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 12:19:12 np0005476733 nova_compute[192580]: 2025-10-08 16:19:12.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:12 np0005476733 nova_compute[192580]: 2025-10-08 16:19:12.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:12 np0005476733 ovn_controller[94857]: 2025-10-08T16:19:12Z|00844|binding|INFO|Releasing lport e7df1276-2f2e-438e-bf88-93bd833d95ea from this chassis (sb_readonly=0)
Oct  8 12:19:12 np0005476733 nova_compute[192580]: 2025-10-08 16:19:12.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:12.282 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ec0b6c7f-9518-401a-b9ba-5148c7d4432d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ec0b6c7f-9518-401a-b9ba-5148c7d4432d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 12:19:12 np0005476733 nova_compute[192580]: 2025-10-08 16:19:12.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:12.292 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a4266b91-fe68-4dbf-8d9c-71c7aaa2f415]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:12.294 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-ec0b6c7f-9518-401a-b9ba-5148c7d4432d
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/ec0b6c7f-9518-401a-b9ba-5148c7d4432d.pid.haproxy
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID ec0b6c7f-9518-401a-b9ba-5148c7d4432d
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 12:19:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:12.295 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ec0b6c7f-9518-401a-b9ba-5148c7d4432d', 'env', 'PROCESS_TAG=haproxy-ec0b6c7f-9518-401a-b9ba-5148c7d4432d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ec0b6c7f-9518-401a-b9ba-5148c7d4432d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 12:19:12 np0005476733 podman[257678]: 2025-10-08 16:19:12.635350035 +0000 UTC m=+0.039617388 container create ddfc774ab8a88880577d383650880150f5d81e97c208ccc7fd174c6cdb264e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-ec0b6c7f-9518-401a-b9ba-5148c7d4432d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:19:12 np0005476733 systemd[1]: Started libpod-conmon-ddfc774ab8a88880577d383650880150f5d81e97c208ccc7fd174c6cdb264e33.scope.
Oct  8 12:19:12 np0005476733 systemd[1]: Started libcrun container.
Oct  8 12:19:12 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca212dd72a75d738c7fdb66c2c61670f0352555d5268e25b0671587380f31c35/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 12:19:12 np0005476733 podman[257678]: 2025-10-08 16:19:12.703139998 +0000 UTC m=+0.107407381 container init ddfc774ab8a88880577d383650880150f5d81e97c208ccc7fd174c6cdb264e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-ec0b6c7f-9518-401a-b9ba-5148c7d4432d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:19:12 np0005476733 podman[257678]: 2025-10-08 16:19:12.708404815 +0000 UTC m=+0.112672178 container start ddfc774ab8a88880577d383650880150f5d81e97c208ccc7fd174c6cdb264e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-ec0b6c7f-9518-401a-b9ba-5148c7d4432d, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  8 12:19:12 np0005476733 podman[257678]: 2025-10-08 16:19:12.616916871 +0000 UTC m=+0.021184234 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 12:19:12 np0005476733 neutron-haproxy-ovnmeta-ec0b6c7f-9518-401a-b9ba-5148c7d4432d[257693]: [NOTICE]   (257697) : New worker (257699) forked
Oct  8 12:19:12 np0005476733 neutron-haproxy-ovnmeta-ec0b6c7f-9518-401a-b9ba-5148c7d4432d[257693]: [NOTICE]   (257697) : Loading success.
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.234 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759940353.2343946, e9519041-3cf1-40a3-8654-4ed813d2c48a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.235 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] VM Started (Lifecycle Event)#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.237 2 DEBUG nova.compute.manager [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.240 2 DEBUG nova.virt.libvirt.driver [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.243 2 INFO nova.virt.libvirt.driver [-] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Instance spawned successfully.#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.243 2 DEBUG nova.virt.libvirt.driver [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.283 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.288 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.292 2 DEBUG nova.virt.libvirt.driver [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.292 2 DEBUG nova.virt.libvirt.driver [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.292 2 DEBUG nova.virt.libvirt.driver [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.293 2 DEBUG nova.virt.libvirt.driver [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.293 2 DEBUG nova.virt.libvirt.driver [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.294 2 DEBUG nova.virt.libvirt.driver [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.338 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.339 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759940353.2345564, e9519041-3cf1-40a3-8654-4ed813d2c48a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.340 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] VM Paused (Lifecycle Event)#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.386 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.390 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759940353.2394304, e9519041-3cf1-40a3-8654-4ed813d2c48a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.391 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] VM Resumed (Lifecycle Event)#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.406 2 INFO nova.compute.manager [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Took 7.38 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.407 2 DEBUG nova.compute.manager [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.419 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.422 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.469 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.494 2 INFO nova.compute.manager [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Took 7.99 seconds to build instance.#033[00m
Oct  8 12:19:13 np0005476733 nova_compute[192580]: 2025-10-08 16:19:13.516 2 DEBUG oslo_concurrency.lockutils [None req-e46abf15-00e5-4460-a8f5-015955586cc0 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:19:14 np0005476733 podman[257716]: 2025-10-08 16:19:14.220753685 +0000 UTC m=+0.048593634 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:19:14 np0005476733 podman[257715]: 2025-10-08 16:19:14.2375816 +0000 UTC m=+0.066321237 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd)
Oct  8 12:19:14 np0005476733 podman[257717]: 2025-10-08 16:19:14.237747285 +0000 UTC m=+0.062940010 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, version=9.6, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  8 12:19:14 np0005476733 nova_compute[192580]: 2025-10-08 16:19:14.604 2 DEBUG nova.compute.manager [req-9b8cfde9-ab4c-46e1-897b-a9a1b589ae95 req-a8626278-052c-4879-8858-bc20c5b11c00 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Received event network-vif-plugged-091f9564-aabf-4b53-8693-9d3b05911b76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:19:14 np0005476733 nova_compute[192580]: 2025-10-08 16:19:14.606 2 DEBUG oslo_concurrency.lockutils [req-9b8cfde9-ab4c-46e1-897b-a9a1b589ae95 req-a8626278-052c-4879-8858-bc20c5b11c00 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:19:14 np0005476733 nova_compute[192580]: 2025-10-08 16:19:14.607 2 DEBUG oslo_concurrency.lockutils [req-9b8cfde9-ab4c-46e1-897b-a9a1b589ae95 req-a8626278-052c-4879-8858-bc20c5b11c00 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:19:14 np0005476733 nova_compute[192580]: 2025-10-08 16:19:14.607 2 DEBUG oslo_concurrency.lockutils [req-9b8cfde9-ab4c-46e1-897b-a9a1b589ae95 req-a8626278-052c-4879-8858-bc20c5b11c00 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:19:14 np0005476733 nova_compute[192580]: 2025-10-08 16:19:14.608 2 DEBUG nova.compute.manager [req-9b8cfde9-ab4c-46e1-897b-a9a1b589ae95 req-a8626278-052c-4879-8858-bc20c5b11c00 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] No waiting events found dispatching network-vif-plugged-091f9564-aabf-4b53-8693-9d3b05911b76 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:19:14 np0005476733 nova_compute[192580]: 2025-10-08 16:19:14.608 2 WARNING nova.compute.manager [req-9b8cfde9-ab4c-46e1-897b-a9a1b589ae95 req-a8626278-052c-4879-8858-bc20c5b11c00 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Received unexpected event network-vif-plugged-091f9564-aabf-4b53-8693-9d3b05911b76 for instance with vm_state active and task_state None.#033[00m
Oct  8 12:19:14 np0005476733 nova_compute[192580]: 2025-10-08 16:19:14.637 2 DEBUG nova.network.neutron [req-2a3d80ab-0f81-4a27-9bc3-0ccf3466d428 req-0ec6ff67-5173-4e83-b88f-928bd02c36d2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Updated VIF entry in instance network info cache for port 091f9564-aabf-4b53-8693-9d3b05911b76. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:19:14 np0005476733 nova_compute[192580]: 2025-10-08 16:19:14.638 2 DEBUG nova.network.neutron [req-2a3d80ab-0f81-4a27-9bc3-0ccf3466d428 req-0ec6ff67-5173-4e83-b88f-928bd02c36d2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Updating instance_info_cache with network_info: [{"id": "091f9564-aabf-4b53-8693-9d3b05911b76", "address": "fa:16:3e:36:43:2f", "network": {"id": "ec0b6c7f-9518-401a-b9ba-5148c7d4432d", "bridge": "br-int", "label": "tempest-test-network--685655984", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap091f9564-aa", "ovs_interfaceid": "091f9564-aabf-4b53-8693-9d3b05911b76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:19:14 np0005476733 nova_compute[192580]: 2025-10-08 16:19:14.661 2 DEBUG oslo_concurrency.lockutils [req-2a3d80ab-0f81-4a27-9bc3-0ccf3466d428 req-0ec6ff67-5173-4e83-b88f-928bd02c36d2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-e9519041-3cf1-40a3-8654-4ed813d2c48a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:19:16 np0005476733 nova_compute[192580]: 2025-10-08 16:19:16.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:18 np0005476733 nova_compute[192580]: 2025-10-08 16:19:18.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:20 np0005476733 ovn_controller[94857]: 2025-10-08T16:19:20Z|00845|pinctrl|WARN|Dropped 777 log messages in last 55 seconds (most recently, 8 seconds ago) due to excessive rate
Oct  8 12:19:20 np0005476733 ovn_controller[94857]: 2025-10-08T16:19:20Z|00846|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:19:21 np0005476733 nova_compute[192580]: 2025-10-08 16:19:21.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:21 np0005476733 nova_compute[192580]: 2025-10-08 16:19:21.902 2 DEBUG nova.compute.manager [req-81db2c7c-4bcd-49e0-a83a-9a1e580d45fd req-fb3ecbb8-da06-4bb6-b22d-9db3d8fb19cf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Received event network-changed-091f9564-aabf-4b53-8693-9d3b05911b76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:19:21 np0005476733 nova_compute[192580]: 2025-10-08 16:19:21.902 2 DEBUG nova.compute.manager [req-81db2c7c-4bcd-49e0-a83a-9a1e580d45fd req-fb3ecbb8-da06-4bb6-b22d-9db3d8fb19cf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Refreshing instance network info cache due to event network-changed-091f9564-aabf-4b53-8693-9d3b05911b76. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:19:21 np0005476733 nova_compute[192580]: 2025-10-08 16:19:21.903 2 DEBUG oslo_concurrency.lockutils [req-81db2c7c-4bcd-49e0-a83a-9a1e580d45fd req-fb3ecbb8-da06-4bb6-b22d-9db3d8fb19cf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-e9519041-3cf1-40a3-8654-4ed813d2c48a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:19:21 np0005476733 nova_compute[192580]: 2025-10-08 16:19:21.903 2 DEBUG oslo_concurrency.lockutils [req-81db2c7c-4bcd-49e0-a83a-9a1e580d45fd req-fb3ecbb8-da06-4bb6-b22d-9db3d8fb19cf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-e9519041-3cf1-40a3-8654-4ed813d2c48a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:19:21 np0005476733 nova_compute[192580]: 2025-10-08 16:19:21.903 2 DEBUG nova.network.neutron [req-81db2c7c-4bcd-49e0-a83a-9a1e580d45fd req-fb3ecbb8-da06-4bb6-b22d-9db3d8fb19cf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Refreshing network info cache for port 091f9564-aabf-4b53-8693-9d3b05911b76 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:19:22 np0005476733 nova_compute[192580]: 2025-10-08 16:19:22.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:19:23 np0005476733 podman[257775]: 2025-10-08 16:19:23.215741146 +0000 UTC m=+0.049423159 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 12:19:23 np0005476733 podman[257776]: 2025-10-08 16:19:23.218745282 +0000 UTC m=+0.049699149 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:19:23 np0005476733 nova_compute[192580]: 2025-10-08 16:19:23.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:23 np0005476733 systemd-logind[827]: New session 107 of user zuul.
Oct  8 12:19:23 np0005476733 systemd[1]: Started Session 107 of User zuul.
Oct  8 12:19:23 np0005476733 systemd-logind[827]: New session 108 of user zuul.
Oct  8 12:19:23 np0005476733 systemd[1]: Started Session 108 of User zuul.
Oct  8 12:19:24 np0005476733 systemd[1]: session-108.scope: Deactivated successfully.
Oct  8 12:19:24 np0005476733 systemd-logind[827]: Session 108 logged out. Waiting for processes to exit.
Oct  8 12:19:24 np0005476733 systemd-logind[827]: Removed session 108.
Oct  8 12:19:24 np0005476733 ovn_controller[94857]: 2025-10-08T16:19:24Z|00109|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:36:43:2f 10.100.0.44
Oct  8 12:19:24 np0005476733 ovn_controller[94857]: 2025-10-08T16:19:24Z|00110|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:36:43:2f 10.100.0.44
Oct  8 12:19:25 np0005476733 nova_compute[192580]: 2025-10-08 16:19:25.173 2 DEBUG nova.network.neutron [req-81db2c7c-4bcd-49e0-a83a-9a1e580d45fd req-fb3ecbb8-da06-4bb6-b22d-9db3d8fb19cf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Updated VIF entry in instance network info cache for port 091f9564-aabf-4b53-8693-9d3b05911b76. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:19:25 np0005476733 nova_compute[192580]: 2025-10-08 16:19:25.174 2 DEBUG nova.network.neutron [req-81db2c7c-4bcd-49e0-a83a-9a1e580d45fd req-fb3ecbb8-da06-4bb6-b22d-9db3d8fb19cf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Updating instance_info_cache with network_info: [{"id": "091f9564-aabf-4b53-8693-9d3b05911b76", "address": "fa:16:3e:36:43:2f", "network": {"id": "ec0b6c7f-9518-401a-b9ba-5148c7d4432d", "bridge": "br-int", "label": "tempest-test-network--685655984", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap091f9564-aa", "ovs_interfaceid": "091f9564-aabf-4b53-8693-9d3b05911b76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:19:25 np0005476733 nova_compute[192580]: 2025-10-08 16:19:25.205 2 DEBUG oslo_concurrency.lockutils [req-81db2c7c-4bcd-49e0-a83a-9a1e580d45fd req-fb3ecbb8-da06-4bb6-b22d-9db3d8fb19cf 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-e9519041-3cf1-40a3-8654-4ed813d2c48a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:19:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:26.377 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:19:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:26.378 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:19:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:26.379 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:19:26 np0005476733 nova_compute[192580]: 2025-10-08 16:19:26.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:28 np0005476733 nova_compute[192580]: 2025-10-08 16:19:28.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:29 np0005476733 nova_compute[192580]: 2025-10-08 16:19:29.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:19:31 np0005476733 nova_compute[192580]: 2025-10-08 16:19:31.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:32 np0005476733 nova_compute[192580]: 2025-10-08 16:19:32.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:19:33 np0005476733 podman[257897]: 2025-10-08 16:19:33.218158558 +0000 UTC m=+0.051035171 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Oct  8 12:19:33 np0005476733 nova_compute[192580]: 2025-10-08 16:19:33.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:34 np0005476733 nova_compute[192580]: 2025-10-08 16:19:34.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:19:34 np0005476733 nova_compute[192580]: 2025-10-08 16:19:34.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:19:34 np0005476733 nova_compute[192580]: 2025-10-08 16:19:34.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:19:36 np0005476733 nova_compute[192580]: 2025-10-08 16:19:36.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:38 np0005476733 systemd-logind[827]: New session 109 of user zuul.
Oct  8 12:19:38 np0005476733 systemd[1]: Started Session 109 of User zuul.
Oct  8 12:19:38 np0005476733 podman[257923]: 2025-10-08 16:19:38.116994526 +0000 UTC m=+0.093651914 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Oct  8 12:19:38 np0005476733 podman[257921]: 2025-10-08 16:19:38.128387437 +0000 UTC m=+0.119625708 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:19:38 np0005476733 nova_compute[192580]: 2025-10-08 16:19:38.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:38 np0005476733 nova_compute[192580]: 2025-10-08 16:19:38.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:19:38 np0005476733 nova_compute[192580]: 2025-10-08 16:19:38.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:19:38 np0005476733 nova_compute[192580]: 2025-10-08 16:19:38.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:19:39 np0005476733 nova_compute[192580]: 2025-10-08 16:19:39.717 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-e9519041-3cf1-40a3-8654-4ed813d2c48a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:19:39 np0005476733 nova_compute[192580]: 2025-10-08 16:19:39.718 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-e9519041-3cf1-40a3-8654-4ed813d2c48a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:19:39 np0005476733 nova_compute[192580]: 2025-10-08 16:19:39.718 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:19:39 np0005476733 nova_compute[192580]: 2025-10-08 16:19:39.718 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e9519041-3cf1-40a3-8654-4ed813d2c48a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:19:41 np0005476733 nova_compute[192580]: 2025-10-08 16:19:41.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:43 np0005476733 nova_compute[192580]: 2025-10-08 16:19:43.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:44 np0005476733 nova_compute[192580]: 2025-10-08 16:19:44.755 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Updating instance_info_cache with network_info: [{"id": "091f9564-aabf-4b53-8693-9d3b05911b76", "address": "fa:16:3e:36:43:2f", "network": {"id": "ec0b6c7f-9518-401a-b9ba-5148c7d4432d", "bridge": "br-int", "label": "tempest-test-network--685655984", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap091f9564-aa", "ovs_interfaceid": "091f9564-aabf-4b53-8693-9d3b05911b76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:19:44 np0005476733 nova_compute[192580]: 2025-10-08 16:19:44.789 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-e9519041-3cf1-40a3-8654-4ed813d2c48a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:19:44 np0005476733 nova_compute[192580]: 2025-10-08 16:19:44.790 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:19:44 np0005476733 nova_compute[192580]: 2025-10-08 16:19:44.790 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:19:45 np0005476733 podman[258001]: 2025-10-08 16:19:45.221561796 +0000 UTC m=+0.052440536 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 12:19:45 np0005476733 podman[258000]: 2025-10-08 16:19:45.221741032 +0000 UTC m=+0.053458528 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  8 12:19:45 np0005476733 podman[258002]: 2025-10-08 16:19:45.224885591 +0000 UTC m=+0.052234319 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, config_id=edpm, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9)
Oct  8 12:19:45 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:45.582 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=60, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=59) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:19:45 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:45.582 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:19:45 np0005476733 nova_compute[192580]: 2025-10-08 16:19:45.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:19:45 np0005476733 nova_compute[192580]: 2025-10-08 16:19:45.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:45 np0005476733 nova_compute[192580]: 2025-10-08 16:19:45.641 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:19:45 np0005476733 nova_compute[192580]: 2025-10-08 16:19:45.641 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:19:45 np0005476733 nova_compute[192580]: 2025-10-08 16:19:45.642 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:19:45 np0005476733 nova_compute[192580]: 2025-10-08 16:19:45.642 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:19:45 np0005476733 nova_compute[192580]: 2025-10-08 16:19:45.738 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:19:45 np0005476733 nova_compute[192580]: 2025-10-08 16:19:45.809 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:19:45 np0005476733 nova_compute[192580]: 2025-10-08 16:19:45.810 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:19:45 np0005476733 nova_compute[192580]: 2025-10-08 16:19:45.907 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:19:46 np0005476733 nova_compute[192580]: 2025-10-08 16:19:46.059 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:19:46 np0005476733 nova_compute[192580]: 2025-10-08 16:19:46.062 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13513MB free_disk=111.28605651855469GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:19:46 np0005476733 nova_compute[192580]: 2025-10-08 16:19:46.062 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:19:46 np0005476733 nova_compute[192580]: 2025-10-08 16:19:46.062 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:19:46 np0005476733 nova_compute[192580]: 2025-10-08 16:19:46.210 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance e9519041-3cf1-40a3-8654-4ed813d2c48a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:19:46 np0005476733 nova_compute[192580]: 2025-10-08 16:19:46.210 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:19:46 np0005476733 nova_compute[192580]: 2025-10-08 16:19:46.212 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:19:46 np0005476733 nova_compute[192580]: 2025-10-08 16:19:46.254 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:19:46 np0005476733 nova_compute[192580]: 2025-10-08 16:19:46.297 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:19:46 np0005476733 nova_compute[192580]: 2025-10-08 16:19:46.328 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:19:46 np0005476733 nova_compute[192580]: 2025-10-08 16:19:46.329 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:19:46 np0005476733 nova_compute[192580]: 2025-10-08 16:19:46.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:48 np0005476733 nova_compute[192580]: 2025-10-08 16:19:48.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:48 np0005476733 nova_compute[192580]: 2025-10-08 16:19:48.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:49 np0005476733 nova_compute[192580]: 2025-10-08 16:19:49.330 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:19:51 np0005476733 nova_compute[192580]: 2025-10-08 16:19:51.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:53 np0005476733 nova_compute[192580]: 2025-10-08 16:19:53.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:54 np0005476733 podman[258074]: 2025-10-08 16:19:54.262790986 +0000 UTC m=+0.056357051 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=iscsid, container_name=iscsid)
Oct  8 12:19:54 np0005476733 podman[258075]: 2025-10-08 16:19:54.262995612 +0000 UTC m=+0.053804229 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 12:19:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:19:54.584 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '60'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:19:56 np0005476733 nova_compute[192580]: 2025-10-08 16:19:56.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:19:58 np0005476733 nova_compute[192580]: 2025-10-08 16:19:58.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:01 np0005476733 nova_compute[192580]: 2025-10-08 16:20:01.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:03 np0005476733 nova_compute[192580]: 2025-10-08 16:20:03.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:04 np0005476733 podman[258124]: 2025-10-08 16:20:04.228028601 +0000 UTC m=+0.052535120 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  8 12:20:06 np0005476733 nova_compute[192580]: 2025-10-08 16:20:06.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:07 np0005476733 systemd-logind[827]: New session 110 of user zuul.
Oct  8 12:20:07 np0005476733 systemd[1]: Started Session 110 of User zuul.
Oct  8 12:20:07 np0005476733 systemd-logind[827]: New session 111 of user zuul.
Oct  8 12:20:07 np0005476733 systemd[1]: Started Session 111 of User zuul.
Oct  8 12:20:07 np0005476733 systemd[1]: session-111.scope: Deactivated successfully.
Oct  8 12:20:07 np0005476733 systemd-logind[827]: Session 111 logged out. Waiting for processes to exit.
Oct  8 12:20:07 np0005476733 systemd-logind[827]: Removed session 111.
Oct  8 12:20:08 np0005476733 podman[258207]: 2025-10-08 16:20:08.274380644 +0000 UTC m=+0.087658894 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3)
Oct  8 12:20:08 np0005476733 podman[258206]: 2025-10-08 16:20:08.309331984 +0000 UTC m=+0.119506235 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 12:20:08 np0005476733 nova_compute[192580]: 2025-10-08 16:20:08.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:11 np0005476733 nova_compute[192580]: 2025-10-08 16:20:11.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:13 np0005476733 nova_compute[192580]: 2025-10-08 16:20:13.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:15 np0005476733 podman[258260]: 2025-10-08 16:20:15.715656494 +0000 UTC m=+0.045829585 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:20:15 np0005476733 podman[258259]: 2025-10-08 16:20:15.726894941 +0000 UTC m=+0.059616274 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 12:20:15 np0005476733 podman[258261]: 2025-10-08 16:20:15.732808359 +0000 UTC m=+0.060510272 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter)
Oct  8 12:20:16 np0005476733 nova_compute[192580]: 2025-10-08 16:20:16.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:18 np0005476733 nova_compute[192580]: 2025-10-08 16:20:18.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:21 np0005476733 nova_compute[192580]: 2025-10-08 16:20:21.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:23 np0005476733 nova_compute[192580]: 2025-10-08 16:20:23.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:23 np0005476733 systemd-logind[827]: New session 112 of user zuul.
Oct  8 12:20:23 np0005476733 systemd[1]: Started Session 112 of User zuul.
Oct  8 12:20:24 np0005476733 nova_compute[192580]: 2025-10-08 16:20:24.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:20:25 np0005476733 podman[258353]: 2025-10-08 16:20:25.223017311 +0000 UTC m=+0.052094424 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 12:20:25 np0005476733 podman[258354]: 2025-10-08 16:20:25.223209047 +0000 UTC m=+0.052351073 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 12:20:25 np0005476733 ovn_controller[94857]: 2025-10-08T16:20:25Z|00847|pinctrl|WARN|Dropped 535 log messages in last 65 seconds (most recently, 18 seconds ago) due to excessive rate
Oct  8 12:20:25 np0005476733 ovn_controller[94857]: 2025-10-08T16:20:25Z|00848|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:20:26 np0005476733 systemd-logind[827]: New session 113 of user zuul.
Oct  8 12:20:26 np0005476733 systemd[1]: Started Session 113 of User zuul.
Oct  8 12:20:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:20:26.378 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:20:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:20:26.378 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:20:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:20:26.379 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:20:26 np0005476733 nova_compute[192580]: 2025-10-08 16:20:26.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:26 np0005476733 systemd-logind[827]: New session 114 of user zuul.
Oct  8 12:20:26 np0005476733 systemd[1]: Started Session 114 of User zuul.
Oct  8 12:20:26 np0005476733 systemd[1]: session-114.scope: Deactivated successfully.
Oct  8 12:20:26 np0005476733 systemd-logind[827]: Session 114 logged out. Waiting for processes to exit.
Oct  8 12:20:26 np0005476733 systemd-logind[827]: Removed session 114.
Oct  8 12:20:28 np0005476733 nova_compute[192580]: 2025-10-08 16:20:28.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:30 np0005476733 nova_compute[192580]: 2025-10-08 16:20:30.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:20:31 np0005476733 nova_compute[192580]: 2025-10-08 16:20:31.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:32 np0005476733 nova_compute[192580]: 2025-10-08 16:20:32.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:20:33 np0005476733 nova_compute[192580]: 2025-10-08 16:20:33.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:34 np0005476733 nova_compute[192580]: 2025-10-08 16:20:34.592 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:20:35 np0005476733 ovn_controller[94857]: 2025-10-08T16:20:35Z|00849|memory_trim|INFO|Detected inactivity (last active 30015 ms ago): trimming memory
Oct  8 12:20:35 np0005476733 podman[258470]: 2025-10-08 16:20:35.257976727 +0000 UTC m=+0.085471665 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 12:20:35 np0005476733 systemd-logind[827]: New session 115 of user zuul.
Oct  8 12:20:35 np0005476733 nova_compute[192580]: 2025-10-08 16:20:35.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:20:35 np0005476733 nova_compute[192580]: 2025-10-08 16:20:35.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:20:35 np0005476733 systemd[1]: Started Session 115 of User zuul.
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.068 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'name': 'tempest-server-test-40326179', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000057', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0433a72056854da48c168f13bcf53e59', 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'hostId': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.070 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.088 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/memory.usage volume: 42.54296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ae45f97-d76f-47c7-aaa1-5703f26207fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.54296875, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'timestamp': '2025-10-08T16:20:36.070434', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'instance-00000057', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'b8b1a95c-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.810888005, 'message_signature': '2a06b4fe84e38574549829af5e489f307d5ea614065259a48e1ca2612b80246e'}]}, 'timestamp': '2025-10-08 16:20:36.088887', '_unique_id': 'c14ea1ddb50b490ba9169a1c4d5e4b11'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.090 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.091 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.110 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.device.write.bytes volume: 73105408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.111 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '023a6cc1-4e30-49a8-b622-e616315b0982', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73105408, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a-vda', 'timestamp': '2025-10-08T16:20:36.091559', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'instance-00000057', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b8b50fd4-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.814539771, 'message_signature': '4ace941a04c153f207e533541ebdae0cb45289120fb0580f718585b20216d96c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a-sda', 'timestamp': '2025-10-08T16:20:36.091559', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'instance-00000057', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b8b5246a-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.814539771, 'message_signature': '153d7d26d5ace232d970d8863d5025d4fe7ac11dba16b3b643971ef09cf34c86'}]}, 'timestamp': '2025-10-08 16:20:36.111681', '_unique_id': '63c13994b820494a8921b21966dd6ea1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.112 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.113 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.114 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.114 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-40326179>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-40326179>]
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.114 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.117 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e9519041-3cf1-40a3-8654-4ed813d2c48a / tap091f9564-aa inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.117 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/network.outgoing.packets volume: 62 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd26a3fa8-4f5e-465d-9e1f-2c57f445320e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 62, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'instance-00000057-e9519041-3cf1-40a3-8654-4ed813d2c48a-tap091f9564-aa', 'timestamp': '2025-10-08T16:20:36.114628', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'tap091f9564-aa', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:36:43:2f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap091f9564-aa'}, 'message_id': 'b8b6248c-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.837615624, 'message_signature': 'f5caa5a4a70e66ba7ecc63ec8e1d9d981f651f638710ba97a98b6708ab3a33d5'}]}, 'timestamp': '2025-10-08 16:20:36.118287', '_unique_id': 'ad03d40d68e94fcc846e2ed4f47af4b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.119 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.120 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.120 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/cpu volume: 11030000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '300e6b0b-8823-4440-bcf6-d1d552aa05b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11030000000, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'timestamp': '2025-10-08T16:20:36.120621', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'instance-00000057', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'b8b691ce-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.810888005, 'message_signature': 'e6070a907cd6a6ec14d7b8194e7bd9fc922c641a6ebe4d27eaf21983a9de59cf'}]}, 'timestamp': '2025-10-08 16:20:36.120989', '_unique_id': 'c59a3249a71a4413ae8f82ee9d100727'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.121 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.122 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.122 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.123 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-40326179>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-40326179>]
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.123 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.123 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.device.read.latency volume: 471550468 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.123 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.device.read.latency volume: 45828135 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3c1e7b5-2d68-4fa3-8e83-103c8a7cac86', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 471550468, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a-vda', 'timestamp': '2025-10-08T16:20:36.123377', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'instance-00000057', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b8b6fd9e-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.814539771, 'message_signature': '98a79e3143a84cda7e54a00b3a36aebf1eea05bfcdc05ca0b1ecbfd071a79a6a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 45828135, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a-sda', 'timestamp': '2025-10-08T16:20:36.123377', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'instance-00000057', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b8b70b86-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.814539771, 'message_signature': 'aa80439ae5079589b9364e2f18fbe63de64f258c8c3ceab0482c8171b6290a4d'}]}, 'timestamp': '2025-10-08 16:20:36.124180', '_unique_id': '5bd6c8ed493d4656aff752459b7c945f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.125 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.126 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.126 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d23e4da-3476-4251-a646-69d5d4668c2f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'instance-00000057-e9519041-3cf1-40a3-8654-4ed813d2c48a-tap091f9564-aa', 'timestamp': '2025-10-08T16:20:36.126158', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'tap091f9564-aa', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:36:43:2f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap091f9564-aa'}, 'message_id': 'b8b7696e-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.837615624, 'message_signature': 'c0378fd0fb259d8f9888f81e5f0ef4c71128de551f2fe7bf7bce46bef851f7ae'}]}, 'timestamp': '2025-10-08 16:20:36.126530', '_unique_id': 'e1aee026d71e4cb9bde23e89a824ccbc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.127 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.128 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.128 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89f07417-3a1f-4c79-b200-432d35fc6d76', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'instance-00000057-e9519041-3cf1-40a3-8654-4ed813d2c48a-tap091f9564-aa', 'timestamp': '2025-10-08T16:20:36.128311', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'tap091f9564-aa', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:36:43:2f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap091f9564-aa'}, 'message_id': 'b8b7bf18-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.837615624, 'message_signature': 'e45a901d86ea83edfbc721b5e90664f73691fc01799eadfd7165fb6c4275d97c'}]}, 'timestamp': '2025-10-08 16:20:36.128760', '_unique_id': '52a5b69f0eeb461485bbaf969c901622'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.130 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.130 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.device.read.bytes volume: 30493184 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.132 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c15b8720-3cc6-4b1c-a52c-d522398b689a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30493184, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a-vda', 'timestamp': '2025-10-08T16:20:36.130697', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'instance-00000057', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b8b844a6-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.814539771, 'message_signature': '2f9348aa62e3ea42aca149ba887c6dc8cdbc2d6df97ff9a7f5f5d98956ca4d16'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a-sda', 'timestamp': '2025-10-08T16:20:36.130697', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'instance-00000057', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b8b858ba-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.814539771, 'message_signature': '5ad3bbfaf71cc22991ece8b8cfa740c38c5b3325850d619bc0e261500dec4cd8'}]}, 'timestamp': '2025-10-08 16:20:36.132587', '_unique_id': 'c3d0b414f754460ea8319da362a7e102'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.133 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/network.outgoing.bytes volume: 8476 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '870ebd86-f3e1-493b-a7e6-ce6030803b58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8476, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'instance-00000057-e9519041-3cf1-40a3-8654-4ed813d2c48a-tap091f9564-aa', 'timestamp': '2025-10-08T16:20:36.134274', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'tap091f9564-aa', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:36:43:2f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap091f9564-aa'}, 'message_id': 'b8b8a45a-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.837615624, 'message_signature': '6cd3bc772e825ebf97c81ba28fa7e7fd62e8b1eabd0808e65046cf314d97948e'}]}, 'timestamp': '2025-10-08 16:20:36.134517', '_unique_id': '282c14ec250f4a27a9687becf519a192'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.134 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.135 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.135 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.device.read.requests volume: 1098 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.135 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '420ebaeb-835b-4927-8cca-217ecf578f1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1098, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a-vda', 'timestamp': '2025-10-08T16:20:36.135636', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'instance-00000057', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b8b8d8ee-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.814539771, 'message_signature': 'd5fb8df5b50d324fbb56cb202cef785ab20f5abba2a551b31cb30710230b8cf1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a-sda', 'timestamp': '2025-10-08T16:20:36.135636', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'instance-00000057', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b8b8e172-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.814539771, 'message_signature': '375a535fb2cff14e536a0bdfff55f47b832213254ccf88aad1a646b8353b448a'}]}, 'timestamp': '2025-10-08 16:20:36.136065', '_unique_id': '85ac0f0166154a8db492d7e1a65a4936'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.136 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.137 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.148 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.149 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22fb19ef-ff37-4240-a1da-5dad2d8a1318', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a-vda', 'timestamp': '2025-10-08T16:20:36.137402', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'instance-00000057', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b8bad838-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.860381557, 'message_signature': '57926704cb8941b61585d48265967f14b413ba043608130d28b9f11b6f4cb391'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a-sda', 'timestamp': '2025-10-08T16:20:36.137402', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'instance-00000057', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b8bae6c0-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.860381557, 'message_signature': '4697a652fdfc04b8d2880af43ec50590d155ff8594548fb6d537e61da5935c6e'}]}, 'timestamp': '2025-10-08 16:20:36.149366', '_unique_id': 'da3b9362eb7342ca994f05f233810614'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.151 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.151 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18ddc4b9-05a2-424a-957e-41b0bde077c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'instance-00000057-e9519041-3cf1-40a3-8654-4ed813d2c48a-tap091f9564-aa', 'timestamp': '2025-10-08T16:20:36.151222', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'tap091f9564-aa', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:36:43:2f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap091f9564-aa'}, 'message_id': 'b8bb3c06-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.837615624, 'message_signature': '85409fac732bb351310c90c58df86500fb126dea15d3df60feb912297243173f'}]}, 'timestamp': '2025-10-08 16:20:36.151548', '_unique_id': '238ff994ef0143cb829f0fbe4e82f0ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.152 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.153 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f912e33-004f-4490-9ea8-40f1c3cfe811', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'instance-00000057-e9519041-3cf1-40a3-8654-4ed813d2c48a-tap091f9564-aa', 'timestamp': '2025-10-08T16:20:36.153078', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'tap091f9564-aa', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:36:43:2f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap091f9564-aa'}, 'message_id': 'b8bb8544-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.837615624, 'message_signature': '83034ad77f0fbdb582033e29f09a650c550ef7284e5830743f661b69754212d3'}]}, 'timestamp': '2025-10-08 16:20:36.153424', '_unique_id': '48b9cdf748cf4175a3737520ffb99720'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.154 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/network.incoming.packets volume: 54 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3546ca0a-6c96-44a4-9a3c-485af05f4911', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 54, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'instance-00000057-e9519041-3cf1-40a3-8654-4ed813d2c48a-tap091f9564-aa', 'timestamp': '2025-10-08T16:20:36.154874', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'tap091f9564-aa', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:36:43:2f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap091f9564-aa'}, 'message_id': 'b8bbca4a-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.837615624, 'message_signature': 'cf3adb34663a42f3c822b280f11e3ce39dec0e0799c16580874a20a7a8b6106c'}]}, 'timestamp': '2025-10-08 16:20:36.155215', '_unique_id': '04d135ac11184c4c80b22ddb0caaa022'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.155 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.156 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.156 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.device.write.latency volume: 2351901254 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44bc6623-609f-423f-a241-df7333665625', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2351901254, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a-vda', 'timestamp': '2025-10-08T16:20:36.156721', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'instance-00000057', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b8bc122a-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.814539771, 'message_signature': 'c159f00208979deaaa554b8a546c5a55cddb6fab3da95be97a17dd66431c394a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a-sda', 'timestamp': '2025-10-08T16:20:36.156721', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'instance-00000057', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b8bc1e32-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.814539771, 'message_signature': '7cc6c0eb7e5c1a9fa392378054fe0a6a88c679c3bca1fc3b6e962375900f182b'}]}, 'timestamp': '2025-10-08 16:20:36.157320', '_unique_id': 'f85ce3228bfc46f38908a49b5deb9abf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.157 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.158 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.158 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.159 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50388370-a1b8-4cca-9958-f62e83a00365', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a-vda', 'timestamp': '2025-10-08T16:20:36.158804', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'instance-00000057', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b8bc6388-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.860381557, 'message_signature': 'abf9b8e57db024d785a1888076f6b9509af4d14cd7f31ac2a894a58bb2428d60'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a-sda', 'timestamp': '2025-10-08T16:20:36.158804', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'instance-00000057', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b8bc6f90-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.860381557, 'message_signature': '637a46196f10f771566baa08084921f297abb8923839d7402c880141391a3616'}]}, 'timestamp': '2025-10-08 16:20:36.159401', '_unique_id': '1cc9c2a8bd92428d9152f45a62164c6b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.160 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.device.allocation volume: 30613504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.161 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1eeaed0-c9f5-483f-802b-a73e4fb9f121', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30613504, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a-vda', 'timestamp': '2025-10-08T16:20:36.160949', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'instance-00000057', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b8bcb860-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.860381557, 'message_signature': 'bbab11f2345fb29f6327c635704e7206f8a9bcfbe0ee339e86b2a2640d581853'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a-sda', 'timestamp': '2025-10-08T16:20:36.160949', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'instance-00000057', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b8bcc328-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.860381557, 'message_signature': '49e47454331ceb91e8ae20e3705bb89ab7befda8c9e0574faf0cbcfb15bf2432'}]}, 'timestamp': '2025-10-08 16:20:36.161540', '_unique_id': '4a599f7da2bb4d61a54e06ed21d57287'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.162 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5cb390dc-7981-4adf-b458-cb741bf938d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'instance-00000057-e9519041-3cf1-40a3-8654-4ed813d2c48a-tap091f9564-aa', 'timestamp': '2025-10-08T16:20:36.163003', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'tap091f9564-aa', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:36:43:2f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap091f9564-aa'}, 'message_id': 'b8bd08b0-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.837615624, 'message_signature': '5a1776f65f67d11a67a34778299936a7c46b2645642232fe450c0cb56035c429'}]}, 'timestamp': '2025-10-08 16:20:36.163337', '_unique_id': 'f73479bf002045838e2e4678136617be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.163 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.164 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.165 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.165 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-40326179>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-40326179>]
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.165 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.165 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/network.incoming.bytes volume: 8556 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92883a22-0ca9-4e5c-ad38-f8133636ff95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8556, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'instance-00000057-e9519041-3cf1-40a3-8654-4ed813d2c48a-tap091f9564-aa', 'timestamp': '2025-10-08T16:20:36.165511', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'tap091f9564-aa', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:36:43:2f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap091f9564-aa'}, 'message_id': 'b8bd6954-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.837615624, 'message_signature': '1c01b503e2d9d4a08c3cc55b7ec42c2558d0afece9d01f6370314ca20904a6ec'}]}, 'timestamp': '2025-10-08 16:20:36.165813', '_unique_id': '0445e461b95f4ccdb47f94493873f47e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.166 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.167 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.167 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.device.write.requests volume: 344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.167 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4aad4e7-2e07-4f59-9771-e4dc043f6dc0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 344, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a-vda', 'timestamp': '2025-10-08T16:20:36.167281', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'instance-00000057', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b8bdae96-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.814539771, 'message_signature': 'd41498a53d3ed038351cbde681f44911e7ecabaaef1e93bfa6c6963de001c848'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a-sda', 'timestamp': '2025-10-08T16:20:36.167281', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'instance-00000057', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b8bdb904-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.814539771, 'message_signature': '3a2ab0d0a53d62b3b7a97aa028548ebc4a3c611bbfe87009198d8e1c2480fb46'}]}, 'timestamp': '2025-10-08 16:20:36.167833', '_unique_id': '3b8e98c86c8745e6818147e65748d443'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.168 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.169 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.169 12 DEBUG ceilometer.compute.pollsters [-] e9519041-3cf1-40a3-8654-4ed813d2c48a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50bc3e93-a676-4de1-84bc-8a6b217d6fec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2bdd69fe495b499fbadd2e2b8da36c6f', 'user_name': None, 'project_id': '0433a72056854da48c168f13bcf53e59', 'project_name': None, 'resource_id': 'instance-00000057-e9519041-3cf1-40a3-8654-4ed813d2c48a-tap091f9564-aa', 'timestamp': '2025-10-08T16:20:36.169415', 'resource_metadata': {'display_name': 'tempest-server-test-40326179', 'name': 'tap091f9564-aa', 'instance_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'instance_type': 'm1.nano', 'host': 'a5db891a3327c15f17a1b78b0565f23342c06055d2bdd9642610d6be', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '987b2db7-1d21-4b59-831a-1e8ace40589b', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}, 'image_ref': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:36:43:2f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap091f9564-aa'}, 'message_id': 'b8be021a-a462-11f0-9274-fa163ef67048', 'monotonic_time': 7339.837615624, 'message_signature': '25083b4ca5de2321ead60c7d2b1c5a7c1ee1ea769693242c7f8bcb1d639d4cf8'}]}, 'timestamp': '2025-10-08 16:20:36.169722', '_unique_id': 'e0324eb05f534f4c8a74ff91be9f2d24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.170 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.171 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.171 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:20:36.171 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-40326179>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-40326179>]
Oct  8 12:20:36 np0005476733 nova_compute[192580]: 2025-10-08 16:20:36.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:38 np0005476733 nova_compute[192580]: 2025-10-08 16:20:38.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:39 np0005476733 podman[258522]: 2025-10-08 16:20:39.25045624 +0000 UTC m=+0.070787189 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 12:20:39 np0005476733 podman[258521]: 2025-10-08 16:20:39.28765166 +0000 UTC m=+0.113438692 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 12:20:39 np0005476733 nova_compute[192580]: 2025-10-08 16:20:39.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:20:39 np0005476733 nova_compute[192580]: 2025-10-08 16:20:39.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:20:39 np0005476733 nova_compute[192580]: 2025-10-08 16:20:39.591 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:20:39 np0005476733 nova_compute[192580]: 2025-10-08 16:20:39.611 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-e9519041-3cf1-40a3-8654-4ed813d2c48a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:20:39 np0005476733 nova_compute[192580]: 2025-10-08 16:20:39.611 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-e9519041-3cf1-40a3-8654-4ed813d2c48a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:20:39 np0005476733 nova_compute[192580]: 2025-10-08 16:20:39.611 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:20:39 np0005476733 nova_compute[192580]: 2025-10-08 16:20:39.611 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e9519041-3cf1-40a3-8654-4ed813d2c48a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:20:39 np0005476733 nova_compute[192580]: 2025-10-08 16:20:39.691 2 DEBUG nova.virt.libvirt.driver [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Check if temp file /var/lib/nova/instances/tmp6d8hw8wx exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Oct  8 12:20:39 np0005476733 nova_compute[192580]: 2025-10-08 16:20:39.692 2 DEBUG nova.compute.manager [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=113664,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6d8hw8wx',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e9519041-3cf1-40a3-8654-4ed813d2c48a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Oct  8 12:20:40 np0005476733 nova_compute[192580]: 2025-10-08 16:20:40.519 2 DEBUG oslo_concurrency.processutils [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:20:40 np0005476733 nova_compute[192580]: 2025-10-08 16:20:40.614 2 DEBUG oslo_concurrency.processutils [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:20:40 np0005476733 nova_compute[192580]: 2025-10-08 16:20:40.617 2 DEBUG oslo_concurrency.processutils [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:20:40 np0005476733 nova_compute[192580]: 2025-10-08 16:20:40.679 2 DEBUG oslo_concurrency.processutils [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:20:41 np0005476733 nova_compute[192580]: 2025-10-08 16:20:41.024 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Updating instance_info_cache with network_info: [{"id": "091f9564-aabf-4b53-8693-9d3b05911b76", "address": "fa:16:3e:36:43:2f", "network": {"id": "ec0b6c7f-9518-401a-b9ba-5148c7d4432d", "bridge": "br-int", "label": "tempest-test-network--685655984", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap091f9564-aa", "ovs_interfaceid": "091f9564-aabf-4b53-8693-9d3b05911b76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:20:41 np0005476733 nova_compute[192580]: 2025-10-08 16:20:41.053 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-e9519041-3cf1-40a3-8654-4ed813d2c48a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:20:41 np0005476733 nova_compute[192580]: 2025-10-08 16:20:41.053 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:20:41 np0005476733 nova_compute[192580]: 2025-10-08 16:20:41.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:42 np0005476733 systemd[1]: Created slice User Slice of UID 42436.
Oct  8 12:20:42 np0005476733 systemd[1]: Starting User Runtime Directory /run/user/42436...
Oct  8 12:20:42 np0005476733 systemd-logind[827]: New session 116 of user nova.
Oct  8 12:20:42 np0005476733 systemd[1]: Finished User Runtime Directory /run/user/42436.
Oct  8 12:20:42 np0005476733 systemd[1]: Starting User Manager for UID 42436...
Oct  8 12:20:42 np0005476733 nova_compute[192580]: 2025-10-08 16:20:42.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:20:42 np0005476733 systemd[258582]: Queued start job for default target Main User Target.
Oct  8 12:20:42 np0005476733 systemd[258582]: Created slice User Application Slice.
Oct  8 12:20:42 np0005476733 systemd[258582]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  8 12:20:42 np0005476733 systemd[258582]: Started Daily Cleanup of User's Temporary Directories.
Oct  8 12:20:42 np0005476733 systemd[258582]: Reached target Paths.
Oct  8 12:20:42 np0005476733 systemd[258582]: Reached target Timers.
Oct  8 12:20:42 np0005476733 systemd[258582]: Starting D-Bus User Message Bus Socket...
Oct  8 12:20:42 np0005476733 systemd[258582]: Starting Create User's Volatile Files and Directories...
Oct  8 12:20:42 np0005476733 systemd[258582]: Listening on D-Bus User Message Bus Socket.
Oct  8 12:20:42 np0005476733 systemd[258582]: Reached target Sockets.
Oct  8 12:20:42 np0005476733 systemd[258582]: Finished Create User's Volatile Files and Directories.
Oct  8 12:20:42 np0005476733 systemd[258582]: Reached target Basic System.
Oct  8 12:20:42 np0005476733 systemd[258582]: Reached target Main User Target.
Oct  8 12:20:42 np0005476733 systemd[258582]: Startup finished in 156ms.
Oct  8 12:20:42 np0005476733 systemd[1]: Started User Manager for UID 42436.
Oct  8 12:20:42 np0005476733 systemd[1]: Started Session 116 of User nova.
Oct  8 12:20:42 np0005476733 systemd[1]: session-116.scope: Deactivated successfully.
Oct  8 12:20:42 np0005476733 systemd-logind[827]: Session 116 logged out. Waiting for processes to exit.
Oct  8 12:20:42 np0005476733 systemd-logind[827]: Removed session 116.
Oct  8 12:20:43 np0005476733 nova_compute[192580]: 2025-10-08 16:20:43.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.124 2 DEBUG nova.compute.manager [req-e018e1f8-2a9c-4428-bedb-86f3f262f1b1 req-462422ea-5bd0-4545-a804-e7f1816fe1aa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Received event network-vif-unplugged-091f9564-aabf-4b53-8693-9d3b05911b76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.125 2 DEBUG oslo_concurrency.lockutils [req-e018e1f8-2a9c-4428-bedb-86f3f262f1b1 req-462422ea-5bd0-4545-a804-e7f1816fe1aa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.125 2 DEBUG oslo_concurrency.lockutils [req-e018e1f8-2a9c-4428-bedb-86f3f262f1b1 req-462422ea-5bd0-4545-a804-e7f1816fe1aa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.126 2 DEBUG oslo_concurrency.lockutils [req-e018e1f8-2a9c-4428-bedb-86f3f262f1b1 req-462422ea-5bd0-4545-a804-e7f1816fe1aa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.126 2 DEBUG nova.compute.manager [req-e018e1f8-2a9c-4428-bedb-86f3f262f1b1 req-462422ea-5bd0-4545-a804-e7f1816fe1aa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] No waiting events found dispatching network-vif-unplugged-091f9564-aabf-4b53-8693-9d3b05911b76 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.126 2 DEBUG nova.compute.manager [req-e018e1f8-2a9c-4428-bedb-86f3f262f1b1 req-462422ea-5bd0-4545-a804-e7f1816fe1aa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Received event network-vif-unplugged-091f9564-aabf-4b53-8693-9d3b05911b76 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.545 2 INFO nova.compute.manager [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Took 4.86 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.545 2 DEBUG nova.compute.manager [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.566 2 DEBUG nova.compute.manager [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=113664,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6d8hw8wx',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e9519041-3cf1-40a3-8654-4ed813d2c48a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(6788ce4d-736a-4796-98a9-e0a5ffedb48a),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.597 2 DEBUG nova.objects.instance [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lazy-loading 'migration_context' on Instance uuid e9519041-3cf1-40a3-8654-4ed813d2c48a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.600 2 DEBUG nova.virt.libvirt.driver [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.602 2 DEBUG nova.virt.libvirt.driver [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.603 2 DEBUG nova.virt.libvirt.driver [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.623 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.623 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.623 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.623 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.630 2 DEBUG nova.virt.libvirt.vif [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T16:19:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-server-test-40326179',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-40326179',id=87,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJXmR2ZdZ/cPq7lDjgd3tfkKNExE73Ly5Y9+bY6BcqlYpolw2Hd9zh5sY9Gw6YddISE8ZprV1f9n+5/viUjWJD3yCOyl/ugmh/k10MEE7fGoiA6Dg0qDXStOp75SJcpxTQ==',key_name='tempest-keypair-test-542762937',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:19:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0433a72056854da48c168f13bcf53e59',ramdisk_id='',reservation_id='r-7499jz6f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-OvnDvrTest-1354773760',owner_user_name='tempest-OvnDvrTest-1354773760-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T16:19:13Z,user_data=None,user_id='2bdd69fe495b499fbadd2e2b8da36c6f',uuid=e9519041-3cf1-40a3-8654-4ed813d2c48a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "091f9564-aabf-4b53-8693-9d3b05911b76", "address": "fa:16:3e:36:43:2f", "network": {"id": "ec0b6c7f-9518-401a-b9ba-5148c7d4432d", "bridge": "br-int", "label": "tempest-test-network--685655984", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap091f9564-aa", "ovs_interfaceid": "091f9564-aabf-4b53-8693-9d3b05911b76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.631 2 DEBUG nova.network.os_vif_util [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converting VIF {"id": "091f9564-aabf-4b53-8693-9d3b05911b76", "address": "fa:16:3e:36:43:2f", "network": {"id": "ec0b6c7f-9518-401a-b9ba-5148c7d4432d", "bridge": "br-int", "label": "tempest-test-network--685655984", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap091f9564-aa", "ovs_interfaceid": "091f9564-aabf-4b53-8693-9d3b05911b76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.631 2 DEBUG nova.network.os_vif_util [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:36:43:2f,bridge_name='br-int',has_traffic_filtering=True,id=091f9564-aabf-4b53-8693-9d3b05911b76,network=Network(ec0b6c7f-9518-401a-b9ba-5148c7d4432d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap091f9564-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.632 2 DEBUG nova.virt.libvirt.migration [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Updating guest XML with vif config: <interface type="ethernet">
Oct  8 12:20:45 np0005476733 nova_compute[192580]:  <mac address="fa:16:3e:36:43:2f"/>
Oct  8 12:20:45 np0005476733 nova_compute[192580]:  <model type="virtio"/>
Oct  8 12:20:45 np0005476733 nova_compute[192580]:  <driver name="vhost" rx_queue_size="512"/>
Oct  8 12:20:45 np0005476733 nova_compute[192580]:  <mtu size="1342"/>
Oct  8 12:20:45 np0005476733 nova_compute[192580]:  <target dev="tap091f9564-aa"/>
Oct  8 12:20:45 np0005476733 nova_compute[192580]: </interface>
Oct  8 12:20:45 np0005476733 nova_compute[192580]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.632 2 DEBUG nova.virt.libvirt.driver [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.707 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.784 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.785 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:20:45 np0005476733 nova_compute[192580]: 2025-10-08 16:20:45.843 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:20:46 np0005476733 nova_compute[192580]: 2025-10-08 16:20:46.105 2 DEBUG nova.virt.libvirt.migration [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  8 12:20:46 np0005476733 nova_compute[192580]: 2025-10-08 16:20:46.106 2 INFO nova.virt.libvirt.migration [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Oct  8 12:20:46 np0005476733 nova_compute[192580]: 2025-10-08 16:20:46.107 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:20:46 np0005476733 nova_compute[192580]: 2025-10-08 16:20:46.109 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13499MB free_disk=111.28604507446289GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:20:46 np0005476733 nova_compute[192580]: 2025-10-08 16:20:46.109 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:20:46 np0005476733 nova_compute[192580]: 2025-10-08 16:20:46.110 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:20:46 np0005476733 nova_compute[192580]: 2025-10-08 16:20:46.186 2 INFO nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Updating resource usage from migration 6788ce4d-736a-4796-98a9-e0a5ffedb48a#033[00m
Oct  8 12:20:46 np0005476733 nova_compute[192580]: 2025-10-08 16:20:46.216 2 INFO nova.virt.libvirt.driver [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Oct  8 12:20:46 np0005476733 podman[258609]: 2025-10-08 16:20:46.272611312 +0000 UTC m=+0.079707011 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Oct  8 12:20:46 np0005476733 podman[258611]: 2025-10-08 16:20:46.275131531 +0000 UTC m=+0.084592615 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-type=git)
Oct  8 12:20:46 np0005476733 podman[258610]: 2025-10-08 16:20:46.294987942 +0000 UTC m=+0.102459173 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 12:20:46 np0005476733 nova_compute[192580]: 2025-10-08 16:20:46.334 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Migration 6788ce4d-736a-4796-98a9-e0a5ffedb48a is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Oct  8 12:20:46 np0005476733 nova_compute[192580]: 2025-10-08 16:20:46.334 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:20:46 np0005476733 nova_compute[192580]: 2025-10-08 16:20:46.334 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:20:46 np0005476733 nova_compute[192580]: 2025-10-08 16:20:46.378 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:20:46 np0005476733 nova_compute[192580]: 2025-10-08 16:20:46.396 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:20:46 np0005476733 nova_compute[192580]: 2025-10-08 16:20:46.416 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:20:46 np0005476733 nova_compute[192580]: 2025-10-08 16:20:46.416 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.306s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:20:46 np0005476733 nova_compute[192580]: 2025-10-08 16:20:46.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:46 np0005476733 nova_compute[192580]: 2025-10-08 16:20:46.719 2 DEBUG nova.virt.libvirt.migration [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  8 12:20:46 np0005476733 nova_compute[192580]: 2025-10-08 16:20:46.720 2 DEBUG nova.virt.libvirt.migration [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  8 12:20:47 np0005476733 nova_compute[192580]: 2025-10-08 16:20:47.197 2 DEBUG nova.compute.manager [req-eb0a2855-96ed-4541-81e5-307d245086b7 req-04c34b37-862f-4c33-bcd0-37e7a67547b3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Received event network-vif-plugged-091f9564-aabf-4b53-8693-9d3b05911b76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:20:47 np0005476733 nova_compute[192580]: 2025-10-08 16:20:47.198 2 DEBUG oslo_concurrency.lockutils [req-eb0a2855-96ed-4541-81e5-307d245086b7 req-04c34b37-862f-4c33-bcd0-37e7a67547b3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:20:47 np0005476733 nova_compute[192580]: 2025-10-08 16:20:47.198 2 DEBUG oslo_concurrency.lockutils [req-eb0a2855-96ed-4541-81e5-307d245086b7 req-04c34b37-862f-4c33-bcd0-37e7a67547b3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:20:47 np0005476733 nova_compute[192580]: 2025-10-08 16:20:47.199 2 DEBUG oslo_concurrency.lockutils [req-eb0a2855-96ed-4541-81e5-307d245086b7 req-04c34b37-862f-4c33-bcd0-37e7a67547b3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:20:47 np0005476733 nova_compute[192580]: 2025-10-08 16:20:47.199 2 DEBUG nova.compute.manager [req-eb0a2855-96ed-4541-81e5-307d245086b7 req-04c34b37-862f-4c33-bcd0-37e7a67547b3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] No waiting events found dispatching network-vif-plugged-091f9564-aabf-4b53-8693-9d3b05911b76 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:20:47 np0005476733 nova_compute[192580]: 2025-10-08 16:20:47.199 2 WARNING nova.compute.manager [req-eb0a2855-96ed-4541-81e5-307d245086b7 req-04c34b37-862f-4c33-bcd0-37e7a67547b3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Received unexpected event network-vif-plugged-091f9564-aabf-4b53-8693-9d3b05911b76 for instance with vm_state active and task_state migrating.#033[00m
Oct  8 12:20:47 np0005476733 nova_compute[192580]: 2025-10-08 16:20:47.200 2 DEBUG nova.compute.manager [req-eb0a2855-96ed-4541-81e5-307d245086b7 req-04c34b37-862f-4c33-bcd0-37e7a67547b3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Received event network-changed-091f9564-aabf-4b53-8693-9d3b05911b76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:20:47 np0005476733 nova_compute[192580]: 2025-10-08 16:20:47.200 2 DEBUG nova.compute.manager [req-eb0a2855-96ed-4541-81e5-307d245086b7 req-04c34b37-862f-4c33-bcd0-37e7a67547b3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Refreshing instance network info cache due to event network-changed-091f9564-aabf-4b53-8693-9d3b05911b76. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:20:47 np0005476733 nova_compute[192580]: 2025-10-08 16:20:47.200 2 DEBUG oslo_concurrency.lockutils [req-eb0a2855-96ed-4541-81e5-307d245086b7 req-04c34b37-862f-4c33-bcd0-37e7a67547b3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-e9519041-3cf1-40a3-8654-4ed813d2c48a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:20:47 np0005476733 nova_compute[192580]: 2025-10-08 16:20:47.201 2 DEBUG oslo_concurrency.lockutils [req-eb0a2855-96ed-4541-81e5-307d245086b7 req-04c34b37-862f-4c33-bcd0-37e7a67547b3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-e9519041-3cf1-40a3-8654-4ed813d2c48a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:20:47 np0005476733 nova_compute[192580]: 2025-10-08 16:20:47.201 2 DEBUG nova.network.neutron [req-eb0a2855-96ed-4541-81e5-307d245086b7 req-04c34b37-862f-4c33-bcd0-37e7a67547b3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Refreshing network info cache for port 091f9564-aabf-4b53-8693-9d3b05911b76 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:20:47 np0005476733 nova_compute[192580]: 2025-10-08 16:20:47.224 2 DEBUG nova.virt.libvirt.migration [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  8 12:20:47 np0005476733 nova_compute[192580]: 2025-10-08 16:20:47.224 2 DEBUG nova.virt.libvirt.migration [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  8 12:20:47 np0005476733 nova_compute[192580]: 2025-10-08 16:20:47.409 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:20:47 np0005476733 nova_compute[192580]: 2025-10-08 16:20:47.728 2 DEBUG nova.virt.libvirt.migration [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Oct  8 12:20:47 np0005476733 nova_compute[192580]: 2025-10-08 16:20:47.729 2 DEBUG nova.virt.libvirt.migration [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Oct  8 12:20:48 np0005476733 nova_compute[192580]: 2025-10-08 16:20:48.076 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759940448.0761738, e9519041-3cf1-40a3-8654-4ed813d2c48a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:20:48 np0005476733 nova_compute[192580]: 2025-10-08 16:20:48.078 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] VM Paused (Lifecycle Event)#033[00m
Oct  8 12:20:48 np0005476733 nova_compute[192580]: 2025-10-08 16:20:48.112 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:20:48 np0005476733 nova_compute[192580]: 2025-10-08 16:20:48.118 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:20:48 np0005476733 nova_compute[192580]: 2025-10-08 16:20:48.141 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Oct  8 12:20:48 np0005476733 kernel: tap091f9564-aa (unregistering): left promiscuous mode
Oct  8 12:20:48 np0005476733 NetworkManager[51699]: <info>  [1759940448.2105] device (tap091f9564-aa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 12:20:48 np0005476733 ovn_controller[94857]: 2025-10-08T16:20:48Z|00850|binding|INFO|Releasing lport 091f9564-aabf-4b53-8693-9d3b05911b76 from this chassis (sb_readonly=0)
Oct  8 12:20:48 np0005476733 nova_compute[192580]: 2025-10-08 16:20:48.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:48 np0005476733 ovn_controller[94857]: 2025-10-08T16:20:48Z|00851|binding|INFO|Setting lport 091f9564-aabf-4b53-8693-9d3b05911b76 down in Southbound
Oct  8 12:20:48 np0005476733 ovn_controller[94857]: 2025-10-08T16:20:48Z|00852|binding|INFO|Removing iface tap091f9564-aa ovn-installed in OVS
Oct  8 12:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:20:48.237 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:43:2f 10.100.0.44'], port_security=['fa:16:3e:36:43:2f 10.100.0.44'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '329dd4f3-73f4-4bda-955c-e971074e916e'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.44/28', 'neutron:device_id': 'e9519041-3cf1-40a3-8654-4ed813d2c48a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec0b6c7f-9518-401a-b9ba-5148c7d4432d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0433a72056854da48c168f13bcf53e59', 'neutron:revision_number': '8', 'neutron:security_group_ids': '6694f32d-684d-42bf-9b26-d5a5c55c3f82', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.202'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbb6917-eb54-40f6-a6ab-44006c4799a4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=091f9564-aabf-4b53-8693-9d3b05911b76) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:20:48.239 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 091f9564-aabf-4b53-8693-9d3b05911b76 in datapath ec0b6c7f-9518-401a-b9ba-5148c7d4432d unbound from our chassis#033[00m
Oct  8 12:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:20:48.241 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ec0b6c7f-9518-401a-b9ba-5148c7d4432d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 12:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:20:48.242 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[348fbffb-0400-4611-b0e0-2e7c6f11f254]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:20:48.243 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ec0b6c7f-9518-401a-b9ba-5148c7d4432d namespace which is not needed anymore#033[00m
Oct  8 12:20:48 np0005476733 nova_compute[192580]: 2025-10-08 16:20:48.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:48 np0005476733 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000057.scope: Deactivated successfully.
Oct  8 12:20:48 np0005476733 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000057.scope: Consumed 16.138s CPU time.
Oct  8 12:20:48 np0005476733 systemd-machined[152624]: Machine qemu-54-instance-00000057 terminated.
Oct  8 12:20:48 np0005476733 neutron-haproxy-ovnmeta-ec0b6c7f-9518-401a-b9ba-5148c7d4432d[257693]: [NOTICE]   (257697) : haproxy version is 2.8.14-c23fe91
Oct  8 12:20:48 np0005476733 neutron-haproxy-ovnmeta-ec0b6c7f-9518-401a-b9ba-5148c7d4432d[257693]: [NOTICE]   (257697) : path to executable is /usr/sbin/haproxy
Oct  8 12:20:48 np0005476733 neutron-haproxy-ovnmeta-ec0b6c7f-9518-401a-b9ba-5148c7d4432d[257693]: [WARNING]  (257697) : Exiting Master process...
Oct  8 12:20:48 np0005476733 neutron-haproxy-ovnmeta-ec0b6c7f-9518-401a-b9ba-5148c7d4432d[257693]: [ALERT]    (257697) : Current worker (257699) exited with code 143 (Terminated)
Oct  8 12:20:48 np0005476733 neutron-haproxy-ovnmeta-ec0b6c7f-9518-401a-b9ba-5148c7d4432d[257693]: [WARNING]  (257697) : All workers exited. Exiting... (0)
Oct  8 12:20:48 np0005476733 systemd[1]: libpod-ddfc774ab8a88880577d383650880150f5d81e97c208ccc7fd174c6cdb264e33.scope: Deactivated successfully.
Oct  8 12:20:48 np0005476733 podman[258708]: 2025-10-08 16:20:48.38432362 +0000 UTC m=+0.047459918 container died ddfc774ab8a88880577d383650880150f5d81e97c208ccc7fd174c6cdb264e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-ec0b6c7f-9518-401a-b9ba-5148c7d4432d, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:20:48 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ddfc774ab8a88880577d383650880150f5d81e97c208ccc7fd174c6cdb264e33-userdata-shm.mount: Deactivated successfully.
Oct  8 12:20:48 np0005476733 systemd[1]: var-lib-containers-storage-overlay-ca212dd72a75d738c7fdb66c2c61670f0352555d5268e25b0671587380f31c35-merged.mount: Deactivated successfully.
Oct  8 12:20:48 np0005476733 podman[258708]: 2025-10-08 16:20:48.444134249 +0000 UTC m=+0.107270587 container cleanup ddfc774ab8a88880577d383650880150f5d81e97c208ccc7fd174c6cdb264e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-ec0b6c7f-9518-401a-b9ba-5148c7d4432d, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  8 12:20:48 np0005476733 systemd[1]: libpod-conmon-ddfc774ab8a88880577d383650880150f5d81e97c208ccc7fd174c6cdb264e33.scope: Deactivated successfully.
Oct  8 12:20:48 np0005476733 nova_compute[192580]: 2025-10-08 16:20:48.464 2 DEBUG nova.virt.libvirt.guest [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Oct  8 12:20:48 np0005476733 nova_compute[192580]: 2025-10-08 16:20:48.465 2 INFO nova.virt.libvirt.driver [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Migration operation has completed#033[00m
Oct  8 12:20:48 np0005476733 nova_compute[192580]: 2025-10-08 16:20:48.465 2 INFO nova.compute.manager [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] _post_live_migration() is started..#033[00m
Oct  8 12:20:48 np0005476733 nova_compute[192580]: 2025-10-08 16:20:48.472 2 DEBUG nova.virt.libvirt.driver [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Oct  8 12:20:48 np0005476733 nova_compute[192580]: 2025-10-08 16:20:48.472 2 DEBUG nova.virt.libvirt.driver [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Oct  8 12:20:48 np0005476733 nova_compute[192580]: 2025-10-08 16:20:48.472 2 DEBUG nova.virt.libvirt.driver [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Oct  8 12:20:48 np0005476733 podman[258755]: 2025-10-08 16:20:48.528972941 +0000 UTC m=+0.046439504 container remove ddfc774ab8a88880577d383650880150f5d81e97c208ccc7fd174c6cdb264e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-ec0b6c7f-9518-401a-b9ba-5148c7d4432d, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 12:20:48 np0005476733 nova_compute[192580]: 2025-10-08 16:20:48.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:20:48.535 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5cffa417-be9f-4d1d-9bee-14a7a280f293]: (4, ('Wed Oct  8 04:20:48 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ec0b6c7f-9518-401a-b9ba-5148c7d4432d (ddfc774ab8a88880577d383650880150f5d81e97c208ccc7fd174c6cdb264e33)\nddfc774ab8a88880577d383650880150f5d81e97c208ccc7fd174c6cdb264e33\nWed Oct  8 04:20:48 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ec0b6c7f-9518-401a-b9ba-5148c7d4432d (ddfc774ab8a88880577d383650880150f5d81e97c208ccc7fd174c6cdb264e33)\nddfc774ab8a88880577d383650880150f5d81e97c208ccc7fd174c6cdb264e33\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:20:48.537 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[91ab6553-6a1f-4fb3-9144-048f9051fe3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:20:48.539 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec0b6c7f-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:20:48 np0005476733 nova_compute[192580]: 2025-10-08 16:20:48.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:48 np0005476733 kernel: tapec0b6c7f-90: left promiscuous mode
Oct  8 12:20:48 np0005476733 nova_compute[192580]: 2025-10-08 16:20:48.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:20:48.576 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2f8833ae-cdeb-4983-8bfe-fba7fc571ffc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:20:48.603 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3978376e-1b7e-470c-92bc-d4e20e864c17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:20:48.605 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1caf1d16-1fa3-4709-a4f1-cdc45e621c3e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:20:48.628 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d7a4cc5c-27f6-4036-b6d6-405c2570e529]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 725571, 'reachable_time': 31749, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 258773, 'error': None, 'target': 'ovnmeta-ec0b6c7f-9518-401a-b9ba-5148c7d4432d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:20:48 np0005476733 systemd[1]: run-netns-ovnmeta\x2dec0b6c7f\x2d9518\x2d401a\x2db9ba\x2d5148c7d4432d.mount: Deactivated successfully.
Oct  8 12:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:20:48.633 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ec0b6c7f-9518-401a-b9ba-5148c7d4432d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 12:20:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:20:48.633 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[db5c15dd-ead7-4486-9b36-94580329f869]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:20:48 np0005476733 nova_compute[192580]: 2025-10-08 16:20:48.789 2 DEBUG nova.compute.manager [req-33bcf37a-09a0-48cb-ba42-e03ed91ceaef req-4d95a1c9-9d05-43d3-aecf-cb5a3b6caac3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Received event network-vif-unplugged-091f9564-aabf-4b53-8693-9d3b05911b76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:20:48 np0005476733 nova_compute[192580]: 2025-10-08 16:20:48.790 2 DEBUG oslo_concurrency.lockutils [req-33bcf37a-09a0-48cb-ba42-e03ed91ceaef req-4d95a1c9-9d05-43d3-aecf-cb5a3b6caac3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:20:48 np0005476733 nova_compute[192580]: 2025-10-08 16:20:48.791 2 DEBUG oslo_concurrency.lockutils [req-33bcf37a-09a0-48cb-ba42-e03ed91ceaef req-4d95a1c9-9d05-43d3-aecf-cb5a3b6caac3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:20:48 np0005476733 nova_compute[192580]: 2025-10-08 16:20:48.791 2 DEBUG oslo_concurrency.lockutils [req-33bcf37a-09a0-48cb-ba42-e03ed91ceaef req-4d95a1c9-9d05-43d3-aecf-cb5a3b6caac3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:20:48 np0005476733 nova_compute[192580]: 2025-10-08 16:20:48.792 2 DEBUG nova.compute.manager [req-33bcf37a-09a0-48cb-ba42-e03ed91ceaef req-4d95a1c9-9d05-43d3-aecf-cb5a3b6caac3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] No waiting events found dispatching network-vif-unplugged-091f9564-aabf-4b53-8693-9d3b05911b76 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:20:48 np0005476733 nova_compute[192580]: 2025-10-08 16:20:48.792 2 DEBUG nova.compute.manager [req-33bcf37a-09a0-48cb-ba42-e03ed91ceaef req-4d95a1c9-9d05-43d3-aecf-cb5a3b6caac3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Received event network-vif-unplugged-091f9564-aabf-4b53-8693-9d3b05911b76 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 12:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:20:49.098 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=61, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=60) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:20:49.099 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.185 2 DEBUG nova.network.neutron [req-eb0a2855-96ed-4541-81e5-307d245086b7 req-04c34b37-862f-4c33-bcd0-37e7a67547b3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Updated VIF entry in instance network info cache for port 091f9564-aabf-4b53-8693-9d3b05911b76. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.185 2 DEBUG nova.network.neutron [req-eb0a2855-96ed-4541-81e5-307d245086b7 req-04c34b37-862f-4c33-bcd0-37e7a67547b3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Updating instance_info_cache with network_info: [{"id": "091f9564-aabf-4b53-8693-9d3b05911b76", "address": "fa:16:3e:36:43:2f", "network": {"id": "ec0b6c7f-9518-401a-b9ba-5148c7d4432d", "bridge": "br-int", "label": "tempest-test-network--685655984", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap091f9564-aa", "ovs_interfaceid": "091f9564-aabf-4b53-8693-9d3b05911b76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.213 2 DEBUG oslo_concurrency.lockutils [req-eb0a2855-96ed-4541-81e5-307d245086b7 req-04c34b37-862f-4c33-bcd0-37e7a67547b3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-e9519041-3cf1-40a3-8654-4ed813d2c48a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.337 2 DEBUG nova.compute.manager [req-d5a8bf0c-8660-417a-8d56-95b1413ce255 req-36f22fea-c89c-4f14-b463-491d06306eed 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Received event network-vif-unplugged-091f9564-aabf-4b53-8693-9d3b05911b76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.337 2 DEBUG oslo_concurrency.lockutils [req-d5a8bf0c-8660-417a-8d56-95b1413ce255 req-36f22fea-c89c-4f14-b463-491d06306eed 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.338 2 DEBUG oslo_concurrency.lockutils [req-d5a8bf0c-8660-417a-8d56-95b1413ce255 req-36f22fea-c89c-4f14-b463-491d06306eed 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.338 2 DEBUG oslo_concurrency.lockutils [req-d5a8bf0c-8660-417a-8d56-95b1413ce255 req-36f22fea-c89c-4f14-b463-491d06306eed 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.338 2 DEBUG nova.compute.manager [req-d5a8bf0c-8660-417a-8d56-95b1413ce255 req-36f22fea-c89c-4f14-b463-491d06306eed 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] No waiting events found dispatching network-vif-unplugged-091f9564-aabf-4b53-8693-9d3b05911b76 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.338 2 DEBUG nova.compute.manager [req-d5a8bf0c-8660-417a-8d56-95b1413ce255 req-36f22fea-c89c-4f14-b463-491d06306eed 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Received event network-vif-unplugged-091f9564-aabf-4b53-8693-9d3b05911b76 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.339 2 DEBUG nova.network.neutron [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Activated binding for port 091f9564-aabf-4b53-8693-9d3b05911b76 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.339 2 DEBUG nova.compute.manager [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "091f9564-aabf-4b53-8693-9d3b05911b76", "address": "fa:16:3e:36:43:2f", "network": {"id": "ec0b6c7f-9518-401a-b9ba-5148c7d4432d", "bridge": "br-int", "label": "tempest-test-network--685655984", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap091f9564-aa", "ovs_interfaceid": "091f9564-aabf-4b53-8693-9d3b05911b76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.340 2 DEBUG nova.virt.libvirt.vif [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T16:19:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-server-test-40326179',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-40326179',id=87,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJXmR2ZdZ/cPq7lDjgd3tfkKNExE73Ly5Y9+bY6BcqlYpolw2Hd9zh5sY9Gw6YddISE8ZprV1f9n+5/viUjWJD3yCOyl/ugmh/k10MEE7fGoiA6Dg0qDXStOp75SJcpxTQ==',key_name='tempest-keypair-test-542762937',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:19:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0433a72056854da48c168f13bcf53e59',ramdisk_id='',reservation_id='r-7499jz6f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-OvnDvrTest-1354773760',owner_user_name='tempest-OvnDvrTest-1354773760-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T16:20:36Z,user_data=None,user_id='2bdd69fe495b499fbadd2e2b8da36c6f',uuid=e9519041-3cf1-40a3-8654-4ed813d2c48a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "091f9564-aabf-4b53-8693-9d3b05911b76", "address": "fa:16:3e:36:43:2f", "network": {"id": "ec0b6c7f-9518-401a-b9ba-5148c7d4432d", "bridge": "br-int", "label": "tempest-test-network--685655984", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap091f9564-aa", "ovs_interfaceid": "091f9564-aabf-4b53-8693-9d3b05911b76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.340 2 DEBUG nova.network.os_vif_util [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converting VIF {"id": "091f9564-aabf-4b53-8693-9d3b05911b76", "address": "fa:16:3e:36:43:2f", "network": {"id": "ec0b6c7f-9518-401a-b9ba-5148c7d4432d", "bridge": "br-int", "label": "tempest-test-network--685655984", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": "10.100.0.33", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.44", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap091f9564-aa", "ovs_interfaceid": "091f9564-aabf-4b53-8693-9d3b05911b76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.340 2 DEBUG nova.network.os_vif_util [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:36:43:2f,bridge_name='br-int',has_traffic_filtering=True,id=091f9564-aabf-4b53-8693-9d3b05911b76,network=Network(ec0b6c7f-9518-401a-b9ba-5148c7d4432d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap091f9564-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.341 2 DEBUG os_vif [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:36:43:2f,bridge_name='br-int',has_traffic_filtering=True,id=091f9564-aabf-4b53-8693-9d3b05911b76,network=Network(ec0b6c7f-9518-401a-b9ba-5148c7d4432d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap091f9564-aa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.342 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap091f9564-aa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.348 2 INFO os_vif [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:36:43:2f,bridge_name='br-int',has_traffic_filtering=True,id=091f9564-aabf-4b53-8693-9d3b05911b76,network=Network(ec0b6c7f-9518-401a-b9ba-5148c7d4432d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap091f9564-aa')#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.349 2 DEBUG oslo_concurrency.lockutils [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.349 2 DEBUG oslo_concurrency.lockutils [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.353 2 DEBUG oslo_concurrency.lockutils [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.354 2 DEBUG nova.compute.manager [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.355 2 INFO nova.virt.libvirt.driver [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Deleting instance files /var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a_del#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.356 2 INFO nova.virt.libvirt.driver [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Deletion of /var/lib/nova/instances/e9519041-3cf1-40a3-8654-4ed813d2c48a_del complete#033[00m
Oct  8 12:20:49 np0005476733 nova_compute[192580]: 2025-10-08 16:20:49.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:20:50 np0005476733 nova_compute[192580]: 2025-10-08 16:20:50.895 2 DEBUG nova.compute.manager [req-ebf5fe04-3d0b-45dc-95bb-7ca8670c9b19 req-6cbbee8f-4a57-4176-ab9a-38720648c885 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Received event network-vif-plugged-091f9564-aabf-4b53-8693-9d3b05911b76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:20:50 np0005476733 nova_compute[192580]: 2025-10-08 16:20:50.896 2 DEBUG oslo_concurrency.lockutils [req-ebf5fe04-3d0b-45dc-95bb-7ca8670c9b19 req-6cbbee8f-4a57-4176-ab9a-38720648c885 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:20:50 np0005476733 nova_compute[192580]: 2025-10-08 16:20:50.896 2 DEBUG oslo_concurrency.lockutils [req-ebf5fe04-3d0b-45dc-95bb-7ca8670c9b19 req-6cbbee8f-4a57-4176-ab9a-38720648c885 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:20:50 np0005476733 nova_compute[192580]: 2025-10-08 16:20:50.897 2 DEBUG oslo_concurrency.lockutils [req-ebf5fe04-3d0b-45dc-95bb-7ca8670c9b19 req-6cbbee8f-4a57-4176-ab9a-38720648c885 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:20:50 np0005476733 nova_compute[192580]: 2025-10-08 16:20:50.897 2 DEBUG nova.compute.manager [req-ebf5fe04-3d0b-45dc-95bb-7ca8670c9b19 req-6cbbee8f-4a57-4176-ab9a-38720648c885 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] No waiting events found dispatching network-vif-plugged-091f9564-aabf-4b53-8693-9d3b05911b76 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:20:50 np0005476733 nova_compute[192580]: 2025-10-08 16:20:50.897 2 WARNING nova.compute.manager [req-ebf5fe04-3d0b-45dc-95bb-7ca8670c9b19 req-6cbbee8f-4a57-4176-ab9a-38720648c885 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Received unexpected event network-vif-plugged-091f9564-aabf-4b53-8693-9d3b05911b76 for instance with vm_state active and task_state migrating.#033[00m
Oct  8 12:20:50 np0005476733 nova_compute[192580]: 2025-10-08 16:20:50.898 2 DEBUG nova.compute.manager [req-ebf5fe04-3d0b-45dc-95bb-7ca8670c9b19 req-6cbbee8f-4a57-4176-ab9a-38720648c885 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Received event network-vif-plugged-091f9564-aabf-4b53-8693-9d3b05911b76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:20:50 np0005476733 nova_compute[192580]: 2025-10-08 16:20:50.898 2 DEBUG oslo_concurrency.lockutils [req-ebf5fe04-3d0b-45dc-95bb-7ca8670c9b19 req-6cbbee8f-4a57-4176-ab9a-38720648c885 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:20:50 np0005476733 nova_compute[192580]: 2025-10-08 16:20:50.898 2 DEBUG oslo_concurrency.lockutils [req-ebf5fe04-3d0b-45dc-95bb-7ca8670c9b19 req-6cbbee8f-4a57-4176-ab9a-38720648c885 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:20:50 np0005476733 nova_compute[192580]: 2025-10-08 16:20:50.898 2 DEBUG oslo_concurrency.lockutils [req-ebf5fe04-3d0b-45dc-95bb-7ca8670c9b19 req-6cbbee8f-4a57-4176-ab9a-38720648c885 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:20:50 np0005476733 nova_compute[192580]: 2025-10-08 16:20:50.899 2 DEBUG nova.compute.manager [req-ebf5fe04-3d0b-45dc-95bb-7ca8670c9b19 req-6cbbee8f-4a57-4176-ab9a-38720648c885 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] No waiting events found dispatching network-vif-plugged-091f9564-aabf-4b53-8693-9d3b05911b76 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:20:50 np0005476733 nova_compute[192580]: 2025-10-08 16:20:50.899 2 WARNING nova.compute.manager [req-ebf5fe04-3d0b-45dc-95bb-7ca8670c9b19 req-6cbbee8f-4a57-4176-ab9a-38720648c885 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Received unexpected event network-vif-plugged-091f9564-aabf-4b53-8693-9d3b05911b76 for instance with vm_state active and task_state migrating.#033[00m
Oct  8 12:20:50 np0005476733 nova_compute[192580]: 2025-10-08 16:20:50.899 2 DEBUG nova.compute.manager [req-ebf5fe04-3d0b-45dc-95bb-7ca8670c9b19 req-6cbbee8f-4a57-4176-ab9a-38720648c885 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Received event network-vif-plugged-091f9564-aabf-4b53-8693-9d3b05911b76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:20:50 np0005476733 nova_compute[192580]: 2025-10-08 16:20:50.899 2 DEBUG oslo_concurrency.lockutils [req-ebf5fe04-3d0b-45dc-95bb-7ca8670c9b19 req-6cbbee8f-4a57-4176-ab9a-38720648c885 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:20:50 np0005476733 nova_compute[192580]: 2025-10-08 16:20:50.900 2 DEBUG oslo_concurrency.lockutils [req-ebf5fe04-3d0b-45dc-95bb-7ca8670c9b19 req-6cbbee8f-4a57-4176-ab9a-38720648c885 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:20:50 np0005476733 nova_compute[192580]: 2025-10-08 16:20:50.900 2 DEBUG oslo_concurrency.lockutils [req-ebf5fe04-3d0b-45dc-95bb-7ca8670c9b19 req-6cbbee8f-4a57-4176-ab9a-38720648c885 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:20:50 np0005476733 nova_compute[192580]: 2025-10-08 16:20:50.900 2 DEBUG nova.compute.manager [req-ebf5fe04-3d0b-45dc-95bb-7ca8670c9b19 req-6cbbee8f-4a57-4176-ab9a-38720648c885 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] No waiting events found dispatching network-vif-plugged-091f9564-aabf-4b53-8693-9d3b05911b76 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:20:50 np0005476733 nova_compute[192580]: 2025-10-08 16:20:50.900 2 WARNING nova.compute.manager [req-ebf5fe04-3d0b-45dc-95bb-7ca8670c9b19 req-6cbbee8f-4a57-4176-ab9a-38720648c885 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Received unexpected event network-vif-plugged-091f9564-aabf-4b53-8693-9d3b05911b76 for instance with vm_state active and task_state migrating.#033[00m
Oct  8 12:20:50 np0005476733 nova_compute[192580]: 2025-10-08 16:20:50.901 2 DEBUG nova.compute.manager [req-ebf5fe04-3d0b-45dc-95bb-7ca8670c9b19 req-6cbbee8f-4a57-4176-ab9a-38720648c885 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Received event network-vif-plugged-091f9564-aabf-4b53-8693-9d3b05911b76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:20:50 np0005476733 nova_compute[192580]: 2025-10-08 16:20:50.901 2 DEBUG oslo_concurrency.lockutils [req-ebf5fe04-3d0b-45dc-95bb-7ca8670c9b19 req-6cbbee8f-4a57-4176-ab9a-38720648c885 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:20:50 np0005476733 nova_compute[192580]: 2025-10-08 16:20:50.901 2 DEBUG oslo_concurrency.lockutils [req-ebf5fe04-3d0b-45dc-95bb-7ca8670c9b19 req-6cbbee8f-4a57-4176-ab9a-38720648c885 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:20:50 np0005476733 nova_compute[192580]: 2025-10-08 16:20:50.902 2 DEBUG oslo_concurrency.lockutils [req-ebf5fe04-3d0b-45dc-95bb-7ca8670c9b19 req-6cbbee8f-4a57-4176-ab9a-38720648c885 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:20:50 np0005476733 nova_compute[192580]: 2025-10-08 16:20:50.902 2 DEBUG nova.compute.manager [req-ebf5fe04-3d0b-45dc-95bb-7ca8670c9b19 req-6cbbee8f-4a57-4176-ab9a-38720648c885 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] No waiting events found dispatching network-vif-plugged-091f9564-aabf-4b53-8693-9d3b05911b76 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:20:50 np0005476733 nova_compute[192580]: 2025-10-08 16:20:50.902 2 WARNING nova.compute.manager [req-ebf5fe04-3d0b-45dc-95bb-7ca8670c9b19 req-6cbbee8f-4a57-4176-ab9a-38720648c885 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Received unexpected event network-vif-plugged-091f9564-aabf-4b53-8693-9d3b05911b76 for instance with vm_state active and task_state migrating.#033[00m
Oct  8 12:20:52 np0005476733 systemd[1]: Stopping User Manager for UID 42436...
Oct  8 12:20:52 np0005476733 systemd[258582]: Activating special unit Exit the Session...
Oct  8 12:20:52 np0005476733 systemd[258582]: Stopped target Main User Target.
Oct  8 12:20:52 np0005476733 systemd[258582]: Stopped target Basic System.
Oct  8 12:20:52 np0005476733 systemd[258582]: Stopped target Paths.
Oct  8 12:20:52 np0005476733 systemd[258582]: Stopped target Sockets.
Oct  8 12:20:52 np0005476733 systemd[258582]: Stopped target Timers.
Oct  8 12:20:52 np0005476733 systemd[258582]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct  8 12:20:52 np0005476733 systemd[258582]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  8 12:20:52 np0005476733 systemd[258582]: Closed D-Bus User Message Bus Socket.
Oct  8 12:20:52 np0005476733 systemd[258582]: Stopped Create User's Volatile Files and Directories.
Oct  8 12:20:52 np0005476733 systemd[258582]: Removed slice User Application Slice.
Oct  8 12:20:52 np0005476733 systemd[258582]: Reached target Shutdown.
Oct  8 12:20:52 np0005476733 systemd[258582]: Finished Exit the Session.
Oct  8 12:20:52 np0005476733 systemd[258582]: Reached target Exit the Session.
Oct  8 12:20:52 np0005476733 systemd[1]: user@42436.service: Deactivated successfully.
Oct  8 12:20:52 np0005476733 systemd[1]: Stopped User Manager for UID 42436.
Oct  8 12:20:52 np0005476733 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Oct  8 12:20:52 np0005476733 systemd[1]: run-user-42436.mount: Deactivated successfully.
Oct  8 12:20:52 np0005476733 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Oct  8 12:20:52 np0005476733 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Oct  8 12:20:52 np0005476733 systemd[1]: Removed slice User Slice of UID 42436.
Oct  8 12:20:53 np0005476733 nova_compute[192580]: 2025-10-08 16:20:53.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:20:54.101 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '61'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:20:54 np0005476733 nova_compute[192580]: 2025-10-08 16:20:54.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:54 np0005476733 nova_compute[192580]: 2025-10-08 16:20:54.657 2 DEBUG oslo_concurrency.lockutils [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:20:54 np0005476733 nova_compute[192580]: 2025-10-08 16:20:54.658 2 DEBUG oslo_concurrency.lockutils [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:20:54 np0005476733 nova_compute[192580]: 2025-10-08 16:20:54.658 2 DEBUG oslo_concurrency.lockutils [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "e9519041-3cf1-40a3-8654-4ed813d2c48a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:20:54 np0005476733 nova_compute[192580]: 2025-10-08 16:20:54.682 2 DEBUG oslo_concurrency.lockutils [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:20:54 np0005476733 nova_compute[192580]: 2025-10-08 16:20:54.683 2 DEBUG oslo_concurrency.lockutils [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:20:54 np0005476733 nova_compute[192580]: 2025-10-08 16:20:54.683 2 DEBUG oslo_concurrency.lockutils [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:20:54 np0005476733 nova_compute[192580]: 2025-10-08 16:20:54.684 2 DEBUG nova.compute.resource_tracker [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:20:54 np0005476733 nova_compute[192580]: 2025-10-08 16:20:54.875 2 WARNING nova.virt.libvirt.driver [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:20:54 np0005476733 nova_compute[192580]: 2025-10-08 16:20:54.876 2 DEBUG nova.compute.resource_tracker [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13657MB free_disk=111.31521987915039GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:20:54 np0005476733 nova_compute[192580]: 2025-10-08 16:20:54.877 2 DEBUG oslo_concurrency.lockutils [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:20:54 np0005476733 nova_compute[192580]: 2025-10-08 16:20:54.877 2 DEBUG oslo_concurrency.lockutils [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:20:54 np0005476733 nova_compute[192580]: 2025-10-08 16:20:54.919 2 DEBUG nova.compute.resource_tracker [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Migration for instance e9519041-3cf1-40a3-8654-4ed813d2c48a refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Oct  8 12:20:54 np0005476733 nova_compute[192580]: 2025-10-08 16:20:54.945 2 DEBUG nova.compute.resource_tracker [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Oct  8 12:20:54 np0005476733 nova_compute[192580]: 2025-10-08 16:20:54.972 2 DEBUG nova.compute.resource_tracker [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Migration 6788ce4d-736a-4796-98a9-e0a5ffedb48a is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Oct  8 12:20:54 np0005476733 nova_compute[192580]: 2025-10-08 16:20:54.972 2 DEBUG nova.compute.resource_tracker [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:20:54 np0005476733 nova_compute[192580]: 2025-10-08 16:20:54.972 2 DEBUG nova.compute.resource_tracker [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:20:55 np0005476733 nova_compute[192580]: 2025-10-08 16:20:55.007 2 DEBUG nova.compute.provider_tree [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:20:55 np0005476733 nova_compute[192580]: 2025-10-08 16:20:55.033 2 DEBUG nova.scheduler.client.report [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:20:55 np0005476733 nova_compute[192580]: 2025-10-08 16:20:55.067 2 DEBUG nova.compute.resource_tracker [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:20:55 np0005476733 nova_compute[192580]: 2025-10-08 16:20:55.068 2 DEBUG oslo_concurrency.lockutils [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:20:55 np0005476733 nova_compute[192580]: 2025-10-08 16:20:55.072 2 INFO nova.compute.manager [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Oct  8 12:20:55 np0005476733 nova_compute[192580]: 2025-10-08 16:20:55.174 2 INFO nova.scheduler.client.report [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Deleted allocation for migration 6788ce4d-736a-4796-98a9-e0a5ffedb48a#033[00m
Oct  8 12:20:55 np0005476733 nova_compute[192580]: 2025-10-08 16:20:55.174 2 DEBUG nova.virt.libvirt.driver [None req-f41cbdb2-2a69-4955-8ad5-22157e5fc77b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Oct  8 12:20:56 np0005476733 podman[258784]: 2025-10-08 16:20:56.257232081 +0000 UTC m=+0.070127728 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=iscsid, org.label-schema.vendor=CentOS)
Oct  8 12:20:56 np0005476733 podman[258785]: 2025-10-08 16:20:56.280147068 +0000 UTC m=+0.083756540 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:20:56 np0005476733 systemd-logind[827]: New session 118 of user zuul.
Oct  8 12:20:56 np0005476733 systemd[1]: Started Session 118 of User zuul.
Oct  8 12:20:56 np0005476733 systemd-logind[827]: New session 119 of user zuul.
Oct  8 12:20:56 np0005476733 systemd[1]: Started Session 119 of User zuul.
Oct  8 12:20:56 np0005476733 systemd[1]: session-119.scope: Deactivated successfully.
Oct  8 12:20:56 np0005476733 systemd-logind[827]: Session 119 logged out. Waiting for processes to exit.
Oct  8 12:20:56 np0005476733 systemd-logind[827]: Removed session 119.
Oct  8 12:20:58 np0005476733 nova_compute[192580]: 2025-10-08 16:20:58.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:20:59 np0005476733 nova_compute[192580]: 2025-10-08 16:20:59.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:03 np0005476733 nova_compute[192580]: 2025-10-08 16:21:03.465 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759940448.4635952, e9519041-3cf1-40a3-8654-4ed813d2c48a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:21:03 np0005476733 nova_compute[192580]: 2025-10-08 16:21:03.465 2 INFO nova.compute.manager [-] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] VM Stopped (Lifecycle Event)#033[00m
Oct  8 12:21:03 np0005476733 nova_compute[192580]: 2025-10-08 16:21:03.486 2 DEBUG nova.compute.manager [None req-c27b450d-573c-40a7-a61c-f28a3423c0c8 - - - - - -] [instance: e9519041-3cf1-40a3-8654-4ed813d2c48a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:21:03 np0005476733 nova_compute[192580]: 2025-10-08 16:21:03.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:04 np0005476733 nova_compute[192580]: 2025-10-08 16:21:04.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:05 np0005476733 systemd-logind[827]: New session 120 of user zuul.
Oct  8 12:21:05 np0005476733 systemd[1]: Started Session 120 of User zuul.
Oct  8 12:21:05 np0005476733 podman[258893]: 2025-10-08 16:21:05.733302033 +0000 UTC m=+0.051706182 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 12:21:07 np0005476733 nova_compute[192580]: 2025-10-08 16:21:07.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:21:07 np0005476733 nova_compute[192580]: 2025-10-08 16:21:07.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 12:21:08 np0005476733 nova_compute[192580]: 2025-10-08 16:21:08.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:09 np0005476733 nova_compute[192580]: 2025-10-08 16:21:09.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:09 np0005476733 nova_compute[192580]: 2025-10-08 16:21:09.972 2 DEBUG nova.virt.libvirt.driver [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Creating tmpfile /var/lib/nova/instances/tmp46e7vutd to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Oct  8 12:21:09 np0005476733 nova_compute[192580]: 2025-10-08 16:21:09.973 2 DEBUG nova.compute.manager [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=113664,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp46e7vutd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Oct  8 12:21:10 np0005476733 podman[258943]: 2025-10-08 16:21:10.26046265 +0000 UTC m=+0.077617085 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:21:10 np0005476733 podman[258942]: 2025-10-08 16:21:10.304054843 +0000 UTC m=+0.134113638 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 12:21:11 np0005476733 nova_compute[192580]: 2025-10-08 16:21:11.657 2 DEBUG nova.compute.manager [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=113664,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp46e7vutd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3559b47f-102c-43cf-a800-ac09d66e2264',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Oct  8 12:21:11 np0005476733 nova_compute[192580]: 2025-10-08 16:21:11.703 2 DEBUG oslo_concurrency.lockutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "refresh_cache-3559b47f-102c-43cf-a800-ac09d66e2264" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:21:11 np0005476733 nova_compute[192580]: 2025-10-08 16:21:11.703 2 DEBUG oslo_concurrency.lockutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquired lock "refresh_cache-3559b47f-102c-43cf-a800-ac09d66e2264" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:21:11 np0005476733 nova_compute[192580]: 2025-10-08 16:21:11.704 2 DEBUG nova.network.neutron [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.091 2 DEBUG nova.network.neutron [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Updating instance_info_cache with network_info: [{"id": "20af3098-8387-4af6-82ab-5fb07b335ea0", "address": "fa:16:3e:74:7c:88", "network": {"id": "4f66031b-46d6-4cd4-8153-64f96d4967ce", "bridge": "br-int", "label": "tempest-test-network--1402782967", "subnets": [{"cidr": "10.100.0.48/28", "dns": [], "gateway": {"address": "10.100.0.49", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.53", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20af3098-83", "ovs_interfaceid": "20af3098-8387-4af6-82ab-5fb07b335ea0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.114 2 DEBUG oslo_concurrency.lockutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Releasing lock "refresh_cache-3559b47f-102c-43cf-a800-ac09d66e2264" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.116 2 DEBUG nova.virt.libvirt.driver [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=113664,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp46e7vutd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3559b47f-102c-43cf-a800-ac09d66e2264',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.116 2 DEBUG nova.virt.libvirt.driver [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Creating instance directory: /var/lib/nova/instances/3559b47f-102c-43cf-a800-ac09d66e2264 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.117 2 DEBUG nova.virt.libvirt.driver [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Creating disk.info with the contents: {'/var/lib/nova/instances/3559b47f-102c-43cf-a800-ac09d66e2264/disk': 'qcow2', '/var/lib/nova/instances/3559b47f-102c-43cf-a800-ac09d66e2264/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.117 2 DEBUG nova.virt.libvirt.driver [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.118 2 DEBUG nova.objects.instance [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lazy-loading 'trusted_certs' on Instance uuid 3559b47f-102c-43cf-a800-ac09d66e2264 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.142 2 DEBUG oslo_concurrency.processutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.196 2 DEBUG oslo_concurrency.processutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.198 2 DEBUG oslo_concurrency.lockutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.198 2 DEBUG oslo_concurrency.lockutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.208 2 DEBUG oslo_concurrency.processutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.266 2 DEBUG oslo_concurrency.processutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.267 2 DEBUG oslo_concurrency.processutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/3559b47f-102c-43cf-a800-ac09d66e2264/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.304 2 DEBUG oslo_concurrency.processutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/3559b47f-102c-43cf-a800-ac09d66e2264/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.306 2 DEBUG oslo_concurrency.lockutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.306 2 DEBUG oslo_concurrency.processutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.360 2 DEBUG oslo_concurrency.processutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.363 2 DEBUG nova.virt.disk.api [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Checking if we can resize image /var/lib/nova/instances/3559b47f-102c-43cf-a800-ac09d66e2264/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.364 2 DEBUG oslo_concurrency.processutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3559b47f-102c-43cf-a800-ac09d66e2264/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.427 2 DEBUG oslo_concurrency.processutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3559b47f-102c-43cf-a800-ac09d66e2264/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.428 2 DEBUG nova.virt.disk.api [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Cannot resize image /var/lib/nova/instances/3559b47f-102c-43cf-a800-ac09d66e2264/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.428 2 DEBUG nova.objects.instance [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lazy-loading 'migration_context' on Instance uuid 3559b47f-102c-43cf-a800-ac09d66e2264 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.451 2 DEBUG oslo_concurrency.processutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/3559b47f-102c-43cf-a800-ac09d66e2264/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.479 2 DEBUG oslo_concurrency.processutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/3559b47f-102c-43cf-a800-ac09d66e2264/disk.config 485376" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.481 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/3559b47f-102c-43cf-a800-ac09d66e2264/disk.config to /var/lib/nova/instances/3559b47f-102c-43cf-a800-ac09d66e2264 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.481 2 DEBUG oslo_concurrency.processutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/3559b47f-102c-43cf-a800-ac09d66e2264/disk.config /var/lib/nova/instances/3559b47f-102c-43cf-a800-ac09d66e2264 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.976 2 DEBUG oslo_concurrency.processutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/3559b47f-102c-43cf-a800-ac09d66e2264/disk.config /var/lib/nova/instances/3559b47f-102c-43cf-a800-ac09d66e2264" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.977 2 DEBUG nova.virt.libvirt.driver [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.978 2 DEBUG nova.virt.libvirt.vif [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T16:19:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-server-test-42452491',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-test-42452491',id=88,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJXmR2ZdZ/cPq7lDjgd3tfkKNExE73Ly5Y9+bY6BcqlYpolw2Hd9zh5sY9Gw6YddISE8ZprV1f9n+5/viUjWJD3yCOyl/ugmh/k10MEE7fGoiA6Dg0qDXStOp75SJcpxTQ==',key_name='tempest-keypair-test-542762937',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:19:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0433a72056854da48c168f13bcf53e59',ramdisk_id='',reservation_id='r-3h645p17',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-OvnDvrTest-1354773760',owner_user_name='tempest-OvnDvrTest-1354773760-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:19:59Z,user_data=None,user_id='2bdd69fe495b499fbadd2e2b8da36c6f',uuid=3559b47f-102c-43cf-a800-ac09d66e2264,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "20af3098-8387-4af6-82ab-5fb07b335ea0", "address": "fa:16:3e:74:7c:88", "network": {"id": "4f66031b-46d6-4cd4-8153-64f96d4967ce", "bridge": "br-int", "label": "tempest-test-network--1402782967", "subnets": [{"cidr": "10.100.0.48/28", "dns": [], "gateway": {"address": "10.100.0.49", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.53", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap20af3098-83", "ovs_interfaceid": "20af3098-8387-4af6-82ab-5fb07b335ea0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.978 2 DEBUG nova.network.os_vif_util [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converting VIF {"id": "20af3098-8387-4af6-82ab-5fb07b335ea0", "address": "fa:16:3e:74:7c:88", "network": {"id": "4f66031b-46d6-4cd4-8153-64f96d4967ce", "bridge": "br-int", "label": "tempest-test-network--1402782967", "subnets": [{"cidr": "10.100.0.48/28", "dns": [], "gateway": {"address": "10.100.0.49", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.53", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap20af3098-83", "ovs_interfaceid": "20af3098-8387-4af6-82ab-5fb07b335ea0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.979 2 DEBUG nova.network.os_vif_util [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:74:7c:88,bridge_name='br-int',has_traffic_filtering=True,id=20af3098-8387-4af6-82ab-5fb07b335ea0,network=Network(4f66031b-46d6-4cd4-8153-64f96d4967ce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20af3098-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.979 2 DEBUG os_vif [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:7c:88,bridge_name='br-int',has_traffic_filtering=True,id=20af3098-8387-4af6-82ab-5fb07b335ea0,network=Network(4f66031b-46d6-4cd4-8153-64f96d4967ce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20af3098-83') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.980 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.981 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.984 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap20af3098-83, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:21:13 np0005476733 nova_compute[192580]: 2025-10-08 16:21:13.984 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap20af3098-83, col_values=(('external_ids', {'iface-id': '20af3098-8387-4af6-82ab-5fb07b335ea0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:74:7c:88', 'vm-uuid': '3559b47f-102c-43cf-a800-ac09d66e2264'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:21:14 np0005476733 nova_compute[192580]: 2025-10-08 16:21:14.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:14 np0005476733 NetworkManager[51699]: <info>  [1759940474.0136] manager: (tap20af3098-83): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/275)
Oct  8 12:21:14 np0005476733 nova_compute[192580]: 2025-10-08 16:21:14.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:21:14 np0005476733 nova_compute[192580]: 2025-10-08 16:21:14.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:14 np0005476733 nova_compute[192580]: 2025-10-08 16:21:14.019 2 INFO os_vif [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:7c:88,bridge_name='br-int',has_traffic_filtering=True,id=20af3098-8387-4af6-82ab-5fb07b335ea0,network=Network(4f66031b-46d6-4cd4-8153-64f96d4967ce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20af3098-83')#033[00m
Oct  8 12:21:14 np0005476733 nova_compute[192580]: 2025-10-08 16:21:14.019 2 DEBUG nova.virt.libvirt.driver [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Oct  8 12:21:14 np0005476733 nova_compute[192580]: 2025-10-08 16:21:14.020 2 DEBUG nova.compute.manager [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=113664,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp46e7vutd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3559b47f-102c-43cf-a800-ac09d66e2264',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Oct  8 12:21:15 np0005476733 nova_compute[192580]: 2025-10-08 16:21:15.732 2 DEBUG nova.network.neutron [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Port 20af3098-8387-4af6-82ab-5fb07b335ea0 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Oct  8 12:21:15 np0005476733 nova_compute[192580]: 2025-10-08 16:21:15.735 2 DEBUG nova.compute.manager [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=113664,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp46e7vutd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3559b47f-102c-43cf-a800-ac09d66e2264',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Oct  8 12:21:15 np0005476733 systemd[1]: Starting libvirt proxy daemon...
Oct  8 12:21:15 np0005476733 systemd[1]: Started libvirt proxy daemon.
Oct  8 12:21:16 np0005476733 NetworkManager[51699]: <info>  [1759940476.1564] manager: (tap20af3098-83): new Tun device (/org/freedesktop/NetworkManager/Devices/276)
Oct  8 12:21:16 np0005476733 kernel: tap20af3098-83: entered promiscuous mode
Oct  8 12:21:16 np0005476733 nova_compute[192580]: 2025-10-08 16:21:16.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:16 np0005476733 ovn_controller[94857]: 2025-10-08T16:21:16Z|00853|binding|INFO|Claiming lport 20af3098-8387-4af6-82ab-5fb07b335ea0 for this additional chassis.
Oct  8 12:21:16 np0005476733 ovn_controller[94857]: 2025-10-08T16:21:16Z|00854|binding|INFO|20af3098-8387-4af6-82ab-5fb07b335ea0: Claiming fa:16:3e:74:7c:88 10.100.0.53
Oct  8 12:21:16 np0005476733 nova_compute[192580]: 2025-10-08 16:21:16.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:16 np0005476733 ovn_controller[94857]: 2025-10-08T16:21:16Z|00855|binding|INFO|Setting lport 20af3098-8387-4af6-82ab-5fb07b335ea0 ovn-installed in OVS
Oct  8 12:21:16 np0005476733 nova_compute[192580]: 2025-10-08 16:21:16.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:16 np0005476733 systemd-udevd[259041]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:21:16 np0005476733 systemd-machined[152624]: New machine qemu-55-instance-00000058.
Oct  8 12:21:16 np0005476733 NetworkManager[51699]: <info>  [1759940476.2236] device (tap20af3098-83): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:21:16 np0005476733 NetworkManager[51699]: <info>  [1759940476.2242] device (tap20af3098-83): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 12:21:16 np0005476733 systemd[1]: Started Virtual Machine qemu-55-instance-00000058.
Oct  8 12:21:16 np0005476733 nova_compute[192580]: 2025-10-08 16:21:16.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:21:17 np0005476733 podman[259059]: 2025-10-08 16:21:17.269241727 +0000 UTC m=+0.084102810 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:21:17 np0005476733 podman[259058]: 2025-10-08 16:21:17.271578382 +0000 UTC m=+0.086797286 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:21:17 np0005476733 podman[259060]: 2025-10-08 16:21:17.29010296 +0000 UTC m=+0.094975996 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  8 12:21:17 np0005476733 nova_compute[192580]: 2025-10-08 16:21:17.981 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759940477.9805324, 3559b47f-102c-43cf-a800-ac09d66e2264 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:21:17 np0005476733 nova_compute[192580]: 2025-10-08 16:21:17.983 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] VM Started (Lifecycle Event)#033[00m
Oct  8 12:21:18 np0005476733 nova_compute[192580]: 2025-10-08 16:21:18.008 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:21:18 np0005476733 nova_compute[192580]: 2025-10-08 16:21:18.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:18 np0005476733 nova_compute[192580]: 2025-10-08 16:21:18.848 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759940478.8486605, 3559b47f-102c-43cf-a800-ac09d66e2264 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:21:18 np0005476733 nova_compute[192580]: 2025-10-08 16:21:18.849 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] VM Resumed (Lifecycle Event)#033[00m
Oct  8 12:21:18 np0005476733 nova_compute[192580]: 2025-10-08 16:21:18.870 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:21:18 np0005476733 nova_compute[192580]: 2025-10-08 16:21:18.874 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:21:18 np0005476733 nova_compute[192580]: 2025-10-08 16:21:18.901 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Oct  8 12:21:19 np0005476733 nova_compute[192580]: 2025-10-08 16:21:19.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:21 np0005476733 ovn_controller[94857]: 2025-10-08T16:21:21Z|00856|binding|INFO|Claiming lport 20af3098-8387-4af6-82ab-5fb07b335ea0 for this chassis.
Oct  8 12:21:21 np0005476733 ovn_controller[94857]: 2025-10-08T16:21:21Z|00857|binding|INFO|20af3098-8387-4af6-82ab-5fb07b335ea0: Claiming fa:16:3e:74:7c:88 10.100.0.53
Oct  8 12:21:21 np0005476733 ovn_controller[94857]: 2025-10-08T16:21:21Z|00858|pinctrl|WARN|Dropped 599 log messages in last 56 seconds (most recently, 2 seconds ago) due to excessive rate
Oct  8 12:21:21 np0005476733 ovn_controller[94857]: 2025-10-08T16:21:21Z|00859|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:21:21 np0005476733 ovn_controller[94857]: 2025-10-08T16:21:21Z|00860|binding|INFO|Setting lport 20af3098-8387-4af6-82ab-5fb07b335ea0 up in Southbound
Oct  8 12:21:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:21.721 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:7c:88 10.100.0.53'], port_security=['fa:16:3e:74:7c:88 10.100.0.53'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.53/28', 'neutron:device_id': '3559b47f-102c-43cf-a800-ac09d66e2264', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4f66031b-46d6-4cd4-8153-64f96d4967ce', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0433a72056854da48c168f13bcf53e59', 'neutron:revision_number': '11', 'neutron:security_group_ids': '6694f32d-684d-42bf-9b26-d5a5c55c3f82', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.238'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a7b039a4-bbf9-4652-a880-db4a62539b1e, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=20af3098-8387-4af6-82ab-5fb07b335ea0) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:21:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:21.724 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 20af3098-8387-4af6-82ab-5fb07b335ea0 in datapath 4f66031b-46d6-4cd4-8153-64f96d4967ce bound to our chassis#033[00m
Oct  8 12:21:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:21.727 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4f66031b-46d6-4cd4-8153-64f96d4967ce#033[00m
Oct  8 12:21:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:21.748 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[12548517-39e1-4e52-ac04-922ad353c788]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:21:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:21.749 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4f66031b-41 in ovnmeta-4f66031b-46d6-4cd4-8153-64f96d4967ce namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 12:21:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:21.752 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4f66031b-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 12:21:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:21.752 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c4382f05-cf20-4cdd-90c3-e9ab9e022ff6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:21:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:21.753 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[898a752d-9e3c-48b8-965f-d9a70202b065]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:21:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:21.767 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[8bccbbc8-45f0-4691-9ebd-80e9b3112e02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:21:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:21.784 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[49f3f814-4a65-4d1d-9a93-b5a462adf99f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:21:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:21.827 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[da487425-cf61-43a0-8ba6-9a51c4059799]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:21:21 np0005476733 NetworkManager[51699]: <info>  [1759940481.8381] manager: (tap4f66031b-40): new Veth device (/org/freedesktop/NetworkManager/Devices/277)
Oct  8 12:21:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:21.837 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d6db4436-4a10-4be5-b5de-7e5aa8d2ac57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:21:21 np0005476733 systemd-udevd[259142]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:21:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:21.890 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[22484064-845d-4c28-93c2-a3303b18a2a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:21:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:21.893 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[25694968-3ef3-40bb-a20e-f21e09f45a26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:21:21 np0005476733 NetworkManager[51699]: <info>  [1759940481.9182] device (tap4f66031b-40): carrier: link connected
Oct  8 12:21:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:21.931 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[e05530a1-eee7-4eb4-986f-82dda8904f17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:21:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:21.953 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0a93c9e1-71d3-431e-9ffc-af99618ac244]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4f66031b-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:31:da:1c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 738558, 'reachable_time': 24342, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259161, 'error': None, 'target': 'ovnmeta-4f66031b-46d6-4cd4-8153-64f96d4967ce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:21:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:21.980 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4db51de8-a570-4bf2-8686-2c6bffe7be81]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe31:da1c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 738558, 'tstamp': 738558}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259162, 'error': None, 'target': 'ovnmeta-4f66031b-46d6-4cd4-8153-64f96d4967ce', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:22.001 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[73d9180a-47ad-4afc-b4f6-1ab3e50bf148]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4f66031b-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:31:da:1c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 738558, 'reachable_time': 24342, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 259163, 'error': None, 'target': 'ovnmeta-4f66031b-46d6-4cd4-8153-64f96d4967ce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:22.033 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[774dd506-0826-4d4c-b11f-611991accb85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:22.110 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c52da469-9afb-4edc-8ae6-012b38f49c7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:22.112 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f66031b-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:22.112 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:22.112 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4f66031b-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:21:22 np0005476733 nova_compute[192580]: 2025-10-08 16:21:22.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:22 np0005476733 NetworkManager[51699]: <info>  [1759940482.1147] manager: (tap4f66031b-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/278)
Oct  8 12:21:22 np0005476733 kernel: tap4f66031b-40: entered promiscuous mode
Oct  8 12:21:22 np0005476733 nova_compute[192580]: 2025-10-08 16:21:22.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:22.118 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4f66031b-40, col_values=(('external_ids', {'iface-id': 'f63630ad-2d15-4cc6-822f-932b82801b85'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:21:22 np0005476733 nova_compute[192580]: 2025-10-08 16:21:22.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:22 np0005476733 ovn_controller[94857]: 2025-10-08T16:21:22Z|00861|binding|INFO|Releasing lport f63630ad-2d15-4cc6-822f-932b82801b85 from this chassis (sb_readonly=0)
Oct  8 12:21:22 np0005476733 nova_compute[192580]: 2025-10-08 16:21:22.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:22.122 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4f66031b-46d6-4cd4-8153-64f96d4967ce.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4f66031b-46d6-4cd4-8153-64f96d4967ce.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:22.124 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8159f967-2387-4318-a102-bdb2c33189ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:22.125 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-4f66031b-46d6-4cd4-8153-64f96d4967ce
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/4f66031b-46d6-4cd4-8153-64f96d4967ce.pid.haproxy
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 4f66031b-46d6-4cd4-8153-64f96d4967ce
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 12:21:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:22.126 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4f66031b-46d6-4cd4-8153-64f96d4967ce', 'env', 'PROCESS_TAG=haproxy-4f66031b-46d6-4cd4-8153-64f96d4967ce', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4f66031b-46d6-4cd4-8153-64f96d4967ce.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 12:21:22 np0005476733 nova_compute[192580]: 2025-10-08 16:21:22.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:22 np0005476733 podman[259196]: 2025-10-08 16:21:22.468180241 +0000 UTC m=+0.043372588 container create a4ca18236502d0a90d6b97f4e1451d4dd49fd2d7b8691efdc41274b220d21b12 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-4f66031b-46d6-4cd4-8153-64f96d4967ce, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 12:21:22 np0005476733 systemd[1]: Started libpod-conmon-a4ca18236502d0a90d6b97f4e1451d4dd49fd2d7b8691efdc41274b220d21b12.scope.
Oct  8 12:21:22 np0005476733 systemd[1]: Started libcrun container.
Oct  8 12:21:22 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae66a7a9d87eb75b563c1e6281c6a365d37fc7af46700025309edf4a1a09ed03/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 12:21:22 np0005476733 podman[259196]: 2025-10-08 16:21:22.53022629 +0000 UTC m=+0.105418647 container init a4ca18236502d0a90d6b97f4e1451d4dd49fd2d7b8691efdc41274b220d21b12 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-4f66031b-46d6-4cd4-8153-64f96d4967ce, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:21:22 np0005476733 podman[259196]: 2025-10-08 16:21:22.535375214 +0000 UTC m=+0.110567551 container start a4ca18236502d0a90d6b97f4e1451d4dd49fd2d7b8691efdc41274b220d21b12 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-4f66031b-46d6-4cd4-8153-64f96d4967ce, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  8 12:21:22 np0005476733 podman[259196]: 2025-10-08 16:21:22.444858351 +0000 UTC m=+0.020050708 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 12:21:22 np0005476733 neutron-haproxy-ovnmeta-4f66031b-46d6-4cd4-8153-64f96d4967ce[259211]: [NOTICE]   (259215) : New worker (259217) forked
Oct  8 12:21:22 np0005476733 neutron-haproxy-ovnmeta-4f66031b-46d6-4cd4-8153-64f96d4967ce[259211]: [NOTICE]   (259215) : Loading success.
Oct  8 12:21:22 np0005476733 nova_compute[192580]: 2025-10-08 16:21:22.853 2 INFO nova.compute.manager [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Post operation of migration started#033[00m
Oct  8 12:21:23 np0005476733 nova_compute[192580]: 2025-10-08 16:21:23.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:24 np0005476733 nova_compute[192580]: 2025-10-08 16:21:24.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:24 np0005476733 nova_compute[192580]: 2025-10-08 16:21:24.677 2 DEBUG oslo_concurrency.lockutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "refresh_cache-3559b47f-102c-43cf-a800-ac09d66e2264" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:21:24 np0005476733 nova_compute[192580]: 2025-10-08 16:21:24.678 2 DEBUG oslo_concurrency.lockutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquired lock "refresh_cache-3559b47f-102c-43cf-a800-ac09d66e2264" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:21:24 np0005476733 nova_compute[192580]: 2025-10-08 16:21:24.678 2 DEBUG nova.network.neutron [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:21:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:26.378 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:21:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:26.379 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:21:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:21:26.379 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:21:26 np0005476733 nova_compute[192580]: 2025-10-08 16:21:26.607 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:21:27 np0005476733 podman[259226]: 2025-10-08 16:21:27.228247242 +0000 UTC m=+0.056413072 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.license=GPLv2, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible)
Oct  8 12:21:27 np0005476733 podman[259227]: 2025-10-08 16:21:27.236237935 +0000 UTC m=+0.057877998 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:21:28 np0005476733 nova_compute[192580]: 2025-10-08 16:21:28.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:28 np0005476733 nova_compute[192580]: 2025-10-08 16:21:28.680 2 DEBUG nova.network.neutron [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Updating instance_info_cache with network_info: [{"id": "20af3098-8387-4af6-82ab-5fb07b335ea0", "address": "fa:16:3e:74:7c:88", "network": {"id": "4f66031b-46d6-4cd4-8153-64f96d4967ce", "bridge": "br-int", "label": "tempest-test-network--1402782967", "subnets": [{"cidr": "10.100.0.48/28", "dns": [], "gateway": {"address": "10.100.0.49", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.53", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20af3098-83", "ovs_interfaceid": "20af3098-8387-4af6-82ab-5fb07b335ea0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:21:28 np0005476733 nova_compute[192580]: 2025-10-08 16:21:28.710 2 DEBUG oslo_concurrency.lockutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Releasing lock "refresh_cache-3559b47f-102c-43cf-a800-ac09d66e2264" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:21:28 np0005476733 nova_compute[192580]: 2025-10-08 16:21:28.724 2 DEBUG oslo_concurrency.lockutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:21:28 np0005476733 nova_compute[192580]: 2025-10-08 16:21:28.724 2 DEBUG oslo_concurrency.lockutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:21:28 np0005476733 nova_compute[192580]: 2025-10-08 16:21:28.725 2 DEBUG oslo_concurrency.lockutils [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:21:28 np0005476733 nova_compute[192580]: 2025-10-08 16:21:28.728 2 INFO nova.virt.libvirt.driver [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Oct  8 12:21:28 np0005476733 virtqemud[192152]: Domain id=55 name='instance-00000058' uuid=3559b47f-102c-43cf-a800-ac09d66e2264 is tainted: custom-monitor
Oct  8 12:21:29 np0005476733 nova_compute[192580]: 2025-10-08 16:21:29.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:29 np0005476733 nova_compute[192580]: 2025-10-08 16:21:29.734 2 INFO nova.virt.libvirt.driver [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Oct  8 12:21:30 np0005476733 nova_compute[192580]: 2025-10-08 16:21:30.740 2 INFO nova.virt.libvirt.driver [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Oct  8 12:21:30 np0005476733 nova_compute[192580]: 2025-10-08 16:21:30.744 2 DEBUG nova.compute.manager [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:21:30 np0005476733 nova_compute[192580]: 2025-10-08 16:21:30.776 2 DEBUG nova.objects.instance [None req-bdbbc7e7-bf10-4121-b462-17fe08967c2b 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  8 12:21:32 np0005476733 nova_compute[192580]: 2025-10-08 16:21:32.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:21:32 np0005476733 nova_compute[192580]: 2025-10-08 16:21:32.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:21:33 np0005476733 systemd-logind[827]: New session 121 of user zuul.
Oct  8 12:21:33 np0005476733 systemd[1]: Started Session 121 of User zuul.
Oct  8 12:21:33 np0005476733 nova_compute[192580]: 2025-10-08 16:21:33.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:33 np0005476733 systemd-logind[827]: New session 122 of user zuul.
Oct  8 12:21:33 np0005476733 systemd[1]: Started Session 122 of User zuul.
Oct  8 12:21:33 np0005476733 systemd[1]: session-122.scope: Deactivated successfully.
Oct  8 12:21:33 np0005476733 systemd-logind[827]: Session 122 logged out. Waiting for processes to exit.
Oct  8 12:21:33 np0005476733 systemd-logind[827]: Removed session 122.
Oct  8 12:21:34 np0005476733 nova_compute[192580]: 2025-10-08 16:21:34.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:34 np0005476733 nova_compute[192580]: 2025-10-08 16:21:34.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:21:34 np0005476733 nova_compute[192580]: 2025-10-08 16:21:34.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 12:21:34 np0005476733 nova_compute[192580]: 2025-10-08 16:21:34.610 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 12:21:35 np0005476733 nova_compute[192580]: 2025-10-08 16:21:35.611 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:21:35 np0005476733 nova_compute[192580]: 2025-10-08 16:21:35.611 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:21:35 np0005476733 nova_compute[192580]: 2025-10-08 16:21:35.612 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:21:36 np0005476733 podman[259335]: 2025-10-08 16:21:36.249282757 +0000 UTC m=+0.068284649 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Oct  8 12:21:38 np0005476733 nova_compute[192580]: 2025-10-08 16:21:38.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:39 np0005476733 nova_compute[192580]: 2025-10-08 16:21:39.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:40 np0005476733 nova_compute[192580]: 2025-10-08 16:21:40.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:21:40 np0005476733 nova_compute[192580]: 2025-10-08 16:21:40.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:21:40 np0005476733 nova_compute[192580]: 2025-10-08 16:21:40.591 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:21:41 np0005476733 podman[259359]: 2025-10-08 16:21:41.28595454 +0000 UTC m=+0.096205336 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  8 12:21:41 np0005476733 podman[259358]: 2025-10-08 16:21:41.331388851 +0000 UTC m=+0.147329217 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:21:41 np0005476733 nova_compute[192580]: 2025-10-08 16:21:41.662 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-3559b47f-102c-43cf-a800-ac09d66e2264" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:21:41 np0005476733 nova_compute[192580]: 2025-10-08 16:21:41.663 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-3559b47f-102c-43cf-a800-ac09d66e2264" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:21:41 np0005476733 nova_compute[192580]: 2025-10-08 16:21:41.663 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:21:41 np0005476733 nova_compute[192580]: 2025-10-08 16:21:41.664 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3559b47f-102c-43cf-a800-ac09d66e2264 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:21:42 np0005476733 systemd-logind[827]: New session 123 of user zuul.
Oct  8 12:21:42 np0005476733 systemd[1]: Started Session 123 of User zuul.
Oct  8 12:21:43 np0005476733 nova_compute[192580]: 2025-10-08 16:21:43.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:44 np0005476733 nova_compute[192580]: 2025-10-08 16:21:44.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:44 np0005476733 systemd-logind[827]: New session 124 of user zuul.
Oct  8 12:21:44 np0005476733 systemd[1]: Started Session 124 of User zuul.
Oct  8 12:21:44 np0005476733 systemd-logind[827]: New session 125 of user zuul.
Oct  8 12:21:44 np0005476733 systemd[1]: Started Session 125 of User zuul.
Oct  8 12:21:44 np0005476733 systemd[1]: session-125.scope: Deactivated successfully.
Oct  8 12:21:44 np0005476733 systemd-logind[827]: Session 125 logged out. Waiting for processes to exit.
Oct  8 12:21:44 np0005476733 systemd-logind[827]: Removed session 125.
Oct  8 12:21:44 np0005476733 nova_compute[192580]: 2025-10-08 16:21:44.988 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Updating instance_info_cache with network_info: [{"id": "20af3098-8387-4af6-82ab-5fb07b335ea0", "address": "fa:16:3e:74:7c:88", "network": {"id": "4f66031b-46d6-4cd4-8153-64f96d4967ce", "bridge": "br-int", "label": "tempest-test-network--1402782967", "subnets": [{"cidr": "10.100.0.48/28", "dns": [], "gateway": {"address": "10.100.0.49", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.53", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20af3098-83", "ovs_interfaceid": "20af3098-8387-4af6-82ab-5fb07b335ea0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:21:45 np0005476733 nova_compute[192580]: 2025-10-08 16:21:45.007 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-3559b47f-102c-43cf-a800-ac09d66e2264" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:21:45 np0005476733 nova_compute[192580]: 2025-10-08 16:21:45.007 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:21:45 np0005476733 nova_compute[192580]: 2025-10-08 16:21:45.008 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:21:46 np0005476733 nova_compute[192580]: 2025-10-08 16:21:46.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:21:46 np0005476733 nova_compute[192580]: 2025-10-08 16:21:46.617 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:21:46 np0005476733 nova_compute[192580]: 2025-10-08 16:21:46.618 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:21:46 np0005476733 nova_compute[192580]: 2025-10-08 16:21:46.618 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:21:46 np0005476733 nova_compute[192580]: 2025-10-08 16:21:46.618 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:21:46 np0005476733 nova_compute[192580]: 2025-10-08 16:21:46.701 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3559b47f-102c-43cf-a800-ac09d66e2264/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:21:46 np0005476733 nova_compute[192580]: 2025-10-08 16:21:46.764 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3559b47f-102c-43cf-a800-ac09d66e2264/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:21:46 np0005476733 nova_compute[192580]: 2025-10-08 16:21:46.765 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3559b47f-102c-43cf-a800-ac09d66e2264/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:21:46 np0005476733 nova_compute[192580]: 2025-10-08 16:21:46.826 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3559b47f-102c-43cf-a800-ac09d66e2264/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:21:46 np0005476733 nova_compute[192580]: 2025-10-08 16:21:46.978 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:21:46 np0005476733 nova_compute[192580]: 2025-10-08 16:21:46.979 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13485MB free_disk=111.28634262084961GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:21:46 np0005476733 nova_compute[192580]: 2025-10-08 16:21:46.980 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:21:46 np0005476733 nova_compute[192580]: 2025-10-08 16:21:46.980 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:21:47 np0005476733 nova_compute[192580]: 2025-10-08 16:21:47.067 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 3559b47f-102c-43cf-a800-ac09d66e2264 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:21:47 np0005476733 nova_compute[192580]: 2025-10-08 16:21:47.068 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:21:47 np0005476733 nova_compute[192580]: 2025-10-08 16:21:47.068 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:21:47 np0005476733 nova_compute[192580]: 2025-10-08 16:21:47.085 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 12:21:47 np0005476733 nova_compute[192580]: 2025-10-08 16:21:47.108 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 12:21:47 np0005476733 nova_compute[192580]: 2025-10-08 16:21:47.109 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 12:21:47 np0005476733 nova_compute[192580]: 2025-10-08 16:21:47.124 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 12:21:47 np0005476733 nova_compute[192580]: 2025-10-08 16:21:47.149 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 12:21:47 np0005476733 nova_compute[192580]: 2025-10-08 16:21:47.194 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:21:47 np0005476733 nova_compute[192580]: 2025-10-08 16:21:47.209 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:21:47 np0005476733 nova_compute[192580]: 2025-10-08 16:21:47.234 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:21:47 np0005476733 nova_compute[192580]: 2025-10-08 16:21:47.235 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:21:47 np0005476733 podman[259506]: 2025-10-08 16:21:47.615403382 +0000 UTC m=+0.065980386 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  8 12:21:47 np0005476733 podman[259505]: 2025-10-08 16:21:47.639045112 +0000 UTC m=+0.085158694 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 12:21:47 np0005476733 podman[259504]: 2025-10-08 16:21:47.653706628 +0000 UTC m=+0.098315413 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 12:21:48 np0005476733 nova_compute[192580]: 2025-10-08 16:21:48.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:49 np0005476733 nova_compute[192580]: 2025-10-08 16:21:49.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:51 np0005476733 nova_compute[192580]: 2025-10-08 16:21:51.236 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:21:52 np0005476733 ovn_controller[94857]: 2025-10-08T16:21:52Z|00862|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct  8 12:21:53 np0005476733 nova_compute[192580]: 2025-10-08 16:21:53.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:53 np0005476733 systemd-logind[827]: New session 126 of user zuul.
Oct  8 12:21:53 np0005476733 systemd[1]: Started Session 126 of User zuul.
Oct  8 12:21:54 np0005476733 nova_compute[192580]: 2025-10-08 16:21:54.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:56 np0005476733 nova_compute[192580]: 2025-10-08 16:21:56.023 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:21:56 np0005476733 nova_compute[192580]: 2025-10-08 16:21:56.049 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Triggering sync for uuid 3559b47f-102c-43cf-a800-ac09d66e2264 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  8 12:21:56 np0005476733 nova_compute[192580]: 2025-10-08 16:21:56.050 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "3559b47f-102c-43cf-a800-ac09d66e2264" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:21:56 np0005476733 nova_compute[192580]: 2025-10-08 16:21:56.050 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "3559b47f-102c-43cf-a800-ac09d66e2264" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:21:56 np0005476733 nova_compute[192580]: 2025-10-08 16:21:56.082 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "3559b47f-102c-43cf-a800-ac09d66e2264" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.031s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:21:58 np0005476733 podman[259606]: 2025-10-08 16:21:58.255399832 +0000 UTC m=+0.075934631 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:21:58 np0005476733 podman[259605]: 2025-10-08 16:21:58.263149228 +0000 UTC m=+0.080466705 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 12:21:58 np0005476733 nova_compute[192580]: 2025-10-08 16:21:58.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:59 np0005476733 systemd-logind[827]: New session 127 of user zuul.
Oct  8 12:21:59 np0005476733 systemd[1]: Started Session 127 of User zuul.
Oct  8 12:21:59 np0005476733 nova_compute[192580]: 2025-10-08 16:21:59.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:21:59 np0005476733 systemd-logind[827]: Session 127 logged out. Waiting for processes to exit.
Oct  8 12:21:59 np0005476733 systemd[1]: session-127.scope: Deactivated successfully.
Oct  8 12:21:59 np0005476733 systemd-logind[827]: Removed session 127.
Oct  8 12:22:03 np0005476733 nova_compute[192580]: 2025-10-08 16:22:03.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:04 np0005476733 nova_compute[192580]: 2025-10-08 16:22:04.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:06 np0005476733 systemd-logind[827]: New session 128 of user zuul.
Oct  8 12:22:06 np0005476733 systemd[1]: Started Session 128 of User zuul.
Oct  8 12:22:06 np0005476733 podman[259684]: 2025-10-08 16:22:06.489722468 +0000 UTC m=+0.073273508 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Oct  8 12:22:06 np0005476733 systemd[1]: session-128.scope: Deactivated successfully.
Oct  8 12:22:06 np0005476733 systemd-logind[827]: Session 128 logged out. Waiting for processes to exit.
Oct  8 12:22:06 np0005476733 systemd-logind[827]: Removed session 128.
Oct  8 12:22:08 np0005476733 nova_compute[192580]: 2025-10-08 16:22:08.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:09 np0005476733 nova_compute[192580]: 2025-10-08 16:22:09.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:12 np0005476733 podman[259732]: 2025-10-08 16:22:12.248945937 +0000 UTC m=+0.074654522 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:22:12 np0005476733 podman[259731]: 2025-10-08 16:22:12.266673999 +0000 UTC m=+0.095303517 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:22:13 np0005476733 nova_compute[192580]: 2025-10-08 16:22:13.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:13 np0005476733 systemd-logind[827]: New session 129 of user zuul.
Oct  8 12:22:13 np0005476733 systemd[1]: Started Session 129 of User zuul.
Oct  8 12:22:13 np0005476733 systemd[1]: session-129.scope: Deactivated successfully.
Oct  8 12:22:13 np0005476733 systemd-logind[827]: Session 129 logged out. Waiting for processes to exit.
Oct  8 12:22:13 np0005476733 systemd-logind[827]: Removed session 129.
Oct  8 12:22:14 np0005476733 nova_compute[192580]: 2025-10-08 16:22:14.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:18 np0005476733 podman[259804]: 2025-10-08 16:22:18.257059099 +0000 UTC m=+0.079847817 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 12:22:18 np0005476733 podman[259805]: 2025-10-08 16:22:18.258721941 +0000 UTC m=+0.081711985 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 12:22:18 np0005476733 podman[259806]: 2025-10-08 16:22:18.287231606 +0000 UTC m=+0.096083251 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible, config_id=edpm, name=ubi9-minimal, version=9.6)
Oct  8 12:22:18 np0005476733 nova_compute[192580]: 2025-10-08 16:22:18.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:19 np0005476733 nova_compute[192580]: 2025-10-08 16:22:19.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:21 np0005476733 systemd-logind[827]: New session 130 of user zuul.
Oct  8 12:22:21 np0005476733 systemd[1]: Started Session 130 of User zuul.
Oct  8 12:22:21 np0005476733 systemd[1]: session-130.scope: Deactivated successfully.
Oct  8 12:22:21 np0005476733 systemd-logind[827]: Session 130 logged out. Waiting for processes to exit.
Oct  8 12:22:21 np0005476733 systemd-logind[827]: Removed session 130.
Oct  8 12:22:21 np0005476733 ovn_controller[94857]: 2025-10-08T16:22:21Z|00863|pinctrl|WARN|Dropped 179 log messages in last 60 seconds (most recently, 10 seconds ago) due to excessive rate
Oct  8 12:22:21 np0005476733 ovn_controller[94857]: 2025-10-08T16:22:21Z|00864|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:22:23 np0005476733 nova_compute[192580]: 2025-10-08 16:22:23.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:24 np0005476733 nova_compute[192580]: 2025-10-08 16:22:24.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:22:26.379 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:22:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:22:26.380 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:22:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:22:26.382 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:22:26 np0005476733 nova_compute[192580]: 2025-10-08 16:22:26.616 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:22:28 np0005476733 systemd-logind[827]: New session 131 of user zuul.
Oct  8 12:22:28 np0005476733 systemd[1]: Started Session 131 of User zuul.
Oct  8 12:22:28 np0005476733 podman[259902]: 2025-10-08 16:22:28.547231977 +0000 UTC m=+0.066319966 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid)
Oct  8 12:22:28 np0005476733 podman[259904]: 2025-10-08 16:22:28.565847908 +0000 UTC m=+0.094662416 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:22:28 np0005476733 nova_compute[192580]: 2025-10-08 16:22:28.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:28 np0005476733 systemd[1]: session-131.scope: Deactivated successfully.
Oct  8 12:22:28 np0005476733 systemd-logind[827]: Session 131 logged out. Waiting for processes to exit.
Oct  8 12:22:28 np0005476733 systemd-logind[827]: Removed session 131.
Oct  8 12:22:29 np0005476733 nova_compute[192580]: 2025-10-08 16:22:29.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:22:31.997 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=62, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=61) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:22:31 np0005476733 nova_compute[192580]: 2025-10-08 16:22:31.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:22:31.998 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:22:31 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:22:31.999 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '62'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.299 2 DEBUG nova.compute.manager [req-30fe9439-bea6-4f93-b289-be4141f79272 req-011635fc-4c2a-4b25-ab2a-51517dcb9b65 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Received event network-changed-20af3098-8387-4af6-82ab-5fb07b335ea0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.299 2 DEBUG nova.compute.manager [req-30fe9439-bea6-4f93-b289-be4141f79272 req-011635fc-4c2a-4b25-ab2a-51517dcb9b65 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Refreshing instance network info cache due to event network-changed-20af3098-8387-4af6-82ab-5fb07b335ea0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.300 2 DEBUG oslo_concurrency.lockutils [req-30fe9439-bea6-4f93-b289-be4141f79272 req-011635fc-4c2a-4b25-ab2a-51517dcb9b65 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-3559b47f-102c-43cf-a800-ac09d66e2264" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.300 2 DEBUG oslo_concurrency.lockutils [req-30fe9439-bea6-4f93-b289-be4141f79272 req-011635fc-4c2a-4b25-ab2a-51517dcb9b65 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-3559b47f-102c-43cf-a800-ac09d66e2264" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.301 2 DEBUG nova.network.neutron [req-30fe9439-bea6-4f93-b289-be4141f79272 req-011635fc-4c2a-4b25-ab2a-51517dcb9b65 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Refreshing network info cache for port 20af3098-8387-4af6-82ab-5fb07b335ea0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.346 2 DEBUG oslo_concurrency.lockutils [None req-31fe8414-fde4-410b-9d31-6d37114c8356 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Acquiring lock "3559b47f-102c-43cf-a800-ac09d66e2264" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.347 2 DEBUG oslo_concurrency.lockutils [None req-31fe8414-fde4-410b-9d31-6d37114c8356 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Lock "3559b47f-102c-43cf-a800-ac09d66e2264" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.347 2 DEBUG oslo_concurrency.lockutils [None req-31fe8414-fde4-410b-9d31-6d37114c8356 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Acquiring lock "3559b47f-102c-43cf-a800-ac09d66e2264-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.348 2 DEBUG oslo_concurrency.lockutils [None req-31fe8414-fde4-410b-9d31-6d37114c8356 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Lock "3559b47f-102c-43cf-a800-ac09d66e2264-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.348 2 DEBUG oslo_concurrency.lockutils [None req-31fe8414-fde4-410b-9d31-6d37114c8356 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Lock "3559b47f-102c-43cf-a800-ac09d66e2264-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.349 2 INFO nova.compute.manager [None req-31fe8414-fde4-410b-9d31-6d37114c8356 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Terminating instance#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.350 2 DEBUG nova.compute.manager [None req-31fe8414-fde4-410b-9d31-6d37114c8356 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 12:22:32 np0005476733 kernel: tap20af3098-83 (unregistering): left promiscuous mode
Oct  8 12:22:32 np0005476733 NetworkManager[51699]: <info>  [1759940552.3784] device (tap20af3098-83): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:32 np0005476733 ovn_controller[94857]: 2025-10-08T16:22:32Z|00865|binding|INFO|Releasing lport 20af3098-8387-4af6-82ab-5fb07b335ea0 from this chassis (sb_readonly=0)
Oct  8 12:22:32 np0005476733 ovn_controller[94857]: 2025-10-08T16:22:32Z|00866|binding|INFO|Setting lport 20af3098-8387-4af6-82ab-5fb07b335ea0 down in Southbound
Oct  8 12:22:32 np0005476733 ovn_controller[94857]: 2025-10-08T16:22:32Z|00867|binding|INFO|Removing iface tap20af3098-83 ovn-installed in OVS
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:22:32.398 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:7c:88 10.100.0.53'], port_security=['fa:16:3e:74:7c:88 10.100.0.53'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.53/28', 'neutron:device_id': '3559b47f-102c-43cf-a800-ac09d66e2264', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4f66031b-46d6-4cd4-8153-64f96d4967ce', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0433a72056854da48c168f13bcf53e59', 'neutron:revision_number': '13', 'neutron:security_group_ids': '6694f32d-684d-42bf-9b26-d5a5c55c3f82', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a7b039a4-bbf9-4652-a880-db4a62539b1e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=20af3098-8387-4af6-82ab-5fb07b335ea0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:22:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:22:32.399 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 20af3098-8387-4af6-82ab-5fb07b335ea0 in datapath 4f66031b-46d6-4cd4-8153-64f96d4967ce unbound from our chassis#033[00m
Oct  8 12:22:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:22:32.401 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4f66031b-46d6-4cd4-8153-64f96d4967ce, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 12:22:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:22:32.404 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[39e197e8-5868-48e8-80a3-d5c8026ceec1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:22:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:22:32.405 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4f66031b-46d6-4cd4-8153-64f96d4967ce namespace which is not needed anymore#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:32 np0005476733 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000058.scope: Deactivated successfully.
Oct  8 12:22:32 np0005476733 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000058.scope: Consumed 5.485s CPU time.
Oct  8 12:22:32 np0005476733 systemd-machined[152624]: Machine qemu-55-instance-00000058 terminated.
Oct  8 12:22:32 np0005476733 neutron-haproxy-ovnmeta-4f66031b-46d6-4cd4-8153-64f96d4967ce[259211]: [NOTICE]   (259215) : haproxy version is 2.8.14-c23fe91
Oct  8 12:22:32 np0005476733 neutron-haproxy-ovnmeta-4f66031b-46d6-4cd4-8153-64f96d4967ce[259211]: [NOTICE]   (259215) : path to executable is /usr/sbin/haproxy
Oct  8 12:22:32 np0005476733 neutron-haproxy-ovnmeta-4f66031b-46d6-4cd4-8153-64f96d4967ce[259211]: [WARNING]  (259215) : Exiting Master process...
Oct  8 12:22:32 np0005476733 neutron-haproxy-ovnmeta-4f66031b-46d6-4cd4-8153-64f96d4967ce[259211]: [ALERT]    (259215) : Current worker (259217) exited with code 143 (Terminated)
Oct  8 12:22:32 np0005476733 neutron-haproxy-ovnmeta-4f66031b-46d6-4cd4-8153-64f96d4967ce[259211]: [WARNING]  (259215) : All workers exited. Exiting... (0)
Oct  8 12:22:32 np0005476733 systemd[1]: libpod-a4ca18236502d0a90d6b97f4e1451d4dd49fd2d7b8691efdc41274b220d21b12.scope: Deactivated successfully.
Oct  8 12:22:32 np0005476733 podman[259998]: 2025-10-08 16:22:32.544000057 +0000 UTC m=+0.049374029 container died a4ca18236502d0a90d6b97f4e1451d4dd49fd2d7b8691efdc41274b220d21b12 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-4f66031b-46d6-4cd4-8153-64f96d4967ce, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 12:22:32 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a4ca18236502d0a90d6b97f4e1451d4dd49fd2d7b8691efdc41274b220d21b12-userdata-shm.mount: Deactivated successfully.
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:22:32 np0005476733 systemd[1]: var-lib-containers-storage-overlay-ae66a7a9d87eb75b563c1e6281c6a365d37fc7af46700025309edf4a1a09ed03-merged.mount: Deactivated successfully.
Oct  8 12:22:32 np0005476733 podman[259998]: 2025-10-08 16:22:32.591074052 +0000 UTC m=+0.096448044 container cleanup a4ca18236502d0a90d6b97f4e1451d4dd49fd2d7b8691efdc41274b220d21b12 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-4f66031b-46d6-4cd4-8153-64f96d4967ce, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2)
Oct  8 12:22:32 np0005476733 systemd[1]: libpod-conmon-a4ca18236502d0a90d6b97f4e1451d4dd49fd2d7b8691efdc41274b220d21b12.scope: Deactivated successfully.
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.639 2 INFO nova.virt.libvirt.driver [-] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Instance destroyed successfully.#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.639 2 DEBUG nova.objects.instance [None req-31fe8414-fde4-410b-9d31-6d37114c8356 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Lazy-loading 'resources' on Instance uuid 3559b47f-102c-43cf-a800-ac09d66e2264 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.655 2 DEBUG nova.virt.libvirt.vif [None req-31fe8414-fde4-410b-9d31-6d37114c8356 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-08T16:19:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-server-test-42452491',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-42452491',id=88,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJXmR2ZdZ/cPq7lDjgd3tfkKNExE73Ly5Y9+bY6BcqlYpolw2Hd9zh5sY9Gw6YddISE8ZprV1f9n+5/viUjWJD3yCOyl/ugmh/k10MEE7fGoiA6Dg0qDXStOp75SJcpxTQ==',key_name='tempest-keypair-test-542762937',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:19:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0433a72056854da48c168f13bcf53e59',ramdisk_id='',reservation_id='r-3h645p17',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-OvnDvrTest-1354773760',owner_user_name='tempest-OvnDvrTest-1354773760-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T16:21:30Z,user_data=None,user_id='2bdd69fe495b499fbadd2e2b8da36c6f',uuid=3559b47f-102c-43cf-a800-ac09d66e2264,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "20af3098-8387-4af6-82ab-5fb07b335ea0", "address": "fa:16:3e:74:7c:88", "network": {"id": "4f66031b-46d6-4cd4-8153-64f96d4967ce", "bridge": "br-int", "label": "tempest-test-network--1402782967", "subnets": [{"cidr": "10.100.0.48/28", "dns": [], "gateway": {"address": "10.100.0.49", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.53", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20af3098-83", "ovs_interfaceid": "20af3098-8387-4af6-82ab-5fb07b335ea0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.656 2 DEBUG nova.network.os_vif_util [None req-31fe8414-fde4-410b-9d31-6d37114c8356 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Converting VIF {"id": "20af3098-8387-4af6-82ab-5fb07b335ea0", "address": "fa:16:3e:74:7c:88", "network": {"id": "4f66031b-46d6-4cd4-8153-64f96d4967ce", "bridge": "br-int", "label": "tempest-test-network--1402782967", "subnets": [{"cidr": "10.100.0.48/28", "dns": [], "gateway": {"address": "10.100.0.49", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.53", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20af3098-83", "ovs_interfaceid": "20af3098-8387-4af6-82ab-5fb07b335ea0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.656 2 DEBUG nova.network.os_vif_util [None req-31fe8414-fde4-410b-9d31-6d37114c8356 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:74:7c:88,bridge_name='br-int',has_traffic_filtering=True,id=20af3098-8387-4af6-82ab-5fb07b335ea0,network=Network(4f66031b-46d6-4cd4-8153-64f96d4967ce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20af3098-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.656 2 DEBUG os_vif [None req-31fe8414-fde4-410b-9d31-6d37114c8356 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:7c:88,bridge_name='br-int',has_traffic_filtering=True,id=20af3098-8387-4af6-82ab-5fb07b335ea0,network=Network(4f66031b-46d6-4cd4-8153-64f96d4967ce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20af3098-83') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:32 np0005476733 podman[260039]: 2025-10-08 16:22:32.658486662 +0000 UTC m=+0.037097559 container remove a4ca18236502d0a90d6b97f4e1451d4dd49fd2d7b8691efdc41274b220d21b12 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-4f66031b-46d6-4cd4-8153-64f96d4967ce, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.658 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20af3098-83, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.662 2 INFO os_vif [None req-31fe8414-fde4-410b-9d31-6d37114c8356 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:7c:88,bridge_name='br-int',has_traffic_filtering=True,id=20af3098-8387-4af6-82ab-5fb07b335ea0,network=Network(4f66031b-46d6-4cd4-8153-64f96d4967ce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20af3098-83')#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.663 2 INFO nova.virt.libvirt.driver [None req-31fe8414-fde4-410b-9d31-6d37114c8356 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Deleting instance files /var/lib/nova/instances/3559b47f-102c-43cf-a800-ac09d66e2264_del#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.664 2 INFO nova.virt.libvirt.driver [None req-31fe8414-fde4-410b-9d31-6d37114c8356 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Deletion of /var/lib/nova/instances/3559b47f-102c-43cf-a800-ac09d66e2264_del complete#033[00m
Oct  8 12:22:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:22:32.666 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[330cfb28-d111-4011-a37c-a91f2edcc393]: (4, ('Wed Oct  8 04:22:32 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4f66031b-46d6-4cd4-8153-64f96d4967ce (a4ca18236502d0a90d6b97f4e1451d4dd49fd2d7b8691efdc41274b220d21b12)\na4ca18236502d0a90d6b97f4e1451d4dd49fd2d7b8691efdc41274b220d21b12\nWed Oct  8 04:22:32 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4f66031b-46d6-4cd4-8153-64f96d4967ce (a4ca18236502d0a90d6b97f4e1451d4dd49fd2d7b8691efdc41274b220d21b12)\na4ca18236502d0a90d6b97f4e1451d4dd49fd2d7b8691efdc41274b220d21b12\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:22:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:22:32.667 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0e4dbdb0-e883-42cf-9274-30ae34f485ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:22:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:22:32.668 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f66031b-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:32 np0005476733 kernel: tap4f66031b-40: left promiscuous mode
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:22:32.683 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d05af3e9-7c7f-434f-9a5f-83052d6889d9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:22:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:22:32.706 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9a00408c-cc1c-4e9f-aa43-d1810b5c65ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:22:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:22:32.707 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9142e5f4-32b9-4d76-b357-3f047f1b2570]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:22:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:22:32.722 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[646cc97d-ce9c-41eb-aa97-6befe641ced8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 738548, 'reachable_time': 17145, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 260058, 'error': None, 'target': 'ovnmeta-4f66031b-46d6-4cd4-8153-64f96d4967ce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:22:32 np0005476733 systemd[1]: run-netns-ovnmeta\x2d4f66031b\x2d46d6\x2d4cd4\x2d8153\x2d64f96d4967ce.mount: Deactivated successfully.
Oct  8 12:22:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:22:32.725 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4f66031b-46d6-4cd4-8153-64f96d4967ce deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 12:22:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:22:32.725 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[651c37e6-4427-4f23-ace8-50e84c10affe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.731 2 INFO nova.compute.manager [None req-31fe8414-fde4-410b-9d31-6d37114c8356 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.732 2 DEBUG oslo.service.loopingcall [None req-31fe8414-fde4-410b-9d31-6d37114c8356 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.733 2 DEBUG nova.compute.manager [-] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.733 2 DEBUG nova.network.neutron [-] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.940 2 DEBUG nova.compute.manager [req-70463be3-ae07-4670-8909-a13c7889f8cc req-8d194611-b984-4c9f-b933-ccc2eb2d0f15 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Received event network-vif-unplugged-20af3098-8387-4af6-82ab-5fb07b335ea0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.940 2 DEBUG oslo_concurrency.lockutils [req-70463be3-ae07-4670-8909-a13c7889f8cc req-8d194611-b984-4c9f-b933-ccc2eb2d0f15 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "3559b47f-102c-43cf-a800-ac09d66e2264-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.941 2 DEBUG oslo_concurrency.lockutils [req-70463be3-ae07-4670-8909-a13c7889f8cc req-8d194611-b984-4c9f-b933-ccc2eb2d0f15 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "3559b47f-102c-43cf-a800-ac09d66e2264-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.941 2 DEBUG oslo_concurrency.lockutils [req-70463be3-ae07-4670-8909-a13c7889f8cc req-8d194611-b984-4c9f-b933-ccc2eb2d0f15 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "3559b47f-102c-43cf-a800-ac09d66e2264-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.941 2 DEBUG nova.compute.manager [req-70463be3-ae07-4670-8909-a13c7889f8cc req-8d194611-b984-4c9f-b933-ccc2eb2d0f15 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] No waiting events found dispatching network-vif-unplugged-20af3098-8387-4af6-82ab-5fb07b335ea0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:22:32 np0005476733 nova_compute[192580]: 2025-10-08 16:22:32.941 2 DEBUG nova.compute.manager [req-70463be3-ae07-4670-8909-a13c7889f8cc req-8d194611-b984-4c9f-b933-ccc2eb2d0f15 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Received event network-vif-unplugged-20af3098-8387-4af6-82ab-5fb07b335ea0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 12:22:33 np0005476733 nova_compute[192580]: 2025-10-08 16:22:33.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:33 np0005476733 nova_compute[192580]: 2025-10-08 16:22:33.731 2 DEBUG nova.network.neutron [-] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:22:33 np0005476733 nova_compute[192580]: 2025-10-08 16:22:33.754 2 INFO nova.compute.manager [-] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Took 1.02 seconds to deallocate network for instance.#033[00m
Oct  8 12:22:33 np0005476733 nova_compute[192580]: 2025-10-08 16:22:33.809 2 DEBUG oslo_concurrency.lockutils [None req-31fe8414-fde4-410b-9d31-6d37114c8356 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:22:33 np0005476733 nova_compute[192580]: 2025-10-08 16:22:33.810 2 DEBUG oslo_concurrency.lockutils [None req-31fe8414-fde4-410b-9d31-6d37114c8356 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:22:33 np0005476733 nova_compute[192580]: 2025-10-08 16:22:33.885 2 DEBUG nova.compute.provider_tree [None req-31fe8414-fde4-410b-9d31-6d37114c8356 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:22:33 np0005476733 nova_compute[192580]: 2025-10-08 16:22:33.909 2 DEBUG nova.scheduler.client.report [None req-31fe8414-fde4-410b-9d31-6d37114c8356 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:22:33 np0005476733 nova_compute[192580]: 2025-10-08 16:22:33.937 2 DEBUG oslo_concurrency.lockutils [None req-31fe8414-fde4-410b-9d31-6d37114c8356 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:22:33 np0005476733 nova_compute[192580]: 2025-10-08 16:22:33.973 2 INFO nova.scheduler.client.report [None req-31fe8414-fde4-410b-9d31-6d37114c8356 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Deleted allocations for instance 3559b47f-102c-43cf-a800-ac09d66e2264#033[00m
Oct  8 12:22:34 np0005476733 nova_compute[192580]: 2025-10-08 16:22:34.055 2 DEBUG oslo_concurrency.lockutils [None req-31fe8414-fde4-410b-9d31-6d37114c8356 2bdd69fe495b499fbadd2e2b8da36c6f 0433a72056854da48c168f13bcf53e59 - - default default] Lock "3559b47f-102c-43cf-a800-ac09d66e2264" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.708s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:22:34 np0005476733 nova_compute[192580]: 2025-10-08 16:22:34.398 2 DEBUG nova.network.neutron [req-30fe9439-bea6-4f93-b289-be4141f79272 req-011635fc-4c2a-4b25-ab2a-51517dcb9b65 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Updated VIF entry in instance network info cache for port 20af3098-8387-4af6-82ab-5fb07b335ea0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:22:34 np0005476733 nova_compute[192580]: 2025-10-08 16:22:34.399 2 DEBUG nova.network.neutron [req-30fe9439-bea6-4f93-b289-be4141f79272 req-011635fc-4c2a-4b25-ab2a-51517dcb9b65 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Updating instance_info_cache with network_info: [{"id": "20af3098-8387-4af6-82ab-5fb07b335ea0", "address": "fa:16:3e:74:7c:88", "network": {"id": "4f66031b-46d6-4cd4-8153-64f96d4967ce", "bridge": "br-int", "label": "tempest-test-network--1402782967", "subnets": [{"cidr": "10.100.0.48/28", "dns": [], "gateway": {"address": "10.100.0.49", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.53", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20af3098-83", "ovs_interfaceid": "20af3098-8387-4af6-82ab-5fb07b335ea0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:22:34 np0005476733 nova_compute[192580]: 2025-10-08 16:22:34.422 2 DEBUG oslo_concurrency.lockutils [req-30fe9439-bea6-4f93-b289-be4141f79272 req-011635fc-4c2a-4b25-ab2a-51517dcb9b65 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-3559b47f-102c-43cf-a800-ac09d66e2264" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:22:34 np0005476733 nova_compute[192580]: 2025-10-08 16:22:34.472 2 DEBUG nova.compute.manager [req-ae92c273-07c8-4aaa-84ed-19459d1889b8 req-3f09c4a2-6b10-4e01-a8e6-5bf3a16809bd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Received event network-vif-deleted-20af3098-8387-4af6-82ab-5fb07b335ea0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:22:34 np0005476733 nova_compute[192580]: 2025-10-08 16:22:34.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:22:35 np0005476733 nova_compute[192580]: 2025-10-08 16:22:35.045 2 DEBUG nova.compute.manager [req-b6701b1e-fadb-4484-a940-6be0e03d0d50 req-b9bf45b2-a4f9-442f-ab50-d915de0802ab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Received event network-vif-plugged-20af3098-8387-4af6-82ab-5fb07b335ea0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:22:35 np0005476733 nova_compute[192580]: 2025-10-08 16:22:35.045 2 DEBUG oslo_concurrency.lockutils [req-b6701b1e-fadb-4484-a940-6be0e03d0d50 req-b9bf45b2-a4f9-442f-ab50-d915de0802ab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "3559b47f-102c-43cf-a800-ac09d66e2264-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:22:35 np0005476733 nova_compute[192580]: 2025-10-08 16:22:35.046 2 DEBUG oslo_concurrency.lockutils [req-b6701b1e-fadb-4484-a940-6be0e03d0d50 req-b9bf45b2-a4f9-442f-ab50-d915de0802ab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "3559b47f-102c-43cf-a800-ac09d66e2264-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:22:35 np0005476733 nova_compute[192580]: 2025-10-08 16:22:35.046 2 DEBUG oslo_concurrency.lockutils [req-b6701b1e-fadb-4484-a940-6be0e03d0d50 req-b9bf45b2-a4f9-442f-ab50-d915de0802ab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "3559b47f-102c-43cf-a800-ac09d66e2264-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:22:35 np0005476733 nova_compute[192580]: 2025-10-08 16:22:35.046 2 DEBUG nova.compute.manager [req-b6701b1e-fadb-4484-a940-6be0e03d0d50 req-b9bf45b2-a4f9-442f-ab50-d915de0802ab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] No waiting events found dispatching network-vif-plugged-20af3098-8387-4af6-82ab-5fb07b335ea0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:22:35 np0005476733 nova_compute[192580]: 2025-10-08 16:22:35.047 2 WARNING nova.compute.manager [req-b6701b1e-fadb-4484-a940-6be0e03d0d50 req-b9bf45b2-a4f9-442f-ab50-d915de0802ab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Received unexpected event network-vif-plugged-20af3098-8387-4af6-82ab-5fb07b335ea0 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 12:22:35 np0005476733 nova_compute[192580]: 2025-10-08 16:22:35.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.066 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.067 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.067 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.067 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.067 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.067 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:22:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:22:36 np0005476733 nova_compute[192580]: 2025-10-08 16:22:36.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:22:36 np0005476733 nova_compute[192580]: 2025-10-08 16:22:36.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:22:37 np0005476733 podman[260063]: 2025-10-08 16:22:37.208874907 +0000 UTC m=+0.069910171 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 12:22:37 np0005476733 nova_compute[192580]: 2025-10-08 16:22:37.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:38 np0005476733 nova_compute[192580]: 2025-10-08 16:22:38.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:39 np0005476733 systemd-logind[827]: New session 132 of user zuul.
Oct  8 12:22:39 np0005476733 systemd[1]: Started Session 132 of User zuul.
Oct  8 12:22:39 np0005476733 systemd[1]: session-132.scope: Deactivated successfully.
Oct  8 12:22:39 np0005476733 systemd-logind[827]: Session 132 logged out. Waiting for processes to exit.
Oct  8 12:22:39 np0005476733 systemd-logind[827]: Removed session 132.
Oct  8 12:22:42 np0005476733 nova_compute[192580]: 2025-10-08 16:22:42.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:22:42 np0005476733 nova_compute[192580]: 2025-10-08 16:22:42.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:22:42 np0005476733 nova_compute[192580]: 2025-10-08 16:22:42.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:22:42 np0005476733 nova_compute[192580]: 2025-10-08 16:22:42.606 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:22:42 np0005476733 nova_compute[192580]: 2025-10-08 16:22:42.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:43 np0005476733 podman[260115]: 2025-10-08 16:22:43.254986704 +0000 UTC m=+0.079231326 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  8 12:22:43 np0005476733 podman[260114]: 2025-10-08 16:22:43.317018473 +0000 UTC m=+0.141834143 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Oct  8 12:22:43 np0005476733 nova_compute[192580]: 2025-10-08 16:22:43.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:22:43 np0005476733 nova_compute[192580]: 2025-10-08 16:22:43.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:46 np0005476733 nova_compute[192580]: 2025-10-08 16:22:46.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:22:46 np0005476733 nova_compute[192580]: 2025-10-08 16:22:46.615 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:22:46 np0005476733 nova_compute[192580]: 2025-10-08 16:22:46.615 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:22:46 np0005476733 nova_compute[192580]: 2025-10-08 16:22:46.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:22:46 np0005476733 nova_compute[192580]: 2025-10-08 16:22:46.616 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:22:46 np0005476733 nova_compute[192580]: 2025-10-08 16:22:46.755 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:22:46 np0005476733 nova_compute[192580]: 2025-10-08 16:22:46.755 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13651MB free_disk=111.31509780883789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:22:46 np0005476733 nova_compute[192580]: 2025-10-08 16:22:46.756 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:22:46 np0005476733 nova_compute[192580]: 2025-10-08 16:22:46.756 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:22:46 np0005476733 nova_compute[192580]: 2025-10-08 16:22:46.876 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:22:46 np0005476733 nova_compute[192580]: 2025-10-08 16:22:46.877 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:22:46 np0005476733 nova_compute[192580]: 2025-10-08 16:22:46.948 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:22:46 np0005476733 nova_compute[192580]: 2025-10-08 16:22:46.964 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:22:46 np0005476733 nova_compute[192580]: 2025-10-08 16:22:46.991 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:22:46 np0005476733 nova_compute[192580]: 2025-10-08 16:22:46.991 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:22:47 np0005476733 nova_compute[192580]: 2025-10-08 16:22:47.639 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759940552.637673, 3559b47f-102c-43cf-a800-ac09d66e2264 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:22:47 np0005476733 nova_compute[192580]: 2025-10-08 16:22:47.639 2 INFO nova.compute.manager [-] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] VM Stopped (Lifecycle Event)#033[00m
Oct  8 12:22:47 np0005476733 nova_compute[192580]: 2025-10-08 16:22:47.661 2 DEBUG nova.compute.manager [None req-5a6f3922-f584-4e3d-8da6-1c2278612608 - - - - - -] [instance: 3559b47f-102c-43cf-a800-ac09d66e2264] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:22:47 np0005476733 nova_compute[192580]: 2025-10-08 16:22:47.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:48 np0005476733 nova_compute[192580]: 2025-10-08 16:22:48.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:49 np0005476733 podman[260163]: 2025-10-08 16:22:49.232260157 +0000 UTC m=+0.057475625 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:22:49 np0005476733 podman[260164]: 2025-10-08 16:22:49.254014098 +0000 UTC m=+0.075663943 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, version=9.6, config_id=edpm)
Oct  8 12:22:49 np0005476733 podman[260162]: 2025-10-08 16:22:49.254014408 +0000 UTC m=+0.081360234 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 12:22:50 np0005476733 nova_compute[192580]: 2025-10-08 16:22:50.984 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:22:51 np0005476733 nova_compute[192580]: 2025-10-08 16:22:51.008 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:22:52 np0005476733 nova_compute[192580]: 2025-10-08 16:22:52.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:52 np0005476733 nova_compute[192580]: 2025-10-08 16:22:52.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:53 np0005476733 nova_compute[192580]: 2025-10-08 16:22:53.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:57 np0005476733 nova_compute[192580]: 2025-10-08 16:22:57.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:58 np0005476733 nova_compute[192580]: 2025-10-08 16:22:58.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:22:59 np0005476733 podman[260222]: 2025-10-08 16:22:59.23022015 +0000 UTC m=+0.054508251 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 12:22:59 np0005476733 podman[260223]: 2025-10-08 16:22:59.263867598 +0000 UTC m=+0.074115503 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 12:23:02 np0005476733 nova_compute[192580]: 2025-10-08 16:23:02.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:03 np0005476733 nova_compute[192580]: 2025-10-08 16:23:03.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:04 np0005476733 nova_compute[192580]: 2025-10-08 16:23:04.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:07 np0005476733 nova_compute[192580]: 2025-10-08 16:23:07.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:08 np0005476733 podman[260270]: 2025-10-08 16:23:08.248725208 +0000 UTC m=+0.071342187 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Oct  8 12:23:08 np0005476733 nova_compute[192580]: 2025-10-08 16:23:08.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:12 np0005476733 nova_compute[192580]: 2025-10-08 16:23:12.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:13 np0005476733 nova_compute[192580]: 2025-10-08 16:23:13.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:14 np0005476733 podman[260294]: 2025-10-08 16:23:14.235313505 +0000 UTC m=+0.060910495 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 12:23:14 np0005476733 podman[260293]: 2025-10-08 16:23:14.256853599 +0000 UTC m=+0.086648651 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:23:17 np0005476733 nova_compute[192580]: 2025-10-08 16:23:17.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:18 np0005476733 nova_compute[192580]: 2025-10-08 16:23:18.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:19 np0005476733 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct  8 12:23:20 np0005476733 podman[260345]: 2025-10-08 16:23:20.071533799 +0000 UTC m=+0.079762464 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:23:20 np0005476733 podman[260346]: 2025-10-08 16:23:20.0779012 +0000 UTC m=+0.091595488 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:23:20 np0005476733 podman[260347]: 2025-10-08 16:23:20.096882164 +0000 UTC m=+0.096558727 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public)
Oct  8 12:23:22 np0005476733 nova_compute[192580]: 2025-10-08 16:23:22.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:23 np0005476733 nova_compute[192580]: 2025-10-08 16:23:23.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:24 np0005476733 ovn_controller[94857]: 2025-10-08T16:23:24Z|00868|pinctrl|WARN|Dropped 1013 log messages in last 62 seconds (most recently, 5 seconds ago) due to excessive rate
Oct  8 12:23:24 np0005476733 ovn_controller[94857]: 2025-10-08T16:23:24Z|00869|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:23:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:26.381 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:23:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:26.381 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:23:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:26.381 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:23:26 np0005476733 nova_compute[192580]: 2025-10-08 16:23:26.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:23:27 np0005476733 nova_compute[192580]: 2025-10-08 16:23:27.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:28 np0005476733 nova_compute[192580]: 2025-10-08 16:23:28.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:28 np0005476733 nova_compute[192580]: 2025-10-08 16:23:28.796 2 DEBUG oslo_concurrency.lockutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "abb95c64-245b-4e3a-bb95-02d86e179dbe" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:23:28 np0005476733 nova_compute[192580]: 2025-10-08 16:23:28.796 2 DEBUG oslo_concurrency.lockutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "abb95c64-245b-4e3a-bb95-02d86e179dbe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:23:28 np0005476733 nova_compute[192580]: 2025-10-08 16:23:28.818 2 DEBUG nova.compute.manager [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 12:23:28 np0005476733 nova_compute[192580]: 2025-10-08 16:23:28.897 2 DEBUG oslo_concurrency.lockutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:23:28 np0005476733 nova_compute[192580]: 2025-10-08 16:23:28.897 2 DEBUG oslo_concurrency.lockutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:23:28 np0005476733 nova_compute[192580]: 2025-10-08 16:23:28.906 2 DEBUG nova.virt.hardware [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 12:23:28 np0005476733 nova_compute[192580]: 2025-10-08 16:23:28.906 2 INFO nova.compute.claims [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.074 2 DEBUG nova.compute.provider_tree [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.092 2 DEBUG nova.scheduler.client.report [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.111 2 DEBUG oslo_concurrency.lockutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.111 2 DEBUG nova.compute.manager [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.163 2 DEBUG nova.compute.manager [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.164 2 DEBUG nova.network.neutron [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.195 2 INFO nova.virt.libvirt.driver [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.216 2 DEBUG nova.compute.manager [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.345 2 DEBUG nova.compute.manager [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.347 2 DEBUG nova.virt.libvirt.driver [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.347 2 INFO nova.virt.libvirt.driver [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Creating image(s)#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.348 2 DEBUG oslo_concurrency.lockutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "/var/lib/nova/instances/abb95c64-245b-4e3a-bb95-02d86e179dbe/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.348 2 DEBUG oslo_concurrency.lockutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "/var/lib/nova/instances/abb95c64-245b-4e3a-bb95-02d86e179dbe/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.349 2 DEBUG oslo_concurrency.lockutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "/var/lib/nova/instances/abb95c64-245b-4e3a-bb95-02d86e179dbe/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.373 2 DEBUG oslo_concurrency.processutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.433 2 DEBUG oslo_concurrency.processutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.435 2 DEBUG oslo_concurrency.lockutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.436 2 DEBUG oslo_concurrency.lockutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.465 2 DEBUG oslo_concurrency.processutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.544 2 DEBUG oslo_concurrency.processutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.545 2 DEBUG oslo_concurrency.processutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/abb95c64-245b-4e3a-bb95-02d86e179dbe/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.583 2 DEBUG oslo_concurrency.processutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/abb95c64-245b-4e3a-bb95-02d86e179dbe/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.584 2 DEBUG oslo_concurrency.lockutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.585 2 DEBUG oslo_concurrency.processutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.660 2 DEBUG oslo_concurrency.processutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.662 2 DEBUG nova.virt.disk.api [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Checking if we can resize image /var/lib/nova/instances/abb95c64-245b-4e3a-bb95-02d86e179dbe/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.662 2 DEBUG oslo_concurrency.processutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/abb95c64-245b-4e3a-bb95-02d86e179dbe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.718 2 DEBUG oslo_concurrency.processutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/abb95c64-245b-4e3a-bb95-02d86e179dbe/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.719 2 DEBUG nova.virt.disk.api [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Cannot resize image /var/lib/nova/instances/abb95c64-245b-4e3a-bb95-02d86e179dbe/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.719 2 DEBUG nova.objects.instance [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lazy-loading 'migration_context' on Instance uuid abb95c64-245b-4e3a-bb95-02d86e179dbe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.751 2 DEBUG nova.virt.libvirt.driver [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.752 2 DEBUG nova.virt.libvirt.driver [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Ensure instance console log exists: /var/lib/nova/instances/abb95c64-245b-4e3a-bb95-02d86e179dbe/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.752 2 DEBUG oslo_concurrency.lockutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.752 2 DEBUG oslo_concurrency.lockutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:23:29 np0005476733 nova_compute[192580]: 2025-10-08 16:23:29.753 2 DEBUG oslo_concurrency.lockutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:23:29 np0005476733 podman[260433]: 2025-10-08 16:23:29.926906822 +0000 UTC m=+0.054936504 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:23:29 np0005476733 podman[260432]: 2025-10-08 16:23:29.944546363 +0000 UTC m=+0.076444799 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  8 12:23:30 np0005476733 nova_compute[192580]: 2025-10-08 16:23:30.308 2 DEBUG nova.network.neutron [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Successfully created port: c3626423-aef0-4457-a1b2-a1d3997f96d2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 12:23:31 np0005476733 nova_compute[192580]: 2025-10-08 16:23:31.365 2 DEBUG nova.network.neutron [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Successfully updated port: c3626423-aef0-4457-a1b2-a1d3997f96d2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 12:23:31 np0005476733 nova_compute[192580]: 2025-10-08 16:23:31.385 2 DEBUG oslo_concurrency.lockutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "refresh_cache-abb95c64-245b-4e3a-bb95-02d86e179dbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:23:31 np0005476733 nova_compute[192580]: 2025-10-08 16:23:31.386 2 DEBUG oslo_concurrency.lockutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquired lock "refresh_cache-abb95c64-245b-4e3a-bb95-02d86e179dbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:23:31 np0005476733 nova_compute[192580]: 2025-10-08 16:23:31.386 2 DEBUG nova.network.neutron [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:23:31 np0005476733 nova_compute[192580]: 2025-10-08 16:23:31.464 2 DEBUG nova.compute.manager [req-bf3e291c-ba51-411a-8668-74d81ba8aa2d req-5d914400-7130-4dab-89c3-e19eded1e624 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Received event network-changed-c3626423-aef0-4457-a1b2-a1d3997f96d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:23:31 np0005476733 nova_compute[192580]: 2025-10-08 16:23:31.465 2 DEBUG nova.compute.manager [req-bf3e291c-ba51-411a-8668-74d81ba8aa2d req-5d914400-7130-4dab-89c3-e19eded1e624 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Refreshing instance network info cache due to event network-changed-c3626423-aef0-4457-a1b2-a1d3997f96d2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:23:31 np0005476733 nova_compute[192580]: 2025-10-08 16:23:31.465 2 DEBUG oslo_concurrency.lockutils [req-bf3e291c-ba51-411a-8668-74d81ba8aa2d req-5d914400-7130-4dab-89c3-e19eded1e624 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-abb95c64-245b-4e3a-bb95-02d86e179dbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:23:31 np0005476733 nova_compute[192580]: 2025-10-08 16:23:31.684 2 DEBUG nova.network.neutron [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.391 2 DEBUG nova.network.neutron [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Updating instance_info_cache with network_info: [{"id": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "address": "fa:16:3e:95:89:aa", "network": {"id": "f3f8093b-d3a9-4ef1-b524-693e6a6f6c96", "bridge": "br-int", "label": "tempest-test-network--1786492812", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3626423-ae", "ovs_interfaceid": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.470 2 DEBUG oslo_concurrency.lockutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Releasing lock "refresh_cache-abb95c64-245b-4e3a-bb95-02d86e179dbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.471 2 DEBUG nova.compute.manager [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Instance network_info: |[{"id": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "address": "fa:16:3e:95:89:aa", "network": {"id": "f3f8093b-d3a9-4ef1-b524-693e6a6f6c96", "bridge": "br-int", "label": "tempest-test-network--1786492812", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3626423-ae", "ovs_interfaceid": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.472 2 DEBUG oslo_concurrency.lockutils [req-bf3e291c-ba51-411a-8668-74d81ba8aa2d req-5d914400-7130-4dab-89c3-e19eded1e624 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-abb95c64-245b-4e3a-bb95-02d86e179dbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.473 2 DEBUG nova.network.neutron [req-bf3e291c-ba51-411a-8668-74d81ba8aa2d req-5d914400-7130-4dab-89c3-e19eded1e624 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Refreshing network info cache for port c3626423-aef0-4457-a1b2-a1d3997f96d2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.478 2 DEBUG nova.virt.libvirt.driver [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Start _get_guest_xml network_info=[{"id": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "address": "fa:16:3e:95:89:aa", "network": {"id": "f3f8093b-d3a9-4ef1-b524-693e6a6f6c96", "bridge": "br-int", "label": "tempest-test-network--1786492812", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3626423-ae", "ovs_interfaceid": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T15:17:39Z,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T15:17:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.483 2 WARNING nova.virt.libvirt.driver [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.490 2 DEBUG nova.virt.libvirt.host [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.491 2 DEBUG nova.virt.libvirt.host [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.495 2 DEBUG nova.virt.libvirt.host [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.496 2 DEBUG nova.virt.libvirt.host [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.497 2 DEBUG nova.virt.libvirt.driver [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.498 2 DEBUG nova.virt.hardware [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='987b2db7-1d21-4b59-831a-1e8ace40589b',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T15:17:39Z,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T15:17:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.499 2 DEBUG nova.virt.hardware [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.499 2 DEBUG nova.virt.hardware [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.499 2 DEBUG nova.virt.hardware [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.500 2 DEBUG nova.virt.hardware [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.500 2 DEBUG nova.virt.hardware [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.501 2 DEBUG nova.virt.hardware [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.501 2 DEBUG nova.virt.hardware [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.502 2 DEBUG nova.virt.hardware [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.502 2 DEBUG nova.virt.hardware [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.503 2 DEBUG nova.virt.hardware [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.510 2 DEBUG nova.virt.libvirt.vif [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:23:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-server-test-693331515',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-693331515',id=90,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO4EccSgdKz20vhzCTJVEKl1/Mu3zu8VUU9b5FpQnU5dG12k8nhnJmJgq+Ku3VVX1x/AvGHzkf+c9hXrKFuhBTSr0zWXBTiMXkars7IT0AKScQJV07dnYMXQ1AYcsSb/WA==',key_name='tempest-keypair-1447999901',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9d7b1c6f132443b0abac8495ed44621d',ramdisk_id='',reservation_id='r-rm4zxywv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-OvnDvrTest-313060968',owner_user_name='tempest-OvnDvrTest-313060968-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:23:29Z,user_data=None,user_id='81b62a8f3edf4f78aeb0b087fd79ebb7',uuid=abb95c64-245b-4e3a-bb95-02d86e179dbe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "address": "fa:16:3e:95:89:aa", "network": {"id": "f3f8093b-d3a9-4ef1-b524-693e6a6f6c96", "bridge": "br-int", "label": "tempest-test-network--1786492812", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3626423-ae", "ovs_interfaceid": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.510 2 DEBUG nova.network.os_vif_util [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converting VIF {"id": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "address": "fa:16:3e:95:89:aa", "network": {"id": "f3f8093b-d3a9-4ef1-b524-693e6a6f6c96", "bridge": "br-int", "label": "tempest-test-network--1786492812", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3626423-ae", "ovs_interfaceid": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.511 2 DEBUG nova.network.os_vif_util [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:89:aa,bridge_name='br-int',has_traffic_filtering=True,id=c3626423-aef0-4457-a1b2-a1d3997f96d2,network=Network(f3f8093b-d3a9-4ef1-b524-693e6a6f6c96),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3626423-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.513 2 DEBUG nova.objects.instance [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lazy-loading 'pci_devices' on Instance uuid abb95c64-245b-4e3a-bb95-02d86e179dbe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.536 2 DEBUG nova.virt.libvirt.driver [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] End _get_guest_xml xml=<domain type="kvm">
Oct  8 12:23:32 np0005476733 nova_compute[192580]:  <uuid>abb95c64-245b-4e3a-bb95-02d86e179dbe</uuid>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:  <name>instance-0000005a</name>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:  <memory>131072</memory>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <nova:name>tempest-server-test-693331515</nova:name>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 16:23:32</nova:creationTime>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <nova:flavor name="m1.nano">
Oct  8 12:23:32 np0005476733 nova_compute[192580]:        <nova:memory>128</nova:memory>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:        <nova:disk>1</nova:disk>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:        <nova:user uuid="81b62a8f3edf4f78aeb0b087fd79ebb7">tempest-OvnDvrTest-313060968-project-admin</nova:user>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:        <nova:project uuid="9d7b1c6f132443b0abac8495ed44621d">tempest-OvnDvrTest-313060968</nova:project>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="ec29a055-bb5f-49c2-94be-8574c5ea97ea"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:        <nova:port uuid="c3626423-aef0-4457-a1b2-a1d3997f96d2">
Oct  8 12:23:32 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.69" ipVersion="4"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <system>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <entry name="serial">abb95c64-245b-4e3a-bb95-02d86e179dbe</entry>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <entry name="uuid">abb95c64-245b-4e3a-bb95-02d86e179dbe</entry>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    </system>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:  <os>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:  </os>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:  <features>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:  </features>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:  </clock>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:  <devices>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/abb95c64-245b-4e3a-bb95-02d86e179dbe/disk"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/abb95c64-245b-4e3a-bb95-02d86e179dbe/disk.config"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:95:89:aa"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <target dev="tapc3626423-ae"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    </interface>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/abb95c64-245b-4e3a-bb95-02d86e179dbe/console.log" append="off"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    </serial>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <video>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    </video>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    </rng>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 12:23:32 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 12:23:32 np0005476733 nova_compute[192580]:  </devices>
Oct  8 12:23:32 np0005476733 nova_compute[192580]: </domain>
Oct  8 12:23:32 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.538 2 DEBUG nova.compute.manager [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Preparing to wait for external event network-vif-plugged-c3626423-aef0-4457-a1b2-a1d3997f96d2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.539 2 DEBUG oslo_concurrency.lockutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "abb95c64-245b-4e3a-bb95-02d86e179dbe-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.539 2 DEBUG oslo_concurrency.lockutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "abb95c64-245b-4e3a-bb95-02d86e179dbe-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.539 2 DEBUG oslo_concurrency.lockutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "abb95c64-245b-4e3a-bb95-02d86e179dbe-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.540 2 DEBUG nova.virt.libvirt.vif [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:23:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-server-test-693331515',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-693331515',id=90,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO4EccSgdKz20vhzCTJVEKl1/Mu3zu8VUU9b5FpQnU5dG12k8nhnJmJgq+Ku3VVX1x/AvGHzkf+c9hXrKFuhBTSr0zWXBTiMXkars7IT0AKScQJV07dnYMXQ1AYcsSb/WA==',key_name='tempest-keypair-1447999901',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9d7b1c6f132443b0abac8495ed44621d',ramdisk_id='',reservation_id='r-rm4zxywv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-OvnDvrTest-313060968',owner_user_name='tempest-OvnDvrTest-313060968-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:23:29Z,user_data=None,user_id='81b62a8f3edf4f78aeb0b087fd79ebb7',uuid=abb95c64-245b-4e3a-bb95-02d86e179dbe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "address": "fa:16:3e:95:89:aa", "network": {"id": "f3f8093b-d3a9-4ef1-b524-693e6a6f6c96", "bridge": "br-int", "label": "tempest-test-network--1786492812", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3626423-ae", "ovs_interfaceid": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.540 2 DEBUG nova.network.os_vif_util [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converting VIF {"id": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "address": "fa:16:3e:95:89:aa", "network": {"id": "f3f8093b-d3a9-4ef1-b524-693e6a6f6c96", "bridge": "br-int", "label": "tempest-test-network--1786492812", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3626423-ae", "ovs_interfaceid": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.541 2 DEBUG nova.network.os_vif_util [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:89:aa,bridge_name='br-int',has_traffic_filtering=True,id=c3626423-aef0-4457-a1b2-a1d3997f96d2,network=Network(f3f8093b-d3a9-4ef1-b524-693e6a6f6c96),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3626423-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.542 2 DEBUG os_vif [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:89:aa,bridge_name='br-int',has_traffic_filtering=True,id=c3626423-aef0-4457-a1b2-a1d3997f96d2,network=Network(f3f8093b-d3a9-4ef1-b524-693e6a6f6c96),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3626423-ae') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.543 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.544 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.547 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc3626423-ae, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.548 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc3626423-ae, col_values=(('external_ids', {'iface-id': 'c3626423-aef0-4457-a1b2-a1d3997f96d2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:95:89:aa', 'vm-uuid': 'abb95c64-245b-4e3a-bb95-02d86e179dbe'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:32 np0005476733 NetworkManager[51699]: <info>  [1759940612.5510] manager: (tapc3626423-ae): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/279)
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.563 2 INFO os_vif [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:89:aa,bridge_name='br-int',has_traffic_filtering=True,id=c3626423-aef0-4457-a1b2-a1d3997f96d2,network=Network(f3f8093b-d3a9-4ef1-b524-693e6a6f6c96),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3626423-ae')#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.632 2 DEBUG nova.virt.libvirt.driver [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.632 2 DEBUG nova.virt.libvirt.driver [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.633 2 DEBUG nova.virt.libvirt.driver [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] No VIF found with MAC fa:16:3e:95:89:aa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 12:23:32 np0005476733 nova_compute[192580]: 2025-10-08 16:23:32.633 2 INFO nova.virt.libvirt.driver [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Using config drive#033[00m
Oct  8 12:23:33 np0005476733 nova_compute[192580]: 2025-10-08 16:23:33.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:34 np0005476733 nova_compute[192580]: 2025-10-08 16:23:34.711 2 INFO nova.virt.libvirt.driver [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Creating config drive at /var/lib/nova/instances/abb95c64-245b-4e3a-bb95-02d86e179dbe/disk.config#033[00m
Oct  8 12:23:34 np0005476733 nova_compute[192580]: 2025-10-08 16:23:34.716 2 DEBUG oslo_concurrency.processutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/abb95c64-245b-4e3a-bb95-02d86e179dbe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpok5kr7dh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:23:34 np0005476733 nova_compute[192580]: 2025-10-08 16:23:34.846 2 DEBUG oslo_concurrency.processutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/abb95c64-245b-4e3a-bb95-02d86e179dbe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpok5kr7dh" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:23:34 np0005476733 NetworkManager[51699]: <info>  [1759940614.9092] manager: (tapc3626423-ae): new Tun device (/org/freedesktop/NetworkManager/Devices/280)
Oct  8 12:23:34 np0005476733 kernel: tapc3626423-ae: entered promiscuous mode
Oct  8 12:23:34 np0005476733 nova_compute[192580]: 2025-10-08 16:23:34.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:34 np0005476733 ovn_controller[94857]: 2025-10-08T16:23:34Z|00870|binding|INFO|Claiming lport c3626423-aef0-4457-a1b2-a1d3997f96d2 for this chassis.
Oct  8 12:23:34 np0005476733 ovn_controller[94857]: 2025-10-08T16:23:34Z|00871|binding|INFO|c3626423-aef0-4457-a1b2-a1d3997f96d2: Claiming fa:16:3e:95:89:aa 10.100.0.69
Oct  8 12:23:34 np0005476733 nova_compute[192580]: 2025-10-08 16:23:34.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:34.919 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:89:aa 10.100.0.69'], port_security=['fa:16:3e:95:89:aa 10.100.0.69'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.69/28', 'neutron:device_id': 'abb95c64-245b-4e3a-bb95-02d86e179dbe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d7b1c6f132443b0abac8495ed44621d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0962c3b7-42c9-4039-adcd-00de2c2d903e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=701186e4-ea38-4beb-a81d-64dc7e4e6aec, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=c3626423-aef0-4457-a1b2-a1d3997f96d2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:23:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:34.920 103739 INFO neutron.agent.ovn.metadata.agent [-] Port c3626423-aef0-4457-a1b2-a1d3997f96d2 in datapath f3f8093b-d3a9-4ef1-b524-693e6a6f6c96 bound to our chassis#033[00m
Oct  8 12:23:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:34.921 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f3f8093b-d3a9-4ef1-b524-693e6a6f6c96#033[00m
Oct  8 12:23:34 np0005476733 ovn_controller[94857]: 2025-10-08T16:23:34Z|00872|binding|INFO|Setting lport c3626423-aef0-4457-a1b2-a1d3997f96d2 ovn-installed in OVS
Oct  8 12:23:34 np0005476733 ovn_controller[94857]: 2025-10-08T16:23:34Z|00873|binding|INFO|Setting lport c3626423-aef0-4457-a1b2-a1d3997f96d2 up in Southbound
Oct  8 12:23:34 np0005476733 nova_compute[192580]: 2025-10-08 16:23:34.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:34 np0005476733 nova_compute[192580]: 2025-10-08 16:23:34.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:34.934 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[920fc5b6-669f-431b-97bf-707ae93e9168]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:23:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:34.935 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf3f8093b-d1 in ovnmeta-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 12:23:34 np0005476733 systemd-udevd[260496]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:23:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:34.938 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf3f8093b-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 12:23:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:34.938 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[530c73e5-f4a6-40d5-bfcd-c0853df5c056]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:23:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:34.940 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8852129c-e74f-47a2-b230-3e29ec9356b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:23:34 np0005476733 NetworkManager[51699]: <info>  [1759940614.9526] device (tapc3626423-ae): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:23:34 np0005476733 NetworkManager[51699]: <info>  [1759940614.9535] device (tapc3626423-ae): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 12:23:34 np0005476733 systemd-machined[152624]: New machine qemu-56-instance-0000005a.
Oct  8 12:23:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:34.957 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[47e0291e-9796-4c32-bd3d-eba078f33175]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:23:34 np0005476733 systemd[1]: Started Virtual Machine qemu-56-instance-0000005a.
Oct  8 12:23:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:34.980 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[50fd9194-907c-4f62-9dc8-94818ffe80bc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:35.010 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[972be547-d132-4473-9c37-b8b61c482d77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:23:35 np0005476733 NetworkManager[51699]: <info>  [1759940615.0157] manager: (tapf3f8093b-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/281)
Oct  8 12:23:35 np0005476733 systemd-udevd[260500]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:35.014 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1ba8162d-b641-4cae-9435-e8348e0ce5ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:35.049 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[5e12593f-3726-40fe-a482-4ed94045792c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:35.052 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[e16980f0-3e10-46b5-b415-1d35d155b3d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:23:35 np0005476733 NetworkManager[51699]: <info>  [1759940615.0793] device (tapf3f8093b-d0): carrier: link connected
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:35.087 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[77d91165-a83c-42c9-899f-d9d66fd005ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:35.111 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[89ca597c-8a5d-4d47-8c8e-3632c5b4fca9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf3f8093b-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:40:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 751874, 'reachable_time': 36145, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 260529, 'error': None, 'target': 'ovnmeta-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:35.129 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[aedf8003-3e1f-4981-824f-c63bf50a104a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecd:40c2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 751874, 'tstamp': 751874}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 260530, 'error': None, 'target': 'ovnmeta-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:35.149 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[85528857-dcfc-4be2-997b-f32ef8c2e31a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf3f8093b-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:40:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 751874, 'reachable_time': 36145, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 260531, 'error': None, 'target': 'ovnmeta-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:35.185 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ff186500-9da1-4f97-a1c2-86b3f9097abc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:35.271 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[59412fb4-7870-4c77-9c26-a7b31a90bddc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:35.273 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf3f8093b-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:35.273 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:35.274 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf3f8093b-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:23:35 np0005476733 kernel: tapf3f8093b-d0: entered promiscuous mode
Oct  8 12:23:35 np0005476733 NetworkManager[51699]: <info>  [1759940615.2870] manager: (tapf3f8093b-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/282)
Oct  8 12:23:35 np0005476733 nova_compute[192580]: 2025-10-08 16:23:35.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:35.291 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf3f8093b-d0, col_values=(('external_ids', {'iface-id': '41eec871-bf2d-4ba7-9d74-04f2f9a2085e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:23:35 np0005476733 ovn_controller[94857]: 2025-10-08T16:23:35Z|00874|binding|INFO|Releasing lport 41eec871-bf2d-4ba7-9d74-04f2f9a2085e from this chassis (sb_readonly=0)
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:35.294 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f3f8093b-d3a9-4ef1-b524-693e6a6f6c96.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f3f8093b-d3a9-4ef1-b524-693e6a6f6c96.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:35.307 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[54c3dc5b-a5c3-4041-b613-ab022142fd38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:35.308 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/f3f8093b-d3a9-4ef1-b524-693e6a6f6c96.pid.haproxy
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID f3f8093b-d3a9-4ef1-b524-693e6a6f6c96
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 12:23:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:35.309 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96', 'env', 'PROCESS_TAG=haproxy-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f3f8093b-d3a9-4ef1-b524-693e6a6f6c96.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 12:23:35 np0005476733 nova_compute[192580]: 2025-10-08 16:23:35.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:23:35 np0005476733 nova_compute[192580]: 2025-10-08 16:23:35.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:23:35 np0005476733 podman[260571]: 2025-10-08 16:23:35.700438307 +0000 UTC m=+0.048058367 container create 8cd7e4937c2ecee6fd37509da14313d3357336a4e7e3c167fc0740cd104730b8 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:23:35 np0005476733 systemd[1]: Started libpod-conmon-8cd7e4937c2ecee6fd37509da14313d3357336a4e7e3c167fc0740cd104730b8.scope.
Oct  8 12:23:35 np0005476733 systemd[1]: Started libcrun container.
Oct  8 12:23:35 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7feab23eac1fc1523fa166e3029706c1b669dd716b65a48545fb9e10960f6f7c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 12:23:35 np0005476733 podman[260571]: 2025-10-08 16:23:35.676554739 +0000 UTC m=+0.024174829 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 12:23:35 np0005476733 podman[260571]: 2025-10-08 16:23:35.779986603 +0000 UTC m=+0.127606673 container init 8cd7e4937c2ecee6fd37509da14313d3357336a4e7e3c167fc0740cd104730b8 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 12:23:35 np0005476733 podman[260571]: 2025-10-08 16:23:35.786490078 +0000 UTC m=+0.134110138 container start 8cd7e4937c2ecee6fd37509da14313d3357336a4e7e3c167fc0740cd104730b8 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:23:35 np0005476733 neutron-haproxy-ovnmeta-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96[260586]: [NOTICE]   (260590) : New worker (260592) forked
Oct  8 12:23:35 np0005476733 neutron-haproxy-ovnmeta-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96[260586]: [NOTICE]   (260590) : Loading success.
Oct  8 12:23:35 np0005476733 nova_compute[192580]: 2025-10-08 16:23:35.874 2 DEBUG nova.compute.manager [req-e47a5925-045b-4584-8cc1-721b2ec83fdb req-46d0c7ee-e083-4ca7-8efc-3279ab8351be 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Received event network-vif-plugged-c3626423-aef0-4457-a1b2-a1d3997f96d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:23:35 np0005476733 nova_compute[192580]: 2025-10-08 16:23:35.875 2 DEBUG oslo_concurrency.lockutils [req-e47a5925-045b-4584-8cc1-721b2ec83fdb req-46d0c7ee-e083-4ca7-8efc-3279ab8351be 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "abb95c64-245b-4e3a-bb95-02d86e179dbe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:23:35 np0005476733 nova_compute[192580]: 2025-10-08 16:23:35.875 2 DEBUG oslo_concurrency.lockutils [req-e47a5925-045b-4584-8cc1-721b2ec83fdb req-46d0c7ee-e083-4ca7-8efc-3279ab8351be 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "abb95c64-245b-4e3a-bb95-02d86e179dbe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:23:35 np0005476733 nova_compute[192580]: 2025-10-08 16:23:35.876 2 DEBUG oslo_concurrency.lockutils [req-e47a5925-045b-4584-8cc1-721b2ec83fdb req-46d0c7ee-e083-4ca7-8efc-3279ab8351be 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "abb95c64-245b-4e3a-bb95-02d86e179dbe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:23:35 np0005476733 nova_compute[192580]: 2025-10-08 16:23:35.876 2 DEBUG nova.compute.manager [req-e47a5925-045b-4584-8cc1-721b2ec83fdb req-46d0c7ee-e083-4ca7-8efc-3279ab8351be 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Processing event network-vif-plugged-c3626423-aef0-4457-a1b2-a1d3997f96d2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 12:23:35 np0005476733 nova_compute[192580]: 2025-10-08 16:23:35.948 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759940615.9484997, abb95c64-245b-4e3a-bb95-02d86e179dbe => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:23:35 np0005476733 nova_compute[192580]: 2025-10-08 16:23:35.949 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] VM Started (Lifecycle Event)#033[00m
Oct  8 12:23:35 np0005476733 nova_compute[192580]: 2025-10-08 16:23:35.950 2 DEBUG nova.compute.manager [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 12:23:35 np0005476733 nova_compute[192580]: 2025-10-08 16:23:35.953 2 DEBUG nova.virt.libvirt.driver [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 12:23:35 np0005476733 nova_compute[192580]: 2025-10-08 16:23:35.956 2 INFO nova.virt.libvirt.driver [-] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Instance spawned successfully.#033[00m
Oct  8 12:23:35 np0005476733 nova_compute[192580]: 2025-10-08 16:23:35.956 2 DEBUG nova.virt.libvirt.driver [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 12:23:35 np0005476733 nova_compute[192580]: 2025-10-08 16:23:35.985 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:23:35 np0005476733 nova_compute[192580]: 2025-10-08 16:23:35.988 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:23:35 np0005476733 nova_compute[192580]: 2025-10-08 16:23:35.998 2 DEBUG nova.virt.libvirt.driver [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:23:35 np0005476733 nova_compute[192580]: 2025-10-08 16:23:35.999 2 DEBUG nova.virt.libvirt.driver [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:23:36 np0005476733 nova_compute[192580]: 2025-10-08 16:23:35.999 2 DEBUG nova.virt.libvirt.driver [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:23:36 np0005476733 nova_compute[192580]: 2025-10-08 16:23:36.000 2 DEBUG nova.virt.libvirt.driver [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:23:36 np0005476733 nova_compute[192580]: 2025-10-08 16:23:36.000 2 DEBUG nova.virt.libvirt.driver [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:23:36 np0005476733 nova_compute[192580]: 2025-10-08 16:23:36.001 2 DEBUG nova.virt.libvirt.driver [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:23:36 np0005476733 nova_compute[192580]: 2025-10-08 16:23:36.035 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:23:36 np0005476733 nova_compute[192580]: 2025-10-08 16:23:36.036 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759940615.9486222, abb95c64-245b-4e3a-bb95-02d86e179dbe => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:23:36 np0005476733 nova_compute[192580]: 2025-10-08 16:23:36.036 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] VM Paused (Lifecycle Event)#033[00m
Oct  8 12:23:36 np0005476733 nova_compute[192580]: 2025-10-08 16:23:36.075 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:23:36 np0005476733 nova_compute[192580]: 2025-10-08 16:23:36.078 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759940615.9529848, abb95c64-245b-4e3a-bb95-02d86e179dbe => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:23:36 np0005476733 nova_compute[192580]: 2025-10-08 16:23:36.079 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] VM Resumed (Lifecycle Event)#033[00m
Oct  8 12:23:36 np0005476733 nova_compute[192580]: 2025-10-08 16:23:36.091 2 INFO nova.compute.manager [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Took 6.75 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 12:23:36 np0005476733 nova_compute[192580]: 2025-10-08 16:23:36.092 2 DEBUG nova.compute.manager [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:23:36 np0005476733 nova_compute[192580]: 2025-10-08 16:23:36.108 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:23:36 np0005476733 nova_compute[192580]: 2025-10-08 16:23:36.111 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:23:36 np0005476733 nova_compute[192580]: 2025-10-08 16:23:36.139 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:23:36 np0005476733 nova_compute[192580]: 2025-10-08 16:23:36.177 2 INFO nova.compute.manager [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Took 7.31 seconds to build instance.#033[00m
Oct  8 12:23:36 np0005476733 nova_compute[192580]: 2025-10-08 16:23:36.196 2 DEBUG oslo_concurrency.lockutils [None req-7c63e497-441d-4de3-ae70-d66bf4f88994 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "abb95c64-245b-4e3a-bb95-02d86e179dbe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.400s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:23:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:36.925 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=63, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=62) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:23:36 np0005476733 nova_compute[192580]: 2025-10-08 16:23:36.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:36.927 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:23:37 np0005476733 nova_compute[192580]: 2025-10-08 16:23:37.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:37 np0005476733 nova_compute[192580]: 2025-10-08 16:23:37.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:23:37 np0005476733 nova_compute[192580]: 2025-10-08 16:23:37.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:23:37 np0005476733 nova_compute[192580]: 2025-10-08 16:23:37.747 2 DEBUG nova.network.neutron [req-bf3e291c-ba51-411a-8668-74d81ba8aa2d req-5d914400-7130-4dab-89c3-e19eded1e624 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Updated VIF entry in instance network info cache for port c3626423-aef0-4457-a1b2-a1d3997f96d2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:23:37 np0005476733 nova_compute[192580]: 2025-10-08 16:23:37.748 2 DEBUG nova.network.neutron [req-bf3e291c-ba51-411a-8668-74d81ba8aa2d req-5d914400-7130-4dab-89c3-e19eded1e624 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Updating instance_info_cache with network_info: [{"id": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "address": "fa:16:3e:95:89:aa", "network": {"id": "f3f8093b-d3a9-4ef1-b524-693e6a6f6c96", "bridge": "br-int", "label": "tempest-test-network--1786492812", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3626423-ae", "ovs_interfaceid": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:23:37 np0005476733 nova_compute[192580]: 2025-10-08 16:23:37.766 2 DEBUG oslo_concurrency.lockutils [req-bf3e291c-ba51-411a-8668-74d81ba8aa2d req-5d914400-7130-4dab-89c3-e19eded1e624 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-abb95c64-245b-4e3a-bb95-02d86e179dbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:23:37 np0005476733 nova_compute[192580]: 2025-10-08 16:23:37.953 2 DEBUG nova.compute.manager [req-a3679dbe-2619-47f8-87a6-c592fd008cc7 req-c55c6e06-4f7b-49a4-ac0a-811fb5ec1eff 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Received event network-vif-plugged-c3626423-aef0-4457-a1b2-a1d3997f96d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:23:37 np0005476733 nova_compute[192580]: 2025-10-08 16:23:37.953 2 DEBUG oslo_concurrency.lockutils [req-a3679dbe-2619-47f8-87a6-c592fd008cc7 req-c55c6e06-4f7b-49a4-ac0a-811fb5ec1eff 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "abb95c64-245b-4e3a-bb95-02d86e179dbe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:23:37 np0005476733 nova_compute[192580]: 2025-10-08 16:23:37.953 2 DEBUG oslo_concurrency.lockutils [req-a3679dbe-2619-47f8-87a6-c592fd008cc7 req-c55c6e06-4f7b-49a4-ac0a-811fb5ec1eff 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "abb95c64-245b-4e3a-bb95-02d86e179dbe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:23:37 np0005476733 nova_compute[192580]: 2025-10-08 16:23:37.953 2 DEBUG oslo_concurrency.lockutils [req-a3679dbe-2619-47f8-87a6-c592fd008cc7 req-c55c6e06-4f7b-49a4-ac0a-811fb5ec1eff 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "abb95c64-245b-4e3a-bb95-02d86e179dbe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:23:37 np0005476733 nova_compute[192580]: 2025-10-08 16:23:37.954 2 DEBUG nova.compute.manager [req-a3679dbe-2619-47f8-87a6-c592fd008cc7 req-c55c6e06-4f7b-49a4-ac0a-811fb5ec1eff 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] No waiting events found dispatching network-vif-plugged-c3626423-aef0-4457-a1b2-a1d3997f96d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:23:37 np0005476733 nova_compute[192580]: 2025-10-08 16:23:37.954 2 WARNING nova.compute.manager [req-a3679dbe-2619-47f8-87a6-c592fd008cc7 req-c55c6e06-4f7b-49a4-ac0a-811fb5ec1eff 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Received unexpected event network-vif-plugged-c3626423-aef0-4457-a1b2-a1d3997f96d2 for instance with vm_state active and task_state None.#033[00m
Oct  8 12:23:38 np0005476733 nova_compute[192580]: 2025-10-08 16:23:38.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:39 np0005476733 podman[260601]: 2025-10-08 16:23:39.220987939 +0000 UTC m=+0.049542434 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct  8 12:23:42 np0005476733 nova_compute[192580]: 2025-10-08 16:23:42.354 2 DEBUG nova.compute.manager [req-813d189d-571f-4269-a0d6-3d5e79e884fc req-43a9e341-f77f-40d5-abc0-00a47e3b2b6d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Received event network-changed-c3626423-aef0-4457-a1b2-a1d3997f96d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:23:42 np0005476733 nova_compute[192580]: 2025-10-08 16:23:42.355 2 DEBUG nova.compute.manager [req-813d189d-571f-4269-a0d6-3d5e79e884fc req-43a9e341-f77f-40d5-abc0-00a47e3b2b6d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Refreshing instance network info cache due to event network-changed-c3626423-aef0-4457-a1b2-a1d3997f96d2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:23:42 np0005476733 nova_compute[192580]: 2025-10-08 16:23:42.355 2 DEBUG oslo_concurrency.lockutils [req-813d189d-571f-4269-a0d6-3d5e79e884fc req-43a9e341-f77f-40d5-abc0-00a47e3b2b6d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-abb95c64-245b-4e3a-bb95-02d86e179dbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:23:42 np0005476733 nova_compute[192580]: 2025-10-08 16:23:42.355 2 DEBUG oslo_concurrency.lockutils [req-813d189d-571f-4269-a0d6-3d5e79e884fc req-43a9e341-f77f-40d5-abc0-00a47e3b2b6d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-abb95c64-245b-4e3a-bb95-02d86e179dbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:23:42 np0005476733 nova_compute[192580]: 2025-10-08 16:23:42.356 2 DEBUG nova.network.neutron [req-813d189d-571f-4269-a0d6-3d5e79e884fc req-43a9e341-f77f-40d5-abc0-00a47e3b2b6d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Refreshing network info cache for port c3626423-aef0-4457-a1b2-a1d3997f96d2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:23:42 np0005476733 nova_compute[192580]: 2025-10-08 16:23:42.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:42 np0005476733 nova_compute[192580]: 2025-10-08 16:23:42.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:23:42 np0005476733 nova_compute[192580]: 2025-10-08 16:23:42.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:23:42 np0005476733 nova_compute[192580]: 2025-10-08 16:23:42.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:23:42 np0005476733 nova_compute[192580]: 2025-10-08 16:23:42.885 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-abb95c64-245b-4e3a-bb95-02d86e179dbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:23:43 np0005476733 systemd-logind[827]: New session 133 of user zuul.
Oct  8 12:23:43 np0005476733 systemd[1]: Started Session 133 of User zuul.
Oct  8 12:23:43 np0005476733 nova_compute[192580]: 2025-10-08 16:23:43.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:43 np0005476733 systemd-logind[827]: New session 134 of user zuul.
Oct  8 12:23:43 np0005476733 systemd[1]: Started Session 134 of User zuul.
Oct  8 12:23:44 np0005476733 systemd[1]: session-134.scope: Deactivated successfully.
Oct  8 12:23:44 np0005476733 systemd-logind[827]: Session 134 logged out. Waiting for processes to exit.
Oct  8 12:23:44 np0005476733 systemd-logind[827]: Removed session 134.
Oct  8 12:23:44 np0005476733 nova_compute[192580]: 2025-10-08 16:23:44.373 2 DEBUG nova.network.neutron [req-813d189d-571f-4269-a0d6-3d5e79e884fc req-43a9e341-f77f-40d5-abc0-00a47e3b2b6d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Updated VIF entry in instance network info cache for port c3626423-aef0-4457-a1b2-a1d3997f96d2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:23:44 np0005476733 nova_compute[192580]: 2025-10-08 16:23:44.373 2 DEBUG nova.network.neutron [req-813d189d-571f-4269-a0d6-3d5e79e884fc req-43a9e341-f77f-40d5-abc0-00a47e3b2b6d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Updating instance_info_cache with network_info: [{"id": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "address": "fa:16:3e:95:89:aa", "network": {"id": "f3f8093b-d3a9-4ef1-b524-693e6a6f6c96", "bridge": "br-int", "label": "tempest-test-network--1786492812", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3626423-ae", "ovs_interfaceid": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:23:44 np0005476733 nova_compute[192580]: 2025-10-08 16:23:44.414 2 DEBUG oslo_concurrency.lockutils [req-813d189d-571f-4269-a0d6-3d5e79e884fc req-43a9e341-f77f-40d5-abc0-00a47e3b2b6d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-abb95c64-245b-4e3a-bb95-02d86e179dbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:23:44 np0005476733 nova_compute[192580]: 2025-10-08 16:23:44.415 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-abb95c64-245b-4e3a-bb95-02d86e179dbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:23:44 np0005476733 nova_compute[192580]: 2025-10-08 16:23:44.415 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:23:44 np0005476733 nova_compute[192580]: 2025-10-08 16:23:44.415 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid abb95c64-245b-4e3a-bb95-02d86e179dbe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:23:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:23:44.929 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '63'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:23:45 np0005476733 podman[260679]: 2025-10-08 16:23:45.243142305 +0000 UTC m=+0.066217193 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:23:45 np0005476733 podman[260678]: 2025-10-08 16:23:45.2643885 +0000 UTC m=+0.089940606 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  8 12:23:46 np0005476733 nova_compute[192580]: 2025-10-08 16:23:46.737 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Updating instance_info_cache with network_info: [{"id": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "address": "fa:16:3e:95:89:aa", "network": {"id": "f3f8093b-d3a9-4ef1-b524-693e6a6f6c96", "bridge": "br-int", "label": "tempest-test-network--1786492812", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3626423-ae", "ovs_interfaceid": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:23:46 np0005476733 nova_compute[192580]: 2025-10-08 16:23:46.756 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-abb95c64-245b-4e3a-bb95-02d86e179dbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:23:46 np0005476733 nova_compute[192580]: 2025-10-08 16:23:46.756 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:23:46 np0005476733 nova_compute[192580]: 2025-10-08 16:23:46.756 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:23:46 np0005476733 nova_compute[192580]: 2025-10-08 16:23:46.756 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:23:46 np0005476733 nova_compute[192580]: 2025-10-08 16:23:46.778 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:23:46 np0005476733 nova_compute[192580]: 2025-10-08 16:23:46.779 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:23:46 np0005476733 nova_compute[192580]: 2025-10-08 16:23:46.779 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:23:46 np0005476733 nova_compute[192580]: 2025-10-08 16:23:46.779 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:23:46 np0005476733 ovn_controller[94857]: 2025-10-08T16:23:46Z|00111|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:95:89:aa 10.100.0.69
Oct  8 12:23:46 np0005476733 ovn_controller[94857]: 2025-10-08T16:23:46Z|00112|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:95:89:aa 10.100.0.69
Oct  8 12:23:46 np0005476733 nova_compute[192580]: 2025-10-08 16:23:46.857 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/abb95c64-245b-4e3a-bb95-02d86e179dbe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:23:46 np0005476733 nova_compute[192580]: 2025-10-08 16:23:46.919 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/abb95c64-245b-4e3a-bb95-02d86e179dbe/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:23:46 np0005476733 nova_compute[192580]: 2025-10-08 16:23:46.920 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/abb95c64-245b-4e3a-bb95-02d86e179dbe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:23:46 np0005476733 nova_compute[192580]: 2025-10-08 16:23:46.976 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/abb95c64-245b-4e3a-bb95-02d86e179dbe/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:23:47 np0005476733 nova_compute[192580]: 2025-10-08 16:23:47.122 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:23:47 np0005476733 nova_compute[192580]: 2025-10-08 16:23:47.123 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13444MB free_disk=111.287841796875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:23:47 np0005476733 nova_compute[192580]: 2025-10-08 16:23:47.124 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:23:47 np0005476733 nova_compute[192580]: 2025-10-08 16:23:47.124 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:23:47 np0005476733 nova_compute[192580]: 2025-10-08 16:23:47.219 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance abb95c64-245b-4e3a-bb95-02d86e179dbe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:23:47 np0005476733 nova_compute[192580]: 2025-10-08 16:23:47.220 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:23:47 np0005476733 nova_compute[192580]: 2025-10-08 16:23:47.220 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:23:47 np0005476733 nova_compute[192580]: 2025-10-08 16:23:47.260 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:23:47 np0005476733 nova_compute[192580]: 2025-10-08 16:23:47.287 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:23:47 np0005476733 nova_compute[192580]: 2025-10-08 16:23:47.317 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:23:47 np0005476733 nova_compute[192580]: 2025-10-08 16:23:47.317 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:23:47 np0005476733 nova_compute[192580]: 2025-10-08 16:23:47.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:48 np0005476733 nova_compute[192580]: 2025-10-08 16:23:48.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:50 np0005476733 podman[260740]: 2025-10-08 16:23:50.233998413 +0000 UTC m=+0.060129680 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:23:50 np0005476733 podman[260741]: 2025-10-08 16:23:50.237543815 +0000 UTC m=+0.060855893 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  8 12:23:50 np0005476733 podman[260739]: 2025-10-08 16:23:50.252883282 +0000 UTC m=+0.078742341 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 12:23:52 np0005476733 nova_compute[192580]: 2025-10-08 16:23:52.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:53 np0005476733 nova_compute[192580]: 2025-10-08 16:23:53.150 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:23:53 np0005476733 nova_compute[192580]: 2025-10-08 16:23:53.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:57 np0005476733 nova_compute[192580]: 2025-10-08 16:23:57.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:23:58 np0005476733 nova_compute[192580]: 2025-10-08 16:23:58.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:00 np0005476733 podman[260810]: 2025-10-08 16:24:00.242774048 +0000 UTC m=+0.064876261 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  8 12:24:00 np0005476733 podman[260811]: 2025-10-08 16:24:00.245163383 +0000 UTC m=+0.061306347 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:24:01 np0005476733 systemd-logind[827]: New session 135 of user zuul.
Oct  8 12:24:01 np0005476733 systemd[1]: Started Session 135 of User zuul.
Oct  8 12:24:02 np0005476733 nova_compute[192580]: 2025-10-08 16:24:02.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:03 np0005476733 nova_compute[192580]: 2025-10-08 16:24:03.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:06 np0005476733 systemd-logind[827]: New session 136 of user zuul.
Oct  8 12:24:06 np0005476733 systemd[1]: Started Session 136 of User zuul.
Oct  8 12:24:07 np0005476733 systemd[1]: session-136.scope: Deactivated successfully.
Oct  8 12:24:07 np0005476733 systemd-logind[827]: Session 136 logged out. Waiting for processes to exit.
Oct  8 12:24:07 np0005476733 systemd-logind[827]: Removed session 136.
Oct  8 12:24:07 np0005476733 nova_compute[192580]: 2025-10-08 16:24:07.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:08 np0005476733 nova_compute[192580]: 2025-10-08 16:24:08.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:09 np0005476733 podman[260920]: 2025-10-08 16:24:09.611250186 +0000 UTC m=+0.069853179 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.450 2 DEBUG nova.compute.manager [req-e75571f2-4555-4fb0-a85c-a183182d7aa2 req-b6ca6876-de11-488e-a808-b8672c9b87db 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Received event network-changed-c3626423-aef0-4457-a1b2-a1d3997f96d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.451 2 DEBUG nova.compute.manager [req-e75571f2-4555-4fb0-a85c-a183182d7aa2 req-b6ca6876-de11-488e-a808-b8672c9b87db 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Refreshing instance network info cache due to event network-changed-c3626423-aef0-4457-a1b2-a1d3997f96d2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.451 2 DEBUG oslo_concurrency.lockutils [req-e75571f2-4555-4fb0-a85c-a183182d7aa2 req-b6ca6876-de11-488e-a808-b8672c9b87db 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-abb95c64-245b-4e3a-bb95-02d86e179dbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.451 2 DEBUG oslo_concurrency.lockutils [req-e75571f2-4555-4fb0-a85c-a183182d7aa2 req-b6ca6876-de11-488e-a808-b8672c9b87db 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-abb95c64-245b-4e3a-bb95-02d86e179dbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.451 2 DEBUG nova.network.neutron [req-e75571f2-4555-4fb0-a85c-a183182d7aa2 req-b6ca6876-de11-488e-a808-b8672c9b87db 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Refreshing network info cache for port c3626423-aef0-4457-a1b2-a1d3997f96d2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.542 2 DEBUG oslo_concurrency.lockutils [None req-4f51c62c-a0d3-4ae6-8125-7334b2c455d6 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "abb95c64-245b-4e3a-bb95-02d86e179dbe" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.542 2 DEBUG oslo_concurrency.lockutils [None req-4f51c62c-a0d3-4ae6-8125-7334b2c455d6 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "abb95c64-245b-4e3a-bb95-02d86e179dbe" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.542 2 DEBUG oslo_concurrency.lockutils [None req-4f51c62c-a0d3-4ae6-8125-7334b2c455d6 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "abb95c64-245b-4e3a-bb95-02d86e179dbe-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.543 2 DEBUG oslo_concurrency.lockutils [None req-4f51c62c-a0d3-4ae6-8125-7334b2c455d6 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "abb95c64-245b-4e3a-bb95-02d86e179dbe-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.543 2 DEBUG oslo_concurrency.lockutils [None req-4f51c62c-a0d3-4ae6-8125-7334b2c455d6 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "abb95c64-245b-4e3a-bb95-02d86e179dbe-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.544 2 INFO nova.compute.manager [None req-4f51c62c-a0d3-4ae6-8125-7334b2c455d6 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Terminating instance#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.545 2 DEBUG nova.compute.manager [None req-4f51c62c-a0d3-4ae6-8125-7334b2c455d6 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 12:24:10 np0005476733 kernel: tapc3626423-ae (unregistering): left promiscuous mode
Oct  8 12:24:10 np0005476733 NetworkManager[51699]: <info>  [1759940650.5785] device (tapc3626423-ae): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:10 np0005476733 ovn_controller[94857]: 2025-10-08T16:24:10Z|00875|binding|INFO|Releasing lport c3626423-aef0-4457-a1b2-a1d3997f96d2 from this chassis (sb_readonly=0)
Oct  8 12:24:10 np0005476733 ovn_controller[94857]: 2025-10-08T16:24:10Z|00876|binding|INFO|Setting lport c3626423-aef0-4457-a1b2-a1d3997f96d2 down in Southbound
Oct  8 12:24:10 np0005476733 ovn_controller[94857]: 2025-10-08T16:24:10Z|00877|binding|INFO|Removing iface tapc3626423-ae ovn-installed in OVS
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:24:10.596 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:89:aa 10.100.0.69'], port_security=['fa:16:3e:95:89:aa 10.100.0.69'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.69/28', 'neutron:device_id': 'abb95c64-245b-4e3a-bb95-02d86e179dbe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d7b1c6f132443b0abac8495ed44621d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0962c3b7-42c9-4039-adcd-00de2c2d903e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=701186e4-ea38-4beb-a81d-64dc7e4e6aec, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=c3626423-aef0-4457-a1b2-a1d3997f96d2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:24:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:24:10.597 103739 INFO neutron.agent.ovn.metadata.agent [-] Port c3626423-aef0-4457-a1b2-a1d3997f96d2 in datapath f3f8093b-d3a9-4ef1-b524-693e6a6f6c96 unbound from our chassis#033[00m
Oct  8 12:24:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:24:10.598 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f3f8093b-d3a9-4ef1-b524-693e6a6f6c96, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 12:24:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:24:10.599 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b7a6c97c-a0f6-47ae-a00c-07b24b39cb21]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:24:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:24:10.600 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96 namespace which is not needed anymore#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:10 np0005476733 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d0000005a.scope: Deactivated successfully.
Oct  8 12:24:10 np0005476733 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d0000005a.scope: Consumed 12.802s CPU time.
Oct  8 12:24:10 np0005476733 systemd-machined[152624]: Machine qemu-56-instance-0000005a terminated.
Oct  8 12:24:10 np0005476733 neutron-haproxy-ovnmeta-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96[260586]: [NOTICE]   (260590) : haproxy version is 2.8.14-c23fe91
Oct  8 12:24:10 np0005476733 neutron-haproxy-ovnmeta-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96[260586]: [NOTICE]   (260590) : path to executable is /usr/sbin/haproxy
Oct  8 12:24:10 np0005476733 neutron-haproxy-ovnmeta-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96[260586]: [WARNING]  (260590) : Exiting Master process...
Oct  8 12:24:10 np0005476733 neutron-haproxy-ovnmeta-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96[260586]: [ALERT]    (260590) : Current worker (260592) exited with code 143 (Terminated)
Oct  8 12:24:10 np0005476733 neutron-haproxy-ovnmeta-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96[260586]: [WARNING]  (260590) : All workers exited. Exiting... (0)
Oct  8 12:24:10 np0005476733 systemd[1]: libpod-8cd7e4937c2ecee6fd37509da14313d3357336a4e7e3c167fc0740cd104730b8.scope: Deactivated successfully.
Oct  8 12:24:10 np0005476733 podman[260966]: 2025-10-08 16:24:10.742662453 +0000 UTC m=+0.044308477 container died 8cd7e4937c2ecee6fd37509da14313d3357336a4e7e3c167fc0740cd104730b8 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:10 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8cd7e4937c2ecee6fd37509da14313d3357336a4e7e3c167fc0740cd104730b8-userdata-shm.mount: Deactivated successfully.
Oct  8 12:24:10 np0005476733 systemd[1]: var-lib-containers-storage-overlay-7feab23eac1fc1523fa166e3029706c1b669dd716b65a48545fb9e10960f6f7c-merged.mount: Deactivated successfully.
Oct  8 12:24:10 np0005476733 podman[260966]: 2025-10-08 16:24:10.78982259 +0000 UTC m=+0.091468614 container cleanup 8cd7e4937c2ecee6fd37509da14313d3357336a4e7e3c167fc0740cd104730b8 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 12:24:10 np0005476733 systemd[1]: libpod-conmon-8cd7e4937c2ecee6fd37509da14313d3357336a4e7e3c167fc0740cd104730b8.scope: Deactivated successfully.
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.810 2 INFO nova.virt.libvirt.driver [-] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Instance destroyed successfully.#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.811 2 DEBUG nova.objects.instance [None req-4f51c62c-a0d3-4ae6-8125-7334b2c455d6 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lazy-loading 'resources' on Instance uuid abb95c64-245b-4e3a-bb95-02d86e179dbe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.848 2 DEBUG nova.virt.libvirt.vif [None req-4f51c62c-a0d3-4ae6-8125-7334b2c455d6 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T16:23:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-server-test-693331515',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-693331515',id=90,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO4EccSgdKz20vhzCTJVEKl1/Mu3zu8VUU9b5FpQnU5dG12k8nhnJmJgq+Ku3VVX1x/AvGHzkf+c9hXrKFuhBTSr0zWXBTiMXkars7IT0AKScQJV07dnYMXQ1AYcsSb/WA==',key_name='tempest-keypair-1447999901',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:23:36Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9d7b1c6f132443b0abac8495ed44621d',ramdisk_id='',reservation_id='r-rm4zxywv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-OvnDvrTest-313060968',owner_user_name='tempest-OvnDvrTest-313060968-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T16:23:36Z,user_data=None,user_id='81b62a8f3edf4f78aeb0b087fd79ebb7',uuid=abb95c64-245b-4e3a-bb95-02d86e179dbe,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "address": "fa:16:3e:95:89:aa", "network": {"id": "f3f8093b-d3a9-4ef1-b524-693e6a6f6c96", "bridge": "br-int", "label": "tempest-test-network--1786492812", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3626423-ae", "ovs_interfaceid": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.849 2 DEBUG nova.network.os_vif_util [None req-4f51c62c-a0d3-4ae6-8125-7334b2c455d6 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converting VIF {"id": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "address": "fa:16:3e:95:89:aa", "network": {"id": "f3f8093b-d3a9-4ef1-b524-693e6a6f6c96", "bridge": "br-int", "label": "tempest-test-network--1786492812", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3626423-ae", "ovs_interfaceid": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.850 2 DEBUG nova.network.os_vif_util [None req-4f51c62c-a0d3-4ae6-8125-7334b2c455d6 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:95:89:aa,bridge_name='br-int',has_traffic_filtering=True,id=c3626423-aef0-4457-a1b2-a1d3997f96d2,network=Network(f3f8093b-d3a9-4ef1-b524-693e6a6f6c96),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3626423-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.850 2 DEBUG os_vif [None req-4f51c62c-a0d3-4ae6-8125-7334b2c455d6 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:89:aa,bridge_name='br-int',has_traffic_filtering=True,id=c3626423-aef0-4457-a1b2-a1d3997f96d2,network=Network(f3f8093b-d3a9-4ef1-b524-693e6a6f6c96),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3626423-ae') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.852 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3626423-ae, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.859 2 INFO os_vif [None req-4f51c62c-a0d3-4ae6-8125-7334b2c455d6 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:89:aa,bridge_name='br-int',has_traffic_filtering=True,id=c3626423-aef0-4457-a1b2-a1d3997f96d2,network=Network(f3f8093b-d3a9-4ef1-b524-693e6a6f6c96),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3626423-ae')#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.860 2 INFO nova.virt.libvirt.driver [None req-4f51c62c-a0d3-4ae6-8125-7334b2c455d6 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Deleting instance files /var/lib/nova/instances/abb95c64-245b-4e3a-bb95-02d86e179dbe_del#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.861 2 INFO nova.virt.libvirt.driver [None req-4f51c62c-a0d3-4ae6-8125-7334b2c455d6 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Deletion of /var/lib/nova/instances/abb95c64-245b-4e3a-bb95-02d86e179dbe_del complete#033[00m
Oct  8 12:24:10 np0005476733 podman[261010]: 2025-10-08 16:24:10.866611268 +0000 UTC m=+0.045479495 container remove 8cd7e4937c2ecee6fd37509da14313d3357336a4e7e3c167fc0740cd104730b8 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 12:24:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:24:10.871 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a1a02332-1bb2-4f3a-a6aa-81382c09aad4]: (4, ('Wed Oct  8 04:24:10 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96 (8cd7e4937c2ecee6fd37509da14313d3357336a4e7e3c167fc0740cd104730b8)\n8cd7e4937c2ecee6fd37509da14313d3357336a4e7e3c167fc0740cd104730b8\nWed Oct  8 04:24:10 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96 (8cd7e4937c2ecee6fd37509da14313d3357336a4e7e3c167fc0740cd104730b8)\n8cd7e4937c2ecee6fd37509da14313d3357336a4e7e3c167fc0740cd104730b8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:24:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:24:10.873 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4b53236c-aae1-4902-8a20-e90ef256739b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:24:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:24:10.874 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf3f8093b-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:10 np0005476733 kernel: tapf3f8093b-d0: left promiscuous mode
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:24:10.890 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ebd1fdfb-3dde-4cc1-9857-635702822d37]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:24:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:24:10.916 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7c66da87-ac9d-4cd8-99e2-87119783f29c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:24:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:24:10.917 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[32b83ddc-c774-4b93-9eff-b442d723d61d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:24:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:24:10.935 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[99df60ab-0276-4adb-949b-08bb1f44e565]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 751866, 'reachable_time': 22834, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 261023, 'error': None, 'target': 'ovnmeta-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:24:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:24:10.938 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f3f8093b-d3a9-4ef1-b524-693e6a6f6c96 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 12:24:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:24:10.939 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[07abb5b5-c176-4224-9d6f-f57beba45214]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:24:10 np0005476733 systemd[1]: run-netns-ovnmeta\x2df3f8093b\x2dd3a9\x2d4ef1\x2db524\x2d693e6a6f6c96.mount: Deactivated successfully.
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.972 2 INFO nova.compute.manager [None req-4f51c62c-a0d3-4ae6-8125-7334b2c455d6 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.973 2 DEBUG oslo.service.loopingcall [None req-4f51c62c-a0d3-4ae6-8125-7334b2c455d6 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.973 2 DEBUG nova.compute.manager [-] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 12:24:10 np0005476733 nova_compute[192580]: 2025-10-08 16:24:10.974 2 DEBUG nova.network.neutron [-] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 12:24:11 np0005476733 nova_compute[192580]: 2025-10-08 16:24:11.801 2 DEBUG nova.network.neutron [-] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:24:11 np0005476733 nova_compute[192580]: 2025-10-08 16:24:11.831 2 INFO nova.compute.manager [-] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Took 0.86 seconds to deallocate network for instance.#033[00m
Oct  8 12:24:11 np0005476733 nova_compute[192580]: 2025-10-08 16:24:11.886 2 DEBUG oslo_concurrency.lockutils [None req-4f51c62c-a0d3-4ae6-8125-7334b2c455d6 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:24:11 np0005476733 nova_compute[192580]: 2025-10-08 16:24:11.887 2 DEBUG oslo_concurrency.lockutils [None req-4f51c62c-a0d3-4ae6-8125-7334b2c455d6 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:24:11 np0005476733 nova_compute[192580]: 2025-10-08 16:24:11.957 2 DEBUG nova.compute.provider_tree [None req-4f51c62c-a0d3-4ae6-8125-7334b2c455d6 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:24:11 np0005476733 nova_compute[192580]: 2025-10-08 16:24:11.976 2 DEBUG nova.scheduler.client.report [None req-4f51c62c-a0d3-4ae6-8125-7334b2c455d6 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:24:12 np0005476733 nova_compute[192580]: 2025-10-08 16:24:12.004 2 DEBUG oslo_concurrency.lockutils [None req-4f51c62c-a0d3-4ae6-8125-7334b2c455d6 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:24:12 np0005476733 nova_compute[192580]: 2025-10-08 16:24:12.039 2 INFO nova.scheduler.client.report [None req-4f51c62c-a0d3-4ae6-8125-7334b2c455d6 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Deleted allocations for instance abb95c64-245b-4e3a-bb95-02d86e179dbe#033[00m
Oct  8 12:24:12 np0005476733 nova_compute[192580]: 2025-10-08 16:24:12.150 2 DEBUG oslo_concurrency.lockutils [None req-4f51c62c-a0d3-4ae6-8125-7334b2c455d6 81b62a8f3edf4f78aeb0b087fd79ebb7 9d7b1c6f132443b0abac8495ed44621d - - default default] Lock "abb95c64-245b-4e3a-bb95-02d86e179dbe" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.608s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:24:12 np0005476733 nova_compute[192580]: 2025-10-08 16:24:12.155 2 DEBUG nova.network.neutron [req-e75571f2-4555-4fb0-a85c-a183182d7aa2 req-b6ca6876-de11-488e-a808-b8672c9b87db 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Updated VIF entry in instance network info cache for port c3626423-aef0-4457-a1b2-a1d3997f96d2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:24:12 np0005476733 nova_compute[192580]: 2025-10-08 16:24:12.156 2 DEBUG nova.network.neutron [req-e75571f2-4555-4fb0-a85c-a183182d7aa2 req-b6ca6876-de11-488e-a808-b8672c9b87db 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Updating instance_info_cache with network_info: [{"id": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "address": "fa:16:3e:95:89:aa", "network": {"id": "f3f8093b-d3a9-4ef1-b524-693e6a6f6c96", "bridge": "br-int", "label": "tempest-test-network--1786492812", "subnets": [{"cidr": "10.100.0.64/28", "dns": [], "gateway": {"address": "10.100.0.65", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0433a72056854da48c168f13bcf53e59", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3626423-ae", "ovs_interfaceid": "c3626423-aef0-4457-a1b2-a1d3997f96d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:24:12 np0005476733 nova_compute[192580]: 2025-10-08 16:24:12.181 2 DEBUG oslo_concurrency.lockutils [req-e75571f2-4555-4fb0-a85c-a183182d7aa2 req-b6ca6876-de11-488e-a808-b8672c9b87db 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-abb95c64-245b-4e3a-bb95-02d86e179dbe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:24:12 np0005476733 nova_compute[192580]: 2025-10-08 16:24:12.564 2 DEBUG nova.compute.manager [req-3230d062-9561-4bfd-95d2-c13d9f7bf1cf req-17a4cb2e-d656-4a98-874b-95a5631d6a63 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Received event network-vif-unplugged-c3626423-aef0-4457-a1b2-a1d3997f96d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:24:12 np0005476733 nova_compute[192580]: 2025-10-08 16:24:12.564 2 DEBUG oslo_concurrency.lockutils [req-3230d062-9561-4bfd-95d2-c13d9f7bf1cf req-17a4cb2e-d656-4a98-874b-95a5631d6a63 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "abb95c64-245b-4e3a-bb95-02d86e179dbe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:24:12 np0005476733 nova_compute[192580]: 2025-10-08 16:24:12.564 2 DEBUG oslo_concurrency.lockutils [req-3230d062-9561-4bfd-95d2-c13d9f7bf1cf req-17a4cb2e-d656-4a98-874b-95a5631d6a63 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "abb95c64-245b-4e3a-bb95-02d86e179dbe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:24:12 np0005476733 nova_compute[192580]: 2025-10-08 16:24:12.564 2 DEBUG oslo_concurrency.lockutils [req-3230d062-9561-4bfd-95d2-c13d9f7bf1cf req-17a4cb2e-d656-4a98-874b-95a5631d6a63 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "abb95c64-245b-4e3a-bb95-02d86e179dbe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:24:12 np0005476733 nova_compute[192580]: 2025-10-08 16:24:12.565 2 DEBUG nova.compute.manager [req-3230d062-9561-4bfd-95d2-c13d9f7bf1cf req-17a4cb2e-d656-4a98-874b-95a5631d6a63 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] No waiting events found dispatching network-vif-unplugged-c3626423-aef0-4457-a1b2-a1d3997f96d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:24:12 np0005476733 nova_compute[192580]: 2025-10-08 16:24:12.565 2 WARNING nova.compute.manager [req-3230d062-9561-4bfd-95d2-c13d9f7bf1cf req-17a4cb2e-d656-4a98-874b-95a5631d6a63 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Received unexpected event network-vif-unplugged-c3626423-aef0-4457-a1b2-a1d3997f96d2 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 12:24:12 np0005476733 nova_compute[192580]: 2025-10-08 16:24:12.565 2 DEBUG nova.compute.manager [req-3230d062-9561-4bfd-95d2-c13d9f7bf1cf req-17a4cb2e-d656-4a98-874b-95a5631d6a63 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Received event network-vif-plugged-c3626423-aef0-4457-a1b2-a1d3997f96d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:24:12 np0005476733 nova_compute[192580]: 2025-10-08 16:24:12.565 2 DEBUG oslo_concurrency.lockutils [req-3230d062-9561-4bfd-95d2-c13d9f7bf1cf req-17a4cb2e-d656-4a98-874b-95a5631d6a63 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "abb95c64-245b-4e3a-bb95-02d86e179dbe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:24:12 np0005476733 nova_compute[192580]: 2025-10-08 16:24:12.565 2 DEBUG oslo_concurrency.lockutils [req-3230d062-9561-4bfd-95d2-c13d9f7bf1cf req-17a4cb2e-d656-4a98-874b-95a5631d6a63 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "abb95c64-245b-4e3a-bb95-02d86e179dbe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:24:12 np0005476733 nova_compute[192580]: 2025-10-08 16:24:12.566 2 DEBUG oslo_concurrency.lockutils [req-3230d062-9561-4bfd-95d2-c13d9f7bf1cf req-17a4cb2e-d656-4a98-874b-95a5631d6a63 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "abb95c64-245b-4e3a-bb95-02d86e179dbe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:24:12 np0005476733 nova_compute[192580]: 2025-10-08 16:24:12.566 2 DEBUG nova.compute.manager [req-3230d062-9561-4bfd-95d2-c13d9f7bf1cf req-17a4cb2e-d656-4a98-874b-95a5631d6a63 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] No waiting events found dispatching network-vif-plugged-c3626423-aef0-4457-a1b2-a1d3997f96d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:24:12 np0005476733 nova_compute[192580]: 2025-10-08 16:24:12.566 2 WARNING nova.compute.manager [req-3230d062-9561-4bfd-95d2-c13d9f7bf1cf req-17a4cb2e-d656-4a98-874b-95a5631d6a63 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Received unexpected event network-vif-plugged-c3626423-aef0-4457-a1b2-a1d3997f96d2 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 12:24:12 np0005476733 nova_compute[192580]: 2025-10-08 16:24:12.566 2 DEBUG nova.compute.manager [req-3230d062-9561-4bfd-95d2-c13d9f7bf1cf req-17a4cb2e-d656-4a98-874b-95a5631d6a63 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Received event network-vif-deleted-c3626423-aef0-4457-a1b2-a1d3997f96d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:24:13 np0005476733 nova_compute[192580]: 2025-10-08 16:24:13.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:15 np0005476733 nova_compute[192580]: 2025-10-08 16:24:15.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:16 np0005476733 podman[261030]: 2025-10-08 16:24:16.268368399 +0000 UTC m=+0.083486051 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:24:16 np0005476733 podman[261029]: 2025-10-08 16:24:16.295306035 +0000 UTC m=+0.115261780 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 12:24:18 np0005476733 nova_compute[192580]: 2025-10-08 16:24:18.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:20 np0005476733 nova_compute[192580]: 2025-10-08 16:24:20.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:20 np0005476733 nova_compute[192580]: 2025-10-08 16:24:20.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:21 np0005476733 ovn_controller[94857]: 2025-10-08T16:24:21Z|00878|pinctrl|WARN|Dropped 1035 log messages in last 57 seconds (most recently, 1 seconds ago) due to excessive rate
Oct  8 12:24:21 np0005476733 ovn_controller[94857]: 2025-10-08T16:24:21Z|00879|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:24:21 np0005476733 podman[261076]: 2025-10-08 16:24:21.233513232 +0000 UTC m=+0.057922170 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 12:24:21 np0005476733 podman[261077]: 2025-10-08 16:24:21.236002481 +0000 UTC m=+0.057113115 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.openshift.expose-services=, release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  8 12:24:21 np0005476733 podman[261075]: 2025-10-08 16:24:21.263928707 +0000 UTC m=+0.090225075 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0)
Oct  8 12:24:23 np0005476733 nova_compute[192580]: 2025-10-08 16:24:23.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:25 np0005476733 nova_compute[192580]: 2025-10-08 16:24:25.808 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759940650.8066685, abb95c64-245b-4e3a-bb95-02d86e179dbe => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:24:25 np0005476733 nova_compute[192580]: 2025-10-08 16:24:25.808 2 INFO nova.compute.manager [-] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] VM Stopped (Lifecycle Event)#033[00m
Oct  8 12:24:25 np0005476733 nova_compute[192580]: 2025-10-08 16:24:25.831 2 DEBUG nova.compute.manager [None req-32b5d607-fb82-4da2-af3f-e7700d199829 - - - - - -] [instance: abb95c64-245b-4e3a-bb95-02d86e179dbe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:24:25 np0005476733 nova_compute[192580]: 2025-10-08 16:24:25.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:24:26.382 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:24:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:24:26.382 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:24:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:24:26.382 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:24:28 np0005476733 nova_compute[192580]: 2025-10-08 16:24:28.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:24:28 np0005476733 nova_compute[192580]: 2025-10-08 16:24:28.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:29 np0005476733 nova_compute[192580]: 2025-10-08 16:24:29.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:30 np0005476733 nova_compute[192580]: 2025-10-08 16:24:30.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:31 np0005476733 podman[261137]: 2025-10-08 16:24:31.243877697 +0000 UTC m=+0.064387405 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:24:31 np0005476733 podman[261136]: 2025-10-08 16:24:31.260123122 +0000 UTC m=+0.082530480 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001)
Oct  8 12:24:33 np0005476733 nova_compute[192580]: 2025-10-08 16:24:33.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:24:33 np0005476733 nova_compute[192580]: 2025-10-08 16:24:33.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:35 np0005476733 nova_compute[192580]: 2025-10-08 16:24:35.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.067 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.067 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.067 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.067 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.070 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.070 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.070 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.070 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.070 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.070 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:24:36.070 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:24:36 np0005476733 nova_compute[192580]: 2025-10-08 16:24:36.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:24:36 np0005476733 nova_compute[192580]: 2025-10-08 16:24:36.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:24:37 np0005476733 nova_compute[192580]: 2025-10-08 16:24:37.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:24:37 np0005476733 nova_compute[192580]: 2025-10-08 16:24:37.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:24:38 np0005476733 nova_compute[192580]: 2025-10-08 16:24:38.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:40 np0005476733 podman[261179]: 2025-10-08 16:24:40.264816752 +0000 UTC m=+0.086416624 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct  8 12:24:40 np0005476733 nova_compute[192580]: 2025-10-08 16:24:40.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:42 np0005476733 nova_compute[192580]: 2025-10-08 16:24:42.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:24:42 np0005476733 nova_compute[192580]: 2025-10-08 16:24:42.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:24:42 np0005476733 nova_compute[192580]: 2025-10-08 16:24:42.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:24:42 np0005476733 nova_compute[192580]: 2025-10-08 16:24:42.611 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:24:43 np0005476733 nova_compute[192580]: 2025-10-08 16:24:43.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:44 np0005476733 nova_compute[192580]: 2025-10-08 16:24:44.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:24:45 np0005476733 nova_compute[192580]: 2025-10-08 16:24:45.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:46 np0005476733 nova_compute[192580]: 2025-10-08 16:24:46.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:24:46 np0005476733 nova_compute[192580]: 2025-10-08 16:24:46.631 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:24:46 np0005476733 nova_compute[192580]: 2025-10-08 16:24:46.631 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:24:46 np0005476733 nova_compute[192580]: 2025-10-08 16:24:46.631 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:24:46 np0005476733 nova_compute[192580]: 2025-10-08 16:24:46.632 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:24:46 np0005476733 podman[261205]: 2025-10-08 16:24:46.747782446 +0000 UTC m=+0.061172163 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 12:24:46 np0005476733 podman[261203]: 2025-10-08 16:24:46.793205878 +0000 UTC m=+0.100466040 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller)
Oct  8 12:24:46 np0005476733 nova_compute[192580]: 2025-10-08 16:24:46.822 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:24:46 np0005476733 nova_compute[192580]: 2025-10-08 16:24:46.823 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13655MB free_disk=111.31312942504883GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:24:46 np0005476733 nova_compute[192580]: 2025-10-08 16:24:46.823 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:24:46 np0005476733 nova_compute[192580]: 2025-10-08 16:24:46.823 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:24:46 np0005476733 nova_compute[192580]: 2025-10-08 16:24:46.903 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:24:46 np0005476733 nova_compute[192580]: 2025-10-08 16:24:46.903 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:24:46 np0005476733 nova_compute[192580]: 2025-10-08 16:24:46.929 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:24:46 np0005476733 nova_compute[192580]: 2025-10-08 16:24:46.954 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:24:47 np0005476733 nova_compute[192580]: 2025-10-08 16:24:47.007 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:24:47 np0005476733 nova_compute[192580]: 2025-10-08 16:24:47.008 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:24:47 np0005476733 systemd-logind[827]: New session 137 of user zuul.
Oct  8 12:24:47 np0005476733 systemd[1]: Started Session 137 of User zuul.
Oct  8 12:24:47 np0005476733 systemd-logind[827]: New session 138 of user zuul.
Oct  8 12:24:47 np0005476733 systemd[1]: Started Session 138 of User zuul.
Oct  8 12:24:48 np0005476733 systemd[1]: session-138.scope: Deactivated successfully.
Oct  8 12:24:48 np0005476733 systemd-logind[827]: Session 138 logged out. Waiting for processes to exit.
Oct  8 12:24:48 np0005476733 systemd-logind[827]: Removed session 138.
Oct  8 12:24:48 np0005476733 nova_compute[192580]: 2025-10-08 16:24:48.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:50 np0005476733 nova_compute[192580]: 2025-10-08 16:24:50.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:52 np0005476733 podman[261311]: 2025-10-08 16:24:52.241100985 +0000 UTC m=+0.063804797 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:24:52 np0005476733 podman[261310]: 2025-10-08 16:24:52.245007349 +0000 UTC m=+0.068354571 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Oct  8 12:24:52 np0005476733 podman[261312]: 2025-10-08 16:24:52.246704333 +0000 UTC m=+0.069848889 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, container_name=openstack_network_exporter, architecture=x86_64, name=ubi9-minimal, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  8 12:24:53 np0005476733 nova_compute[192580]: 2025-10-08 16:24:53.001 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:24:53 np0005476733 nova_compute[192580]: 2025-10-08 16:24:53.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:54 np0005476733 nova_compute[192580]: 2025-10-08 16:24:54.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:24:55 np0005476733 nova_compute[192580]: 2025-10-08 16:24:55.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:24:58 np0005476733 nova_compute[192580]: 2025-10-08 16:24:58.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:00 np0005476733 nova_compute[192580]: 2025-10-08 16:25:00.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:02 np0005476733 podman[261383]: 2025-10-08 16:25:02.233959424 +0000 UTC m=+0.059646565 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:25:02 np0005476733 podman[261382]: 2025-10-08 16:25:02.250352854 +0000 UTC m=+0.066598126 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 12:25:03 np0005476733 nova_compute[192580]: 2025-10-08 16:25:03.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:04 np0005476733 systemd-logind[827]: New session 139 of user zuul.
Oct  8 12:25:04 np0005476733 systemd[1]: Started Session 139 of User zuul.
Oct  8 12:25:05 np0005476733 nova_compute[192580]: 2025-10-08 16:25:05.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:08 np0005476733 nova_compute[192580]: 2025-10-08 16:25:08.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:09 np0005476733 systemd-logind[827]: New session 140 of user zuul.
Oct  8 12:25:09 np0005476733 systemd[1]: Started Session 140 of User zuul.
Oct  8 12:25:09 np0005476733 systemd[1]: session-140.scope: Deactivated successfully.
Oct  8 12:25:09 np0005476733 systemd-logind[827]: Session 140 logged out. Waiting for processes to exit.
Oct  8 12:25:09 np0005476733 systemd-logind[827]: Removed session 140.
Oct  8 12:25:10 np0005476733 nova_compute[192580]: 2025-10-08 16:25:10.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:11 np0005476733 podman[261489]: 2025-10-08 16:25:11.240874245 +0000 UTC m=+0.061075929 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  8 12:25:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:25:13.044 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=64, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=63) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:25:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:25:13.045 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:25:13 np0005476733 nova_compute[192580]: 2025-10-08 16:25:13.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:13 np0005476733 nova_compute[192580]: 2025-10-08 16:25:13.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:25:15.048 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '64'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:25:15 np0005476733 nova_compute[192580]: 2025-10-08 16:25:15.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:17 np0005476733 podman[261508]: 2025-10-08 16:25:17.250177575 +0000 UTC m=+0.067567966 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:25:17 np0005476733 podman[261507]: 2025-10-08 16:25:17.269786507 +0000 UTC m=+0.098113825 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 12:25:18 np0005476733 nova_compute[192580]: 2025-10-08 16:25:18.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:20 np0005476733 ovn_controller[94857]: 2025-10-08T16:25:20Z|00880|pinctrl|WARN|Dropped 979 log messages in last 60 seconds (most recently, 2 seconds ago) due to excessive rate
Oct  8 12:25:20 np0005476733 ovn_controller[94857]: 2025-10-08T16:25:20Z|00881|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:25:20 np0005476733 nova_compute[192580]: 2025-10-08 16:25:20.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:23 np0005476733 podman[261556]: 2025-10-08 16:25:23.226985422 +0000 UTC m=+0.051712963 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 12:25:23 np0005476733 podman[261557]: 2025-10-08 16:25:23.234571172 +0000 UTC m=+0.056201555 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public)
Oct  8 12:25:23 np0005476733 podman[261555]: 2025-10-08 16:25:23.256835399 +0000 UTC m=+0.084438452 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Oct  8 12:25:23 np0005476733 nova_compute[192580]: 2025-10-08 16:25:23.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:25 np0005476733 nova_compute[192580]: 2025-10-08 16:25:25.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:25:26.383 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:25:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:25:26.383 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:25:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:25:26.384 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:25:28 np0005476733 nova_compute[192580]: 2025-10-08 16:25:28.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:29 np0005476733 nova_compute[192580]: 2025-10-08 16:25:29.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:25:31 np0005476733 nova_compute[192580]: 2025-10-08 16:25:31.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:32 np0005476733 podman[261625]: 2025-10-08 16:25:32.491119657 +0000 UTC m=+0.078374169 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:25:32 np0005476733 podman[261624]: 2025-10-08 16:25:32.492300504 +0000 UTC m=+0.087909901 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 12:25:33 np0005476733 nova_compute[192580]: 2025-10-08 16:25:33.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:34 np0005476733 nova_compute[192580]: 2025-10-08 16:25:34.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:25:36 np0005476733 nova_compute[192580]: 2025-10-08 16:25:36.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:37 np0005476733 nova_compute[192580]: 2025-10-08 16:25:37.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:37 np0005476733 nova_compute[192580]: 2025-10-08 16:25:37.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:25:38 np0005476733 nova_compute[192580]: 2025-10-08 16:25:38.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:25:38 np0005476733 nova_compute[192580]: 2025-10-08 16:25:38.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:25:38 np0005476733 nova_compute[192580]: 2025-10-08 16:25:38.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:25:38 np0005476733 nova_compute[192580]: 2025-10-08 16:25:38.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:41 np0005476733 nova_compute[192580]: 2025-10-08 16:25:41.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:42 np0005476733 podman[261670]: 2025-10-08 16:25:42.24342038 +0000 UTC m=+0.069725554 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Oct  8 12:25:43 np0005476733 nova_compute[192580]: 2025-10-08 16:25:43.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:44 np0005476733 nova_compute[192580]: 2025-10-08 16:25:44.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:25:44 np0005476733 nova_compute[192580]: 2025-10-08 16:25:44.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:25:44 np0005476733 nova_compute[192580]: 2025-10-08 16:25:44.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:25:44 np0005476733 nova_compute[192580]: 2025-10-08 16:25:44.609 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:25:45 np0005476733 nova_compute[192580]: 2025-10-08 16:25:45.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:25:46 np0005476733 nova_compute[192580]: 2025-10-08 16:25:46.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:48 np0005476733 podman[261695]: 2025-10-08 16:25:48.270990188 +0000 UTC m=+0.096336299 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 12:25:48 np0005476733 podman[261696]: 2025-10-08 16:25:48.278709993 +0000 UTC m=+0.093005123 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 12:25:48 np0005476733 nova_compute[192580]: 2025-10-08 16:25:48.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:25:48 np0005476733 nova_compute[192580]: 2025-10-08 16:25:48.621 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:25:48 np0005476733 nova_compute[192580]: 2025-10-08 16:25:48.622 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:25:48 np0005476733 nova_compute[192580]: 2025-10-08 16:25:48.622 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:25:48 np0005476733 nova_compute[192580]: 2025-10-08 16:25:48.622 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:25:48 np0005476733 nova_compute[192580]: 2025-10-08 16:25:48.793 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:25:48 np0005476733 nova_compute[192580]: 2025-10-08 16:25:48.794 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13642MB free_disk=111.3127212524414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:25:48 np0005476733 nova_compute[192580]: 2025-10-08 16:25:48.794 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:25:48 np0005476733 nova_compute[192580]: 2025-10-08 16:25:48.794 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:25:48 np0005476733 nova_compute[192580]: 2025-10-08 16:25:48.888 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:25:48 np0005476733 nova_compute[192580]: 2025-10-08 16:25:48.888 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:25:48 np0005476733 nova_compute[192580]: 2025-10-08 16:25:48.926 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:25:48 np0005476733 nova_compute[192580]: 2025-10-08 16:25:48.943 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:25:48 np0005476733 nova_compute[192580]: 2025-10-08 16:25:48.945 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:25:48 np0005476733 nova_compute[192580]: 2025-10-08 16:25:48.945 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:25:48 np0005476733 nova_compute[192580]: 2025-10-08 16:25:48.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:51 np0005476733 nova_compute[192580]: 2025-10-08 16:25:51.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:53 np0005476733 nova_compute[192580]: 2025-10-08 16:25:53.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:54 np0005476733 podman[261744]: 2025-10-08 16:25:54.244051696 +0000 UTC m=+0.073455023 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Oct  8 12:25:54 np0005476733 podman[261746]: 2025-10-08 16:25:54.283184158 +0000 UTC m=+0.090681820 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, version=9.6, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  8 12:25:54 np0005476733 podman[261745]: 2025-10-08 16:25:54.28324144 +0000 UTC m=+0.112807342 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:25:56 np0005476733 nova_compute[192580]: 2025-10-08 16:25:56.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:25:56 np0005476733 nova_compute[192580]: 2025-10-08 16:25:56.947 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:25:58 np0005476733 nova_compute[192580]: 2025-10-08 16:25:58.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:01 np0005476733 nova_compute[192580]: 2025-10-08 16:26:01.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:03 np0005476733 podman[261805]: 2025-10-08 16:26:03.22872942 +0000 UTC m=+0.050020299 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 12:26:03 np0005476733 podman[261804]: 2025-10-08 16:26:03.238589113 +0000 UTC m=+0.058758956 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid)
Oct  8 12:26:04 np0005476733 nova_compute[192580]: 2025-10-08 16:26:04.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:06 np0005476733 nova_compute[192580]: 2025-10-08 16:26:06.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:09 np0005476733 nova_compute[192580]: 2025-10-08 16:26:09.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:11 np0005476733 nova_compute[192580]: 2025-10-08 16:26:11.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:11 np0005476733 systemd-logind[827]: New session 141 of user zuul.
Oct  8 12:26:11 np0005476733 systemd[1]: Started Session 141 of User zuul.
Oct  8 12:26:11 np0005476733 systemd-logind[827]: New session 142 of user zuul.
Oct  8 12:26:11 np0005476733 systemd[1]: Started Session 142 of User zuul.
Oct  8 12:26:11 np0005476733 systemd[1]: session-142.scope: Deactivated successfully.
Oct  8 12:26:11 np0005476733 systemd-logind[827]: Session 142 logged out. Waiting for processes to exit.
Oct  8 12:26:11 np0005476733 systemd-logind[827]: Removed session 142.
Oct  8 12:26:12 np0005476733 podman[261911]: 2025-10-08 16:26:12.53114427 +0000 UTC m=+0.057110224 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct  8 12:26:14 np0005476733 nova_compute[192580]: 2025-10-08 16:26:14.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:15 np0005476733 nova_compute[192580]: 2025-10-08 16:26:15.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:26:15 np0005476733 nova_compute[192580]: 2025-10-08 16:26:15.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 12:26:16 np0005476733 nova_compute[192580]: 2025-10-08 16:26:16.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:19 np0005476733 nova_compute[192580]: 2025-10-08 16:26:19.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:19 np0005476733 podman[261936]: 2025-10-08 16:26:19.107138469 +0000 UTC m=+0.070084706 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251001, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS)
Oct  8 12:26:19 np0005476733 podman[261935]: 2025-10-08 16:26:19.125542203 +0000 UTC m=+0.091749674 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 12:26:20 np0005476733 systemd-logind[827]: New session 143 of user zuul.
Oct  8 12:26:20 np0005476733 systemd[1]: Started Session 143 of User zuul.
Oct  8 12:26:21 np0005476733 nova_compute[192580]: 2025-10-08 16:26:21.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:22 np0005476733 ovn_controller[94857]: 2025-10-08T16:26:22Z|00882|pinctrl|WARN|Dropped 817 log messages in last 61 seconds (most recently, 3 seconds ago) due to excessive rate
Oct  8 12:26:22 np0005476733 ovn_controller[94857]: 2025-10-08T16:26:22Z|00883|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:26:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:26:22.208 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=65, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=64) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:26:22 np0005476733 nova_compute[192580]: 2025-10-08 16:26:22.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:26:22.209 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:26:23 np0005476733 nova_compute[192580]: 2025-10-08 16:26:23.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:26:24 np0005476733 nova_compute[192580]: 2025-10-08 16:26:24.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:25 np0005476733 podman[262014]: 2025-10-08 16:26:25.238500972 +0000 UTC m=+0.059822390 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Oct  8 12:26:25 np0005476733 podman[262012]: 2025-10-08 16:26:25.253449286 +0000 UTC m=+0.074131664 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 12:26:25 np0005476733 podman[262013]: 2025-10-08 16:26:25.257994311 +0000 UTC m=+0.080175617 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 12:26:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:26:26.211 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '65'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:26:26 np0005476733 nova_compute[192580]: 2025-10-08 16:26:26.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:26:26.384 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:26:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:26:26.384 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:26:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:26:26.384 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:26:29 np0005476733 nova_compute[192580]: 2025-10-08 16:26:29.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:29 np0005476733 nova_compute[192580]: 2025-10-08 16:26:29.612 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:26:31 np0005476733 nova_compute[192580]: 2025-10-08 16:26:31.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:31 np0005476733 systemd-logind[827]: New session 144 of user zuul.
Oct  8 12:26:31 np0005476733 systemd[1]: Started Session 144 of User zuul.
Oct  8 12:26:31 np0005476733 systemd-logind[827]: New session 145 of user zuul.
Oct  8 12:26:31 np0005476733 systemd[1]: Started Session 145 of User zuul.
Oct  8 12:26:31 np0005476733 systemd[1]: session-145.scope: Deactivated successfully.
Oct  8 12:26:31 np0005476733 systemd-logind[827]: Session 145 logged out. Waiting for processes to exit.
Oct  8 12:26:31 np0005476733 systemd-logind[827]: Removed session 145.
Oct  8 12:26:34 np0005476733 nova_compute[192580]: 2025-10-08 16:26:34.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:34 np0005476733 podman[262137]: 2025-10-08 16:26:34.239151663 +0000 UTC m=+0.056579807 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 12:26:34 np0005476733 podman[262136]: 2025-10-08 16:26:34.239151653 +0000 UTC m=+0.058460717 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid)
Oct  8 12:26:34 np0005476733 nova_compute[192580]: 2025-10-08 16:26:34.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.067 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.067 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.067 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.067 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.067 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.067 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.067 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:26:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:26:36 np0005476733 nova_compute[192580]: 2025-10-08 16:26:36.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:38 np0005476733 nova_compute[192580]: 2025-10-08 16:26:38.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:26:38 np0005476733 nova_compute[192580]: 2025-10-08 16:26:38.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:26:38 np0005476733 nova_compute[192580]: 2025-10-08 16:26:38.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:26:38 np0005476733 nova_compute[192580]: 2025-10-08 16:26:38.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:26:38 np0005476733 nova_compute[192580]: 2025-10-08 16:26:38.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 12:26:38 np0005476733 nova_compute[192580]: 2025-10-08 16:26:38.616 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 12:26:39 np0005476733 nova_compute[192580]: 2025-10-08 16:26:39.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:39 np0005476733 nova_compute[192580]: 2025-10-08 16:26:39.616 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:26:40 np0005476733 systemd-logind[827]: New session 146 of user zuul.
Oct  8 12:26:40 np0005476733 systemd[1]: Started Session 146 of User zuul.
Oct  8 12:26:41 np0005476733 nova_compute[192580]: 2025-10-08 16:26:41.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:43 np0005476733 podman[262209]: 2025-10-08 16:26:43.218362363 +0000 UTC m=+0.047642683 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  8 12:26:44 np0005476733 nova_compute[192580]: 2025-10-08 16:26:44.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:44 np0005476733 nova_compute[192580]: 2025-10-08 16:26:44.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:26:44 np0005476733 nova_compute[192580]: 2025-10-08 16:26:44.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:26:44 np0005476733 nova_compute[192580]: 2025-10-08 16:26:44.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:26:44 np0005476733 nova_compute[192580]: 2025-10-08 16:26:44.614 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:26:45 np0005476733 nova_compute[192580]: 2025-10-08 16:26:45.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:26:46 np0005476733 nova_compute[192580]: 2025-10-08 16:26:46.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:49 np0005476733 nova_compute[192580]: 2025-10-08 16:26:49.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:49 np0005476733 podman[262229]: 2025-10-08 16:26:49.239117815 +0000 UTC m=+0.062104102 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible)
Oct  8 12:26:49 np0005476733 podman[262228]: 2025-10-08 16:26:49.261862137 +0000 UTC m=+0.091248798 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 12:26:50 np0005476733 nova_compute[192580]: 2025-10-08 16:26:50.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:26:50 np0005476733 nova_compute[192580]: 2025-10-08 16:26:50.634 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:26:50 np0005476733 nova_compute[192580]: 2025-10-08 16:26:50.634 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:26:50 np0005476733 nova_compute[192580]: 2025-10-08 16:26:50.634 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:26:50 np0005476733 nova_compute[192580]: 2025-10-08 16:26:50.635 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:26:50 np0005476733 nova_compute[192580]: 2025-10-08 16:26:50.792 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:26:50 np0005476733 nova_compute[192580]: 2025-10-08 16:26:50.793 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13626MB free_disk=111.31317138671875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:26:50 np0005476733 nova_compute[192580]: 2025-10-08 16:26:50.793 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:26:50 np0005476733 nova_compute[192580]: 2025-10-08 16:26:50.794 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:26:50 np0005476733 nova_compute[192580]: 2025-10-08 16:26:50.871 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:26:50 np0005476733 nova_compute[192580]: 2025-10-08 16:26:50.871 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:26:50 np0005476733 nova_compute[192580]: 2025-10-08 16:26:50.889 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 12:26:50 np0005476733 nova_compute[192580]: 2025-10-08 16:26:50.919 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 12:26:50 np0005476733 nova_compute[192580]: 2025-10-08 16:26:50.920 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 12:26:50 np0005476733 nova_compute[192580]: 2025-10-08 16:26:50.935 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 12:26:50 np0005476733 nova_compute[192580]: 2025-10-08 16:26:50.970 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 12:26:51 np0005476733 nova_compute[192580]: 2025-10-08 16:26:51.003 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:26:51 np0005476733 nova_compute[192580]: 2025-10-08 16:26:51.040 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:26:51 np0005476733 nova_compute[192580]: 2025-10-08 16:26:51.042 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:26:51 np0005476733 nova_compute[192580]: 2025-10-08 16:26:51.042 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.248s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:26:51 np0005476733 nova_compute[192580]: 2025-10-08 16:26:51.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:53 np0005476733 systemd-logind[827]: New session 147 of user zuul.
Oct  8 12:26:53 np0005476733 systemd[1]: Started Session 147 of User zuul.
Oct  8 12:26:53 np0005476733 systemd-logind[827]: New session 148 of user zuul.
Oct  8 12:26:53 np0005476733 systemd[1]: Started Session 148 of User zuul.
Oct  8 12:26:53 np0005476733 systemd[1]: session-148.scope: Deactivated successfully.
Oct  8 12:26:53 np0005476733 systemd-logind[827]: Session 148 logged out. Waiting for processes to exit.
Oct  8 12:26:53 np0005476733 systemd-logind[827]: Removed session 148.
Oct  8 12:26:54 np0005476733 nova_compute[192580]: 2025-10-08 16:26:54.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:54 np0005476733 ovn_controller[94857]: 2025-10-08T16:26:54Z|00884|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct  8 12:26:55 np0005476733 podman[262339]: 2025-10-08 16:26:55.599161777 +0000 UTC m=+0.087328834 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:26:55 np0005476733 podman[262340]: 2025-10-08 16:26:55.607959826 +0000 UTC m=+0.091475995 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, architecture=x86_64, config_id=edpm, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, managed_by=edpm_ansible, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6)
Oct  8 12:26:55 np0005476733 podman[262338]: 2025-10-08 16:26:55.634847679 +0000 UTC m=+0.123214702 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  8 12:26:56 np0005476733 nova_compute[192580]: 2025-10-08 16:26:56.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:26:57 np0005476733 nova_compute[192580]: 2025-10-08 16:26:57.034 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:26:57 np0005476733 nova_compute[192580]: 2025-10-08 16:26:57.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:26:59 np0005476733 nova_compute[192580]: 2025-10-08 16:26:59.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:01 np0005476733 nova_compute[192580]: 2025-10-08 16:27:01.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:04 np0005476733 nova_compute[192580]: 2025-10-08 16:27:04.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:05 np0005476733 podman[262409]: 2025-10-08 16:27:05.246103954 +0000 UTC m=+0.071139710 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 12:27:05 np0005476733 podman[262408]: 2025-10-08 16:27:05.275316042 +0000 UTC m=+0.103816207 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3)
Oct  8 12:27:06 np0005476733 nova_compute[192580]: 2025-10-08 16:27:06.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:09 np0005476733 nova_compute[192580]: 2025-10-08 16:27:09.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:11 np0005476733 nova_compute[192580]: 2025-10-08 16:27:11.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:14 np0005476733 nova_compute[192580]: 2025-10-08 16:27:14.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:14 np0005476733 podman[262462]: 2025-10-08 16:27:14.226206111 +0000 UTC m=+0.052718254 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:27:15 np0005476733 systemd-logind[827]: New session 149 of user zuul.
Oct  8 12:27:15 np0005476733 systemd[1]: Started Session 149 of User zuul.
Oct  8 12:27:16 np0005476733 nova_compute[192580]: 2025-10-08 16:27:16.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:19 np0005476733 nova_compute[192580]: 2025-10-08 16:27:19.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:20 np0005476733 podman[262516]: 2025-10-08 16:27:20.255296039 +0000 UTC m=+0.074529097 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  8 12:27:20 np0005476733 podman[262515]: 2025-10-08 16:27:20.290829907 +0000 UTC m=+0.108682441 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 12:27:20 np0005476733 systemd-logind[827]: New session 150 of user zuul.
Oct  8 12:27:20 np0005476733 systemd[1]: Started Session 150 of User zuul.
Oct  8 12:27:20 np0005476733 systemd[1]: session-150.scope: Deactivated successfully.
Oct  8 12:27:20 np0005476733 systemd-logind[827]: Session 150 logged out. Waiting for processes to exit.
Oct  8 12:27:20 np0005476733 systemd-logind[827]: Removed session 150.
Oct  8 12:27:21 np0005476733 nova_compute[192580]: 2025-10-08 16:27:21.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:22 np0005476733 ovn_controller[94857]: 2025-10-08T16:27:22Z|00885|pinctrl|WARN|Dropped 467 log messages in last 60 seconds (most recently, 2 seconds ago) due to excessive rate
Oct  8 12:27:22 np0005476733 ovn_controller[94857]: 2025-10-08T16:27:22Z|00886|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:27:24 np0005476733 nova_compute[192580]: 2025-10-08 16:27:24.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:26 np0005476733 podman[262593]: 2025-10-08 16:27:26.234700419 +0000 UTC m=+0.064701405 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 12:27:26 np0005476733 podman[262592]: 2025-10-08 16:27:26.239785511 +0000 UTC m=+0.071313085 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2)
Oct  8 12:27:26 np0005476733 podman[262594]: 2025-10-08 16:27:26.254056444 +0000 UTC m=+0.082906233 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Oct  8 12:27:26 np0005476733 nova_compute[192580]: 2025-10-08 16:27:26.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:27:26.385 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:27:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:27:26.385 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:27:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:27:26.385 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:27:27 np0005476733 systemd-logind[827]: New session 151 of user zuul.
Oct  8 12:27:27 np0005476733 systemd[1]: Started Session 151 of User zuul.
Oct  8 12:27:28 np0005476733 systemd[1]: session-151.scope: Deactivated successfully.
Oct  8 12:27:28 np0005476733 systemd-logind[827]: Session 151 logged out. Waiting for processes to exit.
Oct  8 12:27:28 np0005476733 systemd-logind[827]: Removed session 151.
Oct  8 12:27:29 np0005476733 nova_compute[192580]: 2025-10-08 16:27:29.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:31 np0005476733 nova_compute[192580]: 2025-10-08 16:27:31.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:31 np0005476733 nova_compute[192580]: 2025-10-08 16:27:31.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:27:34 np0005476733 nova_compute[192580]: 2025-10-08 16:27:34.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:34 np0005476733 nova_compute[192580]: 2025-10-08 16:27:34.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:27:35 np0005476733 systemd-logind[827]: New session 152 of user zuul.
Oct  8 12:27:35 np0005476733 systemd[1]: Started Session 152 of User zuul.
Oct  8 12:27:35 np0005476733 podman[262720]: 2025-10-08 16:27:35.362976872 +0000 UTC m=+0.053313113 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:27:35 np0005476733 podman[262721]: 2025-10-08 16:27:35.374945732 +0000 UTC m=+0.060469350 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Oct  8 12:27:35 np0005476733 systemd[1]: session-152.scope: Deactivated successfully.
Oct  8 12:27:35 np0005476733 systemd-logind[827]: Session 152 logged out. Waiting for processes to exit.
Oct  8 12:27:35 np0005476733 systemd-logind[827]: Removed session 152.
Oct  8 12:27:36 np0005476733 nova_compute[192580]: 2025-10-08 16:27:36.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:38 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:27:38.541 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=66, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=65) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:27:38 np0005476733 nova_compute[192580]: 2025-10-08 16:27:38.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:38 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:27:38.542 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:27:39 np0005476733 nova_compute[192580]: 2025-10-08 16:27:39.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:39 np0005476733 nova_compute[192580]: 2025-10-08 16:27:39.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:27:39 np0005476733 nova_compute[192580]: 2025-10-08 16:27:39.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:27:40 np0005476733 nova_compute[192580]: 2025-10-08 16:27:40.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:27:41 np0005476733 nova_compute[192580]: 2025-10-08 16:27:41.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:41 np0005476733 nova_compute[192580]: 2025-10-08 16:27:41.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:27:44 np0005476733 nova_compute[192580]: 2025-10-08 16:27:44.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:44 np0005476733 nova_compute[192580]: 2025-10-08 16:27:44.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:27:44 np0005476733 nova_compute[192580]: 2025-10-08 16:27:44.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:27:44 np0005476733 nova_compute[192580]: 2025-10-08 16:27:44.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:27:44 np0005476733 nova_compute[192580]: 2025-10-08 16:27:44.761 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:27:45 np0005476733 podman[262768]: 2025-10-08 16:27:45.229912345 +0000 UTC m=+0.052982223 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true)
Oct  8 12:27:45 np0005476733 nova_compute[192580]: 2025-10-08 16:27:45.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:27:46 np0005476733 nova_compute[192580]: 2025-10-08 16:27:46.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:27:46.544 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '66'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:27:48 np0005476733 nova_compute[192580]: 2025-10-08 16:27:48.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:48 np0005476733 nova_compute[192580]: 2025-10-08 16:27:48.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:49 np0005476733 nova_compute[192580]: 2025-10-08 16:27:49.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:51 np0005476733 podman[262794]: 2025-10-08 16:27:51.270395623 +0000 UTC m=+0.091478045 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm)
Oct  8 12:27:51 np0005476733 podman[262793]: 2025-10-08 16:27:51.272578092 +0000 UTC m=+0.095810842 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 12:27:51 np0005476733 nova_compute[192580]: 2025-10-08 16:27:51.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:52 np0005476733 nova_compute[192580]: 2025-10-08 16:27:52.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:27:52 np0005476733 nova_compute[192580]: 2025-10-08 16:27:52.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:27:52 np0005476733 nova_compute[192580]: 2025-10-08 16:27:52.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:27:52 np0005476733 nova_compute[192580]: 2025-10-08 16:27:52.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:27:52 np0005476733 nova_compute[192580]: 2025-10-08 16:27:52.617 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:27:52 np0005476733 nova_compute[192580]: 2025-10-08 16:27:52.770 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:27:52 np0005476733 nova_compute[192580]: 2025-10-08 16:27:52.771 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13638MB free_disk=111.31277084350586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:27:52 np0005476733 nova_compute[192580]: 2025-10-08 16:27:52.772 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:27:52 np0005476733 nova_compute[192580]: 2025-10-08 16:27:52.772 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:27:52 np0005476733 nova_compute[192580]: 2025-10-08 16:27:52.892 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:27:52 np0005476733 nova_compute[192580]: 2025-10-08 16:27:52.892 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:27:53 np0005476733 nova_compute[192580]: 2025-10-08 16:27:53.004 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:27:53 np0005476733 nova_compute[192580]: 2025-10-08 16:27:53.038 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:27:53 np0005476733 nova_compute[192580]: 2025-10-08 16:27:53.040 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:27:53 np0005476733 nova_compute[192580]: 2025-10-08 16:27:53.040 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.268s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:27:54 np0005476733 nova_compute[192580]: 2025-10-08 16:27:54.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:56 np0005476733 nova_compute[192580]: 2025-10-08 16:27:56.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:27:57 np0005476733 podman[262845]: 2025-10-08 16:27:57.233740403 +0000 UTC m=+0.053503529 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 12:27:57 np0005476733 podman[262847]: 2025-10-08 16:27:57.24089045 +0000 UTC m=+0.056439612 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vendor=Red Hat, Inc.)
Oct  8 12:27:57 np0005476733 podman[262846]: 2025-10-08 16:27:57.261870626 +0000 UTC m=+0.078891955 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:27:59 np0005476733 nova_compute[192580]: 2025-10-08 16:27:59.040 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:27:59 np0005476733 nova_compute[192580]: 2025-10-08 16:27:59.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:01 np0005476733 nova_compute[192580]: 2025-10-08 16:28:01.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:04 np0005476733 nova_compute[192580]: 2025-10-08 16:28:04.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:06 np0005476733 podman[262917]: 2025-10-08 16:28:06.243858335 +0000 UTC m=+0.060527643 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 12:28:06 np0005476733 podman[262916]: 2025-10-08 16:28:06.252757068 +0000 UTC m=+0.073229456 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct  8 12:28:06 np0005476733 nova_compute[192580]: 2025-10-08 16:28:06.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:09 np0005476733 nova_compute[192580]: 2025-10-08 16:28:09.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:11 np0005476733 nova_compute[192580]: 2025-10-08 16:28:11.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:14 np0005476733 nova_compute[192580]: 2025-10-08 16:28:14.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:16 np0005476733 podman[262961]: 2025-10-08 16:28:16.245948428 +0000 UTC m=+0.069425375 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  8 12:28:16 np0005476733 nova_compute[192580]: 2025-10-08 16:28:16.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:19 np0005476733 nova_compute[192580]: 2025-10-08 16:28:19.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:21 np0005476733 nova_compute[192580]: 2025-10-08 16:28:21.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:22 np0005476733 ovn_controller[94857]: 2025-10-08T16:28:22Z|00887|pinctrl|WARN|Dropped 1043 log messages in last 60 seconds (most recently, 5 seconds ago) due to excessive rate
Oct  8 12:28:22 np0005476733 ovn_controller[94857]: 2025-10-08T16:28:22Z|00888|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:28:22 np0005476733 podman[262984]: 2025-10-08 16:28:22.25090832 +0000 UTC m=+0.064843090 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Oct  8 12:28:22 np0005476733 podman[262983]: 2025-10-08 16:28:22.27895548 +0000 UTC m=+0.094942975 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  8 12:28:24 np0005476733 nova_compute[192580]: 2025-10-08 16:28:24.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:28:26.385 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:28:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:28:26.385 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:28:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:28:26.386 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:28:26 np0005476733 nova_compute[192580]: 2025-10-08 16:28:26.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:28 np0005476733 podman[263029]: 2025-10-08 16:28:28.234287654 +0000 UTC m=+0.058757646 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 12:28:28 np0005476733 podman[263028]: 2025-10-08 16:28:28.238530148 +0000 UTC m=+0.067397130 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 12:28:28 np0005476733 podman[263030]: 2025-10-08 16:28:28.240536263 +0000 UTC m=+0.059478860 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=edpm, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1755695350)
Oct  8 12:28:29 np0005476733 nova_compute[192580]: 2025-10-08 16:28:29.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:31 np0005476733 nova_compute[192580]: 2025-10-08 16:28:31.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:32 np0005476733 nova_compute[192580]: 2025-10-08 16:28:32.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:28:34 np0005476733 nova_compute[192580]: 2025-10-08 16:28:34.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:34 np0005476733 nova_compute[192580]: 2025-10-08 16:28:34.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.070 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.070 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.070 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.070 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.070 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.070 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.070 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.071 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.071 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.071 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.071 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.071 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.071 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.072 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.072 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.072 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.072 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:28:36.072 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:28:36 np0005476733 nova_compute[192580]: 2025-10-08 16:28:36.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:36 np0005476733 podman[263095]: 2025-10-08 16:28:36.433026407 +0000 UTC m=+0.060553063 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  8 12:28:36 np0005476733 podman[263096]: 2025-10-08 16:28:36.451463783 +0000 UTC m=+0.073502785 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 12:28:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:28:37.937 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=67, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=66) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:28:37 np0005476733 nova_compute[192580]: 2025-10-08 16:28:37.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:28:37.938 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:28:39 np0005476733 nova_compute[192580]: 2025-10-08 16:28:39.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:39 np0005476733 nova_compute[192580]: 2025-10-08 16:28:39.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:28:39 np0005476733 nova_compute[192580]: 2025-10-08 16:28:39.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:28:41 np0005476733 nova_compute[192580]: 2025-10-08 16:28:41.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:41 np0005476733 nova_compute[192580]: 2025-10-08 16:28:41.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:28:41 np0005476733 nova_compute[192580]: 2025-10-08 16:28:41.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:28:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:28:43.942 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '67'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:28:44 np0005476733 nova_compute[192580]: 2025-10-08 16:28:44.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:44 np0005476733 systemd-logind[827]: New session 153 of user zuul.
Oct  8 12:28:44 np0005476733 systemd[1]: Started Session 153 of User zuul.
Oct  8 12:28:44 np0005476733 nova_compute[192580]: 2025-10-08 16:28:44.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:28:44 np0005476733 nova_compute[192580]: 2025-10-08 16:28:44.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:28:44 np0005476733 nova_compute[192580]: 2025-10-08 16:28:44.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:28:44 np0005476733 nova_compute[192580]: 2025-10-08 16:28:44.604 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:28:44 np0005476733 systemd[1]: session-153.scope: Deactivated successfully.
Oct  8 12:28:44 np0005476733 systemd-logind[827]: Session 153 logged out. Waiting for processes to exit.
Oct  8 12:28:44 np0005476733 systemd-logind[827]: Removed session 153.
Oct  8 12:28:46 np0005476733 podman[263169]: 2025-10-08 16:28:46.404337771 +0000 UTC m=+0.093457507 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 12:28:46 np0005476733 nova_compute[192580]: 2025-10-08 16:28:46.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:47 np0005476733 nova_compute[192580]: 2025-10-08 16:28:47.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:28:49 np0005476733 nova_compute[192580]: 2025-10-08 16:28:49.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:51 np0005476733 nova_compute[192580]: 2025-10-08 16:28:51.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:52 np0005476733 nova_compute[192580]: 2025-10-08 16:28:52.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:28:52 np0005476733 nova_compute[192580]: 2025-10-08 16:28:52.608 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:28:52 np0005476733 nova_compute[192580]: 2025-10-08 16:28:52.609 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:28:52 np0005476733 nova_compute[192580]: 2025-10-08 16:28:52.609 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:28:52 np0005476733 nova_compute[192580]: 2025-10-08 16:28:52.609 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:28:52 np0005476733 nova_compute[192580]: 2025-10-08 16:28:52.785 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:28:52 np0005476733 nova_compute[192580]: 2025-10-08 16:28:52.786 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13636MB free_disk=111.31279754638672GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:28:52 np0005476733 nova_compute[192580]: 2025-10-08 16:28:52.786 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:28:52 np0005476733 nova_compute[192580]: 2025-10-08 16:28:52.787 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:28:52 np0005476733 nova_compute[192580]: 2025-10-08 16:28:52.859 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:28:52 np0005476733 nova_compute[192580]: 2025-10-08 16:28:52.860 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:28:52 np0005476733 nova_compute[192580]: 2025-10-08 16:28:52.881 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:28:52 np0005476733 nova_compute[192580]: 2025-10-08 16:28:52.902 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:28:52 np0005476733 nova_compute[192580]: 2025-10-08 16:28:52.904 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:28:52 np0005476733 nova_compute[192580]: 2025-10-08 16:28:52.904 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:28:53 np0005476733 podman[263194]: 2025-10-08 16:28:53.065853766 +0000 UTC m=+0.070194870 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 12:28:53 np0005476733 podman[263193]: 2025-10-08 16:28:53.168443632 +0000 UTC m=+0.171156715 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 12:28:54 np0005476733 nova_compute[192580]: 2025-10-08 16:28:54.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:56 np0005476733 nova_compute[192580]: 2025-10-08 16:28:56.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:28:58 np0005476733 nova_compute[192580]: 2025-10-08 16:28:58.896 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:28:59 np0005476733 podman[263238]: 2025-10-08 16:28:59.236003981 +0000 UTC m=+0.057010641 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:28:59 np0005476733 podman[263237]: 2025-10-08 16:28:59.241314879 +0000 UTC m=+0.070873421 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 12:28:59 np0005476733 podman[263239]: 2025-10-08 16:28:59.249554061 +0000 UTC m=+0.070664175 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, config_id=edpm, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, io.buildah.version=1.33.7, distribution-scope=public, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  8 12:28:59 np0005476733 nova_compute[192580]: 2025-10-08 16:28:59.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:00 np0005476733 nova_compute[192580]: 2025-10-08 16:29:00.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:29:01 np0005476733 nova_compute[192580]: 2025-10-08 16:29:01.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:04 np0005476733 nova_compute[192580]: 2025-10-08 16:29:04.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:06 np0005476733 nova_compute[192580]: 2025-10-08 16:29:06.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:07 np0005476733 podman[263295]: 2025-10-08 16:29:07.244835445 +0000 UTC m=+0.057323240 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 12:29:07 np0005476733 podman[263294]: 2025-10-08 16:29:07.281142928 +0000 UTC m=+0.090444161 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 12:29:09 np0005476733 nova_compute[192580]: 2025-10-08 16:29:09.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:11 np0005476733 nova_compute[192580]: 2025-10-08 16:29:11.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:11 np0005476733 nova_compute[192580]: 2025-10-08 16:29:11.504 2 DEBUG oslo_concurrency.lockutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Acquiring lock "e4ae9f32-76e5-4371-be11-8e494000ad01" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:29:11 np0005476733 nova_compute[192580]: 2025-10-08 16:29:11.505 2 DEBUG oslo_concurrency.lockutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Lock "e4ae9f32-76e5-4371-be11-8e494000ad01" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:29:11 np0005476733 nova_compute[192580]: 2025-10-08 16:29:11.524 2 DEBUG nova.compute.manager [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 12:29:11 np0005476733 nova_compute[192580]: 2025-10-08 16:29:11.704 2 DEBUG oslo_concurrency.lockutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:29:11 np0005476733 nova_compute[192580]: 2025-10-08 16:29:11.705 2 DEBUG oslo_concurrency.lockutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:29:11 np0005476733 nova_compute[192580]: 2025-10-08 16:29:11.716 2 DEBUG nova.virt.hardware [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 12:29:11 np0005476733 nova_compute[192580]: 2025-10-08 16:29:11.717 2 INFO nova.compute.claims [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 12:29:11 np0005476733 nova_compute[192580]: 2025-10-08 16:29:11.831 2 DEBUG nova.compute.provider_tree [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:29:11 np0005476733 nova_compute[192580]: 2025-10-08 16:29:11.851 2 DEBUG nova.scheduler.client.report [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:29:11 np0005476733 nova_compute[192580]: 2025-10-08 16:29:11.886 2 DEBUG oslo_concurrency.lockutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:29:11 np0005476733 nova_compute[192580]: 2025-10-08 16:29:11.887 2 DEBUG nova.compute.manager [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 12:29:11 np0005476733 nova_compute[192580]: 2025-10-08 16:29:11.947 2 DEBUG nova.compute.manager [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 12:29:11 np0005476733 nova_compute[192580]: 2025-10-08 16:29:11.948 2 DEBUG nova.network.neutron [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 12:29:11 np0005476733 nova_compute[192580]: 2025-10-08 16:29:11.988 2 INFO nova.virt.libvirt.driver [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.010 2 DEBUG nova.compute.manager [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.120 2 DEBUG nova.compute.manager [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.122 2 DEBUG nova.virt.libvirt.driver [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.123 2 INFO nova.virt.libvirt.driver [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Creating image(s)#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.124 2 DEBUG oslo_concurrency.lockutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Acquiring lock "/var/lib/nova/instances/e4ae9f32-76e5-4371-be11-8e494000ad01/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.125 2 DEBUG oslo_concurrency.lockutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Lock "/var/lib/nova/instances/e4ae9f32-76e5-4371-be11-8e494000ad01/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.126 2 DEBUG oslo_concurrency.lockutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Lock "/var/lib/nova/instances/e4ae9f32-76e5-4371-be11-8e494000ad01/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.151 2 DEBUG oslo_concurrency.processutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.216 2 DEBUG oslo_concurrency.processutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.217 2 DEBUG oslo_concurrency.lockutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Acquiring lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.218 2 DEBUG oslo_concurrency.lockutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.232 2 DEBUG oslo_concurrency.processutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.319 2 DEBUG oslo_concurrency.processutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.321 2 DEBUG oslo_concurrency.processutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/e4ae9f32-76e5-4371-be11-8e494000ad01/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.727 2 DEBUG oslo_concurrency.processutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/e4ae9f32-76e5-4371-be11-8e494000ad01/disk 1073741824" returned: 0 in 0.406s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.729 2 DEBUG oslo_concurrency.lockutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.511s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.730 2 DEBUG oslo_concurrency.processutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.824 2 DEBUG oslo_concurrency.processutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.826 2 DEBUG nova.virt.disk.api [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Checking if we can resize image /var/lib/nova/instances/e4ae9f32-76e5-4371-be11-8e494000ad01/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.826 2 DEBUG oslo_concurrency.processutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4ae9f32-76e5-4371-be11-8e494000ad01/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.900 2 DEBUG oslo_concurrency.processutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4ae9f32-76e5-4371-be11-8e494000ad01/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.902 2 DEBUG nova.virt.disk.api [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Cannot resize image /var/lib/nova/instances/e4ae9f32-76e5-4371-be11-8e494000ad01/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.903 2 DEBUG nova.objects.instance [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Lazy-loading 'migration_context' on Instance uuid e4ae9f32-76e5-4371-be11-8e494000ad01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.927 2 DEBUG nova.virt.libvirt.driver [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.928 2 DEBUG nova.virt.libvirt.driver [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Ensure instance console log exists: /var/lib/nova/instances/e4ae9f32-76e5-4371-be11-8e494000ad01/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.929 2 DEBUG oslo_concurrency.lockutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.930 2 DEBUG oslo_concurrency.lockutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:29:12 np0005476733 nova_compute[192580]: 2025-10-08 16:29:12.931 2 DEBUG oslo_concurrency.lockutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:29:13 np0005476733 nova_compute[192580]: 2025-10-08 16:29:13.140 2 DEBUG nova.policy [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '28332a14b074486eb9f93bd4826a0f49', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f35b4c2f25d5494d84303ae8109d46a7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 12:29:14 np0005476733 nova_compute[192580]: 2025-10-08 16:29:14.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:15 np0005476733 nova_compute[192580]: 2025-10-08 16:29:15.131 2 DEBUG nova.network.neutron [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Successfully updated port: 61f010d3-753c-4762-a073-4176feb5bc5b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 12:29:15 np0005476733 nova_compute[192580]: 2025-10-08 16:29:15.151 2 DEBUG oslo_concurrency.lockutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Acquiring lock "refresh_cache-e4ae9f32-76e5-4371-be11-8e494000ad01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:29:15 np0005476733 nova_compute[192580]: 2025-10-08 16:29:15.151 2 DEBUG oslo_concurrency.lockutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Acquired lock "refresh_cache-e4ae9f32-76e5-4371-be11-8e494000ad01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:29:15 np0005476733 nova_compute[192580]: 2025-10-08 16:29:15.152 2 DEBUG nova.network.neutron [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:29:15 np0005476733 nova_compute[192580]: 2025-10-08 16:29:15.262 2 DEBUG nova.compute.manager [req-3edbda67-e2af-4ce6-abb1-e615829ec42e req-b480713b-76fe-48f2-8c2e-e7e57c82b52e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Received event network-changed-61f010d3-753c-4762-a073-4176feb5bc5b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:29:15 np0005476733 nova_compute[192580]: 2025-10-08 16:29:15.263 2 DEBUG nova.compute.manager [req-3edbda67-e2af-4ce6-abb1-e615829ec42e req-b480713b-76fe-48f2-8c2e-e7e57c82b52e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Refreshing instance network info cache due to event network-changed-61f010d3-753c-4762-a073-4176feb5bc5b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:29:15 np0005476733 nova_compute[192580]: 2025-10-08 16:29:15.263 2 DEBUG oslo_concurrency.lockutils [req-3edbda67-e2af-4ce6-abb1-e615829ec42e req-b480713b-76fe-48f2-8c2e-e7e57c82b52e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-e4ae9f32-76e5-4371-be11-8e494000ad01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:29:15 np0005476733 nova_compute[192580]: 2025-10-08 16:29:15.581 2 DEBUG nova.network.neutron [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 12:29:16 np0005476733 nova_compute[192580]: 2025-10-08 16:29:16.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:16 np0005476733 podman[263356]: 2025-10-08 16:29:16.551979367 +0000 UTC m=+0.094587564 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.092 2 DEBUG nova.network.neutron [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Updating instance_info_cache with network_info: [{"id": "61f010d3-753c-4762-a073-4176feb5bc5b", "address": "fa:16:3e:47:6d:74", "network": {"id": "b7d39a51-3870-42d7-b64b-4e681abcb74b", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-1749202466", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f35b4c2f25d5494d84303ae8109d46a7", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61f010d3-75", "ovs_interfaceid": "61f010d3-753c-4762-a073-4176feb5bc5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.112 2 DEBUG oslo_concurrency.lockutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Releasing lock "refresh_cache-e4ae9f32-76e5-4371-be11-8e494000ad01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.112 2 DEBUG nova.compute.manager [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Instance network_info: |[{"id": "61f010d3-753c-4762-a073-4176feb5bc5b", "address": "fa:16:3e:47:6d:74", "network": {"id": "b7d39a51-3870-42d7-b64b-4e681abcb74b", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-1749202466", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f35b4c2f25d5494d84303ae8109d46a7", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61f010d3-75", "ovs_interfaceid": "61f010d3-753c-4762-a073-4176feb5bc5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.112 2 DEBUG oslo_concurrency.lockutils [req-3edbda67-e2af-4ce6-abb1-e615829ec42e req-b480713b-76fe-48f2-8c2e-e7e57c82b52e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-e4ae9f32-76e5-4371-be11-8e494000ad01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.113 2 DEBUG nova.network.neutron [req-3edbda67-e2af-4ce6-abb1-e615829ec42e req-b480713b-76fe-48f2-8c2e-e7e57c82b52e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Refreshing network info cache for port 61f010d3-753c-4762-a073-4176feb5bc5b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.115 2 DEBUG nova.virt.libvirt.driver [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Start _get_guest_xml network_info=[{"id": "61f010d3-753c-4762-a073-4176feb5bc5b", "address": "fa:16:3e:47:6d:74", "network": {"id": "b7d39a51-3870-42d7-b64b-4e681abcb74b", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-1749202466", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f35b4c2f25d5494d84303ae8109d46a7", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61f010d3-75", "ovs_interfaceid": "61f010d3-753c-4762-a073-4176feb5bc5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T15:17:39Z,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T15:17:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.119 2 WARNING nova.virt.libvirt.driver [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.126 2 DEBUG nova.virt.libvirt.host [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.126 2 DEBUG nova.virt.libvirt.host [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.133 2 DEBUG nova.virt.libvirt.host [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.134 2 DEBUG nova.virt.libvirt.host [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.135 2 DEBUG nova.virt.libvirt.driver [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.136 2 DEBUG nova.virt.hardware [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='987b2db7-1d21-4b59-831a-1e8ace40589b',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T15:17:39Z,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T15:17:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.136 2 DEBUG nova.virt.hardware [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.136 2 DEBUG nova.virt.hardware [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.136 2 DEBUG nova.virt.hardware [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.137 2 DEBUG nova.virt.hardware [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.137 2 DEBUG nova.virt.hardware [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.137 2 DEBUG nova.virt.hardware [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.137 2 DEBUG nova.virt.hardware [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.137 2 DEBUG nova.virt.hardware [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.138 2 DEBUG nova.virt.hardware [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.138 2 DEBUG nova.virt.hardware [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.141 2 DEBUG nova.virt.libvirt.vif [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:29:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-internal-dns-test-vm-1675426651',display_name='tempest-internal-dns-test-vm-1675426651',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-internal-dns-test-vm-1675426651',id=93,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP+Dvc/NhHc1Yg2xFGMnIVO5Msb/l5qEVK3EjhnAPqkbQTF2CEAcs953QFvv2Xx9qdZIeKfeoHKk1HHOAMR1TWfljGvYGNI0MmdwpGnPv6KeyBMeTtq8vO1vS//IRY+ekg==',key_name='tempest-internal-dns-test-shared-keypair-1161682905',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f35b4c2f25d5494d84303ae8109d46a7',ramdisk_id='',reservation_id='r-9p5i94j8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InternalDNSInterruptionsTestOvn-1696835486',owner_user_name='tempest-InternalDNSInterruptionsTestOvn-1696835486-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:29:12Z,user_data=None,user_id='28332a14b074486eb9f93bd4826a0f49',uuid=e4ae9f32-76e5-4371-be11-8e494000ad01,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "61f010d3-753c-4762-a073-4176feb5bc5b", "address": "fa:16:3e:47:6d:74", "network": {"id": "b7d39a51-3870-42d7-b64b-4e681abcb74b", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-1749202466", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f35b4c2f25d5494d84303ae8109d46a7", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61f010d3-75", "ovs_interfaceid": "61f010d3-753c-4762-a073-4176feb5bc5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.141 2 DEBUG nova.network.os_vif_util [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Converting VIF {"id": "61f010d3-753c-4762-a073-4176feb5bc5b", "address": "fa:16:3e:47:6d:74", "network": {"id": "b7d39a51-3870-42d7-b64b-4e681abcb74b", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-1749202466", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f35b4c2f25d5494d84303ae8109d46a7", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61f010d3-75", "ovs_interfaceid": "61f010d3-753c-4762-a073-4176feb5bc5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.142 2 DEBUG nova.network.os_vif_util [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:6d:74,bridge_name='br-int',has_traffic_filtering=True,id=61f010d3-753c-4762-a073-4176feb5bc5b,network=Network(b7d39a51-3870-42d7-b64b-4e681abcb74b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap61f010d3-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.142 2 DEBUG nova.objects.instance [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Lazy-loading 'pci_devices' on Instance uuid e4ae9f32-76e5-4371-be11-8e494000ad01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.160 2 DEBUG nova.virt.libvirt.driver [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] End _get_guest_xml xml=<domain type="kvm">
Oct  8 12:29:17 np0005476733 nova_compute[192580]:  <uuid>e4ae9f32-76e5-4371-be11-8e494000ad01</uuid>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:  <name>instance-0000005d</name>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:  <memory>131072</memory>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <nova:name>tempest-internal-dns-test-vm-1675426651</nova:name>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 16:29:17</nova:creationTime>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <nova:flavor name="m1.nano">
Oct  8 12:29:17 np0005476733 nova_compute[192580]:        <nova:memory>128</nova:memory>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:        <nova:disk>1</nova:disk>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:        <nova:user uuid="28332a14b074486eb9f93bd4826a0f49">tempest-InternalDNSInterruptionsTestOvn-1696835486-project-member</nova:user>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:        <nova:project uuid="f35b4c2f25d5494d84303ae8109d46a7">tempest-InternalDNSInterruptionsTestOvn-1696835486</nova:project>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="ec29a055-bb5f-49c2-94be-8574c5ea97ea"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:        <nova:port uuid="61f010d3-753c-4762-a073-4176feb5bc5b">
Oct  8 12:29:17 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <system>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <entry name="serial">e4ae9f32-76e5-4371-be11-8e494000ad01</entry>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <entry name="uuid">e4ae9f32-76e5-4371-be11-8e494000ad01</entry>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    </system>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:  <os>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:  </os>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:  <features>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:  </features>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:  </clock>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:  <devices>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/e4ae9f32-76e5-4371-be11-8e494000ad01/disk"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/e4ae9f32-76e5-4371-be11-8e494000ad01/disk.config"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:47:6d:74"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <target dev="tap61f010d3-75"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    </interface>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/e4ae9f32-76e5-4371-be11-8e494000ad01/console.log" append="off"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    </serial>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <video>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    </video>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    </rng>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 12:29:17 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 12:29:17 np0005476733 nova_compute[192580]:  </devices>
Oct  8 12:29:17 np0005476733 nova_compute[192580]: </domain>
Oct  8 12:29:17 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.161 2 DEBUG nova.compute.manager [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Preparing to wait for external event network-vif-plugged-61f010d3-753c-4762-a073-4176feb5bc5b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.162 2 DEBUG oslo_concurrency.lockutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Acquiring lock "e4ae9f32-76e5-4371-be11-8e494000ad01-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.163 2 DEBUG oslo_concurrency.lockutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Lock "e4ae9f32-76e5-4371-be11-8e494000ad01-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.163 2 DEBUG oslo_concurrency.lockutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Lock "e4ae9f32-76e5-4371-be11-8e494000ad01-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.164 2 DEBUG nova.virt.libvirt.vif [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:29:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-internal-dns-test-vm-1675426651',display_name='tempest-internal-dns-test-vm-1675426651',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-internal-dns-test-vm-1675426651',id=93,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP+Dvc/NhHc1Yg2xFGMnIVO5Msb/l5qEVK3EjhnAPqkbQTF2CEAcs953QFvv2Xx9qdZIeKfeoHKk1HHOAMR1TWfljGvYGNI0MmdwpGnPv6KeyBMeTtq8vO1vS//IRY+ekg==',key_name='tempest-internal-dns-test-shared-keypair-1161682905',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f35b4c2f25d5494d84303ae8109d46a7',ramdisk_id='',reservation_id='r-9p5i94j8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InternalDNSInterruptionsTestOvn-1696835486',owner_user_name='tempest-InternalDNSInterruptionsTestOvn-1696835486-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:29:12Z,user_data=None,user_id='28332a14b074486eb9f93bd4826a0f49',uuid=e4ae9f32-76e5-4371-be11-8e494000ad01,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "61f010d3-753c-4762-a073-4176feb5bc5b", "address": "fa:16:3e:47:6d:74", "network": {"id": "b7d39a51-3870-42d7-b64b-4e681abcb74b", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-1749202466", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f35b4c2f25d5494d84303ae8109d46a7", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61f010d3-75", "ovs_interfaceid": "61f010d3-753c-4762-a073-4176feb5bc5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.165 2 DEBUG nova.network.os_vif_util [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Converting VIF {"id": "61f010d3-753c-4762-a073-4176feb5bc5b", "address": "fa:16:3e:47:6d:74", "network": {"id": "b7d39a51-3870-42d7-b64b-4e681abcb74b", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-1749202466", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f35b4c2f25d5494d84303ae8109d46a7", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61f010d3-75", "ovs_interfaceid": "61f010d3-753c-4762-a073-4176feb5bc5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.166 2 DEBUG nova.network.os_vif_util [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:6d:74,bridge_name='br-int',has_traffic_filtering=True,id=61f010d3-753c-4762-a073-4176feb5bc5b,network=Network(b7d39a51-3870-42d7-b64b-4e681abcb74b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap61f010d3-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.166 2 DEBUG os_vif [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:6d:74,bridge_name='br-int',has_traffic_filtering=True,id=61f010d3-753c-4762-a073-4176feb5bc5b,network=Network(b7d39a51-3870-42d7-b64b-4e681abcb74b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap61f010d3-75') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.168 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.168 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.171 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap61f010d3-75, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.171 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap61f010d3-75, col_values=(('external_ids', {'iface-id': '61f010d3-753c-4762-a073-4176feb5bc5b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:47:6d:74', 'vm-uuid': 'e4ae9f32-76e5-4371-be11-8e494000ad01'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:29:17 np0005476733 NetworkManager[51699]: <info>  [1759940957.1739] manager: (tap61f010d3-75): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/283)
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.181 2 INFO os_vif [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:6d:74,bridge_name='br-int',has_traffic_filtering=True,id=61f010d3-753c-4762-a073-4176feb5bc5b,network=Network(b7d39a51-3870-42d7-b64b-4e681abcb74b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap61f010d3-75')#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.303 2 DEBUG nova.virt.libvirt.driver [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.303 2 DEBUG nova.virt.libvirt.driver [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.304 2 DEBUG nova.virt.libvirt.driver [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] No VIF found with MAC fa:16:3e:47:6d:74, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 12:29:17 np0005476733 nova_compute[192580]: 2025-10-08 16:29:17.304 2 INFO nova.virt.libvirt.driver [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Using config drive#033[00m
Oct  8 12:29:18 np0005476733 nova_compute[192580]: 2025-10-08 16:29:18.196 2 INFO nova.virt.libvirt.driver [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Creating config drive at /var/lib/nova/instances/e4ae9f32-76e5-4371-be11-8e494000ad01/disk.config#033[00m
Oct  8 12:29:18 np0005476733 nova_compute[192580]: 2025-10-08 16:29:18.201 2 DEBUG oslo_concurrency.processutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e4ae9f32-76e5-4371-be11-8e494000ad01/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprn4y10te execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:29:18 np0005476733 nova_compute[192580]: 2025-10-08 16:29:18.331 2 DEBUG oslo_concurrency.processutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e4ae9f32-76e5-4371-be11-8e494000ad01/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprn4y10te" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:29:18 np0005476733 NetworkManager[51699]: <info>  [1759940958.4265] manager: (tap61f010d3-75): new Tun device (/org/freedesktop/NetworkManager/Devices/284)
Oct  8 12:29:18 np0005476733 kernel: tap61f010d3-75: entered promiscuous mode
Oct  8 12:29:18 np0005476733 ovn_controller[94857]: 2025-10-08T16:29:18Z|00889|binding|INFO|Claiming lport 61f010d3-753c-4762-a073-4176feb5bc5b for this chassis.
Oct  8 12:29:18 np0005476733 ovn_controller[94857]: 2025-10-08T16:29:18Z|00890|binding|INFO|61f010d3-753c-4762-a073-4176feb5bc5b: Claiming fa:16:3e:47:6d:74 10.100.0.6
Oct  8 12:29:18 np0005476733 nova_compute[192580]: 2025-10-08 16:29:18.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.440 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:6d:74 10.100.0.6'], port_security=['fa:16:3e:47:6d:74 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-internal-dns-test-port-1278463350', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'e4ae9f32-76e5-4371-be11-8e494000ad01', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b7d39a51-3870-42d7-b64b-4e681abcb74b', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-internal-dns-test-port-1278463350', 'neutron:project_id': 'f35b4c2f25d5494d84303ae8109d46a7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b9bca2ff-ab2b-4001-b4ff-f7550119f7ed', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1133ed6c-2c2e-4c2a-93b5-6315b87b788a, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=61f010d3-753c-4762-a073-4176feb5bc5b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.441 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 61f010d3-753c-4762-a073-4176feb5bc5b in datapath b7d39a51-3870-42d7-b64b-4e681abcb74b bound to our chassis#033[00m
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.443 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b7d39a51-3870-42d7-b64b-4e681abcb74b#033[00m
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.457 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6c992e30-14ec-41ab-9f07-7227d3948190]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.459 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb7d39a51-31 in ovnmeta-b7d39a51-3870-42d7-b64b-4e681abcb74b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.463 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb7d39a51-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.464 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5237ce22-a65f-423e-8c0f-43c6b8968321]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.465 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[bbff3488-cb90-47af-b76e-2630d2084994]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:29:18 np0005476733 systemd-udevd[263395]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.484 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[32ac0e17-f207-43e2-a10b-621ba2fb1412]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:29:18 np0005476733 nova_compute[192580]: 2025-10-08 16:29:18.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:18 np0005476733 NetworkManager[51699]: <info>  [1759940958.4905] device (tap61f010d3-75): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:29:18 np0005476733 NetworkManager[51699]: <info>  [1759940958.4914] device (tap61f010d3-75): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 12:29:18 np0005476733 ovn_controller[94857]: 2025-10-08T16:29:18Z|00891|binding|INFO|Setting lport 61f010d3-753c-4762-a073-4176feb5bc5b ovn-installed in OVS
Oct  8 12:29:18 np0005476733 ovn_controller[94857]: 2025-10-08T16:29:18Z|00892|binding|INFO|Setting lport 61f010d3-753c-4762-a073-4176feb5bc5b up in Southbound
Oct  8 12:29:18 np0005476733 nova_compute[192580]: 2025-10-08 16:29:18.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:18 np0005476733 systemd-machined[152624]: New machine qemu-57-instance-0000005d.
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.507 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[82e75a6e-86d0-4371-bdd0-4280c067d8da]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:29:18 np0005476733 systemd[1]: Started Virtual Machine qemu-57-instance-0000005d.
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.557 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[9dda7a05-db85-4012-bea4-fd65f1f0b0bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:29:18 np0005476733 NetworkManager[51699]: <info>  [1759940958.5676] manager: (tapb7d39a51-30): new Veth device (/org/freedesktop/NetworkManager/Devices/285)
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.567 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8a4f2247-06d2-42e0-b58d-501dd4385bb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:29:18 np0005476733 systemd-udevd[263399]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.611 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[9750bd4a-e2bb-4edc-9b30-9bdc9e1a67a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.615 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[d109a7d1-2bd3-46c8-91b2-ffa70fc8da82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:29:18 np0005476733 NetworkManager[51699]: <info>  [1759940958.6532] device (tapb7d39a51-30): carrier: link connected
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.663 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[f3c38656-2d20-474d-8481-882bbaf7fd64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.688 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a532f0ef-62ca-4a10-9622-8e260bb96a9a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb7d39a51-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a5:e0:dd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 199], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 786231, 'reachable_time': 43081, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 263428, 'error': None, 'target': 'ovnmeta-b7d39a51-3870-42d7-b64b-4e681abcb74b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.713 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[fa16dc0e-82ac-4329-af14-e943bc4cf6b1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea5:e0dd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 786231, 'tstamp': 786231}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 263429, 'error': None, 'target': 'ovnmeta-b7d39a51-3870-42d7-b64b-4e681abcb74b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.743 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5b8be7bd-dc39-4e91-91c3-a345e11108be]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb7d39a51-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a5:e0:dd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 199], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 786231, 'reachable_time': 43081, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 263430, 'error': None, 'target': 'ovnmeta-b7d39a51-3870-42d7-b64b-4e681abcb74b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.792 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7bc60ffd-80d3-4ab3-a1d6-4f67970f7058]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.891 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e0344893-c2c6-4e07-9329-368bb608feb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.892 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7d39a51-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.893 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.893 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb7d39a51-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:29:18 np0005476733 nova_compute[192580]: 2025-10-08 16:29:18.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:18 np0005476733 NetworkManager[51699]: <info>  [1759940958.9062] manager: (tapb7d39a51-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/286)
Oct  8 12:29:18 np0005476733 kernel: tapb7d39a51-30: entered promiscuous mode
Oct  8 12:29:18 np0005476733 nova_compute[192580]: 2025-10-08 16:29:18.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.909 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb7d39a51-30, col_values=(('external_ids', {'iface-id': '8d72f8aa-58c8-42d9-92dd-e02acc77abc4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:29:18 np0005476733 nova_compute[192580]: 2025-10-08 16:29:18.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:18 np0005476733 ovn_controller[94857]: 2025-10-08T16:29:18Z|00893|binding|INFO|Releasing lport 8d72f8aa-58c8-42d9-92dd-e02acc77abc4 from this chassis (sb_readonly=0)
Oct  8 12:29:18 np0005476733 nova_compute[192580]: 2025-10-08 16:29:18.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.935 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b7d39a51-3870-42d7-b64b-4e681abcb74b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b7d39a51-3870-42d7-b64b-4e681abcb74b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.936 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9aa85aa6-b70f-4391-a46b-992e948f0d87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.937 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-b7d39a51-3870-42d7-b64b-4e681abcb74b
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/b7d39a51-3870-42d7-b64b-4e681abcb74b.pid.haproxy
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID b7d39a51-3870-42d7-b64b-4e681abcb74b
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 12:29:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:18.938 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b7d39a51-3870-42d7-b64b-4e681abcb74b', 'env', 'PROCESS_TAG=haproxy-b7d39a51-3870-42d7-b64b-4e681abcb74b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b7d39a51-3870-42d7-b64b-4e681abcb74b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.141 2 DEBUG nova.compute.manager [req-99dbf9ed-70d7-40e3-94d5-9cb280801069 req-6fe7a880-4dd6-41ed-aea4-2ae8a0e97aed 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Received event network-vif-plugged-61f010d3-753c-4762-a073-4176feb5bc5b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.142 2 DEBUG oslo_concurrency.lockutils [req-99dbf9ed-70d7-40e3-94d5-9cb280801069 req-6fe7a880-4dd6-41ed-aea4-2ae8a0e97aed 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e4ae9f32-76e5-4371-be11-8e494000ad01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.142 2 DEBUG oslo_concurrency.lockutils [req-99dbf9ed-70d7-40e3-94d5-9cb280801069 req-6fe7a880-4dd6-41ed-aea4-2ae8a0e97aed 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e4ae9f32-76e5-4371-be11-8e494000ad01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.142 2 DEBUG oslo_concurrency.lockutils [req-99dbf9ed-70d7-40e3-94d5-9cb280801069 req-6fe7a880-4dd6-41ed-aea4-2ae8a0e97aed 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e4ae9f32-76e5-4371-be11-8e494000ad01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.143 2 DEBUG nova.compute.manager [req-99dbf9ed-70d7-40e3-94d5-9cb280801069 req-6fe7a880-4dd6-41ed-aea4-2ae8a0e97aed 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Processing event network-vif-plugged-61f010d3-753c-4762-a073-4176feb5bc5b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.335 2 DEBUG nova.network.neutron [req-3edbda67-e2af-4ce6-abb1-e615829ec42e req-b480713b-76fe-48f2-8c2e-e7e57c82b52e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Updated VIF entry in instance network info cache for port 61f010d3-753c-4762-a073-4176feb5bc5b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.336 2 DEBUG nova.network.neutron [req-3edbda67-e2af-4ce6-abb1-e615829ec42e req-b480713b-76fe-48f2-8c2e-e7e57c82b52e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Updating instance_info_cache with network_info: [{"id": "61f010d3-753c-4762-a073-4176feb5bc5b", "address": "fa:16:3e:47:6d:74", "network": {"id": "b7d39a51-3870-42d7-b64b-4e681abcb74b", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-1749202466", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f35b4c2f25d5494d84303ae8109d46a7", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61f010d3-75", "ovs_interfaceid": "61f010d3-753c-4762-a073-4176feb5bc5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.352 2 DEBUG oslo_concurrency.lockutils [req-3edbda67-e2af-4ce6-abb1-e615829ec42e req-b480713b-76fe-48f2-8c2e-e7e57c82b52e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-e4ae9f32-76e5-4371-be11-8e494000ad01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.378 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759940959.3781915, e4ae9f32-76e5-4371-be11-8e494000ad01 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.379 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] VM Started (Lifecycle Event)#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.382 2 DEBUG nova.compute.manager [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.386 2 DEBUG nova.virt.libvirt.driver [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.389 2 INFO nova.virt.libvirt.driver [-] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Instance spawned successfully.#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.390 2 DEBUG nova.virt.libvirt.driver [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.403 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.409 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.413 2 DEBUG nova.virt.libvirt.driver [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.413 2 DEBUG nova.virt.libvirt.driver [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.414 2 DEBUG nova.virt.libvirt.driver [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.414 2 DEBUG nova.virt.libvirt.driver [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.414 2 DEBUG nova.virt.libvirt.driver [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.415 2 DEBUG nova.virt.libvirt.driver [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:29:19 np0005476733 podman[263469]: 2025-10-08 16:29:19.327954041 +0000 UTC m=+0.023044883 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.449 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.449 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759940959.3784754, e4ae9f32-76e5-4371-be11-8e494000ad01 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.450 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] VM Paused (Lifecycle Event)#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.492 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.497 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759940959.3852277, e4ae9f32-76e5-4371-be11-8e494000ad01 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.497 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] VM Resumed (Lifecycle Event)#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.503 2 INFO nova.compute.manager [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Took 7.38 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.503 2 DEBUG nova.compute.manager [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.555 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.561 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.585 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.597 2 INFO nova.compute.manager [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Took 8.02 seconds to build instance.#033[00m
Oct  8 12:29:19 np0005476733 nova_compute[192580]: 2025-10-08 16:29:19.614 2 DEBUG oslo_concurrency.lockutils [None req-c4162310-dfb3-4be1-bb74-8b21025285d8 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Lock "e4ae9f32-76e5-4371-be11-8e494000ad01" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:29:19 np0005476733 podman[263469]: 2025-10-08 16:29:19.715785274 +0000 UTC m=+0.410876116 container create 95c69c3ca2fbf57113c6c4ab9edeb20f9042aa1e253c915b8a0d8dadd13192e7 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-b7d39a51-3870-42d7-b64b-4e681abcb74b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 12:29:19 np0005476733 systemd[1]: Started libpod-conmon-95c69c3ca2fbf57113c6c4ab9edeb20f9042aa1e253c915b8a0d8dadd13192e7.scope.
Oct  8 12:29:19 np0005476733 systemd[1]: Started libcrun container.
Oct  8 12:29:19 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d69f7af40b30084a995210ccd2a0eb851a5742978e10aaa9e18b267d943dff7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 12:29:20 np0005476733 podman[263469]: 2025-10-08 16:29:20.022409857 +0000 UTC m=+0.717500729 container init 95c69c3ca2fbf57113c6c4ab9edeb20f9042aa1e253c915b8a0d8dadd13192e7 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-b7d39a51-3870-42d7-b64b-4e681abcb74b, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 12:29:20 np0005476733 podman[263469]: 2025-10-08 16:29:20.032990213 +0000 UTC m=+0.728081055 container start 95c69c3ca2fbf57113c6c4ab9edeb20f9042aa1e253c915b8a0d8dadd13192e7 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-b7d39a51-3870-42d7-b64b-4e681abcb74b, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 12:29:20 np0005476733 neutron-haproxy-ovnmeta-b7d39a51-3870-42d7-b64b-4e681abcb74b[263487]: [NOTICE]   (263491) : New worker (263493) forked
Oct  8 12:29:20 np0005476733 neutron-haproxy-ovnmeta-b7d39a51-3870-42d7-b64b-4e681abcb74b[263487]: [NOTICE]   (263491) : Loading success.
Oct  8 12:29:21 np0005476733 nova_compute[192580]: 2025-10-08 16:29:21.249 2 DEBUG nova.compute.manager [req-54e286a1-1777-40bf-93ab-5a8241679530 req-1c8c6d21-0b0c-41da-a6ef-46415c9bb787 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Received event network-vif-plugged-61f010d3-753c-4762-a073-4176feb5bc5b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:29:21 np0005476733 nova_compute[192580]: 2025-10-08 16:29:21.249 2 DEBUG oslo_concurrency.lockutils [req-54e286a1-1777-40bf-93ab-5a8241679530 req-1c8c6d21-0b0c-41da-a6ef-46415c9bb787 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e4ae9f32-76e5-4371-be11-8e494000ad01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:29:21 np0005476733 nova_compute[192580]: 2025-10-08 16:29:21.250 2 DEBUG oslo_concurrency.lockutils [req-54e286a1-1777-40bf-93ab-5a8241679530 req-1c8c6d21-0b0c-41da-a6ef-46415c9bb787 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e4ae9f32-76e5-4371-be11-8e494000ad01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:29:21 np0005476733 nova_compute[192580]: 2025-10-08 16:29:21.250 2 DEBUG oslo_concurrency.lockutils [req-54e286a1-1777-40bf-93ab-5a8241679530 req-1c8c6d21-0b0c-41da-a6ef-46415c9bb787 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e4ae9f32-76e5-4371-be11-8e494000ad01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:29:21 np0005476733 nova_compute[192580]: 2025-10-08 16:29:21.250 2 DEBUG nova.compute.manager [req-54e286a1-1777-40bf-93ab-5a8241679530 req-1c8c6d21-0b0c-41da-a6ef-46415c9bb787 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] No waiting events found dispatching network-vif-plugged-61f010d3-753c-4762-a073-4176feb5bc5b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:29:21 np0005476733 nova_compute[192580]: 2025-10-08 16:29:21.250 2 WARNING nova.compute.manager [req-54e286a1-1777-40bf-93ab-5a8241679530 req-1c8c6d21-0b0c-41da-a6ef-46415c9bb787 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Received unexpected event network-vif-plugged-61f010d3-753c-4762-a073-4176feb5bc5b for instance with vm_state active and task_state None.#033[00m
Oct  8 12:29:21 np0005476733 nova_compute[192580]: 2025-10-08 16:29:21.383 2 INFO nova.compute.manager [None req-be8632db-9840-480d-98e9-a7ac7a9c1d7c 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Get console output#033[00m
Oct  8 12:29:21 np0005476733 nova_compute[192580]: 2025-10-08 16:29:21.391 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:29:22 np0005476733 nova_compute[192580]: 2025-10-08 16:29:22.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:23 np0005476733 podman[263502]: 2025-10-08 16:29:23.295338748 +0000 UTC m=+0.114404362 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 12:29:23 np0005476733 ovn_controller[94857]: 2025-10-08T16:29:23Z|00894|pinctrl|WARN|Dropped 529 log messages in last 62 seconds (most recently, 4 seconds ago) due to excessive rate
Oct  8 12:29:23 np0005476733 ovn_controller[94857]: 2025-10-08T16:29:23Z|00895|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:29:23 np0005476733 podman[263524]: 2025-10-08 16:29:23.408512582 +0000 UTC m=+0.115230880 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  8 12:29:24 np0005476733 nova_compute[192580]: 2025-10-08 16:29:24.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:26.386 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:29:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:26.387 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:29:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:26.388 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:29:26 np0005476733 nova_compute[192580]: 2025-10-08 16:29:26.535 2 INFO nova.compute.manager [None req-b69a1ab6-a214-45de-afdb-704e1104c941 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Get console output#033[00m
Oct  8 12:29:26 np0005476733 nova_compute[192580]: 2025-10-08 16:29:26.540 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:29:27 np0005476733 nova_compute[192580]: 2025-10-08 16:29:27.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:29 np0005476733 nova_compute[192580]: 2025-10-08 16:29:29.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:30 np0005476733 podman[263558]: 2025-10-08 16:29:30.241326073 +0000 UTC m=+0.066335987 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 12:29:30 np0005476733 podman[263560]: 2025-10-08 16:29:30.276639384 +0000 UTC m=+0.101262185 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, version=9.6, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  8 12:29:30 np0005476733 podman[263559]: 2025-10-08 16:29:30.291272898 +0000 UTC m=+0.106068168 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 12:29:31 np0005476733 nova_compute[192580]: 2025-10-08 16:29:31.726 2 INFO nova.compute.manager [None req-19e0a762-dce6-444e-bc7d-37714b26183d 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Get console output#033[00m
Oct  8 12:29:31 np0005476733 nova_compute[192580]: 2025-10-08 16:29:31.733 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:29:32 np0005476733 ovn_controller[94857]: 2025-10-08T16:29:32Z|00113|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:47:6d:74 10.100.0.6
Oct  8 12:29:32 np0005476733 ovn_controller[94857]: 2025-10-08T16:29:32Z|00114|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:47:6d:74 10.100.0.6
Oct  8 12:29:32 np0005476733 nova_compute[192580]: 2025-10-08 16:29:32.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:33 np0005476733 nova_compute[192580]: 2025-10-08 16:29:33.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:29:34 np0005476733 nova_compute[192580]: 2025-10-08 16:29:34.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:34 np0005476733 nova_compute[192580]: 2025-10-08 16:29:34.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:29:36 np0005476733 nova_compute[192580]: 2025-10-08 16:29:36.891 2 INFO nova.compute.manager [None req-db9485e8-e3f2-4060-a4cb-6c3bb94c312d 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Get console output#033[00m
Oct  8 12:29:36 np0005476733 nova_compute[192580]: 2025-10-08 16:29:36.896 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:29:37 np0005476733 nova_compute[192580]: 2025-10-08 16:29:37.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:38 np0005476733 podman[263647]: 2025-10-08 16:29:38.254938431 +0000 UTC m=+0.078915647 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:29:38 np0005476733 podman[263646]: 2025-10-08 16:29:38.25840134 +0000 UTC m=+0.082423207 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:29:39 np0005476733 nova_compute[192580]: 2025-10-08 16:29:39.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:39 np0005476733 NetworkManager[51699]: <info>  [1759940979.2129] manager: (patch-br-int-to-provnet-20e9b335-697c-41bf-8f62-8813cb01ba99): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/287)
Oct  8 12:29:39 np0005476733 NetworkManager[51699]: <info>  [1759940979.2143] manager: (patch-provnet-20e9b335-697c-41bf-8f62-8813cb01ba99-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/288)
Oct  8 12:29:39 np0005476733 ovn_controller[94857]: 2025-10-08T16:29:39Z|00896|binding|INFO|Releasing lport 8d72f8aa-58c8-42d9-92dd-e02acc77abc4 from this chassis (sb_readonly=0)
Oct  8 12:29:39 np0005476733 ovn_controller[94857]: 2025-10-08T16:29:39Z|00897|binding|INFO|Releasing lport 8d72f8aa-58c8-42d9-92dd-e02acc77abc4 from this chassis (sb_readonly=0)
Oct  8 12:29:39 np0005476733 nova_compute[192580]: 2025-10-08 16:29:39.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:39 np0005476733 nova_compute[192580]: 2025-10-08 16:29:39.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:39 np0005476733 nova_compute[192580]: 2025-10-08 16:29:39.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:39.801 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=68, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=67) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:29:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:39.802 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:29:39 np0005476733 nova_compute[192580]: 2025-10-08 16:29:39.803 2 DEBUG nova.compute.manager [req-30783a38-656d-41ec-bd33-6db945d7ef31 req-cb1f9fc8-ea81-4134-823f-f00724b1404b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Received event network-changed-61f010d3-753c-4762-a073-4176feb5bc5b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:29:39 np0005476733 nova_compute[192580]: 2025-10-08 16:29:39.803 2 DEBUG nova.compute.manager [req-30783a38-656d-41ec-bd33-6db945d7ef31 req-cb1f9fc8-ea81-4134-823f-f00724b1404b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Refreshing instance network info cache due to event network-changed-61f010d3-753c-4762-a073-4176feb5bc5b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:29:39 np0005476733 nova_compute[192580]: 2025-10-08 16:29:39.803 2 DEBUG oslo_concurrency.lockutils [req-30783a38-656d-41ec-bd33-6db945d7ef31 req-cb1f9fc8-ea81-4134-823f-f00724b1404b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-e4ae9f32-76e5-4371-be11-8e494000ad01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:29:39 np0005476733 nova_compute[192580]: 2025-10-08 16:29:39.804 2 DEBUG oslo_concurrency.lockutils [req-30783a38-656d-41ec-bd33-6db945d7ef31 req-cb1f9fc8-ea81-4134-823f-f00724b1404b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-e4ae9f32-76e5-4371-be11-8e494000ad01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:29:39 np0005476733 nova_compute[192580]: 2025-10-08 16:29:39.804 2 DEBUG nova.network.neutron [req-30783a38-656d-41ec-bd33-6db945d7ef31 req-cb1f9fc8-ea81-4134-823f-f00724b1404b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Refreshing network info cache for port 61f010d3-753c-4762-a073-4176feb5bc5b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:29:39 np0005476733 nova_compute[192580]: 2025-10-08 16:29:39.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:40 np0005476733 nova_compute[192580]: 2025-10-08 16:29:40.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:29:40 np0005476733 nova_compute[192580]: 2025-10-08 16:29:40.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:29:41 np0005476733 nova_compute[192580]: 2025-10-08 16:29:41.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:29:42 np0005476733 nova_compute[192580]: 2025-10-08 16:29:42.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:42 np0005476733 nova_compute[192580]: 2025-10-08 16:29:42.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:29:43 np0005476733 nova_compute[192580]: 2025-10-08 16:29:43.254 2 DEBUG nova.network.neutron [req-30783a38-656d-41ec-bd33-6db945d7ef31 req-cb1f9fc8-ea81-4134-823f-f00724b1404b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Updated VIF entry in instance network info cache for port 61f010d3-753c-4762-a073-4176feb5bc5b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:29:43 np0005476733 nova_compute[192580]: 2025-10-08 16:29:43.255 2 DEBUG nova.network.neutron [req-30783a38-656d-41ec-bd33-6db945d7ef31 req-cb1f9fc8-ea81-4134-823f-f00724b1404b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Updating instance_info_cache with network_info: [{"id": "61f010d3-753c-4762-a073-4176feb5bc5b", "address": "fa:16:3e:47:6d:74", "network": {"id": "b7d39a51-3870-42d7-b64b-4e681abcb74b", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-1749202466", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f35b4c2f25d5494d84303ae8109d46a7", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61f010d3-75", "ovs_interfaceid": "61f010d3-753c-4762-a073-4176feb5bc5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:29:43 np0005476733 nova_compute[192580]: 2025-10-08 16:29:43.322 2 DEBUG oslo_concurrency.lockutils [req-30783a38-656d-41ec-bd33-6db945d7ef31 req-cb1f9fc8-ea81-4134-823f-f00724b1404b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-e4ae9f32-76e5-4371-be11-8e494000ad01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:29:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:43.804 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '68'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:29:44 np0005476733 nova_compute[192580]: 2025-10-08 16:29:44.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:44 np0005476733 systemd-logind[827]: New session 154 of user zuul.
Oct  8 12:29:44 np0005476733 systemd[1]: Started Session 154 of User zuul.
Oct  8 12:29:44 np0005476733 systemd[1]: session-154.scope: Deactivated successfully.
Oct  8 12:29:44 np0005476733 systemd-logind[827]: Session 154 logged out. Waiting for processes to exit.
Oct  8 12:29:44 np0005476733 systemd-logind[827]: Removed session 154.
Oct  8 12:29:46 np0005476733 nova_compute[192580]: 2025-10-08 16:29:46.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:29:46 np0005476733 nova_compute[192580]: 2025-10-08 16:29:46.591 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:29:46 np0005476733 nova_compute[192580]: 2025-10-08 16:29:46.592 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:29:47 np0005476733 nova_compute[192580]: 2025-10-08 16:29:47.063 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-e4ae9f32-76e5-4371-be11-8e494000ad01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:29:47 np0005476733 nova_compute[192580]: 2025-10-08 16:29:47.064 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-e4ae9f32-76e5-4371-be11-8e494000ad01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:29:47 np0005476733 nova_compute[192580]: 2025-10-08 16:29:47.065 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:29:47 np0005476733 nova_compute[192580]: 2025-10-08 16:29:47.065 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e4ae9f32-76e5-4371-be11-8e494000ad01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:29:47 np0005476733 nova_compute[192580]: 2025-10-08 16:29:47.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:47 np0005476733 podman[263723]: 2025-10-08 16:29:47.244141138 +0000 UTC m=+0.067466563 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 12:29:48 np0005476733 systemd-logind[827]: New session 155 of user zuul.
Oct  8 12:29:48 np0005476733 systemd[1]: Started Session 155 of User zuul.
Oct  8 12:29:48 np0005476733 systemd[1]: Stopping ovn_controller container...
Oct  8 12:29:48 np0005476733 nova_compute[192580]: 2025-10-08 16:29:48.687 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Updating instance_info_cache with network_info: [{"id": "61f010d3-753c-4762-a073-4176feb5bc5b", "address": "fa:16:3e:47:6d:74", "network": {"id": "b7d39a51-3870-42d7-b64b-4e681abcb74b", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-1749202466", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f35b4c2f25d5494d84303ae8109d46a7", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61f010d3-75", "ovs_interfaceid": "61f010d3-753c-4762-a073-4176feb5bc5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:29:48 np0005476733 nova_compute[192580]: 2025-10-08 16:29:48.708 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-e4ae9f32-76e5-4371-be11-8e494000ad01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:29:48 np0005476733 nova_compute[192580]: 2025-10-08 16:29:48.708 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:29:48 np0005476733 nova_compute[192580]: 2025-10-08 16:29:48.708 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:29:48 np0005476733 ovn_controller[94857]: 2025-10-08T16:29:48Z|00898|fatal_signal|WARN|terminating with signal 15 (Terminated)
Oct  8 12:29:48 np0005476733 systemd[1]: libpod-20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b.scope: Deactivated successfully.
Oct  8 12:29:48 np0005476733 systemd[1]: libpod-20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b.scope: Consumed 25.469s CPU time.
Oct  8 12:29:48 np0005476733 podman[263771]: 2025-10-08 16:29:48.742837444 +0000 UTC m=+0.103519577 container died 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Oct  8 12:29:48 np0005476733 systemd[1]: 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b-12726c3703adeae8.timer: Deactivated successfully.
Oct  8 12:29:48 np0005476733 systemd[1]: Stopped /usr/bin/podman healthcheck run 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b.
Oct  8 12:29:48 np0005476733 systemd[1]: 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b-12726c3703adeae8.service: Failed to open /run/systemd/transient/20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b-12726c3703adeae8.service: No such file or directory
Oct  8 12:29:49 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b-userdata-shm.mount: Deactivated successfully.
Oct  8 12:29:49 np0005476733 systemd[1]: var-lib-containers-storage-overlay-d013de5f33e0ff5678f4c633117b334df7b293f43f478018dc7fa7467349bfa0-merged.mount: Deactivated successfully.
Oct  8 12:29:49 np0005476733 podman[263771]: 2025-10-08 16:29:49.376430588 +0000 UTC m=+0.737112731 container cleanup 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001)
Oct  8 12:29:49 np0005476733 podman[263771]: ovn_controller
Oct  8 12:29:49 np0005476733 podman[263801]: ovn_controller
Oct  8 12:29:49 np0005476733 systemd[1]: edpm_ovn_controller.service: Deactivated successfully.
Oct  8 12:29:49 np0005476733 systemd[1]: Stopped ovn_controller container.
Oct  8 12:29:49 np0005476733 systemd[1]: Starting ovn_controller container...
Oct  8 12:29:49 np0005476733 nova_compute[192580]: 2025-10-08 16:29:49.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:49 np0005476733 systemd[1]: Started libcrun container.
Oct  8 12:29:49 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d013de5f33e0ff5678f4c633117b334df7b293f43f478018dc7fa7467349bfa0/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct  8 12:29:49 np0005476733 systemd[1]: Started /usr/bin/podman healthcheck run 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b.
Oct  8 12:29:49 np0005476733 podman[263816]: 2025-10-08 16:29:49.973673189 +0000 UTC m=+0.497429533 container init 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:29:49 np0005476733 ovn_controller[263831]: + sudo -E kolla_set_configs
Oct  8 12:29:50 np0005476733 podman[263816]: 2025-10-08 16:29:50.005379224 +0000 UTC m=+0.529135548 container start 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  8 12:29:50 np0005476733 systemd[1]: Created slice User Slice of UID 0.
Oct  8 12:29:50 np0005476733 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct  8 12:29:50 np0005476733 edpm-start-podman-container[263816]: ovn_controller
Oct  8 12:29:50 np0005476733 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct  8 12:29:50 np0005476733 systemd[1]: Starting User Manager for UID 0...
Oct  8 12:29:50 np0005476733 edpm-start-podman-container[263815]: Creating additional drop-in dependency for "ovn_controller" (20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b)
Oct  8 12:29:50 np0005476733 systemd[1]: Reloading.
Oct  8 12:29:50 np0005476733 podman[263838]: 2025-10-08 16:29:50.160706296 +0000 UTC m=+0.143758115 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:29:50 np0005476733 systemd[263851]: Queued start job for default target Main User Target.
Oct  8 12:29:50 np0005476733 systemd[263851]: Created slice User Application Slice.
Oct  8 12:29:50 np0005476733 systemd[263851]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct  8 12:29:50 np0005476733 systemd[263851]: Started Daily Cleanup of User's Temporary Directories.
Oct  8 12:29:50 np0005476733 systemd[263851]: Reached target Paths.
Oct  8 12:29:50 np0005476733 systemd[263851]: Reached target Timers.
Oct  8 12:29:50 np0005476733 systemd[263851]: Starting D-Bus User Message Bus Socket...
Oct  8 12:29:50 np0005476733 systemd[263851]: Starting Create User's Volatile Files and Directories...
Oct  8 12:29:50 np0005476733 systemd[263851]: Finished Create User's Volatile Files and Directories.
Oct  8 12:29:50 np0005476733 systemd[263851]: Listening on D-Bus User Message Bus Socket.
Oct  8 12:29:50 np0005476733 systemd[263851]: Reached target Sockets.
Oct  8 12:29:50 np0005476733 systemd[263851]: Reached target Basic System.
Oct  8 12:29:50 np0005476733 systemd[263851]: Reached target Main User Target.
Oct  8 12:29:50 np0005476733 systemd[263851]: Startup finished in 173ms.
Oct  8 12:29:50 np0005476733 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 12:29:50 np0005476733 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 12:29:50 np0005476733 systemd[1]: Started User Manager for UID 0.
Oct  8 12:29:50 np0005476733 systemd[1]: 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b-2dfd302ccaf5df9f.service: Main process exited, code=exited, status=1/FAILURE
Oct  8 12:29:50 np0005476733 systemd[1]: 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b-2dfd302ccaf5df9f.service: Failed with result 'exit-code'.
Oct  8 12:29:50 np0005476733 systemd[1]: Started Session c5 of User root.
Oct  8 12:29:50 np0005476733 systemd[1]: Started ovn_controller container.
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: INFO:__main__:Validating config file
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: INFO:__main__:Writing out command to execute
Oct  8 12:29:50 np0005476733 systemd[1]: session-c5.scope: Deactivated successfully.
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: ++ cat /run_command
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: + ARGS=
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: + sudo kolla_copy_cacerts
Oct  8 12:29:50 np0005476733 systemd[1]: Started Session c6 of User root.
Oct  8 12:29:50 np0005476733 systemd-logind[827]: New session 157 of user zuul.
Oct  8 12:29:50 np0005476733 systemd[1]: Started Session 157 of User zuul.
Oct  8 12:29:50 np0005476733 systemd[1]: session-c6.scope: Deactivated successfully.
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: + [[ ! -n '' ]]
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: + . kolla_extend_start
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: + umask 0022
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Oct  8 12:29:50 np0005476733 systemd[1]: session-155.scope: Deactivated successfully.
Oct  8 12:29:50 np0005476733 systemd-logind[827]: Session 155 logged out. Waiting for processes to exit.
Oct  8 12:29:50 np0005476733 systemd-logind[827]: Removed session 155.
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00014|main|INFO|OVS feature set changed, force recompute.
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00020|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00021|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Oct  8 12:29:50 np0005476733 nova_compute[192580]: 2025-10-08 16:29:50.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00022|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00023|main|INFO|OVS feature set changed, force recompute.
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00024|binding|INFO|Releasing lport 8d72f8aa-58c8-42d9-92dd-e02acc77abc4 from this chassis (sb_readonly=0)
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00025|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00026|pinctrl|WARN|IGMP Querier enabled with invalid ETH src address
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00027|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00028|pinctrl|WARN|IGMP Querier enabled with invalid ETH src address
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00029|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  8 12:29:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:50Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  8 12:29:50 np0005476733 systemd-logind[827]: New session 158 of user zuul.
Oct  8 12:29:50 np0005476733 systemd[1]: Started Session 158 of User zuul.
Oct  8 12:29:50 np0005476733 systemd[1]: session-157.scope: Deactivated successfully.
Oct  8 12:29:50 np0005476733 systemd-logind[827]: Session 157 logged out. Waiting for processes to exit.
Oct  8 12:29:50 np0005476733 systemd-logind[827]: Removed session 157.
Oct  8 12:29:51 np0005476733 systemd[1]: session-158.scope: Deactivated successfully.
Oct  8 12:29:51 np0005476733 systemd-logind[827]: Session 158 logged out. Waiting for processes to exit.
Oct  8 12:29:51 np0005476733 systemd-logind[827]: Removed session 158.
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.024 2 DEBUG oslo_concurrency.lockutils [None req-4800142a-4145-4974-895a-b7120a544ddc 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Acquiring lock "e4ae9f32-76e5-4371-be11-8e494000ad01" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.025 2 DEBUG oslo_concurrency.lockutils [None req-4800142a-4145-4974-895a-b7120a544ddc 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Lock "e4ae9f32-76e5-4371-be11-8e494000ad01" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.025 2 DEBUG oslo_concurrency.lockutils [None req-4800142a-4145-4974-895a-b7120a544ddc 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Acquiring lock "e4ae9f32-76e5-4371-be11-8e494000ad01-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.026 2 DEBUG oslo_concurrency.lockutils [None req-4800142a-4145-4974-895a-b7120a544ddc 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Lock "e4ae9f32-76e5-4371-be11-8e494000ad01-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.026 2 DEBUG oslo_concurrency.lockutils [None req-4800142a-4145-4974-895a-b7120a544ddc 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Lock "e4ae9f32-76e5-4371-be11-8e494000ad01-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.027 2 INFO nova.compute.manager [None req-4800142a-4145-4974-895a-b7120a544ddc 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Terminating instance#033[00m
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.028 2 DEBUG nova.compute.manager [None req-4800142a-4145-4974-895a-b7120a544ddc 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 12:29:52 np0005476733 kernel: tap61f010d3-75 (unregistering): left promiscuous mode
Oct  8 12:29:52 np0005476733 NetworkManager[51699]: <info>  [1759940992.0557] device (tap61f010d3-75): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 12:29:52 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:52Z|00030|binding|INFO|Releasing lport 61f010d3-753c-4762-a073-4176feb5bc5b from this chassis (sb_readonly=0)
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:52 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:52Z|00031|if_status|WARN|Trying to release unknown interface 61f010d3-753c-4762-a073-4176feb5bc5b
Oct  8 12:29:52 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:52Z|00032|binding|INFO|Setting lport 61f010d3-753c-4762-a073-4176feb5bc5b down in Southbound
Oct  8 12:29:52 np0005476733 ovn_controller[263831]: 2025-10-08T16:29:52Z|00033|binding|INFO|Removing iface tap61f010d3-75 ovn-installed in OVS
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:52 np0005476733 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d0000005d.scope: Deactivated successfully.
Oct  8 12:29:52 np0005476733 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d0000005d.scope: Consumed 14.194s CPU time.
Oct  8 12:29:52 np0005476733 systemd-machined[152624]: Machine qemu-57-instance-0000005d terminated.
Oct  8 12:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:52.152 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:6d:74 10.100.0.6'], port_security=['fa:16:3e:47:6d:74 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-internal-dns-test-port-1278463350', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'e4ae9f32-76e5-4371-be11-8e494000ad01', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b7d39a51-3870-42d7-b64b-4e681abcb74b', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-internal-dns-test-port-1278463350', 'neutron:project_id': 'f35b4c2f25d5494d84303ae8109d46a7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b9bca2ff-ab2b-4001-b4ff-f7550119f7ed', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.213'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1133ed6c-2c2e-4c2a-93b5-6315b87b788a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=61f010d3-753c-4762-a073-4176feb5bc5b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:52.153 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 61f010d3-753c-4762-a073-4176feb5bc5b in datapath b7d39a51-3870-42d7-b64b-4e681abcb74b unbound from our chassis#033[00m
Oct  8 12:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:52.154 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b7d39a51-3870-42d7-b64b-4e681abcb74b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 12:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:52.155 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[097767a1-7f82-4897-a6ca-465ca316314b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:52.156 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b7d39a51-3870-42d7-b64b-4e681abcb74b namespace which is not needed anymore#033[00m
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:52 np0005476733 neutron-haproxy-ovnmeta-b7d39a51-3870-42d7-b64b-4e681abcb74b[263487]: [NOTICE]   (263491) : haproxy version is 2.8.14-c23fe91
Oct  8 12:29:52 np0005476733 neutron-haproxy-ovnmeta-b7d39a51-3870-42d7-b64b-4e681abcb74b[263487]: [NOTICE]   (263491) : path to executable is /usr/sbin/haproxy
Oct  8 12:29:52 np0005476733 neutron-haproxy-ovnmeta-b7d39a51-3870-42d7-b64b-4e681abcb74b[263487]: [WARNING]  (263491) : Exiting Master process...
Oct  8 12:29:52 np0005476733 neutron-haproxy-ovnmeta-b7d39a51-3870-42d7-b64b-4e681abcb74b[263487]: [WARNING]  (263491) : Exiting Master process...
Oct  8 12:29:52 np0005476733 neutron-haproxy-ovnmeta-b7d39a51-3870-42d7-b64b-4e681abcb74b[263487]: [ALERT]    (263491) : Current worker (263493) exited with code 143 (Terminated)
Oct  8 12:29:52 np0005476733 neutron-haproxy-ovnmeta-b7d39a51-3870-42d7-b64b-4e681abcb74b[263487]: [WARNING]  (263491) : All workers exited. Exiting... (0)
Oct  8 12:29:52 np0005476733 systemd[1]: libpod-95c69c3ca2fbf57113c6c4ab9edeb20f9042aa1e253c915b8a0d8dadd13192e7.scope: Deactivated successfully.
Oct  8 12:29:52 np0005476733 conmon[263487]: conmon 95c69c3ca2fbf57113c6 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-95c69c3ca2fbf57113c6c4ab9edeb20f9042aa1e253c915b8a0d8dadd13192e7.scope/container/memory.events
Oct  8 12:29:52 np0005476733 podman[264022]: 2025-10-08 16:29:52.301691202 +0000 UTC m=+0.048260423 container died 95c69c3ca2fbf57113c6c4ab9edeb20f9042aa1e253c915b8a0d8dadd13192e7 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-b7d39a51-3870-42d7-b64b-4e681abcb74b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.306 2 INFO nova.virt.libvirt.driver [-] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Instance destroyed successfully.#033[00m
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.307 2 DEBUG nova.objects.instance [None req-4800142a-4145-4974-895a-b7120a544ddc 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Lazy-loading 'resources' on Instance uuid e4ae9f32-76e5-4371-be11-8e494000ad01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:29:52 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-95c69c3ca2fbf57113c6c4ab9edeb20f9042aa1e253c915b8a0d8dadd13192e7-userdata-shm.mount: Deactivated successfully.
Oct  8 12:29:52 np0005476733 systemd[1]: var-lib-containers-storage-overlay-8d69f7af40b30084a995210ccd2a0eb851a5742978e10aaa9e18b267d943dff7-merged.mount: Deactivated successfully.
Oct  8 12:29:52 np0005476733 podman[264022]: 2025-10-08 16:29:52.357967759 +0000 UTC m=+0.104536980 container cleanup 95c69c3ca2fbf57113c6c4ab9edeb20f9042aa1e253c915b8a0d8dadd13192e7 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-b7d39a51-3870-42d7-b64b-4e681abcb74b, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  8 12:29:52 np0005476733 systemd[1]: libpod-conmon-95c69c3ca2fbf57113c6c4ab9edeb20f9042aa1e253c915b8a0d8dadd13192e7.scope: Deactivated successfully.
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.409 2 DEBUG nova.virt.libvirt.vif [None req-4800142a-4145-4974-895a-b7120a544ddc 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T16:29:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-internal-dns-test-vm-1675426651',display_name='tempest-internal-dns-test-vm-1675426651',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-internal-dns-test-vm-1675426651',id=93,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP+Dvc/NhHc1Yg2xFGMnIVO5Msb/l5qEVK3EjhnAPqkbQTF2CEAcs953QFvv2Xx9qdZIeKfeoHKk1HHOAMR1TWfljGvYGNI0MmdwpGnPv6KeyBMeTtq8vO1vS//IRY+ekg==',key_name='tempest-internal-dns-test-shared-keypair-1161682905',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:29:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f35b4c2f25d5494d84303ae8109d46a7',ramdisk_id='',reservation_id='r-9p5i94j8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InternalDNSInterruptionsTestOvn-1696835486',owner_user_name='tempest-InternalDNSInterruptionsTestOvn-1696835486-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T16:29:19Z,user_data=None,user_id='28332a14b074486eb9f93bd4826a0f49',uuid=e4ae9f32-76e5-4371-be11-8e494000ad01,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "61f010d3-753c-4762-a073-4176feb5bc5b", "address": "fa:16:3e:47:6d:74", "network": {"id": "b7d39a51-3870-42d7-b64b-4e681abcb74b", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-1749202466", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f35b4c2f25d5494d84303ae8109d46a7", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61f010d3-75", "ovs_interfaceid": "61f010d3-753c-4762-a073-4176feb5bc5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.409 2 DEBUG nova.network.os_vif_util [None req-4800142a-4145-4974-895a-b7120a544ddc 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Converting VIF {"id": "61f010d3-753c-4762-a073-4176feb5bc5b", "address": "fa:16:3e:47:6d:74", "network": {"id": "b7d39a51-3870-42d7-b64b-4e681abcb74b", "bridge": "br-int", "label": "tempest-internal-dns-test-shared-network-1749202466", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f35b4c2f25d5494d84303ae8109d46a7", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61f010d3-75", "ovs_interfaceid": "61f010d3-753c-4762-a073-4176feb5bc5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.410 2 DEBUG nova.network.os_vif_util [None req-4800142a-4145-4974-895a-b7120a544ddc 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:47:6d:74,bridge_name='br-int',has_traffic_filtering=True,id=61f010d3-753c-4762-a073-4176feb5bc5b,network=Network(b7d39a51-3870-42d7-b64b-4e681abcb74b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap61f010d3-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.410 2 DEBUG os_vif [None req-4800142a-4145-4974-895a-b7120a544ddc 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:47:6d:74,bridge_name='br-int',has_traffic_filtering=True,id=61f010d3-753c-4762-a073-4176feb5bc5b,network=Network(b7d39a51-3870-42d7-b64b-4e681abcb74b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap61f010d3-75') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.412 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61f010d3-75, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.417 2 INFO os_vif [None req-4800142a-4145-4974-895a-b7120a544ddc 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:47:6d:74,bridge_name='br-int',has_traffic_filtering=True,id=61f010d3-753c-4762-a073-4176feb5bc5b,network=Network(b7d39a51-3870-42d7-b64b-4e681abcb74b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap61f010d3-75')#033[00m
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.417 2 INFO nova.virt.libvirt.driver [None req-4800142a-4145-4974-895a-b7120a544ddc 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Deleting instance files /var/lib/nova/instances/e4ae9f32-76e5-4371-be11-8e494000ad01_del#033[00m
Oct  8 12:29:52 np0005476733 podman[264065]: 2025-10-08 16:29:52.418261032 +0000 UTC m=+0.042068485 container remove 95c69c3ca2fbf57113c6c4ab9edeb20f9042aa1e253c915b8a0d8dadd13192e7 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-b7d39a51-3870-42d7-b64b-4e681abcb74b, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.418 2 INFO nova.virt.libvirt.driver [None req-4800142a-4145-4974-895a-b7120a544ddc 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Deletion of /var/lib/nova/instances/e4ae9f32-76e5-4371-be11-8e494000ad01_del complete#033[00m
Oct  8 12:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:52.423 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9818116c-2de4-44ea-b26f-bbcd63c23886]: (4, ('Wed Oct  8 04:29:52 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b7d39a51-3870-42d7-b64b-4e681abcb74b (95c69c3ca2fbf57113c6c4ab9edeb20f9042aa1e253c915b8a0d8dadd13192e7)\n95c69c3ca2fbf57113c6c4ab9edeb20f9042aa1e253c915b8a0d8dadd13192e7\nWed Oct  8 04:29:52 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b7d39a51-3870-42d7-b64b-4e681abcb74b (95c69c3ca2fbf57113c6c4ab9edeb20f9042aa1e253c915b8a0d8dadd13192e7)\n95c69c3ca2fbf57113c6c4ab9edeb20f9042aa1e253c915b8a0d8dadd13192e7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:52.425 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0f029f99-f54f-4830-94de-fbdc1e479e5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:52.426 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7d39a51-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:52 np0005476733 kernel: tapb7d39a51-30: left promiscuous mode
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:52.434 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a83839cd-4648-42b5-882c-ba0d1cbb493b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:52.465 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1aaf16d6-82f9-4445-b5dc-3697d8345149]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:52.467 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0fb7b8c8-e1b5-4188-bf83-e7921c705f90]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:52.490 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7ed153f1-37a3-4112-bde7-ef1ac0148ada]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 786221, 'reachable_time': 16452, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 264080, 'error': None, 'target': 'ovnmeta-b7d39a51-3870-42d7-b64b-4e681abcb74b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:29:52 np0005476733 systemd[1]: run-netns-ovnmeta\x2db7d39a51\x2d3870\x2d42d7\x2db64b\x2d4e681abcb74b.mount: Deactivated successfully.
Oct  8 12:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:52.494 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b7d39a51-3870-42d7-b64b-4e681abcb74b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 12:29:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:29:52.495 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[72cd6521-5a34-4915-a516-8835bafb23f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.698 2 INFO nova.compute.manager [None req-4800142a-4145-4974-895a-b7120a544ddc 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Took 0.67 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.699 2 DEBUG oslo.service.loopingcall [None req-4800142a-4145-4974-895a-b7120a544ddc 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.700 2 DEBUG nova.compute.manager [-] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 12:29:52 np0005476733 nova_compute[192580]: 2025-10-08 16:29:52.700 2 DEBUG nova.network.neutron [-] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 12:29:53 np0005476733 nova_compute[192580]: 2025-10-08 16:29:53.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:29:53 np0005476733 nova_compute[192580]: 2025-10-08 16:29:53.691 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:29:53 np0005476733 nova_compute[192580]: 2025-10-08 16:29:53.691 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:29:53 np0005476733 nova_compute[192580]: 2025-10-08 16:29:53.692 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:29:53 np0005476733 nova_compute[192580]: 2025-10-08 16:29:53.692 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:29:53 np0005476733 podman[264082]: 2025-10-08 16:29:53.822598404 +0000 UTC m=+0.069957841 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:29:53 np0005476733 nova_compute[192580]: 2025-10-08 16:29:53.881 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:29:53 np0005476733 nova_compute[192580]: 2025-10-08 16:29:53.882 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13634MB free_disk=111.31269454956055GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:29:53 np0005476733 nova_compute[192580]: 2025-10-08 16:29:53.882 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:29:53 np0005476733 nova_compute[192580]: 2025-10-08 16:29:53.882 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:29:54 np0005476733 nova_compute[192580]: 2025-10-08 16:29:54.135 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance e4ae9f32-76e5-4371-be11-8e494000ad01 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:29:54 np0005476733 nova_compute[192580]: 2025-10-08 16:29:54.136 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:29:54 np0005476733 nova_compute[192580]: 2025-10-08 16:29:54.136 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=640MB phys_disk=119GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:29:54 np0005476733 nova_compute[192580]: 2025-10-08 16:29:54.203 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:29:54 np0005476733 nova_compute[192580]: 2025-10-08 16:29:54.262 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:29:54 np0005476733 nova_compute[192580]: 2025-10-08 16:29:54.301 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:29:54 np0005476733 nova_compute[192580]: 2025-10-08 16:29:54.301 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.419s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:29:54 np0005476733 nova_compute[192580]: 2025-10-08 16:29:54.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:55 np0005476733 nova_compute[192580]: 2025-10-08 16:29:55.120 2 DEBUG nova.compute.manager [req-9155f9ac-7dcb-4b13-b788-346b0afde199 req-0782a46b-5892-45b5-a35b-a45c8cf33172 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Received event network-vif-unplugged-61f010d3-753c-4762-a073-4176feb5bc5b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:29:55 np0005476733 nova_compute[192580]: 2025-10-08 16:29:55.121 2 DEBUG oslo_concurrency.lockutils [req-9155f9ac-7dcb-4b13-b788-346b0afde199 req-0782a46b-5892-45b5-a35b-a45c8cf33172 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e4ae9f32-76e5-4371-be11-8e494000ad01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:29:55 np0005476733 nova_compute[192580]: 2025-10-08 16:29:55.121 2 DEBUG oslo_concurrency.lockutils [req-9155f9ac-7dcb-4b13-b788-346b0afde199 req-0782a46b-5892-45b5-a35b-a45c8cf33172 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e4ae9f32-76e5-4371-be11-8e494000ad01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:29:55 np0005476733 nova_compute[192580]: 2025-10-08 16:29:55.121 2 DEBUG oslo_concurrency.lockutils [req-9155f9ac-7dcb-4b13-b788-346b0afde199 req-0782a46b-5892-45b5-a35b-a45c8cf33172 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e4ae9f32-76e5-4371-be11-8e494000ad01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:29:55 np0005476733 nova_compute[192580]: 2025-10-08 16:29:55.122 2 DEBUG nova.compute.manager [req-9155f9ac-7dcb-4b13-b788-346b0afde199 req-0782a46b-5892-45b5-a35b-a45c8cf33172 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] No waiting events found dispatching network-vif-unplugged-61f010d3-753c-4762-a073-4176feb5bc5b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:29:55 np0005476733 nova_compute[192580]: 2025-10-08 16:29:55.122 2 DEBUG nova.compute.manager [req-9155f9ac-7dcb-4b13-b788-346b0afde199 req-0782a46b-5892-45b5-a35b-a45c8cf33172 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Received event network-vif-unplugged-61f010d3-753c-4762-a073-4176feb5bc5b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 12:29:57 np0005476733 nova_compute[192580]: 2025-10-08 16:29:57.218 2 DEBUG nova.network.neutron [-] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:29:57 np0005476733 nova_compute[192580]: 2025-10-08 16:29:57.284 2 DEBUG nova.compute.manager [req-5a2e7cc4-25ea-4b19-9a27-5c910b90b94d req-8323f8ae-6836-48bd-a730-ed14fcc5f4ed 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Received event network-vif-plugged-61f010d3-753c-4762-a073-4176feb5bc5b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:29:57 np0005476733 nova_compute[192580]: 2025-10-08 16:29:57.285 2 DEBUG oslo_concurrency.lockutils [req-5a2e7cc4-25ea-4b19-9a27-5c910b90b94d req-8323f8ae-6836-48bd-a730-ed14fcc5f4ed 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "e4ae9f32-76e5-4371-be11-8e494000ad01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:29:57 np0005476733 nova_compute[192580]: 2025-10-08 16:29:57.285 2 DEBUG oslo_concurrency.lockutils [req-5a2e7cc4-25ea-4b19-9a27-5c910b90b94d req-8323f8ae-6836-48bd-a730-ed14fcc5f4ed 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e4ae9f32-76e5-4371-be11-8e494000ad01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:29:57 np0005476733 nova_compute[192580]: 2025-10-08 16:29:57.285 2 DEBUG oslo_concurrency.lockutils [req-5a2e7cc4-25ea-4b19-9a27-5c910b90b94d req-8323f8ae-6836-48bd-a730-ed14fcc5f4ed 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "e4ae9f32-76e5-4371-be11-8e494000ad01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:29:57 np0005476733 nova_compute[192580]: 2025-10-08 16:29:57.286 2 DEBUG nova.compute.manager [req-5a2e7cc4-25ea-4b19-9a27-5c910b90b94d req-8323f8ae-6836-48bd-a730-ed14fcc5f4ed 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] No waiting events found dispatching network-vif-plugged-61f010d3-753c-4762-a073-4176feb5bc5b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:29:57 np0005476733 nova_compute[192580]: 2025-10-08 16:29:57.286 2 WARNING nova.compute.manager [req-5a2e7cc4-25ea-4b19-9a27-5c910b90b94d req-8323f8ae-6836-48bd-a730-ed14fcc5f4ed 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Received unexpected event network-vif-plugged-61f010d3-753c-4762-a073-4176feb5bc5b for instance with vm_state active and task_state deleting.#033[00m
Oct  8 12:29:57 np0005476733 nova_compute[192580]: 2025-10-08 16:29:57.378 2 INFO nova.compute.manager [-] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Took 4.68 seconds to deallocate network for instance.#033[00m
Oct  8 12:29:57 np0005476733 nova_compute[192580]: 2025-10-08 16:29:57.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:57 np0005476733 nova_compute[192580]: 2025-10-08 16:29:57.652 2 DEBUG oslo_concurrency.lockutils [None req-4800142a-4145-4974-895a-b7120a544ddc 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:29:57 np0005476733 nova_compute[192580]: 2025-10-08 16:29:57.653 2 DEBUG oslo_concurrency.lockutils [None req-4800142a-4145-4974-895a-b7120a544ddc 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:29:57 np0005476733 nova_compute[192580]: 2025-10-08 16:29:57.694 2 DEBUG nova.compute.provider_tree [None req-4800142a-4145-4974-895a-b7120a544ddc 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:29:57 np0005476733 nova_compute[192580]: 2025-10-08 16:29:57.864 2 DEBUG nova.scheduler.client.report [None req-4800142a-4145-4974-895a-b7120a544ddc 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:29:57 np0005476733 nova_compute[192580]: 2025-10-08 16:29:57.928 2 DEBUG oslo_concurrency.lockutils [None req-4800142a-4145-4974-895a-b7120a544ddc 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.275s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:29:58 np0005476733 nova_compute[192580]: 2025-10-08 16:29:58.002 2 INFO nova.scheduler.client.report [None req-4800142a-4145-4974-895a-b7120a544ddc 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Deleted allocations for instance e4ae9f32-76e5-4371-be11-8e494000ad01#033[00m
Oct  8 12:29:58 np0005476733 nova_compute[192580]: 2025-10-08 16:29:58.199 2 DEBUG oslo_concurrency.lockutils [None req-4800142a-4145-4974-895a-b7120a544ddc 28332a14b074486eb9f93bd4826a0f49 f35b4c2f25d5494d84303ae8109d46a7 - - default default] Lock "e4ae9f32-76e5-4371-be11-8e494000ad01" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:29:58 np0005476733 nova_compute[192580]: 2025-10-08 16:29:58.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:29:59 np0005476733 nova_compute[192580]: 2025-10-08 16:29:59.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:00 np0005476733 systemd[1]: Stopping User Manager for UID 0...
Oct  8 12:30:00 np0005476733 systemd[263851]: Activating special unit Exit the Session...
Oct  8 12:30:00 np0005476733 systemd[263851]: Stopped target Main User Target.
Oct  8 12:30:00 np0005476733 systemd[263851]: Stopped target Basic System.
Oct  8 12:30:00 np0005476733 systemd[263851]: Stopped target Paths.
Oct  8 12:30:00 np0005476733 systemd[263851]: Stopped target Sockets.
Oct  8 12:30:00 np0005476733 systemd[263851]: Stopped target Timers.
Oct  8 12:30:00 np0005476733 systemd[263851]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  8 12:30:00 np0005476733 systemd[263851]: Closed D-Bus User Message Bus Socket.
Oct  8 12:30:00 np0005476733 systemd[263851]: Stopped Create User's Volatile Files and Directories.
Oct  8 12:30:00 np0005476733 systemd[263851]: Removed slice User Application Slice.
Oct  8 12:30:00 np0005476733 systemd[263851]: Reached target Shutdown.
Oct  8 12:30:00 np0005476733 systemd[263851]: Finished Exit the Session.
Oct  8 12:30:00 np0005476733 systemd[263851]: Reached target Exit the Session.
Oct  8 12:30:00 np0005476733 systemd[1]: user@0.service: Deactivated successfully.
Oct  8 12:30:00 np0005476733 systemd[1]: Stopped User Manager for UID 0.
Oct  8 12:30:00 np0005476733 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct  8 12:30:00 np0005476733 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct  8 12:30:00 np0005476733 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct  8 12:30:00 np0005476733 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct  8 12:30:00 np0005476733 systemd[1]: Removed slice User Slice of UID 0.
Oct  8 12:30:00 np0005476733 podman[264103]: 2025-10-08 16:30:00.74566972 +0000 UTC m=+0.067114592 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 12:30:00 np0005476733 podman[264102]: 2025-10-08 16:30:00.751259748 +0000 UTC m=+0.077343226 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:30:00 np0005476733 podman[264104]: 2025-10-08 16:30:00.75794808 +0000 UTC m=+0.077511802 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Oct  8 12:30:02 np0005476733 nova_compute[192580]: 2025-10-08 16:30:02.301 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:30:02 np0005476733 nova_compute[192580]: 2025-10-08 16:30:02.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:04 np0005476733 ovn_controller[263831]: 2025-10-08T16:30:04Z|00034|memory|INFO|18944 kB peak resident set size after 13.5 seconds
Oct  8 12:30:04 np0005476733 ovn_controller[263831]: 2025-10-08T16:30:04Z|00035|memory|INFO|idl-cells-OVN_Southbound:3922 idl-cells-Open_vSwitch:642 lflow-cache-entries-cache-expr:65 lflow-cache-entries-cache-matches:197 lflow-cache-size-KB:278 local_datapath_usage-KB:1 ofctrl_desired_flow_usage-KB:144 ofctrl_installed_flow_usage-KB:104 ofctrl_sb_flow_ref_usage-KB:63
Oct  8 12:30:04 np0005476733 nova_compute[192580]: 2025-10-08 16:30:04.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:05 np0005476733 nova_compute[192580]: 2025-10-08 16:30:05.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:07 np0005476733 nova_compute[192580]: 2025-10-08 16:30:07.305 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759940992.3036969, e4ae9f32-76e5-4371-be11-8e494000ad01 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:30:07 np0005476733 nova_compute[192580]: 2025-10-08 16:30:07.305 2 INFO nova.compute.manager [-] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] VM Stopped (Lifecycle Event)#033[00m
Oct  8 12:30:07 np0005476733 nova_compute[192580]: 2025-10-08 16:30:07.330 2 DEBUG nova.compute.manager [None req-e68e1f1b-cc92-462e-a037-4c04b7acfc9a - - - - - -] [instance: e4ae9f32-76e5-4371-be11-8e494000ad01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:30:07 np0005476733 nova_compute[192580]: 2025-10-08 16:30:07.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:09 np0005476733 podman[264167]: 2025-10-08 16:30:09.225539769 +0000 UTC m=+0.053440918 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:30:09 np0005476733 podman[264168]: 2025-10-08 16:30:09.230785325 +0000 UTC m=+0.058320042 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 12:30:09 np0005476733 nova_compute[192580]: 2025-10-08 16:30:09.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:12 np0005476733 nova_compute[192580]: 2025-10-08 16:30:12.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:14 np0005476733 nova_compute[192580]: 2025-10-08 16:30:14.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:14 np0005476733 nova_compute[192580]: 2025-10-08 16:30:14.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:17 np0005476733 nova_compute[192580]: 2025-10-08 16:30:17.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:17 np0005476733 podman[264217]: 2025-10-08 16:30:17.628161935 +0000 UTC m=+0.056092972 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  8 12:30:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:30:19.060 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=69, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=68) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:30:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:30:19.061 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:30:19 np0005476733 nova_compute[192580]: 2025-10-08 16:30:19.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:19 np0005476733 nova_compute[192580]: 2025-10-08 16:30:19.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:20 np0005476733 podman[264238]: 2025-10-08 16:30:20.975399885 +0000 UTC m=+0.094119569 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 12:30:22 np0005476733 nova_compute[192580]: 2025-10-08 16:30:22.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:24 np0005476733 podman[264264]: 2025-10-08 16:30:24.221182474 +0000 UTC m=+0.054012166 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  8 12:30:24 np0005476733 nova_compute[192580]: 2025-10-08 16:30:24.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:30:26.388 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:30:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:30:26.388 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:30:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:30:26.388 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:30:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:30:27.064 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '69'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:30:27 np0005476733 nova_compute[192580]: 2025-10-08 16:30:27.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:29 np0005476733 systemd-logind[827]: New session 159 of user zuul.
Oct  8 12:30:29 np0005476733 systemd[1]: Started Session 159 of User zuul.
Oct  8 12:30:29 np0005476733 nova_compute[192580]: 2025-10-08 16:30:29.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:29 np0005476733 systemd[1]: session-159.scope: Deactivated successfully.
Oct  8 12:30:29 np0005476733 systemd-logind[827]: Session 159 logged out. Waiting for processes to exit.
Oct  8 12:30:29 np0005476733 systemd-logind[827]: Removed session 159.
Oct  8 12:30:31 np0005476733 podman[264310]: 2025-10-08 16:30:31.241855929 +0000 UTC m=+0.061909057 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd)
Oct  8 12:30:31 np0005476733 podman[264311]: 2025-10-08 16:30:31.242113987 +0000 UTC m=+0.055683448 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:30:31 np0005476733 podman[264312]: 2025-10-08 16:30:31.276178469 +0000 UTC m=+0.089248865 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, config_id=edpm, io.buildah.version=1.33.7, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  8 12:30:32 np0005476733 nova_compute[192580]: 2025-10-08 16:30:32.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:33 np0005476733 nova_compute[192580]: 2025-10-08 16:30:33.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:30:34 np0005476733 nova_compute[192580]: 2025-10-08 16:30:34.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.068 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.070 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.070 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.070 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.070 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.070 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.070 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:30:36.070 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:30:36 np0005476733 nova_compute[192580]: 2025-10-08 16:30:36.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:30:36 np0005476733 ovn_controller[263831]: 2025-10-08T16:30:36Z|00036|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct  8 12:30:37 np0005476733 nova_compute[192580]: 2025-10-08 16:30:37.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:39 np0005476733 nova_compute[192580]: 2025-10-08 16:30:39.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:40 np0005476733 podman[264373]: 2025-10-08 16:30:40.234026302 +0000 UTC m=+0.064079515 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 12:30:40 np0005476733 podman[264372]: 2025-10-08 16:30:40.240138286 +0000 UTC m=+0.066123381 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001)
Oct  8 12:30:41 np0005476733 nova_compute[192580]: 2025-10-08 16:30:41.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:30:41 np0005476733 nova_compute[192580]: 2025-10-08 16:30:41.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:30:42 np0005476733 nova_compute[192580]: 2025-10-08 16:30:42.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:43 np0005476733 nova_compute[192580]: 2025-10-08 16:30:43.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:30:43 np0005476733 nova_compute[192580]: 2025-10-08 16:30:43.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:30:44 np0005476733 nova_compute[192580]: 2025-10-08 16:30:44.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:46 np0005476733 nova_compute[192580]: 2025-10-08 16:30:46.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:30:46 np0005476733 nova_compute[192580]: 2025-10-08 16:30:46.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:30:46 np0005476733 nova_compute[192580]: 2025-10-08 16:30:46.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:30:46 np0005476733 nova_compute[192580]: 2025-10-08 16:30:46.607 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:30:47 np0005476733 nova_compute[192580]: 2025-10-08 16:30:47.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:48 np0005476733 podman[264413]: 2025-10-08 16:30:48.242748474 +0000 UTC m=+0.069376994 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 12:30:48 np0005476733 nova_compute[192580]: 2025-10-08 16:30:48.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:30:49 np0005476733 nova_compute[192580]: 2025-10-08 16:30:49.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:51 np0005476733 ovn_controller[263831]: 2025-10-08T16:30:51Z|00037|pinctrl|WARN|Dropped 587 log messages in last 60 seconds (most recently, 0 seconds ago) due to excessive rate
Oct  8 12:30:51 np0005476733 ovn_controller[263831]: 2025-10-08T16:30:51Z|00038|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:30:51 np0005476733 podman[264434]: 2025-10-08 16:30:51.265213593 +0000 UTC m=+0.090696029 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:30:52 np0005476733 nova_compute[192580]: 2025-10-08 16:30:52.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:54 np0005476733 nova_compute[192580]: 2025-10-08 16:30:54.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:55 np0005476733 podman[264461]: 2025-10-08 16:30:55.266002491 +0000 UTC m=+0.080011271 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 12:30:55 np0005476733 nova_compute[192580]: 2025-10-08 16:30:55.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:30:55 np0005476733 nova_compute[192580]: 2025-10-08 16:30:55.644 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:30:55 np0005476733 nova_compute[192580]: 2025-10-08 16:30:55.645 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:30:55 np0005476733 nova_compute[192580]: 2025-10-08 16:30:55.645 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:30:55 np0005476733 nova_compute[192580]: 2025-10-08 16:30:55.645 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:30:55 np0005476733 nova_compute[192580]: 2025-10-08 16:30:55.791 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:30:55 np0005476733 nova_compute[192580]: 2025-10-08 16:30:55.792 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13654MB free_disk=111.31269454956055GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:30:55 np0005476733 nova_compute[192580]: 2025-10-08 16:30:55.792 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:30:55 np0005476733 nova_compute[192580]: 2025-10-08 16:30:55.792 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:30:55 np0005476733 nova_compute[192580]: 2025-10-08 16:30:55.849 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:30:55 np0005476733 nova_compute[192580]: 2025-10-08 16:30:55.849 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:30:55 np0005476733 nova_compute[192580]: 2025-10-08 16:30:55.875 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:30:55 np0005476733 nova_compute[192580]: 2025-10-08 16:30:55.889 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:30:55 np0005476733 nova_compute[192580]: 2025-10-08 16:30:55.925 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:30:55 np0005476733 nova_compute[192580]: 2025-10-08 16:30:55.925 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:30:57 np0005476733 nova_compute[192580]: 2025-10-08 16:30:57.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:30:59 np0005476733 nova_compute[192580]: 2025-10-08 16:30:59.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:00.264 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=70, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=69) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:31:00 np0005476733 nova_compute[192580]: 2025-10-08 16:31:00.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:00 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:00.265 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:31:01 np0005476733 nova_compute[192580]: 2025-10-08 16:31:01.924 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:31:02 np0005476733 podman[264484]: 2025-10-08 16:31:02.219252576 +0000 UTC m=+0.047728576 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 12:31:02 np0005476733 podman[264483]: 2025-10-08 16:31:02.226184605 +0000 UTC m=+0.057123503 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3)
Oct  8 12:31:02 np0005476733 podman[264485]: 2025-10-08 16:31:02.235039977 +0000 UTC m=+0.058808288 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_id=edpm, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter)
Oct  8 12:31:02 np0005476733 nova_compute[192580]: 2025-10-08 16:31:02.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:03 np0005476733 nova_compute[192580]: 2025-10-08 16:31:03.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:31:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:04.267 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '70'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:31:04 np0005476733 nova_compute[192580]: 2025-10-08 16:31:04.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:05 np0005476733 systemd-logind[827]: New session 160 of user zuul.
Oct  8 12:31:05 np0005476733 systemd[1]: Started Session 160 of User zuul.
Oct  8 12:31:05 np0005476733 systemd[1]: session-160.scope: Deactivated successfully.
Oct  8 12:31:05 np0005476733 systemd-logind[827]: Session 160 logged out. Waiting for processes to exit.
Oct  8 12:31:05 np0005476733 systemd-logind[827]: Removed session 160.
Oct  8 12:31:07 np0005476733 nova_compute[192580]: 2025-10-08 16:31:07.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:09 np0005476733 nova_compute[192580]: 2025-10-08 16:31:09.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:11 np0005476733 podman[264573]: 2025-10-08 16:31:11.232875078 +0000 UTC m=+0.055207903 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:31:11 np0005476733 podman[264572]: 2025-10-08 16:31:11.25590301 +0000 UTC m=+0.085011651 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  8 12:31:12 np0005476733 nova_compute[192580]: 2025-10-08 16:31:12.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:14 np0005476733 nova_compute[192580]: 2025-10-08 16:31:14.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:15 np0005476733 nova_compute[192580]: 2025-10-08 16:31:15.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:31:15 np0005476733 nova_compute[192580]: 2025-10-08 16:31:15.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 12:31:17 np0005476733 nova_compute[192580]: 2025-10-08 16:31:17.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:19 np0005476733 podman[264613]: 2025-10-08 16:31:19.238054268 +0000 UTC m=+0.058860430 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:31:19 np0005476733 nova_compute[192580]: 2025-10-08 16:31:19.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:22 np0005476733 podman[264633]: 2025-10-08 16:31:22.262471679 +0000 UTC m=+0.096497434 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 12:31:22 np0005476733 nova_compute[192580]: 2025-10-08 16:31:22.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:24 np0005476733 nova_compute[192580]: 2025-10-08 16:31:24.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:25 np0005476733 nova_compute[192580]: 2025-10-08 16:31:25.609 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:31:25 np0005476733 nova_compute[192580]: 2025-10-08 16:31:25.609 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:31:25 np0005476733 nova_compute[192580]: 2025-10-08 16:31:25.610 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:31:25 np0005476733 nova_compute[192580]: 2025-10-08 16:31:25.611 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:31:25 np0005476733 nova_compute[192580]: 2025-10-08 16:31:25.612 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:31:25 np0005476733 nova_compute[192580]: 2025-10-08 16:31:25.612 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:31:25 np0005476733 nova_compute[192580]: 2025-10-08 16:31:25.612 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:31:25 np0005476733 nova_compute[192580]: 2025-10-08 16:31:25.640 2 DEBUG nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Oct  8 12:31:25 np0005476733 nova_compute[192580]: 2025-10-08 16:31:25.641 2 WARNING nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493#033[00m
Oct  8 12:31:25 np0005476733 nova_compute[192580]: 2025-10-08 16:31:25.641 2 WARNING nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e#033[00m
Oct  8 12:31:25 np0005476733 nova_compute[192580]: 2025-10-08 16:31:25.641 2 INFO nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Removable base files: /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e#033[00m
Oct  8 12:31:25 np0005476733 nova_compute[192580]: 2025-10-08 16:31:25.642 2 INFO nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493#033[00m
Oct  8 12:31:25 np0005476733 nova_compute[192580]: 2025-10-08 16:31:25.642 2 INFO nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e#033[00m
Oct  8 12:31:25 np0005476733 nova_compute[192580]: 2025-10-08 16:31:25.642 2 DEBUG nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Oct  8 12:31:25 np0005476733 nova_compute[192580]: 2025-10-08 16:31:25.642 2 DEBUG nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Oct  8 12:31:25 np0005476733 nova_compute[192580]: 2025-10-08 16:31:25.643 2 DEBUG nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Oct  8 12:31:26 np0005476733 podman[264660]: 2025-10-08 16:31:26.225358693 +0000 UTC m=+0.053230161 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:31:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:26.389 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:31:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:26.390 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:31:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:26.390 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:31:26 np0005476733 nova_compute[192580]: 2025-10-08 16:31:26.924 2 DEBUG oslo_concurrency.lockutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Acquiring lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:31:26 np0005476733 nova_compute[192580]: 2025-10-08 16:31:26.924 2 DEBUG oslo_concurrency.lockutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:31:26 np0005476733 nova_compute[192580]: 2025-10-08 16:31:26.969 2 DEBUG nova.compute.manager [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.069 2 DEBUG oslo_concurrency.lockutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.070 2 DEBUG oslo_concurrency.lockutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.077 2 DEBUG nova.virt.hardware [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.077 2 INFO nova.compute.claims [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.215 2 DEBUG nova.compute.provider_tree [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.234 2 DEBUG nova.scheduler.client.report [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.264 2 DEBUG oslo_concurrency.lockutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.265 2 DEBUG nova.compute.manager [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.320 2 DEBUG nova.compute.manager [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.320 2 DEBUG nova.network.neutron [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.342 2 INFO nova.virt.libvirt.driver [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.362 2 DEBUG nova.compute.manager [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.468 2 DEBUG nova.compute.manager [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.470 2 DEBUG nova.virt.libvirt.driver [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.470 2 INFO nova.virt.libvirt.driver [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Creating image(s)#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.471 2 DEBUG oslo_concurrency.lockutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Acquiring lock "/var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.472 2 DEBUG oslo_concurrency.lockutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "/var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.473 2 DEBUG oslo_concurrency.lockutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "/var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.485 2 DEBUG oslo_concurrency.processutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.529 2 DEBUG nova.policy [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.571 2 DEBUG oslo_concurrency.processutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.572 2 DEBUG oslo_concurrency.lockutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.573 2 DEBUG oslo_concurrency.lockutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.584 2 DEBUG oslo_concurrency.processutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.644 2 DEBUG oslo_concurrency.processutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:31:27 np0005476733 nova_compute[192580]: 2025-10-08 16:31:27.645 2 DEBUG oslo_concurrency.processutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:31:28 np0005476733 nova_compute[192580]: 2025-10-08 16:31:28.138 2 DEBUG oslo_concurrency.processutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk 10737418240" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:31:28 np0005476733 nova_compute[192580]: 2025-10-08 16:31:28.139 2 DEBUG oslo_concurrency.lockutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.566s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:31:28 np0005476733 nova_compute[192580]: 2025-10-08 16:31:28.139 2 DEBUG oslo_concurrency.processutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:31:28 np0005476733 nova_compute[192580]: 2025-10-08 16:31:28.228 2 DEBUG oslo_concurrency.processutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:31:28 np0005476733 nova_compute[192580]: 2025-10-08 16:31:28.229 2 DEBUG nova.objects.instance [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lazy-loading 'migration_context' on Instance uuid 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:31:28 np0005476733 nova_compute[192580]: 2025-10-08 16:31:28.246 2 DEBUG nova.virt.libvirt.driver [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 12:31:28 np0005476733 nova_compute[192580]: 2025-10-08 16:31:28.246 2 DEBUG nova.virt.libvirt.driver [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Ensure instance console log exists: /var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 12:31:28 np0005476733 nova_compute[192580]: 2025-10-08 16:31:28.247 2 DEBUG oslo_concurrency.lockutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:31:28 np0005476733 nova_compute[192580]: 2025-10-08 16:31:28.247 2 DEBUG oslo_concurrency.lockutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:31:28 np0005476733 nova_compute[192580]: 2025-10-08 16:31:28.248 2 DEBUG oslo_concurrency.lockutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:31:29 np0005476733 nova_compute[192580]: 2025-10-08 16:31:29.344 2 DEBUG nova.network.neutron [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Successfully created port: ae07e4bd-18d5-41c3-ade8-9cad83e6d345 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 12:31:29 np0005476733 nova_compute[192580]: 2025-10-08 16:31:29.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:31 np0005476733 nova_compute[192580]: 2025-10-08 16:31:31.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:31:32 np0005476733 nova_compute[192580]: 2025-10-08 16:31:32.169 2 DEBUG nova.network.neutron [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Successfully updated port: ae07e4bd-18d5-41c3-ade8-9cad83e6d345 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 12:31:32 np0005476733 nova_compute[192580]: 2025-10-08 16:31:32.187 2 DEBUG oslo_concurrency.lockutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Acquiring lock "refresh_cache-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:31:32 np0005476733 nova_compute[192580]: 2025-10-08 16:31:32.187 2 DEBUG oslo_concurrency.lockutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Acquired lock "refresh_cache-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:31:32 np0005476733 nova_compute[192580]: 2025-10-08 16:31:32.187 2 DEBUG nova.network.neutron [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:31:32 np0005476733 nova_compute[192580]: 2025-10-08 16:31:32.330 2 DEBUG nova.compute.manager [req-bb391b7b-bd10-46fe-98d4-904a221add79 req-8b0d2b69-0305-45a7-b761-f090c10c4616 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Received event network-changed-ae07e4bd-18d5-41c3-ade8-9cad83e6d345 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:31:32 np0005476733 nova_compute[192580]: 2025-10-08 16:31:32.331 2 DEBUG nova.compute.manager [req-bb391b7b-bd10-46fe-98d4-904a221add79 req-8b0d2b69-0305-45a7-b761-f090c10c4616 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Refreshing instance network info cache due to event network-changed-ae07e4bd-18d5-41c3-ade8-9cad83e6d345. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:31:32 np0005476733 nova_compute[192580]: 2025-10-08 16:31:32.331 2 DEBUG oslo_concurrency.lockutils [req-bb391b7b-bd10-46fe-98d4-904a221add79 req-8b0d2b69-0305-45a7-b761-f090c10c4616 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:31:32 np0005476733 nova_compute[192580]: 2025-10-08 16:31:32.369 2 DEBUG nova.network.neutron [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 12:31:32 np0005476733 nova_compute[192580]: 2025-10-08 16:31:32.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:33 np0005476733 podman[264693]: 2025-10-08 16:31:33.224929859 +0000 UTC m=+0.052195028 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:31:33 np0005476733 podman[264694]: 2025-10-08 16:31:33.238897523 +0000 UTC m=+0.058885501 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, architecture=x86_64, distribution-scope=public, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible)
Oct  8 12:31:33 np0005476733 podman[264692]: 2025-10-08 16:31:33.253796235 +0000 UTC m=+0.084168353 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.601 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.675 2 DEBUG nova.network.neutron [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Updating instance_info_cache with network_info: [{"id": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "address": "fa:16:3e:c8:e6:70", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.229", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae07e4bd-18", "ovs_interfaceid": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.700 2 DEBUG oslo_concurrency.lockutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Releasing lock "refresh_cache-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.700 2 DEBUG nova.compute.manager [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Instance network_info: |[{"id": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "address": "fa:16:3e:c8:e6:70", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.229", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae07e4bd-18", "ovs_interfaceid": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.700 2 DEBUG oslo_concurrency.lockutils [req-bb391b7b-bd10-46fe-98d4-904a221add79 req-8b0d2b69-0305-45a7-b761-f090c10c4616 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.701 2 DEBUG nova.network.neutron [req-bb391b7b-bd10-46fe-98d4-904a221add79 req-8b0d2b69-0305-45a7-b761-f090c10c4616 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Refreshing network info cache for port ae07e4bd-18d5-41c3-ade8-9cad83e6d345 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.703 2 DEBUG nova.virt.libvirt.driver [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Start _get_guest_xml network_info=[{"id": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "address": "fa:16:3e:c8:e6:70", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.229", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae07e4bd-18", "ovs_interfaceid": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.707 2 WARNING nova.virt.libvirt.driver [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.712 2 DEBUG nova.virt.libvirt.host [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.712 2 DEBUG nova.virt.libvirt.host [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.720 2 DEBUG nova.virt.libvirt.host [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.720 2 DEBUG nova.virt.libvirt.host [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.721 2 DEBUG nova.virt.libvirt.driver [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.721 2 DEBUG nova.virt.hardware [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.721 2 DEBUG nova.virt.hardware [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.722 2 DEBUG nova.virt.hardware [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.722 2 DEBUG nova.virt.hardware [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.722 2 DEBUG nova.virt.hardware [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.722 2 DEBUG nova.virt.hardware [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.722 2 DEBUG nova.virt.hardware [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.723 2 DEBUG nova.virt.hardware [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.723 2 DEBUG nova.virt.hardware [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.723 2 DEBUG nova.virt.hardware [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.723 2 DEBUG nova.virt.hardware [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.726 2 DEBUG nova.virt.libvirt.vif [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:31:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-2100618665',display_name='tempest-server-test-2100618665',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-2100618665',id=94,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKjthntCRZsyp63+354x4Tysej4gPAk5Si8K+FPukJ5+b5PYBTJpcqDxUbi+6C7pe5FLeDw6Z8lshC7Q+5m9uZ+mbSRetxFueZc0ihdj/RdazyUIVVefOeKslAmQPSNYjg==',key_name='tempest-keypair-test-219561444',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e84a9b599804a5f95722444484bbcee',ramdisk_id='',reservation_id='r-1cxxa52z',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-GatewayMtuTestIcmp-9904054',owner_user_name='tempest-GatewayMtuTestIcmp-9904054-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:31:27Z,user_data=None,user_id='2a670570cd6d44aeb1a34bf3fac55635',uuid=3dbbed4f-f648-4fb9-ad0e-9c718fbbd525,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "address": "fa:16:3e:c8:e6:70", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.229", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae07e4bd-18", "ovs_interfaceid": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.726 2 DEBUG nova.network.os_vif_util [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Converting VIF {"id": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "address": "fa:16:3e:c8:e6:70", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.229", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae07e4bd-18", "ovs_interfaceid": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.727 2 DEBUG nova.network.os_vif_util [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:e6:70,bridge_name='br-int',has_traffic_filtering=True,id=ae07e4bd-18d5-41c3-ade8-9cad83e6d345,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae07e4bd-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.728 2 DEBUG nova.objects.instance [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lazy-loading 'pci_devices' on Instance uuid 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.743 2 DEBUG nova.virt.libvirt.driver [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] End _get_guest_xml xml=<domain type="kvm">
Oct  8 12:31:33 np0005476733 nova_compute[192580]:  <uuid>3dbbed4f-f648-4fb9-ad0e-9c718fbbd525</uuid>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:  <name>instance-0000005e</name>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <nova:name>tempest-server-test-2100618665</nova:name>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 16:31:33</nova:creationTime>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 12:31:33 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:        <nova:user uuid="2a670570cd6d44aeb1a34bf3fac55635">tempest-GatewayMtuTestIcmp-9904054-project-member</nova:user>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:        <nova:project uuid="3e84a9b599804a5f95722444484bbcee">tempest-GatewayMtuTestIcmp-9904054</nova:project>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:        <nova:port uuid="ae07e4bd-18d5-41c3-ade8-9cad83e6d345">
Oct  8 12:31:33 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.122.229" ipVersion="4"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <system>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <entry name="serial">3dbbed4f-f648-4fb9-ad0e-9c718fbbd525</entry>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <entry name="uuid">3dbbed4f-f648-4fb9-ad0e-9c718fbbd525</entry>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    </system>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:  <os>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:  </os>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:  <features>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:  </features>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:  </clock>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:  <devices>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.config"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:c8:e6:70"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <mtu size="1312"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <target dev="tapae07e4bd-18"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    </interface>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/console.log" append="off"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    </serial>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <video>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    </video>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    </rng>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 12:31:33 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 12:31:33 np0005476733 nova_compute[192580]:  </devices>
Oct  8 12:31:33 np0005476733 nova_compute[192580]: </domain>
Oct  8 12:31:33 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.744 2 DEBUG nova.compute.manager [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Preparing to wait for external event network-vif-plugged-ae07e4bd-18d5-41c3-ade8-9cad83e6d345 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.744 2 DEBUG oslo_concurrency.lockutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Acquiring lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.744 2 DEBUG oslo_concurrency.lockutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.745 2 DEBUG oslo_concurrency.lockutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.745 2 DEBUG nova.virt.libvirt.vif [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:31:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-2100618665',display_name='tempest-server-test-2100618665',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-2100618665',id=94,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKjthntCRZsyp63+354x4Tysej4gPAk5Si8K+FPukJ5+b5PYBTJpcqDxUbi+6C7pe5FLeDw6Z8lshC7Q+5m9uZ+mbSRetxFueZc0ihdj/RdazyUIVVefOeKslAmQPSNYjg==',key_name='tempest-keypair-test-219561444',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e84a9b599804a5f95722444484bbcee',ramdisk_id='',reservation_id='r-1cxxa52z',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-GatewayMtuTestIcmp-9904054',owner_user_name='tempest-GatewayMtuTestIcmp-9904054-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:31:27Z,user_data=None,user_id='2a670570cd6d44aeb1a34bf3fac55635',uuid=3dbbed4f-f648-4fb9-ad0e-9c718fbbd525,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "address": "fa:16:3e:c8:e6:70", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.229", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae07e4bd-18", "ovs_interfaceid": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.745 2 DEBUG nova.network.os_vif_util [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Converting VIF {"id": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "address": "fa:16:3e:c8:e6:70", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.229", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae07e4bd-18", "ovs_interfaceid": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.746 2 DEBUG nova.network.os_vif_util [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:e6:70,bridge_name='br-int',has_traffic_filtering=True,id=ae07e4bd-18d5-41c3-ade8-9cad83e6d345,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae07e4bd-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.746 2 DEBUG os_vif [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:e6:70,bridge_name='br-int',has_traffic_filtering=True,id=ae07e4bd-18d5-41c3-ade8-9cad83e6d345,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae07e4bd-18') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.747 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.748 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.750 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapae07e4bd-18, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.750 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapae07e4bd-18, col_values=(('external_ids', {'iface-id': 'ae07e4bd-18d5-41c3-ade8-9cad83e6d345', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c8:e6:70', 'vm-uuid': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:33 np0005476733 NetworkManager[51699]: <info>  [1759941093.7529] manager: (tapae07e4bd-18): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/289)
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.763 2 INFO os_vif [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:e6:70,bridge_name='br-int',has_traffic_filtering=True,id=ae07e4bd-18d5-41c3-ade8-9cad83e6d345,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae07e4bd-18')#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.872 2 DEBUG nova.virt.libvirt.driver [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.872 2 DEBUG nova.virt.libvirt.driver [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.873 2 DEBUG nova.virt.libvirt.driver [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] No VIF found with MAC fa:16:3e:c8:e6:70, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 12:31:33 np0005476733 nova_compute[192580]: 2025-10-08 16:31:33.873 2 INFO nova.virt.libvirt.driver [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Using config drive#033[00m
Oct  8 12:31:34 np0005476733 nova_compute[192580]: 2025-10-08 16:31:34.397 2 INFO nova.virt.libvirt.driver [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Creating config drive at /var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.config#033[00m
Oct  8 12:31:34 np0005476733 nova_compute[192580]: 2025-10-08 16:31:34.402 2 DEBUG oslo_concurrency.processutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpngj6iw6c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:31:34 np0005476733 nova_compute[192580]: 2025-10-08 16:31:34.530 2 DEBUG oslo_concurrency.processutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpngj6iw6c" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:31:34 np0005476733 nova_compute[192580]: 2025-10-08 16:31:34.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:34 np0005476733 kernel: tapae07e4bd-18: entered promiscuous mode
Oct  8 12:31:34 np0005476733 NetworkManager[51699]: <info>  [1759941094.5934] manager: (tapae07e4bd-18): new Tun device (/org/freedesktop/NetworkManager/Devices/290)
Oct  8 12:31:34 np0005476733 ovn_controller[263831]: 2025-10-08T16:31:34Z|00039|binding|INFO|Claiming lport ae07e4bd-18d5-41c3-ade8-9cad83e6d345 for this chassis.
Oct  8 12:31:34 np0005476733 ovn_controller[263831]: 2025-10-08T16:31:34Z|00040|binding|INFO|ae07e4bd-18d5-41c3-ade8-9cad83e6d345: Claiming fa:16:3e:c8:e6:70 192.168.122.229
Oct  8 12:31:34 np0005476733 nova_compute[192580]: 2025-10-08 16:31:34.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:34 np0005476733 nova_compute[192580]: 2025-10-08 16:31:34.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:34 np0005476733 nova_compute[192580]: 2025-10-08 16:31:34.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:34 np0005476733 nova_compute[192580]: 2025-10-08 16:31:34.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:34 np0005476733 NetworkManager[51699]: <info>  [1759941094.6091] manager: (patch-provnet-20e9b335-697c-41bf-8f62-8813cb01ba99-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/291)
Oct  8 12:31:34 np0005476733 NetworkManager[51699]: <info>  [1759941094.6097] manager: (patch-br-int-to-provnet-20e9b335-697c-41bf-8f62-8813cb01ba99): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/292)
Oct  8 12:31:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:34.611 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:e6:70 192.168.122.229'], port_security=['fa:16:3e:c8:e6:70 192.168.122.229'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.229/24', 'neutron:device_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e84a9b599804a5f95722444484bbcee', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e3968eba-c424-466b-8079-cc800da29d95', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5b64086-e7d8-42ad-b439-67cb79e13d7c, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=ae07e4bd-18d5-41c3-ade8-9cad83e6d345) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:31:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:34.612 103739 INFO neutron.agent.ovn.metadata.agent [-] Port ae07e4bd-18d5-41c3-ade8-9cad83e6d345 in datapath 81c575b5-ac88-40d3-8b00-79c5c936eec4 bound to our chassis#033[00m
Oct  8 12:31:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:34.613 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 81c575b5-ac88-40d3-8b00-79c5c936eec4#033[00m
Oct  8 12:31:34 np0005476733 systemd-udevd[264773]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:31:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:34.628 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5604f469-9578-462e-a07f-b20c67dcf6b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:31:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:34.628 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap81c575b5-a1 in ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 12:31:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:34.631 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap81c575b5-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 12:31:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:34.631 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9d3c61f3-6aa9-4cff-9dc7-1b1a75e3114b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:31:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:34.632 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[f2046ced-5306-4120-bf07-a1bb8ae853a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:31:34 np0005476733 NetworkManager[51699]: <info>  [1759941094.6345] device (tapae07e4bd-18): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:31:34 np0005476733 NetworkManager[51699]: <info>  [1759941094.6352] device (tapae07e4bd-18): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 12:31:34 np0005476733 systemd-machined[152624]: New machine qemu-58-instance-0000005e.
Oct  8 12:31:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:34.644 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[5f0747dd-70ee-4b93-9f7f-7545af2aa2e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:31:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:34.675 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[113dec7d-3567-4cee-b0b7-bdf6ab7970d8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:31:34 np0005476733 systemd[1]: Started Virtual Machine qemu-58-instance-0000005e.
Oct  8 12:31:34 np0005476733 nova_compute[192580]: 2025-10-08 16:31:34.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:34 np0005476733 nova_compute[192580]: 2025-10-08 16:31:34.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:34 np0005476733 ovn_controller[263831]: 2025-10-08T16:31:34Z|00041|binding|INFO|Setting lport ae07e4bd-18d5-41c3-ade8-9cad83e6d345 ovn-installed in OVS
Oct  8 12:31:34 np0005476733 ovn_controller[263831]: 2025-10-08T16:31:34Z|00042|binding|INFO|Setting lport ae07e4bd-18d5-41c3-ade8-9cad83e6d345 up in Southbound
Oct  8 12:31:34 np0005476733 nova_compute[192580]: 2025-10-08 16:31:34.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:34.710 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[4f4115fc-7ca3-461a-9728-61ebcb25e67a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:31:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:34.714 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[35fe66fa-8d17-4ec5-898a-81d4840676f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:31:34 np0005476733 NetworkManager[51699]: <info>  [1759941094.7160] manager: (tap81c575b5-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/293)
Oct  8 12:31:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:34.747 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[e7f956dc-90a4-4a10-9c01-99e83b358051]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:31:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:34.750 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[bd52bcb2-9af5-439a-a032-18f92d6687d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:31:34 np0005476733 NetworkManager[51699]: <info>  [1759941094.7732] device (tap81c575b5-a0): carrier: link connected
Oct  8 12:31:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:34.780 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[e8ecd7fc-29c9-471e-a06c-467ad392cf70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:31:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:34.803 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[662da7a7-7f64-47e3-b06b-7f43740a3b13]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81c575b5-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:bf:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 799843, 'reachable_time': 26739, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 264806, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:31:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:34.823 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3aae8d83-d19b-47f3-9853-7638f90dbb8b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe79:bf12'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 799843, 'tstamp': 799843}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 264807, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:31:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:34.839 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6fdd5799-e8b0-4ae7-aec0-87254e0923ff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81c575b5-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:bf:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 799843, 'reachable_time': 26739, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 264808, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:31:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:34.876 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[94170539-2353-457c-9f7d-c3c45ecba923]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:31:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:34.944 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[72eebf08-a8c8-4281-960b-fe4747ba3b2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:31:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:34.946 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81c575b5-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:31:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:34.946 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:31:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:34.946 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81c575b5-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:31:34 np0005476733 nova_compute[192580]: 2025-10-08 16:31:34.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:34 np0005476733 NetworkManager[51699]: <info>  [1759941094.9816] manager: (tap81c575b5-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/294)
Oct  8 12:31:34 np0005476733 kernel: tap81c575b5-a0: entered promiscuous mode
Oct  8 12:31:34 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:34.983 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap81c575b5-a0, col_values=(('external_ids', {'iface-id': '3737b929-673d-4d30-a674-dbb8c6c2e54d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:31:34 np0005476733 ovn_controller[263831]: 2025-10-08T16:31:34Z|00043|binding|INFO|Releasing lport 3737b929-673d-4d30-a674-dbb8c6c2e54d from this chassis (sb_readonly=0)
Oct  8 12:31:34 np0005476733 nova_compute[192580]: 2025-10-08 16:31:34.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:34 np0005476733 nova_compute[192580]: 2025-10-08 16:31:34.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:35.000 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/81c575b5-ac88-40d3-8b00-79c5c936eec4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/81c575b5-ac88-40d3-8b00-79c5c936eec4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:35.001 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a3d36694-2947-42f9-8fe1-031ea35bdd2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:35.002 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-81c575b5-ac88-40d3-8b00-79c5c936eec4
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/81c575b5-ac88-40d3-8b00-79c5c936eec4.pid.haproxy
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 81c575b5-ac88-40d3-8b00-79c5c936eec4
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 12:31:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:31:35.005 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'env', 'PROCESS_TAG=haproxy-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/81c575b5-ac88-40d3-8b00-79c5c936eec4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.177 2 DEBUG nova.compute.manager [req-02833acf-5d2f-4944-93bf-f78da1eb8126 req-4200b54b-ba0e-4b2c-8a37-a1b8248b048d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Received event network-vif-plugged-ae07e4bd-18d5-41c3-ade8-9cad83e6d345 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.177 2 DEBUG oslo_concurrency.lockutils [req-02833acf-5d2f-4944-93bf-f78da1eb8126 req-4200b54b-ba0e-4b2c-8a37-a1b8248b048d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.178 2 DEBUG oslo_concurrency.lockutils [req-02833acf-5d2f-4944-93bf-f78da1eb8126 req-4200b54b-ba0e-4b2c-8a37-a1b8248b048d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.178 2 DEBUG oslo_concurrency.lockutils [req-02833acf-5d2f-4944-93bf-f78da1eb8126 req-4200b54b-ba0e-4b2c-8a37-a1b8248b048d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.178 2 DEBUG nova.compute.manager [req-02833acf-5d2f-4944-93bf-f78da1eb8126 req-4200b54b-ba0e-4b2c-8a37-a1b8248b048d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Processing event network-vif-plugged-ae07e4bd-18d5-41c3-ade8-9cad83e6d345 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 12:31:35 np0005476733 podman[264847]: 2025-10-08 16:31:35.386791167 +0000 UTC m=+0.064509859 container create 94bb72e312440653a2bd14965bba5d7195779863a6781d3eb4dfc568b60774fa (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.403 2 DEBUG nova.network.neutron [req-bb391b7b-bd10-46fe-98d4-904a221add79 req-8b0d2b69-0305-45a7-b761-f090c10c4616 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Updated VIF entry in instance network info cache for port ae07e4bd-18d5-41c3-ade8-9cad83e6d345. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.404 2 DEBUG nova.network.neutron [req-bb391b7b-bd10-46fe-98d4-904a221add79 req-8b0d2b69-0305-45a7-b761-f090c10c4616 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Updating instance_info_cache with network_info: [{"id": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "address": "fa:16:3e:c8:e6:70", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.229", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae07e4bd-18", "ovs_interfaceid": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.421 2 DEBUG oslo_concurrency.lockutils [req-bb391b7b-bd10-46fe-98d4-904a221add79 req-8b0d2b69-0305-45a7-b761-f090c10c4616 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:31:35 np0005476733 systemd[1]: Started libpod-conmon-94bb72e312440653a2bd14965bba5d7195779863a6781d3eb4dfc568b60774fa.scope.
Oct  8 12:31:35 np0005476733 podman[264847]: 2025-10-08 16:31:35.345173166 +0000 UTC m=+0.022891888 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 12:31:35 np0005476733 systemd[1]: Started libcrun container.
Oct  8 12:31:35 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7caf3f365bceec9287192091917d73e03933bd5842193d7c280843330dd321a8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 12:31:35 np0005476733 podman[264847]: 2025-10-08 16:31:35.495605063 +0000 UTC m=+0.173323775 container init 94bb72e312440653a2bd14965bba5d7195779863a6781d3eb4dfc568b60774fa (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:31:35 np0005476733 podman[264847]: 2025-10-08 16:31:35.502892304 +0000 UTC m=+0.180610986 container start 94bb72e312440653a2bd14965bba5d7195779863a6781d3eb4dfc568b60774fa (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 12:31:35 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[264862]: [NOTICE]   (264866) : New worker (264868) forked
Oct  8 12:31:35 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[264862]: [NOTICE]   (264866) : Loading success.
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.758 2 DEBUG nova.compute.manager [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.760 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759941095.7579575, 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.760 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] VM Started (Lifecycle Event)#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.764 2 DEBUG nova.virt.libvirt.driver [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.770 2 INFO nova.virt.libvirt.driver [-] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Instance spawned successfully.#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.770 2 DEBUG nova.virt.libvirt.driver [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.789 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.796 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.799 2 DEBUG nova.virt.libvirt.driver [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.799 2 DEBUG nova.virt.libvirt.driver [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.800 2 DEBUG nova.virt.libvirt.driver [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.800 2 DEBUG nova.virt.libvirt.driver [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.800 2 DEBUG nova.virt.libvirt.driver [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.801 2 DEBUG nova.virt.libvirt.driver [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.835 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.836 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759941095.758294, 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.836 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] VM Paused (Lifecycle Event)#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.873 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.876 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759941095.7633734, 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.876 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] VM Resumed (Lifecycle Event)#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.900 2 INFO nova.compute.manager [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Took 8.43 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.901 2 DEBUG nova.compute.manager [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.902 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.907 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.946 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.976 2 INFO nova.compute.manager [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Took 8.94 seconds to build instance.#033[00m
Oct  8 12:31:35 np0005476733 nova_compute[192580]: 2025-10-08 16:31:35.993 2 DEBUG oslo_concurrency.lockutils [None req-0d0509a8-000d-4aa6-a74c-fbc73074bbba 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.069s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:31:36 np0005476733 nova_compute[192580]: 2025-10-08 16:31:36.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:31:37 np0005476733 nova_compute[192580]: 2025-10-08 16:31:37.285 2 DEBUG nova.compute.manager [req-1b8b28e2-936a-4e5c-8e5e-4899507df367 req-405de4ad-ddec-40f8-a123-cd0897991c19 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Received event network-vif-plugged-ae07e4bd-18d5-41c3-ade8-9cad83e6d345 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:31:37 np0005476733 nova_compute[192580]: 2025-10-08 16:31:37.286 2 DEBUG oslo_concurrency.lockutils [req-1b8b28e2-936a-4e5c-8e5e-4899507df367 req-405de4ad-ddec-40f8-a123-cd0897991c19 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:31:37 np0005476733 nova_compute[192580]: 2025-10-08 16:31:37.286 2 DEBUG oslo_concurrency.lockutils [req-1b8b28e2-936a-4e5c-8e5e-4899507df367 req-405de4ad-ddec-40f8-a123-cd0897991c19 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:31:37 np0005476733 nova_compute[192580]: 2025-10-08 16:31:37.286 2 DEBUG oslo_concurrency.lockutils [req-1b8b28e2-936a-4e5c-8e5e-4899507df367 req-405de4ad-ddec-40f8-a123-cd0897991c19 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:31:37 np0005476733 nova_compute[192580]: 2025-10-08 16:31:37.287 2 DEBUG nova.compute.manager [req-1b8b28e2-936a-4e5c-8e5e-4899507df367 req-405de4ad-ddec-40f8-a123-cd0897991c19 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] No waiting events found dispatching network-vif-plugged-ae07e4bd-18d5-41c3-ade8-9cad83e6d345 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:31:37 np0005476733 nova_compute[192580]: 2025-10-08 16:31:37.287 2 WARNING nova.compute.manager [req-1b8b28e2-936a-4e5c-8e5e-4899507df367 req-405de4ad-ddec-40f8-a123-cd0897991c19 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Received unexpected event network-vif-plugged-ae07e4bd-18d5-41c3-ade8-9cad83e6d345 for instance with vm_state active and task_state None.#033[00m
Oct  8 12:31:38 np0005476733 nova_compute[192580]: 2025-10-08 16:31:38.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:39 np0005476733 nova_compute[192580]: 2025-10-08 16:31:39.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:42 np0005476733 podman[264877]: 2025-10-08 16:31:42.236999403 +0000 UTC m=+0.058733166 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, container_name=iscsid, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 12:31:42 np0005476733 podman[264878]: 2025-10-08 16:31:42.261016766 +0000 UTC m=+0.073336570 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 12:31:42 np0005476733 nova_compute[192580]: 2025-10-08 16:31:42.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:31:42 np0005476733 nova_compute[192580]: 2025-10-08 16:31:42.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:31:43 np0005476733 nova_compute[192580]: 2025-10-08 16:31:43.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:44 np0005476733 nova_compute[192580]: 2025-10-08 16:31:44.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:44 np0005476733 nova_compute[192580]: 2025-10-08 16:31:44.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:31:44 np0005476733 nova_compute[192580]: 2025-10-08 16:31:44.591 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:31:46 np0005476733 nova_compute[192580]: 2025-10-08 16:31:46.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:31:46 np0005476733 nova_compute[192580]: 2025-10-08 16:31:46.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:31:46 np0005476733 nova_compute[192580]: 2025-10-08 16:31:46.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:31:47 np0005476733 nova_compute[192580]: 2025-10-08 16:31:47.214 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:31:47 np0005476733 nova_compute[192580]: 2025-10-08 16:31:47.214 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:31:47 np0005476733 nova_compute[192580]: 2025-10-08 16:31:47.215 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:31:47 np0005476733 nova_compute[192580]: 2025-10-08 16:31:47.215 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:31:48 np0005476733 nova_compute[192580]: 2025-10-08 16:31:48.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:49 np0005476733 nova_compute[192580]: 2025-10-08 16:31:49.248 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Updating instance_info_cache with network_info: [{"id": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "address": "fa:16:3e:c8:e6:70", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.229", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae07e4bd-18", "ovs_interfaceid": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:31:49 np0005476733 nova_compute[192580]: 2025-10-08 16:31:49.274 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:31:49 np0005476733 nova_compute[192580]: 2025-10-08 16:31:49.274 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:31:49 np0005476733 nova_compute[192580]: 2025-10-08 16:31:49.275 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:31:49 np0005476733 nova_compute[192580]: 2025-10-08 16:31:49.275 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:31:49 np0005476733 nova_compute[192580]: 2025-10-08 16:31:49.275 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 12:31:49 np0005476733 nova_compute[192580]: 2025-10-08 16:31:49.295 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 12:31:49 np0005476733 nova_compute[192580]: 2025-10-08 16:31:49.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:50 np0005476733 podman[264919]: 2025-10-08 16:31:50.269987095 +0000 UTC m=+0.084819254 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 12:31:53 np0005476733 ovn_controller[263831]: 2025-10-08T16:31:53Z|00044|pinctrl|WARN|Dropped 317 log messages in last 62 seconds (most recently, 2 seconds ago) due to excessive rate
Oct  8 12:31:53 np0005476733 ovn_controller[263831]: 2025-10-08T16:31:53Z|00045|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:31:53 np0005476733 podman[264945]: 2025-10-08 16:31:53.265438367 +0000 UTC m=+0.092488607 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 12:31:53 np0005476733 nova_compute[192580]: 2025-10-08 16:31:53.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:54 np0005476733 nova_compute[192580]: 2025-10-08 16:31:54.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:57 np0005476733 podman[264982]: 2025-10-08 16:31:57.218875211 +0000 UTC m=+0.052156916 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  8 12:31:57 np0005476733 nova_compute[192580]: 2025-10-08 16:31:57.609 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:31:57 np0005476733 nova_compute[192580]: 2025-10-08 16:31:57.635 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:31:57 np0005476733 nova_compute[192580]: 2025-10-08 16:31:57.636 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:31:57 np0005476733 nova_compute[192580]: 2025-10-08 16:31:57.636 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:31:57 np0005476733 nova_compute[192580]: 2025-10-08 16:31:57.636 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:31:57 np0005476733 nova_compute[192580]: 2025-10-08 16:31:57.700 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:31:57 np0005476733 nova_compute[192580]: 2025-10-08 16:31:57.778 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:31:57 np0005476733 nova_compute[192580]: 2025-10-08 16:31:57.779 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:31:57 np0005476733 nova_compute[192580]: 2025-10-08 16:31:57.838 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:31:58 np0005476733 nova_compute[192580]: 2025-10-08 16:31:58.003 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:31:58 np0005476733 nova_compute[192580]: 2025-10-08 16:31:58.004 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13199MB free_disk=111.30889892578125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:31:58 np0005476733 nova_compute[192580]: 2025-10-08 16:31:58.005 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:31:58 np0005476733 nova_compute[192580]: 2025-10-08 16:31:58.005 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:31:58 np0005476733 nova_compute[192580]: 2025-10-08 16:31:58.090 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:31:58 np0005476733 nova_compute[192580]: 2025-10-08 16:31:58.090 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:31:58 np0005476733 nova_compute[192580]: 2025-10-08 16:31:58.090 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:31:58 np0005476733 nova_compute[192580]: 2025-10-08 16:31:58.107 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 12:31:58 np0005476733 nova_compute[192580]: 2025-10-08 16:31:58.126 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 12:31:58 np0005476733 nova_compute[192580]: 2025-10-08 16:31:58.126 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 12:31:58 np0005476733 nova_compute[192580]: 2025-10-08 16:31:58.142 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 12:31:58 np0005476733 nova_compute[192580]: 2025-10-08 16:31:58.173 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 12:31:58 np0005476733 nova_compute[192580]: 2025-10-08 16:31:58.220 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:31:58 np0005476733 nova_compute[192580]: 2025-10-08 16:31:58.236 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:31:58 np0005476733 nova_compute[192580]: 2025-10-08 16:31:58.260 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:31:58 np0005476733 nova_compute[192580]: 2025-10-08 16:31:58.260 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:31:58 np0005476733 nova_compute[192580]: 2025-10-08 16:31:58.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:31:59 np0005476733 nova_compute[192580]: 2025-10-08 16:31:59.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:03 np0005476733 nova_compute[192580]: 2025-10-08 16:32:03.535 2 DEBUG oslo_concurrency.lockutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Acquiring lock "955b5604-c52b-4d34-8a59-8e13bc6d2992" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:32:03 np0005476733 nova_compute[192580]: 2025-10-08 16:32:03.536 2 DEBUG oslo_concurrency.lockutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "955b5604-c52b-4d34-8a59-8e13bc6d2992" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:32:03 np0005476733 nova_compute[192580]: 2025-10-08 16:32:03.554 2 DEBUG nova.compute.manager [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 12:32:03 np0005476733 nova_compute[192580]: 2025-10-08 16:32:03.627 2 DEBUG oslo_concurrency.lockutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:32:03 np0005476733 nova_compute[192580]: 2025-10-08 16:32:03.628 2 DEBUG oslo_concurrency.lockutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:32:03 np0005476733 nova_compute[192580]: 2025-10-08 16:32:03.636 2 DEBUG nova.virt.hardware [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 12:32:03 np0005476733 nova_compute[192580]: 2025-10-08 16:32:03.636 2 INFO nova.compute.claims [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 12:32:03 np0005476733 nova_compute[192580]: 2025-10-08 16:32:03.761 2 DEBUG nova.compute.provider_tree [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:32:03 np0005476733 nova_compute[192580]: 2025-10-08 16:32:03.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:03 np0005476733 nova_compute[192580]: 2025-10-08 16:32:03.780 2 DEBUG nova.scheduler.client.report [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:32:03 np0005476733 nova_compute[192580]: 2025-10-08 16:32:03.815 2 DEBUG oslo_concurrency.lockutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:32:03 np0005476733 nova_compute[192580]: 2025-10-08 16:32:03.816 2 DEBUG nova.compute.manager [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 12:32:03 np0005476733 nova_compute[192580]: 2025-10-08 16:32:03.878 2 DEBUG nova.compute.manager [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 12:32:03 np0005476733 nova_compute[192580]: 2025-10-08 16:32:03.878 2 DEBUG nova.network.neutron [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 12:32:03 np0005476733 nova_compute[192580]: 2025-10-08 16:32:03.905 2 INFO nova.virt.libvirt.driver [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 12:32:03 np0005476733 nova_compute[192580]: 2025-10-08 16:32:03.924 2 DEBUG nova.compute.manager [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 12:32:03 np0005476733 ovn_controller[263831]: 2025-10-08T16:32:03Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c8:e6:70 192.168.122.229
Oct  8 12:32:03 np0005476733 ovn_controller[263831]: 2025-10-08T16:32:03Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c8:e6:70 192.168.122.229
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.016 2 DEBUG nova.compute.manager [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.018 2 DEBUG nova.virt.libvirt.driver [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.018 2 INFO nova.virt.libvirt.driver [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Creating image(s)#033[00m
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.019 2 DEBUG oslo_concurrency.lockutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Acquiring lock "/var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.019 2 DEBUG oslo_concurrency.lockutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "/var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.019 2 DEBUG oslo_concurrency.lockutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "/var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.031 2 DEBUG oslo_concurrency.processutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.091 2 DEBUG oslo_concurrency.processutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.092 2 DEBUG oslo_concurrency.lockutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.093 2 DEBUG oslo_concurrency.lockutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.105 2 DEBUG oslo_concurrency.processutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.181 2 DEBUG oslo_concurrency.processutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.182 2 DEBUG oslo_concurrency.processutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.214 2 DEBUG nova.policy [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 12:32:04 np0005476733 podman[265013]: 2025-10-08 16:32:04.236024126 +0000 UTC m=+0.061366830 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251001)
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.240 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:32:04 np0005476733 podman[265016]: 2025-10-08 16:32:04.242720998 +0000 UTC m=+0.061657598 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  8 12:32:04 np0005476733 podman[265014]: 2025-10-08 16:32:04.244238647 +0000 UTC m=+0.068169855 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.411 2 DEBUG oslo_concurrency.processutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992/disk 10737418240" returned: 0 in 0.229s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.412 2 DEBUG oslo_concurrency.lockutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.319s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.412 2 DEBUG oslo_concurrency.processutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.470 2 DEBUG oslo_concurrency.processutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.471 2 DEBUG nova.objects.instance [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lazy-loading 'migration_context' on Instance uuid 955b5604-c52b-4d34-8a59-8e13bc6d2992 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.489 2 DEBUG nova.virt.libvirt.driver [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.489 2 DEBUG nova.virt.libvirt.driver [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Ensure instance console log exists: /var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.489 2 DEBUG oslo_concurrency.lockutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.490 2 DEBUG oslo_concurrency.lockutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.490 2 DEBUG oslo_concurrency.lockutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:04 np0005476733 nova_compute[192580]: 2025-10-08 16:32:04.957 2 DEBUG nova.network.neutron [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Successfully created port: fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 12:32:05 np0005476733 nova_compute[192580]: 2025-10-08 16:32:05.918 2 DEBUG nova.network.neutron [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Successfully updated port: fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 12:32:05 np0005476733 nova_compute[192580]: 2025-10-08 16:32:05.948 2 DEBUG oslo_concurrency.lockutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Acquiring lock "refresh_cache-955b5604-c52b-4d34-8a59-8e13bc6d2992" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:32:05 np0005476733 nova_compute[192580]: 2025-10-08 16:32:05.948 2 DEBUG oslo_concurrency.lockutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Acquired lock "refresh_cache-955b5604-c52b-4d34-8a59-8e13bc6d2992" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:32:05 np0005476733 nova_compute[192580]: 2025-10-08 16:32:05.948 2 DEBUG nova.network.neutron [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.055 2 DEBUG nova.compute.manager [req-81987402-743d-4e28-b766-2c30e20e76f0 req-a750fae6-18a5-4b60-aade-7e013746e14e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Received event network-changed-fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.056 2 DEBUG nova.compute.manager [req-81987402-743d-4e28-b766-2c30e20e76f0 req-a750fae6-18a5-4b60-aade-7e013746e14e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Refreshing instance network info cache due to event network-changed-fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.057 2 DEBUG oslo_concurrency.lockutils [req-81987402-743d-4e28-b766-2c30e20e76f0 req-a750fae6-18a5-4b60-aade-7e013746e14e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-955b5604-c52b-4d34-8a59-8e13bc6d2992" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.183 2 DEBUG nova.network.neutron [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.864 2 DEBUG nova.network.neutron [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Updating instance_info_cache with network_info: [{"id": "fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b", "address": "fa:16:3e:99:eb:59", "network": {"id": "c62ce41c-4d20-4575-9fd8-d6f63a77e629", "bridge": "br-int", "label": "tempest-test-network--1825912703", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e84a9b599804a5f95722444484bbcee", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc3bca7e-b6", "ovs_interfaceid": "fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.895 2 DEBUG oslo_concurrency.lockutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Releasing lock "refresh_cache-955b5604-c52b-4d34-8a59-8e13bc6d2992" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.895 2 DEBUG nova.compute.manager [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Instance network_info: |[{"id": "fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b", "address": "fa:16:3e:99:eb:59", "network": {"id": "c62ce41c-4d20-4575-9fd8-d6f63a77e629", "bridge": "br-int", "label": "tempest-test-network--1825912703", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e84a9b599804a5f95722444484bbcee", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc3bca7e-b6", "ovs_interfaceid": "fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.896 2 DEBUG oslo_concurrency.lockutils [req-81987402-743d-4e28-b766-2c30e20e76f0 req-a750fae6-18a5-4b60-aade-7e013746e14e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-955b5604-c52b-4d34-8a59-8e13bc6d2992" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.896 2 DEBUG nova.network.neutron [req-81987402-743d-4e28-b766-2c30e20e76f0 req-a750fae6-18a5-4b60-aade-7e013746e14e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Refreshing network info cache for port fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.901 2 DEBUG nova.virt.libvirt.driver [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Start _get_guest_xml network_info=[{"id": "fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b", "address": "fa:16:3e:99:eb:59", "network": {"id": "c62ce41c-4d20-4575-9fd8-d6f63a77e629", "bridge": "br-int", "label": "tempest-test-network--1825912703", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e84a9b599804a5f95722444484bbcee", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc3bca7e-b6", "ovs_interfaceid": "fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.906 2 WARNING nova.virt.libvirt.driver [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.911 2 DEBUG nova.virt.libvirt.host [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.912 2 DEBUG nova.virt.libvirt.host [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.916 2 DEBUG nova.virt.libvirt.host [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.917 2 DEBUG nova.virt.libvirt.host [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.917 2 DEBUG nova.virt.libvirt.driver [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.917 2 DEBUG nova.virt.hardware [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.918 2 DEBUG nova.virt.hardware [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.918 2 DEBUG nova.virt.hardware [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.918 2 DEBUG nova.virt.hardware [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.918 2 DEBUG nova.virt.hardware [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.918 2 DEBUG nova.virt.hardware [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.918 2 DEBUG nova.virt.hardware [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.919 2 DEBUG nova.virt.hardware [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.919 2 DEBUG nova.virt.hardware [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.919 2 DEBUG nova.virt.hardware [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.919 2 DEBUG nova.virt.hardware [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.922 2 DEBUG nova.virt.libvirt.vif [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:32:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1886042372',display_name='tempest-server-test-1886042372',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1886042372',id=96,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKjthntCRZsyp63+354x4Tysej4gPAk5Si8K+FPukJ5+b5PYBTJpcqDxUbi+6C7pe5FLeDw6Z8lshC7Q+5m9uZ+mbSRetxFueZc0ihdj/RdazyUIVVefOeKslAmQPSNYjg==',key_name='tempest-keypair-test-219561444',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e84a9b599804a5f95722444484bbcee',ramdisk_id='',reservation_id='r-2pjrb93y',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-GatewayMtuTestIcmp-9904054',owner_user_name='tempest-GatewayMtuTestIcmp-9904054-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:32:03Z,user_data=None,user_id='2a670570cd6d44aeb1a34bf3fac55635',uuid=955b5604-c52b-4d34-8a59-8e13bc6d2992,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b", "address": "fa:16:3e:99:eb:59", "network": {"id": "c62ce41c-4d20-4575-9fd8-d6f63a77e629", "bridge": "br-int", "label": "tempest-test-network--1825912703", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e84a9b599804a5f95722444484bbcee", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc3bca7e-b6", "ovs_interfaceid": "fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.923 2 DEBUG nova.network.os_vif_util [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Converting VIF {"id": "fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b", "address": "fa:16:3e:99:eb:59", "network": {"id": "c62ce41c-4d20-4575-9fd8-d6f63a77e629", "bridge": "br-int", "label": "tempest-test-network--1825912703", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e84a9b599804a5f95722444484bbcee", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc3bca7e-b6", "ovs_interfaceid": "fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.923 2 DEBUG nova.network.os_vif_util [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:99:eb:59,bridge_name='br-int',has_traffic_filtering=True,id=fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b,network=Network(c62ce41c-4d20-4575-9fd8-d6f63a77e629),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc3bca7e-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.924 2 DEBUG nova.objects.instance [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lazy-loading 'pci_devices' on Instance uuid 955b5604-c52b-4d34-8a59-8e13bc6d2992 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.940 2 DEBUG nova.virt.libvirt.driver [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] End _get_guest_xml xml=<domain type="kvm">
Oct  8 12:32:06 np0005476733 nova_compute[192580]:  <uuid>955b5604-c52b-4d34-8a59-8e13bc6d2992</uuid>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:  <name>instance-00000060</name>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <nova:name>tempest-server-test-1886042372</nova:name>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 16:32:06</nova:creationTime>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 12:32:06 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:        <nova:user uuid="2a670570cd6d44aeb1a34bf3fac55635">tempest-GatewayMtuTestIcmp-9904054-project-member</nova:user>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:        <nova:project uuid="3e84a9b599804a5f95722444484bbcee">tempest-GatewayMtuTestIcmp-9904054</nova:project>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:        <nova:port uuid="fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b">
Oct  8 12:32:06 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <system>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <entry name="serial">955b5604-c52b-4d34-8a59-8e13bc6d2992</entry>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <entry name="uuid">955b5604-c52b-4d34-8a59-8e13bc6d2992</entry>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    </system>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:  <os>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:  </os>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:  <features>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:  </features>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:  </clock>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:  <devices>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992/disk"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.config"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:99:eb:59"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <target dev="tapfc3bca7e-b6"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    </interface>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992/console.log" append="off"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    </serial>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <video>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    </video>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    </rng>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 12:32:06 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 12:32:06 np0005476733 nova_compute[192580]:  </devices>
Oct  8 12:32:06 np0005476733 nova_compute[192580]: </domain>
Oct  8 12:32:06 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.942 2 DEBUG nova.compute.manager [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Preparing to wait for external event network-vif-plugged-fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.942 2 DEBUG oslo_concurrency.lockutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Acquiring lock "955b5604-c52b-4d34-8a59-8e13bc6d2992-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.942 2 DEBUG oslo_concurrency.lockutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "955b5604-c52b-4d34-8a59-8e13bc6d2992-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.942 2 DEBUG oslo_concurrency.lockutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "955b5604-c52b-4d34-8a59-8e13bc6d2992-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.943 2 DEBUG nova.virt.libvirt.vif [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:32:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1886042372',display_name='tempest-server-test-1886042372',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1886042372',id=96,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKjthntCRZsyp63+354x4Tysej4gPAk5Si8K+FPukJ5+b5PYBTJpcqDxUbi+6C7pe5FLeDw6Z8lshC7Q+5m9uZ+mbSRetxFueZc0ihdj/RdazyUIVVefOeKslAmQPSNYjg==',key_name='tempest-keypair-test-219561444',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e84a9b599804a5f95722444484bbcee',ramdisk_id='',reservation_id='r-2pjrb93y',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-GatewayMtuTestIcmp-9904054',owner_user_name='tempest-GatewayMtuTestIcmp-9904054-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:32:03Z,user_data=None,user_id='2a670570cd6d44aeb1a34bf3fac55635',uuid=955b5604-c52b-4d34-8a59-8e13bc6d2992,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b", "address": "fa:16:3e:99:eb:59", "network": {"id": "c62ce41c-4d20-4575-9fd8-d6f63a77e629", "bridge": "br-int", "label": "tempest-test-network--1825912703", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e84a9b599804a5f95722444484bbcee", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc3bca7e-b6", "ovs_interfaceid": "fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.943 2 DEBUG nova.network.os_vif_util [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Converting VIF {"id": "fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b", "address": "fa:16:3e:99:eb:59", "network": {"id": "c62ce41c-4d20-4575-9fd8-d6f63a77e629", "bridge": "br-int", "label": "tempest-test-network--1825912703", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e84a9b599804a5f95722444484bbcee", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc3bca7e-b6", "ovs_interfaceid": "fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.944 2 DEBUG nova.network.os_vif_util [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:99:eb:59,bridge_name='br-int',has_traffic_filtering=True,id=fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b,network=Network(c62ce41c-4d20-4575-9fd8-d6f63a77e629),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc3bca7e-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.944 2 DEBUG os_vif [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:eb:59,bridge_name='br-int',has_traffic_filtering=True,id=fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b,network=Network(c62ce41c-4d20-4575-9fd8-d6f63a77e629),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc3bca7e-b6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.945 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.945 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.949 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc3bca7e-b6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.949 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfc3bca7e-b6, col_values=(('external_ids', {'iface-id': 'fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:99:eb:59', 'vm-uuid': '955b5604-c52b-4d34-8a59-8e13bc6d2992'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:06 np0005476733 NetworkManager[51699]: <info>  [1759941126.9528] manager: (tapfc3bca7e-b6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/295)
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:06 np0005476733 nova_compute[192580]: 2025-10-08 16:32:06.963 2 INFO os_vif [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:eb:59,bridge_name='br-int',has_traffic_filtering=True,id=fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b,network=Network(c62ce41c-4d20-4575-9fd8-d6f63a77e629),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc3bca7e-b6')#033[00m
Oct  8 12:32:07 np0005476733 nova_compute[192580]: 2025-10-08 16:32:07.012 2 DEBUG nova.virt.libvirt.driver [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:32:07 np0005476733 nova_compute[192580]: 2025-10-08 16:32:07.013 2 DEBUG nova.virt.libvirt.driver [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:32:07 np0005476733 nova_compute[192580]: 2025-10-08 16:32:07.013 2 DEBUG nova.virt.libvirt.driver [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] No VIF found with MAC fa:16:3e:99:eb:59, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 12:32:07 np0005476733 nova_compute[192580]: 2025-10-08 16:32:07.014 2 INFO nova.virt.libvirt.driver [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Using config drive#033[00m
Oct  8 12:32:08 np0005476733 nova_compute[192580]: 2025-10-08 16:32:08.022 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:32:08 np0005476733 nova_compute[192580]: 2025-10-08 16:32:08.050 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Triggering sync for uuid 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  8 12:32:08 np0005476733 nova_compute[192580]: 2025-10-08 16:32:08.050 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Triggering sync for uuid 955b5604-c52b-4d34-8a59-8e13bc6d2992 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  8 12:32:08 np0005476733 nova_compute[192580]: 2025-10-08 16:32:08.051 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:32:08 np0005476733 nova_compute[192580]: 2025-10-08 16:32:08.051 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:32:08 np0005476733 nova_compute[192580]: 2025-10-08 16:32:08.051 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "955b5604-c52b-4d34-8a59-8e13bc6d2992" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:32:08 np0005476733 nova_compute[192580]: 2025-10-08 16:32:08.079 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.028s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:32:08 np0005476733 nova_compute[192580]: 2025-10-08 16:32:08.931 2 INFO nova.virt.libvirt.driver [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Creating config drive at /var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.config#033[00m
Oct  8 12:32:08 np0005476733 nova_compute[192580]: 2025-10-08 16:32:08.936 2 DEBUG oslo_concurrency.processutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxmidxwio execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:32:09 np0005476733 nova_compute[192580]: 2025-10-08 16:32:09.064 2 DEBUG oslo_concurrency.processutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxmidxwio" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:32:09 np0005476733 kernel: tapfc3bca7e-b6: entered promiscuous mode
Oct  8 12:32:09 np0005476733 NetworkManager[51699]: <info>  [1759941129.1339] manager: (tapfc3bca7e-b6): new Tun device (/org/freedesktop/NetworkManager/Devices/296)
Oct  8 12:32:09 np0005476733 nova_compute[192580]: 2025-10-08 16:32:09.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:09 np0005476733 ovn_controller[263831]: 2025-10-08T16:32:09Z|00046|binding|INFO|Claiming lport fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b for this chassis.
Oct  8 12:32:09 np0005476733 ovn_controller[263831]: 2025-10-08T16:32:09Z|00047|binding|INFO|fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b: Claiming fa:16:3e:99:eb:59 10.100.0.9
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.148 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:eb:59 10.100.0.9'], port_security=['fa:16:3e:99:eb:59 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c62ce41c-4d20-4575-9fd8-d6f63a77e629', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e84a9b599804a5f95722444484bbcee', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e3968eba-c424-466b-8079-cc800da29d95', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29408260-4b02-46cd-a4a1-515da7300ce7, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.150 103739 INFO neutron.agent.ovn.metadata.agent [-] Port fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b in datapath c62ce41c-4d20-4575-9fd8-d6f63a77e629 bound to our chassis#033[00m
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.151 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c62ce41c-4d20-4575-9fd8-d6f63a77e629#033[00m
Oct  8 12:32:09 np0005476733 ovn_controller[263831]: 2025-10-08T16:32:09Z|00048|binding|INFO|Setting lport fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b ovn-installed in OVS
Oct  8 12:32:09 np0005476733 ovn_controller[263831]: 2025-10-08T16:32:09Z|00049|binding|INFO|Setting lport fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b up in Southbound
Oct  8 12:32:09 np0005476733 nova_compute[192580]: 2025-10-08 16:32:09.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.167 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[db15bc28-73c1-43f8-91ed-cbe00b416a60]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.168 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc62ce41c-41 in ovnmeta-c62ce41c-4d20-4575-9fd8-d6f63a77e629 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.170 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc62ce41c-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.170 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[aa07d276-9db9-488d-a35b-70a24b5ecb19]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:32:09 np0005476733 systemd-udevd[265104]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.173 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[eb1517c8-7444-46ef-b15f-31e31356bf05]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:32:09 np0005476733 NetworkManager[51699]: <info>  [1759941129.1847] device (tapfc3bca7e-b6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:32:09 np0005476733 NetworkManager[51699]: <info>  [1759941129.1856] device (tapfc3bca7e-b6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.184 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[a9e7807d-4477-422a-a46e-f1ea3a488b41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:32:09 np0005476733 systemd-machined[152624]: New machine qemu-59-instance-00000060.
Oct  8 12:32:09 np0005476733 systemd[1]: Started Virtual Machine qemu-59-instance-00000060.
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.212 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[34137fbf-524e-48f5-bac0-b20ee67cd2d8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.247 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[0085aa57-52bc-4e56-afd5-8e3c18eb15fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.253 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[47326ae4-39e5-49ee-853a-33051fc3b667]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:32:09 np0005476733 NetworkManager[51699]: <info>  [1759941129.2543] manager: (tapc62ce41c-40): new Veth device (/org/freedesktop/NetworkManager/Devices/297)
Oct  8 12:32:09 np0005476733 systemd-udevd[265108]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.289 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[68ebda3b-66da-4559-9859-ec8135212382]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.292 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[a0a25178-940f-4ebf-ab3d-ce945cad7171]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:32:09 np0005476733 NetworkManager[51699]: <info>  [1759941129.3171] device (tapc62ce41c-40): carrier: link connected
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.324 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[b44b2efa-65a5-406a-b1c6-10d8838f8ef6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.343 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[eb15d06b-57aa-4a42-8249-b8ec1fc5ed54]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc62ce41c-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c9:71:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 803298, 'reachable_time': 39265, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 265137, 'error': None, 'target': 'ovnmeta-c62ce41c-4d20-4575-9fd8-d6f63a77e629', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.358 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[aee627a6-22a8-48b1-9d0e-2ae63a3b7c63]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec9:7160'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 803298, 'tstamp': 803298}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 265138, 'error': None, 'target': 'ovnmeta-c62ce41c-4d20-4575-9fd8-d6f63a77e629', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.379 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[cd816f02-667b-4b2e-8acf-92052e6a852e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc62ce41c-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c9:71:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 803298, 'reachable_time': 39265, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 265139, 'error': None, 'target': 'ovnmeta-c62ce41c-4d20-4575-9fd8-d6f63a77e629', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.409 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[dd595966-e256-491a-8ed5-339a4f4f9c74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.483 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a57664ad-0c75-40c3-b373-fd2246d6f7f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.487 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc62ce41c-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.487 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.488 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc62ce41c-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:32:09 np0005476733 nova_compute[192580]: 2025-10-08 16:32:09.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:09 np0005476733 kernel: tapc62ce41c-40: entered promiscuous mode
Oct  8 12:32:09 np0005476733 NetworkManager[51699]: <info>  [1759941129.4905] manager: (tapc62ce41c-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/298)
Oct  8 12:32:09 np0005476733 nova_compute[192580]: 2025-10-08 16:32:09.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.497 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc62ce41c-40, col_values=(('external_ids', {'iface-id': 'e09f270f-3f09-4ea0-91f9-58c0fa750a4e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:32:09 np0005476733 nova_compute[192580]: 2025-10-08 16:32:09.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:09 np0005476733 ovn_controller[263831]: 2025-10-08T16:32:09Z|00050|binding|INFO|Releasing lport e09f270f-3f09-4ea0-91f9-58c0fa750a4e from this chassis (sb_readonly=0)
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.500 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c62ce41c-4d20-4575-9fd8-d6f63a77e629.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c62ce41c-4d20-4575-9fd8-d6f63a77e629.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.511 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[81d81a7c-2b2f-4b78-90df-25996bbaa871]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.512 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-c62ce41c-4d20-4575-9fd8-d6f63a77e629
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/c62ce41c-4d20-4575-9fd8-d6f63a77e629.pid.haproxy
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID c62ce41c-4d20-4575-9fd8-d6f63a77e629
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 12:32:09 np0005476733 nova_compute[192580]: 2025-10-08 16:32:09.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:09.514 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c62ce41c-4d20-4575-9fd8-d6f63a77e629', 'env', 'PROCESS_TAG=haproxy-c62ce41c-4d20-4575-9fd8-d6f63a77e629', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c62ce41c-4d20-4575-9fd8-d6f63a77e629.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 12:32:09 np0005476733 nova_compute[192580]: 2025-10-08 16:32:09.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:09 np0005476733 nova_compute[192580]: 2025-10-08 16:32:09.872 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759941129.8711946, 955b5604-c52b-4d34-8a59-8e13bc6d2992 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:32:09 np0005476733 nova_compute[192580]: 2025-10-08 16:32:09.872 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] VM Started (Lifecycle Event)#033[00m
Oct  8 12:32:09 np0005476733 nova_compute[192580]: 2025-10-08 16:32:09.896 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:32:09 np0005476733 nova_compute[192580]: 2025-10-08 16:32:09.900 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759941129.8725724, 955b5604-c52b-4d34-8a59-8e13bc6d2992 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:32:09 np0005476733 nova_compute[192580]: 2025-10-08 16:32:09.901 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] VM Paused (Lifecycle Event)#033[00m
Oct  8 12:32:09 np0005476733 nova_compute[192580]: 2025-10-08 16:32:09.925 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:32:09 np0005476733 nova_compute[192580]: 2025-10-08 16:32:09.928 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:32:09 np0005476733 podman[265178]: 2025-10-08 16:32:09.929257603 +0000 UTC m=+0.091655951 container create 609e0f74b2d941139b30e6a8cfacfee115185bb9b282f7c8a889d5f278425eeb (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-c62ce41c-4d20-4575-9fd8-d6f63a77e629, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:32:09 np0005476733 nova_compute[192580]: 2025-10-08 16:32:09.949 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:32:09 np0005476733 podman[265178]: 2025-10-08 16:32:09.863735432 +0000 UTC m=+0.026133800 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 12:32:09 np0005476733 systemd[1]: Started libpod-conmon-609e0f74b2d941139b30e6a8cfacfee115185bb9b282f7c8a889d5f278425eeb.scope.
Oct  8 12:32:09 np0005476733 systemd[1]: Started libcrun container.
Oct  8 12:32:10 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fc78afbcbcc0135a383db0e3991f6eeef817018c5d8f9ecc1320bbc1fcd9180/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 12:32:10 np0005476733 podman[265178]: 2025-10-08 16:32:10.024436404 +0000 UTC m=+0.186834802 container init 609e0f74b2d941139b30e6a8cfacfee115185bb9b282f7c8a889d5f278425eeb (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-c62ce41c-4d20-4575-9fd8-d6f63a77e629, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  8 12:32:10 np0005476733 podman[265178]: 2025-10-08 16:32:10.03281262 +0000 UTC m=+0.195210968 container start 609e0f74b2d941139b30e6a8cfacfee115185bb9b282f7c8a889d5f278425eeb (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-c62ce41c-4d20-4575-9fd8-d6f63a77e629, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  8 12:32:10 np0005476733 neutron-haproxy-ovnmeta-c62ce41c-4d20-4575-9fd8-d6f63a77e629[265194]: [NOTICE]   (265198) : New worker (265200) forked
Oct  8 12:32:10 np0005476733 neutron-haproxy-ovnmeta-c62ce41c-4d20-4575-9fd8-d6f63a77e629[265194]: [NOTICE]   (265198) : Loading success.
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.223 2 DEBUG nova.network.neutron [req-81987402-743d-4e28-b766-2c30e20e76f0 req-a750fae6-18a5-4b60-aade-7e013746e14e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Updated VIF entry in instance network info cache for port fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.225 2 DEBUG nova.network.neutron [req-81987402-743d-4e28-b766-2c30e20e76f0 req-a750fae6-18a5-4b60-aade-7e013746e14e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Updating instance_info_cache with network_info: [{"id": "fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b", "address": "fa:16:3e:99:eb:59", "network": {"id": "c62ce41c-4d20-4575-9fd8-d6f63a77e629", "bridge": "br-int", "label": "tempest-test-network--1825912703", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e84a9b599804a5f95722444484bbcee", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc3bca7e-b6", "ovs_interfaceid": "fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.244 2 DEBUG oslo_concurrency.lockutils [req-81987402-743d-4e28-b766-2c30e20e76f0 req-a750fae6-18a5-4b60-aade-7e013746e14e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-955b5604-c52b-4d34-8a59-8e13bc6d2992" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.328 2 DEBUG nova.compute.manager [req-c77255d0-4415-4bbf-89a4-2f5ab5965638 req-40b4c34a-3fd0-4b00-80aa-eacf9db1ea70 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Received event network-vif-plugged-fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.329 2 DEBUG oslo_concurrency.lockutils [req-c77255d0-4415-4bbf-89a4-2f5ab5965638 req-40b4c34a-3fd0-4b00-80aa-eacf9db1ea70 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "955b5604-c52b-4d34-8a59-8e13bc6d2992-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.329 2 DEBUG oslo_concurrency.lockutils [req-c77255d0-4415-4bbf-89a4-2f5ab5965638 req-40b4c34a-3fd0-4b00-80aa-eacf9db1ea70 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "955b5604-c52b-4d34-8a59-8e13bc6d2992-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.330 2 DEBUG oslo_concurrency.lockutils [req-c77255d0-4415-4bbf-89a4-2f5ab5965638 req-40b4c34a-3fd0-4b00-80aa-eacf9db1ea70 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "955b5604-c52b-4d34-8a59-8e13bc6d2992-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.330 2 DEBUG nova.compute.manager [req-c77255d0-4415-4bbf-89a4-2f5ab5965638 req-40b4c34a-3fd0-4b00-80aa-eacf9db1ea70 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Processing event network-vif-plugged-fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.332 2 DEBUG nova.compute.manager [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.337 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759941131.3375666, 955b5604-c52b-4d34-8a59-8e13bc6d2992 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.338 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] VM Resumed (Lifecycle Event)#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.341 2 DEBUG nova.virt.libvirt.driver [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.346 2 INFO nova.virt.libvirt.driver [-] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Instance spawned successfully.#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.347 2 DEBUG nova.virt.libvirt.driver [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.363 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.371 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.375 2 DEBUG nova.virt.libvirt.driver [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.376 2 DEBUG nova.virt.libvirt.driver [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.377 2 DEBUG nova.virt.libvirt.driver [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.378 2 DEBUG nova.virt.libvirt.driver [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.378 2 DEBUG nova.virt.libvirt.driver [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.379 2 DEBUG nova.virt.libvirt.driver [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.407 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.437 2 INFO nova.compute.manager [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Took 7.42 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.438 2 DEBUG nova.compute.manager [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.509 2 INFO nova.compute.manager [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Took 7.91 seconds to build instance.#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.526 2 DEBUG oslo_concurrency.lockutils [None req-c628aa4a-25a3-4ffe-bc35-dca34b0b0b2f 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "955b5604-c52b-4d34-8a59-8e13bc6d2992" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.990s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.527 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "955b5604-c52b-4d34-8a59-8e13bc6d2992" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 3.476s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.528 2 INFO nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.528 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "955b5604-c52b-4d34-8a59-8e13bc6d2992" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:32:11 np0005476733 nova_compute[192580]: 2025-10-08 16:32:11.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:13 np0005476733 podman[265209]: 2025-10-08 16:32:13.231043941 +0000 UTC m=+0.052150637 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:32:13 np0005476733 podman[265210]: 2025-10-08 16:32:13.232427895 +0000 UTC m=+0.054160441 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 12:32:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:13.256 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=71, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=70) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:32:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:13.256 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:32:13 np0005476733 nova_compute[192580]: 2025-10-08 16:32:13.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:13 np0005476733 nova_compute[192580]: 2025-10-08 16:32:13.431 2 DEBUG nova.compute.manager [req-bebb05ed-283e-4cef-98d2-b42099b33101 req-c5c1e8cf-9fb0-4954-9617-3da4cf14f5bc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Received event network-vif-plugged-fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:32:13 np0005476733 nova_compute[192580]: 2025-10-08 16:32:13.432 2 DEBUG oslo_concurrency.lockutils [req-bebb05ed-283e-4cef-98d2-b42099b33101 req-c5c1e8cf-9fb0-4954-9617-3da4cf14f5bc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "955b5604-c52b-4d34-8a59-8e13bc6d2992-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:32:13 np0005476733 nova_compute[192580]: 2025-10-08 16:32:13.433 2 DEBUG oslo_concurrency.lockutils [req-bebb05ed-283e-4cef-98d2-b42099b33101 req-c5c1e8cf-9fb0-4954-9617-3da4cf14f5bc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "955b5604-c52b-4d34-8a59-8e13bc6d2992-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:32:13 np0005476733 nova_compute[192580]: 2025-10-08 16:32:13.433 2 DEBUG oslo_concurrency.lockutils [req-bebb05ed-283e-4cef-98d2-b42099b33101 req-c5c1e8cf-9fb0-4954-9617-3da4cf14f5bc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "955b5604-c52b-4d34-8a59-8e13bc6d2992-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:32:13 np0005476733 nova_compute[192580]: 2025-10-08 16:32:13.433 2 DEBUG nova.compute.manager [req-bebb05ed-283e-4cef-98d2-b42099b33101 req-c5c1e8cf-9fb0-4954-9617-3da4cf14f5bc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] No waiting events found dispatching network-vif-plugged-fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:32:13 np0005476733 nova_compute[192580]: 2025-10-08 16:32:13.434 2 WARNING nova.compute.manager [req-bebb05ed-283e-4cef-98d2-b42099b33101 req-c5c1e8cf-9fb0-4954-9617-3da4cf14f5bc 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Received unexpected event network-vif-plugged-fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b for instance with vm_state active and task_state None.#033[00m
Oct  8 12:32:14 np0005476733 nova_compute[192580]: 2025-10-08 16:32:14.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:16 np0005476733 nova_compute[192580]: 2025-10-08 16:32:16.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:19.259 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '71'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:32:19 np0005476733 nova_compute[192580]: 2025-10-08 16:32:19.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:21 np0005476733 podman[265272]: 2025-10-08 16:32:21.235937291 +0000 UTC m=+0.069682623 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 12:32:21 np0005476733 nova_compute[192580]: 2025-10-08 16:32:21.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:24 np0005476733 podman[265291]: 2025-10-08 16:32:24.27035049 +0000 UTC m=+0.077745459 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 12:32:24 np0005476733 nova_compute[192580]: 2025-10-08 16:32:24.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:26.390 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:32:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:26.391 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:32:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:32:26.391 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:32:27 np0005476733 nova_compute[192580]: 2025-10-08 16:32:27.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:28 np0005476733 podman[265342]: 2025-10-08 16:32:28.235841497 +0000 UTC m=+0.061422101 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  8 12:32:29 np0005476733 nova_compute[192580]: 2025-10-08 16:32:29.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:32 np0005476733 nova_compute[192580]: 2025-10-08 16:32:32.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:33 np0005476733 nova_compute[192580]: 2025-10-08 16:32:33.618 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:32:34 np0005476733 nova_compute[192580]: 2025-10-08 16:32:34.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:35 np0005476733 podman[265369]: 2025-10-08 16:32:35.22714528 +0000 UTC m=+0.050657640 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 12:32:35 np0005476733 podman[265370]: 2025-10-08 16:32:35.236066522 +0000 UTC m=+0.056116243 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, config_id=edpm, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  8 12:32:35 np0005476733 podman[265368]: 2025-10-08 16:32:35.236667371 +0000 UTC m=+0.064694645 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.071 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'name': 'tempest-server-test-2100618665', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000005e', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3e84a9b599804a5f95722444484bbcee', 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'hostId': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.073 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'name': 'tempest-server-test-1886042372', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000060', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3e84a9b599804a5f95722444484bbcee', 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'hostId': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.074 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.078 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525 / tapae07e4bd-18 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.078 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.082 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 955b5604-c52b-4d34-8a59-8e13bc6d2992 / tapfc3bca7e-b6 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.082 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a1fa4ba-9008-4ccd-9ac9-4f9cdfaec6b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-0000005e-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-tapae07e4bd-18', 'timestamp': '2025-10-08T16:32:36.074602', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'tapae07e4bd-18', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c8:e6:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapae07e4bd-18'}, 'message_id': '65d79834-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.797576054, 'message_signature': '9fd26c25a601105d84e4ed11fec81fdf58549dd98a8eb430d413c37c130ec675'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-00000060-955b5604-c52b-4d34-8a59-8e13bc6d2992-tapfc3bca7e-b6', 'timestamp': '2025-10-08T16:32:36.074602', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'tapfc3bca7e-b6', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:99:eb:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc3bca7e-b6'}, 'message_id': '65d80d00-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.803033757, 'message_signature': 'a6e45f41263e6b27d863a4c9d4a6cceedc88e2fc10af2694a5f38a40a186602c'}]}, 'timestamp': '2025-10-08 16:32:36.082993', '_unique_id': '847e5c185cdd47088cd9dd1bbbf80fcf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.084 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.085 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.100 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.101 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.112 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.allocation volume: 18030592 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.113 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '300b3b37-c9e1-492e-8d65-0c0316eb1dc2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-vda', 'timestamp': '2025-10-08T16:32:36.086037', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '65dae0b6-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.809039408, 'message_signature': '73187e7dfe34148918f01068831dc054b300c575e2dde9291aed42e7e28ac367'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-sda', 'timestamp': '2025-10-08T16:32:36.086037', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '65daf128-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.809039408, 'message_signature': 'ad86f591fba02b0fe65ce38d7e872882e3fb9650351c9dbeda1100dde78c5c7d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 18030592, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-vda', 'timestamp': '2025-10-08T16:32:36.086037', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '65dca356-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.824881931, 'message_signature': '80d4fb0597dfec2d0e273d5097fdd951f8532d9413af1ef54d3687564ae97228'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-sda', 'timestamp': '2025-10-08T16:32:36.086037', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '65dcb918-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.824881931, 'message_signature': '53925fc9ed58214dec51da92519d50da2fd8de39d0c1d5561ef72534072cabde'}]}, 'timestamp': '2025-10-08 16:32:36.113553', '_unique_id': '017f0f812def483ca080165267de9f25'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.114 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.115 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.132 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.write.requests volume: 707 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.133 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.152 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.write.requests volume: 22 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.153 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40b57c14-e0fa-43de-8409-08c9ce5a0db8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 707, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-vda', 'timestamp': '2025-10-08T16:32:36.115968', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '65dfc630-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.838960468, 'message_signature': '66642a194023ffe214b4995ad24611f540e9f5f7983a722a51959b967269cc9f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-sda', 'timestamp': '2025-10-08T16:32:36.115968', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '65dfd512-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.838960468, 'message_signature': '4e9f48acbd5b056b555dd6e11c61d3f9bf9df847c0109e292c065d6295282fba'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 22, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-vda', 'timestamp': '2025-10-08T16:32:36.115968', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '65e2cd44-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.856918188, 'message_signature': '5c8112c39e8b102a63c2ea0340ed5c1a92066a104950d573a86a80cc62506329'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-sda', 'timestamp': '2025-10-08T16:32:36.115968', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '65e2dd02-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.856918188, 'message_signature': '0be7d4dc6ebabf0e670a3d03df33f2d47ba3053752c22a1e8a5cffe86ccd7dca'}]}, 'timestamp': '2025-10-08 16:32:36.153828', '_unique_id': '4d08a688ae824344a9d189b485c488f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.155 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.156 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.156 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.156 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6813a6bc-53f6-4375-a7ae-743eadf3a4a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-0000005e-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-tapae07e4bd-18', 'timestamp': '2025-10-08T16:32:36.156533', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'tapae07e4bd-18', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c8:e6:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapae07e4bd-18'}, 'message_id': '65e355de-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.797576054, 'message_signature': '7511e64289bcc8a58fdef008a0ab03f069c3bf90f8dbae315f62438d0a8e16e7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-00000060-955b5604-c52b-4d34-8a59-8e13bc6d2992-tapfc3bca7e-b6', 'timestamp': '2025-10-08T16:32:36.156533', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'tapfc3bca7e-b6', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:99:eb:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc3bca7e-b6'}, 'message_id': '65e36114-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.803033757, 'message_signature': 'ef9079cef29c86c221c11c98fdd1153f80d28b229c3814e4083bfcf9212b5e40'}]}, 'timestamp': '2025-10-08 16:32:36.157184', '_unique_id': '05e2f5a0597a49c79d9d44fe00bd9449'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.157 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.158 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.158 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.159 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-2100618665>, <NovaLikeServer: tempest-server-test-1886042372>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-2100618665>, <NovaLikeServer: tempest-server-test-1886042372>]
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.159 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.159 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.159 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd67188d-7c1c-4e58-bbb6-3063140e9cf5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-0000005e-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-tapae07e4bd-18', 'timestamp': '2025-10-08T16:32:36.159369', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'tapae07e4bd-18', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c8:e6:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapae07e4bd-18'}, 'message_id': '65e3c172-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.797576054, 'message_signature': '0576458c8e76340e339220f1c4b77575934001098912f5f74518f068c5061cc7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-00000060-955b5604-c52b-4d34-8a59-8e13bc6d2992-tapfc3bca7e-b6', 'timestamp': '2025-10-08T16:32:36.159369', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'tapfc3bca7e-b6', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:99:eb:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc3bca7e-b6'}, 'message_id': '65e3cb36-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.803033757, 'message_signature': '3370441d896a59011d3df7a1a00bd890db550f3183aa096e63f73a19d3bde9f2'}]}, 'timestamp': '2025-10-08 16:32:36.159881', '_unique_id': '181c908f25d04f39b6bb39243990b96e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.161 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.162 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.162 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2cf1cfd0-1c82-4fb6-adff-d3b16a2ff142', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-0000005e-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-tapae07e4bd-18', 'timestamp': '2025-10-08T16:32:36.162006', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'tapae07e4bd-18', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c8:e6:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapae07e4bd-18'}, 'message_id': '65e42aae-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.797576054, 'message_signature': 'daa4da19f3f9b11c33c1e51a9b22a2f12267ea0fedbe409645d8fba1369c7a92'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-00000060-955b5604-c52b-4d34-8a59-8e13bc6d2992-tapfc3bca7e-b6', 'timestamp': '2025-10-08T16:32:36.162006', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'tapfc3bca7e-b6', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:99:eb:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc3bca7e-b6'}, 'message_id': '65e43508-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.803033757, 'message_signature': '823b011696ab1f2bf0dc9bd6fb6f6f41ef4ce9f105138fdbc5fa51bc8f8b579c'}]}, 'timestamp': '2025-10-08 16:32:36.162589', '_unique_id': '45f6cf400f9a4bd5babbc215b3267c4a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.163 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.164 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.164 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.164 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.164 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.165 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53037164-87ad-4c6f-adaa-50242ad88d2e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-vda', 'timestamp': '2025-10-08T16:32:36.164326', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '65e483fa-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.809039408, 'message_signature': '1eec2adc003fa0aa3eeb7b5d17b7f6ccbcfa6e1abdd736aa9c48622dcf7544ba'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-sda', 'timestamp': '2025-10-08T16:32:36.164326', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '65e49016-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.809039408, 'message_signature': 'b7d47cf8abeeef2b704f52b99e9c5c62a10c58ed893f8465f3b2ed4659b916ef'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-vda', 'timestamp': '2025-10-08T16:32:36.164326', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '65e49d2c-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.824881931, 'message_signature': 'ae5c811a44e47fe0f3b71bdbb266a07f0b12c5abec99e74a9f5c97b519a22ecb'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-sda', 'timestamp': '2025-10-08T16:32:36.164326', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '65e4a682-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.824881931, 'message_signature': '379c14ef7fbb43a3118672e8ef5d22a643401e7444f19cccb290d2c9d2fd1420'}]}, 'timestamp': '2025-10-08 16:32:36.165513', '_unique_id': 'b89122ff251049f692b6bd00ab239286'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.166 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.167 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.167 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/network.outgoing.packets volume: 35 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.167 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a10f969-771b-45fe-9618-c7889cba99a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 35, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-0000005e-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-tapae07e4bd-18', 'timestamp': '2025-10-08T16:32:36.167406', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'tapae07e4bd-18', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c8:e6:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapae07e4bd-18'}, 'message_id': '65e4fbf0-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.797576054, 'message_signature': '0740fa9b2a6734a455e41a8e900174366c3f10f5ddedf5e9334acd0be06a9281'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-00000060-955b5604-c52b-4d34-8a59-8e13bc6d2992-tapfc3bca7e-b6', 'timestamp': '2025-10-08T16:32:36.167406', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'tapfc3bca7e-b6', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:99:eb:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc3bca7e-b6'}, 'message_id': '65e506a4-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.803033757, 'message_signature': '45c0bf4c882fe57be6d964b1f2b9c55cb30b7a0e14b255eab69c68138a887ec5'}]}, 'timestamp': '2025-10-08 16:32:36.167982', '_unique_id': '0ce81ac00ebf46cf9442e4e525ec874f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.168 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.169 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.186 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/memory.usage volume: 226.26171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.201 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/memory.usage volume: 168.64453125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc2f588e-ade5-4f70-8ca8-a7ac1f6341d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 226.26171875, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'timestamp': '2025-10-08T16:32:36.169791', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '65e7eeb4-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.909308781, 'message_signature': 'c78eaeb55ba15fb48b9889320fd3f680e0cb6a2794cd5af269d98ffe7fd7ce93'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 168.64453125, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'timestamp': '2025-10-08T16:32:36.169791', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '65ea4fba-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.924788832, 'message_signature': 'f086074a81d446805b8821f60d04b82221a6ef55b0a55d5684ffb9b7d3857db0'}]}, 'timestamp': '2025-10-08 16:32:36.202698', '_unique_id': '1dc9cbc32ec745238908f88c4d17b3ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.203 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.205 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.205 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.205 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-2100618665>, <NovaLikeServer: tempest-server-test-1886042372>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-2100618665>, <NovaLikeServer: tempest-server-test-1886042372>]
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.205 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.205 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/cpu volume: 41210000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.206 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/cpu volume: 22930000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a5aef7b-e638-491d-a9a4-ae0ef8631226', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 41210000000, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'timestamp': '2025-10-08T16:32:36.205729', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '65ead53e-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.909308781, 'message_signature': 'b6fc619435b5fd4b774344454b68c995447646a8e5143c76aedf41e70501c753'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22930000000, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'timestamp': '2025-10-08T16:32:36.205729', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '65eae13c-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.924788832, 'message_signature': '943ea784c05455ed18f87ba087b019ca760b1c20ba3b27904b79e17aa4fd0212'}]}, 'timestamp': '2025-10-08 16:32:36.206332', '_unique_id': 'afa9a28300d743489a8d37c79ad30cdc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.207 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.208 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.208 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.write.bytes volume: 135451136 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.208 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.208 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.write.bytes volume: 15987712 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f819b1a7-7733-459d-b6bf-c7a9fcd5bb35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 135451136, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-vda', 'timestamp': '2025-10-08T16:32:36.208269', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '65eb38d0-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.838960468, 'message_signature': '48155b254c5ee634a117bdfa8a689ff1ed55a3f1e5ccafe428031663a321724d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-sda', 'timestamp': '2025-10-08T16:32:36.208269', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '65eb42c6-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.838960468, 'message_signature': '1ebf973884f5365fc82ccda9118f71494c50b21995a666f5107376bf98e2a4f1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 15987712, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-vda', 'timestamp': '2025-10-08T16:32:36.208269', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '65eb4bfe-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.856918188, 'message_signature': 'd1185b0ec04004c061db08e5f5a5b511dcfa4168c364b6b00403b085e84e4d6c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-sda', 'timestamp': '2025-10-08T16:32:36.208269', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '65eb55e0-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.856918188, 'message_signature': '645df08eb6e67faee7da5fb2e077b6f8677245772ff4a2285d0f34848e8f4c61'}]}, 'timestamp': '2025-10-08 16:32:36.209299', '_unique_id': 'c818f64853124dbc9f2345801a88a822'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.209 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.210 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.211 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/network.incoming.packets volume: 24 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.211 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e9fb638-177a-44a4-9354-32a047c44b22', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 24, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-0000005e-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-tapae07e4bd-18', 'timestamp': '2025-10-08T16:32:36.211134', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'tapae07e4bd-18', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c8:e6:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapae07e4bd-18'}, 'message_id': '65ebaa36-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.797576054, 'message_signature': '59899020bcf6ef5c600ed3124ccfbbd4657d2cb8d71b4a015f78c133b03284d9'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-00000060-955b5604-c52b-4d34-8a59-8e13bc6d2992-tapfc3bca7e-b6', 'timestamp': '2025-10-08T16:32:36.211134', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'tapfc3bca7e-b6', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:99:eb:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc3bca7e-b6'}, 'message_id': '65ebb51c-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.803033757, 'message_signature': '852a6c1e2e9cbcaa5d2ccfb85d80dd7c3c5cc88c60e4e9a387be386fba96b80c'}]}, 'timestamp': '2025-10-08 16:32:36.211746', '_unique_id': '6896c4bd42df4ce49265835aaaa484a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.212 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.213 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.213 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.write.latency volume: 57437573056 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.214 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.214 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.write.latency volume: 814936612 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.214 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46886fbb-486a-4e3a-b4af-b76b725fad34', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 57437573056, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-vda', 'timestamp': '2025-10-08T16:32:36.213730', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '65ec0d78-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.838960468, 'message_signature': '4823724696abedd4f5d63552ac4ecff0f83683fcbc96e1ca2661b3edec937282'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-sda', 'timestamp': '2025-10-08T16:32:36.213730', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '65ec1912-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.838960468, 'message_signature': '350fc82b963647cf0fa45124f3747740bb872e074938dfda7cc444ab1fd5783f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 814936612, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-vda', 'timestamp': '2025-10-08T16:32:36.213730', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '65ec24c0-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.856918188, 'message_signature': '1da5df63cc0a6c43aa1a5ac4731a0ce0df1331621606d273c77aa1f3da165aaa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-sda', 'timestamp': '2025-10-08T16:32:36.213730', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '65ec2de4-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.856918188, 'message_signature': '19143d0b8e1c79d774b5fbed03f16df5730e825ec7448860d8f17f3e1c39694e'}]}, 'timestamp': '2025-10-08 16:32:36.214826', '_unique_id': '3c5db2cb743a4df98678121575b71c2d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.215 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.216 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.216 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.216 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '93915be9-6e42-4187-b220-e9e6fedf2d36', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-0000005e-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-tapae07e4bd-18', 'timestamp': '2025-10-08T16:32:36.216492', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'tapae07e4bd-18', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c8:e6:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapae07e4bd-18'}, 'message_id': '65ec795c-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.797576054, 'message_signature': '58d18ba4572ce7909976bd3a3799cd3483619b88d57bedc854e4a526cb551abf'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-00000060-955b5604-c52b-4d34-8a59-8e13bc6d2992-tapfc3bca7e-b6', 'timestamp': '2025-10-08T16:32:36.216492', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'tapfc3bca7e-b6', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:99:eb:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc3bca7e-b6'}, 'message_id': '65ec8456-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.803033757, 'message_signature': '9d38518eeac72b9a60b4dc4743f1479963c6973c97a02d3caa78e56c07109431'}]}, 'timestamp': '2025-10-08 16:32:36.217050', '_unique_id': 'b8e238679c0d4f879f0d6cb53d452e8d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.217 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.218 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.218 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/network.outgoing.bytes volume: 3478 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.219 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c535578-4d31-4c64-8ade-178bde82fcb2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3478, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-0000005e-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-tapae07e4bd-18', 'timestamp': '2025-10-08T16:32:36.218757', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'tapae07e4bd-18', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c8:e6:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapae07e4bd-18'}, 'message_id': '65ecd1e0-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.797576054, 'message_signature': '34cfd8da05a49446da9673aaf500c28d565f9d0f8711070a4fdee418475f3224'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-00000060-955b5604-c52b-4d34-8a59-8e13bc6d2992-tapfc3bca7e-b6', 'timestamp': '2025-10-08T16:32:36.218757', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'tapfc3bca7e-b6', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:99:eb:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc3bca7e-b6'}, 'message_id': '65ecdea6-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.803033757, 'message_signature': 'eeabe7aeaf35074969ed4a992cca4284c1ddfba67ccf48af0baca9cb2dd9c093'}]}, 'timestamp': '2025-10-08 16:32:36.219423', '_unique_id': 'd2a3c5fdea4c47389397f10a70be3b61'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.220 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.221 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.221 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.221 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-2100618665>, <NovaLikeServer: tempest-server-test-1886042372>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-2100618665>, <NovaLikeServer: tempest-server-test-1886042372>]
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.221 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.221 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/network.incoming.bytes volume: 2552 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.222 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2005907f-ae65-4afc-a639-3e3de673849b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2552, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-0000005e-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-tapae07e4bd-18', 'timestamp': '2025-10-08T16:32:36.221832', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'tapae07e4bd-18', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c8:e6:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapae07e4bd-18'}, 'message_id': '65ed49e0-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.797576054, 'message_signature': '064f3f48aaf53955cf75164faf2743c346831e7c6a506aef67d41b62a9f07eb3'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-00000060-955b5604-c52b-4d34-8a59-8e13bc6d2992-tapfc3bca7e-b6', 'timestamp': '2025-10-08T16:32:36.221832', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'tapfc3bca7e-b6', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:99:eb:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc3bca7e-b6'}, 'message_id': '65ed552a-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.803033757, 'message_signature': 'b793a3667d46685ae7cdd2be9b58b410d67bac813b050ddb15564ded6f9b2af8'}]}, 'timestamp': '2025-10-08 16:32:36.222395', '_unique_id': '5086329b213a4baab4aa4037b488b081'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.223 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.224 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.224 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.read.requests volume: 11556 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.224 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.224 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.read.requests volume: 8071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.224 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.read.requests volume: 50 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d885dd5-5cda-42af-9f04-d074f90d1d78', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11556, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-vda', 'timestamp': '2025-10-08T16:32:36.224246', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '65eda804-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.838960468, 'message_signature': '7c8f0773453af9c77cf0e247fedcd779a027d879eeac18ec8bed081c5da34814'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-sda', 'timestamp': '2025-10-08T16:32:36.224246', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '65edb164-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.838960468, 'message_signature': '9b00f4e8acba67661d549c2b41aa10ccd61b3d8cefae70a3c9d7fb8d761986bc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 8071, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-vda', 'timestamp': '2025-10-08T16:32:36.224246', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '65edba60-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.856918188, 'message_signature': 'f60bf9c0575645cbfba909a8302b9cbf2c93c34fe3657009411426de96c2faf7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 50, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-sda', 'timestamp': '2025-10-08T16:32:36.224246', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '65edc3f2-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.856918188, 'message_signature': 'c6c0623f1abfbe1391e4e5865ae9cf4c815cb50899c05c96dd1fa06531eb2708'}]}, 'timestamp': '2025-10-08 16:32:36.225255', '_unique_id': '1efc6f23abe14536a1918a55a578e1da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.225 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.227 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.227 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.227 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-2100618665>, <NovaLikeServer: tempest-server-test-1886042372>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-2100618665>, <NovaLikeServer: tempest-server-test-1886042372>]
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.227 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.227 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.228 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96b76d9b-e30e-4892-a16f-a72015e143f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-0000005e-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-tapae07e4bd-18', 'timestamp': '2025-10-08T16:32:36.227721', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'tapae07e4bd-18', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c8:e6:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapae07e4bd-18'}, 'message_id': '65ee3134-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.797576054, 'message_signature': 'beebcf44fca1972991f0acf05a73a400748bb9c9b2ecc76841ca6f78dbf1f4b6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-00000060-955b5604-c52b-4d34-8a59-8e13bc6d2992-tapfc3bca7e-b6', 'timestamp': '2025-10-08T16:32:36.227721', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'tapfc3bca7e-b6', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:99:eb:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc3bca7e-b6'}, 'message_id': '65ee4214-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.803033757, 'message_signature': 'ec0f98805e9db5ed43c857a13a3b0e9f94cfebcc69e7f9f9ae8b18d58176e8f3'}]}, 'timestamp': '2025-10-08 16:32:36.228500', '_unique_id': 'b5e6a253bdea43dabafc1a5f9602f5fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.229 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.230 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.230 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.usage volume: 152698880 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.230 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.230 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.usage volume: 16908288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.230 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3f113b3-7fa3-4409-bbdc-a7f3322866ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152698880, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-vda', 'timestamp': '2025-10-08T16:32:36.230151', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '65ee8e86-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.809039408, 'message_signature': 'b30a43f1568c4cfecd4ed151dc82d846c7767e5498da1efd68ab88894faa5803'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-sda', 'timestamp': '2025-10-08T16:32:36.230151', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '65ee97be-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.809039408, 'message_signature': '72d6a127bf11b8d15e58b017c3a9ffc784c733cd243270e6506d1f4ee5c79551'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 16908288, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-vda', 'timestamp': '2025-10-08T16:32:36.230151', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '65eea0ba-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.824881931, 'message_signature': '79e360ae3817ee2a85a0f27998428479a550bf71cf54affdd69f0a41da91547c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-sda', 'timestamp': '2025-10-08T16:32:36.230151', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '65eea9e8-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.824881931, 'message_signature': 'fbd235756431014e86a0a7ca5c09a0f9946fd98151803647acd9699a4d660676'}]}, 'timestamp': '2025-10-08 16:32:36.231139', '_unique_id': '1f6e65836a3c4335bef65a1910362a34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.231 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.232 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.232 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.read.latency volume: 15051252375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.232 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.read.latency volume: 261691827 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.233 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.read.latency volume: 7904926490 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.233 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.read.latency volume: 170125149 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd7fd8f50-d979-4e8d-90f4-a758324ed283', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15051252375, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-vda', 'timestamp': '2025-10-08T16:32:36.232716', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '65eef2c2-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.838960468, 'message_signature': 'bfae02ec865581ead01122d6ce603ccd484c9b695a761d3795fb980925572279'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 261691827, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-sda', 'timestamp': '2025-10-08T16:32:36.232716', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '65eefe98-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.838960468, 'message_signature': '7c75e4c87de56617e03c61a2d11acf7b3db4ce5665aad56bbc628f35cb131fe1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7904926490, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-vda', 'timestamp': '2025-10-08T16:32:36.232716', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '65ef09ec-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.856918188, 'message_signature': 'c1f6a678c5f1ee7725376479c383ad5f81afdccb5c172fafb534eb4a87033235'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 170125149, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-sda', 'timestamp': '2025-10-08T16:32:36.232716', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '65ef1608-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.856918188, 'message_signature': '459198de6b0e5d159c2e68d3fb9469f57fc153207fed7c9cb02053d5f7e9a473'}]}, 'timestamp': '2025-10-08 16:32:36.233918', '_unique_id': '336480e9d3ce492b872f028b59c1ece7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.234 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.235 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.235 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.read.bytes volume: 327759360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.236 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.236 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.read.bytes volume: 158945280 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.236 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.read.bytes volume: 162020 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e0035bd-e171-4e24-af2d-1cb8472d0738', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 327759360, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-vda', 'timestamp': '2025-10-08T16:32:36.235817', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '65ef6cfc-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.838960468, 'message_signature': 'dac111b832648c236fd1b8a57363b239d9e71a417b25146abe9e99a67db42b7e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-sda', 'timestamp': '2025-10-08T16:32:36.235817', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '65ef77b0-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.838960468, 'message_signature': '774d334b9c9181dd360f97d6b2b5dca077def27b525d5527ffc02cb0d41397c4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 158945280, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-vda', 'timestamp': '2025-10-08T16:32:36.235817', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '65ef808e-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.856918188, 'message_signature': 'a33442548974c7c3e261ae1bbdd7419f8f95ec50aaca5386fb3d55883af7b0ec'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 162020, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-sda', 'timestamp': '2025-10-08T16:32:36.235817', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '65ef8930-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8059.856918188, 'message_signature': '3f5892b4c5b8b4d66980d0500d64f7d9f503ddb30852a57c0dac1a8bb8ccad75'}]}, 'timestamp': '2025-10-08 16:32:36.236822', '_unique_id': '84450d36f8e3486aa1de428a72fdc562'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:32:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:32:36.237 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:32:36 np0005476733 nova_compute[192580]: 2025-10-08 16:32:36.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:32:37 np0005476733 nova_compute[192580]: 2025-10-08 16:32:37.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:39 np0005476733 ovn_controller[263831]: 2025-10-08T16:32:39Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:99:eb:59 10.100.0.9
Oct  8 12:32:39 np0005476733 ovn_controller[263831]: 2025-10-08T16:32:39Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:99:eb:59 10.100.0.9
Oct  8 12:32:39 np0005476733 ovn_controller[263831]: 2025-10-08T16:32:39Z|00051|memory_trim|INFO|Detected inactivity (last active 30016 ms ago): trimming memory
Oct  8 12:32:39 np0005476733 nova_compute[192580]: 2025-10-08 16:32:39.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:42 np0005476733 nova_compute[192580]: 2025-10-08 16:32:42.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:43 np0005476733 nova_compute[192580]: 2025-10-08 16:32:43.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:32:43 np0005476733 nova_compute[192580]: 2025-10-08 16:32:43.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:32:44 np0005476733 podman[265427]: 2025-10-08 16:32:44.260586263 +0000 UTC m=+0.070715117 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, managed_by=edpm_ansible, tcib_managed=true)
Oct  8 12:32:44 np0005476733 podman[265428]: 2025-10-08 16:32:44.27406708 +0000 UTC m=+0.087655994 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:32:44 np0005476733 nova_compute[192580]: 2025-10-08 16:32:44.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:32:44 np0005476733 nova_compute[192580]: 2025-10-08 16:32:44.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:46 np0005476733 nova_compute[192580]: 2025-10-08 16:32:46.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:32:47 np0005476733 nova_compute[192580]: 2025-10-08 16:32:47.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:48 np0005476733 nova_compute[192580]: 2025-10-08 16:32:48.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:32:48 np0005476733 nova_compute[192580]: 2025-10-08 16:32:48.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:32:48 np0005476733 nova_compute[192580]: 2025-10-08 16:32:48.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:32:48 np0005476733 nova_compute[192580]: 2025-10-08 16:32:48.886 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:32:48 np0005476733 nova_compute[192580]: 2025-10-08 16:32:48.887 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:32:48 np0005476733 nova_compute[192580]: 2025-10-08 16:32:48.887 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:32:48 np0005476733 nova_compute[192580]: 2025-10-08 16:32:48.887 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:32:49 np0005476733 nova_compute[192580]: 2025-10-08 16:32:49.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:50 np0005476733 nova_compute[192580]: 2025-10-08 16:32:50.227 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Updating instance_info_cache with network_info: [{"id": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "address": "fa:16:3e:c8:e6:70", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.229", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae07e4bd-18", "ovs_interfaceid": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:32:50 np0005476733 nova_compute[192580]: 2025-10-08 16:32:50.243 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:32:50 np0005476733 nova_compute[192580]: 2025-10-08 16:32:50.244 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:32:50 np0005476733 nova_compute[192580]: 2025-10-08 16:32:50.245 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:32:52 np0005476733 nova_compute[192580]: 2025-10-08 16:32:52.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:52 np0005476733 podman[265490]: 2025-10-08 16:32:52.228106736 +0000 UTC m=+0.054559063 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 12:32:54 np0005476733 nova_compute[192580]: 2025-10-08 16:32:54.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:55 np0005476733 ovn_controller[263831]: 2025-10-08T16:32:55Z|00052|pinctrl|WARN|Dropped 263 log messages in last 62 seconds (most recently, 4 seconds ago) due to excessive rate
Oct  8 12:32:55 np0005476733 ovn_controller[263831]: 2025-10-08T16:32:55Z|00053|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:32:55 np0005476733 podman[265510]: 2025-10-08 16:32:55.249777261 +0000 UTC m=+0.080315621 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:32:57 np0005476733 nova_compute[192580]: 2025-10-08 16:32:57.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:32:58 np0005476733 nova_compute[192580]: 2025-10-08 16:32:58.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:32:58 np0005476733 nova_compute[192580]: 2025-10-08 16:32:58.620 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:32:58 np0005476733 nova_compute[192580]: 2025-10-08 16:32:58.620 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:32:58 np0005476733 nova_compute[192580]: 2025-10-08 16:32:58.621 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:32:58 np0005476733 nova_compute[192580]: 2025-10-08 16:32:58.621 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:32:58 np0005476733 nova_compute[192580]: 2025-10-08 16:32:58.715 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:32:58 np0005476733 podman[265536]: 2025-10-08 16:32:58.732660237 +0000 UTC m=+0.061810222 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 12:32:58 np0005476733 nova_compute[192580]: 2025-10-08 16:32:58.803 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:32:58 np0005476733 nova_compute[192580]: 2025-10-08 16:32:58.804 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:32:58 np0005476733 nova_compute[192580]: 2025-10-08 16:32:58.872 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:32:58 np0005476733 nova_compute[192580]: 2025-10-08 16:32:58.879 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:32:58 np0005476733 nova_compute[192580]: 2025-10-08 16:32:58.945 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:32:58 np0005476733 nova_compute[192580]: 2025-10-08 16:32:58.946 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:32:59 np0005476733 nova_compute[192580]: 2025-10-08 16:32:59.000 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:32:59 np0005476733 nova_compute[192580]: 2025-10-08 16:32:59.145 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:32:59 np0005476733 nova_compute[192580]: 2025-10-08 16:32:59.146 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12103MB free_disk=111.04185104370117GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:32:59 np0005476733 nova_compute[192580]: 2025-10-08 16:32:59.147 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:32:59 np0005476733 nova_compute[192580]: 2025-10-08 16:32:59.147 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:32:59 np0005476733 nova_compute[192580]: 2025-10-08 16:32:59.221 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:32:59 np0005476733 nova_compute[192580]: 2025-10-08 16:32:59.222 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 955b5604-c52b-4d34-8a59-8e13bc6d2992 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:32:59 np0005476733 nova_compute[192580]: 2025-10-08 16:32:59.222 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:32:59 np0005476733 nova_compute[192580]: 2025-10-08 16:32:59.222 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=2560MB phys_disk=119GB used_disk=20GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:32:59 np0005476733 nova_compute[192580]: 2025-10-08 16:32:59.381 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:32:59 np0005476733 nova_compute[192580]: 2025-10-08 16:32:59.398 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:32:59 np0005476733 nova_compute[192580]: 2025-10-08 16:32:59.421 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:32:59 np0005476733 nova_compute[192580]: 2025-10-08 16:32:59.422 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.275s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:32:59 np0005476733 nova_compute[192580]: 2025-10-08 16:32:59.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:02 np0005476733 nova_compute[192580]: 2025-10-08 16:33:02.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:04 np0005476733 nova_compute[192580]: 2025-10-08 16:33:04.423 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:33:04 np0005476733 nova_compute[192580]: 2025-10-08 16:33:04.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:06 np0005476733 podman[265606]: 2025-10-08 16:33:06.253267493 +0000 UTC m=+0.066021767 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=multipathd)
Oct  8 12:33:06 np0005476733 podman[265607]: 2025-10-08 16:33:06.283198893 +0000 UTC m=+0.093972614 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 12:33:06 np0005476733 podman[265608]: 2025-10-08 16:33:06.284955809 +0000 UTC m=+0.081794578 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, version=9.6, architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  8 12:33:07 np0005476733 nova_compute[192580]: 2025-10-08 16:33:07.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:07 np0005476733 nova_compute[192580]: 2025-10-08 16:33:07.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:33:09 np0005476733 nova_compute[192580]: 2025-10-08 16:33:09.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:12 np0005476733 nova_compute[192580]: 2025-10-08 16:33:12.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:14 np0005476733 nova_compute[192580]: 2025-10-08 16:33:14.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:15 np0005476733 podman[265669]: 2025-10-08 16:33:15.262274198 +0000 UTC m=+0.089328337 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid)
Oct  8 12:33:15 np0005476733 podman[265670]: 2025-10-08 16:33:15.266263265 +0000 UTC m=+0.084704630 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:33:17 np0005476733 nova_compute[192580]: 2025-10-08 16:33:17.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:19 np0005476733 nova_compute[192580]: 2025-10-08 16:33:19.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:22 np0005476733 nova_compute[192580]: 2025-10-08 16:33:22.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:23 np0005476733 podman[265714]: 2025-10-08 16:33:23.218050538 +0000 UTC m=+0.052977582 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 12:33:24 np0005476733 nova_compute[192580]: 2025-10-08 16:33:24.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:26 np0005476733 podman[265733]: 2025-10-08 16:33:26.251686593 +0000 UTC m=+0.078302876 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:33:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:33:26.391 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:33:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:33:26.391 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:33:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:33:26.392 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:33:27 np0005476733 nova_compute[192580]: 2025-10-08 16:33:27.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:29 np0005476733 podman[265766]: 2025-10-08 16:33:29.241346471 +0000 UTC m=+0.068757864 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:33:29 np0005476733 nova_compute[192580]: 2025-10-08 16:33:29.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:32 np0005476733 nova_compute[192580]: 2025-10-08 16:33:32.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:33:33.286 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=72, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=71) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:33:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:33:33.287 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:33:33 np0005476733 nova_compute[192580]: 2025-10-08 16:33:33.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:34 np0005476733 nova_compute[192580]: 2025-10-08 16:33:34.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:35 np0005476733 nova_compute[192580]: 2025-10-08 16:33:35.451 2 DEBUG nova.compute.manager [req-2b826d13-a4e9-4f88-8aa8-9a5c3596d675 req-d7c430d3-c564-4de5-85bf-cfb639ffdb6c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Received event network-changed-fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:33:35 np0005476733 nova_compute[192580]: 2025-10-08 16:33:35.451 2 DEBUG nova.compute.manager [req-2b826d13-a4e9-4f88-8aa8-9a5c3596d675 req-d7c430d3-c564-4de5-85bf-cfb639ffdb6c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Refreshing instance network info cache due to event network-changed-fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:33:35 np0005476733 nova_compute[192580]: 2025-10-08 16:33:35.452 2 DEBUG oslo_concurrency.lockutils [req-2b826d13-a4e9-4f88-8aa8-9a5c3596d675 req-d7c430d3-c564-4de5-85bf-cfb639ffdb6c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-955b5604-c52b-4d34-8a59-8e13bc6d2992" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:33:35 np0005476733 nova_compute[192580]: 2025-10-08 16:33:35.452 2 DEBUG oslo_concurrency.lockutils [req-2b826d13-a4e9-4f88-8aa8-9a5c3596d675 req-d7c430d3-c564-4de5-85bf-cfb639ffdb6c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-955b5604-c52b-4d34-8a59-8e13bc6d2992" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:33:35 np0005476733 nova_compute[192580]: 2025-10-08 16:33:35.453 2 DEBUG nova.network.neutron [req-2b826d13-a4e9-4f88-8aa8-9a5c3596d675 req-d7c430d3-c564-4de5-85bf-cfb639ffdb6c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Refreshing network info cache for port fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:33:35 np0005476733 nova_compute[192580]: 2025-10-08 16:33:35.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:33:37 np0005476733 nova_compute[192580]: 2025-10-08 16:33:37.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:37 np0005476733 podman[265788]: 2025-10-08 16:33:37.244658261 +0000 UTC m=+0.063939321 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350)
Oct  8 12:33:37 np0005476733 podman[265786]: 2025-10-08 16:33:37.248966998 +0000 UTC m=+0.072021008 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:33:37 np0005476733 podman[265787]: 2025-10-08 16:33:37.251832279 +0000 UTC m=+0.072217204 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 12:33:37 np0005476733 nova_compute[192580]: 2025-10-08 16:33:37.658 2 DEBUG nova.network.neutron [req-2b826d13-a4e9-4f88-8aa8-9a5c3596d675 req-d7c430d3-c564-4de5-85bf-cfb639ffdb6c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Updated VIF entry in instance network info cache for port fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:33:37 np0005476733 nova_compute[192580]: 2025-10-08 16:33:37.658 2 DEBUG nova.network.neutron [req-2b826d13-a4e9-4f88-8aa8-9a5c3596d675 req-d7c430d3-c564-4de5-85bf-cfb639ffdb6c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Updating instance_info_cache with network_info: [{"id": "fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b", "address": "fa:16:3e:99:eb:59", "network": {"id": "c62ce41c-4d20-4575-9fd8-d6f63a77e629", "bridge": "br-int", "label": "tempest-test-network--1825912703", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e84a9b599804a5f95722444484bbcee", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc3bca7e-b6", "ovs_interfaceid": "fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:33:37 np0005476733 nova_compute[192580]: 2025-10-08 16:33:37.679 2 DEBUG oslo_concurrency.lockutils [req-2b826d13-a4e9-4f88-8aa8-9a5c3596d675 req-d7c430d3-c564-4de5-85bf-cfb639ffdb6c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-955b5604-c52b-4d34-8a59-8e13bc6d2992" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:33:38 np0005476733 nova_compute[192580]: 2025-10-08 16:33:38.582 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:33:39 np0005476733 nova_compute[192580]: 2025-10-08 16:33:39.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:42 np0005476733 nova_compute[192580]: 2025-10-08 16:33:42.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:43 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:33:43.290 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '72'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:33:44 np0005476733 nova_compute[192580]: 2025-10-08 16:33:44.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:33:44 np0005476733 nova_compute[192580]: 2025-10-08 16:33:44.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:33:44 np0005476733 nova_compute[192580]: 2025-10-08 16:33:44.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:46 np0005476733 podman[265849]: 2025-10-08 16:33:46.224020207 +0000 UTC m=+0.054076638 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, container_name=iscsid)
Oct  8 12:33:46 np0005476733 podman[265850]: 2025-10-08 16:33:46.227228439 +0000 UTC m=+0.052440677 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 12:33:46 np0005476733 nova_compute[192580]: 2025-10-08 16:33:46.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:33:47 np0005476733 nova_compute[192580]: 2025-10-08 16:33:47.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:48 np0005476733 nova_compute[192580]: 2025-10-08 16:33:48.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:33:49 np0005476733 nova_compute[192580]: 2025-10-08 16:33:49.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:50 np0005476733 nova_compute[192580]: 2025-10-08 16:33:50.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:33:50 np0005476733 nova_compute[192580]: 2025-10-08 16:33:50.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:33:51 np0005476733 nova_compute[192580]: 2025-10-08 16:33:51.252 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-955b5604-c52b-4d34-8a59-8e13bc6d2992" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:33:51 np0005476733 nova_compute[192580]: 2025-10-08 16:33:51.253 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-955b5604-c52b-4d34-8a59-8e13bc6d2992" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:33:51 np0005476733 nova_compute[192580]: 2025-10-08 16:33:51.253 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:33:52 np0005476733 nova_compute[192580]: 2025-10-08 16:33:52.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:52 np0005476733 nova_compute[192580]: 2025-10-08 16:33:52.665 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Updating instance_info_cache with network_info: [{"id": "fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b", "address": "fa:16:3e:99:eb:59", "network": {"id": "c62ce41c-4d20-4575-9fd8-d6f63a77e629", "bridge": "br-int", "label": "tempest-test-network--1825912703", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e84a9b599804a5f95722444484bbcee", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc3bca7e-b6", "ovs_interfaceid": "fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:33:52 np0005476733 nova_compute[192580]: 2025-10-08 16:33:52.680 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-955b5604-c52b-4d34-8a59-8e13bc6d2992" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:33:52 np0005476733 nova_compute[192580]: 2025-10-08 16:33:52.680 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:33:52 np0005476733 nova_compute[192580]: 2025-10-08 16:33:52.681 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:33:54 np0005476733 podman[265893]: 2025-10-08 16:33:54.220950355 +0000 UTC m=+0.052156236 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 12:33:54 np0005476733 nova_compute[192580]: 2025-10-08 16:33:54.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:57 np0005476733 nova_compute[192580]: 2025-10-08 16:33:57.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:33:57 np0005476733 ovn_controller[263831]: 2025-10-08T16:33:57Z|00054|pinctrl|WARN|Dropped 95 log messages in last 62 seconds (most recently, 6 seconds ago) due to excessive rate
Oct  8 12:33:57 np0005476733 ovn_controller[263831]: 2025-10-08T16:33:57Z|00055|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:33:57 np0005476733 podman[265912]: 2025-10-08 16:33:57.240189603 +0000 UTC m=+0.072547384 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller)
Oct  8 12:33:59 np0005476733 nova_compute[192580]: 2025-10-08 16:33:59.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:34:00 np0005476733 podman[265938]: 2025-10-08 16:34:00.21845889 +0000 UTC m=+0.049827832 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Oct  8 12:34:00 np0005476733 nova_compute[192580]: 2025-10-08 16:34:00.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:34:00 np0005476733 nova_compute[192580]: 2025-10-08 16:34:00.942 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:34:00 np0005476733 nova_compute[192580]: 2025-10-08 16:34:00.942 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:34:00 np0005476733 nova_compute[192580]: 2025-10-08 16:34:00.943 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:34:00 np0005476733 nova_compute[192580]: 2025-10-08 16:34:00.943 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:34:01 np0005476733 nova_compute[192580]: 2025-10-08 16:34:01.059 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:34:01 np0005476733 nova_compute[192580]: 2025-10-08 16:34:01.120 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:34:01 np0005476733 nova_compute[192580]: 2025-10-08 16:34:01.122 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:34:01 np0005476733 nova_compute[192580]: 2025-10-08 16:34:01.183 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:34:01 np0005476733 nova_compute[192580]: 2025-10-08 16:34:01.191 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:34:01 np0005476733 nova_compute[192580]: 2025-10-08 16:34:01.257 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:34:01 np0005476733 nova_compute[192580]: 2025-10-08 16:34:01.259 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:34:01 np0005476733 nova_compute[192580]: 2025-10-08 16:34:01.318 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:34:01 np0005476733 nova_compute[192580]: 2025-10-08 16:34:01.503 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:34:01 np0005476733 nova_compute[192580]: 2025-10-08 16:34:01.504 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=11977MB free_disk=111.02667617797852GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:34:01 np0005476733 nova_compute[192580]: 2025-10-08 16:34:01.505 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:34:01 np0005476733 nova_compute[192580]: 2025-10-08 16:34:01.505 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:34:01 np0005476733 nova_compute[192580]: 2025-10-08 16:34:01.592 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:34:01 np0005476733 nova_compute[192580]: 2025-10-08 16:34:01.593 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 955b5604-c52b-4d34-8a59-8e13bc6d2992 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:34:01 np0005476733 nova_compute[192580]: 2025-10-08 16:34:01.593 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:34:01 np0005476733 nova_compute[192580]: 2025-10-08 16:34:01.593 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=2560MB phys_disk=119GB used_disk=20GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:34:01 np0005476733 nova_compute[192580]: 2025-10-08 16:34:01.666 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:34:01 np0005476733 nova_compute[192580]: 2025-10-08 16:34:01.687 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:34:01 np0005476733 nova_compute[192580]: 2025-10-08 16:34:01.690 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:34:01 np0005476733 nova_compute[192580]: 2025-10-08 16:34:01.691 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:34:02 np0005476733 nova_compute[192580]: 2025-10-08 16:34:02.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:34:04 np0005476733 nova_compute[192580]: 2025-10-08 16:34:04.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:34:04 np0005476733 nova_compute[192580]: 2025-10-08 16:34:04.692 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:34:05 np0005476733 ovn_controller[263831]: 2025-10-08T16:34:05Z|00056|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Oct  8 12:34:07 np0005476733 nova_compute[192580]: 2025-10-08 16:34:07.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:34:08 np0005476733 podman[265970]: 2025-10-08 16:34:08.246154473 +0000 UTC m=+0.067622217 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  8 12:34:08 np0005476733 podman[265971]: 2025-10-08 16:34:08.261069736 +0000 UTC m=+0.080129944 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:34:08 np0005476733 podman[265972]: 2025-10-08 16:34:08.265839108 +0000 UTC m=+0.067930817 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, maintainer=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64)
Oct  8 12:34:09 np0005476733 nova_compute[192580]: 2025-10-08 16:34:09.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:34:12 np0005476733 nova_compute[192580]: 2025-10-08 16:34:12.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:34:14 np0005476733 nova_compute[192580]: 2025-10-08 16:34:14.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:34:17 np0005476733 nova_compute[192580]: 2025-10-08 16:34:17.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:34:17 np0005476733 podman[266035]: 2025-10-08 16:34:17.224326699 +0000 UTC m=+0.057754025 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001)
Oct  8 12:34:17 np0005476733 podman[266036]: 2025-10-08 16:34:17.23215043 +0000 UTC m=+0.060494134 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:34:19 np0005476733 nova_compute[192580]: 2025-10-08 16:34:19.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:34:22 np0005476733 nova_compute[192580]: 2025-10-08 16:34:22.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:34:24 np0005476733 nova_compute[192580]: 2025-10-08 16:34:24.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:34:25 np0005476733 podman[266076]: 2025-10-08 16:34:25.263356435 +0000 UTC m=+0.077801166 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:34:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:34:26.392 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:34:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:34:26.393 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:34:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:34:26.393 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:34:27 np0005476733 nova_compute[192580]: 2025-10-08 16:34:27.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:34:28 np0005476733 podman[266096]: 2025-10-08 16:34:28.259277234 +0000 UTC m=+0.089001685 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  8 12:34:29 np0005476733 nova_compute[192580]: 2025-10-08 16:34:29.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:34:31 np0005476733 podman[266122]: 2025-10-08 16:34:31.25362637 +0000 UTC m=+0.078284062 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=edpm)
Oct  8 12:34:32 np0005476733 nova_compute[192580]: 2025-10-08 16:34:32.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:34:34 np0005476733 nova_compute[192580]: 2025-10-08 16:34:34.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:34:35 np0005476733 nova_compute[192580]: 2025-10-08 16:34:35.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.071 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'name': 'tempest-server-test-2100618665', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000005e', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3e84a9b599804a5f95722444484bbcee', 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'hostId': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.073 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'name': 'tempest-server-test-1886042372', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000060', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3e84a9b599804a5f95722444484bbcee', 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'hostId': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.073 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.075 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/network.incoming.bytes.delta volume: 27618 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.077 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/network.incoming.bytes.delta volume: 117280 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2188759a-c38d-4cf6-a9f4-58fcd99db14a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 27618, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-0000005e-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-tapae07e4bd-18', 'timestamp': '2025-10-08T16:34:36.073485', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'tapae07e4bd-18', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c8:e6:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapae07e4bd-18'}, 'message_id': 'ad5d87ae-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.796454695, 'message_signature': '6765987daacbddc7a95331dd2cb51616dcbbb1a876ac6662e7a8de3ce6dab52f'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 117280, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-00000060-955b5604-c52b-4d34-8a59-8e13bc6d2992-tapfc3bca7e-b6', 'timestamp': '2025-10-08T16:34:36.073485', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'tapfc3bca7e-b6', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:99:eb:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc3bca7e-b6'}, 'message_id': 'ad5dd39e-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.798905194, 'message_signature': '5efb862a9a3974787784b087451e9b50d75257a61929b3e7ca816b45c4d6793f'}]}, 'timestamp': '2025-10-08 16:34:36.077897', '_unique_id': 'c0008d5050a941bda07a03f2db885f7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.078 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.079 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.103 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.write.latency volume: 57860267109 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.104 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.123 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.write.latency volume: 38140996219 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.124 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1e33ca1-9427-4405-bd5d-33eaa4c4ef43', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 57860267109, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-vda', 'timestamp': '2025-10-08T16:34:36.079726', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ad61e326-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.802695894, 'message_signature': 'af834561825374a45e02934883adc29eeb18e6627e34c022ee8107e109f03a05'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-sda', 'timestamp': '2025-10-08T16:34:36.079726', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ad61f118-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.802695894, 'message_signature': 'b58c12de7c46a1a61403a096ccd08e21a3457a47d94c0015d005513a8a069e34'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 38140996219, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-vda', 'timestamp': '2025-10-08T16:34:36.079726', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ad64e724-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.827858248, 'message_signature': 'dd2ffa8859edab4f54f31bcf6b2516cbb331e4cbcf97fd9b7d12ffcad71f00f4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-sda', 'timestamp': '2025-10-08T16:34:36.079726', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ad64f494-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.827858248, 'message_signature': '94775b95590b060d21456a7925fd1b124850844c4f3e6c8286753968684989ff'}]}, 'timestamp': '2025-10-08 16:34:36.124631', '_unique_id': '78ec8f4df78a4098ab3ec04e95dbd556'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.125 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.126 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.126 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.126 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '678860b3-5fc2-4ebd-84ff-e1be00f379fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-0000005e-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-tapae07e4bd-18', 'timestamp': '2025-10-08T16:34:36.126384', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'tapae07e4bd-18', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c8:e6:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapae07e4bd-18'}, 'message_id': 'ad6546ba-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.796454695, 'message_signature': '935d211f3c9258d0362c01213b37ad58a752149419ad83ff67e0256cf4bd1bb8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-00000060-955b5604-c52b-4d34-8a59-8e13bc6d2992-tapfc3bca7e-b6', 'timestamp': '2025-10-08T16:34:36.126384', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'tapfc3bca7e-b6', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:99:eb:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc3bca7e-b6'}, 'message_id': 'ad654f7a-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.798905194, 'message_signature': '2d44e1e574764af9a510808bdfcf39d344902a2769fa4c839c23c98c0ee50d89'}]}, 'timestamp': '2025-10-08 16:34:36.126924', '_unique_id': 'a65976d25087494a85cfd588f5921cb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.127 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.128 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.140 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/memory.usage volume: 226.2734375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.151 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/memory.usage volume: 283.09375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6c15fa0-a052-4200-9a92-c25807ed7441', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 226.2734375, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'timestamp': '2025-10-08T16:34:36.128123', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': 'ad676378-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.863116235, 'message_signature': 'a9274a6be36932ec75e54adf192720bfff41207f26eb376617364ec074f24c69'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 283.09375, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'timestamp': '2025-10-08T16:34:36.128123', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': 'ad692f28-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.874810219, 'message_signature': 'b6c0e3b964d16234db694ae229a6e62cce89442a84429ed3586290e3ddbd4aa7'}]}, 'timestamp': '2025-10-08 16:34:36.152349', '_unique_id': '4fe86182a31146dc8c9dd4d1b6619c70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.153 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.154 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.154 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.write.bytes volume: 135782912 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.154 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.155 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.write.bytes volume: 136644096 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.155 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9bc1ab0f-a54e-424b-a906-d9503c1aed9d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 135782912, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-vda', 'timestamp': '2025-10-08T16:34:36.154584', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ad69930a-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.802695894, 'message_signature': 'bf08d7465a72e08ddf3e2880dff6a39962485f204055e22a64c91da04b91a937'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-sda', 'timestamp': '2025-10-08T16:34:36.154584', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ad699e72-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.802695894, 'message_signature': '9daa2cf125966bc3a8670a240244c261de9f7572b95f0bb4e4c945ed9d46177f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 136644096, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-vda', 'timestamp': '2025-10-08T16:34:36.154584', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ad69a84a-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.827858248, 'message_signature': '157f70fb3b1036ff46c7e61c26e19ce53ac37502698edcc6109264313e2686c6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-sda', 'timestamp': '2025-10-08T16:34:36.154584', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ad69b13c-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.827858248, 'message_signature': '628e01e63dad6ff186d63cbebe567952fff9db545d00dd6356b043902426a777'}]}, 'timestamp': '2025-10-08 16:34:36.155641', '_unique_id': '7facea21c823403daf332984f0100fc9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.156 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.166 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.167 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.176 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.177 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '303ca73a-ddff-455e-866c-f99b2ada3d3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-vda', 'timestamp': '2025-10-08T16:34:36.156904', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ad6b6b76-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.87987118, 'message_signature': 'f5ce5d76cc14713919d8aef98fee80f442523ac9ba8c3d4c23c89696d5d51121'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-sda', 'timestamp': '2025-10-08T16:34:36.156904', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ad6b7c1a-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.87987118, 'message_signature': '8cc2212fe2d65d0df5bb4e6d2e3e9cdcff07356bc770605c6888b0535c175d42'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-vda', 'timestamp': '2025-10-08T16:34:36.156904', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ad6cf3b0-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.890385746, 'message_signature': 'd2b418b40025b1053172c306664110b7a035862951f7dae6a8e4614a6cfa02c7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-sda', 'timestamp': '2025-10-08T16:34:36.156904', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ad6d01ac-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.890385746, 'message_signature': '6472cc904c434c161e62b2a8debda5c2076f16ca162d2f05a5170705edcbee42'}]}, 'timestamp': '2025-10-08 16:34:36.177363', '_unique_id': '4fa9fbf72241428fa1b6c0118e5e4139'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.178 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.179 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.179 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/network.outgoing.bytes.delta volume: 28858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.179 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/network.outgoing.bytes.delta volume: 149130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7de8698b-b278-44b9-8d9c-111ff8bc0999', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 28858, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-0000005e-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-tapae07e4bd-18', 'timestamp': '2025-10-08T16:34:36.179143', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'tapae07e4bd-18', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c8:e6:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapae07e4bd-18'}, 'message_id': 'ad6d5184-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.796454695, 'message_signature': '39f4b56a1a5d838fdcdd7fbe62150e01b698dc7a7cbed224a3f2cef459486df7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 149130, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-00000060-955b5604-c52b-4d34-8a59-8e13bc6d2992-tapfc3bca7e-b6', 'timestamp': '2025-10-08T16:34:36.179143', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'tapfc3bca7e-b6', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:99:eb:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc3bca7e-b6'}, 'message_id': 'ad6d5b52-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.798905194, 'message_signature': 'c719ea020a560818f857552d132817ce9ef3a48cd94267354d03d7a607c8d8f2'}]}, 'timestamp': '2025-10-08 16:34:36.179648', '_unique_id': 'd249f83d2b824bb097b13cad8601fdda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.180 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92183bb0-6b3b-4449-a208-841b8206a30b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-0000005e-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-tapae07e4bd-18', 'timestamp': '2025-10-08T16:34:36.181034', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'tapae07e4bd-18', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c8:e6:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapae07e4bd-18'}, 'message_id': 'ad6d9c2a-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.796454695, 'message_signature': '387d02c45d0a6e27d2ebac76d4b68626c8ee4b38d996017df1d5df99465eac77'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-00000060-955b5604-c52b-4d34-8a59-8e13bc6d2992-tapfc3bca7e-b6', 'timestamp': '2025-10-08T16:34:36.181034', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'tapfc3bca7e-b6', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:99:eb:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc3bca7e-b6'}, 'message_id': 'ad6da4e0-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.798905194, 'message_signature': '964b444bed522481f38bcb9a4d5446fd6d0c4ef3cd14da8d907d2d99c84e70cc'}]}, 'timestamp': '2025-10-08 16:34:36.181523', '_unique_id': '0279f46c9f8142cd8aa1ed1fae60b48c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.181 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.182 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.182 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/network.outgoing.bytes volume: 32336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.182 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/network.outgoing.bytes volume: 149130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13a81f17-f095-4a4d-a605-90e8a0170a16', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32336, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-0000005e-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-tapae07e4bd-18', 'timestamp': '2025-10-08T16:34:36.182613', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'tapae07e4bd-18', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c8:e6:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapae07e4bd-18'}, 'message_id': 'ad6dd816-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.796454695, 'message_signature': '138ec1fde6d7a3466fa90581d8410ac283daa03a9dfb1c0ea0e9a1198284fe3d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 149130, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-00000060-955b5604-c52b-4d34-8a59-8e13bc6d2992-tapfc3bca7e-b6', 'timestamp': '2025-10-08T16:34:36.182613', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'tapfc3bca7e-b6', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:99:eb:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc3bca7e-b6'}, 'message_id': 'ad6de036-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.798905194, 'message_signature': '8ef0218e61569095eb98dda3a67c3206744a95c051ff021654b4096c1f8f98d7'}]}, 'timestamp': '2025-10-08 16:34:36.183042', '_unique_id': 'c10d76257d864583aadd10124d01515c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.183 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.184 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.184 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.184 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92b0071f-64ab-42fc-8d9b-113fa8a24134', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-0000005e-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-tapae07e4bd-18', 'timestamp': '2025-10-08T16:34:36.184216', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'tapae07e4bd-18', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c8:e6:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapae07e4bd-18'}, 'message_id': 'ad6e16b4-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.796454695, 'message_signature': 'db244034dd8c1acf7bbf4914fd543bc522b6b7b234bd8d1db74b8925a8422988'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-00000060-955b5604-c52b-4d34-8a59-8e13bc6d2992-tapfc3bca7e-b6', 'timestamp': '2025-10-08T16:34:36.184216', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'tapfc3bca7e-b6', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:99:eb:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc3bca7e-b6'}, 'message_id': 'ad6e1ec0-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.798905194, 'message_signature': 'a2d75548f9321c2e6983689ca8bebae5bccc22fdf4d0de68696349269c7627e8'}]}, 'timestamp': '2025-10-08 16:34:36.184642', '_unique_id': '047e4ef1431c4620afbc32fedf026b18'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.185 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.186 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.186 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.186 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf3cfcb3-b1ad-4e57-81a8-f3e2f59f305e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-vda', 'timestamp': '2025-10-08T16:34:36.185820', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ad6e55b6-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.87987118, 'message_signature': 'ee64ae8d2a3040dfc30182e8605417160b5909a00735bf6168aaed8a3a9d9a57'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-sda', 'timestamp': '2025-10-08T16:34:36.185820', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ad6e604c-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.87987118, 'message_signature': 'b80b4415ff268068d3a3f251de54704064e154253c191f8c693e2d5cd0999ce2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-vda', 'timestamp': '2025-10-08T16:34:36.185820', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ad6e6826-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.890385746, 'message_signature': '6956ffbb2fd56843f3b6cbac93f57a877bfb885d37820ec48a033c0a31a18c3d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-sda', 'timestamp': '2025-10-08T16:34:36.185820', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ad6e70b4-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.890385746, 'message_signature': 'a760623e0bd74e836519942e353ea933ff54b205c54018ddca06539a9c63acb3'}]}, 'timestamp': '2025-10-08 16:34:36.186732', '_unique_id': '57fb3c968a6c454390db921aa4e04463'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.187 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/cpu volume: 42100000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/cpu volume: 47240000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6893ab21-c68f-40fd-9645-1bb52b182f08', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 42100000000, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'timestamp': '2025-10-08T16:34:36.187958', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'ad6ea9c6-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.863116235, 'message_signature': '41b1836d1561f381e55a359a73e39a59a31e056a1513afbfe147d5a6effc119f'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 47240000000, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'timestamp': '2025-10-08T16:34:36.187958', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'ad6eb308-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.874810219, 'message_signature': '94588d3fb7df567f164fe08be273f0c66e4f3c69b8965e1636d46fe5a30a0c6b'}]}, 'timestamp': '2025-10-08 16:34:36.188455', '_unique_id': 'd3d35a8ceb2c4e06b08a412eb24bc0bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.188 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.189 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.189 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.write.requests volume: 743 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.189 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.190 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.write.requests volume: 810 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.190 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd26dcdb2-16a4-4ad1-8c64-e5a5c3ebc0f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 743, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-vda', 'timestamp': '2025-10-08T16:34:36.189686', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ad6eebfc-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.802695894, 'message_signature': '77a0158a42713f3b1410643dcbc88c95d46c88e3afa67fa5f8bb2af7df97e935'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-sda', 'timestamp': '2025-10-08T16:34:36.189686', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ad6ef462-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.802695894, 'message_signature': 'ed8c007c32a5d3d7899f358a5d3daf70dd96a0fd6b36d8e8275ca74289c35bf0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 810, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-vda', 'timestamp': '2025-10-08T16:34:36.189686', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ad6efcfa-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.827858248, 'message_signature': '7b8641a0db11e5a5c954e042efe0982ea3ffb277120031ef3087077356ff4356'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-sda', 'timestamp': '2025-10-08T16:34:36.189686', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ad6f061e-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.827858248, 'message_signature': 'ec01b87e10b7db0dc97d7336142c5803df88bbc64804a0f4028a81e6317c0d62'}]}, 'timestamp': '2025-10-08 16:34:36.190575', '_unique_id': '57872b487f5d4f8f8e768b0d74ea71b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.read.requests volume: 11556 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.191 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.192 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.read.requests volume: 11689 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.192 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c70b991a-043f-492e-80df-cec0658fdda2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11556, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-vda', 'timestamp': '2025-10-08T16:34:36.191717', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ad6f3b98-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.802695894, 'message_signature': '788afa591879d728cf9114612cfec382bf098a82240241a0ff3d112b9af864cb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-sda', 'timestamp': '2025-10-08T16:34:36.191717', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ad6f4322-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.802695894, 'message_signature': 'a9841fb794b81f2dcdd53432a0f4c83eb1db77c896c708d192b9cdb1aae2cd7f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11689, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-vda', 'timestamp': '2025-10-08T16:34:36.191717', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ad6f4b88-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.827858248, 'message_signature': 'af8635f1800a13c0b340e3f2802dfe2d85ed4168919b0b7dfbdb832eb0b24f00'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-sda', 'timestamp': '2025-10-08T16:34:36.191717', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ad6f5524-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.827858248, 'message_signature': 'eca3fd8868a28d8bc7a75bc47fa6843072e980241026d0245939075a0c857a7c'}]}, 'timestamp': '2025-10-08 16:34:36.192582', '_unique_id': '270b45b8b724411c99e145258d4b968a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.read.latency volume: 15051252375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.193 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.read.latency volume: 261691827 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.194 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.read.latency volume: 11606053496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.194 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.read.latency volume: 206473136 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd545811-1e06-4595-9967-7e51bbbc14df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15051252375, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-vda', 'timestamp': '2025-10-08T16:34:36.193741', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ad6f8a80-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.802695894, 'message_signature': '8770f6f64edc68b6bc3ab2395d82c0f8a101b64ec0cf949123c983e1d00d7df9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 261691827, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-sda', 'timestamp': '2025-10-08T16:34:36.193741', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ad6f9318-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.802695894, 'message_signature': 'c5e0a2d5d13fca61cd462d0a5052c331f7ab391f30ec5db39101a3c40de4ac29'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11606053496, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-vda', 'timestamp': '2025-10-08T16:34:36.193741', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ad6f9c96-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.827858248, 'message_signature': 'a9bfb4c15338ef20177ca11c889ea9e39911fd3b2eca104fa9ab2d06a02d3c50'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 206473136, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-sda', 'timestamp': '2025-10-08T16:34:36.193741', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ad6fa452-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.827858248, 'message_signature': 'bb924c0af5d4f249fe590ce2858826893957ea73368a8956d5fe58cdc6bf2098'}]}, 'timestamp': '2025-10-08 16:34:36.194618', '_unique_id': 'd14114f7a6224d9bbb278aeab258fbd0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.195 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/network.incoming.packets volume: 73 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.196 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/network.incoming.packets volume: 584 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a400b2f-54af-454a-a205-16295acb44d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 73, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-0000005e-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-tapae07e4bd-18', 'timestamp': '2025-10-08T16:34:36.195851', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'tapae07e4bd-18', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c8:e6:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapae07e4bd-18'}, 'message_id': 'ad6fdec2-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.796454695, 'message_signature': 'df26059ac31467d1feb4dfc0542da4f584cf360bcb5cece140536a53d5be650a'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 584, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-00000060-955b5604-c52b-4d34-8a59-8e13bc6d2992-tapfc3bca7e-b6', 'timestamp': '2025-10-08T16:34:36.195851', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'tapfc3bca7e-b6', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:99:eb:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc3bca7e-b6'}, 'message_id': 'ad6feb38-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.798905194, 'message_signature': 'e4263176566d04eb63eab3468e7ff0e5eff9b3f2f4c0b3cc262cf8352eb6345e'}]}, 'timestamp': '2025-10-08 16:34:36.196475', '_unique_id': '1395b75b770c42908ba243209bfaa813'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.197 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.usage volume: 152698880 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.198 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.198 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.usage volume: 152633344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.198 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b711b462-ef97-4fca-be75-2657906279a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152698880, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-vda', 'timestamp': '2025-10-08T16:34:36.197946', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ad702ecc-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.87987118, 'message_signature': 'e849d0947f3945751cf07d818e4de300591063b504ed2500a38fb2ed0f9f0142'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-sda', 'timestamp': '2025-10-08T16:34:36.197946', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ad70389a-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.87987118, 'message_signature': '559d531ba85814d22726c9aa5f3f0873e43468bffef5ba7944eeb21d3d612ccb'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152633344, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-vda', 'timestamp': '2025-10-08T16:34:36.197946', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ad704074-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.890385746, 'message_signature': 'edba891ef3a75f2886d05d233b506499efba5cafc66f50c049224bfe2a21c098'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-sda', 'timestamp': '2025-10-08T16:34:36.197946', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ad704826-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.890385746, 'message_signature': '3aa7903670ba7bda40696b95822ab5a357d97575be5271c0a2d9fd0eae1f8568'}]}, 'timestamp': '2025-10-08 16:34:36.198801', '_unique_id': 'ea17ea81cd334b2d867a34950c2953f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.199 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.read.bytes volume: 327759360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.200 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.200 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.read.bytes volume: 331064832 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.200 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2afa4b67-076b-430e-8326-21610357d344', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 327759360, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-vda', 'timestamp': '2025-10-08T16:34:36.199978', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ad707ef4-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.802695894, 'message_signature': 'd37179c4469fd8901b00c0ce16ad8f4f09e4a685dcaf26818986df45733e0b96'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-sda', 'timestamp': '2025-10-08T16:34:36.199978', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'instance-0000005e', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ad708700-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.802695894, 'message_signature': 'dfd0ce5e8f071652a1393f11cf79dc455251cbc86832ed9277963fa34efcaea2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 331064832, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-vda', 'timestamp': '2025-10-08T16:34:36.199978', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'ad70907e-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.827858248, 'message_signature': 'a7b9a554ab1b40d21c6d85b58e9b524443d26a17a33a737719800b84ea53a374'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992-sda', 'timestamp': '2025-10-08T16:34:36.199978', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'instance-00000060', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'ad709a2e-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.827858248, 'message_signature': '602b3a90adf29cf21cc0962e8c55674467f27de3b945c1a46629fe32fb9e499a'}]}, 'timestamp': '2025-10-08 16:34:36.200917', '_unique_id': 'e5dca7adf92d416db20bd9c21b329821'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.201 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/network.incoming.bytes volume: 30170 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/network.incoming.bytes volume: 117390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9256c75-5512-4ffa-85f0-b7eb6eebc43e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30170, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-0000005e-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-tapae07e4bd-18', 'timestamp': '2025-10-08T16:34:36.202071', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'tapae07e4bd-18', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c8:e6:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapae07e4bd-18'}, 'message_id': 'ad70d156-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.796454695, 'message_signature': 'adcdd82900c5237328b166abcb0525903aef6bb663ef5203ce47f279fd04214c'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 117390, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-00000060-955b5604-c52b-4d34-8a59-8e13bc6d2992-tapfc3bca7e-b6', 'timestamp': '2025-10-08T16:34:36.202071', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'tapfc3bca7e-b6', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:99:eb:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc3bca7e-b6'}, 'message_id': 'ad70d9c6-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.798905194, 'message_signature': '44a963f1658fa1db695bf3f3e8c4a8a590c36ea9aacdfcfea0fdf2e9d1143c4c'}]}, 'timestamp': '2025-10-08 16:34:36.202540', '_unique_id': 'a2844e5cbb9a410088e49c05190a2405'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.202 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.203 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.203 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/network.outgoing.packets volume: 96 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.203 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/network.outgoing.packets volume: 620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24340fb4-621e-4c9e-8724-1da195d6cd00', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 96, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-0000005e-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-tapae07e4bd-18', 'timestamp': '2025-10-08T16:34:36.203610', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'tapae07e4bd-18', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c8:e6:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapae07e4bd-18'}, 'message_id': 'ad710cca-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.796454695, 'message_signature': 'ac039e00e733cb7c68a56e1ce8d76f921090e906d27372f93fa9a2a8b5688f2d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 620, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-00000060-955b5604-c52b-4d34-8a59-8e13bc6d2992-tapfc3bca7e-b6', 'timestamp': '2025-10-08T16:34:36.203610', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'tapfc3bca7e-b6', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:99:eb:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc3bca7e-b6'}, 'message_id': 'ad71158a-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.798905194, 'message_signature': 'e98fe772c17f3123ab6358a41999cd1a8d1cebe79731a727ef940e4624558fe8'}]}, 'timestamp': '2025-10-08 16:34:36.204071', '_unique_id': '1b96a1a5e79e443996e6779f4d9b23e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.204 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.205 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.205 12 DEBUG ceilometer.compute.pollsters [-] 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.205 12 DEBUG ceilometer.compute.pollsters [-] 955b5604-c52b-4d34-8a59-8e13bc6d2992/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98cd9e7d-bfa8-4f56-996d-b0eeb6649765', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-0000005e-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-tapae07e4bd-18', 'timestamp': '2025-10-08T16:34:36.205221', 'resource_metadata': {'display_name': 'tempest-server-test-2100618665', 'name': 'tapae07e4bd-18', 'instance_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:c8:e6:70', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapae07e4bd-18'}, 'message_id': 'ad714b04-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.796454695, 'message_signature': 'd08fc7bcc4c3082b52a213b889b3b5d5e7bcff7246bea428c1cfb974993de83b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2a670570cd6d44aeb1a34bf3fac55635', 'user_name': None, 'project_id': '3e84a9b599804a5f95722444484bbcee', 'project_name': None, 'resource_id': 'instance-00000060-955b5604-c52b-4d34-8a59-8e13bc6d2992-tapfc3bca7e-b6', 'timestamp': '2025-10-08T16:34:36.205221', 'resource_metadata': {'display_name': 'tempest-server-test-1886042372', 'name': 'tapfc3bca7e-b6', 'instance_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'instance_type': 'custom_neutron_guest', 'host': 'ef33d17403963d541866a306c6d99578292e4a06de226364b139e911', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:99:eb:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfc3bca7e-b6'}, 'message_id': 'ad715342-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8179.798905194, 'message_signature': 'b6839dadde4288ca5305d8251e8aaafc1f335635604eb6eed3b60b1a9d0f3381'}]}, 'timestamp': '2025-10-08 16:34:36.205647', '_unique_id': '03fcd808f8f74a5480e78295b4482c01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:34:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:34:36.206 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:34:37 np0005476733 nova_compute[192580]: 2025-10-08 16:34:37.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:34:39 np0005476733 podman[266146]: 2025-10-08 16:34:39.236868163 +0000 UTC m=+0.060147453 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 12:34:39 np0005476733 podman[266147]: 2025-10-08 16:34:39.256002664 +0000 UTC m=+0.065843215 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:34:39 np0005476733 podman[266148]: 2025-10-08 16:34:39.257281894 +0000 UTC m=+0.061882108 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, version=9.6, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  8 12:34:39 np0005476733 nova_compute[192580]: 2025-10-08 16:34:39.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:34:40 np0005476733 nova_compute[192580]: 2025-10-08 16:34:40.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:34:42 np0005476733 nova_compute[192580]: 2025-10-08 16:34:42.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:34:44 np0005476733 nova_compute[192580]: 2025-10-08 16:34:44.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:34:44 np0005476733 nova_compute[192580]: 2025-10-08 16:34:44.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:34:44 np0005476733 nova_compute[192580]: 2025-10-08 16:34:44.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:34:47 np0005476733 nova_compute[192580]: 2025-10-08 16:34:47.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:34:48 np0005476733 podman[266207]: 2025-10-08 16:34:48.224047628 +0000 UTC m=+0.051034550 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 12:34:48 np0005476733 podman[266206]: 2025-10-08 16:34:48.232278681 +0000 UTC m=+0.058579322 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=iscsid, io.buildah.version=1.41.3)
Oct  8 12:34:48 np0005476733 nova_compute[192580]: 2025-10-08 16:34:48.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:34:49 np0005476733 nova_compute[192580]: 2025-10-08 16:34:49.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:34:50 np0005476733 nova_compute[192580]: 2025-10-08 16:34:50.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:34:51 np0005476733 nova_compute[192580]: 2025-10-08 16:34:51.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:34:52 np0005476733 nova_compute[192580]: 2025-10-08 16:34:52.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:34:52 np0005476733 nova_compute[192580]: 2025-10-08 16:34:52.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:34:52 np0005476733 nova_compute[192580]: 2025-10-08 16:34:52.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:34:52 np0005476733 nova_compute[192580]: 2025-10-08 16:34:52.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:34:53 np0005476733 nova_compute[192580]: 2025-10-08 16:34:53.696 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:34:53 np0005476733 nova_compute[192580]: 2025-10-08 16:34:53.697 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:34:53 np0005476733 nova_compute[192580]: 2025-10-08 16:34:53.697 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:34:53 np0005476733 nova_compute[192580]: 2025-10-08 16:34:53.697 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:34:54 np0005476733 nova_compute[192580]: 2025-10-08 16:34:54.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:34:55 np0005476733 nova_compute[192580]: 2025-10-08 16:34:55.416 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Updating instance_info_cache with network_info: [{"id": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "address": "fa:16:3e:c8:e6:70", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.229", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae07e4bd-18", "ovs_interfaceid": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:34:55 np0005476733 nova_compute[192580]: 2025-10-08 16:34:55.435 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-3dbbed4f-f648-4fb9-ad0e-9c718fbbd525" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:34:55 np0005476733 nova_compute[192580]: 2025-10-08 16:34:55.436 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:34:56 np0005476733 podman[266251]: 2025-10-08 16:34:56.218982574 +0000 UTC m=+0.053072726 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible)
Oct  8 12:34:57 np0005476733 nova_compute[192580]: 2025-10-08 16:34:57.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:34:59 np0005476733 ovn_controller[263831]: 2025-10-08T16:34:59Z|00057|pinctrl|WARN|Dropped 53 log messages in last 62 seconds (most recently, 8 seconds ago) due to excessive rate
Oct  8 12:34:59 np0005476733 ovn_controller[263831]: 2025-10-08T16:34:59Z|00058|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:34:59 np0005476733 podman[266271]: 2025-10-08 16:34:59.243395222 +0000 UTC m=+0.071579127 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:34:59 np0005476733 nova_compute[192580]: 2025-10-08 16:34:59.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:02 np0005476733 nova_compute[192580]: 2025-10-08 16:35:02.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:02 np0005476733 podman[266297]: 2025-10-08 16:35:02.221603525 +0000 UTC m=+0.048727578 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  8 12:35:02 np0005476733 nova_compute[192580]: 2025-10-08 16:35:02.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:35:02 np0005476733 nova_compute[192580]: 2025-10-08 16:35:02.622 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:35:02 np0005476733 nova_compute[192580]: 2025-10-08 16:35:02.623 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:35:02 np0005476733 nova_compute[192580]: 2025-10-08 16:35:02.623 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:35:02 np0005476733 nova_compute[192580]: 2025-10-08 16:35:02.623 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:35:02 np0005476733 nova_compute[192580]: 2025-10-08 16:35:02.694 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:35:02 np0005476733 nova_compute[192580]: 2025-10-08 16:35:02.755 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:35:02 np0005476733 nova_compute[192580]: 2025-10-08 16:35:02.756 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:35:02 np0005476733 nova_compute[192580]: 2025-10-08 16:35:02.809 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:35:02 np0005476733 nova_compute[192580]: 2025-10-08 16:35:02.816 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:35:02 np0005476733 nova_compute[192580]: 2025-10-08 16:35:02.869 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:35:02 np0005476733 nova_compute[192580]: 2025-10-08 16:35:02.870 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:35:02 np0005476733 nova_compute[192580]: 2025-10-08 16:35:02.929 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:35:03 np0005476733 nova_compute[192580]: 2025-10-08 16:35:03.071 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:35:03 np0005476733 nova_compute[192580]: 2025-10-08 16:35:03.072 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=11962MB free_disk=111.02676773071289GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:35:03 np0005476733 nova_compute[192580]: 2025-10-08 16:35:03.072 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:35:03 np0005476733 nova_compute[192580]: 2025-10-08 16:35:03.072 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:35:03 np0005476733 nova_compute[192580]: 2025-10-08 16:35:03.164 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:35:03 np0005476733 nova_compute[192580]: 2025-10-08 16:35:03.165 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 955b5604-c52b-4d34-8a59-8e13bc6d2992 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:35:03 np0005476733 nova_compute[192580]: 2025-10-08 16:35:03.165 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:35:03 np0005476733 nova_compute[192580]: 2025-10-08 16:35:03.165 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=2560MB phys_disk=119GB used_disk=20GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:35:03 np0005476733 nova_compute[192580]: 2025-10-08 16:35:03.262 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:35:03 np0005476733 nova_compute[192580]: 2025-10-08 16:35:03.280 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:35:03 np0005476733 nova_compute[192580]: 2025-10-08 16:35:03.282 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:35:03 np0005476733 nova_compute[192580]: 2025-10-08 16:35:03.282 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:35:04 np0005476733 nova_compute[192580]: 2025-10-08 16:35:04.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:06 np0005476733 nova_compute[192580]: 2025-10-08 16:35:06.283 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:35:07 np0005476733 nova_compute[192580]: 2025-10-08 16:35:07.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:09 np0005476733 nova_compute[192580]: 2025-10-08 16:35:09.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:10 np0005476733 podman[266331]: 2025-10-08 16:35:10.224331889 +0000 UTC m=+0.054729409 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:35:10 np0005476733 podman[266330]: 2025-10-08 16:35:10.227655485 +0000 UTC m=+0.058676495 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:35:10 np0005476733 podman[266332]: 2025-10-08 16:35:10.230250048 +0000 UTC m=+0.054554804 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, release=1755695350, container_name=openstack_network_exporter, name=ubi9-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6)
Oct  8 12:35:10 np0005476733 nova_compute[192580]: 2025-10-08 16:35:10.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.635 2 DEBUG oslo_concurrency.lockutils [None req-8b18c4dc-831e-46e6-9a41-008c3b532cd6 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Acquiring lock "955b5604-c52b-4d34-8a59-8e13bc6d2992" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.635 2 DEBUG oslo_concurrency.lockutils [None req-8b18c4dc-831e-46e6-9a41-008c3b532cd6 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "955b5604-c52b-4d34-8a59-8e13bc6d2992" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.635 2 DEBUG oslo_concurrency.lockutils [None req-8b18c4dc-831e-46e6-9a41-008c3b532cd6 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Acquiring lock "955b5604-c52b-4d34-8a59-8e13bc6d2992-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.636 2 DEBUG oslo_concurrency.lockutils [None req-8b18c4dc-831e-46e6-9a41-008c3b532cd6 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "955b5604-c52b-4d34-8a59-8e13bc6d2992-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.636 2 DEBUG oslo_concurrency.lockutils [None req-8b18c4dc-831e-46e6-9a41-008c3b532cd6 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "955b5604-c52b-4d34-8a59-8e13bc6d2992-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.637 2 INFO nova.compute.manager [None req-8b18c4dc-831e-46e6-9a41-008c3b532cd6 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Terminating instance#033[00m
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.638 2 DEBUG nova.compute.manager [None req-8b18c4dc-831e-46e6-9a41-008c3b532cd6 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 12:35:11 np0005476733 kernel: tapfc3bca7e-b6 (unregistering): left promiscuous mode
Oct  8 12:35:11 np0005476733 NetworkManager[51699]: <info>  [1759941311.6661] device (tapfc3bca7e-b6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 12:35:11 np0005476733 ovn_controller[263831]: 2025-10-08T16:35:11Z|00059|binding|INFO|Releasing lport fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b from this chassis (sb_readonly=0)
Oct  8 12:35:11 np0005476733 ovn_controller[263831]: 2025-10-08T16:35:11Z|00060|binding|INFO|Setting lport fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b down in Southbound
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:11 np0005476733 ovn_controller[263831]: 2025-10-08T16:35:11Z|00061|binding|INFO|Removing iface tapfc3bca7e-b6 ovn-installed in OVS
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:11.683 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:eb:59 10.100.0.9'], port_security=['fa:16:3e:99:eb:59 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '955b5604-c52b-4d34-8a59-8e13bc6d2992', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c62ce41c-4d20-4575-9fd8-d6f63a77e629', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e84a9b599804a5f95722444484bbcee', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e3968eba-c424-466b-8079-cc800da29d95', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.201'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29408260-4b02-46cd-a4a1-515da7300ce7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:35:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:11.684 103739 INFO neutron.agent.ovn.metadata.agent [-] Port fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b in datapath c62ce41c-4d20-4575-9fd8-d6f63a77e629 unbound from our chassis#033[00m
Oct  8 12:35:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:11.685 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c62ce41c-4d20-4575-9fd8-d6f63a77e629, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:11.687 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a2d27bdf-dc02-494b-86de-b694cc709b67]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:35:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:11.687 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c62ce41c-4d20-4575-9fd8-d6f63a77e629 namespace which is not needed anymore#033[00m
Oct  8 12:35:11 np0005476733 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000060.scope: Deactivated successfully.
Oct  8 12:35:11 np0005476733 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000060.scope: Consumed 51.128s CPU time.
Oct  8 12:35:11 np0005476733 systemd-machined[152624]: Machine qemu-59-instance-00000060 terminated.
Oct  8 12:35:11 np0005476733 neutron-haproxy-ovnmeta-c62ce41c-4d20-4575-9fd8-d6f63a77e629[265194]: [NOTICE]   (265198) : haproxy version is 2.8.14-c23fe91
Oct  8 12:35:11 np0005476733 neutron-haproxy-ovnmeta-c62ce41c-4d20-4575-9fd8-d6f63a77e629[265194]: [NOTICE]   (265198) : path to executable is /usr/sbin/haproxy
Oct  8 12:35:11 np0005476733 neutron-haproxy-ovnmeta-c62ce41c-4d20-4575-9fd8-d6f63a77e629[265194]: [WARNING]  (265198) : Exiting Master process...
Oct  8 12:35:11 np0005476733 neutron-haproxy-ovnmeta-c62ce41c-4d20-4575-9fd8-d6f63a77e629[265194]: [WARNING]  (265198) : Exiting Master process...
Oct  8 12:35:11 np0005476733 neutron-haproxy-ovnmeta-c62ce41c-4d20-4575-9fd8-d6f63a77e629[265194]: [ALERT]    (265198) : Current worker (265200) exited with code 143 (Terminated)
Oct  8 12:35:11 np0005476733 neutron-haproxy-ovnmeta-c62ce41c-4d20-4575-9fd8-d6f63a77e629[265194]: [WARNING]  (265198) : All workers exited. Exiting... (0)
Oct  8 12:35:11 np0005476733 systemd[1]: libpod-609e0f74b2d941139b30e6a8cfacfee115185bb9b282f7c8a889d5f278425eeb.scope: Deactivated successfully.
Oct  8 12:35:11 np0005476733 podman[266415]: 2025-10-08 16:35:11.813831863 +0000 UTC m=+0.047868701 container died 609e0f74b2d941139b30e6a8cfacfee115185bb9b282f7c8a889d5f278425eeb (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-c62ce41c-4d20-4575-9fd8-d6f63a77e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  8 12:35:11 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-609e0f74b2d941139b30e6a8cfacfee115185bb9b282f7c8a889d5f278425eeb-userdata-shm.mount: Deactivated successfully.
Oct  8 12:35:11 np0005476733 systemd[1]: var-lib-containers-storage-overlay-4fc78afbcbcc0135a383db0e3991f6eeef817018c5d8f9ecc1320bbc1fcd9180-merged.mount: Deactivated successfully.
Oct  8 12:35:11 np0005476733 podman[266415]: 2025-10-08 16:35:11.849955767 +0000 UTC m=+0.083992575 container cleanup 609e0f74b2d941139b30e6a8cfacfee115185bb9b282f7c8a889d5f278425eeb (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-c62ce41c-4d20-4575-9fd8-d6f63a77e629, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  8 12:35:11 np0005476733 systemd[1]: libpod-conmon-609e0f74b2d941139b30e6a8cfacfee115185bb9b282f7c8a889d5f278425eeb.scope: Deactivated successfully.
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.900 2 INFO nova.virt.libvirt.driver [-] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Instance destroyed successfully.#033[00m
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.902 2 DEBUG nova.objects.instance [None req-8b18c4dc-831e-46e6-9a41-008c3b532cd6 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lazy-loading 'resources' on Instance uuid 955b5604-c52b-4d34-8a59-8e13bc6d2992 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:35:11 np0005476733 podman[266451]: 2025-10-08 16:35:11.914184889 +0000 UTC m=+0.042136227 container remove 609e0f74b2d941139b30e6a8cfacfee115185bb9b282f7c8a889d5f278425eeb (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-c62ce41c-4d20-4575-9fd8-d6f63a77e629, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.920 2 DEBUG nova.virt.libvirt.vif [None req-8b18c4dc-831e-46e6-9a41-008c3b532cd6 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T16:32:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1886042372',display_name='tempest-server-test-1886042372',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1886042372',id=96,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKjthntCRZsyp63+354x4Tysej4gPAk5Si8K+FPukJ5+b5PYBTJpcqDxUbi+6C7pe5FLeDw6Z8lshC7Q+5m9uZ+mbSRetxFueZc0ihdj/RdazyUIVVefOeKslAmQPSNYjg==',key_name='tempest-keypair-test-219561444',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:32:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3e84a9b599804a5f95722444484bbcee',ramdisk_id='',reservation_id='r-2pjrb93y',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-GatewayMtuTestIcmp-9904054',owner_user_name='tempest-GatewayMtuTestIcmp-9904054-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T16:32:11Z,user_data=None,user_id='2a670570cd6d44aeb1a34bf3fac55635',uuid=955b5604-c52b-4d34-8a59-8e13bc6d2992,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b", "address": "fa:16:3e:99:eb:59", "network": {"id": "c62ce41c-4d20-4575-9fd8-d6f63a77e629", "bridge": "br-int", "label": "tempest-test-network--1825912703", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e84a9b599804a5f95722444484bbcee", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc3bca7e-b6", "ovs_interfaceid": "fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 12:35:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:11.920 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[def66c79-f009-472c-8922-5b360710e84d]: (4, ('Wed Oct  8 04:35:11 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c62ce41c-4d20-4575-9fd8-d6f63a77e629 (609e0f74b2d941139b30e6a8cfacfee115185bb9b282f7c8a889d5f278425eeb)\n609e0f74b2d941139b30e6a8cfacfee115185bb9b282f7c8a889d5f278425eeb\nWed Oct  8 04:35:11 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c62ce41c-4d20-4575-9fd8-d6f63a77e629 (609e0f74b2d941139b30e6a8cfacfee115185bb9b282f7c8a889d5f278425eeb)\n609e0f74b2d941139b30e6a8cfacfee115185bb9b282f7c8a889d5f278425eeb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.921 2 DEBUG nova.network.os_vif_util [None req-8b18c4dc-831e-46e6-9a41-008c3b532cd6 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Converting VIF {"id": "fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b", "address": "fa:16:3e:99:eb:59", "network": {"id": "c62ce41c-4d20-4575-9fd8-d6f63a77e629", "bridge": "br-int", "label": "tempest-test-network--1825912703", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e84a9b599804a5f95722444484bbcee", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc3bca7e-b6", "ovs_interfaceid": "fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.922 2 DEBUG nova.network.os_vif_util [None req-8b18c4dc-831e-46e6-9a41-008c3b532cd6 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:99:eb:59,bridge_name='br-int',has_traffic_filtering=True,id=fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b,network=Network(c62ce41c-4d20-4575-9fd8-d6f63a77e629),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc3bca7e-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.922 2 DEBUG os_vif [None req-8b18c4dc-831e-46e6-9a41-008c3b532cd6 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:99:eb:59,bridge_name='br-int',has_traffic_filtering=True,id=fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b,network=Network(c62ce41c-4d20-4575-9fd8-d6f63a77e629),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc3bca7e-b6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 12:35:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:11.922 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[df8640c3-82d1-45a9-9823-1c59fc56cd0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:35:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:11.923 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc62ce41c-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.923 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc3bca7e-b6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:35:11 np0005476733 kernel: tapc62ce41c-40: left promiscuous mode
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:11.940 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[fd115ad7-c271-4f66-9454-d62765f3421b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.940 2 INFO os_vif [None req-8b18c4dc-831e-46e6-9a41-008c3b532cd6 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:99:eb:59,bridge_name='br-int',has_traffic_filtering=True,id=fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b,network=Network(c62ce41c-4d20-4575-9fd8-d6f63a77e629),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc3bca7e-b6')#033[00m
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.941 2 INFO nova.virt.libvirt.driver [None req-8b18c4dc-831e-46e6-9a41-008c3b532cd6 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Deleting instance files /var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992_del#033[00m
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.942 2 INFO nova.virt.libvirt.driver [None req-8b18c4dc-831e-46e6-9a41-008c3b532cd6 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Deletion of /var/lib/nova/instances/955b5604-c52b-4d34-8a59-8e13bc6d2992_del complete#033[00m
Oct  8 12:35:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:11.971 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2ef16fef-eb80-4fb3-aa2b-7e697364bbe4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:35:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:11.973 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[de0e9007-d814-473d-9f9e-df0d8b45d9e4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:35:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:11.988 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a88f3166-cbb2-4b07-8932-a10605132420]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 803290, 'reachable_time': 17317, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 266476, 'error': None, 'target': 'ovnmeta-c62ce41c-4d20-4575-9fd8-d6f63a77e629', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:35:11 np0005476733 systemd[1]: run-netns-ovnmeta\x2dc62ce41c\x2d4d20\x2d4575\x2d9fd8\x2dd6f63a77e629.mount: Deactivated successfully.
Oct  8 12:35:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:11.991 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c62ce41c-4d20-4575-9fd8-d6f63a77e629 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 12:35:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:11.991 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[ab637544-71ce-4e5e-bb74-6de26679f662]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.994 2 INFO nova.compute.manager [None req-8b18c4dc-831e-46e6-9a41-008c3b532cd6 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.995 2 DEBUG oslo.service.loopingcall [None req-8b18c4dc-831e-46e6-9a41-008c3b532cd6 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.995 2 DEBUG nova.compute.manager [-] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 12:35:11 np0005476733 nova_compute[192580]: 2025-10-08 16:35:11.995 2 DEBUG nova.network.neutron [-] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 12:35:12 np0005476733 nova_compute[192580]: 2025-10-08 16:35:12.767 2 DEBUG nova.compute.manager [req-891d31ff-5fd3-407b-a31c-98f1dd238243 req-b4af1188-f078-4fe2-9b6f-ea8077fb092d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Received event network-vif-unplugged-fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:35:12 np0005476733 nova_compute[192580]: 2025-10-08 16:35:12.768 2 DEBUG oslo_concurrency.lockutils [req-891d31ff-5fd3-407b-a31c-98f1dd238243 req-b4af1188-f078-4fe2-9b6f-ea8077fb092d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "955b5604-c52b-4d34-8a59-8e13bc6d2992-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:35:12 np0005476733 nova_compute[192580]: 2025-10-08 16:35:12.768 2 DEBUG oslo_concurrency.lockutils [req-891d31ff-5fd3-407b-a31c-98f1dd238243 req-b4af1188-f078-4fe2-9b6f-ea8077fb092d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "955b5604-c52b-4d34-8a59-8e13bc6d2992-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:35:12 np0005476733 nova_compute[192580]: 2025-10-08 16:35:12.769 2 DEBUG oslo_concurrency.lockutils [req-891d31ff-5fd3-407b-a31c-98f1dd238243 req-b4af1188-f078-4fe2-9b6f-ea8077fb092d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "955b5604-c52b-4d34-8a59-8e13bc6d2992-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:35:12 np0005476733 nova_compute[192580]: 2025-10-08 16:35:12.769 2 DEBUG nova.compute.manager [req-891d31ff-5fd3-407b-a31c-98f1dd238243 req-b4af1188-f078-4fe2-9b6f-ea8077fb092d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] No waiting events found dispatching network-vif-unplugged-fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:35:12 np0005476733 nova_compute[192580]: 2025-10-08 16:35:12.769 2 DEBUG nova.compute.manager [req-891d31ff-5fd3-407b-a31c-98f1dd238243 req-b4af1188-f078-4fe2-9b6f-ea8077fb092d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Received event network-vif-unplugged-fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 12:35:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:13.632 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=73, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=72) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:35:13 np0005476733 nova_compute[192580]: 2025-10-08 16:35:13.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:13 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:13.634 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:35:14 np0005476733 nova_compute[192580]: 2025-10-08 16:35:14.065 2 DEBUG nova.network.neutron [-] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:35:14 np0005476733 nova_compute[192580]: 2025-10-08 16:35:14.086 2 INFO nova.compute.manager [-] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Took 2.09 seconds to deallocate network for instance.#033[00m
Oct  8 12:35:14 np0005476733 nova_compute[192580]: 2025-10-08 16:35:14.140 2 DEBUG oslo_concurrency.lockutils [None req-8b18c4dc-831e-46e6-9a41-008c3b532cd6 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:35:14 np0005476733 nova_compute[192580]: 2025-10-08 16:35:14.140 2 DEBUG oslo_concurrency.lockutils [None req-8b18c4dc-831e-46e6-9a41-008c3b532cd6 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:35:14 np0005476733 nova_compute[192580]: 2025-10-08 16:35:14.220 2 DEBUG nova.compute.provider_tree [None req-8b18c4dc-831e-46e6-9a41-008c3b532cd6 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:35:14 np0005476733 nova_compute[192580]: 2025-10-08 16:35:14.240 2 DEBUG nova.scheduler.client.report [None req-8b18c4dc-831e-46e6-9a41-008c3b532cd6 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:35:14 np0005476733 nova_compute[192580]: 2025-10-08 16:35:14.267 2 DEBUG oslo_concurrency.lockutils [None req-8b18c4dc-831e-46e6-9a41-008c3b532cd6 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:35:14 np0005476733 nova_compute[192580]: 2025-10-08 16:35:14.291 2 INFO nova.scheduler.client.report [None req-8b18c4dc-831e-46e6-9a41-008c3b532cd6 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Deleted allocations for instance 955b5604-c52b-4d34-8a59-8e13bc6d2992#033[00m
Oct  8 12:35:14 np0005476733 nova_compute[192580]: 2025-10-08 16:35:14.356 2 DEBUG oslo_concurrency.lockutils [None req-8b18c4dc-831e-46e6-9a41-008c3b532cd6 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "955b5604-c52b-4d34-8a59-8e13bc6d2992" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.721s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:35:14 np0005476733 nova_compute[192580]: 2025-10-08 16:35:14.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:14 np0005476733 nova_compute[192580]: 2025-10-08 16:35:14.883 2 DEBUG nova.compute.manager [req-5206ac03-eb61-42a2-b1b9-ec8bd2246b86 req-ede43774-b52d-4923-9f5d-c0f2b08386b4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Received event network-vif-plugged-fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:35:14 np0005476733 nova_compute[192580]: 2025-10-08 16:35:14.884 2 DEBUG oslo_concurrency.lockutils [req-5206ac03-eb61-42a2-b1b9-ec8bd2246b86 req-ede43774-b52d-4923-9f5d-c0f2b08386b4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "955b5604-c52b-4d34-8a59-8e13bc6d2992-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:35:14 np0005476733 nova_compute[192580]: 2025-10-08 16:35:14.884 2 DEBUG oslo_concurrency.lockutils [req-5206ac03-eb61-42a2-b1b9-ec8bd2246b86 req-ede43774-b52d-4923-9f5d-c0f2b08386b4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "955b5604-c52b-4d34-8a59-8e13bc6d2992-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:35:14 np0005476733 nova_compute[192580]: 2025-10-08 16:35:14.884 2 DEBUG oslo_concurrency.lockutils [req-5206ac03-eb61-42a2-b1b9-ec8bd2246b86 req-ede43774-b52d-4923-9f5d-c0f2b08386b4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "955b5604-c52b-4d34-8a59-8e13bc6d2992-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:35:14 np0005476733 nova_compute[192580]: 2025-10-08 16:35:14.885 2 DEBUG nova.compute.manager [req-5206ac03-eb61-42a2-b1b9-ec8bd2246b86 req-ede43774-b52d-4923-9f5d-c0f2b08386b4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] No waiting events found dispatching network-vif-plugged-fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:35:14 np0005476733 nova_compute[192580]: 2025-10-08 16:35:14.885 2 WARNING nova.compute.manager [req-5206ac03-eb61-42a2-b1b9-ec8bd2246b86 req-ede43774-b52d-4923-9f5d-c0f2b08386b4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Received unexpected event network-vif-plugged-fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b for instance with vm_state deleted and task_state None.#033[00m
Oct  8 12:35:14 np0005476733 nova_compute[192580]: 2025-10-08 16:35:14.885 2 DEBUG nova.compute.manager [req-5206ac03-eb61-42a2-b1b9-ec8bd2246b86 req-ede43774-b52d-4923-9f5d-c0f2b08386b4 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Received event network-vif-deleted-fc3bca7e-b6c6-4d2a-92ad-f97b5b9cf04b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:35:16 np0005476733 nova_compute[192580]: 2025-10-08 16:35:16.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.196 2 DEBUG oslo_concurrency.lockutils [None req-b3f1f583-f65e-4e2d-af11-ea3fe966c926 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Acquiring lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.196 2 DEBUG oslo_concurrency.lockutils [None req-b3f1f583-f65e-4e2d-af11-ea3fe966c926 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.196 2 DEBUG oslo_concurrency.lockutils [None req-b3f1f583-f65e-4e2d-af11-ea3fe966c926 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Acquiring lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.196 2 DEBUG oslo_concurrency.lockutils [None req-b3f1f583-f65e-4e2d-af11-ea3fe966c926 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.197 2 DEBUG oslo_concurrency.lockutils [None req-b3f1f583-f65e-4e2d-af11-ea3fe966c926 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.198 2 INFO nova.compute.manager [None req-b3f1f583-f65e-4e2d-af11-ea3fe966c926 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Terminating instance#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.198 2 DEBUG nova.compute.manager [None req-b3f1f583-f65e-4e2d-af11-ea3fe966c926 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 12:35:18 np0005476733 kernel: tapae07e4bd-18 (unregistering): left promiscuous mode
Oct  8 12:35:18 np0005476733 NetworkManager[51699]: <info>  [1759941318.2256] device (tapae07e4bd-18): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 12:35:18 np0005476733 ovn_controller[263831]: 2025-10-08T16:35:18Z|00062|binding|INFO|Releasing lport ae07e4bd-18d5-41c3-ade8-9cad83e6d345 from this chassis (sb_readonly=0)
Oct  8 12:35:18 np0005476733 ovn_controller[263831]: 2025-10-08T16:35:18Z|00063|binding|INFO|Setting lport ae07e4bd-18d5-41c3-ade8-9cad83e6d345 down in Southbound
Oct  8 12:35:18 np0005476733 ovn_controller[263831]: 2025-10-08T16:35:18Z|00064|binding|INFO|Removing iface tapae07e4bd-18 ovn-installed in OVS
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:18.244 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:e6:70 192.168.122.229'], port_security=['fa:16:3e:c8:e6:70 192.168.122.229'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.229/24', 'neutron:device_id': '3dbbed4f-f648-4fb9-ad0e-9c718fbbd525', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e84a9b599804a5f95722444484bbcee', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e3968eba-c424-466b-8079-cc800da29d95', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5b64086-e7d8-42ad-b439-67cb79e13d7c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=ae07e4bd-18d5-41c3-ade8-9cad83e6d345) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:35:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:18.245 103739 INFO neutron.agent.ovn.metadata.agent [-] Port ae07e4bd-18d5-41c3-ade8-9cad83e6d345 in datapath 81c575b5-ac88-40d3-8b00-79c5c936eec4 unbound from our chassis#033[00m
Oct  8 12:35:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:18.246 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 81c575b5-ac88-40d3-8b00-79c5c936eec4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 12:35:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:18.247 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[10a045a8-da65-40b1-af38-98dca67091ab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:35:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:18.247 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 namespace which is not needed anymore#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:18 np0005476733 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d0000005e.scope: Deactivated successfully.
Oct  8 12:35:18 np0005476733 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d0000005e.scope: Consumed 46.600s CPU time.
Oct  8 12:35:18 np0005476733 systemd-machined[152624]: Machine qemu-58-instance-0000005e terminated.
Oct  8 12:35:18 np0005476733 podman[266484]: 2025-10-08 16:35:18.337918276 +0000 UTC m=+0.071247967 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 12:35:18 np0005476733 podman[266481]: 2025-10-08 16:35:18.356838371 +0000 UTC m=+0.090659818 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:35:18 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[264862]: [NOTICE]   (264866) : haproxy version is 2.8.14-c23fe91
Oct  8 12:35:18 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[264862]: [NOTICE]   (264866) : path to executable is /usr/sbin/haproxy
Oct  8 12:35:18 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[264862]: [WARNING]  (264866) : Exiting Master process...
Oct  8 12:35:18 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[264862]: [WARNING]  (264866) : Exiting Master process...
Oct  8 12:35:18 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[264862]: [ALERT]    (264866) : Current worker (264868) exited with code 143 (Terminated)
Oct  8 12:35:18 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[264862]: [WARNING]  (264866) : All workers exited. Exiting... (0)
Oct  8 12:35:18 np0005476733 systemd[1]: libpod-94bb72e312440653a2bd14965bba5d7195779863a6781d3eb4dfc568b60774fa.scope: Deactivated successfully.
Oct  8 12:35:18 np0005476733 podman[266545]: 2025-10-08 16:35:18.400507575 +0000 UTC m=+0.047169108 container died 94bb72e312440653a2bd14965bba5d7195779863a6781d3eb4dfc568b60774fa (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:18 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-94bb72e312440653a2bd14965bba5d7195779863a6781d3eb4dfc568b60774fa-userdata-shm.mount: Deactivated successfully.
Oct  8 12:35:18 np0005476733 systemd[1]: var-lib-containers-storage-overlay-7caf3f365bceec9287192091917d73e03933bd5842193d7c280843330dd321a8-merged.mount: Deactivated successfully.
Oct  8 12:35:18 np0005476733 podman[266545]: 2025-10-08 16:35:18.434345976 +0000 UTC m=+0.081007499 container cleanup 94bb72e312440653a2bd14965bba5d7195779863a6781d3eb4dfc568b60774fa (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:35:18 np0005476733 systemd[1]: libpod-conmon-94bb72e312440653a2bd14965bba5d7195779863a6781d3eb4dfc568b60774fa.scope: Deactivated successfully.
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.467 2 INFO nova.virt.libvirt.driver [-] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Instance destroyed successfully.#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.468 2 DEBUG nova.objects.instance [None req-b3f1f583-f65e-4e2d-af11-ea3fe966c926 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lazy-loading 'resources' on Instance uuid 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.480 2 DEBUG nova.virt.libvirt.vif [None req-b3f1f583-f65e-4e2d-af11-ea3fe966c926 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T16:31:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-2100618665',display_name='tempest-server-test-2100618665',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-2100618665',id=94,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKjthntCRZsyp63+354x4Tysej4gPAk5Si8K+FPukJ5+b5PYBTJpcqDxUbi+6C7pe5FLeDw6Z8lshC7Q+5m9uZ+mbSRetxFueZc0ihdj/RdazyUIVVefOeKslAmQPSNYjg==',key_name='tempest-keypair-test-219561444',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:31:35Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3e84a9b599804a5f95722444484bbcee',ramdisk_id='',reservation_id='r-1cxxa52z',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-GatewayMtuTestIcmp-9904054',owner_user_name='tempest-GatewayMtuTestIcmp-9904054-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T16:31:35Z,user_data=None,user_id='2a670570cd6d44aeb1a34bf3fac55635',uuid=3dbbed4f-f648-4fb9-ad0e-9c718fbbd525,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "address": "fa:16:3e:c8:e6:70", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.229", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae07e4bd-18", "ovs_interfaceid": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.481 2 DEBUG nova.network.os_vif_util [None req-b3f1f583-f65e-4e2d-af11-ea3fe966c926 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Converting VIF {"id": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "address": "fa:16:3e:c8:e6:70", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.229", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae07e4bd-18", "ovs_interfaceid": "ae07e4bd-18d5-41c3-ade8-9cad83e6d345", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.481 2 DEBUG nova.network.os_vif_util [None req-b3f1f583-f65e-4e2d-af11-ea3fe966c926 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c8:e6:70,bridge_name='br-int',has_traffic_filtering=True,id=ae07e4bd-18d5-41c3-ade8-9cad83e6d345,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae07e4bd-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.482 2 DEBUG os_vif [None req-b3f1f583-f65e-4e2d-af11-ea3fe966c926 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c8:e6:70,bridge_name='br-int',has_traffic_filtering=True,id=ae07e4bd-18d5-41c3-ade8-9cad83e6d345,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae07e4bd-18') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.484 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapae07e4bd-18, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.488 2 INFO os_vif [None req-b3f1f583-f65e-4e2d-af11-ea3fe966c926 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c8:e6:70,bridge_name='br-int',has_traffic_filtering=True,id=ae07e4bd-18d5-41c3-ade8-9cad83e6d345,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae07e4bd-18')#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.488 2 INFO nova.virt.libvirt.driver [None req-b3f1f583-f65e-4e2d-af11-ea3fe966c926 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Deleting instance files /var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525_del#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.489 2 INFO nova.virt.libvirt.driver [None req-b3f1f583-f65e-4e2d-af11-ea3fe966c926 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Deletion of /var/lib/nova/instances/3dbbed4f-f648-4fb9-ad0e-9c718fbbd525_del complete#033[00m
Oct  8 12:35:18 np0005476733 podman[266586]: 2025-10-08 16:35:18.496466701 +0000 UTC m=+0.040933038 container remove 94bb72e312440653a2bd14965bba5d7195779863a6781d3eb4dfc568b60774fa (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  8 12:35:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:18.502 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4672b1d4-c40f-4aea-91c5-aa6662b80857]: (4, ('Wed Oct  8 04:35:18 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 (94bb72e312440653a2bd14965bba5d7195779863a6781d3eb4dfc568b60774fa)\n94bb72e312440653a2bd14965bba5d7195779863a6781d3eb4dfc568b60774fa\nWed Oct  8 04:35:18 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 (94bb72e312440653a2bd14965bba5d7195779863a6781d3eb4dfc568b60774fa)\n94bb72e312440653a2bd14965bba5d7195779863a6781d3eb4dfc568b60774fa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:35:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:18.503 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[33ce6337-8175-417d-a50b-06197e060e47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:35:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:18.504 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81c575b5-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:18 np0005476733 kernel: tap81c575b5-a0: left promiscuous mode
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:18.518 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[32150a08-42e0-49e6-8ab5-0038a5c8eb9f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.539 2 INFO nova.compute.manager [None req-b3f1f583-f65e-4e2d-af11-ea3fe966c926 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.540 2 DEBUG oslo.service.loopingcall [None req-b3f1f583-f65e-4e2d-af11-ea3fe966c926 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.540 2 DEBUG nova.compute.manager [-] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 12:35:18 np0005476733 nova_compute[192580]: 2025-10-08 16:35:18.541 2 DEBUG nova.network.neutron [-] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 12:35:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:18.553 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c15b725c-eecf-4249-82b4-88568522a1cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:35:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:18.555 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[91d1d52c-1d9b-4f70-9f85-cf6c13dfdd83]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:35:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:18.570 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6c30ba69-415d-4b0e-a900-9f087d6f44a8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 799837, 'reachable_time': 44052, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 266605, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:35:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:18.573 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 12:35:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:18.573 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[e6fbf399-1092-4a8d-b2aa-790dbbbe3163]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:35:18 np0005476733 systemd[1]: run-netns-ovnmeta\x2d81c575b5\x2dac88\x2d40d3\x2d8b00\x2d79c5c936eec4.mount: Deactivated successfully.
Oct  8 12:35:19 np0005476733 nova_compute[192580]: 2025-10-08 16:35:19.077 2 DEBUG nova.compute.manager [req-4726818f-7e08-4430-8866-dabffa3973ba req-32ffd834-e741-4953-be77-2e978b9c2b5d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Received event network-vif-unplugged-ae07e4bd-18d5-41c3-ade8-9cad83e6d345 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:35:19 np0005476733 nova_compute[192580]: 2025-10-08 16:35:19.077 2 DEBUG oslo_concurrency.lockutils [req-4726818f-7e08-4430-8866-dabffa3973ba req-32ffd834-e741-4953-be77-2e978b9c2b5d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:35:19 np0005476733 nova_compute[192580]: 2025-10-08 16:35:19.078 2 DEBUG oslo_concurrency.lockutils [req-4726818f-7e08-4430-8866-dabffa3973ba req-32ffd834-e741-4953-be77-2e978b9c2b5d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:35:19 np0005476733 nova_compute[192580]: 2025-10-08 16:35:19.079 2 DEBUG oslo_concurrency.lockutils [req-4726818f-7e08-4430-8866-dabffa3973ba req-32ffd834-e741-4953-be77-2e978b9c2b5d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:35:19 np0005476733 nova_compute[192580]: 2025-10-08 16:35:19.079 2 DEBUG nova.compute.manager [req-4726818f-7e08-4430-8866-dabffa3973ba req-32ffd834-e741-4953-be77-2e978b9c2b5d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] No waiting events found dispatching network-vif-unplugged-ae07e4bd-18d5-41c3-ade8-9cad83e6d345 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:35:19 np0005476733 nova_compute[192580]: 2025-10-08 16:35:19.079 2 DEBUG nova.compute.manager [req-4726818f-7e08-4430-8866-dabffa3973ba req-32ffd834-e741-4953-be77-2e978b9c2b5d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Received event network-vif-unplugged-ae07e4bd-18d5-41c3-ade8-9cad83e6d345 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 12:35:19 np0005476733 nova_compute[192580]: 2025-10-08 16:35:19.080 2 DEBUG nova.compute.manager [req-4726818f-7e08-4430-8866-dabffa3973ba req-32ffd834-e741-4953-be77-2e978b9c2b5d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Received event network-vif-plugged-ae07e4bd-18d5-41c3-ade8-9cad83e6d345 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:35:19 np0005476733 nova_compute[192580]: 2025-10-08 16:35:19.080 2 DEBUG oslo_concurrency.lockutils [req-4726818f-7e08-4430-8866-dabffa3973ba req-32ffd834-e741-4953-be77-2e978b9c2b5d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:35:19 np0005476733 nova_compute[192580]: 2025-10-08 16:35:19.081 2 DEBUG oslo_concurrency.lockutils [req-4726818f-7e08-4430-8866-dabffa3973ba req-32ffd834-e741-4953-be77-2e978b9c2b5d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:35:19 np0005476733 nova_compute[192580]: 2025-10-08 16:35:19.081 2 DEBUG oslo_concurrency.lockutils [req-4726818f-7e08-4430-8866-dabffa3973ba req-32ffd834-e741-4953-be77-2e978b9c2b5d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:35:19 np0005476733 nova_compute[192580]: 2025-10-08 16:35:19.082 2 DEBUG nova.compute.manager [req-4726818f-7e08-4430-8866-dabffa3973ba req-32ffd834-e741-4953-be77-2e978b9c2b5d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] No waiting events found dispatching network-vif-plugged-ae07e4bd-18d5-41c3-ade8-9cad83e6d345 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:35:19 np0005476733 nova_compute[192580]: 2025-10-08 16:35:19.082 2 WARNING nova.compute.manager [req-4726818f-7e08-4430-8866-dabffa3973ba req-32ffd834-e741-4953-be77-2e978b9c2b5d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Received unexpected event network-vif-plugged-ae07e4bd-18d5-41c3-ade8-9cad83e6d345 for instance with vm_state active and task_state deleting.#033[00m
Oct  8 12:35:19 np0005476733 nova_compute[192580]: 2025-10-08 16:35:19.172 2 DEBUG nova.network.neutron [-] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:35:19 np0005476733 nova_compute[192580]: 2025-10-08 16:35:19.195 2 INFO nova.compute.manager [-] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Took 0.65 seconds to deallocate network for instance.#033[00m
Oct  8 12:35:19 np0005476733 nova_compute[192580]: 2025-10-08 16:35:19.236 2 DEBUG oslo_concurrency.lockutils [None req-b3f1f583-f65e-4e2d-af11-ea3fe966c926 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:35:19 np0005476733 nova_compute[192580]: 2025-10-08 16:35:19.237 2 DEBUG oslo_concurrency.lockutils [None req-b3f1f583-f65e-4e2d-af11-ea3fe966c926 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:35:19 np0005476733 nova_compute[192580]: 2025-10-08 16:35:19.287 2 DEBUG nova.compute.provider_tree [None req-b3f1f583-f65e-4e2d-af11-ea3fe966c926 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:35:19 np0005476733 nova_compute[192580]: 2025-10-08 16:35:19.302 2 DEBUG nova.scheduler.client.report [None req-b3f1f583-f65e-4e2d-af11-ea3fe966c926 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:35:19 np0005476733 nova_compute[192580]: 2025-10-08 16:35:19.326 2 DEBUG oslo_concurrency.lockutils [None req-b3f1f583-f65e-4e2d-af11-ea3fe966c926 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.089s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:35:19 np0005476733 nova_compute[192580]: 2025-10-08 16:35:19.350 2 INFO nova.scheduler.client.report [None req-b3f1f583-f65e-4e2d-af11-ea3fe966c926 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Deleted allocations for instance 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525#033[00m
Oct  8 12:35:19 np0005476733 nova_compute[192580]: 2025-10-08 16:35:19.403 2 DEBUG oslo_concurrency.lockutils [None req-b3f1f583-f65e-4e2d-af11-ea3fe966c926 2a670570cd6d44aeb1a34bf3fac55635 3e84a9b599804a5f95722444484bbcee - - default default] Lock "3dbbed4f-f648-4fb9-ad0e-9c718fbbd525" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:35:19 np0005476733 nova_compute[192580]: 2025-10-08 16:35:19.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:21 np0005476733 nova_compute[192580]: 2025-10-08 16:35:21.196 2 DEBUG nova.compute.manager [req-e0b30334-e56e-48a3-b7a1-696d4637835b req-20dc0342-253b-4b0e-a0cc-f7ba47233697 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Received event network-vif-deleted-ae07e4bd-18d5-41c3-ade8-9cad83e6d345 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:35:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:22.637 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '73'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:35:23 np0005476733 nova_compute[192580]: 2025-10-08 16:35:23.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:24 np0005476733 nova_compute[192580]: 2025-10-08 16:35:24.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:26.394 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:35:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:26.395 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:35:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:35:26.395 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:35:26 np0005476733 nova_compute[192580]: 2025-10-08 16:35:26.899 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759941311.8976958, 955b5604-c52b-4d34-8a59-8e13bc6d2992 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:35:26 np0005476733 nova_compute[192580]: 2025-10-08 16:35:26.899 2 INFO nova.compute.manager [-] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] VM Stopped (Lifecycle Event)#033[00m
Oct  8 12:35:26 np0005476733 nova_compute[192580]: 2025-10-08 16:35:26.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:26 np0005476733 nova_compute[192580]: 2025-10-08 16:35:26.923 2 DEBUG nova.compute.manager [None req-9d11f83c-a773-4d88-bf54-fab2316f0d08 - - - - - -] [instance: 955b5604-c52b-4d34-8a59-8e13bc6d2992] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:35:26 np0005476733 nova_compute[192580]: 2025-10-08 16:35:26.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:27 np0005476733 podman[266611]: 2025-10-08 16:35:27.237248415 +0000 UTC m=+0.059249404 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  8 12:35:28 np0005476733 nova_compute[192580]: 2025-10-08 16:35:28.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:29 np0005476733 nova_compute[192580]: 2025-10-08 16:35:29.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:30 np0005476733 podman[266633]: 2025-10-08 16:35:30.307256591 +0000 UTC m=+0.127286667 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:35:33 np0005476733 podman[266659]: 2025-10-08 16:35:33.237380349 +0000 UTC m=+0.057778248 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  8 12:35:33 np0005476733 nova_compute[192580]: 2025-10-08 16:35:33.466 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759941318.4652665, 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:35:33 np0005476733 nova_compute[192580]: 2025-10-08 16:35:33.467 2 INFO nova.compute.manager [-] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] VM Stopped (Lifecycle Event)#033[00m
Oct  8 12:35:33 np0005476733 nova_compute[192580]: 2025-10-08 16:35:33.494 2 DEBUG nova.compute.manager [None req-abe5a493-3b76-4042-9ac5-b51a0b83430d - - - - - -] [instance: 3dbbed4f-f648-4fb9-ad0e-9c718fbbd525] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:35:33 np0005476733 nova_compute[192580]: 2025-10-08 16:35:33.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:34 np0005476733 nova_compute[192580]: 2025-10-08 16:35:34.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:35 np0005476733 nova_compute[192580]: 2025-10-08 16:35:35.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:35:38 np0005476733 nova_compute[192580]: 2025-10-08 16:35:38.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:39 np0005476733 nova_compute[192580]: 2025-10-08 16:35:39.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:41 np0005476733 podman[266684]: 2025-10-08 16:35:41.241787546 +0000 UTC m=+0.063103347 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:35:41 np0005476733 podman[266691]: 2025-10-08 16:35:41.25285958 +0000 UTC m=+0.060392071 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.openshift.expose-services=, version=9.6, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64)
Oct  8 12:35:41 np0005476733 podman[266685]: 2025-10-08 16:35:41.268174959 +0000 UTC m=+0.082846058 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 12:35:41 np0005476733 nova_compute[192580]: 2025-10-08 16:35:41.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:35:43 np0005476733 nova_compute[192580]: 2025-10-08 16:35:43.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:44 np0005476733 nova_compute[192580]: 2025-10-08 16:35:44.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:35:44 np0005476733 nova_compute[192580]: 2025-10-08 16:35:44.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:35:44 np0005476733 nova_compute[192580]: 2025-10-08 16:35:44.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:48 np0005476733 systemd-logind[827]: New session 161 of user zuul.
Oct  8 12:35:48 np0005476733 systemd[1]: Started Session 161 of User zuul.
Oct  8 12:35:48 np0005476733 systemd[1]: session-161.scope: Deactivated successfully.
Oct  8 12:35:48 np0005476733 systemd-logind[827]: Session 161 logged out. Waiting for processes to exit.
Oct  8 12:35:48 np0005476733 systemd-logind[827]: Removed session 161.
Oct  8 12:35:48 np0005476733 nova_compute[192580]: 2025-10-08 16:35:48.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:49 np0005476733 podman[266778]: 2025-10-08 16:35:49.226926287 +0000 UTC m=+0.059431839 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:35:49 np0005476733 podman[266777]: 2025-10-08 16:35:49.22854611 +0000 UTC m=+0.061289109 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, config_id=iscsid)
Oct  8 12:35:49 np0005476733 nova_compute[192580]: 2025-10-08 16:35:49.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:50 np0005476733 nova_compute[192580]: 2025-10-08 16:35:50.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:35:50 np0005476733 nova_compute[192580]: 2025-10-08 16:35:50.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:35:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:35:50Z|00065|pinctrl|WARN|Dropped 421 log messages in last 52 seconds (most recently, 17 seconds ago) due to excessive rate
Oct  8 12:35:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:35:50Z|00066|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:35:52 np0005476733 nova_compute[192580]: 2025-10-08 16:35:52.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:35:52 np0005476733 nova_compute[192580]: 2025-10-08 16:35:52.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:35:52 np0005476733 nova_compute[192580]: 2025-10-08 16:35:52.630 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:35:52 np0005476733 nova_compute[192580]: 2025-10-08 16:35:52.630 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:35:53 np0005476733 nova_compute[192580]: 2025-10-08 16:35:53.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:54 np0005476733 nova_compute[192580]: 2025-10-08 16:35:54.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:57 np0005476733 ovn_controller[263831]: 2025-10-08T16:35:57Z|00067|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Oct  8 12:35:58 np0005476733 podman[266823]: 2025-10-08 16:35:58.260649382 +0000 UTC m=+0.073008134 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:35:58 np0005476733 nova_compute[192580]: 2025-10-08 16:35:58.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:35:59 np0005476733 nova_compute[192580]: 2025-10-08 16:35:59.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:01 np0005476733 podman[266842]: 2025-10-08 16:36:01.270275618 +0000 UTC m=+0.100635126 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  8 12:36:03 np0005476733 nova_compute[192580]: 2025-10-08 16:36:03.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:04 np0005476733 podman[266868]: 2025-10-08 16:36:04.256952031 +0000 UTC m=+0.086262362 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:36:04 np0005476733 nova_compute[192580]: 2025-10-08 16:36:04.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:36:04 np0005476733 nova_compute[192580]: 2025-10-08 16:36:04.625 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:36:04 np0005476733 nova_compute[192580]: 2025-10-08 16:36:04.626 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:36:04 np0005476733 nova_compute[192580]: 2025-10-08 16:36:04.626 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:36:04 np0005476733 nova_compute[192580]: 2025-10-08 16:36:04.626 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:36:04 np0005476733 nova_compute[192580]: 2025-10-08 16:36:04.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:04 np0005476733 nova_compute[192580]: 2025-10-08 16:36:04.785 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:36:04 np0005476733 nova_compute[192580]: 2025-10-08 16:36:04.785 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13644MB free_disk=111.31270217895508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:36:04 np0005476733 nova_compute[192580]: 2025-10-08 16:36:04.786 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:36:04 np0005476733 nova_compute[192580]: 2025-10-08 16:36:04.786 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:36:04 np0005476733 nova_compute[192580]: 2025-10-08 16:36:04.904 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:36:04 np0005476733 nova_compute[192580]: 2025-10-08 16:36:04.905 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:36:04 np0005476733 nova_compute[192580]: 2025-10-08 16:36:04.927 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:36:04 np0005476733 nova_compute[192580]: 2025-10-08 16:36:04.955 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:36:05 np0005476733 nova_compute[192580]: 2025-10-08 16:36:05.004 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:36:05 np0005476733 nova_compute[192580]: 2025-10-08 16:36:05.004 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:36:07 np0005476733 nova_compute[192580]: 2025-10-08 16:36:07.004 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:36:08 np0005476733 nova_compute[192580]: 2025-10-08 16:36:08.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:09 np0005476733 nova_compute[192580]: 2025-10-08 16:36:09.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:12 np0005476733 podman[266891]: 2025-10-08 16:36:12.217850869 +0000 UTC m=+0.047436593 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:36:12 np0005476733 podman[266890]: 2025-10-08 16:36:12.224939866 +0000 UTC m=+0.058029782 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct  8 12:36:12 np0005476733 podman[266892]: 2025-10-08 16:36:12.240713449 +0000 UTC m=+0.062549686 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, distribution-scope=public, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, maintainer=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41)
Oct  8 12:36:13 np0005476733 nova_compute[192580]: 2025-10-08 16:36:13.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:14 np0005476733 nova_compute[192580]: 2025-10-08 16:36:14.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:15 np0005476733 nova_compute[192580]: 2025-10-08 16:36:15.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:36:15 np0005476733 nova_compute[192580]: 2025-10-08 16:36:15.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.039 2 DEBUG oslo_concurrency.lockutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "d2243acb-4779-458d-9e3f-4a5434d27bfb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.040 2 DEBUG oslo_concurrency.lockutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "d2243acb-4779-458d-9e3f-4a5434d27bfb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.101 2 DEBUG nova.compute.manager [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.200 2 DEBUG oslo_concurrency.lockutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.200 2 DEBUG oslo_concurrency.lockutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.208 2 DEBUG nova.virt.hardware [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.208 2 INFO nova.compute.claims [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.359 2 DEBUG nova.compute.provider_tree [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.378 2 DEBUG nova.scheduler.client.report [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.409 2 DEBUG oslo_concurrency.lockutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.410 2 DEBUG nova.compute.manager [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.462 2 DEBUG nova.compute.manager [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.462 2 DEBUG nova.network.neutron [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.485 2 INFO nova.virt.libvirt.driver [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.507 2 DEBUG nova.compute.manager [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.628 2 DEBUG nova.compute.manager [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.630 2 DEBUG nova.virt.libvirt.driver [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.630 2 INFO nova.virt.libvirt.driver [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Creating image(s)#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.631 2 DEBUG oslo_concurrency.lockutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "/var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.632 2 DEBUG oslo_concurrency.lockutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "/var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.633 2 DEBUG oslo_concurrency.lockutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "/var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.660 2 DEBUG oslo_concurrency.processutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.732 2 DEBUG oslo_concurrency.processutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.734 2 DEBUG oslo_concurrency.lockutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.736 2 DEBUG oslo_concurrency.lockutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.763 2 DEBUG oslo_concurrency.processutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.828 2 DEBUG oslo_concurrency.processutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.830 2 DEBUG oslo_concurrency.processutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.875 2 DEBUG oslo_concurrency.processutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb/disk 10737418240" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.877 2 DEBUG oslo_concurrency.lockutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.878 2 DEBUG oslo_concurrency.processutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.936 2 DEBUG nova.policy [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.941 2 DEBUG oslo_concurrency.processutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.942 2 DEBUG nova.objects.instance [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lazy-loading 'migration_context' on Instance uuid d2243acb-4779-458d-9e3f-4a5434d27bfb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.961 2 DEBUG nova.virt.libvirt.driver [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.962 2 DEBUG nova.virt.libvirt.driver [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Ensure instance console log exists: /var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.963 2 DEBUG oslo_concurrency.lockutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.963 2 DEBUG oslo_concurrency.lockutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:36:18 np0005476733 nova_compute[192580]: 2025-10-08 16:36:18.963 2 DEBUG oslo_concurrency.lockutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:36:19 np0005476733 nova_compute[192580]: 2025-10-08 16:36:19.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:20 np0005476733 podman[266964]: 2025-10-08 16:36:20.23501347 +0000 UTC m=+0.049746827 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 12:36:20 np0005476733 podman[266963]: 2025-10-08 16:36:20.242274283 +0000 UTC m=+0.057586448 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, io.buildah.version=1.41.3)
Oct  8 12:36:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:20.646 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=74, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=73) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:36:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:20.646 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:36:20 np0005476733 nova_compute[192580]: 2025-10-08 16:36:20.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:23 np0005476733 nova_compute[192580]: 2025-10-08 16:36:23.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:23 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:23.649 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '74'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:36:24 np0005476733 nova_compute[192580]: 2025-10-08 16:36:24.603 2 DEBUG nova.network.neutron [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Successfully created port: e3be517f-e639-485a-ab91-7dc0a7be7433 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 12:36:24 np0005476733 nova_compute[192580]: 2025-10-08 16:36:24.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:26.395 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:36:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:26.395 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:36:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:26.395 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:36:27 np0005476733 nova_compute[192580]: 2025-10-08 16:36:27.654 2 DEBUG nova.network.neutron [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Successfully updated port: e3be517f-e639-485a-ab91-7dc0a7be7433 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 12:36:27 np0005476733 nova_compute[192580]: 2025-10-08 16:36:27.671 2 DEBUG oslo_concurrency.lockutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "refresh_cache-d2243acb-4779-458d-9e3f-4a5434d27bfb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:36:27 np0005476733 nova_compute[192580]: 2025-10-08 16:36:27.672 2 DEBUG oslo_concurrency.lockutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquired lock "refresh_cache-d2243acb-4779-458d-9e3f-4a5434d27bfb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:36:27 np0005476733 nova_compute[192580]: 2025-10-08 16:36:27.672 2 DEBUG nova.network.neutron [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:36:27 np0005476733 nova_compute[192580]: 2025-10-08 16:36:27.796 2 DEBUG nova.compute.manager [req-02f5238f-693a-4419-9b1d-a3a39ac1534b req-b8f3d023-86df-4baf-a6f5-64fe27fc91d0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Received event network-changed-e3be517f-e639-485a-ab91-7dc0a7be7433 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:36:27 np0005476733 nova_compute[192580]: 2025-10-08 16:36:27.797 2 DEBUG nova.compute.manager [req-02f5238f-693a-4419-9b1d-a3a39ac1534b req-b8f3d023-86df-4baf-a6f5-64fe27fc91d0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Refreshing instance network info cache due to event network-changed-e3be517f-e639-485a-ab91-7dc0a7be7433. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:36:27 np0005476733 nova_compute[192580]: 2025-10-08 16:36:27.797 2 DEBUG oslo_concurrency.lockutils [req-02f5238f-693a-4419-9b1d-a3a39ac1534b req-b8f3d023-86df-4baf-a6f5-64fe27fc91d0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-d2243acb-4779-458d-9e3f-4a5434d27bfb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:36:28 np0005476733 nova_compute[192580]: 2025-10-08 16:36:28.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:28 np0005476733 nova_compute[192580]: 2025-10-08 16:36:28.641 2 DEBUG nova.network.neutron [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 12:36:29 np0005476733 podman[267007]: 2025-10-08 16:36:29.267619683 +0000 UTC m=+0.088564974 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent)
Oct  8 12:36:29 np0005476733 nova_compute[192580]: 2025-10-08 16:36:29.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.901 2 DEBUG nova.network.neutron [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Updating instance_info_cache with network_info: [{"id": "e3be517f-e639-485a-ab91-7dc0a7be7433", "address": "fa:16:3e:93:e1:e1", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3be517f-e6", "ovs_interfaceid": "e3be517f-e639-485a-ab91-7dc0a7be7433", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.926 2 DEBUG oslo_concurrency.lockutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Releasing lock "refresh_cache-d2243acb-4779-458d-9e3f-4a5434d27bfb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.927 2 DEBUG nova.compute.manager [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Instance network_info: |[{"id": "e3be517f-e639-485a-ab91-7dc0a7be7433", "address": "fa:16:3e:93:e1:e1", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3be517f-e6", "ovs_interfaceid": "e3be517f-e639-485a-ab91-7dc0a7be7433", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.927 2 DEBUG oslo_concurrency.lockutils [req-02f5238f-693a-4419-9b1d-a3a39ac1534b req-b8f3d023-86df-4baf-a6f5-64fe27fc91d0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-d2243acb-4779-458d-9e3f-4a5434d27bfb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.928 2 DEBUG nova.network.neutron [req-02f5238f-693a-4419-9b1d-a3a39ac1534b req-b8f3d023-86df-4baf-a6f5-64fe27fc91d0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Refreshing network info cache for port e3be517f-e639-485a-ab91-7dc0a7be7433 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.930 2 DEBUG nova.virt.libvirt.driver [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Start _get_guest_xml network_info=[{"id": "e3be517f-e639-485a-ab91-7dc0a7be7433", "address": "fa:16:3e:93:e1:e1", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3be517f-e6", "ovs_interfaceid": "e3be517f-e639-485a-ab91-7dc0a7be7433", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.934 2 WARNING nova.virt.libvirt.driver [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.945 2 DEBUG nova.virt.libvirt.host [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.945 2 DEBUG nova.virt.libvirt.host [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.950 2 DEBUG nova.virt.libvirt.host [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.951 2 DEBUG nova.virt.libvirt.host [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.951 2 DEBUG nova.virt.libvirt.driver [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.951 2 DEBUG nova.virt.hardware [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.952 2 DEBUG nova.virt.hardware [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.952 2 DEBUG nova.virt.hardware [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.952 2 DEBUG nova.virt.hardware [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.952 2 DEBUG nova.virt.hardware [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.952 2 DEBUG nova.virt.hardware [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.953 2 DEBUG nova.virt.hardware [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.953 2 DEBUG nova.virt.hardware [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.953 2 DEBUG nova.virt.hardware [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.953 2 DEBUG nova.virt.hardware [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.953 2 DEBUG nova.virt.hardware [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.956 2 DEBUG nova.virt.libvirt.vif [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:36:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1966949606',display_name='tempest-server-test-1966949606',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1966949606',id=97,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgF0yKypuL1wJPOTwdtr+zzq0qw+uwurXu01O/Ym5uWgfd00pr3GN1raply3ByKVO5hmmkfvydhY0zQSvT9dZbNRj3hL8c6L+eBag20GVWlTRMyq8EfPEfzsuER1PS2LQ==',key_name='tempest-keypair-test-224344080',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4726c8b7a2a3405b9b2d689862918f5d',ramdisk_id='',reservation_id='r-fbooillz',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-GatewayMtuTestUdp-187807839',owner_user_name='tempest-GatewayMtuTestUdp-187807839-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:36:18Z,user_data=None,user_id='de0012a12c1645bfb620caa34110c3f4',uuid=d2243acb-4779-458d-9e3f-4a5434d27bfb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e3be517f-e639-485a-ab91-7dc0a7be7433", "address": "fa:16:3e:93:e1:e1", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3be517f-e6", "ovs_interfaceid": "e3be517f-e639-485a-ab91-7dc0a7be7433", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.957 2 DEBUG nova.network.os_vif_util [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Converting VIF {"id": "e3be517f-e639-485a-ab91-7dc0a7be7433", "address": "fa:16:3e:93:e1:e1", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3be517f-e6", "ovs_interfaceid": "e3be517f-e639-485a-ab91-7dc0a7be7433", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.957 2 DEBUG nova.network.os_vif_util [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:93:e1:e1,bridge_name='br-int',has_traffic_filtering=True,id=e3be517f-e639-485a-ab91-7dc0a7be7433,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3be517f-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.958 2 DEBUG nova.objects.instance [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lazy-loading 'pci_devices' on Instance uuid d2243acb-4779-458d-9e3f-4a5434d27bfb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.976 2 DEBUG nova.virt.libvirt.driver [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] End _get_guest_xml xml=<domain type="kvm">
Oct  8 12:36:30 np0005476733 nova_compute[192580]:  <uuid>d2243acb-4779-458d-9e3f-4a5434d27bfb</uuid>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:  <name>instance-00000061</name>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <nova:name>tempest-server-test-1966949606</nova:name>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 16:36:30</nova:creationTime>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 12:36:30 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:        <nova:user uuid="de0012a12c1645bfb620caa34110c3f4">tempest-GatewayMtuTestUdp-187807839-project-member</nova:user>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:        <nova:project uuid="4726c8b7a2a3405b9b2d689862918f5d">tempest-GatewayMtuTestUdp-187807839</nova:project>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:        <nova:port uuid="e3be517f-e639-485a-ab91-7dc0a7be7433">
Oct  8 12:36:30 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.122.208" ipVersion="4"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <system>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <entry name="serial">d2243acb-4779-458d-9e3f-4a5434d27bfb</entry>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <entry name="uuid">d2243acb-4779-458d-9e3f-4a5434d27bfb</entry>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    </system>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:  <os>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:  </os>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:  <features>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:  </features>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:  </clock>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:  <devices>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb/disk"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.config"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:93:e1:e1"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <mtu size="1312"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <target dev="tape3be517f-e6"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    </interface>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb/console.log" append="off"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    </serial>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <video>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    </video>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    </rng>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 12:36:30 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 12:36:30 np0005476733 nova_compute[192580]:  </devices>
Oct  8 12:36:30 np0005476733 nova_compute[192580]: </domain>
Oct  8 12:36:30 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.977 2 DEBUG nova.compute.manager [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Preparing to wait for external event network-vif-plugged-e3be517f-e639-485a-ab91-7dc0a7be7433 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.977 2 DEBUG oslo_concurrency.lockutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "d2243acb-4779-458d-9e3f-4a5434d27bfb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.977 2 DEBUG oslo_concurrency.lockutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "d2243acb-4779-458d-9e3f-4a5434d27bfb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.977 2 DEBUG oslo_concurrency.lockutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "d2243acb-4779-458d-9e3f-4a5434d27bfb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.978 2 DEBUG nova.virt.libvirt.vif [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:36:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1966949606',display_name='tempest-server-test-1966949606',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1966949606',id=97,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgF0yKypuL1wJPOTwdtr+zzq0qw+uwurXu01O/Ym5uWgfd00pr3GN1raply3ByKVO5hmmkfvydhY0zQSvT9dZbNRj3hL8c6L+eBag20GVWlTRMyq8EfPEfzsuER1PS2LQ==',key_name='tempest-keypair-test-224344080',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4726c8b7a2a3405b9b2d689862918f5d',ramdisk_id='',reservation_id='r-fbooillz',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-GatewayMtuTestUdp-187807839',owner_user_name='tempest-GatewayMtuTestUdp-187807839-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:36:18Z,user_data=None,user_id='de0012a12c1645bfb620caa34110c3f4',uuid=d2243acb-4779-458d-9e3f-4a5434d27bfb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e3be517f-e639-485a-ab91-7dc0a7be7433", "address": "fa:16:3e:93:e1:e1", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3be517f-e6", "ovs_interfaceid": "e3be517f-e639-485a-ab91-7dc0a7be7433", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.978 2 DEBUG nova.network.os_vif_util [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Converting VIF {"id": "e3be517f-e639-485a-ab91-7dc0a7be7433", "address": "fa:16:3e:93:e1:e1", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3be517f-e6", "ovs_interfaceid": "e3be517f-e639-485a-ab91-7dc0a7be7433", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.979 2 DEBUG nova.network.os_vif_util [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:93:e1:e1,bridge_name='br-int',has_traffic_filtering=True,id=e3be517f-e639-485a-ab91-7dc0a7be7433,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3be517f-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.979 2 DEBUG os_vif [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:93:e1:e1,bridge_name='br-int',has_traffic_filtering=True,id=e3be517f-e639-485a-ab91-7dc0a7be7433,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3be517f-e6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.980 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.980 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.982 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape3be517f-e6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.982 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape3be517f-e6, col_values=(('external_ids', {'iface-id': 'e3be517f-e639-485a-ab91-7dc0a7be7433', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:93:e1:e1', 'vm-uuid': 'd2243acb-4779-458d-9e3f-4a5434d27bfb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:36:30 np0005476733 nova_compute[192580]: 2025-10-08 16:36:30.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:31 np0005476733 nova_compute[192580]: 2025-10-08 16:36:31.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:36:31 np0005476733 NetworkManager[51699]: <info>  [1759941391.0016] manager: (tape3be517f-e6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/299)
Oct  8 12:36:31 np0005476733 nova_compute[192580]: 2025-10-08 16:36:31.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:31 np0005476733 nova_compute[192580]: 2025-10-08 16:36:31.006 2 INFO os_vif [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:93:e1:e1,bridge_name='br-int',has_traffic_filtering=True,id=e3be517f-e639-485a-ab91-7dc0a7be7433,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3be517f-e6')#033[00m
Oct  8 12:36:31 np0005476733 nova_compute[192580]: 2025-10-08 16:36:31.054 2 DEBUG nova.virt.libvirt.driver [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:36:31 np0005476733 nova_compute[192580]: 2025-10-08 16:36:31.054 2 DEBUG nova.virt.libvirt.driver [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:36:31 np0005476733 nova_compute[192580]: 2025-10-08 16:36:31.054 2 DEBUG nova.virt.libvirt.driver [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] No VIF found with MAC fa:16:3e:93:e1:e1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 12:36:31 np0005476733 nova_compute[192580]: 2025-10-08 16:36:31.055 2 INFO nova.virt.libvirt.driver [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Using config drive#033[00m
Oct  8 12:36:31 np0005476733 nova_compute[192580]: 2025-10-08 16:36:31.827 2 INFO nova.virt.libvirt.driver [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Creating config drive at /var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.config#033[00m
Oct  8 12:36:31 np0005476733 nova_compute[192580]: 2025-10-08 16:36:31.832 2 DEBUG oslo_concurrency.processutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpowuvqa1w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:36:31 np0005476733 nova_compute[192580]: 2025-10-08 16:36:31.958 2 DEBUG oslo_concurrency.processutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpowuvqa1w" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:36:32 np0005476733 kernel: tape3be517f-e6: entered promiscuous mode
Oct  8 12:36:32 np0005476733 NetworkManager[51699]: <info>  [1759941392.0381] manager: (tape3be517f-e6): new Tun device (/org/freedesktop/NetworkManager/Devices/300)
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:32 np0005476733 ovn_controller[263831]: 2025-10-08T16:36:32Z|00068|binding|INFO|Claiming lport e3be517f-e639-485a-ab91-7dc0a7be7433 for this chassis.
Oct  8 12:36:32 np0005476733 ovn_controller[263831]: 2025-10-08T16:36:32Z|00069|binding|INFO|e3be517f-e639-485a-ab91-7dc0a7be7433: Claiming fa:16:3e:93:e1:e1 192.168.122.208
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:32 np0005476733 NetworkManager[51699]: <info>  [1759941392.0988] manager: (patch-provnet-20e9b335-697c-41bf-8f62-8813cb01ba99-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/301)
Oct  8 12:36:32 np0005476733 NetworkManager[51699]: <info>  [1759941392.0994] manager: (patch-br-int-to-provnet-20e9b335-697c-41bf-8f62-8813cb01ba99): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/302)
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.100 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:93:e1:e1 192.168.122.208'], port_security=['fa:16:3e:93:e1:e1 192.168.122.208'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.208/24', 'neutron:device_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0ae65408-c1fc-4a23-acb9-ead1616a73f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5b64086-e7d8-42ad-b439-67cb79e13d7c, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=e3be517f-e639-485a-ab91-7dc0a7be7433) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.101 103739 INFO neutron.agent.ovn.metadata.agent [-] Port e3be517f-e639-485a-ab91-7dc0a7be7433 in datapath 81c575b5-ac88-40d3-8b00-79c5c936eec4 bound to our chassis#033[00m
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.103 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 81c575b5-ac88-40d3-8b00-79c5c936eec4#033[00m
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.114 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e3e373ae-d7c4-4596-b178-e9a82cf6f7dd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.115 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap81c575b5-a1 in ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 12:36:32 np0005476733 systemd-udevd[267059]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.117 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap81c575b5-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.117 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1af41319-b7bf-49e1-b45d-aa15dbc27040]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.117 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7218d475-dba2-492b-8d9e-0598613cb2a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:36:32 np0005476733 systemd-machined[152624]: New machine qemu-60-instance-00000061.
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.128 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[02a61ada-eaac-4342-a109-c53a4f2a4d9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:36:32 np0005476733 NetworkManager[51699]: <info>  [1759941392.1341] device (tape3be517f-e6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:36:32 np0005476733 NetworkManager[51699]: <info>  [1759941392.1347] device (tape3be517f-e6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.160 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[087c0bc4-4ff2-4817-9a7e-544e21e6c6da]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:36:32 np0005476733 systemd[1]: Started Virtual Machine qemu-60-instance-00000061.
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.191 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[d4f3205d-3047-4f25-81c6-aa0f105320a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:36:32 np0005476733 ovn_controller[263831]: 2025-10-08T16:36:32Z|00070|binding|INFO|Setting lport e3be517f-e639-485a-ab91-7dc0a7be7433 ovn-installed in OVS
Oct  8 12:36:32 np0005476733 ovn_controller[263831]: 2025-10-08T16:36:32Z|00071|binding|INFO|Setting lport e3be517f-e639-485a-ab91-7dc0a7be7433 up in Southbound
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:32 np0005476733 NetworkManager[51699]: <info>  [1759941392.2044] manager: (tap81c575b5-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/303)
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.203 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[84bd7567-8452-4d83-b4c5-86bcd4d193e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:36:32 np0005476733 systemd-udevd[267068]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:36:32 np0005476733 podman[267039]: 2025-10-08 16:36:32.21466764 +0000 UTC m=+0.181286482 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.237 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[7fb316dc-cc47-4d34-98f0-5b432d74ffa0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.241 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[ad5ef9bf-44cb-416e-bd9d-6b053b341f94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:36:32 np0005476733 NetworkManager[51699]: <info>  [1759941392.2634] device (tap81c575b5-a0): carrier: link connected
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.269 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[35f492f5-a910-49ec-974b-482248f7ab86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.287 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8df2a4b9-3c35-4edb-a0c7-cb1fe8017284]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81c575b5-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:bf:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 208], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 829592, 'reachable_time': 40483, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 267108, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.302 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[058ff10c-3164-4ce1-9fb9-31bdfce0a766]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe79:bf12'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 829592, 'tstamp': 829592}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 267109, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.319 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c198a217-143a-4d09-8caf-3c5dcbe5c7f5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81c575b5-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:bf:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 208], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 829592, 'reachable_time': 40483, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 267110, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.348 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[fef44187-cfd3-4ee2-b7c0-804092517bc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.404 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b15de7f4-6ed9-4439-b226-517d19b4a697]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.406 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81c575b5-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.407 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.408 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81c575b5-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:32 np0005476733 kernel: tap81c575b5-a0: entered promiscuous mode
Oct  8 12:36:32 np0005476733 NetworkManager[51699]: <info>  [1759941392.4120] manager: (tap81c575b5-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/304)
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.414 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap81c575b5-a0, col_values=(('external_ids', {'iface-id': '3737b929-673d-4d30-a674-dbb8c6c2e54d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:32 np0005476733 ovn_controller[263831]: 2025-10-08T16:36:32Z|00072|binding|INFO|Releasing lport 3737b929-673d-4d30-a674-dbb8c6c2e54d from this chassis (sb_readonly=0)
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.416 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/81c575b5-ac88-40d3-8b00-79c5c936eec4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/81c575b5-ac88-40d3-8b00-79c5c936eec4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.417 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[33853c1e-6c7e-41db-a4e0-a2a9e9617a2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.417 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-81c575b5-ac88-40d3-8b00-79c5c936eec4
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/81c575b5-ac88-40d3-8b00-79c5c936eec4.pid.haproxy
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 81c575b5-ac88-40d3-8b00-79c5c936eec4
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 12:36:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:36:32.418 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'env', 'PROCESS_TAG=haproxy-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/81c575b5-ac88-40d3-8b00-79c5c936eec4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:32 np0005476733 podman[267148]: 2025-10-08 16:36:32.794869822 +0000 UTC m=+0.051048250 container create af1dbaebf46327434827efb1609ff138d3a5f6744a8eaebaacf18f4e661bddad (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:36:32 np0005476733 systemd[1]: Started libpod-conmon-af1dbaebf46327434827efb1609ff138d3a5f6744a8eaebaacf18f4e661bddad.scope.
Oct  8 12:36:32 np0005476733 systemd[1]: Started libcrun container.
Oct  8 12:36:32 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f415434c086b7662bf565b07aba6692bae4b35f8fc8e3467046183891911f8c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 12:36:32 np0005476733 podman[267148]: 2025-10-08 16:36:32.768399827 +0000 UTC m=+0.024578275 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 12:36:32 np0005476733 podman[267148]: 2025-10-08 16:36:32.876863896 +0000 UTC m=+0.133042344 container init af1dbaebf46327434827efb1609ff138d3a5f6744a8eaebaacf18f4e661bddad (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2)
Oct  8 12:36:32 np0005476733 podman[267148]: 2025-10-08 16:36:32.882756134 +0000 UTC m=+0.138934562 container start af1dbaebf46327434827efb1609ff138d3a5f6744a8eaebaacf18f4e661bddad (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3)
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.890 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759941392.8895857, d2243acb-4779-458d-9e3f-4a5434d27bfb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.890 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] VM Started (Lifecycle Event)#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.896 2 DEBUG nova.compute.manager [req-f17612c3-5161-4910-83c6-62d94120d750 req-d3c1ed29-154a-4fd0-a34d-b16e247fa0e7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Received event network-vif-plugged-e3be517f-e639-485a-ab91-7dc0a7be7433 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.897 2 DEBUG oslo_concurrency.lockutils [req-f17612c3-5161-4910-83c6-62d94120d750 req-d3c1ed29-154a-4fd0-a34d-b16e247fa0e7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "d2243acb-4779-458d-9e3f-4a5434d27bfb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.897 2 DEBUG oslo_concurrency.lockutils [req-f17612c3-5161-4910-83c6-62d94120d750 req-d3c1ed29-154a-4fd0-a34d-b16e247fa0e7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d2243acb-4779-458d-9e3f-4a5434d27bfb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.898 2 DEBUG oslo_concurrency.lockutils [req-f17612c3-5161-4910-83c6-62d94120d750 req-d3c1ed29-154a-4fd0-a34d-b16e247fa0e7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d2243acb-4779-458d-9e3f-4a5434d27bfb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.898 2 DEBUG nova.compute.manager [req-f17612c3-5161-4910-83c6-62d94120d750 req-d3c1ed29-154a-4fd0-a34d-b16e247fa0e7 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Processing event network-vif-plugged-e3be517f-e639-485a-ab91-7dc0a7be7433 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.899 2 DEBUG nova.compute.manager [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 12:36:32 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[267163]: [NOTICE]   (267168) : New worker (267170) forked
Oct  8 12:36:32 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[267163]: [NOTICE]   (267168) : Loading success.
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.909 2 DEBUG nova.virt.libvirt.driver [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.913 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.923 2 INFO nova.virt.libvirt.driver [-] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Instance spawned successfully.#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.924 2 DEBUG nova.virt.libvirt.driver [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.926 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.956 2 DEBUG nova.virt.libvirt.driver [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.956 2 DEBUG nova.virt.libvirt.driver [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.957 2 DEBUG nova.virt.libvirt.driver [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.957 2 DEBUG nova.virt.libvirt.driver [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.958 2 DEBUG nova.virt.libvirt.driver [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.958 2 DEBUG nova.virt.libvirt.driver [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.962 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.963 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759941392.8897824, d2243acb-4779-458d-9e3f-4a5434d27bfb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:36:32 np0005476733 nova_compute[192580]: 2025-10-08 16:36:32.963 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] VM Paused (Lifecycle Event)#033[00m
Oct  8 12:36:33 np0005476733 nova_compute[192580]: 2025-10-08 16:36:33.012 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:36:33 np0005476733 nova_compute[192580]: 2025-10-08 16:36:33.017 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759941392.9035287, d2243acb-4779-458d-9e3f-4a5434d27bfb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:36:33 np0005476733 nova_compute[192580]: 2025-10-08 16:36:33.017 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] VM Resumed (Lifecycle Event)#033[00m
Oct  8 12:36:33 np0005476733 nova_compute[192580]: 2025-10-08 16:36:33.048 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:36:33 np0005476733 nova_compute[192580]: 2025-10-08 16:36:33.054 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:36:33 np0005476733 nova_compute[192580]: 2025-10-08 16:36:33.064 2 INFO nova.compute.manager [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Took 14.44 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 12:36:33 np0005476733 nova_compute[192580]: 2025-10-08 16:36:33.064 2 DEBUG nova.compute.manager [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:36:33 np0005476733 nova_compute[192580]: 2025-10-08 16:36:33.096 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:36:33 np0005476733 nova_compute[192580]: 2025-10-08 16:36:33.145 2 INFO nova.compute.manager [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Took 14.98 seconds to build instance.#033[00m
Oct  8 12:36:33 np0005476733 nova_compute[192580]: 2025-10-08 16:36:33.168 2 DEBUG oslo_concurrency.lockutils [None req-f270a11d-27b0-4246-a890-848090993102 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "d2243acb-4779-458d-9e3f-4a5434d27bfb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:36:33 np0005476733 nova_compute[192580]: 2025-10-08 16:36:33.689 2 DEBUG nova.network.neutron [req-02f5238f-693a-4419-9b1d-a3a39ac1534b req-b8f3d023-86df-4baf-a6f5-64fe27fc91d0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Updated VIF entry in instance network info cache for port e3be517f-e639-485a-ab91-7dc0a7be7433. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:36:33 np0005476733 nova_compute[192580]: 2025-10-08 16:36:33.689 2 DEBUG nova.network.neutron [req-02f5238f-693a-4419-9b1d-a3a39ac1534b req-b8f3d023-86df-4baf-a6f5-64fe27fc91d0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Updating instance_info_cache with network_info: [{"id": "e3be517f-e639-485a-ab91-7dc0a7be7433", "address": "fa:16:3e:93:e1:e1", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3be517f-e6", "ovs_interfaceid": "e3be517f-e639-485a-ab91-7dc0a7be7433", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:36:33 np0005476733 nova_compute[192580]: 2025-10-08 16:36:33.717 2 DEBUG oslo_concurrency.lockutils [req-02f5238f-693a-4419-9b1d-a3a39ac1534b req-b8f3d023-86df-4baf-a6f5-64fe27fc91d0 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-d2243acb-4779-458d-9e3f-4a5434d27bfb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:36:34 np0005476733 nova_compute[192580]: 2025-10-08 16:36:34.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:35 np0005476733 nova_compute[192580]: 2025-10-08 16:36:35.011 2 DEBUG nova.compute.manager [req-722c7081-3811-40fb-84db-2ce51151f00f req-2767383f-e363-4c57-b8d4-14198a96cd72 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Received event network-vif-plugged-e3be517f-e639-485a-ab91-7dc0a7be7433 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:36:35 np0005476733 nova_compute[192580]: 2025-10-08 16:36:35.011 2 DEBUG oslo_concurrency.lockutils [req-722c7081-3811-40fb-84db-2ce51151f00f req-2767383f-e363-4c57-b8d4-14198a96cd72 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "d2243acb-4779-458d-9e3f-4a5434d27bfb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:36:35 np0005476733 nova_compute[192580]: 2025-10-08 16:36:35.011 2 DEBUG oslo_concurrency.lockutils [req-722c7081-3811-40fb-84db-2ce51151f00f req-2767383f-e363-4c57-b8d4-14198a96cd72 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d2243acb-4779-458d-9e3f-4a5434d27bfb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:36:35 np0005476733 nova_compute[192580]: 2025-10-08 16:36:35.012 2 DEBUG oslo_concurrency.lockutils [req-722c7081-3811-40fb-84db-2ce51151f00f req-2767383f-e363-4c57-b8d4-14198a96cd72 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d2243acb-4779-458d-9e3f-4a5434d27bfb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:36:35 np0005476733 nova_compute[192580]: 2025-10-08 16:36:35.012 2 DEBUG nova.compute.manager [req-722c7081-3811-40fb-84db-2ce51151f00f req-2767383f-e363-4c57-b8d4-14198a96cd72 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] No waiting events found dispatching network-vif-plugged-e3be517f-e639-485a-ab91-7dc0a7be7433 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:36:35 np0005476733 nova_compute[192580]: 2025-10-08 16:36:35.012 2 WARNING nova.compute.manager [req-722c7081-3811-40fb-84db-2ce51151f00f req-2767383f-e363-4c57-b8d4-14198a96cd72 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Received unexpected event network-vif-plugged-e3be517f-e639-485a-ab91-7dc0a7be7433 for instance with vm_state active and task_state None.#033[00m
Oct  8 12:36:35 np0005476733 podman[267179]: 2025-10-08 16:36:35.238048969 +0000 UTC m=+0.060053385 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Oct  8 12:36:35 np0005476733 nova_compute[192580]: 2025-10-08 16:36:35.605 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:36:36 np0005476733 nova_compute[192580]: 2025-10-08 16:36:36.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.073 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'name': 'tempest-server-test-1966949606', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000061', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '4726c8b7a2a3405b9b2d689862918f5d', 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'hostId': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.074 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.105 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.write.latency volume: 7318933 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.106 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4b19003-49ac-4197-952c-3c433f58cae3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7318933, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-vda', 'timestamp': '2025-10-08T16:36:36.074572', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'f4e8bf94-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.797544566, 'message_signature': '40a160fb778683aeab3d7134c8e629896b6fca0e9d8a0f15c98c175ff6267bd7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-sda', 'timestamp': '2025-10-08T16:36:36.074572', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'f4e8e58c-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.797544566, 'message_signature': 'eedbef51980f74a96da4b2d810cfb0c4ae183ff16891d40ffee6e1018f7eb56c'}]}, 'timestamp': '2025-10-08 16:36:36.107656', '_unique_id': 'bdf4f381f34e4cdfb126d9971270043d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.111 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.116 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for d2243acb-4779-458d-9e3f-4a5434d27bfb / tape3be517f-e6 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.116 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c92d2eca-37c8-4667-981b-196a6af40445', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000061-d2243acb-4779-458d-9e3f-4a5434d27bfb-tape3be517f-e6', 'timestamp': '2025-10-08T16:36:36.112234', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'tape3be517f-e6', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:93:e1:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape3be517f-e6'}, 'message_id': 'f4ea550c-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.83526814, 'message_signature': '58a9317ecd3562a56649f86bcee746c0446fd8523d6fe43753a4c3e237f84e6c'}]}, 'timestamp': '2025-10-08 16:36:36.117004', '_unique_id': '2c647ee240954a809fdccc4ddf53089c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.119 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.119 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bff2366f-9e3e-4f9a-845d-e3843a600afe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000061-d2243acb-4779-458d-9e3f-4a5434d27bfb-tape3be517f-e6', 'timestamp': '2025-10-08T16:36:36.119947', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'tape3be517f-e6', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:93:e1:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape3be517f-e6'}, 'message_id': 'f4eadba8-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.83526814, 'message_signature': '02e3475fba5c5ba9a85022a81779e6c310a3826676e0e3caf62cb8a1821444e6'}]}, 'timestamp': '2025-10-08 16:36:36.120437', '_unique_id': '076592c7a90342fc84935bb1ed84771c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.121 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.122 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.122 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.123 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-1966949606>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1966949606>]
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.123 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.123 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.read.bytes volume: 28578304 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.124 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '235dbe83-b16a-47ce-b9fb-8a91a5999e87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28578304, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-vda', 'timestamp': '2025-10-08T16:36:36.123714', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'f4eb6c26-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.797544566, 'message_signature': 'f40b0bf7d4001079bb89f6b88f7b3be8f256d6a6965222d7b6d57eefd6097c2f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-sda', 'timestamp': '2025-10-08T16:36:36.123714', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'f4eb7e14-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.797544566, 'message_signature': 'cd7101bfc5df337c7f11039d2542aa48bb4336b12c1be72ccee4e683f07a5d80'}]}, 'timestamp': '2025-10-08 16:36:36.124578', '_unique_id': '7c58bbf983974f4ca6ae5305762120bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.125 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.126 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.127 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0952533-2707-4fc1-ad53-65d5d7224afc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000061-d2243acb-4779-458d-9e3f-4a5434d27bfb-tape3be517f-e6', 'timestamp': '2025-10-08T16:36:36.127006', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'tape3be517f-e6', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:93:e1:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape3be517f-e6'}, 'message_id': 'f4ebef7a-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.83526814, 'message_signature': 'cf8505799cfe1959879d963a020652bb0e56b8d6c476416cf73b4fb172fa7c83'}]}, 'timestamp': '2025-10-08 16:36:36.127483', '_unique_id': 'fa1092f5fc104faeb61135e7350c7903'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.128 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.129 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.141 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.142 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d651046-1b22-4c96-b4f7-beb82c7207fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-vda', 'timestamp': '2025-10-08T16:36:36.129823', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'f4ee2c04-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.85283452, 'message_signature': 'ff8677d9f66cc08363e416f70fe2ccb17f9375e57041f62f5bfe30816e34e18c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-sda', 'timestamp': '2025-10-08T16:36:36.129823', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'f4ee3b22-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.85283452, 'message_signature': 'a131dca705a979851bb1b04017978ed9af138d4e3993bb2d2af1bd07fda4456e'}]}, 'timestamp': '2025-10-08 16:36:36.142509', '_unique_id': 'c2a24cccd76340acab03bd36ef064814'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.143 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.148 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.148 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.148 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-1966949606>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1966949606>]
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.149 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.149 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.149 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8fa0360f-e94d-4c0a-8b53-6a02f3391ae0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-vda', 'timestamp': '2025-10-08T16:36:36.149064', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'f4ef4882-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.797544566, 'message_signature': '5d31eb069fc014eb25d09147962e8c026bf3fbcc93512cc49caeb135bf498953'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-sda', 'timestamp': '2025-10-08T16:36:36.149064', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'f4ef521e-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.797544566, 'message_signature': 'c9392480f4acb6648615dee662a99cc1c338fc1574c4645ff1b7076cde68f602'}]}, 'timestamp': '2025-10-08 16:36:36.149609', '_unique_id': 'b38e283952aa459b9389348b0e922a9f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.151 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.151 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '06d1a5b5-0211-49bf-b530-b05654c1d011', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000061-d2243acb-4779-458d-9e3f-4a5434d27bfb-tape3be517f-e6', 'timestamp': '2025-10-08T16:36:36.151327', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'tape3be517f-e6', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:93:e1:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape3be517f-e6'}, 'message_id': 'f4ef9e5e-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.83526814, 'message_signature': 'fb59580bf6feed6a4c89a55eed56a253d01787e2c1a6a14eef6ca0429094baf5'}]}, 'timestamp': '2025-10-08 16:36:36.151588', '_unique_id': 'b9b08886fcd6414ca473d6e965735e2a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.152 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1f508cf-d44b-40d7-b5e8-a8ac7f10c11b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000061-d2243acb-4779-458d-9e3f-4a5434d27bfb-tape3be517f-e6', 'timestamp': '2025-10-08T16:36:36.152722', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'tape3be517f-e6', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:93:e1:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape3be517f-e6'}, 'message_id': 'f4efd4aa-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.83526814, 'message_signature': '6f02bcc89a3aff3509c845fcf4902f4462bb377077d439833a11c885c0668849'}]}, 'timestamp': '2025-10-08 16:36:36.152965', '_unique_id': '4fb5103213fc453e9acac078a126f279'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.153 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.allocation volume: 1253376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd25e11c9-dd86-4492-9909-b66e37547283', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1253376, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-vda', 'timestamp': '2025-10-08T16:36:36.154060', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'f4f009ca-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.85283452, 'message_signature': '8ee26277fd926b44abbe4b620f079008edd070168dcbb01ea348e6121c49117d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-sda', 'timestamp': '2025-10-08T16:36:36.154060', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'f4f01168-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.85283452, 'message_signature': 'f8bf2d44f7475b16716f94583e8634c77d4d95d21ab27eda38bb8dfbf0feed6f'}]}, 'timestamp': '2025-10-08 16:36:36.154495', '_unique_id': '384a9e68cba4459b841cf76a4b44fb09'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.155 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.155 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.read.requests volume: 1747 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.155 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae7c7286-eda9-4f51-9225-1ce828393ec3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1747, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-vda', 'timestamp': '2025-10-08T16:36:36.155684', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'f4f0487c-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.797544566, 'message_signature': 'c884b40a6370f2fbe1510ad3be2a916220de4921efec70753de8cc62f0f5595d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-sda', 'timestamp': '2025-10-08T16:36:36.155684', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'f4f050ba-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.797544566, 'message_signature': '1c6f64fc6a9b4712632cd9fe786a7382c9528185e38252066c758ca9bfa5fefc'}]}, 'timestamp': '2025-10-08 16:36:36.156134', '_unique_id': '991ec2afbf2646e28b8ce6d9a0dcdbd6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.156 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73eef18c-ddf7-4802-8647-f2597d61a3bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000061-d2243acb-4779-458d-9e3f-4a5434d27bfb-tape3be517f-e6', 'timestamp': '2025-10-08T16:36:36.157234', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'tape3be517f-e6', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:93:e1:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape3be517f-e6'}, 'message_id': 'f4f084c2-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.83526814, 'message_signature': '82a70eb32567860e0fccd541712793a5a996eaf376995776926d1514eae85c69'}]}, 'timestamp': '2025-10-08 16:36:36.157463', '_unique_id': 'c6e77e7514744c089e504a893332b9bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.157 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.158 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.158 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d995029-04af-4201-99c5-b3b1c5503b1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000061-d2243acb-4779-458d-9e3f-4a5434d27bfb-tape3be517f-e6', 'timestamp': '2025-10-08T16:36:36.158622', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'tape3be517f-e6', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:93:e1:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape3be517f-e6'}, 'message_id': 'f4f0bb9a-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.83526814, 'message_signature': 'aa4fa7f5eab1c2f69a30e86ed3273b5b7e0c2dfde720a173b08e98415c92284d'}]}, 'timestamp': '2025-10-08 16:36:36.158866', '_unique_id': 'e7514d6e801046c2a96c64b52f2bb1d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.159 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bcc6b404-6ff2-49d5-9155-86563e7daef2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000061-d2243acb-4779-458d-9e3f-4a5434d27bfb-tape3be517f-e6', 'timestamp': '2025-10-08T16:36:36.160147', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'tape3be517f-e6', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:93:e1:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape3be517f-e6'}, 'message_id': 'f4f0f6c8-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.83526814, 'message_signature': '53161f809222908a60d3db7dfa30d9a5dbbc2999fdf9ecb3ce4497a0dd873245'}]}, 'timestamp': '2025-10-08 16:36:36.160381', '_unique_id': 'fa91b3af04e04675bba59672eaf1f69f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.161 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.161 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.161 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2ec8a19-3d7f-4858-8c6a-678abf194652', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-vda', 'timestamp': '2025-10-08T16:36:36.161500', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'f4f12b20-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.85283452, 'message_signature': '8da4ed2e40ff5f3626120987256c2a06ecbd6db24c805234f0088bda854c87ff'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-sda', 'timestamp': '2025-10-08T16:36:36.161500', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'f4f13548-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.85283452, 'message_signature': '7cb006dfe4efcec92ace1eda949fb00f458fc3d53a6c8b01db764f5ced2dfe7b'}]}, 'timestamp': '2025-10-08 16:36:36.161993', '_unique_id': '3bb40cc0717e490494cdc8f15629063d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.162 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49966ea4-613f-4d5f-ac04-c046aed6b419', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000061-d2243acb-4779-458d-9e3f-4a5434d27bfb-tape3be517f-e6', 'timestamp': '2025-10-08T16:36:36.163254', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'tape3be517f-e6', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:93:e1:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape3be517f-e6'}, 'message_id': 'f4f17152-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.83526814, 'message_signature': '72a9b747d70575c082eb8df54fc72acba9007dd5ceebc4412556dc06fc693ee2'}]}, 'timestamp': '2025-10-08 16:36:36.163520', '_unique_id': '2e15077984ab42b4af12d7a502097ed4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.163 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.164 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.180 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.180 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance d2243acb-4779-458d-9e3f-4a5434d27bfb: ceilometer.compute.pollsters.NoVolumeException
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.180 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.181 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.181 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1966949606>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1966949606>]
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.181 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.181 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.181 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1966949606>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1966949606>]
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.181 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.181 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.read.latency volume: 802188726 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.181 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.read.latency volume: 1134915 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a9e9b22-100a-49c4-82e6-4dc5dd44be06', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 802188726, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-vda', 'timestamp': '2025-10-08T16:36:36.181633', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'f4f4418e-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.797544566, 'message_signature': '01c9c0f90923fe9d1e6bdd15f2097b727b9b877cf5c58394af3e27dd0aa6cd48'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1134915, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-sda', 'timestamp': '2025-10-08T16:36:36.181633', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'f4f44bde-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.797544566, 'message_signature': '5e49d2284e5715abe5947006c634996f8c227f9f1795b63117f834b342f975ee'}]}, 'timestamp': '2025-10-08 16:36:36.182219', '_unique_id': 'd03572a16caf467cb8d908928059432e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.183 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1631764-7063-4541-84a6-f9f8925e9e63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000061-d2243acb-4779-458d-9e3f-4a5434d27bfb-tape3be517f-e6', 'timestamp': '2025-10-08T16:36:36.183720', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'tape3be517f-e6', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:93:e1:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape3be517f-e6'}, 'message_id': 'f4f48f40-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.83526814, 'message_signature': 'c619ed85964163d7a5b45c3b7b241319d43ec6965a82272b1a840c09697ccb60'}]}, 'timestamp': '2025-10-08 16:36:36.183958', '_unique_id': 'fcaab27cc2e24c91840469d1fc094a2a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.184 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.write.bytes volume: 1024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '520042eb-067b-46e4-b09a-5e8fce11075a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1024, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-vda', 'timestamp': '2025-10-08T16:36:36.185016', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'f4f4c258-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.797544566, 'message_signature': '0f6818beac6b2d3683b28d787d1c388549b0c5fc6210b6dae84d071bebf1f7b1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-sda', 'timestamp': '2025-10-08T16:36:36.185016', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'f4f4ca0a-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.797544566, 'message_signature': '5f1e975a9734cf37fcbe7b1aa5310ba5333a1408e7a4a0bfb77c9a98fcfd74e3'}]}, 'timestamp': '2025-10-08 16:36:36.185436', '_unique_id': 'ff39efa35adf4fc8bf2daf58d7215728'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.185 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.186 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.186 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/cpu volume: 3140000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '42e856af-06c9-42f5-9482-70321b8a14ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3140000000, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'timestamp': '2025-10-08T16:36:36.186528', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'f4f4fcc8-a464-11f0-9274-fa163ef67048', 'monotonic_time': 8299.903447604, 'message_signature': 'e3de777eac8a048e352f0cac7487f9a7447a5122085b44d5eecc59032de9d2e6'}]}, 'timestamp': '2025-10-08 16:36:36.186743', '_unique_id': '1af1b64838de47bc878b1edbdc0312d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:36:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:36:36.187 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:36:37 np0005476733 nova_compute[192580]: 2025-10-08 16:36:37.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:36:39 np0005476733 nova_compute[192580]: 2025-10-08 16:36:39.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:41 np0005476733 nova_compute[192580]: 2025-10-08 16:36:41.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:42 np0005476733 nova_compute[192580]: 2025-10-08 16:36:42.600 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:36:43 np0005476733 podman[267203]: 2025-10-08 16:36:43.260227052 +0000 UTC m=+0.071877683 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:36:43 np0005476733 podman[267202]: 2025-10-08 16:36:43.271601595 +0000 UTC m=+0.080410756 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2)
Oct  8 12:36:43 np0005476733 podman[267204]: 2025-10-08 16:36:43.286421907 +0000 UTC m=+0.081992315 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, release=1755695350, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  8 12:36:44 np0005476733 nova_compute[192580]: 2025-10-08 16:36:44.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:45 np0005476733 nova_compute[192580]: 2025-10-08 16:36:45.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:36:45 np0005476733 nova_compute[192580]: 2025-10-08 16:36:45.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:36:46 np0005476733 nova_compute[192580]: 2025-10-08 16:36:46.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:49 np0005476733 nova_compute[192580]: 2025-10-08 16:36:49.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:50 np0005476733 nova_compute[192580]: 2025-10-08 16:36:50.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:36:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:36:50Z|00073|pinctrl|WARN|Dropped 327 log messages in last 60 seconds (most recently, 1 seconds ago) due to excessive rate
Oct  8 12:36:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:36:50Z|00074|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:36:51 np0005476733 nova_compute[192580]: 2025-10-08 16:36:51.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:51 np0005476733 podman[267269]: 2025-10-08 16:36:51.233843206 +0000 UTC m=+0.059845559 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 12:36:51 np0005476733 podman[267268]: 2025-10-08 16:36:51.259758322 +0000 UTC m=+0.090946390 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  8 12:36:52 np0005476733 nova_compute[192580]: 2025-10-08 16:36:52.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:36:53 np0005476733 nova_compute[192580]: 2025-10-08 16:36:53.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:36:53 np0005476733 nova_compute[192580]: 2025-10-08 16:36:53.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:36:53 np0005476733 nova_compute[192580]: 2025-10-08 16:36:53.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:36:53 np0005476733 nova_compute[192580]: 2025-10-08 16:36:53.883 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-d2243acb-4779-458d-9e3f-4a5434d27bfb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:36:53 np0005476733 nova_compute[192580]: 2025-10-08 16:36:53.884 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-d2243acb-4779-458d-9e3f-4a5434d27bfb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:36:53 np0005476733 nova_compute[192580]: 2025-10-08 16:36:53.884 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:36:53 np0005476733 nova_compute[192580]: 2025-10-08 16:36:53.884 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d2243acb-4779-458d-9e3f-4a5434d27bfb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:36:54 np0005476733 nova_compute[192580]: 2025-10-08 16:36:54.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:55 np0005476733 nova_compute[192580]: 2025-10-08 16:36:55.661 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Updating instance_info_cache with network_info: [{"id": "e3be517f-e639-485a-ab91-7dc0a7be7433", "address": "fa:16:3e:93:e1:e1", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3be517f-e6", "ovs_interfaceid": "e3be517f-e639-485a-ab91-7dc0a7be7433", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:36:55 np0005476733 nova_compute[192580]: 2025-10-08 16:36:55.690 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-d2243acb-4779-458d-9e3f-4a5434d27bfb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:36:55 np0005476733 nova_compute[192580]: 2025-10-08 16:36:55.691 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:36:55 np0005476733 nova_compute[192580]: 2025-10-08 16:36:55.691 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:36:56 np0005476733 nova_compute[192580]: 2025-10-08 16:36:56.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:36:56 np0005476733 ovn_controller[263831]: 2025-10-08T16:36:56Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:93:e1:e1 192.168.122.208
Oct  8 12:36:56 np0005476733 ovn_controller[263831]: 2025-10-08T16:36:56Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:93:e1:e1 192.168.122.208
Oct  8 12:36:58 np0005476733 nova_compute[192580]: 2025-10-08 16:36:58.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:36:58 np0005476733 nova_compute[192580]: 2025-10-08 16:36:58.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 12:36:58 np0005476733 nova_compute[192580]: 2025-10-08 16:36:58.607 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 12:36:59 np0005476733 nova_compute[192580]: 2025-10-08 16:36:59.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:00 np0005476733 podman[267311]: 2025-10-08 16:37:00.242256788 +0000 UTC m=+0.058591879 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 12:37:01 np0005476733 nova_compute[192580]: 2025-10-08 16:37:01.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.042 2 DEBUG oslo_concurrency.lockutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.043 2 DEBUG oslo_concurrency.lockutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.063 2 DEBUG nova.compute.manager [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.136 2 DEBUG oslo_concurrency.lockutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.137 2 DEBUG oslo_concurrency.lockutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.144 2 DEBUG nova.virt.hardware [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.145 2 INFO nova.compute.claims [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.215 2 DEBUG nova.scheduler.client.report [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.234 2 DEBUG nova.scheduler.client.report [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.235 2 DEBUG nova.compute.provider_tree [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.251 2 DEBUG nova.scheduler.client.report [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.271 2 DEBUG nova.scheduler.client.report [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 12:37:03 np0005476733 podman[267330]: 2025-10-08 16:37:03.281323818 +0000 UTC m=+0.113040956 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible)
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.341 2 DEBUG nova.compute.provider_tree [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.359 2 DEBUG nova.scheduler.client.report [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.377 2 DEBUG oslo_concurrency.lockutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.378 2 DEBUG nova.compute.manager [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.430 2 DEBUG nova.compute.manager [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.431 2 DEBUG nova.network.neutron [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.453 2 INFO nova.virt.libvirt.driver [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.472 2 DEBUG nova.compute.manager [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.570 2 DEBUG nova.compute.manager [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.571 2 DEBUG nova.virt.libvirt.driver [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.572 2 INFO nova.virt.libvirt.driver [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Creating image(s)#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.573 2 DEBUG oslo_concurrency.lockutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "/var/lib/nova/instances/3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.573 2 DEBUG oslo_concurrency.lockutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "/var/lib/nova/instances/3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.574 2 DEBUG oslo_concurrency.lockutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "/var/lib/nova/instances/3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.585 2 DEBUG oslo_concurrency.processutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.643 2 DEBUG oslo_concurrency.processutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.645 2 DEBUG oslo_concurrency.lockutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.645 2 DEBUG oslo_concurrency.lockutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.659 2 DEBUG oslo_concurrency.processutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.679 2 DEBUG nova.policy [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.717 2 DEBUG oslo_concurrency.processutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.718 2 DEBUG oslo_concurrency.processutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.749 2 DEBUG oslo_concurrency.processutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk 10737418240" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.750 2 DEBUG oslo_concurrency.lockutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.751 2 DEBUG oslo_concurrency.processutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.820 2 DEBUG oslo_concurrency.processutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.822 2 DEBUG nova.objects.instance [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lazy-loading 'migration_context' on Instance uuid 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.839 2 DEBUG nova.virt.libvirt.driver [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.840 2 DEBUG nova.virt.libvirt.driver [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Ensure instance console log exists: /var/lib/nova/instances/3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.840 2 DEBUG oslo_concurrency.lockutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.841 2 DEBUG oslo_concurrency.lockutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:37:03 np0005476733 nova_compute[192580]: 2025-10-08 16:37:03.841 2 DEBUG oslo_concurrency.lockutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:37:04 np0005476733 nova_compute[192580]: 2025-10-08 16:37:04.421 2 DEBUG nova.network.neutron [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Successfully created port: 8f2597d2-a2aa-4839-ac8f-aff700990b1d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 12:37:04 np0005476733 nova_compute[192580]: 2025-10-08 16:37:04.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:05 np0005476733 nova_compute[192580]: 2025-10-08 16:37:05.253 2 DEBUG nova.network.neutron [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Successfully updated port: 8f2597d2-a2aa-4839-ac8f-aff700990b1d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 12:37:05 np0005476733 nova_compute[192580]: 2025-10-08 16:37:05.288 2 DEBUG oslo_concurrency.lockutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "refresh_cache-3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:37:05 np0005476733 nova_compute[192580]: 2025-10-08 16:37:05.289 2 DEBUG oslo_concurrency.lockutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquired lock "refresh_cache-3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:37:05 np0005476733 nova_compute[192580]: 2025-10-08 16:37:05.289 2 DEBUG nova.network.neutron [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:37:05 np0005476733 nova_compute[192580]: 2025-10-08 16:37:05.374 2 DEBUG nova.compute.manager [req-49bfa1db-943e-4712-a6cd-5e572d82b579 req-8020a857-8270-49ab-962b-b119f011c73c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Received event network-changed-8f2597d2-a2aa-4839-ac8f-aff700990b1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:37:05 np0005476733 nova_compute[192580]: 2025-10-08 16:37:05.374 2 DEBUG nova.compute.manager [req-49bfa1db-943e-4712-a6cd-5e572d82b579 req-8020a857-8270-49ab-962b-b119f011c73c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Refreshing instance network info cache due to event network-changed-8f2597d2-a2aa-4839-ac8f-aff700990b1d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:37:05 np0005476733 nova_compute[192580]: 2025-10-08 16:37:05.375 2 DEBUG oslo_concurrency.lockutils [req-49bfa1db-943e-4712-a6cd-5e572d82b579 req-8020a857-8270-49ab-962b-b119f011c73c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:37:05 np0005476733 nova_compute[192580]: 2025-10-08 16:37:05.451 2 DEBUG nova.network.neutron [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 12:37:05 np0005476733 nova_compute[192580]: 2025-10-08 16:37:05.607 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:37:05 np0005476733 nova_compute[192580]: 2025-10-08 16:37:05.630 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:37:05 np0005476733 nova_compute[192580]: 2025-10-08 16:37:05.630 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:37:05 np0005476733 nova_compute[192580]: 2025-10-08 16:37:05.631 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:37:05 np0005476733 nova_compute[192580]: 2025-10-08 16:37:05.631 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:37:05 np0005476733 nova_compute[192580]: 2025-10-08 16:37:05.702 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:37:05 np0005476733 podman[267369]: 2025-10-08 16:37:05.769330496 +0000 UTC m=+0.085051394 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 12:37:05 np0005476733 nova_compute[192580]: 2025-10-08 16:37:05.774 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:37:05 np0005476733 nova_compute[192580]: 2025-10-08 16:37:05.775 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:37:05 np0005476733 nova_compute[192580]: 2025-10-08 16:37:05.831 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:37:05 np0005476733 nova_compute[192580]: 2025-10-08 16:37:05.963 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:37:05 np0005476733 nova_compute[192580]: 2025-10-08 16:37:05.964 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13052MB free_disk=111.29425048828125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:37:05 np0005476733 nova_compute[192580]: 2025-10-08 16:37:05.964 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:37:05 np0005476733 nova_compute[192580]: 2025-10-08 16:37:05.964 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.027 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance d2243acb-4779-458d-9e3f-4a5434d27bfb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.028 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.028 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.028 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=2560MB phys_disk=119GB used_disk=20GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.083 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.101 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.122 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.123 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.128 2 DEBUG nova.network.neutron [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Updating instance_info_cache with network_info: [{"id": "8f2597d2-a2aa-4839-ac8f-aff700990b1d", "address": "fa:16:3e:3b:7e:24", "network": {"id": "1496f4fb-5756-48b3-9df1-e6965ccbed85", "bridge": "br-int", "label": "tempest-test-network--1104825919", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f2597d2-a2", "ovs_interfaceid": "8f2597d2-a2aa-4839-ac8f-aff700990b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.145 2 DEBUG oslo_concurrency.lockutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Releasing lock "refresh_cache-3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.145 2 DEBUG nova.compute.manager [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Instance network_info: |[{"id": "8f2597d2-a2aa-4839-ac8f-aff700990b1d", "address": "fa:16:3e:3b:7e:24", "network": {"id": "1496f4fb-5756-48b3-9df1-e6965ccbed85", "bridge": "br-int", "label": "tempest-test-network--1104825919", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f2597d2-a2", "ovs_interfaceid": "8f2597d2-a2aa-4839-ac8f-aff700990b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.145 2 DEBUG oslo_concurrency.lockutils [req-49bfa1db-943e-4712-a6cd-5e572d82b579 req-8020a857-8270-49ab-962b-b119f011c73c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.145 2 DEBUG nova.network.neutron [req-49bfa1db-943e-4712-a6cd-5e572d82b579 req-8020a857-8270-49ab-962b-b119f011c73c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Refreshing network info cache for port 8f2597d2-a2aa-4839-ac8f-aff700990b1d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.147 2 DEBUG nova.virt.libvirt.driver [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Start _get_guest_xml network_info=[{"id": "8f2597d2-a2aa-4839-ac8f-aff700990b1d", "address": "fa:16:3e:3b:7e:24", "network": {"id": "1496f4fb-5756-48b3-9df1-e6965ccbed85", "bridge": "br-int", "label": "tempest-test-network--1104825919", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f2597d2-a2", "ovs_interfaceid": "8f2597d2-a2aa-4839-ac8f-aff700990b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.151 2 WARNING nova.virt.libvirt.driver [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.154 2 DEBUG nova.virt.libvirt.host [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.155 2 DEBUG nova.virt.libvirt.host [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.158 2 DEBUG nova.virt.libvirt.host [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.158 2 DEBUG nova.virt.libvirt.host [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.159 2 DEBUG nova.virt.libvirt.driver [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.159 2 DEBUG nova.virt.hardware [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.159 2 DEBUG nova.virt.hardware [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.159 2 DEBUG nova.virt.hardware [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.160 2 DEBUG nova.virt.hardware [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.160 2 DEBUG nova.virt.hardware [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.160 2 DEBUG nova.virt.hardware [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.160 2 DEBUG nova.virt.hardware [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.160 2 DEBUG nova.virt.hardware [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.160 2 DEBUG nova.virt.hardware [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.160 2 DEBUG nova.virt.hardware [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.161 2 DEBUG nova.virt.hardware [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.164 2 DEBUG nova.virt.libvirt.vif [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:37:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-294986936',display_name='tempest-server-test-294986936',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-294986936',id=99,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgF0yKypuL1wJPOTwdtr+zzq0qw+uwurXu01O/Ym5uWgfd00pr3GN1raply3ByKVO5hmmkfvydhY0zQSvT9dZbNRj3hL8c6L+eBag20GVWlTRMyq8EfPEfzsuER1PS2LQ==',key_name='tempest-keypair-test-224344080',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4726c8b7a2a3405b9b2d689862918f5d',ramdisk_id='',reservation_id='r-bzq9fy7c',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-GatewayMtuTestUdp-187807839',owner_user_name='tempest-GatewayMtuTestUdp-187807839-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:37:03Z,user_data=None,user_id='de0012a12c1645bfb620caa34110c3f4',uuid=3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8f2597d2-a2aa-4839-ac8f-aff700990b1d", "address": "fa:16:3e:3b:7e:24", "network": {"id": "1496f4fb-5756-48b3-9df1-e6965ccbed85", "bridge": "br-int", "label": "tempest-test-network--1104825919", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f2597d2-a2", "ovs_interfaceid": "8f2597d2-a2aa-4839-ac8f-aff700990b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.164 2 DEBUG nova.network.os_vif_util [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Converting VIF {"id": "8f2597d2-a2aa-4839-ac8f-aff700990b1d", "address": "fa:16:3e:3b:7e:24", "network": {"id": "1496f4fb-5756-48b3-9df1-e6965ccbed85", "bridge": "br-int", "label": "tempest-test-network--1104825919", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f2597d2-a2", "ovs_interfaceid": "8f2597d2-a2aa-4839-ac8f-aff700990b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.165 2 DEBUG nova.network.os_vif_util [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:7e:24,bridge_name='br-int',has_traffic_filtering=True,id=8f2597d2-a2aa-4839-ac8f-aff700990b1d,network=Network(1496f4fb-5756-48b3-9df1-e6965ccbed85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f2597d2-a2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.165 2 DEBUG nova.objects.instance [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lazy-loading 'pci_devices' on Instance uuid 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.181 2 DEBUG nova.virt.libvirt.driver [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] End _get_guest_xml xml=<domain type="kvm">
Oct  8 12:37:06 np0005476733 nova_compute[192580]:  <uuid>3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0</uuid>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:  <name>instance-00000063</name>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <nova:name>tempest-server-test-294986936</nova:name>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 16:37:06</nova:creationTime>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 12:37:06 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:        <nova:user uuid="de0012a12c1645bfb620caa34110c3f4">tempest-GatewayMtuTestUdp-187807839-project-member</nova:user>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:        <nova:project uuid="4726c8b7a2a3405b9b2d689862918f5d">tempest-GatewayMtuTestUdp-187807839</nova:project>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:        <nova:port uuid="8f2597d2-a2aa-4839-ac8f-aff700990b1d">
Oct  8 12:37:06 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <system>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <entry name="serial">3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0</entry>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <entry name="uuid">3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0</entry>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    </system>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:  <os>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:  </os>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:  <features>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:  </features>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:  </clock>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:  <devices>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.config"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:3b:7e:24"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <target dev="tap8f2597d2-a2"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    </interface>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/console.log" append="off"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    </serial>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <video>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    </video>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    </rng>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 12:37:06 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 12:37:06 np0005476733 nova_compute[192580]:  </devices>
Oct  8 12:37:06 np0005476733 nova_compute[192580]: </domain>
Oct  8 12:37:06 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.181 2 DEBUG nova.compute.manager [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Preparing to wait for external event network-vif-plugged-8f2597d2-a2aa-4839-ac8f-aff700990b1d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.182 2 DEBUG oslo_concurrency.lockutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.182 2 DEBUG oslo_concurrency.lockutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.182 2 DEBUG oslo_concurrency.lockutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.183 2 DEBUG nova.virt.libvirt.vif [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:37:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-294986936',display_name='tempest-server-test-294986936',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-294986936',id=99,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgF0yKypuL1wJPOTwdtr+zzq0qw+uwurXu01O/Ym5uWgfd00pr3GN1raply3ByKVO5hmmkfvydhY0zQSvT9dZbNRj3hL8c6L+eBag20GVWlTRMyq8EfPEfzsuER1PS2LQ==',key_name='tempest-keypair-test-224344080',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4726c8b7a2a3405b9b2d689862918f5d',ramdisk_id='',reservation_id='r-bzq9fy7c',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-GatewayMtuTestUdp-187807839',owner_user_name='tempest-GatewayMtuTestUdp-187807839-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:37:03Z,user_data=None,user_id='de0012a12c1645bfb620caa34110c3f4',uuid=3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8f2597d2-a2aa-4839-ac8f-aff700990b1d", "address": "fa:16:3e:3b:7e:24", "network": {"id": "1496f4fb-5756-48b3-9df1-e6965ccbed85", "bridge": "br-int", "label": "tempest-test-network--1104825919", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f2597d2-a2", "ovs_interfaceid": "8f2597d2-a2aa-4839-ac8f-aff700990b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.183 2 DEBUG nova.network.os_vif_util [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Converting VIF {"id": "8f2597d2-a2aa-4839-ac8f-aff700990b1d", "address": "fa:16:3e:3b:7e:24", "network": {"id": "1496f4fb-5756-48b3-9df1-e6965ccbed85", "bridge": "br-int", "label": "tempest-test-network--1104825919", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f2597d2-a2", "ovs_interfaceid": "8f2597d2-a2aa-4839-ac8f-aff700990b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.184 2 DEBUG nova.network.os_vif_util [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:7e:24,bridge_name='br-int',has_traffic_filtering=True,id=8f2597d2-a2aa-4839-ac8f-aff700990b1d,network=Network(1496f4fb-5756-48b3-9df1-e6965ccbed85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f2597d2-a2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.184 2 DEBUG os_vif [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:7e:24,bridge_name='br-int',has_traffic_filtering=True,id=8f2597d2-a2aa-4839-ac8f-aff700990b1d,network=Network(1496f4fb-5756-48b3-9df1-e6965ccbed85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f2597d2-a2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.185 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.185 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.188 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8f2597d2-a2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.189 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8f2597d2-a2, col_values=(('external_ids', {'iface-id': '8f2597d2-a2aa-4839-ac8f-aff700990b1d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3b:7e:24', 'vm-uuid': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:06 np0005476733 NetworkManager[51699]: <info>  [1759941426.1917] manager: (tap8f2597d2-a2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/305)
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.199 2 INFO os_vif [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:7e:24,bridge_name='br-int',has_traffic_filtering=True,id=8f2597d2-a2aa-4839-ac8f-aff700990b1d,network=Network(1496f4fb-5756-48b3-9df1-e6965ccbed85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f2597d2-a2')#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.252 2 DEBUG nova.virt.libvirt.driver [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.253 2 DEBUG nova.virt.libvirt.driver [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.253 2 DEBUG nova.virt.libvirt.driver [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] No VIF found with MAC fa:16:3e:3b:7e:24, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.253 2 INFO nova.virt.libvirt.driver [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Using config drive#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.565 2 INFO nova.virt.libvirt.driver [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Creating config drive at /var/lib/nova/instances/3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.config#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.569 2 DEBUG oslo_concurrency.processutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnknut41a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.708 2 DEBUG oslo_concurrency.processutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnknut41a" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:37:06 np0005476733 NetworkManager[51699]: <info>  [1759941426.7770] manager: (tap8f2597d2-a2): new Tun device (/org/freedesktop/NetworkManager/Devices/306)
Oct  8 12:37:06 np0005476733 kernel: tap8f2597d2-a2: entered promiscuous mode
Oct  8 12:37:06 np0005476733 ovn_controller[263831]: 2025-10-08T16:37:06Z|00075|binding|INFO|Claiming lport 8f2597d2-a2aa-4839-ac8f-aff700990b1d for this chassis.
Oct  8 12:37:06 np0005476733 ovn_controller[263831]: 2025-10-08T16:37:06Z|00076|binding|INFO|8f2597d2-a2aa-4839-ac8f-aff700990b1d: Claiming fa:16:3e:3b:7e:24 10.100.0.11
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:06.814 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:7e:24 10.100.0.11'], port_security=['fa:16:3e:3b:7e:24 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1496f4fb-5756-48b3-9df1-e6965ccbed85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0ae65408-c1fc-4a23-acb9-ead1616a73f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f54a9571-f531-4620-a544-cc4241429076, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=8f2597d2-a2aa-4839-ac8f-aff700990b1d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:37:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:06.816 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 8f2597d2-a2aa-4839-ac8f-aff700990b1d in datapath 1496f4fb-5756-48b3-9df1-e6965ccbed85 bound to our chassis#033[00m
Oct  8 12:37:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:06.818 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1496f4fb-5756-48b3-9df1-e6965ccbed85#033[00m
Oct  8 12:37:06 np0005476733 ovn_controller[263831]: 2025-10-08T16:37:06Z|00077|binding|INFO|Setting lport 8f2597d2-a2aa-4839-ac8f-aff700990b1d ovn-installed in OVS
Oct  8 12:37:06 np0005476733 ovn_controller[263831]: 2025-10-08T16:37:06Z|00078|binding|INFO|Setting lport 8f2597d2-a2aa-4839-ac8f-aff700990b1d up in Southbound
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:06 np0005476733 nova_compute[192580]: 2025-10-08 16:37:06.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:06.832 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[703a2c91-9818-4e9b-9eee-5716e7c386ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:37:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:06.833 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1496f4fb-51 in ovnmeta-1496f4fb-5756-48b3-9df1-e6965ccbed85 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 12:37:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:06.835 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1496f4fb-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 12:37:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:06.835 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[637f453f-02c6-4111-a68e-a1d98c0adbaf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:37:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:06.836 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1cfc8370-b3f6-49cb-b13c-94b16f7982a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:37:06 np0005476733 systemd-udevd[267432]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:37:06 np0005476733 systemd-machined[152624]: New machine qemu-61-instance-00000063.
Oct  8 12:37:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:06.849 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[36ba2975-d5ea-4d68-9125-0f449cb938f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:37:06 np0005476733 NetworkManager[51699]: <info>  [1759941426.8588] device (tap8f2597d2-a2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:37:06 np0005476733 NetworkManager[51699]: <info>  [1759941426.8595] device (tap8f2597d2-a2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 12:37:06 np0005476733 systemd[1]: Started Virtual Machine qemu-61-instance-00000063.
Oct  8 12:37:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:06.875 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[94bb8ee0-2a3e-4696-8863-befbdbb386d1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:37:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:06.905 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[a81c0f55-c5fe-4da0-a790-4ce43729a12b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:37:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:06.909 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0ba8e891-5b11-40f4-8868-abdc1404f849]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:37:06 np0005476733 NetworkManager[51699]: <info>  [1759941426.9119] manager: (tap1496f4fb-50): new Veth device (/org/freedesktop/NetworkManager/Devices/307)
Oct  8 12:37:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:06.942 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[9f80a3c5-98bc-469f-b5e6-b338ff07a045]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:37:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:06.945 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[1c6fe53d-1186-4c59-b059-a8f204f22319]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:37:06 np0005476733 NetworkManager[51699]: <info>  [1759941426.9698] device (tap1496f4fb-50): carrier: link connected
Oct  8 12:37:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:06.974 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[5d5592e4-3267-491e-9775-15a96c0fce7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:37:06 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:06.993 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7c92cf6e-ac83-4a44-91cb-e8fdbf84834e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1496f4fb-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d0:85:78'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 210], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 833063, 'reachable_time': 43977, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 267463, 'error': None, 'target': 'ovnmeta-1496f4fb-5756-48b3-9df1-e6965ccbed85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:07.012 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4b2627df-26c2-4133-aab3-f3f9bdb8a74f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed0:8578'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 833063, 'tstamp': 833063}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 267464, 'error': None, 'target': 'ovnmeta-1496f4fb-5756-48b3-9df1-e6965ccbed85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:07.028 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6d0394d3-0dfd-4f43-b0aa-2706839c938b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1496f4fb-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d0:85:78'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 210], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 833063, 'reachable_time': 43977, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 267466, 'error': None, 'target': 'ovnmeta-1496f4fb-5756-48b3-9df1-e6965ccbed85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:07.058 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a5510215-c54b-4b1f-ae93-13560490b187]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.103 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:07.114 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5db7e6ad-c2cf-4169-b999-786129d833c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:07.115 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1496f4fb-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:07.116 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:07.116 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1496f4fb-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:07 np0005476733 NetworkManager[51699]: <info>  [1759941427.1198] manager: (tap1496f4fb-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/308)
Oct  8 12:37:07 np0005476733 kernel: tap1496f4fb-50: entered promiscuous mode
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:07.125 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1496f4fb-50, col_values=(('external_ids', {'iface-id': 'ea88a4c7-613a-4d3b-aaa7-638fbf87efa0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:37:07 np0005476733 ovn_controller[263831]: 2025-10-08T16:37:07Z|00079|binding|INFO|Releasing lport ea88a4c7-613a-4d3b-aaa7-638fbf87efa0 from this chassis (sb_readonly=0)
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:07.128 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1496f4fb-5756-48b3-9df1-e6965ccbed85.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1496f4fb-5756-48b3-9df1-e6965ccbed85.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:07.129 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[53bf7389-d2a3-4525-a22a-71d406dc0cb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:07.130 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-1496f4fb-5756-48b3-9df1-e6965ccbed85
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/1496f4fb-5756-48b3-9df1-e6965ccbed85.pid.haproxy
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 1496f4fb-5756-48b3-9df1-e6965ccbed85
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 12:37:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:07.131 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1496f4fb-5756-48b3-9df1-e6965ccbed85', 'env', 'PROCESS_TAG=haproxy-1496f4fb-5756-48b3-9df1-e6965ccbed85', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1496f4fb-5756-48b3-9df1-e6965ccbed85.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.269 2 DEBUG nova.network.neutron [req-49bfa1db-943e-4712-a6cd-5e572d82b579 req-8020a857-8270-49ab-962b-b119f011c73c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Updated VIF entry in instance network info cache for port 8f2597d2-a2aa-4839-ac8f-aff700990b1d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.270 2 DEBUG nova.network.neutron [req-49bfa1db-943e-4712-a6cd-5e572d82b579 req-8020a857-8270-49ab-962b-b119f011c73c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Updating instance_info_cache with network_info: [{"id": "8f2597d2-a2aa-4839-ac8f-aff700990b1d", "address": "fa:16:3e:3b:7e:24", "network": {"id": "1496f4fb-5756-48b3-9df1-e6965ccbed85", "bridge": "br-int", "label": "tempest-test-network--1104825919", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f2597d2-a2", "ovs_interfaceid": "8f2597d2-a2aa-4839-ac8f-aff700990b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.287 2 DEBUG oslo_concurrency.lockutils [req-49bfa1db-943e-4712-a6cd-5e572d82b579 req-8020a857-8270-49ab-962b-b119f011c73c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.461 2 DEBUG nova.compute.manager [req-2e75043b-ee3f-4abe-a534-eab2c8ad8410 req-bb912501-56dc-4134-a94a-81359a321f1b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Received event network-vif-plugged-8f2597d2-a2aa-4839-ac8f-aff700990b1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.463 2 DEBUG oslo_concurrency.lockutils [req-2e75043b-ee3f-4abe-a534-eab2c8ad8410 req-bb912501-56dc-4134-a94a-81359a321f1b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.463 2 DEBUG oslo_concurrency.lockutils [req-2e75043b-ee3f-4abe-a534-eab2c8ad8410 req-bb912501-56dc-4134-a94a-81359a321f1b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.463 2 DEBUG oslo_concurrency.lockutils [req-2e75043b-ee3f-4abe-a534-eab2c8ad8410 req-bb912501-56dc-4134-a94a-81359a321f1b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.464 2 DEBUG nova.compute.manager [req-2e75043b-ee3f-4abe-a534-eab2c8ad8410 req-bb912501-56dc-4134-a94a-81359a321f1b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Processing event network-vif-plugged-8f2597d2-a2aa-4839-ac8f-aff700990b1d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.464 2 DEBUG nova.compute.manager [req-2e75043b-ee3f-4abe-a534-eab2c8ad8410 req-bb912501-56dc-4134-a94a-81359a321f1b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Received event network-vif-plugged-8f2597d2-a2aa-4839-ac8f-aff700990b1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.465 2 DEBUG oslo_concurrency.lockutils [req-2e75043b-ee3f-4abe-a534-eab2c8ad8410 req-bb912501-56dc-4134-a94a-81359a321f1b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.465 2 DEBUG oslo_concurrency.lockutils [req-2e75043b-ee3f-4abe-a534-eab2c8ad8410 req-bb912501-56dc-4134-a94a-81359a321f1b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.465 2 DEBUG oslo_concurrency.lockutils [req-2e75043b-ee3f-4abe-a534-eab2c8ad8410 req-bb912501-56dc-4134-a94a-81359a321f1b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.465 2 DEBUG nova.compute.manager [req-2e75043b-ee3f-4abe-a534-eab2c8ad8410 req-bb912501-56dc-4134-a94a-81359a321f1b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] No waiting events found dispatching network-vif-plugged-8f2597d2-a2aa-4839-ac8f-aff700990b1d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.466 2 WARNING nova.compute.manager [req-2e75043b-ee3f-4abe-a534-eab2c8ad8410 req-bb912501-56dc-4134-a94a-81359a321f1b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Received unexpected event network-vif-plugged-8f2597d2-a2aa-4839-ac8f-aff700990b1d for instance with vm_state building and task_state spawning.#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.511 2 DEBUG nova.compute.manager [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.511 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759941427.5105703, 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.512 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] VM Started (Lifecycle Event)#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.515 2 DEBUG nova.virt.libvirt.driver [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 12:37:07 np0005476733 podman[267504]: 2025-10-08 16:37:07.520382983 +0000 UTC m=+0.046027528 container create 96396fc92f45a03b87d0c3ff68a43abcbb5ff0abbbbb54cb1826953becc913ed (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-1496f4fb-5756-48b3-9df1-e6965ccbed85, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.521 2 INFO nova.virt.libvirt.driver [-] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Instance spawned successfully.#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.522 2 DEBUG nova.virt.libvirt.driver [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.544 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.550 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.554 2 DEBUG nova.virt.libvirt.driver [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.554 2 DEBUG nova.virt.libvirt.driver [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.554 2 DEBUG nova.virt.libvirt.driver [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.555 2 DEBUG nova.virt.libvirt.driver [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.555 2 DEBUG nova.virt.libvirt.driver [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.556 2 DEBUG nova.virt.libvirt.driver [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:37:07 np0005476733 systemd[1]: Started libpod-conmon-96396fc92f45a03b87d0c3ff68a43abcbb5ff0abbbbb54cb1826953becc913ed.scope.
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.580 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.581 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759941427.5115626, 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.581 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] VM Paused (Lifecycle Event)#033[00m
Oct  8 12:37:07 np0005476733 systemd[1]: Started libcrun container.
Oct  8 12:37:07 np0005476733 podman[267504]: 2025-10-08 16:37:07.495267262 +0000 UTC m=+0.020911807 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 12:37:07 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1e8919eda12b8425f01ca9b2ff863ad63793c200364cced3354d740d3613bbea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.605 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.609 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759941427.514727, 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:37:07 np0005476733 podman[267504]: 2025-10-08 16:37:07.609766623 +0000 UTC m=+0.135411178 container init 96396fc92f45a03b87d0c3ff68a43abcbb5ff0abbbbb54cb1826953becc913ed (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-1496f4fb-5756-48b3-9df1-e6965ccbed85, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.609 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] VM Resumed (Lifecycle Event)#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.615 2 INFO nova.compute.manager [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Took 4.04 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.615 2 DEBUG nova.compute.manager [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:37:07 np0005476733 podman[267504]: 2025-10-08 16:37:07.616299862 +0000 UTC m=+0.141944407 container start 96396fc92f45a03b87d0c3ff68a43abcbb5ff0abbbbb54cb1826953becc913ed (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-1496f4fb-5756-48b3-9df1-e6965ccbed85, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.627 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.630 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:37:07 np0005476733 neutron-haproxy-ovnmeta-1496f4fb-5756-48b3-9df1-e6965ccbed85[267519]: [NOTICE]   (267523) : New worker (267525) forked
Oct  8 12:37:07 np0005476733 neutron-haproxy-ovnmeta-1496f4fb-5756-48b3-9df1-e6965ccbed85[267519]: [NOTICE]   (267523) : Loading success.
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.836 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.867 2 INFO nova.compute.manager [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Took 4.76 seconds to build instance.#033[00m
Oct  8 12:37:07 np0005476733 nova_compute[192580]: 2025-10-08 16:37:07.887 2 DEBUG oslo_concurrency.lockutils [None req-8f9f22ac-d2ee-4ee2-8014-c9a11e9d4442 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.844s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:37:09 np0005476733 nova_compute[192580]: 2025-10-08 16:37:09.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:10 np0005476733 nova_compute[192580]: 2025-10-08 16:37:10.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:37:11 np0005476733 nova_compute[192580]: 2025-10-08 16:37:11.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:14 np0005476733 podman[267535]: 2025-10-08 16:37:14.233862583 +0000 UTC m=+0.054300923 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:37:14 np0005476733 podman[267536]: 2025-10-08 16:37:14.250228465 +0000 UTC m=+0.067373289 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  8 12:37:14 np0005476733 podman[267534]: 2025-10-08 16:37:14.254706749 +0000 UTC m=+0.077943468 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:37:14 np0005476733 nova_compute[192580]: 2025-10-08 16:37:14.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:16 np0005476733 nova_compute[192580]: 2025-10-08 16:37:16.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:19 np0005476733 nova_compute[192580]: 2025-10-08 16:37:19.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:21 np0005476733 nova_compute[192580]: 2025-10-08 16:37:21.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:22 np0005476733 podman[267596]: 2025-10-08 16:37:22.235856132 +0000 UTC m=+0.061112549 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct  8 12:37:22 np0005476733 podman[267597]: 2025-10-08 16:37:22.23986396 +0000 UTC m=+0.058510437 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 12:37:24 np0005476733 nova_compute[192580]: 2025-10-08 16:37:24.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:26 np0005476733 nova_compute[192580]: 2025-10-08 16:37:26.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:26.403 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:37:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:26.409 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.008s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:37:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:37:26.412 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:37:29 np0005476733 nova_compute[192580]: 2025-10-08 16:37:29.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:30 np0005476733 ovn_controller[263831]: 2025-10-08T16:37:30Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3b:7e:24 10.100.0.11
Oct  8 12:37:30 np0005476733 ovn_controller[263831]: 2025-10-08T16:37:30Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3b:7e:24 10.100.0.11
Oct  8 12:37:31 np0005476733 nova_compute[192580]: 2025-10-08 16:37:31.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:31 np0005476733 podman[267651]: 2025-10-08 16:37:31.22188609 +0000 UTC m=+0.052299278 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 12:37:34 np0005476733 podman[267671]: 2025-10-08 16:37:34.277308262 +0000 UTC m=+0.105284739 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251001, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:37:34 np0005476733 nova_compute[192580]: 2025-10-08 16:37:34.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:36 np0005476733 nova_compute[192580]: 2025-10-08 16:37:36.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:36 np0005476733 podman[267698]: 2025-10-08 16:37:36.23942092 +0000 UTC m=+0.071757920 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:37:36 np0005476733 nova_compute[192580]: 2025-10-08 16:37:36.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:37:37 np0005476733 ovn_controller[263831]: 2025-10-08T16:37:37Z|00080|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Oct  8 12:37:39 np0005476733 nova_compute[192580]: 2025-10-08 16:37:39.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:41 np0005476733 nova_compute[192580]: 2025-10-08 16:37:41.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:43 np0005476733 nova_compute[192580]: 2025-10-08 16:37:43.582 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:37:44 np0005476733 nova_compute[192580]: 2025-10-08 16:37:44.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:45 np0005476733 podman[267741]: 2025-10-08 16:37:45.236937983 +0000 UTC m=+0.061971717 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  8 12:37:45 np0005476733 podman[267742]: 2025-10-08 16:37:45.239725742 +0000 UTC m=+0.058442034 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 12:37:45 np0005476733 podman[267743]: 2025-10-08 16:37:45.25284381 +0000 UTC m=+0.060177419 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, name=ubi9-minimal, build-date=2025-08-20T13:12:41, release=1755695350, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter)
Oct  8 12:37:45 np0005476733 nova_compute[192580]: 2025-10-08 16:37:45.591 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:37:45 np0005476733 nova_compute[192580]: 2025-10-08 16:37:45.591 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:37:46 np0005476733 nova_compute[192580]: 2025-10-08 16:37:46.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:49 np0005476733 nova_compute[192580]: 2025-10-08 16:37:49.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:50 np0005476733 nova_compute[192580]: 2025-10-08 16:37:50.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:37:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:37:50Z|00081|pinctrl|WARN|Dropped 233 log messages in last 60 seconds (most recently, 14 seconds ago) due to excessive rate
Oct  8 12:37:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:37:50Z|00082|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:37:51 np0005476733 nova_compute[192580]: 2025-10-08 16:37:51.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:53 np0005476733 podman[267806]: 2025-10-08 16:37:53.216878358 +0000 UTC m=+0.051159571 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 12:37:53 np0005476733 podman[267807]: 2025-10-08 16:37:53.229920865 +0000 UTC m=+0.058097024 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:37:53 np0005476733 nova_compute[192580]: 2025-10-08 16:37:53.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:37:53 np0005476733 nova_compute[192580]: 2025-10-08 16:37:53.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:37:53 np0005476733 nova_compute[192580]: 2025-10-08 16:37:53.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:37:54 np0005476733 nova_compute[192580]: 2025-10-08 16:37:54.675 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-d2243acb-4779-458d-9e3f-4a5434d27bfb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:37:54 np0005476733 nova_compute[192580]: 2025-10-08 16:37:54.675 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-d2243acb-4779-458d-9e3f-4a5434d27bfb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:37:54 np0005476733 nova_compute[192580]: 2025-10-08 16:37:54.675 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:37:54 np0005476733 nova_compute[192580]: 2025-10-08 16:37:54.675 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d2243acb-4779-458d-9e3f-4a5434d27bfb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:37:54 np0005476733 nova_compute[192580]: 2025-10-08 16:37:54.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:56 np0005476733 nova_compute[192580]: 2025-10-08 16:37:56.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:37:57 np0005476733 nova_compute[192580]: 2025-10-08 16:37:57.819 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Updating instance_info_cache with network_info: [{"id": "e3be517f-e639-485a-ab91-7dc0a7be7433", "address": "fa:16:3e:93:e1:e1", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3be517f-e6", "ovs_interfaceid": "e3be517f-e639-485a-ab91-7dc0a7be7433", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:37:57 np0005476733 nova_compute[192580]: 2025-10-08 16:37:57.889 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-d2243acb-4779-458d-9e3f-4a5434d27bfb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:37:57 np0005476733 nova_compute[192580]: 2025-10-08 16:37:57.889 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:37:57 np0005476733 nova_compute[192580]: 2025-10-08 16:37:57.890 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:37:57 np0005476733 nova_compute[192580]: 2025-10-08 16:37:57.890 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:37:59 np0005476733 nova_compute[192580]: 2025-10-08 16:37:59.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:01 np0005476733 nova_compute[192580]: 2025-10-08 16:38:01.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:02 np0005476733 podman[267887]: 2025-10-08 16:38:02.213019909 +0000 UTC m=+0.047060321 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct  8 12:38:04 np0005476733 nova_compute[192580]: 2025-10-08 16:38:04.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:05 np0005476733 podman[267907]: 2025-10-08 16:38:05.284560395 +0000 UTC m=+0.110058331 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001)
Oct  8 12:38:06 np0005476733 nova_compute[192580]: 2025-10-08 16:38:06.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:07 np0005476733 podman[267935]: 2025-10-08 16:38:07.240264219 +0000 UTC m=+0.060986366 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute)
Oct  8 12:38:07 np0005476733 nova_compute[192580]: 2025-10-08 16:38:07.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:38:07 np0005476733 nova_compute[192580]: 2025-10-08 16:38:07.621 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:38:07 np0005476733 nova_compute[192580]: 2025-10-08 16:38:07.622 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:38:07 np0005476733 nova_compute[192580]: 2025-10-08 16:38:07.622 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:38:07 np0005476733 nova_compute[192580]: 2025-10-08 16:38:07.622 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:38:07 np0005476733 nova_compute[192580]: 2025-10-08 16:38:07.721 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:38:07 np0005476733 nova_compute[192580]: 2025-10-08 16:38:07.783 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:38:07 np0005476733 nova_compute[192580]: 2025-10-08 16:38:07.784 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:38:07 np0005476733 nova_compute[192580]: 2025-10-08 16:38:07.844 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:38:07 np0005476733 nova_compute[192580]: 2025-10-08 16:38:07.853 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:38:07 np0005476733 nova_compute[192580]: 2025-10-08 16:38:07.946 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:38:07 np0005476733 nova_compute[192580]: 2025-10-08 16:38:07.947 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:38:08 np0005476733 nova_compute[192580]: 2025-10-08 16:38:08.003 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:38:08 np0005476733 nova_compute[192580]: 2025-10-08 16:38:08.169 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:38:08 np0005476733 nova_compute[192580]: 2025-10-08 16:38:08.171 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12017MB free_disk=111.02641677856445GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:38:08 np0005476733 nova_compute[192580]: 2025-10-08 16:38:08.171 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:38:08 np0005476733 nova_compute[192580]: 2025-10-08 16:38:08.171 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:38:08 np0005476733 nova_compute[192580]: 2025-10-08 16:38:08.508 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance d2243acb-4779-458d-9e3f-4a5434d27bfb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:38:08 np0005476733 nova_compute[192580]: 2025-10-08 16:38:08.508 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:38:08 np0005476733 nova_compute[192580]: 2025-10-08 16:38:08.509 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:38:08 np0005476733 nova_compute[192580]: 2025-10-08 16:38:08.509 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=2560MB phys_disk=119GB used_disk=20GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:38:08 np0005476733 nova_compute[192580]: 2025-10-08 16:38:08.755 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:38:08 np0005476733 nova_compute[192580]: 2025-10-08 16:38:08.775 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:38:08 np0005476733 nova_compute[192580]: 2025-10-08 16:38:08.804 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:38:08 np0005476733 nova_compute[192580]: 2025-10-08 16:38:08.804 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.633s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:38:09 np0005476733 nova_compute[192580]: 2025-10-08 16:38:09.805 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:38:09 np0005476733 nova_compute[192580]: 2025-10-08 16:38:09.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:11 np0005476733 nova_compute[192580]: 2025-10-08 16:38:11.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:14 np0005476733 nova_compute[192580]: 2025-10-08 16:38:14.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:15 np0005476733 nova_compute[192580]: 2025-10-08 16:38:15.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:38:15.080 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=75, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=74) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:38:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:38:15.081 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:38:15 np0005476733 nova_compute[192580]: 2025-10-08 16:38:15.965 2 DEBUG nova.compute.manager [req-028fd974-c0be-49c8-a6cb-a6ed33ff1742 req-bb569a8a-487c-403f-b9c5-dc0955207742 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Received event network-changed-8f2597d2-a2aa-4839-ac8f-aff700990b1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:38:15 np0005476733 nova_compute[192580]: 2025-10-08 16:38:15.966 2 DEBUG nova.compute.manager [req-028fd974-c0be-49c8-a6cb-a6ed33ff1742 req-bb569a8a-487c-403f-b9c5-dc0955207742 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Refreshing instance network info cache due to event network-changed-8f2597d2-a2aa-4839-ac8f-aff700990b1d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:38:15 np0005476733 nova_compute[192580]: 2025-10-08 16:38:15.966 2 DEBUG oslo_concurrency.lockutils [req-028fd974-c0be-49c8-a6cb-a6ed33ff1742 req-bb569a8a-487c-403f-b9c5-dc0955207742 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:38:15 np0005476733 nova_compute[192580]: 2025-10-08 16:38:15.966 2 DEBUG oslo_concurrency.lockutils [req-028fd974-c0be-49c8-a6cb-a6ed33ff1742 req-bb569a8a-487c-403f-b9c5-dc0955207742 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:38:15 np0005476733 nova_compute[192580]: 2025-10-08 16:38:15.966 2 DEBUG nova.network.neutron [req-028fd974-c0be-49c8-a6cb-a6ed33ff1742 req-bb569a8a-487c-403f-b9c5-dc0955207742 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Refreshing network info cache for port 8f2597d2-a2aa-4839-ac8f-aff700990b1d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:38:16 np0005476733 nova_compute[192580]: 2025-10-08 16:38:16.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:16 np0005476733 podman[267969]: 2025-10-08 16:38:16.241654306 +0000 UTC m=+0.062549346 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc.)
Oct  8 12:38:16 np0005476733 podman[267968]: 2025-10-08 16:38:16.253350929 +0000 UTC m=+0.077609066 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:38:16 np0005476733 podman[267967]: 2025-10-08 16:38:16.2659306 +0000 UTC m=+0.091759927 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 12:38:18 np0005476733 nova_compute[192580]: 2025-10-08 16:38:18.022 2 DEBUG nova.network.neutron [req-028fd974-c0be-49c8-a6cb-a6ed33ff1742 req-bb569a8a-487c-403f-b9c5-dc0955207742 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Updated VIF entry in instance network info cache for port 8f2597d2-a2aa-4839-ac8f-aff700990b1d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:38:18 np0005476733 nova_compute[192580]: 2025-10-08 16:38:18.023 2 DEBUG nova.network.neutron [req-028fd974-c0be-49c8-a6cb-a6ed33ff1742 req-bb569a8a-487c-403f-b9c5-dc0955207742 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Updating instance_info_cache with network_info: [{"id": "8f2597d2-a2aa-4839-ac8f-aff700990b1d", "address": "fa:16:3e:3b:7e:24", "network": {"id": "1496f4fb-5756-48b3-9df1-e6965ccbed85", "bridge": "br-int", "label": "tempest-test-network--1104825919", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f2597d2-a2", "ovs_interfaceid": "8f2597d2-a2aa-4839-ac8f-aff700990b1d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:38:18 np0005476733 nova_compute[192580]: 2025-10-08 16:38:18.050 2 DEBUG oslo_concurrency.lockutils [req-028fd974-c0be-49c8-a6cb-a6ed33ff1742 req-bb569a8a-487c-403f-b9c5-dc0955207742 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:38:19 np0005476733 nova_compute[192580]: 2025-10-08 16:38:19.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:21 np0005476733 nova_compute[192580]: 2025-10-08 16:38:21.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:23 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:38:23.083 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '75'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:38:24 np0005476733 podman[268035]: 2025-10-08 16:38:24.234224244 +0000 UTC m=+0.060518781 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:38:24 np0005476733 podman[268034]: 2025-10-08 16:38:24.240235485 +0000 UTC m=+0.067038968 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 12:38:24 np0005476733 nova_compute[192580]: 2025-10-08 16:38:24.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:26 np0005476733 nova_compute[192580]: 2025-10-08 16:38:26.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:38:26.401 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:38:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:38:26.402 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:38:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:38:26.403 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:38:29 np0005476733 nova_compute[192580]: 2025-10-08 16:38:29.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:31 np0005476733 nova_compute[192580]: 2025-10-08 16:38:31.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:33 np0005476733 podman[268077]: 2025-10-08 16:38:33.212938468 +0000 UTC m=+0.046675819 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  8 12:38:34 np0005476733 nova_compute[192580]: 2025-10-08 16:38:34.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.072 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'name': 'tempest-server-test-1966949606', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000061', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '4726c8b7a2a3405b9b2d689862918f5d', 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'hostId': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.076 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'name': 'tempest-server-test-294986936', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000063', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '4726c8b7a2a3405b9b2d689862918f5d', 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'hostId': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.076 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.096 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.read.latency volume: 6708605739 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.096 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.read.latency volume: 39303933 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.115 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.device.read.latency volume: 6592448147 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.116 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.device.read.latency volume: 102278223 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba7d9dd4-89f5-45fe-bc12-b752f5edd905', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6708605739, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-vda', 'timestamp': '2025-10-08T16:38:36.076849', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '3c6dd0e8-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.799815194, 'message_signature': '3f851db455a376df5061eced4f3bcfcc5ff3b302da1e6348fe4325c9c4688697'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 39303933, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-sda', 'timestamp': '2025-10-08T16:38:36.076849', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '3c6dddd6-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.799815194, 'message_signature': '6f7b0e630dd9ea69644424fb75bb59f0430ced81ab3779306008fd2cf86f011f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6592448147, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-vda', 'timestamp': '2025-10-08T16:38:36.076849', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'instance-00000063', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '3c70d018-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.820170073, 'message_signature': 'c0a2d1f82a7d4fe1c630bb9c735a48b912c87747a28a2057c367d64b53ad2e98'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 102278223, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-sda', 'timestamp': '2025-10-08T16:38:36.076849', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'instance-00000063', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '3c70e10c-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.820170073, 'message_signature': 'e79b20370a18ab30f127a27e1b611a53a0bf8419b0a7a66bda8703a3b8dd4b72'}]}, 'timestamp': '2025-10-08 16:38:36.116997', '_unique_id': 'accbebb321a44431b6a6c0c9df87eb7b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.119 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.121 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.123 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0 / tap8f2597d2-a2 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.123 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '736dd0a1-88fd-4544-8d4d-01452a3b8583', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000061-d2243acb-4779-458d-9e3f-4a5434d27bfb-tape3be517f-e6', 'timestamp': '2025-10-08T16:38:36.119240', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'tape3be517f-e6', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:93:e1:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape3be517f-e6'}, 'message_id': '3c71abe6-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.842222686, 'message_signature': 'd71c3428f07eb20d212f68b512aa951233debcb7196fdacb4dab2413e179b5f1'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000063-3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-tap8f2597d2-a2', 'timestamp': '2025-10-08T16:38:36.119240', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'tap8f2597d2-a2', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:3b:7e:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f2597d2-a2'}, 'message_id': '3c71fee8-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.845083628, 'message_signature': 'ef3ce7efb0ff6a413a918f960dc5558dd304a96595898ff5357d24af1e0226fe'}]}, 'timestamp': '2025-10-08 16:38:36.124252', '_unique_id': '1692b7cd939944d99ddb85c5f57ecfd0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.124 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.125 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.125 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/network.incoming.bytes volume: 13239 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.126 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/network.incoming.bytes volume: 78271 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85874d8a-3d78-47d7-882c-c76f7497608d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 13239, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000061-d2243acb-4779-458d-9e3f-4a5434d27bfb-tape3be517f-e6', 'timestamp': '2025-10-08T16:38:36.125702', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'tape3be517f-e6', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:93:e1:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape3be517f-e6'}, 'message_id': '3c72424a-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.842222686, 'message_signature': '64a7ed20f6fb3bd5506095aa641b0ab06e4730fd9f845b9c3406b5ab14e3cc16'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 78271, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000063-3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-tap8f2597d2-a2', 'timestamp': '2025-10-08T16:38:36.125702', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'tap8f2597d2-a2', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:3b:7e:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f2597d2-a2'}, 'message_id': '3c724e02-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.845083628, 'message_signature': '5592090ec0bc7729680ec68bebe836148d1ad1bcabfebf8b5793236045acf2b2'}]}, 'timestamp': '2025-10-08 16:38:36.126331', '_unique_id': '7ed12bd9ac8b4def95f4110d41c5c01d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.127 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.128 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a678535-06d4-46ae-a112-30515a86cf80', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000061-d2243acb-4779-458d-9e3f-4a5434d27bfb-tape3be517f-e6', 'timestamp': '2025-10-08T16:38:36.127863', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'tape3be517f-e6', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:93:e1:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape3be517f-e6'}, 'message_id': '3c729574-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.842222686, 'message_signature': 'f67f69554b1ac577c61b765d54b8dfc5b639bab7a0cf8456ebab3dc836d0b0ed'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000063-3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-tap8f2597d2-a2', 'timestamp': '2025-10-08T16:38:36.127863', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'tap8f2597d2-a2', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:3b:7e:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f2597d2-a2'}, 'message_id': '3c72a71c-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.845083628, 'message_signature': '4812171a013516bdafdcfa2f20935f1ffff966c7eaffb452519a532fb8a869ac'}]}, 'timestamp': '2025-10-08 16:38:36.128566', '_unique_id': '650c8774181a44a5acc73975454410f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.129 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eae37684-2eac-436a-88b9-542262b82e0d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000061-d2243acb-4779-458d-9e3f-4a5434d27bfb-tape3be517f-e6', 'timestamp': '2025-10-08T16:38:36.129716', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'tape3be517f-e6', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:93:e1:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape3be517f-e6'}, 'message_id': '3c72ddae-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.842222686, 'message_signature': 'b6e313d021d782814751e358f94fef0ac730c8b84662e81b48e3daf4078d4983'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000063-3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-tap8f2597d2-a2', 'timestamp': '2025-10-08T16:38:36.129716', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'tap8f2597d2-a2', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:3b:7e:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f2597d2-a2'}, 'message_id': '3c72e5d8-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.845083628, 'message_signature': '6f0946ec92dff607bf9c26da0c5b80a372dd67ca3f5370a8915b92652904514f'}]}, 'timestamp': '2025-10-08 16:38:36.130190', '_unique_id': '6725f4fb58404efdb466ecdf8b462216'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:38:36 np0005476733 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.130 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.131 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.131 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.write.latency volume: 6490027757 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.131 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.131 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.device.write.latency volume: 11036511087 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82648757-e726-4bc5-b1bf-1a5ef1f8618c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6490027757, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-vda', 'timestamp': '2025-10-08T16:38:36.131345', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '3c731d5a-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.799815194, 'message_signature': '68cd511b19576b04ae63abb39cd7ca0642c1d5f53b8d8c8f20bf5dd6a88866c3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-sda', 'timestamp': '2025-10-08T16:38:36.131345', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '3c732728-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.799815194, 'message_signature': '434fe6fc25db0bc68a21a626c9b0593f926cf5e185c85a74e6ba92e222c5b70b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11036511087, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-vda', 'timestamp': '2025-10-08T16:38:36.131345', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'instance-00000063', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '3c732f52-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.820170073, 'message_signature': 'aecf051cd9529a62b6f9689a489b22f14cd51683b91467be78c2c6bcba1b827b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-sda', 'timestamp': '2025-10-08T16:38:36.131345', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'instance-00000063', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '3c73374a-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.820170073, 'message_signature': '525d412b637bc665920e2e2ca58cfef64682049a39eb646a23000cbaa0db53d1'}]}, 'timestamp': '2025-10-08 16:38:36.132221', '_unique_id': '9179c0a603e44069b2fcc7d6f05e6e21'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.132 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.134 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.134 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/network.outgoing.bytes volume: 16005 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.134 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/network.outgoing.bytes volume: 103785 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5eea06e-b7ec-42c1-af70-4830e626cfca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 16005, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000061-d2243acb-4779-458d-9e3f-4a5434d27bfb-tape3be517f-e6', 'timestamp': '2025-10-08T16:38:36.134559', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'tape3be517f-e6', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:93:e1:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape3be517f-e6'}, 'message_id': '3c739b0e-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.842222686, 'message_signature': '59cd75b85bf005add12f6377d1b4e313d735e87517aa0fc5aa44ef4c4e4fe04a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 103785, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000063-3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-tap8f2597d2-a2', 'timestamp': '2025-10-08T16:38:36.134559', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'tap8f2597d2-a2', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:3b:7e:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f2597d2-a2'}, 'message_id': '3c73a4aa-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.845083628, 'message_signature': '4ea4c6699233a5b616569053a03da61771eac1646e06fffd9eeb3701a219e1b7'}]}, 'timestamp': '2025-10-08 16:38:36.135070', '_unique_id': 'f2861b5225b34ec4b0adc882e33dba67'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.135 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.136 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.136 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/network.incoming.bytes.delta volume: 13129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.136 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ead46dd-95c9-4477-a180-71582cf12e07', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 13129, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000061-d2243acb-4779-458d-9e3f-4a5434d27bfb-tape3be517f-e6', 'timestamp': '2025-10-08T16:38:36.136450', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'tape3be517f-e6', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:93:e1:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape3be517f-e6'}, 'message_id': '3c73e550-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.842222686, 'message_signature': '5a337d51bd662e71db7557e5c01c9c5ae3d0c52e9d7016a7f22e47a3e017b2f2'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000063-3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-tap8f2597d2-a2', 'timestamp': '2025-10-08T16:38:36.136450', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'tap8f2597d2-a2', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:3b:7e:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f2597d2-a2'}, 'message_id': '3c73edde-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.845083628, 'message_signature': 'fbca2bab4ea8494ab67904ef6107895792e529ea6bb58d6dd431df86a338bd98'}]}, 'timestamp': '2025-10-08 16:38:36.136905', '_unique_id': 'e6e090c1f044480fb4154d102871d176'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.137 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.138 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.138 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.read.bytes volume: 330081792 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.138 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.138 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.device.read.bytes volume: 331687424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '837fb90e-f749-4350-8ea2-644031887996', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 330081792, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-vda', 'timestamp': '2025-10-08T16:38:36.138131', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '3c74298e-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.799815194, 'message_signature': '417186786e1df84ad66071f75a9ec16f67df17b96152678b604f48e62bca30fc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-sda', 'timestamp': '2025-10-08T16:38:36.138131', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '3c7437f8-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.799815194, 'message_signature': '3c67a70a0081f2b5e459c55e00d60bbb93902430a864e007d6e4ffeff8fc74b7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 331687424, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-vda', 'timestamp': '2025-10-08T16:38:36.138131', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'instance-00000063', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '3c744086-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.820170073, 'message_signature': 'ed2580e4be111b51e5c51d501e3d07508570505c6339d600fb9fb08dcbf73bbe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-sda', 'timestamp': '2025-10-08T16:38:36.138131', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'instance-00000063', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '3c744996-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.820170073, 'message_signature': '4e2ec180fd3fbe72441d2e7c8805ad4eb26129b2556a6d3fa07040b0c9143ef6'}]}, 'timestamp': '2025-10-08 16:38:36.139331', '_unique_id': '6b15a7601b6047cfa4085000e4926351'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.139 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.140 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.151 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.usage volume: 152502272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.152 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.163 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.device.usage volume: 152436736 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.163 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4064b09e-b09f-487b-b675-7e602a7a4b5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152502272, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-vda', 'timestamp': '2025-10-08T16:38:36.140619', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '3c764b2e-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.863607618, 'message_signature': 'c24cee5a741ed2f57d9e28f53b68626e5b55675d91989665fd494cbf9dc01378'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-sda', 'timestamp': '2025-10-08T16:38:36.140619', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '3c765682-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.863607618, 'message_signature': '24b7131d60a950b2b78a259cf3ff0e6e182e2b8620c1049dc85e291d3844da4c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152436736, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-vda', 'timestamp': '2025-10-08T16:38:36.140619', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'instance-00000063', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '3c780694-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.875647082, 'message_signature': '0f584b6f679ae56122b561a749ceafc727e854e00faa11ca005f656d16437eee'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-sda', 'timestamp': '2025-10-08T16:38:36.140619', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'instance-00000063', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '3c78112a-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.875647082, 'message_signature': '4f307ec2e7aa18679db9841c2615c2af191283d39225842e99404acb20d0a66c'}]}, 'timestamp': '2025-10-08 16:38:36.164029', '_unique_id': 'e6e5710d648648efa545d2770b767bac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.165 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.165 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.165 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-294986936>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-294986936>]
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.166 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.166 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.166 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.166 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.166 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa857feb-97e1-4b11-8bf9-d91a1a104b4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-vda', 'timestamp': '2025-10-08T16:38:36.166133', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '3c786e90-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.863607618, 'message_signature': 'b8f8b7207bcee82fe1cea5c74f751e62fc401544d1f59c4a27a1eb2a65e6f5bd'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-sda', 'timestamp': '2025-10-08T16:38:36.166133', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '3c7876ce-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.863607618, 'message_signature': '8497f8b470222ee10edfa92bf3545b28ac050f01ab8dcfd1460f44516b3a30ed'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-vda', 'timestamp': '2025-10-08T16:38:36.166133', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'instance-00000063', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '3c787e62-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.875647082, 'message_signature': '8462a595b48160f8585545f21a030a72db46ed9ea3de326bd8077c35b49a3cf0'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-sda', 'timestamp': '2025-10-08T16:38:36.166133', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'instance-00000063', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '3c78859c-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.875647082, 'message_signature': 'c9a0151c949cdadb9f31c8e04ce463c772263e471fa2ca0d3907e4edd166738a'}]}, 'timestamp': '2025-10-08 16:38:36.166996', '_unique_id': 'ff9503df388a4d01b0016a826cb70598'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.167 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.168 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.168 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.168 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2aa3115e-4c6b-4094-a30b-ff01055598e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000061-d2243acb-4779-458d-9e3f-4a5434d27bfb-tape3be517f-e6', 'timestamp': '2025-10-08T16:38:36.168603', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'tape3be517f-e6', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:93:e1:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape3be517f-e6'}, 'message_id': '3c78cd86-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.842222686, 'message_signature': 'b27d18a4fb9eae952b7da2369ba05e8a817f4c69376ca7f8f7bad2bbca5930b0'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000063-3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-tap8f2597d2-a2', 'timestamp': '2025-10-08T16:38:36.168603', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'tap8f2597d2-a2', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:3b:7e:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f2597d2-a2'}, 'message_id': '3c78d65a-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.845083628, 'message_signature': '7eb53f1fad0e6a0ed18e1df50c01f11da12a453ad24d97be3a9811cff72bfc7d'}]}, 'timestamp': '2025-10-08 16:38:36.169110', '_unique_id': 'd5316658e4be4f6683609cb129c04bb8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.169 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.170 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.170 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.170 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-294986936>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-294986936>]
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.170 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.170 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.write.bytes volume: 136828416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.171 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.171 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.device.write.bytes volume: 136439296 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.171 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '751c6ce1-22eb-4b85-9514-a79c43fcf500', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 136828416, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-vda', 'timestamp': '2025-10-08T16:38:36.170816', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '3c792358-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.799815194, 'message_signature': '2ec2c44f8e24d624fadf6379a3207d05adf258e40bba9936f74421088c54e92f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-sda', 'timestamp': '2025-10-08T16:38:36.170816', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '3c792c36-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.799815194, 'message_signature': '84244a383a8fc991cc75a89422b31f5bc48d354c941fad18658afa08e2f135a5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 136439296, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-vda', 'timestamp': '2025-10-08T16:38:36.170816', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'instance-00000063', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '3c7933f2-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.820170073, 'message_signature': '48ddd197193090f98d8ec2d1c63631815edeedb984bf3ef4e0cab38a74bbfd4f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-sda', 'timestamp': '2025-10-08T16:38:36.170816', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'instance-00000063', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '3c793d2a-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.820170073, 'message_signature': '97a1782e0c48947277015b2d3f8b0236f70635c840f78982a533aa003e2d8bbd'}]}, 'timestamp': '2025-10-08 16:38:36.171723', '_unique_id': 'ea8f17fdea8342d9b6c2a545900ba6b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.172 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.173 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.187 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/memory.usage volume: 238.80859375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.200 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/memory.usage volume: 260.15234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a2f255e-e763-434e-8e78-00ad99cf7a25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 238.80859375, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'timestamp': '2025-10-08T16:38:36.173236', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '3c7bb398-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.910133492, 'message_signature': '9df9965d60ac494577d124dad64f06ab42be1e01a6ff50e0d3846803dc5aaf33'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 260.15234375, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'timestamp': '2025-10-08T16:38:36.173236', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'instance-00000063', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '3c7db40e-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.923495528, 'message_signature': '9ae8d96af1602ea673fece4dd415f42ea5da707d65f52268460752e5a8373d65'}]}, 'timestamp': '2025-10-08 16:38:36.200999', '_unique_id': '32340d1fd3cb421496965a459f9c1d03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.201 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.202 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.202 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/cpu volume: 40560000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/cpu volume: 42630000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3058356d-9d90-45c3-94d3-447b614779b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 40560000000, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'timestamp': '2025-10-08T16:38:36.202896', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '3c7e088c-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.910133492, 'message_signature': 'cd041cb9c68769cb0b0b16a8c97c180db26922d66459c9d2b94d252cf151076f'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 42630000000, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'timestamp': '2025-10-08T16:38:36.202896', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'instance-00000063', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '3c7e137c-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.923495528, 'message_signature': 'd639cb54baff7594379c87219e4d6cdf7d596cb5d824278d75a3813275f0a9c4'}]}, 'timestamp': '2025-10-08 16:38:36.203432', '_unique_id': '746766c0fc6e4c45ae94de1be88721c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.203 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.204 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.204 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.204 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-294986936>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-294986936>]
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.204 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.204 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/network.incoming.packets volume: 80 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/network.incoming.packets volume: 457 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3cdbd02e-a3c6-40bd-bb03-26b56f3e4787', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 80, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000061-d2243acb-4779-458d-9e3f-4a5434d27bfb-tape3be517f-e6', 'timestamp': '2025-10-08T16:38:36.204961', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'tape3be517f-e6', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:93:e1:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape3be517f-e6'}, 'message_id': '3c7e59ae-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.842222686, 'message_signature': 'e821ef3c668dda8344734f92f57357232b2874ba74fb09acc90e4ba287cbae9a'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 457, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000063-3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-tap8f2597d2-a2', 'timestamp': '2025-10-08T16:38:36.204961', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'tap8f2597d2-a2', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:3b:7e:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f2597d2-a2'}, 'message_id': '3c7e6246-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.845083628, 'message_signature': '8c2e553314826583f0a7151fdb60968b8dbf1b22c8935960f8638be9309ca168'}]}, 'timestamp': '2025-10-08 16:38:36.205439', '_unique_id': '37b5e71953584673ae21c436b357f4fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.205 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.206 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.206 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.write.requests volume: 783 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.206 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.207 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.device.write.requests volume: 784 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.207 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0dab5cb-14dd-46e5-a128-2ad003a3a180', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 783, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-vda', 'timestamp': '2025-10-08T16:38:36.206715', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '3c7e9d92-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.799815194, 'message_signature': '08910328d6b9fcaacffb2af1772733dec9e00df11bf2b24d635cd7d61e5ff4f7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-sda', 'timestamp': '2025-10-08T16:38:36.206715', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '3c7ea562-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.799815194, 'message_signature': '00afba44d27a4af4f59b0726f7698fb7a6acd20090967380fe8a093c81bfbd1d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 784, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-vda', 'timestamp': '2025-10-08T16:38:36.206715', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'instance-00000063', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '3c7eadc8-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.820170073, 'message_signature': '598322e7f6710d7a5821faa56ab1d00c0b61ae35e872d10aabc558eb4f215fc6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-sda', 'timestamp': '2025-10-08T16:38:36.206715', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'instance-00000063', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '3c7eb584-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.820170073, 'message_signature': '6be0d94fe2e42d6e9fbc973beba5040b57847a870b1c07d38a920604d68d75d8'}]}, 'timestamp': '2025-10-08 16:38:36.207543', '_unique_id': '3551314bc1ce4ebba9234e33c2a1be65'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.208 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60a2db3d-6562-4c53-9991-9cf30256d893', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-vda', 'timestamp': '2025-10-08T16:38:36.208660', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '3c7ee964-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.863607618, 'message_signature': '752a6399bea3629b05bc5cbdc5ec7c3dd4d4f1d4ea3a2400f33a73580fa72d6b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-sda', 'timestamp': '2025-10-08T16:38:36.208660', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '3c7ef120-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.863607618, 'message_signature': '2c8b912ddcc6c4de05734b77f2062885594b54dbe4a06b204b6006b4b937a68d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-vda', 'timestamp': '2025-10-08T16:38:36.208660', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'instance-00000063', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '3c7efa26-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.875647082, 'message_signature': '67e4a3f37c43249eba8b17ebeb5c8a49a403a59a9e575aeec4ca0287bad9cf8b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-sda', 'timestamp': '2025-10-08T16:38:36.208660', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'instance-00000063', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '3c7f02b4-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.875647082, 'message_signature': '020ac600cd578e8f9b30f91a56fd3d123387e5c9d690accd50ece61878761b5f'}]}, 'timestamp': '2025-10-08 16:38:36.209527', '_unique_id': '83fefa495403447cb7468bceeb6170b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.209 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.210 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.210 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.read.requests volume: 11675 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.210 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.211 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.device.read.requests volume: 11702 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.211 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea4cc945-2b50-4020-879b-92af0a5fccc3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11675, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-vda', 'timestamp': '2025-10-08T16:38:36.210705', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '3c7f3932-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.799815194, 'message_signature': '9cead816e21f6ef7e6b70ab8cb64da17addb3461d9fcfcdc026ca7ec1fe5863f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb-sda', 'timestamp': '2025-10-08T16:38:36.210705', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'instance-00000061', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '3c7f40da-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.799815194, 'message_signature': 'b4a9034a3cdc0e49e7e5517a7137e7525e672df556c26dfb9c47a725daf4a157'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11702, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-vda', 'timestamp': '2025-10-08T16:38:36.210705', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'instance-00000063', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '3c7f4936-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.820170073, 'message_signature': 'b4a9c1aeb5372b8fa0147bee9fd0cf61ac0149ff3e7f9f4d8fb93175b77ba9c1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-sda', 'timestamp': '2025-10-08T16:38:36.210705', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'instance-00000063', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '3c7f50d4-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.820170073, 'message_signature': '507585e5b326b8367707718d69ce776fa438058b9ed6822ce5f7583691d62e82'}]}, 'timestamp': '2025-10-08 16:38:36.211536', '_unique_id': 'b28ba0048f1c47ecb4cabe82fb835b25'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/network.outgoing.packets volume: 97 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.212 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/network.outgoing.packets volume: 485 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '143bee2f-d883-44da-958e-c170802d4047', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 97, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000061-d2243acb-4779-458d-9e3f-4a5434d27bfb-tape3be517f-e6', 'timestamp': '2025-10-08T16:38:36.212688', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'tape3be517f-e6', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:93:e1:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape3be517f-e6'}, 'message_id': '3c7f8752-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.842222686, 'message_signature': '7dd0e9722cdee52f43a351d039de0dcf72f5e17f9a9f4e0ded15031ed0c2c196'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 485, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000063-3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-tap8f2597d2-a2', 'timestamp': '2025-10-08T16:38:36.212688', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'tap8f2597d2-a2', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:3b:7e:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f2597d2-a2'}, 'message_id': '3c7f8fa4-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.845083628, 'message_signature': '01a81351c962c827c7a7f720b4071bd5c79d1f70d4f3ed1afdd46ef2d81d2c49'}]}, 'timestamp': '2025-10-08 16:38:36.213157', '_unique_id': 'de03b53c33ad4bb39886b9d0d1306e9e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.213 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.214 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.214 12 DEBUG ceilometer.compute.pollsters [-] d2243acb-4779-458d-9e3f-4a5434d27bfb/network.outgoing.bytes.delta volume: 16005 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.214 12 DEBUG ceilometer.compute.pollsters [-] 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3be73a9-003a-42cf-a118-6cd0e0eb95c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 16005, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000061-d2243acb-4779-458d-9e3f-4a5434d27bfb-tape3be517f-e6', 'timestamp': '2025-10-08T16:38:36.214412', 'resource_metadata': {'display_name': 'tempest-server-test-1966949606', 'name': 'tape3be517f-e6', 'instance_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:93:e1:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape3be517f-e6'}, 'message_id': '3c7fca96-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.842222686, 'message_signature': '6d96935aa8c7621c92459621a123b27959c209a165b693834fa7292f84c5d0ec'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000063-3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-tap8f2597d2-a2', 'timestamp': '2025-10-08T16:38:36.214412', 'resource_metadata': {'display_name': 'tempest-server-test-294986936', 'name': 'tap8f2597d2-a2', 'instance_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:3b:7e:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8f2597d2-a2'}, 'message_id': '3c7fd2d4-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8419.845083628, 'message_signature': '3ad4b27b936873b820ef4cb4ad987549fdede070759f7bbb022236335e61732f'}]}, 'timestamp': '2025-10-08 16:38:36.214851', '_unique_id': '9485580b287049e6a2a2cf01b41f92ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.215 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.216 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:38:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:38:36.216 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-294986936>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-294986936>]
Oct  8 12:38:36 np0005476733 nova_compute[192580]: 2025-10-08 16:38:36.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:36 np0005476733 podman[268097]: 2025-10-08 16:38:36.247819994 +0000 UTC m=+0.076059116 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:38:37 np0005476733 nova_compute[192580]: 2025-10-08 16:38:37.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:38:38 np0005476733 podman[268121]: 2025-10-08 16:38:38.227049268 +0000 UTC m=+0.061761721 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0)
Oct  8 12:38:39 np0005476733 nova_compute[192580]: 2025-10-08 16:38:39.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:41 np0005476733 nova_compute[192580]: 2025-10-08 16:38:41.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:44 np0005476733 nova_compute[192580]: 2025-10-08 16:38:44.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:38:44 np0005476733 nova_compute[192580]: 2025-10-08 16:38:44.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:45 np0005476733 ovn_controller[263831]: 2025-10-08T16:38:45Z|00083|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Oct  8 12:38:46 np0005476733 nova_compute[192580]: 2025-10-08 16:38:46.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:46 np0005476733 nova_compute[192580]: 2025-10-08 16:38:46.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:38:46 np0005476733 nova_compute[192580]: 2025-10-08 16:38:46.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:38:47 np0005476733 podman[268143]: 2025-10-08 16:38:47.226692648 +0000 UTC m=+0.046762972 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, config_id=edpm, io.openshift.tags=minimal rhel9)
Oct  8 12:38:47 np0005476733 podman[268141]: 2025-10-08 16:38:47.252966556 +0000 UTC m=+0.080680864 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:38:47 np0005476733 podman[268142]: 2025-10-08 16:38:47.252986517 +0000 UTC m=+0.078645049 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 12:38:49 np0005476733 nova_compute[192580]: 2025-10-08 16:38:49.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:38:50Z|00084|pinctrl|WARN|Dropped 101 log messages in last 60 seconds (most recently, 5 seconds ago) due to excessive rate
Oct  8 12:38:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:38:50Z|00085|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:38:51 np0005476733 nova_compute[192580]: 2025-10-08 16:38:51.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:52 np0005476733 nova_compute[192580]: 2025-10-08 16:38:52.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:38:53 np0005476733 nova_compute[192580]: 2025-10-08 16:38:53.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:38:53 np0005476733 nova_compute[192580]: 2025-10-08 16:38:53.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:38:54 np0005476733 nova_compute[192580]: 2025-10-08 16:38:54.676 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:38:54 np0005476733 nova_compute[192580]: 2025-10-08 16:38:54.677 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:38:54 np0005476733 nova_compute[192580]: 2025-10-08 16:38:54.677 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:38:54 np0005476733 nova_compute[192580]: 2025-10-08 16:38:54.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:55 np0005476733 podman[268209]: 2025-10-08 16:38:55.24943173 +0000 UTC m=+0.073637350 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 12:38:55 np0005476733 podman[268208]: 2025-10-08 16:38:55.271148642 +0000 UTC m=+0.095624600 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:38:56 np0005476733 nova_compute[192580]: 2025-10-08 16:38:56.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:38:56 np0005476733 nova_compute[192580]: 2025-10-08 16:38:56.711 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Updating instance_info_cache with network_info: [{"id": "8f2597d2-a2aa-4839-ac8f-aff700990b1d", "address": "fa:16:3e:3b:7e:24", "network": {"id": "1496f4fb-5756-48b3-9df1-e6965ccbed85", "bridge": "br-int", "label": "tempest-test-network--1104825919", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f2597d2-a2", "ovs_interfaceid": "8f2597d2-a2aa-4839-ac8f-aff700990b1d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:38:56 np0005476733 nova_compute[192580]: 2025-10-08 16:38:56.735 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:38:56 np0005476733 nova_compute[192580]: 2025-10-08 16:38:56.735 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:38:56 np0005476733 nova_compute[192580]: 2025-10-08 16:38:56.736 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:38:57 np0005476733 nova_compute[192580]: 2025-10-08 16:38:57.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:38:59 np0005476733 nova_compute[192580]: 2025-10-08 16:38:59.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:01 np0005476733 nova_compute[192580]: 2025-10-08 16:39:01.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:04 np0005476733 podman[268250]: 2025-10-08 16:39:04.215153339 +0000 UTC m=+0.048430925 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 12:39:04 np0005476733 nova_compute[192580]: 2025-10-08 16:39:04.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:06 np0005476733 nova_compute[192580]: 2025-10-08 16:39:06.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:07 np0005476733 podman[268271]: 2025-10-08 16:39:07.243026592 +0000 UTC m=+0.076812890 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Oct  8 12:39:07 np0005476733 nova_compute[192580]: 2025-10-08 16:39:07.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:39:07 np0005476733 nova_compute[192580]: 2025-10-08 16:39:07.759 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:39:07 np0005476733 nova_compute[192580]: 2025-10-08 16:39:07.759 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:39:07 np0005476733 nova_compute[192580]: 2025-10-08 16:39:07.759 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:39:07 np0005476733 nova_compute[192580]: 2025-10-08 16:39:07.760 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:39:07 np0005476733 nova_compute[192580]: 2025-10-08 16:39:07.836 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:39:07 np0005476733 nova_compute[192580]: 2025-10-08 16:39:07.892 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:39:07 np0005476733 nova_compute[192580]: 2025-10-08 16:39:07.894 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:39:07 np0005476733 nova_compute[192580]: 2025-10-08 16:39:07.949 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:39:07 np0005476733 nova_compute[192580]: 2025-10-08 16:39:07.955 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:39:08 np0005476733 nova_compute[192580]: 2025-10-08 16:39:08.033 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:39:08 np0005476733 nova_compute[192580]: 2025-10-08 16:39:08.034 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:39:08 np0005476733 nova_compute[192580]: 2025-10-08 16:39:08.090 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:39:08 np0005476733 nova_compute[192580]: 2025-10-08 16:39:08.239 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:39:08 np0005476733 nova_compute[192580]: 2025-10-08 16:39:08.241 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12022MB free_disk=111.02641677856445GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:39:08 np0005476733 nova_compute[192580]: 2025-10-08 16:39:08.241 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:39:08 np0005476733 nova_compute[192580]: 2025-10-08 16:39:08.241 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:39:08 np0005476733 nova_compute[192580]: 2025-10-08 16:39:08.353 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance d2243acb-4779-458d-9e3f-4a5434d27bfb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:39:08 np0005476733 nova_compute[192580]: 2025-10-08 16:39:08.354 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:39:08 np0005476733 nova_compute[192580]: 2025-10-08 16:39:08.354 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:39:08 np0005476733 nova_compute[192580]: 2025-10-08 16:39:08.354 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=2560MB phys_disk=119GB used_disk=20GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:39:08 np0005476733 nova_compute[192580]: 2025-10-08 16:39:08.742 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:39:08 np0005476733 nova_compute[192580]: 2025-10-08 16:39:08.761 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:39:08 np0005476733 nova_compute[192580]: 2025-10-08 16:39:08.763 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:39:08 np0005476733 nova_compute[192580]: 2025-10-08 16:39:08.764 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.523s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:39:09 np0005476733 podman[268310]: 2025-10-08 16:39:09.240806628 +0000 UTC m=+0.067198964 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 12:39:09 np0005476733 nova_compute[192580]: 2025-10-08 16:39:09.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:11 np0005476733 nova_compute[192580]: 2025-10-08 16:39:11.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:11 np0005476733 nova_compute[192580]: 2025-10-08 16:39:11.765 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:39:14 np0005476733 nova_compute[192580]: 2025-10-08 16:39:14.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:15 np0005476733 nova_compute[192580]: 2025-10-08 16:39:15.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:39:16 np0005476733 nova_compute[192580]: 2025-10-08 16:39:16.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:18 np0005476733 podman[268332]: 2025-10-08 16:39:18.228048915 +0000 UTC m=+0.051698600 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:39:18 np0005476733 podman[268331]: 2025-10-08 16:39:18.233255261 +0000 UTC m=+0.059455247 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible)
Oct  8 12:39:18 np0005476733 podman[268333]: 2025-10-08 16:39:18.233157808 +0000 UTC m=+0.054539551 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  8 12:39:19 np0005476733 nova_compute[192580]: 2025-10-08 16:39:19.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:21 np0005476733 nova_compute[192580]: 2025-10-08 16:39:21.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:24 np0005476733 nova_compute[192580]: 2025-10-08 16:39:24.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:26 np0005476733 podman[268393]: 2025-10-08 16:39:26.26000776 +0000 UTC m=+0.081414397 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 12:39:26 np0005476733 podman[268392]: 2025-10-08 16:39:26.264258335 +0000 UTC m=+0.098050128 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, container_name=iscsid, managed_by=edpm_ansible)
Oct  8 12:39:26 np0005476733 nova_compute[192580]: 2025-10-08 16:39:26.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:26.402 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:39:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:26.403 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:39:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:26.404 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:39:29 np0005476733 nova_compute[192580]: 2025-10-08 16:39:29.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:31 np0005476733 nova_compute[192580]: 2025-10-08 16:39:31.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:34 np0005476733 nova_compute[192580]: 2025-10-08 16:39:34.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:35 np0005476733 podman[268442]: 2025-10-08 16:39:35.216059928 +0000 UTC m=+0.045670077 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:39:36 np0005476733 nova_compute[192580]: 2025-10-08 16:39:36.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:38 np0005476733 podman[268461]: 2025-10-08 16:39:38.268135793 +0000 UTC m=+0.096300631 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2)
Oct  8 12:39:39 np0005476733 nova_compute[192580]: 2025-10-08 16:39:39.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:39:39 np0005476733 nova_compute[192580]: 2025-10-08 16:39:39.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:40 np0005476733 podman[268488]: 2025-10-08 16:39:40.215568963 +0000 UTC m=+0.048869879 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 12:39:41 np0005476733 nova_compute[192580]: 2025-10-08 16:39:41.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:41 np0005476733 nova_compute[192580]: 2025-10-08 16:39:41.732 2 DEBUG oslo_concurrency.lockutils [None req-99477109-b97a-4d1f-8385-d01a826347cc de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:39:41 np0005476733 nova_compute[192580]: 2025-10-08 16:39:41.733 2 DEBUG oslo_concurrency.lockutils [None req-99477109-b97a-4d1f-8385-d01a826347cc de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:39:41 np0005476733 nova_compute[192580]: 2025-10-08 16:39:41.733 2 DEBUG oslo_concurrency.lockutils [None req-99477109-b97a-4d1f-8385-d01a826347cc de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:39:41 np0005476733 nova_compute[192580]: 2025-10-08 16:39:41.733 2 DEBUG oslo_concurrency.lockutils [None req-99477109-b97a-4d1f-8385-d01a826347cc de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:39:41 np0005476733 nova_compute[192580]: 2025-10-08 16:39:41.733 2 DEBUG oslo_concurrency.lockutils [None req-99477109-b97a-4d1f-8385-d01a826347cc de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:39:41 np0005476733 nova_compute[192580]: 2025-10-08 16:39:41.734 2 INFO nova.compute.manager [None req-99477109-b97a-4d1f-8385-d01a826347cc de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Terminating instance#033[00m
Oct  8 12:39:41 np0005476733 nova_compute[192580]: 2025-10-08 16:39:41.735 2 DEBUG nova.compute.manager [None req-99477109-b97a-4d1f-8385-d01a826347cc de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 12:39:41 np0005476733 kernel: tap8f2597d2-a2 (unregistering): left promiscuous mode
Oct  8 12:39:41 np0005476733 NetworkManager[51699]: <info>  [1759941581.7648] device (tap8f2597d2-a2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 12:39:41 np0005476733 nova_compute[192580]: 2025-10-08 16:39:41.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:41 np0005476733 ovn_controller[263831]: 2025-10-08T16:39:41Z|00086|binding|INFO|Releasing lport 8f2597d2-a2aa-4839-ac8f-aff700990b1d from this chassis (sb_readonly=0)
Oct  8 12:39:41 np0005476733 ovn_controller[263831]: 2025-10-08T16:39:41Z|00087|binding|INFO|Setting lport 8f2597d2-a2aa-4839-ac8f-aff700990b1d down in Southbound
Oct  8 12:39:41 np0005476733 ovn_controller[263831]: 2025-10-08T16:39:41Z|00088|binding|INFO|Removing iface tap8f2597d2-a2 ovn-installed in OVS
Oct  8 12:39:41 np0005476733 nova_compute[192580]: 2025-10-08 16:39:41.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:41.783 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:7e:24 10.100.0.11'], port_security=['fa:16:3e:3b:7e:24 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1496f4fb-5756-48b3-9df1-e6965ccbed85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0ae65408-c1fc-4a23-acb9-ead1616a73f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.245'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f54a9571-f531-4620-a544-cc4241429076, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=8f2597d2-a2aa-4839-ac8f-aff700990b1d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:39:41 np0005476733 nova_compute[192580]: 2025-10-08 16:39:41.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:41.785 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 8f2597d2-a2aa-4839-ac8f-aff700990b1d in datapath 1496f4fb-5756-48b3-9df1-e6965ccbed85 unbound from our chassis#033[00m
Oct  8 12:39:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:41.787 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1496f4fb-5756-48b3-9df1-e6965ccbed85, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 12:39:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:41.788 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c52bbf27-aec4-447e-8ec0-f0eb23dbab01]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:39:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:41.789 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1496f4fb-5756-48b3-9df1-e6965ccbed85 namespace which is not needed anymore#033[00m
Oct  8 12:39:41 np0005476733 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000063.scope: Deactivated successfully.
Oct  8 12:39:41 np0005476733 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000063.scope: Consumed 47.023s CPU time.
Oct  8 12:39:41 np0005476733 systemd-machined[152624]: Machine qemu-61-instance-00000063 terminated.
Oct  8 12:39:41 np0005476733 neutron-haproxy-ovnmeta-1496f4fb-5756-48b3-9df1-e6965ccbed85[267519]: [NOTICE]   (267523) : haproxy version is 2.8.14-c23fe91
Oct  8 12:39:41 np0005476733 neutron-haproxy-ovnmeta-1496f4fb-5756-48b3-9df1-e6965ccbed85[267519]: [NOTICE]   (267523) : path to executable is /usr/sbin/haproxy
Oct  8 12:39:41 np0005476733 neutron-haproxy-ovnmeta-1496f4fb-5756-48b3-9df1-e6965ccbed85[267519]: [WARNING]  (267523) : Exiting Master process...
Oct  8 12:39:41 np0005476733 neutron-haproxy-ovnmeta-1496f4fb-5756-48b3-9df1-e6965ccbed85[267519]: [WARNING]  (267523) : Exiting Master process...
Oct  8 12:39:41 np0005476733 neutron-haproxy-ovnmeta-1496f4fb-5756-48b3-9df1-e6965ccbed85[267519]: [ALERT]    (267523) : Current worker (267525) exited with code 143 (Terminated)
Oct  8 12:39:41 np0005476733 neutron-haproxy-ovnmeta-1496f4fb-5756-48b3-9df1-e6965ccbed85[267519]: [WARNING]  (267523) : All workers exited. Exiting... (0)
Oct  8 12:39:41 np0005476733 systemd[1]: libpod-96396fc92f45a03b87d0c3ff68a43abcbb5ff0abbbbb54cb1826953becc913ed.scope: Deactivated successfully.
Oct  8 12:39:41 np0005476733 podman[268534]: 2025-10-08 16:39:41.915991096 +0000 UTC m=+0.043190878 container died 96396fc92f45a03b87d0c3ff68a43abcbb5ff0abbbbb54cb1826953becc913ed (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-1496f4fb-5756-48b3-9df1-e6965ccbed85, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  8 12:39:41 np0005476733 systemd[1]: var-lib-containers-storage-overlay-1e8919eda12b8425f01ca9b2ff863ad63793c200364cced3354d740d3613bbea-merged.mount: Deactivated successfully.
Oct  8 12:39:41 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-96396fc92f45a03b87d0c3ff68a43abcbb5ff0abbbbb54cb1826953becc913ed-userdata-shm.mount: Deactivated successfully.
Oct  8 12:39:41 np0005476733 podman[268534]: 2025-10-08 16:39:41.945617291 +0000 UTC m=+0.072817073 container cleanup 96396fc92f45a03b87d0c3ff68a43abcbb5ff0abbbbb54cb1826953becc913ed (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-1496f4fb-5756-48b3-9df1-e6965ccbed85, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:39:41 np0005476733 systemd[1]: libpod-conmon-96396fc92f45a03b87d0c3ff68a43abcbb5ff0abbbbb54cb1826953becc913ed.scope: Deactivated successfully.
Oct  8 12:39:42 np0005476733 nova_compute[192580]: 2025-10-08 16:39:42.000 2 INFO nova.virt.libvirt.driver [-] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Instance destroyed successfully.#033[00m
Oct  8 12:39:42 np0005476733 nova_compute[192580]: 2025-10-08 16:39:42.000 2 DEBUG nova.objects.instance [None req-99477109-b97a-4d1f-8385-d01a826347cc de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lazy-loading 'resources' on Instance uuid 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:39:42 np0005476733 podman[268567]: 2025-10-08 16:39:42.005056276 +0000 UTC m=+0.038749217 container remove 96396fc92f45a03b87d0c3ff68a43abcbb5ff0abbbbb54cb1826953becc913ed (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-1496f4fb-5756-48b3-9df1-e6965ccbed85, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 12:39:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:42.010 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9e109dda-33ce-455d-9b95-5fa84be7205c]: (4, ('Wed Oct  8 04:39:41 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-1496f4fb-5756-48b3-9df1-e6965ccbed85 (96396fc92f45a03b87d0c3ff68a43abcbb5ff0abbbbb54cb1826953becc913ed)\n96396fc92f45a03b87d0c3ff68a43abcbb5ff0abbbbb54cb1826953becc913ed\nWed Oct  8 04:39:41 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-1496f4fb-5756-48b3-9df1-e6965ccbed85 (96396fc92f45a03b87d0c3ff68a43abcbb5ff0abbbbb54cb1826953becc913ed)\n96396fc92f45a03b87d0c3ff68a43abcbb5ff0abbbbb54cb1826953becc913ed\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:39:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:42.012 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[bb716909-eda1-43c7-a189-dbd6e8780c43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:39:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:42.013 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1496f4fb-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:39:42 np0005476733 kernel: tap1496f4fb-50: left promiscuous mode
Oct  8 12:39:42 np0005476733 nova_compute[192580]: 2025-10-08 16:39:42.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:42 np0005476733 nova_compute[192580]: 2025-10-08 16:39:42.029 2 DEBUG nova.virt.libvirt.vif [None req-99477109-b97a-4d1f-8385-d01a826347cc de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T16:37:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-294986936',display_name='tempest-server-test-294986936',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-294986936',id=99,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgF0yKypuL1wJPOTwdtr+zzq0qw+uwurXu01O/Ym5uWgfd00pr3GN1raply3ByKVO5hmmkfvydhY0zQSvT9dZbNRj3hL8c6L+eBag20GVWlTRMyq8EfPEfzsuER1PS2LQ==',key_name='tempest-keypair-test-224344080',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:37:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4726c8b7a2a3405b9b2d689862918f5d',ramdisk_id='',reservation_id='r-bzq9fy7c',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-GatewayMtuTestUdp-187807839',owner_user_name='tempest-GatewayMtuTestUdp-187807839-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T16:37:07Z,user_data=None,user_id='de0012a12c1645bfb620caa34110c3f4',uuid=3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8f2597d2-a2aa-4839-ac8f-aff700990b1d", "address": "fa:16:3e:3b:7e:24", "network": {"id": "1496f4fb-5756-48b3-9df1-e6965ccbed85", "bridge": "br-int", "label": "tempest-test-network--1104825919", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f2597d2-a2", "ovs_interfaceid": "8f2597d2-a2aa-4839-ac8f-aff700990b1d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 12:39:42 np0005476733 nova_compute[192580]: 2025-10-08 16:39:42.029 2 DEBUG nova.network.os_vif_util [None req-99477109-b97a-4d1f-8385-d01a826347cc de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Converting VIF {"id": "8f2597d2-a2aa-4839-ac8f-aff700990b1d", "address": "fa:16:3e:3b:7e:24", "network": {"id": "1496f4fb-5756-48b3-9df1-e6965ccbed85", "bridge": "br-int", "label": "tempest-test-network--1104825919", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8f2597d2-a2", "ovs_interfaceid": "8f2597d2-a2aa-4839-ac8f-aff700990b1d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:39:42 np0005476733 nova_compute[192580]: 2025-10-08 16:39:42.030 2 DEBUG nova.network.os_vif_util [None req-99477109-b97a-4d1f-8385-d01a826347cc de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3b:7e:24,bridge_name='br-int',has_traffic_filtering=True,id=8f2597d2-a2aa-4839-ac8f-aff700990b1d,network=Network(1496f4fb-5756-48b3-9df1-e6965ccbed85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f2597d2-a2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:39:42 np0005476733 nova_compute[192580]: 2025-10-08 16:39:42.031 2 DEBUG os_vif [None req-99477109-b97a-4d1f-8385-d01a826347cc de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3b:7e:24,bridge_name='br-int',has_traffic_filtering=True,id=8f2597d2-a2aa-4839-ac8f-aff700990b1d,network=Network(1496f4fb-5756-48b3-9df1-e6965ccbed85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f2597d2-a2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 12:39:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:42.031 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5fae4914-fb00-4e76-8c26-29cb5696ae2b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:39:42 np0005476733 nova_compute[192580]: 2025-10-08 16:39:42.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:42 np0005476733 nova_compute[192580]: 2025-10-08 16:39:42.032 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f2597d2-a2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:39:42 np0005476733 nova_compute[192580]: 2025-10-08 16:39:42.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:42 np0005476733 nova_compute[192580]: 2025-10-08 16:39:42.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:42 np0005476733 nova_compute[192580]: 2025-10-08 16:39:42.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:39:42 np0005476733 nova_compute[192580]: 2025-10-08 16:39:42.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:42 np0005476733 nova_compute[192580]: 2025-10-08 16:39:42.044 2 INFO os_vif [None req-99477109-b97a-4d1f-8385-d01a826347cc de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3b:7e:24,bridge_name='br-int',has_traffic_filtering=True,id=8f2597d2-a2aa-4839-ac8f-aff700990b1d,network=Network(1496f4fb-5756-48b3-9df1-e6965ccbed85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8f2597d2-a2')#033[00m
Oct  8 12:39:42 np0005476733 nova_compute[192580]: 2025-10-08 16:39:42.045 2 INFO nova.virt.libvirt.driver [None req-99477109-b97a-4d1f-8385-d01a826347cc de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Deleting instance files /var/lib/nova/instances/3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0_del#033[00m
Oct  8 12:39:42 np0005476733 nova_compute[192580]: 2025-10-08 16:39:42.045 2 INFO nova.virt.libvirt.driver [None req-99477109-b97a-4d1f-8385-d01a826347cc de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Deletion of /var/lib/nova/instances/3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0_del complete#033[00m
Oct  8 12:39:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:42.059 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0ec0b0fe-add7-41e9-90e8-ea0ced9a822e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:39:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:42.060 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[cb274cce-f144-4bd8-ab44-bc98f69e7a2a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:39:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:42.076 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b89d1e7c-7717-433f-a64b-ff4b13c26496]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 833056, 'reachable_time': 38210, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 268599, 'error': None, 'target': 'ovnmeta-1496f4fb-5756-48b3-9df1-e6965ccbed85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:39:42 np0005476733 systemd[1]: run-netns-ovnmeta\x2d1496f4fb\x2d5756\x2d48b3\x2d9df1\x2de6965ccbed85.mount: Deactivated successfully.
Oct  8 12:39:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:42.080 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1496f4fb-5756-48b3-9df1-e6965ccbed85 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 12:39:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:42.080 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[28c90020-b6b7-4b4a-93e7-db81874ca1b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:39:42 np0005476733 nova_compute[192580]: 2025-10-08 16:39:42.100 2 INFO nova.compute.manager [None req-99477109-b97a-4d1f-8385-d01a826347cc de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 12:39:42 np0005476733 nova_compute[192580]: 2025-10-08 16:39:42.101 2 DEBUG oslo.service.loopingcall [None req-99477109-b97a-4d1f-8385-d01a826347cc de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 12:39:42 np0005476733 nova_compute[192580]: 2025-10-08 16:39:42.101 2 DEBUG nova.compute.manager [-] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 12:39:42 np0005476733 nova_compute[192580]: 2025-10-08 16:39:42.101 2 DEBUG nova.network.neutron [-] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 12:39:44 np0005476733 nova_compute[192580]: 2025-10-08 16:39:44.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:39:44 np0005476733 nova_compute[192580]: 2025-10-08 16:39:44.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:45 np0005476733 nova_compute[192580]: 2025-10-08 16:39:45.221 2 DEBUG nova.compute.manager [req-91b60e8b-5db1-4d1b-ad01-36e2f42d6308 req-8666549c-7733-4bf5-945a-349ac5ec847b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Received event network-vif-unplugged-8f2597d2-a2aa-4839-ac8f-aff700990b1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:39:45 np0005476733 nova_compute[192580]: 2025-10-08 16:39:45.222 2 DEBUG oslo_concurrency.lockutils [req-91b60e8b-5db1-4d1b-ad01-36e2f42d6308 req-8666549c-7733-4bf5-945a-349ac5ec847b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:39:45 np0005476733 nova_compute[192580]: 2025-10-08 16:39:45.222 2 DEBUG oslo_concurrency.lockutils [req-91b60e8b-5db1-4d1b-ad01-36e2f42d6308 req-8666549c-7733-4bf5-945a-349ac5ec847b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:39:45 np0005476733 nova_compute[192580]: 2025-10-08 16:39:45.222 2 DEBUG oslo_concurrency.lockutils [req-91b60e8b-5db1-4d1b-ad01-36e2f42d6308 req-8666549c-7733-4bf5-945a-349ac5ec847b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:39:45 np0005476733 nova_compute[192580]: 2025-10-08 16:39:45.222 2 DEBUG nova.compute.manager [req-91b60e8b-5db1-4d1b-ad01-36e2f42d6308 req-8666549c-7733-4bf5-945a-349ac5ec847b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] No waiting events found dispatching network-vif-unplugged-8f2597d2-a2aa-4839-ac8f-aff700990b1d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:39:45 np0005476733 nova_compute[192580]: 2025-10-08 16:39:45.223 2 DEBUG nova.compute.manager [req-91b60e8b-5db1-4d1b-ad01-36e2f42d6308 req-8666549c-7733-4bf5-945a-349ac5ec847b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Received event network-vif-unplugged-8f2597d2-a2aa-4839-ac8f-aff700990b1d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 12:39:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:46.262 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=76, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=75) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:39:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:46.263 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:39:46 np0005476733 nova_compute[192580]: 2025-10-08 16:39:46.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:46 np0005476733 nova_compute[192580]: 2025-10-08 16:39:46.343 2 DEBUG nova.network.neutron [-] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:39:46 np0005476733 nova_compute[192580]: 2025-10-08 16:39:46.362 2 INFO nova.compute.manager [-] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Took 4.26 seconds to deallocate network for instance.#033[00m
Oct  8 12:39:46 np0005476733 nova_compute[192580]: 2025-10-08 16:39:46.407 2 DEBUG oslo_concurrency.lockutils [None req-99477109-b97a-4d1f-8385-d01a826347cc de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:39:46 np0005476733 nova_compute[192580]: 2025-10-08 16:39:46.407 2 DEBUG oslo_concurrency.lockutils [None req-99477109-b97a-4d1f-8385-d01a826347cc de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:39:46 np0005476733 nova_compute[192580]: 2025-10-08 16:39:46.500 2 DEBUG nova.compute.provider_tree [None req-99477109-b97a-4d1f-8385-d01a826347cc de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:39:46 np0005476733 nova_compute[192580]: 2025-10-08 16:39:46.527 2 DEBUG nova.scheduler.client.report [None req-99477109-b97a-4d1f-8385-d01a826347cc de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:39:46 np0005476733 nova_compute[192580]: 2025-10-08 16:39:46.552 2 DEBUG oslo_concurrency.lockutils [None req-99477109-b97a-4d1f-8385-d01a826347cc de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:39:46 np0005476733 nova_compute[192580]: 2025-10-08 16:39:46.581 2 INFO nova.scheduler.client.report [None req-99477109-b97a-4d1f-8385-d01a826347cc de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Deleted allocations for instance 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0#033[00m
Oct  8 12:39:46 np0005476733 nova_compute[192580]: 2025-10-08 16:39:46.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:39:46 np0005476733 nova_compute[192580]: 2025-10-08 16:39:46.587 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:39:46 np0005476733 nova_compute[192580]: 2025-10-08 16:39:46.655 2 DEBUG oslo_concurrency.lockutils [None req-99477109-b97a-4d1f-8385-d01a826347cc de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.922s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:39:47 np0005476733 nova_compute[192580]: 2025-10-08 16:39:47.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:47 np0005476733 nova_compute[192580]: 2025-10-08 16:39:47.380 2 DEBUG nova.compute.manager [req-5f8a6f3a-f577-4c53-ae80-4915b2be8ff9 req-48ac594a-a057-42a5-893c-38ce7ff62966 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Received event network-vif-plugged-8f2597d2-a2aa-4839-ac8f-aff700990b1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:39:47 np0005476733 nova_compute[192580]: 2025-10-08 16:39:47.381 2 DEBUG oslo_concurrency.lockutils [req-5f8a6f3a-f577-4c53-ae80-4915b2be8ff9 req-48ac594a-a057-42a5-893c-38ce7ff62966 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:39:47 np0005476733 nova_compute[192580]: 2025-10-08 16:39:47.381 2 DEBUG oslo_concurrency.lockutils [req-5f8a6f3a-f577-4c53-ae80-4915b2be8ff9 req-48ac594a-a057-42a5-893c-38ce7ff62966 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:39:47 np0005476733 nova_compute[192580]: 2025-10-08 16:39:47.381 2 DEBUG oslo_concurrency.lockutils [req-5f8a6f3a-f577-4c53-ae80-4915b2be8ff9 req-48ac594a-a057-42a5-893c-38ce7ff62966 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:39:47 np0005476733 nova_compute[192580]: 2025-10-08 16:39:47.381 2 DEBUG nova.compute.manager [req-5f8a6f3a-f577-4c53-ae80-4915b2be8ff9 req-48ac594a-a057-42a5-893c-38ce7ff62966 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] No waiting events found dispatching network-vif-plugged-8f2597d2-a2aa-4839-ac8f-aff700990b1d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:39:47 np0005476733 nova_compute[192580]: 2025-10-08 16:39:47.382 2 WARNING nova.compute.manager [req-5f8a6f3a-f577-4c53-ae80-4915b2be8ff9 req-48ac594a-a057-42a5-893c-38ce7ff62966 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Received unexpected event network-vif-plugged-8f2597d2-a2aa-4839-ac8f-aff700990b1d for instance with vm_state deleted and task_state None.#033[00m
Oct  8 12:39:47 np0005476733 nova_compute[192580]: 2025-10-08 16:39:47.382 2 DEBUG nova.compute.manager [req-5f8a6f3a-f577-4c53-ae80-4915b2be8ff9 req-48ac594a-a057-42a5-893c-38ce7ff62966 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Received event network-vif-deleted-8f2597d2-a2aa-4839-ac8f-aff700990b1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:39:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:48.265 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '76'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:39:49 np0005476733 podman[268601]: 2025-10-08 16:39:49.228758886 +0000 UTC m=+0.056213173 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:39:49 np0005476733 podman[268600]: 2025-10-08 16:39:49.23575954 +0000 UTC m=+0.063488056 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  8 12:39:49 np0005476733 podman[268602]: 2025-10-08 16:39:49.246946387 +0000 UTC m=+0.067186523 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, architecture=x86_64, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-type=git, config_id=edpm, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  8 12:39:49 np0005476733 nova_compute[192580]: 2025-10-08 16:39:49.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:39:50Z|00089|pinctrl|WARN|Dropped 203 log messages in last 60 seconds (most recently, 2 seconds ago) due to excessive rate
Oct  8 12:39:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:39:50Z|00090|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:39:52 np0005476733 nova_compute[192580]: 2025-10-08 16:39:52.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:52 np0005476733 nova_compute[192580]: 2025-10-08 16:39:52.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.110 2 DEBUG oslo_concurrency.lockutils [None req-c3cfb2eb-8bf9-40a5-a177-5a817295c6e4 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "d2243acb-4779-458d-9e3f-4a5434d27bfb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.110 2 DEBUG oslo_concurrency.lockutils [None req-c3cfb2eb-8bf9-40a5-a177-5a817295c6e4 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "d2243acb-4779-458d-9e3f-4a5434d27bfb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.110 2 DEBUG oslo_concurrency.lockutils [None req-c3cfb2eb-8bf9-40a5-a177-5a817295c6e4 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "d2243acb-4779-458d-9e3f-4a5434d27bfb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.111 2 DEBUG oslo_concurrency.lockutils [None req-c3cfb2eb-8bf9-40a5-a177-5a817295c6e4 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "d2243acb-4779-458d-9e3f-4a5434d27bfb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.111 2 DEBUG oslo_concurrency.lockutils [None req-c3cfb2eb-8bf9-40a5-a177-5a817295c6e4 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "d2243acb-4779-458d-9e3f-4a5434d27bfb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.112 2 INFO nova.compute.manager [None req-c3cfb2eb-8bf9-40a5-a177-5a817295c6e4 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Terminating instance#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.113 2 DEBUG nova.compute.manager [None req-c3cfb2eb-8bf9-40a5-a177-5a817295c6e4 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 12:39:54 np0005476733 kernel: tape3be517f-e6 (unregistering): left promiscuous mode
Oct  8 12:39:54 np0005476733 NetworkManager[51699]: <info>  [1759941594.1488] device (tape3be517f-e6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:54 np0005476733 ovn_controller[263831]: 2025-10-08T16:39:54Z|00091|binding|INFO|Releasing lport e3be517f-e639-485a-ab91-7dc0a7be7433 from this chassis (sb_readonly=0)
Oct  8 12:39:54 np0005476733 ovn_controller[263831]: 2025-10-08T16:39:54Z|00092|binding|INFO|Setting lport e3be517f-e639-485a-ab91-7dc0a7be7433 down in Southbound
Oct  8 12:39:54 np0005476733 ovn_controller[263831]: 2025-10-08T16:39:54Z|00093|binding|INFO|Removing iface tape3be517f-e6 ovn-installed in OVS
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:54.160 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:93:e1:e1 192.168.122.208'], port_security=['fa:16:3e:93:e1:e1 192.168.122.208'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.208/24', 'neutron:device_id': 'd2243acb-4779-458d-9e3f-4a5434d27bfb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0ae65408-c1fc-4a23-acb9-ead1616a73f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5b64086-e7d8-42ad-b439-67cb79e13d7c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=e3be517f-e639-485a-ab91-7dc0a7be7433) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:39:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:54.161 103739 INFO neutron.agent.ovn.metadata.agent [-] Port e3be517f-e639-485a-ab91-7dc0a7be7433 in datapath 81c575b5-ac88-40d3-8b00-79c5c936eec4 unbound from our chassis#033[00m
Oct  8 12:39:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:54.162 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 81c575b5-ac88-40d3-8b00-79c5c936eec4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 12:39:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:54.163 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2e698b93-79cb-4406-8b2d-64786e1312f6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:39:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:54.164 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 namespace which is not needed anymore#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:54 np0005476733 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000061.scope: Deactivated successfully.
Oct  8 12:39:54 np0005476733 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000061.scope: Consumed 45.160s CPU time.
Oct  8 12:39:54 np0005476733 systemd-machined[152624]: Machine qemu-60-instance-00000061 terminated.
Oct  8 12:39:54 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[267163]: [NOTICE]   (267168) : haproxy version is 2.8.14-c23fe91
Oct  8 12:39:54 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[267163]: [NOTICE]   (267168) : path to executable is /usr/sbin/haproxy
Oct  8 12:39:54 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[267163]: [ALERT]    (267168) : Current worker (267170) exited with code 143 (Terminated)
Oct  8 12:39:54 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[267163]: [WARNING]  (267168) : All workers exited. Exiting... (0)
Oct  8 12:39:54 np0005476733 systemd[1]: libpod-af1dbaebf46327434827efb1609ff138d3a5f6744a8eaebaacf18f4e661bddad.scope: Deactivated successfully.
Oct  8 12:39:54 np0005476733 podman[268695]: 2025-10-08 16:39:54.316204496 +0000 UTC m=+0.050690958 container died af1dbaebf46327434827efb1609ff138d3a5f6744a8eaebaacf18f4e661bddad (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 12:39:54 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-af1dbaebf46327434827efb1609ff138d3a5f6744a8eaebaacf18f4e661bddad-userdata-shm.mount: Deactivated successfully.
Oct  8 12:39:54 np0005476733 systemd[1]: var-lib-containers-storage-overlay-3f415434c086b7662bf565b07aba6692bae4b35f8fc8e3467046183891911f8c-merged.mount: Deactivated successfully.
Oct  8 12:39:54 np0005476733 podman[268695]: 2025-10-08 16:39:54.356009465 +0000 UTC m=+0.090495927 container cleanup af1dbaebf46327434827efb1609ff138d3a5f6744a8eaebaacf18f4e661bddad (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:39:54 np0005476733 systemd[1]: libpod-conmon-af1dbaebf46327434827efb1609ff138d3a5f6744a8eaebaacf18f4e661bddad.scope: Deactivated successfully.
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.407 2 INFO nova.virt.libvirt.driver [-] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Instance destroyed successfully.#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.407 2 DEBUG nova.objects.instance [None req-c3cfb2eb-8bf9-40a5-a177-5a817295c6e4 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lazy-loading 'resources' on Instance uuid d2243acb-4779-458d-9e3f-4a5434d27bfb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:39:54 np0005476733 podman[268736]: 2025-10-08 16:39:54.429581202 +0000 UTC m=+0.049701537 container remove af1dbaebf46327434827efb1609ff138d3a5f6744a8eaebaacf18f4e661bddad (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  8 12:39:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:54.434 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[efdbf02b-6c4a-446e-9bb6-855444de1cfc]: (4, ('Wed Oct  8 04:39:54 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 (af1dbaebf46327434827efb1609ff138d3a5f6744a8eaebaacf18f4e661bddad)\naf1dbaebf46327434827efb1609ff138d3a5f6744a8eaebaacf18f4e661bddad\nWed Oct  8 04:39:54 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 (af1dbaebf46327434827efb1609ff138d3a5f6744a8eaebaacf18f4e661bddad)\naf1dbaebf46327434827efb1609ff138d3a5f6744a8eaebaacf18f4e661bddad\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:39:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:54.436 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[559191b6-40fd-41c8-aab0-b358abe704dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:39:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:54.437 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81c575b5-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.472 2 DEBUG nova.compute.manager [req-7a561221-2b72-44dc-a26e-0fe5ccedc740 req-ebb2fc09-6335-455c-b684-dbad7bfe0c68 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Received event network-vif-unplugged-e3be517f-e639-485a-ab91-7dc0a7be7433 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.472 2 DEBUG oslo_concurrency.lockutils [req-7a561221-2b72-44dc-a26e-0fe5ccedc740 req-ebb2fc09-6335-455c-b684-dbad7bfe0c68 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "d2243acb-4779-458d-9e3f-4a5434d27bfb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.473 2 DEBUG oslo_concurrency.lockutils [req-7a561221-2b72-44dc-a26e-0fe5ccedc740 req-ebb2fc09-6335-455c-b684-dbad7bfe0c68 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d2243acb-4779-458d-9e3f-4a5434d27bfb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.473 2 DEBUG oslo_concurrency.lockutils [req-7a561221-2b72-44dc-a26e-0fe5ccedc740 req-ebb2fc09-6335-455c-b684-dbad7bfe0c68 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d2243acb-4779-458d-9e3f-4a5434d27bfb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.473 2 DEBUG nova.compute.manager [req-7a561221-2b72-44dc-a26e-0fe5ccedc740 req-ebb2fc09-6335-455c-b684-dbad7bfe0c68 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] No waiting events found dispatching network-vif-unplugged-e3be517f-e639-485a-ab91-7dc0a7be7433 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.473 2 DEBUG nova.compute.manager [req-7a561221-2b72-44dc-a26e-0fe5ccedc740 req-ebb2fc09-6335-455c-b684-dbad7bfe0c68 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Received event network-vif-unplugged-e3be517f-e639-485a-ab91-7dc0a7be7433 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 12:39:54 np0005476733 kernel: tap81c575b5-a0: left promiscuous mode
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.474 2 DEBUG nova.virt.libvirt.vif [None req-c3cfb2eb-8bf9-40a5-a177-5a817295c6e4 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T16:36:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1966949606',display_name='tempest-server-test-1966949606',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1966949606',id=97,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgF0yKypuL1wJPOTwdtr+zzq0qw+uwurXu01O/Ym5uWgfd00pr3GN1raply3ByKVO5hmmkfvydhY0zQSvT9dZbNRj3hL8c6L+eBag20GVWlTRMyq8EfPEfzsuER1PS2LQ==',key_name='tempest-keypair-test-224344080',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:36:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4726c8b7a2a3405b9b2d689862918f5d',ramdisk_id='',reservation_id='r-fbooillz',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-GatewayMtuTestUdp-187807839',owner_user_name='tempest-GatewayMtuTestUdp-187807839-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T16:36:33Z,user_data=None,user_id='de0012a12c1645bfb620caa34110c3f4',uuid=d2243acb-4779-458d-9e3f-4a5434d27bfb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e3be517f-e639-485a-ab91-7dc0a7be7433", "address": "fa:16:3e:93:e1:e1", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3be517f-e6", "ovs_interfaceid": "e3be517f-e639-485a-ab91-7dc0a7be7433", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.474 2 DEBUG nova.network.os_vif_util [None req-c3cfb2eb-8bf9-40a5-a177-5a817295c6e4 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Converting VIF {"id": "e3be517f-e639-485a-ab91-7dc0a7be7433", "address": "fa:16:3e:93:e1:e1", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1312, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3be517f-e6", "ovs_interfaceid": "e3be517f-e639-485a-ab91-7dc0a7be7433", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.475 2 DEBUG nova.network.os_vif_util [None req-c3cfb2eb-8bf9-40a5-a177-5a817295c6e4 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:93:e1:e1,bridge_name='br-int',has_traffic_filtering=True,id=e3be517f-e639-485a-ab91-7dc0a7be7433,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3be517f-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.475 2 DEBUG os_vif [None req-c3cfb2eb-8bf9-40a5-a177-5a817295c6e4 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:93:e1:e1,bridge_name='br-int',has_traffic_filtering=True,id=e3be517f-e639-485a-ab91-7dc0a7be7433,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3be517f-e6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.478 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3be517f-e6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:54.495 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e356ea96-984e-4537-8cb8-7893f6a3dd7a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.497 2 INFO os_vif [None req-c3cfb2eb-8bf9-40a5-a177-5a817295c6e4 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:93:e1:e1,bridge_name='br-int',has_traffic_filtering=True,id=e3be517f-e639-485a-ab91-7dc0a7be7433,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3be517f-e6')#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.498 2 INFO nova.virt.libvirt.driver [None req-c3cfb2eb-8bf9-40a5-a177-5a817295c6e4 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Deleting instance files /var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb_del#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.498 2 INFO nova.virt.libvirt.driver [None req-c3cfb2eb-8bf9-40a5-a177-5a817295c6e4 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Deletion of /var/lib/nova/instances/d2243acb-4779-458d-9e3f-4a5434d27bfb_del complete#033[00m
Oct  8 12:39:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:54.523 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[047f919a-e295-44f5-8168-e87174d2614d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:39:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:54.524 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a7e1ec4a-bba5-4c1f-890a-58a639109957]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:39:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:54.542 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ef6440e1-3d1c-4582-996f-fa4c5e616f87]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 829585, 'reachable_time': 39571, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 268758, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:39:54 np0005476733 systemd[1]: run-netns-ovnmeta\x2d81c575b5\x2dac88\x2d40d3\x2d8b00\x2d79c5c936eec4.mount: Deactivated successfully.
Oct  8 12:39:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:54.545 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 12:39:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:39:54.545 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[dc956b4c-f8e0-44a5-9d0e-26e766578f66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.578 2 INFO nova.compute.manager [None req-c3cfb2eb-8bf9-40a5-a177-5a817295c6e4 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Took 0.47 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.579 2 DEBUG oslo.service.loopingcall [None req-c3cfb2eb-8bf9-40a5-a177-5a817295c6e4 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.579 2 DEBUG nova.compute.manager [-] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.579 2 DEBUG nova.network.neutron [-] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.618 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.619 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:39:54 np0005476733 nova_compute[192580]: 2025-10-08 16:39:54.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:55 np0005476733 nova_compute[192580]: 2025-10-08 16:39:55.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:39:56 np0005476733 nova_compute[192580]: 2025-10-08 16:39:56.173 2 DEBUG nova.network.neutron [-] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:39:56 np0005476733 nova_compute[192580]: 2025-10-08 16:39:56.193 2 INFO nova.compute.manager [-] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Took 1.61 seconds to deallocate network for instance.#033[00m
Oct  8 12:39:56 np0005476733 nova_compute[192580]: 2025-10-08 16:39:56.236 2 DEBUG oslo_concurrency.lockutils [None req-c3cfb2eb-8bf9-40a5-a177-5a817295c6e4 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:39:56 np0005476733 nova_compute[192580]: 2025-10-08 16:39:56.236 2 DEBUG oslo_concurrency.lockutils [None req-c3cfb2eb-8bf9-40a5-a177-5a817295c6e4 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:39:56 np0005476733 nova_compute[192580]: 2025-10-08 16:39:56.298 2 DEBUG nova.compute.provider_tree [None req-c3cfb2eb-8bf9-40a5-a177-5a817295c6e4 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:39:56 np0005476733 nova_compute[192580]: 2025-10-08 16:39:56.312 2 DEBUG nova.scheduler.client.report [None req-c3cfb2eb-8bf9-40a5-a177-5a817295c6e4 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:39:56 np0005476733 nova_compute[192580]: 2025-10-08 16:39:56.335 2 DEBUG oslo_concurrency.lockutils [None req-c3cfb2eb-8bf9-40a5-a177-5a817295c6e4 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:39:56 np0005476733 nova_compute[192580]: 2025-10-08 16:39:56.360 2 INFO nova.scheduler.client.report [None req-c3cfb2eb-8bf9-40a5-a177-5a817295c6e4 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Deleted allocations for instance d2243acb-4779-458d-9e3f-4a5434d27bfb#033[00m
Oct  8 12:39:56 np0005476733 nova_compute[192580]: 2025-10-08 16:39:56.459 2 DEBUG oslo_concurrency.lockutils [None req-c3cfb2eb-8bf9-40a5-a177-5a817295c6e4 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "d2243acb-4779-458d-9e3f-4a5434d27bfb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.349s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:39:56 np0005476733 nova_compute[192580]: 2025-10-08 16:39:56.571 2 DEBUG nova.compute.manager [req-7b5e4608-11f3-4515-957b-059280749c01 req-ffd3f293-00f5-4270-b1df-8338c3ec3a4c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Received event network-vif-plugged-e3be517f-e639-485a-ab91-7dc0a7be7433 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:39:56 np0005476733 nova_compute[192580]: 2025-10-08 16:39:56.572 2 DEBUG oslo_concurrency.lockutils [req-7b5e4608-11f3-4515-957b-059280749c01 req-ffd3f293-00f5-4270-b1df-8338c3ec3a4c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "d2243acb-4779-458d-9e3f-4a5434d27bfb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:39:56 np0005476733 nova_compute[192580]: 2025-10-08 16:39:56.572 2 DEBUG oslo_concurrency.lockutils [req-7b5e4608-11f3-4515-957b-059280749c01 req-ffd3f293-00f5-4270-b1df-8338c3ec3a4c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d2243acb-4779-458d-9e3f-4a5434d27bfb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:39:56 np0005476733 nova_compute[192580]: 2025-10-08 16:39:56.572 2 DEBUG oslo_concurrency.lockutils [req-7b5e4608-11f3-4515-957b-059280749c01 req-ffd3f293-00f5-4270-b1df-8338c3ec3a4c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d2243acb-4779-458d-9e3f-4a5434d27bfb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:39:56 np0005476733 nova_compute[192580]: 2025-10-08 16:39:56.573 2 DEBUG nova.compute.manager [req-7b5e4608-11f3-4515-957b-059280749c01 req-ffd3f293-00f5-4270-b1df-8338c3ec3a4c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] No waiting events found dispatching network-vif-plugged-e3be517f-e639-485a-ab91-7dc0a7be7433 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:39:56 np0005476733 nova_compute[192580]: 2025-10-08 16:39:56.573 2 WARNING nova.compute.manager [req-7b5e4608-11f3-4515-957b-059280749c01 req-ffd3f293-00f5-4270-b1df-8338c3ec3a4c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Received unexpected event network-vif-plugged-e3be517f-e639-485a-ab91-7dc0a7be7433 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 12:39:56 np0005476733 nova_compute[192580]: 2025-10-08 16:39:56.573 2 DEBUG nova.compute.manager [req-7b5e4608-11f3-4515-957b-059280749c01 req-ffd3f293-00f5-4270-b1df-8338c3ec3a4c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Received event network-vif-deleted-e3be517f-e639-485a-ab91-7dc0a7be7433 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:39:57 np0005476733 nova_compute[192580]: 2025-10-08 16:39:56.999 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759941581.9973824, 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:39:57 np0005476733 nova_compute[192580]: 2025-10-08 16:39:57.000 2 INFO nova.compute.manager [-] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] VM Stopped (Lifecycle Event)#033[00m
Oct  8 12:39:57 np0005476733 nova_compute[192580]: 2025-10-08 16:39:57.032 2 DEBUG nova.compute.manager [None req-901b03e7-19a0-4d0c-9c9a-4c07816170ab - - - - - -] [instance: 3ec1fb0e-0c8b-4305-af1a-8cadddf93dd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:39:57 np0005476733 podman[268759]: 2025-10-08 16:39:57.244661099 +0000 UTC m=+0.070942993 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, org.label-schema.license=GPLv2)
Oct  8 12:39:57 np0005476733 podman[268760]: 2025-10-08 16:39:57.259299036 +0000 UTC m=+0.074925930 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:39:58 np0005476733 nova_compute[192580]: 2025-10-08 16:39:58.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:39:59 np0005476733 nova_compute[192580]: 2025-10-08 16:39:59.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:39:59 np0005476733 nova_compute[192580]: 2025-10-08 16:39:59.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:04 np0005476733 nova_compute[192580]: 2025-10-08 16:40:04.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:04 np0005476733 nova_compute[192580]: 2025-10-08 16:40:04.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:05 np0005476733 nova_compute[192580]: 2025-10-08 16:40:05.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:06 np0005476733 podman[268803]: 2025-10-08 16:40:06.255926441 +0000 UTC m=+0.081709437 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:40:08 np0005476733 nova_compute[192580]: 2025-10-08 16:40:08.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:08 np0005476733 nova_compute[192580]: 2025-10-08 16:40:08.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:40:08 np0005476733 nova_compute[192580]: 2025-10-08 16:40:08.621 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:40:08 np0005476733 nova_compute[192580]: 2025-10-08 16:40:08.622 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:40:08 np0005476733 nova_compute[192580]: 2025-10-08 16:40:08.622 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:40:08 np0005476733 nova_compute[192580]: 2025-10-08 16:40:08.622 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:40:08 np0005476733 podman[268822]: 2025-10-08 16:40:08.756147659 +0000 UTC m=+0.095315321 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  8 12:40:08 np0005476733 nova_compute[192580]: 2025-10-08 16:40:08.781 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:40:08 np0005476733 nova_compute[192580]: 2025-10-08 16:40:08.782 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13656MB free_disk=111.31285095214844GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:40:08 np0005476733 nova_compute[192580]: 2025-10-08 16:40:08.782 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:40:08 np0005476733 nova_compute[192580]: 2025-10-08 16:40:08.782 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:40:08 np0005476733 nova_compute[192580]: 2025-10-08 16:40:08.882 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:40:08 np0005476733 nova_compute[192580]: 2025-10-08 16:40:08.883 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:40:08 np0005476733 nova_compute[192580]: 2025-10-08 16:40:08.904 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:40:08 np0005476733 nova_compute[192580]: 2025-10-08 16:40:08.922 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:40:08 np0005476733 nova_compute[192580]: 2025-10-08 16:40:08.950 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:40:08 np0005476733 nova_compute[192580]: 2025-10-08 16:40:08.951 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:40:09 np0005476733 nova_compute[192580]: 2025-10-08 16:40:09.407 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759941594.4048493, d2243acb-4779-458d-9e3f-4a5434d27bfb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:40:09 np0005476733 nova_compute[192580]: 2025-10-08 16:40:09.407 2 INFO nova.compute.manager [-] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] VM Stopped (Lifecycle Event)#033[00m
Oct  8 12:40:09 np0005476733 nova_compute[192580]: 2025-10-08 16:40:09.431 2 DEBUG nova.compute.manager [None req-2e6a8799-ed88-4191-b8cd-c49372589dfc - - - - - -] [instance: d2243acb-4779-458d-9e3f-4a5434d27bfb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:40:09 np0005476733 nova_compute[192580]: 2025-10-08 16:40:09.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:09 np0005476733 nova_compute[192580]: 2025-10-08 16:40:09.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:11 np0005476733 podman[268849]: 2025-10-08 16:40:11.26150572 +0000 UTC m=+0.091831279 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:40:11 np0005476733 nova_compute[192580]: 2025-10-08 16:40:11.951 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:40:14 np0005476733 nova_compute[192580]: 2025-10-08 16:40:14.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:14 np0005476733 nova_compute[192580]: 2025-10-08 16:40:14.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:19 np0005476733 nova_compute[192580]: 2025-10-08 16:40:19.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:19 np0005476733 nova_compute[192580]: 2025-10-08 16:40:19.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:20 np0005476733 podman[268871]: 2025-10-08 16:40:20.232464327 +0000 UTC m=+0.058203927 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 12:40:20 np0005476733 podman[268872]: 2025-10-08 16:40:20.244042867 +0000 UTC m=+0.069062634 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  8 12:40:20 np0005476733 podman[268870]: 2025-10-08 16:40:20.251510454 +0000 UTC m=+0.083483823 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 12:40:24 np0005476733 nova_compute[192580]: 2025-10-08 16:40:24.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:24 np0005476733 nova_compute[192580]: 2025-10-08 16:40:24.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:26.407 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:40:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:26.408 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:40:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:26.408 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:40:28 np0005476733 podman[268934]: 2025-10-08 16:40:28.230879192 +0000 UTC m=+0.058515897 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  8 12:40:28 np0005476733 podman[268935]: 2025-10-08 16:40:28.24715381 +0000 UTC m=+0.064287251 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 12:40:29 np0005476733 nova_compute[192580]: 2025-10-08 16:40:29.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:30 np0005476733 nova_compute[192580]: 2025-10-08 16:40:30.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.142 2 DEBUG oslo_concurrency.lockutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.142 2 DEBUG oslo_concurrency.lockutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.160 2 DEBUG nova.compute.manager [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.244 2 DEBUG oslo_concurrency.lockutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.245 2 DEBUG oslo_concurrency.lockutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.253 2 DEBUG nova.virt.hardware [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.253 2 INFO nova.compute.claims [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.459 2 DEBUG nova.compute.provider_tree [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.480 2 DEBUG nova.scheduler.client.report [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.500 2 DEBUG oslo_concurrency.lockutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.501 2 DEBUG nova.compute.manager [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.561 2 DEBUG nova.compute.manager [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.561 2 DEBUG nova.network.neutron [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.581 2 INFO nova.virt.libvirt.driver [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.601 2 DEBUG nova.compute.manager [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.693 2 DEBUG nova.compute.manager [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.694 2 DEBUG nova.virt.libvirt.driver [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.695 2 INFO nova.virt.libvirt.driver [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Creating image(s)#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.696 2 DEBUG oslo_concurrency.lockutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "/var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.697 2 DEBUG oslo_concurrency.lockutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "/var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.697 2 DEBUG oslo_concurrency.lockutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "/var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.709 2 DEBUG oslo_concurrency.processutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.767 2 DEBUG oslo_concurrency.processutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.768 2 DEBUG oslo_concurrency.lockutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.768 2 DEBUG oslo_concurrency.lockutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.780 2 DEBUG oslo_concurrency.processutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.838 2 DEBUG nova.policy [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.840 2 DEBUG oslo_concurrency.processutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.842 2 DEBUG oslo_concurrency.processutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.875 2 DEBUG oslo_concurrency.processutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk 10737418240" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.876 2 DEBUG oslo_concurrency.lockutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.877 2 DEBUG oslo_concurrency.processutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.932 2 DEBUG oslo_concurrency.processutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.934 2 DEBUG nova.objects.instance [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lazy-loading 'migration_context' on Instance uuid f5367a5f-b3ff-45c9-b61a-a87626cfcb03 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.953 2 DEBUG nova.virt.libvirt.driver [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.953 2 DEBUG nova.virt.libvirt.driver [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Ensure instance console log exists: /var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.954 2 DEBUG oslo_concurrency.lockutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.954 2 DEBUG oslo_concurrency.lockutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:40:31 np0005476733 nova_compute[192580]: 2025-10-08 16:40:31.954 2 DEBUG oslo_concurrency.lockutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:40:32 np0005476733 nova_compute[192580]: 2025-10-08 16:40:32.768 2 DEBUG nova.network.neutron [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Successfully created port: f66cbaf5-fd00-4780-a065-fdcf7175d68c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 12:40:34 np0005476733 nova_compute[192580]: 2025-10-08 16:40:34.454 2 DEBUG nova.network.neutron [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Successfully updated port: f66cbaf5-fd00-4780-a065-fdcf7175d68c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 12:40:34 np0005476733 nova_compute[192580]: 2025-10-08 16:40:34.470 2 DEBUG oslo_concurrency.lockutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "refresh_cache-f5367a5f-b3ff-45c9-b61a-a87626cfcb03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:40:34 np0005476733 nova_compute[192580]: 2025-10-08 16:40:34.471 2 DEBUG oslo_concurrency.lockutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquired lock "refresh_cache-f5367a5f-b3ff-45c9-b61a-a87626cfcb03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:40:34 np0005476733 nova_compute[192580]: 2025-10-08 16:40:34.471 2 DEBUG nova.network.neutron [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:40:34 np0005476733 nova_compute[192580]: 2025-10-08 16:40:34.563 2 DEBUG nova.compute.manager [req-91c63d6e-2e48-4337-83eb-8702bd2e2beb req-213ecc69-1905-4025-a40f-0b702ec89989 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Received event network-changed-f66cbaf5-fd00-4780-a065-fdcf7175d68c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:40:34 np0005476733 nova_compute[192580]: 2025-10-08 16:40:34.563 2 DEBUG nova.compute.manager [req-91c63d6e-2e48-4337-83eb-8702bd2e2beb req-213ecc69-1905-4025-a40f-0b702ec89989 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Refreshing instance network info cache due to event network-changed-f66cbaf5-fd00-4780-a065-fdcf7175d68c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:40:34 np0005476733 nova_compute[192580]: 2025-10-08 16:40:34.563 2 DEBUG oslo_concurrency.lockutils [req-91c63d6e-2e48-4337-83eb-8702bd2e2beb req-213ecc69-1905-4025-a40f-0b702ec89989 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-f5367a5f-b3ff-45c9-b61a-a87626cfcb03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:40:34 np0005476733 nova_compute[192580]: 2025-10-08 16:40:34.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:34 np0005476733 nova_compute[192580]: 2025-10-08 16:40:34.628 2 DEBUG nova.network.neutron [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.811 2 DEBUG nova.network.neutron [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Updating instance_info_cache with network_info: [{"id": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "address": "fa:16:3e:e7:3a:92", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66cbaf5-fd", "ovs_interfaceid": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.832 2 DEBUG oslo_concurrency.lockutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Releasing lock "refresh_cache-f5367a5f-b3ff-45c9-b61a-a87626cfcb03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.832 2 DEBUG nova.compute.manager [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Instance network_info: |[{"id": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "address": "fa:16:3e:e7:3a:92", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66cbaf5-fd", "ovs_interfaceid": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.832 2 DEBUG oslo_concurrency.lockutils [req-91c63d6e-2e48-4337-83eb-8702bd2e2beb req-213ecc69-1905-4025-a40f-0b702ec89989 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-f5367a5f-b3ff-45c9-b61a-a87626cfcb03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.833 2 DEBUG nova.network.neutron [req-91c63d6e-2e48-4337-83eb-8702bd2e2beb req-213ecc69-1905-4025-a40f-0b702ec89989 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Refreshing network info cache for port f66cbaf5-fd00-4780-a065-fdcf7175d68c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.835 2 DEBUG nova.virt.libvirt.driver [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Start _get_guest_xml network_info=[{"id": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "address": "fa:16:3e:e7:3a:92", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66cbaf5-fd", "ovs_interfaceid": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.839 2 WARNING nova.virt.libvirt.driver [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.845 2 DEBUG nova.virt.libvirt.host [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.845 2 DEBUG nova.virt.libvirt.host [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.852 2 DEBUG nova.virt.libvirt.host [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.852 2 DEBUG nova.virt.libvirt.host [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.853 2 DEBUG nova.virt.libvirt.driver [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.853 2 DEBUG nova.virt.hardware [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.853 2 DEBUG nova.virt.hardware [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.854 2 DEBUG nova.virt.hardware [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.854 2 DEBUG nova.virt.hardware [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.854 2 DEBUG nova.virt.hardware [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.854 2 DEBUG nova.virt.hardware [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.854 2 DEBUG nova.virt.hardware [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.854 2 DEBUG nova.virt.hardware [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.855 2 DEBUG nova.virt.hardware [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.855 2 DEBUG nova.virt.hardware [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.855 2 DEBUG nova.virt.hardware [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.859 2 DEBUG nova.virt.libvirt.vif [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:40:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1528954241',display_name='tempest-server-test-1528954241',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1528954241',id=101,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgF0yKypuL1wJPOTwdtr+zzq0qw+uwurXu01O/Ym5uWgfd00pr3GN1raply3ByKVO5hmmkfvydhY0zQSvT9dZbNRj3hL8c6L+eBag20GVWlTRMyq8EfPEfzsuER1PS2LQ==',key_name='tempest-keypair-test-224344080',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4726c8b7a2a3405b9b2d689862918f5d',ramdisk_id='',reservation_id='r-86pbza36',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-GatewayMtuTestUdp-187807839',owner_user_name='tempest-GatewayMtuTestUdp-187807839-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:40:31Z,user_data=None,user_id='de0012a12c1645bfb620caa34110c3f4',uuid=f5367a5f-b3ff-45c9-b61a-a87626cfcb03,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "address": "fa:16:3e:e7:3a:92", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66cbaf5-fd", "ovs_interfaceid": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.859 2 DEBUG nova.network.os_vif_util [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Converting VIF {"id": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "address": "fa:16:3e:e7:3a:92", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66cbaf5-fd", "ovs_interfaceid": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.859 2 DEBUG nova.network.os_vif_util [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:3a:92,bridge_name='br-int',has_traffic_filtering=True,id=f66cbaf5-fd00-4780-a065-fdcf7175d68c,network=Network(aef581c1-ee97-4e74-a1f4-beb582e7a3d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf66cbaf5-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.860 2 DEBUG nova.objects.instance [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lazy-loading 'pci_devices' on Instance uuid f5367a5f-b3ff-45c9-b61a-a87626cfcb03 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.875 2 DEBUG nova.virt.libvirt.driver [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] End _get_guest_xml xml=<domain type="kvm">
Oct  8 12:40:35 np0005476733 nova_compute[192580]:  <uuid>f5367a5f-b3ff-45c9-b61a-a87626cfcb03</uuid>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:  <name>instance-00000065</name>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <nova:name>tempest-server-test-1528954241</nova:name>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 16:40:35</nova:creationTime>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 12:40:35 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:        <nova:user uuid="de0012a12c1645bfb620caa34110c3f4">tempest-GatewayMtuTestUdp-187807839-project-member</nova:user>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:        <nova:project uuid="4726c8b7a2a3405b9b2d689862918f5d">tempest-GatewayMtuTestUdp-187807839</nova:project>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:        <nova:port uuid="f66cbaf5-fd00-4780-a065-fdcf7175d68c">
Oct  8 12:40:35 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.21" ipVersion="4"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <system>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <entry name="serial">f5367a5f-b3ff-45c9-b61a-a87626cfcb03</entry>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <entry name="uuid">f5367a5f-b3ff-45c9-b61a-a87626cfcb03</entry>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    </system>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:  <os>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:  </os>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:  <features>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:  </features>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:  </clock>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:  <devices>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.config"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:e7:3a:92"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <target dev="tapf66cbaf5-fd"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    </interface>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03/console.log" append="off"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    </serial>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <video>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    </video>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    </rng>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 12:40:35 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 12:40:35 np0005476733 nova_compute[192580]:  </devices>
Oct  8 12:40:35 np0005476733 nova_compute[192580]: </domain>
Oct  8 12:40:35 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.875 2 DEBUG nova.compute.manager [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Preparing to wait for external event network-vif-plugged-f66cbaf5-fd00-4780-a065-fdcf7175d68c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.875 2 DEBUG oslo_concurrency.lockutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.876 2 DEBUG oslo_concurrency.lockutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.876 2 DEBUG oslo_concurrency.lockutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.877 2 DEBUG nova.virt.libvirt.vif [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:40:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1528954241',display_name='tempest-server-test-1528954241',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1528954241',id=101,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgF0yKypuL1wJPOTwdtr+zzq0qw+uwurXu01O/Ym5uWgfd00pr3GN1raply3ByKVO5hmmkfvydhY0zQSvT9dZbNRj3hL8c6L+eBag20GVWlTRMyq8EfPEfzsuER1PS2LQ==',key_name='tempest-keypair-test-224344080',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4726c8b7a2a3405b9b2d689862918f5d',ramdisk_id='',reservation_id='r-86pbza36',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-GatewayMtuTestUdp-187807839',owner_user_name='tempest-GatewayMtuTestUdp-187807839-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:40:31Z,user_data=None,user_id='de0012a12c1645bfb620caa34110c3f4',uuid=f5367a5f-b3ff-45c9-b61a-a87626cfcb03,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "address": "fa:16:3e:e7:3a:92", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66cbaf5-fd", "ovs_interfaceid": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.877 2 DEBUG nova.network.os_vif_util [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Converting VIF {"id": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "address": "fa:16:3e:e7:3a:92", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66cbaf5-fd", "ovs_interfaceid": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.877 2 DEBUG nova.network.os_vif_util [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:3a:92,bridge_name='br-int',has_traffic_filtering=True,id=f66cbaf5-fd00-4780-a065-fdcf7175d68c,network=Network(aef581c1-ee97-4e74-a1f4-beb582e7a3d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf66cbaf5-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.878 2 DEBUG os_vif [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:3a:92,bridge_name='br-int',has_traffic_filtering=True,id=f66cbaf5-fd00-4780-a065-fdcf7175d68c,network=Network(aef581c1-ee97-4e74-a1f4-beb582e7a3d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf66cbaf5-fd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.878 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.879 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.881 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf66cbaf5-fd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.881 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf66cbaf5-fd, col_values=(('external_ids', {'iface-id': 'f66cbaf5-fd00-4780-a065-fdcf7175d68c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e7:3a:92', 'vm-uuid': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:35 np0005476733 NetworkManager[51699]: <info>  [1759941635.8834] manager: (tapf66cbaf5-fd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/309)
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.888 2 INFO os_vif [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:3a:92,bridge_name='br-int',has_traffic_filtering=True,id=f66cbaf5-fd00-4780-a065-fdcf7175d68c,network=Network(aef581c1-ee97-4e74-a1f4-beb582e7a3d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf66cbaf5-fd')#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.938 2 DEBUG nova.virt.libvirt.driver [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.939 2 DEBUG nova.virt.libvirt.driver [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.939 2 DEBUG nova.virt.libvirt.driver [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] No VIF found with MAC fa:16:3e:e7:3a:92, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 12:40:35 np0005476733 nova_compute[192580]: 2025-10-08 16:40:35.939 2 INFO nova.virt.libvirt.driver [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Using config drive#033[00m
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.073 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'name': 'tempest-server-test-1528954241', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000065', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '4726c8b7a2a3405b9b2d689862918f5d', 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'hostId': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.074 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.074 12 DEBUG ceilometer.compute.pollsters [-] Instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000065, id=f5367a5f-b3ff-45c9-b61a-a87626cfcb03>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.075 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.075 12 DEBUG ceilometer.compute.pollsters [-] Instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000065, id=f5367a5f-b3ff-45c9-b61a-a87626cfcb03>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.075 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.075 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.075 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-1528954241>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1528954241>]
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.076 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.076 12 DEBUG ceilometer.compute.pollsters [-] Instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000065, id=f5367a5f-b3ff-45c9-b61a-a87626cfcb03>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.076 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.077 12 DEBUG ceilometer.compute.pollsters [-] Instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000065, id=f5367a5f-b3ff-45c9-b61a-a87626cfcb03>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.077 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.077 12 DEBUG ceilometer.compute.pollsters [-] Instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000065, id=f5367a5f-b3ff-45c9-b61a-a87626cfcb03>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.077 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.078 12 DEBUG ceilometer.compute.pollsters [-] Instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000065, id=f5367a5f-b3ff-45c9-b61a-a87626cfcb03>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.078 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.078 12 DEBUG ceilometer.compute.pollsters [-] Instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000065, id=f5367a5f-b3ff-45c9-b61a-a87626cfcb03>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.079 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.079 12 DEBUG ceilometer.compute.pollsters [-] Instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000065, id=f5367a5f-b3ff-45c9-b61a-a87626cfcb03>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.079 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.080 12 DEBUG ceilometer.compute.pollsters [-] Instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000065, id=f5367a5f-b3ff-45c9-b61a-a87626cfcb03>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.080 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.080 12 DEBUG ceilometer.compute.pollsters [-] Instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000065, id=f5367a5f-b3ff-45c9-b61a-a87626cfcb03>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.080 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.080 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.080 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-1528954241>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1528954241>]
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.081 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.081 12 DEBUG ceilometer.compute.pollsters [-] Instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000065, id=f5367a5f-b3ff-45c9-b61a-a87626cfcb03>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.081 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.082 12 DEBUG ceilometer.compute.pollsters [-] Instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000065, id=f5367a5f-b3ff-45c9-b61a-a87626cfcb03>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.082 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.082 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.082 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1528954241>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1528954241>]
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.082 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.082 12 DEBUG ceilometer.compute.pollsters [-] Instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000065, id=f5367a5f-b3ff-45c9-b61a-a87626cfcb03>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.083 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.083 12 DEBUG ceilometer.compute.pollsters [-] Instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000065, id=f5367a5f-b3ff-45c9-b61a-a87626cfcb03>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.083 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.084 12 DEBUG ceilometer.compute.pollsters [-] Instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000065, id=f5367a5f-b3ff-45c9-b61a-a87626cfcb03>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.084 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.084 12 DEBUG ceilometer.compute.pollsters [-] Instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000065, id=f5367a5f-b3ff-45c9-b61a-a87626cfcb03>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.084 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.085 12 DEBUG ceilometer.compute.pollsters [-] Instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000065, id=f5367a5f-b3ff-45c9-b61a-a87626cfcb03>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.085 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.085 12 DEBUG ceilometer.compute.pollsters [-] Instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000065, id=f5367a5f-b3ff-45c9-b61a-a87626cfcb03>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.085 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.086 12 DEBUG ceilometer.compute.pollsters [-] Instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000065, id=f5367a5f-b3ff-45c9-b61a-a87626cfcb03>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.086 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.086 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.086 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1528954241>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1528954241>]
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.086 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.087 12 DEBUG ceilometer.compute.pollsters [-] Instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000065, id=f5367a5f-b3ff-45c9-b61a-a87626cfcb03>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.087 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 12:40:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:40:36.087 12 DEBUG ceilometer.compute.pollsters [-] Instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000065, id=f5367a5f-b3ff-45c9-b61a-a87626cfcb03>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Oct  8 12:40:37 np0005476733 nova_compute[192580]: 2025-10-08 16:40:37.033 2 INFO nova.virt.libvirt.driver [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Creating config drive at /var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.config#033[00m
Oct  8 12:40:37 np0005476733 nova_compute[192580]: 2025-10-08 16:40:37.040 2 DEBUG oslo_concurrency.processutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpisamuvis execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:40:37 np0005476733 nova_compute[192580]: 2025-10-08 16:40:37.174 2 DEBUG oslo_concurrency.processutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpisamuvis" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:40:37 np0005476733 podman[268997]: 2025-10-08 16:40:37.211252838 +0000 UTC m=+0.045063648 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_metadata_agent)
Oct  8 12:40:37 np0005476733 NetworkManager[51699]: <info>  [1759941637.2456] manager: (tapf66cbaf5-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/310)
Oct  8 12:40:37 np0005476733 kernel: tapf66cbaf5-fd: entered promiscuous mode
Oct  8 12:40:37 np0005476733 ovn_controller[263831]: 2025-10-08T16:40:37Z|00094|binding|INFO|Claiming lport f66cbaf5-fd00-4780-a065-fdcf7175d68c for this chassis.
Oct  8 12:40:37 np0005476733 ovn_controller[263831]: 2025-10-08T16:40:37Z|00095|binding|INFO|f66cbaf5-fd00-4780-a065-fdcf7175d68c: Claiming fa:16:3e:e7:3a:92 10.100.0.21
Oct  8 12:40:37 np0005476733 nova_compute[192580]: 2025-10-08 16:40:37.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.255 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:3a:92 10.100.0.21'], port_security=['fa:16:3e:e7:3a:92 10.100.0.21'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.21/28', 'neutron:device_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aef581c1-ee97-4e74-a1f4-beb582e7a3d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0ae65408-c1fc-4a23-acb9-ead1616a73f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11840178-7e45-489c-af98-e0bcd5dab024, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=f66cbaf5-fd00-4780-a065-fdcf7175d68c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.256 103739 INFO neutron.agent.ovn.metadata.agent [-] Port f66cbaf5-fd00-4780-a065-fdcf7175d68c in datapath aef581c1-ee97-4e74-a1f4-beb582e7a3d5 bound to our chassis#033[00m
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.257 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aef581c1-ee97-4e74-a1f4-beb582e7a3d5#033[00m
Oct  8 12:40:37 np0005476733 ovn_controller[263831]: 2025-10-08T16:40:37Z|00096|binding|INFO|Setting lport f66cbaf5-fd00-4780-a065-fdcf7175d68c ovn-installed in OVS
Oct  8 12:40:37 np0005476733 ovn_controller[263831]: 2025-10-08T16:40:37Z|00097|binding|INFO|Setting lport f66cbaf5-fd00-4780-a065-fdcf7175d68c up in Southbound
Oct  8 12:40:37 np0005476733 nova_compute[192580]: 2025-10-08 16:40:37.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.271 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9509b865-4505-4903-b7ba-777da481e5b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.272 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapaef581c1-e1 in ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 12:40:37 np0005476733 systemd-udevd[269028]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.274 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapaef581c1-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.274 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[590f7194-f995-4c73-a70c-d54ae741f10b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.276 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1c553b3d-c0a7-4835-9d6e-f58b88a34c08]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:40:37 np0005476733 NetworkManager[51699]: <info>  [1759941637.2879] device (tapf66cbaf5-fd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:40:37 np0005476733 NetworkManager[51699]: <info>  [1759941637.2886] device (tapf66cbaf5-fd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.287 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[bc5fcc9c-87c9-4603-a2b8-19b111cb9786]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:40:37 np0005476733 systemd-machined[152624]: New machine qemu-62-instance-00000065.
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.302 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[eed43074-91b4-43cf-bd2e-5469629611d3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:40:37 np0005476733 systemd[1]: Started Virtual Machine qemu-62-instance-00000065.
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.331 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[b0b8956c-778b-44c7-b7ff-1be87a60270f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:40:37 np0005476733 NetworkManager[51699]: <info>  [1759941637.3374] manager: (tapaef581c1-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/311)
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.336 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4458af02-96d7-484c-a1d9-5bbd9d927526]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.371 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[ba6cb992-ecc7-4b05-9758-d2cf5907ebc4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.374 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[a844dd23-dfb5-4449-9579-ff58912fd050]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:40:37 np0005476733 NetworkManager[51699]: <info>  [1759941637.4002] device (tapaef581c1-e0): carrier: link connected
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.403 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[bbd2fe26-d8db-4a0b-ab4d-9826bffd33d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.419 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[bbe3248c-dd5a-443c-8c92-b075913396e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaef581c1-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f1:43:55'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 214], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 854106, 'reachable_time': 16216, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 269063, 'error': None, 'target': 'ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.434 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ea06a633-ae88-4f32-bb35-c5a6c60ccdbc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef1:4355'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 854106, 'tstamp': 854106}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 269064, 'error': None, 'target': 'ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.451 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e03379b7-454d-4351-917a-28af3554d1b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaef581c1-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f1:43:55'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 214], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 854106, 'reachable_time': 16216, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 269065, 'error': None, 'target': 'ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.485 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[276dab78-5077-4636-b5ce-c25a7b2a7d1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.543 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2babdd96-885e-4709-8a19-998672890bb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.545 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaef581c1-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.545 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.545 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaef581c1-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:40:37 np0005476733 NetworkManager[51699]: <info>  [1759941637.5484] manager: (tapaef581c1-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/312)
Oct  8 12:40:37 np0005476733 nova_compute[192580]: 2025-10-08 16:40:37.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:37 np0005476733 kernel: tapaef581c1-e0: entered promiscuous mode
Oct  8 12:40:37 np0005476733 nova_compute[192580]: 2025-10-08 16:40:37.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.554 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaef581c1-e0, col_values=(('external_ids', {'iface-id': '87673df0-8166-4107-b1eb-9c37811630b4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:40:37 np0005476733 nova_compute[192580]: 2025-10-08 16:40:37.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.557 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/aef581c1-ee97-4e74-a1f4-beb582e7a3d5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/aef581c1-ee97-4e74-a1f4-beb582e7a3d5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 12:40:37 np0005476733 ovn_controller[263831]: 2025-10-08T16:40:37Z|00098|binding|INFO|Releasing lport 87673df0-8166-4107-b1eb-9c37811630b4 from this chassis (sb_readonly=0)
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.559 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8f417657-17cd-453e-b722-de9968167ef2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.560 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-aef581c1-ee97-4e74-a1f4-beb582e7a3d5
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/aef581c1-ee97-4e74-a1f4-beb582e7a3d5.pid.haproxy
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID aef581c1-ee97-4e74-a1f4-beb582e7a3d5
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 12:40:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:37.560 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5', 'env', 'PROCESS_TAG=haproxy-aef581c1-ee97-4e74-a1f4-beb582e7a3d5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/aef581c1-ee97-4e74-a1f4-beb582e7a3d5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 12:40:37 np0005476733 nova_compute[192580]: 2025-10-08 16:40:37.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:37 np0005476733 podman[269101]: 2025-10-08 16:40:37.942166245 +0000 UTC m=+0.050326495 container create e66aedaed696fdcc10ac52eaa613d6022301e5e2cb178990668cf309e78160c3 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  8 12:40:37 np0005476733 nova_compute[192580]: 2025-10-08 16:40:37.978 2 DEBUG nova.compute.manager [req-0a95586a-1035-40b6-a266-34baf4b0a0ab req-e95337b7-0c36-4a1d-9420-2de1e1ce1245 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Received event network-vif-plugged-f66cbaf5-fd00-4780-a065-fdcf7175d68c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:40:37 np0005476733 nova_compute[192580]: 2025-10-08 16:40:37.981 2 DEBUG oslo_concurrency.lockutils [req-0a95586a-1035-40b6-a266-34baf4b0a0ab req-e95337b7-0c36-4a1d-9420-2de1e1ce1245 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:40:37 np0005476733 nova_compute[192580]: 2025-10-08 16:40:37.982 2 DEBUG oslo_concurrency.lockutils [req-0a95586a-1035-40b6-a266-34baf4b0a0ab req-e95337b7-0c36-4a1d-9420-2de1e1ce1245 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:40:37 np0005476733 nova_compute[192580]: 2025-10-08 16:40:37.982 2 DEBUG oslo_concurrency.lockutils [req-0a95586a-1035-40b6-a266-34baf4b0a0ab req-e95337b7-0c36-4a1d-9420-2de1e1ce1245 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:40:37 np0005476733 nova_compute[192580]: 2025-10-08 16:40:37.982 2 DEBUG nova.compute.manager [req-0a95586a-1035-40b6-a266-34baf4b0a0ab req-e95337b7-0c36-4a1d-9420-2de1e1ce1245 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Processing event network-vif-plugged-f66cbaf5-fd00-4780-a065-fdcf7175d68c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 12:40:37 np0005476733 systemd[1]: Started libpod-conmon-e66aedaed696fdcc10ac52eaa613d6022301e5e2cb178990668cf309e78160c3.scope.
Oct  8 12:40:38 np0005476733 podman[269101]: 2025-10-08 16:40:37.918114259 +0000 UTC m=+0.026274539 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 12:40:38 np0005476733 systemd[1]: Started libcrun container.
Oct  8 12:40:38 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eaf71acde083c0b8cff0903864aa7dc194679dbc44a20b8775712248efa6be5e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 12:40:38 np0005476733 podman[269101]: 2025-10-08 16:40:38.045687507 +0000 UTC m=+0.153847777 container init e66aedaed696fdcc10ac52eaa613d6022301e5e2cb178990668cf309e78160c3 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  8 12:40:38 np0005476733 podman[269101]: 2025-10-08 16:40:38.053082582 +0000 UTC m=+0.161242832 container start e66aedaed696fdcc10ac52eaa613d6022301e5e2cb178990668cf309e78160c3 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 12:40:38 np0005476733 neutron-haproxy-ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5[269117]: [NOTICE]   (269121) : New worker (269123) forked
Oct  8 12:40:38 np0005476733 neutron-haproxy-ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5[269117]: [NOTICE]   (269121) : Loading success.
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.164 2 DEBUG nova.compute.manager [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.164 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759941638.1634636, f5367a5f-b3ff-45c9-b61a-a87626cfcb03 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.165 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] VM Started (Lifecycle Event)#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.169 2 DEBUG nova.virt.libvirt.driver [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.173 2 INFO nova.virt.libvirt.driver [-] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Instance spawned successfully.#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.174 2 DEBUG nova.virt.libvirt.driver [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.189 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.197 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.202 2 DEBUG nova.virt.libvirt.driver [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.203 2 DEBUG nova.virt.libvirt.driver [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.204 2 DEBUG nova.virt.libvirt.driver [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.204 2 DEBUG nova.virt.libvirt.driver [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.205 2 DEBUG nova.virt.libvirt.driver [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.205 2 DEBUG nova.virt.libvirt.driver [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.219 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.219 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759941638.165166, f5367a5f-b3ff-45c9-b61a-a87626cfcb03 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.219 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] VM Paused (Lifecycle Event)#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.243 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.247 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759941638.167373, f5367a5f-b3ff-45c9-b61a-a87626cfcb03 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.247 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] VM Resumed (Lifecycle Event)#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.281 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.285 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.295 2 INFO nova.compute.manager [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Took 6.60 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.295 2 DEBUG nova.compute.manager [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.309 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.358 2 INFO nova.compute.manager [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Took 7.15 seconds to build instance.#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.378 2 DEBUG oslo_concurrency.lockutils [None req-157ad658-a96e-4ea4-9b29-dc8bc15b3d87 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.831 2 DEBUG nova.network.neutron [req-91c63d6e-2e48-4337-83eb-8702bd2e2beb req-213ecc69-1905-4025-a40f-0b702ec89989 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Updated VIF entry in instance network info cache for port f66cbaf5-fd00-4780-a065-fdcf7175d68c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.832 2 DEBUG nova.network.neutron [req-91c63d6e-2e48-4337-83eb-8702bd2e2beb req-213ecc69-1905-4025-a40f-0b702ec89989 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Updating instance_info_cache with network_info: [{"id": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "address": "fa:16:3e:e7:3a:92", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66cbaf5-fd", "ovs_interfaceid": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:40:38 np0005476733 nova_compute[192580]: 2025-10-08 16:40:38.848 2 DEBUG oslo_concurrency.lockutils [req-91c63d6e-2e48-4337-83eb-8702bd2e2beb req-213ecc69-1905-4025-a40f-0b702ec89989 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-f5367a5f-b3ff-45c9-b61a-a87626cfcb03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:40:39 np0005476733 podman[269132]: 2025-10-08 16:40:39.259542814 +0000 UTC m=+0.088021698 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 12:40:39 np0005476733 nova_compute[192580]: 2025-10-08 16:40:39.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:40:40 np0005476733 nova_compute[192580]: 2025-10-08 16:40:40.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:40 np0005476733 nova_compute[192580]: 2025-10-08 16:40:40.062 2 DEBUG nova.compute.manager [req-32d7ab2e-3a06-43ee-8ac6-63f22b0ef501 req-6323e7c7-36a1-4877-b22d-d484096d49f1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Received event network-vif-plugged-f66cbaf5-fd00-4780-a065-fdcf7175d68c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:40:40 np0005476733 nova_compute[192580]: 2025-10-08 16:40:40.063 2 DEBUG oslo_concurrency.lockutils [req-32d7ab2e-3a06-43ee-8ac6-63f22b0ef501 req-6323e7c7-36a1-4877-b22d-d484096d49f1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:40:40 np0005476733 nova_compute[192580]: 2025-10-08 16:40:40.063 2 DEBUG oslo_concurrency.lockutils [req-32d7ab2e-3a06-43ee-8ac6-63f22b0ef501 req-6323e7c7-36a1-4877-b22d-d484096d49f1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:40:40 np0005476733 nova_compute[192580]: 2025-10-08 16:40:40.063 2 DEBUG oslo_concurrency.lockutils [req-32d7ab2e-3a06-43ee-8ac6-63f22b0ef501 req-6323e7c7-36a1-4877-b22d-d484096d49f1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:40:40 np0005476733 nova_compute[192580]: 2025-10-08 16:40:40.063 2 DEBUG nova.compute.manager [req-32d7ab2e-3a06-43ee-8ac6-63f22b0ef501 req-6323e7c7-36a1-4877-b22d-d484096d49f1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] No waiting events found dispatching network-vif-plugged-f66cbaf5-fd00-4780-a065-fdcf7175d68c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:40:40 np0005476733 nova_compute[192580]: 2025-10-08 16:40:40.063 2 WARNING nova.compute.manager [req-32d7ab2e-3a06-43ee-8ac6-63f22b0ef501 req-6323e7c7-36a1-4877-b22d-d484096d49f1 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Received unexpected event network-vif-plugged-f66cbaf5-fd00-4780-a065-fdcf7175d68c for instance with vm_state active and task_state None.#033[00m
Oct  8 12:40:40 np0005476733 nova_compute[192580]: 2025-10-08 16:40:40.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:42 np0005476733 nova_compute[192580]: 2025-10-08 16:40:42.054 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:40:42 np0005476733 podman[269159]: 2025-10-08 16:40:42.245676597 +0000 UTC m=+0.069701484 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=edpm, container_name=ceilometer_agent_compute)
Oct  8 12:40:44 np0005476733 nova_compute[192580]: 2025-10-08 16:40:44.979 2 DEBUG nova.compute.manager [req-8f723d4f-57ea-41f1-9341-b9412a8f705e req-70ecd6fa-33e0-4ce8-9d28-0490b1019f86 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Received event network-changed-f66cbaf5-fd00-4780-a065-fdcf7175d68c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:40:44 np0005476733 nova_compute[192580]: 2025-10-08 16:40:44.979 2 DEBUG nova.compute.manager [req-8f723d4f-57ea-41f1-9341-b9412a8f705e req-70ecd6fa-33e0-4ce8-9d28-0490b1019f86 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Refreshing instance network info cache due to event network-changed-f66cbaf5-fd00-4780-a065-fdcf7175d68c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:40:44 np0005476733 nova_compute[192580]: 2025-10-08 16:40:44.979 2 DEBUG oslo_concurrency.lockutils [req-8f723d4f-57ea-41f1-9341-b9412a8f705e req-70ecd6fa-33e0-4ce8-9d28-0490b1019f86 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-f5367a5f-b3ff-45c9-b61a-a87626cfcb03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:40:44 np0005476733 nova_compute[192580]: 2025-10-08 16:40:44.980 2 DEBUG oslo_concurrency.lockutils [req-8f723d4f-57ea-41f1-9341-b9412a8f705e req-70ecd6fa-33e0-4ce8-9d28-0490b1019f86 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-f5367a5f-b3ff-45c9-b61a-a87626cfcb03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:40:44 np0005476733 nova_compute[192580]: 2025-10-08 16:40:44.980 2 DEBUG nova.network.neutron [req-8f723d4f-57ea-41f1-9341-b9412a8f705e req-70ecd6fa-33e0-4ce8-9d28-0490b1019f86 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Refreshing network info cache for port f66cbaf5-fd00-4780-a065-fdcf7175d68c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:40:45 np0005476733 nova_compute[192580]: 2025-10-08 16:40:45.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:45 np0005476733 nova_compute[192580]: 2025-10-08 16:40:45.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:46 np0005476733 nova_compute[192580]: 2025-10-08 16:40:46.600 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:40:47 np0005476733 nova_compute[192580]: 2025-10-08 16:40:47.986 2 DEBUG nova.network.neutron [req-8f723d4f-57ea-41f1-9341-b9412a8f705e req-70ecd6fa-33e0-4ce8-9d28-0490b1019f86 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Updated VIF entry in instance network info cache for port f66cbaf5-fd00-4780-a065-fdcf7175d68c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:40:47 np0005476733 nova_compute[192580]: 2025-10-08 16:40:47.987 2 DEBUG nova.network.neutron [req-8f723d4f-57ea-41f1-9341-b9412a8f705e req-70ecd6fa-33e0-4ce8-9d28-0490b1019f86 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Updating instance_info_cache with network_info: [{"id": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "address": "fa:16:3e:e7:3a:92", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66cbaf5-fd", "ovs_interfaceid": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:40:48 np0005476733 nova_compute[192580]: 2025-10-08 16:40:48.010 2 DEBUG oslo_concurrency.lockutils [req-8f723d4f-57ea-41f1-9341-b9412a8f705e req-70ecd6fa-33e0-4ce8-9d28-0490b1019f86 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-f5367a5f-b3ff-45c9-b61a-a87626cfcb03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:40:48 np0005476733 nova_compute[192580]: 2025-10-08 16:40:48.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:40:48 np0005476733 nova_compute[192580]: 2025-10-08 16:40:48.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.157 2 DEBUG oslo_concurrency.lockutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.158 2 DEBUG oslo_concurrency.lockutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.180 2 DEBUG nova.compute.manager [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.262 2 DEBUG oslo_concurrency.lockutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.263 2 DEBUG oslo_concurrency.lockutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.272 2 DEBUG nova.virt.hardware [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.273 2 INFO nova.compute.claims [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.415 2 DEBUG nova.compute.provider_tree [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.433 2 DEBUG nova.scheduler.client.report [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.470 2 DEBUG oslo_concurrency.lockutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.471 2 DEBUG nova.compute.manager [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.518 2 DEBUG nova.compute.manager [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.518 2 DEBUG nova.network.neutron [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.541 2 INFO nova.virt.libvirt.driver [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.561 2 DEBUG nova.compute.manager [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.663 2 DEBUG nova.compute.manager [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.665 2 DEBUG nova.virt.libvirt.driver [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.665 2 INFO nova.virt.libvirt.driver [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Creating image(s)#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.666 2 DEBUG oslo_concurrency.lockutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "/var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.666 2 DEBUG oslo_concurrency.lockutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "/var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.667 2 DEBUG oslo_concurrency.lockutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "/var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.678 2 DEBUG oslo_concurrency.processutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.739 2 DEBUG oslo_concurrency.processutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.740 2 DEBUG oslo_concurrency.lockutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.740 2 DEBUG oslo_concurrency.lockutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.752 2 DEBUG oslo_concurrency.processutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.809 2 DEBUG oslo_concurrency.processutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.810 2 DEBUG oslo_concurrency.processutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.842 2 DEBUG nova.policy [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.846 2 DEBUG oslo_concurrency.processutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk 10737418240" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.847 2 DEBUG oslo_concurrency.lockutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.847 2 DEBUG oslo_concurrency.processutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.913 2 DEBUG oslo_concurrency.processutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:40:49 np0005476733 nova_compute[192580]: 2025-10-08 16:40:49.914 2 DEBUG nova.objects.instance [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lazy-loading 'migration_context' on Instance uuid 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:40:50 np0005476733 nova_compute[192580]: 2025-10-08 16:40:50.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:50 np0005476733 nova_compute[192580]: 2025-10-08 16:40:50.049 2 DEBUG nova.virt.libvirt.driver [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 12:40:50 np0005476733 nova_compute[192580]: 2025-10-08 16:40:50.050 2 DEBUG nova.virt.libvirt.driver [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Ensure instance console log exists: /var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 12:40:50 np0005476733 nova_compute[192580]: 2025-10-08 16:40:50.050 2 DEBUG oslo_concurrency.lockutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:40:50 np0005476733 nova_compute[192580]: 2025-10-08 16:40:50.051 2 DEBUG oslo_concurrency.lockutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:40:50 np0005476733 nova_compute[192580]: 2025-10-08 16:40:50.051 2 DEBUG oslo_concurrency.lockutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:40:50 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:50.487 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=77, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=76) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:40:50 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:50.488 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:40:50 np0005476733 nova_compute[192580]: 2025-10-08 16:40:50.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:50 np0005476733 nova_compute[192580]: 2025-10-08 16:40:50.624 2 DEBUG nova.network.neutron [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Successfully created port: 39736fa3-776f-4ab7-8d41-b774edd28a9f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 12:40:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:40:50Z|00099|pinctrl|WARN|Dropped 693 log messages in last 60 seconds (most recently, 0 seconds ago) due to excessive rate
Oct  8 12:40:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:40:50Z|00100|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:40:50 np0005476733 nova_compute[192580]: 2025-10-08 16:40:50.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:51 np0005476733 podman[269193]: 2025-10-08 16:40:51.24943219 +0000 UTC m=+0.067625508 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible)
Oct  8 12:40:51 np0005476733 podman[269195]: 2025-10-08 16:40:51.254977656 +0000 UTC m=+0.065405107 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Oct  8 12:40:51 np0005476733 podman[269194]: 2025-10-08 16:40:51.26574411 +0000 UTC m=+0.083539925 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 12:40:53 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:53.496 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '77'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:40:53 np0005476733 nova_compute[192580]: 2025-10-08 16:40:53.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:40:55 np0005476733 nova_compute[192580]: 2025-10-08 16:40:55.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:55 np0005476733 nova_compute[192580]: 2025-10-08 16:40:55.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:40:55 np0005476733 nova_compute[192580]: 2025-10-08 16:40:55.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:40:55 np0005476733 nova_compute[192580]: 2025-10-08 16:40:55.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:40:55 np0005476733 nova_compute[192580]: 2025-10-08 16:40:55.611 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  8 12:40:55 np0005476733 nova_compute[192580]: 2025-10-08 16:40:55.657 2 DEBUG nova.network.neutron [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Successfully updated port: 39736fa3-776f-4ab7-8d41-b774edd28a9f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 12:40:55 np0005476733 nova_compute[192580]: 2025-10-08 16:40:55.682 2 DEBUG oslo_concurrency.lockutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "refresh_cache-8bb9fbe7-3f99-4a38-b5b6-2943a5dead35" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:40:55 np0005476733 nova_compute[192580]: 2025-10-08 16:40:55.683 2 DEBUG oslo_concurrency.lockutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquired lock "refresh_cache-8bb9fbe7-3f99-4a38-b5b6-2943a5dead35" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:40:55 np0005476733 nova_compute[192580]: 2025-10-08 16:40:55.684 2 DEBUG nova.network.neutron [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:40:55 np0005476733 nova_compute[192580]: 2025-10-08 16:40:55.784 2 DEBUG nova.compute.manager [req-fc586a6d-9046-4a6e-8227-34a80c262224 req-162edfd8-3630-4a2e-bdf6-c183187fe9fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Received event network-changed-39736fa3-776f-4ab7-8d41-b774edd28a9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:40:55 np0005476733 nova_compute[192580]: 2025-10-08 16:40:55.784 2 DEBUG nova.compute.manager [req-fc586a6d-9046-4a6e-8227-34a80c262224 req-162edfd8-3630-4a2e-bdf6-c183187fe9fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Refreshing instance network info cache due to event network-changed-39736fa3-776f-4ab7-8d41-b774edd28a9f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:40:55 np0005476733 nova_compute[192580]: 2025-10-08 16:40:55.785 2 DEBUG oslo_concurrency.lockutils [req-fc586a6d-9046-4a6e-8227-34a80c262224 req-162edfd8-3630-4a2e-bdf6-c183187fe9fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-8bb9fbe7-3f99-4a38-b5b6-2943a5dead35" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:40:55 np0005476733 nova_compute[192580]: 2025-10-08 16:40:55.878 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-f5367a5f-b3ff-45c9-b61a-a87626cfcb03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:40:55 np0005476733 nova_compute[192580]: 2025-10-08 16:40:55.879 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-f5367a5f-b3ff-45c9-b61a-a87626cfcb03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:40:55 np0005476733 nova_compute[192580]: 2025-10-08 16:40:55.879 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:40:55 np0005476733 nova_compute[192580]: 2025-10-08 16:40:55.879 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f5367a5f-b3ff-45c9-b61a-a87626cfcb03 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:40:55 np0005476733 nova_compute[192580]: 2025-10-08 16:40:55.888 2 DEBUG nova.network.neutron [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 12:40:55 np0005476733 nova_compute[192580]: 2025-10-08 16:40:55.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.892 2 DEBUG nova.network.neutron [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Updating instance_info_cache with network_info: [{"id": "39736fa3-776f-4ab7-8d41-b774edd28a9f", "address": "fa:16:3e:2b:51:5d", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39736fa3-77", "ovs_interfaceid": "39736fa3-776f-4ab7-8d41-b774edd28a9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.931 2 DEBUG oslo_concurrency.lockutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Releasing lock "refresh_cache-8bb9fbe7-3f99-4a38-b5b6-2943a5dead35" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.931 2 DEBUG nova.compute.manager [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Instance network_info: |[{"id": "39736fa3-776f-4ab7-8d41-b774edd28a9f", "address": "fa:16:3e:2b:51:5d", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39736fa3-77", "ovs_interfaceid": "39736fa3-776f-4ab7-8d41-b774edd28a9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.932 2 DEBUG oslo_concurrency.lockutils [req-fc586a6d-9046-4a6e-8227-34a80c262224 req-162edfd8-3630-4a2e-bdf6-c183187fe9fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-8bb9fbe7-3f99-4a38-b5b6-2943a5dead35" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.932 2 DEBUG nova.network.neutron [req-fc586a6d-9046-4a6e-8227-34a80c262224 req-162edfd8-3630-4a2e-bdf6-c183187fe9fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Refreshing network info cache for port 39736fa3-776f-4ab7-8d41-b774edd28a9f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.934 2 DEBUG nova.virt.libvirt.driver [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Start _get_guest_xml network_info=[{"id": "39736fa3-776f-4ab7-8d41-b774edd28a9f", "address": "fa:16:3e:2b:51:5d", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39736fa3-77", "ovs_interfaceid": "39736fa3-776f-4ab7-8d41-b774edd28a9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.938 2 WARNING nova.virt.libvirt.driver [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.944 2 DEBUG nova.virt.libvirt.host [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.944 2 DEBUG nova.virt.libvirt.host [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.948 2 DEBUG nova.virt.libvirt.host [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.948 2 DEBUG nova.virt.libvirt.host [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.948 2 DEBUG nova.virt.libvirt.driver [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.949 2 DEBUG nova.virt.hardware [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.949 2 DEBUG nova.virt.hardware [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.949 2 DEBUG nova.virt.hardware [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.950 2 DEBUG nova.virt.hardware [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.950 2 DEBUG nova.virt.hardware [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.950 2 DEBUG nova.virt.hardware [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.950 2 DEBUG nova.virt.hardware [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.950 2 DEBUG nova.virt.hardware [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.951 2 DEBUG nova.virt.hardware [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.951 2 DEBUG nova.virt.hardware [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.951 2 DEBUG nova.virt.hardware [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.954 2 DEBUG nova.virt.libvirt.vif [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:40:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1820634934',display_name='tempest-server-test-1820634934',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1820634934',id=102,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgF0yKypuL1wJPOTwdtr+zzq0qw+uwurXu01O/Ym5uWgfd00pr3GN1raply3ByKVO5hmmkfvydhY0zQSvT9dZbNRj3hL8c6L+eBag20GVWlTRMyq8EfPEfzsuER1PS2LQ==',key_name='tempest-keypair-test-224344080',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4726c8b7a2a3405b9b2d689862918f5d',ramdisk_id='',reservation_id='r-l22g7rkv',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-GatewayMtuTestUdp-187807839',owner_user_name='tempest-GatewayMtuTestUdp-187807839-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:40:49Z,user_data=None,user_id='de0012a12c1645bfb620caa34110c3f4',uuid=8bb9fbe7-3f99-4a38-b5b6-2943a5dead35,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "39736fa3-776f-4ab7-8d41-b774edd28a9f", "address": "fa:16:3e:2b:51:5d", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39736fa3-77", "ovs_interfaceid": "39736fa3-776f-4ab7-8d41-b774edd28a9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.955 2 DEBUG nova.network.os_vif_util [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Converting VIF {"id": "39736fa3-776f-4ab7-8d41-b774edd28a9f", "address": "fa:16:3e:2b:51:5d", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39736fa3-77", "ovs_interfaceid": "39736fa3-776f-4ab7-8d41-b774edd28a9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.956 2 DEBUG nova.network.os_vif_util [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:51:5d,bridge_name='br-int',has_traffic_filtering=True,id=39736fa3-776f-4ab7-8d41-b774edd28a9f,network=Network(aef581c1-ee97-4e74-a1f4-beb582e7a3d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39736fa3-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.956 2 DEBUG nova.objects.instance [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lazy-loading 'pci_devices' on Instance uuid 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.977 2 DEBUG nova.virt.libvirt.driver [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] End _get_guest_xml xml=<domain type="kvm">
Oct  8 12:40:57 np0005476733 nova_compute[192580]:  <uuid>8bb9fbe7-3f99-4a38-b5b6-2943a5dead35</uuid>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:  <name>instance-00000066</name>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <nova:name>tempest-server-test-1820634934</nova:name>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 16:40:57</nova:creationTime>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 12:40:57 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:        <nova:user uuid="de0012a12c1645bfb620caa34110c3f4">tempest-GatewayMtuTestUdp-187807839-project-member</nova:user>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:        <nova:project uuid="4726c8b7a2a3405b9b2d689862918f5d">tempest-GatewayMtuTestUdp-187807839</nova:project>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:        <nova:port uuid="39736fa3-776f-4ab7-8d41-b774edd28a9f">
Oct  8 12:40:57 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.29" ipVersion="4"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <system>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <entry name="serial">8bb9fbe7-3f99-4a38-b5b6-2943a5dead35</entry>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <entry name="uuid">8bb9fbe7-3f99-4a38-b5b6-2943a5dead35</entry>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    </system>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:  <os>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:  </os>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:  <features>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:  </features>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:  </clock>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:  <devices>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.config"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:2b:51:5d"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <target dev="tap39736fa3-77"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    </interface>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/console.log" append="off"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    </serial>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <video>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    </video>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    </rng>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 12:40:57 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 12:40:57 np0005476733 nova_compute[192580]:  </devices>
Oct  8 12:40:57 np0005476733 nova_compute[192580]: </domain>
Oct  8 12:40:57 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.978 2 DEBUG nova.compute.manager [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Preparing to wait for external event network-vif-plugged-39736fa3-776f-4ab7-8d41-b774edd28a9f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.979 2 DEBUG oslo_concurrency.lockutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.979 2 DEBUG oslo_concurrency.lockutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.979 2 DEBUG oslo_concurrency.lockutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.980 2 DEBUG nova.virt.libvirt.vif [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:40:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1820634934',display_name='tempest-server-test-1820634934',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1820634934',id=102,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgF0yKypuL1wJPOTwdtr+zzq0qw+uwurXu01O/Ym5uWgfd00pr3GN1raply3ByKVO5hmmkfvydhY0zQSvT9dZbNRj3hL8c6L+eBag20GVWlTRMyq8EfPEfzsuER1PS2LQ==',key_name='tempest-keypair-test-224344080',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4726c8b7a2a3405b9b2d689862918f5d',ramdisk_id='',reservation_id='r-l22g7rkv',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-GatewayMtuTestUdp-187807839',owner_user_name='tempest-GatewayMtuTestUdp-187807839-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:40:49Z,user_data=None,user_id='de0012a12c1645bfb620caa34110c3f4',uuid=8bb9fbe7-3f99-4a38-b5b6-2943a5dead35,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "39736fa3-776f-4ab7-8d41-b774edd28a9f", "address": "fa:16:3e:2b:51:5d", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39736fa3-77", "ovs_interfaceid": "39736fa3-776f-4ab7-8d41-b774edd28a9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.980 2 DEBUG nova.network.os_vif_util [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Converting VIF {"id": "39736fa3-776f-4ab7-8d41-b774edd28a9f", "address": "fa:16:3e:2b:51:5d", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39736fa3-77", "ovs_interfaceid": "39736fa3-776f-4ab7-8d41-b774edd28a9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.981 2 DEBUG nova.network.os_vif_util [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:51:5d,bridge_name='br-int',has_traffic_filtering=True,id=39736fa3-776f-4ab7-8d41-b774edd28a9f,network=Network(aef581c1-ee97-4e74-a1f4-beb582e7a3d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39736fa3-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.981 2 DEBUG os_vif [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:51:5d,bridge_name='br-int',has_traffic_filtering=True,id=39736fa3-776f-4ab7-8d41-b774edd28a9f,network=Network(aef581c1-ee97-4e74-a1f4-beb582e7a3d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39736fa3-77') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.982 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.982 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.985 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap39736fa3-77, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.986 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap39736fa3-77, col_values=(('external_ids', {'iface-id': '39736fa3-776f-4ab7-8d41-b774edd28a9f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2b:51:5d', 'vm-uuid': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:40:57 np0005476733 NetworkManager[51699]: <info>  [1759941657.9912] manager: (tap39736fa3-77): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/313)
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:57 np0005476733 nova_compute[192580]: 2025-10-08 16:40:57.997 2 INFO os_vif [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:51:5d,bridge_name='br-int',has_traffic_filtering=True,id=39736fa3-776f-4ab7-8d41-b774edd28a9f,network=Network(aef581c1-ee97-4e74-a1f4-beb582e7a3d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39736fa3-77')#033[00m
Oct  8 12:40:58 np0005476733 nova_compute[192580]: 2025-10-08 16:40:58.086 2 DEBUG nova.virt.libvirt.driver [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:40:58 np0005476733 nova_compute[192580]: 2025-10-08 16:40:58.087 2 DEBUG nova.virt.libvirt.driver [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:40:58 np0005476733 nova_compute[192580]: 2025-10-08 16:40:58.087 2 DEBUG nova.virt.libvirt.driver [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] No VIF found with MAC fa:16:3e:2b:51:5d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 12:40:58 np0005476733 nova_compute[192580]: 2025-10-08 16:40:58.088 2 INFO nova.virt.libvirt.driver [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Using config drive#033[00m
Oct  8 12:40:59 np0005476733 nova_compute[192580]: 2025-10-08 16:40:59.179 2 INFO nova.virt.libvirt.driver [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Creating config drive at /var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.config#033[00m
Oct  8 12:40:59 np0005476733 nova_compute[192580]: 2025-10-08 16:40:59.184 2 DEBUG oslo_concurrency.processutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl3si8_om execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:40:59 np0005476733 podman[269262]: 2025-10-08 16:40:59.229626661 +0000 UTC m=+0.049593432 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:40:59 np0005476733 podman[269261]: 2025-10-08 16:40:59.233134353 +0000 UTC m=+0.055086497 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 12:40:59 np0005476733 nova_compute[192580]: 2025-10-08 16:40:59.241 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Updating instance_info_cache with network_info: [{"id": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "address": "fa:16:3e:e7:3a:92", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66cbaf5-fd", "ovs_interfaceid": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:40:59 np0005476733 nova_compute[192580]: 2025-10-08 16:40:59.278 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-f5367a5f-b3ff-45c9-b61a-a87626cfcb03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:40:59 np0005476733 nova_compute[192580]: 2025-10-08 16:40:59.279 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:40:59 np0005476733 nova_compute[192580]: 2025-10-08 16:40:59.279 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:40:59 np0005476733 nova_compute[192580]: 2025-10-08 16:40:59.309 2 DEBUG oslo_concurrency.processutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl3si8_om" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:40:59 np0005476733 NetworkManager[51699]: <info>  [1759941659.3958] manager: (tap39736fa3-77): new Tun device (/org/freedesktop/NetworkManager/Devices/314)
Oct  8 12:40:59 np0005476733 kernel: tap39736fa3-77: entered promiscuous mode
Oct  8 12:40:59 np0005476733 ovn_controller[263831]: 2025-10-08T16:40:59Z|00101|binding|INFO|Claiming lport 39736fa3-776f-4ab7-8d41-b774edd28a9f for this chassis.
Oct  8 12:40:59 np0005476733 ovn_controller[263831]: 2025-10-08T16:40:59Z|00102|binding|INFO|39736fa3-776f-4ab7-8d41-b774edd28a9f: Claiming fa:16:3e:2b:51:5d 10.100.0.29
Oct  8 12:40:59 np0005476733 nova_compute[192580]: 2025-10-08 16:40:59.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:59.410 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:51:5d 10.100.0.29'], port_security=['fa:16:3e:2b:51:5d 10.100.0.29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.29/28', 'neutron:device_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aef581c1-ee97-4e74-a1f4-beb582e7a3d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0ae65408-c1fc-4a23-acb9-ead1616a73f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11840178-7e45-489c-af98-e0bcd5dab024, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=39736fa3-776f-4ab7-8d41-b774edd28a9f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:40:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:59.412 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 39736fa3-776f-4ab7-8d41-b774edd28a9f in datapath aef581c1-ee97-4e74-a1f4-beb582e7a3d5 bound to our chassis#033[00m
Oct  8 12:40:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:59.413 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aef581c1-ee97-4e74-a1f4-beb582e7a3d5#033[00m
Oct  8 12:40:59 np0005476733 ovn_controller[263831]: 2025-10-08T16:40:59Z|00103|binding|INFO|Setting lport 39736fa3-776f-4ab7-8d41-b774edd28a9f ovn-installed in OVS
Oct  8 12:40:59 np0005476733 ovn_controller[263831]: 2025-10-08T16:40:59Z|00104|binding|INFO|Setting lport 39736fa3-776f-4ab7-8d41-b774edd28a9f up in Southbound
Oct  8 12:40:59 np0005476733 nova_compute[192580]: 2025-10-08 16:40:59.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:59 np0005476733 nova_compute[192580]: 2025-10-08 16:40:59.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:59 np0005476733 systemd-udevd[269321]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:40:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:59.435 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[bf2d666b-d2c0-46f8-a8c8-f0bf93cedc0a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:40:59 np0005476733 NetworkManager[51699]: <info>  [1759941659.4502] device (tap39736fa3-77): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:40:59 np0005476733 NetworkManager[51699]: <info>  [1759941659.4517] device (tap39736fa3-77): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 12:40:59 np0005476733 systemd-machined[152624]: New machine qemu-63-instance-00000066.
Oct  8 12:40:59 np0005476733 systemd[1]: Started Virtual Machine qemu-63-instance-00000066.
Oct  8 12:40:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:59.468 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[9b9c2d2a-8c7f-4f1e-9df1-55c4aac8b857]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:40:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:59.473 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[229cec1d-caa1-40bc-a94b-3c0798786cb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:40:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:59.503 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[7af4f80a-a466-4fc0-8332-29355f0a5d57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:40:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:59.526 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[92926a97-fc57-41d2-97bf-5074d37e9184]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaef581c1-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f1:43:55'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 832, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 832, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 214], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 854106, 'reachable_time': 16216, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 269329, 'error': None, 'target': 'ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:40:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:59.545 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ffbdfc1b-acf9-406c-957a-2580d2c62407]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapaef581c1-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 854117, 'tstamp': 854117}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 269334, 'error': None, 'target': 'ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.18'], ['IFA_LOCAL', '10.100.0.18'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tapaef581c1-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 854120, 'tstamp': 854120}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 269334, 'error': None, 'target': 'ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:40:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:59.547 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaef581c1-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:40:59 np0005476733 nova_compute[192580]: 2025-10-08 16:40:59.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:40:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:59.549 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaef581c1-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:40:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:59.550 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:40:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:59.550 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaef581c1-e0, col_values=(('external_ids', {'iface-id': '87673df0-8166-4107-b1eb-9c37811630b4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:40:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:40:59.550 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.068 2 DEBUG nova.compute.manager [req-75fa84c7-11c5-4c7b-a22e-b5bd2d1f5789 req-c766aeb8-f090-4c01-997b-6bd76ef3ff0f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Received event network-vif-plugged-39736fa3-776f-4ab7-8d41-b774edd28a9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.069 2 DEBUG oslo_concurrency.lockutils [req-75fa84c7-11c5-4c7b-a22e-b5bd2d1f5789 req-c766aeb8-f090-4c01-997b-6bd76ef3ff0f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.069 2 DEBUG oslo_concurrency.lockutils [req-75fa84c7-11c5-4c7b-a22e-b5bd2d1f5789 req-c766aeb8-f090-4c01-997b-6bd76ef3ff0f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.069 2 DEBUG oslo_concurrency.lockutils [req-75fa84c7-11c5-4c7b-a22e-b5bd2d1f5789 req-c766aeb8-f090-4c01-997b-6bd76ef3ff0f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.069 2 DEBUG nova.compute.manager [req-75fa84c7-11c5-4c7b-a22e-b5bd2d1f5789 req-c766aeb8-f090-4c01-997b-6bd76ef3ff0f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Processing event network-vif-plugged-39736fa3-776f-4ab7-8d41-b774edd28a9f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.323 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759941660.3231418, 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.324 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] VM Started (Lifecycle Event)#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.331 2 DEBUG nova.compute.manager [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.341 2 DEBUG nova.virt.libvirt.driver [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.353 2 DEBUG nova.network.neutron [req-fc586a6d-9046-4a6e-8227-34a80c262224 req-162edfd8-3630-4a2e-bdf6-c183187fe9fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Updated VIF entry in instance network info cache for port 39736fa3-776f-4ab7-8d41-b774edd28a9f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.353 2 DEBUG nova.network.neutron [req-fc586a6d-9046-4a6e-8227-34a80c262224 req-162edfd8-3630-4a2e-bdf6-c183187fe9fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Updating instance_info_cache with network_info: [{"id": "39736fa3-776f-4ab7-8d41-b774edd28a9f", "address": "fa:16:3e:2b:51:5d", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39736fa3-77", "ovs_interfaceid": "39736fa3-776f-4ab7-8d41-b774edd28a9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.355 2 INFO nova.virt.libvirt.driver [-] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Instance spawned successfully.#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.356 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.356 2 DEBUG nova.virt.libvirt.driver [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.360 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.381 2 DEBUG oslo_concurrency.lockutils [req-fc586a6d-9046-4a6e-8227-34a80c262224 req-162edfd8-3630-4a2e-bdf6-c183187fe9fa 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-8bb9fbe7-3f99-4a38-b5b6-2943a5dead35" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.383 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.384 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759941660.3232956, 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.384 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] VM Paused (Lifecycle Event)#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.388 2 DEBUG nova.virt.libvirt.driver [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.388 2 DEBUG nova.virt.libvirt.driver [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.388 2 DEBUG nova.virt.libvirt.driver [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.389 2 DEBUG nova.virt.libvirt.driver [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.389 2 DEBUG nova.virt.libvirt.driver [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.390 2 DEBUG nova.virt.libvirt.driver [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.419 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.423 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759941660.3412201, 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.424 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] VM Resumed (Lifecycle Event)#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.453 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.457 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.463 2 INFO nova.compute.manager [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Took 10.80 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.464 2 DEBUG nova.compute.manager [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.479 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.527 2 INFO nova.compute.manager [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Took 11.29 seconds to build instance.#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.542 2 DEBUG oslo_concurrency.lockutils [None req-dc818a46-0c84-4bfe-b126-b06345c9968d de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.384s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:41:00 np0005476733 nova_compute[192580]: 2025-10-08 16:41:00.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:41:01 np0005476733 ovn_controller[263831]: 2025-10-08T16:41:01Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e7:3a:92 10.100.0.21
Oct  8 12:41:01 np0005476733 ovn_controller[263831]: 2025-10-08T16:41:01Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e7:3a:92 10.100.0.21
Oct  8 12:41:02 np0005476733 nova_compute[192580]: 2025-10-08 16:41:02.169 2 DEBUG nova.compute.manager [req-8faa7b34-48fd-4197-9321-5383f00e5a66 req-d7fbd76f-e033-46a3-ae9a-639e54acfbf6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Received event network-vif-plugged-39736fa3-776f-4ab7-8d41-b774edd28a9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:41:02 np0005476733 nova_compute[192580]: 2025-10-08 16:41:02.170 2 DEBUG oslo_concurrency.lockutils [req-8faa7b34-48fd-4197-9321-5383f00e5a66 req-d7fbd76f-e033-46a3-ae9a-639e54acfbf6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:41:02 np0005476733 nova_compute[192580]: 2025-10-08 16:41:02.170 2 DEBUG oslo_concurrency.lockutils [req-8faa7b34-48fd-4197-9321-5383f00e5a66 req-d7fbd76f-e033-46a3-ae9a-639e54acfbf6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:41:02 np0005476733 nova_compute[192580]: 2025-10-08 16:41:02.170 2 DEBUG oslo_concurrency.lockutils [req-8faa7b34-48fd-4197-9321-5383f00e5a66 req-d7fbd76f-e033-46a3-ae9a-639e54acfbf6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:41:02 np0005476733 nova_compute[192580]: 2025-10-08 16:41:02.170 2 DEBUG nova.compute.manager [req-8faa7b34-48fd-4197-9321-5383f00e5a66 req-d7fbd76f-e033-46a3-ae9a-639e54acfbf6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] No waiting events found dispatching network-vif-plugged-39736fa3-776f-4ab7-8d41-b774edd28a9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:41:02 np0005476733 nova_compute[192580]: 2025-10-08 16:41:02.171 2 WARNING nova.compute.manager [req-8faa7b34-48fd-4197-9321-5383f00e5a66 req-d7fbd76f-e033-46a3-ae9a-639e54acfbf6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Received unexpected event network-vif-plugged-39736fa3-776f-4ab7-8d41-b774edd28a9f for instance with vm_state active and task_state None.#033[00m
Oct  8 12:41:02 np0005476733 nova_compute[192580]: 2025-10-08 16:41:02.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:41:05 np0005476733 nova_compute[192580]: 2025-10-08 16:41:05.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:41:07 np0005476733 nova_compute[192580]: 2025-10-08 16:41:07.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:41:08 np0005476733 podman[269344]: 2025-10-08 16:41:08.238001481 +0000 UTC m=+0.058317130 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:41:10 np0005476733 nova_compute[192580]: 2025-10-08 16:41:10.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:41:10 np0005476733 podman[269363]: 2025-10-08 16:41:10.257971855 +0000 UTC m=+0.087693518 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:41:10 np0005476733 nova_compute[192580]: 2025-10-08 16:41:10.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:41:10 np0005476733 nova_compute[192580]: 2025-10-08 16:41:10.617 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:41:10 np0005476733 nova_compute[192580]: 2025-10-08 16:41:10.618 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:41:10 np0005476733 nova_compute[192580]: 2025-10-08 16:41:10.618 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:41:10 np0005476733 nova_compute[192580]: 2025-10-08 16:41:10.618 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:41:10 np0005476733 nova_compute[192580]: 2025-10-08 16:41:10.694 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:41:10 np0005476733 nova_compute[192580]: 2025-10-08 16:41:10.750 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:41:10 np0005476733 nova_compute[192580]: 2025-10-08 16:41:10.752 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:41:10 np0005476733 nova_compute[192580]: 2025-10-08 16:41:10.813 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:41:10 np0005476733 nova_compute[192580]: 2025-10-08 16:41:10.819 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:41:10 np0005476733 nova_compute[192580]: 2025-10-08 16:41:10.880 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:41:10 np0005476733 nova_compute[192580]: 2025-10-08 16:41:10.881 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:41:10 np0005476733 nova_compute[192580]: 2025-10-08 16:41:10.939 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:41:11 np0005476733 nova_compute[192580]: 2025-10-08 16:41:11.077 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:41:11 np0005476733 nova_compute[192580]: 2025-10-08 16:41:11.078 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12779MB free_disk=111.29270553588867GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:41:11 np0005476733 nova_compute[192580]: 2025-10-08 16:41:11.079 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:41:11 np0005476733 nova_compute[192580]: 2025-10-08 16:41:11.079 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:41:11 np0005476733 nova_compute[192580]: 2025-10-08 16:41:11.169 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:41:11 np0005476733 nova_compute[192580]: 2025-10-08 16:41:11.169 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:41:11 np0005476733 nova_compute[192580]: 2025-10-08 16:41:11.170 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:41:11 np0005476733 nova_compute[192580]: 2025-10-08 16:41:11.170 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=2560MB phys_disk=119GB used_disk=20GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:41:11 np0005476733 nova_compute[192580]: 2025-10-08 16:41:11.261 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:41:11 np0005476733 nova_compute[192580]: 2025-10-08 16:41:11.281 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:41:11 np0005476733 nova_compute[192580]: 2025-10-08 16:41:11.303 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:41:11 np0005476733 nova_compute[192580]: 2025-10-08 16:41:11.303 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:41:12 np0005476733 nova_compute[192580]: 2025-10-08 16:41:12.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:41:13 np0005476733 podman[269409]: 2025-10-08 16:41:13.244198 +0000 UTC m=+0.070215750 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS)
Oct  8 12:41:14 np0005476733 nova_compute[192580]: 2025-10-08 16:41:14.304 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:41:15 np0005476733 nova_compute[192580]: 2025-10-08 16:41:15.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:41:17 np0005476733 nova_compute[192580]: 2025-10-08 16:41:17.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:41:18 np0005476733 nova_compute[192580]: 2025-10-08 16:41:18.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:41:18 np0005476733 nova_compute[192580]: 2025-10-08 16:41:18.860 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:41:18 np0005476733 nova_compute[192580]: 2025-10-08 16:41:18.860 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 12:41:20 np0005476733 nova_compute[192580]: 2025-10-08 16:41:20.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:41:22 np0005476733 podman[269436]: 2025-10-08 16:41:22.235644211 +0000 UTC m=+0.057954480 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:41:22 np0005476733 podman[269435]: 2025-10-08 16:41:22.250569436 +0000 UTC m=+0.074743114 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:41:22 np0005476733 podman[269437]: 2025-10-08 16:41:22.258956954 +0000 UTC m=+0.074145785 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, config_id=edpm, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.buildah.version=1.33.7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  8 12:41:23 np0005476733 nova_compute[192580]: 2025-10-08 16:41:23.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:41:25 np0005476733 nova_compute[192580]: 2025-10-08 16:41:25.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:41:25 np0005476733 ovn_controller[263831]: 2025-10-08T16:41:25Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2b:51:5d 10.100.0.29
Oct  8 12:41:25 np0005476733 ovn_controller[263831]: 2025-10-08T16:41:25Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2b:51:5d 10.100.0.29
Oct  8 12:41:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:41:26.409 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:41:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:41:26.410 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:41:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:41:26.411 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:41:28 np0005476733 nova_compute[192580]: 2025-10-08 16:41:28.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:41:30 np0005476733 nova_compute[192580]: 2025-10-08 16:41:30.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:41:30 np0005476733 podman[269515]: 2025-10-08 16:41:30.231830795 +0000 UTC m=+0.051784712 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:41:30 np0005476733 podman[269514]: 2025-10-08 16:41:30.265983254 +0000 UTC m=+0.089279048 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:41:33 np0005476733 nova_compute[192580]: 2025-10-08 16:41:33.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:41:35 np0005476733 nova_compute[192580]: 2025-10-08 16:41:35.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:41:35 np0005476733 ovn_controller[263831]: 2025-10-08T16:41:35Z|00105|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Oct  8 12:41:37 np0005476733 nova_compute[192580]: 2025-10-08 16:41:37.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:41:38 np0005476733 nova_compute[192580]: 2025-10-08 16:41:38.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:41:39 np0005476733 podman[269568]: 2025-10-08 16:41:39.227446888 +0000 UTC m=+0.056188253 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  8 12:41:40 np0005476733 nova_compute[192580]: 2025-10-08 16:41:40.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:41:41 np0005476733 podman[269587]: 2025-10-08 16:41:41.239021573 +0000 UTC m=+0.072902995 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, org.label-schema.build-date=20251001)
Oct  8 12:41:41 np0005476733 nova_compute[192580]: 2025-10-08 16:41:41.603 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:41:43 np0005476733 nova_compute[192580]: 2025-10-08 16:41:43.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:41:44 np0005476733 podman[269613]: 2025-10-08 16:41:44.25290814 +0000 UTC m=+0.070342874 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  8 12:41:45 np0005476733 nova_compute[192580]: 2025-10-08 16:41:45.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:41:47 np0005476733 nova_compute[192580]: 2025-10-08 16:41:47.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:41:48 np0005476733 nova_compute[192580]: 2025-10-08 16:41:48.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:41:48 np0005476733 nova_compute[192580]: 2025-10-08 16:41:48.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:41:48 np0005476733 nova_compute[192580]: 2025-10-08 16:41:48.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:41:50 np0005476733 nova_compute[192580]: 2025-10-08 16:41:50.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:41:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:41:50Z|00106|pinctrl|WARN|Dropped 223 log messages in last 60 seconds (most recently, 10 seconds ago) due to excessive rate
Oct  8 12:41:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:41:50Z|00107|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:41:53 np0005476733 nova_compute[192580]: 2025-10-08 16:41:53.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:41:53 np0005476733 podman[269682]: 2025-10-08 16:41:53.237945698 +0000 UTC m=+0.063850068 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:41:53 np0005476733 podman[269683]: 2025-10-08 16:41:53.246974925 +0000 UTC m=+0.064363753 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 12:41:53 np0005476733 podman[269684]: 2025-10-08 16:41:53.246907633 +0000 UTC m=+0.062136262 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, architecture=x86_64, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  8 12:41:54 np0005476733 nova_compute[192580]: 2025-10-08 16:41:54.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:41:55 np0005476733 nova_compute[192580]: 2025-10-08 16:41:55.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:41:55 np0005476733 nova_compute[192580]: 2025-10-08 16:41:55.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:41:57 np0005476733 nova_compute[192580]: 2025-10-08 16:41:57.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:41:57 np0005476733 nova_compute[192580]: 2025-10-08 16:41:57.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:41:57 np0005476733 nova_compute[192580]: 2025-10-08 16:41:57.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:41:57 np0005476733 nova_compute[192580]: 2025-10-08 16:41:57.921 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-f5367a5f-b3ff-45c9-b61a-a87626cfcb03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:41:57 np0005476733 nova_compute[192580]: 2025-10-08 16:41:57.922 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-f5367a5f-b3ff-45c9-b61a-a87626cfcb03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:41:57 np0005476733 nova_compute[192580]: 2025-10-08 16:41:57.922 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:41:57 np0005476733 nova_compute[192580]: 2025-10-08 16:41:57.922 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f5367a5f-b3ff-45c9-b61a-a87626cfcb03 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:41:58 np0005476733 nova_compute[192580]: 2025-10-08 16:41:58.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:41:59 np0005476733 nova_compute[192580]: 2025-10-08 16:41:59.311 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Updating instance_info_cache with network_info: [{"id": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "address": "fa:16:3e:e7:3a:92", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66cbaf5-fd", "ovs_interfaceid": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:41:59 np0005476733 nova_compute[192580]: 2025-10-08 16:41:59.329 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-f5367a5f-b3ff-45c9-b61a-a87626cfcb03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:41:59 np0005476733 nova_compute[192580]: 2025-10-08 16:41:59.329 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:42:00 np0005476733 nova_compute[192580]: 2025-10-08 16:42:00.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:01 np0005476733 podman[269747]: 2025-10-08 16:42:01.221864001 +0000 UTC m=+0.054409166 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 12:42:01 np0005476733 podman[269748]: 2025-10-08 16:42:01.221920213 +0000 UTC m=+0.050462390 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:42:01 np0005476733 nova_compute[192580]: 2025-10-08 16:42:01.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:42:03 np0005476733 nova_compute[192580]: 2025-10-08 16:42:03.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:05 np0005476733 nova_compute[192580]: 2025-10-08 16:42:05.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:08 np0005476733 nova_compute[192580]: 2025-10-08 16:42:08.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:42:09.170 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=78, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=77) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:42:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:42:09.170 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:42:09 np0005476733 nova_compute[192580]: 2025-10-08 16:42:09.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:10 np0005476733 nova_compute[192580]: 2025-10-08 16:42:10.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:10 np0005476733 podman[269791]: 2025-10-08 16:42:10.227173213 +0000 UTC m=+0.057485803 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 12:42:10 np0005476733 nova_compute[192580]: 2025-10-08 16:42:10.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:42:10 np0005476733 nova_compute[192580]: 2025-10-08 16:42:10.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:42:10 np0005476733 nova_compute[192580]: 2025-10-08 16:42:10.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:42:10 np0005476733 nova_compute[192580]: 2025-10-08 16:42:10.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:42:10 np0005476733 nova_compute[192580]: 2025-10-08 16:42:10.617 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:42:10 np0005476733 nova_compute[192580]: 2025-10-08 16:42:10.712 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:42:10 np0005476733 nova_compute[192580]: 2025-10-08 16:42:10.773 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:42:10 np0005476733 nova_compute[192580]: 2025-10-08 16:42:10.774 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:42:10 np0005476733 nova_compute[192580]: 2025-10-08 16:42:10.828 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:42:10 np0005476733 nova_compute[192580]: 2025-10-08 16:42:10.835 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:42:10 np0005476733 nova_compute[192580]: 2025-10-08 16:42:10.890 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:42:10 np0005476733 nova_compute[192580]: 2025-10-08 16:42:10.891 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:42:10 np0005476733 nova_compute[192580]: 2025-10-08 16:42:10.949 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.127 2 DEBUG nova.compute.manager [req-b9455c4a-2011-49dd-8c48-c22b965d10cf req-0f39ad0f-10bc-4cdb-ae50-64ea7e68fb84 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Received event network-changed-39736fa3-776f-4ab7-8d41-b774edd28a9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.128 2 DEBUG nova.compute.manager [req-b9455c4a-2011-49dd-8c48-c22b965d10cf req-0f39ad0f-10bc-4cdb-ae50-64ea7e68fb84 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Refreshing instance network info cache due to event network-changed-39736fa3-776f-4ab7-8d41-b774edd28a9f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.128 2 DEBUG oslo_concurrency.lockutils [req-b9455c4a-2011-49dd-8c48-c22b965d10cf req-0f39ad0f-10bc-4cdb-ae50-64ea7e68fb84 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-8bb9fbe7-3f99-4a38-b5b6-2943a5dead35" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.128 2 DEBUG oslo_concurrency.lockutils [req-b9455c4a-2011-49dd-8c48-c22b965d10cf req-0f39ad0f-10bc-4cdb-ae50-64ea7e68fb84 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-8bb9fbe7-3f99-4a38-b5b6-2943a5dead35" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.129 2 DEBUG nova.network.neutron [req-b9455c4a-2011-49dd-8c48-c22b965d10cf req-0f39ad0f-10bc-4cdb-ae50-64ea7e68fb84 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Refreshing network info cache for port 39736fa3-776f-4ab7-8d41-b774edd28a9f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.137 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.138 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=11996MB free_disk=111.0108871459961GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.139 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.139 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.239 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.239 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.240 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.240 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=2560MB phys_disk=119GB used_disk=20GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.259 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.279 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.280 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.301 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.329 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.389 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.413 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.414 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.415 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.275s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 12:42:11 np0005476733 nova_compute[192580]: 2025-10-08 16:42:11.744 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 12:42:12 np0005476733 podman[269823]: 2025-10-08 16:42:12.262377762 +0000 UTC m=+0.091808839 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  8 12:42:12 np0005476733 nova_compute[192580]: 2025-10-08 16:42:12.535 2 DEBUG nova.network.neutron [req-b9455c4a-2011-49dd-8c48-c22b965d10cf req-0f39ad0f-10bc-4cdb-ae50-64ea7e68fb84 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Updated VIF entry in instance network info cache for port 39736fa3-776f-4ab7-8d41-b774edd28a9f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:42:12 np0005476733 nova_compute[192580]: 2025-10-08 16:42:12.535 2 DEBUG nova.network.neutron [req-b9455c4a-2011-49dd-8c48-c22b965d10cf req-0f39ad0f-10bc-4cdb-ae50-64ea7e68fb84 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Updating instance_info_cache with network_info: [{"id": "39736fa3-776f-4ab7-8d41-b774edd28a9f", "address": "fa:16:3e:2b:51:5d", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39736fa3-77", "ovs_interfaceid": "39736fa3-776f-4ab7-8d41-b774edd28a9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:42:12 np0005476733 nova_compute[192580]: 2025-10-08 16:42:12.560 2 DEBUG oslo_concurrency.lockutils [req-b9455c4a-2011-49dd-8c48-c22b965d10cf req-0f39ad0f-10bc-4cdb-ae50-64ea7e68fb84 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-8bb9fbe7-3f99-4a38-b5b6-2943a5dead35" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:42:13 np0005476733 nova_compute[192580]: 2025-10-08 16:42:13.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:15 np0005476733 nova_compute[192580]: 2025-10-08 16:42:15.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:15 np0005476733 podman[269849]: 2025-10-08 16:42:15.24554042 +0000 UTC m=+0.067985569 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct  8 12:42:15 np0005476733 nova_compute[192580]: 2025-10-08 16:42:15.745 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:42:18 np0005476733 nova_compute[192580]: 2025-10-08 16:42:18.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:42:19.172 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '78'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:42:20 np0005476733 nova_compute[192580]: 2025-10-08 16:42:20.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:23 np0005476733 nova_compute[192580]: 2025-10-08 16:42:23.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:24 np0005476733 podman[269870]: 2025-10-08 16:42:24.218263704 +0000 UTC m=+0.047619860 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:42:24 np0005476733 podman[269871]: 2025-10-08 16:42:24.226219458 +0000 UTC m=+0.053907880 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64)
Oct  8 12:42:24 np0005476733 podman[269869]: 2025-10-08 16:42:24.249841041 +0000 UTC m=+0.082512542 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:42:25 np0005476733 nova_compute[192580]: 2025-10-08 16:42:25.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:42:26.411 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:42:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:42:26.411 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:42:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:42:26.411 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:42:28 np0005476733 nova_compute[192580]: 2025-10-08 16:42:28.023 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:42:28 np0005476733 nova_compute[192580]: 2025-10-08 16:42:28.056 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Triggering sync for uuid f5367a5f-b3ff-45c9-b61a-a87626cfcb03 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  8 12:42:28 np0005476733 nova_compute[192580]: 2025-10-08 16:42:28.057 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Triggering sync for uuid 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  8 12:42:28 np0005476733 nova_compute[192580]: 2025-10-08 16:42:28.057 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:42:28 np0005476733 nova_compute[192580]: 2025-10-08 16:42:28.057 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:42:28 np0005476733 nova_compute[192580]: 2025-10-08 16:42:28.058 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:42:28 np0005476733 nova_compute[192580]: 2025-10-08 16:42:28.058 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:42:28 np0005476733 nova_compute[192580]: 2025-10-08 16:42:28.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:28 np0005476733 nova_compute[192580]: 2025-10-08 16:42:28.102 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.044s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:42:28 np0005476733 nova_compute[192580]: 2025-10-08 16:42:28.102 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.044s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:42:30 np0005476733 nova_compute[192580]: 2025-10-08 16:42:30.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:32 np0005476733 podman[269938]: 2025-10-08 16:42:32.254423142 +0000 UTC m=+0.065612073 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 12:42:32 np0005476733 podman[269937]: 2025-10-08 16:42:32.268114389 +0000 UTC m=+0.081809880 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  8 12:42:33 np0005476733 nova_compute[192580]: 2025-10-08 16:42:33.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:35 np0005476733 nova_compute[192580]: 2025-10-08 16:42:35.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.078 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'name': 'tempest-server-test-1528954241', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000065', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '4726c8b7a2a3405b9b2d689862918f5d', 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'hostId': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.084 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'name': 'tempest-server-test-1820634934', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000066', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '4726c8b7a2a3405b9b2d689862918f5d', 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'hostId': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.084 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.113 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.device.write.requests volume: 800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.114 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.137 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.device.write.requests volume: 783 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.138 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17f94038-0748-4429-a2bf-b603a677f50b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 800, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03-vda', 'timestamp': '2025-10-08T16:42:36.085004', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'instance-00000065', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cb7d8120-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.808035007, 'message_signature': 'fb642dc59c2e1711036a36baf3d5b60c31247d8c188cc978773c98a2ac7ab998'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03-sda', 'timestamp': '2025-10-08T16:42:36.085004', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'instance-00000065', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cb7d989a-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.808035007, 'message_signature': '2fd7ff02524abc91bd5ca69a27940ff1856def48a14de9c6d3d6b9d1942c907d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 783, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-vda', 'timestamp': '2025-10-08T16:42:36.085004', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'instance-00000066', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cb812b40-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.837562748, 'message_signature': 'a8d4aef46a279ef76c2b19d21d64b14b2aca10363574d706c353cd1196c70b91'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-sda', 'timestamp': '2025-10-08T16:42:36.085004', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'instance-00000066', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cb8143aa-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.837562748, 'message_signature': '2809696fed827355330e95b4d210237f4cafc05590b50e787ae6900bc7526ae1'}]}, 'timestamp': '2025-10-08 16:42:36.138683', '_unique_id': 'db8351d941254d5a938a527d6a1e58e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.143 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.143 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.143 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-test-1820634934>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1820634934>]
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.144 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.148 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f5367a5f-b3ff-45c9-b61a-a87626cfcb03 / tapf66cbaf5-fd inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.148 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.150 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35 / tap39736fa3-77 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.151 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7bd9cc83-1554-48b5-a685-cd13012c75f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000065-f5367a5f-b3ff-45c9-b61a-a87626cfcb03-tapf66cbaf5-fd', 'timestamp': '2025-10-08T16:42:36.144247', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'tapf66cbaf5-fd', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e7:3a:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf66cbaf5-fd'}, 'message_id': 'cb82d814-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.867267596, 'message_signature': '42096963dbc3e9b09d141c2881d1dc59e29aa065ba0c7f0fe4c04bb6113b1491'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000066-8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-tap39736fa3-77', 'timestamp': '2025-10-08T16:42:36.144247', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'tap39736fa3-77', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2b:51:5d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap39736fa3-77'}, 'message_id': 'cb83389a-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.871915353, 'message_signature': 'ee6a81a9fb69fba1fda466dd1b21475a90fe1ccd3daa98d42ba663e3528b8560'}]}, 'timestamp': '2025-10-08 16:42:36.151345', '_unique_id': '4f130f7c94a542d2ae65d85810e0534c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.153 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.153 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.153 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ede7416-9b18-452b-be60-9af895f9dcb5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000065-f5367a5f-b3ff-45c9-b61a-a87626cfcb03-tapf66cbaf5-fd', 'timestamp': '2025-10-08T16:42:36.153542', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'tapf66cbaf5-fd', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e7:3a:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf66cbaf5-fd'}, 'message_id': 'cb839916-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.867267596, 'message_signature': 'c2cc38816a97f6695c7a55aa7dc635f7373fa24f5742d20c5a4bc2aabb3639a7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000066-8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-tap39736fa3-77', 'timestamp': '2025-10-08T16:42:36.153542', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'tap39736fa3-77', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2b:51:5d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap39736fa3-77'}, 'message_id': 'cb83a2b2-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.871915353, 'message_signature': 'db4665147c526fdbb8c64b405f554954d5710113e8309fbe5e570755da0d706a'}]}, 'timestamp': '2025-10-08 16:42:36.154050', '_unique_id': '78385813a0be46f5a556ecf80428835a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.155 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.155 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.155 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1820634934>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1820634934>]
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.155 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.155 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d2d4266-592f-4f57-a4a5-a1c30cfcc5c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000065-f5367a5f-b3ff-45c9-b61a-a87626cfcb03-tapf66cbaf5-fd', 'timestamp': '2025-10-08T16:42:36.155801', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'tapf66cbaf5-fd', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e7:3a:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf66cbaf5-fd'}, 'message_id': 'cb83f190-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.867267596, 'message_signature': '3c663c616d85396c7907c5da69c9ddc1d7e32683d61bbb494a40e631975a0078'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000066-8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-tap39736fa3-77', 'timestamp': '2025-10-08T16:42:36.155801', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'tap39736fa3-77', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2b:51:5d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap39736fa3-77'}, 'message_id': 'cb83fb04-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.871915353, 'message_signature': 'a688bcf8b1a16afddf3a89c9865d89d86565431f3d7afefa308c89f482216814'}]}, 'timestamp': '2025-10-08 16:42:36.156367', '_unique_id': 'e4b95945a7694a84b6cef3058d9aa10c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.156 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.157 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.157 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.157 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2ad40ac-5591-4773-a2c6-ec1d7d7ae691', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000065-f5367a5f-b3ff-45c9-b61a-a87626cfcb03-tapf66cbaf5-fd', 'timestamp': '2025-10-08T16:42:36.157641', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'tapf66cbaf5-fd', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e7:3a:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf66cbaf5-fd'}, 'message_id': 'cb8438bc-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.867267596, 'message_signature': 'ce99285495bd129b52f360ea6599ffbca89bd87a08bd8c219abf18de1c441698'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000066-8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-tap39736fa3-77', 'timestamp': '2025-10-08T16:42:36.157641', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'tap39736fa3-77', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2b:51:5d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap39736fa3-77'}, 'message_id': 'cb8440dc-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.871915353, 'message_signature': 'f67d4b42c7818e613dd1dd319cfb9f46ad67b183857780f6c5885d664979973e'}]}, 'timestamp': '2025-10-08 16:42:36.158072', '_unique_id': '0e127863313e4cd99f769dde48803524'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.158 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.159 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.159 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/network.incoming.bytes volume: 263530 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.159 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/network.incoming.bytes volume: 82140 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd758a8d-2c27-4ac0-a04c-f0452629c378', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 263530, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000065-f5367a5f-b3ff-45c9-b61a-a87626cfcb03-tapf66cbaf5-fd', 'timestamp': '2025-10-08T16:42:36.159197', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'tapf66cbaf5-fd', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e7:3a:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf66cbaf5-fd'}, 'message_id': 'cb847552-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.867267596, 'message_signature': 'ec3e038d1426989aa02e0df1d4ba022d897a6ea6e0ae2644b86d695ed5b4c192'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 82140, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000066-8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-tap39736fa3-77', 'timestamp': '2025-10-08T16:42:36.159197', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'tap39736fa3-77', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2b:51:5d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap39736fa3-77'}, 'message_id': 'cb847d86-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.871915353, 'message_signature': '33f67181bea6b237a610ab8083986d0b331a1369c6bae139bf344624563d59e0'}]}, 'timestamp': '2025-10-08 16:42:36.159627', '_unique_id': '71ef601e772e4e85a70308a5525f40b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.160 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.174 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.device.usage volume: 169476096 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.175 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.186 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.device.usage volume: 152502272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.186 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4c17aff-5202-4c07-84c1-ac82daa8f51c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 169476096, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03-vda', 'timestamp': '2025-10-08T16:42:36.160728', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'instance-00000065', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cb86d5c2-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.8837082, 'message_signature': '51d0acc12e78c42e700846385d186d1702739e777de633999af8de434862d379'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03-sda', 'timestamp': '2025-10-08T16:42:36.160728', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'instance-00000065', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cb86e33c-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.8837082, 'message_signature': 'ffcde9ed1e1f9ea871caf5cea923bcdd8a26aa4052be6ace7888b0182bcafae0'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152502272, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-vda', 'timestamp': '2025-10-08T16:42:36.160728', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'instance-00000066', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cb889fd8-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.898310775, 'message_signature': '6db698f8a401127965700b21e2d6287bafe69f055e6288dd3672086e2c086c33'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-sda', 'timestamp': '2025-10-08T16:42:36.160728', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'instance-00000066', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cb88a852-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.898310775, 'message_signature': '4672152d7b9c49fcf6c3e62a251f0d1861de0386e1e04f5a963741beb274c5a0'}]}, 'timestamp': '2025-10-08 16:42:36.186950', '_unique_id': 'e0d3a8e0c6684146bc92b0542ae03533'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.188 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.202 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/memory.usage volume: 303.23828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.218 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/memory.usage volume: 277.75390625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c2b4490-8322-41d6-8ebf-41a67e748ec3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 303.23828125, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'timestamp': '2025-10-08T16:42:36.188891', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'instance-00000065', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': 'cb8b1e0c-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.925502242, 'message_signature': 'e655e437dc29139400424687c8171241bf49f50fed26704c259047ae7b9fbc60'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 277.75390625, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'timestamp': '2025-10-08T16:42:36.188891', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'instance-00000066', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': 'cb8d8b7e-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.941401849, 'message_signature': '823ee09031245c2e8f3a88498c2e7014f0f69d77f50de9568977f2e64bb15469'}]}, 'timestamp': '2025-10-08 16:42:36.219046', '_unique_id': 'b79664a5cdde42bbbdf008991a3df602'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.220 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.221 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.221 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/network.outgoing.bytes volume: 289434 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.221 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/network.outgoing.bytes volume: 108594 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d169e07-92af-494f-a27d-2b4cb331deba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 289434, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000065-f5367a5f-b3ff-45c9-b61a-a87626cfcb03-tapf66cbaf5-fd', 'timestamp': '2025-10-08T16:42:36.221212', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'tapf66cbaf5-fd', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e7:3a:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf66cbaf5-fd'}, 'message_id': 'cb8deca4-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.867267596, 'message_signature': '3d354db548a502459cb2c8473975695382d7f32696f6e3bc2b41204b2cf62c82'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 108594, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000066-8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-tap39736fa3-77', 'timestamp': '2025-10-08T16:42:36.221212', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'tap39736fa3-77', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2b:51:5d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap39736fa3-77'}, 'message_id': 'cb8df7d0-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.871915353, 'message_signature': '66b098bf23ffb929bc8f2c4aa246ace02e7cb56debb2c40687fbaedafb5c57d4'}]}, 'timestamp': '2025-10-08 16:42:36.221773', '_unique_id': '4532fe73fd9444ad9a4e80b5b92fe4aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.222 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.223 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.223 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/cpu volume: 43900000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.223 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/cpu volume: 44240000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd03c3db7-a4e8-45e9-9247-a7f41a8565ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 43900000000, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'timestamp': '2025-10-08T16:42:36.223133', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'instance-00000065', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'cb8e3718-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.925502242, 'message_signature': 'adde13c72bf306f56493260bd899af710c7cfb57e2e2068875b395ee20462827'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 44240000000, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'timestamp': '2025-10-08T16:42:36.223133', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'instance-00000066', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'cb8e3ede-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.941401849, 'message_signature': '9d5def030675eab82e602cd16bf076dddbb5597a2140d9992e8ab0aa91d94978'}]}, 'timestamp': '2025-10-08 16:42:36.223550', '_unique_id': '030c54262a204211b6195e039fd48cfe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.224 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.225 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-test-1820634934>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1820634934>]
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.225 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.225 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/network.incoming.packets volume: 1198 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.225 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/network.incoming.packets volume: 484 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94676db5-c269-4d43-8a68-1fdd32d098f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1198, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000065-f5367a5f-b3ff-45c9-b61a-a87626cfcb03-tapf66cbaf5-fd', 'timestamp': '2025-10-08T16:42:36.225261', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'tapf66cbaf5-fd', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e7:3a:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf66cbaf5-fd'}, 'message_id': 'cb8e89fc-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.867267596, 'message_signature': '3e383e57cbc5db0994ffea4285b8d5c76e9a3b7b070510619ba35a55cbe5fea7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 484, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000066-8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-tap39736fa3-77', 'timestamp': '2025-10-08T16:42:36.225261', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'tap39736fa3-77', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2b:51:5d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap39736fa3-77'}, 'message_id': 'cb8e9348-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.871915353, 'message_signature': '6d6493d9006ba56e16725d427a32795142079fd8e82b6092b0b84e99605ed260'}]}, 'timestamp': '2025-10-08 16:42:36.225717', '_unique_id': '7b3c39bdcbd347e4a362f463bed10d9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.226 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.227 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.227 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.227 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '983f647a-174d-4214-af88-f4fdf15cbe62', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03-vda', 'timestamp': '2025-10-08T16:42:36.226804', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'instance-00000065', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cb8ec656-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.8837082, 'message_signature': 'f95f5ba48c637a750e8e06385a47806d200c68f1dcade08fb6f5b7133bfd6a46'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03-sda', 'timestamp': '2025-10-08T16:42:36.226804', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'instance-00000065', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cb8ecea8-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.8837082, 'message_signature': '655fdbce87fe48a37831148c85295e76ad9b86e258f9c0d6a83e625fea35f8bd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-vda', 'timestamp': '2025-10-08T16:42:36.226804', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'instance-00000066', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cb8ed628-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.898310775, 'message_signature': '3405382c2602f4747e4b0310ffcc62369f065a2e2864a5da6083ed329941fcda'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-sda', 'timestamp': '2025-10-08T16:42:36.226804', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'instance-00000066', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cb8edfba-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.898310775, 'message_signature': 'ff49a0fb16749ed8547fa93c4dda51469d4b01f87b8600fa10362324b955e536'}]}, 'timestamp': '2025-10-08 16:42:36.227669', '_unique_id': 'e11194743ef649e2bd29006e89dc8c37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.228 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.229 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.229 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.device.allocation volume: 169873408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.229 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.229 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.229 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41f983ee-a5c2-4680-aaf6-b078f1826b60', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 169873408, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03-vda', 'timestamp': '2025-10-08T16:42:36.229067', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'instance-00000065', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cb8f1f7a-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.8837082, 'message_signature': 'a3c0efb109e90f7595d67bd3f266068441779c6d70a2d90b587721898addbfa0'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03-sda', 'timestamp': '2025-10-08T16:42:36.229067', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'instance-00000065', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cb8f2704-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.8837082, 'message_signature': '5baa9a8933927cdc361ad7f67e8e9777088e046b35f7965d35ca4a98d46e9071'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-vda', 'timestamp': '2025-10-08T16:42:36.229067', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'instance-00000066', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cb8f3064-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.898310775, 'message_signature': '6da62d93654194c2e0cb696f3a09d9f82a53785de2a7cbb5fedbaecff851dba3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-sda', 'timestamp': '2025-10-08T16:42:36.229067', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'instance-00000066', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cb8f3776-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.898310775, 'message_signature': '35c6aca5e161d03c70dcc8e1627669010a898ef23a50827e4f6fb10f5ccaac37'}]}, 'timestamp': '2025-10-08 16:42:36.229912', '_unique_id': '66d3564c37c24099baea5da8d4d6b3d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.230 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.231 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.231 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.device.read.latency volume: 7257521600 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.231 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.device.read.latency volume: 85699375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.231 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.device.read.latency volume: 8192693260 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.231 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.device.read.latency volume: 35021024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1bf2503f-5ad0-4760-90c8-03c8694ad8a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7257521600, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03-vda', 'timestamp': '2025-10-08T16:42:36.231076', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'instance-00000065', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cb8f6dcc-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.808035007, 'message_signature': '80846262d1a269fbcf90b588be5871266bd32edbd896882c4fed64b9f675a532'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 85699375, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03-sda', 'timestamp': '2025-10-08T16:42:36.231076', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'instance-00000065', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cb8f754c-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.808035007, 'message_signature': '7bd24cfa538767ec9f801335d36e43fc6ffa1781e4d1089c2190c55982bbb491'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8192693260, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-vda', 'timestamp': '2025-10-08T16:42:36.231076', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'instance-00000066', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cb8f7c72-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.837562748, 'message_signature': 'a4bcca98f99a80bd2517b8bf428daa640f46f9ce47303a3c8a12c7a6500e8cf2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 35021024, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-sda', 'timestamp': '2025-10-08T16:42:36.231076', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'instance-00000066', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cb8f8500-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.837562748, 'message_signature': '4542cf824d821477b4bfae7ae8ca1755f56b64cdb0af7c48aeca7a76c9a3a349'}]}, 'timestamp': '2025-10-08 16:42:36.231897', '_unique_id': 'ffe929052abd4c4bb1e5ce570ae8d07c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.232 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.233 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/network.outgoing.packets volume: 1306 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.233 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/network.outgoing.packets volume: 508 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee732293-1675-4283-9939-08b4a073284d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1306, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000065-f5367a5f-b3ff-45c9-b61a-a87626cfcb03-tapf66cbaf5-fd', 'timestamp': '2025-10-08T16:42:36.233033', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'tapf66cbaf5-fd', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e7:3a:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf66cbaf5-fd'}, 'message_id': 'cb8fba34-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.867267596, 'message_signature': '0a4760a14ba251d904bd4c698f84de1c25ebe8df28a3c022e09e55644ab76f07'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 508, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000066-8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-tap39736fa3-77', 'timestamp': '2025-10-08T16:42:36.233033', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'tap39736fa3-77', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2b:51:5d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap39736fa3-77'}, 'message_id': 'cb8fc222-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.871915353, 'message_signature': 'b3d4b3381893e14bc013b974856144e66f712d64c00cf5e6c2b0f354d4598c59'}]}, 'timestamp': '2025-10-08 16:42:36.233491', '_unique_id': '9db1438b0dce484888add6d953c621b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.device.write.bytes volume: 153245696 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.234 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.235 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.device.write.bytes volume: 136521728 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.235 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '155e5423-44e1-4e35-8fda-3d6b4a86ca4c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 153245696, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03-vda', 'timestamp': '2025-10-08T16:42:36.234715', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'instance-00000065', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cb8ffb8e-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.808035007, 'message_signature': '4e61f22ce38d948278b3b54a704f4e045b886b19887a4219be29d585852e04fc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03-sda', 'timestamp': '2025-10-08T16:42:36.234715', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'instance-00000065', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cb900336-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.808035007, 'message_signature': 'd2f99b08a5950104a02f2dd83113a720656ce0f672a8f99615687e3a086e109b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 136521728, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-vda', 'timestamp': '2025-10-08T16:42:36.234715', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'instance-00000066', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cb900b38-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.837562748, 'message_signature': '5215df910b6e4f1f1132b537733e8f2c62f85d0e8596e7723a815b974e9aefdb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-sda', 'timestamp': '2025-10-08T16:42:36.234715', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'instance-00000066', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cb90125e-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.837562748, 'message_signature': '37805646c1e05e08162543fc993c0d0ac47bfc23d63c7f97f7e4eb43d5f95ae8'}]}, 'timestamp': '2025-10-08 16:42:36.235514', '_unique_id': '796a7133b085438290dadda107ba2910'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.236 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.237 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.237 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '866e422d-6ba2-4a22-8ee3-26a45b7ac8aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000065-f5367a5f-b3ff-45c9-b61a-a87626cfcb03-tapf66cbaf5-fd', 'timestamp': '2025-10-08T16:42:36.237002', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'tapf66cbaf5-fd', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e7:3a:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf66cbaf5-fd'}, 'message_id': 'cb905624-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.867267596, 'message_signature': '4986afe05af75129aedf5fd0c69cf427410c972536f3f9d543ab4a08499134a4'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000066-8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-tap39736fa3-77', 'timestamp': '2025-10-08T16:42:36.237002', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'tap39736fa3-77', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2b:51:5d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap39736fa3-77'}, 'message_id': 'cb906150-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.871915353, 'message_signature': '57c018912d2f960b9dffd0d0b3ec986f97dd9a7c446f1bb059384946641bc404'}]}, 'timestamp': '2025-10-08 16:42:36.237585', '_unique_id': 'cfce6c5f1de24832bff2e976a7a16fe7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.238 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.239 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.device.write.latency volume: 9565874026 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.239 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.239 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.device.write.latency volume: 11086682875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.239 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba8d19d2-7027-41e6-8877-480ae941b084', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9565874026, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03-vda', 'timestamp': '2025-10-08T16:42:36.239002', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'instance-00000065', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cb90a49e-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.808035007, 'message_signature': 'abebd64c883547b22602c310e5c8402a7980b776cb314ea51eaa107ddef4f5c9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03-sda', 'timestamp': '2025-10-08T16:42:36.239002', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'instance-00000065', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cb90ae62-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.808035007, 'message_signature': 'de36d0c3cf53672f93cde05d2bc30b9e05426c2db81fd1d9b8eada27cca263d8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11086682875, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-vda', 'timestamp': '2025-10-08T16:42:36.239002', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'instance-00000066', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cb90b592-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.837562748, 'message_signature': 'bbd184ef8670363a9df77a0c58e0f460d9fd89b839641b100964561232bad4df'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-sda', 'timestamp': '2025-10-08T16:42:36.239002', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'instance-00000066', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cb90bc72-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.837562748, 'message_signature': 'e04205699f9bd089fbb3b49db68bb8c4406404d38aec0adb4e4aca0b1e6f2106'}]}, 'timestamp': '2025-10-08 16:42:36.239879', '_unique_id': 'd7a701fe647e43b58625e660af3d56b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.240 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.241 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.device.read.bytes volume: 329610752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.241 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.241 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.device.read.bytes volume: 331687424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.241 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '929196ea-a31f-4872-b2be-37796ac034cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 329610752, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03-vda', 'timestamp': '2025-10-08T16:42:36.241011', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'instance-00000065', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cb90f1b0-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.808035007, 'message_signature': '5c3c57927f28ce99cb3a9a3bce264aaa7cb58ca4035a606cf265849b531fa66c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03-sda', 'timestamp': '2025-10-08T16:42:36.241011', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'instance-00000065', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cb90f930-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.808035007, 'message_signature': 'fc5606cde3ffe028e60a5d28b0d457843550ce6f320307b3e3b9aacea4fc411b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 331687424, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-vda', 'timestamp': '2025-10-08T16:42:36.241011', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'instance-00000066', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cb91022c-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.837562748, 'message_signature': '779365653e70b9d3fed16a476f191041ad006fece28f428b38a58a6a674e7f45'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-sda', 'timestamp': '2025-10-08T16:42:36.241011', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'instance-00000066', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cb91093e-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.837562748, 'message_signature': '53755d3af3c274a645c1107f2dd8b48824991b0cafc885946bdfcf097c1f5599'}]}, 'timestamp': '2025-10-08 16:42:36.241850', '_unique_id': '5ef40e6020294d549ff8665156778a8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.242 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.243 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.243 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14d728d0-7056-4622-86b4-b244f1ccf3a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000065-f5367a5f-b3ff-45c9-b61a-a87626cfcb03-tapf66cbaf5-fd', 'timestamp': '2025-10-08T16:42:36.243004', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'tapf66cbaf5-fd', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:e7:3a:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf66cbaf5-fd'}, 'message_id': 'cb914566-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.867267596, 'message_signature': '3fca4a6e6b3e505fd32c2223a32e373180c58b471f0bb42a197480dc78d6105c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'instance-00000066-8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-tap39736fa3-77', 'timestamp': '2025-10-08T16:42:36.243004', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'tap39736fa3-77', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:2b:51:5d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap39736fa3-77'}, 'message_id': 'cb914dfe-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.871915353, 'message_signature': '381b0212633b174a6d25d079bf3840141e55364933c5fb2cffe96ca37c279689'}]}, 'timestamp': '2025-10-08 16:42:36.243604', '_unique_id': 'dece3c85d3344aec82c297a588a85ed3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.244 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.device.read.requests volume: 11670 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.245 12 DEBUG ceilometer.compute.pollsters [-] f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.245 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.device.read.requests volume: 11706 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.245 12 DEBUG ceilometer.compute.pollsters [-] 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5d2eb58-06e6-4465-8d34-374abe4fe54f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11670, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03-vda', 'timestamp': '2025-10-08T16:42:36.244960', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'instance-00000065', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cb918ce2-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.808035007, 'message_signature': 'e56adae1f4463c1712dfed63bf17004d8ee41738a909ee2b33c02cfedb97abae'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03-sda', 'timestamp': '2025-10-08T16:42:36.244960', 'resource_metadata': {'display_name': 'tempest-server-test-1528954241', 'name': 'instance-00000065', 'instance_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cb9197c8-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.808035007, 'message_signature': '6a66880f0916f3cf2b96b979f5ce33a06645ae145fde1d5e1855b2aebc98b1b1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11706, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-vda', 'timestamp': '2025-10-08T16:42:36.244960', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'instance-00000066', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'cb91a2b8-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.837562748, 'message_signature': '9619a0a86460a05dac72cb16c43e63733530f7b210cd7428fdbd0f42f6bfa23a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'de0012a12c1645bfb620caa34110c3f4', 'user_name': None, 'project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'project_name': None, 'resource_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-sda', 'timestamp': '2025-10-08T16:42:36.244960', 'resource_metadata': {'display_name': 'tempest-server-test-1820634934', 'name': 'instance-00000066', 'instance_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'instance_type': 'custom_neutron_guest', 'host': '628365113ec8d9ddb93a73623d169dfb5832311dbf1687cd09842462', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'cb91aaf6-a465-11f0-9274-fa163ef67048', 'monotonic_time': 8659.837562748, 'message_signature': '5520562cf4160c4864af3ebc6387bbe021c47bc9e3ba4bd79ee0d9b7bb4e251f'}]}, 'timestamp': '2025-10-08 16:42:36.246018', '_unique_id': 'c3dbe606fa244879b15db8ab8ebdd709'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.246 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.247 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.247 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:42:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:42:36.247 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-test-1820634934>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-test-1820634934>]
Oct  8 12:42:38 np0005476733 nova_compute[192580]: 2025-10-08 16:42:38.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:40 np0005476733 nova_compute[192580]: 2025-10-08 16:42:40.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:40 np0005476733 ovn_controller[263831]: 2025-10-08T16:42:40Z|00108|memory_trim|INFO|Detected inactivity (last active 30021 ms ago): trimming memory
Oct  8 12:42:41 np0005476733 podman[269979]: 2025-10-08 16:42:41.239914813 +0000 UTC m=+0.055543692 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 12:42:42 np0005476733 nova_compute[192580]: 2025-10-08 16:42:42.624 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:42:43 np0005476733 nova_compute[192580]: 2025-10-08 16:42:43.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:43 np0005476733 podman[269999]: 2025-10-08 16:42:43.32337571 +0000 UTC m=+0.138755935 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  8 12:42:45 np0005476733 nova_compute[192580]: 2025-10-08 16:42:45.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:46 np0005476733 podman[270026]: 2025-10-08 16:42:46.240037807 +0000 UTC m=+0.063468634 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:42:48 np0005476733 nova_compute[192580]: 2025-10-08 16:42:48.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:48 np0005476733 nova_compute[192580]: 2025-10-08 16:42:48.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:42:48 np0005476733 nova_compute[192580]: 2025-10-08 16:42:48.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:42:49 np0005476733 nova_compute[192580]: 2025-10-08 16:42:49.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:42:50 np0005476733 nova_compute[192580]: 2025-10-08 16:42:50.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:42:50Z|00109|pinctrl|WARN|Dropped 127 log messages in last 60 seconds (most recently, 7 seconds ago) due to excessive rate
Oct  8 12:42:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:42:50Z|00110|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:42:53 np0005476733 nova_compute[192580]: 2025-10-08 16:42:53.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:54 np0005476733 nova_compute[192580]: 2025-10-08 16:42:54.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:42:55 np0005476733 nova_compute[192580]: 2025-10-08 16:42:55.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:55 np0005476733 podman[270049]: 2025-10-08 16:42:55.251116995 +0000 UTC m=+0.075991194 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:42:55 np0005476733 podman[270050]: 2025-10-08 16:42:55.279289763 +0000 UTC m=+0.086589253 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-type=git, distribution-scope=public, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Oct  8 12:42:55 np0005476733 podman[270048]: 2025-10-08 16:42:55.294227789 +0000 UTC m=+0.114346477 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 12:42:56 np0005476733 nova_compute[192580]: 2025-10-08 16:42:56.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:42:58 np0005476733 nova_compute[192580]: 2025-10-08 16:42:58.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:42:59 np0005476733 nova_compute[192580]: 2025-10-08 16:42:59.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:42:59 np0005476733 nova_compute[192580]: 2025-10-08 16:42:59.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:42:59 np0005476733 nova_compute[192580]: 2025-10-08 16:42:59.947 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-8bb9fbe7-3f99-4a38-b5b6-2943a5dead35" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:42:59 np0005476733 nova_compute[192580]: 2025-10-08 16:42:59.947 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-8bb9fbe7-3f99-4a38-b5b6-2943a5dead35" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:42:59 np0005476733 nova_compute[192580]: 2025-10-08 16:42:59.948 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:43:00 np0005476733 nova_compute[192580]: 2025-10-08 16:43:00.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:01 np0005476733 nova_compute[192580]: 2025-10-08 16:43:01.272 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Updating instance_info_cache with network_info: [{"id": "39736fa3-776f-4ab7-8d41-b774edd28a9f", "address": "fa:16:3e:2b:51:5d", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39736fa3-77", "ovs_interfaceid": "39736fa3-776f-4ab7-8d41-b774edd28a9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:43:01 np0005476733 nova_compute[192580]: 2025-10-08 16:43:01.290 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-8bb9fbe7-3f99-4a38-b5b6-2943a5dead35" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:43:01 np0005476733 nova_compute[192580]: 2025-10-08 16:43:01.290 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:43:03 np0005476733 nova_compute[192580]: 2025-10-08 16:43:03.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:03 np0005476733 podman[270116]: 2025-10-08 16:43:03.237978301 +0000 UTC m=+0.062021659 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:43:03 np0005476733 podman[270115]: 2025-10-08 16:43:03.242419993 +0000 UTC m=+0.071012355 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:43:03 np0005476733 nova_compute[192580]: 2025-10-08 16:43:03.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:43:05 np0005476733 nova_compute[192580]: 2025-10-08 16:43:05.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:08 np0005476733 nova_compute[192580]: 2025-10-08 16:43:08.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:10 np0005476733 nova_compute[192580]: 2025-10-08 16:43:10.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:10 np0005476733 nova_compute[192580]: 2025-10-08 16:43:10.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:43:10 np0005476733 nova_compute[192580]: 2025-10-08 16:43:10.632 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:43:10 np0005476733 nova_compute[192580]: 2025-10-08 16:43:10.633 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:43:10 np0005476733 nova_compute[192580]: 2025-10-08 16:43:10.633 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:43:10 np0005476733 nova_compute[192580]: 2025-10-08 16:43:10.634 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:43:10 np0005476733 nova_compute[192580]: 2025-10-08 16:43:10.714 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:43:10 np0005476733 nova_compute[192580]: 2025-10-08 16:43:10.769 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:43:10 np0005476733 nova_compute[192580]: 2025-10-08 16:43:10.770 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:43:10 np0005476733 nova_compute[192580]: 2025-10-08 16:43:10.828 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:43:10 np0005476733 nova_compute[192580]: 2025-10-08 16:43:10.836 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:43:10 np0005476733 nova_compute[192580]: 2025-10-08 16:43:10.888 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:43:10 np0005476733 nova_compute[192580]: 2025-10-08 16:43:10.889 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:43:10 np0005476733 nova_compute[192580]: 2025-10-08 16:43:10.940 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:43:11 np0005476733 nova_compute[192580]: 2025-10-08 16:43:11.104 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:43:11 np0005476733 nova_compute[192580]: 2025-10-08 16:43:11.106 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12019MB free_disk=111.0108871459961GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:43:11 np0005476733 nova_compute[192580]: 2025-10-08 16:43:11.106 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:43:11 np0005476733 nova_compute[192580]: 2025-10-08 16:43:11.106 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:43:11 np0005476733 nova_compute[192580]: 2025-10-08 16:43:11.249 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:43:11 np0005476733 nova_compute[192580]: 2025-10-08 16:43:11.250 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:43:11 np0005476733 nova_compute[192580]: 2025-10-08 16:43:11.250 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:43:11 np0005476733 nova_compute[192580]: 2025-10-08 16:43:11.250 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=2560MB phys_disk=119GB used_disk=20GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:43:11 np0005476733 nova_compute[192580]: 2025-10-08 16:43:11.487 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:43:11 np0005476733 nova_compute[192580]: 2025-10-08 16:43:11.505 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:43:11 np0005476733 nova_compute[192580]: 2025-10-08 16:43:11.506 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:43:11 np0005476733 nova_compute[192580]: 2025-10-08 16:43:11.507 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.401s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:43:12 np0005476733 podman[270173]: 2025-10-08 16:43:12.218448073 +0000 UTC m=+0.043758386 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:43:13 np0005476733 nova_compute[192580]: 2025-10-08 16:43:13.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:14 np0005476733 podman[270192]: 2025-10-08 16:43:14.262439062 +0000 UTC m=+0.087468761 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251001)
Oct  8 12:43:15 np0005476733 nova_compute[192580]: 2025-10-08 16:43:15.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:17 np0005476733 podman[270218]: 2025-10-08 16:43:17.219904279 +0000 UTC m=+0.054870960 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  8 12:43:18 np0005476733 nova_compute[192580]: 2025-10-08 16:43:18.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:18 np0005476733 nova_compute[192580]: 2025-10-08 16:43:18.506 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:43:20 np0005476733 nova_compute[192580]: 2025-10-08 16:43:20.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:21 np0005476733 nova_compute[192580]: 2025-10-08 16:43:21.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:43:23 np0005476733 nova_compute[192580]: 2025-10-08 16:43:23.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:25 np0005476733 nova_compute[192580]: 2025-10-08 16:43:25.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:26 np0005476733 podman[270244]: 2025-10-08 16:43:26.234771447 +0000 UTC m=+0.052804005 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9)
Oct  8 12:43:26 np0005476733 podman[270243]: 2025-10-08 16:43:26.259830596 +0000 UTC m=+0.080007722 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:43:26 np0005476733 podman[270242]: 2025-10-08 16:43:26.265864668 +0000 UTC m=+0.089494924 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:43:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:26.412 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:43:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:26.412 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:43:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:26.412 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:43:28 np0005476733 nova_compute[192580]: 2025-10-08 16:43:28.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:30 np0005476733 nova_compute[192580]: 2025-10-08 16:43:30.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:33 np0005476733 nova_compute[192580]: 2025-10-08 16:43:33.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:34 np0005476733 podman[270308]: 2025-10-08 16:43:34.216957953 +0000 UTC m=+0.049084556 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  8 12:43:34 np0005476733 podman[270309]: 2025-10-08 16:43:34.246906928 +0000 UTC m=+0.075319323 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 12:43:35 np0005476733 nova_compute[192580]: 2025-10-08 16:43:35.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:36 np0005476733 nova_compute[192580]: 2025-10-08 16:43:36.885 2 DEBUG oslo_concurrency.lockutils [None req-6b203b60-bb00-4c51-9519-a2bb502d8957 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:43:36 np0005476733 nova_compute[192580]: 2025-10-08 16:43:36.885 2 DEBUG oslo_concurrency.lockutils [None req-6b203b60-bb00-4c51-9519-a2bb502d8957 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:43:36 np0005476733 nova_compute[192580]: 2025-10-08 16:43:36.885 2 DEBUG oslo_concurrency.lockutils [None req-6b203b60-bb00-4c51-9519-a2bb502d8957 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:43:36 np0005476733 nova_compute[192580]: 2025-10-08 16:43:36.886 2 DEBUG oslo_concurrency.lockutils [None req-6b203b60-bb00-4c51-9519-a2bb502d8957 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:43:36 np0005476733 nova_compute[192580]: 2025-10-08 16:43:36.886 2 DEBUG oslo_concurrency.lockutils [None req-6b203b60-bb00-4c51-9519-a2bb502d8957 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:43:36 np0005476733 nova_compute[192580]: 2025-10-08 16:43:36.887 2 INFO nova.compute.manager [None req-6b203b60-bb00-4c51-9519-a2bb502d8957 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Terminating instance#033[00m
Oct  8 12:43:36 np0005476733 nova_compute[192580]: 2025-10-08 16:43:36.888 2 DEBUG nova.compute.manager [None req-6b203b60-bb00-4c51-9519-a2bb502d8957 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 12:43:36 np0005476733 kernel: tap39736fa3-77 (unregistering): left promiscuous mode
Oct  8 12:43:36 np0005476733 NetworkManager[51699]: <info>  [1759941816.9177] device (tap39736fa3-77): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 12:43:36 np0005476733 nova_compute[192580]: 2025-10-08 16:43:36.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:36 np0005476733 ovn_controller[263831]: 2025-10-08T16:43:36Z|00111|binding|INFO|Releasing lport 39736fa3-776f-4ab7-8d41-b774edd28a9f from this chassis (sb_readonly=0)
Oct  8 12:43:36 np0005476733 ovn_controller[263831]: 2025-10-08T16:43:36Z|00112|binding|INFO|Setting lport 39736fa3-776f-4ab7-8d41-b774edd28a9f down in Southbound
Oct  8 12:43:36 np0005476733 ovn_controller[263831]: 2025-10-08T16:43:36Z|00113|binding|INFO|Removing iface tap39736fa3-77 ovn-installed in OVS
Oct  8 12:43:36 np0005476733 nova_compute[192580]: 2025-10-08 16:43:36.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:36.934 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:51:5d 10.100.0.29'], port_security=['fa:16:3e:2b:51:5d 10.100.0.29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.29/28', 'neutron:device_id': '8bb9fbe7-3f99-4a38-b5b6-2943a5dead35', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aef581c1-ee97-4e74-a1f4-beb582e7a3d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0ae65408-c1fc-4a23-acb9-ead1616a73f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.214'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11840178-7e45-489c-af98-e0bcd5dab024, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=39736fa3-776f-4ab7-8d41-b774edd28a9f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:43:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:36.935 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 39736fa3-776f-4ab7-8d41-b774edd28a9f in datapath aef581c1-ee97-4e74-a1f4-beb582e7a3d5 unbound from our chassis#033[00m
Oct  8 12:43:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:36.937 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aef581c1-ee97-4e74-a1f4-beb582e7a3d5#033[00m
Oct  8 12:43:36 np0005476733 nova_compute[192580]: 2025-10-08 16:43:36.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:36.953 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[87aaf490-032c-41fb-a0b8-21d3a6c67816]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:43:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:36.984 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[d1b8ec8e-bf7f-412f-bc54-11fd32b9c46c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:43:36 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:36.987 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[fc0ecc56-c787-474a-9d32-1b82a11c110a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:43:36 np0005476733 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d00000066.scope: Deactivated successfully.
Oct  8 12:43:36 np0005476733 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d00000066.scope: Consumed 48.720s CPU time.
Oct  8 12:43:36 np0005476733 systemd-machined[152624]: Machine qemu-63-instance-00000066 terminated.
Oct  8 12:43:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:37.014 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[11ead65c-27d9-4408-a305-a0f9de000db2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:43:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:37.032 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[72b73930-29e0-459d-956a-6b7ce43ec805]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaef581c1-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f1:43:55'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 8, 'rx_bytes': 1084, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 8, 'rx_bytes': 1084, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 214], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 854106, 'reachable_time': 31421, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 270367, 'error': None, 'target': 'ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:43:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:37.049 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c14d745f-d057-4253-a916-1c96c1974c2c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapaef581c1-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 854117, 'tstamp': 854117}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 270368, 'error': None, 'target': 'ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.18'], ['IFA_LOCAL', '10.100.0.18'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tapaef581c1-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 854120, 'tstamp': 854120}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 270368, 'error': None, 'target': 'ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:43:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:37.050 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaef581c1-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:43:37 np0005476733 nova_compute[192580]: 2025-10-08 16:43:37.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:37.056 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaef581c1-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:43:37 np0005476733 nova_compute[192580]: 2025-10-08 16:43:37.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:37.056 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:43:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:37.057 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaef581c1-e0, col_values=(('external_ids', {'iface-id': '87673df0-8166-4107-b1eb-9c37811630b4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:43:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:37.057 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:43:37 np0005476733 nova_compute[192580]: 2025-10-08 16:43:37.154 2 INFO nova.virt.libvirt.driver [-] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Instance destroyed successfully.#033[00m
Oct  8 12:43:37 np0005476733 nova_compute[192580]: 2025-10-08 16:43:37.155 2 DEBUG nova.objects.instance [None req-6b203b60-bb00-4c51-9519-a2bb502d8957 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lazy-loading 'resources' on Instance uuid 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:43:37 np0005476733 nova_compute[192580]: 2025-10-08 16:43:37.173 2 DEBUG nova.virt.libvirt.vif [None req-6b203b60-bb00-4c51-9519-a2bb502d8957 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T16:40:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1820634934',display_name='tempest-server-test-1820634934',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1820634934',id=102,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgF0yKypuL1wJPOTwdtr+zzq0qw+uwurXu01O/Ym5uWgfd00pr3GN1raply3ByKVO5hmmkfvydhY0zQSvT9dZbNRj3hL8c6L+eBag20GVWlTRMyq8EfPEfzsuER1PS2LQ==',key_name='tempest-keypair-test-224344080',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:41:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4726c8b7a2a3405b9b2d689862918f5d',ramdisk_id='',reservation_id='r-l22g7rkv',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-GatewayMtuTestUdp-187807839',owner_user_name='tempest-GatewayMtuTestUdp-187807839-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T16:41:00Z,user_data=None,user_id='de0012a12c1645bfb620caa34110c3f4',uuid=8bb9fbe7-3f99-4a38-b5b6-2943a5dead35,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "39736fa3-776f-4ab7-8d41-b774edd28a9f", "address": "fa:16:3e:2b:51:5d", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39736fa3-77", "ovs_interfaceid": "39736fa3-776f-4ab7-8d41-b774edd28a9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 12:43:37 np0005476733 nova_compute[192580]: 2025-10-08 16:43:37.174 2 DEBUG nova.network.os_vif_util [None req-6b203b60-bb00-4c51-9519-a2bb502d8957 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Converting VIF {"id": "39736fa3-776f-4ab7-8d41-b774edd28a9f", "address": "fa:16:3e:2b:51:5d", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39736fa3-77", "ovs_interfaceid": "39736fa3-776f-4ab7-8d41-b774edd28a9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:43:37 np0005476733 nova_compute[192580]: 2025-10-08 16:43:37.174 2 DEBUG nova.network.os_vif_util [None req-6b203b60-bb00-4c51-9519-a2bb502d8957 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2b:51:5d,bridge_name='br-int',has_traffic_filtering=True,id=39736fa3-776f-4ab7-8d41-b774edd28a9f,network=Network(aef581c1-ee97-4e74-a1f4-beb582e7a3d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39736fa3-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:43:37 np0005476733 nova_compute[192580]: 2025-10-08 16:43:37.175 2 DEBUG os_vif [None req-6b203b60-bb00-4c51-9519-a2bb502d8957 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2b:51:5d,bridge_name='br-int',has_traffic_filtering=True,id=39736fa3-776f-4ab7-8d41-b774edd28a9f,network=Network(aef581c1-ee97-4e74-a1f4-beb582e7a3d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39736fa3-77') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 12:43:37 np0005476733 nova_compute[192580]: 2025-10-08 16:43:37.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:37 np0005476733 nova_compute[192580]: 2025-10-08 16:43:37.176 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap39736fa3-77, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:43:37 np0005476733 nova_compute[192580]: 2025-10-08 16:43:37.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:37 np0005476733 nova_compute[192580]: 2025-10-08 16:43:37.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:43:37 np0005476733 nova_compute[192580]: 2025-10-08 16:43:37.182 2 INFO os_vif [None req-6b203b60-bb00-4c51-9519-a2bb502d8957 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2b:51:5d,bridge_name='br-int',has_traffic_filtering=True,id=39736fa3-776f-4ab7-8d41-b774edd28a9f,network=Network(aef581c1-ee97-4e74-a1f4-beb582e7a3d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39736fa3-77')#033[00m
Oct  8 12:43:37 np0005476733 nova_compute[192580]: 2025-10-08 16:43:37.183 2 INFO nova.virt.libvirt.driver [None req-6b203b60-bb00-4c51-9519-a2bb502d8957 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Deleting instance files /var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35_del#033[00m
Oct  8 12:43:37 np0005476733 nova_compute[192580]: 2025-10-08 16:43:37.184 2 INFO nova.virt.libvirt.driver [None req-6b203b60-bb00-4c51-9519-a2bb502d8957 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Deletion of /var/lib/nova/instances/8bb9fbe7-3f99-4a38-b5b6-2943a5dead35_del complete#033[00m
Oct  8 12:43:37 np0005476733 nova_compute[192580]: 2025-10-08 16:43:37.239 2 INFO nova.compute.manager [None req-6b203b60-bb00-4c51-9519-a2bb502d8957 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 12:43:37 np0005476733 nova_compute[192580]: 2025-10-08 16:43:37.240 2 DEBUG oslo.service.loopingcall [None req-6b203b60-bb00-4c51-9519-a2bb502d8957 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 12:43:37 np0005476733 nova_compute[192580]: 2025-10-08 16:43:37.240 2 DEBUG nova.compute.manager [-] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 12:43:37 np0005476733 nova_compute[192580]: 2025-10-08 16:43:37.240 2 DEBUG nova.network.neutron [-] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 12:43:40 np0005476733 nova_compute[192580]: 2025-10-08 16:43:40.188 2 DEBUG nova.compute.manager [req-d6e73b7f-c376-4f52-9d50-a41d22da111b req-51d68b3b-4035-42e4-9e21-cca2d3c337fb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Received event network-vif-unplugged-39736fa3-776f-4ab7-8d41-b774edd28a9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:43:40 np0005476733 nova_compute[192580]: 2025-10-08 16:43:40.188 2 DEBUG oslo_concurrency.lockutils [req-d6e73b7f-c376-4f52-9d50-a41d22da111b req-51d68b3b-4035-42e4-9e21-cca2d3c337fb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:43:40 np0005476733 nova_compute[192580]: 2025-10-08 16:43:40.188 2 DEBUG oslo_concurrency.lockutils [req-d6e73b7f-c376-4f52-9d50-a41d22da111b req-51d68b3b-4035-42e4-9e21-cca2d3c337fb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:43:40 np0005476733 nova_compute[192580]: 2025-10-08 16:43:40.188 2 DEBUG oslo_concurrency.lockutils [req-d6e73b7f-c376-4f52-9d50-a41d22da111b req-51d68b3b-4035-42e4-9e21-cca2d3c337fb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:43:40 np0005476733 nova_compute[192580]: 2025-10-08 16:43:40.188 2 DEBUG nova.compute.manager [req-d6e73b7f-c376-4f52-9d50-a41d22da111b req-51d68b3b-4035-42e4-9e21-cca2d3c337fb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] No waiting events found dispatching network-vif-unplugged-39736fa3-776f-4ab7-8d41-b774edd28a9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:43:40 np0005476733 nova_compute[192580]: 2025-10-08 16:43:40.189 2 DEBUG nova.compute.manager [req-d6e73b7f-c376-4f52-9d50-a41d22da111b req-51d68b3b-4035-42e4-9e21-cca2d3c337fb 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Received event network-vif-unplugged-39736fa3-776f-4ab7-8d41-b774edd28a9f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 12:43:40 np0005476733 nova_compute[192580]: 2025-10-08 16:43:40.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:40.518 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=79, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=78) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:43:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:40.519 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:43:40 np0005476733 nova_compute[192580]: 2025-10-08 16:43:40.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:40 np0005476733 nova_compute[192580]: 2025-10-08 16:43:40.568 2 DEBUG nova.network.neutron [-] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:43:40 np0005476733 nova_compute[192580]: 2025-10-08 16:43:40.593 2 INFO nova.compute.manager [-] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Took 3.35 seconds to deallocate network for instance.#033[00m
Oct  8 12:43:40 np0005476733 nova_compute[192580]: 2025-10-08 16:43:40.646 2 DEBUG oslo_concurrency.lockutils [None req-6b203b60-bb00-4c51-9519-a2bb502d8957 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:43:40 np0005476733 nova_compute[192580]: 2025-10-08 16:43:40.647 2 DEBUG oslo_concurrency.lockutils [None req-6b203b60-bb00-4c51-9519-a2bb502d8957 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:43:40 np0005476733 nova_compute[192580]: 2025-10-08 16:43:40.741 2 DEBUG nova.compute.provider_tree [None req-6b203b60-bb00-4c51-9519-a2bb502d8957 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:43:40 np0005476733 nova_compute[192580]: 2025-10-08 16:43:40.755 2 DEBUG nova.scheduler.client.report [None req-6b203b60-bb00-4c51-9519-a2bb502d8957 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:43:40 np0005476733 nova_compute[192580]: 2025-10-08 16:43:40.777 2 DEBUG oslo_concurrency.lockutils [None req-6b203b60-bb00-4c51-9519-a2bb502d8957 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:43:40 np0005476733 nova_compute[192580]: 2025-10-08 16:43:40.800 2 INFO nova.scheduler.client.report [None req-6b203b60-bb00-4c51-9519-a2bb502d8957 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Deleted allocations for instance 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35#033[00m
Oct  8 12:43:40 np0005476733 nova_compute[192580]: 2025-10-08 16:43:40.854 2 DEBUG oslo_concurrency.lockutils [None req-6b203b60-bb00-4c51-9519-a2bb502d8957 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.969s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:43:42 np0005476733 nova_compute[192580]: 2025-10-08 16:43:42.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:42 np0005476733 nova_compute[192580]: 2025-10-08 16:43:42.279 2 DEBUG nova.compute.manager [req-88b24aea-7841-4e93-919f-c424830f9077 req-1aa39e2c-4bfc-4ebe-8a1d-03d24120f88b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Received event network-vif-deleted-39736fa3-776f-4ab7-8d41-b774edd28a9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:43:42 np0005476733 nova_compute[192580]: 2025-10-08 16:43:42.280 2 DEBUG nova.compute.manager [req-88b24aea-7841-4e93-919f-c424830f9077 req-1aa39e2c-4bfc-4ebe-8a1d-03d24120f88b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Received event network-vif-plugged-39736fa3-776f-4ab7-8d41-b774edd28a9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:43:42 np0005476733 nova_compute[192580]: 2025-10-08 16:43:42.280 2 DEBUG oslo_concurrency.lockutils [req-88b24aea-7841-4e93-919f-c424830f9077 req-1aa39e2c-4bfc-4ebe-8a1d-03d24120f88b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:43:42 np0005476733 nova_compute[192580]: 2025-10-08 16:43:42.280 2 DEBUG oslo_concurrency.lockutils [req-88b24aea-7841-4e93-919f-c424830f9077 req-1aa39e2c-4bfc-4ebe-8a1d-03d24120f88b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:43:42 np0005476733 nova_compute[192580]: 2025-10-08 16:43:42.281 2 DEBUG oslo_concurrency.lockutils [req-88b24aea-7841-4e93-919f-c424830f9077 req-1aa39e2c-4bfc-4ebe-8a1d-03d24120f88b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "8bb9fbe7-3f99-4a38-b5b6-2943a5dead35-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:43:42 np0005476733 nova_compute[192580]: 2025-10-08 16:43:42.281 2 DEBUG nova.compute.manager [req-88b24aea-7841-4e93-919f-c424830f9077 req-1aa39e2c-4bfc-4ebe-8a1d-03d24120f88b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] No waiting events found dispatching network-vif-plugged-39736fa3-776f-4ab7-8d41-b774edd28a9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:43:42 np0005476733 nova_compute[192580]: 2025-10-08 16:43:42.281 2 WARNING nova.compute.manager [req-88b24aea-7841-4e93-919f-c424830f9077 req-1aa39e2c-4bfc-4ebe-8a1d-03d24120f88b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Received unexpected event network-vif-plugged-39736fa3-776f-4ab7-8d41-b774edd28a9f for instance with vm_state deleted and task_state None.#033[00m
Oct  8 12:43:42 np0005476733 nova_compute[192580]: 2025-10-08 16:43:42.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:43:43 np0005476733 podman[270389]: 2025-10-08 16:43:43.227960266 +0000 UTC m=+0.055140039 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.383 2 DEBUG nova.compute.manager [req-f99f488c-b807-46cf-b9d7-fd722886bb14 req-61a2a305-bd7d-4798-81ab-db76ed24e919 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Received event network-changed-f66cbaf5-fd00-4780-a065-fdcf7175d68c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.383 2 DEBUG nova.compute.manager [req-f99f488c-b807-46cf-b9d7-fd722886bb14 req-61a2a305-bd7d-4798-81ab-db76ed24e919 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Refreshing instance network info cache due to event network-changed-f66cbaf5-fd00-4780-a065-fdcf7175d68c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.384 2 DEBUG oslo_concurrency.lockutils [req-f99f488c-b807-46cf-b9d7-fd722886bb14 req-61a2a305-bd7d-4798-81ab-db76ed24e919 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-f5367a5f-b3ff-45c9-b61a-a87626cfcb03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.384 2 DEBUG oslo_concurrency.lockutils [req-f99f488c-b807-46cf-b9d7-fd722886bb14 req-61a2a305-bd7d-4798-81ab-db76ed24e919 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-f5367a5f-b3ff-45c9-b61a-a87626cfcb03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.384 2 DEBUG nova.network.neutron [req-f99f488c-b807-46cf-b9d7-fd722886bb14 req-61a2a305-bd7d-4798-81ab-db76ed24e919 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Refreshing network info cache for port f66cbaf5-fd00-4780-a065-fdcf7175d68c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.404 2 DEBUG oslo_concurrency.lockutils [None req-7d7f3e21-d9b6-434c-aa10-64dcae990144 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.405 2 DEBUG oslo_concurrency.lockutils [None req-7d7f3e21-d9b6-434c-aa10-64dcae990144 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.405 2 DEBUG oslo_concurrency.lockutils [None req-7d7f3e21-d9b6-434c-aa10-64dcae990144 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.406 2 DEBUG oslo_concurrency.lockutils [None req-7d7f3e21-d9b6-434c-aa10-64dcae990144 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.406 2 DEBUG oslo_concurrency.lockutils [None req-7d7f3e21-d9b6-434c-aa10-64dcae990144 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.408 2 INFO nova.compute.manager [None req-7d7f3e21-d9b6-434c-aa10-64dcae990144 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Terminating instance#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.410 2 DEBUG nova.compute.manager [None req-7d7f3e21-d9b6-434c-aa10-64dcae990144 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 12:43:44 np0005476733 kernel: tapf66cbaf5-fd (unregistering): left promiscuous mode
Oct  8 12:43:44 np0005476733 NetworkManager[51699]: <info>  [1759941824.4514] device (tapf66cbaf5-fd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 12:43:44 np0005476733 ovn_controller[263831]: 2025-10-08T16:43:44Z|00114|binding|INFO|Releasing lport f66cbaf5-fd00-4780-a065-fdcf7175d68c from this chassis (sb_readonly=0)
Oct  8 12:43:44 np0005476733 ovn_controller[263831]: 2025-10-08T16:43:44Z|00115|binding|INFO|Setting lport f66cbaf5-fd00-4780-a065-fdcf7175d68c down in Southbound
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:44 np0005476733 ovn_controller[263831]: 2025-10-08T16:43:44Z|00116|binding|INFO|Removing iface tapf66cbaf5-fd ovn-installed in OVS
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:44.468 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:3a:92 10.100.0.21'], port_security=['fa:16:3e:e7:3a:92 10.100.0.21'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.21/28', 'neutron:device_id': 'f5367a5f-b3ff-45c9-b61a-a87626cfcb03', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aef581c1-ee97-4e74-a1f4-beb582e7a3d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4726c8b7a2a3405b9b2d689862918f5d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0ae65408-c1fc-4a23-acb9-ead1616a73f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11840178-7e45-489c-af98-e0bcd5dab024, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=f66cbaf5-fd00-4780-a065-fdcf7175d68c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:43:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:44.469 103739 INFO neutron.agent.ovn.metadata.agent [-] Port f66cbaf5-fd00-4780-a065-fdcf7175d68c in datapath aef581c1-ee97-4e74-a1f4-beb582e7a3d5 unbound from our chassis#033[00m
Oct  8 12:43:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:44.470 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network aef581c1-ee97-4e74-a1f4-beb582e7a3d5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 12:43:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:44.471 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4f82d8bf-47dc-4a8c-abdb-2f9c0f115020]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:43:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:44.471 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5 namespace which is not needed anymore#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:44 np0005476733 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000065.scope: Deactivated successfully.
Oct  8 12:43:44 np0005476733 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000065.scope: Consumed 48.755s CPU time.
Oct  8 12:43:44 np0005476733 systemd-machined[152624]: Machine qemu-62-instance-00000065 terminated.
Oct  8 12:43:44 np0005476733 podman[270411]: 2025-10-08 16:43:44.572168981 +0000 UTC m=+0.087806101 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 12:43:44 np0005476733 neutron-haproxy-ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5[269117]: [NOTICE]   (269121) : haproxy version is 2.8.14-c23fe91
Oct  8 12:43:44 np0005476733 neutron-haproxy-ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5[269117]: [NOTICE]   (269121) : path to executable is /usr/sbin/haproxy
Oct  8 12:43:44 np0005476733 neutron-haproxy-ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5[269117]: [WARNING]  (269121) : Exiting Master process...
Oct  8 12:43:44 np0005476733 neutron-haproxy-ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5[269117]: [WARNING]  (269121) : Exiting Master process...
Oct  8 12:43:44 np0005476733 neutron-haproxy-ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5[269117]: [ALERT]    (269121) : Current worker (269123) exited with code 143 (Terminated)
Oct  8 12:43:44 np0005476733 neutron-haproxy-ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5[269117]: [WARNING]  (269121) : All workers exited. Exiting... (0)
Oct  8 12:43:44 np0005476733 systemd[1]: libpod-e66aedaed696fdcc10ac52eaa613d6022301e5e2cb178990668cf309e78160c3.scope: Deactivated successfully.
Oct  8 12:43:44 np0005476733 podman[270454]: 2025-10-08 16:43:44.613764077 +0000 UTC m=+0.044450718 container died e66aedaed696fdcc10ac52eaa613d6022301e5e2cb178990668cf309e78160c3 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:44 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e66aedaed696fdcc10ac52eaa613d6022301e5e2cb178990668cf309e78160c3-userdata-shm.mount: Deactivated successfully.
Oct  8 12:43:44 np0005476733 systemd[1]: var-lib-containers-storage-overlay-eaf71acde083c0b8cff0903864aa7dc194679dbc44a20b8775712248efa6be5e-merged.mount: Deactivated successfully.
Oct  8 12:43:44 np0005476733 podman[270454]: 2025-10-08 16:43:44.658474863 +0000 UTC m=+0.089161484 container cleanup e66aedaed696fdcc10ac52eaa613d6022301e5e2cb178990668cf309e78160c3 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:43:44 np0005476733 systemd[1]: libpod-conmon-e66aedaed696fdcc10ac52eaa613d6022301e5e2cb178990668cf309e78160c3.scope: Deactivated successfully.
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.678 2 INFO nova.virt.libvirt.driver [-] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Instance destroyed successfully.#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.679 2 DEBUG nova.objects.instance [None req-7d7f3e21-d9b6-434c-aa10-64dcae990144 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lazy-loading 'resources' on Instance uuid f5367a5f-b3ff-45c9-b61a-a87626cfcb03 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.701 2 DEBUG nova.virt.libvirt.vif [None req-7d7f3e21-d9b6-434c-aa10-64dcae990144 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T16:40:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1528954241',display_name='tempest-server-test-1528954241',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1528954241',id=101,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGgF0yKypuL1wJPOTwdtr+zzq0qw+uwurXu01O/Ym5uWgfd00pr3GN1raply3ByKVO5hmmkfvydhY0zQSvT9dZbNRj3hL8c6L+eBag20GVWlTRMyq8EfPEfzsuER1PS2LQ==',key_name='tempest-keypair-test-224344080',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:40:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4726c8b7a2a3405b9b2d689862918f5d',ramdisk_id='',reservation_id='r-86pbza36',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-GatewayMtuTestUdp-187807839',owner_user_name='tempest-GatewayMtuTestUdp-187807839-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T16:40:38Z,user_data=None,user_id='de0012a12c1645bfb620caa34110c3f4',uuid=f5367a5f-b3ff-45c9-b61a-a87626cfcb03,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "address": "fa:16:3e:e7:3a:92", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66cbaf5-fd", "ovs_interfaceid": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.702 2 DEBUG nova.network.os_vif_util [None req-7d7f3e21-d9b6-434c-aa10-64dcae990144 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Converting VIF {"id": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "address": "fa:16:3e:e7:3a:92", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66cbaf5-fd", "ovs_interfaceid": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.703 2 DEBUG nova.network.os_vif_util [None req-7d7f3e21-d9b6-434c-aa10-64dcae990144 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e7:3a:92,bridge_name='br-int',has_traffic_filtering=True,id=f66cbaf5-fd00-4780-a065-fdcf7175d68c,network=Network(aef581c1-ee97-4e74-a1f4-beb582e7a3d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf66cbaf5-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.703 2 DEBUG os_vif [None req-7d7f3e21-d9b6-434c-aa10-64dcae990144 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e7:3a:92,bridge_name='br-int',has_traffic_filtering=True,id=f66cbaf5-fd00-4780-a065-fdcf7175d68c,network=Network(aef581c1-ee97-4e74-a1f4-beb582e7a3d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf66cbaf5-fd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.705 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf66cbaf5-fd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.711 2 INFO os_vif [None req-7d7f3e21-d9b6-434c-aa10-64dcae990144 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e7:3a:92,bridge_name='br-int',has_traffic_filtering=True,id=f66cbaf5-fd00-4780-a065-fdcf7175d68c,network=Network(aef581c1-ee97-4e74-a1f4-beb582e7a3d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf66cbaf5-fd')#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.711 2 INFO nova.virt.libvirt.driver [None req-7d7f3e21-d9b6-434c-aa10-64dcae990144 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Deleting instance files /var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03_del#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.712 2 INFO nova.virt.libvirt.driver [None req-7d7f3e21-d9b6-434c-aa10-64dcae990144 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Deletion of /var/lib/nova/instances/f5367a5f-b3ff-45c9-b61a-a87626cfcb03_del complete#033[00m
Oct  8 12:43:44 np0005476733 podman[270501]: 2025-10-08 16:43:44.720384407 +0000 UTC m=+0.038539420 container remove e66aedaed696fdcc10ac52eaa613d6022301e5e2cb178990668cf309e78160c3 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct  8 12:43:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:44.725 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[93e93d9f-25c5-4a5d-860a-8b285f7a69b5]: (4, ('Wed Oct  8 04:43:44 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5 (e66aedaed696fdcc10ac52eaa613d6022301e5e2cb178990668cf309e78160c3)\ne66aedaed696fdcc10ac52eaa613d6022301e5e2cb178990668cf309e78160c3\nWed Oct  8 04:43:44 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5 (e66aedaed696fdcc10ac52eaa613d6022301e5e2cb178990668cf309e78160c3)\ne66aedaed696fdcc10ac52eaa613d6022301e5e2cb178990668cf309e78160c3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:43:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:44.726 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[dd1bf20f-cd7e-4b97-8271-51360b0006c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:43:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:44.727 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaef581c1-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:44 np0005476733 kernel: tapaef581c1-e0: left promiscuous mode
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:44.744 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[eecff13b-2246-4097-8942-71b2ec0ae038]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:43:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:44.768 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[58fac008-4913-44a5-94b4-51a2d57efb2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:43:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:44.770 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[91a68901-7d8a-4340-9ba2-50844645194f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:43:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:44.788 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6c9a0f33-7b75-4499-8645-eb5d01bb7aaf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 854099, 'reachable_time': 40479, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 270515, 'error': None, 'target': 'ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:43:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:44.791 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-aef581c1-ee97-4e74-a1f4-beb582e7a3d5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 12:43:44 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:44.791 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[924f9cb4-04fa-494c-960e-e8b33887bf6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:43:44 np0005476733 systemd[1]: run-netns-ovnmeta\x2daef581c1\x2dee97\x2d4e74\x2da1f4\x2dbeb582e7a3d5.mount: Deactivated successfully.
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.800 2 INFO nova.compute.manager [None req-7d7f3e21-d9b6-434c-aa10-64dcae990144 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.800 2 DEBUG oslo.service.loopingcall [None req-7d7f3e21-d9b6-434c-aa10-64dcae990144 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.801 2 DEBUG nova.compute.manager [-] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 12:43:44 np0005476733 nova_compute[192580]: 2025-10-08 16:43:44.801 2 DEBUG nova.network.neutron [-] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 12:43:45 np0005476733 nova_compute[192580]: 2025-10-08 16:43:45.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:45 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:43:45.521 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '79'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:43:45 np0005476733 nova_compute[192580]: 2025-10-08 16:43:45.585 2 DEBUG nova.network.neutron [-] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:43:45 np0005476733 nova_compute[192580]: 2025-10-08 16:43:45.617 2 INFO nova.compute.manager [-] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Took 0.82 seconds to deallocate network for instance.#033[00m
Oct  8 12:43:45 np0005476733 nova_compute[192580]: 2025-10-08 16:43:45.658 2 DEBUG oslo_concurrency.lockutils [None req-7d7f3e21-d9b6-434c-aa10-64dcae990144 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:43:45 np0005476733 nova_compute[192580]: 2025-10-08 16:43:45.658 2 DEBUG oslo_concurrency.lockutils [None req-7d7f3e21-d9b6-434c-aa10-64dcae990144 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:43:45 np0005476733 nova_compute[192580]: 2025-10-08 16:43:45.717 2 DEBUG nova.compute.manager [req-ba04b8ec-248d-4c01-92d9-96f452b8d383 req-e09833ab-856c-420e-ba6a-2fcb5858b6b5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Received event network-vif-deleted-f66cbaf5-fd00-4780-a065-fdcf7175d68c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:43:45 np0005476733 nova_compute[192580]: 2025-10-08 16:43:45.733 2 DEBUG nova.compute.provider_tree [None req-7d7f3e21-d9b6-434c-aa10-64dcae990144 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:43:45 np0005476733 nova_compute[192580]: 2025-10-08 16:43:45.753 2 DEBUG nova.scheduler.client.report [None req-7d7f3e21-d9b6-434c-aa10-64dcae990144 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:43:45 np0005476733 nova_compute[192580]: 2025-10-08 16:43:45.776 2 DEBUG oslo_concurrency.lockutils [None req-7d7f3e21-d9b6-434c-aa10-64dcae990144 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:43:45 np0005476733 nova_compute[192580]: 2025-10-08 16:43:45.804 2 INFO nova.scheduler.client.report [None req-7d7f3e21-d9b6-434c-aa10-64dcae990144 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Deleted allocations for instance f5367a5f-b3ff-45c9-b61a-a87626cfcb03#033[00m
Oct  8 12:43:45 np0005476733 nova_compute[192580]: 2025-10-08 16:43:45.900 2 DEBUG nova.network.neutron [req-f99f488c-b807-46cf-b9d7-fd722886bb14 req-61a2a305-bd7d-4798-81ab-db76ed24e919 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Updated VIF entry in instance network info cache for port f66cbaf5-fd00-4780-a065-fdcf7175d68c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:43:45 np0005476733 nova_compute[192580]: 2025-10-08 16:43:45.901 2 DEBUG nova.network.neutron [req-f99f488c-b807-46cf-b9d7-fd722886bb14 req-61a2a305-bd7d-4798-81ab-db76ed24e919 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Updating instance_info_cache with network_info: [{"id": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "address": "fa:16:3e:e7:3a:92", "network": {"id": "aef581c1-ee97-4e74-a1f4-beb582e7a3d5", "bridge": "br-int", "label": "tempest-test-network--848614175", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4726c8b7a2a3405b9b2d689862918f5d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66cbaf5-fd", "ovs_interfaceid": "f66cbaf5-fd00-4780-a065-fdcf7175d68c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:43:45 np0005476733 nova_compute[192580]: 2025-10-08 16:43:45.922 2 DEBUG oslo_concurrency.lockutils [None req-7d7f3e21-d9b6-434c-aa10-64dcae990144 de0012a12c1645bfb620caa34110c3f4 4726c8b7a2a3405b9b2d689862918f5d - - default default] Lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.517s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:43:45 np0005476733 nova_compute[192580]: 2025-10-08 16:43:45.934 2 DEBUG oslo_concurrency.lockutils [req-f99f488c-b807-46cf-b9d7-fd722886bb14 req-61a2a305-bd7d-4798-81ab-db76ed24e919 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-f5367a5f-b3ff-45c9-b61a-a87626cfcb03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:43:46 np0005476733 nova_compute[192580]: 2025-10-08 16:43:46.599 2 DEBUG nova.compute.manager [req-218e2958-8f3f-451f-acaa-5c44ef78de1a req-97283f4c-03ba-4c2f-8b37-b0aa1663140a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Received event network-vif-unplugged-f66cbaf5-fd00-4780-a065-fdcf7175d68c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:43:46 np0005476733 nova_compute[192580]: 2025-10-08 16:43:46.600 2 DEBUG oslo_concurrency.lockutils [req-218e2958-8f3f-451f-acaa-5c44ef78de1a req-97283f4c-03ba-4c2f-8b37-b0aa1663140a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:43:46 np0005476733 nova_compute[192580]: 2025-10-08 16:43:46.600 2 DEBUG oslo_concurrency.lockutils [req-218e2958-8f3f-451f-acaa-5c44ef78de1a req-97283f4c-03ba-4c2f-8b37-b0aa1663140a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:43:46 np0005476733 nova_compute[192580]: 2025-10-08 16:43:46.600 2 DEBUG oslo_concurrency.lockutils [req-218e2958-8f3f-451f-acaa-5c44ef78de1a req-97283f4c-03ba-4c2f-8b37-b0aa1663140a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:43:46 np0005476733 nova_compute[192580]: 2025-10-08 16:43:46.600 2 DEBUG nova.compute.manager [req-218e2958-8f3f-451f-acaa-5c44ef78de1a req-97283f4c-03ba-4c2f-8b37-b0aa1663140a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] No waiting events found dispatching network-vif-unplugged-f66cbaf5-fd00-4780-a065-fdcf7175d68c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:43:46 np0005476733 nova_compute[192580]: 2025-10-08 16:43:46.600 2 WARNING nova.compute.manager [req-218e2958-8f3f-451f-acaa-5c44ef78de1a req-97283f4c-03ba-4c2f-8b37-b0aa1663140a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Received unexpected event network-vif-unplugged-f66cbaf5-fd00-4780-a065-fdcf7175d68c for instance with vm_state deleted and task_state None.#033[00m
Oct  8 12:43:46 np0005476733 nova_compute[192580]: 2025-10-08 16:43:46.601 2 DEBUG nova.compute.manager [req-218e2958-8f3f-451f-acaa-5c44ef78de1a req-97283f4c-03ba-4c2f-8b37-b0aa1663140a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Received event network-vif-plugged-f66cbaf5-fd00-4780-a065-fdcf7175d68c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:43:46 np0005476733 nova_compute[192580]: 2025-10-08 16:43:46.601 2 DEBUG oslo_concurrency.lockutils [req-218e2958-8f3f-451f-acaa-5c44ef78de1a req-97283f4c-03ba-4c2f-8b37-b0aa1663140a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:43:46 np0005476733 nova_compute[192580]: 2025-10-08 16:43:46.601 2 DEBUG oslo_concurrency.lockutils [req-218e2958-8f3f-451f-acaa-5c44ef78de1a req-97283f4c-03ba-4c2f-8b37-b0aa1663140a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:43:46 np0005476733 nova_compute[192580]: 2025-10-08 16:43:46.601 2 DEBUG oslo_concurrency.lockutils [req-218e2958-8f3f-451f-acaa-5c44ef78de1a req-97283f4c-03ba-4c2f-8b37-b0aa1663140a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "f5367a5f-b3ff-45c9-b61a-a87626cfcb03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:43:46 np0005476733 nova_compute[192580]: 2025-10-08 16:43:46.601 2 DEBUG nova.compute.manager [req-218e2958-8f3f-451f-acaa-5c44ef78de1a req-97283f4c-03ba-4c2f-8b37-b0aa1663140a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] No waiting events found dispatching network-vif-plugged-f66cbaf5-fd00-4780-a065-fdcf7175d68c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:43:46 np0005476733 nova_compute[192580]: 2025-10-08 16:43:46.602 2 WARNING nova.compute.manager [req-218e2958-8f3f-451f-acaa-5c44ef78de1a req-97283f4c-03ba-4c2f-8b37-b0aa1663140a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Received unexpected event network-vif-plugged-f66cbaf5-fd00-4780-a065-fdcf7175d68c for instance with vm_state deleted and task_state None.#033[00m
Oct  8 12:43:48 np0005476733 podman[270519]: 2025-10-08 16:43:48.263902554 +0000 UTC m=+0.088574436 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:43:49 np0005476733 nova_compute[192580]: 2025-10-08 16:43:49.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:43:49 np0005476733 nova_compute[192580]: 2025-10-08 16:43:49.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:50 np0005476733 nova_compute[192580]: 2025-10-08 16:43:50.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:50 np0005476733 nova_compute[192580]: 2025-10-08 16:43:50.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:43:50 np0005476733 nova_compute[192580]: 2025-10-08 16:43:50.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:43:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:43:50Z|00117|pinctrl|WARN|Dropped 447 log messages in last 60 seconds (most recently, 2 seconds ago) due to excessive rate
Oct  8 12:43:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:43:50Z|00118|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:43:52 np0005476733 nova_compute[192580]: 2025-10-08 16:43:52.153 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759941817.1516619, 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:43:52 np0005476733 nova_compute[192580]: 2025-10-08 16:43:52.153 2 INFO nova.compute.manager [-] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] VM Stopped (Lifecycle Event)#033[00m
Oct  8 12:43:52 np0005476733 nova_compute[192580]: 2025-10-08 16:43:52.174 2 DEBUG nova.compute.manager [None req-b7eed4ab-e59f-43ef-abbd-b70f2885fdef - - - - - -] [instance: 8bb9fbe7-3f99-4a38-b5b6-2943a5dead35] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:43:54 np0005476733 nova_compute[192580]: 2025-10-08 16:43:54.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:55 np0005476733 nova_compute[192580]: 2025-10-08 16:43:55.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:56 np0005476733 nova_compute[192580]: 2025-10-08 16:43:56.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:43:57 np0005476733 podman[270546]: 2025-10-08 16:43:57.246391398 +0000 UTC m=+0.063936870 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 12:43:57 np0005476733 podman[270547]: 2025-10-08 16:43:57.247761752 +0000 UTC m=+0.062218045 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_id=edpm, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, distribution-scope=public)
Oct  8 12:43:57 np0005476733 podman[270545]: 2025-10-08 16:43:57.272341106 +0000 UTC m=+0.088503594 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct  8 12:43:57 np0005476733 nova_compute[192580]: 2025-10-08 16:43:57.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:57 np0005476733 nova_compute[192580]: 2025-10-08 16:43:57.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:43:57 np0005476733 nova_compute[192580]: 2025-10-08 16:43:57.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:43:59 np0005476733 nova_compute[192580]: 2025-10-08 16:43:59.677 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759941824.676044, f5367a5f-b3ff-45c9-b61a-a87626cfcb03 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:43:59 np0005476733 nova_compute[192580]: 2025-10-08 16:43:59.677 2 INFO nova.compute.manager [-] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] VM Stopped (Lifecycle Event)#033[00m
Oct  8 12:43:59 np0005476733 nova_compute[192580]: 2025-10-08 16:43:59.697 2 DEBUG nova.compute.manager [None req-6b4ee2ef-d64c-4157-b0bf-8442539c3dd0 - - - - - -] [instance: f5367a5f-b3ff-45c9-b61a-a87626cfcb03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:43:59 np0005476733 nova_compute[192580]: 2025-10-08 16:43:59.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:00 np0005476733 nova_compute[192580]: 2025-10-08 16:44:00.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:00 np0005476733 nova_compute[192580]: 2025-10-08 16:44:00.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:44:00 np0005476733 nova_compute[192580]: 2025-10-08 16:44:00.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:44:00 np0005476733 nova_compute[192580]: 2025-10-08 16:44:00.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:44:00 np0005476733 nova_compute[192580]: 2025-10-08 16:44:00.611 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:44:04 np0005476733 nova_compute[192580]: 2025-10-08 16:44:04.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:44:04 np0005476733 nova_compute[192580]: 2025-10-08 16:44:04.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:05 np0005476733 podman[270610]: 2025-10-08 16:44:05.217311935 +0000 UTC m=+0.048641673 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 12:44:05 np0005476733 podman[270609]: 2025-10-08 16:44:05.233001975 +0000 UTC m=+0.064949873 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 12:44:05 np0005476733 nova_compute[192580]: 2025-10-08 16:44:05.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:09 np0005476733 nova_compute[192580]: 2025-10-08 16:44:09.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:10 np0005476733 nova_compute[192580]: 2025-10-08 16:44:10.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:11 np0005476733 nova_compute[192580]: 2025-10-08 16:44:11.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:44:11 np0005476733 nova_compute[192580]: 2025-10-08 16:44:11.633 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:44:11 np0005476733 nova_compute[192580]: 2025-10-08 16:44:11.633 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:44:11 np0005476733 nova_compute[192580]: 2025-10-08 16:44:11.633 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:44:11 np0005476733 nova_compute[192580]: 2025-10-08 16:44:11.634 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:44:11 np0005476733 nova_compute[192580]: 2025-10-08 16:44:11.788 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:44:11 np0005476733 nova_compute[192580]: 2025-10-08 16:44:11.789 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13645MB free_disk=111.31275177001953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:44:11 np0005476733 nova_compute[192580]: 2025-10-08 16:44:11.789 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:44:11 np0005476733 nova_compute[192580]: 2025-10-08 16:44:11.790 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:44:11 np0005476733 nova_compute[192580]: 2025-10-08 16:44:11.857 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:44:11 np0005476733 nova_compute[192580]: 2025-10-08 16:44:11.858 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:44:11 np0005476733 nova_compute[192580]: 2025-10-08 16:44:11.881 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:44:11 np0005476733 nova_compute[192580]: 2025-10-08 16:44:11.898 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:44:11 np0005476733 nova_compute[192580]: 2025-10-08 16:44:11.924 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:44:11 np0005476733 nova_compute[192580]: 2025-10-08 16:44:11.924 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:44:14 np0005476733 podman[270652]: 2025-10-08 16:44:14.220006483 +0000 UTC m=+0.047605859 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 12:44:14 np0005476733 nova_compute[192580]: 2025-10-08 16:44:14.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:15 np0005476733 podman[270671]: 2025-10-08 16:44:15.265444771 +0000 UTC m=+0.089431333 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 12:44:15 np0005476733 nova_compute[192580]: 2025-10-08 16:44:15.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:17 np0005476733 nova_compute[192580]: 2025-10-08 16:44:17.928 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:44:19 np0005476733 podman[270697]: 2025-10-08 16:44:19.230194149 +0000 UTC m=+0.064222500 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 12:44:19 np0005476733 nova_compute[192580]: 2025-10-08 16:44:19.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:20 np0005476733 nova_compute[192580]: 2025-10-08 16:44:20.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:21 np0005476733 systemd-logind[827]: New session 162 of user zuul.
Oct  8 12:44:21 np0005476733 systemd[1]: Started Session 162 of User zuul.
Oct  8 12:44:21 np0005476733 systemd[1]: session-162.scope: Deactivated successfully.
Oct  8 12:44:21 np0005476733 systemd-logind[827]: Session 162 logged out. Waiting for processes to exit.
Oct  8 12:44:21 np0005476733 systemd-logind[827]: Removed session 162.
Oct  8 12:44:24 np0005476733 nova_compute[192580]: 2025-10-08 16:44:24.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:25 np0005476733 nova_compute[192580]: 2025-10-08 16:44:25.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:26.413 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:44:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:26.413 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:44:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:26.413 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:44:28 np0005476733 podman[270746]: 2025-10-08 16:44:28.237108624 +0000 UTC m=+0.066402728 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 12:44:28 np0005476733 podman[270748]: 2025-10-08 16:44:28.237190457 +0000 UTC m=+0.062715501 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, architecture=x86_64, name=ubi9-minimal, vcs-type=git, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public)
Oct  8 12:44:28 np0005476733 podman[270747]: 2025-10-08 16:44:28.255736279 +0000 UTC m=+0.083839335 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:44:29 np0005476733 nova_compute[192580]: 2025-10-08 16:44:29.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:30 np0005476733 nova_compute[192580]: 2025-10-08 16:44:30.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:34 np0005476733 nova_compute[192580]: 2025-10-08 16:44:34.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:35 np0005476733 nova_compute[192580]: 2025-10-08 16:44:35.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.073 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.074 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.074 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.074 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.074 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.074 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.074 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.075 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.075 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.075 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.075 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.075 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.075 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.075 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.075 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.075 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.075 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:44:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:44:36 np0005476733 podman[270805]: 2025-10-08 16:44:36.217736803 +0000 UTC m=+0.046988230 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 12:44:36 np0005476733 podman[270804]: 2025-10-08 16:44:36.240738136 +0000 UTC m=+0.063794386 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid)
Oct  8 12:44:39 np0005476733 nova_compute[192580]: 2025-10-08 16:44:39.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:40 np0005476733 nova_compute[192580]: 2025-10-08 16:44:40.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:41 np0005476733 nova_compute[192580]: 2025-10-08 16:44:41.693 2 DEBUG oslo_concurrency.lockutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "952a7e26-0b63-4309-a370-27b9715e05cb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:44:41 np0005476733 nova_compute[192580]: 2025-10-08 16:44:41.694 2 DEBUG oslo_concurrency.lockutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "952a7e26-0b63-4309-a370-27b9715e05cb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:44:41 np0005476733 nova_compute[192580]: 2025-10-08 16:44:41.717 2 DEBUG nova.compute.manager [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 12:44:41 np0005476733 nova_compute[192580]: 2025-10-08 16:44:41.835 2 DEBUG oslo_concurrency.lockutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:44:41 np0005476733 nova_compute[192580]: 2025-10-08 16:44:41.836 2 DEBUG oslo_concurrency.lockutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:44:41 np0005476733 nova_compute[192580]: 2025-10-08 16:44:41.845 2 DEBUG nova.virt.hardware [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 12:44:41 np0005476733 nova_compute[192580]: 2025-10-08 16:44:41.845 2 INFO nova.compute.claims [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.005 2 DEBUG nova.compute.provider_tree [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.052 2 DEBUG nova.scheduler.client.report [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.079 2 DEBUG oslo_concurrency.lockutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.244s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.080 2 DEBUG nova.compute.manager [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.133 2 DEBUG nova.compute.manager [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.134 2 DEBUG nova.network.neutron [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.159 2 INFO nova.virt.libvirt.driver [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.180 2 DEBUG nova.compute.manager [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.314 2 DEBUG nova.compute.manager [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.315 2 DEBUG nova.virt.libvirt.driver [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.316 2 INFO nova.virt.libvirt.driver [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Creating image(s)#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.316 2 DEBUG oslo_concurrency.lockutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "/var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.316 2 DEBUG oslo_concurrency.lockutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "/var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.317 2 DEBUG oslo_concurrency.lockutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "/var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.329 2 DEBUG oslo_concurrency.processutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.408 2 DEBUG nova.policy [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.425 2 DEBUG oslo_concurrency.processutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.425 2 DEBUG oslo_concurrency.lockutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.426 2 DEBUG oslo_concurrency.lockutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.437 2 DEBUG oslo_concurrency.processutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.517 2 DEBUG oslo_concurrency.processutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.519 2 DEBUG oslo_concurrency.processutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.556 2 DEBUG oslo_concurrency.processutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb/disk 10737418240" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.558 2 DEBUG oslo_concurrency.lockutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.558 2 DEBUG oslo_concurrency.processutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.616 2 DEBUG oslo_concurrency.processutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.617 2 DEBUG nova.objects.instance [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lazy-loading 'migration_context' on Instance uuid 952a7e26-0b63-4309-a370-27b9715e05cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.685 2 DEBUG nova.virt.libvirt.driver [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.685 2 DEBUG nova.virt.libvirt.driver [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Ensure instance console log exists: /var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.686 2 DEBUG oslo_concurrency.lockutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.686 2 DEBUG oslo_concurrency.lockutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:44:42 np0005476733 nova_compute[192580]: 2025-10-08 16:44:42.687 2 DEBUG oslo_concurrency.lockutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:44:44 np0005476733 nova_compute[192580]: 2025-10-08 16:44:44.116 2 DEBUG nova.network.neutron [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Successfully created port: 7ad2d12a-572a-4586-93cf-9104c8937c76 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 12:44:44 np0005476733 nova_compute[192580]: 2025-10-08 16:44:44.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:44:44 np0005476733 nova_compute[192580]: 2025-10-08 16:44:44.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:45 np0005476733 nova_compute[192580]: 2025-10-08 16:44:45.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:45 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:45.002 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=80, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=79) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:44:45 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:45.006 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:44:45 np0005476733 podman[270860]: 2025-10-08 16:44:45.223512542 +0000 UTC m=+0.055460520 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  8 12:44:45 np0005476733 nova_compute[192580]: 2025-10-08 16:44:45.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:45 np0005476733 nova_compute[192580]: 2025-10-08 16:44:45.396 2 DEBUG nova.network.neutron [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Successfully updated port: 7ad2d12a-572a-4586-93cf-9104c8937c76 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 12:44:45 np0005476733 nova_compute[192580]: 2025-10-08 16:44:45.411 2 DEBUG oslo_concurrency.lockutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "refresh_cache-952a7e26-0b63-4309-a370-27b9715e05cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:44:45 np0005476733 nova_compute[192580]: 2025-10-08 16:44:45.412 2 DEBUG oslo_concurrency.lockutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquired lock "refresh_cache-952a7e26-0b63-4309-a370-27b9715e05cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:44:45 np0005476733 nova_compute[192580]: 2025-10-08 16:44:45.412 2 DEBUG nova.network.neutron [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:44:45 np0005476733 nova_compute[192580]: 2025-10-08 16:44:45.600 2 DEBUG nova.compute.manager [req-fcc5f0fd-8f51-43b9-82f3-9b535ba7b997 req-4a02d753-0a5a-41c9-9480-6ffca2763ef6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Received event network-changed-7ad2d12a-572a-4586-93cf-9104c8937c76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:44:45 np0005476733 nova_compute[192580]: 2025-10-08 16:44:45.601 2 DEBUG nova.compute.manager [req-fcc5f0fd-8f51-43b9-82f3-9b535ba7b997 req-4a02d753-0a5a-41c9-9480-6ffca2763ef6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Refreshing instance network info cache due to event network-changed-7ad2d12a-572a-4586-93cf-9104c8937c76. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:44:45 np0005476733 nova_compute[192580]: 2025-10-08 16:44:45.601 2 DEBUG oslo_concurrency.lockutils [req-fcc5f0fd-8f51-43b9-82f3-9b535ba7b997 req-4a02d753-0a5a-41c9-9480-6ffca2763ef6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-952a7e26-0b63-4309-a370-27b9715e05cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:44:45 np0005476733 nova_compute[192580]: 2025-10-08 16:44:45.957 2 DEBUG nova.network.neutron [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 12:44:46 np0005476733 podman[270880]: 2025-10-08 16:44:46.282206822 +0000 UTC m=+0.111379494 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.182 2 DEBUG nova.network.neutron [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Updating instance_info_cache with network_info: [{"id": "7ad2d12a-572a-4586-93cf-9104c8937c76", "address": "fa:16:3e:ff:23:9d", "network": {"id": "e4acb787-c359-4e53-a697-a921451e586f", "bridge": "br-int", "label": "tempest-test-network--1114541047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ddcd45556b9d4077968eee95f005487d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad2d12a-57", "ovs_interfaceid": "7ad2d12a-572a-4586-93cf-9104c8937c76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.265 2 DEBUG oslo_concurrency.lockutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Releasing lock "refresh_cache-952a7e26-0b63-4309-a370-27b9715e05cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.265 2 DEBUG nova.compute.manager [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Instance network_info: |[{"id": "7ad2d12a-572a-4586-93cf-9104c8937c76", "address": "fa:16:3e:ff:23:9d", "network": {"id": "e4acb787-c359-4e53-a697-a921451e586f", "bridge": "br-int", "label": "tempest-test-network--1114541047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ddcd45556b9d4077968eee95f005487d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad2d12a-57", "ovs_interfaceid": "7ad2d12a-572a-4586-93cf-9104c8937c76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.266 2 DEBUG oslo_concurrency.lockutils [req-fcc5f0fd-8f51-43b9-82f3-9b535ba7b997 req-4a02d753-0a5a-41c9-9480-6ffca2763ef6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-952a7e26-0b63-4309-a370-27b9715e05cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.266 2 DEBUG nova.network.neutron [req-fcc5f0fd-8f51-43b9-82f3-9b535ba7b997 req-4a02d753-0a5a-41c9-9480-6ffca2763ef6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Refreshing network info cache for port 7ad2d12a-572a-4586-93cf-9104c8937c76 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.269 2 DEBUG nova.virt.libvirt.driver [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Start _get_guest_xml network_info=[{"id": "7ad2d12a-572a-4586-93cf-9104c8937c76", "address": "fa:16:3e:ff:23:9d", "network": {"id": "e4acb787-c359-4e53-a697-a921451e586f", "bridge": "br-int", "label": "tempest-test-network--1114541047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ddcd45556b9d4077968eee95f005487d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad2d12a-57", "ovs_interfaceid": "7ad2d12a-572a-4586-93cf-9104c8937c76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.273 2 WARNING nova.virt.libvirt.driver [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.373 2 DEBUG nova.virt.libvirt.host [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.374 2 DEBUG nova.virt.libvirt.host [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.383 2 DEBUG nova.virt.libvirt.host [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.384 2 DEBUG nova.virt.libvirt.host [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.384 2 DEBUG nova.virt.libvirt.driver [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.384 2 DEBUG nova.virt.hardware [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.385 2 DEBUG nova.virt.hardware [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.385 2 DEBUG nova.virt.hardware [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.385 2 DEBUG nova.virt.hardware [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.385 2 DEBUG nova.virt.hardware [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.385 2 DEBUG nova.virt.hardware [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.386 2 DEBUG nova.virt.hardware [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.386 2 DEBUG nova.virt.hardware [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.386 2 DEBUG nova.virt.hardware [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.386 2 DEBUG nova.virt.hardware [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.387 2 DEBUG nova.virt.hardware [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.390 2 DEBUG nova.virt.libvirt.vif [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:44:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_igmp_snooping_after_openvswitch_restart-1066522754',display_name='tempest-test_igmp_snooping_after_openvswitch_restart-1066522754',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-igmp-snooping-after-openvswitch-restart-1066522754',id=103,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNTrp4SSnfhKhWl5/vr+ysOFBxdgTslwc0H7TgTRWXihMtjd4e3hSjQ8BhgGRqYqjDbdOxfo/dIVr5KrHj2ewhMFyenbuZO+39j7i/Z4jwloPin+qTxJZUEv9/APVs6CqQ==',key_name='tempest-keypair-test-2124727488',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ddcd45556b9d4077968eee95f005487d',ramdisk_id='',reservation_id='r-92xxo6uk',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Common-2020042253',owner_user_name='tempest-MulticastTestIPv4Common-2020042253-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:44:42Z,user_data=None,user_id='bee18afeaf16419c98219491d4757b96',uuid=952a7e26-0b63-4309-a370-27b9715e05cb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7ad2d12a-572a-4586-93cf-9104c8937c76", "address": "fa:16:3e:ff:23:9d", "network": {"id": "e4acb787-c359-4e53-a697-a921451e586f", "bridge": "br-int", "label": "tempest-test-network--1114541047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ddcd45556b9d4077968eee95f005487d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad2d12a-57", "ovs_interfaceid": "7ad2d12a-572a-4586-93cf-9104c8937c76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.390 2 DEBUG nova.network.os_vif_util [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Converting VIF {"id": "7ad2d12a-572a-4586-93cf-9104c8937c76", "address": "fa:16:3e:ff:23:9d", "network": {"id": "e4acb787-c359-4e53-a697-a921451e586f", "bridge": "br-int", "label": "tempest-test-network--1114541047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ddcd45556b9d4077968eee95f005487d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad2d12a-57", "ovs_interfaceid": "7ad2d12a-572a-4586-93cf-9104c8937c76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.391 2 DEBUG nova.network.os_vif_util [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:23:9d,bridge_name='br-int',has_traffic_filtering=True,id=7ad2d12a-572a-4586-93cf-9104c8937c76,network=Network(e4acb787-c359-4e53-a697-a921451e586f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ad2d12a-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.391 2 DEBUG nova.objects.instance [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lazy-loading 'pci_devices' on Instance uuid 952a7e26-0b63-4309-a370-27b9715e05cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.498 2 DEBUG nova.virt.libvirt.driver [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] End _get_guest_xml xml=<domain type="kvm">
Oct  8 12:44:47 np0005476733 nova_compute[192580]:  <uuid>952a7e26-0b63-4309-a370-27b9715e05cb</uuid>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:  <name>instance-00000067</name>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <nova:name>tempest-test_igmp_snooping_after_openvswitch_restart-1066522754</nova:name>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 16:44:47</nova:creationTime>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 12:44:47 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:        <nova:user uuid="bee18afeaf16419c98219491d4757b96">tempest-MulticastTestIPv4Common-2020042253-project-member</nova:user>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:        <nova:project uuid="ddcd45556b9d4077968eee95f005487d">tempest-MulticastTestIPv4Common-2020042253</nova:project>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:        <nova:port uuid="7ad2d12a-572a-4586-93cf-9104c8937c76">
Oct  8 12:44:47 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <system>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <entry name="serial">952a7e26-0b63-4309-a370-27b9715e05cb</entry>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <entry name="uuid">952a7e26-0b63-4309-a370-27b9715e05cb</entry>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    </system>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:  <os>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:  </os>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:  <features>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:  </features>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:  </clock>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:  <devices>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb/disk"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb/disk.config"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:ff:23:9d"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <target dev="tap7ad2d12a-57"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    </interface>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb/console.log" append="off"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    </serial>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <video>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    </video>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    </rng>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 12:44:47 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 12:44:47 np0005476733 nova_compute[192580]:  </devices>
Oct  8 12:44:47 np0005476733 nova_compute[192580]: </domain>
Oct  8 12:44:47 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.499 2 DEBUG nova.compute.manager [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Preparing to wait for external event network-vif-plugged-7ad2d12a-572a-4586-93cf-9104c8937c76 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.500 2 DEBUG oslo_concurrency.lockutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "952a7e26-0b63-4309-a370-27b9715e05cb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.500 2 DEBUG oslo_concurrency.lockutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "952a7e26-0b63-4309-a370-27b9715e05cb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.500 2 DEBUG oslo_concurrency.lockutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "952a7e26-0b63-4309-a370-27b9715e05cb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.501 2 DEBUG nova.virt.libvirt.vif [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:44:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_igmp_snooping_after_openvswitch_restart-1066522754',display_name='tempest-test_igmp_snooping_after_openvswitch_restart-1066522754',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-igmp-snooping-after-openvswitch-restart-1066522754',id=103,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNTrp4SSnfhKhWl5/vr+ysOFBxdgTslwc0H7TgTRWXihMtjd4e3hSjQ8BhgGRqYqjDbdOxfo/dIVr5KrHj2ewhMFyenbuZO+39j7i/Z4jwloPin+qTxJZUEv9/APVs6CqQ==',key_name='tempest-keypair-test-2124727488',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ddcd45556b9d4077968eee95f005487d',ramdisk_id='',reservation_id='r-92xxo6uk',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Common-2020042253',owner_user_name='tempest-MulticastTestIPv4Common-2020042253-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:44:42Z,user_data=None,user_id='bee18afeaf16419c98219491d4757b96',uuid=952a7e26-0b63-4309-a370-27b9715e05cb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7ad2d12a-572a-4586-93cf-9104c8937c76", "address": "fa:16:3e:ff:23:9d", "network": {"id": "e4acb787-c359-4e53-a697-a921451e586f", "bridge": "br-int", "label": "tempest-test-network--1114541047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ddcd45556b9d4077968eee95f005487d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad2d12a-57", "ovs_interfaceid": "7ad2d12a-572a-4586-93cf-9104c8937c76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.501 2 DEBUG nova.network.os_vif_util [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Converting VIF {"id": "7ad2d12a-572a-4586-93cf-9104c8937c76", "address": "fa:16:3e:ff:23:9d", "network": {"id": "e4acb787-c359-4e53-a697-a921451e586f", "bridge": "br-int", "label": "tempest-test-network--1114541047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ddcd45556b9d4077968eee95f005487d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad2d12a-57", "ovs_interfaceid": "7ad2d12a-572a-4586-93cf-9104c8937c76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.501 2 DEBUG nova.network.os_vif_util [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:23:9d,bridge_name='br-int',has_traffic_filtering=True,id=7ad2d12a-572a-4586-93cf-9104c8937c76,network=Network(e4acb787-c359-4e53-a697-a921451e586f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ad2d12a-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.502 2 DEBUG os_vif [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:23:9d,bridge_name='br-int',has_traffic_filtering=True,id=7ad2d12a-572a-4586-93cf-9104c8937c76,network=Network(e4acb787-c359-4e53-a697-a921451e586f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ad2d12a-57') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.502 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.503 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.508 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7ad2d12a-57, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.509 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7ad2d12a-57, col_values=(('external_ids', {'iface-id': '7ad2d12a-572a-4586-93cf-9104c8937c76', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ff:23:9d', 'vm-uuid': '952a7e26-0b63-4309-a370-27b9715e05cb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:44:47 np0005476733 NetworkManager[51699]: <info>  [1759941887.5119] manager: (tap7ad2d12a-57): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/315)
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.519 2 INFO os_vif [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:23:9d,bridge_name='br-int',has_traffic_filtering=True,id=7ad2d12a-572a-4586-93cf-9104c8937c76,network=Network(e4acb787-c359-4e53-a697-a921451e586f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ad2d12a-57')#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.598 2 DEBUG nova.virt.libvirt.driver [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.598 2 DEBUG nova.virt.libvirt.driver [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.598 2 DEBUG nova.virt.libvirt.driver [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] No VIF found with MAC fa:16:3e:ff:23:9d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 12:44:47 np0005476733 nova_compute[192580]: 2025-10-08 16:44:47.599 2 INFO nova.virt.libvirt.driver [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Using config drive#033[00m
Oct  8 12:44:48 np0005476733 nova_compute[192580]: 2025-10-08 16:44:48.149 2 INFO nova.virt.libvirt.driver [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Creating config drive at /var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb/disk.config#033[00m
Oct  8 12:44:48 np0005476733 nova_compute[192580]: 2025-10-08 16:44:48.158 2 DEBUG oslo_concurrency.processutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj_fz219i execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:44:48 np0005476733 nova_compute[192580]: 2025-10-08 16:44:48.290 2 DEBUG oslo_concurrency.processutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj_fz219i" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:44:48 np0005476733 kernel: tap7ad2d12a-57: entered promiscuous mode
Oct  8 12:44:48 np0005476733 NetworkManager[51699]: <info>  [1759941888.3697] manager: (tap7ad2d12a-57): new Tun device (/org/freedesktop/NetworkManager/Devices/316)
Oct  8 12:44:48 np0005476733 ovn_controller[263831]: 2025-10-08T16:44:48Z|00119|binding|INFO|Claiming lport 7ad2d12a-572a-4586-93cf-9104c8937c76 for this chassis.
Oct  8 12:44:48 np0005476733 ovn_controller[263831]: 2025-10-08T16:44:48Z|00120|binding|INFO|7ad2d12a-572a-4586-93cf-9104c8937c76: Claiming fa:16:3e:ff:23:9d 10.100.0.14
Oct  8 12:44:48 np0005476733 nova_compute[192580]: 2025-10-08 16:44:48.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:48 np0005476733 nova_compute[192580]: 2025-10-08 16:44:48.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:48 np0005476733 systemd-udevd[270924]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.439 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:23:9d 10.100.0.14'], port_security=['fa:16:3e:ff:23:9d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e4acb787-c359-4e53-a697-a921451e586f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ddcd45556b9d4077968eee95f005487d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f3351433-d580-4e91-809c-2020321c7d00', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2ba209ed-1f02-454a-a2cd-88fedbd343f3, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=7ad2d12a-572a-4586-93cf-9104c8937c76) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.442 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 7ad2d12a-572a-4586-93cf-9104c8937c76 in datapath e4acb787-c359-4e53-a697-a921451e586f bound to our chassis#033[00m
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.445 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e4acb787-c359-4e53-a697-a921451e586f#033[00m
Oct  8 12:44:48 np0005476733 systemd-machined[152624]: New machine qemu-64-instance-00000067.
Oct  8 12:44:48 np0005476733 NetworkManager[51699]: <info>  [1759941888.4548] device (tap7ad2d12a-57): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:44:48 np0005476733 NetworkManager[51699]: <info>  [1759941888.4559] device (tap7ad2d12a-57): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.459 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[bf1142a2-9db8-438f-8f8b-230b515b6bf1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.461 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape4acb787-c1 in ovnmeta-e4acb787-c359-4e53-a697-a921451e586f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.463 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape4acb787-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.463 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8608d598-5e84-43ea-8401-f6b6e49b2af9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.464 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[11f16420-9cc8-45d2-9fe9-30cbae6f16f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.475 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[db8d5a71-2ff1-485a-b469-86a2431129b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.511 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d04306a5-3802-4e6d-89e4-db40d5fe8fa0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:44:48 np0005476733 ovn_controller[263831]: 2025-10-08T16:44:48Z|00121|binding|INFO|Setting lport 7ad2d12a-572a-4586-93cf-9104c8937c76 ovn-installed in OVS
Oct  8 12:44:48 np0005476733 ovn_controller[263831]: 2025-10-08T16:44:48Z|00122|binding|INFO|Setting lport 7ad2d12a-572a-4586-93cf-9104c8937c76 up in Southbound
Oct  8 12:44:48 np0005476733 nova_compute[192580]: 2025-10-08 16:44:48.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:48 np0005476733 systemd[1]: Started Virtual Machine qemu-64-instance-00000067.
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.537 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[80c6aa14-001e-4e9b-9870-0d89b146a318]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:44:48 np0005476733 systemd-udevd[270928]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:44:48 np0005476733 NetworkManager[51699]: <info>  [1759941888.5447] manager: (tape4acb787-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/317)
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.543 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[038e7b44-56d9-4888-91c6-2bae687005dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.577 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[3e890ca9-31cd-4158-9ccc-ee38bc951237]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.584 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[b6ffbf18-24f2-475f-9b45-41b52e377480]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:44:48 np0005476733 NetworkManager[51699]: <info>  [1759941888.6094] device (tape4acb787-c0): carrier: link connected
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.613 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[5fb6ae76-1f4f-409b-9807-d83a22132e03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.633 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e9a14a83-bbc4-4c7a-a14a-c21f36c83259]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape4acb787-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:62:2e:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 219], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 879227, 'reachable_time': 22928, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 270957, 'error': None, 'target': 'ovnmeta-e4acb787-c359-4e53-a697-a921451e586f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.650 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9903c089-595a-4ab3-a843-612a0393f577]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe62:2e8c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 879227, 'tstamp': 879227}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 270959, 'error': None, 'target': 'ovnmeta-e4acb787-c359-4e53-a697-a921451e586f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.671 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4fb3aadd-d5fc-4710-8aa1-6e1da4941848]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape4acb787-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:62:2e:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 219], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 879227, 'reachable_time': 22928, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 270960, 'error': None, 'target': 'ovnmeta-e4acb787-c359-4e53-a697-a921451e586f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.710 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[eb6d37ab-41ca-4081-be4c-f815c59467c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.794 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1bce344d-3a4a-42ec-a232-186d628803c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.795 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape4acb787-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.795 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.796 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape4acb787-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:44:48 np0005476733 kernel: tape4acb787-c0: entered promiscuous mode
Oct  8 12:44:48 np0005476733 NetworkManager[51699]: <info>  [1759941888.7990] manager: (tape4acb787-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/318)
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.801 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape4acb787-c0, col_values=(('external_ids', {'iface-id': '0f63c3a5-0e7e-4b46-b5a7-733b076dd1fa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:44:48 np0005476733 ovn_controller[263831]: 2025-10-08T16:44:48Z|00123|binding|INFO|Releasing lport 0f63c3a5-0e7e-4b46-b5a7-733b076dd1fa from this chassis (sb_readonly=0)
Oct  8 12:44:48 np0005476733 nova_compute[192580]: 2025-10-08 16:44:48.806 2 DEBUG nova.compute.manager [req-873df826-2979-48cf-8bbd-25a1687377ff req-3e355fae-a1fa-425e-9e46-44321f4f87a9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Received event network-vif-plugged-7ad2d12a-572a-4586-93cf-9104c8937c76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:44:48 np0005476733 nova_compute[192580]: 2025-10-08 16:44:48.807 2 DEBUG oslo_concurrency.lockutils [req-873df826-2979-48cf-8bbd-25a1687377ff req-3e355fae-a1fa-425e-9e46-44321f4f87a9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "952a7e26-0b63-4309-a370-27b9715e05cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:44:48 np0005476733 nova_compute[192580]: 2025-10-08 16:44:48.808 2 DEBUG oslo_concurrency.lockutils [req-873df826-2979-48cf-8bbd-25a1687377ff req-3e355fae-a1fa-425e-9e46-44321f4f87a9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "952a7e26-0b63-4309-a370-27b9715e05cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:44:48 np0005476733 nova_compute[192580]: 2025-10-08 16:44:48.808 2 DEBUG oslo_concurrency.lockutils [req-873df826-2979-48cf-8bbd-25a1687377ff req-3e355fae-a1fa-425e-9e46-44321f4f87a9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "952a7e26-0b63-4309-a370-27b9715e05cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:44:48 np0005476733 nova_compute[192580]: 2025-10-08 16:44:48.809 2 DEBUG nova.compute.manager [req-873df826-2979-48cf-8bbd-25a1687377ff req-3e355fae-a1fa-425e-9e46-44321f4f87a9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Processing event network-vif-plugged-7ad2d12a-572a-4586-93cf-9104c8937c76 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 12:44:48 np0005476733 nova_compute[192580]: 2025-10-08 16:44:48.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.827 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e4acb787-c359-4e53-a697-a921451e586f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e4acb787-c359-4e53-a697-a921451e586f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.829 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a49c0557-01ff-4cf4-a5c5-b3fcd03cdfa4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:44:48 np0005476733 nova_compute[192580]: 2025-10-08 16:44:48.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.830 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-e4acb787-c359-4e53-a697-a921451e586f
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/e4acb787-c359-4e53-a697-a921451e586f.pid.haproxy
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID e4acb787-c359-4e53-a697-a921451e586f
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 12:44:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:48.830 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e4acb787-c359-4e53-a697-a921451e586f', 'env', 'PROCESS_TAG=haproxy-e4acb787-c359-4e53-a697-a921451e586f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e4acb787-c359-4e53-a697-a921451e586f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.039 2 DEBUG nova.network.neutron [req-fcc5f0fd-8f51-43b9-82f3-9b535ba7b997 req-4a02d753-0a5a-41c9-9480-6ffca2763ef6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Updated VIF entry in instance network info cache for port 7ad2d12a-572a-4586-93cf-9104c8937c76. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.039 2 DEBUG nova.network.neutron [req-fcc5f0fd-8f51-43b9-82f3-9b535ba7b997 req-4a02d753-0a5a-41c9-9480-6ffca2763ef6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Updating instance_info_cache with network_info: [{"id": "7ad2d12a-572a-4586-93cf-9104c8937c76", "address": "fa:16:3e:ff:23:9d", "network": {"id": "e4acb787-c359-4e53-a697-a921451e586f", "bridge": "br-int", "label": "tempest-test-network--1114541047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ddcd45556b9d4077968eee95f005487d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad2d12a-57", "ovs_interfaceid": "7ad2d12a-572a-4586-93cf-9104c8937c76", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.063 2 DEBUG oslo_concurrency.lockutils [req-fcc5f0fd-8f51-43b9-82f3-9b535ba7b997 req-4a02d753-0a5a-41c9-9480-6ffca2763ef6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-952a7e26-0b63-4309-a370-27b9715e05cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:44:49 np0005476733 podman[270997]: 2025-10-08 16:44:49.23754085 +0000 UTC m=+0.061571654 container create af3c891bf04bf58f538d1dde13667292d003dde741ef694f771b86d2fe93dfeb (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-e4acb787-c359-4e53-a697-a921451e586f, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:44:49 np0005476733 systemd[1]: Started libpod-conmon-af3c891bf04bf58f538d1dde13667292d003dde741ef694f771b86d2fe93dfeb.scope.
Oct  8 12:44:49 np0005476733 systemd[1]: Started libcrun container.
Oct  8 12:44:49 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3dc529a684be04286bdf196eb7acc2aedf492c95aed3773f06dc055a10a273a6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 12:44:49 np0005476733 podman[270997]: 2025-10-08 16:44:49.204514758 +0000 UTC m=+0.028545582 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 12:44:49 np0005476733 podman[270997]: 2025-10-08 16:44:49.31248253 +0000 UTC m=+0.136513344 container init af3c891bf04bf58f538d1dde13667292d003dde741ef694f771b86d2fe93dfeb (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-e4acb787-c359-4e53-a697-a921451e586f, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct  8 12:44:49 np0005476733 podman[270997]: 2025-10-08 16:44:49.319869296 +0000 UTC m=+0.143900090 container start af3c891bf04bf58f538d1dde13667292d003dde741ef694f771b86d2fe93dfeb (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-e4acb787-c359-4e53-a697-a921451e586f, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:44:49 np0005476733 podman[271010]: 2025-10-08 16:44:49.330024 +0000 UTC m=+0.053296951 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 12:44:49 np0005476733 neutron-haproxy-ovnmeta-e4acb787-c359-4e53-a697-a921451e586f[271018]: [NOTICE]   (271035) : New worker (271038) forked
Oct  8 12:44:49 np0005476733 neutron-haproxy-ovnmeta-e4acb787-c359-4e53-a697-a921451e586f[271018]: [NOTICE]   (271035) : Loading success.
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.391 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759941889.3907475, 952a7e26-0b63-4309-a370-27b9715e05cb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.391 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] VM Started (Lifecycle Event)#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.393 2 DEBUG nova.compute.manager [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.398 2 DEBUG nova.virt.libvirt.driver [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.402 2 INFO nova.virt.libvirt.driver [-] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Instance spawned successfully.#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.402 2 DEBUG nova.virt.libvirt.driver [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.428 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.432 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.449 2 DEBUG nova.virt.libvirt.driver [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.450 2 DEBUG nova.virt.libvirt.driver [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.450 2 DEBUG nova.virt.libvirt.driver [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.451 2 DEBUG nova.virt.libvirt.driver [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.451 2 DEBUG nova.virt.libvirt.driver [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.452 2 DEBUG nova.virt.libvirt.driver [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.504 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.505 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759941889.3908768, 952a7e26-0b63-4309-a370-27b9715e05cb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.505 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] VM Paused (Lifecycle Event)#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.542 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.549 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759941889.3982205, 952a7e26-0b63-4309-a370-27b9715e05cb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.550 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] VM Resumed (Lifecycle Event)#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.554 2 INFO nova.compute.manager [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Took 7.24 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.554 2 DEBUG nova.compute.manager [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.588 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.593 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.623 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.640 2 INFO nova.compute.manager [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Took 7.86 seconds to build instance.#033[00m
Oct  8 12:44:49 np0005476733 nova_compute[192580]: 2025-10-08 16:44:49.662 2 DEBUG oslo_concurrency.lockutils [None req-f965c8c2-ae40-47e6-8373-3691098ead55 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "952a7e26-0b63-4309-a370-27b9715e05cb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.968s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:44:50 np0005476733 nova_compute[192580]: 2025-10-08 16:44:50.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:50 np0005476733 nova_compute[192580]: 2025-10-08 16:44:50.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:44:50 np0005476733 nova_compute[192580]: 2025-10-08 16:44:50.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:44:50 np0005476733 nova_compute[192580]: 2025-10-08 16:44:50.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:44:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:44:50Z|00124|pinctrl|WARN|Dropped 479 log messages in last 60 seconds (most recently, 2 seconds ago) due to excessive rate
Oct  8 12:44:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:44:50Z|00125|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:44:51 np0005476733 nova_compute[192580]: 2025-10-08 16:44:51.006 2 DEBUG nova.compute.manager [req-85bceac6-2c59-4d16-bdc3-1ebb766cd3df req-f2366377-af7a-4c50-b38d-effcb76ffb72 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Received event network-vif-plugged-7ad2d12a-572a-4586-93cf-9104c8937c76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:44:51 np0005476733 nova_compute[192580]: 2025-10-08 16:44:51.008 2 DEBUG oslo_concurrency.lockutils [req-85bceac6-2c59-4d16-bdc3-1ebb766cd3df req-f2366377-af7a-4c50-b38d-effcb76ffb72 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "952a7e26-0b63-4309-a370-27b9715e05cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:44:51 np0005476733 nova_compute[192580]: 2025-10-08 16:44:51.008 2 DEBUG oslo_concurrency.lockutils [req-85bceac6-2c59-4d16-bdc3-1ebb766cd3df req-f2366377-af7a-4c50-b38d-effcb76ffb72 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "952a7e26-0b63-4309-a370-27b9715e05cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:44:51 np0005476733 nova_compute[192580]: 2025-10-08 16:44:51.009 2 DEBUG oslo_concurrency.lockutils [req-85bceac6-2c59-4d16-bdc3-1ebb766cd3df req-f2366377-af7a-4c50-b38d-effcb76ffb72 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "952a7e26-0b63-4309-a370-27b9715e05cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:44:51 np0005476733 nova_compute[192580]: 2025-10-08 16:44:51.009 2 DEBUG nova.compute.manager [req-85bceac6-2c59-4d16-bdc3-1ebb766cd3df req-f2366377-af7a-4c50-b38d-effcb76ffb72 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] No waiting events found dispatching network-vif-plugged-7ad2d12a-572a-4586-93cf-9104c8937c76 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:44:51 np0005476733 nova_compute[192580]: 2025-10-08 16:44:51.010 2 WARNING nova.compute.manager [req-85bceac6-2c59-4d16-bdc3-1ebb766cd3df req-f2366377-af7a-4c50-b38d-effcb76ffb72 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Received unexpected event network-vif-plugged-7ad2d12a-572a-4586-93cf-9104c8937c76 for instance with vm_state active and task_state None.#033[00m
Oct  8 12:44:52 np0005476733 nova_compute[192580]: 2025-10-08 16:44:52.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:53 np0005476733 nova_compute[192580]: 2025-10-08 16:44:53.121 2 INFO nova.compute.manager [None req-7d49eff8-39be-48aa-a6dc-3aef7b515957 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Get console output#033[00m
Oct  8 12:44:53 np0005476733 nova_compute[192580]: 2025-10-08 16:44:53.126 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:44:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:44:54.011 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '80'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:44:55 np0005476733 nova_compute[192580]: 2025-10-08 16:44:55.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:57 np0005476733 nova_compute[192580]: 2025-10-08 16:44:57.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:44:58 np0005476733 nova_compute[192580]: 2025-10-08 16:44:58.277 2 INFO nova.compute.manager [None req-37c1c9f3-e0e4-4a31-b433-6d87c4ed2b54 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Get console output#033[00m
Oct  8 12:44:58 np0005476733 nova_compute[192580]: 2025-10-08 16:44:58.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:44:58 np0005476733 nova_compute[192580]: 2025-10-08 16:44:58.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:44:59 np0005476733 podman[271051]: 2025-10-08 16:44:59.224827137 +0000 UTC m=+0.054537881 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:44:59 np0005476733 podman[271050]: 2025-10-08 16:44:59.230230869 +0000 UTC m=+0.061999128 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 12:44:59 np0005476733 podman[271052]: 2025-10-08 16:44:59.249175723 +0000 UTC m=+0.070395626 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, distribution-scope=public, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  8 12:45:00 np0005476733 nova_compute[192580]: 2025-10-08 16:45:00.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:01 np0005476733 nova_compute[192580]: 2025-10-08 16:45:01.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:45:01 np0005476733 nova_compute[192580]: 2025-10-08 16:45:01.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:45:01 np0005476733 nova_compute[192580]: 2025-10-08 16:45:01.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:45:01 np0005476733 nova_compute[192580]: 2025-10-08 16:45:01.990 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-952a7e26-0b63-4309-a370-27b9715e05cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:45:01 np0005476733 nova_compute[192580]: 2025-10-08 16:45:01.990 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-952a7e26-0b63-4309-a370-27b9715e05cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:45:01 np0005476733 nova_compute[192580]: 2025-10-08 16:45:01.991 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:45:01 np0005476733 nova_compute[192580]: 2025-10-08 16:45:01.991 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 952a7e26-0b63-4309-a370-27b9715e05cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:45:02 np0005476733 nova_compute[192580]: 2025-10-08 16:45:02.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:03 np0005476733 nova_compute[192580]: 2025-10-08 16:45:03.462 2 INFO nova.compute.manager [None req-1b532cab-a68e-4490-8a1b-d6fee3047ace bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Get console output#033[00m
Oct  8 12:45:03 np0005476733 nova_compute[192580]: 2025-10-08 16:45:03.466 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:45:04 np0005476733 nova_compute[192580]: 2025-10-08 16:45:04.008 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Updating instance_info_cache with network_info: [{"id": "7ad2d12a-572a-4586-93cf-9104c8937c76", "address": "fa:16:3e:ff:23:9d", "network": {"id": "e4acb787-c359-4e53-a697-a921451e586f", "bridge": "br-int", "label": "tempest-test-network--1114541047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ddcd45556b9d4077968eee95f005487d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad2d12a-57", "ovs_interfaceid": "7ad2d12a-572a-4586-93cf-9104c8937c76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:45:04 np0005476733 nova_compute[192580]: 2025-10-08 16:45:04.033 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-952a7e26-0b63-4309-a370-27b9715e05cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:45:04 np0005476733 nova_compute[192580]: 2025-10-08 16:45:04.034 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:45:05 np0005476733 nova_compute[192580]: 2025-10-08 16:45:05.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:06 np0005476733 nova_compute[192580]: 2025-10-08 16:45:06.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:45:07 np0005476733 podman[271122]: 2025-10-08 16:45:07.229439609 +0000 UTC m=+0.051610146 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 12:45:07 np0005476733 podman[271123]: 2025-10-08 16:45:07.245964776 +0000 UTC m=+0.053904220 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:45:07 np0005476733 nova_compute[192580]: 2025-10-08 16:45:07.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:08 np0005476733 nova_compute[192580]: 2025-10-08 16:45:08.672 2 INFO nova.compute.manager [None req-39373e22-6334-453b-8820-fa4abf604b9b bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Get console output#033[00m
Oct  8 12:45:08 np0005476733 nova_compute[192580]: 2025-10-08 16:45:08.676 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:45:10 np0005476733 nova_compute[192580]: 2025-10-08 16:45:10.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:12 np0005476733 ovn_controller[263831]: 2025-10-08T16:45:12Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ff:23:9d 10.100.0.14
Oct  8 12:45:12 np0005476733 ovn_controller[263831]: 2025-10-08T16:45:12Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ff:23:9d 10.100.0.14
Oct  8 12:45:12 np0005476733 nova_compute[192580]: 2025-10-08 16:45:12.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:13 np0005476733 nova_compute[192580]: 2025-10-08 16:45:13.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:45:13 np0005476733 nova_compute[192580]: 2025-10-08 16:45:13.639 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:45:13 np0005476733 nova_compute[192580]: 2025-10-08 16:45:13.639 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:45:13 np0005476733 nova_compute[192580]: 2025-10-08 16:45:13.639 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:45:13 np0005476733 nova_compute[192580]: 2025-10-08 16:45:13.640 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:45:13 np0005476733 nova_compute[192580]: 2025-10-08 16:45:13.723 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:45:13 np0005476733 nova_compute[192580]: 2025-10-08 16:45:13.799 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:45:13 np0005476733 nova_compute[192580]: 2025-10-08 16:45:13.801 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:45:13 np0005476733 nova_compute[192580]: 2025-10-08 16:45:13.864 2 INFO nova.compute.manager [None req-472c926c-4ada-4b68-8130-5058b5544881 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Get console output#033[00m
Oct  8 12:45:13 np0005476733 nova_compute[192580]: 2025-10-08 16:45:13.873 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:45:13 np0005476733 nova_compute[192580]: 2025-10-08 16:45:13.883 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:45:14 np0005476733 nova_compute[192580]: 2025-10-08 16:45:14.037 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:45:14 np0005476733 nova_compute[192580]: 2025-10-08 16:45:14.038 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13095MB free_disk=111.29528045654297GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:45:14 np0005476733 nova_compute[192580]: 2025-10-08 16:45:14.038 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:45:14 np0005476733 nova_compute[192580]: 2025-10-08 16:45:14.038 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:45:14 np0005476733 nova_compute[192580]: 2025-10-08 16:45:14.114 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 952a7e26-0b63-4309-a370-27b9715e05cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:45:14 np0005476733 nova_compute[192580]: 2025-10-08 16:45:14.114 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:45:14 np0005476733 nova_compute[192580]: 2025-10-08 16:45:14.115 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:45:14 np0005476733 nova_compute[192580]: 2025-10-08 16:45:14.152 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:45:14 np0005476733 nova_compute[192580]: 2025-10-08 16:45:14.168 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:45:14 np0005476733 nova_compute[192580]: 2025-10-08 16:45:14.193 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:45:14 np0005476733 nova_compute[192580]: 2025-10-08 16:45:14.194 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:45:15 np0005476733 nova_compute[192580]: 2025-10-08 16:45:15.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:16 np0005476733 podman[271172]: 2025-10-08 16:45:16.226718106 +0000 UTC m=+0.050815752 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:45:17 np0005476733 podman[271192]: 2025-10-08 16:45:17.256315128 +0000 UTC m=+0.085175548 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 12:45:17 np0005476733 nova_compute[192580]: 2025-10-08 16:45:17.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:18 np0005476733 ovn_controller[263831]: 2025-10-08T16:45:18Z|00126|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct  8 12:45:19 np0005476733 nova_compute[192580]: 2025-10-08 16:45:19.046 2 INFO nova.compute.manager [None req-a6b69522-e7a1-43bd-a370-0a28f1299296 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Get console output#033[00m
Oct  8 12:45:19 np0005476733 nova_compute[192580]: 2025-10-08 16:45:19.051 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:45:19 np0005476733 nova_compute[192580]: 2025-10-08 16:45:19.054 2 INFO nova.virt.libvirt.driver [None req-a6b69522-e7a1-43bd-a370-0a28f1299296 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Truncated console log returned, 3197 bytes ignored#033[00m
Oct  8 12:45:19 np0005476733 nova_compute[192580]: 2025-10-08 16:45:19.194 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:45:20 np0005476733 podman[271222]: 2025-10-08 16:45:20.234387422 +0000 UTC m=+0.064250820 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:45:20 np0005476733 nova_compute[192580]: 2025-10-08 16:45:20.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:22 np0005476733 nova_compute[192580]: 2025-10-08 16:45:22.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:24 np0005476733 nova_compute[192580]: 2025-10-08 16:45:24.277 2 INFO nova.compute.manager [None req-5ceb6dc7-c480-411b-946f-58c82690363c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Get console output#033[00m
Oct  8 12:45:24 np0005476733 nova_compute[192580]: 2025-10-08 16:45:24.287 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:45:24 np0005476733 nova_compute[192580]: 2025-10-08 16:45:24.294 2 INFO nova.virt.libvirt.driver [None req-5ceb6dc7-c480-411b-946f-58c82690363c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Truncated console log returned, 3423 bytes ignored#033[00m
Oct  8 12:45:25 np0005476733 nova_compute[192580]: 2025-10-08 16:45:25.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:25 np0005476733 nova_compute[192580]: 2025-10-08 16:45:25.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:45:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:45:26.414 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:45:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:45:26.415 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:45:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:45:26.416 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:45:27 np0005476733 NetworkManager[51699]: <info>  [1759941927.4334] manager: (patch-provnet-20e9b335-697c-41bf-8f62-8813cb01ba99-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/319)
Oct  8 12:45:27 np0005476733 NetworkManager[51699]: <info>  [1759941927.4348] manager: (patch-br-int-to-provnet-20e9b335-697c-41bf-8f62-8813cb01ba99): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/320)
Oct  8 12:45:27 np0005476733 nova_compute[192580]: 2025-10-08 16:45:27.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:27 np0005476733 ovn_controller[263831]: 2025-10-08T16:45:27Z|00127|binding|INFO|Releasing lport 0f63c3a5-0e7e-4b46-b5a7-733b076dd1fa from this chassis (sb_readonly=0)
Oct  8 12:45:27 np0005476733 ovn_controller[263831]: 2025-10-08T16:45:27Z|00128|binding|INFO|Releasing lport 0f63c3a5-0e7e-4b46-b5a7-733b076dd1fa from this chassis (sb_readonly=0)
Oct  8 12:45:27 np0005476733 nova_compute[192580]: 2025-10-08 16:45:27.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:27 np0005476733 nova_compute[192580]: 2025-10-08 16:45:27.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:28 np0005476733 nova_compute[192580]: 2025-10-08 16:45:28.041 2 DEBUG nova.compute.manager [req-58e78605-064e-4140-941d-4ce2dc9fe320 req-731b89b9-d523-475e-ba05-994b9f555cd6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Received event network-changed-7ad2d12a-572a-4586-93cf-9104c8937c76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:45:28 np0005476733 nova_compute[192580]: 2025-10-08 16:45:28.042 2 DEBUG nova.compute.manager [req-58e78605-064e-4140-941d-4ce2dc9fe320 req-731b89b9-d523-475e-ba05-994b9f555cd6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Refreshing instance network info cache due to event network-changed-7ad2d12a-572a-4586-93cf-9104c8937c76. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:45:28 np0005476733 nova_compute[192580]: 2025-10-08 16:45:28.042 2 DEBUG oslo_concurrency.lockutils [req-58e78605-064e-4140-941d-4ce2dc9fe320 req-731b89b9-d523-475e-ba05-994b9f555cd6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-952a7e26-0b63-4309-a370-27b9715e05cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:45:28 np0005476733 nova_compute[192580]: 2025-10-08 16:45:28.042 2 DEBUG oslo_concurrency.lockutils [req-58e78605-064e-4140-941d-4ce2dc9fe320 req-731b89b9-d523-475e-ba05-994b9f555cd6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-952a7e26-0b63-4309-a370-27b9715e05cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:45:28 np0005476733 nova_compute[192580]: 2025-10-08 16:45:28.043 2 DEBUG nova.network.neutron [req-58e78605-064e-4140-941d-4ce2dc9fe320 req-731b89b9-d523-475e-ba05-994b9f555cd6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Refreshing network info cache for port 7ad2d12a-572a-4586-93cf-9104c8937c76 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:45:29 np0005476733 nova_compute[192580]: 2025-10-08 16:45:29.257 2 DEBUG nova.network.neutron [req-58e78605-064e-4140-941d-4ce2dc9fe320 req-731b89b9-d523-475e-ba05-994b9f555cd6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Updated VIF entry in instance network info cache for port 7ad2d12a-572a-4586-93cf-9104c8937c76. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:45:29 np0005476733 nova_compute[192580]: 2025-10-08 16:45:29.258 2 DEBUG nova.network.neutron [req-58e78605-064e-4140-941d-4ce2dc9fe320 req-731b89b9-d523-475e-ba05-994b9f555cd6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Updating instance_info_cache with network_info: [{"id": "7ad2d12a-572a-4586-93cf-9104c8937c76", "address": "fa:16:3e:ff:23:9d", "network": {"id": "e4acb787-c359-4e53-a697-a921451e586f", "bridge": "br-int", "label": "tempest-test-network--1114541047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ddcd45556b9d4077968eee95f005487d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad2d12a-57", "ovs_interfaceid": "7ad2d12a-572a-4586-93cf-9104c8937c76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:45:29 np0005476733 nova_compute[192580]: 2025-10-08 16:45:29.299 2 DEBUG oslo_concurrency.lockutils [req-58e78605-064e-4140-941d-4ce2dc9fe320 req-731b89b9-d523-475e-ba05-994b9f555cd6 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-952a7e26-0b63-4309-a370-27b9715e05cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:45:30 np0005476733 podman[271246]: 2025-10-08 16:45:30.227872595 +0000 UTC m=+0.056595934 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 12:45:30 np0005476733 podman[271247]: 2025-10-08 16:45:30.229594661 +0000 UTC m=+0.054731637 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:45:30 np0005476733 podman[271248]: 2025-10-08 16:45:30.234694403 +0000 UTC m=+0.056887455 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, distribution-scope=public, name=ubi9-minimal, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  8 12:45:30 np0005476733 nova_compute[192580]: 2025-10-08 16:45:30.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:45:32.082 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=81, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=80) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:45:32 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:45:32.084 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:45:32 np0005476733 nova_compute[192580]: 2025-10-08 16:45:32.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:32 np0005476733 nova_compute[192580]: 2025-10-08 16:45:32.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:35 np0005476733 nova_compute[192580]: 2025-10-08 16:45:35.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:37 np0005476733 nova_compute[192580]: 2025-10-08 16:45:37.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:37 np0005476733 podman[271317]: 2025-10-08 16:45:37.674108142 +0000 UTC m=+0.057201335 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 12:45:37 np0005476733 podman[271316]: 2025-10-08 16:45:37.717429473 +0000 UTC m=+0.101954093 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:45:40 np0005476733 nova_compute[192580]: 2025-10-08 16:45:40.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:45:42.088 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '81'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:45:42 np0005476733 nova_compute[192580]: 2025-10-08 16:45:42.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:45 np0005476733 nova_compute[192580]: 2025-10-08 16:45:45.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:45 np0005476733 nova_compute[192580]: 2025-10-08 16:45:45.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:45:47 np0005476733 podman[271414]: 2025-10-08 16:45:47.225988073 +0000 UTC m=+0.053704033 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001)
Oct  8 12:45:47 np0005476733 nova_compute[192580]: 2025-10-08 16:45:47.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:48 np0005476733 podman[271435]: 2025-10-08 16:45:48.247965812 +0000 UTC m=+0.077906736 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 12:45:50 np0005476733 nova_compute[192580]: 2025-10-08 16:45:50.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:50 np0005476733 nova_compute[192580]: 2025-10-08 16:45:50.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:45:50 np0005476733 nova_compute[192580]: 2025-10-08 16:45:50.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:45:50 np0005476733 nova_compute[192580]: 2025-10-08 16:45:50.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:45:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:45:50Z|00129|pinctrl|WARN|Dropped 209 log messages in last 60 seconds (most recently, 3 seconds ago) due to excessive rate
Oct  8 12:45:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:45:50Z|00130|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:45:51 np0005476733 podman[271461]: 2025-10-08 16:45:51.227368469 +0000 UTC m=+0.056753260 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible)
Oct  8 12:45:52 np0005476733 nova_compute[192580]: 2025-10-08 16:45:52.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:55 np0005476733 nova_compute[192580]: 2025-10-08 16:45:55.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:57 np0005476733 nova_compute[192580]: 2025-10-08 16:45:57.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:45:59 np0005476733 nova_compute[192580]: 2025-10-08 16:45:59.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:46:00 np0005476733 nova_compute[192580]: 2025-10-08 16:46:00.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:00 np0005476733 nova_compute[192580]: 2025-10-08 16:46:00.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:46:01 np0005476733 podman[271484]: 2025-10-08 16:46:01.227210446 +0000 UTC m=+0.058832707 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 12:46:01 np0005476733 podman[271486]: 2025-10-08 16:46:01.231613717 +0000 UTC m=+0.054550191 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, config_id=edpm, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Oct  8 12:46:01 np0005476733 podman[271485]: 2025-10-08 16:46:01.262989248 +0000 UTC m=+0.088237746 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:46:02 np0005476733 ovn_controller[263831]: 2025-10-08T16:46:02Z|00131|memory_trim|INFO|Detected inactivity (last active 30013 ms ago): trimming memory
Oct  8 12:46:02 np0005476733 nova_compute[192580]: 2025-10-08 16:46:02.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:03 np0005476733 nova_compute[192580]: 2025-10-08 16:46:03.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:46:03 np0005476733 nova_compute[192580]: 2025-10-08 16:46:03.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:46:03 np0005476733 nova_compute[192580]: 2025-10-08 16:46:03.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:46:03 np0005476733 nova_compute[192580]: 2025-10-08 16:46:03.797 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-952a7e26-0b63-4309-a370-27b9715e05cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:46:03 np0005476733 nova_compute[192580]: 2025-10-08 16:46:03.797 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-952a7e26-0b63-4309-a370-27b9715e05cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:46:03 np0005476733 nova_compute[192580]: 2025-10-08 16:46:03.798 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:46:03 np0005476733 nova_compute[192580]: 2025-10-08 16:46:03.798 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 952a7e26-0b63-4309-a370-27b9715e05cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:46:05 np0005476733 nova_compute[192580]: 2025-10-08 16:46:05.098 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Updating instance_info_cache with network_info: [{"id": "7ad2d12a-572a-4586-93cf-9104c8937c76", "address": "fa:16:3e:ff:23:9d", "network": {"id": "e4acb787-c359-4e53-a697-a921451e586f", "bridge": "br-int", "label": "tempest-test-network--1114541047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ddcd45556b9d4077968eee95f005487d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad2d12a-57", "ovs_interfaceid": "7ad2d12a-572a-4586-93cf-9104c8937c76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:46:05 np0005476733 nova_compute[192580]: 2025-10-08 16:46:05.117 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-952a7e26-0b63-4309-a370-27b9715e05cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:46:05 np0005476733 nova_compute[192580]: 2025-10-08 16:46:05.118 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:46:05 np0005476733 nova_compute[192580]: 2025-10-08 16:46:05.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:07 np0005476733 nova_compute[192580]: 2025-10-08 16:46:07.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:46:07 np0005476733 nova_compute[192580]: 2025-10-08 16:46:07.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:08 np0005476733 podman[271550]: 2025-10-08 16:46:08.249287516 +0000 UTC m=+0.063511576 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:46:08 np0005476733 podman[271549]: 2025-10-08 16:46:08.27041334 +0000 UTC m=+0.089418302 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 12:46:10 np0005476733 nova_compute[192580]: 2025-10-08 16:46:10.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:46:12.432 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=82, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=81) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:46:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:46:12.433 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:46:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:46:12.434 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '82'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:46:12 np0005476733 nova_compute[192580]: 2025-10-08 16:46:12.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:12 np0005476733 nova_compute[192580]: 2025-10-08 16:46:12.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:13 np0005476733 nova_compute[192580]: 2025-10-08 16:46:13.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:46:13 np0005476733 nova_compute[192580]: 2025-10-08 16:46:13.620 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:46:13 np0005476733 nova_compute[192580]: 2025-10-08 16:46:13.621 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:46:13 np0005476733 nova_compute[192580]: 2025-10-08 16:46:13.621 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:46:13 np0005476733 nova_compute[192580]: 2025-10-08 16:46:13.621 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:46:13 np0005476733 nova_compute[192580]: 2025-10-08 16:46:13.690 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:46:13 np0005476733 nova_compute[192580]: 2025-10-08 16:46:13.753 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:46:13 np0005476733 nova_compute[192580]: 2025-10-08 16:46:13.754 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:46:13 np0005476733 nova_compute[192580]: 2025-10-08 16:46:13.814 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:46:13 np0005476733 nova_compute[192580]: 2025-10-08 16:46:13.955 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:46:13 np0005476733 nova_compute[192580]: 2025-10-08 16:46:13.956 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12835MB free_disk=111.16978073120117GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:46:13 np0005476733 nova_compute[192580]: 2025-10-08 16:46:13.956 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:46:13 np0005476733 nova_compute[192580]: 2025-10-08 16:46:13.956 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:46:14 np0005476733 nova_compute[192580]: 2025-10-08 16:46:14.084 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 952a7e26-0b63-4309-a370-27b9715e05cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:46:14 np0005476733 nova_compute[192580]: 2025-10-08 16:46:14.085 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:46:14 np0005476733 nova_compute[192580]: 2025-10-08 16:46:14.085 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:46:14 np0005476733 nova_compute[192580]: 2025-10-08 16:46:14.142 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:46:14 np0005476733 nova_compute[192580]: 2025-10-08 16:46:14.160 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:46:14 np0005476733 nova_compute[192580]: 2025-10-08 16:46:14.162 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:46:14 np0005476733 nova_compute[192580]: 2025-10-08 16:46:14.162 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:46:15 np0005476733 nova_compute[192580]: 2025-10-08 16:46:15.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:17 np0005476733 nova_compute[192580]: 2025-10-08 16:46:17.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:17 np0005476733 podman[271611]: 2025-10-08 16:46:17.813249044 +0000 UTC m=+0.081117098 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct  8 12:46:19 np0005476733 podman[271635]: 2025-10-08 16:46:19.277594049 +0000 UTC m=+0.109542925 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 12:46:20 np0005476733 nova_compute[192580]: 2025-10-08 16:46:20.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:20 np0005476733 nova_compute[192580]: 2025-10-08 16:46:20.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:46:20 np0005476733 nova_compute[192580]: 2025-10-08 16:46:20.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:46:20 np0005476733 nova_compute[192580]: 2025-10-08 16:46:20.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 12:46:22 np0005476733 podman[271659]: 2025-10-08 16:46:22.23863507 +0000 UTC m=+0.064902341 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:46:22 np0005476733 nova_compute[192580]: 2025-10-08 16:46:22.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:25 np0005476733 nova_compute[192580]: 2025-10-08 16:46:25.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:46:26.416 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:46:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:46:26.417 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:46:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:46:26.417 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:46:27 np0005476733 nova_compute[192580]: 2025-10-08 16:46:27.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:30 np0005476733 nova_compute[192580]: 2025-10-08 16:46:30.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:32 np0005476733 podman[271680]: 2025-10-08 16:46:32.219546102 +0000 UTC m=+0.046143652 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:46:32 np0005476733 podman[271681]: 2025-10-08 16:46:32.22544889 +0000 UTC m=+0.048826068 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, container_name=openstack_network_exporter, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  8 12:46:32 np0005476733 podman[271679]: 2025-10-08 16:46:32.250905902 +0000 UTC m=+0.080685974 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  8 12:46:32 np0005476733 nova_compute[192580]: 2025-10-08 16:46:32.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:35 np0005476733 nova_compute[192580]: 2025-10-08 16:46:35.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.076 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000067', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ddcd45556b9d4077968eee95f005487d', 'user_id': 'bee18afeaf16419c98219491d4757b96', 'hostId': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.077 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.077 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.077 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-test_igmp_snooping_after_openvswitch_restart-1066522754>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_igmp_snooping_after_openvswitch_restart-1066522754>]
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.077 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.082 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 952a7e26-0b63-4309-a370-27b9715e05cb / tap7ad2d12a-57 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.083 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/network.incoming.packets volume: 19 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f79e7cc4-658b-4358-9175-fb08f036d2d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 19, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-00000067-952a7e26-0b63-4309-a370-27b9715e05cb-tap7ad2d12a-57', 'timestamp': '2025-10-08T16:46:36.077885', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'tap7ad2d12a-57', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:ff:23:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7ad2d12a-57'}, 'message_id': '5a860054-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.800851808, 'message_signature': 'd489b9d63013bed3e1f206146a14718a5bbfcf758a256c79a560b57ca65d22ec'}]}, 'timestamp': '2025-10-08 16:46:36.083856', '_unique_id': 'cb114f51bb2a4d71902d43888320354e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.085 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.086 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.086 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a352b2f-af45-4d7c-a246-70e76af3a945', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-00000067-952a7e26-0b63-4309-a370-27b9715e05cb-tap7ad2d12a-57', 'timestamp': '2025-10-08T16:46:36.086333', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'tap7ad2d12a-57', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:ff:23:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7ad2d12a-57'}, 'message_id': '5a8671e2-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.800851808, 'message_signature': '3308be77427d11973db93f03fd67eefae07a34048874cc337ee717c17ce05966'}]}, 'timestamp': '2025-10-08 16:46:36.086704', '_unique_id': '892eea51415f408792b64b1005af0aaf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.087 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.088 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.102 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/disk.device.usage volume: 152567808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.103 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24982483-2915-47b9-94fe-83d31400de3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152567808, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '952a7e26-0b63-4309-a370-27b9715e05cb-vda', 'timestamp': '2025-10-08T16:46:36.088479', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'instance-00000067', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '5a88f52a-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.811472377, 'message_signature': 'cd6e8162bc05525ec279d382da56d0a7d52b674f39caf9925a6a3525c2e80cdc'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '952a7e26-0b63-4309-a370-27b9715e05cb-sda', 'timestamp': '2025-10-08T16:46:36.088479', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'instance-00000067', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '5a8903da-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.811472377, 'message_signature': '68b067faf773b2c1c0809ae0e60295178d344625c8892bcbbae0a4c1c944d80b'}]}, 'timestamp': '2025-10-08 16:46:36.103519', '_unique_id': '1d56f5059ee647eeb6b8691d4637756e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.104 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.105 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.105 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-test_igmp_snooping_after_openvswitch_restart-1066522754>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_igmp_snooping_after_openvswitch_restart-1066522754>]
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.105 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.127 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/disk.device.read.latency volume: 6540084502 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.127 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/disk.device.read.latency volume: 73179687 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3743632-7477-4745-a78f-860ea89d2aaf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6540084502, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '952a7e26-0b63-4309-a370-27b9715e05cb-vda', 'timestamp': '2025-10-08T16:46:36.106126', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'instance-00000067', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '5a8cb822-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.82912164, 'message_signature': '7cd977377aa53f773f169f4fef1097ab36648ab2e7e26ee7b3d2076cdcdba38a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 73179687, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '952a7e26-0b63-4309-a370-27b9715e05cb-sda', 'timestamp': '2025-10-08T16:46:36.106126', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'instance-00000067', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '5a8cc510-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.82912164, 'message_signature': 'dd732324cf53bf80f7f255bc59a6386756284d24071d7001a67bf5a501ae23f0'}]}, 'timestamp': '2025-10-08 16:46:36.128153', '_unique_id': 'eb3ab31aacbe409b8157b21d9a075a9f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.130 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.130 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/disk.device.write.bytes volume: 135738880 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.130 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '764223ff-3200-4e0a-8e07-86de8e5ce449', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 135738880, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '952a7e26-0b63-4309-a370-27b9715e05cb-vda', 'timestamp': '2025-10-08T16:46:36.130303', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'instance-00000067', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '5a8d267c-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.82912164, 'message_signature': '319ad4c3eca0fbf9f94579f99f705e3fddc8b5fd1a45eee378a0926e2f4d0eee'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '952a7e26-0b63-4309-a370-27b9715e05cb-sda', 'timestamp': '2025-10-08T16:46:36.130303', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'instance-00000067', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '5a8d3194-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.82912164, 'message_signature': '2cdd91de5d6d4a7371e7a13d750d266cea2a27ba9161ac21ba787458d7ca21eb'}]}, 'timestamp': '2025-10-08 16:46:36.130889', '_unique_id': 'fbc746c2b1b54148aaaa33cb4a1a3d8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.131 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.132 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.132 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '75036ce4-7ea1-4558-b1e4-b4f6108bf079', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-00000067-952a7e26-0b63-4309-a370-27b9715e05cb-tap7ad2d12a-57', 'timestamp': '2025-10-08T16:46:36.132613', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'tap7ad2d12a-57', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:ff:23:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7ad2d12a-57'}, 'message_id': '5a8d8220-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.800851808, 'message_signature': 'fcbd0f5c826feea5410813f97e348e951f9a7820e3eba0ceb25d9c1ef81543fd'}]}, 'timestamp': '2025-10-08 16:46:36.132998', '_unique_id': 'd8a4f8d996c84142890d3bd5a04f3e78'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.133 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.134 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.134 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34a3766f-4cbf-40b4-8b10-10b6d4213994', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-00000067-952a7e26-0b63-4309-a370-27b9715e05cb-tap7ad2d12a-57', 'timestamp': '2025-10-08T16:46:36.134844', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'tap7ad2d12a-57', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:ff:23:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7ad2d12a-57'}, 'message_id': '5a8dd806-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.800851808, 'message_signature': '1d8530b9c86f678d9f0129276f1aad835e2e6e140c2867e74edb207ee06294dd'}]}, 'timestamp': '2025-10-08 16:46:36.135196', '_unique_id': 'ab4f5838c63942d9b535831848ee502b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.135 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.136 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.136 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.136 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-test_igmp_snooping_after_openvswitch_restart-1066522754>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_igmp_snooping_after_openvswitch_restart-1066522754>]
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.137 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.137 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/network.outgoing.packets volume: 48 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee1f4846-111a-4c24-959a-9fed19c40935', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 48, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-00000067-952a7e26-0b63-4309-a370-27b9715e05cb-tap7ad2d12a-57', 'timestamp': '2025-10-08T16:46:36.137187', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'tap7ad2d12a-57', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:ff:23:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7ad2d12a-57'}, 'message_id': '5a8e338c-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.800851808, 'message_signature': '17cad4a10ea31379655e2b0dfaf5083f0caabba7f2acb30725e630b25950bdca'}]}, 'timestamp': '2025-10-08 16:46:36.137522', '_unique_id': 'f87aa706eb8b44409472906b620db586'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.138 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.139 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.139 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.139 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e87fca0-2f8c-4e83-8667-90479f660030', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '952a7e26-0b63-4309-a370-27b9715e05cb-vda', 'timestamp': '2025-10-08T16:46:36.139129', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'instance-00000067', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '5a8e7f40-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.811472377, 'message_signature': 'c1606ca98328f57e058afe805ba17242972287a979f2f9a9952fa3d8974d30be'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '952a7e26-0b63-4309-a370-27b9715e05cb-sda', 'timestamp': '2025-10-08T16:46:36.139129', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'instance-00000067', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '5a8e8a30-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.811472377, 'message_signature': 'ebfdf97f9f37c078d80eb2c05be7609a79ad0b99709ef6879d6264039b7c2161'}]}, 'timestamp': '2025-10-08 16:46:36.139708', '_unique_id': '9d6876b5cc99483ea638851a2b801315'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.141 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.141 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/network.outgoing.bytes volume: 4844 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f05c54bc-3833-425d-bdb6-eeffa198f548', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4844, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-00000067-952a7e26-0b63-4309-a370-27b9715e05cb-tap7ad2d12a-57', 'timestamp': '2025-10-08T16:46:36.141255', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'tap7ad2d12a-57', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:ff:23:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7ad2d12a-57'}, 'message_id': '5a8ed1ca-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.800851808, 'message_signature': 'e2261ad0d360cf90bd138e437309456f76cc366351e330b75ec3b79678240810'}]}, 'timestamp': '2025-10-08 16:46:36.141557', '_unique_id': '81f8f9658ad54dfd9d96b191b489bb10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.142 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.158 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/memory.usage volume: 227.75 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a329a656-76b0-4da0-94b3-15523520dadc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 227.75, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'timestamp': '2025-10-08T16:46:36.143047', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'instance-00000067', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '5a918884-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.881627894, 'message_signature': '5e1d99c252b4a98b5b274b2cf1f49be3c6223a40be411bc587482f1dda8027c8'}]}, 'timestamp': '2025-10-08 16:46:36.159407', '_unique_id': 'bf559f9c720147d6a73e8a3fb0aaa414'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.161 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.161 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/network.incoming.bytes volume: 2540 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '933bf45d-3f21-4ba9-8528-3c3564c7b504', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2540, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-00000067-952a7e26-0b63-4309-a370-27b9715e05cb-tap7ad2d12a-57', 'timestamp': '2025-10-08T16:46:36.161523', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'tap7ad2d12a-57', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:ff:23:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7ad2d12a-57'}, 'message_id': '5a91eb12-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.800851808, 'message_signature': '203778480a0e2bb327679dca804e7d19dd445e0fc66c9b5c0ede1101212c51d8'}]}, 'timestamp': '2025-10-08 16:46:36.161889', '_unique_id': 'b649211dc4714892a91016686abfe2c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.162 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.163 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.163 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.164 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-test_igmp_snooping_after_openvswitch_restart-1066522754>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_igmp_snooping_after_openvswitch_restart-1066522754>]
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.164 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.164 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/disk.device.read.requests volume: 11521 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.164 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9d2bd9f-0d1a-44e9-8a69-1cbd0b719d8e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11521, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '952a7e26-0b63-4309-a370-27b9715e05cb-vda', 'timestamp': '2025-10-08T16:46:36.164417', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'instance-00000067', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '5a925bb0-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.82912164, 'message_signature': 'e5a79bf3f74ef0707b6c9e5d7cb552d794346caba87adf0163a28d227b5aa66a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '952a7e26-0b63-4309-a370-27b9715e05cb-sda', 'timestamp': '2025-10-08T16:46:36.164417', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'instance-00000067', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '5a926786-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.82912164, 'message_signature': '6d9bb3f78d8b5f0de5d47f757dab87477de2afc441608dfea793bea79d34e5f6'}]}, 'timestamp': '2025-10-08 16:46:36.165041', '_unique_id': '4e3ad73bc56247d88b316358f9ee49ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.165 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.166 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.166 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff21530d-9f95-4923-b0e4-38430fefe0e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-00000067-952a7e26-0b63-4309-a370-27b9715e05cb-tap7ad2d12a-57', 'timestamp': '2025-10-08T16:46:36.166864', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'tap7ad2d12a-57', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:ff:23:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7ad2d12a-57'}, 'message_id': '5a92bb1e-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.800851808, 'message_signature': '1afcf67a1e52605eb95a4764669bf321baeb8b56b531b86d375e7b6cfbcf29d0'}]}, 'timestamp': '2025-10-08 16:46:36.167233', '_unique_id': 'e27bed6b7acd471a87973d30580ff476'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.167 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.168 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.168 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/disk.device.read.bytes volume: 326608384 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.169 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34b97164-7982-42a8-bc55-73ede07c4110', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 326608384, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '952a7e26-0b63-4309-a370-27b9715e05cb-vda', 'timestamp': '2025-10-08T16:46:36.168889', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'instance-00000067', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '5a9309ac-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.82912164, 'message_signature': '107870a5e957983c9ccf166618a479f2c236d71fd1dcf62126cd7fea19b0a2f6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '952a7e26-0b63-4309-a370-27b9715e05cb-sda', 'timestamp': '2025-10-08T16:46:36.168889', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'instance-00000067', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '5a9316c2-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.82912164, 'message_signature': '918fe3e9a7dcc3e8cbc6ff1432f17542707264f33edafaa31cf606fdcc57b1ee'}]}, 'timestamp': '2025-10-08 16:46:36.169522', '_unique_id': '0779a4172d4748b8949fd9637c72fc2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.170 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.171 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.171 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98c5e9e1-d130-44c5-ae5b-e936f0a195f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-00000067-952a7e26-0b63-4309-a370-27b9715e05cb-tap7ad2d12a-57', 'timestamp': '2025-10-08T16:46:36.171389', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'tap7ad2d12a-57', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:ff:23:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7ad2d12a-57'}, 'message_id': '5a936b90-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.800851808, 'message_signature': '87d7a1850a7ffb93db6a8743b3e862854b1e51130113b03a1a2de9b993288a6c'}]}, 'timestamp': '2025-10-08 16:46:36.171716', '_unique_id': '4e8a054a298f413da4e7251eb79cb08b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.172 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.173 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.173 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/disk.device.write.latency volume: 10022137604 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.173 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5fc0b3e2-7489-4c2a-bb2b-c5d4a7063dd9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10022137604, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '952a7e26-0b63-4309-a370-27b9715e05cb-vda', 'timestamp': '2025-10-08T16:46:36.173326', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'instance-00000067', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '5a93b726-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.82912164, 'message_signature': 'c6fc831f0247a58a069398eb71ea1e5642a0c81bfc652f5e1a95c970d8e07365'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '952a7e26-0b63-4309-a370-27b9715e05cb-sda', 'timestamp': '2025-10-08T16:46:36.173326', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'instance-00000067', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '5a93c27a-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.82912164, 'message_signature': '465a467d0708549c0b6dce3c27283d5edf7553052f57a593f7d54559103c2f27'}]}, 'timestamp': '2025-10-08 16:46:36.173920', '_unique_id': 'dae1cc478c484e7a8bb244eb1158dacc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.174 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.175 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.175 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.175 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ebe741ad-a06d-47e0-b9a1-7751a6876835', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '952a7e26-0b63-4309-a370-27b9715e05cb-vda', 'timestamp': '2025-10-08T16:46:36.175583', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'instance-00000067', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '5a940eba-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.811472377, 'message_signature': '17670e9f7bded8b83de8cdad7d97e568af9a9203609b8100af8ed493e470b810'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '952a7e26-0b63-4309-a370-27b9715e05cb-sda', 'timestamp': '2025-10-08T16:46:36.175583', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'instance-00000067', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '5a941a04-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.811472377, 'message_signature': '3463baf6b46d7e6b47c0c13695e9eb6ec12c9edb7cf824a6441d8d1436627f74'}]}, 'timestamp': '2025-10-08 16:46:36.176185', '_unique_id': '21f86028405549f49cfa81b3fb85ebeb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.176 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.177 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.177 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/cpu volume: 38840000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f14820af-d25a-4e3a-b84e-c436e70dc8b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 38840000000, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'timestamp': '2025-10-08T16:46:36.177670', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'instance-00000067', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '5a946040-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.881627894, 'message_signature': '7e0237d704b204b1b92daa37ab1eaaba0668f675f14267910cb4f78d75a9f328'}]}, 'timestamp': '2025-10-08 16:46:36.177966', '_unique_id': 'e7aec0aa44c34b5fbcad3821324c626b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.178 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.179 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.179 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d00e2f9-b9ac-4f30-a80c-1f306170b8ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-00000067-952a7e26-0b63-4309-a370-27b9715e05cb-tap7ad2d12a-57', 'timestamp': '2025-10-08T16:46:36.179641', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'tap7ad2d12a-57', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:ff:23:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7ad2d12a-57'}, 'message_id': '5a94adf2-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.800851808, 'message_signature': 'b756e1e558c78084843e6873bb6667316f677f33a6ec66435202eda801e01acb'}]}, 'timestamp': '2025-10-08 16:46:36.179969', '_unique_id': '24f26641084a4fe4b1dd745e36244c5e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.180 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.181 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.181 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/disk.device.write.requests volume: 720 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.181 12 DEBUG ceilometer.compute.pollsters [-] 952a7e26-0b63-4309-a370-27b9715e05cb/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a1311df-a6ab-45fb-9ea5-e348a38b94be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 720, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '952a7e26-0b63-4309-a370-27b9715e05cb-vda', 'timestamp': '2025-10-08T16:46:36.181490', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'instance-00000067', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '5a94f5c8-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.82912164, 'message_signature': '7d0749f84d2636106bcb572575fdf07fabe48b31bd86577061135cd744871dd5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '952a7e26-0b63-4309-a370-27b9715e05cb-sda', 'timestamp': '2025-10-08T16:46:36.181490', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_after_openvswitch_restart-1066522754', 'name': 'instance-00000067', 'instance_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '5a950072-a466-11f0-9274-fa163ef67048', 'monotonic_time': 8899.82912164, 'message_signature': 'eda2422aa62d7c08a524dd9fa4404755bc597f3baa20509b655cf6f0f4391831'}]}, 'timestamp': '2025-10-08 16:46:36.182059', '_unique_id': 'db2f8390d73442a48f5941a96b2f1fd5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:46:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:46:36.182 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:46:37 np0005476733 nova_compute[192580]: 2025-10-08 16:46:37.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:39 np0005476733 podman[271747]: 2025-10-08 16:46:39.224852408 +0000 UTC m=+0.053619081 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  8 12:46:39 np0005476733 podman[271748]: 2025-10-08 16:46:39.231206381 +0000 UTC m=+0.053421174 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:46:40 np0005476733 nova_compute[192580]: 2025-10-08 16:46:40.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:42 np0005476733 nova_compute[192580]: 2025-10-08 16:46:42.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:45 np0005476733 nova_compute[192580]: 2025-10-08 16:46:45.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:46 np0005476733 nova_compute[192580]: 2025-10-08 16:46:46.681 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:46:47 np0005476733 nova_compute[192580]: 2025-10-08 16:46:47.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:48 np0005476733 podman[271793]: 2025-10-08 16:46:48.224186651 +0000 UTC m=+0.050204452 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:46:48 np0005476733 nova_compute[192580]: 2025-10-08 16:46:48.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:46:50 np0005476733 podman[271812]: 2025-10-08 16:46:50.279886564 +0000 UTC m=+0.113870552 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 12:46:50 np0005476733 nova_compute[192580]: 2025-10-08 16:46:50.594 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:46:50 np0005476733 nova_compute[192580]: 2025-10-08 16:46:50.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:46:50Z|00132|pinctrl|WARN|Dropped 149 log messages in last 60 seconds (most recently, 1 seconds ago) due to excessive rate
Oct  8 12:46:50 np0005476733 ovn_controller[263831]: 2025-10-08T16:46:50Z|00133|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:46:52 np0005476733 nova_compute[192580]: 2025-10-08 16:46:52.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:46:52 np0005476733 nova_compute[192580]: 2025-10-08 16:46:52.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:46:52 np0005476733 nova_compute[192580]: 2025-10-08 16:46:52.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:53 np0005476733 podman[271842]: 2025-10-08 16:46:53.275877801 +0000 UTC m=+0.100554358 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct  8 12:46:55 np0005476733 nova_compute[192580]: 2025-10-08 16:46:55.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:57 np0005476733 ovn_controller[263831]: 2025-10-08T16:46:57Z|00134|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Oct  8 12:46:57 np0005476733 nova_compute[192580]: 2025-10-08 16:46:57.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:46:59 np0005476733 nova_compute[192580]: 2025-10-08 16:46:59.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:47:00 np0005476733 nova_compute[192580]: 2025-10-08 16:47:00.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:01 np0005476733 nova_compute[192580]: 2025-10-08 16:47:01.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:47:02 np0005476733 nova_compute[192580]: 2025-10-08 16:47:02.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:03 np0005476733 podman[271870]: 2025-10-08 16:47:03.243046356 +0000 UTC m=+0.059580361 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  8 12:47:03 np0005476733 podman[271869]: 2025-10-08 16:47:03.260013756 +0000 UTC m=+0.078337919 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 12:47:03 np0005476733 podman[271868]: 2025-10-08 16:47:03.266021658 +0000 UTC m=+0.083826034 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.build-date=20251001)
Oct  8 12:47:03 np0005476733 nova_compute[192580]: 2025-10-08 16:47:03.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:47:03 np0005476733 nova_compute[192580]: 2025-10-08 16:47:03.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:47:03 np0005476733 nova_compute[192580]: 2025-10-08 16:47:03.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:47:04 np0005476733 nova_compute[192580]: 2025-10-08 16:47:04.109 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-952a7e26-0b63-4309-a370-27b9715e05cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:47:04 np0005476733 nova_compute[192580]: 2025-10-08 16:47:04.109 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-952a7e26-0b63-4309-a370-27b9715e05cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:47:04 np0005476733 nova_compute[192580]: 2025-10-08 16:47:04.109 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:47:04 np0005476733 nova_compute[192580]: 2025-10-08 16:47:04.109 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 952a7e26-0b63-4309-a370-27b9715e05cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:47:05 np0005476733 nova_compute[192580]: 2025-10-08 16:47:05.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:05 np0005476733 nova_compute[192580]: 2025-10-08 16:47:05.738 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Updating instance_info_cache with network_info: [{"id": "7ad2d12a-572a-4586-93cf-9104c8937c76", "address": "fa:16:3e:ff:23:9d", "network": {"id": "e4acb787-c359-4e53-a697-a921451e586f", "bridge": "br-int", "label": "tempest-test-network--1114541047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ddcd45556b9d4077968eee95f005487d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad2d12a-57", "ovs_interfaceid": "7ad2d12a-572a-4586-93cf-9104c8937c76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:47:05 np0005476733 nova_compute[192580]: 2025-10-08 16:47:05.753 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-952a7e26-0b63-4309-a370-27b9715e05cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:47:05 np0005476733 nova_compute[192580]: 2025-10-08 16:47:05.753 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:47:07 np0005476733 nova_compute[192580]: 2025-10-08 16:47:07.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:09 np0005476733 nova_compute[192580]: 2025-10-08 16:47:09.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:47:10 np0005476733 podman[271932]: 2025-10-08 16:47:10.238141976 +0000 UTC m=+0.062423372 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:47:10 np0005476733 podman[271933]: 2025-10-08 16:47:10.238371044 +0000 UTC m=+0.061705889 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 12:47:10 np0005476733 nova_compute[192580]: 2025-10-08 16:47:10.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:11 np0005476733 nova_compute[192580]: 2025-10-08 16:47:11.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:47:11 np0005476733 nova_compute[192580]: 2025-10-08 16:47:11.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 12:47:11 np0005476733 nova_compute[192580]: 2025-10-08 16:47:11.603 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 12:47:12 np0005476733 nova_compute[192580]: 2025-10-08 16:47:12.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:13 np0005476733 nova_compute[192580]: 2025-10-08 16:47:13.603 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:47:13 np0005476733 nova_compute[192580]: 2025-10-08 16:47:13.628 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:47:13 np0005476733 nova_compute[192580]: 2025-10-08 16:47:13.628 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:47:13 np0005476733 nova_compute[192580]: 2025-10-08 16:47:13.628 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:47:13 np0005476733 nova_compute[192580]: 2025-10-08 16:47:13.628 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:47:13 np0005476733 nova_compute[192580]: 2025-10-08 16:47:13.689 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:47:13 np0005476733 nova_compute[192580]: 2025-10-08 16:47:13.764 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:47:13 np0005476733 nova_compute[192580]: 2025-10-08 16:47:13.765 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:47:13 np0005476733 nova_compute[192580]: 2025-10-08 16:47:13.821 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:47:13 np0005476733 nova_compute[192580]: 2025-10-08 16:47:13.957 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:47:13 np0005476733 nova_compute[192580]: 2025-10-08 16:47:13.959 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12844MB free_disk=111.16978073120117GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:47:13 np0005476733 nova_compute[192580]: 2025-10-08 16:47:13.959 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:47:13 np0005476733 nova_compute[192580]: 2025-10-08 16:47:13.959 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:47:14 np0005476733 nova_compute[192580]: 2025-10-08 16:47:14.045 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 952a7e26-0b63-4309-a370-27b9715e05cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:47:14 np0005476733 nova_compute[192580]: 2025-10-08 16:47:14.046 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:47:14 np0005476733 nova_compute[192580]: 2025-10-08 16:47:14.046 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:47:14 np0005476733 nova_compute[192580]: 2025-10-08 16:47:14.068 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 12:47:14 np0005476733 nova_compute[192580]: 2025-10-08 16:47:14.105 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 12:47:14 np0005476733 nova_compute[192580]: 2025-10-08 16:47:14.105 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 12:47:14 np0005476733 nova_compute[192580]: 2025-10-08 16:47:14.128 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 12:47:14 np0005476733 nova_compute[192580]: 2025-10-08 16:47:14.154 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 12:47:14 np0005476733 nova_compute[192580]: 2025-10-08 16:47:14.219 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:47:14 np0005476733 nova_compute[192580]: 2025-10-08 16:47:14.236 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:47:14 np0005476733 nova_compute[192580]: 2025-10-08 16:47:14.238 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:47:14 np0005476733 nova_compute[192580]: 2025-10-08 16:47:14.238 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.279s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:47:15 np0005476733 nova_compute[192580]: 2025-10-08 16:47:15.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:17 np0005476733 nova_compute[192580]: 2025-10-08 16:47:17.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:19 np0005476733 podman[271985]: 2025-10-08 16:47:19.228222012 +0000 UTC m=+0.058362352 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 12:47:20 np0005476733 nova_compute[192580]: 2025-10-08 16:47:20.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:21 np0005476733 podman[272006]: 2025-10-08 16:47:21.279072499 +0000 UTC m=+0.108490060 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 12:47:22 np0005476733 nova_compute[192580]: 2025-10-08 16:47:22.224 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:47:22 np0005476733 nova_compute[192580]: 2025-10-08 16:47:22.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:24 np0005476733 podman[272033]: 2025-10-08 16:47:24.220779984 +0000 UTC m=+0.053118255 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:47:24 np0005476733 systemd-logind[827]: New session 163 of user zuul.
Oct  8 12:47:24 np0005476733 systemd[1]: Started Session 163 of User zuul.
Oct  8 12:47:24 np0005476733 systemd[1]: Stopping Open vSwitch...
Oct  8 12:47:24 np0005476733 systemd[1]: openvswitch.service: Deactivated successfully.
Oct  8 12:47:24 np0005476733 systemd[1]: Stopped Open vSwitch.
Oct  8 12:47:24 np0005476733 systemd[1]: Stopping Open vSwitch Forwarding Unit...
Oct  8 12:47:24 np0005476733 ovs-vswitchd[49995]: ovs|00982|ofproto_dpif_rid|ERR|recirc_id 3032 left allocated when ofproto (br-int) is destructed
Oct  8 12:47:24 np0005476733 ovs-vswitchd[49995]: ovs|00983|ofproto_dpif_rid|ERR|recirc_id 3028 left allocated when ofproto (br-int) is destructed
Oct  8 12:47:24 np0005476733 ovs-vswitchd[49995]: ovs|00984|ofproto_dpif_rid|ERR|recirc_id 3029 left allocated when ofproto (br-int) is destructed
Oct  8 12:47:24 np0005476733 ovs-vswitchd[49995]: ovs|00985|ofproto_dpif_rid|ERR|recirc_id 3030 left allocated when ofproto (br-int) is destructed
Oct  8 12:47:24 np0005476733 ovs-vswitchd[49995]: ovs|00986|ofproto_dpif_rid|ERR|recirc_id 3031 left allocated when ofproto (br-int) is destructed
Oct  8 12:47:25 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:24Z|00004|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connection closed by peer
Oct  8 12:47:25 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:24Z|00018|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connection closed by peer
Oct  8 12:47:25 np0005476733 ovs-ctl[272084]: Exiting ovs-vswitchd (49995) [  OK  ]
Oct  8 12:47:25 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:25Z|00135|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connection closed by peer
Oct  8 12:47:25 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:25Z|00136|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connection closed by peer
Oct  8 12:47:26 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:26Z|00005|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  8 12:47:26 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:26Z|00006|rconn(ovn_statctrl3)|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (Connection refused)
Oct  8 12:47:26 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:26Z|00007|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 2 seconds before reconnect
Oct  8 12:47:26 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:26Z|00019|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  8 12:47:26 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:26Z|00137|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  8 12:47:26 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:26Z|00020|rconn(ovn_pinctrl0)|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (Connection refused)
Oct  8 12:47:26 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:26Z|00138|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (Connection refused)
Oct  8 12:47:26 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:26Z|00021|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 2 seconds before reconnect
Oct  8 12:47:26 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:26Z|00139|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 2 seconds before reconnect
Oct  8 12:47:26 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:26Z|00140|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  8 12:47:26 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:26Z|00141|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (Connection refused)
Oct  8 12:47:26 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:26Z|00142|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 2 seconds before reconnect
Oct  8 12:47:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:26.417 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:47:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:26.417 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:47:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:26.418 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:47:27 np0005476733 systemd[1]: ovs-vswitchd.service: Deactivated successfully.
Oct  8 12:47:27 np0005476733 systemd[1]: Stopped Open vSwitch Forwarding Unit.
Oct  8 12:47:27 np0005476733 systemd[1]: ovs-vswitchd.service: Consumed 43.927s CPU time.
Oct  8 12:47:27 np0005476733 systemd[1]: Stopping Open vSwitch Database Unit...
Oct  8 12:47:27 np0005476733 NetworkManager[51699]: <warn>  [1759942047.1932] ovsdb: short read from ovsdb: Success
Oct  8 12:47:27 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:27Z|00143|jsonrpc|WARN|unix:/run/openvswitch/db.sock: receive error: Connection reset by peer
Oct  8 12:47:27 np0005476733 ovs-ctl[272109]: Exiting ovsdb-server (49913) [  OK  ]
Oct  8 12:47:27 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:27Z|00144|reconnect|WARN|unix:/run/openvswitch/db.sock: connection dropped (Connection reset by peer)
Oct  8 12:47:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:27.193 103739 WARNING ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: send error: Connection reset by peer#033[00m
Oct  8 12:47:27 np0005476733 nova_compute[192580]: 2025-10-08 16:47:27.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:27 np0005476733 nova_compute[192580]: 2025-10-08 16:47:27.194 2 WARNING ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: send error: Broken pipe#033[00m
Oct  8 12:47:27 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:27.195 103739 WARNING ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connection dropped (Connection reset by peer)#033[00m
Oct  8 12:47:27 np0005476733 nova_compute[192580]: 2025-10-08 16:47:27.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:47:27 np0005476733 nova_compute[192580]: 2025-10-08 16:47:27.196 2 WARNING ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connection dropped (Broken pipe)#033[00m
Oct  8 12:47:27 np0005476733 nova_compute[192580]: 2025-10-08 16:47:27.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering BACKOFF _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 12:47:27 np0005476733 systemd[1]: Starting NetworkManager Privileged Helper...
Oct  8 12:47:27 np0005476733 dbus-broker[817]: A security policy denied :1.314 to send method call /org/freedesktop/nm_priv_helper:org.freedesktop.nm_priv_helper.GetFD to org.freedesktop.nm_priv_helper.
Oct  8 12:47:27 np0005476733 systemd[1]: Started NetworkManager Privileged Helper.
Oct  8 12:47:27 np0005476733 NetworkManager[51699]: <info>  [1759942047.3332] ovsdb: disconnected from ovsdb
Oct  8 12:47:27 np0005476733 nova_compute[192580]: 2025-10-08 16:47:27.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:47:28 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:28Z|00008|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  8 12:47:28 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:28Z|00022|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  8 12:47:28 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:28Z|00145|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  8 12:47:28 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:28Z|00009|rconn(ovn_statctrl3)|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (Connection refused)
Oct  8 12:47:28 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:28Z|00010|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 4 seconds before reconnect
Oct  8 12:47:28 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:28Z|00023|rconn(ovn_pinctrl0)|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (Connection refused)
Oct  8 12:47:28 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:28Z|00024|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 4 seconds before reconnect
Oct  8 12:47:28 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:28Z|00146|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (Connection refused)
Oct  8 12:47:28 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:28Z|00147|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 4 seconds before reconnect
Oct  8 12:47:28 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:28Z|00148|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct  8 12:47:28 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:28Z|00149|reconnect|INFO|unix:/run/openvswitch/db.sock: connection attempt failed (Connection refused)
Oct  8 12:47:28 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:28Z|00150|reconnect|INFO|unix:/run/openvswitch/db.sock: waiting 2 seconds before reconnect
Oct  8 12:47:28 np0005476733 nova_compute[192580]: 2025-10-08 16:47:28.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:47:28 np0005476733 nova_compute[192580]: 2025-10-08 16:47:28.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 12:47:28 np0005476733 nova_compute[192580]: 2025-10-08 16:47:28.198 2 WARNING ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connection attempt failed (Connection refused)#033[00m
Oct  8 12:47:28 np0005476733 nova_compute[192580]: 2025-10-08 16:47:28.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering BACKOFF _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 12:47:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:28.198 103739 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Oct  8 12:47:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:28.198 103739 WARNING ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connection attempt failed (Connection refused)#033[00m
Oct  8 12:47:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:28.198 103739 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: waiting 2 seconds before reconnect#033[00m
Oct  8 12:47:29 np0005476733 systemd[1]: ovsdb-server.service: Deactivated successfully.
Oct  8 12:47:29 np0005476733 systemd[1]: Stopped Open vSwitch Database Unit.
Oct  8 12:47:29 np0005476733 systemd[1]: ovsdb-server.service: Consumed 28.852s CPU time.
Oct  8 12:47:29 np0005476733 systemd[1]: Starting Open vSwitch Database Unit...
Oct  8 12:47:29 np0005476733 systemd[1]: nm-priv-helper.service: Deactivated successfully.
Oct  8 12:47:29 np0005476733 ovs-ctl[272147]: Starting ovsdb-server [  OK  ]
Oct  8 12:47:29 np0005476733 ovs-vsctl[272195]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Oct  8 12:47:29 np0005476733 ovs-vsctl[272200]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"ec52a299-e5bc-4227-a88e-e241833eebb2\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Oct  8 12:47:29 np0005476733 ovs-ctl[272147]: Configuring Open vSwitch system IDs [  OK  ]
Oct  8 12:47:29 np0005476733 ovs-ctl[272147]: Enabling remote OVSDB managers [  OK  ]
Oct  8 12:47:29 np0005476733 systemd[1]: Started Open vSwitch Database Unit.
Oct  8 12:47:29 np0005476733 systemd[1]: Starting Open vSwitch Forwarding Unit...
Oct  8 12:47:29 np0005476733 kernel: No such timeout policy "ovs_test_tp"
Oct  8 12:47:29 np0005476733 ovs-ctl[272208]: Starting ovs-vswitchd [  OK  ]
Oct  8 12:47:29 np0005476733 ovs-ctl[272208]: Enabling remote OVSDB managers [  OK  ]
Oct  8 12:47:29 np0005476733 systemd[1]: Started Open vSwitch Forwarding Unit.
Oct  8 12:47:29 np0005476733 ovs-vsctl[272267]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1.ctlplane.example.com
Oct  8 12:47:29 np0005476733 systemd[1]: Starting Open vSwitch...
Oct  8 12:47:29 np0005476733 systemd[1]: Finished Open vSwitch.
Oct  8 12:47:29 np0005476733 systemd-logind[827]: New session 164 of user zuul.
Oct  8 12:47:29 np0005476733 systemd[1]: Started Session 164 of User zuul.
Oct  8 12:47:29 np0005476733 systemd[1]: session-163.scope: Deactivated successfully.
Oct  8 12:47:29 np0005476733 systemd-logind[827]: Session 163 logged out. Waiting for processes to exit.
Oct  8 12:47:29 np0005476733 systemd-logind[827]: Removed session 163.
Oct  8 12:47:30 np0005476733 systemd[1]: session-164.scope: Deactivated successfully.
Oct  8 12:47:30 np0005476733 systemd-logind[827]: Session 164 logged out. Waiting for processes to exit.
Oct  8 12:47:30 np0005476733 systemd-logind[827]: Removed session 164.
Oct  8 12:47:30 np0005476733 systemd-logind[827]: New session 165 of user zuul.
Oct  8 12:47:30 np0005476733 systemd[1]: Started Session 165 of User zuul.
Oct  8 12:47:30 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:30Z|00151|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct  8 12:47:30 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:30Z|00152|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct  8 12:47:30 np0005476733 nova_compute[192580]: 2025-10-08 16:47:30.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 1999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:47:30 np0005476733 nova_compute[192580]: 2025-10-08 16:47:30.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 12:47:30 np0005476733 nova_compute[192580]: 2025-10-08 16:47:30.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLOUT] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:30 np0005476733 nova_compute[192580]: 2025-10-08 16:47:30.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 12:47:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:30.200 103739 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Oct  8 12:47:30 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:30.200 103739 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Oct  8 12:47:30 np0005476733 nova_compute[192580]: 2025-10-08 16:47:30.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:30 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:30Z|00153|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  8 12:47:30 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:30Z|00154|jsonrpc|WARN|unix:/run/openvswitch/db.sock: receive error: Connection reset by peer
Oct  8 12:47:30 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:30Z|00155|reconnect|WARN|unix:/run/openvswitch/db.sock: connection dropped (Connection reset by peer)
Oct  8 12:47:30 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:30Z|00156|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  8 12:47:30 np0005476733 nova_compute[192580]: 2025-10-08 16:47:30.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:30 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:30Z|00157|binding|INFO|Releasing lport 0f63c3a5-0e7e-4b46-b5a7-733b076dd1fa from this chassis (sb_readonly=0)
Oct  8 12:47:30 np0005476733 nova_compute[192580]: 2025-10-08 16:47:30.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:30 np0005476733 systemd[1]: session-165.scope: Deactivated successfully.
Oct  8 12:47:30 np0005476733 systemd-logind[827]: Session 165 logged out. Waiting for processes to exit.
Oct  8 12:47:30 np0005476733 systemd-logind[827]: Removed session 165.
Oct  8 12:47:31 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:31Z|00158|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct  8 12:47:31 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:31Z|00159|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct  8 12:47:32 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:32Z|00025|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  8 12:47:32 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:32Z|00160|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  8 12:47:32 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:32Z|00161|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  8 12:47:32 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:32Z|00162|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Oct  8 12:47:32 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:32Z|00026|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  8 12:47:32 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:32Z|00163|binding|INFO|Releasing lport 0f63c3a5-0e7e-4b46-b5a7-733b076dd1fa from this chassis (sb_readonly=0)
Oct  8 12:47:32 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:32Z|00011|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  8 12:47:32 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:32Z|00012|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  8 12:47:34 np0005476733 podman[272331]: 2025-10-08 16:47:34.064215383 +0000 UTC m=+0.068285398 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  8 12:47:34 np0005476733 podman[272333]: 2025-10-08 16:47:34.064272135 +0000 UTC m=+0.068390001 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Oct  8 12:47:34 np0005476733 podman[272332]: 2025-10-08 16:47:34.082929621 +0000 UTC m=+0.086820800 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:47:34 np0005476733 ovs-vsctl[272395]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1.ctlplane.example.com
Oct  8 12:47:34 np0005476733 nova_compute[192580]: 2025-10-08 16:47:34.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:35 np0005476733 nova_compute[192580]: 2025-10-08 16:47:35.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:39 np0005476733 nova_compute[192580]: 2025-10-08 16:47:39.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:40 np0005476733 nova_compute[192580]: 2025-10-08 16:47:40.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:40 np0005476733 podman[272417]: 2025-10-08 16:47:40.470140756 +0000 UTC m=+0.052664060 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible)
Oct  8 12:47:40 np0005476733 podman[272418]: 2025-10-08 16:47:40.476875501 +0000 UTC m=+0.055711227 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:47:44 np0005476733 nova_compute[192580]: 2025-10-08 16:47:44.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:45 np0005476733 nova_compute[192580]: 2025-10-08 16:47:45.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:46 np0005476733 nova_compute[192580]: 2025-10-08 16:47:46.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:47:49 np0005476733 nova_compute[192580]: 2025-10-08 16:47:49.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:50 np0005476733 nova_compute[192580]: 2025-10-08 16:47:50.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:50 np0005476733 podman[272469]: 2025-10-08 16:47:50.237922004 +0000 UTC m=+0.067947998 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 12:47:51 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:51Z|00164|pinctrl|WARN|Dropped 221 log messages in last 60 seconds (most recently, 0 seconds ago) due to excessive rate
Oct  8 12:47:51 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:51Z|00165|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:47:51 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:51.266 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=83, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=82) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:47:51 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:51.266 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:47:51 np0005476733 nova_compute[192580]: 2025-10-08 16:47:51.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:52 np0005476733 podman[272488]: 2025-10-08 16:47:52.265396517 +0000 UTC m=+0.098926456 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:47:52 np0005476733 nova_compute[192580]: 2025-10-08 16:47:52.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:47:53 np0005476733 nova_compute[192580]: 2025-10-08 16:47:53.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:47:53 np0005476733 nova_compute[192580]: 2025-10-08 16:47:53.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:47:54 np0005476733 podman[272516]: 2025-10-08 16:47:54.403774146 +0000 UTC m=+0.061831153 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible)
Oct  8 12:47:54 np0005476733 nova_compute[192580]: 2025-10-08 16:47:54.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:55 np0005476733 nova_compute[192580]: 2025-10-08 16:47:55.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:55 np0005476733 nova_compute[192580]: 2025-10-08 16:47:55.783 2 DEBUG oslo_concurrency.lockutils [None req-01fedf84-cfd8-4bca-af1e-6d1a696457f5 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "952a7e26-0b63-4309-a370-27b9715e05cb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:47:55 np0005476733 nova_compute[192580]: 2025-10-08 16:47:55.783 2 DEBUG oslo_concurrency.lockutils [None req-01fedf84-cfd8-4bca-af1e-6d1a696457f5 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "952a7e26-0b63-4309-a370-27b9715e05cb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:47:55 np0005476733 nova_compute[192580]: 2025-10-08 16:47:55.784 2 DEBUG oslo_concurrency.lockutils [None req-01fedf84-cfd8-4bca-af1e-6d1a696457f5 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "952a7e26-0b63-4309-a370-27b9715e05cb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:47:55 np0005476733 nova_compute[192580]: 2025-10-08 16:47:55.784 2 DEBUG oslo_concurrency.lockutils [None req-01fedf84-cfd8-4bca-af1e-6d1a696457f5 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "952a7e26-0b63-4309-a370-27b9715e05cb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:47:55 np0005476733 nova_compute[192580]: 2025-10-08 16:47:55.784 2 DEBUG oslo_concurrency.lockutils [None req-01fedf84-cfd8-4bca-af1e-6d1a696457f5 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "952a7e26-0b63-4309-a370-27b9715e05cb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:47:55 np0005476733 nova_compute[192580]: 2025-10-08 16:47:55.785 2 INFO nova.compute.manager [None req-01fedf84-cfd8-4bca-af1e-6d1a696457f5 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Terminating instance#033[00m
Oct  8 12:47:55 np0005476733 nova_compute[192580]: 2025-10-08 16:47:55.786 2 DEBUG nova.compute.manager [None req-01fedf84-cfd8-4bca-af1e-6d1a696457f5 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 12:47:55 np0005476733 kernel: tap7ad2d12a-57 (unregistering): left promiscuous mode
Oct  8 12:47:55 np0005476733 NetworkManager[51699]: <info>  [1759942075.8517] device (tap7ad2d12a-57): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 12:47:55 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:55Z|00166|binding|INFO|Releasing lport 7ad2d12a-572a-4586-93cf-9104c8937c76 from this chassis (sb_readonly=0)
Oct  8 12:47:55 np0005476733 nova_compute[192580]: 2025-10-08 16:47:55.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:55 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:55Z|00167|if_status|WARN|Trying to release unknown interface 7ad2d12a-572a-4586-93cf-9104c8937c76
Oct  8 12:47:55 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:55Z|00168|binding|INFO|Setting lport 7ad2d12a-572a-4586-93cf-9104c8937c76 down in Southbound
Oct  8 12:47:55 np0005476733 ovn_controller[263831]: 2025-10-08T16:47:55Z|00169|binding|INFO|Removing iface tap7ad2d12a-57 ovn-installed in OVS
Oct  8 12:47:55 np0005476733 nova_compute[192580]: 2025-10-08 16:47:55.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:55 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:55.870 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:23:9d 10.100.0.14'], port_security=['fa:16:3e:ff:23:9d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '952a7e26-0b63-4309-a370-27b9715e05cb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e4acb787-c359-4e53-a697-a921451e586f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ddcd45556b9d4077968eee95f005487d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f3351433-d580-4e91-809c-2020321c7d00', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.181'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2ba209ed-1f02-454a-a2cd-88fedbd343f3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=7ad2d12a-572a-4586-93cf-9104c8937c76) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:47:55 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:55.872 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 7ad2d12a-572a-4586-93cf-9104c8937c76 in datapath e4acb787-c359-4e53-a697-a921451e586f unbound from our chassis#033[00m
Oct  8 12:47:55 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:55.873 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e4acb787-c359-4e53-a697-a921451e586f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 12:47:55 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:55.874 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6ed85b86-346c-4e8c-b317-47448b22fec0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:47:55 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:55.874 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e4acb787-c359-4e53-a697-a921451e586f namespace which is not needed anymore#033[00m
Oct  8 12:47:55 np0005476733 nova_compute[192580]: 2025-10-08 16:47:55.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:55 np0005476733 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000067.scope: Deactivated successfully.
Oct  8 12:47:55 np0005476733 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000067.scope: Consumed 43.662s CPU time.
Oct  8 12:47:55 np0005476733 systemd-machined[152624]: Machine qemu-64-instance-00000067 terminated.
Oct  8 12:47:56 np0005476733 nova_compute[192580]: 2025-10-08 16:47:56.062 2 INFO nova.virt.libvirt.driver [-] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Instance destroyed successfully.#033[00m
Oct  8 12:47:56 np0005476733 nova_compute[192580]: 2025-10-08 16:47:56.063 2 DEBUG nova.objects.instance [None req-01fedf84-cfd8-4bca-af1e-6d1a696457f5 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lazy-loading 'resources' on Instance uuid 952a7e26-0b63-4309-a370-27b9715e05cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:47:56 np0005476733 nova_compute[192580]: 2025-10-08 16:47:56.084 2 DEBUG nova.virt.libvirt.vif [None req-01fedf84-cfd8-4bca-af1e-6d1a696457f5 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T16:44:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_igmp_snooping_after_openvswitch_restart-1066522754',display_name='tempest-test_igmp_snooping_after_openvswitch_restart-1066522754',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-igmp-snooping-after-openvswitch-restart-1066522754',id=103,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNTrp4SSnfhKhWl5/vr+ysOFBxdgTslwc0H7TgTRWXihMtjd4e3hSjQ8BhgGRqYqjDbdOxfo/dIVr5KrHj2ewhMFyenbuZO+39j7i/Z4jwloPin+qTxJZUEv9/APVs6CqQ==',key_name='tempest-keypair-test-2124727488',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:44:49Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ddcd45556b9d4077968eee95f005487d',ramdisk_id='',reservation_id='r-92xxo6uk',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-MulticastTestIPv4Common-2020042253',owner_user_name='tempest-MulticastTestIPv4Common-2020042253-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T16:44:49Z,user_data=None,user_id='bee18afeaf16419c98219491d4757b96',uuid=952a7e26-0b63-4309-a370-27b9715e05cb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7ad2d12a-572a-4586-93cf-9104c8937c76", "address": "fa:16:3e:ff:23:9d", "network": {"id": "e4acb787-c359-4e53-a697-a921451e586f", "bridge": "br-int", "label": "tempest-test-network--1114541047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ddcd45556b9d4077968eee95f005487d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad2d12a-57", "ovs_interfaceid": "7ad2d12a-572a-4586-93cf-9104c8937c76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 12:47:56 np0005476733 nova_compute[192580]: 2025-10-08 16:47:56.084 2 DEBUG nova.network.os_vif_util [None req-01fedf84-cfd8-4bca-af1e-6d1a696457f5 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Converting VIF {"id": "7ad2d12a-572a-4586-93cf-9104c8937c76", "address": "fa:16:3e:ff:23:9d", "network": {"id": "e4acb787-c359-4e53-a697-a921451e586f", "bridge": "br-int", "label": "tempest-test-network--1114541047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ddcd45556b9d4077968eee95f005487d", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ad2d12a-57", "ovs_interfaceid": "7ad2d12a-572a-4586-93cf-9104c8937c76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:47:56 np0005476733 nova_compute[192580]: 2025-10-08 16:47:56.085 2 DEBUG nova.network.os_vif_util [None req-01fedf84-cfd8-4bca-af1e-6d1a696457f5 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ff:23:9d,bridge_name='br-int',has_traffic_filtering=True,id=7ad2d12a-572a-4586-93cf-9104c8937c76,network=Network(e4acb787-c359-4e53-a697-a921451e586f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ad2d12a-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:47:56 np0005476733 nova_compute[192580]: 2025-10-08 16:47:56.085 2 DEBUG os_vif [None req-01fedf84-cfd8-4bca-af1e-6d1a696457f5 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:23:9d,bridge_name='br-int',has_traffic_filtering=True,id=7ad2d12a-572a-4586-93cf-9104c8937c76,network=Network(e4acb787-c359-4e53-a697-a921451e586f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ad2d12a-57') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 12:47:56 np0005476733 nova_compute[192580]: 2025-10-08 16:47:56.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:56 np0005476733 nova_compute[192580]: 2025-10-08 16:47:56.087 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7ad2d12a-57, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:47:56 np0005476733 nova_compute[192580]: 2025-10-08 16:47:56.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:56 np0005476733 nova_compute[192580]: 2025-10-08 16:47:56.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:47:56 np0005476733 nova_compute[192580]: 2025-10-08 16:47:56.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:56 np0005476733 nova_compute[192580]: 2025-10-08 16:47:56.093 2 INFO os_vif [None req-01fedf84-cfd8-4bca-af1e-6d1a696457f5 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:23:9d,bridge_name='br-int',has_traffic_filtering=True,id=7ad2d12a-572a-4586-93cf-9104c8937c76,network=Network(e4acb787-c359-4e53-a697-a921451e586f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ad2d12a-57')#033[00m
Oct  8 12:47:56 np0005476733 nova_compute[192580]: 2025-10-08 16:47:56.094 2 INFO nova.virt.libvirt.driver [None req-01fedf84-cfd8-4bca-af1e-6d1a696457f5 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Deleting instance files /var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb_del#033[00m
Oct  8 12:47:56 np0005476733 nova_compute[192580]: 2025-10-08 16:47:56.094 2 INFO nova.virt.libvirt.driver [None req-01fedf84-cfd8-4bca-af1e-6d1a696457f5 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Deletion of /var/lib/nova/instances/952a7e26-0b63-4309-a370-27b9715e05cb_del complete#033[00m
Oct  8 12:47:56 np0005476733 nova_compute[192580]: 2025-10-08 16:47:56.154 2 INFO nova.compute.manager [None req-01fedf84-cfd8-4bca-af1e-6d1a696457f5 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 12:47:56 np0005476733 nova_compute[192580]: 2025-10-08 16:47:56.155 2 DEBUG oslo.service.loopingcall [None req-01fedf84-cfd8-4bca-af1e-6d1a696457f5 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 12:47:56 np0005476733 nova_compute[192580]: 2025-10-08 16:47:56.155 2 DEBUG nova.compute.manager [-] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 12:47:56 np0005476733 nova_compute[192580]: 2025-10-08 16:47:56.155 2 DEBUG nova.network.neutron [-] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 12:47:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:56.268 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '83'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:47:56 np0005476733 neutron-haproxy-ovnmeta-e4acb787-c359-4e53-a697-a921451e586f[271018]: [NOTICE]   (271035) : haproxy version is 2.8.14-c23fe91
Oct  8 12:47:56 np0005476733 neutron-haproxy-ovnmeta-e4acb787-c359-4e53-a697-a921451e586f[271018]: [NOTICE]   (271035) : path to executable is /usr/sbin/haproxy
Oct  8 12:47:56 np0005476733 neutron-haproxy-ovnmeta-e4acb787-c359-4e53-a697-a921451e586f[271018]: [WARNING]  (271035) : Exiting Master process...
Oct  8 12:47:56 np0005476733 neutron-haproxy-ovnmeta-e4acb787-c359-4e53-a697-a921451e586f[271018]: [ALERT]    (271035) : Current worker (271038) exited with code 143 (Terminated)
Oct  8 12:47:56 np0005476733 neutron-haproxy-ovnmeta-e4acb787-c359-4e53-a697-a921451e586f[271018]: [WARNING]  (271035) : All workers exited. Exiting... (0)
Oct  8 12:47:56 np0005476733 systemd[1]: libpod-af3c891bf04bf58f538d1dde13667292d003dde741ef694f771b86d2fe93dfeb.scope: Deactivated successfully.
Oct  8 12:47:56 np0005476733 conmon[271018]: conmon af3c891bf04bf58f538d <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-af3c891bf04bf58f538d1dde13667292d003dde741ef694f771b86d2fe93dfeb.scope/container/memory.events
Oct  8 12:47:56 np0005476733 podman[272561]: 2025-10-08 16:47:56.346079502 +0000 UTC m=+0.385345698 container died af3c891bf04bf58f538d1dde13667292d003dde741ef694f771b86d2fe93dfeb (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-e4acb787-c359-4e53-a697-a921451e586f, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 12:47:56 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-af3c891bf04bf58f538d1dde13667292d003dde741ef694f771b86d2fe93dfeb-userdata-shm.mount: Deactivated successfully.
Oct  8 12:47:56 np0005476733 systemd[1]: var-lib-containers-storage-overlay-3dc529a684be04286bdf196eb7acc2aedf492c95aed3773f06dc055a10a273a6-merged.mount: Deactivated successfully.
Oct  8 12:47:57 np0005476733 podman[272561]: 2025-10-08 16:47:57.644435765 +0000 UTC m=+1.683701981 container cleanup af3c891bf04bf58f538d1dde13667292d003dde741ef694f771b86d2fe93dfeb (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-e4acb787-c359-4e53-a697-a921451e586f, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:47:57 np0005476733 systemd[1]: libpod-conmon-af3c891bf04bf58f538d1dde13667292d003dde741ef694f771b86d2fe93dfeb.scope: Deactivated successfully.
Oct  8 12:47:57 np0005476733 nova_compute[192580]: 2025-10-08 16:47:57.782 2 DEBUG nova.compute.manager [req-a276a73d-bd3e-4222-8488-386b3b0a9216 req-da2bc72d-788d-4fa8-a229-ecda42fe73ad 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Received event network-vif-unplugged-7ad2d12a-572a-4586-93cf-9104c8937c76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:47:57 np0005476733 nova_compute[192580]: 2025-10-08 16:47:57.783 2 DEBUG oslo_concurrency.lockutils [req-a276a73d-bd3e-4222-8488-386b3b0a9216 req-da2bc72d-788d-4fa8-a229-ecda42fe73ad 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "952a7e26-0b63-4309-a370-27b9715e05cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:47:57 np0005476733 nova_compute[192580]: 2025-10-08 16:47:57.783 2 DEBUG oslo_concurrency.lockutils [req-a276a73d-bd3e-4222-8488-386b3b0a9216 req-da2bc72d-788d-4fa8-a229-ecda42fe73ad 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "952a7e26-0b63-4309-a370-27b9715e05cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:47:57 np0005476733 nova_compute[192580]: 2025-10-08 16:47:57.784 2 DEBUG oslo_concurrency.lockutils [req-a276a73d-bd3e-4222-8488-386b3b0a9216 req-da2bc72d-788d-4fa8-a229-ecda42fe73ad 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "952a7e26-0b63-4309-a370-27b9715e05cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:47:57 np0005476733 nova_compute[192580]: 2025-10-08 16:47:57.785 2 DEBUG nova.compute.manager [req-a276a73d-bd3e-4222-8488-386b3b0a9216 req-da2bc72d-788d-4fa8-a229-ecda42fe73ad 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] No waiting events found dispatching network-vif-unplugged-7ad2d12a-572a-4586-93cf-9104c8937c76 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:47:57 np0005476733 nova_compute[192580]: 2025-10-08 16:47:57.785 2 DEBUG nova.compute.manager [req-a276a73d-bd3e-4222-8488-386b3b0a9216 req-da2bc72d-788d-4fa8-a229-ecda42fe73ad 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Received event network-vif-unplugged-7ad2d12a-572a-4586-93cf-9104c8937c76 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 12:47:57 np0005476733 podman[272609]: 2025-10-08 16:47:57.941683213 +0000 UTC m=+0.273831183 container remove af3c891bf04bf58f538d1dde13667292d003dde741ef694f771b86d2fe93dfeb (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-e4acb787-c359-4e53-a697-a921451e586f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 12:47:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:57.948 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0afd365d-dbbe-47eb-9288-e8ed62cf07f7]: (4, ('Wed Oct  8 04:47:55 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e4acb787-c359-4e53-a697-a921451e586f (af3c891bf04bf58f538d1dde13667292d003dde741ef694f771b86d2fe93dfeb)\naf3c891bf04bf58f538d1dde13667292d003dde741ef694f771b86d2fe93dfeb\nWed Oct  8 04:47:57 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e4acb787-c359-4e53-a697-a921451e586f (af3c891bf04bf58f538d1dde13667292d003dde741ef694f771b86d2fe93dfeb)\naf3c891bf04bf58f538d1dde13667292d003dde741ef694f771b86d2fe93dfeb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:47:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:57.950 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b2ff3c1d-c9a7-4e64-87ad-246ccdc2211f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:47:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:57.952 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape4acb787-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:47:58 np0005476733 nova_compute[192580]: 2025-10-08 16:47:58.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:58 np0005476733 kernel: tape4acb787-c0: left promiscuous mode
Oct  8 12:47:58 np0005476733 nova_compute[192580]: 2025-10-08 16:47:58.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:58.029 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4fe33571-0680-41c4-b5ef-94ce34625646]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:47:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:58.064 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[17233f9a-db3d-4fce-b06f-689cb1926663]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:47:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:58.065 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[903c1ff4-1739-47d0-8a18-61a09eefece5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:47:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:58.083 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8e2b6346-8964-433e-99ee-d0845c2a30cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 879219, 'reachable_time': 21303, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 272627, 'error': None, 'target': 'ovnmeta-e4acb787-c359-4e53-a697-a921451e586f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:47:58 np0005476733 systemd[1]: run-netns-ovnmeta\x2de4acb787\x2dc359\x2d4e53\x2da697\x2da921451e586f.mount: Deactivated successfully.
Oct  8 12:47:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:58.087 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e4acb787-c359-4e53-a697-a921451e586f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 12:47:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:47:58.087 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[f3071a2a-adbd-4588-901e-5ad667898107]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:47:59 np0005476733 nova_compute[192580]: 2025-10-08 16:47:59.542 2 DEBUG nova.network.neutron [-] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:47:59 np0005476733 nova_compute[192580]: 2025-10-08 16:47:59.576 2 INFO nova.compute.manager [-] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Took 3.42 seconds to deallocate network for instance.#033[00m
Oct  8 12:47:59 np0005476733 nova_compute[192580]: 2025-10-08 16:47:59.629 2 DEBUG oslo_concurrency.lockutils [None req-01fedf84-cfd8-4bca-af1e-6d1a696457f5 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:47:59 np0005476733 nova_compute[192580]: 2025-10-08 16:47:59.630 2 DEBUG oslo_concurrency.lockutils [None req-01fedf84-cfd8-4bca-af1e-6d1a696457f5 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:47:59 np0005476733 nova_compute[192580]: 2025-10-08 16:47:59.694 2 DEBUG nova.compute.provider_tree [None req-01fedf84-cfd8-4bca-af1e-6d1a696457f5 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:47:59 np0005476733 nova_compute[192580]: 2025-10-08 16:47:59.712 2 DEBUG nova.scheduler.client.report [None req-01fedf84-cfd8-4bca-af1e-6d1a696457f5 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:47:59 np0005476733 nova_compute[192580]: 2025-10-08 16:47:59.746 2 DEBUG oslo_concurrency.lockutils [None req-01fedf84-cfd8-4bca-af1e-6d1a696457f5 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:47:59 np0005476733 nova_compute[192580]: 2025-10-08 16:47:59.778 2 INFO nova.scheduler.client.report [None req-01fedf84-cfd8-4bca-af1e-6d1a696457f5 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Deleted allocations for instance 952a7e26-0b63-4309-a370-27b9715e05cb#033[00m
Oct  8 12:47:59 np0005476733 nova_compute[192580]: 2025-10-08 16:47:59.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:47:59 np0005476733 nova_compute[192580]: 2025-10-08 16:47:59.873 2 DEBUG oslo_concurrency.lockutils [None req-01fedf84-cfd8-4bca-af1e-6d1a696457f5 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "952a7e26-0b63-4309-a370-27b9715e05cb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.090s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:47:59 np0005476733 nova_compute[192580]: 2025-10-08 16:47:59.903 2 DEBUG nova.compute.manager [req-0a0d0b85-a03a-4999-8925-01c7c2629869 req-8e88843b-6378-4a01-89bd-b606490ff0df 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Received event network-vif-plugged-7ad2d12a-572a-4586-93cf-9104c8937c76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:47:59 np0005476733 nova_compute[192580]: 2025-10-08 16:47:59.904 2 DEBUG oslo_concurrency.lockutils [req-0a0d0b85-a03a-4999-8925-01c7c2629869 req-8e88843b-6378-4a01-89bd-b606490ff0df 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "952a7e26-0b63-4309-a370-27b9715e05cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:47:59 np0005476733 nova_compute[192580]: 2025-10-08 16:47:59.904 2 DEBUG oslo_concurrency.lockutils [req-0a0d0b85-a03a-4999-8925-01c7c2629869 req-8e88843b-6378-4a01-89bd-b606490ff0df 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "952a7e26-0b63-4309-a370-27b9715e05cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:47:59 np0005476733 nova_compute[192580]: 2025-10-08 16:47:59.904 2 DEBUG oslo_concurrency.lockutils [req-0a0d0b85-a03a-4999-8925-01c7c2629869 req-8e88843b-6378-4a01-89bd-b606490ff0df 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "952a7e26-0b63-4309-a370-27b9715e05cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:47:59 np0005476733 nova_compute[192580]: 2025-10-08 16:47:59.904 2 DEBUG nova.compute.manager [req-0a0d0b85-a03a-4999-8925-01c7c2629869 req-8e88843b-6378-4a01-89bd-b606490ff0df 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] No waiting events found dispatching network-vif-plugged-7ad2d12a-572a-4586-93cf-9104c8937c76 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:47:59 np0005476733 nova_compute[192580]: 2025-10-08 16:47:59.905 2 WARNING nova.compute.manager [req-0a0d0b85-a03a-4999-8925-01c7c2629869 req-8e88843b-6378-4a01-89bd-b606490ff0df 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Received unexpected event network-vif-plugged-7ad2d12a-572a-4586-93cf-9104c8937c76 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 12:47:59 np0005476733 nova_compute[192580]: 2025-10-08 16:47:59.905 2 DEBUG nova.compute.manager [req-0a0d0b85-a03a-4999-8925-01c7c2629869 req-8e88843b-6378-4a01-89bd-b606490ff0df 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Received event network-vif-deleted-7ad2d12a-572a-4586-93cf-9104c8937c76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:48:00 np0005476733 nova_compute[192580]: 2025-10-08 16:48:00.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:48:01 np0005476733 nova_compute[192580]: 2025-10-08 16:48:01.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:02 np0005476733 nova_compute[192580]: 2025-10-08 16:48:02.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:48:04 np0005476733 podman[272631]: 2025-10-08 16:48:04.22103904 +0000 UTC m=+0.044596073 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 12:48:04 np0005476733 podman[272632]: 2025-10-08 16:48:04.230724269 +0000 UTC m=+0.051343238 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, version=9.6, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  8 12:48:04 np0005476733 podman[272630]: 2025-10-08 16:48:04.230969357 +0000 UTC m=+0.057470184 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Oct  8 12:48:04 np0005476733 nova_compute[192580]: 2025-10-08 16:48:04.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:05 np0005476733 nova_compute[192580]: 2025-10-08 16:48:05.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:48:05 np0005476733 nova_compute[192580]: 2025-10-08 16:48:05.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:48:05 np0005476733 nova_compute[192580]: 2025-10-08 16:48:05.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:48:05 np0005476733 nova_compute[192580]: 2025-10-08 16:48:05.608 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:48:06 np0005476733 nova_compute[192580]: 2025-10-08 16:48:06.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:09 np0005476733 nova_compute[192580]: 2025-10-08 16:48:09.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:11 np0005476733 nova_compute[192580]: 2025-10-08 16:48:11.062 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759942076.060114, 952a7e26-0b63-4309-a370-27b9715e05cb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:48:11 np0005476733 nova_compute[192580]: 2025-10-08 16:48:11.062 2 INFO nova.compute.manager [-] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] VM Stopped (Lifecycle Event)#033[00m
Oct  8 12:48:11 np0005476733 nova_compute[192580]: 2025-10-08 16:48:11.096 2 DEBUG nova.compute.manager [None req-65c6af50-633c-4084-a212-807fdbf4866b - - - - - -] [instance: 952a7e26-0b63-4309-a370-27b9715e05cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:48:11 np0005476733 nova_compute[192580]: 2025-10-08 16:48:11.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:11 np0005476733 podman[272689]: 2025-10-08 16:48:11.227839214 +0000 UTC m=+0.055217172 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3)
Oct  8 12:48:11 np0005476733 podman[272690]: 2025-10-08 16:48:11.227929727 +0000 UTC m=+0.050862153 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 12:48:11 np0005476733 nova_compute[192580]: 2025-10-08 16:48:11.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:48:14 np0005476733 nova_compute[192580]: 2025-10-08 16:48:14.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:48:14 np0005476733 nova_compute[192580]: 2025-10-08 16:48:14.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:48:14 np0005476733 nova_compute[192580]: 2025-10-08 16:48:14.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:48:14 np0005476733 nova_compute[192580]: 2025-10-08 16:48:14.617 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:48:14 np0005476733 nova_compute[192580]: 2025-10-08 16:48:14.617 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:48:14 np0005476733 nova_compute[192580]: 2025-10-08 16:48:14.760 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:48:14 np0005476733 nova_compute[192580]: 2025-10-08 16:48:14.762 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13647MB free_disk=111.31255340576172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:48:14 np0005476733 nova_compute[192580]: 2025-10-08 16:48:14.762 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:48:14 np0005476733 nova_compute[192580]: 2025-10-08 16:48:14.762 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:48:14 np0005476733 nova_compute[192580]: 2025-10-08 16:48:14.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:14 np0005476733 nova_compute[192580]: 2025-10-08 16:48:14.841 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:48:14 np0005476733 nova_compute[192580]: 2025-10-08 16:48:14.842 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:48:14 np0005476733 nova_compute[192580]: 2025-10-08 16:48:14.953 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:48:15 np0005476733 nova_compute[192580]: 2025-10-08 16:48:15.209 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:48:15 np0005476733 nova_compute[192580]: 2025-10-08 16:48:15.234 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:48:15 np0005476733 nova_compute[192580]: 2025-10-08 16:48:15.234 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.472s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:48:16 np0005476733 nova_compute[192580]: 2025-10-08 16:48:16.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:19 np0005476733 nova_compute[192580]: 2025-10-08 16:48:19.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:21 np0005476733 nova_compute[192580]: 2025-10-08 16:48:21.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:21 np0005476733 podman[272732]: 2025-10-08 16:48:21.225229453 +0000 UTC m=+0.050509011 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  8 12:48:23 np0005476733 nova_compute[192580]: 2025-10-08 16:48:23.235 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:48:23 np0005476733 podman[272751]: 2025-10-08 16:48:23.252073536 +0000 UTC m=+0.081710967 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:48:24 np0005476733 nova_compute[192580]: 2025-10-08 16:48:24.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:25 np0005476733 podman[272777]: 2025-10-08 16:48:25.221834107 +0000 UTC m=+0.057096932 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 12:48:26 np0005476733 nova_compute[192580]: 2025-10-08 16:48:26.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:26.419 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:48:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:26.419 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:48:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:26.420 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:48:29 np0005476733 nova_compute[192580]: 2025-10-08 16:48:29.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:31 np0005476733 nova_compute[192580]: 2025-10-08 16:48:31.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:34 np0005476733 nova_compute[192580]: 2025-10-08 16:48:34.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:35 np0005476733 podman[272800]: 2025-10-08 16:48:35.233004326 +0000 UTC m=+0.056410520 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, config_id=edpm, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  8 12:48:35 np0005476733 podman[272799]: 2025-10-08 16:48:35.250177624 +0000 UTC m=+0.078709641 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 12:48:35 np0005476733 podman[272798]: 2025-10-08 16:48:35.251192825 +0000 UTC m=+0.081146638 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.074 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.074 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.074 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.075 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.075 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.075 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.075 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.075 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.075 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.075 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.075 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:48:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:48:36 np0005476733 nova_compute[192580]: 2025-10-08 16:48:36.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:36 np0005476733 ovn_controller[263831]: 2025-10-08T16:48:36Z|00170|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Oct  8 12:48:39 np0005476733 nova_compute[192580]: 2025-10-08 16:48:39.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:41 np0005476733 nova_compute[192580]: 2025-10-08 16:48:41.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:42 np0005476733 podman[272863]: 2025-10-08 16:48:42.227897529 +0000 UTC m=+0.048958032 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 12:48:42 np0005476733 podman[272864]: 2025-10-08 16:48:42.228324273 +0000 UTC m=+0.044938144 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:48:44 np0005476733 nova_compute[192580]: 2025-10-08 16:48:44.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:46 np0005476733 nova_compute[192580]: 2025-10-08 16:48:46.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:47 np0005476733 nova_compute[192580]: 2025-10-08 16:48:47.492 2 DEBUG oslo_concurrency.lockutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "59ce9674-8997-4f60-b278-fca63264b284" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:48:47 np0005476733 nova_compute[192580]: 2025-10-08 16:48:47.493 2 DEBUG oslo_concurrency.lockutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "59ce9674-8997-4f60-b278-fca63264b284" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:48:47 np0005476733 nova_compute[192580]: 2025-10-08 16:48:47.507 2 DEBUG nova.compute.manager [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 12:48:47 np0005476733 nova_compute[192580]: 2025-10-08 16:48:47.580 2 DEBUG oslo_concurrency.lockutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:48:47 np0005476733 nova_compute[192580]: 2025-10-08 16:48:47.580 2 DEBUG oslo_concurrency.lockutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:48:47 np0005476733 nova_compute[192580]: 2025-10-08 16:48:47.587 2 DEBUG nova.virt.hardware [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 12:48:47 np0005476733 nova_compute[192580]: 2025-10-08 16:48:47.587 2 INFO nova.compute.claims [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 12:48:47 np0005476733 nova_compute[192580]: 2025-10-08 16:48:47.742 2 DEBUG nova.compute.provider_tree [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:48:47 np0005476733 nova_compute[192580]: 2025-10-08 16:48:47.757 2 DEBUG nova.scheduler.client.report [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:48:47 np0005476733 nova_compute[192580]: 2025-10-08 16:48:47.789 2 DEBUG oslo_concurrency.lockutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:48:47 np0005476733 nova_compute[192580]: 2025-10-08 16:48:47.790 2 DEBUG nova.compute.manager [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 12:48:47 np0005476733 nova_compute[192580]: 2025-10-08 16:48:47.839 2 DEBUG nova.compute.manager [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 12:48:47 np0005476733 nova_compute[192580]: 2025-10-08 16:48:47.840 2 DEBUG nova.network.neutron [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 12:48:47 np0005476733 nova_compute[192580]: 2025-10-08 16:48:47.855 2 INFO nova.virt.libvirt.driver [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 12:48:47 np0005476733 nova_compute[192580]: 2025-10-08 16:48:47.875 2 DEBUG nova.compute.manager [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.003 2 DEBUG nova.policy [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.049 2 DEBUG nova.compute.manager [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.050 2 DEBUG nova.virt.libvirt.driver [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.051 2 INFO nova.virt.libvirt.driver [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Creating image(s)#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.051 2 DEBUG oslo_concurrency.lockutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "/var/lib/nova/instances/59ce9674-8997-4f60-b278-fca63264b284/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.052 2 DEBUG oslo_concurrency.lockutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "/var/lib/nova/instances/59ce9674-8997-4f60-b278-fca63264b284/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.052 2 DEBUG oslo_concurrency.lockutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "/var/lib/nova/instances/59ce9674-8997-4f60-b278-fca63264b284/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.064 2 DEBUG oslo_concurrency.processutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.143 2 DEBUG oslo_concurrency.processutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.144 2 DEBUG oslo_concurrency.lockutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.145 2 DEBUG oslo_concurrency.lockutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.158 2 DEBUG oslo_concurrency.processutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.217 2 DEBUG oslo_concurrency.processutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.218 2 DEBUG oslo_concurrency.processutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/59ce9674-8997-4f60-b278-fca63264b284/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.294 2 DEBUG oslo_concurrency.processutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/59ce9674-8997-4f60-b278-fca63264b284/disk 10737418240" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.295 2 DEBUG oslo_concurrency.lockutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.296 2 DEBUG oslo_concurrency.processutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.350 2 DEBUG oslo_concurrency.processutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.352 2 DEBUG nova.objects.instance [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lazy-loading 'migration_context' on Instance uuid 59ce9674-8997-4f60-b278-fca63264b284 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.374 2 DEBUG nova.virt.libvirt.driver [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.374 2 DEBUG nova.virt.libvirt.driver [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Ensure instance console log exists: /var/lib/nova/instances/59ce9674-8997-4f60-b278-fca63264b284/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.375 2 DEBUG oslo_concurrency.lockutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.375 2 DEBUG oslo_concurrency.lockutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.375 2 DEBUG oslo_concurrency.lockutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:48:48 np0005476733 nova_compute[192580]: 2025-10-08 16:48:48.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:48:49 np0005476733 nova_compute[192580]: 2025-10-08 16:48:49.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:50 np0005476733 nova_compute[192580]: 2025-10-08 16:48:50.650 2 DEBUG nova.network.neutron [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Successfully created port: 4d9e95a4-6e11-4c93-b8e6-862a11093b1c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 12:48:51 np0005476733 nova_compute[192580]: 2025-10-08 16:48:51.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:51 np0005476733 ovn_controller[263831]: 2025-10-08T16:48:51Z|00171|pinctrl|WARN|Dropped 353 log messages in last 60 seconds (most recently, 0 seconds ago) due to excessive rate
Oct  8 12:48:51 np0005476733 ovn_controller[263831]: 2025-10-08T16:48:51Z|00172|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:48:51 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:51.316 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=84, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=83) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:48:51 np0005476733 nova_compute[192580]: 2025-10-08 16:48:51.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:51 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:51.318 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:48:52 np0005476733 nova_compute[192580]: 2025-10-08 16:48:52.124 2 DEBUG nova.network.neutron [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Successfully updated port: 4d9e95a4-6e11-4c93-b8e6-862a11093b1c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 12:48:52 np0005476733 nova_compute[192580]: 2025-10-08 16:48:52.152 2 DEBUG oslo_concurrency.lockutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "refresh_cache-59ce9674-8997-4f60-b278-fca63264b284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:48:52 np0005476733 nova_compute[192580]: 2025-10-08 16:48:52.152 2 DEBUG oslo_concurrency.lockutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquired lock "refresh_cache-59ce9674-8997-4f60-b278-fca63264b284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:48:52 np0005476733 nova_compute[192580]: 2025-10-08 16:48:52.153 2 DEBUG nova.network.neutron [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:48:52 np0005476733 nova_compute[192580]: 2025-10-08 16:48:52.266 2 DEBUG nova.compute.manager [req-043dc586-b57c-40ea-a463-b45d8fefbd5f req-0f4f6113-d1a6-4cef-b694-f6299c94f735 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Received event network-changed-4d9e95a4-6e11-4c93-b8e6-862a11093b1c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:48:52 np0005476733 nova_compute[192580]: 2025-10-08 16:48:52.266 2 DEBUG nova.compute.manager [req-043dc586-b57c-40ea-a463-b45d8fefbd5f req-0f4f6113-d1a6-4cef-b694-f6299c94f735 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Refreshing instance network info cache due to event network-changed-4d9e95a4-6e11-4c93-b8e6-862a11093b1c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:48:52 np0005476733 nova_compute[192580]: 2025-10-08 16:48:52.267 2 DEBUG oslo_concurrency.lockutils [req-043dc586-b57c-40ea-a463-b45d8fefbd5f req-0f4f6113-d1a6-4cef-b694-f6299c94f735 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-59ce9674-8997-4f60-b278-fca63264b284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:48:52 np0005476733 podman[272919]: 2025-10-08 16:48:52.284940839 +0000 UTC m=+0.092095588 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct  8 12:48:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:52.321 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '84'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:48:52 np0005476733 nova_compute[192580]: 2025-10-08 16:48:52.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:48:53 np0005476733 nova_compute[192580]: 2025-10-08 16:48:53.131 2 DEBUG nova.network.neutron [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 12:48:54 np0005476733 podman[272939]: 2025-10-08 16:48:54.308555268 +0000 UTC m=+0.134266093 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001)
Oct  8 12:48:54 np0005476733 nova_compute[192580]: 2025-10-08 16:48:54.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:48:54 np0005476733 nova_compute[192580]: 2025-10-08 16:48:54.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:48:54 np0005476733 nova_compute[192580]: 2025-10-08 16:48:54.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.356 2 DEBUG nova.network.neutron [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Updating instance_info_cache with network_info: [{"id": "4d9e95a4-6e11-4c93-b8e6-862a11093b1c", "address": "fa:16:3e:57:29:6c", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.224", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d9e95a4-6e", "ovs_interfaceid": "4d9e95a4-6e11-4c93-b8e6-862a11093b1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.388 2 DEBUG oslo_concurrency.lockutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Releasing lock "refresh_cache-59ce9674-8997-4f60-b278-fca63264b284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.390 2 DEBUG nova.compute.manager [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Instance network_info: |[{"id": "4d9e95a4-6e11-4c93-b8e6-862a11093b1c", "address": "fa:16:3e:57:29:6c", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.224", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d9e95a4-6e", "ovs_interfaceid": "4d9e95a4-6e11-4c93-b8e6-862a11093b1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.391 2 DEBUG oslo_concurrency.lockutils [req-043dc586-b57c-40ea-a463-b45d8fefbd5f req-0f4f6113-d1a6-4cef-b694-f6299c94f735 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-59ce9674-8997-4f60-b278-fca63264b284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.391 2 DEBUG nova.network.neutron [req-043dc586-b57c-40ea-a463-b45d8fefbd5f req-0f4f6113-d1a6-4cef-b694-f6299c94f735 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Refreshing network info cache for port 4d9e95a4-6e11-4c93-b8e6-862a11093b1c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.398 2 DEBUG nova.virt.libvirt.driver [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Start _get_guest_xml network_info=[{"id": "4d9e95a4-6e11-4c93-b8e6-862a11093b1c", "address": "fa:16:3e:57:29:6c", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.224", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d9e95a4-6e", "ovs_interfaceid": "4d9e95a4-6e11-4c93-b8e6-862a11093b1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.406 2 WARNING nova.virt.libvirt.driver [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.415 2 DEBUG nova.virt.libvirt.host [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.416 2 DEBUG nova.virt.libvirt.host [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.420 2 DEBUG nova.virt.libvirt.host [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.421 2 DEBUG nova.virt.libvirt.host [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.421 2 DEBUG nova.virt.libvirt.driver [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.422 2 DEBUG nova.virt.hardware [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.422 2 DEBUG nova.virt.hardware [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.423 2 DEBUG nova.virt.hardware [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.423 2 DEBUG nova.virt.hardware [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.423 2 DEBUG nova.virt.hardware [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.423 2 DEBUG nova.virt.hardware [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.424 2 DEBUG nova.virt.hardware [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.424 2 DEBUG nova.virt.hardware [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.424 2 DEBUG nova.virt.hardware [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.425 2 DEBUG nova.virt.hardware [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.425 2 DEBUG nova.virt.hardware [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.430 2 DEBUG nova.virt.libvirt.vif [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:48:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962',display_name='tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-igmp-snooping-ext-network-and-unsubscribe-14650699',id=107,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNTrp4SSnfhKhWl5/vr+ysOFBxdgTslwc0H7TgTRWXihMtjd4e3hSjQ8BhgGRqYqjDbdOxfo/dIVr5KrHj2ewhMFyenbuZO+39j7i/Z4jwloPin+qTxJZUEv9/APVs6CqQ==',key_name='tempest-keypair-test-2124727488',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ddcd45556b9d4077968eee95f005487d',ramdisk_id='',reservation_id='r-fon0b8sx',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Common-2020042253',owner_user_name='tempest-MulticastTestIPv4Common-2020042253-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:48:47Z,user_data=None,user_id='bee18afeaf16419c98219491d4757b96',uuid=59ce9674-8997-4f60-b278-fca63264b284,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4d9e95a4-6e11-4c93-b8e6-862a11093b1c", "address": "fa:16:3e:57:29:6c", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.224", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d9e95a4-6e", "ovs_interfaceid": "4d9e95a4-6e11-4c93-b8e6-862a11093b1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.431 2 DEBUG nova.network.os_vif_util [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Converting VIF {"id": "4d9e95a4-6e11-4c93-b8e6-862a11093b1c", "address": "fa:16:3e:57:29:6c", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.224", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d9e95a4-6e", "ovs_interfaceid": "4d9e95a4-6e11-4c93-b8e6-862a11093b1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.431 2 DEBUG nova.network.os_vif_util [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:29:6c,bridge_name='br-int',has_traffic_filtering=True,id=4d9e95a4-6e11-4c93-b8e6-862a11093b1c,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d9e95a4-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.433 2 DEBUG nova.objects.instance [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lazy-loading 'pci_devices' on Instance uuid 59ce9674-8997-4f60-b278-fca63264b284 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.455 2 DEBUG nova.virt.libvirt.driver [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] End _get_guest_xml xml=<domain type="kvm">
Oct  8 12:48:55 np0005476733 nova_compute[192580]:  <uuid>59ce9674-8997-4f60-b278-fca63264b284</uuid>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:  <name>instance-0000006b</name>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <nova:name>tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962</nova:name>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 16:48:55</nova:creationTime>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 12:48:55 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:        <nova:user uuid="bee18afeaf16419c98219491d4757b96">tempest-MulticastTestIPv4Common-2020042253-project-member</nova:user>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:        <nova:project uuid="ddcd45556b9d4077968eee95f005487d">tempest-MulticastTestIPv4Common-2020042253</nova:project>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:        <nova:port uuid="4d9e95a4-6e11-4c93-b8e6-862a11093b1c">
Oct  8 12:48:55 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.122.224" ipVersion="4"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <system>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <entry name="serial">59ce9674-8997-4f60-b278-fca63264b284</entry>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <entry name="uuid">59ce9674-8997-4f60-b278-fca63264b284</entry>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    </system>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:  <os>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:  </os>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:  <features>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:  </features>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:  </clock>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:  <devices>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/59ce9674-8997-4f60-b278-fca63264b284/disk"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/59ce9674-8997-4f60-b278-fca63264b284/disk.config"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:57:29:6c"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <mtu size="1400"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <target dev="tap4d9e95a4-6e"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    </interface>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/59ce9674-8997-4f60-b278-fca63264b284/console.log" append="off"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    </serial>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <video>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    </video>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    </rng>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 12:48:55 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 12:48:55 np0005476733 nova_compute[192580]:  </devices>
Oct  8 12:48:55 np0005476733 nova_compute[192580]: </domain>
Oct  8 12:48:55 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.460 2 DEBUG nova.compute.manager [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Preparing to wait for external event network-vif-plugged-4d9e95a4-6e11-4c93-b8e6-862a11093b1c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.461 2 DEBUG oslo_concurrency.lockutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "59ce9674-8997-4f60-b278-fca63264b284-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.462 2 DEBUG oslo_concurrency.lockutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "59ce9674-8997-4f60-b278-fca63264b284-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.462 2 DEBUG oslo_concurrency.lockutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "59ce9674-8997-4f60-b278-fca63264b284-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.463 2 DEBUG nova.virt.libvirt.vif [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:48:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962',display_name='tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-igmp-snooping-ext-network-and-unsubscribe-14650699',id=107,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNTrp4SSnfhKhWl5/vr+ysOFBxdgTslwc0H7TgTRWXihMtjd4e3hSjQ8BhgGRqYqjDbdOxfo/dIVr5KrHj2ewhMFyenbuZO+39j7i/Z4jwloPin+qTxJZUEv9/APVs6CqQ==',key_name='tempest-keypair-test-2124727488',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ddcd45556b9d4077968eee95f005487d',ramdisk_id='',reservation_id='r-fon0b8sx',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Common-2020042253',owner_user_name='tempest-MulticastTestIPv4Common-2020042253-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:48:47Z,user_data=None,user_id='bee18afeaf16419c98219491d4757b96',uuid=59ce9674-8997-4f60-b278-fca63264b284,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4d9e95a4-6e11-4c93-b8e6-862a11093b1c", "address": "fa:16:3e:57:29:6c", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.224", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d9e95a4-6e", "ovs_interfaceid": "4d9e95a4-6e11-4c93-b8e6-862a11093b1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.463 2 DEBUG nova.network.os_vif_util [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Converting VIF {"id": "4d9e95a4-6e11-4c93-b8e6-862a11093b1c", "address": "fa:16:3e:57:29:6c", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.224", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d9e95a4-6e", "ovs_interfaceid": "4d9e95a4-6e11-4c93-b8e6-862a11093b1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.464 2 DEBUG nova.network.os_vif_util [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:29:6c,bridge_name='br-int',has_traffic_filtering=True,id=4d9e95a4-6e11-4c93-b8e6-862a11093b1c,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d9e95a4-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.464 2 DEBUG os_vif [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:29:6c,bridge_name='br-int',has_traffic_filtering=True,id=4d9e95a4-6e11-4c93-b8e6-862a11093b1c,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d9e95a4-6e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.466 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.466 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.470 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4d9e95a4-6e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.470 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4d9e95a4-6e, col_values=(('external_ids', {'iface-id': '4d9e95a4-6e11-4c93-b8e6-862a11093b1c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:57:29:6c', 'vm-uuid': '59ce9674-8997-4f60-b278-fca63264b284'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.483 2 INFO os_vif [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:29:6c,bridge_name='br-int',has_traffic_filtering=True,id=4d9e95a4-6e11-4c93-b8e6-862a11093b1c,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d9e95a4-6e')#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.536 2 DEBUG nova.virt.libvirt.driver [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.536 2 DEBUG nova.virt.libvirt.driver [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.536 2 DEBUG nova.virt.libvirt.driver [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] No VIF found with MAC fa:16:3e:57:29:6c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.537 2 INFO nova.virt.libvirt.driver [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Using config drive#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.946 2 INFO nova.virt.libvirt.driver [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Creating config drive at /var/lib/nova/instances/59ce9674-8997-4f60-b278-fca63264b284/disk.config#033[00m
Oct  8 12:48:55 np0005476733 nova_compute[192580]: 2025-10-08 16:48:55.953 2 DEBUG oslo_concurrency.processutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/59ce9674-8997-4f60-b278-fca63264b284/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfe3dpdt7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:48:56 np0005476733 nova_compute[192580]: 2025-10-08 16:48:56.085 2 DEBUG oslo_concurrency.processutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/59ce9674-8997-4f60-b278-fca63264b284/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfe3dpdt7" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:48:56 np0005476733 kernel: tap4d9e95a4-6e: entered promiscuous mode
Oct  8 12:48:56 np0005476733 NetworkManager[51699]: <info>  [1759942136.1951] manager: (tap4d9e95a4-6e): new Tun device (/org/freedesktop/NetworkManager/Devices/321)
Oct  8 12:48:56 np0005476733 ovn_controller[263831]: 2025-10-08T16:48:56Z|00173|binding|INFO|Claiming lport 4d9e95a4-6e11-4c93-b8e6-862a11093b1c for this chassis.
Oct  8 12:48:56 np0005476733 ovn_controller[263831]: 2025-10-08T16:48:56Z|00174|binding|INFO|4d9e95a4-6e11-4c93-b8e6-862a11093b1c: Claiming fa:16:3e:57:29:6c 192.168.122.224
Oct  8 12:48:56 np0005476733 nova_compute[192580]: 2025-10-08 16:48:56.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.204 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:29:6c 192.168.122.224'], port_security=['fa:16:3e:57:29:6c 192.168.122.224'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.224/24', 'neutron:device_id': '59ce9674-8997-4f60-b278-fca63264b284', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ddcd45556b9d4077968eee95f005487d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f3351433-d580-4e91-809c-2020321c7d00', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5b64086-e7d8-42ad-b439-67cb79e13d7c, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=4d9e95a4-6e11-4c93-b8e6-862a11093b1c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.205 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 4d9e95a4-6e11-4c93-b8e6-862a11093b1c in datapath 81c575b5-ac88-40d3-8b00-79c5c936eec4 bound to our chassis#033[00m
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.207 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 81c575b5-ac88-40d3-8b00-79c5c936eec4#033[00m
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.224 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3396459a-3578-4e7e-a741-252a50028f22]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.225 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap81c575b5-a1 in ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 12:48:56 np0005476733 ovn_controller[263831]: 2025-10-08T16:48:56Z|00175|binding|INFO|Setting lport 4d9e95a4-6e11-4c93-b8e6-862a11093b1c up in Southbound
Oct  8 12:48:56 np0005476733 ovn_controller[263831]: 2025-10-08T16:48:56Z|00176|binding|INFO|Setting lport 4d9e95a4-6e11-4c93-b8e6-862a11093b1c ovn-installed in OVS
Oct  8 12:48:56 np0005476733 nova_compute[192580]: 2025-10-08 16:48:56.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.230 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap81c575b5-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.230 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9332e9cd-fc37-4c6f-8936-48fc8f4bc611]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.231 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c52b33ec-e670-4801-89bc-a5b8a5198b5c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:48:56 np0005476733 nova_compute[192580]: 2025-10-08 16:48:56.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:56 np0005476733 systemd-udevd[272991]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.246 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[861ac222-83fc-4323-8eb7-759665980147]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:48:56 np0005476733 systemd-machined[152624]: New machine qemu-65-instance-0000006b.
Oct  8 12:48:56 np0005476733 NetworkManager[51699]: <info>  [1759942136.2666] device (tap4d9e95a4-6e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:48:56 np0005476733 NetworkManager[51699]: <info>  [1759942136.2678] device (tap4d9e95a4-6e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.270 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[df2e49c9-4bae-4110-b77f-89383a799806]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:48:56 np0005476733 systemd[1]: Started Virtual Machine qemu-65-instance-0000006b.
Oct  8 12:48:56 np0005476733 podman[272975]: 2025-10-08 16:48:56.298502973 +0000 UTC m=+0.107522709 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.313 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[ad72a7cb-fb2b-4b5d-86da-60f2033a36a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.320 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2416b881-82f7-4b69-8f11-3a4de697b1fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:48:56 np0005476733 systemd-udevd[273005]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:48:56 np0005476733 NetworkManager[51699]: <info>  [1759942136.3218] manager: (tap81c575b5-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/322)
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.360 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[c8e2bfbc-b45f-43e0-987c-9e03b063d0d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.365 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[8fec17dd-f786-45f3-a36e-a524f2d35531]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:48:56 np0005476733 NetworkManager[51699]: <info>  [1759942136.3907] device (tap81c575b5-a0): carrier: link connected
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.397 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[f2f4ce75-98d3-4b04-a4bb-61daf53dffcb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.425 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e7de8bf9-5261-4c81-b2a1-d0afc3c1c41e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81c575b5-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:bf:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 904005, 'reachable_time': 43569, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273036, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:48:56 np0005476733 nova_compute[192580]: 2025-10-08 16:48:56.434 2 DEBUG nova.compute.manager [req-a51faa4d-04aa-46f6-8597-3f0ac33fe74d req-d6a5cb55-869c-40f8-9bc9-256d4afb7081 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Received event network-vif-plugged-4d9e95a4-6e11-4c93-b8e6-862a11093b1c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:48:56 np0005476733 nova_compute[192580]: 2025-10-08 16:48:56.435 2 DEBUG oslo_concurrency.lockutils [req-a51faa4d-04aa-46f6-8597-3f0ac33fe74d req-d6a5cb55-869c-40f8-9bc9-256d4afb7081 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "59ce9674-8997-4f60-b278-fca63264b284-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:48:56 np0005476733 nova_compute[192580]: 2025-10-08 16:48:56.435 2 DEBUG oslo_concurrency.lockutils [req-a51faa4d-04aa-46f6-8597-3f0ac33fe74d req-d6a5cb55-869c-40f8-9bc9-256d4afb7081 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "59ce9674-8997-4f60-b278-fca63264b284-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:48:56 np0005476733 nova_compute[192580]: 2025-10-08 16:48:56.435 2 DEBUG oslo_concurrency.lockutils [req-a51faa4d-04aa-46f6-8597-3f0ac33fe74d req-d6a5cb55-869c-40f8-9bc9-256d4afb7081 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "59ce9674-8997-4f60-b278-fca63264b284-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:48:56 np0005476733 nova_compute[192580]: 2025-10-08 16:48:56.436 2 DEBUG nova.compute.manager [req-a51faa4d-04aa-46f6-8597-3f0ac33fe74d req-d6a5cb55-869c-40f8-9bc9-256d4afb7081 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Processing event network-vif-plugged-4d9e95a4-6e11-4c93-b8e6-862a11093b1c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.444 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c3ce89c8-7988-450f-9fe3-c20e25d54b18]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe79:bf12'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 904005, 'tstamp': 904005}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273037, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.472 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4d0401c3-ea10-4b5f-8f82-66a16c3dc9bb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81c575b5-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:bf:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 904005, 'reachable_time': 43569, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 273038, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.511 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c0d8c58d-4528-4e75-bba4-54aa690e9a0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.592 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5c6250fa-76d0-4137-9d28-7668ce5a72cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.595 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81c575b5-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.595 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.595 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81c575b5-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:48:56 np0005476733 kernel: tap81c575b5-a0: entered promiscuous mode
Oct  8 12:48:56 np0005476733 nova_compute[192580]: 2025-10-08 16:48:56.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.601 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap81c575b5-a0, col_values=(('external_ids', {'iface-id': '3737b929-673d-4d30-a674-dbb8c6c2e54d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:48:56 np0005476733 ovn_controller[263831]: 2025-10-08T16:48:56Z|00177|binding|INFO|Releasing lport 3737b929-673d-4d30-a674-dbb8c6c2e54d from this chassis (sb_readonly=0)
Oct  8 12:48:56 np0005476733 nova_compute[192580]: 2025-10-08 16:48:56.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:56 np0005476733 nova_compute[192580]: 2025-10-08 16:48:56.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.604 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/81c575b5-ac88-40d3-8b00-79c5c936eec4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/81c575b5-ac88-40d3-8b00-79c5c936eec4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.605 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[40e9803e-0995-4cb7-a735-76a0d84839c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.606 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-81c575b5-ac88-40d3-8b00-79c5c936eec4
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/81c575b5-ac88-40d3-8b00-79c5c936eec4.pid.haproxy
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 81c575b5-ac88-40d3-8b00-79c5c936eec4
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 12:48:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:48:56.607 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'env', 'PROCESS_TAG=haproxy-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/81c575b5-ac88-40d3-8b00-79c5c936eec4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 12:48:56 np0005476733 nova_compute[192580]: 2025-10-08 16:48:56.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:48:56 np0005476733 podman[273077]: 2025-10-08 16:48:56.973427725 +0000 UTC m=+0.052140343 container create d80a94e2bc2bce8cb857adf5df52babf20c2d0c0909dca449db231a0261ca75f (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  8 12:48:57 np0005476733 systemd[1]: Started libpod-conmon-d80a94e2bc2bce8cb857adf5df52babf20c2d0c0909dca449db231a0261ca75f.scope.
Oct  8 12:48:57 np0005476733 podman[273077]: 2025-10-08 16:48:56.944221305 +0000 UTC m=+0.022933893 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 12:48:57 np0005476733 systemd[1]: Started libcrun container.
Oct  8 12:48:57 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c82a46e2803d8a90c05526d872210107cad72011c40ba11292030ffdfadec30b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 12:48:57 np0005476733 podman[273077]: 2025-10-08 16:48:57.084411165 +0000 UTC m=+0.163123773 container init d80a94e2bc2bce8cb857adf5df52babf20c2d0c0909dca449db231a0261ca75f (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct  8 12:48:57 np0005476733 podman[273077]: 2025-10-08 16:48:57.089880909 +0000 UTC m=+0.168593487 container start d80a94e2bc2bce8cb857adf5df52babf20c2d0c0909dca449db231a0261ca75f (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.116 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759942137.1159086, 59ce9674-8997-4f60-b278-fca63264b284 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.118 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 59ce9674-8997-4f60-b278-fca63264b284] VM Started (Lifecycle Event)#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.120 2 DEBUG nova.compute.manager [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.127 2 DEBUG nova.virt.libvirt.driver [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.131 2 INFO nova.virt.libvirt.driver [-] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Instance spawned successfully.#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.132 2 DEBUG nova.virt.libvirt.driver [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 12:48:57 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[273092]: [NOTICE]   (273096) : New worker (273098) forked
Oct  8 12:48:57 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[273092]: [NOTICE]   (273096) : Loading success.
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.156 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.164 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.167 2 DEBUG nova.network.neutron [req-043dc586-b57c-40ea-a463-b45d8fefbd5f req-0f4f6113-d1a6-4cef-b694-f6299c94f735 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Updated VIF entry in instance network info cache for port 4d9e95a4-6e11-4c93-b8e6-862a11093b1c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.168 2 DEBUG nova.network.neutron [req-043dc586-b57c-40ea-a463-b45d8fefbd5f req-0f4f6113-d1a6-4cef-b694-f6299c94f735 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Updating instance_info_cache with network_info: [{"id": "4d9e95a4-6e11-4c93-b8e6-862a11093b1c", "address": "fa:16:3e:57:29:6c", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.224", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d9e95a4-6e", "ovs_interfaceid": "4d9e95a4-6e11-4c93-b8e6-862a11093b1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.171 2 DEBUG nova.virt.libvirt.driver [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.172 2 DEBUG nova.virt.libvirt.driver [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.173 2 DEBUG nova.virt.libvirt.driver [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.173 2 DEBUG nova.virt.libvirt.driver [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.174 2 DEBUG nova.virt.libvirt.driver [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.174 2 DEBUG nova.virt.libvirt.driver [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.208 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 59ce9674-8997-4f60-b278-fca63264b284] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.209 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759942137.1162274, 59ce9674-8997-4f60-b278-fca63264b284 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.209 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 59ce9674-8997-4f60-b278-fca63264b284] VM Paused (Lifecycle Event)#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.212 2 DEBUG oslo_concurrency.lockutils [req-043dc586-b57c-40ea-a463-b45d8fefbd5f req-0f4f6113-d1a6-4cef-b694-f6299c94f735 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-59ce9674-8997-4f60-b278-fca63264b284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.240 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.244 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759942137.1232853, 59ce9674-8997-4f60-b278-fca63264b284 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.244 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 59ce9674-8997-4f60-b278-fca63264b284] VM Resumed (Lifecycle Event)#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.252 2 INFO nova.compute.manager [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Took 9.20 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.253 2 DEBUG nova.compute.manager [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.264 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.267 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.306 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 59ce9674-8997-4f60-b278-fca63264b284] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.354 2 INFO nova.compute.manager [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Took 9.80 seconds to build instance.#033[00m
Oct  8 12:48:57 np0005476733 nova_compute[192580]: 2025-10-08 16:48:57.372 2 DEBUG oslo_concurrency.lockutils [None req-a850efd4-4828-4dc1-bbff-1971113c3acf bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "59ce9674-8997-4f60-b278-fca63264b284" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.879s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:48:58 np0005476733 nova_compute[192580]: 2025-10-08 16:48:58.546 2 DEBUG nova.compute.manager [req-1ad212d5-ef75-4e14-a522-cbbf784b6fbc req-17d17c81-4d32-4c32-9e42-6a068ae8c5bd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Received event network-vif-plugged-4d9e95a4-6e11-4c93-b8e6-862a11093b1c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:48:58 np0005476733 nova_compute[192580]: 2025-10-08 16:48:58.548 2 DEBUG oslo_concurrency.lockutils [req-1ad212d5-ef75-4e14-a522-cbbf784b6fbc req-17d17c81-4d32-4c32-9e42-6a068ae8c5bd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "59ce9674-8997-4f60-b278-fca63264b284-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:48:58 np0005476733 nova_compute[192580]: 2025-10-08 16:48:58.548 2 DEBUG oslo_concurrency.lockutils [req-1ad212d5-ef75-4e14-a522-cbbf784b6fbc req-17d17c81-4d32-4c32-9e42-6a068ae8c5bd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "59ce9674-8997-4f60-b278-fca63264b284-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:48:58 np0005476733 nova_compute[192580]: 2025-10-08 16:48:58.549 2 DEBUG oslo_concurrency.lockutils [req-1ad212d5-ef75-4e14-a522-cbbf784b6fbc req-17d17c81-4d32-4c32-9e42-6a068ae8c5bd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "59ce9674-8997-4f60-b278-fca63264b284-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:48:58 np0005476733 nova_compute[192580]: 2025-10-08 16:48:58.549 2 DEBUG nova.compute.manager [req-1ad212d5-ef75-4e14-a522-cbbf784b6fbc req-17d17c81-4d32-4c32-9e42-6a068ae8c5bd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] No waiting events found dispatching network-vif-plugged-4d9e95a4-6e11-4c93-b8e6-862a11093b1c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:48:58 np0005476733 nova_compute[192580]: 2025-10-08 16:48:58.549 2 WARNING nova.compute.manager [req-1ad212d5-ef75-4e14-a522-cbbf784b6fbc req-17d17c81-4d32-4c32-9e42-6a068ae8c5bd 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Received unexpected event network-vif-plugged-4d9e95a4-6e11-4c93-b8e6-862a11093b1c for instance with vm_state active and task_state None.#033[00m
Oct  8 12:48:59 np0005476733 nova_compute[192580]: 2025-10-08 16:48:59.240 2 INFO nova.compute.manager [None req-9a034b4b-789a-428b-b091-1a95edcc7883 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Get console output#033[00m
Oct  8 12:48:59 np0005476733 nova_compute[192580]: 2025-10-08 16:48:59.245 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:48:59 np0005476733 nova_compute[192580]: 2025-10-08 16:48:59.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:00 np0005476733 nova_compute[192580]: 2025-10-08 16:49:00.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:01 np0005476733 nova_compute[192580]: 2025-10-08 16:49:01.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:49:03 np0005476733 nova_compute[192580]: 2025-10-08 16:49:03.593 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:49:04 np0005476733 nova_compute[192580]: 2025-10-08 16:49:04.386 2 INFO nova.compute.manager [None req-5150b335-99d0-499a-9891-f0adc1d6abf3 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Get console output#033[00m
Oct  8 12:49:04 np0005476733 nova_compute[192580]: 2025-10-08 16:49:04.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:05 np0005476733 nova_compute[192580]: 2025-10-08 16:49:05.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:05 np0005476733 nova_compute[192580]: 2025-10-08 16:49:05.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:49:05 np0005476733 nova_compute[192580]: 2025-10-08 16:49:05.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:49:05 np0005476733 nova_compute[192580]: 2025-10-08 16:49:05.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:49:05 np0005476733 nova_compute[192580]: 2025-10-08 16:49:05.813 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-59ce9674-8997-4f60-b278-fca63264b284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:49:05 np0005476733 nova_compute[192580]: 2025-10-08 16:49:05.814 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-59ce9674-8997-4f60-b278-fca63264b284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:49:05 np0005476733 nova_compute[192580]: 2025-10-08 16:49:05.815 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:49:05 np0005476733 nova_compute[192580]: 2025-10-08 16:49:05.815 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 59ce9674-8997-4f60-b278-fca63264b284 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:49:06 np0005476733 podman[273107]: 2025-10-08 16:49:06.250945658 +0000 UTC m=+0.081293203 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  8 12:49:06 np0005476733 podman[273109]: 2025-10-08 16:49:06.279194759 +0000 UTC m=+0.092482690 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9-minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6)
Oct  8 12:49:06 np0005476733 podman[273108]: 2025-10-08 16:49:06.289302152 +0000 UTC m=+0.111299870 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:49:08 np0005476733 nova_compute[192580]: 2025-10-08 16:49:08.161 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Updating instance_info_cache with network_info: [{"id": "4d9e95a4-6e11-4c93-b8e6-862a11093b1c", "address": "fa:16:3e:57:29:6c", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.224", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d9e95a4-6e", "ovs_interfaceid": "4d9e95a4-6e11-4c93-b8e6-862a11093b1c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:49:08 np0005476733 nova_compute[192580]: 2025-10-08 16:49:08.180 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-59ce9674-8997-4f60-b278-fca63264b284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:49:08 np0005476733 nova_compute[192580]: 2025-10-08 16:49:08.181 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:49:09 np0005476733 nova_compute[192580]: 2025-10-08 16:49:09.571 2 INFO nova.compute.manager [None req-e9ac9036-8119-4473-ba36-7ca6b5a8ca98 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Get console output#033[00m
Oct  8 12:49:09 np0005476733 nova_compute[192580]: 2025-10-08 16:49:09.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:10 np0005476733 nova_compute[192580]: 2025-10-08 16:49:10.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:12 np0005476733 nova_compute[192580]: 2025-10-08 16:49:12.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:49:13 np0005476733 podman[273179]: 2025-10-08 16:49:13.270915441 +0000 UTC m=+0.078098931 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 12:49:13 np0005476733 podman[273178]: 2025-10-08 16:49:13.281332863 +0000 UTC m=+0.089603978 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 12:49:14 np0005476733 nova_compute[192580]: 2025-10-08 16:49:14.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:49:14 np0005476733 nova_compute[192580]: 2025-10-08 16:49:14.622 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:49:14 np0005476733 nova_compute[192580]: 2025-10-08 16:49:14.623 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:49:14 np0005476733 nova_compute[192580]: 2025-10-08 16:49:14.623 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:49:14 np0005476733 nova_compute[192580]: 2025-10-08 16:49:14.624 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:49:14 np0005476733 nova_compute[192580]: 2025-10-08 16:49:14.709 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/59ce9674-8997-4f60-b278-fca63264b284/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:49:14 np0005476733 nova_compute[192580]: 2025-10-08 16:49:14.740 2 INFO nova.compute.manager [None req-d4597d7a-db33-43e8-8ce7-86208f30a4aa bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Get console output#033[00m
Oct  8 12:49:14 np0005476733 nova_compute[192580]: 2025-10-08 16:49:14.746 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:49:14 np0005476733 nova_compute[192580]: 2025-10-08 16:49:14.773 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/59ce9674-8997-4f60-b278-fca63264b284/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:49:14 np0005476733 nova_compute[192580]: 2025-10-08 16:49:14.774 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/59ce9674-8997-4f60-b278-fca63264b284/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:49:14 np0005476733 nova_compute[192580]: 2025-10-08 16:49:14.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:14 np0005476733 nova_compute[192580]: 2025-10-08 16:49:14.840 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/59ce9674-8997-4f60-b278-fca63264b284/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:49:14 np0005476733 nova_compute[192580]: 2025-10-08 16:49:14.989 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:49:14 np0005476733 nova_compute[192580]: 2025-10-08 16:49:14.991 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13243MB free_disk=111.31079864501953GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:49:14 np0005476733 nova_compute[192580]: 2025-10-08 16:49:14.991 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:49:14 np0005476733 nova_compute[192580]: 2025-10-08 16:49:14.992 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:49:15 np0005476733 nova_compute[192580]: 2025-10-08 16:49:15.090 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 59ce9674-8997-4f60-b278-fca63264b284 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:49:15 np0005476733 nova_compute[192580]: 2025-10-08 16:49:15.091 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:49:15 np0005476733 nova_compute[192580]: 2025-10-08 16:49:15.091 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:49:15 np0005476733 nova_compute[192580]: 2025-10-08 16:49:15.143 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:49:15 np0005476733 nova_compute[192580]: 2025-10-08 16:49:15.160 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:49:15 np0005476733 nova_compute[192580]: 2025-10-08 16:49:15.194 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:49:15 np0005476733 nova_compute[192580]: 2025-10-08 16:49:15.195 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:49:15 np0005476733 nova_compute[192580]: 2025-10-08 16:49:15.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:19 np0005476733 nova_compute[192580]: 2025-10-08 16:49:19.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:19 np0005476733 nova_compute[192580]: 2025-10-08 16:49:19.895 2 INFO nova.compute.manager [None req-07037260-dcab-4abe-bd2e-d0a5328b8f91 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Get console output#033[00m
Oct  8 12:49:19 np0005476733 nova_compute[192580]: 2025-10-08 16:49:19.900 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:49:20 np0005476733 nova_compute[192580]: 2025-10-08 16:49:20.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:21 np0005476733 ovn_controller[263831]: 2025-10-08T16:49:21Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:57:29:6c 192.168.122.224
Oct  8 12:49:21 np0005476733 ovn_controller[263831]: 2025-10-08T16:49:21Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:57:29:6c 192.168.122.224
Oct  8 12:49:23 np0005476733 nova_compute[192580]: 2025-10-08 16:49:23.194 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:49:23 np0005476733 podman[273228]: 2025-10-08 16:49:23.23634883 +0000 UTC m=+0.060768399 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct  8 12:49:24 np0005476733 nova_compute[192580]: 2025-10-08 16:49:24.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:25 np0005476733 nova_compute[192580]: 2025-10-08 16:49:25.074 2 INFO nova.compute.manager [None req-5071d5fb-2b43-4191-bc4d-bc2dbc348f32 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Get console output#033[00m
Oct  8 12:49:25 np0005476733 nova_compute[192580]: 2025-10-08 16:49:25.079 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:49:25 np0005476733 podman[273247]: 2025-10-08 16:49:25.293293633 +0000 UTC m=+0.116496387 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:49:25 np0005476733 nova_compute[192580]: 2025-10-08 16:49:25.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:26 np0005476733 ovn_controller[263831]: 2025-10-08T16:49:26Z|00178|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct  8 12:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:49:26.422 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:49:26.423 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:49:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:49:26.423 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:49:27 np0005476733 podman[273273]: 2025-10-08 16:49:27.2839338 +0000 UTC m=+0.097397267 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  8 12:49:29 np0005476733 nova_compute[192580]: 2025-10-08 16:49:29.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:30 np0005476733 nova_compute[192580]: 2025-10-08 16:49:30.311 2 INFO nova.compute.manager [None req-e7542e0d-8819-4668-a6a0-f722505f777c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Get console output#033[00m
Oct  8 12:49:30 np0005476733 nova_compute[192580]: 2025-10-08 16:49:30.318 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:49:30 np0005476733 nova_compute[192580]: 2025-10-08 16:49:30.323 2 INFO nova.virt.libvirt.driver [None req-e7542e0d-8819-4668-a6a0-f722505f777c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Truncated console log returned, 3162 bytes ignored#033[00m
Oct  8 12:49:30 np0005476733 nova_compute[192580]: 2025-10-08 16:49:30.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:30 np0005476733 nova_compute[192580]: 2025-10-08 16:49:30.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:49:34 np0005476733 nova_compute[192580]: 2025-10-08 16:49:34.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:35 np0005476733 nova_compute[192580]: 2025-10-08 16:49:35.506 2 INFO nova.compute.manager [None req-a55a968f-3c6b-4215-bf05-348a2c448999 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Get console output#033[00m
Oct  8 12:49:35 np0005476733 nova_compute[192580]: 2025-10-08 16:49:35.513 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:49:35 np0005476733 nova_compute[192580]: 2025-10-08 16:49:35.516 2 INFO nova.virt.libvirt.driver [None req-a55a968f-3c6b-4215-bf05-348a2c448999 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Truncated console log returned, 3388 bytes ignored#033[00m
Oct  8 12:49:35 np0005476733 nova_compute[192580]: 2025-10-08 16:49:35.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:37 np0005476733 podman[273308]: 2025-10-08 16:49:37.263183551 +0000 UTC m=+0.083287477 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3)
Oct  8 12:49:37 np0005476733 podman[273310]: 2025-10-08 16:49:37.263507531 +0000 UTC m=+0.070895522 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, config_id=edpm, distribution-scope=public)
Oct  8 12:49:37 np0005476733 podman[273309]: 2025-10-08 16:49:37.271128194 +0000 UTC m=+0.086450108 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:49:39 np0005476733 nova_compute[192580]: 2025-10-08 16:49:39.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:40 np0005476733 nova_compute[192580]: 2025-10-08 16:49:40.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:40 np0005476733 nova_compute[192580]: 2025-10-08 16:49:40.624 2 DEBUG oslo_concurrency.lockutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:49:40 np0005476733 nova_compute[192580]: 2025-10-08 16:49:40.625 2 DEBUG oslo_concurrency.lockutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:49:40 np0005476733 nova_compute[192580]: 2025-10-08 16:49:40.657 2 DEBUG nova.compute.manager [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 12:49:40 np0005476733 nova_compute[192580]: 2025-10-08 16:49:40.769 2 DEBUG oslo_concurrency.lockutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:49:40 np0005476733 nova_compute[192580]: 2025-10-08 16:49:40.770 2 DEBUG oslo_concurrency.lockutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:49:40 np0005476733 nova_compute[192580]: 2025-10-08 16:49:40.782 2 DEBUG nova.virt.hardware [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 12:49:40 np0005476733 nova_compute[192580]: 2025-10-08 16:49:40.783 2 INFO nova.compute.claims [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 12:49:40 np0005476733 nova_compute[192580]: 2025-10-08 16:49:40.952 2 DEBUG nova.compute.provider_tree [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:49:40 np0005476733 nova_compute[192580]: 2025-10-08 16:49:40.969 2 DEBUG nova.scheduler.client.report [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:49:40 np0005476733 nova_compute[192580]: 2025-10-08 16:49:40.997 2 DEBUG oslo_concurrency.lockutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:49:40 np0005476733 nova_compute[192580]: 2025-10-08 16:49:40.998 2 DEBUG nova.compute.manager [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.058 2 DEBUG nova.compute.manager [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.059 2 DEBUG nova.network.neutron [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.086 2 INFO nova.virt.libvirt.driver [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.114 2 DEBUG nova.compute.manager [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.260 2 DEBUG nova.compute.manager [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.262 2 DEBUG nova.virt.libvirt.driver [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.263 2 INFO nova.virt.libvirt.driver [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Creating image(s)#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.264 2 DEBUG oslo_concurrency.lockutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "/var/lib/nova/instances/cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.264 2 DEBUG oslo_concurrency.lockutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "/var/lib/nova/instances/cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.265 2 DEBUG oslo_concurrency.lockutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "/var/lib/nova/instances/cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.285 2 DEBUG oslo_concurrency.processutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.364 2 DEBUG oslo_concurrency.processutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.365 2 DEBUG oslo_concurrency.lockutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.366 2 DEBUG oslo_concurrency.lockutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.379 2 DEBUG oslo_concurrency.processutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.447 2 DEBUG oslo_concurrency.processutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.449 2 DEBUG oslo_concurrency.processutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.490 2 DEBUG oslo_concurrency.processutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk 10737418240" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.492 2 DEBUG oslo_concurrency.lockutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.492 2 DEBUG oslo_concurrency.processutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.567 2 DEBUG oslo_concurrency.processutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.569 2 DEBUG nova.objects.instance [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lazy-loading 'migration_context' on Instance uuid cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.602 2 DEBUG nova.virt.libvirt.driver [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.603 2 DEBUG nova.virt.libvirt.driver [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Ensure instance console log exists: /var/lib/nova/instances/cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.603 2 DEBUG oslo_concurrency.lockutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.604 2 DEBUG oslo_concurrency.lockutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:49:41 np0005476733 nova_compute[192580]: 2025-10-08 16:49:41.604 2 DEBUG oslo_concurrency.lockutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:49:42 np0005476733 nova_compute[192580]: 2025-10-08 16:49:42.210 2 DEBUG nova.policy [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 12:49:44 np0005476733 podman[273387]: 2025-10-08 16:49:44.245170152 +0000 UTC m=+0.066336916 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 12:49:44 np0005476733 podman[273388]: 2025-10-08 16:49:44.2479374 +0000 UTC m=+0.058325600 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:49:44 np0005476733 nova_compute[192580]: 2025-10-08 16:49:44.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:45 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:49:45.200 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=85, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=84) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:49:45 np0005476733 nova_compute[192580]: 2025-10-08 16:49:45.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:45 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:49:45.203 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:49:45 np0005476733 nova_compute[192580]: 2025-10-08 16:49:45.369 2 DEBUG nova.network.neutron [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Successfully created port: 9f60af9b-fd44-4445-8f3e-2548a718dac7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 12:49:45 np0005476733 nova_compute[192580]: 2025-10-08 16:49:45.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:46 np0005476733 nova_compute[192580]: 2025-10-08 16:49:46.367 2 DEBUG nova.network.neutron [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Successfully updated port: 9f60af9b-fd44-4445-8f3e-2548a718dac7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 12:49:46 np0005476733 nova_compute[192580]: 2025-10-08 16:49:46.383 2 DEBUG oslo_concurrency.lockutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "refresh_cache-cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:49:46 np0005476733 nova_compute[192580]: 2025-10-08 16:49:46.383 2 DEBUG oslo_concurrency.lockutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquired lock "refresh_cache-cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:49:46 np0005476733 nova_compute[192580]: 2025-10-08 16:49:46.384 2 DEBUG nova.network.neutron [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:49:46 np0005476733 nova_compute[192580]: 2025-10-08 16:49:46.466 2 DEBUG nova.compute.manager [req-b98ae7f1-69cb-4c69-9c94-09df7f62a244 req-258f4ba6-de97-4e66-ba05-5b1995bbe044 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Received event network-changed-9f60af9b-fd44-4445-8f3e-2548a718dac7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:49:46 np0005476733 nova_compute[192580]: 2025-10-08 16:49:46.467 2 DEBUG nova.compute.manager [req-b98ae7f1-69cb-4c69-9c94-09df7f62a244 req-258f4ba6-de97-4e66-ba05-5b1995bbe044 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Refreshing instance network info cache due to event network-changed-9f60af9b-fd44-4445-8f3e-2548a718dac7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:49:46 np0005476733 nova_compute[192580]: 2025-10-08 16:49:46.467 2 DEBUG oslo_concurrency.lockutils [req-b98ae7f1-69cb-4c69-9c94-09df7f62a244 req-258f4ba6-de97-4e66-ba05-5b1995bbe044 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:49:47 np0005476733 nova_compute[192580]: 2025-10-08 16:49:47.178 2 DEBUG nova.network.neutron [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 12:49:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:49:48.206 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '85'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.520 2 DEBUG nova.network.neutron [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Updating instance_info_cache with network_info: [{"id": "9f60af9b-fd44-4445-8f3e-2548a718dac7", "address": "fa:16:3e:f8:3f:1b", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.226", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f60af9b-fd", "ovs_interfaceid": "9f60af9b-fd44-4445-8f3e-2548a718dac7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.543 2 DEBUG oslo_concurrency.lockutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Releasing lock "refresh_cache-cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.544 2 DEBUG nova.compute.manager [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Instance network_info: |[{"id": "9f60af9b-fd44-4445-8f3e-2548a718dac7", "address": "fa:16:3e:f8:3f:1b", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.226", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f60af9b-fd", "ovs_interfaceid": "9f60af9b-fd44-4445-8f3e-2548a718dac7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.544 2 DEBUG oslo_concurrency.lockutils [req-b98ae7f1-69cb-4c69-9c94-09df7f62a244 req-258f4ba6-de97-4e66-ba05-5b1995bbe044 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.545 2 DEBUG nova.network.neutron [req-b98ae7f1-69cb-4c69-9c94-09df7f62a244 req-258f4ba6-de97-4e66-ba05-5b1995bbe044 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Refreshing network info cache for port 9f60af9b-fd44-4445-8f3e-2548a718dac7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.548 2 DEBUG nova.virt.libvirt.driver [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Start _get_guest_xml network_info=[{"id": "9f60af9b-fd44-4445-8f3e-2548a718dac7", "address": "fa:16:3e:f8:3f:1b", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.226", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f60af9b-fd", "ovs_interfaceid": "9f60af9b-fd44-4445-8f3e-2548a718dac7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.555 2 WARNING nova.virt.libvirt.driver [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.565 2 DEBUG nova.virt.libvirt.host [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.566 2 DEBUG nova.virt.libvirt.host [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.570 2 DEBUG nova.virt.libvirt.host [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.571 2 DEBUG nova.virt.libvirt.host [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.572 2 DEBUG nova.virt.libvirt.driver [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.572 2 DEBUG nova.virt.hardware [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.572 2 DEBUG nova.virt.hardware [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.573 2 DEBUG nova.virt.hardware [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.573 2 DEBUG nova.virt.hardware [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.573 2 DEBUG nova.virt.hardware [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.573 2 DEBUG nova.virt.hardware [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.573 2 DEBUG nova.virt.hardware [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.574 2 DEBUG nova.virt.hardware [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.574 2 DEBUG nova.virt.hardware [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.574 2 DEBUG nova.virt.hardware [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.574 2 DEBUG nova.virt.hardware [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.579 2 DEBUG nova.virt.libvirt.vif [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:49:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068',display_name='tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-igmp-snooping-ext-network-and-unsubscribe-33195906',id=108,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNTrp4SSnfhKhWl5/vr+ysOFBxdgTslwc0H7TgTRWXihMtjd4e3hSjQ8BhgGRqYqjDbdOxfo/dIVr5KrHj2ewhMFyenbuZO+39j7i/Z4jwloPin+qTxJZUEv9/APVs6CqQ==',key_name='tempest-keypair-test-2124727488',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ddcd45556b9d4077968eee95f005487d',ramdisk_id='',reservation_id='r-lzinfukz',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Common-2020042253',owner_user_name='tempest-MulticastTestIPv4Common-2020042253-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:49:41Z,user_data=None,user_id='bee18afeaf16419c98219491d4757b96',uuid=cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9f60af9b-fd44-4445-8f3e-2548a718dac7", "address": "fa:16:3e:f8:3f:1b", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.226", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f60af9b-fd", "ovs_interfaceid": "9f60af9b-fd44-4445-8f3e-2548a718dac7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.579 2 DEBUG nova.network.os_vif_util [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Converting VIF {"id": "9f60af9b-fd44-4445-8f3e-2548a718dac7", "address": "fa:16:3e:f8:3f:1b", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.226", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f60af9b-fd", "ovs_interfaceid": "9f60af9b-fd44-4445-8f3e-2548a718dac7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.580 2 DEBUG nova.network.os_vif_util [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:3f:1b,bridge_name='br-int',has_traffic_filtering=True,id=9f60af9b-fd44-4445-8f3e-2548a718dac7,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f60af9b-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.581 2 DEBUG nova.objects.instance [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lazy-loading 'pci_devices' on Instance uuid cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.599 2 DEBUG nova.virt.libvirt.driver [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] End _get_guest_xml xml=<domain type="kvm">
Oct  8 12:49:48 np0005476733 nova_compute[192580]:  <uuid>cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44</uuid>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:  <name>instance-0000006c</name>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <nova:name>tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068</nova:name>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 16:49:48</nova:creationTime>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 12:49:48 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:        <nova:user uuid="bee18afeaf16419c98219491d4757b96">tempest-MulticastTestIPv4Common-2020042253-project-member</nova:user>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:        <nova:project uuid="ddcd45556b9d4077968eee95f005487d">tempest-MulticastTestIPv4Common-2020042253</nova:project>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:        <nova:port uuid="9f60af9b-fd44-4445-8f3e-2548a718dac7">
Oct  8 12:49:48 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.122.226" ipVersion="4"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <system>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <entry name="serial">cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44</entry>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <entry name="uuid">cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44</entry>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    </system>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:  <os>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:  </os>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:  <features>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:  </features>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:  </clock>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:  <devices>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.config"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:f8:3f:1b"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <mtu size="1400"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <target dev="tap9f60af9b-fd"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    </interface>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/console.log" append="off"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    </serial>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <video>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    </video>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    </rng>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 12:49:48 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 12:49:48 np0005476733 nova_compute[192580]:  </devices>
Oct  8 12:49:48 np0005476733 nova_compute[192580]: </domain>
Oct  8 12:49:48 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.601 2 DEBUG nova.compute.manager [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Preparing to wait for external event network-vif-plugged-9f60af9b-fd44-4445-8f3e-2548a718dac7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.602 2 DEBUG oslo_concurrency.lockutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.602 2 DEBUG oslo_concurrency.lockutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.603 2 DEBUG oslo_concurrency.lockutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.604 2 DEBUG nova.virt.libvirt.vif [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:49:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068',display_name='tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-igmp-snooping-ext-network-and-unsubscribe-33195906',id=108,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNTrp4SSnfhKhWl5/vr+ysOFBxdgTslwc0H7TgTRWXihMtjd4e3hSjQ8BhgGRqYqjDbdOxfo/dIVr5KrHj2ewhMFyenbuZO+39j7i/Z4jwloPin+qTxJZUEv9/APVs6CqQ==',key_name='tempest-keypair-test-2124727488',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ddcd45556b9d4077968eee95f005487d',ramdisk_id='',reservation_id='r-lzinfukz',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-MulticastTestIPv4Common-2020042253',owner_user_name='tempest-MulticastTestIPv4Common-2020042253-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:49:41Z,user_data=None,user_id='bee18afeaf16419c98219491d4757b96',uuid=cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9f60af9b-fd44-4445-8f3e-2548a718dac7", "address": "fa:16:3e:f8:3f:1b", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.226", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f60af9b-fd", "ovs_interfaceid": "9f60af9b-fd44-4445-8f3e-2548a718dac7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.605 2 DEBUG nova.network.os_vif_util [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Converting VIF {"id": "9f60af9b-fd44-4445-8f3e-2548a718dac7", "address": "fa:16:3e:f8:3f:1b", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.226", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f60af9b-fd", "ovs_interfaceid": "9f60af9b-fd44-4445-8f3e-2548a718dac7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.606 2 DEBUG nova.network.os_vif_util [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:3f:1b,bridge_name='br-int',has_traffic_filtering=True,id=9f60af9b-fd44-4445-8f3e-2548a718dac7,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f60af9b-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.607 2 DEBUG os_vif [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:3f:1b,bridge_name='br-int',has_traffic_filtering=True,id=9f60af9b-fd44-4445-8f3e-2548a718dac7,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f60af9b-fd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.608 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.609 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.614 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9f60af9b-fd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.615 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9f60af9b-fd, col_values=(('external_ids', {'iface-id': '9f60af9b-fd44-4445-8f3e-2548a718dac7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f8:3f:1b', 'vm-uuid': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.629 2 INFO os_vif [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:3f:1b,bridge_name='br-int',has_traffic_filtering=True,id=9f60af9b-fd44-4445-8f3e-2548a718dac7,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f60af9b-fd')#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.689 2 DEBUG nova.virt.libvirt.driver [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.689 2 DEBUG nova.virt.libvirt.driver [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.689 2 DEBUG nova.virt.libvirt.driver [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] No VIF found with MAC fa:16:3e:f8:3f:1b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 12:49:48 np0005476733 nova_compute[192580]: 2025-10-08 16:49:48.690 2 INFO nova.virt.libvirt.driver [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Using config drive#033[00m
Oct  8 12:49:49 np0005476733 nova_compute[192580]: 2025-10-08 16:49:49.149 2 INFO nova.virt.libvirt.driver [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Creating config drive at /var/lib/nova/instances/cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.config#033[00m
Oct  8 12:49:49 np0005476733 nova_compute[192580]: 2025-10-08 16:49:49.157 2 DEBUG oslo_concurrency.processutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx_c8omru execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:49:49 np0005476733 nova_compute[192580]: 2025-10-08 16:49:49.287 2 DEBUG oslo_concurrency.processutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx_c8omru" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:49:49 np0005476733 kernel: tap9f60af9b-fd: entered promiscuous mode
Oct  8 12:49:49 np0005476733 NetworkManager[51699]: <info>  [1759942189.3773] manager: (tap9f60af9b-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/323)
Oct  8 12:49:49 np0005476733 ovn_controller[263831]: 2025-10-08T16:49:49Z|00179|binding|INFO|Claiming lport 9f60af9b-fd44-4445-8f3e-2548a718dac7 for this chassis.
Oct  8 12:49:49 np0005476733 nova_compute[192580]: 2025-10-08 16:49:49.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:49 np0005476733 ovn_controller[263831]: 2025-10-08T16:49:49Z|00180|binding|INFO|9f60af9b-fd44-4445-8f3e-2548a718dac7: Claiming fa:16:3e:f8:3f:1b 192.168.122.226
Oct  8 12:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:49:49.387 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:3f:1b 192.168.122.226'], port_security=['fa:16:3e:f8:3f:1b 192.168.122.226'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.226/24', 'neutron:device_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ddcd45556b9d4077968eee95f005487d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f3351433-d580-4e91-809c-2020321c7d00', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5b64086-e7d8-42ad-b439-67cb79e13d7c, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=9f60af9b-fd44-4445-8f3e-2548a718dac7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:49:49.388 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 9f60af9b-fd44-4445-8f3e-2548a718dac7 in datapath 81c575b5-ac88-40d3-8b00-79c5c936eec4 bound to our chassis#033[00m
Oct  8 12:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:49:49.389 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 81c575b5-ac88-40d3-8b00-79c5c936eec4#033[00m
Oct  8 12:49:49 np0005476733 ovn_controller[263831]: 2025-10-08T16:49:49Z|00181|binding|INFO|Setting lport 9f60af9b-fd44-4445-8f3e-2548a718dac7 ovn-installed in OVS
Oct  8 12:49:49 np0005476733 ovn_controller[263831]: 2025-10-08T16:49:49Z|00182|binding|INFO|Setting lport 9f60af9b-fd44-4445-8f3e-2548a718dac7 up in Southbound
Oct  8 12:49:49 np0005476733 nova_compute[192580]: 2025-10-08 16:49:49.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:49 np0005476733 systemd-udevd[273460]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:49:49.417 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[305876a9-ca74-4b22-8028-46085b7fa0f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:49:49 np0005476733 systemd-machined[152624]: New machine qemu-66-instance-0000006c.
Oct  8 12:49:49 np0005476733 NetworkManager[51699]: <info>  [1759942189.4301] device (tap9f60af9b-fd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:49:49 np0005476733 NetworkManager[51699]: <info>  [1759942189.4312] device (tap9f60af9b-fd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 12:49:49 np0005476733 systemd[1]: Started Virtual Machine qemu-66-instance-0000006c.
Oct  8 12:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:49:49.453 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[66e79b9d-56a3-4a16-a2c9-809f678d75f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:49:49.457 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[95f06f44-b3ec-44a1-bd59-6494fa09189e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:49:49.490 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[836cd531-7930-42bc-9c7e-dfcb3303d5fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:49:49.513 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[32547d08-7a7b-4023-9eb7-5b000ea03692]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81c575b5-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:bf:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 17, 'tx_packets': 5, 'rx_bytes': 1210, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 17, 'tx_packets': 5, 'rx_bytes': 1210, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 904005, 'reachable_time': 43569, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273473, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:49:49.531 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[101fc78d-abf7-4682-bffd-9b54016f9ba0]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap81c575b5-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 904020, 'tstamp': 904020}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273474, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.122.171'], ['IFA_LOCAL', '192.168.122.171'], ['IFA_BROADCAST', '192.168.122.255'], ['IFA_LABEL', 'tap81c575b5-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 904024, 'tstamp': 904024}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273474, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:49:49.532 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81c575b5-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:49:49 np0005476733 nova_compute[192580]: 2025-10-08 16:49:49.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:49:49.579 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81c575b5-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:49:49.580 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:49:49.580 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap81c575b5-a0, col_values=(('external_ids', {'iface-id': '3737b929-673d-4d30-a674-dbb8c6c2e54d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:49:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:49:49.581 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:49:49 np0005476733 nova_compute[192580]: 2025-10-08 16:49:49.615 2 DEBUG nova.compute.manager [req-421a8d59-fb6d-4515-b40e-bbe1042341f9 req-8313eb7b-a3e2-461e-8fee-c6b6cc607dd3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Received event network-vif-plugged-9f60af9b-fd44-4445-8f3e-2548a718dac7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:49:49 np0005476733 nova_compute[192580]: 2025-10-08 16:49:49.616 2 DEBUG oslo_concurrency.lockutils [req-421a8d59-fb6d-4515-b40e-bbe1042341f9 req-8313eb7b-a3e2-461e-8fee-c6b6cc607dd3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:49:49 np0005476733 nova_compute[192580]: 2025-10-08 16:49:49.616 2 DEBUG oslo_concurrency.lockutils [req-421a8d59-fb6d-4515-b40e-bbe1042341f9 req-8313eb7b-a3e2-461e-8fee-c6b6cc607dd3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:49:49 np0005476733 nova_compute[192580]: 2025-10-08 16:49:49.617 2 DEBUG oslo_concurrency.lockutils [req-421a8d59-fb6d-4515-b40e-bbe1042341f9 req-8313eb7b-a3e2-461e-8fee-c6b6cc607dd3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:49:49 np0005476733 nova_compute[192580]: 2025-10-08 16:49:49.617 2 DEBUG nova.compute.manager [req-421a8d59-fb6d-4515-b40e-bbe1042341f9 req-8313eb7b-a3e2-461e-8fee-c6b6cc607dd3 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Processing event network-vif-plugged-9f60af9b-fd44-4445-8f3e-2548a718dac7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 12:49:49 np0005476733 nova_compute[192580]: 2025-10-08 16:49:49.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.290 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759942190.2890599, cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.290 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] VM Started (Lifecycle Event)#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.293 2 DEBUG nova.compute.manager [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.298 2 DEBUG nova.virt.libvirt.driver [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.303 2 INFO nova.virt.libvirt.driver [-] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Instance spawned successfully.#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.303 2 DEBUG nova.virt.libvirt.driver [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.318 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.325 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.328 2 DEBUG nova.virt.libvirt.driver [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.329 2 DEBUG nova.virt.libvirt.driver [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.329 2 DEBUG nova.virt.libvirt.driver [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.330 2 DEBUG nova.virt.libvirt.driver [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.330 2 DEBUG nova.virt.libvirt.driver [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.331 2 DEBUG nova.virt.libvirt.driver [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.358 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.358 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759942190.289711, cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.358 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] VM Paused (Lifecycle Event)#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.387 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.391 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759942190.296814, cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.392 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] VM Resumed (Lifecycle Event)#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.403 2 INFO nova.compute.manager [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Took 9.14 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.404 2 DEBUG nova.compute.manager [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.436 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.441 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.470 2 DEBUG nova.network.neutron [req-b98ae7f1-69cb-4c69-9c94-09df7f62a244 req-258f4ba6-de97-4e66-ba05-5b1995bbe044 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Updated VIF entry in instance network info cache for port 9f60af9b-fd44-4445-8f3e-2548a718dac7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.471 2 DEBUG nova.network.neutron [req-b98ae7f1-69cb-4c69-9c94-09df7f62a244 req-258f4ba6-de97-4e66-ba05-5b1995bbe044 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Updating instance_info_cache with network_info: [{"id": "9f60af9b-fd44-4445-8f3e-2548a718dac7", "address": "fa:16:3e:f8:3f:1b", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.226", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f60af9b-fd", "ovs_interfaceid": "9f60af9b-fd44-4445-8f3e-2548a718dac7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.493 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.503 2 DEBUG oslo_concurrency.lockutils [req-b98ae7f1-69cb-4c69-9c94-09df7f62a244 req-258f4ba6-de97-4e66-ba05-5b1995bbe044 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.505 2 INFO nova.compute.manager [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Took 9.79 seconds to build instance.#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.530 2 DEBUG oslo_concurrency.lockutils [None req-27ad0665-52b2-4762-957d-6a4c518d0147 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.905s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:49:50 np0005476733 nova_compute[192580]: 2025-10-08 16:49:50.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:49:51 np0005476733 nova_compute[192580]: 2025-10-08 16:49:51.699 2 DEBUG nova.compute.manager [req-06d51921-7784-4099-bea2-553909647543 req-6ef9ace2-7632-4648-9723-ae5658114b75 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Received event network-vif-plugged-9f60af9b-fd44-4445-8f3e-2548a718dac7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:49:51 np0005476733 nova_compute[192580]: 2025-10-08 16:49:51.700 2 DEBUG oslo_concurrency.lockutils [req-06d51921-7784-4099-bea2-553909647543 req-6ef9ace2-7632-4648-9723-ae5658114b75 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:49:51 np0005476733 nova_compute[192580]: 2025-10-08 16:49:51.700 2 DEBUG oslo_concurrency.lockutils [req-06d51921-7784-4099-bea2-553909647543 req-6ef9ace2-7632-4648-9723-ae5658114b75 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:49:51 np0005476733 nova_compute[192580]: 2025-10-08 16:49:51.700 2 DEBUG oslo_concurrency.lockutils [req-06d51921-7784-4099-bea2-553909647543 req-6ef9ace2-7632-4648-9723-ae5658114b75 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:49:51 np0005476733 nova_compute[192580]: 2025-10-08 16:49:51.700 2 DEBUG nova.compute.manager [req-06d51921-7784-4099-bea2-553909647543 req-6ef9ace2-7632-4648-9723-ae5658114b75 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] No waiting events found dispatching network-vif-plugged-9f60af9b-fd44-4445-8f3e-2548a718dac7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:49:51 np0005476733 nova_compute[192580]: 2025-10-08 16:49:51.701 2 WARNING nova.compute.manager [req-06d51921-7784-4099-bea2-553909647543 req-6ef9ace2-7632-4648-9723-ae5658114b75 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Received unexpected event network-vif-plugged-9f60af9b-fd44-4445-8f3e-2548a718dac7 for instance with vm_state active and task_state None.#033[00m
Oct  8 12:49:52 np0005476733 nova_compute[192580]: 2025-10-08 16:49:52.467 2 INFO nova.compute.manager [None req-78d7bd59-9a10-4c89-8f40-77f3eca97104 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Get console output#033[00m
Oct  8 12:49:52 np0005476733 nova_compute[192580]: 2025-10-08 16:49:52.475 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:49:52 np0005476733 nova_compute[192580]: 2025-10-08 16:49:52.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:49:53 np0005476733 nova_compute[192580]: 2025-10-08 16:49:53.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:54 np0005476733 podman[273484]: 2025-10-08 16:49:54.291688905 +0000 UTC m=+0.097531721 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:49:54 np0005476733 nova_compute[192580]: 2025-10-08 16:49:54.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:55 np0005476733 nova_compute[192580]: 2025-10-08 16:49:55.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:49:55 np0005476733 nova_compute[192580]: 2025-10-08 16:49:55.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:49:56 np0005476733 ovn_controller[263831]: 2025-10-08T16:49:56Z|00183|pinctrl|WARN|Dropped 275 log messages in last 65 seconds (most recently, 5 seconds ago) due to excessive rate
Oct  8 12:49:56 np0005476733 ovn_controller[263831]: 2025-10-08T16:49:56Z|00184|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:49:56 np0005476733 podman[273507]: 2025-10-08 16:49:56.281841158 +0000 UTC m=+0.108426469 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:49:57 np0005476733 nova_compute[192580]: 2025-10-08 16:49:57.639 2 INFO nova.compute.manager [None req-437a1ca3-d4d0-4a25-bee7-226217f081cb bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Get console output#033[00m
Oct  8 12:49:57 np0005476733 nova_compute[192580]: 2025-10-08 16:49:57.645 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:49:58 np0005476733 podman[273533]: 2025-10-08 16:49:58.234455843 +0000 UTC m=+0.068038111 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm)
Oct  8 12:49:58 np0005476733 nova_compute[192580]: 2025-10-08 16:49:58.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:49:59 np0005476733 nova_compute[192580]: 2025-10-08 16:49:59.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:50:02 np0005476733 nova_compute[192580]: 2025-10-08 16:50:02.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:50:02 np0005476733 nova_compute[192580]: 2025-10-08 16:50:02.803 2 INFO nova.compute.manager [None req-c5ec1da8-1c00-4c25-a204-6f05521484e1 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Get console output#033[00m
Oct  8 12:50:02 np0005476733 nova_compute[192580]: 2025-10-08 16:50:02.810 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:50:03 np0005476733 nova_compute[192580]: 2025-10-08 16:50:03.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:50:04 np0005476733 nova_compute[192580]: 2025-10-08 16:50:04.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:50:05 np0005476733 nova_compute[192580]: 2025-10-08 16:50:05.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:50:06 np0005476733 nova_compute[192580]: 2025-10-08 16:50:06.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:50:06 np0005476733 nova_compute[192580]: 2025-10-08 16:50:06.592 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:50:06 np0005476733 nova_compute[192580]: 2025-10-08 16:50:06.592 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:50:07 np0005476733 nova_compute[192580]: 2025-10-08 16:50:07.196 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-59ce9674-8997-4f60-b278-fca63264b284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:50:07 np0005476733 nova_compute[192580]: 2025-10-08 16:50:07.197 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-59ce9674-8997-4f60-b278-fca63264b284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:50:07 np0005476733 nova_compute[192580]: 2025-10-08 16:50:07.197 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:50:07 np0005476733 nova_compute[192580]: 2025-10-08 16:50:07.198 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 59ce9674-8997-4f60-b278-fca63264b284 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:50:07 np0005476733 nova_compute[192580]: 2025-10-08 16:50:07.981 2 INFO nova.compute.manager [None req-6ee38fc6-da93-41eb-a065-9657c5ebbbe7 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Get console output#033[00m
Oct  8 12:50:07 np0005476733 nova_compute[192580]: 2025-10-08 16:50:07.986 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:50:08 np0005476733 podman[273562]: 2025-10-08 16:50:08.231528671 +0000 UTC m=+0.056713740 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:50:08 np0005476733 podman[273563]: 2025-10-08 16:50:08.234251578 +0000 UTC m=+0.055874293 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, name=ubi9-minimal, vendor=Red Hat, Inc., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible)
Oct  8 12:50:08 np0005476733 podman[273561]: 2025-10-08 16:50:08.245887919 +0000 UTC m=+0.072641077 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0)
Oct  8 12:50:08 np0005476733 nova_compute[192580]: 2025-10-08 16:50:08.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:50:09 np0005476733 nova_compute[192580]: 2025-10-08 16:50:09.149 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Updating instance_info_cache with network_info: [{"id": "4d9e95a4-6e11-4c93-b8e6-862a11093b1c", "address": "fa:16:3e:57:29:6c", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.224", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d9e95a4-6e", "ovs_interfaceid": "4d9e95a4-6e11-4c93-b8e6-862a11093b1c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:50:09 np0005476733 nova_compute[192580]: 2025-10-08 16:50:09.176 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-59ce9674-8997-4f60-b278-fca63264b284" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:50:09 np0005476733 nova_compute[192580]: 2025-10-08 16:50:09.176 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:50:09 np0005476733 nova_compute[192580]: 2025-10-08 16:50:09.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:50:13 np0005476733 nova_compute[192580]: 2025-10-08 16:50:13.131 2 INFO nova.compute.manager [None req-10298e09-8c42-4b7a-8f5d-e47a999ca3f4 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Get console output#033[00m
Oct  8 12:50:13 np0005476733 nova_compute[192580]: 2025-10-08 16:50:13.136 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:50:13 np0005476733 nova_compute[192580]: 2025-10-08 16:50:13.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:50:14 np0005476733 nova_compute[192580]: 2025-10-08 16:50:14.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:50:14 np0005476733 nova_compute[192580]: 2025-10-08 16:50:14.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:50:15 np0005476733 podman[273624]: 2025-10-08 16:50:15.225172855 +0000 UTC m=+0.054623232 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:50:15 np0005476733 podman[273625]: 2025-10-08 16:50:15.235520625 +0000 UTC m=+0.058486286 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:50:15 np0005476733 ovn_controller[263831]: 2025-10-08T16:50:15Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f8:3f:1b 192.168.122.226
Oct  8 12:50:15 np0005476733 ovn_controller[263831]: 2025-10-08T16:50:15Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f8:3f:1b 192.168.122.226
Oct  8 12:50:15 np0005476733 nova_compute[192580]: 2025-10-08 16:50:15.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:50:15 np0005476733 nova_compute[192580]: 2025-10-08 16:50:15.619 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:50:15 np0005476733 nova_compute[192580]: 2025-10-08 16:50:15.620 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:50:15 np0005476733 nova_compute[192580]: 2025-10-08 16:50:15.621 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:50:15 np0005476733 nova_compute[192580]: 2025-10-08 16:50:15.621 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:50:15 np0005476733 nova_compute[192580]: 2025-10-08 16:50:15.719 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/59ce9674-8997-4f60-b278-fca63264b284/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:50:15 np0005476733 nova_compute[192580]: 2025-10-08 16:50:15.775 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/59ce9674-8997-4f60-b278-fca63264b284/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:50:15 np0005476733 nova_compute[192580]: 2025-10-08 16:50:15.777 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/59ce9674-8997-4f60-b278-fca63264b284/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:50:15 np0005476733 nova_compute[192580]: 2025-10-08 16:50:15.841 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/59ce9674-8997-4f60-b278-fca63264b284/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:50:15 np0005476733 nova_compute[192580]: 2025-10-08 16:50:15.848 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:50:15 np0005476733 nova_compute[192580]: 2025-10-08 16:50:15.910 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:50:15 np0005476733 nova_compute[192580]: 2025-10-08 16:50:15.911 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:50:15 np0005476733 nova_compute[192580]: 2025-10-08 16:50:15.989 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:50:16 np0005476733 nova_compute[192580]: 2025-10-08 16:50:16.137 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:50:16 np0005476733 nova_compute[192580]: 2025-10-08 16:50:16.140 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12317MB free_disk=111.1520004272461GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:50:16 np0005476733 nova_compute[192580]: 2025-10-08 16:50:16.140 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:50:16 np0005476733 nova_compute[192580]: 2025-10-08 16:50:16.141 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:50:16 np0005476733 nova_compute[192580]: 2025-10-08 16:50:16.231 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 59ce9674-8997-4f60-b278-fca63264b284 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:50:16 np0005476733 nova_compute[192580]: 2025-10-08 16:50:16.231 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:50:16 np0005476733 nova_compute[192580]: 2025-10-08 16:50:16.232 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:50:16 np0005476733 nova_compute[192580]: 2025-10-08 16:50:16.232 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=2560MB phys_disk=119GB used_disk=20GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:50:16 np0005476733 nova_compute[192580]: 2025-10-08 16:50:16.290 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:50:16 np0005476733 nova_compute[192580]: 2025-10-08 16:50:16.314 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:50:16 np0005476733 nova_compute[192580]: 2025-10-08 16:50:16.338 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:50:16 np0005476733 nova_compute[192580]: 2025-10-08 16:50:16.338 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:50:18 np0005476733 nova_compute[192580]: 2025-10-08 16:50:18.322 2 INFO nova.compute.manager [None req-8c6d4c72-4f6f-4d89-8743-990d9efc2da7 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Get console output#033[00m
Oct  8 12:50:18 np0005476733 nova_compute[192580]: 2025-10-08 16:50:18.328 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:50:18 np0005476733 nova_compute[192580]: 2025-10-08 16:50:18.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:50:19 np0005476733 ovn_controller[263831]: 2025-10-08T16:50:19Z|00185|memory_trim|INFO|Detected inactivity (last active 30024 ms ago): trimming memory
Oct  8 12:50:19 np0005476733 nova_compute[192580]: 2025-10-08 16:50:19.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:50:23 np0005476733 nova_compute[192580]: 2025-10-08 16:50:23.340 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:50:23 np0005476733 nova_compute[192580]: 2025-10-08 16:50:23.547 2 INFO nova.compute.manager [None req-43fdb295-c2fd-4fdc-b678-d271c19c5e8e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Get console output#033[00m
Oct  8 12:50:23 np0005476733 nova_compute[192580]: 2025-10-08 16:50:23.552 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:50:23 np0005476733 nova_compute[192580]: 2025-10-08 16:50:23.554 2 INFO nova.virt.libvirt.driver [None req-43fdb295-c2fd-4fdc-b678-d271c19c5e8e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Truncated console log returned, 3218 bytes ignored#033[00m
Oct  8 12:50:23 np0005476733 nova_compute[192580]: 2025-10-08 16:50:23.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:50:24 np0005476733 nova_compute[192580]: 2025-10-08 16:50:24.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:50:25 np0005476733 podman[273683]: 2025-10-08 16:50:25.222866804 +0000 UTC m=+0.051961969 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 12:50:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:50:26.425 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:50:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:50:26.426 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:50:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:50:26.427 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:50:27 np0005476733 podman[273711]: 2025-10-08 16:50:27.255836251 +0000 UTC m=+0.082366628 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct  8 12:50:28 np0005476733 nova_compute[192580]: 2025-10-08 16:50:28.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:50:28 np0005476733 nova_compute[192580]: 2025-10-08 16:50:28.727 2 INFO nova.compute.manager [None req-14a46398-0769-499d-a039-e694e0aa46d0 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Get console output#033[00m
Oct  8 12:50:28 np0005476733 nova_compute[192580]: 2025-10-08 16:50:28.732 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:50:28 np0005476733 nova_compute[192580]: 2025-10-08 16:50:28.736 2 INFO nova.virt.libvirt.driver [None req-14a46398-0769-499d-a039-e694e0aa46d0 bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Truncated console log returned, 3444 bytes ignored#033[00m
Oct  8 12:50:29 np0005476733 podman[273739]: 2025-10-08 16:50:29.229759566 +0000 UTC m=+0.061300065 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 12:50:29 np0005476733 nova_compute[192580]: 2025-10-08 16:50:29.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:50:33 np0005476733 nova_compute[192580]: 2025-10-08 16:50:33.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:50:34 np0005476733 nova_compute[192580]: 2025-10-08 16:50:34.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.080 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '59ce9674-8997-4f60-b278-fca63264b284', 'name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000006b', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ddcd45556b9d4077968eee95f005487d', 'user_id': 'bee18afeaf16419c98219491d4757b96', 'hostId': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.085 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000006c', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ddcd45556b9d4077968eee95f005487d', 'user_id': 'bee18afeaf16419c98219491d4757b96', 'hostId': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.085 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.102 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.102 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.115 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.116 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eeeb22f0-5625-4568-bd50-4a1ca08fc0a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '59ce9674-8997-4f60-b278-fca63264b284-vda', 'timestamp': '2025-10-08T16:50:36.086104', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'instance-0000006b', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e995fae2-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.809081904, 'message_signature': '782482e9b7cc921ffded75311022c02cf954466fe0b49220b4a189db3cfb05d7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '59ce9674-8997-4f60-b278-fca63264b284-sda', 'timestamp': '2025-10-08T16:50:36.086104', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'instance-0000006b', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e9960d48-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.809081904, 'message_signature': '689c0599e06d7c9913e3e9d9c91c62020906cae67918df68fba1949047f35e37'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-vda', 'timestamp': '2025-10-08T16:50:36.086104', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'instance-0000006c', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e9981098-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.826133747, 'message_signature': '67d62e75549d2dc2a921476f2e3126a0ef8b3bf1f983203373e3f55d62959f3c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-sda', 'timestamp': '2025-10-08T16:50:36.086104', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'instance-0000006c', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e99820c4-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.826133747, 'message_signature': 'd8455b0f8a44867072dc15168796ea8a39d93ec4bf850a951f14080b81c6bac4'}]}, 'timestamp': '2025-10-08 16:50:36.116767', '_unique_id': 'aca0f27cb4b04844b733b161ec67d998'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.119 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.120 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.120 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962>, <NovaLikeServer: tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962>, <NovaLikeServer: tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068>]
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.120 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.120 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.120 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962>, <NovaLikeServer: tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962>, <NovaLikeServer: tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068>]
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.121 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.121 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.121 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962>, <NovaLikeServer: tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962>, <NovaLikeServer: tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068>]
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.121 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.125 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 59ce9674-8997-4f60-b278-fca63264b284 / tap4d9e95a4-6e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.125 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/network.outgoing.bytes volume: 36051 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.128 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44 / tap9f60af9b-fd inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.128 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/network.outgoing.bytes volume: 26313 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9396483e-7bc5-440c-a4ea-5cc7470dbb92', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 36051, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-0000006b-59ce9674-8997-4f60-b278-fca63264b284-tap4d9e95a4-6e', 'timestamp': '2025-10-08T16:50:36.121801', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'tap4d9e95a4-6e', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:57:29:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4d9e95a4-6e'}, 'message_id': 'e9998fb8-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.844808163, 'message_signature': '5081453bcbd0d412e817881ebc521a7e8e679b5acf24f19e94e5bbe5f6b079ae'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 26313, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-0000006c-cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-tap9f60af9b-fd', 'timestamp': '2025-10-08T16:50:36.121801', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'tap9f60af9b-fd', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f8:3f:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f60af9b-fd'}, 'message_id': 'e999f282-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.849156871, 'message_signature': '597bc273c4f041cf86283efc0fb5224fd713dc33eedaf178b3ae5d08e9f6a66c'}]}, 'timestamp': '2025-10-08 16:50:36.128696', '_unique_id': 'c195419990c64ac68a304eecaa1408a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.131 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.131 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.131 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '993d5003-1725-4357-b276-8456f405a52c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-0000006b-59ce9674-8997-4f60-b278-fca63264b284-tap4d9e95a4-6e', 'timestamp': '2025-10-08T16:50:36.131370', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'tap4d9e95a4-6e', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:57:29:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4d9e95a4-6e'}, 'message_id': 'e99a6a82-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.844808163, 'message_signature': '727a5eb71033479ee99ede487ffeca1fec5523bcb8fff1a746587dcc1ef87f93'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-0000006c-cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-tap9f60af9b-fd', 'timestamp': '2025-10-08T16:50:36.131370', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'tap9f60af9b-fd', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f8:3f:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f60af9b-fd'}, 'message_id': 'e99a782e-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.849156871, 'message_signature': 'f741f53e9908b98670fb6d1918cd2d16a51015028c0706e93c35a31eac88426b'}]}, 'timestamp': '2025-10-08 16:50:36.132117', '_unique_id': '4594efcb4efb4560bf034085431cfcbe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.132 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.134 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.134 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.134 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e5dbcac-71f4-4b44-8e24-eb42ca08d1af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-0000006b-59ce9674-8997-4f60-b278-fca63264b284-tap4d9e95a4-6e', 'timestamp': '2025-10-08T16:50:36.134169', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'tap4d9e95a4-6e', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:57:29:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4d9e95a4-6e'}, 'message_id': 'e99ad6f2-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.844808163, 'message_signature': '50f10b8e1a2bac3f3364a0115bd2d357b527a54271731d58039018754cf5fc1c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-0000006c-cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-tap9f60af9b-fd', 'timestamp': '2025-10-08T16:50:36.134169', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'tap9f60af9b-fd', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f8:3f:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f60af9b-fd'}, 'message_id': 'e99ae4a8-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.849156871, 'message_signature': '92b4277d2fbd9b9e6690c26776ffee080a73020dcea8284639fedd45288e8dc1'}]}, 'timestamp': '2025-10-08 16:50:36.134879', '_unique_id': 'fe4280aad5a84612a205000ffddc5f0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.135 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.136 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.157 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/disk.device.read.latency volume: 7524177171 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.158 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/disk.device.read.latency volume: 176426087 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.177 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.device.read.latency volume: 9200449888 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.178 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.device.read.latency volume: 59296941 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36c0043c-0335-41f8-9e32-04507138f60c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7524177171, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '59ce9674-8997-4f60-b278-fca63264b284-vda', 'timestamp': '2025-10-08T16:50:36.136895', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'instance-0000006b', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e99e714a-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.859881303, 'message_signature': '48d8578f6d1b2516842e2e0b9297a08b06d855a166e6f48232adfb85d882b4fb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 176426087, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '59ce9674-8997-4f60-b278-fca63264b284-sda', 'timestamp': '2025-10-08T16:50:36.136895', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'instance-0000006b', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e99e7ee2-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.859881303, 'message_signature': '37fd6b85826778fe34bac59238eb47f30a21aa616f42140eac3fedea93b033d4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9200449888, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-vda', 'timestamp': '2025-10-08T16:50:36.136895', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'instance-0000006c', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e9a17a34-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.881394289, 'message_signature': 'f1905817129e8c9a489e5c898c6d999568aed0785461a410b01e89e5c768b4e9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 59296941, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-sda', 'timestamp': '2025-10-08T16:50:36.136895', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'instance-0000006c', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e9a186a0-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.881394289, 'message_signature': 'f08e8946aea153febc2298fff0e793ba3bff022117a2c22cce0fcf4becf99258'}]}, 'timestamp': '2025-10-08 16:50:36.178304', '_unique_id': 'aa2cdb5297fd42aabbd0d7788dc8d8f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.179 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.180 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.180 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/disk.device.write.requests volume: 732 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.180 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.180 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.device.write.requests volume: 485 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b677f1c-1a85-4dda-b239-af2bb65f97b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 732, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '59ce9674-8997-4f60-b278-fca63264b284-vda', 'timestamp': '2025-10-08T16:50:36.180442', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'instance-0000006b', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e9a1e528-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.859881303, 'message_signature': 'e64590c88e0108d96275c5460d648d4c3dfbf3807be5bd1c5c60ac5809c1bcbc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '59ce9674-8997-4f60-b278-fca63264b284-sda', 'timestamp': '2025-10-08T16:50:36.180442', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'instance-0000006b', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e9a1eee2-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.859881303, 'message_signature': '6896c3a064466a2fc09572a450747037026152bfdc4075099cda4063749db1b6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 485, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-vda', 'timestamp': '2025-10-08T16:50:36.180442', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'instance-0000006c', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e9a1f89c-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.881394289, 'message_signature': '03a5dbb3e9c41220d2d8c17eabfe131955ec712e41997e852dcbe9d7b2a20a21'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-sda', 'timestamp': '2025-10-08T16:50:36.180442', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'instance-0000006c', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e9a201c0-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.881394289, 'message_signature': '897a9e5c49076e9e67c61f07f8c752c0067502887b94ad5a236317d8b07c7074'}]}, 'timestamp': '2025-10-08 16:50:36.181443', '_unique_id': '2c150f56e5ef43d59e6c12d342d41413'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.181 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.182 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.182 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/disk.device.write.bytes volume: 135758336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.183 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.183 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.device.write.bytes volume: 144878592 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.183 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c871c71b-35aa-4e1c-ad7d-c7e1eabd1f08', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 135758336, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '59ce9674-8997-4f60-b278-fca63264b284-vda', 'timestamp': '2025-10-08T16:50:36.182963', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'instance-0000006b', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e9a2486a-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.859881303, 'message_signature': '7327330d2916ac2caef6c79a1b50767d0dcc4ed1ac4e2c5311d5f086d83e96aa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '59ce9674-8997-4f60-b278-fca63264b284-sda', 'timestamp': '2025-10-08T16:50:36.182963', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'instance-0000006b', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e9a2526a-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.859881303, 'message_signature': 'a8311ccbc9280f590f17254b8044986ebad8e11791c26c4c9d1385f36e790b35'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 144878592, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-vda', 'timestamp': '2025-10-08T16:50:36.182963', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'instance-0000006c', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e9a25b7a-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.881394289, 'message_signature': '1fa3d67fe688f4b18ae077fb6c87f6a1a9dee29f8dc7b5bfbd6eb3f2c8b9d898'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-sda', 'timestamp': '2025-10-08T16:50:36.182963', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'instance-0000006c', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e9a26408-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.881394289, 'message_signature': '0528c9450e392b731ae06e19c1ba3fa19d0bf08a5d7166fcc2df73a235f1bf35'}]}, 'timestamp': '2025-10-08 16:50:36.183952', '_unique_id': 'eaa2abd87bbb48299aa53ad6bbad185f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.184 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.185 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.185 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/disk.device.write.latency volume: 7262900872 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.185 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.185 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.device.write.latency volume: 5981922609 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.186 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e913089-2a43-488b-a2e6-aa82d7f4c653', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7262900872, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '59ce9674-8997-4f60-b278-fca63264b284-vda', 'timestamp': '2025-10-08T16:50:36.185461', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'instance-0000006b', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e9a2a882-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.859881303, 'message_signature': '8ba9b5a16c14c5d88deb1aab26b91e9a75a430c19efaa2851d2ffd76d7609703'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '59ce9674-8997-4f60-b278-fca63264b284-sda', 'timestamp': '2025-10-08T16:50:36.185461', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'instance-0000006b', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e9a2b1a6-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.859881303, 'message_signature': '18b76cea9a7f88fd70576794c206d8b66a12c4d9f47106396e3e2921b4d83003'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5981922609, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-vda', 'timestamp': '2025-10-08T16:50:36.185461', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'instance-0000006c', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e9a2bb74-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.881394289, 'message_signature': 'bc35460347e348962cc88dac60321b45f480ceaf340c69e115995f3c73ac5ef7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-sda', 'timestamp': '2025-10-08T16:50:36.185461', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'instance-0000006c', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e9a2c45c-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.881394289, 'message_signature': '90e735735d844264751d3c805f3e4a6dc5efa209bdac221d30f8ca92edb406bf'}]}, 'timestamp': '2025-10-08 16:50:36.186420', '_unique_id': 'e36da4442ee149e58a28289919bb7f96'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.187 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.188 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.188 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/network.incoming.packets volume: 161 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.188 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/network.incoming.packets volume: 113 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59055f41-f36b-4152-a009-82e167c17df2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 161, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-0000006b-59ce9674-8997-4f60-b278-fca63264b284-tap4d9e95a4-6e', 'timestamp': '2025-10-08T16:50:36.188171', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'tap4d9e95a4-6e', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:57:29:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4d9e95a4-6e'}, 'message_id': 'e9a31308-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.844808163, 'message_signature': 'e081d757a46ebdd2cb870cad21eba1c10a869e75af2084d121ab4896aa9fbb3b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 113, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-0000006c-cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-tap9f60af9b-fd', 'timestamp': '2025-10-08T16:50:36.188171', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'tap9f60af9b-fd', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f8:3f:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f60af9b-fd'}, 'message_id': 'e9a31c72-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.849156871, 'message_signature': '3ab6485bf960ea60ed08f3071d6aa2a8c8c555bd75e740fbdde51b5076c2c307'}]}, 'timestamp': '2025-10-08 16:50:36.188680', '_unique_id': 'c99fe1b559be4df7831ad7e4ec94a969'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.189 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.190 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.190 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.190 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.190 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fff86642-1218-4168-88ef-18bcc30bf835', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '59ce9674-8997-4f60-b278-fca63264b284-vda', 'timestamp': '2025-10-08T16:50:36.190097', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'instance-0000006b', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e9a3624a-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.809081904, 'message_signature': '55fd6a6e09e9b07f87f3c2e9d542847f5f57e8d1a1cc8758c53b847dd5dd0769'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '59ce9674-8997-4f60-b278-fca63264b284-sda', 'timestamp': '2025-10-08T16:50:36.190097', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'instance-0000006b', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e9a36b64-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.809081904, 'message_signature': 'd8094768975d33a2b51eb21d1b14a3d2853b1fc16c7bb98c8251b7c2c1e71584'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-vda', 'timestamp': '2025-10-08T16:50:36.190097', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'instance-0000006c', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e9a3741a-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.826133747, 'message_signature': '85904e59f3d77a767fe95c9ae850e2d48946d82b63980ee28f65429d3c9cc6c6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-sda', 'timestamp': '2025-10-08T16:50:36.190097', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'instance-0000006c', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e9a37da2-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.826133747, 'message_signature': 'cacdef9487bb4c4b52115f0feaa57426d586631826e05558ba0ba6c026a70f0e'}]}, 'timestamp': '2025-10-08 16:50:36.191186', '_unique_id': '8743406db70449eb883f1d6004974146'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.191 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.192 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.192 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.192 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9dadf901-e5ce-4249-8aa4-2a8db6f4d257', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-0000006b-59ce9674-8997-4f60-b278-fca63264b284-tap4d9e95a4-6e', 'timestamp': '2025-10-08T16:50:36.192636', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'tap4d9e95a4-6e', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:57:29:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4d9e95a4-6e'}, 'message_id': 'e9a3c0d2-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.844808163, 'message_signature': 'b68553898a25dd7c415a9b84ff488d45ac4a8511df152c95aa2c76c809f25935'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-0000006c-cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-tap9f60af9b-fd', 'timestamp': '2025-10-08T16:50:36.192636', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'tap9f60af9b-fd', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f8:3f:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f60af9b-fd'}, 'message_id': 'e9a3ca1e-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.849156871, 'message_signature': '689a57cc564c7526917816a8c5313b8851e26616ea037a0e6cab4688af8188a6'}]}, 'timestamp': '2025-10-08 16:50:36.193141', '_unique_id': 'ab9d25a1eb52404689fbba40695e7c06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.193 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.194 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.194 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.194 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5cd1ca44-6c44-4b31-a06a-0115e34f5b45', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-0000006b-59ce9674-8997-4f60-b278-fca63264b284-tap4d9e95a4-6e', 'timestamp': '2025-10-08T16:50:36.194695', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'tap4d9e95a4-6e', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:57:29:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4d9e95a4-6e'}, 'message_id': 'e9a4112c-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.844808163, 'message_signature': 'f4535c1d42f404ff55b7d5531c1517af2f57bb23e585401d61aa6a5f8d81ae4f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-0000006c-cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-tap9f60af9b-fd', 'timestamp': '2025-10-08T16:50:36.194695', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'tap9f60af9b-fd', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f8:3f:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f60af9b-fd'}, 'message_id': 'e9a41c30-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.849156871, 'message_signature': 'f64c1f38d63ac29b74f6a9413d1fae894caedab94ea3077218c4631465c71d68'}]}, 'timestamp': '2025-10-08 16:50:36.195228', '_unique_id': '1cc3fee4f7dc4cfe9a70eaee7cf167d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.195 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.196 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.225 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/memory.usage volume: 229.8125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.241 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/memory.usage volume: 318.13671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ffe20c5d-e092-43fe-8302-fb9f7c885e41', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 229.8125, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '59ce9674-8997-4f60-b278-fca63264b284', 'timestamp': '2025-10-08T16:50:36.196684', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'instance-0000006b', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': 'e9a8be52-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.94789165, 'message_signature': '07790883bd037cbe43f43b4c1a23a9d7f30b85a5f800618510c9ccda3a11a792'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 318.13671875, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'timestamp': '2025-10-08T16:50:36.196684', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'instance-0000006c', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': 'e9ab3268-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.964025704, 'message_signature': '4415e7eda6e2d009298f103ebc985e640dea6cc330f1a5370c5172af00a8d5ff'}]}, 'timestamp': '2025-10-08 16:50:36.241735', '_unique_id': 'd54d0c9284d04557aa8aa76e2b2f7fd4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.242 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.243 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.243 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/disk.device.read.bytes volume: 330892800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.244 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.244 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.device.read.bytes volume: 332064256 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.244 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0ce6e0e-9871-477b-af25-73ad94951f86', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 330892800, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '59ce9674-8997-4f60-b278-fca63264b284-vda', 'timestamp': '2025-10-08T16:50:36.243824', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'instance-0000006b', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e9ab9122-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.859881303, 'message_signature': '25968cc048ffbc871b0e86c134317087ac017220aa141dbab37fd13ef67850aa'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '59ce9674-8997-4f60-b278-fca63264b284-sda', 'timestamp': '2025-10-08T16:50:36.243824', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'instance-0000006b', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e9ab9cd0-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.859881303, 'message_signature': '5883022af26fc1ad6f81ff199cecb6e5a78c10a9179d3f3fac7a4131a799cbd0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 332064256, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-vda', 'timestamp': '2025-10-08T16:50:36.243824', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'instance-0000006c', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e9aba626-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.881394289, 'message_signature': '9c3bb9eeae555ffdfb0cf1742afb80b4e0e1e446e97bacc399d2b93d437c1706'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-sda', 'timestamp': '2025-10-08T16:50:36.243824', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'instance-0000006c', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e9abaf04-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.881394289, 'message_signature': '398cc462300a7f8a80a67a0233660af3389fab42d6ccef89a98d173c4d2174fa'}]}, 'timestamp': '2025-10-08 16:50:36.244858', '_unique_id': '00b1228a2e234f379c1961208772af15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.245 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.246 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.247 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/network.outgoing.packets volume: 194 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.247 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/network.outgoing.packets volume: 148 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e78785d7-9c0f-4311-8368-9fd3df6c878d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 194, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-0000006b-59ce9674-8997-4f60-b278-fca63264b284-tap4d9e95a4-6e', 'timestamp': '2025-10-08T16:50:36.246987', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'tap4d9e95a4-6e', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:57:29:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4d9e95a4-6e'}, 'message_id': 'e9ac1052-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.844808163, 'message_signature': '4ae89210fbda820fe338e6fb64442c420a6aa00c0b1fbcc657dfb3542f2f3911'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 148, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-0000006c-cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-tap9f60af9b-fd', 'timestamp': '2025-10-08T16:50:36.246987', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'tap9f60af9b-fd', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f8:3f:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f60af9b-fd'}, 'message_id': 'e9ac24ca-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.849156871, 'message_signature': '1716e8e64fcffe0dbf78b8853398d0a2040dd68fd9831ddd7a736cc09f0895b6'}]}, 'timestamp': '2025-10-08 16:50:36.247928', '_unique_id': '80410bbb6c1d4490818e67c5b56ed23c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.248 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.250 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.250 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.250 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f6f39bc-ec3f-44e8-8f0e-ff0afc5a303c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-0000006b-59ce9674-8997-4f60-b278-fca63264b284-tap4d9e95a4-6e', 'timestamp': '2025-10-08T16:50:36.250375', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'tap4d9e95a4-6e', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:57:29:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4d9e95a4-6e'}, 'message_id': 'e9ac91ee-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.844808163, 'message_signature': '913c6204c539b6f6e453be555fd77194991032785d41e3f42a37f940745474cc'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-0000006c-cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-tap9f60af9b-fd', 'timestamp': '2025-10-08T16:50:36.250375', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'tap9f60af9b-fd', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f8:3f:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f60af9b-fd'}, 'message_id': 'e9ac9d9c-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.849156871, 'message_signature': '358112369e5363a838bb88432d212ed5dbcf5065e537fd137fedc0b270a64a67'}]}, 'timestamp': '2025-10-08 16:50:36.250982', '_unique_id': '3d4a9faa183b4787a6f2ab7a780b616d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.251 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.252 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.252 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.253 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962>, <NovaLikeServer: tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962>, <NovaLikeServer: tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068>]
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.253 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.253 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/disk.device.read.requests volume: 11691 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.253 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.253 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.device.read.requests volume: 11709 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.254 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '015c99e3-4a77-4fb8-9077-73cd934c1d8b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11691, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '59ce9674-8997-4f60-b278-fca63264b284-vda', 'timestamp': '2025-10-08T16:50:36.253377', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'instance-0000006b', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e9ad05f2-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.859881303, 'message_signature': 'ae231c8ee47f8326b42122f4ab94546b11a9a5e22098b00ea60155061ed0ead8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '59ce9674-8997-4f60-b278-fca63264b284-sda', 'timestamp': '2025-10-08T16:50:36.253377', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'instance-0000006b', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e9ad106a-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.859881303, 'message_signature': '0177c6b9af65cc5e2be689a358eb46c42b263aa8ddb22e909603149d68c31b1d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11709, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-vda', 'timestamp': '2025-10-08T16:50:36.253377', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'instance-0000006c', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e9ad1966-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.881394289, 'message_signature': '60063811aa4a9b6015ffa7eb7a2d2e884a958db7b2b300308f47337f1999e8fe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-sda', 'timestamp': '2025-10-08T16:50:36.253377', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'instance-0000006c', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e9ad230c-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.881394289, 'message_signature': '96197d711b47fb36495ff2ddf0207a9b83c70c3bcd5f610f5729f2ccacdb11c1'}]}, 'timestamp': '2025-10-08 16:50:36.254383', '_unique_id': '8511c6debcc643589d93a6790e772a8a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.255 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.256 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.256 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/disk.device.usage volume: 152371200 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.256 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.256 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.device.usage volume: 152502272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.256 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2026a84e-1907-4e7f-8c60-e20bdddf1fb9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152371200, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '59ce9674-8997-4f60-b278-fca63264b284-vda', 'timestamp': '2025-10-08T16:50:36.256165', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'instance-0000006b', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e9ad72a8-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.809081904, 'message_signature': '79f5be77707cddcea2516516153f36c22812b0b37a4be7df5267c843063abdca'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '59ce9674-8997-4f60-b278-fca63264b284-sda', 'timestamp': '2025-10-08T16:50:36.256165', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'instance-0000006b', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e9ad7bf4-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.809081904, 'message_signature': 'c4afa186b2b3f9fc02bd59f203e20229ffbe9f3868446f4b83a456817f557d1b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152502272, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-vda', 'timestamp': '2025-10-08T16:50:36.256165', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'instance-0000006c', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'e9ad85ae-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.826133747, 'message_signature': '3d44d60db860c233ae548283f63cb20602880bac0f7a940c21ef8f3c0f8881be'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-sda', 'timestamp': '2025-10-08T16:50:36.256165', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'instance-0000006c', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'e9ad8f22-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.826133747, 'message_signature': '8950d9f065f0c331a29a842080a386f402872e4541536042c8daddcbb83d04a6'}]}, 'timestamp': '2025-10-08 16:50:36.257183', '_unique_id': '29ab8ec3ffde4e5692ac827247cdc689'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.257 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.259 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.259 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/network.incoming.bytes volume: 29967 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.259 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/network.incoming.bytes volume: 19901 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2432bf05-4745-4f71-9c51-174eb684ce29', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29967, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-0000006b-59ce9674-8997-4f60-b278-fca63264b284-tap4d9e95a4-6e', 'timestamp': '2025-10-08T16:50:36.259139', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'tap4d9e95a4-6e', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:57:29:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4d9e95a4-6e'}, 'message_id': 'e9ade742-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.844808163, 'message_signature': '883d0c9b87f0944c725dcc99f3a547f1be9fba110c7b549eb56c543008708458'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 19901, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-0000006c-cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-tap9f60af9b-fd', 'timestamp': '2025-10-08T16:50:36.259139', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'tap9f60af9b-fd', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f8:3f:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f60af9b-fd'}, 'message_id': 'e9adf142-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.849156871, 'message_signature': '304d355d393620ffc0cd714f496e3b5038164a49fd7fca1faf4fb27432d9b4fc'}]}, 'timestamp': '2025-10-08 16:50:36.259694', '_unique_id': '39a6c6e10dd447e6a8c472f639e17e2e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.260 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.261 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.261 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.261 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1df7d21d-9855-49d0-80ef-f59ea3514174', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-0000006b-59ce9674-8997-4f60-b278-fca63264b284-tap4d9e95a4-6e', 'timestamp': '2025-10-08T16:50:36.261323', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'tap4d9e95a4-6e', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:57:29:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4d9e95a4-6e'}, 'message_id': 'e9ae3d14-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.844808163, 'message_signature': 'a920363af2fef48f655a3d9c455dce6ff9cc54ce117415bc2f4bb227e7d4ce4b'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'instance-0000006c-cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-tap9f60af9b-fd', 'timestamp': '2025-10-08T16:50:36.261323', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'tap9f60af9b-fd', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:f8:3f:1b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9f60af9b-fd'}, 'message_id': 'e9ae46ba-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.849156871, 'message_signature': 'a6d8a011a281c83eaa2f94a4418e588a2dd8525fcb228c0c4f7b4ad862e38945'}]}, 'timestamp': '2025-10-08 16:50:36.261853', '_unique_id': '2fbbd60832ec4e46a2a2645add7060fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.262 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.263 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.263 12 DEBUG ceilometer.compute.pollsters [-] 59ce9674-8997-4f60-b278-fca63264b284/cpu volume: 41430000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.263 12 DEBUG ceilometer.compute.pollsters [-] cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44/cpu volume: 42520000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88130c7b-d7f9-46b0-a832-ffa374f7299b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 41430000000, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': '59ce9674-8997-4f60-b278-fca63264b284', 'timestamp': '2025-10-08T16:50:36.263373', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962', 'name': 'instance-0000006b', 'instance_id': '59ce9674-8997-4f60-b278-fca63264b284', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'e9ae8cb0-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.94789165, 'message_signature': '431ffeeb477c63e583b080c93d4356fdd0f216757d0f2ee00e1e1931974ff62d'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 42520000000, 'user_id': 'bee18afeaf16419c98219491d4757b96', 'user_name': None, 'project_id': 'ddcd45556b9d4077968eee95f005487d', 'project_name': None, 'resource_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'timestamp': '2025-10-08T16:50:36.263373', 'resource_metadata': {'display_name': 'tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068', 'name': 'instance-0000006c', 'instance_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'instance_type': 'custom_neutron_guest', 'host': '8dcbbd82292ede28dc63e25b3fd362983e4a6210d4f8991656d7a15d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'e9ae98ae-a466-11f0-9274-fa163ef67048', 'monotonic_time': 9139.964025704, 'message_signature': 'd52ec911a1650e2d96c8d48dde086486f6f66e38cad032e2dc08019890610e25'}]}, 'timestamp': '2025-10-08 16:50:36.263967', '_unique_id': '126c1c357d0446f3a25a8d4c6d70e62d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:50:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:50:36.264 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:50:38 np0005476733 nova_compute[192580]: 2025-10-08 16:50:38.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:50:39 np0005476733 podman[273760]: 2025-10-08 16:50:39.246826831 +0000 UTC m=+0.063495456 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:50:39 np0005476733 podman[273759]: 2025-10-08 16:50:39.253869515 +0000 UTC m=+0.070421746 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 12:50:39 np0005476733 podman[273761]: 2025-10-08 16:50:39.284008326 +0000 UTC m=+0.100786555 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Oct  8 12:50:39 np0005476733 nova_compute[192580]: 2025-10-08 16:50:39.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:50:43 np0005476733 nova_compute[192580]: 2025-10-08 16:50:43.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:50:44 np0005476733 nova_compute[192580]: 2025-10-08 16:50:44.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:50:46 np0005476733 podman[273874]: 2025-10-08 16:50:46.251359616 +0000 UTC m=+0.067571676 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:50:46 np0005476733 podman[273873]: 2025-10-08 16:50:46.262957565 +0000 UTC m=+0.075459937 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001)
Oct  8 12:50:48 np0005476733 nova_compute[192580]: 2025-10-08 16:50:48.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:50:49 np0005476733 nova_compute[192580]: 2025-10-08 16:50:49.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:50:52 np0005476733 nova_compute[192580]: 2025-10-08 16:50:52.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:50:53 np0005476733 nova_compute[192580]: 2025-10-08 16:50:53.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:50:54 np0005476733 nova_compute[192580]: 2025-10-08 16:50:54.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:50:54 np0005476733 nova_compute[192580]: 2025-10-08 16:50:54.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:50:56 np0005476733 podman[273926]: 2025-10-08 16:50:56.24521088 +0000 UTC m=+0.067846785 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  8 12:50:56 np0005476733 nova_compute[192580]: 2025-10-08 16:50:56.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:50:56 np0005476733 nova_compute[192580]: 2025-10-08 16:50:56.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:50:58 np0005476733 ovn_controller[263831]: 2025-10-08T16:50:58Z|00186|pinctrl|WARN|Dropped 125 log messages in last 62 seconds (most recently, 13 seconds ago) due to excessive rate
Oct  8 12:50:58 np0005476733 ovn_controller[263831]: 2025-10-08T16:50:58Z|00187|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:50:58 np0005476733 podman[273945]: 2025-10-08 16:50:58.330047461 +0000 UTC m=+0.142262057 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:50:58 np0005476733 nova_compute[192580]: 2025-10-08 16:50:58.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:50:59 np0005476733 nova_compute[192580]: 2025-10-08 16:50:59.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:00 np0005476733 podman[273971]: 2025-10-08 16:51:00.256129871 +0000 UTC m=+0.079952290 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 12:51:02 np0005476733 nova_compute[192580]: 2025-10-08 16:51:02.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:51:03 np0005476733 nova_compute[192580]: 2025-10-08 16:51:03.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:04 np0005476733 nova_compute[192580]: 2025-10-08 16:51:04.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:06 np0005476733 nova_compute[192580]: 2025-10-08 16:51:06.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:51:06 np0005476733 nova_compute[192580]: 2025-10-08 16:51:06.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:51:07 np0005476733 nova_compute[192580]: 2025-10-08 16:51:07.216 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:51:07 np0005476733 nova_compute[192580]: 2025-10-08 16:51:07.216 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:51:07 np0005476733 nova_compute[192580]: 2025-10-08 16:51:07.216 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:51:08 np0005476733 nova_compute[192580]: 2025-10-08 16:51:08.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:08 np0005476733 nova_compute[192580]: 2025-10-08 16:51:08.953 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Updating instance_info_cache with network_info: [{"id": "9f60af9b-fd44-4445-8f3e-2548a718dac7", "address": "fa:16:3e:f8:3f:1b", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.226", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f60af9b-fd", "ovs_interfaceid": "9f60af9b-fd44-4445-8f3e-2548a718dac7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:51:08 np0005476733 nova_compute[192580]: 2025-10-08 16:51:08.972 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:51:08 np0005476733 nova_compute[192580]: 2025-10-08 16:51:08.973 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:51:08 np0005476733 nova_compute[192580]: 2025-10-08 16:51:08.973 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:51:09 np0005476733 nova_compute[192580]: 2025-10-08 16:51:09.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:10 np0005476733 podman[273990]: 2025-10-08 16:51:10.227480588 +0000 UTC m=+0.060810860 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 12:51:10 np0005476733 podman[273991]: 2025-10-08 16:51:10.235501454 +0000 UTC m=+0.064055784 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:51:10 np0005476733 podman[273992]: 2025-10-08 16:51:10.239310526 +0000 UTC m=+0.065669135 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Oct  8 12:51:11 np0005476733 nova_compute[192580]: 2025-10-08 16:51:11.759 2 DEBUG oslo_concurrency.lockutils [None req-3a141cfd-6571-4792-a54a-6e6ed880b00c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:51:11 np0005476733 nova_compute[192580]: 2025-10-08 16:51:11.759 2 DEBUG oslo_concurrency.lockutils [None req-3a141cfd-6571-4792-a54a-6e6ed880b00c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:51:11 np0005476733 nova_compute[192580]: 2025-10-08 16:51:11.759 2 DEBUG oslo_concurrency.lockutils [None req-3a141cfd-6571-4792-a54a-6e6ed880b00c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:51:11 np0005476733 nova_compute[192580]: 2025-10-08 16:51:11.759 2 DEBUG oslo_concurrency.lockutils [None req-3a141cfd-6571-4792-a54a-6e6ed880b00c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:51:11 np0005476733 nova_compute[192580]: 2025-10-08 16:51:11.760 2 DEBUG oslo_concurrency.lockutils [None req-3a141cfd-6571-4792-a54a-6e6ed880b00c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:51:11 np0005476733 nova_compute[192580]: 2025-10-08 16:51:11.761 2 INFO nova.compute.manager [None req-3a141cfd-6571-4792-a54a-6e6ed880b00c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Terminating instance#033[00m
Oct  8 12:51:11 np0005476733 nova_compute[192580]: 2025-10-08 16:51:11.761 2 DEBUG nova.compute.manager [None req-3a141cfd-6571-4792-a54a-6e6ed880b00c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 12:51:11 np0005476733 kernel: tap9f60af9b-fd (unregistering): left promiscuous mode
Oct  8 12:51:11 np0005476733 NetworkManager[51699]: <info>  [1759942271.9071] device (tap9f60af9b-fd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 12:51:11 np0005476733 ovn_controller[263831]: 2025-10-08T16:51:11Z|00188|binding|INFO|Releasing lport 9f60af9b-fd44-4445-8f3e-2548a718dac7 from this chassis (sb_readonly=0)
Oct  8 12:51:11 np0005476733 ovn_controller[263831]: 2025-10-08T16:51:11Z|00189|binding|INFO|Setting lport 9f60af9b-fd44-4445-8f3e-2548a718dac7 down in Southbound
Oct  8 12:51:11 np0005476733 ovn_controller[263831]: 2025-10-08T16:51:11Z|00190|binding|INFO|Removing iface tap9f60af9b-fd ovn-installed in OVS
Oct  8 12:51:11 np0005476733 nova_compute[192580]: 2025-10-08 16:51:11.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:11 np0005476733 nova_compute[192580]: 2025-10-08 16:51:11.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:11.929 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:3f:1b 192.168.122.226'], port_security=['fa:16:3e:f8:3f:1b 192.168.122.226'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.226/24', 'neutron:device_id': 'cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ddcd45556b9d4077968eee95f005487d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f3351433-d580-4e91-809c-2020321c7d00', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5b64086-e7d8-42ad-b439-67cb79e13d7c, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=9f60af9b-fd44-4445-8f3e-2548a718dac7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:51:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:11.930 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 9f60af9b-fd44-4445-8f3e-2548a718dac7 in datapath 81c575b5-ac88-40d3-8b00-79c5c936eec4 unbound from our chassis#033[00m
Oct  8 12:51:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:11.931 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 81c575b5-ac88-40d3-8b00-79c5c936eec4#033[00m
Oct  8 12:51:11 np0005476733 nova_compute[192580]: 2025-10-08 16:51:11.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:11.958 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d9896e40-b384-450b-bd66-deb1bbd528d2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:51:11 np0005476733 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d0000006c.scope: Deactivated successfully.
Oct  8 12:51:11 np0005476733 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d0000006c.scope: Consumed 47.526s CPU time.
Oct  8 12:51:11 np0005476733 systemd-machined[152624]: Machine qemu-66-instance-0000006c terminated.
Oct  8 12:51:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:11.991 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[5e85311e-ce20-4ff7-92ad-8497fe6c3135]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:51:11 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:11.995 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[c802169a-1440-4fa0-b3bc-23327324345b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:51:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:12.018 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[d945df13-8f2a-4909-a5e3-f49d8be7d78e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:51:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:12.034 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[caad62b4-a7d4-4ebd-acf4-aa7b117024ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81c575b5-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:bf:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 29, 'tx_packets': 7, 'rx_bytes': 1714, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 29, 'tx_packets': 7, 'rx_bytes': 1714, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 904005, 'reachable_time': 43569, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 274066, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:51:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:12.054 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[581866cf-6993-4c09-addf-8d8f89a32c6e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap81c575b5-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 904020, 'tstamp': 904020}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 274067, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.122.171'], ['IFA_LOCAL', '192.168.122.171'], ['IFA_BROADCAST', '192.168.122.255'], ['IFA_LABEL', 'tap81c575b5-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 904024, 'tstamp': 904024}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 274067, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:51:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:12.056 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81c575b5-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:12.064 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81c575b5-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:51:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:12.064 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:51:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:12.065 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap81c575b5-a0, col_values=(('external_ids', {'iface-id': '3737b929-673d-4d30-a674-dbb8c6c2e54d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:51:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:12.065 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.125 2 DEBUG nova.compute.manager [req-13992237-2fbc-496a-b237-0b05eed019f3 req-cf55c710-53a4-48be-a256-878c978a4758 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Received event network-vif-unplugged-9f60af9b-fd44-4445-8f3e-2548a718dac7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.125 2 DEBUG oslo_concurrency.lockutils [req-13992237-2fbc-496a-b237-0b05eed019f3 req-cf55c710-53a4-48be-a256-878c978a4758 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.126 2 DEBUG oslo_concurrency.lockutils [req-13992237-2fbc-496a-b237-0b05eed019f3 req-cf55c710-53a4-48be-a256-878c978a4758 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.126 2 DEBUG oslo_concurrency.lockutils [req-13992237-2fbc-496a-b237-0b05eed019f3 req-cf55c710-53a4-48be-a256-878c978a4758 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.126 2 DEBUG nova.compute.manager [req-13992237-2fbc-496a-b237-0b05eed019f3 req-cf55c710-53a4-48be-a256-878c978a4758 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] No waiting events found dispatching network-vif-unplugged-9f60af9b-fd44-4445-8f3e-2548a718dac7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.126 2 DEBUG nova.compute.manager [req-13992237-2fbc-496a-b237-0b05eed019f3 req-cf55c710-53a4-48be-a256-878c978a4758 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Received event network-vif-unplugged-9f60af9b-fd44-4445-8f3e-2548a718dac7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.238 2 INFO nova.virt.libvirt.driver [-] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Instance destroyed successfully.#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.238 2 DEBUG nova.objects.instance [None req-3a141cfd-6571-4792-a54a-6e6ed880b00c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lazy-loading 'resources' on Instance uuid cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.257 2 DEBUG nova.virt.libvirt.vif [None req-3a141cfd-6571-4792-a54a-6e6ed880b00c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T16:49:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068',display_name='tempest-test_igmp_snooping_ext_network_and_unsubscribe-331959068',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-igmp-snooping-ext-network-and-unsubscribe-33195906',id=108,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNTrp4SSnfhKhWl5/vr+ysOFBxdgTslwc0H7TgTRWXihMtjd4e3hSjQ8BhgGRqYqjDbdOxfo/dIVr5KrHj2ewhMFyenbuZO+39j7i/Z4jwloPin+qTxJZUEv9/APVs6CqQ==',key_name='tempest-keypair-test-2124727488',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:49:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ddcd45556b9d4077968eee95f005487d',ramdisk_id='',reservation_id='r-lzinfukz',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-MulticastTestIPv4Common-2020042253',owner_user_name='tempest-MulticastTestIPv4Common-2020042253-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T16:49:50Z,user_data=None,user_id='bee18afeaf16419c98219491d4757b96',uuid=cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9f60af9b-fd44-4445-8f3e-2548a718dac7", "address": "fa:16:3e:f8:3f:1b", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.226", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f60af9b-fd", "ovs_interfaceid": "9f60af9b-fd44-4445-8f3e-2548a718dac7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.258 2 DEBUG nova.network.os_vif_util [None req-3a141cfd-6571-4792-a54a-6e6ed880b00c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Converting VIF {"id": "9f60af9b-fd44-4445-8f3e-2548a718dac7", "address": "fa:16:3e:f8:3f:1b", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.226", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f60af9b-fd", "ovs_interfaceid": "9f60af9b-fd44-4445-8f3e-2548a718dac7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.258 2 DEBUG nova.network.os_vif_util [None req-3a141cfd-6571-4792-a54a-6e6ed880b00c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f8:3f:1b,bridge_name='br-int',has_traffic_filtering=True,id=9f60af9b-fd44-4445-8f3e-2548a718dac7,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f60af9b-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.259 2 DEBUG os_vif [None req-3a141cfd-6571-4792-a54a-6e6ed880b00c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f8:3f:1b,bridge_name='br-int',has_traffic_filtering=True,id=9f60af9b-fd44-4445-8f3e-2548a718dac7,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f60af9b-fd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.260 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f60af9b-fd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:12.266 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=86, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=85) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:51:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:12.267 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.269 2 INFO os_vif [None req-3a141cfd-6571-4792-a54a-6e6ed880b00c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f8:3f:1b,bridge_name='br-int',has_traffic_filtering=True,id=9f60af9b-fd44-4445-8f3e-2548a718dac7,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f60af9b-fd')#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.270 2 INFO nova.virt.libvirt.driver [None req-3a141cfd-6571-4792-a54a-6e6ed880b00c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Deleting instance files /var/lib/nova/instances/cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44_del#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.270 2 INFO nova.virt.libvirt.driver [None req-3a141cfd-6571-4792-a54a-6e6ed880b00c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Deletion of /var/lib/nova/instances/cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44_del complete#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.349 2 INFO nova.compute.manager [None req-3a141cfd-6571-4792-a54a-6e6ed880b00c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Took 0.59 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.350 2 DEBUG oslo.service.loopingcall [None req-3a141cfd-6571-4792-a54a-6e6ed880b00c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.350 2 DEBUG nova.compute.manager [-] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 12:51:12 np0005476733 nova_compute[192580]: 2025-10-08 16:51:12.351 2 DEBUG nova.network.neutron [-] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 12:51:13 np0005476733 nova_compute[192580]: 2025-10-08 16:51:13.480 2 DEBUG nova.network.neutron [-] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:51:13 np0005476733 nova_compute[192580]: 2025-10-08 16:51:13.502 2 INFO nova.compute.manager [-] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Took 1.15 seconds to deallocate network for instance.#033[00m
Oct  8 12:51:13 np0005476733 nova_compute[192580]: 2025-10-08 16:51:13.548 2 DEBUG oslo_concurrency.lockutils [None req-3a141cfd-6571-4792-a54a-6e6ed880b00c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:51:13 np0005476733 nova_compute[192580]: 2025-10-08 16:51:13.548 2 DEBUG oslo_concurrency.lockutils [None req-3a141cfd-6571-4792-a54a-6e6ed880b00c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:51:13 np0005476733 nova_compute[192580]: 2025-10-08 16:51:13.563 2 DEBUG nova.compute.manager [req-5e53acbb-26ec-46b0-a08b-2ffbb7b2f05b req-3d43baf4-a5bd-4385-99c1-fb0629a36082 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Received event network-vif-deleted-9f60af9b-fd44-4445-8f3e-2548a718dac7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:51:13 np0005476733 nova_compute[192580]: 2025-10-08 16:51:13.628 2 DEBUG nova.compute.provider_tree [None req-3a141cfd-6571-4792-a54a-6e6ed880b00c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:51:13 np0005476733 nova_compute[192580]: 2025-10-08 16:51:13.645 2 DEBUG nova.scheduler.client.report [None req-3a141cfd-6571-4792-a54a-6e6ed880b00c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:51:13 np0005476733 nova_compute[192580]: 2025-10-08 16:51:13.667 2 DEBUG oslo_concurrency.lockutils [None req-3a141cfd-6571-4792-a54a-6e6ed880b00c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:51:13 np0005476733 nova_compute[192580]: 2025-10-08 16:51:13.695 2 INFO nova.scheduler.client.report [None req-3a141cfd-6571-4792-a54a-6e6ed880b00c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Deleted allocations for instance cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44#033[00m
Oct  8 12:51:13 np0005476733 nova_compute[192580]: 2025-10-08 16:51:13.766 2 DEBUG oslo_concurrency.lockutils [None req-3a141cfd-6571-4792-a54a-6e6ed880b00c bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:51:14 np0005476733 nova_compute[192580]: 2025-10-08 16:51:14.256 2 DEBUG nova.compute.manager [req-aa516709-5b66-4fc3-8284-a5b556eb0008 req-e4ce1d91-e732-465e-b6be-713d0036d149 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Received event network-vif-plugged-9f60af9b-fd44-4445-8f3e-2548a718dac7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:51:14 np0005476733 nova_compute[192580]: 2025-10-08 16:51:14.257 2 DEBUG oslo_concurrency.lockutils [req-aa516709-5b66-4fc3-8284-a5b556eb0008 req-e4ce1d91-e732-465e-b6be-713d0036d149 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:51:14 np0005476733 nova_compute[192580]: 2025-10-08 16:51:14.257 2 DEBUG oslo_concurrency.lockutils [req-aa516709-5b66-4fc3-8284-a5b556eb0008 req-e4ce1d91-e732-465e-b6be-713d0036d149 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:51:14 np0005476733 nova_compute[192580]: 2025-10-08 16:51:14.258 2 DEBUG oslo_concurrency.lockutils [req-aa516709-5b66-4fc3-8284-a5b556eb0008 req-e4ce1d91-e732-465e-b6be-713d0036d149 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:51:14 np0005476733 nova_compute[192580]: 2025-10-08 16:51:14.259 2 DEBUG nova.compute.manager [req-aa516709-5b66-4fc3-8284-a5b556eb0008 req-e4ce1d91-e732-465e-b6be-713d0036d149 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] No waiting events found dispatching network-vif-plugged-9f60af9b-fd44-4445-8f3e-2548a718dac7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:51:14 np0005476733 nova_compute[192580]: 2025-10-08 16:51:14.259 2 WARNING nova.compute.manager [req-aa516709-5b66-4fc3-8284-a5b556eb0008 req-e4ce1d91-e732-465e-b6be-713d0036d149 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Received unexpected event network-vif-plugged-9f60af9b-fd44-4445-8f3e-2548a718dac7 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 12:51:14 np0005476733 nova_compute[192580]: 2025-10-08 16:51:14.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:51:14 np0005476733 nova_compute[192580]: 2025-10-08 16:51:14.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.514 2 DEBUG oslo_concurrency.lockutils [None req-de5b0ff5-4a4f-481a-8fd9-c0b758a7499e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "59ce9674-8997-4f60-b278-fca63264b284" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.515 2 DEBUG oslo_concurrency.lockutils [None req-de5b0ff5-4a4f-481a-8fd9-c0b758a7499e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "59ce9674-8997-4f60-b278-fca63264b284" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.515 2 DEBUG oslo_concurrency.lockutils [None req-de5b0ff5-4a4f-481a-8fd9-c0b758a7499e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "59ce9674-8997-4f60-b278-fca63264b284-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.515 2 DEBUG oslo_concurrency.lockutils [None req-de5b0ff5-4a4f-481a-8fd9-c0b758a7499e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "59ce9674-8997-4f60-b278-fca63264b284-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.516 2 DEBUG oslo_concurrency.lockutils [None req-de5b0ff5-4a4f-481a-8fd9-c0b758a7499e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "59ce9674-8997-4f60-b278-fca63264b284-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.517 2 INFO nova.compute.manager [None req-de5b0ff5-4a4f-481a-8fd9-c0b758a7499e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Terminating instance#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.519 2 DEBUG nova.compute.manager [None req-de5b0ff5-4a4f-481a-8fd9-c0b758a7499e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 12:51:15 np0005476733 kernel: tap4d9e95a4-6e (unregistering): left promiscuous mode
Oct  8 12:51:15 np0005476733 NetworkManager[51699]: <info>  [1759942275.5467] device (tap4d9e95a4-6e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 12:51:15 np0005476733 ovn_controller[263831]: 2025-10-08T16:51:15Z|00191|binding|INFO|Releasing lport 4d9e95a4-6e11-4c93-b8e6-862a11093b1c from this chassis (sb_readonly=0)
Oct  8 12:51:15 np0005476733 ovn_controller[263831]: 2025-10-08T16:51:15Z|00192|binding|INFO|Setting lport 4d9e95a4-6e11-4c93-b8e6-862a11093b1c down in Southbound
Oct  8 12:51:15 np0005476733 ovn_controller[263831]: 2025-10-08T16:51:15Z|00193|binding|INFO|Removing iface tap4d9e95a4-6e ovn-installed in OVS
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:15.568 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:29:6c 192.168.122.224'], port_security=['fa:16:3e:57:29:6c 192.168.122.224'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.224/24', 'neutron:device_id': '59ce9674-8997-4f60-b278-fca63264b284', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ddcd45556b9d4077968eee95f005487d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f3351433-d580-4e91-809c-2020321c7d00', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5b64086-e7d8-42ad-b439-67cb79e13d7c, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=4d9e95a4-6e11-4c93-b8e6-862a11093b1c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:51:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:15.569 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 4d9e95a4-6e11-4c93-b8e6-862a11093b1c in datapath 81c575b5-ac88-40d3-8b00-79c5c936eec4 unbound from our chassis#033[00m
Oct  8 12:51:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:15.570 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 81c575b5-ac88-40d3-8b00-79c5c936eec4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 12:51:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:15.571 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[dda292e2-bd57-4cd1-a9ab-639de5d2c0fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:51:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:15.571 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 namespace which is not needed anymore#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:15 np0005476733 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000006b.scope: Deactivated successfully.
Oct  8 12:51:15 np0005476733 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000006b.scope: Consumed 46.754s CPU time.
Oct  8 12:51:15 np0005476733 systemd-machined[152624]: Machine qemu-65-instance-0000006b terminated.
Oct  8 12:51:15 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[273092]: [NOTICE]   (273096) : haproxy version is 2.8.14-c23fe91
Oct  8 12:51:15 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[273092]: [NOTICE]   (273096) : path to executable is /usr/sbin/haproxy
Oct  8 12:51:15 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[273092]: [WARNING]  (273096) : Exiting Master process...
Oct  8 12:51:15 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[273092]: [ALERT]    (273096) : Current worker (273098) exited with code 143 (Terminated)
Oct  8 12:51:15 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[273092]: [WARNING]  (273096) : All workers exited. Exiting... (0)
Oct  8 12:51:15 np0005476733 systemd[1]: libpod-d80a94e2bc2bce8cb857adf5df52babf20c2d0c0909dca449db231a0261ca75f.scope: Deactivated successfully.
Oct  8 12:51:15 np0005476733 podman[274110]: 2025-10-08 16:51:15.72046811 +0000 UTC m=+0.058654311 container died d80a94e2bc2bce8cb857adf5df52babf20c2d0c0909dca449db231a0261ca75f (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:51:15 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d80a94e2bc2bce8cb857adf5df52babf20c2d0c0909dca449db231a0261ca75f-userdata-shm.mount: Deactivated successfully.
Oct  8 12:51:15 np0005476733 systemd[1]: var-lib-containers-storage-overlay-c82a46e2803d8a90c05526d872210107cad72011c40ba11292030ffdfadec30b-merged.mount: Deactivated successfully.
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.807 2 INFO nova.virt.libvirt.driver [-] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Instance destroyed successfully.#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.808 2 DEBUG nova.objects.instance [None req-de5b0ff5-4a4f-481a-8fd9-c0b758a7499e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lazy-loading 'resources' on Instance uuid 59ce9674-8997-4f60-b278-fca63264b284 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:51:15 np0005476733 podman[274110]: 2025-10-08 16:51:15.820332615 +0000 UTC m=+0.158518806 container cleanup d80a94e2bc2bce8cb857adf5df52babf20c2d0c0909dca449db231a0261ca75f (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.826 2 DEBUG nova.virt.libvirt.vif [None req-de5b0ff5-4a4f-481a-8fd9-c0b758a7499e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T16:48:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962',display_name='tempest-test_igmp_snooping_ext_network_and_unsubscribe-1465069962',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-igmp-snooping-ext-network-and-unsubscribe-14650699',id=107,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNTrp4SSnfhKhWl5/vr+ysOFBxdgTslwc0H7TgTRWXihMtjd4e3hSjQ8BhgGRqYqjDbdOxfo/dIVr5KrHj2ewhMFyenbuZO+39j7i/Z4jwloPin+qTxJZUEv9/APVs6CqQ==',key_name='tempest-keypair-test-2124727488',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:48:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ddcd45556b9d4077968eee95f005487d',ramdisk_id='',reservation_id='r-fon0b8sx',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-MulticastTestIPv4Common-2020042253',owner_user_name='tempest-MulticastTestIPv4Common-2020042253-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T16:48:57Z,user_data=None,user_id='bee18afeaf16419c98219491d4757b96',uuid=59ce9674-8997-4f60-b278-fca63264b284,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4d9e95a4-6e11-4c93-b8e6-862a11093b1c", "address": "fa:16:3e:57:29:6c", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.224", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d9e95a4-6e", "ovs_interfaceid": "4d9e95a4-6e11-4c93-b8e6-862a11093b1c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.827 2 DEBUG nova.network.os_vif_util [None req-de5b0ff5-4a4f-481a-8fd9-c0b758a7499e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Converting VIF {"id": "4d9e95a4-6e11-4c93-b8e6-862a11093b1c", "address": "fa:16:3e:57:29:6c", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.224", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d9e95a4-6e", "ovs_interfaceid": "4d9e95a4-6e11-4c93-b8e6-862a11093b1c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.827 2 DEBUG nova.network.os_vif_util [None req-de5b0ff5-4a4f-481a-8fd9-c0b758a7499e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:57:29:6c,bridge_name='br-int',has_traffic_filtering=True,id=4d9e95a4-6e11-4c93-b8e6-862a11093b1c,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d9e95a4-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.828 2 DEBUG os_vif [None req-de5b0ff5-4a4f-481a-8fd9-c0b758a7499e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:29:6c,bridge_name='br-int',has_traffic_filtering=True,id=4d9e95a4-6e11-4c93-b8e6-862a11093b1c,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d9e95a4-6e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 12:51:15 np0005476733 systemd[1]: libpod-conmon-d80a94e2bc2bce8cb857adf5df52babf20c2d0c0909dca449db231a0261ca75f.scope: Deactivated successfully.
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.829 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4d9e95a4-6e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.866 2 INFO os_vif [None req-de5b0ff5-4a4f-481a-8fd9-c0b758a7499e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:29:6c,bridge_name='br-int',has_traffic_filtering=True,id=4d9e95a4-6e11-4c93-b8e6-862a11093b1c,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d9e95a4-6e')#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.867 2 INFO nova.virt.libvirt.driver [None req-de5b0ff5-4a4f-481a-8fd9-c0b758a7499e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Deleting instance files /var/lib/nova/instances/59ce9674-8997-4f60-b278-fca63264b284_del#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.868 2 INFO nova.virt.libvirt.driver [None req-de5b0ff5-4a4f-481a-8fd9-c0b758a7499e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Deletion of /var/lib/nova/instances/59ce9674-8997-4f60-b278-fca63264b284_del complete#033[00m
Oct  8 12:51:15 np0005476733 podman[274157]: 2025-10-08 16:51:15.924470165 +0000 UTC m=+0.078544056 container remove d80a94e2bc2bce8cb857adf5df52babf20c2d0c0909dca449db231a0261ca75f (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  8 12:51:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:15.931 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7ceffd29-b01e-4be7-bb17-0ad9bcbe9c14]: (4, ('Wed Oct  8 04:51:15 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 (d80a94e2bc2bce8cb857adf5df52babf20c2d0c0909dca449db231a0261ca75f)\nd80a94e2bc2bce8cb857adf5df52babf20c2d0c0909dca449db231a0261ca75f\nWed Oct  8 04:51:15 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 (d80a94e2bc2bce8cb857adf5df52babf20c2d0c0909dca449db231a0261ca75f)\nd80a94e2bc2bce8cb857adf5df52babf20c2d0c0909dca449db231a0261ca75f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:51:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:15.932 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[cb5d99c9-5a14-459d-961e-13d2ea5d06ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:51:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:15.933 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81c575b5-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:51:15 np0005476733 kernel: tap81c575b5-a0: left promiscuous mode
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.940 2 INFO nova.compute.manager [None req-de5b0ff5-4a4f-481a-8fd9-c0b758a7499e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.941 2 DEBUG oslo.service.loopingcall [None req-de5b0ff5-4a4f-481a-8fd9-c0b758a7499e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.941 2 DEBUG nova.compute.manager [-] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.941 2 DEBUG nova.network.neutron [-] [instance: 59ce9674-8997-4f60-b278-fca63264b284] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:15.950 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c1fd6845-4718-4113-a2ed-be667cc08646]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.972 2 DEBUG nova.compute.manager [req-e94b4589-0485-47b6-8b32-f32ad194316a req-55822d48-5f84-4652-9630-772d2fefcf61 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Received event network-vif-unplugged-4d9e95a4-6e11-4c93-b8e6-862a11093b1c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.972 2 DEBUG oslo_concurrency.lockutils [req-e94b4589-0485-47b6-8b32-f32ad194316a req-55822d48-5f84-4652-9630-772d2fefcf61 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "59ce9674-8997-4f60-b278-fca63264b284-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.972 2 DEBUG oslo_concurrency.lockutils [req-e94b4589-0485-47b6-8b32-f32ad194316a req-55822d48-5f84-4652-9630-772d2fefcf61 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "59ce9674-8997-4f60-b278-fca63264b284-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.973 2 DEBUG oslo_concurrency.lockutils [req-e94b4589-0485-47b6-8b32-f32ad194316a req-55822d48-5f84-4652-9630-772d2fefcf61 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "59ce9674-8997-4f60-b278-fca63264b284-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.973 2 DEBUG nova.compute.manager [req-e94b4589-0485-47b6-8b32-f32ad194316a req-55822d48-5f84-4652-9630-772d2fefcf61 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] No waiting events found dispatching network-vif-unplugged-4d9e95a4-6e11-4c93-b8e6-862a11093b1c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:51:15 np0005476733 nova_compute[192580]: 2025-10-08 16:51:15.973 2 DEBUG nova.compute.manager [req-e94b4589-0485-47b6-8b32-f32ad194316a req-55822d48-5f84-4652-9630-772d2fefcf61 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Received event network-vif-unplugged-4d9e95a4-6e11-4c93-b8e6-862a11093b1c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 12:51:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:15.993 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e8b904c6-5ecb-40be-9140-6ef5da20ec80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:51:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:15.995 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[479cfde8-458f-43fb-905c-148ae131f3cd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:51:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:16.013 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d29fd1b4-ca78-450b-8229-a5f37449267f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 903996, 'reachable_time': 33563, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 274172, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:51:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:16.015 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 12:51:16 np0005476733 systemd[1]: run-netns-ovnmeta\x2d81c575b5\x2dac88\x2d40d3\x2d8b00\x2d79c5c936eec4.mount: Deactivated successfully.
Oct  8 12:51:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:16.016 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[42ed6a16-1116-471d-902e-8fa2c7fc69cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:51:16 np0005476733 nova_compute[192580]: 2025-10-08 16:51:16.474 2 DEBUG nova.network.neutron [-] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:51:16 np0005476733 nova_compute[192580]: 2025-10-08 16:51:16.494 2 INFO nova.compute.manager [-] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Took 0.55 seconds to deallocate network for instance.#033[00m
Oct  8 12:51:16 np0005476733 nova_compute[192580]: 2025-10-08 16:51:16.547 2 DEBUG oslo_concurrency.lockutils [None req-de5b0ff5-4a4f-481a-8fd9-c0b758a7499e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:51:16 np0005476733 nova_compute[192580]: 2025-10-08 16:51:16.547 2 DEBUG oslo_concurrency.lockutils [None req-de5b0ff5-4a4f-481a-8fd9-c0b758a7499e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:51:16 np0005476733 nova_compute[192580]: 2025-10-08 16:51:16.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:51:16 np0005476733 nova_compute[192580]: 2025-10-08 16:51:16.603 2 DEBUG nova.compute.provider_tree [None req-de5b0ff5-4a4f-481a-8fd9-c0b758a7499e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:51:16 np0005476733 nova_compute[192580]: 2025-10-08 16:51:16.619 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:51:16 np0005476733 nova_compute[192580]: 2025-10-08 16:51:16.621 2 DEBUG nova.scheduler.client.report [None req-de5b0ff5-4a4f-481a-8fd9-c0b758a7499e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:51:16 np0005476733 nova_compute[192580]: 2025-10-08 16:51:16.642 2 DEBUG oslo_concurrency.lockutils [None req-de5b0ff5-4a4f-481a-8fd9-c0b758a7499e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:51:16 np0005476733 nova_compute[192580]: 2025-10-08 16:51:16.644 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.024s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:51:16 np0005476733 nova_compute[192580]: 2025-10-08 16:51:16.644 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:51:16 np0005476733 nova_compute[192580]: 2025-10-08 16:51:16.644 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:51:16 np0005476733 nova_compute[192580]: 2025-10-08 16:51:16.731 2 INFO nova.scheduler.client.report [None req-de5b0ff5-4a4f-481a-8fd9-c0b758a7499e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Deleted allocations for instance 59ce9674-8997-4f60-b278-fca63264b284#033[00m
Oct  8 12:51:16 np0005476733 podman[274175]: 2025-10-08 16:51:16.749633248 +0000 UTC m=+0.058132835 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:51:16 np0005476733 podman[274174]: 2025-10-08 16:51:16.757150118 +0000 UTC m=+0.069757445 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  8 12:51:16 np0005476733 nova_compute[192580]: 2025-10-08 16:51:16.796 2 DEBUG oslo_concurrency.lockutils [None req-de5b0ff5-4a4f-481a-8fd9-c0b758a7499e bee18afeaf16419c98219491d4757b96 ddcd45556b9d4077968eee95f005487d - - default default] Lock "59ce9674-8997-4f60-b278-fca63264b284" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.281s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:51:16 np0005476733 nova_compute[192580]: 2025-10-08 16:51:16.832 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:51:16 np0005476733 nova_compute[192580]: 2025-10-08 16:51:16.833 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13628MB free_disk=111.31253051757812GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:51:16 np0005476733 nova_compute[192580]: 2025-10-08 16:51:16.833 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:51:16 np0005476733 nova_compute[192580]: 2025-10-08 16:51:16.834 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:51:16 np0005476733 nova_compute[192580]: 2025-10-08 16:51:16.883 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:51:16 np0005476733 nova_compute[192580]: 2025-10-08 16:51:16.884 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:51:16 np0005476733 nova_compute[192580]: 2025-10-08 16:51:16.913 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:51:16 np0005476733 nova_compute[192580]: 2025-10-08 16:51:16.928 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:51:16 np0005476733 nova_compute[192580]: 2025-10-08 16:51:16.949 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:51:16 np0005476733 nova_compute[192580]: 2025-10-08 16:51:16.949 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:51:18 np0005476733 nova_compute[192580]: 2025-10-08 16:51:18.062 2 DEBUG nova.compute.manager [req-69e5fb9e-bef9-4c0a-9746-710c996cf46e req-d96ef1ba-4885-48df-ad04-56051f9b202a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Received event network-vif-plugged-4d9e95a4-6e11-4c93-b8e6-862a11093b1c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:51:18 np0005476733 nova_compute[192580]: 2025-10-08 16:51:18.062 2 DEBUG oslo_concurrency.lockutils [req-69e5fb9e-bef9-4c0a-9746-710c996cf46e req-d96ef1ba-4885-48df-ad04-56051f9b202a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "59ce9674-8997-4f60-b278-fca63264b284-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:51:18 np0005476733 nova_compute[192580]: 2025-10-08 16:51:18.063 2 DEBUG oslo_concurrency.lockutils [req-69e5fb9e-bef9-4c0a-9746-710c996cf46e req-d96ef1ba-4885-48df-ad04-56051f9b202a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "59ce9674-8997-4f60-b278-fca63264b284-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:51:18 np0005476733 nova_compute[192580]: 2025-10-08 16:51:18.063 2 DEBUG oslo_concurrency.lockutils [req-69e5fb9e-bef9-4c0a-9746-710c996cf46e req-d96ef1ba-4885-48df-ad04-56051f9b202a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "59ce9674-8997-4f60-b278-fca63264b284-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:51:18 np0005476733 nova_compute[192580]: 2025-10-08 16:51:18.063 2 DEBUG nova.compute.manager [req-69e5fb9e-bef9-4c0a-9746-710c996cf46e req-d96ef1ba-4885-48df-ad04-56051f9b202a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] No waiting events found dispatching network-vif-plugged-4d9e95a4-6e11-4c93-b8e6-862a11093b1c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:51:18 np0005476733 nova_compute[192580]: 2025-10-08 16:51:18.063 2 WARNING nova.compute.manager [req-69e5fb9e-bef9-4c0a-9746-710c996cf46e req-d96ef1ba-4885-48df-ad04-56051f9b202a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Received unexpected event network-vif-plugged-4d9e95a4-6e11-4c93-b8e6-862a11093b1c for instance with vm_state deleted and task_state None.#033[00m
Oct  8 12:51:18 np0005476733 nova_compute[192580]: 2025-10-08 16:51:18.063 2 DEBUG nova.compute.manager [req-69e5fb9e-bef9-4c0a-9746-710c996cf46e req-d96ef1ba-4885-48df-ad04-56051f9b202a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Received event network-vif-deleted-4d9e95a4-6e11-4c93-b8e6-862a11093b1c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:51:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:19.269 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '86'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:51:19 np0005476733 nova_compute[192580]: 2025-10-08 16:51:19.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:20 np0005476733 nova_compute[192580]: 2025-10-08 16:51:20.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:22 np0005476733 nova_compute[192580]: 2025-10-08 16:51:22.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:51:22 np0005476733 nova_compute[192580]: 2025-10-08 16:51:22.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 12:51:23 np0005476733 nova_compute[192580]: 2025-10-08 16:51:23.609 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:51:24 np0005476733 nova_compute[192580]: 2025-10-08 16:51:24.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:25 np0005476733 nova_compute[192580]: 2025-10-08 16:51:25.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:26.425 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:51:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:26.425 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:51:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:51:26.425 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:51:27 np0005476733 podman[274221]: 2025-10-08 16:51:27.217069905 +0000 UTC m=+0.050441670 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 12:51:27 np0005476733 nova_compute[192580]: 2025-10-08 16:51:27.237 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759942272.2362497, cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:51:27 np0005476733 nova_compute[192580]: 2025-10-08 16:51:27.237 2 INFO nova.compute.manager [-] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] VM Stopped (Lifecycle Event)#033[00m
Oct  8 12:51:27 np0005476733 nova_compute[192580]: 2025-10-08 16:51:27.267 2 DEBUG nova.compute.manager [None req-767e2a9c-e717-452b-b4d3-6c71b7c33682 - - - - - -] [instance: cc8bd787-bfc5-43f3-8bf1-e48d3d1c3b44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:51:29 np0005476733 podman[274240]: 2025-10-08 16:51:29.275180694 +0000 UTC m=+0.107682415 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 12:51:29 np0005476733 nova_compute[192580]: 2025-10-08 16:51:29.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:30 np0005476733 nova_compute[192580]: 2025-10-08 16:51:30.805 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759942275.8045278, 59ce9674-8997-4f60-b278-fca63264b284 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:51:30 np0005476733 nova_compute[192580]: 2025-10-08 16:51:30.806 2 INFO nova.compute.manager [-] [instance: 59ce9674-8997-4f60-b278-fca63264b284] VM Stopped (Lifecycle Event)#033[00m
Oct  8 12:51:30 np0005476733 nova_compute[192580]: 2025-10-08 16:51:30.828 2 DEBUG nova.compute.manager [None req-d6303367-941f-4f64-90da-33d0251b3719 - - - - - -] [instance: 59ce9674-8997-4f60-b278-fca63264b284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:51:30 np0005476733 nova_compute[192580]: 2025-10-08 16:51:30.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:31 np0005476733 podman[274266]: 2025-10-08 16:51:31.241062831 +0000 UTC m=+0.065587952 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 12:51:31 np0005476733 nova_compute[192580]: 2025-10-08 16:51:31.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:31 np0005476733 nova_compute[192580]: 2025-10-08 16:51:31.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:32 np0005476733 nova_compute[192580]: 2025-10-08 16:51:32.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:51:34 np0005476733 nova_compute[192580]: 2025-10-08 16:51:34.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:35 np0005476733 nova_compute[192580]: 2025-10-08 16:51:35.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:39 np0005476733 nova_compute[192580]: 2025-10-08 16:51:39.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:40 np0005476733 nova_compute[192580]: 2025-10-08 16:51:40.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:41 np0005476733 podman[274289]: 2025-10-08 16:51:41.237074556 +0000 UTC m=+0.058351832 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, version=9.6)
Oct  8 12:51:41 np0005476733 podman[274288]: 2025-10-08 16:51:41.240257838 +0000 UTC m=+0.063050022 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:51:41 np0005476733 podman[274287]: 2025-10-08 16:51:41.258285522 +0000 UTC m=+0.083716250 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 12:51:44 np0005476733 nova_compute[192580]: 2025-10-08 16:51:44.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:45 np0005476733 nova_compute[192580]: 2025-10-08 16:51:45.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:46 np0005476733 systemd-logind[827]: New session 166 of user zuul.
Oct  8 12:51:46 np0005476733 systemd[1]: Started Session 166 of User zuul.
Oct  8 12:51:46 np0005476733 systemd[1]: session-166.scope: Deactivated successfully.
Oct  8 12:51:46 np0005476733 systemd-logind[827]: Session 166 logged out. Waiting for processes to exit.
Oct  8 12:51:46 np0005476733 systemd-logind[827]: Removed session 166.
Oct  8 12:51:47 np0005476733 podman[274379]: 2025-10-08 16:51:47.240117213 +0000 UTC m=+0.061702619 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 12:51:47 np0005476733 podman[274378]: 2025-10-08 16:51:47.248284613 +0000 UTC m=+0.076665506 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 12:51:49 np0005476733 nova_compute[192580]: 2025-10-08 16:51:49.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:50 np0005476733 nova_compute[192580]: 2025-10-08 16:51:50.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:51 np0005476733 ovn_controller[263831]: 2025-10-08T16:51:51Z|00194|pinctrl|WARN|Dropped 373 log messages in last 53 seconds (most recently, 0 seconds ago) due to excessive rate
Oct  8 12:51:51 np0005476733 ovn_controller[263831]: 2025-10-08T16:51:51Z|00195|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:51:54 np0005476733 nova_compute[192580]: 2025-10-08 16:51:54.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:51:54 np0005476733 nova_compute[192580]: 2025-10-08 16:51:54.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:55 np0005476733 nova_compute[192580]: 2025-10-08 16:51:55.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:51:56 np0005476733 nova_compute[192580]: 2025-10-08 16:51:56.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:51:58 np0005476733 podman[274428]: 2025-10-08 16:51:58.245506782 +0000 UTC m=+0.078591006 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:51:58 np0005476733 nova_compute[192580]: 2025-10-08 16:51:58.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:51:58 np0005476733 nova_compute[192580]: 2025-10-08 16:51:58.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:51:58 np0005476733 nova_compute[192580]: 2025-10-08 16:51:58.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:51:59 np0005476733 nova_compute[192580]: 2025-10-08 16:51:59.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:00 np0005476733 podman[274451]: 2025-10-08 16:52:00.279385509 +0000 UTC m=+0.107517520 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  8 12:52:00 np0005476733 nova_compute[192580]: 2025-10-08 16:52:00.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:02 np0005476733 podman[274477]: 2025-10-08 16:52:02.242113307 +0000 UTC m=+0.071759360 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:52:04 np0005476733 nova_compute[192580]: 2025-10-08 16:52:04.612 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:52:04 np0005476733 nova_compute[192580]: 2025-10-08 16:52:04.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:05 np0005476733 nova_compute[192580]: 2025-10-08 16:52:05.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:07 np0005476733 nova_compute[192580]: 2025-10-08 16:52:07.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:52:07 np0005476733 nova_compute[192580]: 2025-10-08 16:52:07.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:52:07 np0005476733 nova_compute[192580]: 2025-10-08 16:52:07.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:52:07 np0005476733 nova_compute[192580]: 2025-10-08 16:52:07.608 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:52:09 np0005476733 nova_compute[192580]: 2025-10-08 16:52:09.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:52:09 np0005476733 nova_compute[192580]: 2025-10-08 16:52:09.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:10 np0005476733 nova_compute[192580]: 2025-10-08 16:52:10.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:12 np0005476733 podman[274505]: 2025-10-08 16:52:12.234906458 +0000 UTC m=+0.061053658 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  8 12:52:12 np0005476733 podman[274504]: 2025-10-08 16:52:12.241123057 +0000 UTC m=+0.067549026 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 12:52:12 np0005476733 podman[274503]: 2025-10-08 16:52:12.257094777 +0000 UTC m=+0.076324526 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=multipathd)
Oct  8 12:52:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:52:12.347 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=87, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=86) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:52:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:52:12.348 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:52:12 np0005476733 nova_compute[192580]: 2025-10-08 16:52:12.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:14 np0005476733 nova_compute[192580]: 2025-10-08 16:52:14.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:15 np0005476733 nova_compute[192580]: 2025-10-08 16:52:15.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:16 np0005476733 nova_compute[192580]: 2025-10-08 16:52:16.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:52:18 np0005476733 podman[274564]: 2025-10-08 16:52:18.22719039 +0000 UTC m=+0.050405688 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, container_name=iscsid)
Oct  8 12:52:18 np0005476733 podman[274565]: 2025-10-08 16:52:18.258066265 +0000 UTC m=+0.079131565 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 12:52:18 np0005476733 nova_compute[192580]: 2025-10-08 16:52:18.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:52:18 np0005476733 nova_compute[192580]: 2025-10-08 16:52:18.617 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:52:18 np0005476733 nova_compute[192580]: 2025-10-08 16:52:18.618 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:52:18 np0005476733 nova_compute[192580]: 2025-10-08 16:52:18.618 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:52:18 np0005476733 nova_compute[192580]: 2025-10-08 16:52:18.618 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:52:18 np0005476733 nova_compute[192580]: 2025-10-08 16:52:18.769 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:52:18 np0005476733 nova_compute[192580]: 2025-10-08 16:52:18.770 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13648MB free_disk=111.31253051757812GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:52:18 np0005476733 nova_compute[192580]: 2025-10-08 16:52:18.770 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:52:18 np0005476733 nova_compute[192580]: 2025-10-08 16:52:18.771 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:52:18 np0005476733 nova_compute[192580]: 2025-10-08 16:52:18.872 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:52:18 np0005476733 nova_compute[192580]: 2025-10-08 16:52:18.872 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:52:18 np0005476733 nova_compute[192580]: 2025-10-08 16:52:18.890 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 12:52:18 np0005476733 nova_compute[192580]: 2025-10-08 16:52:18.910 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 12:52:18 np0005476733 nova_compute[192580]: 2025-10-08 16:52:18.911 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 12:52:18 np0005476733 nova_compute[192580]: 2025-10-08 16:52:18.939 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 12:52:18 np0005476733 nova_compute[192580]: 2025-10-08 16:52:18.975 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 12:52:19 np0005476733 nova_compute[192580]: 2025-10-08 16:52:19.003 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:52:19 np0005476733 nova_compute[192580]: 2025-10-08 16:52:19.019 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:52:19 np0005476733 nova_compute[192580]: 2025-10-08 16:52:19.021 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:52:19 np0005476733 nova_compute[192580]: 2025-10-08 16:52:19.021 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:52:19 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:52:19.350 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '87'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:52:19 np0005476733 nova_compute[192580]: 2025-10-08 16:52:19.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:20 np0005476733 nova_compute[192580]: 2025-10-08 16:52:20.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:23 np0005476733 nova_compute[192580]: 2025-10-08 16:52:23.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:52:23 np0005476733 nova_compute[192580]: 2025-10-08 16:52:23.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:52:23 np0005476733 nova_compute[192580]: 2025-10-08 16:52:23.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 12:52:23 np0005476733 nova_compute[192580]: 2025-10-08 16:52:23.604 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 12:52:24 np0005476733 nova_compute[192580]: 2025-10-08 16:52:24.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:25 np0005476733 ovn_controller[263831]: 2025-10-08T16:52:25Z|00196|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct  8 12:52:25 np0005476733 nova_compute[192580]: 2025-10-08 16:52:25.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:52:26.426 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:52:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:52:26.427 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:52:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:52:26.427 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:52:29 np0005476733 podman[274605]: 2025-10-08 16:52:29.223980688 +0000 UTC m=+0.048990543 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct  8 12:52:29 np0005476733 nova_compute[192580]: 2025-10-08 16:52:29.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:30 np0005476733 systemd-logind[827]: New session 167 of user zuul.
Oct  8 12:52:30 np0005476733 systemd[1]: Started Session 167 of User zuul.
Oct  8 12:52:30 np0005476733 systemd[1]: session-167.scope: Deactivated successfully.
Oct  8 12:52:30 np0005476733 systemd-logind[827]: Session 167 logged out. Waiting for processes to exit.
Oct  8 12:52:30 np0005476733 systemd-logind[827]: Removed session 167.
Oct  8 12:52:30 np0005476733 podman[274651]: 2025-10-08 16:52:30.476772458 +0000 UTC m=+0.069085465 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 12:52:30 np0005476733 nova_compute[192580]: 2025-10-08 16:52:30.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:33 np0005476733 podman[274677]: 2025-10-08 16:52:33.237288375 +0000 UTC m=+0.070922682 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:52:34 np0005476733 nova_compute[192580]: 2025-10-08 16:52:34.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:35 np0005476733 nova_compute[192580]: 2025-10-08 16:52:35.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.075 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.075 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.075 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:52:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:52:37 np0005476733 nova_compute[192580]: 2025-10-08 16:52:37.039 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:52:40 np0005476733 nova_compute[192580]: 2025-10-08 16:52:40.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:40 np0005476733 nova_compute[192580]: 2025-10-08 16:52:40.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:43 np0005476733 podman[274698]: 2025-10-08 16:52:43.237841473 +0000 UTC m=+0.054419615 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:52:43 np0005476733 podman[274699]: 2025-10-08 16:52:43.254922678 +0000 UTC m=+0.065504929 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, build-date=2025-08-20T13:12:41, name=ubi9-minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64)
Oct  8 12:52:43 np0005476733 podman[274697]: 2025-10-08 16:52:43.255248079 +0000 UTC m=+0.074660592 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 12:52:45 np0005476733 nova_compute[192580]: 2025-10-08 16:52:45.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:45 np0005476733 nova_compute[192580]: 2025-10-08 16:52:45.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:49 np0005476733 podman[274760]: 2025-10-08 16:52:49.216046438 +0000 UTC m=+0.044486699 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 12:52:49 np0005476733 podman[274759]: 2025-10-08 16:52:49.218879498 +0000 UTC m=+0.049055565 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  8 12:52:50 np0005476733 nova_compute[192580]: 2025-10-08 16:52:50.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:50 np0005476733 nova_compute[192580]: 2025-10-08 16:52:50.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:51 np0005476733 ovn_controller[263831]: 2025-10-08T16:52:51Z|00197|pinctrl|WARN|Dropped 363 log messages in last 61 seconds (most recently, 1 seconds ago) due to excessive rate
Oct  8 12:52:51 np0005476733 ovn_controller[263831]: 2025-10-08T16:52:51Z|00198|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:52:55 np0005476733 nova_compute[192580]: 2025-10-08 16:52:55.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:55 np0005476733 nova_compute[192580]: 2025-10-08 16:52:55.607 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:52:55 np0005476733 nova_compute[192580]: 2025-10-08 16:52:55.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:52:56 np0005476733 nova_compute[192580]: 2025-10-08 16:52:56.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:52:59 np0005476733 nova_compute[192580]: 2025-10-08 16:52:59.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:52:59 np0005476733 nova_compute[192580]: 2025-10-08 16:52:59.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:53:00 np0005476733 nova_compute[192580]: 2025-10-08 16:53:00.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:00 np0005476733 podman[274805]: 2025-10-08 16:53:00.220118996 +0000 UTC m=+0.054311192 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Oct  8 12:53:00 np0005476733 nova_compute[192580]: 2025-10-08 16:53:00.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:01 np0005476733 podman[274825]: 2025-10-08 16:53:01.239836794 +0000 UTC m=+0.074307680 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Oct  8 12:53:04 np0005476733 podman[274851]: 2025-10-08 16:53:04.246409277 +0000 UTC m=+0.066421428 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 12:53:04 np0005476733 nova_compute[192580]: 2025-10-08 16:53:04.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:53:05 np0005476733 nova_compute[192580]: 2025-10-08 16:53:05.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:05 np0005476733 nova_compute[192580]: 2025-10-08 16:53:05.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:09.088 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=88, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=87) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:53:09 np0005476733 nova_compute[192580]: 2025-10-08 16:53:09.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:09 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:09.091 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:53:09 np0005476733 nova_compute[192580]: 2025-10-08 16:53:09.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:53:09 np0005476733 nova_compute[192580]: 2025-10-08 16:53:09.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:53:09 np0005476733 nova_compute[192580]: 2025-10-08 16:53:09.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:53:09 np0005476733 nova_compute[192580]: 2025-10-08 16:53:09.603 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:53:09 np0005476733 nova_compute[192580]: 2025-10-08 16:53:09.603 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:53:10 np0005476733 nova_compute[192580]: 2025-10-08 16:53:10.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:10.093 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '88'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:53:10 np0005476733 nova_compute[192580]: 2025-10-08 16:53:10.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:14 np0005476733 podman[274871]: 2025-10-08 16:53:14.229441207 +0000 UTC m=+0.056545524 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  8 12:53:14 np0005476733 podman[274872]: 2025-10-08 16:53:14.237798613 +0000 UTC m=+0.057719461 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:53:14 np0005476733 podman[274877]: 2025-10-08 16:53:14.261766658 +0000 UTC m=+0.069228029 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Oct  8 12:53:15 np0005476733 nova_compute[192580]: 2025-10-08 16:53:15.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:15 np0005476733 nova_compute[192580]: 2025-10-08 16:53:15.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:18 np0005476733 nova_compute[192580]: 2025-10-08 16:53:18.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:53:18 np0005476733 nova_compute[192580]: 2025-10-08 16:53:18.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:53:18 np0005476733 nova_compute[192580]: 2025-10-08 16:53:18.629 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:53:18 np0005476733 nova_compute[192580]: 2025-10-08 16:53:18.629 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:53:18 np0005476733 nova_compute[192580]: 2025-10-08 16:53:18.630 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:53:18 np0005476733 nova_compute[192580]: 2025-10-08 16:53:18.630 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:53:18 np0005476733 nova_compute[192580]: 2025-10-08 16:53:18.765 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:53:18 np0005476733 nova_compute[192580]: 2025-10-08 16:53:18.766 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13645MB free_disk=111.3017463684082GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:53:18 np0005476733 nova_compute[192580]: 2025-10-08 16:53:18.766 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:53:18 np0005476733 nova_compute[192580]: 2025-10-08 16:53:18.766 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:53:18 np0005476733 nova_compute[192580]: 2025-10-08 16:53:18.852 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:53:18 np0005476733 nova_compute[192580]: 2025-10-08 16:53:18.852 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:53:19 np0005476733 nova_compute[192580]: 2025-10-08 16:53:19.049 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:53:19 np0005476733 nova_compute[192580]: 2025-10-08 16:53:19.066 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:53:19 np0005476733 nova_compute[192580]: 2025-10-08 16:53:19.068 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:53:19 np0005476733 nova_compute[192580]: 2025-10-08 16:53:19.068 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.302s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:53:20 np0005476733 nova_compute[192580]: 2025-10-08 16:53:20.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:20 np0005476733 podman[274938]: 2025-10-08 16:53:20.217916848 +0000 UTC m=+0.048724905 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:53:20 np0005476733 podman[274937]: 2025-10-08 16:53:20.250957101 +0000 UTC m=+0.083268296 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 12:53:20 np0005476733 nova_compute[192580]: 2025-10-08 16:53:20.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:24 np0005476733 nova_compute[192580]: 2025-10-08 16:53:24.069 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:53:25 np0005476733 nova_compute[192580]: 2025-10-08 16:53:25.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:25 np0005476733 nova_compute[192580]: 2025-10-08 16:53:25.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:26.427 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:53:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:26.427 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:53:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:26.427 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:53:30 np0005476733 nova_compute[192580]: 2025-10-08 16:53:30.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:30 np0005476733 nova_compute[192580]: 2025-10-08 16:53:30.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:31 np0005476733 podman[274980]: 2025-10-08 16:53:31.21721117 +0000 UTC m=+0.047086324 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 12:53:32 np0005476733 podman[275000]: 2025-10-08 16:53:32.26199255 +0000 UTC m=+0.089385636 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 12:53:35 np0005476733 nova_compute[192580]: 2025-10-08 16:53:35.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:35 np0005476733 podman[275026]: 2025-10-08 16:53:35.221508515 +0000 UTC m=+0.051534578 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true)
Oct  8 12:53:35 np0005476733 nova_compute[192580]: 2025-10-08 16:53:35.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:53:35 np0005476733 nova_compute[192580]: 2025-10-08 16:53:35.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:40 np0005476733 nova_compute[192580]: 2025-10-08 16:53:40.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:40 np0005476733 nova_compute[192580]: 2025-10-08 16:53:40.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:45 np0005476733 nova_compute[192580]: 2025-10-08 16:53:45.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:45 np0005476733 podman[275047]: 2025-10-08 16:53:45.241139334 +0000 UTC m=+0.054463741 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 12:53:45 np0005476733 podman[275046]: 2025-10-08 16:53:45.249650666 +0000 UTC m=+0.064360187 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:53:45 np0005476733 podman[275048]: 2025-10-08 16:53:45.294918042 +0000 UTC m=+0.094184330 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, managed_by=edpm_ansible, release=1755695350, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7)
Oct  8 12:53:45 np0005476733 nova_compute[192580]: 2025-10-08 16:53:45.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.066 2 DEBUG oslo_concurrency.lockutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Acquiring lock "646e343d-1440-4cfb-9002-e70d282e35b2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.066 2 DEBUG oslo_concurrency.lockutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Lock "646e343d-1440-4cfb-9002-e70d282e35b2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.087 2 DEBUG nova.compute.manager [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.194 2 DEBUG oslo_concurrency.lockutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.195 2 DEBUG oslo_concurrency.lockutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.208 2 DEBUG nova.virt.hardware [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.208 2 INFO nova.compute.claims [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.380 2 DEBUG nova.compute.provider_tree [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.399 2 DEBUG nova.scheduler.client.report [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.424 2 DEBUG oslo_concurrency.lockutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.425 2 DEBUG nova.compute.manager [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.494 2 DEBUG nova.compute.manager [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.495 2 DEBUG nova.network.neutron [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.526 2 INFO nova.virt.libvirt.driver [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.556 2 DEBUG nova.compute.manager [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.664 2 DEBUG nova.compute.manager [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.666 2 DEBUG nova.virt.libvirt.driver [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.667 2 INFO nova.virt.libvirt.driver [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Creating image(s)#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.668 2 DEBUG oslo_concurrency.lockutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Acquiring lock "/var/lib/nova/instances/646e343d-1440-4cfb-9002-e70d282e35b2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.668 2 DEBUG oslo_concurrency.lockutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Lock "/var/lib/nova/instances/646e343d-1440-4cfb-9002-e70d282e35b2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.669 2 DEBUG oslo_concurrency.lockutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Lock "/var/lib/nova/instances/646e343d-1440-4cfb-9002-e70d282e35b2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.682 2 DEBUG oslo_concurrency.processutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.744 2 DEBUG oslo_concurrency.processutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.745 2 DEBUG oslo_concurrency.lockutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Acquiring lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.746 2 DEBUG oslo_concurrency.lockutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.763 2 DEBUG oslo_concurrency.processutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.833 2 DEBUG oslo_concurrency.processutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.835 2 DEBUG oslo_concurrency.processutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/646e343d-1440-4cfb-9002-e70d282e35b2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.876 2 DEBUG oslo_concurrency.processutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493,backing_fmt=raw /var/lib/nova/instances/646e343d-1440-4cfb-9002-e70d282e35b2/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.878 2 DEBUG oslo_concurrency.lockutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Lock "8e8e2abd6632b8926f12ff2e0bba1de20acba493" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.879 2 DEBUG oslo_concurrency.processutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.969 2 DEBUG oslo_concurrency.processutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.971 2 DEBUG nova.virt.disk.api [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Checking if we can resize image /var/lib/nova/instances/646e343d-1440-4cfb-9002-e70d282e35b2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 12:53:49 np0005476733 nova_compute[192580]: 2025-10-08 16:53:49.972 2 DEBUG oslo_concurrency.processutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/646e343d-1440-4cfb-9002-e70d282e35b2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:53:50 np0005476733 nova_compute[192580]: 2025-10-08 16:53:50.036 2 DEBUG oslo_concurrency.processutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/646e343d-1440-4cfb-9002-e70d282e35b2/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:53:50 np0005476733 nova_compute[192580]: 2025-10-08 16:53:50.038 2 DEBUG nova.virt.disk.api [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Cannot resize image /var/lib/nova/instances/646e343d-1440-4cfb-9002-e70d282e35b2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 12:53:50 np0005476733 nova_compute[192580]: 2025-10-08 16:53:50.039 2 DEBUG nova.objects.instance [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Lazy-loading 'migration_context' on Instance uuid 646e343d-1440-4cfb-9002-e70d282e35b2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:53:50 np0005476733 nova_compute[192580]: 2025-10-08 16:53:50.059 2 DEBUG nova.virt.libvirt.driver [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 12:53:50 np0005476733 nova_compute[192580]: 2025-10-08 16:53:50.060 2 DEBUG nova.virt.libvirt.driver [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Ensure instance console log exists: /var/lib/nova/instances/646e343d-1440-4cfb-9002-e70d282e35b2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 12:53:50 np0005476733 nova_compute[192580]: 2025-10-08 16:53:50.061 2 DEBUG oslo_concurrency.lockutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:53:50 np0005476733 nova_compute[192580]: 2025-10-08 16:53:50.062 2 DEBUG oslo_concurrency.lockutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:53:50 np0005476733 nova_compute[192580]: 2025-10-08 16:53:50.062 2 DEBUG oslo_concurrency.lockutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:53:50 np0005476733 nova_compute[192580]: 2025-10-08 16:53:50.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:50 np0005476733 nova_compute[192580]: 2025-10-08 16:53:50.586 2 DEBUG nova.policy [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ce07ebb87c0b46b793d12f35ecf533a5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'afbfae2d6c3c47e5b3d7fb0bd7b2af50', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 12:53:50 np0005476733 nova_compute[192580]: 2025-10-08 16:53:50.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:51 np0005476733 podman[275124]: 2025-10-08 16:53:51.275179278 +0000 UTC m=+0.084531481 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:53:51 np0005476733 podman[275123]: 2025-10-08 16:53:51.282550843 +0000 UTC m=+0.108842247 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 12:53:52 np0005476733 ovn_controller[263831]: 2025-10-08T16:53:52Z|00199|pinctrl|WARN|Dropped 303 log messages in last 61 seconds (most recently, 6 seconds ago) due to excessive rate
Oct  8 12:53:52 np0005476733 ovn_controller[263831]: 2025-10-08T16:53:52Z|00200|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:53:52 np0005476733 nova_compute[192580]: 2025-10-08 16:53:52.718 2 DEBUG nova.network.neutron [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Successfully created port: 68edde7c-cef7-4ea0-ac59-a9e1216687e9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 12:53:54 np0005476733 nova_compute[192580]: 2025-10-08 16:53:54.797 2 DEBUG nova.network.neutron [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Successfully updated port: 68edde7c-cef7-4ea0-ac59-a9e1216687e9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 12:53:54 np0005476733 nova_compute[192580]: 2025-10-08 16:53:54.814 2 DEBUG oslo_concurrency.lockutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Acquiring lock "refresh_cache-646e343d-1440-4cfb-9002-e70d282e35b2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:53:54 np0005476733 nova_compute[192580]: 2025-10-08 16:53:54.814 2 DEBUG oslo_concurrency.lockutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Acquired lock "refresh_cache-646e343d-1440-4cfb-9002-e70d282e35b2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:53:54 np0005476733 nova_compute[192580]: 2025-10-08 16:53:54.814 2 DEBUG nova.network.neutron [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:53:54 np0005476733 nova_compute[192580]: 2025-10-08 16:53:54.921 2 DEBUG nova.compute.manager [req-7fd34c40-122d-4100-a2bd-cfd576d605e4 req-7ac5b1ff-4083-4ffb-beba-75a98c2d0286 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Received event network-changed-68edde7c-cef7-4ea0-ac59-a9e1216687e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:53:54 np0005476733 nova_compute[192580]: 2025-10-08 16:53:54.922 2 DEBUG nova.compute.manager [req-7fd34c40-122d-4100-a2bd-cfd576d605e4 req-7ac5b1ff-4083-4ffb-beba-75a98c2d0286 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Refreshing instance network info cache due to event network-changed-68edde7c-cef7-4ea0-ac59-a9e1216687e9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:53:54 np0005476733 nova_compute[192580]: 2025-10-08 16:53:54.922 2 DEBUG oslo_concurrency.lockutils [req-7fd34c40-122d-4100-a2bd-cfd576d605e4 req-7ac5b1ff-4083-4ffb-beba-75a98c2d0286 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-646e343d-1440-4cfb-9002-e70d282e35b2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:53:55 np0005476733 nova_compute[192580]: 2025-10-08 16:53:55.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:55 np0005476733 nova_compute[192580]: 2025-10-08 16:53:55.215 2 DEBUG nova.network.neutron [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 12:53:55 np0005476733 nova_compute[192580]: 2025-10-08 16:53:55.943 2 DEBUG nova.network.neutron [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Updating instance_info_cache with network_info: [{"id": "68edde7c-cef7-4ea0-ac59-a9e1216687e9", "address": "fa:16:3e:ce:b7:23", "network": {"id": "8cc9bd86-85c9-4a22-9c32-ca632fa7aae3", "bridge": "br-int", "label": "tempest-test-network--809217892", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "afbfae2d6c3c47e5b3d7fb0bd7b2af50", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68edde7c-ce", "ovs_interfaceid": "68edde7c-cef7-4ea0-ac59-a9e1216687e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:53:55 np0005476733 nova_compute[192580]: 2025-10-08 16:53:55.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:55 np0005476733 nova_compute[192580]: 2025-10-08 16:53:55.966 2 DEBUG oslo_concurrency.lockutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Releasing lock "refresh_cache-646e343d-1440-4cfb-9002-e70d282e35b2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:53:55 np0005476733 nova_compute[192580]: 2025-10-08 16:53:55.967 2 DEBUG nova.compute.manager [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Instance network_info: |[{"id": "68edde7c-cef7-4ea0-ac59-a9e1216687e9", "address": "fa:16:3e:ce:b7:23", "network": {"id": "8cc9bd86-85c9-4a22-9c32-ca632fa7aae3", "bridge": "br-int", "label": "tempest-test-network--809217892", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "afbfae2d6c3c47e5b3d7fb0bd7b2af50", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68edde7c-ce", "ovs_interfaceid": "68edde7c-cef7-4ea0-ac59-a9e1216687e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 12:53:55 np0005476733 nova_compute[192580]: 2025-10-08 16:53:55.967 2 DEBUG oslo_concurrency.lockutils [req-7fd34c40-122d-4100-a2bd-cfd576d605e4 req-7ac5b1ff-4083-4ffb-beba-75a98c2d0286 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-646e343d-1440-4cfb-9002-e70d282e35b2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:53:55 np0005476733 nova_compute[192580]: 2025-10-08 16:53:55.968 2 DEBUG nova.network.neutron [req-7fd34c40-122d-4100-a2bd-cfd576d605e4 req-7ac5b1ff-4083-4ffb-beba-75a98c2d0286 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Refreshing network info cache for port 68edde7c-cef7-4ea0-ac59-a9e1216687e9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:53:55 np0005476733 nova_compute[192580]: 2025-10-08 16:53:55.972 2 DEBUG nova.virt.libvirt.driver [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Start _get_guest_xml network_info=[{"id": "68edde7c-cef7-4ea0-ac59-a9e1216687e9", "address": "fa:16:3e:ce:b7:23", "network": {"id": "8cc9bd86-85c9-4a22-9c32-ca632fa7aae3", "bridge": "br-int", "label": "tempest-test-network--809217892", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "afbfae2d6c3c47e5b3d7fb0bd7b2af50", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68edde7c-ce", "ovs_interfaceid": "68edde7c-cef7-4ea0-ac59-a9e1216687e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T15:17:39Z,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T15:17:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'ec29a055-bb5f-49c2-94be-8574c5ea97ea'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 12:53:55 np0005476733 nova_compute[192580]: 2025-10-08 16:53:55.978 2 WARNING nova.virt.libvirt.driver [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:53:55 np0005476733 nova_compute[192580]: 2025-10-08 16:53:55.984 2 DEBUG nova.virt.libvirt.host [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 12:53:55 np0005476733 nova_compute[192580]: 2025-10-08 16:53:55.986 2 DEBUG nova.virt.libvirt.host [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 12:53:55 np0005476733 nova_compute[192580]: 2025-10-08 16:53:55.995 2 DEBUG nova.virt.libvirt.host [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 12:53:55 np0005476733 nova_compute[192580]: 2025-10-08 16:53:55.996 2 DEBUG nova.virt.libvirt.host [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 12:53:55 np0005476733 nova_compute[192580]: 2025-10-08 16:53:55.997 2 DEBUG nova.virt.libvirt.driver [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 12:53:55 np0005476733 nova_compute[192580]: 2025-10-08 16:53:55.997 2 DEBUG nova.virt.hardware [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='987b2db7-1d21-4b59-831a-1e8ace40589b',id=2,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T15:17:39Z,direct_url=<?>,disk_format='qcow2',id=ec29a055-bb5f-49c2-94be-8574c5ea97ea,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T15:17:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 12:53:55 np0005476733 nova_compute[192580]: 2025-10-08 16:53:55.997 2 DEBUG nova.virt.hardware [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 12:53:55 np0005476733 nova_compute[192580]: 2025-10-08 16:53:55.998 2 DEBUG nova.virt.hardware [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 12:53:55 np0005476733 nova_compute[192580]: 2025-10-08 16:53:55.998 2 DEBUG nova.virt.hardware [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 12:53:55 np0005476733 nova_compute[192580]: 2025-10-08 16:53:55.998 2 DEBUG nova.virt.hardware [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 12:53:55 np0005476733 nova_compute[192580]: 2025-10-08 16:53:55.999 2 DEBUG nova.virt.hardware [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 12:53:55 np0005476733 nova_compute[192580]: 2025-10-08 16:53:55.999 2 DEBUG nova.virt.hardware [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:55.999 2 DEBUG nova.virt.hardware [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.000 2 DEBUG nova.virt.hardware [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.000 2 DEBUG nova.virt.hardware [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.000 2 DEBUG nova.virt.hardware [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.005 2 DEBUG nova.virt.libvirt.vif [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1631303517',display_name='tempest-server-test-1631303517',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1631303517',id=109,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKxZgmPYaI2Fw1XuvZstgNUsJwJMWfN6OBu90NIW+WLqlfo7lw4389fYbhLgftOBK/DJYU1BCRVs1g9GSwkyQpzgGL65LPyGM8k2BXpbc2nz7GgUjkKNUY6ZAUn6neqwqA==',key_name='tempest-keypair-test-970761234',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='afbfae2d6c3c47e5b3d7fb0bd7b2af50',ramdisk_id='',reservation_id='r-2a0y4w3x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-OvnDbsMonitoringTest-1742113068',owner_user_name='tempest-OvnDbsMonitoringTest-1742113068-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:53:49Z,user_data=None,user_id='ce07ebb87c0b46b793d12f35ecf533a5',uuid=646e343d-1440-4cfb-9002-e70d282e35b2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "68edde7c-cef7-4ea0-ac59-a9e1216687e9", "address": "fa:16:3e:ce:b7:23", "network": {"id": "8cc9bd86-85c9-4a22-9c32-ca632fa7aae3", "bridge": "br-int", "label": "tempest-test-network--809217892", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "afbfae2d6c3c47e5b3d7fb0bd7b2af50", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68edde7c-ce", "ovs_interfaceid": "68edde7c-cef7-4ea0-ac59-a9e1216687e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.006 2 DEBUG nova.network.os_vif_util [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Converting VIF {"id": "68edde7c-cef7-4ea0-ac59-a9e1216687e9", "address": "fa:16:3e:ce:b7:23", "network": {"id": "8cc9bd86-85c9-4a22-9c32-ca632fa7aae3", "bridge": "br-int", "label": "tempest-test-network--809217892", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "afbfae2d6c3c47e5b3d7fb0bd7b2af50", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68edde7c-ce", "ovs_interfaceid": "68edde7c-cef7-4ea0-ac59-a9e1216687e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.006 2 DEBUG nova.network.os_vif_util [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:b7:23,bridge_name='br-int',has_traffic_filtering=True,id=68edde7c-cef7-4ea0-ac59-a9e1216687e9,network=Network(8cc9bd86-85c9-4a22-9c32-ca632fa7aae3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68edde7c-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.007 2 DEBUG nova.objects.instance [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Lazy-loading 'pci_devices' on Instance uuid 646e343d-1440-4cfb-9002-e70d282e35b2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.023 2 DEBUG nova.virt.libvirt.driver [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] End _get_guest_xml xml=<domain type="kvm">
Oct  8 12:53:56 np0005476733 nova_compute[192580]:  <uuid>646e343d-1440-4cfb-9002-e70d282e35b2</uuid>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:  <name>instance-0000006d</name>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:  <memory>131072</memory>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <nova:name>tempest-server-test-1631303517</nova:name>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 16:53:55</nova:creationTime>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <nova:flavor name="m1.nano">
Oct  8 12:53:56 np0005476733 nova_compute[192580]:        <nova:memory>128</nova:memory>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:        <nova:disk>1</nova:disk>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:        <nova:user uuid="ce07ebb87c0b46b793d12f35ecf533a5">tempest-OvnDbsMonitoringTest-1742113068-project-member</nova:user>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:        <nova:project uuid="afbfae2d6c3c47e5b3d7fb0bd7b2af50">tempest-OvnDbsMonitoringTest-1742113068</nova:project>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="ec29a055-bb5f-49c2-94be-8574c5ea97ea"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:        <nova:port uuid="68edde7c-cef7-4ea0-ac59-a9e1216687e9">
Oct  8 12:53:56 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <system>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <entry name="serial">646e343d-1440-4cfb-9002-e70d282e35b2</entry>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <entry name="uuid">646e343d-1440-4cfb-9002-e70d282e35b2</entry>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    </system>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:  <os>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:  </os>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:  <features>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:  </features>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:  </clock>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:  <devices>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/646e343d-1440-4cfb-9002-e70d282e35b2/disk"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/646e343d-1440-4cfb-9002-e70d282e35b2/disk.config"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:ce:b7:23"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <target dev="tap68edde7c-ce"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    </interface>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/646e343d-1440-4cfb-9002-e70d282e35b2/console.log" append="off"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    </serial>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <video>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    </video>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    </rng>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 12:53:56 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 12:53:56 np0005476733 nova_compute[192580]:  </devices>
Oct  8 12:53:56 np0005476733 nova_compute[192580]: </domain>
Oct  8 12:53:56 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.025 2 DEBUG nova.compute.manager [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Preparing to wait for external event network-vif-plugged-68edde7c-cef7-4ea0-ac59-a9e1216687e9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.025 2 DEBUG oslo_concurrency.lockutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Acquiring lock "646e343d-1440-4cfb-9002-e70d282e35b2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.025 2 DEBUG oslo_concurrency.lockutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Lock "646e343d-1440-4cfb-9002-e70d282e35b2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.026 2 DEBUG oslo_concurrency.lockutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Lock "646e343d-1440-4cfb-9002-e70d282e35b2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.026 2 DEBUG nova.virt.libvirt.vif [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-test-1631303517',display_name='tempest-server-test-1631303517',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1631303517',id=109,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKxZgmPYaI2Fw1XuvZstgNUsJwJMWfN6OBu90NIW+WLqlfo7lw4389fYbhLgftOBK/DJYU1BCRVs1g9GSwkyQpzgGL65LPyGM8k2BXpbc2nz7GgUjkKNUY6ZAUn6neqwqA==',key_name='tempest-keypair-test-970761234',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='afbfae2d6c3c47e5b3d7fb0bd7b2af50',ramdisk_id='',reservation_id='r-2a0y4w3x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-OvnDbsMonitoringTest-1742113068',owner_user_name='tempest-OvnDbsMonitoringTest-1742113068-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:53:49Z,user_data=None,user_id='ce07ebb87c0b46b793d12f35ecf533a5',uuid=646e343d-1440-4cfb-9002-e70d282e35b2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "68edde7c-cef7-4ea0-ac59-a9e1216687e9", "address": "fa:16:3e:ce:b7:23", "network": {"id": "8cc9bd86-85c9-4a22-9c32-ca632fa7aae3", "bridge": "br-int", "label": "tempest-test-network--809217892", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "afbfae2d6c3c47e5b3d7fb0bd7b2af50", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68edde7c-ce", "ovs_interfaceid": "68edde7c-cef7-4ea0-ac59-a9e1216687e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.027 2 DEBUG nova.network.os_vif_util [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Converting VIF {"id": "68edde7c-cef7-4ea0-ac59-a9e1216687e9", "address": "fa:16:3e:ce:b7:23", "network": {"id": "8cc9bd86-85c9-4a22-9c32-ca632fa7aae3", "bridge": "br-int", "label": "tempest-test-network--809217892", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "afbfae2d6c3c47e5b3d7fb0bd7b2af50", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68edde7c-ce", "ovs_interfaceid": "68edde7c-cef7-4ea0-ac59-a9e1216687e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.028 2 DEBUG nova.network.os_vif_util [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:b7:23,bridge_name='br-int',has_traffic_filtering=True,id=68edde7c-cef7-4ea0-ac59-a9e1216687e9,network=Network(8cc9bd86-85c9-4a22-9c32-ca632fa7aae3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68edde7c-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.028 2 DEBUG os_vif [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:b7:23,bridge_name='br-int',has_traffic_filtering=True,id=68edde7c-cef7-4ea0-ac59-a9e1216687e9,network=Network(8cc9bd86-85c9-4a22-9c32-ca632fa7aae3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68edde7c-ce') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.029 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.030 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.032 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap68edde7c-ce, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.033 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap68edde7c-ce, col_values=(('external_ids', {'iface-id': '68edde7c-cef7-4ea0-ac59-a9e1216687e9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ce:b7:23', 'vm-uuid': '646e343d-1440-4cfb-9002-e70d282e35b2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.044 2 INFO os_vif [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:b7:23,bridge_name='br-int',has_traffic_filtering=True,id=68edde7c-cef7-4ea0-ac59-a9e1216687e9,network=Network(8cc9bd86-85c9-4a22-9c32-ca632fa7aae3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68edde7c-ce')#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.128 2 DEBUG nova.virt.libvirt.driver [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.129 2 DEBUG nova.virt.libvirt.driver [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.129 2 DEBUG nova.virt.libvirt.driver [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] No VIF found with MAC fa:16:3e:ce:b7:23, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 12:53:56 np0005476733 nova_compute[192580]: 2025-10-08 16:53:56.129 2 INFO nova.virt.libvirt.driver [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Using config drive#033[00m
Oct  8 12:53:57 np0005476733 nova_compute[192580]: 2025-10-08 16:53:57.098 2 INFO nova.virt.libvirt.driver [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Creating config drive at /var/lib/nova/instances/646e343d-1440-4cfb-9002-e70d282e35b2/disk.config#033[00m
Oct  8 12:53:57 np0005476733 nova_compute[192580]: 2025-10-08 16:53:57.104 2 DEBUG oslo_concurrency.processutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/646e343d-1440-4cfb-9002-e70d282e35b2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbx3ritze execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:53:57 np0005476733 nova_compute[192580]: 2025-10-08 16:53:57.229 2 DEBUG oslo_concurrency.processutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/646e343d-1440-4cfb-9002-e70d282e35b2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbx3ritze" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:53:57 np0005476733 kernel: tap68edde7c-ce: entered promiscuous mode
Oct  8 12:53:57 np0005476733 nova_compute[192580]: 2025-10-08 16:53:57.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:57 np0005476733 NetworkManager[51699]: <info>  [1759942437.3527] manager: (tap68edde7c-ce): new Tun device (/org/freedesktop/NetworkManager/Devices/324)
Oct  8 12:53:57 np0005476733 ovn_controller[263831]: 2025-10-08T16:53:57Z|00201|binding|INFO|Claiming lport 68edde7c-cef7-4ea0-ac59-a9e1216687e9 for this chassis.
Oct  8 12:53:57 np0005476733 ovn_controller[263831]: 2025-10-08T16:53:57Z|00202|binding|INFO|68edde7c-cef7-4ea0-ac59-a9e1216687e9: Claiming fa:16:3e:ce:b7:23 10.100.0.5
Oct  8 12:53:57 np0005476733 nova_compute[192580]: 2025-10-08 16:53:57.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.365 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:b7:23 10.100.0.5'], port_security=['fa:16:3e:ce:b7:23 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '646e343d-1440-4cfb-9002-e70d282e35b2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'afbfae2d6c3c47e5b3d7fb0bd7b2af50', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c287c00c-b558-486a-9a7c-b556845fbdb7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=321fcb12-d7c4-459b-9a34-b02dd95d9940, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=68edde7c-cef7-4ea0-ac59-a9e1216687e9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.366 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 68edde7c-cef7-4ea0-ac59-a9e1216687e9 in datapath 8cc9bd86-85c9-4a22-9c32-ca632fa7aae3 bound to our chassis#033[00m
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.368 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8cc9bd86-85c9-4a22-9c32-ca632fa7aae3#033[00m
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.384 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ec13f5e9-d903-468b-afca-4438dfe8ba19]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.385 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8cc9bd86-81 in ovnmeta-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.388 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8cc9bd86-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.388 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5d7fde19-1081-4e5d-9414-d4b710f3ae41]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:53:57 np0005476733 systemd-udevd[275186]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:53:57 np0005476733 nova_compute[192580]: 2025-10-08 16:53:57.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.389 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6983f66f-c9c4-409f-962f-5bf81c789a3e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:53:57 np0005476733 ovn_controller[263831]: 2025-10-08T16:53:57Z|00203|binding|INFO|Setting lport 68edde7c-cef7-4ea0-ac59-a9e1216687e9 ovn-installed in OVS
Oct  8 12:53:57 np0005476733 ovn_controller[263831]: 2025-10-08T16:53:57Z|00204|binding|INFO|Setting lport 68edde7c-cef7-4ea0-ac59-a9e1216687e9 up in Southbound
Oct  8 12:53:57 np0005476733 nova_compute[192580]: 2025-10-08 16:53:57.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:57 np0005476733 NetworkManager[51699]: <info>  [1759942437.4047] device (tap68edde7c-ce): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:53:57 np0005476733 NetworkManager[51699]: <info>  [1759942437.4054] device (tap68edde7c-ce): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 12:53:57 np0005476733 systemd-machined[152624]: New machine qemu-67-instance-0000006d.
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.404 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[9e4c1e25-1942-4b91-95f7-8bd1eae4d57c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:53:57 np0005476733 systemd[1]: Started Virtual Machine qemu-67-instance-0000006d.
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.432 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8f67bbdf-d0f5-4238-8611-b0a33de9a5a6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.467 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[c39e17f1-f4c4-4d38-99fb-4e7d5c8a8c47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:53:57 np0005476733 NetworkManager[51699]: <info>  [1759942437.4735] manager: (tap8cc9bd86-80): new Veth device (/org/freedesktop/NetworkManager/Devices/325)
Oct  8 12:53:57 np0005476733 systemd-udevd[275190]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.473 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[77d809ef-b5f9-414f-9c41-26c2e5c8c1de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.508 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[0c26826f-16bd-433c-95bd-33cc8c4f868c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.511 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[526276c6-cc6f-4f27-b480-900800b7f51b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:53:57 np0005476733 NetworkManager[51699]: <info>  [1759942437.5339] device (tap8cc9bd86-80): carrier: link connected
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.541 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[7591a262-f423-4d3b-90b4-9cf26aa4bd50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.564 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3fb5949e-fe27-4758-ae2a-2820e89ca92d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8cc9bd86-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 227], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 934119, 'reachable_time': 30105, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 275219, 'error': None, 'target': 'ovnmeta-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.582 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1fa091a2-56db-4a00-aa18-ff1b52a7154f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4a:3762'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 934119, 'tstamp': 934119}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 275220, 'error': None, 'target': 'ovnmeta-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:53:57 np0005476733 nova_compute[192580]: 2025-10-08 16:53:57.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.605 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0ac3b708-c92e-4938-87e8-f3bf2a5b1693]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8cc9bd86-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:37:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 227], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 934119, 'reachable_time': 30105, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 275223, 'error': None, 'target': 'ovnmeta-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.642 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[60965310-5385-4a8f-8281-58681a8d5b32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.711 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0bef2b19-10e8-412c-99b7-c108e051ca93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.712 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8cc9bd86-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.713 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.713 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8cc9bd86-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:53:57 np0005476733 nova_compute[192580]: 2025-10-08 16:53:57.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:57 np0005476733 kernel: tap8cc9bd86-80: entered promiscuous mode
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.725 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8cc9bd86-80, col_values=(('external_ids', {'iface-id': '32f3f1d7-aa37-4668-ab7f-1525e039ec9d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:53:57 np0005476733 ovn_controller[263831]: 2025-10-08T16:53:57Z|00205|binding|INFO|Releasing lport 32f3f1d7-aa37-4668-ab7f-1525e039ec9d from this chassis (sb_readonly=0)
Oct  8 12:53:57 np0005476733 nova_compute[192580]: 2025-10-08 16:53:57.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.727 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8cc9bd86-85c9-4a22-9c32-ca632fa7aae3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8cc9bd86-85c9-4a22-9c32-ca632fa7aae3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.728 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[fd6d16f3-5f8f-4c6c-9383-c097cf3ea82b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.729 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/8cc9bd86-85c9-4a22-9c32-ca632fa7aae3.pid.haproxy
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 8cc9bd86-85c9-4a22-9c32-ca632fa7aae3
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.730 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3', 'env', 'PROCESS_TAG=haproxy-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8cc9bd86-85c9-4a22-9c32-ca632fa7aae3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 12:53:57 np0005476733 nova_compute[192580]: 2025-10-08 16:53:57.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:57 np0005476733 nova_compute[192580]: 2025-10-08 16:53:57.754 2 DEBUG nova.compute.manager [req-f64aa297-e36a-4c0c-b2fa-4de5b3fbb4bc req-27bb2fa8-5694-4d8f-87b9-bbad08c6e098 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Received event network-vif-plugged-68edde7c-cef7-4ea0-ac59-a9e1216687e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:53:57 np0005476733 nova_compute[192580]: 2025-10-08 16:53:57.755 2 DEBUG oslo_concurrency.lockutils [req-f64aa297-e36a-4c0c-b2fa-4de5b3fbb4bc req-27bb2fa8-5694-4d8f-87b9-bbad08c6e098 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "646e343d-1440-4cfb-9002-e70d282e35b2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:53:57 np0005476733 nova_compute[192580]: 2025-10-08 16:53:57.755 2 DEBUG oslo_concurrency.lockutils [req-f64aa297-e36a-4c0c-b2fa-4de5b3fbb4bc req-27bb2fa8-5694-4d8f-87b9-bbad08c6e098 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "646e343d-1440-4cfb-9002-e70d282e35b2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:53:57 np0005476733 nova_compute[192580]: 2025-10-08 16:53:57.756 2 DEBUG oslo_concurrency.lockutils [req-f64aa297-e36a-4c0c-b2fa-4de5b3fbb4bc req-27bb2fa8-5694-4d8f-87b9-bbad08c6e098 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "646e343d-1440-4cfb-9002-e70d282e35b2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:53:57 np0005476733 nova_compute[192580]: 2025-10-08 16:53:57.756 2 DEBUG nova.compute.manager [req-f64aa297-e36a-4c0c-b2fa-4de5b3fbb4bc req-27bb2fa8-5694-4d8f-87b9-bbad08c6e098 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Processing event network-vif-plugged-68edde7c-cef7-4ea0-ac59-a9e1216687e9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 12:53:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:57.870 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=89, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=88) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:53:57 np0005476733 nova_compute[192580]: 2025-10-08 16:53:57.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.044 2 DEBUG nova.compute.manager [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.046 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759942438.044304, 646e343d-1440-4cfb-9002-e70d282e35b2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.046 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] VM Started (Lifecycle Event)#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.049 2 DEBUG nova.virt.libvirt.driver [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.052 2 INFO nova.virt.libvirt.driver [-] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Instance spawned successfully.#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.053 2 DEBUG nova.virt.libvirt.driver [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.072 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.076 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.081 2 DEBUG nova.virt.libvirt.driver [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.081 2 DEBUG nova.virt.libvirt.driver [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.082 2 DEBUG nova.virt.libvirt.driver [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.082 2 DEBUG nova.virt.libvirt.driver [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.083 2 DEBUG nova.virt.libvirt.driver [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.083 2 DEBUG nova.virt.libvirt.driver [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.110 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.111 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759942438.04445, 646e343d-1440-4cfb-9002-e70d282e35b2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.112 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] VM Paused (Lifecycle Event)#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.145 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.150 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759942438.0483704, 646e343d-1440-4cfb-9002-e70d282e35b2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.151 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] VM Resumed (Lifecycle Event)#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.153 2 INFO nova.compute.manager [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Took 8.49 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.154 2 DEBUG nova.compute.manager [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:53:58 np0005476733 podman[275260]: 2025-10-08 16:53:58.075530485 +0000 UTC m=+0.023774300 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 12:53:58 np0005476733 podman[275260]: 2025-10-08 16:53:58.17023953 +0000 UTC m=+0.118483335 container create a7d6b290de44dc3bfb6a3de042a62897991bc628b6fdcf39a409b833f165f03f (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.182 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.188 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.213 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:53:58 np0005476733 systemd[1]: Started libpod-conmon-a7d6b290de44dc3bfb6a3de042a62897991bc628b6fdcf39a409b833f165f03f.scope.
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.223 2 INFO nova.compute.manager [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Took 9.07 seconds to build instance.#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.240 2 DEBUG oslo_concurrency.lockutils [None req-1d366a1c-7629-4177-9833-050464f879d6 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Lock "646e343d-1440-4cfb-9002-e70d282e35b2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:53:58 np0005476733 systemd[1]: Started libcrun container.
Oct  8 12:53:58 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d91015fd1595bb636a3c2d26a9c4dbc85292e965df6d424856d783515bff75e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.270 2 DEBUG nova.network.neutron [req-7fd34c40-122d-4100-a2bd-cfd576d605e4 req-7ac5b1ff-4083-4ffb-beba-75a98c2d0286 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Updated VIF entry in instance network info cache for port 68edde7c-cef7-4ea0-ac59-a9e1216687e9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.270 2 DEBUG nova.network.neutron [req-7fd34c40-122d-4100-a2bd-cfd576d605e4 req-7ac5b1ff-4083-4ffb-beba-75a98c2d0286 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Updating instance_info_cache with network_info: [{"id": "68edde7c-cef7-4ea0-ac59-a9e1216687e9", "address": "fa:16:3e:ce:b7:23", "network": {"id": "8cc9bd86-85c9-4a22-9c32-ca632fa7aae3", "bridge": "br-int", "label": "tempest-test-network--809217892", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "afbfae2d6c3c47e5b3d7fb0bd7b2af50", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68edde7c-ce", "ovs_interfaceid": "68edde7c-cef7-4ea0-ac59-a9e1216687e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:53:58 np0005476733 podman[275260]: 2025-10-08 16:53:58.279758168 +0000 UTC m=+0.228001973 container init a7d6b290de44dc3bfb6a3de042a62897991bc628b6fdcf39a409b833f165f03f (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:53:58 np0005476733 podman[275260]: 2025-10-08 16:53:58.286171063 +0000 UTC m=+0.234414868 container start a7d6b290de44dc3bfb6a3de042a62897991bc628b6fdcf39a409b833f165f03f (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2)
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.303 2 DEBUG oslo_concurrency.lockutils [req-7fd34c40-122d-4100-a2bd-cfd576d605e4 req-7ac5b1ff-4083-4ffb-beba-75a98c2d0286 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-646e343d-1440-4cfb-9002-e70d282e35b2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:53:58 np0005476733 neutron-haproxy-ovnmeta-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3[275274]: [NOTICE]   (275278) : New worker (275280) forked
Oct  8 12:53:58 np0005476733 neutron-haproxy-ovnmeta-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3[275274]: [NOTICE]   (275278) : Loading success.
Oct  8 12:53:58 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:53:58.347 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:53:58 np0005476733 nova_compute[192580]: 2025-10-08 16:53:58.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:53:59 np0005476733 nova_compute[192580]: 2025-10-08 16:53:59.836 2 DEBUG nova.compute.manager [req-42962102-bce6-43be-954b-7671e8011803 req-f24e2087-8fb1-4535-8c78-4288dac33e0a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Received event network-vif-plugged-68edde7c-cef7-4ea0-ac59-a9e1216687e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:53:59 np0005476733 nova_compute[192580]: 2025-10-08 16:53:59.838 2 DEBUG oslo_concurrency.lockutils [req-42962102-bce6-43be-954b-7671e8011803 req-f24e2087-8fb1-4535-8c78-4288dac33e0a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "646e343d-1440-4cfb-9002-e70d282e35b2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:53:59 np0005476733 nova_compute[192580]: 2025-10-08 16:53:59.838 2 DEBUG oslo_concurrency.lockutils [req-42962102-bce6-43be-954b-7671e8011803 req-f24e2087-8fb1-4535-8c78-4288dac33e0a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "646e343d-1440-4cfb-9002-e70d282e35b2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:53:59 np0005476733 nova_compute[192580]: 2025-10-08 16:53:59.839 2 DEBUG oslo_concurrency.lockutils [req-42962102-bce6-43be-954b-7671e8011803 req-f24e2087-8fb1-4535-8c78-4288dac33e0a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "646e343d-1440-4cfb-9002-e70d282e35b2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:53:59 np0005476733 nova_compute[192580]: 2025-10-08 16:53:59.839 2 DEBUG nova.compute.manager [req-42962102-bce6-43be-954b-7671e8011803 req-f24e2087-8fb1-4535-8c78-4288dac33e0a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] No waiting events found dispatching network-vif-plugged-68edde7c-cef7-4ea0-ac59-a9e1216687e9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:53:59 np0005476733 nova_compute[192580]: 2025-10-08 16:53:59.840 2 WARNING nova.compute.manager [req-42962102-bce6-43be-954b-7671e8011803 req-f24e2087-8fb1-4535-8c78-4288dac33e0a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Received unexpected event network-vif-plugged-68edde7c-cef7-4ea0-ac59-a9e1216687e9 for instance with vm_state active and task_state None.#033[00m
Oct  8 12:54:00 np0005476733 nova_compute[192580]: 2025-10-08 16:54:00.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:00 np0005476733 nova_compute[192580]: 2025-10-08 16:54:00.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:54:00 np0005476733 nova_compute[192580]: 2025-10-08 16:54:00.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:54:00 np0005476733 nova_compute[192580]: 2025-10-08 16:54:00.907 2 INFO nova.compute.manager [None req-38ae32e1-3a99-4c49-8690-3484bb54d034 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Get console output#033[00m
Oct  8 12:54:00 np0005476733 nova_compute[192580]: 2025-10-08 16:54:00.913 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:54:01 np0005476733 nova_compute[192580]: 2025-10-08 16:54:01.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:02 np0005476733 podman[275289]: 2025-10-08 16:54:02.229151519 +0000 UTC m=+0.055245265 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  8 12:54:03 np0005476733 podman[275310]: 2025-10-08 16:54:03.264800377 +0000 UTC m=+0.094444008 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:54:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:54:03.349 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '89'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:54:05 np0005476733 nova_compute[192580]: 2025-10-08 16:54:05.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:06 np0005476733 nova_compute[192580]: 2025-10-08 16:54:06.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:06 np0005476733 nova_compute[192580]: 2025-10-08 16:54:06.101 2 INFO nova.compute.manager [None req-33e5f789-31d7-498c-954d-0800bc409e96 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Get console output#033[00m
Oct  8 12:54:06 np0005476733 nova_compute[192580]: 2025-10-08 16:54:06.109 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:54:06 np0005476733 podman[275334]: 2025-10-08 16:54:06.251581282 +0000 UTC m=+0.076101702 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:54:06 np0005476733 nova_compute[192580]: 2025-10-08 16:54:06.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:54:10 np0005476733 nova_compute[192580]: 2025-10-08 16:54:10.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:10 np0005476733 nova_compute[192580]: 2025-10-08 16:54:10.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:54:10 np0005476733 nova_compute[192580]: 2025-10-08 16:54:10.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:54:10 np0005476733 nova_compute[192580]: 2025-10-08 16:54:10.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:54:11 np0005476733 nova_compute[192580]: 2025-10-08 16:54:11.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:11 np0005476733 nova_compute[192580]: 2025-10-08 16:54:11.290 2 INFO nova.compute.manager [None req-a47c4668-dc79-4f77-a0c9-0c51ab39ee19 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Get console output#033[00m
Oct  8 12:54:11 np0005476733 nova_compute[192580]: 2025-10-08 16:54:11.295 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:54:11 np0005476733 nova_compute[192580]: 2025-10-08 16:54:11.537 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-646e343d-1440-4cfb-9002-e70d282e35b2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:54:11 np0005476733 nova_compute[192580]: 2025-10-08 16:54:11.538 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-646e343d-1440-4cfb-9002-e70d282e35b2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:54:11 np0005476733 nova_compute[192580]: 2025-10-08 16:54:11.538 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:54:11 np0005476733 nova_compute[192580]: 2025-10-08 16:54:11.538 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 646e343d-1440-4cfb-9002-e70d282e35b2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:54:11 np0005476733 ovn_controller[263831]: 2025-10-08T16:54:11Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ce:b7:23 10.100.0.5
Oct  8 12:54:11 np0005476733 ovn_controller[263831]: 2025-10-08T16:54:11Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ce:b7:23 10.100.0.5
Oct  8 12:54:13 np0005476733 nova_compute[192580]: 2025-10-08 16:54:13.681 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Updating instance_info_cache with network_info: [{"id": "68edde7c-cef7-4ea0-ac59-a9e1216687e9", "address": "fa:16:3e:ce:b7:23", "network": {"id": "8cc9bd86-85c9-4a22-9c32-ca632fa7aae3", "bridge": "br-int", "label": "tempest-test-network--809217892", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "afbfae2d6c3c47e5b3d7fb0bd7b2af50", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68edde7c-ce", "ovs_interfaceid": "68edde7c-cef7-4ea0-ac59-a9e1216687e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:54:13 np0005476733 nova_compute[192580]: 2025-10-08 16:54:13.712 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-646e343d-1440-4cfb-9002-e70d282e35b2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:54:13 np0005476733 nova_compute[192580]: 2025-10-08 16:54:13.712 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:54:13 np0005476733 nova_compute[192580]: 2025-10-08 16:54:13.713 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:54:15 np0005476733 nova_compute[192580]: 2025-10-08 16:54:15.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:16 np0005476733 nova_compute[192580]: 2025-10-08 16:54:16.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:16 np0005476733 podman[275375]: 2025-10-08 16:54:16.22358316 +0000 UTC m=+0.056169095 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 12:54:16 np0005476733 podman[275376]: 2025-10-08 16:54:16.224426537 +0000 UTC m=+0.052942722 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 12:54:16 np0005476733 podman[275377]: 2025-10-08 16:54:16.229946743 +0000 UTC m=+0.057033463 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7, version=9.6, container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  8 12:54:16 np0005476733 nova_compute[192580]: 2025-10-08 16:54:16.463 2 INFO nova.compute.manager [None req-b327fca5-ef5a-4c9f-8457-ad8dffc9ecd5 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Get console output#033[00m
Oct  8 12:54:16 np0005476733 nova_compute[192580]: 2025-10-08 16:54:16.469 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:54:16 np0005476733 nova_compute[192580]: 2025-10-08 16:54:16.888 2 DEBUG oslo_concurrency.lockutils [None req-4120493a-01a4-4138-a1dc-f46b0599ed88 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Acquiring lock "646e343d-1440-4cfb-9002-e70d282e35b2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:54:16 np0005476733 nova_compute[192580]: 2025-10-08 16:54:16.889 2 DEBUG oslo_concurrency.lockutils [None req-4120493a-01a4-4138-a1dc-f46b0599ed88 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Lock "646e343d-1440-4cfb-9002-e70d282e35b2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:54:16 np0005476733 nova_compute[192580]: 2025-10-08 16:54:16.889 2 DEBUG oslo_concurrency.lockutils [None req-4120493a-01a4-4138-a1dc-f46b0599ed88 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Acquiring lock "646e343d-1440-4cfb-9002-e70d282e35b2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:54:16 np0005476733 nova_compute[192580]: 2025-10-08 16:54:16.889 2 DEBUG oslo_concurrency.lockutils [None req-4120493a-01a4-4138-a1dc-f46b0599ed88 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Lock "646e343d-1440-4cfb-9002-e70d282e35b2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:54:16 np0005476733 nova_compute[192580]: 2025-10-08 16:54:16.890 2 DEBUG oslo_concurrency.lockutils [None req-4120493a-01a4-4138-a1dc-f46b0599ed88 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Lock "646e343d-1440-4cfb-9002-e70d282e35b2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:54:16 np0005476733 nova_compute[192580]: 2025-10-08 16:54:16.891 2 INFO nova.compute.manager [None req-4120493a-01a4-4138-a1dc-f46b0599ed88 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Terminating instance#033[00m
Oct  8 12:54:16 np0005476733 nova_compute[192580]: 2025-10-08 16:54:16.891 2 DEBUG nova.compute.manager [None req-4120493a-01a4-4138-a1dc-f46b0599ed88 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 12:54:16 np0005476733 kernel: tap68edde7c-ce (unregistering): left promiscuous mode
Oct  8 12:54:16 np0005476733 NetworkManager[51699]: <info>  [1759942456.9202] device (tap68edde7c-ce): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 12:54:16 np0005476733 nova_compute[192580]: 2025-10-08 16:54:16.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:16 np0005476733 ovn_controller[263831]: 2025-10-08T16:54:16Z|00206|binding|INFO|Releasing lport 68edde7c-cef7-4ea0-ac59-a9e1216687e9 from this chassis (sb_readonly=0)
Oct  8 12:54:16 np0005476733 ovn_controller[263831]: 2025-10-08T16:54:16Z|00207|binding|INFO|Setting lport 68edde7c-cef7-4ea0-ac59-a9e1216687e9 down in Southbound
Oct  8 12:54:16 np0005476733 ovn_controller[263831]: 2025-10-08T16:54:16Z|00208|binding|INFO|Removing iface tap68edde7c-ce ovn-installed in OVS
Oct  8 12:54:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:54:16.935 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:b7:23 10.100.0.5'], port_security=['fa:16:3e:ce:b7:23 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '646e343d-1440-4cfb-9002-e70d282e35b2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'afbfae2d6c3c47e5b3d7fb0bd7b2af50', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c287c00c-b558-486a-9a7c-b556845fbdb7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=321fcb12-d7c4-459b-9a34-b02dd95d9940, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=68edde7c-cef7-4ea0-ac59-a9e1216687e9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:54:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:54:16.937 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 68edde7c-cef7-4ea0-ac59-a9e1216687e9 in datapath 8cc9bd86-85c9-4a22-9c32-ca632fa7aae3 unbound from our chassis#033[00m
Oct  8 12:54:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:54:16.938 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8cc9bd86-85c9-4a22-9c32-ca632fa7aae3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 12:54:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:54:16.939 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[72e4ee23-5e89-41dc-8f33-e80fb0ef2056]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:54:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:54:16.939 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3 namespace which is not needed anymore#033[00m
Oct  8 12:54:16 np0005476733 nova_compute[192580]: 2025-10-08 16:54:16.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:16 np0005476733 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d0000006d.scope: Deactivated successfully.
Oct  8 12:54:16 np0005476733 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d0000006d.scope: Consumed 13.543s CPU time.
Oct  8 12:54:16 np0005476733 systemd-machined[152624]: Machine qemu-67-instance-0000006d terminated.
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.165 2 INFO nova.virt.libvirt.driver [-] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Instance destroyed successfully.#033[00m
Oct  8 12:54:17 np0005476733 neutron-haproxy-ovnmeta-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3[275274]: [NOTICE]   (275278) : haproxy version is 2.8.14-c23fe91
Oct  8 12:54:17 np0005476733 neutron-haproxy-ovnmeta-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3[275274]: [NOTICE]   (275278) : path to executable is /usr/sbin/haproxy
Oct  8 12:54:17 np0005476733 neutron-haproxy-ovnmeta-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3[275274]: [WARNING]  (275278) : Exiting Master process...
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.166 2 DEBUG nova.objects.instance [None req-4120493a-01a4-4138-a1dc-f46b0599ed88 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Lazy-loading 'resources' on Instance uuid 646e343d-1440-4cfb-9002-e70d282e35b2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:54:17 np0005476733 neutron-haproxy-ovnmeta-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3[275274]: [ALERT]    (275278) : Current worker (275280) exited with code 143 (Terminated)
Oct  8 12:54:17 np0005476733 neutron-haproxy-ovnmeta-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3[275274]: [WARNING]  (275278) : All workers exited. Exiting... (0)
Oct  8 12:54:17 np0005476733 systemd[1]: libpod-a7d6b290de44dc3bfb6a3de042a62897991bc628b6fdcf39a409b833f165f03f.scope: Deactivated successfully.
Oct  8 12:54:17 np0005476733 podman[275464]: 2025-10-08 16:54:17.177872249 +0000 UTC m=+0.158287966 container died a7d6b290de44dc3bfb6a3de042a62897991bc628b6fdcf39a409b833f165f03f (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.187 2 DEBUG nova.virt.libvirt.vif [None req-4120493a-01a4-4138-a1dc-f46b0599ed88 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T16:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-test-1631303517',display_name='tempest-server-test-1631303517',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-test-1631303517',id=109,image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKxZgmPYaI2Fw1XuvZstgNUsJwJMWfN6OBu90NIW+WLqlfo7lw4389fYbhLgftOBK/DJYU1BCRVs1g9GSwkyQpzgGL65LPyGM8k2BXpbc2nz7GgUjkKNUY6ZAUn6neqwqA==',key_name='tempest-keypair-test-970761234',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:53:58Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='afbfae2d6c3c47e5b3d7fb0bd7b2af50',ramdisk_id='',reservation_id='r-2a0y4w3x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec29a055-bb5f-49c2-94be-8574c5ea97ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-OvnDbsMonitoringTest-1742113068',owner_user_name='tempest-OvnDbsMonitoringTest-1742113068-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T16:53:58Z,user_data=None,user_id='ce07ebb87c0b46b793d12f35ecf533a5',uuid=646e343d-1440-4cfb-9002-e70d282e35b2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "68edde7c-cef7-4ea0-ac59-a9e1216687e9", "address": "fa:16:3e:ce:b7:23", "network": {"id": "8cc9bd86-85c9-4a22-9c32-ca632fa7aae3", "bridge": "br-int", "label": "tempest-test-network--809217892", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "afbfae2d6c3c47e5b3d7fb0bd7b2af50", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68edde7c-ce", "ovs_interfaceid": "68edde7c-cef7-4ea0-ac59-a9e1216687e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.187 2 DEBUG nova.network.os_vif_util [None req-4120493a-01a4-4138-a1dc-f46b0599ed88 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Converting VIF {"id": "68edde7c-cef7-4ea0-ac59-a9e1216687e9", "address": "fa:16:3e:ce:b7:23", "network": {"id": "8cc9bd86-85c9-4a22-9c32-ca632fa7aae3", "bridge": "br-int", "label": "tempest-test-network--809217892", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "afbfae2d6c3c47e5b3d7fb0bd7b2af50", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68edde7c-ce", "ovs_interfaceid": "68edde7c-cef7-4ea0-ac59-a9e1216687e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.188 2 DEBUG nova.network.os_vif_util [None req-4120493a-01a4-4138-a1dc-f46b0599ed88 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ce:b7:23,bridge_name='br-int',has_traffic_filtering=True,id=68edde7c-cef7-4ea0-ac59-a9e1216687e9,network=Network(8cc9bd86-85c9-4a22-9c32-ca632fa7aae3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68edde7c-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.189 2 DEBUG os_vif [None req-4120493a-01a4-4138-a1dc-f46b0599ed88 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ce:b7:23,bridge_name='br-int',has_traffic_filtering=True,id=68edde7c-cef7-4ea0-ac59-a9e1216687e9,network=Network(8cc9bd86-85c9-4a22-9c32-ca632fa7aae3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68edde7c-ce') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.191 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap68edde7c-ce, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.195 2 INFO os_vif [None req-4120493a-01a4-4138-a1dc-f46b0599ed88 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ce:b7:23,bridge_name='br-int',has_traffic_filtering=True,id=68edde7c-cef7-4ea0-ac59-a9e1216687e9,network=Network(8cc9bd86-85c9-4a22-9c32-ca632fa7aae3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68edde7c-ce')#033[00m
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.196 2 INFO nova.virt.libvirt.driver [None req-4120493a-01a4-4138-a1dc-f46b0599ed88 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Deleting instance files /var/lib/nova/instances/646e343d-1440-4cfb-9002-e70d282e35b2_del#033[00m
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.197 2 INFO nova.virt.libvirt.driver [None req-4120493a-01a4-4138-a1dc-f46b0599ed88 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Deletion of /var/lib/nova/instances/646e343d-1440-4cfb-9002-e70d282e35b2_del complete#033[00m
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.202 2 DEBUG nova.compute.manager [req-025b8b06-b0c0-4996-b1d4-2f7e49d1004a req-2c9360be-2bbf-465e-9eb8-a23b15df71ab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Received event network-vif-unplugged-68edde7c-cef7-4ea0-ac59-a9e1216687e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.203 2 DEBUG oslo_concurrency.lockutils [req-025b8b06-b0c0-4996-b1d4-2f7e49d1004a req-2c9360be-2bbf-465e-9eb8-a23b15df71ab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "646e343d-1440-4cfb-9002-e70d282e35b2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.203 2 DEBUG oslo_concurrency.lockutils [req-025b8b06-b0c0-4996-b1d4-2f7e49d1004a req-2c9360be-2bbf-465e-9eb8-a23b15df71ab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "646e343d-1440-4cfb-9002-e70d282e35b2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.203 2 DEBUG oslo_concurrency.lockutils [req-025b8b06-b0c0-4996-b1d4-2f7e49d1004a req-2c9360be-2bbf-465e-9eb8-a23b15df71ab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "646e343d-1440-4cfb-9002-e70d282e35b2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.204 2 DEBUG nova.compute.manager [req-025b8b06-b0c0-4996-b1d4-2f7e49d1004a req-2c9360be-2bbf-465e-9eb8-a23b15df71ab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] No waiting events found dispatching network-vif-unplugged-68edde7c-cef7-4ea0-ac59-a9e1216687e9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.204 2 DEBUG nova.compute.manager [req-025b8b06-b0c0-4996-b1d4-2f7e49d1004a req-2c9360be-2bbf-465e-9eb8-a23b15df71ab 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Received event network-vif-unplugged-68edde7c-cef7-4ea0-ac59-a9e1216687e9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.246 2 INFO nova.compute.manager [None req-4120493a-01a4-4138-a1dc-f46b0599ed88 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.247 2 DEBUG oslo.service.loopingcall [None req-4120493a-01a4-4138-a1dc-f46b0599ed88 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.247 2 DEBUG nova.compute.manager [-] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.247 2 DEBUG nova.network.neutron [-] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 12:54:17 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a7d6b290de44dc3bfb6a3de042a62897991bc628b6fdcf39a409b833f165f03f-userdata-shm.mount: Deactivated successfully.
Oct  8 12:54:17 np0005476733 systemd[1]: var-lib-containers-storage-overlay-8d91015fd1595bb636a3c2d26a9c4dbc85292e965df6d424856d783515bff75e-merged.mount: Deactivated successfully.
Oct  8 12:54:17 np0005476733 podman[275464]: 2025-10-08 16:54:17.455992762 +0000 UTC m=+0.436408479 container cleanup a7d6b290de44dc3bfb6a3de042a62897991bc628b6fdcf39a409b833f165f03f (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 12:54:17 np0005476733 systemd[1]: libpod-conmon-a7d6b290de44dc3bfb6a3de042a62897991bc628b6fdcf39a409b833f165f03f.scope: Deactivated successfully.
Oct  8 12:54:17 np0005476733 podman[275512]: 2025-10-08 16:54:17.630339051 +0000 UTC m=+0.152763521 container remove a7d6b290de44dc3bfb6a3de042a62897991bc628b6fdcf39a409b833f165f03f (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 12:54:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:54:17.636 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[301fccbd-a329-43e7-ae2f-da9168586efa]: (4, ('Wed Oct  8 04:54:17 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3 (a7d6b290de44dc3bfb6a3de042a62897991bc628b6fdcf39a409b833f165f03f)\na7d6b290de44dc3bfb6a3de042a62897991bc628b6fdcf39a409b833f165f03f\nWed Oct  8 04:54:17 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3 (a7d6b290de44dc3bfb6a3de042a62897991bc628b6fdcf39a409b833f165f03f)\na7d6b290de44dc3bfb6a3de042a62897991bc628b6fdcf39a409b833f165f03f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:54:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:54:17.637 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[02c6e159-4aeb-44c7-aac3-8c24945333ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:54:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:54:17.638 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8cc9bd86-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:17 np0005476733 kernel: tap8cc9bd86-80: left promiscuous mode
Oct  8 12:54:17 np0005476733 nova_compute[192580]: 2025-10-08 16:54:17.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:54:17.655 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e78a0dc2-8240-4381-b70a-50d52f7630d9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:54:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:54:17.687 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a4efcc8b-0668-497f-8752-4c587a2eb7bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:54:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:54:17.688 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[905cdbf3-3598-4de1-a238-c98ffc59dde3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:54:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:54:17.705 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4c9d0f10-3f19-44b5-b1b3-03658033da69]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 934112, 'reachable_time': 24079, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 275527, 'error': None, 'target': 'ovnmeta-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:54:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:54:17.708 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8cc9bd86-85c9-4a22-9c32-ca632fa7aae3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 12:54:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:54:17.708 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[539d7624-2894-4c64-b6b6-861e893485e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:54:17 np0005476733 systemd[1]: run-netns-ovnmeta\x2d8cc9bd86\x2d85c9\x2d4a22\x2d9c32\x2dca632fa7aae3.mount: Deactivated successfully.
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.574 2 DEBUG nova.network.neutron [-] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.595 2 INFO nova.compute.manager [-] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Took 1.35 seconds to deallocate network for instance.#033[00m
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.626 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.626 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.627 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.627 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.651 2 DEBUG nova.compute.manager [req-7e78642a-a7a0-4126-95f6-e58bcc7f102a req-c0fe0336-61db-4ef2-b81a-5b0eeb1b9299 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Received event network-vif-deleted-68edde7c-cef7-4ea0-ac59-a9e1216687e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.658 2 DEBUG oslo_concurrency.lockutils [None req-4120493a-01a4-4138-a1dc-f46b0599ed88 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.658 2 DEBUG oslo_concurrency.lockutils [None req-4120493a-01a4-4138-a1dc-f46b0599ed88 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.716 2 DEBUG nova.compute.provider_tree [None req-4120493a-01a4-4138-a1dc-f46b0599ed88 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.732 2 DEBUG nova.scheduler.client.report [None req-4120493a-01a4-4138-a1dc-f46b0599ed88 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.753 2 DEBUG oslo_concurrency.lockutils [None req-4120493a-01a4-4138-a1dc-f46b0599ed88 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.779 2 INFO nova.scheduler.client.report [None req-4120493a-01a4-4138-a1dc-f46b0599ed88 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Deleted allocations for instance 646e343d-1440-4cfb-9002-e70d282e35b2#033[00m
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.799 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.799 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13619MB free_disk=111.30184936523438GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.800 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.800 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.854 2 DEBUG oslo_concurrency.lockutils [None req-4120493a-01a4-4138-a1dc-f46b0599ed88 ce07ebb87c0b46b793d12f35ecf533a5 afbfae2d6c3c47e5b3d7fb0bd7b2af50 - - default default] Lock "646e343d-1440-4cfb-9002-e70d282e35b2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.965s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.873 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.873 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.897 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.913 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.936 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:54:18 np0005476733 nova_compute[192580]: 2025-10-08 16:54:18.937 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:54:19 np0005476733 nova_compute[192580]: 2025-10-08 16:54:19.536 2 DEBUG nova.compute.manager [req-21a08716-ccce-4db7-95e9-22fa1c992095 req-59eb284c-7b71-44d2-ab5c-25eaf0438501 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Received event network-vif-plugged-68edde7c-cef7-4ea0-ac59-a9e1216687e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:54:19 np0005476733 nova_compute[192580]: 2025-10-08 16:54:19.536 2 DEBUG oslo_concurrency.lockutils [req-21a08716-ccce-4db7-95e9-22fa1c992095 req-59eb284c-7b71-44d2-ab5c-25eaf0438501 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "646e343d-1440-4cfb-9002-e70d282e35b2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:54:19 np0005476733 nova_compute[192580]: 2025-10-08 16:54:19.536 2 DEBUG oslo_concurrency.lockutils [req-21a08716-ccce-4db7-95e9-22fa1c992095 req-59eb284c-7b71-44d2-ab5c-25eaf0438501 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "646e343d-1440-4cfb-9002-e70d282e35b2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:54:19 np0005476733 nova_compute[192580]: 2025-10-08 16:54:19.537 2 DEBUG oslo_concurrency.lockutils [req-21a08716-ccce-4db7-95e9-22fa1c992095 req-59eb284c-7b71-44d2-ab5c-25eaf0438501 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "646e343d-1440-4cfb-9002-e70d282e35b2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:54:19 np0005476733 nova_compute[192580]: 2025-10-08 16:54:19.537 2 DEBUG nova.compute.manager [req-21a08716-ccce-4db7-95e9-22fa1c992095 req-59eb284c-7b71-44d2-ab5c-25eaf0438501 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] No waiting events found dispatching network-vif-plugged-68edde7c-cef7-4ea0-ac59-a9e1216687e9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:54:19 np0005476733 nova_compute[192580]: 2025-10-08 16:54:19.537 2 WARNING nova.compute.manager [req-21a08716-ccce-4db7-95e9-22fa1c992095 req-59eb284c-7b71-44d2-ab5c-25eaf0438501 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Received unexpected event network-vif-plugged-68edde7c-cef7-4ea0-ac59-a9e1216687e9 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 12:54:19 np0005476733 nova_compute[192580]: 2025-10-08 16:54:19.937 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:54:20 np0005476733 nova_compute[192580]: 2025-10-08 16:54:20.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:21 np0005476733 nova_compute[192580]: 2025-10-08 16:54:21.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:22 np0005476733 podman[275530]: 2025-10-08 16:54:22.004613471 +0000 UTC m=+0.054710628 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 12:54:22 np0005476733 podman[275529]: 2025-10-08 16:54:22.00489244 +0000 UTC m=+0.056698602 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:54:22 np0005476733 nova_compute[192580]: 2025-10-08 16:54:22.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:25 np0005476733 nova_compute[192580]: 2025-10-08 16:54:25.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:25 np0005476733 nova_compute[192580]: 2025-10-08 16:54:25.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:54:26.428 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:54:26.428 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:54:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:54:26.428 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:54:27 np0005476733 nova_compute[192580]: 2025-10-08 16:54:27.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:30 np0005476733 nova_compute[192580]: 2025-10-08 16:54:30.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:32 np0005476733 nova_compute[192580]: 2025-10-08 16:54:32.165 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759942457.1636975, 646e343d-1440-4cfb-9002-e70d282e35b2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:54:32 np0005476733 nova_compute[192580]: 2025-10-08 16:54:32.166 2 INFO nova.compute.manager [-] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] VM Stopped (Lifecycle Event)#033[00m
Oct  8 12:54:32 np0005476733 nova_compute[192580]: 2025-10-08 16:54:32.188 2 DEBUG nova.compute.manager [None req-88ec7fb0-fcf5-4bbf-ac81-baca32808784 - - - - - -] [instance: 646e343d-1440-4cfb-9002-e70d282e35b2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:54:32 np0005476733 nova_compute[192580]: 2025-10-08 16:54:32.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:33 np0005476733 podman[275570]: 2025-10-08 16:54:33.219861488 +0000 UTC m=+0.047515938 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 12:54:34 np0005476733 podman[275592]: 2025-10-08 16:54:34.240897659 +0000 UTC m=+0.075587955 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:54:35 np0005476733 nova_compute[192580]: 2025-10-08 16:54:35.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.075 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:54:36.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:54:37 np0005476733 nova_compute[192580]: 2025-10-08 16:54:37.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:37 np0005476733 podman[275618]: 2025-10-08 16:54:37.228869753 +0000 UTC m=+0.060881486 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Oct  8 12:54:40 np0005476733 systemd-logind[827]: New session 168 of user zuul.
Oct  8 12:54:40 np0005476733 systemd[1]: Started Session 168 of User zuul.
Oct  8 12:54:40 np0005476733 nova_compute[192580]: 2025-10-08 16:54:40.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:40 np0005476733 systemd[1]: session-168.scope: Deactivated successfully.
Oct  8 12:54:40 np0005476733 systemd-logind[827]: Session 168 logged out. Waiting for processes to exit.
Oct  8 12:54:40 np0005476733 systemd-logind[827]: Removed session 168.
Oct  8 12:54:42 np0005476733 nova_compute[192580]: 2025-10-08 16:54:42.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:45 np0005476733 nova_compute[192580]: 2025-10-08 16:54:45.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:47 np0005476733 nova_compute[192580]: 2025-10-08 16:54:47.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:47 np0005476733 podman[275666]: 2025-10-08 16:54:47.231785038 +0000 UTC m=+0.057982022 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 12:54:47 np0005476733 podman[275667]: 2025-10-08 16:54:47.238032838 +0000 UTC m=+0.062074024 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., architecture=x86_64, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.expose-services=, name=ubi9-minimal, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  8 12:54:47 np0005476733 podman[275665]: 2025-10-08 16:54:47.25283201 +0000 UTC m=+0.070527393 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:54:50 np0005476733 nova_compute[192580]: 2025-10-08 16:54:50.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:51 np0005476733 ovn_controller[263831]: 2025-10-08T16:54:51Z|00209|pinctrl|WARN|Dropped 299 log messages in last 59 seconds (most recently, 18 seconds ago) due to excessive rate
Oct  8 12:54:51 np0005476733 ovn_controller[263831]: 2025-10-08T16:54:51Z|00210|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:54:51 np0005476733 ovn_controller[263831]: 2025-10-08T16:54:51Z|00211|memory_trim|INFO|Detected inactivity (last active 30018 ms ago): trimming memory
Oct  8 12:54:52 np0005476733 nova_compute[192580]: 2025-10-08 16:54:52.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:52 np0005476733 podman[275729]: 2025-10-08 16:54:52.240809703 +0000 UTC m=+0.059105389 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:54:52 np0005476733 podman[275728]: 2025-10-08 16:54:52.241035941 +0000 UTC m=+0.067784537 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  8 12:54:55 np0005476733 nova_compute[192580]: 2025-10-08 16:54:55.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:57 np0005476733 nova_compute[192580]: 2025-10-08 16:54:57.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:54:57 np0005476733 nova_compute[192580]: 2025-10-08 16:54:57.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:54:59 np0005476733 nova_compute[192580]: 2025-10-08 16:54:59.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:55:00 np0005476733 nova_compute[192580]: 2025-10-08 16:55:00.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:01 np0005476733 nova_compute[192580]: 2025-10-08 16:55:01.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:55:01 np0005476733 nova_compute[192580]: 2025-10-08 16:55:01.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:55:02 np0005476733 nova_compute[192580]: 2025-10-08 16:55:02.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:04 np0005476733 podman[275772]: 2025-10-08 16:55:04.222931473 +0000 UTC m=+0.051890950 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  8 12:55:05 np0005476733 nova_compute[192580]: 2025-10-08 16:55:05.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:05 np0005476733 podman[275791]: 2025-10-08 16:55:05.248014633 +0000 UTC m=+0.079766660 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Oct  8 12:55:07 np0005476733 nova_compute[192580]: 2025-10-08 16:55:07.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:07 np0005476733 nova_compute[192580]: 2025-10-08 16:55:07.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:55:08 np0005476733 podman[275818]: 2025-10-08 16:55:08.249483388 +0000 UTC m=+0.073654414 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  8 12:55:10 np0005476733 nova_compute[192580]: 2025-10-08 16:55:10.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:10 np0005476733 nova_compute[192580]: 2025-10-08 16:55:10.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:55:11 np0005476733 nova_compute[192580]: 2025-10-08 16:55:11.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:55:11 np0005476733 nova_compute[192580]: 2025-10-08 16:55:11.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:55:11 np0005476733 nova_compute[192580]: 2025-10-08 16:55:11.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:55:11 np0005476733 nova_compute[192580]: 2025-10-08 16:55:11.610 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:55:12 np0005476733 nova_compute[192580]: 2025-10-08 16:55:12.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:15 np0005476733 nova_compute[192580]: 2025-10-08 16:55:15.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:17 np0005476733 nova_compute[192580]: 2025-10-08 16:55:17.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:18 np0005476733 podman[275841]: 2025-10-08 16:55:18.241453044 +0000 UTC m=+0.065344768 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 12:55:18 np0005476733 podman[275842]: 2025-10-08 16:55:18.267975281 +0000 UTC m=+0.080037567 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, release=1755695350)
Oct  8 12:55:18 np0005476733 podman[275840]: 2025-10-08 16:55:18.281963868 +0000 UTC m=+0.109765907 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 12:55:19 np0005476733 nova_compute[192580]: 2025-10-08 16:55:19.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:55:19 np0005476733 nova_compute[192580]: 2025-10-08 16:55:19.629 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:55:19 np0005476733 nova_compute[192580]: 2025-10-08 16:55:19.629 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:55:19 np0005476733 nova_compute[192580]: 2025-10-08 16:55:19.630 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:55:19 np0005476733 nova_compute[192580]: 2025-10-08 16:55:19.630 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:55:19 np0005476733 nova_compute[192580]: 2025-10-08 16:55:19.797 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:55:19 np0005476733 nova_compute[192580]: 2025-10-08 16:55:19.800 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13641MB free_disk=111.30184936523438GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:55:19 np0005476733 nova_compute[192580]: 2025-10-08 16:55:19.800 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:55:19 np0005476733 nova_compute[192580]: 2025-10-08 16:55:19.801 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:55:19 np0005476733 nova_compute[192580]: 2025-10-08 16:55:19.886 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:55:19 np0005476733 nova_compute[192580]: 2025-10-08 16:55:19.886 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:55:19 np0005476733 nova_compute[192580]: 2025-10-08 16:55:19.910 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:55:19 np0005476733 nova_compute[192580]: 2025-10-08 16:55:19.930 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:55:19 np0005476733 nova_compute[192580]: 2025-10-08 16:55:19.932 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:55:19 np0005476733 nova_compute[192580]: 2025-10-08 16:55:19.932 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:55:20 np0005476733 nova_compute[192580]: 2025-10-08 16:55:20.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:21 np0005476733 nova_compute[192580]: 2025-10-08 16:55:21.931 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:55:22 np0005476733 nova_compute[192580]: 2025-10-08 16:55:22.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:23 np0005476733 podman[275904]: 2025-10-08 16:55:23.240899663 +0000 UTC m=+0.063214461 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:55:23 np0005476733 podman[275905]: 2025-10-08 16:55:23.271577162 +0000 UTC m=+0.078996664 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:55:25 np0005476733 nova_compute[192580]: 2025-10-08 16:55:25.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:55:26.429 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:55:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:55:26.430 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:55:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:55:26.430 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:55:26 np0005476733 nova_compute[192580]: 2025-10-08 16:55:26.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:55:27 np0005476733 nova_compute[192580]: 2025-10-08 16:55:27.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:30 np0005476733 nova_compute[192580]: 2025-10-08 16:55:30.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:32 np0005476733 nova_compute[192580]: 2025-10-08 16:55:32.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:33 np0005476733 nova_compute[192580]: 2025-10-08 16:55:33.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:55:33.012 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=90, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=89) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:55:33 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:55:33.014 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:55:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:55:35.016 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '90'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:55:35 np0005476733 nova_compute[192580]: 2025-10-08 16:55:35.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:35 np0005476733 podman[275944]: 2025-10-08 16:55:35.244980573 +0000 UTC m=+0.077067074 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Oct  8 12:55:36 np0005476733 podman[275963]: 2025-10-08 16:55:36.247778832 +0000 UTC m=+0.081138473 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 12:55:36 np0005476733 nova_compute[192580]: 2025-10-08 16:55:36.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:55:37 np0005476733 nova_compute[192580]: 2025-10-08 16:55:37.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:39 np0005476733 podman[275988]: 2025-10-08 16:55:39.214821827 +0000 UTC m=+0.048946165 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:55:40 np0005476733 nova_compute[192580]: 2025-10-08 16:55:40.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:42 np0005476733 nova_compute[192580]: 2025-10-08 16:55:42.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:45 np0005476733 nova_compute[192580]: 2025-10-08 16:55:45.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:47 np0005476733 nova_compute[192580]: 2025-10-08 16:55:47.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:49 np0005476733 podman[276009]: 2025-10-08 16:55:49.225053586 +0000 UTC m=+0.052774226 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:55:49 np0005476733 podman[276008]: 2025-10-08 16:55:49.225065607 +0000 UTC m=+0.057947752 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3)
Oct  8 12:55:49 np0005476733 podman[276010]: 2025-10-08 16:55:49.230372126 +0000 UTC m=+0.056766933 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, architecture=x86_64, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  8 12:55:50 np0005476733 nova_compute[192580]: 2025-10-08 16:55:50.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:52 np0005476733 nova_compute[192580]: 2025-10-08 16:55:52.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:54 np0005476733 podman[276074]: 2025-10-08 16:55:54.23030417 +0000 UTC m=+0.051786335 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 12:55:54 np0005476733 podman[276073]: 2025-10-08 16:55:54.254912207 +0000 UTC m=+0.078239901 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 12:55:55 np0005476733 nova_compute[192580]: 2025-10-08 16:55:55.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:57 np0005476733 nova_compute[192580]: 2025-10-08 16:55:57.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:55:59 np0005476733 nova_compute[192580]: 2025-10-08 16:55:59.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:56:00 np0005476733 nova_compute[192580]: 2025-10-08 16:56:00.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:56:00 np0005476733 nova_compute[192580]: 2025-10-08 16:56:00.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:56:01 np0005476733 nova_compute[192580]: 2025-10-08 16:56:01.596 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:56:01 np0005476733 nova_compute[192580]: 2025-10-08 16:56:01.596 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:56:02 np0005476733 nova_compute[192580]: 2025-10-08 16:56:02.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:56:03 np0005476733 ovn_controller[263831]: 2025-10-08T16:56:03Z|00212|pinctrl|WARN|Dropped 127 log messages in last 71 seconds (most recently, 16 seconds ago) due to excessive rate
Oct  8 12:56:03 np0005476733 ovn_controller[263831]: 2025-10-08T16:56:03Z|00213|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:56:03 np0005476733 ovn_controller[263831]: 2025-10-08T16:56:03Z|00214|memory_trim|INFO|Detected inactivity (last active 30018 ms ago): trimming memory
Oct  8 12:56:05 np0005476733 nova_compute[192580]: 2025-10-08 16:56:05.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:56:06 np0005476733 podman[276118]: 2025-10-08 16:56:06.227824532 +0000 UTC m=+0.054059508 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:56:07 np0005476733 podman[276137]: 2025-10-08 16:56:07.239908687 +0000 UTC m=+0.074238642 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:56:07 np0005476733 nova_compute[192580]: 2025-10-08 16:56:07.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:56:08 np0005476733 nova_compute[192580]: 2025-10-08 16:56:08.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:56:10 np0005476733 nova_compute[192580]: 2025-10-08 16:56:10.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:56:10 np0005476733 podman[276163]: 2025-10-08 16:56:10.221145735 +0000 UTC m=+0.053467158 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Oct  8 12:56:11 np0005476733 nova_compute[192580]: 2025-10-08 16:56:11.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:56:11 np0005476733 nova_compute[192580]: 2025-10-08 16:56:11.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:56:11 np0005476733 nova_compute[192580]: 2025-10-08 16:56:11.591 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:56:11 np0005476733 nova_compute[192580]: 2025-10-08 16:56:11.610 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:56:12 np0005476733 nova_compute[192580]: 2025-10-08 16:56:12.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:56:12 np0005476733 nova_compute[192580]: 2025-10-08 16:56:12.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:56:15 np0005476733 nova_compute[192580]: 2025-10-08 16:56:15.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:56:17 np0005476733 nova_compute[192580]: 2025-10-08 16:56:17.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:56:20 np0005476733 nova_compute[192580]: 2025-10-08 16:56:20.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:56:20 np0005476733 podman[276186]: 2025-10-08 16:56:20.228053708 +0000 UTC m=+0.055801284 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, version=9.6, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=edpm, release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container)
Oct  8 12:56:20 np0005476733 podman[276185]: 2025-10-08 16:56:20.240630879 +0000 UTC m=+0.070996338 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 12:56:20 np0005476733 podman[276184]: 2025-10-08 16:56:20.245990661 +0000 UTC m=+0.080739511 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  8 12:56:20 np0005476733 nova_compute[192580]: 2025-10-08 16:56:20.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:56:20 np0005476733 nova_compute[192580]: 2025-10-08 16:56:20.614 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:56:20 np0005476733 nova_compute[192580]: 2025-10-08 16:56:20.614 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:56:20 np0005476733 nova_compute[192580]: 2025-10-08 16:56:20.615 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:56:20 np0005476733 nova_compute[192580]: 2025-10-08 16:56:20.615 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:56:20 np0005476733 nova_compute[192580]: 2025-10-08 16:56:20.826 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:56:20 np0005476733 nova_compute[192580]: 2025-10-08 16:56:20.827 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13652MB free_disk=111.30184936523438GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:56:20 np0005476733 nova_compute[192580]: 2025-10-08 16:56:20.827 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:56:20 np0005476733 nova_compute[192580]: 2025-10-08 16:56:20.828 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:56:20 np0005476733 nova_compute[192580]: 2025-10-08 16:56:20.906 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:56:20 np0005476733 nova_compute[192580]: 2025-10-08 16:56:20.906 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:56:20 np0005476733 nova_compute[192580]: 2025-10-08 16:56:20.930 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:56:20 np0005476733 nova_compute[192580]: 2025-10-08 16:56:20.947 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:56:20 np0005476733 nova_compute[192580]: 2025-10-08 16:56:20.948 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:56:20 np0005476733 nova_compute[192580]: 2025-10-08 16:56:20.949 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:56:21 np0005476733 nova_compute[192580]: 2025-10-08 16:56:21.949 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:56:22 np0005476733 nova_compute[192580]: 2025-10-08 16:56:22.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:56:22 np0005476733 nova_compute[192580]: 2025-10-08 16:56:22.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:56:22 np0005476733 nova_compute[192580]: 2025-10-08 16:56:22.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 12:56:25 np0005476733 nova_compute[192580]: 2025-10-08 16:56:25.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:56:25 np0005476733 podman[276248]: 2025-10-08 16:56:25.236561516 +0000 UTC m=+0.052723185 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:56:25 np0005476733 podman[276247]: 2025-10-08 16:56:25.23729711 +0000 UTC m=+0.069535842 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 12:56:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:56:26.430 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:56:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:56:26.431 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:56:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:56:26.431 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:56:27 np0005476733 nova_compute[192580]: 2025-10-08 16:56:27.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:56:28 np0005476733 nova_compute[192580]: 2025-10-08 16:56:28.720 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:56:30 np0005476733 nova_compute[192580]: 2025-10-08 16:56:30.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:56:32 np0005476733 nova_compute[192580]: 2025-10-08 16:56:32.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:56:35 np0005476733 nova_compute[192580]: 2025-10-08 16:56:35.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:56:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:56:35.607 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=91, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=90) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:56:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:56:35.608 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:56:35 np0005476733 nova_compute[192580]: 2025-10-08 16:56:35.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.079 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.079 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.079 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.079 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.080 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.080 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.080 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.080 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:56:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 12:56:37 np0005476733 podman[276292]: 2025-10-08 16:56:37.217734555 +0000 UTC m=+0.048316984 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_metadata_agent)
Oct  8 12:56:37 np0005476733 nova_compute[192580]: 2025-10-08 16:56:37.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:56:38 np0005476733 podman[276312]: 2025-10-08 16:56:38.247824924 +0000 UTC m=+0.076270197 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller)
Oct  8 12:56:40 np0005476733 nova_compute[192580]: 2025-10-08 16:56:40.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:56:41 np0005476733 podman[276340]: 2025-10-08 16:56:41.219920552 +0000 UTC m=+0.054783222 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Oct  8 12:56:42 np0005476733 nova_compute[192580]: 2025-10-08 16:56:42.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:56:45 np0005476733 nova_compute[192580]: 2025-10-08 16:56:45.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:56:45 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:56:45.611 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '91'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:56:47 np0005476733 nova_compute[192580]: 2025-10-08 16:56:47.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:56:50 np0005476733 nova_compute[192580]: 2025-10-08 16:56:50.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:56:51 np0005476733 podman[276361]: 2025-10-08 16:56:51.277508924 +0000 UTC m=+0.090444891 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 12:56:51 np0005476733 podman[276360]: 2025-10-08 16:56:51.277567556 +0000 UTC m=+0.087596000 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd)
Oct  8 12:56:51 np0005476733 podman[276362]: 2025-10-08 16:56:51.283145994 +0000 UTC m=+0.086253127 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, maintainer=Red Hat, Inc., architecture=x86_64, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  8 12:56:52 np0005476733 nova_compute[192580]: 2025-10-08 16:56:52.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:56:55 np0005476733 nova_compute[192580]: 2025-10-08 16:56:55.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:56:56 np0005476733 podman[276422]: 2025-10-08 16:56:56.253945128 +0000 UTC m=+0.081589737 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct  8 12:56:56 np0005476733 podman[276423]: 2025-10-08 16:56:56.259287808 +0000 UTC m=+0.085468001 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:56:57 np0005476733 nova_compute[192580]: 2025-10-08 16:56:57.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:00 np0005476733 nova_compute[192580]: 2025-10-08 16:57:00.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:00 np0005476733 systemd-logind[827]: New session 169 of user zuul.
Oct  8 12:57:00 np0005476733 nova_compute[192580]: 2025-10-08 16:57:00.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:57:00 np0005476733 systemd[1]: Started Session 169 of User zuul.
Oct  8 12:57:00 np0005476733 nova_compute[192580]: 2025-10-08 16:57:00.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:57:00 np0005476733 systemd[1]: session-169.scope: Deactivated successfully.
Oct  8 12:57:00 np0005476733 systemd-logind[827]: Session 169 logged out. Waiting for processes to exit.
Oct  8 12:57:00 np0005476733 systemd-logind[827]: Removed session 169.
Oct  8 12:57:02 np0005476733 nova_compute[192580]: 2025-10-08 16:57:02.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:02 np0005476733 nova_compute[192580]: 2025-10-08 16:57:02.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:57:02 np0005476733 nova_compute[192580]: 2025-10-08 16:57:02.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:57:04 np0005476733 nova_compute[192580]: 2025-10-08 16:57:04.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:57:05 np0005476733 nova_compute[192580]: 2025-10-08 16:57:05.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:07 np0005476733 nova_compute[192580]: 2025-10-08 16:57:07.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:08 np0005476733 podman[276495]: 2025-10-08 16:57:08.220108787 +0000 UTC m=+0.051525777 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 12:57:08 np0005476733 nova_compute[192580]: 2025-10-08 16:57:08.600 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:57:09 np0005476733 ovn_controller[263831]: 2025-10-08T16:57:09Z|00215|pinctrl|WARN|Dropped 285 log messages in last 66 seconds (most recently, 20 seconds ago) due to excessive rate
Oct  8 12:57:09 np0005476733 ovn_controller[263831]: 2025-10-08T16:57:09Z|00216|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:57:09 np0005476733 podman[276514]: 2025-10-08 16:57:09.250104404 +0000 UTC m=+0.080951557 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  8 12:57:10 np0005476733 nova_compute[192580]: 2025-10-08 16:57:10.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:12 np0005476733 podman[276540]: 2025-10-08 16:57:12.219528865 +0000 UTC m=+0.050038000 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  8 12:57:12 np0005476733 nova_compute[192580]: 2025-10-08 16:57:12.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:12 np0005476733 nova_compute[192580]: 2025-10-08 16:57:12.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:57:13 np0005476733 nova_compute[192580]: 2025-10-08 16:57:13.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:57:13 np0005476733 nova_compute[192580]: 2025-10-08 16:57:13.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:57:13 np0005476733 nova_compute[192580]: 2025-10-08 16:57:13.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:57:13 np0005476733 nova_compute[192580]: 2025-10-08 16:57:13.608 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 12:57:15 np0005476733 nova_compute[192580]: 2025-10-08 16:57:15.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:17 np0005476733 nova_compute[192580]: 2025-10-08 16:57:17.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:20 np0005476733 nova_compute[192580]: 2025-10-08 16:57:20.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:21 np0005476733 nova_compute[192580]: 2025-10-08 16:57:21.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:57:21 np0005476733 nova_compute[192580]: 2025-10-08 16:57:21.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:57:21 np0005476733 nova_compute[192580]: 2025-10-08 16:57:21.624 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:57:21 np0005476733 nova_compute[192580]: 2025-10-08 16:57:21.625 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:57:21 np0005476733 nova_compute[192580]: 2025-10-08 16:57:21.625 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:57:21 np0005476733 nova_compute[192580]: 2025-10-08 16:57:21.625 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:57:21 np0005476733 nova_compute[192580]: 2025-10-08 16:57:21.743 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:57:21 np0005476733 nova_compute[192580]: 2025-10-08 16:57:21.744 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13663MB free_disk=111.30181884765625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:57:21 np0005476733 nova_compute[192580]: 2025-10-08 16:57:21.744 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:57:21 np0005476733 nova_compute[192580]: 2025-10-08 16:57:21.744 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:57:21 np0005476733 nova_compute[192580]: 2025-10-08 16:57:21.953 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:57:21 np0005476733 nova_compute[192580]: 2025-10-08 16:57:21.954 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:57:21 np0005476733 nova_compute[192580]: 2025-10-08 16:57:21.975 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 12:57:21 np0005476733 podman[276561]: 2025-10-08 16:57:21.984582803 +0000 UTC m=+0.044176243 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 12:57:21 np0005476733 podman[276560]: 2025-10-08 16:57:21.989858321 +0000 UTC m=+0.054211463 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  8 12:57:21 np0005476733 nova_compute[192580]: 2025-10-08 16:57:21.992 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 12:57:21 np0005476733 nova_compute[192580]: 2025-10-08 16:57:21.992 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 12:57:22 np0005476733 podman[276562]: 2025-10-08 16:57:22.003798417 +0000 UTC m=+0.057343954 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_id=edpm, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-type=git, release=1755695350, version=9.6, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64)
Oct  8 12:57:22 np0005476733 nova_compute[192580]: 2025-10-08 16:57:22.010 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 12:57:22 np0005476733 nova_compute[192580]: 2025-10-08 16:57:22.034 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 12:57:22 np0005476733 nova_compute[192580]: 2025-10-08 16:57:22.055 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:57:22 np0005476733 nova_compute[192580]: 2025-10-08 16:57:22.072 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:57:22 np0005476733 nova_compute[192580]: 2025-10-08 16:57:22.074 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:57:22 np0005476733 nova_compute[192580]: 2025-10-08 16:57:22.075 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.331s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:57:22 np0005476733 nova_compute[192580]: 2025-10-08 16:57:22.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:25 np0005476733 nova_compute[192580]: 2025-10-08 16:57:25.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:26.432 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:57:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:26.432 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:57:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:26.433 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:57:27 np0005476733 podman[276628]: 2025-10-08 16:57:27.216048482 +0000 UTC m=+0.044703939 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:57:27 np0005476733 podman[276627]: 2025-10-08 16:57:27.220190465 +0000 UTC m=+0.051362303 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=iscsid)
Oct  8 12:57:27 np0005476733 nova_compute[192580]: 2025-10-08 16:57:27.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:30 np0005476733 nova_compute[192580]: 2025-10-08 16:57:30.075 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:57:30 np0005476733 nova_compute[192580]: 2025-10-08 16:57:30.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:31 np0005476733 nova_compute[192580]: 2025-10-08 16:57:31.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:57:31 np0005476733 nova_compute[192580]: 2025-10-08 16:57:31.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 12:57:31 np0005476733 nova_compute[192580]: 2025-10-08 16:57:31.631 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 12:57:32 np0005476733 nova_compute[192580]: 2025-10-08 16:57:32.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:33 np0005476733 nova_compute[192580]: 2025-10-08 16:57:33.536 2 DEBUG oslo_concurrency.lockutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Acquiring lock "0d0100d4-d5e3-4c53-b37f-275a02847eb0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:57:33 np0005476733 nova_compute[192580]: 2025-10-08 16:57:33.537 2 DEBUG oslo_concurrency.lockutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "0d0100d4-d5e3-4c53-b37f-275a02847eb0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:57:33 np0005476733 nova_compute[192580]: 2025-10-08 16:57:33.554 2 DEBUG nova.compute.manager [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 12:57:33 np0005476733 nova_compute[192580]: 2025-10-08 16:57:33.638 2 DEBUG oslo_concurrency.lockutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:57:33 np0005476733 nova_compute[192580]: 2025-10-08 16:57:33.639 2 DEBUG oslo_concurrency.lockutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:57:33 np0005476733 nova_compute[192580]: 2025-10-08 16:57:33.647 2 DEBUG nova.virt.hardware [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 12:57:33 np0005476733 nova_compute[192580]: 2025-10-08 16:57:33.647 2 INFO nova.compute.claims [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 12:57:33 np0005476733 nova_compute[192580]: 2025-10-08 16:57:33.755 2 DEBUG nova.compute.provider_tree [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:57:33 np0005476733 nova_compute[192580]: 2025-10-08 16:57:33.786 2 DEBUG nova.scheduler.client.report [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:57:33 np0005476733 nova_compute[192580]: 2025-10-08 16:57:33.828 2 DEBUG oslo_concurrency.lockutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:57:33 np0005476733 nova_compute[192580]: 2025-10-08 16:57:33.829 2 DEBUG nova.compute.manager [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 12:57:33 np0005476733 nova_compute[192580]: 2025-10-08 16:57:33.908 2 DEBUG nova.compute.manager [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 12:57:33 np0005476733 nova_compute[192580]: 2025-10-08 16:57:33.908 2 DEBUG nova.network.neutron [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 12:57:33 np0005476733 nova_compute[192580]: 2025-10-08 16:57:33.928 2 INFO nova.virt.libvirt.driver [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 12:57:33 np0005476733 nova_compute[192580]: 2025-10-08 16:57:33.948 2 DEBUG nova.compute.manager [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 12:57:34 np0005476733 nova_compute[192580]: 2025-10-08 16:57:34.074 2 DEBUG nova.compute.manager [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 12:57:34 np0005476733 nova_compute[192580]: 2025-10-08 16:57:34.076 2 DEBUG nova.virt.libvirt.driver [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 12:57:34 np0005476733 nova_compute[192580]: 2025-10-08 16:57:34.076 2 INFO nova.virt.libvirt.driver [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Creating image(s)#033[00m
Oct  8 12:57:34 np0005476733 nova_compute[192580]: 2025-10-08 16:57:34.077 2 DEBUG oslo_concurrency.lockutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Acquiring lock "/var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:57:34 np0005476733 nova_compute[192580]: 2025-10-08 16:57:34.078 2 DEBUG oslo_concurrency.lockutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "/var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:57:34 np0005476733 nova_compute[192580]: 2025-10-08 16:57:34.079 2 DEBUG oslo_concurrency.lockutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "/var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:57:34 np0005476733 nova_compute[192580]: 2025-10-08 16:57:34.096 2 DEBUG oslo_concurrency.processutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:57:34 np0005476733 nova_compute[192580]: 2025-10-08 16:57:34.153 2 DEBUG oslo_concurrency.processutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:57:34 np0005476733 nova_compute[192580]: 2025-10-08 16:57:34.155 2 DEBUG oslo_concurrency.lockutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:57:34 np0005476733 nova_compute[192580]: 2025-10-08 16:57:34.155 2 DEBUG oslo_concurrency.lockutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:57:34 np0005476733 nova_compute[192580]: 2025-10-08 16:57:34.168 2 DEBUG oslo_concurrency.processutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:57:34 np0005476733 nova_compute[192580]: 2025-10-08 16:57:34.223 2 DEBUG oslo_concurrency.processutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:57:34 np0005476733 nova_compute[192580]: 2025-10-08 16:57:34.224 2 DEBUG oslo_concurrency.processutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:57:34 np0005476733 nova_compute[192580]: 2025-10-08 16:57:34.258 2 DEBUG oslo_concurrency.processutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk 10737418240" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:57:34 np0005476733 nova_compute[192580]: 2025-10-08 16:57:34.259 2 DEBUG oslo_concurrency.lockutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:57:34 np0005476733 nova_compute[192580]: 2025-10-08 16:57:34.260 2 DEBUG oslo_concurrency.processutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:57:34 np0005476733 nova_compute[192580]: 2025-10-08 16:57:34.313 2 DEBUG oslo_concurrency.processutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:57:34 np0005476733 nova_compute[192580]: 2025-10-08 16:57:34.314 2 DEBUG nova.objects.instance [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lazy-loading 'migration_context' on Instance uuid 0d0100d4-d5e3-4c53-b37f-275a02847eb0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:57:34 np0005476733 nova_compute[192580]: 2025-10-08 16:57:34.332 2 DEBUG nova.virt.libvirt.driver [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 12:57:34 np0005476733 nova_compute[192580]: 2025-10-08 16:57:34.333 2 DEBUG nova.virt.libvirt.driver [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Ensure instance console log exists: /var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 12:57:34 np0005476733 nova_compute[192580]: 2025-10-08 16:57:34.333 2 DEBUG oslo_concurrency.lockutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:57:34 np0005476733 nova_compute[192580]: 2025-10-08 16:57:34.334 2 DEBUG oslo_concurrency.lockutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:57:34 np0005476733 nova_compute[192580]: 2025-10-08 16:57:34.334 2 DEBUG oslo_concurrency.lockutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:57:34 np0005476733 nova_compute[192580]: 2025-10-08 16:57:34.783 2 DEBUG nova.policy [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 12:57:35 np0005476733 nova_compute[192580]: 2025-10-08 16:57:35.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:35.811 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=92, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=91) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:57:35 np0005476733 nova_compute[192580]: 2025-10-08 16:57:35.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:35.812 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:57:35 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:35.813 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '92'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:57:35 np0005476733 nova_compute[192580]: 2025-10-08 16:57:35.923 2 DEBUG nova.network.neutron [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Successfully created port: aacfecb1-9e82-4905-8b61-d1e0e486b87e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 12:57:37 np0005476733 nova_compute[192580]: 2025-10-08 16:57:37.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:38 np0005476733 nova_compute[192580]: 2025-10-08 16:57:38.183 2 DEBUG nova.network.neutron [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Successfully updated port: aacfecb1-9e82-4905-8b61-d1e0e486b87e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 12:57:38 np0005476733 nova_compute[192580]: 2025-10-08 16:57:38.205 2 DEBUG oslo_concurrency.lockutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Acquiring lock "refresh_cache-0d0100d4-d5e3-4c53-b37f-275a02847eb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:57:38 np0005476733 nova_compute[192580]: 2025-10-08 16:57:38.205 2 DEBUG oslo_concurrency.lockutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Acquired lock "refresh_cache-0d0100d4-d5e3-4c53-b37f-275a02847eb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:57:38 np0005476733 nova_compute[192580]: 2025-10-08 16:57:38.206 2 DEBUG nova.network.neutron [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 12:57:38 np0005476733 nova_compute[192580]: 2025-10-08 16:57:38.342 2 DEBUG nova.compute.manager [req-9f436e75-1a59-49e3-a217-13a2b1c24c16 req-6c93b7b4-f4b8-4e96-9305-6dae0c434b8f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Received event network-changed-aacfecb1-9e82-4905-8b61-d1e0e486b87e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:57:38 np0005476733 nova_compute[192580]: 2025-10-08 16:57:38.343 2 DEBUG nova.compute.manager [req-9f436e75-1a59-49e3-a217-13a2b1c24c16 req-6c93b7b4-f4b8-4e96-9305-6dae0c434b8f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Refreshing instance network info cache due to event network-changed-aacfecb1-9e82-4905-8b61-d1e0e486b87e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 12:57:38 np0005476733 nova_compute[192580]: 2025-10-08 16:57:38.343 2 DEBUG oslo_concurrency.lockutils [req-9f436e75-1a59-49e3-a217-13a2b1c24c16 req-6c93b7b4-f4b8-4e96-9305-6dae0c434b8f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-0d0100d4-d5e3-4c53-b37f-275a02847eb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:57:38 np0005476733 nova_compute[192580]: 2025-10-08 16:57:38.754 2 DEBUG nova.network.neutron [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 12:57:39 np0005476733 podman[276678]: 2025-10-08 16:57:39.220468134 +0000 UTC m=+0.054174182 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  8 12:57:39 np0005476733 nova_compute[192580]: 2025-10-08 16:57:39.982 2 DEBUG nova.network.neutron [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Updating instance_info_cache with network_info: [{"id": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "address": "fa:16:3e:6c:dd:b7", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.193", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaacfecb1-9e", "ovs_interfaceid": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.183 2 DEBUG oslo_concurrency.lockutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Releasing lock "refresh_cache-0d0100d4-d5e3-4c53-b37f-275a02847eb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.183 2 DEBUG nova.compute.manager [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Instance network_info: |[{"id": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "address": "fa:16:3e:6c:dd:b7", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.193", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaacfecb1-9e", "ovs_interfaceid": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.183 2 DEBUG oslo_concurrency.lockutils [req-9f436e75-1a59-49e3-a217-13a2b1c24c16 req-6c93b7b4-f4b8-4e96-9305-6dae0c434b8f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-0d0100d4-d5e3-4c53-b37f-275a02847eb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.184 2 DEBUG nova.network.neutron [req-9f436e75-1a59-49e3-a217-13a2b1c24c16 req-6c93b7b4-f4b8-4e96-9305-6dae0c434b8f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Refreshing network info cache for port aacfecb1-9e82-4905-8b61-d1e0e486b87e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.186 2 DEBUG nova.virt.libvirt.driver [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Start _get_guest_xml network_info=[{"id": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "address": "fa:16:3e:6c:dd:b7", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.193", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaacfecb1-9e", "ovs_interfaceid": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.191 2 WARNING nova.virt.libvirt.driver [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.197 2 DEBUG nova.virt.libvirt.host [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.197 2 DEBUG nova.virt.libvirt.host [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.201 2 DEBUG nova.virt.libvirt.host [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.202 2 DEBUG nova.virt.libvirt.host [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.202 2 DEBUG nova.virt.libvirt.driver [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.202 2 DEBUG nova.virt.hardware [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.203 2 DEBUG nova.virt.hardware [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.203 2 DEBUG nova.virt.hardware [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.203 2 DEBUG nova.virt.hardware [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.204 2 DEBUG nova.virt.hardware [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.204 2 DEBUG nova.virt.hardware [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.204 2 DEBUG nova.virt.hardware [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.204 2 DEBUG nova.virt.hardware [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.205 2 DEBUG nova.virt.hardware [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.205 2 DEBUG nova.virt.hardware [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.205 2 DEBUG nova.virt.hardware [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.211 2 DEBUG nova.virt.libvirt.vif [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:57:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_dscp_marking_external_network-960705218',display_name='tempest-test_dscp_marking_external_network-960705218',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dscp-marking-external-network-960705218',id=111,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMZY+1amHtOfP7A9mNq9elrLdSmJExinx1RhEMIok/jY2NIEZLmbGYzwSU7hW6lTgCdp0ppXG+TES3vu/f6QcskjPgczSOIC1ZSOo2pydcs/ssHi2XSJm93QN1u5564Lw==',key_name='tempest-keypair-test-130020287',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1bfc3ffe3bf745dfbb59f63a07f1e1a9',ramdisk_id='',reservation_id='r-6rqnqjre',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-951807005',owner_user_name='tempest-QosTestCommon-951807005-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:57:33Z,user_data=None,user_id='c7e76ef370424761ad76d4119e9bf895',uuid=0d0100d4-d5e3-4c53-b37f-275a02847eb0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "address": "fa:16:3e:6c:dd:b7", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.193", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaacfecb1-9e", "ovs_interfaceid": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.212 2 DEBUG nova.network.os_vif_util [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Converting VIF {"id": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "address": "fa:16:3e:6c:dd:b7", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.193", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaacfecb1-9e", "ovs_interfaceid": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.213 2 DEBUG nova.network.os_vif_util [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:dd:b7,bridge_name='br-int',has_traffic_filtering=True,id=aacfecb1-9e82-4905-8b61-d1e0e486b87e,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaacfecb1-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.214 2 DEBUG nova.objects.instance [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0d0100d4-d5e3-4c53-b37f-275a02847eb0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.302 2 DEBUG nova.virt.libvirt.driver [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] End _get_guest_xml xml=<domain type="kvm">
Oct  8 12:57:40 np0005476733 nova_compute[192580]:  <uuid>0d0100d4-d5e3-4c53-b37f-275a02847eb0</uuid>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:  <name>instance-0000006f</name>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <nova:name>tempest-test_dscp_marking_external_network-960705218</nova:name>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 16:57:40</nova:creationTime>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 12:57:40 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:        <nova:user uuid="c7e76ef370424761ad76d4119e9bf895">tempest-QosTestCommon-951807005-project-member</nova:user>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:        <nova:project uuid="1bfc3ffe3bf745dfbb59f63a07f1e1a9">tempest-QosTestCommon-951807005</nova:project>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:        <nova:port uuid="aacfecb1-9e82-4905-8b61-d1e0e486b87e">
Oct  8 12:57:40 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.122.193" ipVersion="4"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <system>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <entry name="serial">0d0100d4-d5e3-4c53-b37f-275a02847eb0</entry>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <entry name="uuid">0d0100d4-d5e3-4c53-b37f-275a02847eb0</entry>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    </system>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:  <os>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:  </os>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:  <features>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:  </features>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:  </clock>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:  <devices>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.config"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    </disk>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:6c:dd:b7"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <mtu size="1400"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <target dev="tapaacfecb1-9e"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    </interface>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0/console.log" append="off"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    </serial>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <video>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    </video>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    </rng>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 12:57:40 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 12:57:40 np0005476733 nova_compute[192580]:  </devices>
Oct  8 12:57:40 np0005476733 nova_compute[192580]: </domain>
Oct  8 12:57:40 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.303 2 DEBUG nova.compute.manager [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Preparing to wait for external event network-vif-plugged-aacfecb1-9e82-4905-8b61-d1e0e486b87e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.304 2 DEBUG oslo_concurrency.lockutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Acquiring lock "0d0100d4-d5e3-4c53-b37f-275a02847eb0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.304 2 DEBUG oslo_concurrency.lockutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "0d0100d4-d5e3-4c53-b37f-275a02847eb0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.305 2 DEBUG oslo_concurrency.lockutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "0d0100d4-d5e3-4c53-b37f-275a02847eb0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.306 2 DEBUG nova.virt.libvirt.vif [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T16:57:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_dscp_marking_external_network-960705218',display_name='tempest-test_dscp_marking_external_network-960705218',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dscp-marking-external-network-960705218',id=111,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMZY+1amHtOfP7A9mNq9elrLdSmJExinx1RhEMIok/jY2NIEZLmbGYzwSU7hW6lTgCdp0ppXG+TES3vu/f6QcskjPgczSOIC1ZSOo2pydcs/ssHi2XSJm93QN1u5564Lw==',key_name='tempest-keypair-test-130020287',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1bfc3ffe3bf745dfbb59f63a07f1e1a9',ramdisk_id='',reservation_id='r-6rqnqjre',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-951807005',owner_user_name='tempest-QosTestCommon-951807005-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T16:57:33Z,user_data=None,user_id='c7e76ef370424761ad76d4119e9bf895',uuid=0d0100d4-d5e3-4c53-b37f-275a02847eb0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "address": "fa:16:3e:6c:dd:b7", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.193", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaacfecb1-9e", "ovs_interfaceid": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.306 2 DEBUG nova.network.os_vif_util [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Converting VIF {"id": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "address": "fa:16:3e:6c:dd:b7", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.193", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaacfecb1-9e", "ovs_interfaceid": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.307 2 DEBUG nova.network.os_vif_util [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:dd:b7,bridge_name='br-int',has_traffic_filtering=True,id=aacfecb1-9e82-4905-8b61-d1e0e486b87e,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaacfecb1-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.307 2 DEBUG os_vif [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:dd:b7,bridge_name='br-int',has_traffic_filtering=True,id=aacfecb1-9e82-4905-8b61-d1e0e486b87e,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaacfecb1-9e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.309 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.309 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.313 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaacfecb1-9e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.314 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapaacfecb1-9e, col_values=(('external_ids', {'iface-id': 'aacfecb1-9e82-4905-8b61-d1e0e486b87e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6c:dd:b7', 'vm-uuid': '0d0100d4-d5e3-4c53-b37f-275a02847eb0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:40 np0005476733 podman[276699]: 2025-10-08 16:57:40.320398575 +0000 UTC m=+0.146134179 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.326 2 INFO os_vif [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:dd:b7,bridge_name='br-int',has_traffic_filtering=True,id=aacfecb1-9e82-4905-8b61-d1e0e486b87e,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaacfecb1-9e')#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.479 2 DEBUG nova.virt.libvirt.driver [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.480 2 DEBUG nova.virt.libvirt.driver [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.480 2 DEBUG nova.virt.libvirt.driver [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] No VIF found with MAC fa:16:3e:6c:dd:b7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.480 2 INFO nova.virt.libvirt.driver [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Using config drive#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.909 2 INFO nova.virt.libvirt.driver [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Creating config drive at /var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.config#033[00m
Oct  8 12:57:40 np0005476733 nova_compute[192580]: 2025-10-08 16:57:40.918 2 DEBUG oslo_concurrency.processutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp52u2swdr execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:57:41 np0005476733 nova_compute[192580]: 2025-10-08 16:57:41.050 2 DEBUG oslo_concurrency.processutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp52u2swdr" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:57:41 np0005476733 kernel: tapaacfecb1-9e: entered promiscuous mode
Oct  8 12:57:41 np0005476733 nova_compute[192580]: 2025-10-08 16:57:41.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:41 np0005476733 NetworkManager[51699]: <info>  [1759942661.1196] manager: (tapaacfecb1-9e): new Tun device (/org/freedesktop/NetworkManager/Devices/326)
Oct  8 12:57:41 np0005476733 ovn_controller[263831]: 2025-10-08T16:57:41Z|00217|binding|INFO|Claiming lport aacfecb1-9e82-4905-8b61-d1e0e486b87e for this chassis.
Oct  8 12:57:41 np0005476733 ovn_controller[263831]: 2025-10-08T16:57:41Z|00218|binding|INFO|aacfecb1-9e82-4905-8b61-d1e0e486b87e: Claiming fa:16:3e:6c:dd:b7 192.168.122.193
Oct  8 12:57:41 np0005476733 nova_compute[192580]: 2025-10-08 16:57:41.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:41 np0005476733 nova_compute[192580]: 2025-10-08 16:57:41.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:41 np0005476733 systemd-udevd[276743]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:57:41 np0005476733 systemd-machined[152624]: New machine qemu-68-instance-0000006f.
Oct  8 12:57:41 np0005476733 NetworkManager[51699]: <info>  [1759942661.1647] device (tapaacfecb1-9e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:57:41 np0005476733 NetworkManager[51699]: <info>  [1759942661.1655] device (tapaacfecb1-9e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.173 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:dd:b7 192.168.122.193'], port_security=['fa:16:3e:6c:dd:b7 192.168.122.193'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.193/24', 'neutron:device_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '25db264a-2b7b-491b-8b39-dd7c8ac35c2b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5b64086-e7d8-42ad-b439-67cb79e13d7c, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=aacfecb1-9e82-4905-8b61-d1e0e486b87e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.175 103739 INFO neutron.agent.ovn.metadata.agent [-] Port aacfecb1-9e82-4905-8b61-d1e0e486b87e in datapath 81c575b5-ac88-40d3-8b00-79c5c936eec4 bound to our chassis#033[00m
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.176 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 81c575b5-ac88-40d3-8b00-79c5c936eec4#033[00m
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.188 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[31d27b48-7226-43f2-9485-cfbbe5ea886b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.188 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap81c575b5-a1 in ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.190 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap81c575b5-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.190 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2029930b-36dd-4785-965a-a7c394acc4e8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.191 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[344da888-03cf-4df3-9c6e-358b7535c578]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.203 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[4e4ed3b3-ce30-4927-bcfc-3007d8d88717]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:57:41 np0005476733 nova_compute[192580]: 2025-10-08 16:57:41.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.218 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[38115b8d-5e6c-462f-b143-1f4eed75329a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:57:41 np0005476733 systemd[1]: Started Virtual Machine qemu-68-instance-0000006f.
Oct  8 12:57:41 np0005476733 nova_compute[192580]: 2025-10-08 16:57:41.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:41 np0005476733 nova_compute[192580]: 2025-10-08 16:57:41.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:41 np0005476733 nova_compute[192580]: 2025-10-08 16:57:41.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:41 np0005476733 ovn_controller[263831]: 2025-10-08T16:57:41Z|00219|binding|INFO|Setting lport aacfecb1-9e82-4905-8b61-d1e0e486b87e ovn-installed in OVS
Oct  8 12:57:41 np0005476733 nova_compute[192580]: 2025-10-08 16:57:41.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.245 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[d457cbda-218f-4c4a-9236-611279062b3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.249 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a5e9b6fa-2798-473b-a4a8-1c398f2d225d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:57:41 np0005476733 NetworkManager[51699]: <info>  [1759942661.2510] manager: (tap81c575b5-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/327)
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.280 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[8f6fd2ce-198e-4cdd-a312-ec3faa3ce5e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.283 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[0673408b-6736-4a2c-bd62-3c9f8f7ff03a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:57:41 np0005476733 ovn_controller[263831]: 2025-10-08T16:57:41Z|00220|binding|INFO|Setting lport aacfecb1-9e82-4905-8b61-d1e0e486b87e up in Southbound
Oct  8 12:57:41 np0005476733 NetworkManager[51699]: <info>  [1759942661.3109] device (tap81c575b5-a0): carrier: link connected
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.317 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[3001924b-480d-4142-95c3-53fe130514cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.336 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6e3d9e29-4515-482a-b8e9-d0d7e4fbde4a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81c575b5-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:bf:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 230], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 956497, 'reachable_time': 28673, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 276777, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.352 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[455a2025-5ea0-4b07-8491-355bb3f84a37]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe79:bf12'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 956497, 'tstamp': 956497}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 276778, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.369 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ada52dcf-ce4d-47f3-9e88-8b01d3a4ca35]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81c575b5-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:bf:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 230], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 956497, 'reachable_time': 28673, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 276779, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.398 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[64595cf2-ea68-4839-9901-6747b65010af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.454 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a36fccdd-589d-40f7-9e48-201b41b5c05d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.455 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81c575b5-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.455 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.456 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81c575b5-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:57:41 np0005476733 nova_compute[192580]: 2025-10-08 16:57:41.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:41 np0005476733 kernel: tap81c575b5-a0: entered promiscuous mode
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.460 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap81c575b5-a0, col_values=(('external_ids', {'iface-id': '3737b929-673d-4d30-a674-dbb8c6c2e54d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:57:41 np0005476733 ovn_controller[263831]: 2025-10-08T16:57:41Z|00221|binding|INFO|Releasing lport 3737b929-673d-4d30-a674-dbb8c6c2e54d from this chassis (sb_readonly=0)
Oct  8 12:57:41 np0005476733 nova_compute[192580]: 2025-10-08 16:57:41.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.474 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/81c575b5-ac88-40d3-8b00-79c5c936eec4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/81c575b5-ac88-40d3-8b00-79c5c936eec4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.476 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[27e0ff00-22b1-4f3c-9899-754dd5c37ea4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.476 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-81c575b5-ac88-40d3-8b00-79c5c936eec4
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/81c575b5-ac88-40d3-8b00-79c5c936eec4.pid.haproxy
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 81c575b5-ac88-40d3-8b00-79c5c936eec4
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 12:57:41 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:57:41.477 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'env', 'PROCESS_TAG=haproxy-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/81c575b5-ac88-40d3-8b00-79c5c936eec4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 12:57:41 np0005476733 nova_compute[192580]: 2025-10-08 16:57:41.622 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:57:41 np0005476733 nova_compute[192580]: 2025-10-08 16:57:41.732 2 DEBUG nova.network.neutron [req-9f436e75-1a59-49e3-a217-13a2b1c24c16 req-6c93b7b4-f4b8-4e96-9305-6dae0c434b8f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Updated VIF entry in instance network info cache for port aacfecb1-9e82-4905-8b61-d1e0e486b87e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 12:57:41 np0005476733 nova_compute[192580]: 2025-10-08 16:57:41.733 2 DEBUG nova.network.neutron [req-9f436e75-1a59-49e3-a217-13a2b1c24c16 req-6c93b7b4-f4b8-4e96-9305-6dae0c434b8f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Updating instance_info_cache with network_info: [{"id": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "address": "fa:16:3e:6c:dd:b7", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.193", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaacfecb1-9e", "ovs_interfaceid": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:57:41 np0005476733 nova_compute[192580]: 2025-10-08 16:57:41.792 2 DEBUG oslo_concurrency.lockutils [req-9f436e75-1a59-49e3-a217-13a2b1c24c16 req-6c93b7b4-f4b8-4e96-9305-6dae0c434b8f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-0d0100d4-d5e3-4c53-b37f-275a02847eb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:57:41 np0005476733 podman[276818]: 2025-10-08 16:57:41.821741756 +0000 UTC m=+0.051473525 container create d37201e2061e125b999bfdc9cc5b9846b86844e71462905ec13336b97286c485 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  8 12:57:41 np0005476733 systemd[1]: Started libpod-conmon-d37201e2061e125b999bfdc9cc5b9846b86844e71462905ec13336b97286c485.scope.
Oct  8 12:57:41 np0005476733 podman[276818]: 2025-10-08 16:57:41.795657873 +0000 UTC m=+0.025389652 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 12:57:41 np0005476733 systemd[1]: Started libcrun container.
Oct  8 12:57:41 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/044d4aa5147809967fbbb70717e1c7a5d7071ea841533a08963a0554124cb19b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 12:57:41 np0005476733 podman[276818]: 2025-10-08 16:57:41.910557153 +0000 UTC m=+0.140288942 container init d37201e2061e125b999bfdc9cc5b9846b86844e71462905ec13336b97286c485 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 12:57:41 np0005476733 podman[276818]: 2025-10-08 16:57:41.917868017 +0000 UTC m=+0.147599786 container start d37201e2061e125b999bfdc9cc5b9846b86844e71462905ec13336b97286c485 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 12:57:41 np0005476733 nova_compute[192580]: 2025-10-08 16:57:41.937 2 DEBUG nova.compute.manager [req-4f53488a-44be-4602-8057-629434407b75 req-64e1745d-77cb-48b8-b529-218df25ba64c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Received event network-vif-plugged-aacfecb1-9e82-4905-8b61-d1e0e486b87e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:57:41 np0005476733 nova_compute[192580]: 2025-10-08 16:57:41.937 2 DEBUG oslo_concurrency.lockutils [req-4f53488a-44be-4602-8057-629434407b75 req-64e1745d-77cb-48b8-b529-218df25ba64c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "0d0100d4-d5e3-4c53-b37f-275a02847eb0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:57:41 np0005476733 nova_compute[192580]: 2025-10-08 16:57:41.938 2 DEBUG oslo_concurrency.lockutils [req-4f53488a-44be-4602-8057-629434407b75 req-64e1745d-77cb-48b8-b529-218df25ba64c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "0d0100d4-d5e3-4c53-b37f-275a02847eb0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:57:41 np0005476733 nova_compute[192580]: 2025-10-08 16:57:41.938 2 DEBUG oslo_concurrency.lockutils [req-4f53488a-44be-4602-8057-629434407b75 req-64e1745d-77cb-48b8-b529-218df25ba64c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "0d0100d4-d5e3-4c53-b37f-275a02847eb0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:57:41 np0005476733 nova_compute[192580]: 2025-10-08 16:57:41.938 2 DEBUG nova.compute.manager [req-4f53488a-44be-4602-8057-629434407b75 req-64e1745d-77cb-48b8-b529-218df25ba64c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Processing event network-vif-plugged-aacfecb1-9e82-4905-8b61-d1e0e486b87e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 12:57:41 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[276834]: [NOTICE]   (276838) : New worker (276840) forked
Oct  8 12:57:41 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[276834]: [NOTICE]   (276838) : Loading success.
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.043 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759942662.042704, 0d0100d4-d5e3-4c53-b37f-275a02847eb0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.044 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] VM Started (Lifecycle Event)#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.047 2 DEBUG nova.compute.manager [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.051 2 DEBUG nova.virt.libvirt.driver [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.054 2 INFO nova.virt.libvirt.driver [-] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Instance spawned successfully.#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.055 2 DEBUG nova.virt.libvirt.driver [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.163 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.174 2 DEBUG nova.virt.libvirt.driver [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.175 2 DEBUG nova.virt.libvirt.driver [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.176 2 DEBUG nova.virt.libvirt.driver [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.177 2 DEBUG nova.virt.libvirt.driver [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.177 2 DEBUG nova.virt.libvirt.driver [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.178 2 DEBUG nova.virt.libvirt.driver [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.187 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.252 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.253 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759942662.0428195, 0d0100d4-d5e3-4c53-b37f-275a02847eb0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.253 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] VM Paused (Lifecycle Event)#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.332 2 INFO nova.compute.manager [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Took 8.26 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.333 2 DEBUG nova.compute.manager [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.372 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.375 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759942662.0496514, 0d0100d4-d5e3-4c53-b37f-275a02847eb0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.375 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] VM Resumed (Lifecycle Event)#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.489 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.491 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.622 2 INFO nova.compute.manager [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Took 9.02 seconds to build instance.#033[00m
Oct  8 12:57:42 np0005476733 nova_compute[192580]: 2025-10-08 16:57:42.748 2 DEBUG oslo_concurrency.lockutils [None req-6fca22b2-e172-4a55-b193-e9958119a2cc c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "0d0100d4-d5e3-4c53-b37f-275a02847eb0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:57:43 np0005476733 podman[276849]: 2025-10-08 16:57:43.23705633 +0000 UTC m=+0.066065421 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:57:44 np0005476733 nova_compute[192580]: 2025-10-08 16:57:44.078 2 DEBUG nova.compute.manager [req-78802304-088f-4af2-a3d0-41b2abd48f53 req-3128890a-d0a3-479a-a6a4-b72990b30489 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Received event network-vif-plugged-aacfecb1-9e82-4905-8b61-d1e0e486b87e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 12:57:44 np0005476733 nova_compute[192580]: 2025-10-08 16:57:44.079 2 DEBUG oslo_concurrency.lockutils [req-78802304-088f-4af2-a3d0-41b2abd48f53 req-3128890a-d0a3-479a-a6a4-b72990b30489 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "0d0100d4-d5e3-4c53-b37f-275a02847eb0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:57:44 np0005476733 nova_compute[192580]: 2025-10-08 16:57:44.079 2 DEBUG oslo_concurrency.lockutils [req-78802304-088f-4af2-a3d0-41b2abd48f53 req-3128890a-d0a3-479a-a6a4-b72990b30489 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "0d0100d4-d5e3-4c53-b37f-275a02847eb0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:57:44 np0005476733 nova_compute[192580]: 2025-10-08 16:57:44.079 2 DEBUG oslo_concurrency.lockutils [req-78802304-088f-4af2-a3d0-41b2abd48f53 req-3128890a-d0a3-479a-a6a4-b72990b30489 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "0d0100d4-d5e3-4c53-b37f-275a02847eb0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:57:44 np0005476733 nova_compute[192580]: 2025-10-08 16:57:44.079 2 DEBUG nova.compute.manager [req-78802304-088f-4af2-a3d0-41b2abd48f53 req-3128890a-d0a3-479a-a6a4-b72990b30489 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] No waiting events found dispatching network-vif-plugged-aacfecb1-9e82-4905-8b61-d1e0e486b87e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 12:57:44 np0005476733 nova_compute[192580]: 2025-10-08 16:57:44.079 2 WARNING nova.compute.manager [req-78802304-088f-4af2-a3d0-41b2abd48f53 req-3128890a-d0a3-479a-a6a4-b72990b30489 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Received unexpected event network-vif-plugged-aacfecb1-9e82-4905-8b61-d1e0e486b87e for instance with vm_state active and task_state None.#033[00m
Oct  8 12:57:45 np0005476733 nova_compute[192580]: 2025-10-08 16:57:45.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:45 np0005476733 nova_compute[192580]: 2025-10-08 16:57:45.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:45 np0005476733 nova_compute[192580]: 2025-10-08 16:57:45.435 2 INFO nova.compute.manager [None req-51607ece-2455-4bd8-b805-e710f52c365a c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Get console output#033[00m
Oct  8 12:57:45 np0005476733 nova_compute[192580]: 2025-10-08 16:57:45.441 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:57:50 np0005476733 nova_compute[192580]: 2025-10-08 16:57:50.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:50 np0005476733 nova_compute[192580]: 2025-10-08 16:57:50.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:50 np0005476733 nova_compute[192580]: 2025-10-08 16:57:50.576 2 INFO nova.compute.manager [None req-1d61e9ee-38b2-4bf0-bde4-2a73ef891d55 c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Get console output#033[00m
Oct  8 12:57:52 np0005476733 podman[276872]: 2025-10-08 16:57:52.229339087 +0000 UTC m=+0.056840526 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  8 12:57:52 np0005476733 podman[276874]: 2025-10-08 16:57:52.240754062 +0000 UTC m=+0.061311720 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41)
Oct  8 12:57:52 np0005476733 podman[276873]: 2025-10-08 16:57:52.244255064 +0000 UTC m=+0.062496338 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:57:55 np0005476733 nova_compute[192580]: 2025-10-08 16:57:55.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:55 np0005476733 nova_compute[192580]: 2025-10-08 16:57:55.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:57:55 np0005476733 nova_compute[192580]: 2025-10-08 16:57:55.798 2 INFO nova.compute.manager [None req-cdb03607-4a58-474b-8b56-144185a24dcf c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Get console output#033[00m
Oct  8 12:57:55 np0005476733 nova_compute[192580]: 2025-10-08 16:57:55.802 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:57:58 np0005476733 podman[276941]: 2025-10-08 16:57:58.245920492 +0000 UTC m=+0.070158251 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 12:57:58 np0005476733 podman[276942]: 2025-10-08 16:57:58.246037786 +0000 UTC m=+0.063399515 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:58:00 np0005476733 nova_compute[192580]: 2025-10-08 16:58:00.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:58:00 np0005476733 nova_compute[192580]: 2025-10-08 16:58:00.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:58:01 np0005476733 nova_compute[192580]: 2025-10-08 16:58:01.027 2 INFO nova.compute.manager [None req-d12aeb95-f347-45c6-b73c-f211445bb0ac c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Get console output#033[00m
Oct  8 12:58:01 np0005476733 nova_compute[192580]: 2025-10-08 16:58:01.031 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:58:01 np0005476733 nova_compute[192580]: 2025-10-08 16:58:01.618 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:58:02 np0005476733 nova_compute[192580]: 2025-10-08 16:58:02.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:58:02 np0005476733 nova_compute[192580]: 2025-10-08 16:58:02.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:58:02 np0005476733 nova_compute[192580]: 2025-10-08 16:58:02.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:58:05 np0005476733 nova_compute[192580]: 2025-10-08 16:58:05.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:58:05 np0005476733 nova_compute[192580]: 2025-10-08 16:58:05.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:58:05 np0005476733 nova_compute[192580]: 2025-10-08 16:58:05.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  8 12:58:05 np0005476733 nova_compute[192580]: 2025-10-08 16:58:05.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 12:58:05 np0005476733 nova_compute[192580]: 2025-10-08 16:58:05.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:58:05 np0005476733 nova_compute[192580]: 2025-10-08 16:58:05.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 12:58:05 np0005476733 ovn_controller[263831]: 2025-10-08T16:58:05Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6c:dd:b7 192.168.122.193
Oct  8 12:58:05 np0005476733 ovn_controller[263831]: 2025-10-08T16:58:05Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6c:dd:b7 192.168.122.193
Oct  8 12:58:06 np0005476733 nova_compute[192580]: 2025-10-08 16:58:06.241 2 INFO nova.compute.manager [None req-1c050447-f6a4-4357-afd2-dd3e2ced77af c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Get console output#033[00m
Oct  8 12:58:06 np0005476733 nova_compute[192580]: 2025-10-08 16:58:06.244 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:58:08 np0005476733 nova_compute[192580]: 2025-10-08 16:58:08.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:58:10 np0005476733 podman[276983]: 2025-10-08 16:58:10.248544197 +0000 UTC m=+0.068948583 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  8 12:58:10 np0005476733 nova_compute[192580]: 2025-10-08 16:58:10.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:58:10 np0005476733 nova_compute[192580]: 2025-10-08 16:58:10.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:58:10 np0005476733 nova_compute[192580]: 2025-10-08 16:58:10.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  8 12:58:10 np0005476733 nova_compute[192580]: 2025-10-08 16:58:10.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 12:58:10 np0005476733 nova_compute[192580]: 2025-10-08 16:58:10.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 12:58:10 np0005476733 nova_compute[192580]: 2025-10-08 16:58:10.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:58:11 np0005476733 ovn_controller[263831]: 2025-10-08T16:58:11Z|00222|pinctrl|WARN|Dropped 327 log messages in last 62 seconds (most recently, 29 seconds ago) due to excessive rate
Oct  8 12:58:11 np0005476733 ovn_controller[263831]: 2025-10-08T16:58:11Z|00223|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:58:11 np0005476733 ovn_controller[263831]: 2025-10-08T16:58:11Z|00224|memory_trim|INFO|Detected inactivity (last active 30029 ms ago): trimming memory
Oct  8 12:58:11 np0005476733 podman[277002]: 2025-10-08 16:58:11.305224896 +0000 UTC m=+0.120414236 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  8 12:58:11 np0005476733 nova_compute[192580]: 2025-10-08 16:58:11.562 2 INFO nova.compute.manager [None req-4d9f1a22-e879-4620-9eb1-12454c367d43 c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Get console output#033[00m
Oct  8 12:58:11 np0005476733 nova_compute[192580]: 2025-10-08 16:58:11.567 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:58:11 np0005476733 nova_compute[192580]: 2025-10-08 16:58:11.570 2 INFO nova.virt.libvirt.driver [None req-4d9f1a22-e879-4620-9eb1-12454c367d43 c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Truncated console log returned, 3047 bytes ignored#033[00m
Oct  8 12:58:14 np0005476733 podman[277029]: 2025-10-08 16:58:14.275116633 +0000 UTC m=+0.089611803 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 12:58:14 np0005476733 nova_compute[192580]: 2025-10-08 16:58:14.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:58:15 np0005476733 nova_compute[192580]: 2025-10-08 16:58:15.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:58:15 np0005476733 nova_compute[192580]: 2025-10-08 16:58:15.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:58:15 np0005476733 nova_compute[192580]: 2025-10-08 16:58:15.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  8 12:58:15 np0005476733 nova_compute[192580]: 2025-10-08 16:58:15.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 12:58:15 np0005476733 nova_compute[192580]: 2025-10-08 16:58:15.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:58:15 np0005476733 nova_compute[192580]: 2025-10-08 16:58:15.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 12:58:15 np0005476733 nova_compute[192580]: 2025-10-08 16:58:15.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:58:15 np0005476733 nova_compute[192580]: 2025-10-08 16:58:15.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:58:15 np0005476733 nova_compute[192580]: 2025-10-08 16:58:15.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:58:16 np0005476733 nova_compute[192580]: 2025-10-08 16:58:16.705 2 INFO nova.compute.manager [None req-06e9215f-afe8-4b1e-8ba5-2f9706b58158 c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Get console output#033[00m
Oct  8 12:58:16 np0005476733 nova_compute[192580]: 2025-10-08 16:58:16.713 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 12:58:16 np0005476733 nova_compute[192580]: 2025-10-08 16:58:16.715 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-0d0100d4-d5e3-4c53-b37f-275a02847eb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:58:16 np0005476733 nova_compute[192580]: 2025-10-08 16:58:16.715 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-0d0100d4-d5e3-4c53-b37f-275a02847eb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:58:16 np0005476733 nova_compute[192580]: 2025-10-08 16:58:16.715 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:58:16 np0005476733 nova_compute[192580]: 2025-10-08 16:58:16.716 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0d0100d4-d5e3-4c53-b37f-275a02847eb0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:58:16 np0005476733 nova_compute[192580]: 2025-10-08 16:58:16.719 2 INFO nova.virt.libvirt.driver [None req-06e9215f-afe8-4b1e-8ba5-2f9706b58158 c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Truncated console log returned, 3335 bytes ignored#033[00m
Oct  8 12:58:19 np0005476733 nova_compute[192580]: 2025-10-08 16:58:19.475 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Updating instance_info_cache with network_info: [{"id": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "address": "fa:16:3e:6c:dd:b7", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.193", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaacfecb1-9e", "ovs_interfaceid": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:58:19 np0005476733 nova_compute[192580]: 2025-10-08 16:58:19.767 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-0d0100d4-d5e3-4c53-b37f-275a02847eb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:58:19 np0005476733 nova_compute[192580]: 2025-10-08 16:58:19.768 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:58:20 np0005476733 nova_compute[192580]: 2025-10-08 16:58:20.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:58:21 np0005476733 nova_compute[192580]: 2025-10-08 16:58:21.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:58:21 np0005476733 nova_compute[192580]: 2025-10-08 16:58:21.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:58:21 np0005476733 nova_compute[192580]: 2025-10-08 16:58:21.651 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:58:21 np0005476733 nova_compute[192580]: 2025-10-08 16:58:21.652 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:58:21 np0005476733 nova_compute[192580]: 2025-10-08 16:58:21.652 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:58:21 np0005476733 nova_compute[192580]: 2025-10-08 16:58:21.653 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:58:21 np0005476733 nova_compute[192580]: 2025-10-08 16:58:21.963 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:58:22 np0005476733 nova_compute[192580]: 2025-10-08 16:58:22.023 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:58:22 np0005476733 nova_compute[192580]: 2025-10-08 16:58:22.024 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:58:22 np0005476733 nova_compute[192580]: 2025-10-08 16:58:22.078 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:58:22 np0005476733 nova_compute[192580]: 2025-10-08 16:58:22.254 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:58:22 np0005476733 nova_compute[192580]: 2025-10-08 16:58:22.256 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12939MB free_disk=111.22111129760742GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:58:22 np0005476733 nova_compute[192580]: 2025-10-08 16:58:22.257 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:58:22 np0005476733 nova_compute[192580]: 2025-10-08 16:58:22.257 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:58:22 np0005476733 nova_compute[192580]: 2025-10-08 16:58:22.417 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 0d0100d4-d5e3-4c53-b37f-275a02847eb0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:58:22 np0005476733 nova_compute[192580]: 2025-10-08 16:58:22.417 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:58:22 np0005476733 nova_compute[192580]: 2025-10-08 16:58:22.417 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:58:22 np0005476733 nova_compute[192580]: 2025-10-08 16:58:22.469 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:58:22 np0005476733 nova_compute[192580]: 2025-10-08 16:58:22.488 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:58:22 np0005476733 nova_compute[192580]: 2025-10-08 16:58:22.515 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:58:22 np0005476733 nova_compute[192580]: 2025-10-08 16:58:22.515 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:58:23 np0005476733 podman[277076]: 2025-10-08 16:58:23.22295201 +0000 UTC m=+0.051073212 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 12:58:23 np0005476733 podman[277075]: 2025-10-08 16:58:23.227149724 +0000 UTC m=+0.058126068 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 12:58:23 np0005476733 podman[277077]: 2025-10-08 16:58:23.228795136 +0000 UTC m=+0.054506772 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git)
Oct  8 12:58:25 np0005476733 nova_compute[192580]: 2025-10-08 16:58:25.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:58:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:58:26.433 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:58:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:58:26.434 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:58:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:58:26.434 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:58:29 np0005476733 podman[277141]: 2025-10-08 16:58:29.236943662 +0000 UTC m=+0.060577466 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:58:29 np0005476733 podman[277142]: 2025-10-08 16:58:29.266105434 +0000 UTC m=+0.073388925 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 12:58:30 np0005476733 nova_compute[192580]: 2025-10-08 16:58:30.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:58:31 np0005476733 nova_compute[192580]: 2025-10-08 16:58:31.515 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:58:35 np0005476733 nova_compute[192580]: 2025-10-08 16:58:35.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.079 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'name': 'tempest-test_dscp_marking_external_network-960705218', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000006f', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'hostId': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.080 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.096 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/cpu volume: 39230000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69e4a0d6-e592-4711-bb4b-769e6ed516e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 39230000000, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'timestamp': '2025-10-08T16:58:36.080839', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '07af480c-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.819141402, 'message_signature': 'e122c4a936f4e38a2a5e5bc05557f4a0ad7316635052fd16a1c81a79d738b58b'}]}, 'timestamp': '2025-10-08 16:58:36.097005', '_unique_id': '30e974dd3df845208e9650e7272fb842'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.098 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.100 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.101 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.101 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-test_dscp_marking_external_network-960705218>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dscp_marking_external_network-960705218>]
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.102 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.120 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.read.requests volume: 11525 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.121 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b8c1d5a-1eb4-41d9-bd89-45db485d1f9d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11525, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-vda', 'timestamp': '2025-10-08T16:58:36.102378', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '07b2f704-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.825382141, 'message_signature': 'bbb366f48338f10d4093767ab09edb24c2ffd46d98701d6a25bbe1b4e15e4863'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-sda', 'timestamp': '2025-10-08T16:58:36.102378', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '07b310a4-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.825382141, 'message_signature': '664bf9b7aacb43899586e733a70762bb4fb41b3d7437a4edcb53f6d29323e54e'}]}, 'timestamp': '2025-10-08 16:58:36.121744', '_unique_id': '3da7284254c1489bbc77f24fe87901c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.123 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.125 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.125 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.write.latency volume: 8922530828 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.125 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8d4c4de-e2f0-4893-8d1b-031f23250221', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8922530828, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-vda', 'timestamp': '2025-10-08T16:58:36.125365', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '07b3b50e-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.825382141, 'message_signature': '6e259b140cfe2926d8f0377c6b4bc0198fafdf8f1a29f89b818a38eb0c001f6c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-sda', 'timestamp': '2025-10-08T16:58:36.125365', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '07b3cdb4-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.825382141, 'message_signature': '646f515502e3f954270b5a34de87f4e1295e31162e82d8596beb9450e6232477'}]}, 'timestamp': '2025-10-08 16:58:36.126624', '_unique_id': '6ddbaf5b79244247b40227843e954892'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.127 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.130 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.133 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 0d0100d4-d5e3-4c53-b37f-275a02847eb0 / tapaacfecb1-9e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.133 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6bd7a774-178b-4401-b20b-7f79fa004a8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-0000006f-0d0100d4-d5e3-4c53-b37f-275a02847eb0-tapaacfecb1-9e', 'timestamp': '2025-10-08T16:58:36.130432', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'tapaacfecb1-9e', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6c:dd:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaacfecb1-9e'}, 'message_id': '07b4ecc6-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.853474419, 'message_signature': '17ab6690c4121659c0d4df318e62371a3b6f0fc51fd90cd84be2b5ba23b862c2'}]}, 'timestamp': '2025-10-08 16:58:36.133887', '_unique_id': '0ca84845196049c8a3bd721a6bb58a45'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.134 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.136 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.136 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e18e17c3-0c8c-490a-b524-3e14114aa181', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-0000006f-0d0100d4-d5e3-4c53-b37f-275a02847eb0-tapaacfecb1-9e', 'timestamp': '2025-10-08T16:58:36.136881', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'tapaacfecb1-9e', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6c:dd:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaacfecb1-9e'}, 'message_id': '07b57ac4-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.853474419, 'message_signature': 'aee2f3be53d10a42d3234001dc68a962952c31823b76fb13de33d9aee0be1730'}]}, 'timestamp': '2025-10-08 16:58:36.137643', '_unique_id': 'e9511bcdffcf4c9b90b921ec7275990b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.139 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.140 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.141 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd45e7390-4152-4546-8a9a-7113deacd371', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-0000006f-0d0100d4-d5e3-4c53-b37f-275a02847eb0-tapaacfecb1-9e', 'timestamp': '2025-10-08T16:58:36.141170', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'tapaacfecb1-9e', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6c:dd:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaacfecb1-9e'}, 'message_id': '07b61d58-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.853474419, 'message_signature': 'abf4bdac368ca76d220944967e0acd5d7918f84c2ff412d301d02b1fb547d0e5'}]}, 'timestamp': '2025-10-08 16:58:36.141692', '_unique_id': '6cfe2b4a12a04a2297381e95943a275e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.142 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.144 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.144 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/network.incoming.packets volume: 17 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6fc29360-398e-47df-a220-a2f2d005af09', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 17, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-0000006f-0d0100d4-d5e3-4c53-b37f-275a02847eb0-tapaacfecb1-9e', 'timestamp': '2025-10-08T16:58:36.144277', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'tapaacfecb1-9e', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6c:dd:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaacfecb1-9e'}, 'message_id': '07b693dc-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.853474419, 'message_signature': 'f67f01bd3b5dbcf226999e928d5fc7a74e753e6f2591058decf9dcf16bf46b7d'}]}, 'timestamp': '2025-10-08 16:58:36.144702', '_unique_id': 'b671a342bedb46bead2bbedd30420c6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.145 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.146 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.146 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/network.incoming.bytes volume: 2390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '440c77dd-2cc1-4148-b7ee-9ff0f39a49b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2390, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-0000006f-0d0100d4-d5e3-4c53-b37f-275a02847eb0-tapaacfecb1-9e', 'timestamp': '2025-10-08T16:58:36.146948', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'tapaacfecb1-9e', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6c:dd:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaacfecb1-9e'}, 'message_id': '07b6fc50-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.853474419, 'message_signature': 'b531a24aa6721ab56631a0b5e27938e335be9523166265a208754e5d30b3f08b'}]}, 'timestamp': '2025-10-08 16:58:36.147407', '_unique_id': '520605bce22c48f89a61dbf2a616707c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.148 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.149 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.149 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/network.outgoing.packets volume: 34 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '419c5766-82a0-4f12-b04b-0b7621021427', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 34, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-0000006f-0d0100d4-d5e3-4c53-b37f-275a02847eb0-tapaacfecb1-9e', 'timestamp': '2025-10-08T16:58:36.149791', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'tapaacfecb1-9e', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6c:dd:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaacfecb1-9e'}, 'message_id': '07b76c4e-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.853474419, 'message_signature': '2e6e82cf9ee82f8f7c4b0644d92dd673b5b338a88e6058d367dd803049ad31d3'}]}, 'timestamp': '2025-10-08 16:58:36.150258', '_unique_id': 'af86bc7b07384daab87e8a64741774c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.152 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.152 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.read.latency volume: 6463819950 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.153 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.read.latency volume: 47315887 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a840ded-86cc-4751-9090-fb62af8d9ade', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6463819950, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-vda', 'timestamp': '2025-10-08T16:58:36.152568', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '07b7da30-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.825382141, 'message_signature': '4ec257a31a84bb8734ce95b47c27d578dd7970df92043f599347158289bd3f72'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 47315887, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-sda', 'timestamp': '2025-10-08T16:58:36.152568', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '07b7ed0e-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.825382141, 'message_signature': 'fceea1f6f7f173330734313798348be1fdeff0e7c0cb043f531a3a16a83d38aa'}]}, 'timestamp': '2025-10-08 16:58:36.153575', '_unique_id': 'af3b4d0493d14272a895ce1823caf5e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.156 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.168 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.usage volume: 152567808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.168 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2766eb8b-71cf-46d9-b550-de9eb86cd815', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152567808, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-vda', 'timestamp': '2025-10-08T16:58:36.156333', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '07ba43ba-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.879351096, 'message_signature': '97b684946510e6856fdd6e3983d5206b4c7dba606fdae4279631ba5201009205'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-sda', 'timestamp': '2025-10-08T16:58:36.156333', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '07ba5c56-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.879351096, 'message_signature': 'f7a5780fba96788a2856306bc0ec5c5c3f3f39ddc03aecba5f6f8cb0738c692f'}]}, 'timestamp': '2025-10-08 16:58:36.169570', '_unique_id': '455e88154154456c8d2f05a8510ff78e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.171 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.173 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.173 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50777b0f-fcb3-44b7-98cc-b3138ff1b2e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-0000006f-0d0100d4-d5e3-4c53-b37f-275a02847eb0-tapaacfecb1-9e', 'timestamp': '2025-10-08T16:58:36.173920', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'tapaacfecb1-9e', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6c:dd:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaacfecb1-9e'}, 'message_id': '07bb1f1a-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.853474419, 'message_signature': 'fcda2ddf3bd4abf04fb364785d3f629f8d83fbb82741541261db15b3e3360dfe'}]}, 'timestamp': '2025-10-08 16:58:36.174580', '_unique_id': '3f47055337f946269ec6bdaf2d26b20e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.175 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.177 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.177 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '930369ba-8cf9-4fb5-a5ad-90ad16cf732c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-0000006f-0d0100d4-d5e3-4c53-b37f-275a02847eb0-tapaacfecb1-9e', 'timestamp': '2025-10-08T16:58:36.177684', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'tapaacfecb1-9e', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6c:dd:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaacfecb1-9e'}, 'message_id': '07bbae44-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.853474419, 'message_signature': '0e0a51d803ecd6d7ac5293cc2d9852eec1cce039b4db98918e5b8197c066f4d9'}]}, 'timestamp': '2025-10-08 16:58:36.178229', '_unique_id': '794cb3e20b014e69b4570dece6d38115'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.179 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/memory.usage volume: 226.7109375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fac7963e-be10-40d2-9084-db935c47819e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 226.7109375, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'timestamp': '2025-10-08T16:58:36.180038', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '07bc0448-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.819141402, 'message_signature': '38b00fd44f99b0f84ef0655ac1152d71071d4833818a35a3131267b3e0e7ad59'}]}, 'timestamp': '2025-10-08 16:58:36.180282', '_unique_id': '1e0ebc3b6d994f84afce3880ea875eef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.180 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.181 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.181 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.181 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-test_dscp_marking_external_network-960705218>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dscp_marking_external_network-960705218>]
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.181 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.181 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.181 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-test_dscp_marking_external_network-960705218>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dscp_marking_external_network-960705218>]
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.182 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.182 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c06bae3a-1a36-42f8-ba2f-9b0eadcac042', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-0000006f-0d0100d4-d5e3-4c53-b37f-275a02847eb0-tapaacfecb1-9e', 'timestamp': '2025-10-08T16:58:36.182236', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'tapaacfecb1-9e', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6c:dd:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaacfecb1-9e'}, 'message_id': '07bc5aba-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.853474419, 'message_signature': '02316468d9e8e8ec1077e11fd7a02cbd6be381d73f08186cc07337d1d36a33eb'}]}, 'timestamp': '2025-10-08 16:58:36.182571', '_unique_id': 'be152f2d245540c6b848d46ec44a3dc1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.183 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.184 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3098c147-d57e-4d2c-83da-2660dda7f072', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-vda', 'timestamp': '2025-10-08T16:58:36.183915', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '07bc9b74-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.879351096, 'message_signature': '9b424cb2b1fe87b4e5195e3bd3759728b913782f1fc3aed1dacf94fb0c576064'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-sda', 'timestamp': '2025-10-08T16:58:36.183915', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '07bca63c-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.879351096, 'message_signature': '2ce4c7b37be6a1d8918e2d250ca0ffc165d212da70a7345028f0e19abfab3c77'}]}, 'timestamp': '2025-10-08 16:58:36.184603', '_unique_id': 'c4fb9f194cc54078bd375f009e1df015'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.185 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/network.outgoing.bytes volume: 3612 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7eb783da-7891-4c6f-bbd3-363a5db94c40', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3612, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-0000006f-0d0100d4-d5e3-4c53-b37f-275a02847eb0-tapaacfecb1-9e', 'timestamp': '2025-10-08T16:58:36.185938', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'tapaacfecb1-9e', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6c:dd:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaacfecb1-9e'}, 'message_id': '07bcea7a-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.853474419, 'message_signature': 'd1e5239b0f668c5709c0c00c82d93e115b59cfb6b61b1794e18122cb59f3010d'}]}, 'timestamp': '2025-10-08 16:58:36.186200', '_unique_id': 'af8381d6e6214dc582fa5ec588d39b7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.186 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.187 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.187 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.write.requests volume: 703 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.187 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7076d916-5aa7-4adb-a897-c75595003364', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 703, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-vda', 'timestamp': '2025-10-08T16:58:36.187302', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '07bd1efa-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.825382141, 'message_signature': '009c38e92cc0a9e411212c75887494a7411689f2dac5e6e148f50c161a424032'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-sda', 'timestamp': '2025-10-08T16:58:36.187302', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '07bd2710-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.825382141, 'message_signature': '3e1c29551b8a001572fd41e1601bcbff2c6e6f3a9067e064ce33968f9a0ca4d1'}]}, 'timestamp': '2025-10-08 16:58:36.187836', '_unique_id': 'b7efc50135c545e7931d99f54d6cbc20'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.188 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.189 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.189 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-test_dscp_marking_external_network-960705218>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dscp_marking_external_network-960705218>]
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.189 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.189 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.189 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07251324-2504-42e0-9537-c08156988a74', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-vda', 'timestamp': '2025-10-08T16:58:36.189322', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '07bd6e1e-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.879351096, 'message_signature': 'edb26fa77150950ff01e4f784ffe450348bd35d644e2e387c5457fb7575b9861'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-sda', 'timestamp': '2025-10-08T16:58:36.189322', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '07bd77ba-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.879351096, 'message_signature': '18b336fc83a120bbdeaac006fdf71dd60daf3865aead246618fd34cbe24646ea'}]}, 'timestamp': '2025-10-08 16:58:36.189784', '_unique_id': '455dd48d6fd146b5acde3f264e69e1f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.190 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.write.bytes volume: 135441920 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ccb8600-63e9-4609-ba17-5cdef4d9e19a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 135441920, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-vda', 'timestamp': '2025-10-08T16:58:36.190918', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '07bdac58-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.825382141, 'message_signature': 'e93047ac4af09d2291f85104cda09865cca5c4a9ff5a12aaacec30658feb4cf9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-sda', 'timestamp': '2025-10-08T16:58:36.190918', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '07bdb522-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.825382141, 'message_signature': '7b73f2dce94eace62cdf75e8477403596d5954b34ed58407e9ac1a48426e3ed4'}]}, 'timestamp': '2025-10-08 16:58:36.191353', '_unique_id': '13b2217a3c8942c1a4f2db41a9dee31b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.191 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.192 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.192 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.read.bytes volume: 326608384 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.192 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78eb4ad3-145b-4d89-aac1-73f74448de44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 326608384, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-vda', 'timestamp': '2025-10-08T16:58:36.192659', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '07bdf046-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.825382141, 'message_signature': '4558b3f4dc8f5c68df739588e3bfc787d0288afc55319a453a25ef4259edd4c5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-sda', 'timestamp': '2025-10-08T16:58:36.192659', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '07bdf834-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9619.825382141, 'message_signature': 'd74e959885c4c303e728b47ed6a3e360e9a627b19592fa698b1231b310fd2278'}]}, 'timestamp': '2025-10-08 16:58:36.193069', '_unique_id': '9d3bd375b8184e5dae3147e078fdca5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 12:58:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 16:58:36.193 12 ERROR oslo_messaging.notify.messaging 
Oct  8 12:58:40 np0005476733 nova_compute[192580]: 2025-10-08 16:58:40.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:58:41 np0005476733 podman[277186]: 2025-10-08 16:58:41.257131648 +0000 UTC m=+0.073912492 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  8 12:58:42 np0005476733 podman[277203]: 2025-10-08 16:58:42.260261357 +0000 UTC m=+0.090044547 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:58:45 np0005476733 podman[277229]: 2025-10-08 16:58:45.23317159 +0000 UTC m=+0.068439637 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 12:58:45 np0005476733 nova_compute[192580]: 2025-10-08 16:58:45.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:58:45 np0005476733 nova_compute[192580]: 2025-10-08 16:58:45.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:58:45 np0005476733 nova_compute[192580]: 2025-10-08 16:58:45.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  8 12:58:45 np0005476733 nova_compute[192580]: 2025-10-08 16:58:45.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 12:58:45 np0005476733 nova_compute[192580]: 2025-10-08 16:58:45.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:58:45 np0005476733 nova_compute[192580]: 2025-10-08 16:58:45.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 12:58:50 np0005476733 nova_compute[192580]: 2025-10-08 16:58:50.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:58:50 np0005476733 nova_compute[192580]: 2025-10-08 16:58:50.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:58:50 np0005476733 nova_compute[192580]: 2025-10-08 16:58:50.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  8 12:58:50 np0005476733 nova_compute[192580]: 2025-10-08 16:58:50.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 12:58:50 np0005476733 nova_compute[192580]: 2025-10-08 16:58:50.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 12:58:50 np0005476733 nova_compute[192580]: 2025-10-08 16:58:50.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:58:54 np0005476733 ovn_controller[263831]: 2025-10-08T16:58:54Z|00225|pinctrl|WARN|Dropped 107 log messages in last 43 seconds (most recently, 12 seconds ago) due to excessive rate
Oct  8 12:58:54 np0005476733 ovn_controller[263831]: 2025-10-08T16:58:54Z|00226|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 12:58:54 np0005476733 ovn_controller[263831]: 2025-10-08T16:58:54Z|00227|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Oct  8 12:58:54 np0005476733 podman[277250]: 2025-10-08 16:58:54.226481449 +0000 UTC m=+0.052890941 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:58:54 np0005476733 podman[277252]: 2025-10-08 16:58:54.23686189 +0000 UTC m=+0.061655370 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, architecture=x86_64, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  8 12:58:54 np0005476733 podman[277251]: 2025-10-08 16:58:54.238830013 +0000 UTC m=+0.062130055 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 12:58:55 np0005476733 nova_compute[192580]: 2025-10-08 16:58:55.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:59:00 np0005476733 podman[277318]: 2025-10-08 16:59:00.220025128 +0000 UTC m=+0.047845640 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 12:59:00 np0005476733 podman[277317]: 2025-10-08 16:59:00.224963515 +0000 UTC m=+0.057650272 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 12:59:00 np0005476733 nova_compute[192580]: 2025-10-08 16:59:00.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:59:01 np0005476733 nova_compute[192580]: 2025-10-08 16:59:01.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:59:02 np0005476733 nova_compute[192580]: 2025-10-08 16:59:02.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:59:02 np0005476733 nova_compute[192580]: 2025-10-08 16:59:02.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 12:59:04 np0005476733 nova_compute[192580]: 2025-10-08 16:59:04.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:59:05 np0005476733 nova_compute[192580]: 2025-10-08 16:59:05.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:59:05 np0005476733 nova_compute[192580]: 2025-10-08 16:59:05.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:59:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:59:10.021 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=93, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=92) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 12:59:10 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:59:10.022 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 12:59:10 np0005476733 nova_compute[192580]: 2025-10-08 16:59:10.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:59:10 np0005476733 nova_compute[192580]: 2025-10-08 16:59:10.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:59:10 np0005476733 nova_compute[192580]: 2025-10-08 16:59:10.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:59:12 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:59:12.024 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '93'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 12:59:12 np0005476733 podman[277364]: 2025-10-08 16:59:12.216375212 +0000 UTC m=+0.045597087 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 12:59:13 np0005476733 podman[277383]: 2025-10-08 16:59:13.23510419 +0000 UTC m=+0.065550995 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 12:59:15 np0005476733 nova_compute[192580]: 2025-10-08 16:59:15.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:59:15 np0005476733 nova_compute[192580]: 2025-10-08 16:59:15.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:59:15 np0005476733 nova_compute[192580]: 2025-10-08 16:59:15.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 12:59:15 np0005476733 nova_compute[192580]: 2025-10-08 16:59:15.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 12:59:15 np0005476733 nova_compute[192580]: 2025-10-08 16:59:15.808 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-0d0100d4-d5e3-4c53-b37f-275a02847eb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 12:59:15 np0005476733 nova_compute[192580]: 2025-10-08 16:59:15.808 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-0d0100d4-d5e3-4c53-b37f-275a02847eb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 12:59:15 np0005476733 nova_compute[192580]: 2025-10-08 16:59:15.808 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 12:59:15 np0005476733 nova_compute[192580]: 2025-10-08 16:59:15.808 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0d0100d4-d5e3-4c53-b37f-275a02847eb0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 12:59:16 np0005476733 podman[277411]: 2025-10-08 16:59:16.235312175 +0000 UTC m=+0.068323394 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:59:17 np0005476733 nova_compute[192580]: 2025-10-08 16:59:17.150 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Updating instance_info_cache with network_info: [{"id": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "address": "fa:16:3e:6c:dd:b7", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.193", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaacfecb1-9e", "ovs_interfaceid": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 12:59:17 np0005476733 nova_compute[192580]: 2025-10-08 16:59:17.194 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-0d0100d4-d5e3-4c53-b37f-275a02847eb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 12:59:17 np0005476733 nova_compute[192580]: 2025-10-08 16:59:17.194 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 12:59:17 np0005476733 nova_compute[192580]: 2025-10-08 16:59:17.194 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:59:20 np0005476733 nova_compute[192580]: 2025-10-08 16:59:20.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:59:20 np0005476733 nova_compute[192580]: 2025-10-08 16:59:20.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:59:21 np0005476733 nova_compute[192580]: 2025-10-08 16:59:21.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:59:21 np0005476733 nova_compute[192580]: 2025-10-08 16:59:21.630 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:59:21 np0005476733 nova_compute[192580]: 2025-10-08 16:59:21.631 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:59:21 np0005476733 nova_compute[192580]: 2025-10-08 16:59:21.632 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:59:21 np0005476733 nova_compute[192580]: 2025-10-08 16:59:21.632 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 12:59:21 np0005476733 nova_compute[192580]: 2025-10-08 16:59:21.700 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:59:21 np0005476733 nova_compute[192580]: 2025-10-08 16:59:21.791 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:59:21 np0005476733 nova_compute[192580]: 2025-10-08 16:59:21.792 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 12:59:21 np0005476733 nova_compute[192580]: 2025-10-08 16:59:21.868 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 12:59:22 np0005476733 nova_compute[192580]: 2025-10-08 16:59:22.010 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 12:59:22 np0005476733 nova_compute[192580]: 2025-10-08 16:59:22.011 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12851MB free_disk=111.1585693359375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 12:59:22 np0005476733 nova_compute[192580]: 2025-10-08 16:59:22.011 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:59:22 np0005476733 nova_compute[192580]: 2025-10-08 16:59:22.012 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:59:22 np0005476733 nova_compute[192580]: 2025-10-08 16:59:22.201 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 0d0100d4-d5e3-4c53-b37f-275a02847eb0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 12:59:22 np0005476733 nova_compute[192580]: 2025-10-08 16:59:22.201 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 12:59:22 np0005476733 nova_compute[192580]: 2025-10-08 16:59:22.201 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 12:59:22 np0005476733 nova_compute[192580]: 2025-10-08 16:59:22.261 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 12:59:22 np0005476733 nova_compute[192580]: 2025-10-08 16:59:22.291 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 12:59:22 np0005476733 nova_compute[192580]: 2025-10-08 16:59:22.292 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 12:59:22 np0005476733 nova_compute[192580]: 2025-10-08 16:59:22.292 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.281s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:59:23 np0005476733 nova_compute[192580]: 2025-10-08 16:59:23.292 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:59:25 np0005476733 podman[277439]: 2025-10-08 16:59:25.225327428 +0000 UTC m=+0.049099089 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 12:59:25 np0005476733 podman[277440]: 2025-10-08 16:59:25.246824735 +0000 UTC m=+0.065596036 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., name=ubi9-minimal)
Oct  8 12:59:25 np0005476733 podman[277438]: 2025-10-08 16:59:25.257199917 +0000 UTC m=+0.084709077 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd)
Oct  8 12:59:25 np0005476733 nova_compute[192580]: 2025-10-08 16:59:25.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:59:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:59:26.434 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 12:59:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:59:26.435 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 12:59:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 16:59:26.435 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 12:59:30 np0005476733 nova_compute[192580]: 2025-10-08 16:59:30.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:59:30 np0005476733 nova_compute[192580]: 2025-10-08 16:59:30.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:59:31 np0005476733 podman[277502]: 2025-10-08 16:59:31.220151578 +0000 UTC m=+0.049280324 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 12:59:31 np0005476733 podman[277503]: 2025-10-08 16:59:31.236746368 +0000 UTC m=+0.062127215 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 12:59:35 np0005476733 nova_compute[192580]: 2025-10-08 16:59:35.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:59:40 np0005476733 nova_compute[192580]: 2025-10-08 16:59:40.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:59:43 np0005476733 podman[277553]: 2025-10-08 16:59:43.222860025 +0000 UTC m=+0.056395183 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 12:59:44 np0005476733 podman[277573]: 2025-10-08 16:59:44.254543567 +0000 UTC m=+0.084616034 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 12:59:44 np0005476733 nova_compute[192580]: 2025-10-08 16:59:44.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 12:59:45 np0005476733 nova_compute[192580]: 2025-10-08 16:59:45.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:59:47 np0005476733 podman[277600]: 2025-10-08 16:59:47.225477466 +0000 UTC m=+0.057858319 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  8 12:59:50 np0005476733 nova_compute[192580]: 2025-10-08 16:59:50.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:59:50 np0005476733 nova_compute[192580]: 2025-10-08 16:59:50.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 12:59:55 np0005476733 nova_compute[192580]: 2025-10-08 16:59:55.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 12:59:56 np0005476733 podman[277624]: 2025-10-08 16:59:56.235988737 +0000 UTC m=+0.059859774 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, name=ubi9-minimal, release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Oct  8 12:59:56 np0005476733 podman[277623]: 2025-10-08 16:59:56.247356899 +0000 UTC m=+0.075297176 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 12:59:56 np0005476733 podman[277622]: 2025-10-08 16:59:56.255961604 +0000 UTC m=+0.086000758 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 13:00:00 np0005476733 nova_compute[192580]: 2025-10-08 17:00:00.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:00:00 np0005476733 nova_compute[192580]: 2025-10-08 17:00:00.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:00:02 np0005476733 podman[277684]: 2025-10-08 17:00:02.263255973 +0000 UTC m=+0.091520665 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, managed_by=edpm_ansible)
Oct  8 13:00:02 np0005476733 podman[277685]: 2025-10-08 17:00:02.28884828 +0000 UTC m=+0.106008578 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 13:00:03 np0005476733 nova_compute[192580]: 2025-10-08 17:00:03.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:00:03 np0005476733 nova_compute[192580]: 2025-10-08 17:00:03.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:00:03 np0005476733 nova_compute[192580]: 2025-10-08 17:00:03.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:00:05 np0005476733 nova_compute[192580]: 2025-10-08 17:00:05.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:00:06 np0005476733 nova_compute[192580]: 2025-10-08 17:00:06.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:00:09 np0005476733 ovn_controller[263831]: 2025-10-08T17:00:09Z|00228|pinctrl|WARN|Dropped 113 log messages in last 76 seconds (most recently, 20 seconds ago) due to excessive rate
Oct  8 13:00:09 np0005476733 ovn_controller[263831]: 2025-10-08T17:00:09Z|00229|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:00:09 np0005476733 nova_compute[192580]: 2025-10-08 17:00:09.656 2 DEBUG nova.compute.manager [req-eb9683c9-8e0d-4107-82fd-76076deb62a8 req-f3940e4a-6819-41e5-8e15-07846d9cd154 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Received event network-changed-aacfecb1-9e82-4905-8b61-d1e0e486b87e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:00:09 np0005476733 nova_compute[192580]: 2025-10-08 17:00:09.656 2 DEBUG nova.compute.manager [req-eb9683c9-8e0d-4107-82fd-76076deb62a8 req-f3940e4a-6819-41e5-8e15-07846d9cd154 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Refreshing instance network info cache due to event network-changed-aacfecb1-9e82-4905-8b61-d1e0e486b87e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 13:00:09 np0005476733 nova_compute[192580]: 2025-10-08 17:00:09.656 2 DEBUG oslo_concurrency.lockutils [req-eb9683c9-8e0d-4107-82fd-76076deb62a8 req-f3940e4a-6819-41e5-8e15-07846d9cd154 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-0d0100d4-d5e3-4c53-b37f-275a02847eb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 13:00:09 np0005476733 nova_compute[192580]: 2025-10-08 17:00:09.656 2 DEBUG oslo_concurrency.lockutils [req-eb9683c9-8e0d-4107-82fd-76076deb62a8 req-f3940e4a-6819-41e5-8e15-07846d9cd154 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-0d0100d4-d5e3-4c53-b37f-275a02847eb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 13:00:09 np0005476733 nova_compute[192580]: 2025-10-08 17:00:09.656 2 DEBUG nova.network.neutron [req-eb9683c9-8e0d-4107-82fd-76076deb62a8 req-f3940e4a-6819-41e5-8e15-07846d9cd154 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Refreshing network info cache for port aacfecb1-9e82-4905-8b61-d1e0e486b87e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 13:00:10 np0005476733 nova_compute[192580]: 2025-10-08 17:00:10.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:00:11 np0005476733 nova_compute[192580]: 2025-10-08 17:00:11.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:00:12 np0005476733 nova_compute[192580]: 2025-10-08 17:00:12.836 2 DEBUG nova.network.neutron [req-eb9683c9-8e0d-4107-82fd-76076deb62a8 req-f3940e4a-6819-41e5-8e15-07846d9cd154 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Updated VIF entry in instance network info cache for port aacfecb1-9e82-4905-8b61-d1e0e486b87e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 13:00:12 np0005476733 nova_compute[192580]: 2025-10-08 17:00:12.837 2 DEBUG nova.network.neutron [req-eb9683c9-8e0d-4107-82fd-76076deb62a8 req-f3940e4a-6819-41e5-8e15-07846d9cd154 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Updating instance_info_cache with network_info: [{"id": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "address": "fa:16:3e:6c:dd:b7", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.193", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaacfecb1-9e", "ovs_interfaceid": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 13:00:12 np0005476733 nova_compute[192580]: 2025-10-08 17:00:12.865 2 DEBUG oslo_concurrency.lockutils [req-eb9683c9-8e0d-4107-82fd-76076deb62a8 req-f3940e4a-6819-41e5-8e15-07846d9cd154 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-0d0100d4-d5e3-4c53-b37f-275a02847eb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 13:00:14 np0005476733 podman[277747]: 2025-10-08 17:00:14.22653719 +0000 UTC m=+0.055906486 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct  8 13:00:15 np0005476733 podman[277767]: 2025-10-08 17:00:15.268949655 +0000 UTC m=+0.095707968 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  8 13:00:15 np0005476733 nova_compute[192580]: 2025-10-08 17:00:15.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:00:16 np0005476733 nova_compute[192580]: 2025-10-08 17:00:16.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:00:16 np0005476733 nova_compute[192580]: 2025-10-08 17:00:16.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:00:16 np0005476733 nova_compute[192580]: 2025-10-08 17:00:16.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:00:16 np0005476733 nova_compute[192580]: 2025-10-08 17:00:16.824 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-0d0100d4-d5e3-4c53-b37f-275a02847eb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 13:00:16 np0005476733 nova_compute[192580]: 2025-10-08 17:00:16.824 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-0d0100d4-d5e3-4c53-b37f-275a02847eb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 13:00:16 np0005476733 nova_compute[192580]: 2025-10-08 17:00:16.824 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 13:00:16 np0005476733 nova_compute[192580]: 2025-10-08 17:00:16.824 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0d0100d4-d5e3-4c53-b37f-275a02847eb0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 13:00:17 np0005476733 nova_compute[192580]: 2025-10-08 17:00:17.968 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Updating instance_info_cache with network_info: [{"id": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "address": "fa:16:3e:6c:dd:b7", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.193", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaacfecb1-9e", "ovs_interfaceid": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 13:00:17 np0005476733 nova_compute[192580]: 2025-10-08 17:00:17.982 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-0d0100d4-d5e3-4c53-b37f-275a02847eb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 13:00:17 np0005476733 nova_compute[192580]: 2025-10-08 17:00:17.983 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 13:00:17 np0005476733 nova_compute[192580]: 2025-10-08 17:00:17.983 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:00:18 np0005476733 podman[277796]: 2025-10-08 17:00:18.226064343 +0000 UTC m=+0.060021029 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  8 13:00:20 np0005476733 nova_compute[192580]: 2025-10-08 17:00:20.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:00:23 np0005476733 nova_compute[192580]: 2025-10-08 17:00:23.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:00:23 np0005476733 nova_compute[192580]: 2025-10-08 17:00:23.640 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:00:23 np0005476733 nova_compute[192580]: 2025-10-08 17:00:23.641 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:00:23 np0005476733 nova_compute[192580]: 2025-10-08 17:00:23.641 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:00:23 np0005476733 nova_compute[192580]: 2025-10-08 17:00:23.641 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:00:23 np0005476733 nova_compute[192580]: 2025-10-08 17:00:23.724 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:00:23 np0005476733 nova_compute[192580]: 2025-10-08 17:00:23.787 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:00:23 np0005476733 nova_compute[192580]: 2025-10-08 17:00:23.788 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:00:23 np0005476733 nova_compute[192580]: 2025-10-08 17:00:23.843 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:00:23 np0005476733 nova_compute[192580]: 2025-10-08 17:00:23.995 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:00:23 np0005476733 nova_compute[192580]: 2025-10-08 17:00:23.996 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12842MB free_disk=111.15901565551758GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:00:23 np0005476733 nova_compute[192580]: 2025-10-08 17:00:23.996 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:00:23 np0005476733 nova_compute[192580]: 2025-10-08 17:00:23.996 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:00:24 np0005476733 nova_compute[192580]: 2025-10-08 17:00:24.116 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance 0d0100d4-d5e3-4c53-b37f-275a02847eb0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 13:00:24 np0005476733 nova_compute[192580]: 2025-10-08 17:00:24.117 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:00:24 np0005476733 nova_compute[192580]: 2025-10-08 17:00:24.117 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:00:24 np0005476733 nova_compute[192580]: 2025-10-08 17:00:24.159 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:00:24 np0005476733 nova_compute[192580]: 2025-10-08 17:00:24.180 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:00:24 np0005476733 nova_compute[192580]: 2025-10-08 17:00:24.181 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:00:24 np0005476733 nova_compute[192580]: 2025-10-08 17:00:24.182 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:00:25 np0005476733 nova_compute[192580]: 2025-10-08 17:00:25.183 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:00:25 np0005476733 nova_compute[192580]: 2025-10-08 17:00:25.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 13:00:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:00:26.434 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:00:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:00:26.435 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:00:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:00:26.436 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:00:27 np0005476733 podman[277825]: 2025-10-08 17:00:27.230034703 +0000 UTC m=+0.055356000 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 13:00:27 np0005476733 podman[277824]: 2025-10-08 17:00:27.233946317 +0000 UTC m=+0.061582327 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct  8 13:00:27 np0005476733 podman[277826]: 2025-10-08 17:00:27.261982583 +0000 UTC m=+0.086843135 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, managed_by=edpm_ansible, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Oct  8 13:00:30 np0005476733 nova_compute[192580]: 2025-10-08 17:00:30.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:00:32 np0005476733 nova_compute[192580]: 2025-10-08 17:00:32.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:00:33 np0005476733 podman[277885]: 2025-10-08 17:00:33.216137304 +0000 UTC m=+0.045406710 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 13:00:33 np0005476733 podman[277884]: 2025-10-08 17:00:33.228002573 +0000 UTC m=+0.058539990 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 13:00:35 np0005476733 nova_compute[192580]: 2025-10-08 17:00:35.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.078 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'name': 'tempest-test_dscp_marking_external_network-960705218', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000006f', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'hostId': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.079 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.089 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.089 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9e8b188-101f-4355-a95d-6e3ee4ebb7cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-vda', 'timestamp': '2025-10-08T17:00:36.079448', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4f34bc48-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.802419199, 'message_signature': '0b32fcfcdc16b44a12eb9473756ed24f2d9bbc8d0e9f27bfd99df4d20967f4f0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-sda', 'timestamp': '2025-10-08T17:00:36.079448', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4f34ca4e-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.802419199, 'message_signature': '98e20be897c5e0507c536b5583388a4827124e69f29ee454778a0ff15e8d0567'}]}, 'timestamp': '2025-10-08 17:00:36.090064', '_unique_id': 'fd59c06bcd494a6dace3ee1d4c8cfe19'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.091 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.092 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.108 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.read.bytes volume: 328566272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.108 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '646ac8ea-9577-4bc8-aaf3-b4ca3f73829e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 328566272, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-vda', 'timestamp': '2025-10-08T17:00:36.092494', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4f379e90-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.815487336, 'message_signature': 'cd4bd2cb8c7da1462ef8537f89dfbee5d5b84c9b44ef88a6a9202111c7b09d52'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-sda', 'timestamp': '2025-10-08T17:00:36.092494', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4f37a908-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.815487336, 'message_signature': 'b553ffae3ef2f95dad206a425f766592c7699fe405dc6f985ca98376d86d6887'}]}, 'timestamp': '2025-10-08 17:00:36.108832', '_unique_id': '8e1fb5855a5348b3af12eb48c5899381'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.110 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.113 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/network.outgoing.bytes volume: 24096 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd8f106f-3ea3-4817-b319-cba28234230e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 24096, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-0000006f-0d0100d4-d5e3-4c53-b37f-275a02847eb0-tapaacfecb1-9e', 'timestamp': '2025-10-08T17:00:36.110700', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'tapaacfecb1-9e', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6c:dd:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaacfecb1-9e'}, 'message_id': '4f3866ea-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.833679616, 'message_signature': '965e27e3dba4c3ac955671b8fb3ff5a28d3d13370a2ed9c282c1af8f02ba2cd9'}]}, 'timestamp': '2025-10-08 17:00:36.113717', '_unique_id': '6329036f0cd0477f9bca3bc4c3cb7d89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.114 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5bd55897-8627-4005-b775-fc6e24cc54c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-0000006f-0d0100d4-d5e3-4c53-b37f-275a02847eb0-tapaacfecb1-9e', 'timestamp': '2025-10-08T17:00:36.115021', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'tapaacfecb1-9e', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6c:dd:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaacfecb1-9e'}, 'message_id': '4f38a664-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.833679616, 'message_signature': 'ea884d76b450bc23ad1e29181a9f48fd0bddca76a214b13c54fc3380e52e4c4e'}]}, 'timestamp': '2025-10-08 17:00:36.115320', '_unique_id': '52075674638b48cc947f1863fb12a053'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.115 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.116 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.116 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/network.outgoing.packets volume: 146 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'babf8b45-4409-4c36-bf6b-778ecdb8750e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 146, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-0000006f-0d0100d4-d5e3-4c53-b37f-275a02847eb0-tapaacfecb1-9e', 'timestamp': '2025-10-08T17:00:36.116473', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'tapaacfecb1-9e', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6c:dd:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaacfecb1-9e'}, 'message_id': '4f38dc6a-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.833679616, 'message_signature': 'bad2a7ce15c5508b88dc88dc254453f3385c90ff86e3aed83910dd35b7d6a614'}]}, 'timestamp': '2025-10-08 17:00:36.116699', '_unique_id': 'b71bfed59212482cad815a0549b6e1a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.117 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/network.incoming.packets volume: 107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aeed4dcd-15a5-4871-bc15-e86a2e8d1e65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 107, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-0000006f-0d0100d4-d5e3-4c53-b37f-275a02847eb0-tapaacfecb1-9e', 'timestamp': '2025-10-08T17:00:36.117941', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'tapaacfecb1-9e', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6c:dd:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaacfecb1-9e'}, 'message_id': '4f3915c2-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.833679616, 'message_signature': '521d31de4903d2db11dda621f3cddcf770fedede36bd74dcab02648edf6afb59'}]}, 'timestamp': '2025-10-08 17:00:36.118189', '_unique_id': 'bad211ff0b8a4ad9b9d6c5329068f574'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.118 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.119 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.119 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a8448cf-11fc-4b65-a92e-fa6ab89ea56a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-0000006f-0d0100d4-d5e3-4c53-b37f-275a02847eb0-tapaacfecb1-9e', 'timestamp': '2025-10-08T17:00:36.119376', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'tapaacfecb1-9e', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6c:dd:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaacfecb1-9e'}, 'message_id': '4f394da8-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.833679616, 'message_signature': '3aa8d86cd3fe105c16bcb46183e5dc3125ada9e50954c89137e9ae4be9376ddb'}]}, 'timestamp': '2025-10-08 17:00:36.119597', '_unique_id': '2943d7677f174657a15e3c49e1926143'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.120 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb74700d-dab8-4db7-9097-04d96ed7aaf6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-vda', 'timestamp': '2025-10-08T17:00:36.120642', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4f397f1c-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.802419199, 'message_signature': 'd326ac9fce283d458bb3abcd328738802dfe51b7fda4003e1a6b322e661d141e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-sda', 'timestamp': '2025-10-08T17:00:36.120642', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4f3986ec-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.802419199, 'message_signature': '3407a844896f428b1a42c885bf5b7dd478f8145ebb603746e93658c8b001b98c'}]}, 'timestamp': '2025-10-08 17:00:36.121048', '_unique_id': '358b6a3612a44c70bc024961033776b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.121 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/network.incoming.bytes.delta volume: 13720 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ab17d6c-c78b-4c85-9113-e8863e08fc1f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 13720, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-0000006f-0d0100d4-d5e3-4c53-b37f-275a02847eb0-tapaacfecb1-9e', 'timestamp': '2025-10-08T17:00:36.122161', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'tapaacfecb1-9e', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6c:dd:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaacfecb1-9e'}, 'message_id': '4f39bba8-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.833679616, 'message_signature': 'a78578c5e14c645c03f297f9b6165fa125da260646dbccca2975bad050d23213'}]}, 'timestamp': '2025-10-08 17:00:36.122455', '_unique_id': 'e2892df1e90e493592ef44e42dcb4485'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.122 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.123 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.123 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/network.outgoing.bytes.delta volume: 20484 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5e7a90a-fa9b-4efb-98d6-d7564c1cde7f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 20484, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-0000006f-0d0100d4-d5e3-4c53-b37f-275a02847eb0-tapaacfecb1-9e', 'timestamp': '2025-10-08T17:00:36.123648', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'tapaacfecb1-9e', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6c:dd:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaacfecb1-9e'}, 'message_id': '4f39f4b0-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.833679616, 'message_signature': '58358af200c659dd4a725d919ab5756e85b48bc6d72e602240673bc1f597fb49'}]}, 'timestamp': '2025-10-08 17:00:36.123875', '_unique_id': 'd899125ed5a441799eb1e2052d7627b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.124 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.125 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.125 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.write.bytes volume: 136989184 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.125 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6cd9b082-4f68-4faf-a127-8b9587732db8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 136989184, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-vda', 'timestamp': '2025-10-08T17:00:36.125253', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4f3a35b0-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.815487336, 'message_signature': '101f924cbf4cdf2d83840dba99a720390a5695efede3f735b337a5b1fcf38a3d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-sda', 'timestamp': '2025-10-08T17:00:36.125253', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4f3a3fa6-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.815487336, 'message_signature': 'bc66dc19b432ca714413f23bd59756e7bcfe11f9a72b2765522ecdd349e361b3'}]}, 'timestamp': '2025-10-08 17:00:36.125805', '_unique_id': '31af957a7caf4d879d519611bb923d5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.126 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.127 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.139 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/cpu volume: 41410000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4e4159c0-9297-4553-84ba-d530a4bbf2e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 41410000000, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'timestamp': '2025-10-08T17:00:36.127383', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '4f3c6c7c-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.86258884, 'message_signature': '023a2add803165a9d6987bc5f019d8cfc38b6b8ca9dac8b404a408de20f401ac'}]}, 'timestamp': '2025-10-08 17:00:36.140110', '_unique_id': '55872c466f1047989c41fac50b0ffb94'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.141 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.141 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef12892e-2288-4751-9063-016e63de8026', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-0000006f-0d0100d4-d5e3-4c53-b37f-275a02847eb0-tapaacfecb1-9e', 'timestamp': '2025-10-08T17:00:36.141546', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'tapaacfecb1-9e', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6c:dd:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaacfecb1-9e'}, 'message_id': '4f3cb01a-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.833679616, 'message_signature': '98a9ef006fc964a1fd13a66ca14bb41c131f0ad2f1d220610d8146b1c087680b'}]}, 'timestamp': '2025-10-08 17:00:36.141783', '_unique_id': 'b9ce80ac31484bd98f6ece05ddaa8292'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.142 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/network.incoming.bytes volume: 16110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3bb3c949-883e-4059-b410-e0ba2ab6a16b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 16110, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-0000006f-0d0100d4-d5e3-4c53-b37f-275a02847eb0-tapaacfecb1-9e', 'timestamp': '2025-10-08T17:00:36.142847', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'tapaacfecb1-9e', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6c:dd:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaacfecb1-9e'}, 'message_id': '4f3ce27e-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.833679616, 'message_signature': 'd0b7ef2757297b0085db458d54d0df87d6e7d4c66436269e0ff11e8a3774c476'}]}, 'timestamp': '2025-10-08 17:00:36.143068', '_unique_id': '8fb4ffce3888496e8c4f5fcc77e100cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.143 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.144 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.144 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.144 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.write.requests volume: 806 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.144 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92c18ec7-31d4-4be2-8370-5721521cc83a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 806, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-vda', 'timestamp': '2025-10-08T17:00:36.144460', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4f3d219e-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.815487336, 'message_signature': 'c3ef39de0c03560cf825817f67dc79f82a25e9e093f097bbf49a54b45f9f8724'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-sda', 'timestamp': '2025-10-08T17:00:36.144460', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4f3d2996-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.815487336, 'message_signature': '69271400279c9ade52076c33b8b44f958a5ea5a08a8ac3c8bfbb645c74fca805'}]}, 'timestamp': '2025-10-08 17:00:36.144878', '_unique_id': 'fdead64a77ce4ccd8484474592f02932'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.145 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/memory.usage volume: 236.1796875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cce8e134-9cdd-4d26-8378-875a6b3390c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 236.1796875, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'timestamp': '2025-10-08T17:00:36.145949', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '4f3d5baa-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.86258884, 'message_signature': '69373c788b9464d61404b0d49fcb3700b1a40dae75d2b2358733c57f6bc3cecd'}]}, 'timestamp': '2025-10-08 17:00:36.146189', '_unique_id': '69517ab61b1c4970896225f7645136b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.146 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.147 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.147 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.write.latency volume: 9129842170 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.147 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a85fc174-c419-4d59-99e1-95138b9cbc84', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9129842170, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-vda', 'timestamp': '2025-10-08T17:00:36.147287', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4f3d9106-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.815487336, 'message_signature': 'f0f00b69aea87b042f7817a7a2f209aca1d6add2d6f2719ea6b6db6995a9480f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-sda', 'timestamp': '2025-10-08T17:00:36.147287', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4f3d98cc-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.815487336, 'message_signature': '0fd1d3e6676b4aa29bf5e9019bbaa806f707533ffeffd3009a64e9b48e54c008'}]}, 'timestamp': '2025-10-08 17:00:36.147732', '_unique_id': '7bdc36a0a7a44047b6c202e318e43e18'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.148 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.usage volume: 152764416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62eae23f-5fd9-40ef-b6fa-b6e65b22203e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152764416, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-vda', 'timestamp': '2025-10-08T17:00:36.148876', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4f3dce46-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.802419199, 'message_signature': '5ffd4383513946f3d0de49b2feee408a417c04c8e0bc7ee511a5e36dce4147fd'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-sda', 'timestamp': '2025-10-08T17:00:36.148876', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4f3dd760-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.802419199, 'message_signature': 'f90b45c4512f9ebe3fd85a0a34728925db68b8e61fd6ae6e2a9ea2dbcae7e401'}]}, 'timestamp': '2025-10-08 17:00:36.149327', '_unique_id': '13ecba54c07247b99c3f086c44f238d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.149 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.150 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.150 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3222668a-cb7c-406f-8bf8-42d381575e8d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-0000006f-0d0100d4-d5e3-4c53-b37f-275a02847eb0-tapaacfecb1-9e', 'timestamp': '2025-10-08T17:00:36.150423', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'tapaacfecb1-9e', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:6c:dd:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaacfecb1-9e'}, 'message_id': '4f3e0a8c-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.833679616, 'message_signature': 'a464c7eef60981d851606c943c60f716f205d8c6ac23eee82295cf118ba5cf48'}]}, 'timestamp': '2025-10-08 17:00:36.150645', '_unique_id': '10b04fba9e64417882b9af26e42a9c98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.151 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.read.requests volume: 11653 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13158e98-c097-406f-b057-94623a2f7d4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11653, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-vda', 'timestamp': '2025-10-08T17:00:36.151980', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4f3e48a8-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.815487336, 'message_signature': '5863b66da4bd048b4343d8282efa77b0743ac9dc795f3e4abe6e92673b914a04'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-sda', 'timestamp': '2025-10-08T17:00:36.151980', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4f3e5258-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.815487336, 'message_signature': '28f241dbe77a122f3b4260c6d0c61885da3d9a4c57a2f397ceca5d1541726c53'}]}, 'timestamp': '2025-10-08 17:00:36.152478', '_unique_id': 'c0f0b2ccfa3a4c6cb08e70e78bc387c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.153 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.153 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.153 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.read.latency volume: 6521555132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.153 12 DEBUG ceilometer.compute.pollsters [-] 0d0100d4-d5e3-4c53-b37f-275a02847eb0/disk.device.read.latency volume: 47315887 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92485332-d512-4f9d-8fef-eda59339891a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6521555132, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-vda', 'timestamp': '2025-10-08T17:00:36.153691', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '4f3e89e4-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.815487336, 'message_signature': '38d1aa80a32b15b7c0e49e6215c95efe3bc4469ab0c1b92359212f01bfcf60cf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 47315887, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0-sda', 'timestamp': '2025-10-08T17:00:36.153691', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_external_network-960705218', 'name': 'instance-0000006f', 'instance_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '4f3e91b4-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9739.815487336, 'message_signature': 'aa1f93502a30f4e0b574980761ffd2fe353e7831cc3d1cc0f68c211bcb7821c8'}]}, 'timestamp': '2025-10-08 17:00:36.154111', '_unique_id': '0346649a7bac44a697a9d0c5965531da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:00:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:00:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:00:40 np0005476733 nova_compute[192580]: 2025-10-08 17:00:40.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 13:00:45 np0005476733 podman[277927]: 2025-10-08 17:00:45.207726285 +0000 UTC m=+0.041917000 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  8 13:00:45 np0005476733 nova_compute[192580]: 2025-10-08 17:00:45.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 13:00:45 np0005476733 nova_compute[192580]: 2025-10-08 17:00:45.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:00:45 np0005476733 nova_compute[192580]: 2025-10-08 17:00:45.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  8 13:00:45 np0005476733 nova_compute[192580]: 2025-10-08 17:00:45.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 13:00:45 np0005476733 nova_compute[192580]: 2025-10-08 17:00:45.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 13:00:45 np0005476733 nova_compute[192580]: 2025-10-08 17:00:45.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:00:46 np0005476733 podman[277948]: 2025-10-08 17:00:46.254203059 +0000 UTC m=+0.084199441 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:00:49 np0005476733 podman[277975]: 2025-10-08 17:00:49.22646303 +0000 UTC m=+0.059249243 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  8 13:00:50 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:00:50.226 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=94, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=93) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:00:50 np0005476733 nova_compute[192580]: 2025-10-08 17:00:50.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:00:50 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:00:50.227 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 13:00:50 np0005476733 nova_compute[192580]: 2025-10-08 17:00:50.491 2 DEBUG nova.compute.manager [req-d19ae0ed-aca1-4bce-8a8e-f9992cd7e667 req-95b9c92f-9f83-4e5c-aa19-17efb07e282d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Received event network-changed-aacfecb1-9e82-4905-8b61-d1e0e486b87e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:00:50 np0005476733 nova_compute[192580]: 2025-10-08 17:00:50.491 2 DEBUG nova.compute.manager [req-d19ae0ed-aca1-4bce-8a8e-f9992cd7e667 req-95b9c92f-9f83-4e5c-aa19-17efb07e282d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Refreshing instance network info cache due to event network-changed-aacfecb1-9e82-4905-8b61-d1e0e486b87e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 13:00:50 np0005476733 nova_compute[192580]: 2025-10-08 17:00:50.491 2 DEBUG oslo_concurrency.lockutils [req-d19ae0ed-aca1-4bce-8a8e-f9992cd7e667 req-95b9c92f-9f83-4e5c-aa19-17efb07e282d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-0d0100d4-d5e3-4c53-b37f-275a02847eb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 13:00:50 np0005476733 nova_compute[192580]: 2025-10-08 17:00:50.492 2 DEBUG oslo_concurrency.lockutils [req-d19ae0ed-aca1-4bce-8a8e-f9992cd7e667 req-95b9c92f-9f83-4e5c-aa19-17efb07e282d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-0d0100d4-d5e3-4c53-b37f-275a02847eb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 13:00:50 np0005476733 nova_compute[192580]: 2025-10-08 17:00:50.492 2 DEBUG nova.network.neutron [req-d19ae0ed-aca1-4bce-8a8e-f9992cd7e667 req-95b9c92f-9f83-4e5c-aa19-17efb07e282d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Refreshing network info cache for port aacfecb1-9e82-4905-8b61-d1e0e486b87e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 13:00:50 np0005476733 nova_compute[192580]: 2025-10-08 17:00:50.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:00:51 np0005476733 nova_compute[192580]: 2025-10-08 17:00:51.695 2 DEBUG nova.network.neutron [req-d19ae0ed-aca1-4bce-8a8e-f9992cd7e667 req-95b9c92f-9f83-4e5c-aa19-17efb07e282d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Updated VIF entry in instance network info cache for port aacfecb1-9e82-4905-8b61-d1e0e486b87e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 13:00:51 np0005476733 nova_compute[192580]: 2025-10-08 17:00:51.695 2 DEBUG nova.network.neutron [req-d19ae0ed-aca1-4bce-8a8e-f9992cd7e667 req-95b9c92f-9f83-4e5c-aa19-17efb07e282d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Updating instance_info_cache with network_info: [{"id": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "address": "fa:16:3e:6c:dd:b7", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.193", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaacfecb1-9e", "ovs_interfaceid": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 13:00:51 np0005476733 nova_compute[192580]: 2025-10-08 17:00:51.769 2 DEBUG oslo_concurrency.lockutils [req-d19ae0ed-aca1-4bce-8a8e-f9992cd7e667 req-95b9c92f-9f83-4e5c-aa19-17efb07e282d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-0d0100d4-d5e3-4c53-b37f-275a02847eb0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 13:00:55 np0005476733 nova_compute[192580]: 2025-10-08 17:00:55.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:00:55 np0005476733 ovn_controller[263831]: 2025-10-08T17:00:55Z|00230|pinctrl|WARN|Dropped 101 log messages in last 46 seconds (most recently, 6 seconds ago) due to excessive rate
Oct  8 13:00:55 np0005476733 ovn_controller[263831]: 2025-10-08T17:00:55Z|00231|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:00:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:00:56.229 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '94'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:00:58 np0005476733 podman[277996]: 2025-10-08 17:00:58.255621016 +0000 UTC m=+0.070066699 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Oct  8 13:00:58 np0005476733 podman[277997]: 2025-10-08 17:00:58.260965486 +0000 UTC m=+0.080861334 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 13:00:58 np0005476733 podman[277998]: 2025-10-08 17:00:58.261791893 +0000 UTC m=+0.077977262 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  8 13:01:00 np0005476733 nova_compute[192580]: 2025-10-08 17:01:00.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.176 2 DEBUG oslo_concurrency.lockutils [None req-8f7a377b-3214-4f7c-9fad-045ec838d31e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Acquiring lock "0d0100d4-d5e3-4c53-b37f-275a02847eb0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.177 2 DEBUG oslo_concurrency.lockutils [None req-8f7a377b-3214-4f7c-9fad-045ec838d31e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "0d0100d4-d5e3-4c53-b37f-275a02847eb0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.177 2 DEBUG oslo_concurrency.lockutils [None req-8f7a377b-3214-4f7c-9fad-045ec838d31e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Acquiring lock "0d0100d4-d5e3-4c53-b37f-275a02847eb0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.177 2 DEBUG oslo_concurrency.lockutils [None req-8f7a377b-3214-4f7c-9fad-045ec838d31e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "0d0100d4-d5e3-4c53-b37f-275a02847eb0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.177 2 DEBUG oslo_concurrency.lockutils [None req-8f7a377b-3214-4f7c-9fad-045ec838d31e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "0d0100d4-d5e3-4c53-b37f-275a02847eb0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.179 2 INFO nova.compute.manager [None req-8f7a377b-3214-4f7c-9fad-045ec838d31e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Terminating instance#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.179 2 DEBUG nova.compute.manager [None req-8f7a377b-3214-4f7c-9fad-045ec838d31e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 13:01:02 np0005476733 kernel: tapaacfecb1-9e (unregistering): left promiscuous mode
Oct  8 13:01:02 np0005476733 NetworkManager[51699]: <info>  [1759942862.2096] device (tapaacfecb1-9e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:02 np0005476733 ovn_controller[263831]: 2025-10-08T17:01:02Z|00232|binding|INFO|Releasing lport aacfecb1-9e82-4905-8b61-d1e0e486b87e from this chassis (sb_readonly=0)
Oct  8 13:01:02 np0005476733 ovn_controller[263831]: 2025-10-08T17:01:02Z|00233|binding|INFO|Setting lport aacfecb1-9e82-4905-8b61-d1e0e486b87e down in Southbound
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:02 np0005476733 ovn_controller[263831]: 2025-10-08T17:01:02Z|00234|binding|INFO|Removing iface tapaacfecb1-9e ovn-installed in OVS
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:01:02.227 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:dd:b7 192.168.122.193'], port_security=['fa:16:3e:6c:dd:b7 192.168.122.193'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.193/24', 'neutron:device_id': '0d0100d4-d5e3-4c53-b37f-275a02847eb0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'neutron:revision_number': '6', 'neutron:security_group_ids': '25db264a-2b7b-491b-8b39-dd7c8ac35c2b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5b64086-e7d8-42ad-b439-67cb79e13d7c, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=aacfecb1-9e82-4905-8b61-d1e0e486b87e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:01:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:01:02.228 103739 INFO neutron.agent.ovn.metadata.agent [-] Port aacfecb1-9e82-4905-8b61-d1e0e486b87e in datapath 81c575b5-ac88-40d3-8b00-79c5c936eec4 unbound from our chassis#033[00m
Oct  8 13:01:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:01:02.230 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 81c575b5-ac88-40d3-8b00-79c5c936eec4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 13:01:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:01:02.232 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6384e95c-4074-47a2-85ca-4ff74b967bcb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:01:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:01:02.232 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 namespace which is not needed anymore#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:02 np0005476733 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d0000006f.scope: Deactivated successfully.
Oct  8 13:01:02 np0005476733 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d0000006f.scope: Consumed 45.456s CPU time.
Oct  8 13:01:02 np0005476733 systemd-machined[152624]: Machine qemu-68-instance-0000006f terminated.
Oct  8 13:01:02 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[276834]: [NOTICE]   (276838) : haproxy version is 2.8.14-c23fe91
Oct  8 13:01:02 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[276834]: [NOTICE]   (276838) : path to executable is /usr/sbin/haproxy
Oct  8 13:01:02 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[276834]: [WARNING]  (276838) : Exiting Master process...
Oct  8 13:01:02 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[276834]: [ALERT]    (276838) : Current worker (276840) exited with code 143 (Terminated)
Oct  8 13:01:02 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[276834]: [WARNING]  (276838) : All workers exited. Exiting... (0)
Oct  8 13:01:02 np0005476733 systemd[1]: libpod-d37201e2061e125b999bfdc9cc5b9846b86844e71462905ec13336b97286c485.scope: Deactivated successfully.
Oct  8 13:01:02 np0005476733 podman[278097]: 2025-10-08 17:01:02.363608692 +0000 UTC m=+0.044854913 container died d37201e2061e125b999bfdc9cc5b9846b86844e71462905ec13336b97286c485 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 13:01:02 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d37201e2061e125b999bfdc9cc5b9846b86844e71462905ec13336b97286c485-userdata-shm.mount: Deactivated successfully.
Oct  8 13:01:02 np0005476733 systemd[1]: var-lib-containers-storage-overlay-044d4aa5147809967fbbb70717e1c7a5d7071ea841533a08963a0554124cb19b-merged.mount: Deactivated successfully.
Oct  8 13:01:02 np0005476733 podman[278097]: 2025-10-08 17:01:02.397419652 +0000 UTC m=+0.078665863 container cleanup d37201e2061e125b999bfdc9cc5b9846b86844e71462905ec13336b97286c485 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 13:01:02 np0005476733 systemd[1]: libpod-conmon-d37201e2061e125b999bfdc9cc5b9846b86844e71462905ec13336b97286c485.scope: Deactivated successfully.
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.442 2 INFO nova.virt.libvirt.driver [-] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Instance destroyed successfully.#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.442 2 DEBUG nova.objects.instance [None req-8f7a377b-3214-4f7c-9fad-045ec838d31e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lazy-loading 'resources' on Instance uuid 0d0100d4-d5e3-4c53-b37f-275a02847eb0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.466 2 DEBUG nova.virt.libvirt.vif [None req-8f7a377b-3214-4f7c-9fad-045ec838d31e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T16:57:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_dscp_marking_external_network-960705218',display_name='tempest-test_dscp_marking_external_network-960705218',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dscp-marking-external-network-960705218',id=111,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMZY+1amHtOfP7A9mNq9elrLdSmJExinx1RhEMIok/jY2NIEZLmbGYzwSU7hW6lTgCdp0ppXG+TES3vu/f6QcskjPgczSOIC1ZSOo2pydcs/ssHi2XSJm93QN1u5564Lw==',key_name='tempest-keypair-test-130020287',keypairs=<?>,launch_index=0,launched_at=2025-10-08T16:57:42Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1bfc3ffe3bf745dfbb59f63a07f1e1a9',ramdisk_id='',reservation_id='r-6rqnqjre',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-951807005',owner_user_name='tempest-QosTestCommon-951807005-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T16:57:42Z,user_data=None,user_id='c7e76ef370424761ad76d4119e9bf895',uuid=0d0100d4-d5e3-4c53-b37f-275a02847eb0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "address": "fa:16:3e:6c:dd:b7", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.193", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaacfecb1-9e", "ovs_interfaceid": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.466 2 DEBUG nova.network.os_vif_util [None req-8f7a377b-3214-4f7c-9fad-045ec838d31e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Converting VIF {"id": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "address": "fa:16:3e:6c:dd:b7", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.193", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaacfecb1-9e", "ovs_interfaceid": "aacfecb1-9e82-4905-8b61-d1e0e486b87e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.467 2 DEBUG nova.network.os_vif_util [None req-8f7a377b-3214-4f7c-9fad-045ec838d31e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6c:dd:b7,bridge_name='br-int',has_traffic_filtering=True,id=aacfecb1-9e82-4905-8b61-d1e0e486b87e,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaacfecb1-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.467 2 DEBUG os_vif [None req-8f7a377b-3214-4f7c-9fad-045ec838d31e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6c:dd:b7,bridge_name='br-int',has_traffic_filtering=True,id=aacfecb1-9e82-4905-8b61-d1e0e486b87e,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaacfecb1-9e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.469 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaacfecb1-9e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.473 2 INFO os_vif [None req-8f7a377b-3214-4f7c-9fad-045ec838d31e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6c:dd:b7,bridge_name='br-int',has_traffic_filtering=True,id=aacfecb1-9e82-4905-8b61-d1e0e486b87e,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaacfecb1-9e')#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.474 2 INFO nova.virt.libvirt.driver [None req-8f7a377b-3214-4f7c-9fad-045ec838d31e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Deleting instance files /var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0_del#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.475 2 INFO nova.virt.libvirt.driver [None req-8f7a377b-3214-4f7c-9fad-045ec838d31e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Deletion of /var/lib/nova/instances/0d0100d4-d5e3-4c53-b37f-275a02847eb0_del complete#033[00m
Oct  8 13:01:02 np0005476733 podman[278131]: 2025-10-08 17:01:02.513378005 +0000 UTC m=+0.093267529 container remove d37201e2061e125b999bfdc9cc5b9846b86844e71462905ec13336b97286c485 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 13:01:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:01:02.518 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[dc1c2d7d-ec3b-430e-bc08-837caeeaed75]: (4, ('Wed Oct  8 05:01:02 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 (d37201e2061e125b999bfdc9cc5b9846b86844e71462905ec13336b97286c485)\nd37201e2061e125b999bfdc9cc5b9846b86844e71462905ec13336b97286c485\nWed Oct  8 05:01:02 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 (d37201e2061e125b999bfdc9cc5b9846b86844e71462905ec13336b97286c485)\nd37201e2061e125b999bfdc9cc5b9846b86844e71462905ec13336b97286c485\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:01:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:01:02.519 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e83beed1-f677-4674-b6d1-0011c5ae1aee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:01:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:01:02.520 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81c575b5-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:02 np0005476733 kernel: tap81c575b5-a0: left promiscuous mode
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:01:02.536 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ef960b88-5daa-479b-a97d-8a3cd95a4907]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.542 2 INFO nova.compute.manager [None req-8f7a377b-3214-4f7c-9fad-045ec838d31e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.542 2 DEBUG oslo.service.loopingcall [None req-8f7a377b-3214-4f7c-9fad-045ec838d31e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.542 2 DEBUG nova.compute.manager [-] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 13:01:02 np0005476733 nova_compute[192580]: 2025-10-08 17:01:02.543 2 DEBUG nova.network.neutron [-] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 13:01:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:01:02.563 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8b652e23-cd2d-4507-a290-5f5ac2454eb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:01:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:01:02.564 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5b88b38f-9262-49e6-8e3c-5539b71d9677]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:01:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:01:02.580 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e8da96fe-ed5f-4151-8556-78ac1f942d21]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 956490, 'reachable_time': 44362, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 278158, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:01:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:01:02.582 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 13:01:02 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:01:02.583 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[7bcab521-6f99-4759-bbb4-f3656baa249b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:01:02 np0005476733 systemd[1]: run-netns-ovnmeta\x2d81c575b5\x2dac88\x2d40d3\x2d8b00\x2d79c5c936eec4.mount: Deactivated successfully.
Oct  8 13:01:03 np0005476733 nova_compute[192580]: 2025-10-08 17:01:03.129 2 DEBUG nova.compute.manager [req-cddee623-6c98-40d1-957b-2c1dafc21548 req-ad8a34b5-44db-423e-af9e-b44f0e51818c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Received event network-vif-unplugged-aacfecb1-9e82-4905-8b61-d1e0e486b87e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:01:03 np0005476733 nova_compute[192580]: 2025-10-08 17:01:03.130 2 DEBUG oslo_concurrency.lockutils [req-cddee623-6c98-40d1-957b-2c1dafc21548 req-ad8a34b5-44db-423e-af9e-b44f0e51818c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "0d0100d4-d5e3-4c53-b37f-275a02847eb0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:01:03 np0005476733 nova_compute[192580]: 2025-10-08 17:01:03.130 2 DEBUG oslo_concurrency.lockutils [req-cddee623-6c98-40d1-957b-2c1dafc21548 req-ad8a34b5-44db-423e-af9e-b44f0e51818c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "0d0100d4-d5e3-4c53-b37f-275a02847eb0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:01:03 np0005476733 nova_compute[192580]: 2025-10-08 17:01:03.130 2 DEBUG oslo_concurrency.lockutils [req-cddee623-6c98-40d1-957b-2c1dafc21548 req-ad8a34b5-44db-423e-af9e-b44f0e51818c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "0d0100d4-d5e3-4c53-b37f-275a02847eb0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:01:03 np0005476733 nova_compute[192580]: 2025-10-08 17:01:03.130 2 DEBUG nova.compute.manager [req-cddee623-6c98-40d1-957b-2c1dafc21548 req-ad8a34b5-44db-423e-af9e-b44f0e51818c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] No waiting events found dispatching network-vif-unplugged-aacfecb1-9e82-4905-8b61-d1e0e486b87e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 13:01:03 np0005476733 nova_compute[192580]: 2025-10-08 17:01:03.131 2 DEBUG nova.compute.manager [req-cddee623-6c98-40d1-957b-2c1dafc21548 req-ad8a34b5-44db-423e-af9e-b44f0e51818c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Received event network-vif-unplugged-aacfecb1-9e82-4905-8b61-d1e0e486b87e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 13:01:03 np0005476733 nova_compute[192580]: 2025-10-08 17:01:03.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:01:03 np0005476733 nova_compute[192580]: 2025-10-08 17:01:03.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:01:03 np0005476733 nova_compute[192580]: 2025-10-08 17:01:03.717 2 DEBUG nova.network.neutron [-] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 13:01:03 np0005476733 nova_compute[192580]: 2025-10-08 17:01:03.735 2 INFO nova.compute.manager [-] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Took 1.19 seconds to deallocate network for instance.#033[00m
Oct  8 13:01:03 np0005476733 nova_compute[192580]: 2025-10-08 17:01:03.793 2 DEBUG oslo_concurrency.lockutils [None req-8f7a377b-3214-4f7c-9fad-045ec838d31e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:01:03 np0005476733 nova_compute[192580]: 2025-10-08 17:01:03.793 2 DEBUG oslo_concurrency.lockutils [None req-8f7a377b-3214-4f7c-9fad-045ec838d31e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:01:03 np0005476733 nova_compute[192580]: 2025-10-08 17:01:03.859 2 DEBUG nova.compute.provider_tree [None req-8f7a377b-3214-4f7c-9fad-045ec838d31e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:01:03 np0005476733 nova_compute[192580]: 2025-10-08 17:01:03.880 2 DEBUG nova.scheduler.client.report [None req-8f7a377b-3214-4f7c-9fad-045ec838d31e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:01:03 np0005476733 nova_compute[192580]: 2025-10-08 17:01:03.903 2 DEBUG oslo_concurrency.lockutils [None req-8f7a377b-3214-4f7c-9fad-045ec838d31e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:01:03 np0005476733 nova_compute[192580]: 2025-10-08 17:01:03.931 2 INFO nova.scheduler.client.report [None req-8f7a377b-3214-4f7c-9fad-045ec838d31e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Deleted allocations for instance 0d0100d4-d5e3-4c53-b37f-275a02847eb0#033[00m
Oct  8 13:01:04 np0005476733 nova_compute[192580]: 2025-10-08 17:01:04.012 2 DEBUG oslo_concurrency.lockutils [None req-8f7a377b-3214-4f7c-9fad-045ec838d31e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "0d0100d4-d5e3-4c53-b37f-275a02847eb0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.835s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:01:04 np0005476733 podman[278159]: 2025-10-08 17:01:04.24400778 +0000 UTC m=+0.065170563 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  8 13:01:04 np0005476733 podman[278160]: 2025-10-08 17:01:04.265185508 +0000 UTC m=+0.085706340 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 13:01:04 np0005476733 nova_compute[192580]: 2025-10-08 17:01:04.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:01:05 np0005476733 nova_compute[192580]: 2025-10-08 17:01:05.275 2 DEBUG nova.compute.manager [req-219511f3-1c74-4669-9c35-d016e1f0369e req-a32e7d63-391d-4a3a-a4cf-17fb0f3a77d2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Received event network-vif-plugged-aacfecb1-9e82-4905-8b61-d1e0e486b87e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:01:05 np0005476733 nova_compute[192580]: 2025-10-08 17:01:05.276 2 DEBUG oslo_concurrency.lockutils [req-219511f3-1c74-4669-9c35-d016e1f0369e req-a32e7d63-391d-4a3a-a4cf-17fb0f3a77d2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "0d0100d4-d5e3-4c53-b37f-275a02847eb0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:01:05 np0005476733 nova_compute[192580]: 2025-10-08 17:01:05.276 2 DEBUG oslo_concurrency.lockutils [req-219511f3-1c74-4669-9c35-d016e1f0369e req-a32e7d63-391d-4a3a-a4cf-17fb0f3a77d2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "0d0100d4-d5e3-4c53-b37f-275a02847eb0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:01:05 np0005476733 nova_compute[192580]: 2025-10-08 17:01:05.276 2 DEBUG oslo_concurrency.lockutils [req-219511f3-1c74-4669-9c35-d016e1f0369e req-a32e7d63-391d-4a3a-a4cf-17fb0f3a77d2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "0d0100d4-d5e3-4c53-b37f-275a02847eb0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:01:05 np0005476733 nova_compute[192580]: 2025-10-08 17:01:05.276 2 DEBUG nova.compute.manager [req-219511f3-1c74-4669-9c35-d016e1f0369e req-a32e7d63-391d-4a3a-a4cf-17fb0f3a77d2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] No waiting events found dispatching network-vif-plugged-aacfecb1-9e82-4905-8b61-d1e0e486b87e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 13:01:05 np0005476733 nova_compute[192580]: 2025-10-08 17:01:05.276 2 WARNING nova.compute.manager [req-219511f3-1c74-4669-9c35-d016e1f0369e req-a32e7d63-391d-4a3a-a4cf-17fb0f3a77d2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Received unexpected event network-vif-plugged-aacfecb1-9e82-4905-8b61-d1e0e486b87e for instance with vm_state deleted and task_state None.#033[00m
Oct  8 13:01:05 np0005476733 nova_compute[192580]: 2025-10-08 17:01:05.277 2 DEBUG nova.compute.manager [req-219511f3-1c74-4669-9c35-d016e1f0369e req-a32e7d63-391d-4a3a-a4cf-17fb0f3a77d2 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Received event network-vif-deleted-aacfecb1-9e82-4905-8b61-d1e0e486b87e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:01:05 np0005476733 nova_compute[192580]: 2025-10-08 17:01:05.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:07 np0005476733 nova_compute[192580]: 2025-10-08 17:01:07.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:08 np0005476733 nova_compute[192580]: 2025-10-08 17:01:08.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:01:10 np0005476733 nova_compute[192580]: 2025-10-08 17:01:10.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:12 np0005476733 nova_compute[192580]: 2025-10-08 17:01:12.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:13 np0005476733 nova_compute[192580]: 2025-10-08 17:01:13.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:01:15 np0005476733 nova_compute[192580]: 2025-10-08 17:01:15.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:16 np0005476733 podman[278207]: 2025-10-08 17:01:16.215959256 +0000 UTC m=+0.045185725 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 13:01:16 np0005476733 nova_compute[192580]: 2025-10-08 17:01:16.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:01:16 np0005476733 nova_compute[192580]: 2025-10-08 17:01:16.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:01:16 np0005476733 nova_compute[192580]: 2025-10-08 17:01:16.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:01:16 np0005476733 nova_compute[192580]: 2025-10-08 17:01:16.610 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:01:17 np0005476733 podman[278227]: 2025-10-08 17:01:17.255786127 +0000 UTC m=+0.088839389 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 13:01:17 np0005476733 nova_compute[192580]: 2025-10-08 17:01:17.439 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759942862.4383001, 0d0100d4-d5e3-4c53-b37f-275a02847eb0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 13:01:17 np0005476733 nova_compute[192580]: 2025-10-08 17:01:17.440 2 INFO nova.compute.manager [-] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] VM Stopped (Lifecycle Event)#033[00m
Oct  8 13:01:17 np0005476733 nova_compute[192580]: 2025-10-08 17:01:17.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:17 np0005476733 nova_compute[192580]: 2025-10-08 17:01:17.493 2 DEBUG nova.compute.manager [None req-0ee823f7-cb2b-465b-adb7-b153f53e314b - - - - - -] [instance: 0d0100d4-d5e3-4c53-b37f-275a02847eb0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 13:01:17 np0005476733 nova_compute[192580]: 2025-10-08 17:01:17.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:01:20 np0005476733 podman[278253]: 2025-10-08 17:01:20.253801701 +0000 UTC m=+0.080944308 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Oct  8 13:01:20 np0005476733 nova_compute[192580]: 2025-10-08 17:01:20.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:22 np0005476733 nova_compute[192580]: 2025-10-08 17:01:22.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:25 np0005476733 nova_compute[192580]: 2025-10-08 17:01:25.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:25 np0005476733 nova_compute[192580]: 2025-10-08 17:01:25.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:01:25 np0005476733 nova_compute[192580]: 2025-10-08 17:01:25.619 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:01:25 np0005476733 nova_compute[192580]: 2025-10-08 17:01:25.619 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:01:25 np0005476733 nova_compute[192580]: 2025-10-08 17:01:25.619 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:01:25 np0005476733 nova_compute[192580]: 2025-10-08 17:01:25.619 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:01:25 np0005476733 nova_compute[192580]: 2025-10-08 17:01:25.772 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:01:25 np0005476733 nova_compute[192580]: 2025-10-08 17:01:25.773 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13656MB free_disk=111.30171203613281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:01:25 np0005476733 nova_compute[192580]: 2025-10-08 17:01:25.774 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:01:25 np0005476733 nova_compute[192580]: 2025-10-08 17:01:25.774 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:01:25 np0005476733 nova_compute[192580]: 2025-10-08 17:01:25.858 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:01:25 np0005476733 nova_compute[192580]: 2025-10-08 17:01:25.859 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:01:25 np0005476733 nova_compute[192580]: 2025-10-08 17:01:25.880 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:01:25 np0005476733 nova_compute[192580]: 2025-10-08 17:01:25.928 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:01:25 np0005476733 nova_compute[192580]: 2025-10-08 17:01:25.982 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:01:25 np0005476733 nova_compute[192580]: 2025-10-08 17:01:25.982 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:01:26.438 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:01:26.438 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:01:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:01:26.438 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:01:26 np0005476733 nova_compute[192580]: 2025-10-08 17:01:26.984 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:01:27 np0005476733 nova_compute[192580]: 2025-10-08 17:01:27.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:29 np0005476733 podman[278277]: 2025-10-08 17:01:29.21920149 +0000 UTC m=+0.043685406 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 13:01:29 np0005476733 podman[278276]: 2025-10-08 17:01:29.235058756 +0000 UTC m=+0.062882909 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  8 13:01:29 np0005476733 podman[278278]: 2025-10-08 17:01:29.238992492 +0000 UTC m=+0.056915849 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  8 13:01:30 np0005476733 nova_compute[192580]: 2025-10-08 17:01:30.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:32 np0005476733 nova_compute[192580]: 2025-10-08 17:01:32.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:32 np0005476733 nova_compute[192580]: 2025-10-08 17:01:32.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:01:34 np0005476733 nova_compute[192580]: 2025-10-08 17:01:34.592 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:01:34 np0005476733 nova_compute[192580]: 2025-10-08 17:01:34.593 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 13:01:35 np0005476733 podman[278340]: 2025-10-08 17:01:35.219495335 +0000 UTC m=+0.043905153 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 13:01:35 np0005476733 podman[278339]: 2025-10-08 17:01:35.229015269 +0000 UTC m=+0.056305389 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, io.buildah.version=1.41.3)
Oct  8 13:01:35 np0005476733 nova_compute[192580]: 2025-10-08 17:01:35.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:37 np0005476733 nova_compute[192580]: 2025-10-08 17:01:37.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:40 np0005476733 ovn_controller[263831]: 2025-10-08T17:01:40Z|00235|memory_trim|INFO|Detected inactivity (last active 30012 ms ago): trimming memory
Oct  8 13:01:40 np0005476733 nova_compute[192580]: 2025-10-08 17:01:40.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:42 np0005476733 nova_compute[192580]: 2025-10-08 17:01:42.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:45 np0005476733 nova_compute[192580]: 2025-10-08 17:01:45.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:45 np0005476733 nova_compute[192580]: 2025-10-08 17:01:45.599 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:01:47 np0005476733 podman[278381]: 2025-10-08 17:01:47.206708418 +0000 UTC m=+0.040905117 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 13:01:47 np0005476733 nova_compute[192580]: 2025-10-08 17:01:47.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:48 np0005476733 podman[278400]: 2025-10-08 17:01:48.240765155 +0000 UTC m=+0.074138549 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 13:01:50 np0005476733 nova_compute[192580]: 2025-10-08 17:01:50.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:51 np0005476733 podman[278426]: 2025-10-08 17:01:51.210951228 +0000 UTC m=+0.044709699 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:01:52 np0005476733 nova_compute[192580]: 2025-10-08 17:01:52.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:55 np0005476733 nova_compute[192580]: 2025-10-08 17:01:55.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:57 np0005476733 ovn_controller[263831]: 2025-10-08T17:01:57Z|00236|pinctrl|WARN|Dropped 233 log messages in last 62 seconds (most recently, 10 seconds ago) due to excessive rate
Oct  8 13:01:57 np0005476733 ovn_controller[263831]: 2025-10-08T17:01:57Z|00237|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:01:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:01:57.440 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=95, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=94) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:01:57 np0005476733 nova_compute[192580]: 2025-10-08 17:01:57.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:01:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:01:57.441 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 13:01:57 np0005476733 nova_compute[192580]: 2025-10-08 17:01:57.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:00 np0005476733 podman[278447]: 2025-10-08 17:02:00.229143825 +0000 UTC m=+0.057044293 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 13:02:00 np0005476733 podman[278448]: 2025-10-08 17:02:00.249939929 +0000 UTC m=+0.074184759 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 13:02:00 np0005476733 podman[278449]: 2025-10-08 17:02:00.25683995 +0000 UTC m=+0.079955965 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=)
Oct  8 13:02:00 np0005476733 nova_compute[192580]: 2025-10-08 17:02:00.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:02 np0005476733 nova_compute[192580]: 2025-10-08 17:02:02.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:04 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:04.442 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '95'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:02:04 np0005476733 nova_compute[192580]: 2025-10-08 17:02:04.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:02:04 np0005476733 nova_compute[192580]: 2025-10-08 17:02:04.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:02:04 np0005476733 nova_compute[192580]: 2025-10-08 17:02:04.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:02:05 np0005476733 nova_compute[192580]: 2025-10-08 17:02:05.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:05 np0005476733 nova_compute[192580]: 2025-10-08 17:02:05.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:06 np0005476733 podman[278510]: 2025-10-08 17:02:06.246905239 +0000 UTC m=+0.058199330 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 13:02:06 np0005476733 podman[278509]: 2025-10-08 17:02:06.25789373 +0000 UTC m=+0.083183358 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  8 13:02:07 np0005476733 nova_compute[192580]: 2025-10-08 17:02:07.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.201 2 DEBUG oslo_concurrency.lockutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Acquiring lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.201 2 DEBUG oslo_concurrency.lockutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.216 2 DEBUG nova.compute.manager [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.304 2 DEBUG oslo_concurrency.lockutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.304 2 DEBUG oslo_concurrency.lockutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.313 2 DEBUG nova.virt.hardware [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.313 2 INFO nova.compute.claims [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.460 2 DEBUG nova.compute.provider_tree [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.483 2 DEBUG nova.scheduler.client.report [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.517 2 DEBUG oslo_concurrency.lockutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.517 2 DEBUG nova.compute.manager [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.572 2 DEBUG nova.compute.manager [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.574 2 DEBUG nova.network.neutron [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.596 2 INFO nova.virt.libvirt.driver [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.612 2 DEBUG nova.compute.manager [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.720 2 DEBUG nova.compute.manager [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.721 2 DEBUG nova.virt.libvirt.driver [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.721 2 INFO nova.virt.libvirt.driver [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Creating image(s)#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.722 2 DEBUG oslo_concurrency.lockutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Acquiring lock "/var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.722 2 DEBUG oslo_concurrency.lockutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "/var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.723 2 DEBUG oslo_concurrency.lockutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "/var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.735 2 DEBUG oslo_concurrency.processutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.759 2 DEBUG nova.policy [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.801 2 DEBUG oslo_concurrency.processutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.802 2 DEBUG oslo_concurrency.lockutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.802 2 DEBUG oslo_concurrency.lockutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.815 2 DEBUG oslo_concurrency.processutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.872 2 DEBUG oslo_concurrency.processutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.873 2 DEBUG oslo_concurrency.processutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.909 2 DEBUG oslo_concurrency.processutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk 10737418240" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.910 2 DEBUG oslo_concurrency.lockutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.911 2 DEBUG oslo_concurrency.processutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.962 2 DEBUG oslo_concurrency.processutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.963 2 DEBUG nova.objects.instance [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lazy-loading 'migration_context' on Instance uuid b62bae97-ef9e-4f29-8e84-9731b7b2dc32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.978 2 DEBUG nova.virt.libvirt.driver [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.979 2 DEBUG nova.virt.libvirt.driver [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Ensure instance console log exists: /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.979 2 DEBUG oslo_concurrency.lockutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.980 2 DEBUG oslo_concurrency.lockutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:02:09 np0005476733 nova_compute[192580]: 2025-10-08 17:02:09.980 2 DEBUG oslo_concurrency.lockutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:02:10 np0005476733 nova_compute[192580]: 2025-10-08 17:02:10.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:02:10 np0005476733 nova_compute[192580]: 2025-10-08 17:02:10.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:12 np0005476733 nova_compute[192580]: 2025-10-08 17:02:12.624 2 DEBUG nova.network.neutron [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Successfully created port: 515844fc-ac1c-4a41-afdb-496edaf8d2ad _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 13:02:12 np0005476733 nova_compute[192580]: 2025-10-08 17:02:12.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:14 np0005476733 nova_compute[192580]: 2025-10-08 17:02:14.312 2 DEBUG nova.network.neutron [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Successfully updated port: 515844fc-ac1c-4a41-afdb-496edaf8d2ad _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 13:02:14 np0005476733 nova_compute[192580]: 2025-10-08 17:02:14.342 2 DEBUG oslo_concurrency.lockutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Acquiring lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 13:02:14 np0005476733 nova_compute[192580]: 2025-10-08 17:02:14.343 2 DEBUG oslo_concurrency.lockutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Acquired lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 13:02:14 np0005476733 nova_compute[192580]: 2025-10-08 17:02:14.343 2 DEBUG nova.network.neutron [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 13:02:14 np0005476733 nova_compute[192580]: 2025-10-08 17:02:14.466 2 DEBUG nova.compute.manager [req-ad152035-b782-49d9-82f2-d8448177db97 req-e65c3fe9-314a-40df-8ca9-ca421fc91075 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Received event network-changed-515844fc-ac1c-4a41-afdb-496edaf8d2ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:02:14 np0005476733 nova_compute[192580]: 2025-10-08 17:02:14.467 2 DEBUG nova.compute.manager [req-ad152035-b782-49d9-82f2-d8448177db97 req-e65c3fe9-314a-40df-8ca9-ca421fc91075 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Refreshing instance network info cache due to event network-changed-515844fc-ac1c-4a41-afdb-496edaf8d2ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 13:02:14 np0005476733 nova_compute[192580]: 2025-10-08 17:02:14.467 2 DEBUG oslo_concurrency.lockutils [req-ad152035-b782-49d9-82f2-d8448177db97 req-e65c3fe9-314a-40df-8ca9-ca421fc91075 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 13:02:14 np0005476733 nova_compute[192580]: 2025-10-08 17:02:14.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:02:14 np0005476733 nova_compute[192580]: 2025-10-08 17:02:14.903 2 DEBUG nova.network.neutron [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 13:02:15 np0005476733 nova_compute[192580]: 2025-10-08 17:02:15.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.126 2 DEBUG nova.network.neutron [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Updating instance_info_cache with network_info: [{"id": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "address": "fa:16:3e:8c:c7:90", "network": {"id": "90f23013-7525-484c-9fdb-1b08a96d01b7", "bridge": "br-int", "label": "tempest-test-network--203230236", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bfc3ffe3bf745dfbb59f63a07f1e1a9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap515844fc-ac", "ovs_interfaceid": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.148 2 DEBUG oslo_concurrency.lockutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Releasing lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.148 2 DEBUG nova.compute.manager [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Instance network_info: |[{"id": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "address": "fa:16:3e:8c:c7:90", "network": {"id": "90f23013-7525-484c-9fdb-1b08a96d01b7", "bridge": "br-int", "label": "tempest-test-network--203230236", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bfc3ffe3bf745dfbb59f63a07f1e1a9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap515844fc-ac", "ovs_interfaceid": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.148 2 DEBUG oslo_concurrency.lockutils [req-ad152035-b782-49d9-82f2-d8448177db97 req-e65c3fe9-314a-40df-8ca9-ca421fc91075 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.148 2 DEBUG nova.network.neutron [req-ad152035-b782-49d9-82f2-d8448177db97 req-e65c3fe9-314a-40df-8ca9-ca421fc91075 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Refreshing network info cache for port 515844fc-ac1c-4a41-afdb-496edaf8d2ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.150 2 DEBUG nova.virt.libvirt.driver [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Start _get_guest_xml network_info=[{"id": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "address": "fa:16:3e:8c:c7:90", "network": {"id": "90f23013-7525-484c-9fdb-1b08a96d01b7", "bridge": "br-int", "label": "tempest-test-network--203230236", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bfc3ffe3bf745dfbb59f63a07f1e1a9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap515844fc-ac", "ovs_interfaceid": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.154 2 WARNING nova.virt.libvirt.driver [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.164 2 DEBUG nova.virt.libvirt.host [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.164 2 DEBUG nova.virt.libvirt.host [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.166 2 DEBUG nova.virt.libvirt.host [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.167 2 DEBUG nova.virt.libvirt.host [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.167 2 DEBUG nova.virt.libvirt.driver [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.168 2 DEBUG nova.virt.hardware [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.168 2 DEBUG nova.virt.hardware [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.168 2 DEBUG nova.virt.hardware [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.168 2 DEBUG nova.virt.hardware [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.169 2 DEBUG nova.virt.hardware [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.169 2 DEBUG nova.virt.hardware [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.169 2 DEBUG nova.virt.hardware [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.169 2 DEBUG nova.virt.hardware [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.169 2 DEBUG nova.virt.hardware [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.170 2 DEBUG nova.virt.hardware [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.170 2 DEBUG nova.virt.hardware [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.173 2 DEBUG nova.virt.libvirt.vif [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T17:02:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_dscp_marking_south_north-1351956767',display_name='tempest-test_dscp_marking_south_north-1351956767',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dscp-marking-south-north-1351956767',id=114,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMZY+1amHtOfP7A9mNq9elrLdSmJExinx1RhEMIok/jY2NIEZLmbGYzwSU7hW6lTgCdp0ppXG+TES3vu/f6QcskjPgczSOIC1ZSOo2pydcs/ssHi2XSJm93QN1u5564Lw==',key_name='tempest-keypair-test-130020287',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1bfc3ffe3bf745dfbb59f63a07f1e1a9',ramdisk_id='',reservation_id='r-okufdn5f',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-951807005',owner_user_name='tempest-QosTestCommon-951807005-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T17:02:09Z,user_data=None,user_id='c7e76ef370424761ad76d4119e9bf895',uuid=b62bae97-ef9e-4f29-8e84-9731b7b2dc32,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "address": "fa:16:3e:8c:c7:90", "network": {"id": "90f23013-7525-484c-9fdb-1b08a96d01b7", "bridge": "br-int", "label": "tempest-test-network--203230236", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bfc3ffe3bf745dfbb59f63a07f1e1a9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap515844fc-ac", "ovs_interfaceid": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.173 2 DEBUG nova.network.os_vif_util [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Converting VIF {"id": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "address": "fa:16:3e:8c:c7:90", "network": {"id": "90f23013-7525-484c-9fdb-1b08a96d01b7", "bridge": "br-int", "label": "tempest-test-network--203230236", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bfc3ffe3bf745dfbb59f63a07f1e1a9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap515844fc-ac", "ovs_interfaceid": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.174 2 DEBUG nova.network.os_vif_util [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:c7:90,bridge_name='br-int',has_traffic_filtering=True,id=515844fc-ac1c-4a41-afdb-496edaf8d2ad,network=Network(90f23013-7525-484c-9fdb-1b08a96d01b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap515844fc-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.175 2 DEBUG nova.objects.instance [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lazy-loading 'pci_devices' on Instance uuid b62bae97-ef9e-4f29-8e84-9731b7b2dc32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.195 2 DEBUG nova.virt.libvirt.driver [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] End _get_guest_xml xml=<domain type="kvm">
Oct  8 13:02:16 np0005476733 nova_compute[192580]:  <uuid>b62bae97-ef9e-4f29-8e84-9731b7b2dc32</uuid>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:  <name>instance-00000072</name>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <nova:name>tempest-test_dscp_marking_south_north-1351956767</nova:name>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 17:02:16</nova:creationTime>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 13:02:16 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:        <nova:user uuid="c7e76ef370424761ad76d4119e9bf895">tempest-QosTestCommon-951807005-project-member</nova:user>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:        <nova:project uuid="1bfc3ffe3bf745dfbb59f63a07f1e1a9">tempest-QosTestCommon-951807005</nova:project>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:        <nova:port uuid="515844fc-ac1c-4a41-afdb-496edaf8d2ad">
Oct  8 13:02:16 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.1.210" ipVersion="4"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <system>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <entry name="serial">b62bae97-ef9e-4f29-8e84-9731b7b2dc32</entry>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <entry name="uuid">b62bae97-ef9e-4f29-8e84-9731b7b2dc32</entry>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    </system>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:  <os>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:  </os>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:  <features>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:  </features>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:  </clock>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:  <devices>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    </disk>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.config"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    </disk>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:8c:c7:90"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <mtu size="1342"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <target dev="tap515844fc-ac"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    </interface>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/console.log" append="off"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    </serial>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <video>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    </video>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    </rng>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 13:02:16 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 13:02:16 np0005476733 nova_compute[192580]:  </devices>
Oct  8 13:02:16 np0005476733 nova_compute[192580]: </domain>
Oct  8 13:02:16 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.196 2 DEBUG nova.compute.manager [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Preparing to wait for external event network-vif-plugged-515844fc-ac1c-4a41-afdb-496edaf8d2ad prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.196 2 DEBUG oslo_concurrency.lockutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Acquiring lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.197 2 DEBUG oslo_concurrency.lockutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.197 2 DEBUG oslo_concurrency.lockutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.197 2 DEBUG nova.virt.libvirt.vif [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T17:02:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_dscp_marking_south_north-1351956767',display_name='tempest-test_dscp_marking_south_north-1351956767',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dscp-marking-south-north-1351956767',id=114,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMZY+1amHtOfP7A9mNq9elrLdSmJExinx1RhEMIok/jY2NIEZLmbGYzwSU7hW6lTgCdp0ppXG+TES3vu/f6QcskjPgczSOIC1ZSOo2pydcs/ssHi2XSJm93QN1u5564Lw==',key_name='tempest-keypair-test-130020287',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1bfc3ffe3bf745dfbb59f63a07f1e1a9',ramdisk_id='',reservation_id='r-okufdn5f',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestCommon-951807005',owner_user_name='tempest-QosTestCommon-951807005-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T17:02:09Z,user_data=None,user_id='c7e76ef370424761ad76d4119e9bf895',uuid=b62bae97-ef9e-4f29-8e84-9731b7b2dc32,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "address": "fa:16:3e:8c:c7:90", "network": {"id": "90f23013-7525-484c-9fdb-1b08a96d01b7", "bridge": "br-int", "label": "tempest-test-network--203230236", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bfc3ffe3bf745dfbb59f63a07f1e1a9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap515844fc-ac", "ovs_interfaceid": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.198 2 DEBUG nova.network.os_vif_util [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Converting VIF {"id": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "address": "fa:16:3e:8c:c7:90", "network": {"id": "90f23013-7525-484c-9fdb-1b08a96d01b7", "bridge": "br-int", "label": "tempest-test-network--203230236", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bfc3ffe3bf745dfbb59f63a07f1e1a9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap515844fc-ac", "ovs_interfaceid": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.198 2 DEBUG nova.network.os_vif_util [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:c7:90,bridge_name='br-int',has_traffic_filtering=True,id=515844fc-ac1c-4a41-afdb-496edaf8d2ad,network=Network(90f23013-7525-484c-9fdb-1b08a96d01b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap515844fc-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.199 2 DEBUG os_vif [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:c7:90,bridge_name='br-int',has_traffic_filtering=True,id=515844fc-ac1c-4a41-afdb-496edaf8d2ad,network=Network(90f23013-7525-484c-9fdb-1b08a96d01b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap515844fc-ac') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.199 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.200 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.202 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap515844fc-ac, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.202 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap515844fc-ac, col_values=(('external_ids', {'iface-id': '515844fc-ac1c-4a41-afdb-496edaf8d2ad', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8c:c7:90', 'vm-uuid': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.212 2 INFO os_vif [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:c7:90,bridge_name='br-int',has_traffic_filtering=True,id=515844fc-ac1c-4a41-afdb-496edaf8d2ad,network=Network(90f23013-7525-484c-9fdb-1b08a96d01b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap515844fc-ac')#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.265 2 DEBUG nova.virt.libvirt.driver [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.266 2 DEBUG nova.virt.libvirt.driver [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.266 2 DEBUG nova.virt.libvirt.driver [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] No VIF found with MAC fa:16:3e:8c:c7:90, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.267 2 INFO nova.virt.libvirt.driver [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Using config drive#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.613 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.614 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.614 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.979 2 INFO nova.virt.libvirt.driver [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Creating config drive at /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.config#033[00m
Oct  8 13:02:16 np0005476733 nova_compute[192580]: 2025-10-08 17:02:16.984 2 DEBUG oslo_concurrency.processutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_yaukr9s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:02:17 np0005476733 nova_compute[192580]: 2025-10-08 17:02:17.110 2 DEBUG oslo_concurrency.processutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_yaukr9s" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:02:17 np0005476733 kernel: tap515844fc-ac: entered promiscuous mode
Oct  8 13:02:17 np0005476733 NetworkManager[51699]: <info>  [1759942937.1659] manager: (tap515844fc-ac): new Tun device (/org/freedesktop/NetworkManager/Devices/328)
Oct  8 13:02:17 np0005476733 ovn_controller[263831]: 2025-10-08T17:02:17Z|00238|binding|INFO|Claiming lport 515844fc-ac1c-4a41-afdb-496edaf8d2ad for this chassis.
Oct  8 13:02:17 np0005476733 ovn_controller[263831]: 2025-10-08T17:02:17Z|00239|binding|INFO|515844fc-ac1c-4a41-afdb-496edaf8d2ad: Claiming fa:16:3e:8c:c7:90 192.168.1.210
Oct  8 13:02:17 np0005476733 nova_compute[192580]: 2025-10-08 17:02:17.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.175 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:c7:90 192.168.1.210'], port_security=['fa:16:3e:8c:c7:90 192.168.1.210'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.1.210/24', 'neutron:device_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90f23013-7525-484c-9fdb-1b08a96d01b7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '25db264a-2b7b-491b-8b39-dd7c8ac35c2b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6dc09949-30f3-4269-9199-dde0aaedfe99, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=515844fc-ac1c-4a41-afdb-496edaf8d2ad) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.176 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 515844fc-ac1c-4a41-afdb-496edaf8d2ad in datapath 90f23013-7525-484c-9fdb-1b08a96d01b7 bound to our chassis#033[00m
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.178 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 90f23013-7525-484c-9fdb-1b08a96d01b7#033[00m
Oct  8 13:02:17 np0005476733 nova_compute[192580]: 2025-10-08 17:02:17.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:17 np0005476733 ovn_controller[263831]: 2025-10-08T17:02:17Z|00240|binding|INFO|Setting lport 515844fc-ac1c-4a41-afdb-496edaf8d2ad ovn-installed in OVS
Oct  8 13:02:17 np0005476733 ovn_controller[263831]: 2025-10-08T17:02:17Z|00241|binding|INFO|Setting lport 515844fc-ac1c-4a41-afdb-496edaf8d2ad up in Southbound
Oct  8 13:02:17 np0005476733 nova_compute[192580]: 2025-10-08 17:02:17.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.189 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c28e76d9-96c4-4659-9564-330cf323976f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.190 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap90f23013-71 in ovnmeta-90f23013-7525-484c-9fdb-1b08a96d01b7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.193 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap90f23013-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.193 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[efb0c695-f492-4641-a874-75b2890d5c28]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.194 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e7cf621f-f925-49a6-9331-6a88dc7687bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:02:17 np0005476733 systemd-udevd[278585]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.205 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[940bf159-bbcd-4653-9bc8-ef22413e6cff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:02:17 np0005476733 systemd-machined[152624]: New machine qemu-69-instance-00000072.
Oct  8 13:02:17 np0005476733 NetworkManager[51699]: <info>  [1759942937.2127] device (tap515844fc-ac): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 13:02:17 np0005476733 NetworkManager[51699]: <info>  [1759942937.2137] device (tap515844fc-ac): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 13:02:17 np0005476733 systemd[1]: Started Virtual Machine qemu-69-instance-00000072.
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.228 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8f384db8-c71d-42bb-8138-e7836113fd17]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.259 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[d19d25d2-de88-453b-9f0a-a202254afabd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.264 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4ddded71-baf6-4d4c-a49d-1920e4d0867c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:02:17 np0005476733 NetworkManager[51699]: <info>  [1759942937.2672] manager: (tap90f23013-70): new Veth device (/org/freedesktop/NetworkManager/Devices/329)
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.301 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[a86850aa-d8d6-4610-8e89-5ad76ff2ce02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.304 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[e5c97def-7295-4d1d-8c19-b79166e0afc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:02:17 np0005476733 podman[278591]: 2025-10-08 17:02:17.305160921 +0000 UTC m=+0.062175977 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 13:02:17 np0005476733 NetworkManager[51699]: <info>  [1759942937.3295] device (tap90f23013-70): carrier: link connected
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.335 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[bc695f2d-a00c-4522-8e7c-3c01cf10effe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.354 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3a8d4edd-fa00-43b1-ad18-a13d79f40dbd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90f23013-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:93:6b:a0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 233], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 984099, 'reachable_time': 26102, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 278634, 'error': None, 'target': 'ovnmeta-90f23013-7525-484c-9fdb-1b08a96d01b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.369 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[3a6ca01c-5920-4f40-8cf0-1f9a0febe54a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe93:6ba0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 984099, 'tstamp': 984099}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 278635, 'error': None, 'target': 'ovnmeta-90f23013-7525-484c-9fdb-1b08a96d01b7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.389 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[62c63213-1f1e-464e-90f3-fa503ac15054]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90f23013-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:93:6b:a0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 233], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 984099, 'reachable_time': 26102, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 278636, 'error': None, 'target': 'ovnmeta-90f23013-7525-484c-9fdb-1b08a96d01b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.420 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0b176825-42bc-4db9-a3e7-f07ee0d052b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:02:17 np0005476733 nova_compute[192580]: 2025-10-08 17:02:17.473 2 DEBUG nova.compute.manager [req-e12aef91-9fb7-4080-81b1-47891da82ea0 req-270ec8f5-b68a-4480-b0b5-b1f5b42c50e9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Received event network-vif-plugged-515844fc-ac1c-4a41-afdb-496edaf8d2ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:02:17 np0005476733 nova_compute[192580]: 2025-10-08 17:02:17.474 2 DEBUG oslo_concurrency.lockutils [req-e12aef91-9fb7-4080-81b1-47891da82ea0 req-270ec8f5-b68a-4480-b0b5-b1f5b42c50e9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:02:17 np0005476733 nova_compute[192580]: 2025-10-08 17:02:17.474 2 DEBUG oslo_concurrency.lockutils [req-e12aef91-9fb7-4080-81b1-47891da82ea0 req-270ec8f5-b68a-4480-b0b5-b1f5b42c50e9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:02:17 np0005476733 nova_compute[192580]: 2025-10-08 17:02:17.475 2 DEBUG oslo_concurrency.lockutils [req-e12aef91-9fb7-4080-81b1-47891da82ea0 req-270ec8f5-b68a-4480-b0b5-b1f5b42c50e9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:02:17 np0005476733 nova_compute[192580]: 2025-10-08 17:02:17.475 2 DEBUG nova.compute.manager [req-e12aef91-9fb7-4080-81b1-47891da82ea0 req-270ec8f5-b68a-4480-b0b5-b1f5b42c50e9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Processing event network-vif-plugged-515844fc-ac1c-4a41-afdb-496edaf8d2ad _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.492 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[544762ca-8662-42b9-a401-d00c3e855a07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.495 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90f23013-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.496 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.496 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90f23013-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:02:17 np0005476733 nova_compute[192580]: 2025-10-08 17:02:17.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:17 np0005476733 kernel: tap90f23013-70: entered promiscuous mode
Oct  8 13:02:17 np0005476733 nova_compute[192580]: 2025-10-08 17:02:17.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.504 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap90f23013-70, col_values=(('external_ids', {'iface-id': '2481e32f-1b38-4415-a6aa-73dcb50ed723'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:02:17 np0005476733 ovn_controller[263831]: 2025-10-08T17:02:17Z|00242|binding|INFO|Releasing lport 2481e32f-1b38-4415-a6aa-73dcb50ed723 from this chassis (sb_readonly=0)
Oct  8 13:02:17 np0005476733 nova_compute[192580]: 2025-10-08 17:02:17.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:17 np0005476733 nova_compute[192580]: 2025-10-08 17:02:17.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.509 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/90f23013-7525-484c-9fdb-1b08a96d01b7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/90f23013-7525-484c-9fdb-1b08a96d01b7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.512 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a09e41d3-5d98-41ee-b120-efa451651f71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.514 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-90f23013-7525-484c-9fdb-1b08a96d01b7
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/90f23013-7525-484c-9fdb-1b08a96d01b7.pid.haproxy
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 90f23013-7525-484c-9fdb-1b08a96d01b7
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 13:02:17 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:17.516 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-90f23013-7525-484c-9fdb-1b08a96d01b7', 'env', 'PROCESS_TAG=haproxy-90f23013-7525-484c-9fdb-1b08a96d01b7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/90f23013-7525-484c-9fdb-1b08a96d01b7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 13:02:17 np0005476733 nova_compute[192580]: 2025-10-08 17:02:17.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:17 np0005476733 podman[278668]: 2025-10-08 17:02:17.838703802 +0000 UTC m=+0.043478069 container create f0f53389b955193b1c0e904eb46c334b4f641ff0805cbd6d3a976738372828c0 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-90f23013-7525-484c-9fdb-1b08a96d01b7, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 13:02:17 np0005476733 systemd[1]: Started libpod-conmon-f0f53389b955193b1c0e904eb46c334b4f641ff0805cbd6d3a976738372828c0.scope.
Oct  8 13:02:17 np0005476733 systemd[1]: Started libcrun container.
Oct  8 13:02:17 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/21d3fd576f0e5452f473b945c869ddf820f5f40dda16f5160e76590c14c467fd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 13:02:17 np0005476733 podman[278668]: 2025-10-08 17:02:17.815787161 +0000 UTC m=+0.020561408 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 13:02:17 np0005476733 podman[278668]: 2025-10-08 17:02:17.920889317 +0000 UTC m=+0.125663594 container init f0f53389b955193b1c0e904eb46c334b4f641ff0805cbd6d3a976738372828c0 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-90f23013-7525-484c-9fdb-1b08a96d01b7, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 13:02:17 np0005476733 podman[278668]: 2025-10-08 17:02:17.925916018 +0000 UTC m=+0.130690265 container start f0f53389b955193b1c0e904eb46c334b4f641ff0805cbd6d3a976738372828c0 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-90f23013-7525-484c-9fdb-1b08a96d01b7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Oct  8 13:02:17 np0005476733 neutron-haproxy-ovnmeta-90f23013-7525-484c-9fdb-1b08a96d01b7[278683]: [NOTICE]   (278687) : New worker (278689) forked
Oct  8 13:02:17 np0005476733 neutron-haproxy-ovnmeta-90f23013-7525-484c-9fdb-1b08a96d01b7[278683]: [NOTICE]   (278687) : Loading success.
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.197 2 DEBUG nova.network.neutron [req-ad152035-b782-49d9-82f2-d8448177db97 req-e65c3fe9-314a-40df-8ca9-ca421fc91075 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Updated VIF entry in instance network info cache for port 515844fc-ac1c-4a41-afdb-496edaf8d2ad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.198 2 DEBUG nova.network.neutron [req-ad152035-b782-49d9-82f2-d8448177db97 req-e65c3fe9-314a-40df-8ca9-ca421fc91075 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Updating instance_info_cache with network_info: [{"id": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "address": "fa:16:3e:8c:c7:90", "network": {"id": "90f23013-7525-484c-9fdb-1b08a96d01b7", "bridge": "br-int", "label": "tempest-test-network--203230236", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bfc3ffe3bf745dfbb59f63a07f1e1a9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap515844fc-ac", "ovs_interfaceid": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.215 2 DEBUG oslo_concurrency.lockutils [req-ad152035-b782-49d9-82f2-d8448177db97 req-e65c3fe9-314a-40df-8ca9-ca421fc91075 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.616 2 DEBUG nova.compute.manager [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.617 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759942938.6169744, b62bae97-ef9e-4f29-8e84-9731b7b2dc32 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.618 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] VM Started (Lifecycle Event)#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.622 2 DEBUG nova.virt.libvirt.driver [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.626 2 INFO nova.virt.libvirt.driver [-] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Instance spawned successfully.#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.626 2 DEBUG nova.virt.libvirt.driver [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.650 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.656 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.659 2 DEBUG nova.virt.libvirt.driver [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.659 2 DEBUG nova.virt.libvirt.driver [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.660 2 DEBUG nova.virt.libvirt.driver [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.660 2 DEBUG nova.virt.libvirt.driver [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.661 2 DEBUG nova.virt.libvirt.driver [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.661 2 DEBUG nova.virt.libvirt.driver [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.700 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.700 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759942938.6174898, b62bae97-ef9e-4f29-8e84-9731b7b2dc32 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.700 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] VM Paused (Lifecycle Event)#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.736 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.740 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759942938.6203198, b62bae97-ef9e-4f29-8e84-9731b7b2dc32 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.740 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] VM Resumed (Lifecycle Event)#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.753 2 INFO nova.compute.manager [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Took 9.03 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.754 2 DEBUG nova.compute.manager [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.767 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.769 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.790 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.817 2 INFO nova.compute.manager [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Took 9.55 seconds to build instance.#033[00m
Oct  8 13:02:18 np0005476733 nova_compute[192580]: 2025-10-08 17:02:18.839 2 DEBUG oslo_concurrency.lockutils [None req-8bae79fc-c2ae-4014-b84b-9077d4be65ea c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:02:19 np0005476733 podman[278705]: 2025-10-08 17:02:19.255991609 +0000 UTC m=+0.083877710 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  8 13:02:19 np0005476733 nova_compute[192580]: 2025-10-08 17:02:19.580 2 DEBUG nova.compute.manager [req-1666f0d6-45c2-45fe-839a-48ad5a3deda2 req-ca52d7ff-2a13-4e93-b3ed-dc05c98165c9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Received event network-vif-plugged-515844fc-ac1c-4a41-afdb-496edaf8d2ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:02:19 np0005476733 nova_compute[192580]: 2025-10-08 17:02:19.580 2 DEBUG oslo_concurrency.lockutils [req-1666f0d6-45c2-45fe-839a-48ad5a3deda2 req-ca52d7ff-2a13-4e93-b3ed-dc05c98165c9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:02:19 np0005476733 nova_compute[192580]: 2025-10-08 17:02:19.581 2 DEBUG oslo_concurrency.lockutils [req-1666f0d6-45c2-45fe-839a-48ad5a3deda2 req-ca52d7ff-2a13-4e93-b3ed-dc05c98165c9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:02:19 np0005476733 nova_compute[192580]: 2025-10-08 17:02:19.581 2 DEBUG oslo_concurrency.lockutils [req-1666f0d6-45c2-45fe-839a-48ad5a3deda2 req-ca52d7ff-2a13-4e93-b3ed-dc05c98165c9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:02:19 np0005476733 nova_compute[192580]: 2025-10-08 17:02:19.581 2 DEBUG nova.compute.manager [req-1666f0d6-45c2-45fe-839a-48ad5a3deda2 req-ca52d7ff-2a13-4e93-b3ed-dc05c98165c9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] No waiting events found dispatching network-vif-plugged-515844fc-ac1c-4a41-afdb-496edaf8d2ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 13:02:19 np0005476733 nova_compute[192580]: 2025-10-08 17:02:19.582 2 WARNING nova.compute.manager [req-1666f0d6-45c2-45fe-839a-48ad5a3deda2 req-ca52d7ff-2a13-4e93-b3ed-dc05c98165c9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Received unexpected event network-vif-plugged-515844fc-ac1c-4a41-afdb-496edaf8d2ad for instance with vm_state active and task_state None.#033[00m
Oct  8 13:02:19 np0005476733 nova_compute[192580]: 2025-10-08 17:02:19.604 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:02:20 np0005476733 nova_compute[192580]: 2025-10-08 17:02:20.519 2 INFO nova.compute.manager [None req-f52f39da-f4ac-4cc8-94ae-112c2fe0a246 c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Get console output#033[00m
Oct  8 13:02:20 np0005476733 nova_compute[192580]: 2025-10-08 17:02:20.524 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 13:02:20 np0005476733 nova_compute[192580]: 2025-10-08 17:02:20.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:21 np0005476733 nova_compute[192580]: 2025-10-08 17:02:21.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:22 np0005476733 podman[278731]: 2025-10-08 17:02:22.231352 +0000 UTC m=+0.058903062 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=edpm, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  8 13:02:25 np0005476733 nova_compute[192580]: 2025-10-08 17:02:25.701 2 INFO nova.compute.manager [None req-e1c5e3f4-500d-4bff-a8a7-d6db51805e6e c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Get console output#033[00m
Oct  8 13:02:25 np0005476733 nova_compute[192580]: 2025-10-08 17:02:25.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:26 np0005476733 nova_compute[192580]: 2025-10-08 17:02:26.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:26.439 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:02:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:26.440 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:02:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:02:26.441 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:02:26 np0005476733 nova_compute[192580]: 2025-10-08 17:02:26.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:02:26 np0005476733 nova_compute[192580]: 2025-10-08 17:02:26.618 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:02:26 np0005476733 nova_compute[192580]: 2025-10-08 17:02:26.619 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:02:26 np0005476733 nova_compute[192580]: 2025-10-08 17:02:26.620 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:02:26 np0005476733 nova_compute[192580]: 2025-10-08 17:02:26.620 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:02:26 np0005476733 nova_compute[192580]: 2025-10-08 17:02:26.689 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:02:26 np0005476733 nova_compute[192580]: 2025-10-08 17:02:26.750 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:02:26 np0005476733 nova_compute[192580]: 2025-10-08 17:02:26.752 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:02:26 np0005476733 nova_compute[192580]: 2025-10-08 17:02:26.811 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:02:26 np0005476733 nova_compute[192580]: 2025-10-08 17:02:26.961 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:02:26 np0005476733 nova_compute[192580]: 2025-10-08 17:02:26.962 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13361MB free_disk=111.29995346069336GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:02:26 np0005476733 nova_compute[192580]: 2025-10-08 17:02:26.963 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:02:26 np0005476733 nova_compute[192580]: 2025-10-08 17:02:26.963 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:02:27 np0005476733 nova_compute[192580]: 2025-10-08 17:02:27.142 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance b62bae97-ef9e-4f29-8e84-9731b7b2dc32 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 13:02:27 np0005476733 nova_compute[192580]: 2025-10-08 17:02:27.143 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:02:27 np0005476733 nova_compute[192580]: 2025-10-08 17:02:27.143 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:02:27 np0005476733 nova_compute[192580]: 2025-10-08 17:02:27.161 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 13:02:27 np0005476733 nova_compute[192580]: 2025-10-08 17:02:27.180 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 13:02:27 np0005476733 nova_compute[192580]: 2025-10-08 17:02:27.181 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 13:02:27 np0005476733 nova_compute[192580]: 2025-10-08 17:02:27.195 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 13:02:27 np0005476733 nova_compute[192580]: 2025-10-08 17:02:27.214 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 13:02:27 np0005476733 nova_compute[192580]: 2025-10-08 17:02:27.256 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:02:27 np0005476733 nova_compute[192580]: 2025-10-08 17:02:27.274 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:02:27 np0005476733 nova_compute[192580]: 2025-10-08 17:02:27.314 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:02:27 np0005476733 nova_compute[192580]: 2025-10-08 17:02:27.315 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.352s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:02:28 np0005476733 nova_compute[192580]: 2025-10-08 17:02:28.316 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:02:30 np0005476733 nova_compute[192580]: 2025-10-08 17:02:30.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:30 np0005476733 nova_compute[192580]: 2025-10-08 17:02:30.850 2 INFO nova.compute.manager [None req-742049bb-e508-4740-a210-873eca75ff10 c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Get console output#033[00m
Oct  8 13:02:30 np0005476733 nova_compute[192580]: 2025-10-08 17:02:30.855 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 13:02:31 np0005476733 nova_compute[192580]: 2025-10-08 17:02:31.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:31 np0005476733 podman[278758]: 2025-10-08 17:02:31.220519657 +0000 UTC m=+0.053328264 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd)
Oct  8 13:02:31 np0005476733 podman[278759]: 2025-10-08 17:02:31.221040264 +0000 UTC m=+0.048327304 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 13:02:31 np0005476733 podman[278760]: 2025-10-08 17:02:31.23502267 +0000 UTC m=+0.060614086 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=edpm, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  8 13:02:34 np0005476733 nova_compute[192580]: 2025-10-08 17:02:34.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:02:35 np0005476733 nova_compute[192580]: 2025-10-08 17:02:35.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:36 np0005476733 nova_compute[192580]: 2025-10-08 17:02:36.029 2 INFO nova.compute.manager [None req-e082201a-67e2-4f04-b15f-1486ae83c4b6 c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Get console output#033[00m
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.079 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'name': 'tempest-test_dscp_marking_south_north-1351956767', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000072', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'hostId': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.080 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.096 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/cpu volume: 16810000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '221c6b4f-7458-458f-b171-3a45cc712726', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16810000000, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'timestamp': '2025-10-08T17:02:36.080871', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '96bc62a0-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.81932054, 'message_signature': '8a6224684e68df84d63b79e6ce73c8afe58595fac3da2bd48c23e676d571de57'}]}, 'timestamp': '2025-10-08 17:02:36.096974', '_unique_id': '45d0155ca87b4136984ce8cc44e413f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.098 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.099 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.110 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.110 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '950dd113-1757-4ede-a994-0e9a07a2f04f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:02:36.099124', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '96be7c66-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.822105849, 'message_signature': 'b63dc03be7c6d2542b7e6e2d89bd96f4c69e74683f4283ee2a7642b707b3fa7b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:02:36.099124', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '96be88fa-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.822105849, 'message_signature': '143bee53a2a5699469d86fe09ccb7b68b895a125c4d4508d469454254331dbb9'}]}, 'timestamp': '2025-10-08 17:02:36.111023', '_unique_id': 'e0c57faec91d4c998aa6d071432badee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.111 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.112 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.115 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b62bae97-ef9e-4f29-8e84-9731b7b2dc32 / tap515844fc-ac inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.116 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9d174fa-873e-4cf2-aaed-5335bd08cda8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:02:36.112650', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': '96bf602c-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.835642561, 'message_signature': 'a3c4c4841eca40665c40aeab6acf29ccc63f2f4a3a425a2581daa1adcbfc4239'}]}, 'timestamp': '2025-10-08 17:02:36.116544', '_unique_id': 'fd2b7f1074f1470d8f53e0086779c06b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.117 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.136 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.write.latency volume: 51604448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.137 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e4c22e4-5e1a-4ae6-a797-e1b61845d038', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 51604448, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:02:36.117828', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '96c29026-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.840801115, 'message_signature': 'de39366f9bbc47fce2b4831c740cf1fb22715572d6342eb553579d8aed1821fe'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:02:36.117828', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '96c29b5c-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.840801115, 'message_signature': '3362e289393794fcca096474a8d64129117b9d348b55613ff0f4d3b120381892'}]}, 'timestamp': '2025-10-08 17:02:36.137686', '_unique_id': '69ae919097424ed0a33e5f5410061f9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.138 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.139 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.139 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.139 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-test_dscp_marking_south_north-1351956767>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dscp_marking_south_north-1351956767>]
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.139 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.139 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2de792a5-c28f-4b8d-9f22-037d11dc8c4b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:02:36.139813', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': '96c2f9da-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.835642561, 'message_signature': '71a1b9903155d40202262aa676a0fccbe2875b5cd1876a8f3e8b51f17efe2a2b'}]}, 'timestamp': '2025-10-08 17:02:36.140110', '_unique_id': '7f558f72f7c340558b2abf2a9afc12d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.140 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.141 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.141 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.141 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-test_dscp_marking_south_north-1351956767>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dscp_marking_south_north-1351956767>]
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.141 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.141 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.141 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-test_dscp_marking_south_north-1351956767>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dscp_marking_south_north-1351956767>]
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.141 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.usage volume: 2752512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1f57c4b-576e-4af1-8fa4-b7ca5b4fcd27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 2752512, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:02:36.141975', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '96c34eee-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.822105849, 'message_signature': '17766bd91c4dab22556eac59f0b8d35092d4d014d0a75276b1e54bfc3f703a4e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:02:36.141975', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '96c35812-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.822105849, 'message_signature': '6ac1a30be62a02e56a5744589a32e71f8b8a2299c90ff96d68d0f7bff9c4b69a'}]}, 'timestamp': '2025-10-08 17:02:36.142490', '_unique_id': '5425885de6284b11a3e02911e66e7f11'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.142 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.143 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.143 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.read.requests volume: 6327 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.143 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.read.requests volume: 22 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0fdf3322-b3d8-4d55-b646-c1b1d84127f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 6327, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:02:36.143649', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '96c38efe-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.840801115, 'message_signature': '6d5e8f99749a4157d9aefb4e4fa0497df826bbde5566774ca7a3bf2a4fd7f428'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 22, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:02:36.143649', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '96c39930-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.840801115, 'message_signature': '3561822fc77165af38d136e1d80ee7b4af09dc3716b5798e7020d2d55d553c59'}]}, 'timestamp': '2025-10-08 17:02:36.144174', '_unique_id': '78a7a256ba9b496e9f5b68cf16c8a7b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.144 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.145 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.145 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8b4c63a-d44c-4801-a753-eb4d89b871b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:02:36.145283', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': '96c3cf04-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.835642561, 'message_signature': '1e55c4645d7fd0a12efa8fc38124b8133251e268fd2d54211db8956213f63615'}]}, 'timestamp': '2025-10-08 17:02:36.145579', '_unique_id': '1d8a41ff868f44a7aedd7c31a0bb5803'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.146 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb6b6078-567b-4a2c-914a-63d882deda7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:02:36.146665', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': '96c405be-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.835642561, 'message_signature': '2be25cfbd5d968f0673d39c71423eb464bafbbb42cd774a4ef8c471bb76baf20'}]}, 'timestamp': '2025-10-08 17:02:36.146980', '_unique_id': 'db1f523ce0f74b4c840683ee33b04ef8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.147 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc2628cd-164d-43af-8077-56a5eb59a23b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:02:36.148064', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': '96c43c96-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.835642561, 'message_signature': '03dbb54c1191c616a0eec7936497fd87a71e7afc3b485556faa8c84bd692df7b'}]}, 'timestamp': '2025-10-08 17:02:36.148383', '_unique_id': '7e7b2c2886ef474ea5c15206df38c445'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.148 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.149 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.149 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.allocation volume: 3350528 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.149 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef881c88-9ab5-4a61-b04d-05dd3966acb7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 3350528, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:02:36.149581', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '96c476ac-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.822105849, 'message_signature': '726990ff0882a0f3746f645113517f7d13e21df1b0282eb0ccc4583baf944c50'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:02:36.149581', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '96c480b6-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.822105849, 'message_signature': '85cab175cbc30c353060fa0a55890be701a89d15433019b5ac678d86017e238f'}]}, 'timestamp': '2025-10-08 17:02:36.150087', '_unique_id': '16686884d085472eb047b90108aa9d28'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aac18e4d-5218-46a0-b5bd-183a71fc4695', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:02:36.151224', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': '96c4b6f8-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.835642561, 'message_signature': 'a9aa47c6791a081f5321ead0b89dd86a22691d6db7fe3fbfc77c7df0c952c6b2'}]}, 'timestamp': '2025-10-08 17:02:36.151516', '_unique_id': 'b4fe6940e381445d95cfe29da25ce33f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.151 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.152 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.152 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc8d627e-9bb2-4ee9-8bf7-d0a04307cc19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:02:36.152581', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': '96c4ebfa-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.835642561, 'message_signature': '451a25b87f6cd3c0a85276d2dde0f5d73184060fae070753c0753b119e631a67'}]}, 'timestamp': '2025-10-08 17:02:36.152872', '_unique_id': 'ca13c6fc341b4f9b9f2860ad9bdcbae2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.153 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.read.bytes volume: 123513344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.read.bytes volume: 55476 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8fe51fdc-afc6-4ddd-87e9-c5395cbf1cb0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 123513344, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:02:36.153946', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '96c521d8-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.840801115, 'message_signature': 'fc059abb10e91adafcbee594ec57cacaf6788db383c40464cf4c3efafa1d7475'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 55476, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:02:36.153946', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '96c52c5a-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.840801115, 'message_signature': '1026273b9ce072d4fac2dea5917e03b44c0bba3a456c93bc07f40aab249fb463'}]}, 'timestamp': '2025-10-08 17:02:36.154478', '_unique_id': 'c1593278f37346b2b97c292f6ed289cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.155 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.155 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.read.latency volume: 3142443450 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.read.latency volume: 17049410 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1f36102-119e-4baf-bc5d-6b1feb1a0b1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3142443450, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:02:36.155775', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '96c568b4-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.840801115, 'message_signature': '5b69f53560523418a8c9143f6ec3f34caa9b79cb49bdbbfb51a9d1529f75086b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17049410, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:02:36.155775', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '96c57372-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.840801115, 'message_signature': '45905d649b3e0f086c8d4c861e064d7aafd0afbd083e5ed281ac08464741b0cb'}]}, 'timestamp': '2025-10-08 17:02:36.156299', '_unique_id': '9af1162bb14c4e75a0e461fbbb80f07a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.156 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.157 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.157 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.157 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance b62bae97-ef9e-4f29-8e84-9731b7b2dc32: ceilometer.compute.pollsters.NoVolumeException
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.157 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.157 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.write.requests volume: 6 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.157 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '75563960-13dc-479b-9b81-4497f816f28b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 6, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:02:36.157675', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '96c5b1ac-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.840801115, 'message_signature': '002fa6577ecd3ba8c956d7bb24b0d4bc2e346df3e1087e1d06a927ffe694a141'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:02:36.157675', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '96c5b968-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.840801115, 'message_signature': '348c7efc485010f2137a8351096d0bc93c6f3ee103948fe40e20075cf244f491'}]}, 'timestamp': '2025-10-08 17:02:36.158081', '_unique_id': '7fa030fb71494952a539b88c090701dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.158 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.159 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.159 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.159 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-test_dscp_marking_south_north-1351956767>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dscp_marking_south_north-1351956767>]
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.159 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.159 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.write.bytes volume: 2100736 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.159 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0005920-59d2-476c-b39a-994fd78d5644', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2100736, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:02:36.159597', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '96c5fcde-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.840801115, 'message_signature': 'a5c9ffcefa4826e5a3007cdc37f17a0e835f5f7317f27f5dba74a78b5ae3f785'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:02:36.159597', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '96c60486-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.840801115, 'message_signature': '593ca951c835a6a919ee3948e765a5872d1a9619100ba2cad6b703bde9377776'}]}, 'timestamp': '2025-10-08 17:02:36.160002', '_unique_id': 'c30a4751dd544e9c88602b741d500555'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f81bd145-f0f0-4e61-9481-6e9fc49c1b65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:02:36.161197', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': '96c63c62-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.835642561, 'message_signature': '7acfc81dd46ee864310158b15b1348135a364e5df561c3d07bb1617e90c07a12'}]}, 'timestamp': '2025-10-08 17:02:36.161458', '_unique_id': 'b6b534547ce0444ead27b421399998ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.161 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.162 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.162 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de86c7f6-809c-4708-b261-fcea610852be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:02:36.162539', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': '96c66fde-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.835642561, 'message_signature': '774e7ea17c8666544fae600b35739b13968e41aff202b3c49a00627c6cdc0108'}]}, 'timestamp': '2025-10-08 17:02:36.162776', '_unique_id': 'fdee632096fc4372a6649aa88f128c54'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.163 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0996532e-14b5-4880-8c75-f3cb6877ccf3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:02:36.164067', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': '96c6af12-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9859.835642561, 'message_signature': '65660e7d219c10081c8d5783e973b9a30f3d15d56b75d19aebd0cc8ace028c1c'}]}, 'timestamp': '2025-10-08 17:02:36.164385', '_unique_id': 'e1e689cc3e724cef9c3fd13f4b8f0d77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:02:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:02:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:02:36 np0005476733 nova_compute[192580]: 2025-10-08 17:02:36.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:36 np0005476733 nova_compute[192580]: 2025-10-08 17:02:36.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:02:36 np0005476733 nova_compute[192580]: 2025-10-08 17:02:36.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 13:02:36 np0005476733 nova_compute[192580]: 2025-10-08 17:02:36.612 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 13:02:37 np0005476733 podman[278827]: 2025-10-08 17:02:37.222024091 +0000 UTC m=+0.050179184 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 13:02:37 np0005476733 podman[278826]: 2025-10-08 17:02:37.258028441 +0000 UTC m=+0.089480689 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 13:02:39 np0005476733 nova_compute[192580]: 2025-10-08 17:02:39.047 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:02:39 np0005476733 nova_compute[192580]: 2025-10-08 17:02:39.071 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Triggering sync for uuid b62bae97-ef9e-4f29-8e84-9731b7b2dc32 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  8 13:02:39 np0005476733 nova_compute[192580]: 2025-10-08 17:02:39.072 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:02:39 np0005476733 nova_compute[192580]: 2025-10-08 17:02:39.072 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:02:39 np0005476733 nova_compute[192580]: 2025-10-08 17:02:39.098 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.026s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:02:40 np0005476733 nova_compute[192580]: 2025-10-08 17:02:40.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:41 np0005476733 ovn_controller[263831]: 2025-10-08T17:02:41Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8c:c7:90 192.168.1.210
Oct  8 13:02:41 np0005476733 ovn_controller[263831]: 2025-10-08T17:02:41Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8c:c7:90 192.168.1.210
Oct  8 13:02:41 np0005476733 nova_compute[192580]: 2025-10-08 17:02:41.192 2 INFO nova.compute.manager [None req-3000a455-5b90-4cac-b60f-69729954bf14 c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Get console output#033[00m
Oct  8 13:02:41 np0005476733 nova_compute[192580]: 2025-10-08 17:02:41.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:45 np0005476733 nova_compute[192580]: 2025-10-08 17:02:45.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:46 np0005476733 nova_compute[192580]: 2025-10-08 17:02:46.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:46 np0005476733 nova_compute[192580]: 2025-10-08 17:02:46.321 2 INFO nova.compute.manager [None req-a4647e0f-2adb-47c9-96c1-5cad633d5038 c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Get console output#033[00m
Oct  8 13:02:46 np0005476733 nova_compute[192580]: 2025-10-08 17:02:46.326 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 13:02:46 np0005476733 nova_compute[192580]: 2025-10-08 17:02:46.329 2 INFO nova.virt.libvirt.driver [None req-a4647e0f-2adb-47c9-96c1-5cad633d5038 c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Truncated console log returned, 1366 bytes ignored#033[00m
Oct  8 13:02:47 np0005476733 ovn_controller[263831]: 2025-10-08T17:02:47Z|00243|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Oct  8 13:02:48 np0005476733 podman[278868]: 2025-10-08 17:02:48.227891319 +0000 UTC m=+0.057438865 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Oct  8 13:02:50 np0005476733 podman[278887]: 2025-10-08 17:02:50.259332832 +0000 UTC m=+0.089642034 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 13:02:50 np0005476733 nova_compute[192580]: 2025-10-08 17:02:50.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:51 np0005476733 nova_compute[192580]: 2025-10-08 17:02:51.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:51 np0005476733 nova_compute[192580]: 2025-10-08 17:02:51.494 2 INFO nova.compute.manager [None req-38a1c1ce-f7fc-433b-927d-5222ecc3e775 c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Get console output#033[00m
Oct  8 13:02:51 np0005476733 nova_compute[192580]: 2025-10-08 17:02:51.500 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 13:02:51 np0005476733 nova_compute[192580]: 2025-10-08 17:02:51.503 2 INFO nova.virt.libvirt.driver [None req-38a1c1ce-f7fc-433b-927d-5222ecc3e775 c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Truncated console log returned, 3266 bytes ignored#033[00m
Oct  8 13:02:53 np0005476733 podman[278935]: 2025-10-08 17:02:53.235149177 +0000 UTC m=+0.067385383 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Oct  8 13:02:53 np0005476733 ovn_controller[263831]: 2025-10-08T17:02:53Z|00244|pinctrl|WARN|Dropped 359 log messages in last 56 seconds (most recently, 4 seconds ago) due to excessive rate
Oct  8 13:02:53 np0005476733 ovn_controller[263831]: 2025-10-08T17:02:53Z|00245|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:02:53 np0005476733 nova_compute[192580]: 2025-10-08 17:02:53.940 2 DEBUG nova.compute.manager [req-55f1c868-361f-4cd0-bdfb-7cb8976f791f req-0941e92f-dc6d-4906-8db2-a61a19bec459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Received event network-changed-515844fc-ac1c-4a41-afdb-496edaf8d2ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:02:53 np0005476733 nova_compute[192580]: 2025-10-08 17:02:53.940 2 DEBUG nova.compute.manager [req-55f1c868-361f-4cd0-bdfb-7cb8976f791f req-0941e92f-dc6d-4906-8db2-a61a19bec459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Refreshing instance network info cache due to event network-changed-515844fc-ac1c-4a41-afdb-496edaf8d2ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 13:02:53 np0005476733 nova_compute[192580]: 2025-10-08 17:02:53.940 2 DEBUG oslo_concurrency.lockutils [req-55f1c868-361f-4cd0-bdfb-7cb8976f791f req-0941e92f-dc6d-4906-8db2-a61a19bec459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 13:02:53 np0005476733 nova_compute[192580]: 2025-10-08 17:02:53.940 2 DEBUG oslo_concurrency.lockutils [req-55f1c868-361f-4cd0-bdfb-7cb8976f791f req-0941e92f-dc6d-4906-8db2-a61a19bec459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 13:02:53 np0005476733 nova_compute[192580]: 2025-10-08 17:02:53.941 2 DEBUG nova.network.neutron [req-55f1c868-361f-4cd0-bdfb-7cb8976f791f req-0941e92f-dc6d-4906-8db2-a61a19bec459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Refreshing network info cache for port 515844fc-ac1c-4a41-afdb-496edaf8d2ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 13:02:55 np0005476733 nova_compute[192580]: 2025-10-08 17:02:55.086 2 DEBUG nova.network.neutron [req-55f1c868-361f-4cd0-bdfb-7cb8976f791f req-0941e92f-dc6d-4906-8db2-a61a19bec459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Updated VIF entry in instance network info cache for port 515844fc-ac1c-4a41-afdb-496edaf8d2ad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 13:02:55 np0005476733 nova_compute[192580]: 2025-10-08 17:02:55.087 2 DEBUG nova.network.neutron [req-55f1c868-361f-4cd0-bdfb-7cb8976f791f req-0941e92f-dc6d-4906-8db2-a61a19bec459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Updating instance_info_cache with network_info: [{"id": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "address": "fa:16:3e:8c:c7:90", "network": {"id": "90f23013-7525-484c-9fdb-1b08a96d01b7", "bridge": "br-int", "label": "tempest-test-network--203230236", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bfc3ffe3bf745dfbb59f63a07f1e1a9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap515844fc-ac", "ovs_interfaceid": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 13:02:55 np0005476733 nova_compute[192580]: 2025-10-08 17:02:55.110 2 DEBUG oslo_concurrency.lockutils [req-55f1c868-361f-4cd0-bdfb-7cb8976f791f req-0941e92f-dc6d-4906-8db2-a61a19bec459 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 13:02:55 np0005476733 nova_compute[192580]: 2025-10-08 17:02:55.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:02:56 np0005476733 nova_compute[192580]: 2025-10-08 17:02:56.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:00 np0005476733 nova_compute[192580]: 2025-10-08 17:03:00.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:01 np0005476733 nova_compute[192580]: 2025-10-08 17:03:01.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:02 np0005476733 podman[278957]: 2025-10-08 17:03:02.247105903 +0000 UTC m=+0.058060025 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 13:03:02 np0005476733 podman[278956]: 2025-10-08 17:03:02.262350259 +0000 UTC m=+0.072302730 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 13:03:02 np0005476733 podman[278958]: 2025-10-08 17:03:02.28426001 +0000 UTC m=+0.084461149 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, distribution-scope=public, name=ubi9-minimal, version=9.6, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc.)
Oct  8 13:03:04 np0005476733 nova_compute[192580]: 2025-10-08 17:03:04.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:03:04 np0005476733 nova_compute[192580]: 2025-10-08 17:03:04.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:03:05 np0005476733 nova_compute[192580]: 2025-10-08 17:03:05.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:03:05 np0005476733 nova_compute[192580]: 2025-10-08 17:03:05.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:06 np0005476733 nova_compute[192580]: 2025-10-08 17:03:06.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:08 np0005476733 podman[279022]: 2025-10-08 17:03:08.223197906 +0000 UTC m=+0.050051200 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 13:03:08 np0005476733 podman[279021]: 2025-10-08 17:03:08.231539152 +0000 UTC m=+0.060776092 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid)
Oct  8 13:03:10 np0005476733 nova_compute[192580]: 2025-10-08 17:03:10.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:03:10 np0005476733 nova_compute[192580]: 2025-10-08 17:03:10.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:11 np0005476733 nova_compute[192580]: 2025-10-08 17:03:11.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:03:14.061 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=96, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=95) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:03:14 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:03:14.062 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 13:03:14 np0005476733 nova_compute[192580]: 2025-10-08 17:03:14.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:15 np0005476733 nova_compute[192580]: 2025-10-08 17:03:15.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:03:16.064 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '96'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:03:16 np0005476733 nova_compute[192580]: 2025-10-08 17:03:16.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:16 np0005476733 nova_compute[192580]: 2025-10-08 17:03:16.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:03:17 np0005476733 nova_compute[192580]: 2025-10-08 17:03:17.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:03:17 np0005476733 nova_compute[192580]: 2025-10-08 17:03:17.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:03:17 np0005476733 nova_compute[192580]: 2025-10-08 17:03:17.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:03:17 np0005476733 nova_compute[192580]: 2025-10-08 17:03:17.912 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 13:03:17 np0005476733 nova_compute[192580]: 2025-10-08 17:03:17.912 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 13:03:17 np0005476733 nova_compute[192580]: 2025-10-08 17:03:17.913 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 13:03:17 np0005476733 nova_compute[192580]: 2025-10-08 17:03:17.913 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b62bae97-ef9e-4f29-8e84-9731b7b2dc32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 13:03:19 np0005476733 nova_compute[192580]: 2025-10-08 17:03:19.112 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Updating instance_info_cache with network_info: [{"id": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "address": "fa:16:3e:8c:c7:90", "network": {"id": "90f23013-7525-484c-9fdb-1b08a96d01b7", "bridge": "br-int", "label": "tempest-test-network--203230236", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bfc3ffe3bf745dfbb59f63a07f1e1a9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap515844fc-ac", "ovs_interfaceid": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 13:03:19 np0005476733 nova_compute[192580]: 2025-10-08 17:03:19.130 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 13:03:19 np0005476733 nova_compute[192580]: 2025-10-08 17:03:19.130 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 13:03:19 np0005476733 podman[279062]: 2025-10-08 17:03:19.231805702 +0000 UTC m=+0.054520192 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 13:03:20 np0005476733 nova_compute[192580]: 2025-10-08 17:03:20.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:03:20 np0005476733 nova_compute[192580]: 2025-10-08 17:03:20.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:21 np0005476733 podman[279082]: 2025-10-08 17:03:21.295783845 +0000 UTC m=+0.116525544 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller)
Oct  8 13:03:21 np0005476733 nova_compute[192580]: 2025-10-08 17:03:21.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:24 np0005476733 podman[279109]: 2025-10-08 17:03:24.309311744 +0000 UTC m=+0.128829046 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 13:03:25 np0005476733 nova_compute[192580]: 2025-10-08 17:03:25.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:26 np0005476733 nova_compute[192580]: 2025-10-08 17:03:26.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:03:26.440 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:03:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:03:26.441 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:03:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:03:26.441 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:03:27 np0005476733 nova_compute[192580]: 2025-10-08 17:03:27.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:03:28 np0005476733 nova_compute[192580]: 2025-10-08 17:03:28.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:03:28 np0005476733 nova_compute[192580]: 2025-10-08 17:03:28.617 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:03:28 np0005476733 nova_compute[192580]: 2025-10-08 17:03:28.617 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:03:28 np0005476733 nova_compute[192580]: 2025-10-08 17:03:28.617 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:03:28 np0005476733 nova_compute[192580]: 2025-10-08 17:03:28.618 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:03:28 np0005476733 nova_compute[192580]: 2025-10-08 17:03:28.678 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:03:28 np0005476733 nova_compute[192580]: 2025-10-08 17:03:28.753 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:03:28 np0005476733 nova_compute[192580]: 2025-10-08 17:03:28.754 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:03:28 np0005476733 nova_compute[192580]: 2025-10-08 17:03:28.808 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:03:28 np0005476733 nova_compute[192580]: 2025-10-08 17:03:28.957 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:03:28 np0005476733 nova_compute[192580]: 2025-10-08 17:03:28.958 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12846MB free_disk=111.15824508666992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:03:28 np0005476733 nova_compute[192580]: 2025-10-08 17:03:28.958 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:03:28 np0005476733 nova_compute[192580]: 2025-10-08 17:03:28.959 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:03:29 np0005476733 nova_compute[192580]: 2025-10-08 17:03:29.024 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance b62bae97-ef9e-4f29-8e84-9731b7b2dc32 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 13:03:29 np0005476733 nova_compute[192580]: 2025-10-08 17:03:29.025 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:03:29 np0005476733 nova_compute[192580]: 2025-10-08 17:03:29.025 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:03:29 np0005476733 nova_compute[192580]: 2025-10-08 17:03:29.132 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:03:29 np0005476733 nova_compute[192580]: 2025-10-08 17:03:29.155 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:03:29 np0005476733 nova_compute[192580]: 2025-10-08 17:03:29.156 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:03:29 np0005476733 nova_compute[192580]: 2025-10-08 17:03:29.157 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:03:30 np0005476733 nova_compute[192580]: 2025-10-08 17:03:30.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:31 np0005476733 nova_compute[192580]: 2025-10-08 17:03:31.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:33 np0005476733 podman[279137]: 2025-10-08 17:03:33.229808848 +0000 UTC m=+0.059040837 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 13:03:33 np0005476733 podman[279136]: 2025-10-08 17:03:33.229835159 +0000 UTC m=+0.062851279 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 13:03:33 np0005476733 podman[279138]: 2025-10-08 17:03:33.245862351 +0000 UTC m=+0.070482203 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  8 13:03:35 np0005476733 nova_compute[192580]: 2025-10-08 17:03:35.157 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:03:35 np0005476733 nova_compute[192580]: 2025-10-08 17:03:35.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:36 np0005476733 nova_compute[192580]: 2025-10-08 17:03:36.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:39 np0005476733 podman[279202]: 2025-10-08 17:03:39.258525172 +0000 UTC m=+0.064322296 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 13:03:39 np0005476733 podman[279201]: 2025-10-08 17:03:39.261477806 +0000 UTC m=+0.073018163 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  8 13:03:40 np0005476733 nova_compute[192580]: 2025-10-08 17:03:40.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:41 np0005476733 nova_compute[192580]: 2025-10-08 17:03:41.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:45 np0005476733 nova_compute[192580]: 2025-10-08 17:03:45.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:46 np0005476733 nova_compute[192580]: 2025-10-08 17:03:46.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:49 np0005476733 nova_compute[192580]: 2025-10-08 17:03:49.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:03:50 np0005476733 nova_compute[192580]: 2025-10-08 17:03:50.088 2 DEBUG nova.compute.manager [req-6869c108-1ea8-4c01-bf3c-d8f4b3542956 req-35e38675-a64b-46d4-a8ad-00eeb928c62b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Received event network-changed-515844fc-ac1c-4a41-afdb-496edaf8d2ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:03:50 np0005476733 nova_compute[192580]: 2025-10-08 17:03:50.088 2 DEBUG nova.compute.manager [req-6869c108-1ea8-4c01-bf3c-d8f4b3542956 req-35e38675-a64b-46d4-a8ad-00eeb928c62b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Refreshing instance network info cache due to event network-changed-515844fc-ac1c-4a41-afdb-496edaf8d2ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 13:03:50 np0005476733 nova_compute[192580]: 2025-10-08 17:03:50.089 2 DEBUG oslo_concurrency.lockutils [req-6869c108-1ea8-4c01-bf3c-d8f4b3542956 req-35e38675-a64b-46d4-a8ad-00eeb928c62b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 13:03:50 np0005476733 nova_compute[192580]: 2025-10-08 17:03:50.089 2 DEBUG oslo_concurrency.lockutils [req-6869c108-1ea8-4c01-bf3c-d8f4b3542956 req-35e38675-a64b-46d4-a8ad-00eeb928c62b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 13:03:50 np0005476733 nova_compute[192580]: 2025-10-08 17:03:50.089 2 DEBUG nova.network.neutron [req-6869c108-1ea8-4c01-bf3c-d8f4b3542956 req-35e38675-a64b-46d4-a8ad-00eeb928c62b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Refreshing network info cache for port 515844fc-ac1c-4a41-afdb-496edaf8d2ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 13:03:50 np0005476733 podman[279242]: 2025-10-08 17:03:50.243359809 +0000 UTC m=+0.066100053 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 13:03:51 np0005476733 nova_compute[192580]: 2025-10-08 17:03:51.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:51 np0005476733 nova_compute[192580]: 2025-10-08 17:03:51.338 2 DEBUG nova.network.neutron [req-6869c108-1ea8-4c01-bf3c-d8f4b3542956 req-35e38675-a64b-46d4-a8ad-00eeb928c62b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Updated VIF entry in instance network info cache for port 515844fc-ac1c-4a41-afdb-496edaf8d2ad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 13:03:51 np0005476733 nova_compute[192580]: 2025-10-08 17:03:51.339 2 DEBUG nova.network.neutron [req-6869c108-1ea8-4c01-bf3c-d8f4b3542956 req-35e38675-a64b-46d4-a8ad-00eeb928c62b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Updating instance_info_cache with network_info: [{"id": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "address": "fa:16:3e:8c:c7:90", "network": {"id": "90f23013-7525-484c-9fdb-1b08a96d01b7", "bridge": "br-int", "label": "tempest-test-network--203230236", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bfc3ffe3bf745dfbb59f63a07f1e1a9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap515844fc-ac", "ovs_interfaceid": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 13:03:51 np0005476733 nova_compute[192580]: 2025-10-08 17:03:51.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:51 np0005476733 nova_compute[192580]: 2025-10-08 17:03:51.377 2 DEBUG oslo_concurrency.lockutils [req-6869c108-1ea8-4c01-bf3c-d8f4b3542956 req-35e38675-a64b-46d4-a8ad-00eeb928c62b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 13:03:52 np0005476733 ovn_controller[263831]: 2025-10-08T17:03:52Z|00246|pinctrl|WARN|Dropped 167 log messages in last 58 seconds (most recently, 2 seconds ago) due to excessive rate
Oct  8 13:03:52 np0005476733 ovn_controller[263831]: 2025-10-08T17:03:52Z|00247|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:03:52 np0005476733 podman[279262]: 2025-10-08 17:03:52.297104772 +0000 UTC m=+0.122283157 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 13:03:55 np0005476733 podman[279287]: 2025-10-08 17:03:55.23636948 +0000 UTC m=+0.066874637 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 13:03:56 np0005476733 nova_compute[192580]: 2025-10-08 17:03:56.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:03:56 np0005476733 nova_compute[192580]: 2025-10-08 17:03:56.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:01 np0005476733 nova_compute[192580]: 2025-10-08 17:04:01.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:01 np0005476733 nova_compute[192580]: 2025-10-08 17:04:01.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:04 np0005476733 podman[279308]: 2025-10-08 17:04:04.243034758 +0000 UTC m=+0.073917363 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Oct  8 13:04:04 np0005476733 podman[279309]: 2025-10-08 17:04:04.25313202 +0000 UTC m=+0.070194442 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 13:04:04 np0005476733 podman[279310]: 2025-10-08 17:04:04.268899124 +0000 UTC m=+0.081060620 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, vcs-type=git, release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal)
Oct  8 13:04:04 np0005476733 nova_compute[192580]: 2025-10-08 17:04:04.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:04:04 np0005476733 nova_compute[192580]: 2025-10-08 17:04:04.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:04:06 np0005476733 nova_compute[192580]: 2025-10-08 17:04:06.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:06 np0005476733 nova_compute[192580]: 2025-10-08 17:04:06.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:07 np0005476733 nova_compute[192580]: 2025-10-08 17:04:07.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:04:10 np0005476733 podman[279371]: 2025-10-08 17:04:10.229900614 +0000 UTC m=+0.052523509 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 13:04:10 np0005476733 podman[279372]: 2025-10-08 17:04:10.232074993 +0000 UTC m=+0.052443386 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 13:04:11 np0005476733 nova_compute[192580]: 2025-10-08 17:04:11.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:11 np0005476733 nova_compute[192580]: 2025-10-08 17:04:11.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:12 np0005476733 nova_compute[192580]: 2025-10-08 17:04:12.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:04:16 np0005476733 nova_compute[192580]: 2025-10-08 17:04:16.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:16 np0005476733 nova_compute[192580]: 2025-10-08 17:04:16.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:17 np0005476733 nova_compute[192580]: 2025-10-08 17:04:17.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:04:19 np0005476733 nova_compute[192580]: 2025-10-08 17:04:19.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:04:19 np0005476733 nova_compute[192580]: 2025-10-08 17:04:19.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:04:19 np0005476733 nova_compute[192580]: 2025-10-08 17:04:19.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:04:19 np0005476733 nova_compute[192580]: 2025-10-08 17:04:19.937 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 13:04:19 np0005476733 nova_compute[192580]: 2025-10-08 17:04:19.938 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 13:04:19 np0005476733 nova_compute[192580]: 2025-10-08 17:04:19.938 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 13:04:19 np0005476733 nova_compute[192580]: 2025-10-08 17:04:19.938 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b62bae97-ef9e-4f29-8e84-9731b7b2dc32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 13:04:21 np0005476733 nova_compute[192580]: 2025-10-08 17:04:21.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:21 np0005476733 podman[279414]: 2025-10-08 17:04:21.247334642 +0000 UTC m=+0.073804587 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Oct  8 13:04:21 np0005476733 nova_compute[192580]: 2025-10-08 17:04:21.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:21 np0005476733 nova_compute[192580]: 2025-10-08 17:04:21.467 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Updating instance_info_cache with network_info: [{"id": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "address": "fa:16:3e:8c:c7:90", "network": {"id": "90f23013-7525-484c-9fdb-1b08a96d01b7", "bridge": "br-int", "label": "tempest-test-network--203230236", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bfc3ffe3bf745dfbb59f63a07f1e1a9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap515844fc-ac", "ovs_interfaceid": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 13:04:21 np0005476733 nova_compute[192580]: 2025-10-08 17:04:21.490 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 13:04:21 np0005476733 nova_compute[192580]: 2025-10-08 17:04:21.491 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 13:04:21 np0005476733 nova_compute[192580]: 2025-10-08 17:04:21.491 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:04:23 np0005476733 podman[279436]: 2025-10-08 17:04:23.242475355 +0000 UTC m=+0.077903230 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:04:26 np0005476733 nova_compute[192580]: 2025-10-08 17:04:26.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:26 np0005476733 podman[279463]: 2025-10-08 17:04:26.220018306 +0000 UTC m=+0.054478131 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 13:04:26 np0005476733 nova_compute[192580]: 2025-10-08 17:04:26.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:04:26.442 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:04:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:04:26.442 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:04:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:04:26.442 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:04:28 np0005476733 nova_compute[192580]: 2025-10-08 17:04:28.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:04:28 np0005476733 nova_compute[192580]: 2025-10-08 17:04:28.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:04:28 np0005476733 nova_compute[192580]: 2025-10-08 17:04:28.613 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:04:28 np0005476733 nova_compute[192580]: 2025-10-08 17:04:28.613 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:04:28 np0005476733 nova_compute[192580]: 2025-10-08 17:04:28.613 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:04:28 np0005476733 nova_compute[192580]: 2025-10-08 17:04:28.613 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:04:28 np0005476733 nova_compute[192580]: 2025-10-08 17:04:28.681 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:04:28 np0005476733 nova_compute[192580]: 2025-10-08 17:04:28.735 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:04:28 np0005476733 nova_compute[192580]: 2025-10-08 17:04:28.736 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:04:28 np0005476733 nova_compute[192580]: 2025-10-08 17:04:28.792 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:04:28 np0005476733 nova_compute[192580]: 2025-10-08 17:04:28.921 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:04:28 np0005476733 nova_compute[192580]: 2025-10-08 17:04:28.922 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12852MB free_disk=111.15838623046875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:04:28 np0005476733 nova_compute[192580]: 2025-10-08 17:04:28.922 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:04:28 np0005476733 nova_compute[192580]: 2025-10-08 17:04:28.922 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:04:28 np0005476733 nova_compute[192580]: 2025-10-08 17:04:28.995 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance b62bae97-ef9e-4f29-8e84-9731b7b2dc32 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 13:04:28 np0005476733 nova_compute[192580]: 2025-10-08 17:04:28.995 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:04:28 np0005476733 nova_compute[192580]: 2025-10-08 17:04:28.995 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:04:29 np0005476733 nova_compute[192580]: 2025-10-08 17:04:29.097 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:04:29 np0005476733 nova_compute[192580]: 2025-10-08 17:04:29.112 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:04:29 np0005476733 nova_compute[192580]: 2025-10-08 17:04:29.113 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:04:29 np0005476733 nova_compute[192580]: 2025-10-08 17:04:29.114 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:04:31 np0005476733 nova_compute[192580]: 2025-10-08 17:04:31.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:31 np0005476733 nova_compute[192580]: 2025-10-08 17:04:31.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:35 np0005476733 podman[279490]: 2025-10-08 17:04:35.222931382 +0000 UTC m=+0.051768804 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 13:04:35 np0005476733 podman[279491]: 2025-10-08 17:04:35.234800321 +0000 UTC m=+0.060609456 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, distribution-scope=public, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=edpm, io.buildah.version=1.33.7)
Oct  8 13:04:35 np0005476733 podman[279489]: 2025-10-08 17:04:35.246503925 +0000 UTC m=+0.077995212 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 13:04:36 np0005476733 nova_compute[192580]: 2025-10-08 17:04:36.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.079 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'name': 'tempest-test_dscp_marking_south_north-1351956767', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000072', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'hostId': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.080 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.098 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.write.latency volume: 7711860143 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.098 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '700d3e0f-a0e2-46c8-8cbb-0521d7a92420', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7711860143, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:04:36.080252', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'de433cb6-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.803210866, 'message_signature': '95dbc8af2cabfa85b3f11baf122c5bed0c9d1d2e2e1f3385271e98d902f3331c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:04:36.080252', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'de434c10-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.803210866, 'message_signature': 'a4ff864518e473f31adc0882fcea1eda8e5f49b5f5d90397c11163b97f0dd206'}]}, 'timestamp': '2025-10-08 17:04:36.099317', '_unique_id': '12be037a45154bc4a53d87cd2e700b64'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.100 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.101 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.105 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.outgoing.packets volume: 157 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c34e761-3365-4324-b167-58626eedaf26', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 157, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:04:36.101835', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': 'de44432c-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.824832036, 'message_signature': 'ed9401810b2f78d4a09d4300811808e0f09b6d52dc7dcbac2960176e3f1c26c0'}]}, 'timestamp': '2025-10-08 17:04:36.105728', '_unique_id': 'a844c9f239cd48f0b5ee199e6a6d11fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.106 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.108 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.108 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.write.bytes volume: 136895488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.108 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a92d9a2-bf3e-45a2-84d5-1a508c5f584f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 136895488, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:04:36.108196', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'de44b4f6-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.803210866, 'message_signature': 'de89042ce8f3cd20103b9e7cdfcd670c14ee7a877e75d40a641e283d28a9e4a9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:04:36.108196', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'de44c14e-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.803210866, 'message_signature': '07eacdbcf8e4de535731af08b338a051774dd23e4112167879bee4291e2a1749'}]}, 'timestamp': '2025-10-08 17:04:36.108875', '_unique_id': 'a8c8bb5a9f53403aa3627992bc7c8ba2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.109 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.110 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.126 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/memory.usage volume: 235.0546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a7d79d1-f009-4109-b8e4-82ac1e97a034', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 235.0546875, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'timestamp': '2025-10-08T17:04:36.110720', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': 'de477e66-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.84906137, 'message_signature': 'e08b8c63445dd50f0032b706002db88a3ab8412d409c11ce2cee21d6ea4aa943'}]}, 'timestamp': '2025-10-08 17:04:36.126890', '_unique_id': '48cf0a66b76347bea0b431662d088d63'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.128 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.129 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.outgoing.bytes volume: 28071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17b9c23b-2601-4a4d-b9bc-fd2d7b01d454', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28071, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:04:36.129050', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': 'de47e338-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.824832036, 'message_signature': 'af726ca6a6d5a9fa86eeee7c986862ad91eb96c9ffae70f311ab11c3db1ed47f'}]}, 'timestamp': '2025-10-08 17:04:36.129427', '_unique_id': 'db614fa8b9c14da987773d35d187839e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.131 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.141 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.142 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50b309ed-3b77-4f9d-b25f-ea4df3fae5f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:04:36.131139', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'de49e0ca-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.854126002, 'message_signature': '4d942c4ae9eca15bd0aba9312a0987ee7e3aee4eb45cd21337dd8015b19c1a39'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:04:36.131139', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'de49ee26-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.854126002, 'message_signature': 'a2337bd1b3c035b2ab783af89d87942cc19df4584ceb5b337a74f375aa07c44a'}]}, 'timestamp': '2025-10-08 17:04:36.142788', '_unique_id': 'd6f658595ead416790d69e305df2a3f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.143 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.144 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.144 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a970d52a-7d3f-4d94-b4e2-63805f4843dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:04:36.144890', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': 'de4a4c9a-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.824832036, 'message_signature': 'f4d762a304f9887e702c14eb8b3a003f3d88f4b5601ea90cea4229e698c85add'}]}, 'timestamp': '2025-10-08 17:04:36.145246', '_unique_id': '26e0b93324a44494a0700aecd6f418de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.146 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.147 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9dfeb45d-6920-44ed-b943-ef3b6e3e1653', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:04:36.146993', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': 'de4a9f6a-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.824832036, 'message_signature': '3084aa9f14a4dfa1287ee9c45e12bb4f0eab9c46e2740dbc822b429f9e02999c'}]}, 'timestamp': '2025-10-08 17:04:36.147359', '_unique_id': '59f9887172554451afd8f6ec7f814b2b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.148 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.149 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.incoming.packets volume: 109 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0063df9-5bd7-4b55-ae55-81b46a518408', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 109, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:04:36.149019', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': 'de4aee3e-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.824832036, 'message_signature': '872c772005fc293e5a5fe238deee9fd4b6b55ecb4881d097241f71a9af110d6f'}]}, 'timestamp': '2025-10-08 17:04:36.149371', '_unique_id': 'bf67ccdf88b341e09df26066133712bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.150 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.read.latency volume: 6861795936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.151 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.read.latency volume: 50547014 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fdf05db2-d4aa-4261-80bc-e34719cd5edb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6861795936, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:04:36.150951', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'de4b3a74-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.803210866, 'message_signature': 'd22eec6761d094bf19ac64c688310e34436066f02f362d69bdd53712f0ef5a40'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 50547014, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:04:36.150951', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'de4b45f0-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.803210866, 'message_signature': '6d274802d6a25655ee0c069f1d3586ed9ab8054186cb2f2da983b67649022dc4'}]}, 'timestamp': '2025-10-08 17:04:36.151577', '_unique_id': '8157dd1e722847d0b90f89914a274b92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.153 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.153 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.153 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8447ec33-9e24-43c3-bb70-ba4059c1d6d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:04:36.153265', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'de4b92b2-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.854126002, 'message_signature': 'c9c040b42a73c3526fd9d1f5fa5b2641ca7d532fe6ca43165676a9b548a9de05'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:04:36.153265', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'de4b9d70-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.854126002, 'message_signature': 'f660dc0fd708089f493de7bf5b2edd0d0aeb22134082e7f3412f075625e12107'}]}, 'timestamp': '2025-10-08 17:04:36.153811', '_unique_id': 'ffabe8c91a974b8da7ceb167173c5283'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.154 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.155 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.155 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.read.bytes volume: 328566272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.155 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23d6cb2e-280e-40a0-aded-41c0d9d2a16c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 328566272, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:04:36.155562', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'de4bec62-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.803210866, 'message_signature': '552e5ec4ad506d7082440bd319ec92ccc12ebe989612dbdd7aa118b4be40ccc9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:04:36.155562', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'de4bf612-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.803210866, 'message_signature': '99969057c9e231b1d77d843d28b2be2fc3f340943433d0597fe61f9a31657f24'}]}, 'timestamp': '2025-10-08 17:04:36.156078', '_unique_id': '64131a34c7af43a1b73fc917db905a0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.156 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.157 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.157 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.incoming.bytes.delta volume: 18485 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f58e19ce-bbfd-445b-808c-c73eb1c48ea4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 18485, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:04:36.157682', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': 'de4c3f96-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.824832036, 'message_signature': 'b18cf1f7581fd088fc153986cc7f5c8222766bb0b66874240f044e1711bc2809'}]}, 'timestamp': '2025-10-08 17:04:36.157979', '_unique_id': '8161f7a67f944052b8522e1ce3b02464'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.158 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.159 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.159 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.159 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81b74fdf-ce00-43b1-ae76-e63a6fa87d00', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:04:36.159598', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': 'de4c89e2-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.824832036, 'message_signature': '1135e4d72ff434709c7ee192cb2633cac2045bd064a2e2aeb858a67dd6ba82d6'}]}, 'timestamp': '2025-10-08 17:04:36.159884', '_unique_id': 'b3c63462708749d2b037a9a66387b8bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.161 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.161 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.161 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.incoming.bytes volume: 18595 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0620ef2-8184-4f31-932d-84217c6ece0d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 18595, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:04:36.161382', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': 'de4ccfce-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.824832036, 'message_signature': 'ab495bd09f4ed0d72be0a18571e832f1e0615bf5a06c4b7c1322500d6746f091'}]}, 'timestamp': '2025-10-08 17:04:36.161675', '_unique_id': '2adb25595fc44befb4d7fba61a1f530f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.162 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.163 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.usage volume: 152698880 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.163 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3e85ff5-a91f-46b4-ab7f-ebc47abedfda', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152698880, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:04:36.163049', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'de4d1240-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.854126002, 'message_signature': '39d42fa7d8357761453314ecc845b6eeeeb084093b17d2196b18b699b9157cf6'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:04:36.163049', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'de4d1d94-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.854126002, 'message_signature': 'f9035ae077597ae43e61593b8ea7d92aa4156ad592dccb1c5900687e08eaa50d'}]}, 'timestamp': '2025-10-08 17:04:36.163650', '_unique_id': 'd2730dad0500454d924a873e9619b05a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.164 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.165 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.read.requests volume: 11655 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.165 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10bb2f14-ac24-4a56-a84e-018f891970fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11655, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:04:36.165030', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'de4d5ee4-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.803210866, 'message_signature': 'b0c39f3f3c77e8a377da137797baf78321b9dce4d313a1c8956811f75ff7dc7c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:04:36.165030', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'de4d6880-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.803210866, 'message_signature': '61f2b034e68414a64572d370201380aa0c70a490a364e52e0b51fe7a784d0bb6'}]}, 'timestamp': '2025-10-08 17:04:36.165562', '_unique_id': '867da77fd2d34fd6a21865337323d347'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.166 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.write.requests volume: 801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.167 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6cf938c6-f5a0-49c4-bad4-8bd090f04db2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 801, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:04:36.166946', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'de4da9b2-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.803210866, 'message_signature': '6fc0f9267d9e2d1447aba9e85fea36561d5a058100840bad7970ae4a4c7e16cb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:04:36.166946', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'de4db722-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.803210866, 'message_signature': '72aab045e3343ff1d34b76acfff387662e9a4b2b3e86e083929db544c3d4e6e3'}]}, 'timestamp': '2025-10-08 17:04:36.167650', '_unique_id': '280d0d10961e454abfc285da1855c456'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.168 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b90d96d7-e8fd-4e2a-84c8-45dc3a7881f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:04:36.169061', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': 'de4dfcf0-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.824832036, 'message_signature': 'd1de31cd866a04a4bf726097cdab2ee5fb608ab44761332f268e4033b274bbd6'}]}, 'timestamp': '2025-10-08 17:04:36.169379', '_unique_id': '2da4623364ef449ea2b391522e6df49f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.169 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.170 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.170 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.outgoing.bytes.delta volume: 28071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '478fe648-6eb0-44c4-8f1e-571f6262e54d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 28071, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:04:36.170724', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': 'de4e3cb0-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.824832036, 'message_signature': '05739e5b0b006d8c9b48ba36a83a88c299002a3f74bd5588bc005085ddd2e5df'}]}, 'timestamp': '2025-10-08 17:04:36.171010', '_unique_id': '6490f6314e704b59aab10e84fbdb69ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.171 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.172 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.172 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/cpu volume: 40470000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7ccb0a2-eda0-4d12-a86b-fa98154eae1d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 40470000000, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'timestamp': '2025-10-08T17:04:36.172452', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'de4e8058-a468-11f0-9274-fa163ef67048', 'monotonic_time': 9979.84906137, 'message_signature': '8a9a53005f4443dac5735f5c142e36e890dfd0dfbf7b90e3e8ae65746e442fc2'}]}, 'timestamp': '2025-10-08 17:04:36.172735', '_unique_id': '8783d17750794fa09616bf13c2614d6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:04:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:04:36.173 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:04:36 np0005476733 nova_compute[192580]: 2025-10-08 17:04:36.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:37 np0005476733 nova_compute[192580]: 2025-10-08 17:04:37.114 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:04:41 np0005476733 nova_compute[192580]: 2025-10-08 17:04:41.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:41 np0005476733 podman[279552]: 2025-10-08 17:04:41.222247197 +0000 UTC m=+0.056010780 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, container_name=iscsid)
Oct  8 13:04:41 np0005476733 podman[279553]: 2025-10-08 17:04:41.222282208 +0000 UTC m=+0.051937400 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 13:04:41 np0005476733 nova_compute[192580]: 2025-10-08 17:04:41.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:46 np0005476733 nova_compute[192580]: 2025-10-08 17:04:46.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:46 np0005476733 nova_compute[192580]: 2025-10-08 17:04:46.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:51 np0005476733 nova_compute[192580]: 2025-10-08 17:04:51.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:51 np0005476733 nova_compute[192580]: 2025-10-08 17:04:51.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:51 np0005476733 podman[279596]: 2025-10-08 17:04:51.983816683 +0000 UTC m=+0.056632500 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct  8 13:04:54 np0005476733 ovn_controller[263831]: 2025-10-08T17:04:54Z|00248|pinctrl|WARN|Dropped 127 log messages in last 62 seconds (most recently, 11 seconds ago) due to excessive rate
Oct  8 13:04:54 np0005476733 ovn_controller[263831]: 2025-10-08T17:04:54Z|00249|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:04:54 np0005476733 podman[279615]: 2025-10-08 17:04:54.247838334 +0000 UTC m=+0.077806526 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  8 13:04:56 np0005476733 nova_compute[192580]: 2025-10-08 17:04:56.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:56 np0005476733 nova_compute[192580]: 2025-10-08 17:04:56.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:04:57 np0005476733 podman[279642]: 2025-10-08 17:04:57.214931711 +0000 UTC m=+0.049661848 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  8 13:05:01 np0005476733 nova_compute[192580]: 2025-10-08 17:05:01.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:05:01 np0005476733 nova_compute[192580]: 2025-10-08 17:05:01.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:05:03 np0005476733 nova_compute[192580]: 2025-10-08 17:05:03.446 2 DEBUG nova.compute.manager [req-c473ab79-458d-4ea1-b90a-21cf14e2052c req-378b65a6-7514-4c2d-a4be-26b5ba1a1af9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Received event network-changed-515844fc-ac1c-4a41-afdb-496edaf8d2ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:05:03 np0005476733 nova_compute[192580]: 2025-10-08 17:05:03.447 2 DEBUG nova.compute.manager [req-c473ab79-458d-4ea1-b90a-21cf14e2052c req-378b65a6-7514-4c2d-a4be-26b5ba1a1af9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Refreshing instance network info cache due to event network-changed-515844fc-ac1c-4a41-afdb-496edaf8d2ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 13:05:03 np0005476733 nova_compute[192580]: 2025-10-08 17:05:03.447 2 DEBUG oslo_concurrency.lockutils [req-c473ab79-458d-4ea1-b90a-21cf14e2052c req-378b65a6-7514-4c2d-a4be-26b5ba1a1af9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 13:05:03 np0005476733 nova_compute[192580]: 2025-10-08 17:05:03.448 2 DEBUG oslo_concurrency.lockutils [req-c473ab79-458d-4ea1-b90a-21cf14e2052c req-378b65a6-7514-4c2d-a4be-26b5ba1a1af9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 13:05:03 np0005476733 nova_compute[192580]: 2025-10-08 17:05:03.448 2 DEBUG nova.network.neutron [req-c473ab79-458d-4ea1-b90a-21cf14e2052c req-378b65a6-7514-4c2d-a4be-26b5ba1a1af9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Refreshing network info cache for port 515844fc-ac1c-4a41-afdb-496edaf8d2ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 13:05:05 np0005476733 nova_compute[192580]: 2025-10-08 17:05:05.600 2 DEBUG nova.compute.manager [req-32b18c23-1d88-4c9b-82e9-e03eefe66ec3 req-6808c384-ba62-441c-8245-2bf2c2149f14 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Received event network-changed-515844fc-ac1c-4a41-afdb-496edaf8d2ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:05:05 np0005476733 nova_compute[192580]: 2025-10-08 17:05:05.601 2 DEBUG nova.compute.manager [req-32b18c23-1d88-4c9b-82e9-e03eefe66ec3 req-6808c384-ba62-441c-8245-2bf2c2149f14 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Refreshing instance network info cache due to event network-changed-515844fc-ac1c-4a41-afdb-496edaf8d2ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 13:05:05 np0005476733 nova_compute[192580]: 2025-10-08 17:05:05.601 2 DEBUG oslo_concurrency.lockutils [req-32b18c23-1d88-4c9b-82e9-e03eefe66ec3 req-6808c384-ba62-441c-8245-2bf2c2149f14 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 13:05:05 np0005476733 nova_compute[192580]: 2025-10-08 17:05:05.657 2 DEBUG nova.network.neutron [req-c473ab79-458d-4ea1-b90a-21cf14e2052c req-378b65a6-7514-4c2d-a4be-26b5ba1a1af9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Updated VIF entry in instance network info cache for port 515844fc-ac1c-4a41-afdb-496edaf8d2ad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 13:05:05 np0005476733 nova_compute[192580]: 2025-10-08 17:05:05.657 2 DEBUG nova.network.neutron [req-c473ab79-458d-4ea1-b90a-21cf14e2052c req-378b65a6-7514-4c2d-a4be-26b5ba1a1af9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Updating instance_info_cache with network_info: [{"id": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "address": "fa:16:3e:8c:c7:90", "network": {"id": "90f23013-7525-484c-9fdb-1b08a96d01b7", "bridge": "br-int", "label": "tempest-test-network--203230236", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bfc3ffe3bf745dfbb59f63a07f1e1a9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap515844fc-ac", "ovs_interfaceid": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 13:05:05 np0005476733 nova_compute[192580]: 2025-10-08 17:05:05.693 2 DEBUG oslo_concurrency.lockutils [req-c473ab79-458d-4ea1-b90a-21cf14e2052c req-378b65a6-7514-4c2d-a4be-26b5ba1a1af9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 13:05:05 np0005476733 nova_compute[192580]: 2025-10-08 17:05:05.694 2 DEBUG oslo_concurrency.lockutils [req-32b18c23-1d88-4c9b-82e9-e03eefe66ec3 req-6808c384-ba62-441c-8245-2bf2c2149f14 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 13:05:05 np0005476733 nova_compute[192580]: 2025-10-08 17:05:05.694 2 DEBUG nova.network.neutron [req-32b18c23-1d88-4c9b-82e9-e03eefe66ec3 req-6808c384-ba62-441c-8245-2bf2c2149f14 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Refreshing network info cache for port 515844fc-ac1c-4a41-afdb-496edaf8d2ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 13:05:06 np0005476733 nova_compute[192580]: 2025-10-08 17:05:06.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:05:06 np0005476733 podman[279663]: 2025-10-08 17:05:06.24892586 +0000 UTC m=+0.070436021 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 13:05:06 np0005476733 podman[279664]: 2025-10-08 17:05:06.252933387 +0000 UTC m=+0.071264506 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, release=1755695350, config_id=edpm, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal)
Oct  8 13:05:06 np0005476733 podman[279662]: 2025-10-08 17:05:06.273452083 +0000 UTC m=+0.101512804 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct  8 13:05:06 np0005476733 nova_compute[192580]: 2025-10-08 17:05:06.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:05:06 np0005476733 nova_compute[192580]: 2025-10-08 17:05:06.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:05:06 np0005476733 nova_compute[192580]: 2025-10-08 17:05:06.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:05:07 np0005476733 nova_compute[192580]: 2025-10-08 17:05:07.610 2 DEBUG nova.network.neutron [req-32b18c23-1d88-4c9b-82e9-e03eefe66ec3 req-6808c384-ba62-441c-8245-2bf2c2149f14 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Updated VIF entry in instance network info cache for port 515844fc-ac1c-4a41-afdb-496edaf8d2ad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 13:05:07 np0005476733 nova_compute[192580]: 2025-10-08 17:05:07.612 2 DEBUG nova.network.neutron [req-32b18c23-1d88-4c9b-82e9-e03eefe66ec3 req-6808c384-ba62-441c-8245-2bf2c2149f14 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Updating instance_info_cache with network_info: [{"id": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "address": "fa:16:3e:8c:c7:90", "network": {"id": "90f23013-7525-484c-9fdb-1b08a96d01b7", "bridge": "br-int", "label": "tempest-test-network--203230236", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bfc3ffe3bf745dfbb59f63a07f1e1a9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap515844fc-ac", "ovs_interfaceid": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 13:05:07 np0005476733 nova_compute[192580]: 2025-10-08 17:05:07.633 2 DEBUG oslo_concurrency.lockutils [req-32b18c23-1d88-4c9b-82e9-e03eefe66ec3 req-6808c384-ba62-441c-8245-2bf2c2149f14 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 13:05:08 np0005476733 nova_compute[192580]: 2025-10-08 17:05:08.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:05:11 np0005476733 nova_compute[192580]: 2025-10-08 17:05:11.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:05:11 np0005476733 nova_compute[192580]: 2025-10-08 17:05:11.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:05:12 np0005476733 podman[279729]: 2025-10-08 17:05:12.21088324 +0000 UTC m=+0.042427776 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 13:05:12 np0005476733 podman[279728]: 2025-10-08 17:05:12.216138128 +0000 UTC m=+0.050038589 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 13:05:14 np0005476733 nova_compute[192580]: 2025-10-08 17:05:14.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:05:16 np0005476733 nova_compute[192580]: 2025-10-08 17:05:16.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:05:16 np0005476733 nova_compute[192580]: 2025-10-08 17:05:16.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:05:18 np0005476733 nova_compute[192580]: 2025-10-08 17:05:18.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:05:21 np0005476733 nova_compute[192580]: 2025-10-08 17:05:21.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:05:21 np0005476733 nova_compute[192580]: 2025-10-08 17:05:21.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:05:21 np0005476733 nova_compute[192580]: 2025-10-08 17:05:21.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:05:21 np0005476733 nova_compute[192580]: 2025-10-08 17:05:21.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:05:21 np0005476733 nova_compute[192580]: 2025-10-08 17:05:21.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:05:21 np0005476733 nova_compute[192580]: 2025-10-08 17:05:21.780 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 13:05:21 np0005476733 nova_compute[192580]: 2025-10-08 17:05:21.780 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 13:05:21 np0005476733 nova_compute[192580]: 2025-10-08 17:05:21.780 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 13:05:21 np0005476733 nova_compute[192580]: 2025-10-08 17:05:21.781 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b62bae97-ef9e-4f29-8e84-9731b7b2dc32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 13:05:22 np0005476733 podman[279771]: 2025-10-08 17:05:22.218597439 +0000 UTC m=+0.053576001 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:05:25 np0005476733 podman[279790]: 2025-10-08 17:05:25.293568461 +0000 UTC m=+0.121078858 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible)
Oct  8 13:05:26 np0005476733 nova_compute[192580]: 2025-10-08 17:05:26.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:05:26 np0005476733 nova_compute[192580]: 2025-10-08 17:05:26.296 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Updating instance_info_cache with network_info: [{"id": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "address": "fa:16:3e:8c:c7:90", "network": {"id": "90f23013-7525-484c-9fdb-1b08a96d01b7", "bridge": "br-int", "label": "tempest-test-network--203230236", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bfc3ffe3bf745dfbb59f63a07f1e1a9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap515844fc-ac", "ovs_interfaceid": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 13:05:26 np0005476733 nova_compute[192580]: 2025-10-08 17:05:26.312 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 13:05:26 np0005476733 nova_compute[192580]: 2025-10-08 17:05:26.312 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 13:05:26 np0005476733 nova_compute[192580]: 2025-10-08 17:05:26.313 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:05:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:05:26.442 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:05:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:05:26.443 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:05:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:05:26.443 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:05:26 np0005476733 nova_compute[192580]: 2025-10-08 17:05:26.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:05:28 np0005476733 podman[279817]: 2025-10-08 17:05:28.226965081 +0000 UTC m=+0.058660495 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  8 13:05:29 np0005476733 nova_compute[192580]: 2025-10-08 17:05:29.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:05:29 np0005476733 nova_compute[192580]: 2025-10-08 17:05:29.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:05:29 np0005476733 nova_compute[192580]: 2025-10-08 17:05:29.615 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:05:29 np0005476733 nova_compute[192580]: 2025-10-08 17:05:29.615 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:05:29 np0005476733 nova_compute[192580]: 2025-10-08 17:05:29.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:05:29 np0005476733 nova_compute[192580]: 2025-10-08 17:05:29.616 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:05:29 np0005476733 nova_compute[192580]: 2025-10-08 17:05:29.676 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:05:29 np0005476733 nova_compute[192580]: 2025-10-08 17:05:29.733 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:05:29 np0005476733 nova_compute[192580]: 2025-10-08 17:05:29.734 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:05:29 np0005476733 nova_compute[192580]: 2025-10-08 17:05:29.787 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:05:29 np0005476733 nova_compute[192580]: 2025-10-08 17:05:29.935 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:05:29 np0005476733 nova_compute[192580]: 2025-10-08 17:05:29.938 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12874MB free_disk=111.15838623046875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:05:29 np0005476733 nova_compute[192580]: 2025-10-08 17:05:29.939 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:05:29 np0005476733 nova_compute[192580]: 2025-10-08 17:05:29.939 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:05:30 np0005476733 nova_compute[192580]: 2025-10-08 17:05:30.031 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance b62bae97-ef9e-4f29-8e84-9731b7b2dc32 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 13:05:30 np0005476733 nova_compute[192580]: 2025-10-08 17:05:30.032 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:05:30 np0005476733 nova_compute[192580]: 2025-10-08 17:05:30.032 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:05:30 np0005476733 nova_compute[192580]: 2025-10-08 17:05:30.078 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:05:30 np0005476733 nova_compute[192580]: 2025-10-08 17:05:30.093 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:05:30 np0005476733 nova_compute[192580]: 2025-10-08 17:05:30.094 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:05:30 np0005476733 nova_compute[192580]: 2025-10-08 17:05:30.094 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:05:31 np0005476733 nova_compute[192580]: 2025-10-08 17:05:31.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:05:31 np0005476733 nova_compute[192580]: 2025-10-08 17:05:31.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:05:36 np0005476733 nova_compute[192580]: 2025-10-08 17:05:36.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:05:36 np0005476733 nova_compute[192580]: 2025-10-08 17:05:36.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:05:37 np0005476733 nova_compute[192580]: 2025-10-08 17:05:37.093 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:05:37 np0005476733 podman[279844]: 2025-10-08 17:05:37.228885247 +0000 UTC m=+0.055541055 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 13:05:37 np0005476733 podman[279843]: 2025-10-08 17:05:37.232238164 +0000 UTC m=+0.063891222 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0)
Oct  8 13:05:37 np0005476733 podman[279845]: 2025-10-08 17:05:37.238811624 +0000 UTC m=+0.061525226 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6)
Oct  8 13:05:41 np0005476733 nova_compute[192580]: 2025-10-08 17:05:41.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:05:41 np0005476733 nova_compute[192580]: 2025-10-08 17:05:41.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:05:43 np0005476733 podman[279909]: 2025-10-08 17:05:43.225571417 +0000 UTC m=+0.058213440 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 13:05:43 np0005476733 podman[279908]: 2025-10-08 17:05:43.241865417 +0000 UTC m=+0.076342670 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 13:05:46 np0005476733 nova_compute[192580]: 2025-10-08 17:05:46.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:05:46 np0005476733 nova_compute[192580]: 2025-10-08 17:05:46.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:05:51 np0005476733 nova_compute[192580]: 2025-10-08 17:05:51.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:05:51 np0005476733 nova_compute[192580]: 2025-10-08 17:05:51.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:05:53 np0005476733 podman[279952]: 2025-10-08 17:05:53.221992324 +0000 UTC m=+0.047550100 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2)
Oct  8 13:05:53 np0005476733 nova_compute[192580]: 2025-10-08 17:05:53.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:05:56 np0005476733 nova_compute[192580]: 2025-10-08 17:05:56.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:05:56 np0005476733 ovn_controller[263831]: 2025-10-08T17:05:56Z|00250|pinctrl|WARN|Dropped 103 log messages in last 62 seconds (most recently, 16 seconds ago) due to excessive rate
Oct  8 13:05:56 np0005476733 ovn_controller[263831]: 2025-10-08T17:05:56Z|00251|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:05:56 np0005476733 podman[279971]: 2025-10-08 17:05:56.238943303 +0000 UTC m=+0.073454086 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Oct  8 13:05:56 np0005476733 nova_compute[192580]: 2025-10-08 17:05:56.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:05:59 np0005476733 podman[279997]: 2025-10-08 17:05:59.220664976 +0000 UTC m=+0.051612209 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001)
Oct  8 13:06:01 np0005476733 nova_compute[192580]: 2025-10-08 17:06:01.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:01 np0005476733 nova_compute[192580]: 2025-10-08 17:06:01.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:06 np0005476733 nova_compute[192580]: 2025-10-08 17:06:06.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:06 np0005476733 nova_compute[192580]: 2025-10-08 17:06:06.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:06 np0005476733 nova_compute[192580]: 2025-10-08 17:06:06.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:06:06 np0005476733 nova_compute[192580]: 2025-10-08 17:06:06.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:06:08 np0005476733 podman[280018]: 2025-10-08 17:06:08.221643802 +0000 UTC m=+0.054605984 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 13:06:08 np0005476733 podman[280020]: 2025-10-08 17:06:08.221648592 +0000 UTC m=+0.048418157 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., config_id=edpm, build-date=2025-08-20T13:12:41, vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  8 13:06:08 np0005476733 podman[280019]: 2025-10-08 17:06:08.226795417 +0000 UTC m=+0.050531295 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 13:06:08 np0005476733 nova_compute[192580]: 2025-10-08 17:06:08.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:06:11 np0005476733 nova_compute[192580]: 2025-10-08 17:06:11.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:11 np0005476733 nova_compute[192580]: 2025-10-08 17:06:11.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:14 np0005476733 podman[280079]: 2025-10-08 17:06:14.270465027 +0000 UTC m=+0.089185450 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, container_name=iscsid)
Oct  8 13:06:14 np0005476733 podman[280080]: 2025-10-08 17:06:14.272035477 +0000 UTC m=+0.083849679 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 13:06:15 np0005476733 nova_compute[192580]: 2025-10-08 17:06:15.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:06:16 np0005476733 nova_compute[192580]: 2025-10-08 17:06:16.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:16 np0005476733 nova_compute[192580]: 2025-10-08 17:06:16.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:18 np0005476733 nova_compute[192580]: 2025-10-08 17:06:18.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:06:21 np0005476733 nova_compute[192580]: 2025-10-08 17:06:21.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:21 np0005476733 nova_compute[192580]: 2025-10-08 17:06:21.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:22 np0005476733 nova_compute[192580]: 2025-10-08 17:06:22.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:06:22 np0005476733 nova_compute[192580]: 2025-10-08 17:06:22.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:06:22 np0005476733 nova_compute[192580]: 2025-10-08 17:06:22.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:06:23 np0005476733 nova_compute[192580]: 2025-10-08 17:06:23.042 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 13:06:23 np0005476733 nova_compute[192580]: 2025-10-08 17:06:23.042 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 13:06:23 np0005476733 nova_compute[192580]: 2025-10-08 17:06:23.042 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 13:06:23 np0005476733 nova_compute[192580]: 2025-10-08 17:06:23.042 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b62bae97-ef9e-4f29-8e84-9731b7b2dc32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 13:06:24 np0005476733 podman[280122]: 2025-10-08 17:06:24.25890185 +0000 UTC m=+0.080362009 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 13:06:26 np0005476733 nova_compute[192580]: 2025-10-08 17:06:26.172 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Updating instance_info_cache with network_info: [{"id": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "address": "fa:16:3e:8c:c7:90", "network": {"id": "90f23013-7525-484c-9fdb-1b08a96d01b7", "bridge": "br-int", "label": "tempest-test-network--203230236", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bfc3ffe3bf745dfbb59f63a07f1e1a9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap515844fc-ac", "ovs_interfaceid": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 13:06:26 np0005476733 nova_compute[192580]: 2025-10-08 17:06:26.197 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 13:06:26 np0005476733 nova_compute[192580]: 2025-10-08 17:06:26.198 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 13:06:26 np0005476733 nova_compute[192580]: 2025-10-08 17:06:26.198 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:06:26 np0005476733 nova_compute[192580]: 2025-10-08 17:06:26.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:06:26.443 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:06:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:06:26.444 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:06:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:06:26.445 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:06:26 np0005476733 nova_compute[192580]: 2025-10-08 17:06:26.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:27 np0005476733 podman[280141]: 2025-10-08 17:06:27.293951267 +0000 UTC m=+0.124032463 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 13:06:29 np0005476733 ovn_controller[263831]: 2025-10-08T17:06:29Z|00252|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct  8 13:06:29 np0005476733 nova_compute[192580]: 2025-10-08 17:06:29.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:06:30 np0005476733 podman[280167]: 2025-10-08 17:06:30.254060259 +0000 UTC m=+0.079822840 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  8 13:06:31 np0005476733 nova_compute[192580]: 2025-10-08 17:06:31.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:31 np0005476733 nova_compute[192580]: 2025-10-08 17:06:31.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:31 np0005476733 nova_compute[192580]: 2025-10-08 17:06:31.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:06:31 np0005476733 nova_compute[192580]: 2025-10-08 17:06:31.615 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:06:31 np0005476733 nova_compute[192580]: 2025-10-08 17:06:31.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:06:31 np0005476733 nova_compute[192580]: 2025-10-08 17:06:31.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:06:31 np0005476733 nova_compute[192580]: 2025-10-08 17:06:31.616 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:06:31 np0005476733 nova_compute[192580]: 2025-10-08 17:06:31.674 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:06:31 np0005476733 nova_compute[192580]: 2025-10-08 17:06:31.729 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:06:31 np0005476733 nova_compute[192580]: 2025-10-08 17:06:31.730 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:06:31 np0005476733 nova_compute[192580]: 2025-10-08 17:06:31.805 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:06:31 np0005476733 nova_compute[192580]: 2025-10-08 17:06:31.963 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:06:31 np0005476733 nova_compute[192580]: 2025-10-08 17:06:31.964 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12854MB free_disk=111.15838623046875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:06:31 np0005476733 nova_compute[192580]: 2025-10-08 17:06:31.965 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:06:31 np0005476733 nova_compute[192580]: 2025-10-08 17:06:31.965 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:06:32 np0005476733 nova_compute[192580]: 2025-10-08 17:06:32.435 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance b62bae97-ef9e-4f29-8e84-9731b7b2dc32 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 13:06:32 np0005476733 nova_compute[192580]: 2025-10-08 17:06:32.435 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:06:32 np0005476733 nova_compute[192580]: 2025-10-08 17:06:32.436 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:06:32 np0005476733 nova_compute[192580]: 2025-10-08 17:06:32.560 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:06:32 np0005476733 nova_compute[192580]: 2025-10-08 17:06:32.576 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:06:32 np0005476733 nova_compute[192580]: 2025-10-08 17:06:32.578 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:06:32 np0005476733 nova_compute[192580]: 2025-10-08 17:06:32.578 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.613s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.080 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'name': 'tempest-test_dscp_marking_south_north-1351956767', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000072', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'hostId': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.081 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.083 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.incoming.bytes.delta volume: 19890 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '851ca129-2598-4384-b999-959ae6739b10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 19890, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:06:36.081341', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': '25c7838a-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.80431564, 'message_signature': 'b9e0049f8318f5d2b17824904a00a57b43b109c28ef7fbae2bad11da26833459'}]}, 'timestamp': '2025-10-08 17:06:36.084186', '_unique_id': 'd05a19a7646c41af87047dff82059e9a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.085 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.086 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.086 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3caf301-4f53-4e62-bb5a-59c4fe7cdc64', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:06:36.086416', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': '25c7ec8a-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.80431564, 'message_signature': 'e9e0a733aa4cdb747831e854641e6f4d50736e600b8ec55e2169176cf46c7b0b'}]}, 'timestamp': '2025-10-08 17:06:36.086713', '_unique_id': 'a30d815bc00b46b5a3617cdfd7957b0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.087 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.088 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.096 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.usage volume: 152764416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.097 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6e1bc84-e740-4845-ae49-459adc752dd9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 152764416, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:06:36.088254', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '25c992ec-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.81122983, 'message_signature': 'ba26b2fa948cccfa365866cb378856aca29d11077dedd85d9dbf68c30bbc9b5e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:06:36.088254', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '25c99f12-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.81122983, 'message_signature': 'b6cbe272b3556c15a33230562f4a628ca6807bd9c26f4f78db22f8d92a3dee70'}]}, 'timestamp': '2025-10-08 17:06:36.097828', '_unique_id': '3b5cc69b85874259a87b65ae761c48a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.098 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.099 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.099 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.allocation volume: 153096192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.100 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f85041c3-82cf-4bfe-bf56-c71dc46886c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153096192, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:06:36.099946', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '25c9fe3a-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.81122983, 'message_signature': '878d92c5eab16c5fb442163d6170c76f3b25adfc2a4e6d3d6392418d82a29cc6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:06:36.099946', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '25ca0876-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.81122983, 'message_signature': '254781eba553d8e1b648c17dbea579f1127c82e5fbdec2f439406c109768ff22'}]}, 'timestamp': '2025-10-08 17:06:36.100515', '_unique_id': '294ea92bfce04a33b734441b41f2ed9a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.101 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.incoming.packets volume: 227 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9f0e1ef-9155-4a73-8c74-b9116b8e33fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 227, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:06:36.102153', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': '25ca533a-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.80431564, 'message_signature': '98d3e8d502f68674f784b017c0842e0246487eea3eeafd28012759c48ae4c8d5'}]}, 'timestamp': '2025-10-08 17:06:36.102436', '_unique_id': '434396663f8449b593b355aa27cf0ce1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.102 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.103 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.103 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.incoming.bytes volume: 38485 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51564fdf-c8a5-459f-a05c-e647ba9d3b32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 38485, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:06:36.103939', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': '25ca9976-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.80431564, 'message_signature': '69814faffe504ed2b8a274417d1d13fe26a75790a6bec975fec1735bd90df172'}]}, 'timestamp': '2025-10-08 17:06:36.104245', '_unique_id': '91109033997b4a75a31722e1dc064502'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.104 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.105 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.outgoing.bytes volume: 57229 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83e79b25-a153-4681-81aa-47157541335a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 57229, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:06:36.105737', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': '25cadee0-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.80431564, 'message_signature': '5eb4cabadf0223bddf836d7b322ffeb780398679ac5b071b23f51797b4018797'}]}, 'timestamp': '2025-10-08 17:06:36.106014', '_unique_id': '84c9c742a0584e1c87b80a8394af1fb2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.106 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.107 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.107 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.outgoing.packets volume: 303 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b144025-3555-4c02-8c50-7dbf8354f466', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 303, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:06:36.107757', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': '25cb2da0-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.80431564, 'message_signature': '4f45b4cb6dfb6748c0c33aaed78f1228e2cd4041d053e1ef442727b9e0a16ac7'}]}, 'timestamp': '2025-10-08 17:06:36.108030', '_unique_id': '4415c760a19b46ae8c397e5eb38c3741'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.108 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.109 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.127 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.read.requests volume: 11655 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.128 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '496a9c4c-cbc3-4247-88cd-af5c0c90a617', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11655, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:06:36.109561', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '25ce389c-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.832542512, 'message_signature': 'ffb89e413a02f66e126d667e36969c3ea5a7ddb1f91622cd963a9c4536edd8d5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:06:36.109561', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '25ce459e-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.832542512, 'message_signature': 'ba9a3e92e9fa6f578c0f306ad6de9ac519443843d771b43075d4b5d9bf00192a'}]}, 'timestamp': '2025-10-08 17:06:36.128305', '_unique_id': '2f107e7bb78648dc8f45082538b3f9f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.129 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.130 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.130 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.read.bytes volume: 328566272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.130 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33d469b7-a7a6-40d1-a076-7c8c2d94cce0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 328566272, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:06:36.130515', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '25cea69c-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.832542512, 'message_signature': '854d6237f741251b0602255a01d20d55ce937c4d6679d098d9c1f4693367695c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:06:36.130515', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '25ceafe8-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.832542512, 'message_signature': '989aa2890c60d426163afd7d72aed8103458a6314e1a4f2a1b7e5a82d5071447'}]}, 'timestamp': '2025-10-08 17:06:36.131010', '_unique_id': '40ff02f2eae544f3b5bf77ec3f93fc84'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.131 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.132 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.132 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.read.latency volume: 6861795936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.132 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.read.latency volume: 50547014 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6de9590-06fe-4296-b00e-c8b0fc4dc8db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6861795936, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:06:36.132512', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '25cef458-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.832542512, 'message_signature': '44fef9312d7ae1a944658ac38db2041bf6f246760386b6e4594ca2227a27a521'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 50547014, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:06:36.132512', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '25cefde0-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.832542512, 'message_signature': '4c95aef76d940354f436d6a7cd112c6fa4f511707b1fdc9bd972a3c1bbfac4f7'}]}, 'timestamp': '2025-10-08 17:06:36.133008', '_unique_id': '34d1963189964058acb945f1dc7266a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.133 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.134 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.134 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.outgoing.bytes.delta volume: 29158 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f22f3c8-a8ea-4830-80d0-f044f9bfdb3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 29158, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:06:36.134679', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': '25cf496c-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.80431564, 'message_signature': 'cf31513cc608f39f13ef006a593aa37c504bdcc7825d47e5e9fc86bc657c2377'}]}, 'timestamp': '2025-10-08 17:06:36.134954', '_unique_id': '5403130c21c84cc3a9ac46ad525d73e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.135 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.136 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.136 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.151 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/cpu volume: 43180000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3409dfe-85d7-406d-abf3-e98f592f3010', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 43180000000, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'timestamp': '2025-10-08T17:06:36.136765', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': '25d1e226-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.874273195, 'message_signature': '01b04d2ed28a31840729329e1ad5977c80b9bf47b0ed4c02b3fc3b96a78a4639'}]}, 'timestamp': '2025-10-08 17:06:36.152029', '_unique_id': '9c9ba87044ab41899eb4d9abf6855ee5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.152 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.153 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.154 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.write.bytes volume: 137182208 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.154 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '67ce4b2c-bd7f-47a1-9145-9de64ab0aa8b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 137182208, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:06:36.154029', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '25d23e7e-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.832542512, 'message_signature': 'e587d894a5866459b65105c95ebde750ff2a2ecc4cf9f3e8880bac3011e9625f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:06:36.154029', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '25d24838-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.832542512, 'message_signature': 'c434cbbab2cc2e61c441d067361853f91764d6a264a63b0e2225ddd6b3727be9'}]}, 'timestamp': '2025-10-08 17:06:36.154575', '_unique_id': 'adf60fb5f8e3480293dae1a876694a8a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.155 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.156 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.156 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5a63f64f-8607-4029-85db-b8afda74e3e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:06:36.156174', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': '25d29158-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.80431564, 'message_signature': 'ca7885026ca826814c1740b0056f0a76339a87f74a6ee81fa567bb12d9f1075b'}]}, 'timestamp': '2025-10-08 17:06:36.156455', '_unique_id': 'ab5536726d6d44668bbd9c555a244bc4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.157 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52efec16-cdf7-480a-b6eb-47c9ab8ab305', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:06:36.158302', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': '25d2e37e-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.80431564, 'message_signature': '7fc252722a4528840a80e46b78380c7624bf065f7d8b4b6796822ca2cfcc030b'}]}, 'timestamp': '2025-10-08 17:06:36.158557', '_unique_id': '542519aa50fa49278e08a19427ea5978'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.158 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.159 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.159 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/memory.usage volume: 234.95703125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f235438-3d89-4505-8819-3b66f95dfdb5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 234.95703125, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'timestamp': '2025-10-08T17:06:36.159967', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': '25d3253c-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.874273195, 'message_signature': '733c84d4d1062a510795beaf2b5d2240005f0f1ffa86ec06f426a233dca8e5c7'}]}, 'timestamp': '2025-10-08 17:06:36.160232', '_unique_id': '0facec82822e4b0b9556da9693eab6b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.160 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.161 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.161 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.write.requests volume: 842 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.161 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37fa04de-3676-477c-80ef-e7e8f894b76a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 842, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:06:36.161606', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '25d364ca-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.832542512, 'message_signature': '8b0862cc5d162ab8847dea57f4bda5333f77d67c5af7f850f61404f543239ccd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:06:36.161606', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '25d36da8-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.832542512, 'message_signature': 'efdcfdab49a63cbd94abf662268f48572627d16723a67a321a2a9084bc17de93'}]}, 'timestamp': '2025-10-08 17:06:36.162095', '_unique_id': 'e2b99c6e534743e783e9fd6b5c3cdb5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.162 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.163 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.163 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65f2bb6b-8d65-43c8-a926-ed4c2c4bd014', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'instance-00000072-b62bae97-ef9e-4f29-8e84-9731b7b2dc32-tap515844fc-ac', 'timestamp': '2025-10-08T17:06:36.163571', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'tap515844fc-ac', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:8c:c7:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap515844fc-ac'}, 'message_id': '25d3b1d2-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.80431564, 'message_signature': '998367d62988dbfda5b9bdc71b6ca3fe3bc62169dd5ce58cb905dfe567ab231b'}]}, 'timestamp': '2025-10-08 17:06:36.163837', '_unique_id': 'a31cce1592584d31be3d93ae8f9b571d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.164 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.165 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.165 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.165 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '988de374-34db-48ed-bfa1-043c6a2e02d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:06:36.165249', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '25d3f318-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.81122983, 'message_signature': 'd8a020cd18f728e7684b36d34526aa095b03eec92e92b9cef11eb067030e0cdd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:06:36.165249', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '25d3fc32-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.81122983, 'message_signature': '141d11c725819e51f898af6ecf55303fa9e7587c480766cc650d6b500c256934'}]}, 'timestamp': '2025-10-08 17:06:36.165740', '_unique_id': '3f88e454c8af460a97ca10ee65f371a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.166 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.167 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.167 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.write.latency volume: 7750214288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.167 12 DEBUG ceilometer.compute.pollsters [-] b62bae97-ef9e-4f29-8e84-9731b7b2dc32/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4f512a9-f2c2-4bec-b190-016306373ce7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7750214288, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-vda', 'timestamp': '2025-10-08T17:06:36.167233', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': '25d44098-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.832542512, 'message_signature': '0f8881cbf1bce48e49fe04abce12fccee6cbe33df890d4d5bbce7e26f2cb342a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c7e76ef370424761ad76d4119e9bf895', 'user_name': None, 'project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'project_name': None, 'resource_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32-sda', 'timestamp': '2025-10-08T17:06:36.167233', 'resource_metadata': {'display_name': 'tempest-test_dscp_marking_south_north-1351956767', 'name': 'instance-00000072', 'instance_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'instance_type': 'custom_neutron_guest', 'host': 'aaf757a1db3364fac10fff693cb94dff6c205dd023d98bdd75ff661e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': '25d44b10-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10099.832542512, 'message_signature': '546e9f5447d0921e4bcc321f2626261e8cc2bf5951b99a3b8ffeb5b5e3943015'}]}, 'timestamp': '2025-10-08 17:06:36.167771', '_unique_id': 'bd85681d699d48199b831e56c99a252b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:06:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:06:36.168 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:06:36 np0005476733 nova_compute[192580]: 2025-10-08 17:06:36.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:36 np0005476733 nova_compute[192580]: 2025-10-08 17:06:36.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:06:37.413 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=97, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=96) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:06:37 np0005476733 nova_compute[192580]: 2025-10-08 17:06:37.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:06:37.414 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 13:06:37 np0005476733 nova_compute[192580]: 2025-10-08 17:06:37.660 2 DEBUG nova.compute.manager [req-96bba2fe-4280-4770-b8ef-6aef7c0eb7a6 req-6bc162ea-c1de-410e-ada4-415405a256ea 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Received event network-changed-515844fc-ac1c-4a41-afdb-496edaf8d2ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:06:37 np0005476733 nova_compute[192580]: 2025-10-08 17:06:37.660 2 DEBUG nova.compute.manager [req-96bba2fe-4280-4770-b8ef-6aef7c0eb7a6 req-6bc162ea-c1de-410e-ada4-415405a256ea 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Refreshing instance network info cache due to event network-changed-515844fc-ac1c-4a41-afdb-496edaf8d2ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 13:06:37 np0005476733 nova_compute[192580]: 2025-10-08 17:06:37.660 2 DEBUG oslo_concurrency.lockutils [req-96bba2fe-4280-4770-b8ef-6aef7c0eb7a6 req-6bc162ea-c1de-410e-ada4-415405a256ea 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 13:06:37 np0005476733 nova_compute[192580]: 2025-10-08 17:06:37.661 2 DEBUG oslo_concurrency.lockutils [req-96bba2fe-4280-4770-b8ef-6aef7c0eb7a6 req-6bc162ea-c1de-410e-ada4-415405a256ea 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 13:06:37 np0005476733 nova_compute[192580]: 2025-10-08 17:06:37.661 2 DEBUG nova.network.neutron [req-96bba2fe-4280-4770-b8ef-6aef7c0eb7a6 req-6bc162ea-c1de-410e-ada4-415405a256ea 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Refreshing network info cache for port 515844fc-ac1c-4a41-afdb-496edaf8d2ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 13:06:39 np0005476733 nova_compute[192580]: 2025-10-08 17:06:39.014 2 DEBUG nova.network.neutron [req-96bba2fe-4280-4770-b8ef-6aef7c0eb7a6 req-6bc162ea-c1de-410e-ada4-415405a256ea 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Updated VIF entry in instance network info cache for port 515844fc-ac1c-4a41-afdb-496edaf8d2ad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 13:06:39 np0005476733 nova_compute[192580]: 2025-10-08 17:06:39.015 2 DEBUG nova.network.neutron [req-96bba2fe-4280-4770-b8ef-6aef7c0eb7a6 req-6bc162ea-c1de-410e-ada4-415405a256ea 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Updating instance_info_cache with network_info: [{"id": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "address": "fa:16:3e:8c:c7:90", "network": {"id": "90f23013-7525-484c-9fdb-1b08a96d01b7", "bridge": "br-int", "label": "tempest-test-network--203230236", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bfc3ffe3bf745dfbb59f63a07f1e1a9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap515844fc-ac", "ovs_interfaceid": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 13:06:39 np0005476733 nova_compute[192580]: 2025-10-08 17:06:39.034 2 DEBUG oslo_concurrency.lockutils [req-96bba2fe-4280-4770-b8ef-6aef7c0eb7a6 req-6bc162ea-c1de-410e-ada4-415405a256ea 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-b62bae97-ef9e-4f29-8e84-9731b7b2dc32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 13:06:39 np0005476733 podman[280197]: 2025-10-08 17:06:39.218863118 +0000 UTC m=+0.052089414 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 13:06:39 np0005476733 podman[280198]: 2025-10-08 17:06:39.221460841 +0000 UTC m=+0.050634148 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 13:06:39 np0005476733 podman[280199]: 2025-10-08 17:06:39.231977677 +0000 UTC m=+0.058694066 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.openshift.expose-services=, release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, version=9.6, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7)
Oct  8 13:06:39 np0005476733 nova_compute[192580]: 2025-10-08 17:06:39.579 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:06:39 np0005476733 nova_compute[192580]: 2025-10-08 17:06:39.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:06:39 np0005476733 nova_compute[192580]: 2025-10-08 17:06:39.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 13:06:41 np0005476733 nova_compute[192580]: 2025-10-08 17:06:41.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:41 np0005476733 nova_compute[192580]: 2025-10-08 17:06:41.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:45 np0005476733 podman[280256]: 2025-10-08 17:06:45.236382475 +0000 UTC m=+0.060674290 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  8 13:06:45 np0005476733 podman[280257]: 2025-10-08 17:06:45.255876057 +0000 UTC m=+0.071726262 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 13:06:45 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:06:45.416 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '97'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.139 2 DEBUG oslo_concurrency.lockutils [None req-f29fc106-9a4a-4e34-828b-4090bf62035c c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Acquiring lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.139 2 DEBUG oslo_concurrency.lockutils [None req-f29fc106-9a4a-4e34-828b-4090bf62035c c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.140 2 DEBUG oslo_concurrency.lockutils [None req-f29fc106-9a4a-4e34-828b-4090bf62035c c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Acquiring lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.140 2 DEBUG oslo_concurrency.lockutils [None req-f29fc106-9a4a-4e34-828b-4090bf62035c c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.140 2 DEBUG oslo_concurrency.lockutils [None req-f29fc106-9a4a-4e34-828b-4090bf62035c c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.141 2 INFO nova.compute.manager [None req-f29fc106-9a4a-4e34-828b-4090bf62035c c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Terminating instance#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.142 2 DEBUG nova.compute.manager [None req-f29fc106-9a4a-4e34-828b-4090bf62035c c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 13:06:46 np0005476733 kernel: tap515844fc-ac (unregistering): left promiscuous mode
Oct  8 13:06:46 np0005476733 NetworkManager[51699]: <info>  [1759943206.1714] device (tap515844fc-ac): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 13:06:46 np0005476733 ovn_controller[263831]: 2025-10-08T17:06:46Z|00253|binding|INFO|Releasing lport 515844fc-ac1c-4a41-afdb-496edaf8d2ad from this chassis (sb_readonly=0)
Oct  8 13:06:46 np0005476733 ovn_controller[263831]: 2025-10-08T17:06:46Z|00254|binding|INFO|Setting lport 515844fc-ac1c-4a41-afdb-496edaf8d2ad down in Southbound
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:46 np0005476733 ovn_controller[263831]: 2025-10-08T17:06:46Z|00255|binding|INFO|Removing iface tap515844fc-ac ovn-installed in OVS
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:06:46.189 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:c7:90 192.168.1.210'], port_security=['fa:16:3e:8c:c7:90 192.168.1.210'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.1.210/24', 'neutron:device_id': 'b62bae97-ef9e-4f29-8e84-9731b7b2dc32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90f23013-7525-484c-9fdb-1b08a96d01b7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1bfc3ffe3bf745dfbb59f63a07f1e1a9', 'neutron:revision_number': '6', 'neutron:security_group_ids': '25db264a-2b7b-491b-8b39-dd7c8ac35c2b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.215'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6dc09949-30f3-4269-9199-dde0aaedfe99, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=515844fc-ac1c-4a41-afdb-496edaf8d2ad) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:06:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:06:46.191 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 515844fc-ac1c-4a41-afdb-496edaf8d2ad in datapath 90f23013-7525-484c-9fdb-1b08a96d01b7 unbound from our chassis#033[00m
Oct  8 13:06:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:06:46.192 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 90f23013-7525-484c-9fdb-1b08a96d01b7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 13:06:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:06:46.194 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8bda34d2-eed0-487f-90dd-4961f4d173a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:06:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:06:46.194 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-90f23013-7525-484c-9fdb-1b08a96d01b7 namespace which is not needed anymore#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:46 np0005476733 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000072.scope: Deactivated successfully.
Oct  8 13:06:46 np0005476733 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000072.scope: Consumed 47.422s CPU time.
Oct  8 13:06:46 np0005476733 systemd-machined[152624]: Machine qemu-69-instance-00000072 terminated.
Oct  8 13:06:46 np0005476733 neutron-haproxy-ovnmeta-90f23013-7525-484c-9fdb-1b08a96d01b7[278683]: [NOTICE]   (278687) : haproxy version is 2.8.14-c23fe91
Oct  8 13:06:46 np0005476733 neutron-haproxy-ovnmeta-90f23013-7525-484c-9fdb-1b08a96d01b7[278683]: [NOTICE]   (278687) : path to executable is /usr/sbin/haproxy
Oct  8 13:06:46 np0005476733 neutron-haproxy-ovnmeta-90f23013-7525-484c-9fdb-1b08a96d01b7[278683]: [WARNING]  (278687) : Exiting Master process...
Oct  8 13:06:46 np0005476733 neutron-haproxy-ovnmeta-90f23013-7525-484c-9fdb-1b08a96d01b7[278683]: [WARNING]  (278687) : Exiting Master process...
Oct  8 13:06:46 np0005476733 neutron-haproxy-ovnmeta-90f23013-7525-484c-9fdb-1b08a96d01b7[278683]: [ALERT]    (278687) : Current worker (278689) exited with code 143 (Terminated)
Oct  8 13:06:46 np0005476733 neutron-haproxy-ovnmeta-90f23013-7525-484c-9fdb-1b08a96d01b7[278683]: [WARNING]  (278687) : All workers exited. Exiting... (0)
Oct  8 13:06:46 np0005476733 systemd[1]: libpod-f0f53389b955193b1c0e904eb46c334b4f641ff0805cbd6d3a976738372828c0.scope: Deactivated successfully.
Oct  8 13:06:46 np0005476733 podman[280326]: 2025-10-08 17:06:46.341397728 +0000 UTC m=+0.049048978 container died f0f53389b955193b1c0e904eb46c334b4f641ff0805cbd6d3a976738372828c0 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-90f23013-7525-484c-9fdb-1b08a96d01b7, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 13:06:46 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f0f53389b955193b1c0e904eb46c334b4f641ff0805cbd6d3a976738372828c0-userdata-shm.mount: Deactivated successfully.
Oct  8 13:06:46 np0005476733 systemd[1]: var-lib-containers-storage-overlay-21d3fd576f0e5452f473b945c869ddf820f5f40dda16f5160e76590c14c467fd-merged.mount: Deactivated successfully.
Oct  8 13:06:46 np0005476733 podman[280326]: 2025-10-08 17:06:46.389876766 +0000 UTC m=+0.097528016 container cleanup f0f53389b955193b1c0e904eb46c334b4f641ff0805cbd6d3a976738372828c0 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-90f23013-7525-484c-9fdb-1b08a96d01b7, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 13:06:46 np0005476733 systemd[1]: libpod-conmon-f0f53389b955193b1c0e904eb46c334b4f641ff0805cbd6d3a976738372828c0.scope: Deactivated successfully.
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.433 2 INFO nova.virt.libvirt.driver [-] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Instance destroyed successfully.#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.434 2 DEBUG nova.objects.instance [None req-f29fc106-9a4a-4e34-828b-4090bf62035c c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lazy-loading 'resources' on Instance uuid b62bae97-ef9e-4f29-8e84-9731b7b2dc32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.448 2 DEBUG nova.virt.libvirt.vif [None req-f29fc106-9a4a-4e34-828b-4090bf62035c c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T17:02:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_dscp_marking_south_north-1351956767',display_name='tempest-test_dscp_marking_south_north-1351956767',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dscp-marking-south-north-1351956767',id=114,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDMZY+1amHtOfP7A9mNq9elrLdSmJExinx1RhEMIok/jY2NIEZLmbGYzwSU7hW6lTgCdp0ppXG+TES3vu/f6QcskjPgczSOIC1ZSOo2pydcs/ssHi2XSJm93QN1u5564Lw==',key_name='tempest-keypair-test-130020287',keypairs=<?>,launch_index=0,launched_at=2025-10-08T17:02:18Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1bfc3ffe3bf745dfbb59f63a07f1e1a9',ramdisk_id='',reservation_id='r-okufdn5f',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestCommon-951807005',owner_user_name='tempest-QosTestCommon-951807005-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T17:02:18Z,user_data=None,user_id='c7e76ef370424761ad76d4119e9bf895',uuid=b62bae97-ef9e-4f29-8e84-9731b7b2dc32,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "address": "fa:16:3e:8c:c7:90", "network": {"id": "90f23013-7525-484c-9fdb-1b08a96d01b7", "bridge": "br-int", "label": "tempest-test-network--203230236", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bfc3ffe3bf745dfbb59f63a07f1e1a9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap515844fc-ac", "ovs_interfaceid": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.449 2 DEBUG nova.network.os_vif_util [None req-f29fc106-9a4a-4e34-828b-4090bf62035c c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Converting VIF {"id": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "address": "fa:16:3e:8c:c7:90", "network": {"id": "90f23013-7525-484c-9fdb-1b08a96d01b7", "bridge": "br-int", "label": "tempest-test-network--203230236", "subnets": [{"cidr": "192.168.1.0/24", "dns": [], "gateway": {"address": "192.168.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.1.210", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bfc3ffe3bf745dfbb59f63a07f1e1a9", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap515844fc-ac", "ovs_interfaceid": "515844fc-ac1c-4a41-afdb-496edaf8d2ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.450 2 DEBUG nova.network.os_vif_util [None req-f29fc106-9a4a-4e34-828b-4090bf62035c c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8c:c7:90,bridge_name='br-int',has_traffic_filtering=True,id=515844fc-ac1c-4a41-afdb-496edaf8d2ad,network=Network(90f23013-7525-484c-9fdb-1b08a96d01b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap515844fc-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.450 2 DEBUG os_vif [None req-f29fc106-9a4a-4e34-828b-4090bf62035c c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8c:c7:90,bridge_name='br-int',has_traffic_filtering=True,id=515844fc-ac1c-4a41-afdb-496edaf8d2ad,network=Network(90f23013-7525-484c-9fdb-1b08a96d01b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap515844fc-ac') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.452 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap515844fc-ac, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.457 2 INFO os_vif [None req-f29fc106-9a4a-4e34-828b-4090bf62035c c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8c:c7:90,bridge_name='br-int',has_traffic_filtering=True,id=515844fc-ac1c-4a41-afdb-496edaf8d2ad,network=Network(90f23013-7525-484c-9fdb-1b08a96d01b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap515844fc-ac')#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.458 2 INFO nova.virt.libvirt.driver [None req-f29fc106-9a4a-4e34-828b-4090bf62035c c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Deleting instance files /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32_del#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.459 2 INFO nova.virt.libvirt.driver [None req-f29fc106-9a4a-4e34-828b-4090bf62035c c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Deletion of /var/lib/nova/instances/b62bae97-ef9e-4f29-8e84-9731b7b2dc32_del complete#033[00m
Oct  8 13:06:46 np0005476733 podman[280366]: 2025-10-08 17:06:46.467898618 +0000 UTC m=+0.048324074 container remove f0f53389b955193b1c0e904eb46c334b4f641ff0805cbd6d3a976738372828c0 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-90f23013-7525-484c-9fdb-1b08a96d01b7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 13:06:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:06:46.475 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e2a7b5ec-7121-4c58-b228-f35a89876dde]: (4, ('Wed Oct  8 05:06:46 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-90f23013-7525-484c-9fdb-1b08a96d01b7 (f0f53389b955193b1c0e904eb46c334b4f641ff0805cbd6d3a976738372828c0)\nf0f53389b955193b1c0e904eb46c334b4f641ff0805cbd6d3a976738372828c0\nWed Oct  8 05:06:46 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-90f23013-7525-484c-9fdb-1b08a96d01b7 (f0f53389b955193b1c0e904eb46c334b4f641ff0805cbd6d3a976738372828c0)\nf0f53389b955193b1c0e904eb46c334b4f641ff0805cbd6d3a976738372828c0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:06:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:06:46.476 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1dcbf9f6-cd81-46d4-9b53-a15c7ec1f2c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:06:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:06:46.477 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90f23013-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:46 np0005476733 kernel: tap90f23013-70: left promiscuous mode
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:06:46.493 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[4e81a157-b761-4e42-92c0-192c5e352118]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:06:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:06:46.525 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c0d92155-f759-455d-8d05-8a11393c066c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:06:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:06:46.526 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[921edda0-4d44-4198-a947-60edddf266a4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.531 2 INFO nova.compute.manager [None req-f29fc106-9a4a-4e34-828b-4090bf62035c c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.531 2 DEBUG oslo.service.loopingcall [None req-f29fc106-9a4a-4e34-828b-4090bf62035c c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.532 2 DEBUG nova.compute.manager [-] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 13:06:46 np0005476733 nova_compute[192580]: 2025-10-08 17:06:46.532 2 DEBUG nova.network.neutron [-] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 13:06:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:06:46.545 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8b0490b8-8371-4aa3-8cfe-81e39bb13551]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 984091, 'reachable_time': 28281, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280385, 'error': None, 'target': 'ovnmeta-90f23013-7525-484c-9fdb-1b08a96d01b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:06:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:06:46.549 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-90f23013-7525-484c-9fdb-1b08a96d01b7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 13:06:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:06:46.549 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[5dd847ff-8faf-4adc-b551-c08a114e7e87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:06:46 np0005476733 systemd[1]: run-netns-ovnmeta\x2d90f23013\x2d7525\x2d484c\x2d9fdb\x2d1b08a96d01b7.mount: Deactivated successfully.
Oct  8 13:06:47 np0005476733 nova_compute[192580]: 2025-10-08 17:06:47.309 2 DEBUG nova.compute.manager [req-501db12b-09cf-483e-a57a-51136bff6bd8 req-8a81092f-e260-4fbf-b9dc-503d47586ba9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Received event network-vif-unplugged-515844fc-ac1c-4a41-afdb-496edaf8d2ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:06:47 np0005476733 nova_compute[192580]: 2025-10-08 17:06:47.310 2 DEBUG oslo_concurrency.lockutils [req-501db12b-09cf-483e-a57a-51136bff6bd8 req-8a81092f-e260-4fbf-b9dc-503d47586ba9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:06:47 np0005476733 nova_compute[192580]: 2025-10-08 17:06:47.311 2 DEBUG oslo_concurrency.lockutils [req-501db12b-09cf-483e-a57a-51136bff6bd8 req-8a81092f-e260-4fbf-b9dc-503d47586ba9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:06:47 np0005476733 nova_compute[192580]: 2025-10-08 17:06:47.311 2 DEBUG oslo_concurrency.lockutils [req-501db12b-09cf-483e-a57a-51136bff6bd8 req-8a81092f-e260-4fbf-b9dc-503d47586ba9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:06:47 np0005476733 nova_compute[192580]: 2025-10-08 17:06:47.312 2 DEBUG nova.compute.manager [req-501db12b-09cf-483e-a57a-51136bff6bd8 req-8a81092f-e260-4fbf-b9dc-503d47586ba9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] No waiting events found dispatching network-vif-unplugged-515844fc-ac1c-4a41-afdb-496edaf8d2ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 13:06:47 np0005476733 nova_compute[192580]: 2025-10-08 17:06:47.312 2 DEBUG nova.compute.manager [req-501db12b-09cf-483e-a57a-51136bff6bd8 req-8a81092f-e260-4fbf-b9dc-503d47586ba9 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Received event network-vif-unplugged-515844fc-ac1c-4a41-afdb-496edaf8d2ad for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 13:06:48 np0005476733 nova_compute[192580]: 2025-10-08 17:06:48.545 2 DEBUG nova.network.neutron [-] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 13:06:48 np0005476733 nova_compute[192580]: 2025-10-08 17:06:48.570 2 INFO nova.compute.manager [-] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Took 2.04 seconds to deallocate network for instance.#033[00m
Oct  8 13:06:48 np0005476733 nova_compute[192580]: 2025-10-08 17:06:48.615 2 DEBUG oslo_concurrency.lockutils [None req-f29fc106-9a4a-4e34-828b-4090bf62035c c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:06:48 np0005476733 nova_compute[192580]: 2025-10-08 17:06:48.616 2 DEBUG oslo_concurrency.lockutils [None req-f29fc106-9a4a-4e34-828b-4090bf62035c c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:06:48 np0005476733 nova_compute[192580]: 2025-10-08 17:06:48.640 2 DEBUG nova.compute.manager [req-4dd30030-1ec7-4b82-be0c-fd2828f71ee4 req-de6b30b3-ce6e-496f-9eb1-d97ca3e40cb5 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Received event network-vif-deleted-515844fc-ac1c-4a41-afdb-496edaf8d2ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:06:48 np0005476733 nova_compute[192580]: 2025-10-08 17:06:48.691 2 DEBUG nova.compute.provider_tree [None req-f29fc106-9a4a-4e34-828b-4090bf62035c c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:06:48 np0005476733 nova_compute[192580]: 2025-10-08 17:06:48.710 2 DEBUG nova.scheduler.client.report [None req-f29fc106-9a4a-4e34-828b-4090bf62035c c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:06:48 np0005476733 nova_compute[192580]: 2025-10-08 17:06:48.730 2 DEBUG oslo_concurrency.lockutils [None req-f29fc106-9a4a-4e34-828b-4090bf62035c c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:06:48 np0005476733 nova_compute[192580]: 2025-10-08 17:06:48.764 2 INFO nova.scheduler.client.report [None req-f29fc106-9a4a-4e34-828b-4090bf62035c c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Deleted allocations for instance b62bae97-ef9e-4f29-8e84-9731b7b2dc32#033[00m
Oct  8 13:06:48 np0005476733 nova_compute[192580]: 2025-10-08 17:06:48.825 2 DEBUG oslo_concurrency.lockutils [None req-f29fc106-9a4a-4e34-828b-4090bf62035c c7e76ef370424761ad76d4119e9bf895 1bfc3ffe3bf745dfbb59f63a07f1e1a9 - - default default] Lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.685s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:06:49 np0005476733 nova_compute[192580]: 2025-10-08 17:06:49.447 2 DEBUG nova.compute.manager [req-9e7dadae-b736-4a75-8fb0-acd8afbe146c req-0919e01b-2f5c-4617-8d3f-5f5b2025e01a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Received event network-vif-plugged-515844fc-ac1c-4a41-afdb-496edaf8d2ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:06:49 np0005476733 nova_compute[192580]: 2025-10-08 17:06:49.448 2 DEBUG oslo_concurrency.lockutils [req-9e7dadae-b736-4a75-8fb0-acd8afbe146c req-0919e01b-2f5c-4617-8d3f-5f5b2025e01a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:06:49 np0005476733 nova_compute[192580]: 2025-10-08 17:06:49.449 2 DEBUG oslo_concurrency.lockutils [req-9e7dadae-b736-4a75-8fb0-acd8afbe146c req-0919e01b-2f5c-4617-8d3f-5f5b2025e01a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:06:49 np0005476733 nova_compute[192580]: 2025-10-08 17:06:49.450 2 DEBUG oslo_concurrency.lockutils [req-9e7dadae-b736-4a75-8fb0-acd8afbe146c req-0919e01b-2f5c-4617-8d3f-5f5b2025e01a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "b62bae97-ef9e-4f29-8e84-9731b7b2dc32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:06:49 np0005476733 nova_compute[192580]: 2025-10-08 17:06:49.450 2 DEBUG nova.compute.manager [req-9e7dadae-b736-4a75-8fb0-acd8afbe146c req-0919e01b-2f5c-4617-8d3f-5f5b2025e01a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] No waiting events found dispatching network-vif-plugged-515844fc-ac1c-4a41-afdb-496edaf8d2ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 13:06:49 np0005476733 nova_compute[192580]: 2025-10-08 17:06:49.450 2 WARNING nova.compute.manager [req-9e7dadae-b736-4a75-8fb0-acd8afbe146c req-0919e01b-2f5c-4617-8d3f-5f5b2025e01a 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Received unexpected event network-vif-plugged-515844fc-ac1c-4a41-afdb-496edaf8d2ad for instance with vm_state deleted and task_state None.#033[00m
Oct  8 13:06:51 np0005476733 nova_compute[192580]: 2025-10-08 17:06:51.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:51 np0005476733 nova_compute[192580]: 2025-10-08 17:06:51.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:55 np0005476733 ovn_controller[263831]: 2025-10-08T17:06:55Z|00256|pinctrl|WARN|Dropped 327 log messages in last 59 seconds (most recently, 4 seconds ago) due to excessive rate
Oct  8 13:06:55 np0005476733 ovn_controller[263831]: 2025-10-08T17:06:55Z|00257|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:06:55 np0005476733 podman[280388]: 2025-10-08 17:06:55.234364971 +0000 UTC m=+0.056865016 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 13:06:56 np0005476733 nova_compute[192580]: 2025-10-08 17:06:56.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:56 np0005476733 nova_compute[192580]: 2025-10-08 17:06:56.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:58 np0005476733 podman[280408]: 2025-10-08 17:06:58.257852829 +0000 UTC m=+0.074970916 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  8 13:06:59 np0005476733 nova_compute[192580]: 2025-10-08 17:06:59.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:06:59 np0005476733 nova_compute[192580]: 2025-10-08 17:06:59.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:01 np0005476733 podman[280436]: 2025-10-08 17:07:01.237143356 +0000 UTC m=+0.066913229 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Oct  8 13:07:01 np0005476733 nova_compute[192580]: 2025-10-08 17:07:01.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:01 np0005476733 nova_compute[192580]: 2025-10-08 17:07:01.432 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759943206.4303713, b62bae97-ef9e-4f29-8e84-9731b7b2dc32 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 13:07:01 np0005476733 nova_compute[192580]: 2025-10-08 17:07:01.432 2 INFO nova.compute.manager [-] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] VM Stopped (Lifecycle Event)#033[00m
Oct  8 13:07:01 np0005476733 nova_compute[192580]: 2025-10-08 17:07:01.453 2 DEBUG nova.compute.manager [None req-e933e6f6-9047-4b07-95e5-2ccef638e9cf - - - - - -] [instance: b62bae97-ef9e-4f29-8e84-9731b7b2dc32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 13:07:01 np0005476733 nova_compute[192580]: 2025-10-08 17:07:01.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:06 np0005476733 nova_compute[192580]: 2025-10-08 17:07:06.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:06 np0005476733 nova_compute[192580]: 2025-10-08 17:07:06.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:07 np0005476733 nova_compute[192580]: 2025-10-08 17:07:07.605 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:07:07 np0005476733 nova_compute[192580]: 2025-10-08 17:07:07.605 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:07:09 np0005476733 nova_compute[192580]: 2025-10-08 17:07:09.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:07:10 np0005476733 podman[280458]: 2025-10-08 17:07:10.22935045 +0000 UTC m=+0.055765142 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 13:07:10 np0005476733 podman[280459]: 2025-10-08 17:07:10.236938693 +0000 UTC m=+0.063968965 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., vcs-type=git, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, managed_by=edpm_ansible, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  8 13:07:10 np0005476733 podman[280457]: 2025-10-08 17:07:10.247144258 +0000 UTC m=+0.072031341 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 13:07:11 np0005476733 nova_compute[192580]: 2025-10-08 17:07:11.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:11 np0005476733 nova_compute[192580]: 2025-10-08 17:07:11.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:16 np0005476733 podman[280521]: 2025-10-08 17:07:16.22198592 +0000 UTC m=+0.055749582 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 13:07:16 np0005476733 podman[280522]: 2025-10-08 17:07:16.226622568 +0000 UTC m=+0.052656363 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 13:07:16 np0005476733 nova_compute[192580]: 2025-10-08 17:07:16.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:16 np0005476733 nova_compute[192580]: 2025-10-08 17:07:16.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:16 np0005476733 nova_compute[192580]: 2025-10-08 17:07:16.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:07:19 np0005476733 nova_compute[192580]: 2025-10-08 17:07:19.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:07:21 np0005476733 nova_compute[192580]: 2025-10-08 17:07:21.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:21 np0005476733 nova_compute[192580]: 2025-10-08 17:07:21.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:22 np0005476733 nova_compute[192580]: 2025-10-08 17:07:22.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:07:22 np0005476733 nova_compute[192580]: 2025-10-08 17:07:22.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:07:22 np0005476733 nova_compute[192580]: 2025-10-08 17:07:22.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:07:22 np0005476733 nova_compute[192580]: 2025-10-08 17:07:22.610 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:07:24 np0005476733 nova_compute[192580]: 2025-10-08 17:07:24.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:07:26 np0005476733 podman[280565]: 2025-10-08 17:07:26.222915061 +0000 UTC m=+0.050914306 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 13:07:26 np0005476733 nova_compute[192580]: 2025-10-08 17:07:26.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:07:26.444 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:07:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:07:26.445 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:07:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:07:26.445 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:07:26 np0005476733 nova_compute[192580]: 2025-10-08 17:07:26.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:28 np0005476733 nova_compute[192580]: 2025-10-08 17:07:28.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:07:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:07:28.986 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=98, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=97) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:07:28 np0005476733 nova_compute[192580]: 2025-10-08 17:07:28.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:28 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:07:28.987 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 13:07:29 np0005476733 podman[280584]: 2025-10-08 17:07:29.234963884 +0000 UTC m=+0.069321585 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:07:31 np0005476733 nova_compute[192580]: 2025-10-08 17:07:31.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:31 np0005476733 nova_compute[192580]: 2025-10-08 17:07:31.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:31 np0005476733 nova_compute[192580]: 2025-10-08 17:07:31.605 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:07:32 np0005476733 podman[280611]: 2025-10-08 17:07:32.22706669 +0000 UTC m=+0.051213927 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 13:07:32 np0005476733 nova_compute[192580]: 2025-10-08 17:07:32.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:07:32 np0005476733 nova_compute[192580]: 2025-10-08 17:07:32.617 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:07:32 np0005476733 nova_compute[192580]: 2025-10-08 17:07:32.617 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:07:32 np0005476733 nova_compute[192580]: 2025-10-08 17:07:32.618 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:07:32 np0005476733 nova_compute[192580]: 2025-10-08 17:07:32.618 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:07:32 np0005476733 nova_compute[192580]: 2025-10-08 17:07:32.763 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:07:32 np0005476733 nova_compute[192580]: 2025-10-08 17:07:32.764 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13659MB free_disk=111.30170059204102GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:07:32 np0005476733 nova_compute[192580]: 2025-10-08 17:07:32.764 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:07:32 np0005476733 nova_compute[192580]: 2025-10-08 17:07:32.764 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:07:32 np0005476733 nova_compute[192580]: 2025-10-08 17:07:32.848 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:07:32 np0005476733 nova_compute[192580]: 2025-10-08 17:07:32.848 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:07:32 np0005476733 nova_compute[192580]: 2025-10-08 17:07:32.870 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 13:07:32 np0005476733 nova_compute[192580]: 2025-10-08 17:07:32.891 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 13:07:32 np0005476733 nova_compute[192580]: 2025-10-08 17:07:32.891 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 13:07:32 np0005476733 nova_compute[192580]: 2025-10-08 17:07:32.910 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 13:07:32 np0005476733 nova_compute[192580]: 2025-10-08 17:07:32.952 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 13:07:32 np0005476733 nova_compute[192580]: 2025-10-08 17:07:32.972 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:07:32 np0005476733 nova_compute[192580]: 2025-10-08 17:07:32.986 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:07:33 np0005476733 nova_compute[192580]: 2025-10-08 17:07:33.024 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:07:33 np0005476733 nova_compute[192580]: 2025-10-08 17:07:33.024 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:07:35 np0005476733 systemd-logind[827]: New session 170 of user zuul.
Oct  8 13:07:35 np0005476733 systemd[1]: Started Session 170 of User zuul.
Oct  8 13:07:35 np0005476733 systemd[1]: session-170.scope: Deactivated successfully.
Oct  8 13:07:35 np0005476733 systemd-logind[827]: Session 170 logged out. Waiting for processes to exit.
Oct  8 13:07:35 np0005476733 systemd-logind[827]: Removed session 170.
Oct  8 13:07:36 np0005476733 nova_compute[192580]: 2025-10-08 17:07:36.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:36 np0005476733 nova_compute[192580]: 2025-10-08 17:07:36.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:38 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:07:38.989 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '98'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:07:40 np0005476733 nova_compute[192580]: 2025-10-08 17:07:40.026 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:07:41 np0005476733 ovn_controller[263831]: 2025-10-08T17:07:41Z|00258|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct  8 13:07:41 np0005476733 podman[280660]: 2025-10-08 17:07:41.243232559 +0000 UTC m=+0.067838018 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 13:07:41 np0005476733 podman[280659]: 2025-10-08 17:07:41.253949811 +0000 UTC m=+0.080976257 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=multipathd)
Oct  8 13:07:41 np0005476733 podman[280661]: 2025-10-08 17:07:41.280589973 +0000 UTC m=+0.099470099 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, architecture=x86_64, version=9.6, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  8 13:07:41 np0005476733 nova_compute[192580]: 2025-10-08 17:07:41.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:41 np0005476733 nova_compute[192580]: 2025-10-08 17:07:41.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:45 np0005476733 nova_compute[192580]: 2025-10-08 17:07:45.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:07:45 np0005476733 nova_compute[192580]: 2025-10-08 17:07:45.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 13:07:45 np0005476733 nova_compute[192580]: 2025-10-08 17:07:45.612 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 13:07:46 np0005476733 nova_compute[192580]: 2025-10-08 17:07:46.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:46 np0005476733 nova_compute[192580]: 2025-10-08 17:07:46.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:47 np0005476733 podman[280717]: 2025-10-08 17:07:47.218979738 +0000 UTC m=+0.050676700 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 13:07:47 np0005476733 podman[280718]: 2025-10-08 17:07:47.219160254 +0000 UTC m=+0.047800709 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 13:07:51 np0005476733 ovn_controller[263831]: 2025-10-08T17:07:51Z|00259|pinctrl|WARN|Dropped 283 log messages in last 56 seconds (most recently, 2 seconds ago) due to excessive rate
Oct  8 13:07:51 np0005476733 ovn_controller[263831]: 2025-10-08T17:07:51Z|00260|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:07:51 np0005476733 nova_compute[192580]: 2025-10-08 17:07:51.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:51 np0005476733 nova_compute[192580]: 2025-10-08 17:07:51.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:54 np0005476733 nova_compute[192580]: 2025-10-08 17:07:54.603 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:07:56 np0005476733 nova_compute[192580]: 2025-10-08 17:07:56.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:56 np0005476733 nova_compute[192580]: 2025-10-08 17:07:56.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:07:57 np0005476733 podman[280763]: 2025-10-08 17:07:57.21700663 +0000 UTC m=+0.047627143 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  8 13:08:00 np0005476733 podman[280784]: 2025-10-08 17:08:00.241667455 +0000 UTC m=+0.071444873 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 13:08:01 np0005476733 nova_compute[192580]: 2025-10-08 17:08:01.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:01 np0005476733 nova_compute[192580]: 2025-10-08 17:08:01.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:03 np0005476733 podman[280813]: 2025-10-08 17:08:03.230365611 +0000 UTC m=+0.062826427 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm)
Oct  8 13:08:06 np0005476733 nova_compute[192580]: 2025-10-08 17:08:06.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:06 np0005476733 nova_compute[192580]: 2025-10-08 17:08:06.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:08 np0005476733 nova_compute[192580]: 2025-10-08 17:08:08.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:08:08 np0005476733 nova_compute[192580]: 2025-10-08 17:08:08.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:08:10 np0005476733 nova_compute[192580]: 2025-10-08 17:08:10.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:08:11 np0005476733 nova_compute[192580]: 2025-10-08 17:08:11.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:11 np0005476733 nova_compute[192580]: 2025-10-08 17:08:11.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:12 np0005476733 podman[280840]: 2025-10-08 17:08:12.242297086 +0000 UTC m=+0.057641011 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 13:08:12 np0005476733 podman[280839]: 2025-10-08 17:08:12.244365353 +0000 UTC m=+0.064793391 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 13:08:12 np0005476733 podman[280841]: 2025-10-08 17:08:12.249760555 +0000 UTC m=+0.060728741 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.openshift.expose-services=)
Oct  8 13:08:16 np0005476733 nova_compute[192580]: 2025-10-08 17:08:16.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:16 np0005476733 nova_compute[192580]: 2025-10-08 17:08:16.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:17 np0005476733 nova_compute[192580]: 2025-10-08 17:08:17.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:08:18 np0005476733 podman[280904]: 2025-10-08 17:08:18.24412203 +0000 UTC m=+0.069200023 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 13:08:18 np0005476733 podman[280903]: 2025-10-08 17:08:18.245388629 +0000 UTC m=+0.074813290 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_id=iscsid, container_name=iscsid)
Oct  8 13:08:19 np0005476733 nova_compute[192580]: 2025-10-08 17:08:19.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:08:21 np0005476733 nova_compute[192580]: 2025-10-08 17:08:21.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:21 np0005476733 nova_compute[192580]: 2025-10-08 17:08:21.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:23 np0005476733 nova_compute[192580]: 2025-10-08 17:08:23.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:08:23 np0005476733 nova_compute[192580]: 2025-10-08 17:08:23.591 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:08:23 np0005476733 nova_compute[192580]: 2025-10-08 17:08:23.591 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:08:23 np0005476733 nova_compute[192580]: 2025-10-08 17:08:23.609 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:08:24 np0005476733 nova_compute[192580]: 2025-10-08 17:08:24.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:08:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:26.445 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:08:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:26.446 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:08:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:26.446 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:08:26 np0005476733 nova_compute[192580]: 2025-10-08 17:08:26.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:26 np0005476733 nova_compute[192580]: 2025-10-08 17:08:26.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:28 np0005476733 podman[280950]: 2025-10-08 17:08:28.245376881 +0000 UTC m=+0.075682829 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  8 13:08:31 np0005476733 podman[280969]: 2025-10-08 17:08:31.240984919 +0000 UTC m=+0.075161152 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  8 13:08:31 np0005476733 nova_compute[192580]: 2025-10-08 17:08:31.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:31 np0005476733 nova_compute[192580]: 2025-10-08 17:08:31.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:31 np0005476733 nova_compute[192580]: 2025-10-08 17:08:31.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:08:33 np0005476733 nova_compute[192580]: 2025-10-08 17:08:33.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:08:33 np0005476733 nova_compute[192580]: 2025-10-08 17:08:33.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:08:33 np0005476733 nova_compute[192580]: 2025-10-08 17:08:33.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:08:33 np0005476733 nova_compute[192580]: 2025-10-08 17:08:33.617 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:08:33 np0005476733 nova_compute[192580]: 2025-10-08 17:08:33.617 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:08:33 np0005476733 nova_compute[192580]: 2025-10-08 17:08:33.768 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:08:33 np0005476733 nova_compute[192580]: 2025-10-08 17:08:33.769 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13655MB free_disk=111.30164337158203GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:08:33 np0005476733 nova_compute[192580]: 2025-10-08 17:08:33.769 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:08:33 np0005476733 nova_compute[192580]: 2025-10-08 17:08:33.769 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:08:33 np0005476733 nova_compute[192580]: 2025-10-08 17:08:33.833 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:08:33 np0005476733 nova_compute[192580]: 2025-10-08 17:08:33.833 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:08:33 np0005476733 nova_compute[192580]: 2025-10-08 17:08:33.858 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:08:33 np0005476733 nova_compute[192580]: 2025-10-08 17:08:33.872 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:08:33 np0005476733 nova_compute[192580]: 2025-10-08 17:08:33.873 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:08:33 np0005476733 nova_compute[192580]: 2025-10-08 17:08:33.873 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:08:34 np0005476733 podman[280995]: 2025-10-08 17:08:34.228155637 +0000 UTC m=+0.059242813 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.080 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:08:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:08:36 np0005476733 nova_compute[192580]: 2025-10-08 17:08:36.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:36 np0005476733 nova_compute[192580]: 2025-10-08 17:08:36.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:39 np0005476733 nova_compute[192580]: 2025-10-08 17:08:39.873 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:08:41 np0005476733 nova_compute[192580]: 2025-10-08 17:08:41.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:41 np0005476733 nova_compute[192580]: 2025-10-08 17:08:41.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:41 np0005476733 ovn_controller[263831]: 2025-10-08T17:08:41Z|00261|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory
Oct  8 13:08:43 np0005476733 podman[281016]: 2025-10-08 17:08:43.226371602 +0000 UTC m=+0.047745096 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 13:08:43 np0005476733 podman[281017]: 2025-10-08 17:08:43.243742007 +0000 UTC m=+0.067120475 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  8 13:08:43 np0005476733 podman[281015]: 2025-10-08 17:08:43.266944338 +0000 UTC m=+0.086272257 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd)
Oct  8 13:08:46 np0005476733 nova_compute[192580]: 2025-10-08 17:08:46.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:46 np0005476733 nova_compute[192580]: 2025-10-08 17:08:46.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:49 np0005476733 podman[281079]: 2025-10-08 17:08:49.229263801 +0000 UTC m=+0.063469539 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid)
Oct  8 13:08:49 np0005476733 podman[281080]: 2025-10-08 17:08:49.236240354 +0000 UTC m=+0.067327362 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 13:08:51 np0005476733 nova_compute[192580]: 2025-10-08 17:08:51.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:51 np0005476733 nova_compute[192580]: 2025-10-08 17:08:51.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.043 2 DEBUG oslo_concurrency.lockutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Acquiring lock "d2559ab1-2621-46b4-8e3c-568cbff22376" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.043 2 DEBUG oslo_concurrency.lockutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lock "d2559ab1-2621-46b4-8e3c-568cbff22376" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.062 2 DEBUG nova.compute.manager [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.152 2 DEBUG oslo_concurrency.lockutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.152 2 DEBUG oslo_concurrency.lockutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.162 2 DEBUG nova.virt.hardware [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.163 2 INFO nova.compute.claims [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.295 2 DEBUG nova.compute.provider_tree [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.315 2 DEBUG nova.scheduler.client.report [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.335 2 DEBUG oslo_concurrency.lockutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.336 2 DEBUG nova.compute.manager [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.389 2 DEBUG nova.compute.manager [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.390 2 DEBUG nova.network.neutron [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.412 2 INFO nova.virt.libvirt.driver [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.431 2 DEBUG nova.compute.manager [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.532 2 DEBUG nova.compute.manager [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.533 2 DEBUG nova.virt.libvirt.driver [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.534 2 INFO nova.virt.libvirt.driver [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Creating image(s)#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.534 2 DEBUG oslo_concurrency.lockutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Acquiring lock "/var/lib/nova/instances/d2559ab1-2621-46b4-8e3c-568cbff22376/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.535 2 DEBUG oslo_concurrency.lockutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lock "/var/lib/nova/instances/d2559ab1-2621-46b4-8e3c-568cbff22376/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.535 2 DEBUG oslo_concurrency.lockutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lock "/var/lib/nova/instances/d2559ab1-2621-46b4-8e3c-568cbff22376/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.549 2 DEBUG oslo_concurrency.processutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.605 2 DEBUG oslo_concurrency.processutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.606 2 DEBUG oslo_concurrency.lockutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Acquiring lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.607 2 DEBUG oslo_concurrency.lockutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.623 2 DEBUG oslo_concurrency.processutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.676 2 DEBUG oslo_concurrency.processutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.677 2 DEBUG oslo_concurrency.processutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/d2559ab1-2621-46b4-8e3c-568cbff22376/disk 10737418240 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.724 2 DEBUG oslo_concurrency.processutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e,backing_fmt=raw /var/lib/nova/instances/d2559ab1-2621-46b4-8e3c-568cbff22376/disk 10737418240" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.725 2 DEBUG oslo_concurrency.lockutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lock "bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.725 2 DEBUG oslo_concurrency.processutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.792 2 DEBUG oslo_concurrency.processutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.793 2 DEBUG nova.objects.instance [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lazy-loading 'migration_context' on Instance uuid d2559ab1-2621-46b4-8e3c-568cbff22376 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.815 2 DEBUG nova.virt.libvirt.driver [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.816 2 DEBUG nova.virt.libvirt.driver [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Ensure instance console log exists: /var/lib/nova/instances/d2559ab1-2621-46b4-8e3c-568cbff22376/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.816 2 DEBUG oslo_concurrency.lockutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.817 2 DEBUG oslo_concurrency.lockutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:08:52 np0005476733 nova_compute[192580]: 2025-10-08 17:08:52.817 2 DEBUG oslo_concurrency.lockutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:08:53 np0005476733 nova_compute[192580]: 2025-10-08 17:08:53.141 2 DEBUG nova.policy [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 13:08:54 np0005476733 ovn_controller[263831]: 2025-10-08T17:08:54Z|00262|pinctrl|WARN|Dropped 219 log messages in last 64 seconds (most recently, 13 seconds ago) due to excessive rate
Oct  8 13:08:54 np0005476733 ovn_controller[263831]: 2025-10-08T17:08:54Z|00263|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:08:54 np0005476733 nova_compute[192580]: 2025-10-08 17:08:54.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:54.454 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=99, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=98) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:08:54 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:54.457 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 13:08:55 np0005476733 nova_compute[192580]: 2025-10-08 17:08:55.168 2 DEBUG nova.network.neutron [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Successfully created port: a232f057-6f16-4161-bffc-b743b97e5d1f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 13:08:56 np0005476733 nova_compute[192580]: 2025-10-08 17:08:56.306 2 DEBUG nova.network.neutron [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Successfully updated port: a232f057-6f16-4161-bffc-b743b97e5d1f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 13:08:56 np0005476733 nova_compute[192580]: 2025-10-08 17:08:56.326 2 DEBUG oslo_concurrency.lockutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Acquiring lock "refresh_cache-d2559ab1-2621-46b4-8e3c-568cbff22376" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 13:08:56 np0005476733 nova_compute[192580]: 2025-10-08 17:08:56.326 2 DEBUG oslo_concurrency.lockutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Acquired lock "refresh_cache-d2559ab1-2621-46b4-8e3c-568cbff22376" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 13:08:56 np0005476733 nova_compute[192580]: 2025-10-08 17:08:56.326 2 DEBUG nova.network.neutron [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 13:08:56 np0005476733 nova_compute[192580]: 2025-10-08 17:08:56.432 2 DEBUG nova.compute.manager [req-2764a4f6-1e40-44da-92f1-ed66125863ad req-eb140b3e-7f07-4c40-812d-4edc17c91100 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Received event network-changed-a232f057-6f16-4161-bffc-b743b97e5d1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:08:56 np0005476733 nova_compute[192580]: 2025-10-08 17:08:56.433 2 DEBUG nova.compute.manager [req-2764a4f6-1e40-44da-92f1-ed66125863ad req-eb140b3e-7f07-4c40-812d-4edc17c91100 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Refreshing instance network info cache due to event network-changed-a232f057-6f16-4161-bffc-b743b97e5d1f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 13:08:56 np0005476733 nova_compute[192580]: 2025-10-08 17:08:56.433 2 DEBUG oslo_concurrency.lockutils [req-2764a4f6-1e40-44da-92f1-ed66125863ad req-eb140b3e-7f07-4c40-812d-4edc17c91100 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-d2559ab1-2621-46b4-8e3c-568cbff22376" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 13:08:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:56.459 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '99'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:08:56 np0005476733 nova_compute[192580]: 2025-10-08 17:08:56.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:56 np0005476733 nova_compute[192580]: 2025-10-08 17:08:56.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:56 np0005476733 nova_compute[192580]: 2025-10-08 17:08:56.502 2 DEBUG nova.network.neutron [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.149 2 DEBUG nova.network.neutron [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Updating instance_info_cache with network_info: [{"id": "a232f057-6f16-4161-bffc-b743b97e5d1f", "address": "fa:16:3e:10:96:59", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.237", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa232f057-6f", "ovs_interfaceid": "a232f057-6f16-4161-bffc-b743b97e5d1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.173 2 DEBUG oslo_concurrency.lockutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Releasing lock "refresh_cache-d2559ab1-2621-46b4-8e3c-568cbff22376" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.173 2 DEBUG nova.compute.manager [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Instance network_info: |[{"id": "a232f057-6f16-4161-bffc-b743b97e5d1f", "address": "fa:16:3e:10:96:59", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.237", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa232f057-6f", "ovs_interfaceid": "a232f057-6f16-4161-bffc-b743b97e5d1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.174 2 DEBUG oslo_concurrency.lockutils [req-2764a4f6-1e40-44da-92f1-ed66125863ad req-eb140b3e-7f07-4c40-812d-4edc17c91100 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-d2559ab1-2621-46b4-8e3c-568cbff22376" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.174 2 DEBUG nova.network.neutron [req-2764a4f6-1e40-44da-92f1-ed66125863ad req-eb140b3e-7f07-4c40-812d-4edc17c91100 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Refreshing network info cache for port a232f057-6f16-4161-bffc-b743b97e5d1f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.180 2 DEBUG nova.virt.libvirt.driver [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Start _get_guest_xml network_info=[{"id": "a232f057-6f16-4161-bffc-b743b97e5d1f", "address": "fa:16:3e:10:96:59", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.237", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa232f057-6f", "ovs_interfaceid": "a232f057-6f16-4161-bffc-b743b97e5d1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_secret_uuid': None, 'encrypted': False, 'size': 0, 'encryption_options': None, 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': '11111111-1111-1111-1111-111111111111'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.186 2 WARNING nova.virt.libvirt.driver [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.193 2 DEBUG nova.virt.libvirt.host [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.194 2 DEBUG nova.virt.libvirt.host [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.200 2 DEBUG nova.virt.libvirt.host [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.201 2 DEBUG nova.virt.libvirt.host [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.202 2 DEBUG nova.virt.libvirt.driver [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.202 2 DEBUG nova.virt.hardware [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T15:17:29Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='22222222-2222-2222-2222-222222222222',id=1,is_public=True,memory_mb=1024,name='custom_neutron_guest',projects=<?>,root_gb=10,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='1f2c1262d5f154b977e91c4cfcf989f9',container_format='bare',created_at=2025-10-08T15:16:45Z,direct_url=<?>,disk_format='qcow2',id=11111111-1111-1111-1111-111111111111,min_disk=0,min_ram=0,name='custom_neutron_guest',owner='daecd871adaa4a7ba72129f7b1a03cd9',properties=ImageMetaProps,protected=<?>,size=1180893184,status='active',tags=<?>,updated_at=2025-10-08T15:17:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.202 2 DEBUG nova.virt.hardware [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.202 2 DEBUG nova.virt.hardware [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.203 2 DEBUG nova.virt.hardware [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.203 2 DEBUG nova.virt.hardware [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.203 2 DEBUG nova.virt.hardware [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.203 2 DEBUG nova.virt.hardware [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.204 2 DEBUG nova.virt.hardware [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.204 2 DEBUG nova.virt.hardware [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.204 2 DEBUG nova.virt.hardware [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.205 2 DEBUG nova.virt.hardware [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.209 2 DEBUG nova.virt.libvirt.vif [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T17:08:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_dscp_bwlimit_external_network-1818989422',display_name='tempest-test_dscp_bwlimit_external_network-1818989422',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dscp-bwlimit-external-network-1818989422',id=116,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJR/7xp4fhkcZ9yCLipsodZ2B1hsuYH3iQyfkCZCDl7mvXgidE/NHKeXMZyKMiWHW8BEIKPoo7O5FsRdZDjBlPOtIfenzRDdK1bjfcVb4Kuc8gL9en8DLsiKTPLRbj8LaA==',key_name='tempest-keypair-test-1556105078',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c64130dbc8f84b8db2db002a5e499a8b',ramdisk_id='',reservation_id='r-u0hmfh6i',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestExternalNetwork-1440675542',owner_user_name='tempest-QosTestExternalNetwork-1440675542-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T17:08:52Z,user_data=None,user_id='96fd231581074677b87116ed9773b06d',uuid=d2559ab1-2621-46b4-8e3c-568cbff22376,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a232f057-6f16-4161-bffc-b743b97e5d1f", "address": "fa:16:3e:10:96:59", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.237", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa232f057-6f", "ovs_interfaceid": "a232f057-6f16-4161-bffc-b743b97e5d1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.210 2 DEBUG nova.network.os_vif_util [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Converting VIF {"id": "a232f057-6f16-4161-bffc-b743b97e5d1f", "address": "fa:16:3e:10:96:59", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.237", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa232f057-6f", "ovs_interfaceid": "a232f057-6f16-4161-bffc-b743b97e5d1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.210 2 DEBUG nova.network.os_vif_util [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:96:59,bridge_name='br-int',has_traffic_filtering=True,id=a232f057-6f16-4161-bffc-b743b97e5d1f,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa232f057-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.211 2 DEBUG nova.objects.instance [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lazy-loading 'pci_devices' on Instance uuid d2559ab1-2621-46b4-8e3c-568cbff22376 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.233 2 DEBUG nova.virt.libvirt.driver [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] End _get_guest_xml xml=<domain type="kvm">
Oct  8 13:08:58 np0005476733 nova_compute[192580]:  <uuid>d2559ab1-2621-46b4-8e3c-568cbff22376</uuid>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:  <name>instance-00000074</name>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:  <memory>1048576</memory>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:  <vcpu>1</vcpu>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:  <metadata>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <nova:name>tempest-test_dscp_bwlimit_external_network-1818989422</nova:name>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <nova:creationTime>2025-10-08 17:08:58</nova:creationTime>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <nova:flavor name="custom_neutron_guest">
Oct  8 13:08:58 np0005476733 nova_compute[192580]:        <nova:memory>1024</nova:memory>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:        <nova:disk>10</nova:disk>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:        <nova:swap>0</nova:swap>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:        <nova:vcpus>1</nova:vcpus>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      </nova:flavor>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <nova:owner>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:        <nova:user uuid="96fd231581074677b87116ed9773b06d">tempest-QosTestExternalNetwork-1440675542-project-member</nova:user>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:        <nova:project uuid="c64130dbc8f84b8db2db002a5e499a8b">tempest-QosTestExternalNetwork-1440675542</nova:project>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      </nova:owner>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <nova:ports>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:        <nova:port uuid="a232f057-6f16-4161-bffc-b743b97e5d1f">
Oct  8 13:08:58 np0005476733 nova_compute[192580]:          <nova:ip type="fixed" address="192.168.122.237" ipVersion="4"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:        </nova:port>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      </nova:ports>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    </nova:instance>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:  </metadata>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:  <sysinfo type="smbios">
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <system>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <entry name="manufacturer">RDO</entry>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <entry name="product">OpenStack Compute</entry>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <entry name="serial">d2559ab1-2621-46b4-8e3c-568cbff22376</entry>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <entry name="uuid">d2559ab1-2621-46b4-8e3c-568cbff22376</entry>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <entry name="family">Virtual Machine</entry>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    </system>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:  </sysinfo>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:  <os>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <boot dev="hd"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <smbios mode="sysinfo"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:  </os>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:  <features>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <acpi/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <apic/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <vmcoreinfo/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:  </features>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:  <clock offset="utc">
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <timer name="hpet" present="no"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:  </clock>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:  <cpu mode="host-model" match="exact">
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:  </cpu>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:  <devices>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <disk type="file" device="disk">
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/d2559ab1-2621-46b4-8e3c-568cbff22376/disk"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <target dev="vda" bus="virtio"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    </disk>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <disk type="file" device="cdrom">
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <source file="/var/lib/nova/instances/d2559ab1-2621-46b4-8e3c-568cbff22376/disk.config"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <target dev="sda" bus="sata"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    </disk>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <interface type="ethernet">
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <mac address="fa:16:3e:10:96:59"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <mtu size="1400"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <target dev="tapa232f057-6f"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    </interface>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <serial type="pty">
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <log file="/var/lib/nova/instances/d2559ab1-2621-46b4-8e3c-568cbff22376/console.log" append="off"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    </serial>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <video>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <model type="virtio"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    </video>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <input type="tablet" bus="usb"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <rng model="virtio">
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <backend model="random">/dev/urandom</backend>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    </rng>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <controller type="usb" index="0"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    <memballoon model="virtio">
Oct  8 13:08:58 np0005476733 nova_compute[192580]:      <stats period="10"/>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:    </memballoon>
Oct  8 13:08:58 np0005476733 nova_compute[192580]:  </devices>
Oct  8 13:08:58 np0005476733 nova_compute[192580]: </domain>
Oct  8 13:08:58 np0005476733 nova_compute[192580]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.233 2 DEBUG nova.compute.manager [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Preparing to wait for external event network-vif-plugged-a232f057-6f16-4161-bffc-b743b97e5d1f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.234 2 DEBUG oslo_concurrency.lockutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Acquiring lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.234 2 DEBUG oslo_concurrency.lockutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.235 2 DEBUG oslo_concurrency.lockutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.237 2 DEBUG nova.virt.libvirt.vif [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T17:08:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-test_dscp_bwlimit_external_network-1818989422',display_name='tempest-test_dscp_bwlimit_external_network-1818989422',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dscp-bwlimit-external-network-1818989422',id=116,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJR/7xp4fhkcZ9yCLipsodZ2B1hsuYH3iQyfkCZCDl7mvXgidE/NHKeXMZyKMiWHW8BEIKPoo7O5FsRdZDjBlPOtIfenzRDdK1bjfcVb4Kuc8gL9en8DLsiKTPLRbj8LaA==',key_name='tempest-keypair-test-1556105078',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c64130dbc8f84b8db2db002a5e499a8b',ramdisk_id='',reservation_id='r-u0hmfh6i',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-QosTestExternalNetwork-1440675542',owner_user_name='tempest-QosTestExternalNetwork-1440675542-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T17:08:52Z,user_data=None,user_id='96fd231581074677b87116ed9773b06d',uuid=d2559ab1-2621-46b4-8e3c-568cbff22376,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a232f057-6f16-4161-bffc-b743b97e5d1f", "address": "fa:16:3e:10:96:59", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.237", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa232f057-6f", "ovs_interfaceid": "a232f057-6f16-4161-bffc-b743b97e5d1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.237 2 DEBUG nova.network.os_vif_util [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Converting VIF {"id": "a232f057-6f16-4161-bffc-b743b97e5d1f", "address": "fa:16:3e:10:96:59", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.237", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa232f057-6f", "ovs_interfaceid": "a232f057-6f16-4161-bffc-b743b97e5d1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.239 2 DEBUG nova.network.os_vif_util [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:96:59,bridge_name='br-int',has_traffic_filtering=True,id=a232f057-6f16-4161-bffc-b743b97e5d1f,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa232f057-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.239 2 DEBUG os_vif [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:96:59,bridge_name='br-int',has_traffic_filtering=True,id=a232f057-6f16-4161-bffc-b743b97e5d1f,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa232f057-6f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.241 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.242 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.247 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa232f057-6f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.247 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa232f057-6f, col_values=(('external_ids', {'iface-id': 'a232f057-6f16-4161-bffc-b743b97e5d1f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:10:96:59', 'vm-uuid': 'd2559ab1-2621-46b4-8e3c-568cbff22376'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.261 2 INFO os_vif [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:96:59,bridge_name='br-int',has_traffic_filtering=True,id=a232f057-6f16-4161-bffc-b743b97e5d1f,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa232f057-6f')#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.325 2 DEBUG nova.virt.libvirt.driver [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.325 2 DEBUG nova.virt.libvirt.driver [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.326 2 DEBUG nova.virt.libvirt.driver [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] No VIF found with MAC fa:16:3e:10:96:59, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 13:08:58 np0005476733 nova_compute[192580]: 2025-10-08 17:08:58.326 2 INFO nova.virt.libvirt.driver [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Using config drive#033[00m
Oct  8 13:08:59 np0005476733 nova_compute[192580]: 2025-10-08 17:08:59.040 2 INFO nova.virt.libvirt.driver [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Creating config drive at /var/lib/nova/instances/d2559ab1-2621-46b4-8e3c-568cbff22376/disk.config#033[00m
Oct  8 13:08:59 np0005476733 nova_compute[192580]: 2025-10-08 17:08:59.046 2 DEBUG oslo_concurrency.processutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d2559ab1-2621-46b4-8e3c-568cbff22376/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplq62t5km execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:08:59 np0005476733 nova_compute[192580]: 2025-10-08 17:08:59.178 2 DEBUG oslo_concurrency.processutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d2559ab1-2621-46b4-8e3c-568cbff22376/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplq62t5km" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:08:59 np0005476733 podman[281140]: 2025-10-08 17:08:59.254297153 +0000 UTC m=+0.077600769 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 13:08:59 np0005476733 kernel: tapa232f057-6f: entered promiscuous mode
Oct  8 13:08:59 np0005476733 NetworkManager[51699]: <info>  [1759943339.2604] manager: (tapa232f057-6f): new Tun device (/org/freedesktop/NetworkManager/Devices/330)
Oct  8 13:08:59 np0005476733 nova_compute[192580]: 2025-10-08 17:08:59.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:59 np0005476733 ovn_controller[263831]: 2025-10-08T17:08:59Z|00264|binding|INFO|Claiming lport a232f057-6f16-4161-bffc-b743b97e5d1f for this chassis.
Oct  8 13:08:59 np0005476733 ovn_controller[263831]: 2025-10-08T17:08:59Z|00265|binding|INFO|a232f057-6f16-4161-bffc-b743b97e5d1f: Claiming fa:16:3e:10:96:59 192.168.122.237
Oct  8 13:08:59 np0005476733 nova_compute[192580]: 2025-10-08 17:08:59.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:59 np0005476733 nova_compute[192580]: 2025-10-08 17:08:59.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:59 np0005476733 nova_compute[192580]: 2025-10-08 17:08:59.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.280 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:96:59 192.168.122.237'], port_security=['fa:16:3e:10:96:59 192.168.122.237'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.237/24', 'neutron:device_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3cc7e400-7bea-4a8c-b1db-0f3cb058bcbc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5b64086-e7d8-42ad-b439-67cb79e13d7c, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=a232f057-6f16-4161-bffc-b743b97e5d1f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.281 103739 INFO neutron.agent.ovn.metadata.agent [-] Port a232f057-6f16-4161-bffc-b743b97e5d1f in datapath 81c575b5-ac88-40d3-8b00-79c5c936eec4 bound to our chassis#033[00m
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.282 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 81c575b5-ac88-40d3-8b00-79c5c936eec4#033[00m
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.293 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0a6a217e-cfd8-4872-b9db-10cf2fdd3d78]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.294 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap81c575b5-a1 in ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 13:08:59 np0005476733 systemd-udevd[281174]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.297 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap81c575b5-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.297 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[8c050092-16dc-42e9-8c0c-c9ebe1c8f6e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:08:59 np0005476733 systemd-machined[152624]: New machine qemu-70-instance-00000074.
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.297 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5714be03-ee3b-4f67-b693-9de9e1710308]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:08:59 np0005476733 systemd[1]: Started Virtual Machine qemu-70-instance-00000074.
Oct  8 13:08:59 np0005476733 NetworkManager[51699]: <info>  [1759943339.3092] device (tapa232f057-6f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 13:08:59 np0005476733 NetworkManager[51699]: <info>  [1759943339.3101] device (tapa232f057-6f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.313 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[0a22c6b7-e917-42b0-9f67-1a6ec396eb0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.343 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[cf0e4fde-99c4-4950-b0a0-a12780acd34a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:08:59 np0005476733 nova_compute[192580]: 2025-10-08 17:08:59.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.374 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[5cf8f08f-8dfa-4cc5-af2e-f6aaf83a1a5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:08:59 np0005476733 nova_compute[192580]: 2025-10-08 17:08:59.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.384 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[b53befef-9831-48a3-ac00-99a625ef8c9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:08:59 np0005476733 ovn_controller[263831]: 2025-10-08T17:08:59Z|00266|binding|INFO|Setting lport a232f057-6f16-4161-bffc-b743b97e5d1f ovn-installed in OVS
Oct  8 13:08:59 np0005476733 NetworkManager[51699]: <info>  [1759943339.3850] manager: (tap81c575b5-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/331)
Oct  8 13:08:59 np0005476733 ovn_controller[263831]: 2025-10-08T17:08:59Z|00267|binding|INFO|Setting lport a232f057-6f16-4161-bffc-b743b97e5d1f up in Southbound
Oct  8 13:08:59 np0005476733 nova_compute[192580]: 2025-10-08 17:08:59.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.417 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[43d8575e-c641-44f0-9abf-6506cc37ad5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.420 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[65252c21-57ef-4054-8e17-ca9b3dbc4443]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:08:59 np0005476733 NetworkManager[51699]: <info>  [1759943339.4399] device (tap81c575b5-a0): carrier: link connected
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.446 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[c8ef5ddb-d9c2-434e-a54a-761662f33364]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.561 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[eaa4638e-6f41-4417-8d92-518b772761f0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81c575b5-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:bf:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 236], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1024310, 'reachable_time': 39793, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 281206, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.574 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[71656ebc-f282-451e-8d66-8ae91ea1eb8a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe79:bf12'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1024310, 'tstamp': 1024310}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 281207, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.591 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5d92d721-4bf9-4d8a-8abb-0f01a6fd4ea8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81c575b5-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:bf:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 236], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1024310, 'reachable_time': 39793, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 281208, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.616 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[021e226e-f870-4a61-aaa0-da7268f01805]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.668 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[2088d51c-ca6b-4c7c-8170-f305c3f4241d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.669 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81c575b5-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.669 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.669 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81c575b5-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:08:59 np0005476733 nova_compute[192580]: 2025-10-08 17:08:59.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:59 np0005476733 kernel: tap81c575b5-a0: entered promiscuous mode
Oct  8 13:08:59 np0005476733 nova_compute[192580]: 2025-10-08 17:08:59.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.674 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap81c575b5-a0, col_values=(('external_ids', {'iface-id': '3737b929-673d-4d30-a674-dbb8c6c2e54d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:08:59 np0005476733 nova_compute[192580]: 2025-10-08 17:08:59.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:59 np0005476733 ovn_controller[263831]: 2025-10-08T17:08:59Z|00268|binding|INFO|Releasing lport 3737b929-673d-4d30-a674-dbb8c6c2e54d from this chassis (sb_readonly=0)
Oct  8 13:08:59 np0005476733 nova_compute[192580]: 2025-10-08 17:08:59.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.677 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/81c575b5-ac88-40d3-8b00-79c5c936eec4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/81c575b5-ac88-40d3-8b00-79c5c936eec4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.678 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[da1d65a6-1151-4af4-8375-447f030f7042]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.679 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-81c575b5-ac88-40d3-8b00-79c5c936eec4
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/81c575b5-ac88-40d3-8b00-79c5c936eec4.pid.haproxy
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 81c575b5-ac88-40d3-8b00-79c5c936eec4
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 13:08:59 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:08:59.681 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'env', 'PROCESS_TAG=haproxy-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/81c575b5-ac88-40d3-8b00-79c5c936eec4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 13:08:59 np0005476733 nova_compute[192580]: 2025-10-08 17:08:59.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:08:59 np0005476733 nova_compute[192580]: 2025-10-08 17:08:59.738 2 DEBUG nova.compute.manager [req-e6cce804-54f3-4239-9ba9-375a5ab2b9fe req-28cd2d1c-d66e-4a3b-8714-5c0703fe864c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Received event network-vif-plugged-a232f057-6f16-4161-bffc-b743b97e5d1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:08:59 np0005476733 nova_compute[192580]: 2025-10-08 17:08:59.739 2 DEBUG oslo_concurrency.lockutils [req-e6cce804-54f3-4239-9ba9-375a5ab2b9fe req-28cd2d1c-d66e-4a3b-8714-5c0703fe864c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:08:59 np0005476733 nova_compute[192580]: 2025-10-08 17:08:59.739 2 DEBUG oslo_concurrency.lockutils [req-e6cce804-54f3-4239-9ba9-375a5ab2b9fe req-28cd2d1c-d66e-4a3b-8714-5c0703fe864c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:08:59 np0005476733 nova_compute[192580]: 2025-10-08 17:08:59.739 2 DEBUG oslo_concurrency.lockutils [req-e6cce804-54f3-4239-9ba9-375a5ab2b9fe req-28cd2d1c-d66e-4a3b-8714-5c0703fe864c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:08:59 np0005476733 nova_compute[192580]: 2025-10-08 17:08:59.739 2 DEBUG nova.compute.manager [req-e6cce804-54f3-4239-9ba9-375a5ab2b9fe req-28cd2d1c-d66e-4a3b-8714-5c0703fe864c 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Processing event network-vif-plugged-a232f057-6f16-4161-bffc-b743b97e5d1f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 13:09:00 np0005476733 podman[281240]: 2025-10-08 17:09:00.009988659 +0000 UTC m=+0.048717957 container create 9578c8d6c7e46f8f256964e96e92c2e6f31fe20585d8b1a80b9549b28504d4fe (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 13:09:00 np0005476733 systemd[1]: Started libpod-conmon-9578c8d6c7e46f8f256964e96e92c2e6f31fe20585d8b1a80b9549b28504d4fe.scope.
Oct  8 13:09:00 np0005476733 systemd[1]: Started libcrun container.
Oct  8 13:09:00 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/019c0282c33939bea2ee46f124818d2b211a453aab5aeaa348b8aa7524c91c29/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 13:09:00 np0005476733 podman[281240]: 2025-10-08 17:09:00.065974467 +0000 UTC m=+0.104703775 container init 9578c8d6c7e46f8f256964e96e92c2e6f31fe20585d8b1a80b9549b28504d4fe (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 13:09:00 np0005476733 podman[281240]: 2025-10-08 17:09:00.070806332 +0000 UTC m=+0.109535620 container start 9578c8d6c7e46f8f256964e96e92c2e6f31fe20585d8b1a80b9549b28504d4fe (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:09:00 np0005476733 podman[281240]: 2025-10-08 17:08:59.98090261 +0000 UTC m=+0.019631918 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 13:09:00 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[281255]: [NOTICE]   (281259) : New worker (281261) forked
Oct  8 13:09:00 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[281255]: [NOTICE]   (281259) : Loading success.
Oct  8 13:09:00 np0005476733 nova_compute[192580]: 2025-10-08 17:09:00.234 2 DEBUG nova.network.neutron [req-2764a4f6-1e40-44da-92f1-ed66125863ad req-eb140b3e-7f07-4c40-812d-4edc17c91100 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Updated VIF entry in instance network info cache for port a232f057-6f16-4161-bffc-b743b97e5d1f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 13:09:00 np0005476733 nova_compute[192580]: 2025-10-08 17:09:00.234 2 DEBUG nova.network.neutron [req-2764a4f6-1e40-44da-92f1-ed66125863ad req-eb140b3e-7f07-4c40-812d-4edc17c91100 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Updating instance_info_cache with network_info: [{"id": "a232f057-6f16-4161-bffc-b743b97e5d1f", "address": "fa:16:3e:10:96:59", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.237", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa232f057-6f", "ovs_interfaceid": "a232f057-6f16-4161-bffc-b743b97e5d1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 13:09:00 np0005476733 nova_compute[192580]: 2025-10-08 17:09:00.251 2 DEBUG oslo_concurrency.lockutils [req-2764a4f6-1e40-44da-92f1-ed66125863ad req-eb140b3e-7f07-4c40-812d-4edc17c91100 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-d2559ab1-2621-46b4-8e3c-568cbff22376" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.014 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759943341.0135615, d2559ab1-2621-46b4-8e3c-568cbff22376 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.014 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] VM Started (Lifecycle Event)#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.016 2 DEBUG nova.compute.manager [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.018 2 DEBUG nova.virt.libvirt.driver [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.021 2 INFO nova.virt.libvirt.driver [-] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Instance spawned successfully.#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.021 2 DEBUG nova.virt.libvirt.driver [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.040 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.045 2 DEBUG nova.virt.libvirt.driver [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.046 2 DEBUG nova.virt.libvirt.driver [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.046 2 DEBUG nova.virt.libvirt.driver [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.047 2 DEBUG nova.virt.libvirt.driver [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.047 2 DEBUG nova.virt.libvirt.driver [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.047 2 DEBUG nova.virt.libvirt.driver [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.052 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.079 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.079 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759943341.0145211, d2559ab1-2621-46b4-8e3c-568cbff22376 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.079 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] VM Paused (Lifecycle Event)#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.115 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.118 2 DEBUG nova.virt.driver [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] Emitting event <LifecycleEvent: 1759943341.0182347, d2559ab1-2621-46b4-8e3c-568cbff22376 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.118 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] VM Resumed (Lifecycle Event)#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.132 2 INFO nova.compute.manager [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Took 8.60 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.132 2 DEBUG nova.compute.manager [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.148 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.150 2 DEBUG nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.186 2 INFO nova.compute.manager [None req-7c1a3833-9689-412b-940c-e1338e345555 - - - - - -] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.208 2 INFO nova.compute.manager [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Took 9.09 seconds to build instance.#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.224 2 DEBUG oslo_concurrency.lockutils [None req-e45f7413-76b6-4dfe-a1c9-da81ef50d863 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lock "d2559ab1-2621-46b4-8e3c-568cbff22376" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.845 2 DEBUG nova.compute.manager [req-fbc4cc5a-fd8c-4716-a321-c91f24b096ff req-d515e9d5-ec3c-4b49-baf1-6f4eb2e5f70e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Received event network-vif-plugged-a232f057-6f16-4161-bffc-b743b97e5d1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.846 2 DEBUG oslo_concurrency.lockutils [req-fbc4cc5a-fd8c-4716-a321-c91f24b096ff req-d515e9d5-ec3c-4b49-baf1-6f4eb2e5f70e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.846 2 DEBUG oslo_concurrency.lockutils [req-fbc4cc5a-fd8c-4716-a321-c91f24b096ff req-d515e9d5-ec3c-4b49-baf1-6f4eb2e5f70e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.846 2 DEBUG oslo_concurrency.lockutils [req-fbc4cc5a-fd8c-4716-a321-c91f24b096ff req-d515e9d5-ec3c-4b49-baf1-6f4eb2e5f70e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.847 2 DEBUG nova.compute.manager [req-fbc4cc5a-fd8c-4716-a321-c91f24b096ff req-d515e9d5-ec3c-4b49-baf1-6f4eb2e5f70e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] No waiting events found dispatching network-vif-plugged-a232f057-6f16-4161-bffc-b743b97e5d1f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 13:09:01 np0005476733 nova_compute[192580]: 2025-10-08 17:09:01.847 2 WARNING nova.compute.manager [req-fbc4cc5a-fd8c-4716-a321-c91f24b096ff req-d515e9d5-ec3c-4b49-baf1-6f4eb2e5f70e 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Received unexpected event network-vif-plugged-a232f057-6f16-4161-bffc-b743b97e5d1f for instance with vm_state active and task_state None.#033[00m
Oct  8 13:09:02 np0005476733 podman[281277]: 2025-10-08 17:09:02.349483051 +0000 UTC m=+0.161504949 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 13:09:02 np0005476733 nova_compute[192580]: 2025-10-08 17:09:02.491 2 INFO nova.compute.manager [None req-c0f93e54-b7e9-4586-884e-f34c0a07818b 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Get console output#033[00m
Oct  8 13:09:02 np0005476733 nova_compute[192580]: 2025-10-08 17:09:02.498 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 13:09:03 np0005476733 nova_compute[192580]: 2025-10-08 17:09:03.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:05 np0005476733 podman[281303]: 2025-10-08 17:09:05.236690806 +0000 UTC m=+0.064906124 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 13:09:06 np0005476733 nova_compute[192580]: 2025-10-08 17:09:06.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:07 np0005476733 nova_compute[192580]: 2025-10-08 17:09:07.645 2 INFO nova.compute.manager [None req-f50a8ddb-c4fa-4052-8498-be8d62a30660 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Get console output#033[00m
Oct  8 13:09:08 np0005476733 nova_compute[192580]: 2025-10-08 17:09:08.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:10 np0005476733 nova_compute[192580]: 2025-10-08 17:09:10.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:09:10 np0005476733 nova_compute[192580]: 2025-10-08 17:09:10.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:09:10 np0005476733 nova_compute[192580]: 2025-10-08 17:09:10.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:09:11 np0005476733 nova_compute[192580]: 2025-10-08 17:09:11.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:12 np0005476733 nova_compute[192580]: 2025-10-08 17:09:12.876 2 INFO nova.compute.manager [None req-b216e3c4-22f5-4c8b-8dc8-55730824d1a1 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Get console output#033[00m
Oct  8 13:09:12 np0005476733 nova_compute[192580]: 2025-10-08 17:09:12.883 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 13:09:13 np0005476733 nova_compute[192580]: 2025-10-08 17:09:13.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:14 np0005476733 podman[281326]: 2025-10-08 17:09:14.229399207 +0000 UTC m=+0.051146305 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 13:09:14 np0005476733 podman[281325]: 2025-10-08 17:09:14.2398547 +0000 UTC m=+0.061036780 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:09:14 np0005476733 podman[281327]: 2025-10-08 17:09:14.242145563 +0000 UTC m=+0.056784944 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9)
Oct  8 13:09:16 np0005476733 nova_compute[192580]: 2025-10-08 17:09:16.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:18 np0005476733 nova_compute[192580]: 2025-10-08 17:09:18.098 2 INFO nova.compute.manager [None req-a87876de-d190-4d89-ac90-610a8aa7f176 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Get console output#033[00m
Oct  8 13:09:18 np0005476733 nova_compute[192580]: 2025-10-08 17:09:18.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:18 np0005476733 nova_compute[192580]: 2025-10-08 17:09:18.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:09:20 np0005476733 podman[281395]: 2025-10-08 17:09:20.244758143 +0000 UTC m=+0.061703992 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 13:09:20 np0005476733 podman[281396]: 2025-10-08 17:09:20.262920524 +0000 UTC m=+0.078257161 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 13:09:20 np0005476733 nova_compute[192580]: 2025-10-08 17:09:20.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:09:21 np0005476733 nova_compute[192580]: 2025-10-08 17:09:21.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:23 np0005476733 nova_compute[192580]: 2025-10-08 17:09:23.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:23 np0005476733 nova_compute[192580]: 2025-10-08 17:09:23.308 2 INFO nova.compute.manager [None req-eb2aa429-8e69-4c2c-bb7c-62c18caae9e2 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Get console output#033[00m
Oct  8 13:09:24 np0005476733 ovn_controller[263831]: 2025-10-08T17:09:24Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:10:96:59 192.168.122.237
Oct  8 13:09:24 np0005476733 ovn_controller[263831]: 2025-10-08T17:09:24Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:10:96:59 192.168.122.237
Oct  8 13:09:25 np0005476733 nova_compute[192580]: 2025-10-08 17:09:25.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:09:25 np0005476733 nova_compute[192580]: 2025-10-08 17:09:25.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:09:25 np0005476733 nova_compute[192580]: 2025-10-08 17:09:25.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:09:25 np0005476733 nova_compute[192580]: 2025-10-08 17:09:25.995 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-d2559ab1-2621-46b4-8e3c-568cbff22376" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 13:09:25 np0005476733 nova_compute[192580]: 2025-10-08 17:09:25.995 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-d2559ab1-2621-46b4-8e3c-568cbff22376" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 13:09:25 np0005476733 nova_compute[192580]: 2025-10-08 17:09:25.995 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 13:09:25 np0005476733 nova_compute[192580]: 2025-10-08 17:09:25.995 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d2559ab1-2621-46b4-8e3c-568cbff22376 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 13:09:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:09:26.447 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:09:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:09:26.447 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:09:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:09:26.448 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:09:26 np0005476733 nova_compute[192580]: 2025-10-08 17:09:26.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:27 np0005476733 nova_compute[192580]: 2025-10-08 17:09:27.865 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Updating instance_info_cache with network_info: [{"id": "a232f057-6f16-4161-bffc-b743b97e5d1f", "address": "fa:16:3e:10:96:59", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.237", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa232f057-6f", "ovs_interfaceid": "a232f057-6f16-4161-bffc-b743b97e5d1f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 13:09:28 np0005476733 nova_compute[192580]: 2025-10-08 17:09:28.191 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-d2559ab1-2621-46b4-8e3c-568cbff22376" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 13:09:28 np0005476733 nova_compute[192580]: 2025-10-08 17:09:28.191 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 13:09:28 np0005476733 nova_compute[192580]: 2025-10-08 17:09:28.191 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:09:28 np0005476733 nova_compute[192580]: 2025-10-08 17:09:28.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:28 np0005476733 nova_compute[192580]: 2025-10-08 17:09:28.723 2 INFO nova.compute.manager [None req-f4f1c0dd-dfea-4c27-aee3-f7e13d06d339 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Get console output#033[00m
Oct  8 13:09:28 np0005476733 nova_compute[192580]: 2025-10-08 17:09:28.728 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 13:09:28 np0005476733 nova_compute[192580]: 2025-10-08 17:09:28.731 2 INFO nova.virt.libvirt.driver [None req-f4f1c0dd-dfea-4c27-aee3-f7e13d06d339 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Truncated console log returned, 2981 bytes ignored#033[00m
Oct  8 13:09:29 np0005476733 ovn_controller[263831]: 2025-10-08T17:09:29Z|00269|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Oct  8 13:09:30 np0005476733 podman[281440]: 2025-10-08 17:09:30.223802687 +0000 UTC m=+0.050928718 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  8 13:09:31 np0005476733 nova_compute[192580]: 2025-10-08 17:09:31.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:31 np0005476733 nova_compute[192580]: 2025-10-08 17:09:31.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:09:33 np0005476733 podman[281470]: 2025-10-08 17:09:33.259229675 +0000 UTC m=+0.088642752 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 13:09:33 np0005476733 nova_compute[192580]: 2025-10-08 17:09:33.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:33 np0005476733 nova_compute[192580]: 2025-10-08 17:09:33.976 2 INFO nova.compute.manager [None req-50d90418-9b44-4022-ad66-fb7000e36a83 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Get console output#033[00m
Oct  8 13:09:33 np0005476733 nova_compute[192580]: 2025-10-08 17:09:33.980 52 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 13:09:33 np0005476733 nova_compute[192580]: 2025-10-08 17:09:33.984 2 INFO nova.virt.libvirt.driver [None req-50d90418-9b44-4022-ad66-fb7000e36a83 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Truncated console log returned, 3347 bytes ignored#033[00m
Oct  8 13:09:34 np0005476733 nova_compute[192580]: 2025-10-08 17:09:34.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:09:34 np0005476733 nova_compute[192580]: 2025-10-08 17:09:34.776 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:09:34 np0005476733 nova_compute[192580]: 2025-10-08 17:09:34.777 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:09:34 np0005476733 nova_compute[192580]: 2025-10-08 17:09:34.777 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:09:34 np0005476733 nova_compute[192580]: 2025-10-08 17:09:34.778 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:09:35 np0005476733 nova_compute[192580]: 2025-10-08 17:09:35.083 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2559ab1-2621-46b4-8e3c-568cbff22376/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:09:35 np0005476733 nova_compute[192580]: 2025-10-08 17:09:35.140 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2559ab1-2621-46b4-8e3c-568cbff22376/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:09:35 np0005476733 nova_compute[192580]: 2025-10-08 17:09:35.141 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2559ab1-2621-46b4-8e3c-568cbff22376/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:09:35 np0005476733 nova_compute[192580]: 2025-10-08 17:09:35.199 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2559ab1-2621-46b4-8e3c-568cbff22376/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:09:35 np0005476733 nova_compute[192580]: 2025-10-08 17:09:35.380 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:09:35 np0005476733 nova_compute[192580]: 2025-10-08 17:09:35.382 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12928MB free_disk=111.21819305419922GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:09:35 np0005476733 nova_compute[192580]: 2025-10-08 17:09:35.382 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:09:35 np0005476733 nova_compute[192580]: 2025-10-08 17:09:35.383 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:09:35 np0005476733 nova_compute[192580]: 2025-10-08 17:09:35.737 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance d2559ab1-2621-46b4-8e3c-568cbff22376 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 13:09:35 np0005476733 nova_compute[192580]: 2025-10-08 17:09:35.738 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:09:35 np0005476733 nova_compute[192580]: 2025-10-08 17:09:35.738 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:09:35 np0005476733 nova_compute[192580]: 2025-10-08 17:09:35.778 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:09:35 np0005476733 nova_compute[192580]: 2025-10-08 17:09:35.853 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:09:36 np0005476733 podman[281512]: 2025-10-08 17:09:36.228002406 +0000 UTC m=+0.057291351 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:09:36 np0005476733 nova_compute[192580]: 2025-10-08 17:09:36.239 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:09:36 np0005476733 nova_compute[192580]: 2025-10-08 17:09:36.240 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.857s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:09:36 np0005476733 nova_compute[192580]: 2025-10-08 17:09:36.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:38 np0005476733 nova_compute[192580]: 2025-10-08 17:09:38.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:41 np0005476733 nova_compute[192580]: 2025-10-08 17:09:41.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:43 np0005476733 nova_compute[192580]: 2025-10-08 17:09:43.240 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:09:43 np0005476733 nova_compute[192580]: 2025-10-08 17:09:43.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:45 np0005476733 podman[281534]: 2025-10-08 17:09:45.222993919 +0000 UTC m=+0.050442043 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 13:09:45 np0005476733 podman[281533]: 2025-10-08 17:09:45.232462202 +0000 UTC m=+0.061982661 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd)
Oct  8 13:09:45 np0005476733 podman[281535]: 2025-10-08 17:09:45.238010359 +0000 UTC m=+0.060755933 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, config_id=edpm, managed_by=edpm_ansible, release=1755695350, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter)
Oct  8 13:09:46 np0005476733 nova_compute[192580]: 2025-10-08 17:09:46.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:48 np0005476733 nova_compute[192580]: 2025-10-08 17:09:48.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:50 np0005476733 ovn_controller[263831]: 2025-10-08T17:09:50Z|00270|pinctrl|WARN|Dropped 203 log messages in last 56 seconds (most recently, 4 seconds ago) due to excessive rate
Oct  8 13:09:50 np0005476733 ovn_controller[263831]: 2025-10-08T17:09:50Z|00271|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:09:51 np0005476733 podman[281606]: 2025-10-08 17:09:51.233020955 +0000 UTC m=+0.048562913 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 13:09:51 np0005476733 podman[281605]: 2025-10-08 17:09:51.233004384 +0000 UTC m=+0.051157645 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 13:09:51 np0005476733 nova_compute[192580]: 2025-10-08 17:09:51.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:53 np0005476733 nova_compute[192580]: 2025-10-08 17:09:53.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:54 np0005476733 nova_compute[192580]: 2025-10-08 17:09:54.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:09:55 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:09:55.372 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=100, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=99) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:09:55 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:09:55.372 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 13:09:55 np0005476733 nova_compute[192580]: 2025-10-08 17:09:55.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:56 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:09:56.375 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '100'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:09:56 np0005476733 nova_compute[192580]: 2025-10-08 17:09:56.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:57 np0005476733 nova_compute[192580]: 2025-10-08 17:09:57.641 2 DEBUG oslo_concurrency.lockutils [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Acquiring lock "interface-d2559ab1-2621-46b4-8e3c-568cbff22376-6a2cde05-dbe2-4f08-94d1-646f8139b49e" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:09:57 np0005476733 nova_compute[192580]: 2025-10-08 17:09:57.642 2 DEBUG oslo_concurrency.lockutils [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lock "interface-d2559ab1-2621-46b4-8e3c-568cbff22376-6a2cde05-dbe2-4f08-94d1-646f8139b49e" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:09:57 np0005476733 nova_compute[192580]: 2025-10-08 17:09:57.642 2 DEBUG nova.objects.instance [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lazy-loading 'flavor' on Instance uuid d2559ab1-2621-46b4-8e3c-568cbff22376 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 13:09:58 np0005476733 nova_compute[192580]: 2025-10-08 17:09:58.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:09:59 np0005476733 nova_compute[192580]: 2025-10-08 17:09:59.034 2 DEBUG nova.objects.instance [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lazy-loading 'pci_requests' on Instance uuid d2559ab1-2621-46b4-8e3c-568cbff22376 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 13:09:59 np0005476733 nova_compute[192580]: 2025-10-08 17:09:59.050 2 DEBUG nova.network.neutron [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 13:10:01 np0005476733 nova_compute[192580]: 2025-10-08 17:10:01.160 2 DEBUG nova.policy [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 13:10:01 np0005476733 podman[281647]: 2025-10-08 17:10:01.21119798 +0000 UTC m=+0.046391542 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:10:01 np0005476733 nova_compute[192580]: 2025-10-08 17:10:01.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:02 np0005476733 nova_compute[192580]: 2025-10-08 17:10:02.823 2 DEBUG nova.network.neutron [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Successfully updated port: 6a2cde05-dbe2-4f08-94d1-646f8139b49e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 13:10:02 np0005476733 nova_compute[192580]: 2025-10-08 17:10:02.846 2 DEBUG oslo_concurrency.lockutils [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Acquiring lock "refresh_cache-d2559ab1-2621-46b4-8e3c-568cbff22376" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 13:10:02 np0005476733 nova_compute[192580]: 2025-10-08 17:10:02.846 2 DEBUG oslo_concurrency.lockutils [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Acquired lock "refresh_cache-d2559ab1-2621-46b4-8e3c-568cbff22376" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 13:10:02 np0005476733 nova_compute[192580]: 2025-10-08 17:10:02.847 2 DEBUG nova.network.neutron [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 13:10:02 np0005476733 nova_compute[192580]: 2025-10-08 17:10:02.962 2 DEBUG nova.compute.manager [req-256b9865-dae5-4d9f-b40d-8b4b822c1a90 req-d2454948-55be-4531-b87b-e94e549ad42f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Received event network-changed-6a2cde05-dbe2-4f08-94d1-646f8139b49e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:10:02 np0005476733 nova_compute[192580]: 2025-10-08 17:10:02.962 2 DEBUG nova.compute.manager [req-256b9865-dae5-4d9f-b40d-8b4b822c1a90 req-d2454948-55be-4531-b87b-e94e549ad42f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Refreshing instance network info cache due to event network-changed-6a2cde05-dbe2-4f08-94d1-646f8139b49e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 13:10:02 np0005476733 nova_compute[192580]: 2025-10-08 17:10:02.962 2 DEBUG oslo_concurrency.lockutils [req-256b9865-dae5-4d9f-b40d-8b4b822c1a90 req-d2454948-55be-4531-b87b-e94e549ad42f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-d2559ab1-2621-46b4-8e3c-568cbff22376" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 13:10:03 np0005476733 nova_compute[192580]: 2025-10-08 17:10:03.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:04 np0005476733 podman[281668]: 2025-10-08 17:10:04.288968652 +0000 UTC m=+0.120335495 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 13:10:06 np0005476733 nova_compute[192580]: 2025-10-08 17:10:06.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.174 2 DEBUG nova.network.neutron [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Updating instance_info_cache with network_info: [{"id": "a232f057-6f16-4161-bffc-b743b97e5d1f", "address": "fa:16:3e:10:96:59", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.237", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": null, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa232f057-6f", "ovs_interfaceid": "a232f057-6f16-4161-bffc-b743b97e5d1f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6a2cde05-dbe2-4f08-94d1-646f8139b49e", "address": "fa:16:3e:0b:48:8d", "network": {"id": "23fdc065-0e6b-415e-9e71-7aa724ab9c52", "bridge": "br-int", "label": "tempest-tenant-ctl-network-670284253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c64130dbc8f84b8db2db002a5e499a8b", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a2cde05-db", "ovs_interfaceid": "6a2cde05-dbe2-4f08-94d1-646f8139b49e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.218 2 DEBUG oslo_concurrency.lockutils [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Releasing lock "refresh_cache-d2559ab1-2621-46b4-8e3c-568cbff22376" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.219 2 DEBUG oslo_concurrency.lockutils [req-256b9865-dae5-4d9f-b40d-8b4b822c1a90 req-d2454948-55be-4531-b87b-e94e549ad42f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-d2559ab1-2621-46b4-8e3c-568cbff22376" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.219 2 DEBUG nova.network.neutron [req-256b9865-dae5-4d9f-b40d-8b4b822c1a90 req-d2454948-55be-4531-b87b-e94e549ad42f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Refreshing network info cache for port 6a2cde05-dbe2-4f08-94d1-646f8139b49e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.222 2 DEBUG nova.virt.libvirt.vif [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T17:08:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_dscp_bwlimit_external_network-1818989422',display_name='tempest-test_dscp_bwlimit_external_network-1818989422',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dscp-bwlimit-external-network-1818989422',id=116,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJR/7xp4fhkcZ9yCLipsodZ2B1hsuYH3iQyfkCZCDl7mvXgidE/NHKeXMZyKMiWHW8BEIKPoo7O5FsRdZDjBlPOtIfenzRDdK1bjfcVb4Kuc8gL9en8DLsiKTPLRbj8LaA==',key_name='tempest-keypair-test-1556105078',keypairs=<?>,launch_index=0,launched_at=2025-10-08T17:09:01Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c64130dbc8f84b8db2db002a5e499a8b',ramdisk_id='',reservation_id='r-u0hmfh6i',resources=<?>,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestExternalNetwork-1440675542',owner_user_name='tempest-QosTestExternalNetwork-1440675542-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T17:09:01Z,user_data=None,user_id='96fd231581074677b87116ed9773b06d',uuid=d2559ab1-2621-46b4-8e3c-568cbff22376,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6a2cde05-dbe2-4f08-94d1-646f8139b49e", "address": "fa:16:3e:0b:48:8d", "network": {"id": "23fdc065-0e6b-415e-9e71-7aa724ab9c52", "bridge": "br-int", "label": "tempest-tenant-ctl-network-670284253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c64130dbc8f84b8db2db002a5e499a8b", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a2cde05-db", "ovs_interfaceid": "6a2cde05-dbe2-4f08-94d1-646f8139b49e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.222 2 DEBUG nova.network.os_vif_util [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Converting VIF {"id": "6a2cde05-dbe2-4f08-94d1-646f8139b49e", "address": "fa:16:3e:0b:48:8d", "network": {"id": "23fdc065-0e6b-415e-9e71-7aa724ab9c52", "bridge": "br-int", "label": "tempest-tenant-ctl-network-670284253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c64130dbc8f84b8db2db002a5e499a8b", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a2cde05-db", "ovs_interfaceid": "6a2cde05-dbe2-4f08-94d1-646f8139b49e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.223 2 DEBUG nova.network.os_vif_util [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0b:48:8d,bridge_name='br-int',has_traffic_filtering=True,id=6a2cde05-dbe2-4f08-94d1-646f8139b49e,network=Network(23fdc065-0e6b-415e-9e71-7aa724ab9c52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6a2cde05-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.223 2 DEBUG os_vif [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0b:48:8d,bridge_name='br-int',has_traffic_filtering=True,id=6a2cde05-dbe2-4f08-94d1-646f8139b49e,network=Network(23fdc065-0e6b-415e-9e71-7aa724ab9c52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6a2cde05-db') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.224 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.224 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.227 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a2cde05-db, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.227 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6a2cde05-db, col_values=(('external_ids', {'iface-id': '6a2cde05-dbe2-4f08-94d1-646f8139b49e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0b:48:8d', 'vm-uuid': 'd2559ab1-2621-46b4-8e3c-568cbff22376'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.241 2 INFO os_vif [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0b:48:8d,bridge_name='br-int',has_traffic_filtering=True,id=6a2cde05-dbe2-4f08-94d1-646f8139b49e,network=Network(23fdc065-0e6b-415e-9e71-7aa724ab9c52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6a2cde05-db')#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.242 2 DEBUG nova.virt.libvirt.vif [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T17:08:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_dscp_bwlimit_external_network-1818989422',display_name='tempest-test_dscp_bwlimit_external_network-1818989422',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dscp-bwlimit-external-network-1818989422',id=116,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJR/7xp4fhkcZ9yCLipsodZ2B1hsuYH3iQyfkCZCDl7mvXgidE/NHKeXMZyKMiWHW8BEIKPoo7O5FsRdZDjBlPOtIfenzRDdK1bjfcVb4Kuc8gL9en8DLsiKTPLRbj8LaA==',key_name='tempest-keypair-test-1556105078',keypairs=<?>,launch_index=0,launched_at=2025-10-08T17:09:01Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c64130dbc8f84b8db2db002a5e499a8b',ramdisk_id='',reservation_id='r-u0hmfh6i',resources=<?>,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestExternalNetwork-1440675542',owner_user_name='tempest-QosTestExternalNetwork-1440675542-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T17:09:01Z,user_data=None,user_id='96fd231581074677b87116ed9773b06d',uuid=d2559ab1-2621-46b4-8e3c-568cbff22376,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6a2cde05-dbe2-4f08-94d1-646f8139b49e", "address": "fa:16:3e:0b:48:8d", "network": {"id": "23fdc065-0e6b-415e-9e71-7aa724ab9c52", "bridge": "br-int", "label": "tempest-tenant-ctl-network-670284253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c64130dbc8f84b8db2db002a5e499a8b", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a2cde05-db", "ovs_interfaceid": "6a2cde05-dbe2-4f08-94d1-646f8139b49e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.242 2 DEBUG nova.network.os_vif_util [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Converting VIF {"id": "6a2cde05-dbe2-4f08-94d1-646f8139b49e", "address": "fa:16:3e:0b:48:8d", "network": {"id": "23fdc065-0e6b-415e-9e71-7aa724ab9c52", "bridge": "br-int", "label": "tempest-tenant-ctl-network-670284253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c64130dbc8f84b8db2db002a5e499a8b", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a2cde05-db", "ovs_interfaceid": "6a2cde05-dbe2-4f08-94d1-646f8139b49e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.243 2 DEBUG nova.network.os_vif_util [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0b:48:8d,bridge_name='br-int',has_traffic_filtering=True,id=6a2cde05-dbe2-4f08-94d1-646f8139b49e,network=Network(23fdc065-0e6b-415e-9e71-7aa724ab9c52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6a2cde05-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.245 2 DEBUG nova.virt.libvirt.guest [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] attach device xml: <interface type="ethernet">
Oct  8 13:10:07 np0005476733 nova_compute[192580]:  <mac address="fa:16:3e:0b:48:8d"/>
Oct  8 13:10:07 np0005476733 nova_compute[192580]:  <model type="virtio"/>
Oct  8 13:10:07 np0005476733 nova_compute[192580]:  <driver name="vhost" rx_queue_size="512"/>
Oct  8 13:10:07 np0005476733 nova_compute[192580]:  <mtu size="1342"/>
Oct  8 13:10:07 np0005476733 nova_compute[192580]:  <target dev="tap6a2cde05-db"/>
Oct  8 13:10:07 np0005476733 nova_compute[192580]: </interface>
Oct  8 13:10:07 np0005476733 nova_compute[192580]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  8 13:10:07 np0005476733 kernel: tap6a2cde05-db: entered promiscuous mode
Oct  8 13:10:07 np0005476733 NetworkManager[51699]: <info>  [1759943407.2604] manager: (tap6a2cde05-db): new Tun device (/org/freedesktop/NetworkManager/Devices/332)
Oct  8 13:10:07 np0005476733 ovn_controller[263831]: 2025-10-08T17:10:07Z|00272|binding|INFO|Claiming lport 6a2cde05-dbe2-4f08-94d1-646f8139b49e for this chassis.
Oct  8 13:10:07 np0005476733 ovn_controller[263831]: 2025-10-08T17:10:07Z|00273|binding|INFO|6a2cde05-dbe2-4f08-94d1-646f8139b49e: Claiming fa:16:3e:0b:48:8d 10.100.0.8
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:07 np0005476733 podman[281695]: 2025-10-08 17:10:07.273052881 +0000 UTC m=+0.084177929 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.281 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0b:48:8d 10.100.0.8'], port_security=['fa:16:3e:0b:48:8d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-23fdc065-0e6b-415e-9e71-7aa724ab9c52', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2213252c-94f9-4b54-86f1-ccaa6f15fcb5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa5e840d-fbd9-4cb2-9d13-461aaf3b8865, chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=6a2cde05-dbe2-4f08-94d1-646f8139b49e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.282 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 6a2cde05-dbe2-4f08-94d1-646f8139b49e in datapath 23fdc065-0e6b-415e-9e71-7aa724ab9c52 bound to our chassis#033[00m
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.284 103739 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 23fdc065-0e6b-415e-9e71-7aa724ab9c52#033[00m
Oct  8 13:10:07 np0005476733 ovn_controller[263831]: 2025-10-08T17:10:07Z|00274|binding|INFO|Setting lport 6a2cde05-dbe2-4f08-94d1-646f8139b49e up in Southbound
Oct  8 13:10:07 np0005476733 ovn_controller[263831]: 2025-10-08T17:10:07Z|00275|binding|INFO|Setting lport 6a2cde05-dbe2-4f08-94d1-646f8139b49e ovn-installed in OVS
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:07 np0005476733 systemd-udevd[281719]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.303 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[6f6b16a9-3812-49ea-8929-b09b0d8da23f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.304 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap23fdc065-01 in ovnmeta-23fdc065-0e6b-415e-9e71-7aa724ab9c52 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.306 221259 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap23fdc065-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.306 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[e1f8343d-1a86-45ea-bca6-0b7abc07b3cd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.307 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[eef3a96f-6405-49f7-9eea-b677452c329b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:10:07 np0005476733 NetworkManager[51699]: <info>  [1759943407.3244] device (tap6a2cde05-db): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 13:10:07 np0005476733 NetworkManager[51699]: <info>  [1759943407.3258] device (tap6a2cde05-db): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.322 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[8ff8b44a-e7df-4265-b132-b5f013ef4801]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.350 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[15e5931e-c7c5-4f20-99f4-6b03d5211fab]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.394 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[f8b2adb1-c227-4815-800b-ed91eb8f60f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:10:07 np0005476733 NetworkManager[51699]: <info>  [1759943407.4017] manager: (tap23fdc065-00): new Veth device (/org/freedesktop/NetworkManager/Devices/333)
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.401 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[1d57d9d7-3110-4eba-9b63-4d8b50d068c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.411 2 DEBUG nova.virt.libvirt.driver [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.411 2 DEBUG nova.virt.libvirt.driver [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.411 2 DEBUG nova.virt.libvirt.driver [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] No VIF found with MAC fa:16:3e:10:96:59, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.412 2 DEBUG nova.virt.libvirt.driver [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] No VIF found with MAC fa:16:3e:0b:48:8d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 13:10:07 np0005476733 ovn_controller[263831]: 2025-10-08T17:10:07Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0b:48:8d 10.100.0.8
Oct  8 13:10:07 np0005476733 ovn_controller[263831]: 2025-10-08T17:10:07Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0b:48:8d 10.100.0.8
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.433 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[c84d8f54-cfcd-4dcd-8fa6-783e6841cce9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.436 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[44715758-37fe-45e5-9d86-b3658ecf3d6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:10:07 np0005476733 NetworkManager[51699]: <info>  [1759943407.4620] device (tap23fdc065-00): carrier: link connected
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.466 221372 DEBUG oslo.privsep.daemon [-] privsep: reply[e5e14e70-efb5-45bb-befd-4003a5932645]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.480 2 DEBUG nova.virt.libvirt.guest [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 13:10:07 np0005476733 nova_compute[192580]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 13:10:07 np0005476733 nova_compute[192580]:  <nova:name>tempest-test_dscp_bwlimit_external_network-1818989422</nova:name>
Oct  8 13:10:07 np0005476733 nova_compute[192580]:  <nova:creationTime>2025-10-08 17:10:07</nova:creationTime>
Oct  8 13:10:07 np0005476733 nova_compute[192580]:  <nova:flavor name="custom_neutron_guest">
Oct  8 13:10:07 np0005476733 nova_compute[192580]:    <nova:memory>1024</nova:memory>
Oct  8 13:10:07 np0005476733 nova_compute[192580]:    <nova:disk>10</nova:disk>
Oct  8 13:10:07 np0005476733 nova_compute[192580]:    <nova:swap>0</nova:swap>
Oct  8 13:10:07 np0005476733 nova_compute[192580]:    <nova:ephemeral>0</nova:ephemeral>
Oct  8 13:10:07 np0005476733 nova_compute[192580]:    <nova:vcpus>1</nova:vcpus>
Oct  8 13:10:07 np0005476733 nova_compute[192580]:  </nova:flavor>
Oct  8 13:10:07 np0005476733 nova_compute[192580]:  <nova:owner>
Oct  8 13:10:07 np0005476733 nova_compute[192580]:    <nova:user uuid="96fd231581074677b87116ed9773b06d">tempest-QosTestExternalNetwork-1440675542-project-member</nova:user>
Oct  8 13:10:07 np0005476733 nova_compute[192580]:    <nova:project uuid="c64130dbc8f84b8db2db002a5e499a8b">tempest-QosTestExternalNetwork-1440675542</nova:project>
Oct  8 13:10:07 np0005476733 nova_compute[192580]:  </nova:owner>
Oct  8 13:10:07 np0005476733 nova_compute[192580]:  <nova:root type="image" uuid="11111111-1111-1111-1111-111111111111"/>
Oct  8 13:10:07 np0005476733 nova_compute[192580]:  <nova:ports>
Oct  8 13:10:07 np0005476733 nova_compute[192580]:    <nova:port uuid="a232f057-6f16-4161-bffc-b743b97e5d1f">
Oct  8 13:10:07 np0005476733 nova_compute[192580]:      <nova:ip type="fixed" address="192.168.122.237" ipVersion="4"/>
Oct  8 13:10:07 np0005476733 nova_compute[192580]:    </nova:port>
Oct  8 13:10:07 np0005476733 nova_compute[192580]:    <nova:port uuid="6a2cde05-dbe2-4f08-94d1-646f8139b49e">
Oct  8 13:10:07 np0005476733 nova_compute[192580]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  8 13:10:07 np0005476733 nova_compute[192580]:    </nova:port>
Oct  8 13:10:07 np0005476733 nova_compute[192580]:  </nova:ports>
Oct  8 13:10:07 np0005476733 nova_compute[192580]: </nova:instance>
Oct  8 13:10:07 np0005476733 nova_compute[192580]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.485 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[002bf0e4-b125-4701-879c-ee2c4ffdb507]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap23fdc065-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:eb:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 238], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1031112, 'reachable_time': 33625, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 281746, 'error': None, 'target': 'ovnmeta-23fdc065-0e6b-415e-9e71-7aa724ab9c52', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.500 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[92cbf909-2f90-4c54-bb2e-9353004fd079]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe59:eba3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1031112, 'tstamp': 1031112}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 281747, 'error': None, 'target': 'ovnmeta-23fdc065-0e6b-415e-9e71-7aa724ab9c52', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.516 2 DEBUG oslo_concurrency.lockutils [None req-edec9f5d-6a60-4946-8496-a08a63602273 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lock "interface-d2559ab1-2621-46b4-8e3c-568cbff22376-6a2cde05-dbe2-4f08-94d1-646f8139b49e" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 9.874s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.521 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[13f10b5c-901d-425e-8909-4267c8b13503]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap23fdc065-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:eb:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 238], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1031112, 'reachable_time': 33625, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 281748, 'error': None, 'target': 'ovnmeta-23fdc065-0e6b-415e-9e71-7aa724ab9c52', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.549 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[fb8fcc5e-73d3-40eb-9bf9-ae81bb6c20f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.610 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[c9c65b07-4994-484b-8f56-c90059418ce3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.612 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23fdc065-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.612 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.612 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap23fdc065-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:07 np0005476733 kernel: tap23fdc065-00: entered promiscuous mode
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.617 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap23fdc065-00, col_values=(('external_ids', {'iface-id': 'fc5249ad-3766-4d14-89c2-877cebab3c42'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:10:07 np0005476733 ovn_controller[263831]: 2025-10-08T17:10:07Z|00276|binding|INFO|Releasing lport fc5249ad-3766-4d14-89c2-877cebab3c42 from this chassis (sb_readonly=0)
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.631 103739 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/23fdc065-0e6b-415e-9e71-7aa724ab9c52.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/23fdc065-0e6b-415e-9e71-7aa724ab9c52.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.632 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[0b952d04-fd3b-4fbf-885e-69b9887f0417]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.633 103739 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: global
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]:    log         /dev/log local0 debug
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]:    log-tag     haproxy-metadata-proxy-23fdc065-0e6b-415e-9e71-7aa724ab9c52
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]:    user        root
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]:    group       root
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]:    maxconn     1024
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]:    pidfile     /var/lib/neutron/external/pids/23fdc065-0e6b-415e-9e71-7aa724ab9c52.pid.haproxy
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]:    daemon
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: defaults
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]:    log global
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]:    mode http
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]:    option httplog
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]:    option dontlognull
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]:    option http-server-close
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]:    option forwardfor
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]:    retries                 3
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]:    timeout http-request    30s
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]:    timeout connect         30s
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]:    timeout client          32s
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]:    timeout server          32s
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]:    timeout http-keep-alive 30s
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: listen listener
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]:    bind 169.254.169.254:80
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]:    http-request add-header X-OVN-Network-ID 23fdc065-0e6b-415e-9e71-7aa724ab9c52
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 13:10:07 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:07.634 103739 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-23fdc065-0e6b-415e-9e71-7aa724ab9c52', 'env', 'PROCESS_TAG=haproxy-23fdc065-0e6b-415e-9e71-7aa724ab9c52', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/23fdc065-0e6b-415e-9e71-7aa724ab9c52.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.829 2 DEBUG nova.compute.manager [req-344a48aa-30e5-41c8-a8a4-8862edc30716 req-2f7a1e3e-0308-4794-aeae-914a7754647b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Received event network-vif-plugged-6a2cde05-dbe2-4f08-94d1-646f8139b49e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.829 2 DEBUG oslo_concurrency.lockutils [req-344a48aa-30e5-41c8-a8a4-8862edc30716 req-2f7a1e3e-0308-4794-aeae-914a7754647b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.830 2 DEBUG oslo_concurrency.lockutils [req-344a48aa-30e5-41c8-a8a4-8862edc30716 req-2f7a1e3e-0308-4794-aeae-914a7754647b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.830 2 DEBUG oslo_concurrency.lockutils [req-344a48aa-30e5-41c8-a8a4-8862edc30716 req-2f7a1e3e-0308-4794-aeae-914a7754647b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.830 2 DEBUG nova.compute.manager [req-344a48aa-30e5-41c8-a8a4-8862edc30716 req-2f7a1e3e-0308-4794-aeae-914a7754647b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] No waiting events found dispatching network-vif-plugged-6a2cde05-dbe2-4f08-94d1-646f8139b49e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 13:10:07 np0005476733 nova_compute[192580]: 2025-10-08 17:10:07.831 2 WARNING nova.compute.manager [req-344a48aa-30e5-41c8-a8a4-8862edc30716 req-2f7a1e3e-0308-4794-aeae-914a7754647b 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Received unexpected event network-vif-plugged-6a2cde05-dbe2-4f08-94d1-646f8139b49e for instance with vm_state active and task_state None.#033[00m
Oct  8 13:10:07 np0005476733 podman[281780]: 2025-10-08 17:10:07.980054722 +0000 UTC m=+0.043866342 container create 5a3ea151501a6a9c8f5613850fa8b3fe2f2e4e95a9aab85c91aacebef819a3dd (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-23fdc065-0e6b-415e-9e71-7aa724ab9c52, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  8 13:10:08 np0005476733 systemd[1]: Started libpod-conmon-5a3ea151501a6a9c8f5613850fa8b3fe2f2e4e95a9aab85c91aacebef819a3dd.scope.
Oct  8 13:10:08 np0005476733 podman[281780]: 2025-10-08 17:10:07.956046306 +0000 UTC m=+0.019857936 image pull d271ca3874fbf350f8fe76e1765c145131b8ba1ebb45804192d140db9c1d861e 38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297
Oct  8 13:10:08 np0005476733 systemd[1]: Started libcrun container.
Oct  8 13:10:08 np0005476733 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a491b2eac6024c986a9ac5de1b3df657043367e4e9574f6430a1cbbf727a343/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 13:10:08 np0005476733 podman[281780]: 2025-10-08 17:10:08.096580464 +0000 UTC m=+0.160392174 container init 5a3ea151501a6a9c8f5613850fa8b3fe2f2e4e95a9aab85c91aacebef819a3dd (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-23fdc065-0e6b-415e-9e71-7aa724ab9c52, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  8 13:10:08 np0005476733 podman[281780]: 2025-10-08 17:10:08.102193314 +0000 UTC m=+0.166004964 container start 5a3ea151501a6a9c8f5613850fa8b3fe2f2e4e95a9aab85c91aacebef819a3dd (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-23fdc065-0e6b-415e-9e71-7aa724ab9c52, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 13:10:08 np0005476733 neutron-haproxy-ovnmeta-23fdc065-0e6b-415e-9e71-7aa724ab9c52[281795]: [NOTICE]   (281799) : New worker (281801) forked
Oct  8 13:10:08 np0005476733 neutron-haproxy-ovnmeta-23fdc065-0e6b-415e-9e71-7aa724ab9c52[281795]: [NOTICE]   (281799) : Loading success.
Oct  8 13:10:09 np0005476733 nova_compute[192580]: 2025-10-08 17:10:09.620 2 DEBUG nova.network.neutron [req-256b9865-dae5-4d9f-b40d-8b4b822c1a90 req-d2454948-55be-4531-b87b-e94e549ad42f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Updated VIF entry in instance network info cache for port 6a2cde05-dbe2-4f08-94d1-646f8139b49e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 13:10:09 np0005476733 nova_compute[192580]: 2025-10-08 17:10:09.621 2 DEBUG nova.network.neutron [req-256b9865-dae5-4d9f-b40d-8b4b822c1a90 req-d2454948-55be-4531-b87b-e94e549ad42f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Updating instance_info_cache with network_info: [{"id": "a232f057-6f16-4161-bffc-b743b97e5d1f", "address": "fa:16:3e:10:96:59", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.237", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": null, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa232f057-6f", "ovs_interfaceid": "a232f057-6f16-4161-bffc-b743b97e5d1f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6a2cde05-dbe2-4f08-94d1-646f8139b49e", "address": "fa:16:3e:0b:48:8d", "network": {"id": "23fdc065-0e6b-415e-9e71-7aa724ab9c52", "bridge": "br-int", "label": "tempest-tenant-ctl-network-670284253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c64130dbc8f84b8db2db002a5e499a8b", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a2cde05-db", "ovs_interfaceid": "6a2cde05-dbe2-4f08-94d1-646f8139b49e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 13:10:09 np0005476733 nova_compute[192580]: 2025-10-08 17:10:09.645 2 DEBUG oslo_concurrency.lockutils [req-256b9865-dae5-4d9f-b40d-8b4b822c1a90 req-d2454948-55be-4531-b87b-e94e549ad42f 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-d2559ab1-2621-46b4-8e3c-568cbff22376" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 13:10:09 np0005476733 nova_compute[192580]: 2025-10-08 17:10:09.960 2 DEBUG nova.compute.manager [req-ef3950c5-707d-4cca-9338-58b795a8c325 req-14cfdab4-12ca-4009-8b02-6cf181e4cf97 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Received event network-vif-plugged-6a2cde05-dbe2-4f08-94d1-646f8139b49e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:10:09 np0005476733 nova_compute[192580]: 2025-10-08 17:10:09.960 2 DEBUG oslo_concurrency.lockutils [req-ef3950c5-707d-4cca-9338-58b795a8c325 req-14cfdab4-12ca-4009-8b02-6cf181e4cf97 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:10:09 np0005476733 nova_compute[192580]: 2025-10-08 17:10:09.960 2 DEBUG oslo_concurrency.lockutils [req-ef3950c5-707d-4cca-9338-58b795a8c325 req-14cfdab4-12ca-4009-8b02-6cf181e4cf97 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:10:09 np0005476733 nova_compute[192580]: 2025-10-08 17:10:09.961 2 DEBUG oslo_concurrency.lockutils [req-ef3950c5-707d-4cca-9338-58b795a8c325 req-14cfdab4-12ca-4009-8b02-6cf181e4cf97 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:10:09 np0005476733 nova_compute[192580]: 2025-10-08 17:10:09.961 2 DEBUG nova.compute.manager [req-ef3950c5-707d-4cca-9338-58b795a8c325 req-14cfdab4-12ca-4009-8b02-6cf181e4cf97 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] No waiting events found dispatching network-vif-plugged-6a2cde05-dbe2-4f08-94d1-646f8139b49e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 13:10:09 np0005476733 nova_compute[192580]: 2025-10-08 17:10:09.961 2 WARNING nova.compute.manager [req-ef3950c5-707d-4cca-9338-58b795a8c325 req-14cfdab4-12ca-4009-8b02-6cf181e4cf97 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Received unexpected event network-vif-plugged-6a2cde05-dbe2-4f08-94d1-646f8139b49e for instance with vm_state active and task_state None.#033[00m
Oct  8 13:10:10 np0005476733 nova_compute[192580]: 2025-10-08 17:10:10.601 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:10:11 np0005476733 nova_compute[192580]: 2025-10-08 17:10:11.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:12 np0005476733 nova_compute[192580]: 2025-10-08 17:10:12.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:12 np0005476733 nova_compute[192580]: 2025-10-08 17:10:12.455 2 DEBUG nova.compute.manager [req-19faa51a-a71c-473b-bd8d-df29a80fe387 req-58c9ac27-01eb-47be-87ed-9c9703e22c7d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Received event network-changed-6a2cde05-dbe2-4f08-94d1-646f8139b49e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:10:12 np0005476733 nova_compute[192580]: 2025-10-08 17:10:12.455 2 DEBUG nova.compute.manager [req-19faa51a-a71c-473b-bd8d-df29a80fe387 req-58c9ac27-01eb-47be-87ed-9c9703e22c7d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Refreshing instance network info cache due to event network-changed-6a2cde05-dbe2-4f08-94d1-646f8139b49e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 13:10:12 np0005476733 nova_compute[192580]: 2025-10-08 17:10:12.456 2 DEBUG oslo_concurrency.lockutils [req-19faa51a-a71c-473b-bd8d-df29a80fe387 req-58c9ac27-01eb-47be-87ed-9c9703e22c7d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "refresh_cache-d2559ab1-2621-46b4-8e3c-568cbff22376" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 13:10:12 np0005476733 nova_compute[192580]: 2025-10-08 17:10:12.456 2 DEBUG oslo_concurrency.lockutils [req-19faa51a-a71c-473b-bd8d-df29a80fe387 req-58c9ac27-01eb-47be-87ed-9c9703e22c7d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquired lock "refresh_cache-d2559ab1-2621-46b4-8e3c-568cbff22376" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 13:10:12 np0005476733 nova_compute[192580]: 2025-10-08 17:10:12.456 2 DEBUG nova.network.neutron [req-19faa51a-a71c-473b-bd8d-df29a80fe387 req-58c9ac27-01eb-47be-87ed-9c9703e22c7d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Refreshing network info cache for port 6a2cde05-dbe2-4f08-94d1-646f8139b49e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 13:10:12 np0005476733 nova_compute[192580]: 2025-10-08 17:10:12.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:10:12 np0005476733 nova_compute[192580]: 2025-10-08 17:10:12.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:10:16 np0005476733 podman[281812]: 2025-10-08 17:10:16.23127038 +0000 UTC m=+0.054451051 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 13:10:16 np0005476733 podman[281811]: 2025-10-08 17:10:16.235019149 +0000 UTC m=+0.058972874 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 13:10:16 np0005476733 podman[281813]: 2025-10-08 17:10:16.265193004 +0000 UTC m=+0.085184893 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.buildah.version=1.33.7, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.expose-services=, vcs-type=git)
Oct  8 13:10:16 np0005476733 nova_compute[192580]: 2025-10-08 17:10:16.357 2 DEBUG nova.network.neutron [req-19faa51a-a71c-473b-bd8d-df29a80fe387 req-58c9ac27-01eb-47be-87ed-9c9703e22c7d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Updated VIF entry in instance network info cache for port 6a2cde05-dbe2-4f08-94d1-646f8139b49e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 13:10:16 np0005476733 nova_compute[192580]: 2025-10-08 17:10:16.358 2 DEBUG nova.network.neutron [req-19faa51a-a71c-473b-bd8d-df29a80fe387 req-58c9ac27-01eb-47be-87ed-9c9703e22c7d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Updating instance_info_cache with network_info: [{"id": "a232f057-6f16-4161-bffc-b743b97e5d1f", "address": "fa:16:3e:10:96:59", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.237", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": null, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa232f057-6f", "ovs_interfaceid": "a232f057-6f16-4161-bffc-b743b97e5d1f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6a2cde05-dbe2-4f08-94d1-646f8139b49e", "address": "fa:16:3e:0b:48:8d", "network": {"id": "23fdc065-0e6b-415e-9e71-7aa724ab9c52", "bridge": "br-int", "label": "tempest-tenant-ctl-network-670284253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c64130dbc8f84b8db2db002a5e499a8b", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a2cde05-db", "ovs_interfaceid": "6a2cde05-dbe2-4f08-94d1-646f8139b49e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 13:10:16 np0005476733 nova_compute[192580]: 2025-10-08 17:10:16.389 2 DEBUG oslo_concurrency.lockutils [req-19faa51a-a71c-473b-bd8d-df29a80fe387 req-58c9ac27-01eb-47be-87ed-9c9703e22c7d 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Releasing lock "refresh_cache-d2559ab1-2621-46b4-8e3c-568cbff22376" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 13:10:16 np0005476733 nova_compute[192580]: 2025-10-08 17:10:16.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:17 np0005476733 nova_compute[192580]: 2025-10-08 17:10:17.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:18 np0005476733 nova_compute[192580]: 2025-10-08 17:10:18.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:10:21 np0005476733 nova_compute[192580]: 2025-10-08 17:10:21.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:22 np0005476733 podman[281873]: 2025-10-08 17:10:22.217639819 +0000 UTC m=+0.045279157 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid)
Oct  8 13:10:22 np0005476733 podman[281874]: 2025-10-08 17:10:22.231198352 +0000 UTC m=+0.053812630 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 13:10:22 np0005476733 nova_compute[192580]: 2025-10-08 17:10:22.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:22 np0005476733 nova_compute[192580]: 2025-10-08 17:10:22.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:10:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:26.448 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:10:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:26.449 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:10:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:10:26.451 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:10:26 np0005476733 nova_compute[192580]: 2025-10-08 17:10:26.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:10:26 np0005476733 nova_compute[192580]: 2025-10-08 17:10:26.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:10:26 np0005476733 nova_compute[192580]: 2025-10-08 17:10:26.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:10:26 np0005476733 nova_compute[192580]: 2025-10-08 17:10:26.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:27 np0005476733 nova_compute[192580]: 2025-10-08 17:10:27.230 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "refresh_cache-d2559ab1-2621-46b4-8e3c-568cbff22376" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 13:10:27 np0005476733 nova_compute[192580]: 2025-10-08 17:10:27.231 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquired lock "refresh_cache-d2559ab1-2621-46b4-8e3c-568cbff22376" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 13:10:27 np0005476733 nova_compute[192580]: 2025-10-08 17:10:27.231 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 13:10:27 np0005476733 nova_compute[192580]: 2025-10-08 17:10:27.231 2 DEBUG nova.objects.instance [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d2559ab1-2621-46b4-8e3c-568cbff22376 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 13:10:27 np0005476733 nova_compute[192580]: 2025-10-08 17:10:27.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:30 np0005476733 nova_compute[192580]: 2025-10-08 17:10:30.038 2 DEBUG nova.network.neutron [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Updating instance_info_cache with network_info: [{"id": "a232f057-6f16-4161-bffc-b743b97e5d1f", "address": "fa:16:3e:10:96:59", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.237", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa232f057-6f", "ovs_interfaceid": "a232f057-6f16-4161-bffc-b743b97e5d1f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6a2cde05-dbe2-4f08-94d1-646f8139b49e", "address": "fa:16:3e:0b:48:8d", "network": {"id": "23fdc065-0e6b-415e-9e71-7aa724ab9c52", "bridge": "br-int", "label": "tempest-tenant-ctl-network-670284253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c64130dbc8f84b8db2db002a5e499a8b", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a2cde05-db", "ovs_interfaceid": "6a2cde05-dbe2-4f08-94d1-646f8139b49e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 13:10:30 np0005476733 nova_compute[192580]: 2025-10-08 17:10:30.079 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Releasing lock "refresh_cache-d2559ab1-2621-46b4-8e3c-568cbff22376" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 13:10:30 np0005476733 nova_compute[192580]: 2025-10-08 17:10:30.079 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 13:10:30 np0005476733 nova_compute[192580]: 2025-10-08 17:10:30.080 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:10:31 np0005476733 nova_compute[192580]: 2025-10-08 17:10:31.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:10:31 np0005476733 nova_compute[192580]: 2025-10-08 17:10:31.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:32 np0005476733 podman[281915]: 2025-10-08 17:10:32.226239996 +0000 UTC m=+0.054586223 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:10:32 np0005476733 nova_compute[192580]: 2025-10-08 17:10:32.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:35 np0005476733 podman[281934]: 2025-10-08 17:10:35.266526441 +0000 UTC m=+0.100745869 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.084 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000074', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'user_id': '96fd231581074677b87116ed9773b06d', 'hostId': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.085 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.119 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/disk.device.read.requests volume: 11695 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.120 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/disk.device.read.requests volume: 103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0bd55b9d-ca9d-457e-ac6e-57dab3d2ad2a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 11695, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376-vda', 'timestamp': '2025-10-08T17:10:36.085326', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'instance-00000074', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'b4da17c2-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.808319416, 'message_signature': '72307ea743ca52115d60e4d3d49fd726fe4e70803c619abd43483836e008a38e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 103, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376-sda', 'timestamp': '2025-10-08T17:10:36.085326', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'instance-00000074', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'b4da3090-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.808319416, 'message_signature': '460c5951f806beb7ebfcafd6ca88f7039030985f307993fae52e5600eea987b9'}]}, 'timestamp': '2025-10-08 17:10:36.120726', '_unique_id': '406d7605d188413e94084286a351128a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.123 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.124 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.131 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for d2559ab1-2621-46b4-8e3c-568cbff22376 / tapa232f057-6f inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.132 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for d2559ab1-2621-46b4-8e3c-568cbff22376 / tap6a2cde05-db inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.133 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.133 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04b0346f-2bb8-4970-8e33-a8b77928c5aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'instance-00000074-d2559ab1-2621-46b4-8e3c-568cbff22376-tapa232f057-6f', 'timestamp': '2025-10-08T17:10:36.124799', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'tapa232f057-6f', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:10:96:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa232f057-6f'}, 'message_id': 'b4dc33d6-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.847852559, 'message_signature': 'ec1267e74514a77b5c973307ceff6eb5c0c60d2c738e5f54fb4bbdc68762cfbb'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'instance-00000074-d2559ab1-2621-46b4-8e3c-568cbff22376-tap6a2cde05-db', 'timestamp': '2025-10-08T17:10:36.124799', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'tap6a2cde05-db', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:0b:48:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a2cde05-db'}, 'message_id': 'b4dc4c36-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.847852559, 'message_signature': 'cd3542f78897898fcaccabf66c791d49d40bfa07aaac86cd823d3d2972303531'}]}, 'timestamp': '2025-10-08 17:10:36.134596', '_unique_id': '18803a3d5667405399d65249580a9d15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.136 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.138 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.155 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/disk.device.usage volume: 153223168 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.156 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11c16e14-55e2-4bbe-b631-352bb293e8b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 153223168, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376-vda', 'timestamp': '2025-10-08T17:10:36.138923', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'instance-00000074', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'b4dfaf2a-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.86197776, 'message_signature': '0146b6724a8a839281cbfe590a32b05ffee53c7d7c1a9a8332f872e090d2ceec'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376-sda', 'timestamp': '2025-10-08T17:10:36.138923', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'instance-00000074', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'b4dfcb40-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.86197776, 'message_signature': 'd38abaee30a67759874d3e25750692acc54897bbdce9a10ccdb0871f94ec3f93'}]}, 'timestamp': '2025-10-08 17:10:36.157460', '_unique_id': '6d781c4dc2e4429685264b7d1d4c667d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.159 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.161 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.162 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.162 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-test_dscp_bwlimit_external_network-1818989422>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dscp_bwlimit_external_network-1818989422>]
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.162 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.163 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/network.outgoing.packets volume: 549 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.163 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/network.outgoing.packets volume: 98 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03454954-1773-460d-9bb6-a42b19ec53f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 549, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'instance-00000074-d2559ab1-2621-46b4-8e3c-568cbff22376-tapa232f057-6f', 'timestamp': '2025-10-08T17:10:36.163230', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'tapa232f057-6f', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:10:96:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa232f057-6f'}, 'message_id': 'b4e0c46e-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.847852559, 'message_signature': '05810ef946b283d36976f769892ff099465fb61b51c35aa3bc191259be6f7faf'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 98, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'instance-00000074-d2559ab1-2621-46b4-8e3c-568cbff22376-tap6a2cde05-db', 'timestamp': '2025-10-08T17:10:36.163230', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'tap6a2cde05-db', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:0b:48:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a2cde05-db'}, 'message_id': 'b4e0dbb6-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.847852559, 'message_signature': '2a879f622cf426dc19102e240cd47d5e3b18f2064a17119bb1dd4c803872f88e'}]}, 'timestamp': '2025-10-08 17:10:36.164421', '_unique_id': 'e65bd05aefd0424e96df22af121e0ecb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.165 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.167 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.168 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.168 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '980ef5de-b726-4d73-ac8b-e2392fb290bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'instance-00000074-d2559ab1-2621-46b4-8e3c-568cbff22376-tapa232f057-6f', 'timestamp': '2025-10-08T17:10:36.167978', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'tapa232f057-6f', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:10:96:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa232f057-6f'}, 'message_id': 'b4e17e9a-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.847852559, 'message_signature': 'f4712763baab176120166bb12f382e62e77675372f365eb7a62f8a12937e071f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'instance-00000074-d2559ab1-2621-46b4-8e3c-568cbff22376-tap6a2cde05-db', 'timestamp': '2025-10-08T17:10:36.167978', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'tap6a2cde05-db', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:0b:48:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a2cde05-db'}, 'message_id': 'b4e193f8-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.847852559, 'message_signature': 'aa2515941149a4a032f20ea459e5862d49902a580d58c55dc521e2aa8c90e4c7'}]}, 'timestamp': '2025-10-08 17:10:36.169169', '_unique_id': '7ea2148f9bee4e91a30595f48111085e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.170 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.172 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.172 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.172 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d5c4edd-ca02-4c07-9c3d-9b4723a712ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'instance-00000074-d2559ab1-2621-46b4-8e3c-568cbff22376-tapa232f057-6f', 'timestamp': '2025-10-08T17:10:36.172417', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'tapa232f057-6f', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:10:96:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa232f057-6f'}, 'message_id': 'b4e22a52-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.847852559, 'message_signature': 'e8971f7095e3b38261e8c459554dd3e8807facfdc40dd3c9e40ef846c7b86374'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'instance-00000074-d2559ab1-2621-46b4-8e3c-568cbff22376-tap6a2cde05-db', 'timestamp': '2025-10-08T17:10:36.172417', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'tap6a2cde05-db', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:0b:48:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a2cde05-db'}, 'message_id': 'b4e240f0-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.847852559, 'message_signature': 'b7fd8919cbf07c1736c558a83d9812541669c490392dcc9c906b041cc1fd838e'}]}, 'timestamp': '2025-10-08 17:10:36.173582', '_unique_id': '684dfe62fff44ed2b317fab5d5348d1d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.174 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.176 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.176 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/disk.device.allocation volume: 154144768 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.177 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9552350-e6f4-42fc-841e-8ad0a334baee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 154144768, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376-vda', 'timestamp': '2025-10-08T17:10:36.176808', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'instance-00000074', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'b4e2d808-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.86197776, 'message_signature': '9e3433ae90ecbd1eeba631a48624aa7c7c18281b3782bc81285fda8b3329439e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376-sda', 'timestamp': '2025-10-08T17:10:36.176808', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'instance-00000074', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'b4e2f02c-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.86197776, 'message_signature': '40612824ad55752af743ae88095d33c39af893959605976b448ab434dc5fca02'}]}, 'timestamp': '2025-10-08 17:10:36.178031', '_unique_id': 'de874d21e6c04d518a77eb4e1e2b0f4a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.179 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.181 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.181 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/disk.device.capacity volume: 10737418240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.181 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ee6b494-e1e9-41a0-b4b0-eeddd0030067', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 10737418240, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376-vda', 'timestamp': '2025-10-08T17:10:36.181260', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'instance-00000074', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'b4e384ba-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.86197776, 'message_signature': '9dc1bf8a6f87357f4da8d526f0903d41f59bea3cb9f83208d5c31facf1f02f1f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376-sda', 'timestamp': '2025-10-08T17:10:36.181260', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'instance-00000074', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'b4e39a04-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.86197776, 'message_signature': 'eef0b7e554266f0cf8568ff372f571c57c15722354f2771f51fdb553356fe7ac'}]}, 'timestamp': '2025-10-08 17:10:36.182378', '_unique_id': '5f48d75c13ee4359a486b6ea85148cd7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.183 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.185 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.185 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/network.incoming.packets volume: 3454 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.186 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/network.incoming.packets volume: 97 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5ca9812-552f-4af7-b2a2-1832585264e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 3454, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'instance-00000074-d2559ab1-2621-46b4-8e3c-568cbff22376-tapa232f057-6f', 'timestamp': '2025-10-08T17:10:36.185589', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'tapa232f057-6f', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:10:96:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa232f057-6f'}, 'message_id': 'b4e42c3a-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.847852559, 'message_signature': 'ed49252f125cbff8a713a2ae94b68e69570c092b8a05fc13b9ad481e1f015d60'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 97, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'instance-00000074-d2559ab1-2621-46b4-8e3c-568cbff22376-tap6a2cde05-db', 'timestamp': '2025-10-08T17:10:36.185589', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'tap6a2cde05-db', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:0b:48:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a2cde05-db'}, 'message_id': 'b4e44184-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.847852559, 'message_signature': '154e8d57508f0d92660aadccb57a59cefe42f3a42ff965d0aa2d45d986cc4c71'}]}, 'timestamp': '2025-10-08 17:10:36.186686', '_unique_id': '13e2e6d77e87456abda7ddbd1e5f49bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.188 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.189 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.189 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/disk.device.write.requests volume: 798 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.190 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71dcbbc5-513f-4e53-b7cf-8e1fcb4f4124', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 798, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376-vda', 'timestamp': '2025-10-08T17:10:36.189615', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'instance-00000074', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'b4e4c730-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.808319416, 'message_signature': '1e0515dde47384b78122ba9bf7f9ee77f28449bfadb783771860d5f293c8b32b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376-sda', 'timestamp': '2025-10-08T17:10:36.189615', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'instance-00000074', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'b4e4da68-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.808319416, 'message_signature': 'f0297d6c554d5f9f3a97539df5c26b4850717ae54d1437ce1ffa388358d313e2'}]}, 'timestamp': '2025-10-08 17:10:36.190610', '_unique_id': '4c992a2b77494f52b897a9f8b9182392'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.191 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.192 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.213 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/memory.usage volume: 260.12109375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6448ee36-b66f-459d-802a-5d1bb32ee142', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 260.12109375, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'timestamp': '2025-10-08T17:10:36.193025', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'instance-00000074', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10}, 'message_id': 'b4e87cae-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.93650441, 'message_signature': '9050d026d70aec13bf960a48c7cf86332ac63b559c72b29e91ab4176b43ea050'}]}, 'timestamp': '2025-10-08 17:10:36.214504', '_unique_id': 'd84f5160674644a7acf5132e98ee8abc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.216 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.217 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.218 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.218 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ac00fbf-bb77-402a-95b9-60ab7c43986c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'instance-00000074-d2559ab1-2621-46b4-8e3c-568cbff22376-tapa232f057-6f', 'timestamp': '2025-10-08T17:10:36.218059', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'tapa232f057-6f', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:10:96:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa232f057-6f'}, 'message_id': 'b4e9214a-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.847852559, 'message_signature': '89ba357be4f2b0db2f52aa8eba6bd176b4c2e46db3a9e887387a6bf65e401fd0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'instance-00000074-d2559ab1-2621-46b4-8e3c-568cbff22376-tap6a2cde05-db', 'timestamp': '2025-10-08T17:10:36.218059', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'tap6a2cde05-db', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:0b:48:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a2cde05-db'}, 'message_id': 'b4e935ea-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.847852559, 'message_signature': '3844df83cc90298dc11e3e0c7cd88a532eb3f29aaa89b8ff2b42258e5c1234d0'}]}, 'timestamp': '2025-10-08 17:10:36.219190', '_unique_id': 'be797ea77f194821961bd40528250f5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.220 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.221 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.221 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.222 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc413f9b-ca41-496c-9a2c-4c0390ac8d4b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'instance-00000074-d2559ab1-2621-46b4-8e3c-568cbff22376-tapa232f057-6f', 'timestamp': '2025-10-08T17:10:36.221795', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'tapa232f057-6f', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:10:96:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa232f057-6f'}, 'message_id': 'b4e9b04c-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.847852559, 'message_signature': 'd297cc55594763fe8aab3571c7681aa7e513db2e052a441d9079c0c52401e01d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'instance-00000074-d2559ab1-2621-46b4-8e3c-568cbff22376-tap6a2cde05-db', 'timestamp': '2025-10-08T17:10:36.221795', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'tap6a2cde05-db', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:0b:48:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a2cde05-db'}, 'message_id': 'b4e9c60e-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.847852559, 'message_signature': '48a54cda08ebb35a3f62c5f5c8bb0c60ef9e7772c4f99ca7b42a658b598712a2'}]}, 'timestamp': '2025-10-08 17:10:36.222839', '_unique_id': 'd3d404f8385f4aefa8f555526ff83cac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.223 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.225 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.225 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/disk.device.write.bytes volume: 136721408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.225 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc8efa42-b177-440d-a7b9-623b27733927', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 136721408, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376-vda', 'timestamp': '2025-10-08T17:10:36.225310', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'instance-00000074', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'b4ea39cc-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.808319416, 'message_signature': '70422babe368586e7afb962e26daab74a155b5a080cf263554fe78f99e51d524'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376-sda', 'timestamp': '2025-10-08T17:10:36.225310', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'instance-00000074', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'b4ea4f2a-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.808319416, 'message_signature': '75b0e9b7bc943b60f28d9c24174d65f9eb5a8fc770aafe08d506047a8001443e'}]}, 'timestamp': '2025-10-08 17:10:36.226348', '_unique_id': '8ba4ec5d49c045d39d81d14545a192cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.227 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.228 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.229 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.229 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-test_dscp_bwlimit_external_network-1818989422>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dscp_bwlimit_external_network-1818989422>]
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.229 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.229 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/network.outgoing.bytes volume: 62552 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.230 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/network.outgoing.bytes volume: 25639 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b6c7221-c59c-43b6-aa19-0c57310a773c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 62552, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'instance-00000074-d2559ab1-2621-46b4-8e3c-568cbff22376-tapa232f057-6f', 'timestamp': '2025-10-08T17:10:36.229933', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'tapa232f057-6f', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:10:96:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa232f057-6f'}, 'message_id': 'b4eaf0ce-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.847852559, 'message_signature': '2f37b54ae84231f07b276103e5b6cb93aa648cef0a3ce1f9d651775bfb5b27b6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25639, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'instance-00000074-d2559ab1-2621-46b4-8e3c-568cbff22376-tap6a2cde05-db', 'timestamp': '2025-10-08T17:10:36.229933', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'tap6a2cde05-db', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:0b:48:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a2cde05-db'}, 'message_id': 'b4eb06f4-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.847852559, 'message_signature': 'f9c08502a2d7eab09396622e69eb18a2f79d5c742f204491ed9feb4821eb7a26'}]}, 'timestamp': '2025-10-08 17:10:36.231134', '_unique_id': '142103f6d03f4fed998561b93bb0fec3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.232 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.233 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.233 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/disk.device.read.bytes volume: 331109888 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.234 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/disk.device.read.bytes volume: 348408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '079f3b58-6674-4fc3-847e-042b311a49df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 331109888, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376-vda', 'timestamp': '2025-10-08T17:10:36.233739', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'instance-00000074', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'b4eb823c-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.808319416, 'message_signature': '9b0e21c7594fb6819611137dc5feb5e946cb5ffea47fc8de8047dc5bdaeeec8d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 348408, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376-sda', 'timestamp': '2025-10-08T17:10:36.233739', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'instance-00000074', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'b4eb9538-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.808319416, 'message_signature': '883745fb5bf021d6dd88df33d293e2900010ba35ac7b972a02a562480eea0f76'}]}, 'timestamp': '2025-10-08 17:10:36.234679', '_unique_id': '04f8e28504b8433ead4fb0c753e22774'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.235 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.236 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.237 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.237 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-test_dscp_bwlimit_external_network-1818989422>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dscp_bwlimit_external_network-1818989422>]
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.237 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.237 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.238 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '61eba446-17ee-47cd-b9fa-89c113b7b677', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'instance-00000074-d2559ab1-2621-46b4-8e3c-568cbff22376-tapa232f057-6f', 'timestamp': '2025-10-08T17:10:36.237753', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'tapa232f057-6f', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:10:96:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa232f057-6f'}, 'message_id': 'b4ec1f94-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.847852559, 'message_signature': '71c487ca19273827786f1761cca76c1850cfec5c731e1b2c5c4768c6907c767a'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'instance-00000074-d2559ab1-2621-46b4-8e3c-568cbff22376-tap6a2cde05-db', 'timestamp': '2025-10-08T17:10:36.237753', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'tap6a2cde05-db', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:0b:48:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a2cde05-db'}, 'message_id': 'b4ec33d0-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.847852559, 'message_signature': '3edfbfc08381a0b6e5c0b73fa2b3d4759b0078934021f8645fc8a610f98bfd30'}]}, 'timestamp': '2025-10-08 17:10:36.238755', '_unique_id': '21c87b1d64c8463cb9b7237d36610ce3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.239 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.241 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.241 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/cpu volume: 40060000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ad327f1-ad65-43b4-a63e-92ca6ade2bdc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 40060000000, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'timestamp': '2025-10-08T17:10:36.241240', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'instance-00000074', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'cpu_number': 1}, 'message_id': 'b4eca73e-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.93650441, 'message_signature': '00ec9332b0900e9ab8fcad7feebde9269d489f3136f9fb39472620f8199efac1'}]}, 'timestamp': '2025-10-08 17:10:36.241705', '_unique_id': '4a1e942585164ca5b6f56c21817d7559'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.242 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.243 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.244 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.244 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-test_dscp_bwlimit_external_network-1818989422>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-test_dscp_bwlimit_external_network-1818989422>]
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.244 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.244 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/disk.device.read.latency volume: 6250688128 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.245 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/disk.device.read.latency volume: 78388436 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c812b013-8635-4ee2-b189-e8b81f2e097b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6250688128, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376-vda', 'timestamp': '2025-10-08T17:10:36.244691', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'instance-00000074', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'b4ed2e3e-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.808319416, 'message_signature': '9d4cd3ecde767ddcb60986c63367e77d43d8cec58ba4428b19a288feadc1c02f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 78388436, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376-sda', 'timestamp': '2025-10-08T17:10:36.244691', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'instance-00000074', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'b4ed443c-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.808319416, 'message_signature': 'b42acba8c92f9b87bec29b1b7554e632c0ed17df6344ee5cd1076bb8bbe91788'}]}, 'timestamp': '2025-10-08 17:10:36.245724', '_unique_id': '02ff32b43418482c8a552c0e2b4b3138'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.246 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.247 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.248 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/network.incoming.bytes volume: 15326692 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.248 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/network.incoming.bytes volume: 16965 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '657c552f-814a-46b5-abaa-247fa732508a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 15326692, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'instance-00000074-d2559ab1-2621-46b4-8e3c-568cbff22376-tapa232f057-6f', 'timestamp': '2025-10-08T17:10:36.248204', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'tapa232f057-6f', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:10:96:59', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa232f057-6f'}, 'message_id': 'b4edb84a-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.847852559, 'message_signature': '48e0862568884a4352a9bcff1881764e228cb53c71e60308fd429f7e23abd99e'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 16965, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'instance-00000074-d2559ab1-2621-46b4-8e3c-568cbff22376-tap6a2cde05-db', 'timestamp': '2025-10-08T17:10:36.248204', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'tap6a2cde05-db', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'mac': 'fa:16:3e:0b:48:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a2cde05-db'}, 'message_id': 'b4edc9fc-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.847852559, 'message_signature': 'd225abf218d290aea6dbd3f8cde0f580988c3f9dbd512dfd040952c6040e60ae'}]}, 'timestamp': '2025-10-08 17:10:36.249180', '_unique_id': 'fbf76d6457094f5a8fdd08e43658e652'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.250 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.251 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.251 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/disk.device.write.latency volume: 6765120792 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.252 12 DEBUG ceilometer.compute.pollsters [-] d2559ab1-2621-46b4-8e3c-568cbff22376/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b779c6a-6f6d-438f-91b3-10d3df729c5d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6765120792, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376-vda', 'timestamp': '2025-10-08T17:10:36.251751', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'instance-00000074', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'vda'}, 'message_id': 'b4ee4350-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.808319416, 'message_signature': 'c33b26e65be0131907de75f87c12dd8b8e6deade36db1b579aca36852d58dff3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '96fd231581074677b87116ed9773b06d', 'user_name': None, 'project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'project_name': None, 'resource_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376-sda', 'timestamp': '2025-10-08T17:10:36.251751', 'resource_metadata': {'display_name': 'tempest-test_dscp_bwlimit_external_network-1818989422', 'name': 'instance-00000074', 'instance_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'instance_type': 'custom_neutron_guest', 'host': '312354ca12ceb4cf6aa1b6b0765ee83f2a802c12cc9d2c1c16912ecd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '22222222-2222-2222-2222-222222222222', 'name': 'custom_neutron_guest', 'vcpus': 1, 'ram': 1024, 'disk': 10, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '11111111-1111-1111-1111-111111111111'}, 'image_ref': '11111111-1111-1111-1111-111111111111', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1024, 'disk_gb': 10, 'ephemeral_gb': 0, 'root_gb': 10, 'disk_name': 'sda'}, 'message_id': 'b4ee55ca-a469-11f0-9274-fa163ef67048', 'monotonic_time': 10339.808319416, 'message_signature': '7cf065ccacf82900e7e4bea9cf65ffaa22eec1f9f9d07dfefe4655769e2e316b'}]}, 'timestamp': '2025-10-08 17:10:36.252711', '_unique_id': '4e5ba34cb2a24e15b21b4ff36d17486d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 13:10:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:10:36.253 12 ERROR oslo_messaging.notify.messaging 
Oct  8 13:10:36 np0005476733 nova_compute[192580]: 2025-10-08 17:10:36.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:10:36 np0005476733 nova_compute[192580]: 2025-10-08 17:10:36.615 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:10:36 np0005476733 nova_compute[192580]: 2025-10-08 17:10:36.615 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:10:36 np0005476733 nova_compute[192580]: 2025-10-08 17:10:36.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:10:36 np0005476733 nova_compute[192580]: 2025-10-08 17:10:36.616 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:10:36 np0005476733 nova_compute[192580]: 2025-10-08 17:10:36.697 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2559ab1-2621-46b4-8e3c-568cbff22376/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:10:36 np0005476733 nova_compute[192580]: 2025-10-08 17:10:36.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:36 np0005476733 nova_compute[192580]: 2025-10-08 17:10:36.761 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2559ab1-2621-46b4-8e3c-568cbff22376/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:10:36 np0005476733 nova_compute[192580]: 2025-10-08 17:10:36.762 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2559ab1-2621-46b4-8e3c-568cbff22376/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:10:36 np0005476733 nova_compute[192580]: 2025-10-08 17:10:36.817 2 DEBUG oslo_concurrency.processutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2559ab1-2621-46b4-8e3c-568cbff22376/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:10:36 np0005476733 nova_compute[192580]: 2025-10-08 17:10:36.984 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:10:36 np0005476733 nova_compute[192580]: 2025-10-08 17:10:36.986 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=12784MB free_disk=111.15467834472656GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:10:36 np0005476733 nova_compute[192580]: 2025-10-08 17:10:36.986 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:10:36 np0005476733 nova_compute[192580]: 2025-10-08 17:10:36.987 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:10:37 np0005476733 nova_compute[192580]: 2025-10-08 17:10:37.082 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Instance d2559ab1-2621-46b4-8e3c-568cbff22376 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 10, 'MEMORY_MB': 1024, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 13:10:37 np0005476733 nova_compute[192580]: 2025-10-08 17:10:37.084 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:10:37 np0005476733 nova_compute[192580]: 2025-10-08 17:10:37.084 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=1536MB phys_disk=119GB used_disk=10GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:10:37 np0005476733 nova_compute[192580]: 2025-10-08 17:10:37.140 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:10:37 np0005476733 nova_compute[192580]: 2025-10-08 17:10:37.163 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:10:37 np0005476733 nova_compute[192580]: 2025-10-08 17:10:37.166 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:10:37 np0005476733 nova_compute[192580]: 2025-10-08 17:10:37.166 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:10:37 np0005476733 nova_compute[192580]: 2025-10-08 17:10:37.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:38 np0005476733 podman[281966]: 2025-10-08 17:10:38.242222223 +0000 UTC m=+0.065707610 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct  8 13:10:41 np0005476733 ovn_controller[263831]: 2025-10-08T17:10:41Z|00277|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Oct  8 13:10:41 np0005476733 nova_compute[192580]: 2025-10-08 17:10:41.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:42 np0005476733 nova_compute[192580]: 2025-10-08 17:10:42.167 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:10:42 np0005476733 nova_compute[192580]: 2025-10-08 17:10:42.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:46 np0005476733 nova_compute[192580]: 2025-10-08 17:10:46.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:47 np0005476733 podman[281987]: 2025-10-08 17:10:47.240460748 +0000 UTC m=+0.059908185 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350)
Oct  8 13:10:47 np0005476733 nova_compute[192580]: 2025-10-08 17:10:47.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:47 np0005476733 podman[281985]: 2025-10-08 17:10:47.25835316 +0000 UTC m=+0.075709890 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 13:10:47 np0005476733 podman[281986]: 2025-10-08 17:10:47.273144913 +0000 UTC m=+0.091853746 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 13:10:51 np0005476733 nova_compute[192580]: 2025-10-08 17:10:51.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:52 np0005476733 nova_compute[192580]: 2025-10-08 17:10:52.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:53 np0005476733 podman[282049]: 2025-10-08 17:10:53.237856276 +0000 UTC m=+0.062121126 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 13:10:53 np0005476733 podman[282050]: 2025-10-08 17:10:53.240093697 +0000 UTC m=+0.053091297 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 13:10:56 np0005476733 nova_compute[192580]: 2025-10-08 17:10:56.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:10:56 np0005476733 podman[206798]: time="2025-10-08T17:10:56Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct  8 13:10:56 np0005476733 podman[206798]: @ - - [08/Oct/2025:17:10:56 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 28298 "" "Go-http-client/1.1"
Oct  8 13:10:57 np0005476733 nova_compute[192580]: 2025-10-08 17:10:57.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:01 np0005476733 nova_compute[192580]: 2025-10-08 17:11:01.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:02 np0005476733 nova_compute[192580]: 2025-10-08 17:11:02.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:03 np0005476733 podman[282090]: 2025-10-08 17:11:03.229196811 +0000 UTC m=+0.058280173 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  8 13:11:06 np0005476733 ovn_controller[263831]: 2025-10-08T17:11:06Z|00278|pinctrl|WARN|Dropped 305 log messages in last 75 seconds (most recently, 19 seconds ago) due to excessive rate
Oct  8 13:11:06 np0005476733 ovn_controller[263831]: 2025-10-08T17:11:06Z|00279|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:11:06 np0005476733 podman[282110]: 2025-10-08 17:11:06.322985203 +0000 UTC m=+0.134606902 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 13:11:06 np0005476733 nova_compute[192580]: 2025-10-08 17:11:06.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:07 np0005476733 nova_compute[192580]: 2025-10-08 17:11:07.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:09 np0005476733 podman[282137]: 2025-10-08 17:11:09.221729294 +0000 UTC m=+0.053457209 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  8 13:11:10 np0005476733 nova_compute[192580]: 2025-10-08 17:11:10.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:11:11 np0005476733 nova_compute[192580]: 2025-10-08 17:11:11.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:12 np0005476733 nova_compute[192580]: 2025-10-08 17:11:12.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:14 np0005476733 nova_compute[192580]: 2025-10-08 17:11:14.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:11:14 np0005476733 nova_compute[192580]: 2025-10-08 17:11:14.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:11:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:15.025 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=101, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=100) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:11:15 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:15.025 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 13:11:15 np0005476733 nova_compute[192580]: 2025-10-08 17:11:15.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:16 np0005476733 nova_compute[192580]: 2025-10-08 17:11:16.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:17 np0005476733 nova_compute[192580]: 2025-10-08 17:11:17.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.181 2 DEBUG oslo_concurrency.lockutils [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Acquiring lock "d2559ab1-2621-46b4-8e3c-568cbff22376" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.181 2 DEBUG oslo_concurrency.lockutils [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lock "d2559ab1-2621-46b4-8e3c-568cbff22376" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.181 2 DEBUG oslo_concurrency.lockutils [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Acquiring lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.182 2 DEBUG oslo_concurrency.lockutils [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.182 2 DEBUG oslo_concurrency.lockutils [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.183 2 INFO nova.compute.manager [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Terminating instance#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.184 2 DEBUG nova.compute.manager [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 13:11:18 np0005476733 kernel: tapa232f057-6f (unregistering): left promiscuous mode
Oct  8 13:11:18 np0005476733 NetworkManager[51699]: <info>  [1759943478.2145] device (tapa232f057-6f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 13:11:18 np0005476733 ovn_controller[263831]: 2025-10-08T17:11:18Z|00280|binding|INFO|Releasing lport a232f057-6f16-4161-bffc-b743b97e5d1f from this chassis (sb_readonly=0)
Oct  8 13:11:18 np0005476733 ovn_controller[263831]: 2025-10-08T17:11:18Z|00281|binding|INFO|Setting lport a232f057-6f16-4161-bffc-b743b97e5d1f down in Southbound
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:18 np0005476733 ovn_controller[263831]: 2025-10-08T17:11:18Z|00282|binding|INFO|Removing iface tapa232f057-6f ovn-installed in OVS
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:18 np0005476733 kernel: tap6a2cde05-db (unregistering): left promiscuous mode
Oct  8 13:11:18 np0005476733 NetworkManager[51699]: <info>  [1759943478.2518] device (tap6a2cde05-db): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 13:11:18 np0005476733 ovn_controller[263831]: 2025-10-08T17:11:18Z|00283|binding|INFO|Releasing lport 6a2cde05-dbe2-4f08-94d1-646f8139b49e from this chassis (sb_readonly=1)
Oct  8 13:11:18 np0005476733 ovn_controller[263831]: 2025-10-08T17:11:18Z|00284|binding|INFO|Removing iface tap6a2cde05-db ovn-installed in OVS
Oct  8 13:11:18 np0005476733 ovn_controller[263831]: 2025-10-08T17:11:18Z|00285|if_status|INFO|Not setting lport 6a2cde05-dbe2-4f08-94d1-646f8139b49e down as sb is readonly
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:18 np0005476733 podman[282158]: 2025-10-08 17:11:18.267028279 +0000 UTC m=+0.091639329 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct  8 13:11:18 np0005476733 ovn_controller[263831]: 2025-10-08T17:11:18Z|00286|binding|INFO|Setting lport 6a2cde05-dbe2-4f08-94d1-646f8139b49e down in Southbound
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.270 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:96:59 192.168.122.237'], port_security=['fa:16:3e:10:96:59 192.168.122.237'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.237/24', 'neutron:device_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3cc7e400-7bea-4a8c-b1db-0f3cb058bcbc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5b64086-e7d8-42ad-b439-67cb79e13d7c, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=a232f057-6f16-4161-bffc-b743b97e5d1f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.272 103739 INFO neutron.agent.ovn.metadata.agent [-] Port a232f057-6f16-4161-bffc-b743b97e5d1f in datapath 81c575b5-ac88-40d3-8b00-79c5c936eec4 unbound from our chassis#033[00m
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.273 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 81c575b5-ac88-40d3-8b00-79c5c936eec4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.275 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[9bb5591d-27ec-4f93-a182-e635d98f7557]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.278 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 namespace which is not needed anymore#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:18 np0005476733 podman[282160]: 2025-10-08 17:11:18.283887028 +0000 UTC m=+0.096189285 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6)
Oct  8 13:11:18 np0005476733 podman[282159]: 2025-10-08 17:11:18.291243043 +0000 UTC m=+0.114629484 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 13:11:18 np0005476733 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000074.scope: Deactivated successfully.
Oct  8 13:11:18 np0005476733 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000074.scope: Consumed 46.592s CPU time.
Oct  8 13:11:18 np0005476733 systemd-machined[152624]: Machine qemu-70-instance-00000074 terminated.
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.307 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0b:48:8d 10.100.0.8'], port_security=['fa:16:3e:0b:48:8d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'd2559ab1-2621-46b4-8e3c-568cbff22376', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-23fdc065-0e6b-415e-9e71-7aa724ab9c52', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c64130dbc8f84b8db2db002a5e499a8b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2213252c-94f9-4b54-86f1-ccaa6f15fcb5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.240'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa5e840d-fbd9-4cb2-9d13-461aaf3b8865, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>], logical_port=6a2cde05-dbe2-4f08-94d1-646f8139b49e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f029f4068b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:11:18 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[281255]: [NOTICE]   (281259) : haproxy version is 2.8.14-c23fe91
Oct  8 13:11:18 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[281255]: [NOTICE]   (281259) : path to executable is /usr/sbin/haproxy
Oct  8 13:11:18 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[281255]: [WARNING]  (281259) : Exiting Master process...
Oct  8 13:11:18 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[281255]: [ALERT]    (281259) : Current worker (281261) exited with code 143 (Terminated)
Oct  8 13:11:18 np0005476733 neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4[281255]: [WARNING]  (281259) : All workers exited. Exiting... (0)
Oct  8 13:11:18 np0005476733 systemd[1]: libpod-9578c8d6c7e46f8f256964e96e92c2e6f31fe20585d8b1a80b9549b28504d4fe.scope: Deactivated successfully.
Oct  8 13:11:18 np0005476733 NetworkManager[51699]: <info>  [1759943478.4056] manager: (tapa232f057-6f): new Tun device (/org/freedesktop/NetworkManager/Devices/334)
Oct  8 13:11:18 np0005476733 podman[282252]: 2025-10-08 17:11:18.411199116 +0000 UTC m=+0.046130725 container died 9578c8d6c7e46f8f256964e96e92c2e6f31fe20585d8b1a80b9549b28504d4fe (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:11:18 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9578c8d6c7e46f8f256964e96e92c2e6f31fe20585d8b1a80b9549b28504d4fe-userdata-shm.mount: Deactivated successfully.
Oct  8 13:11:18 np0005476733 systemd[1]: var-lib-containers-storage-overlay-019c0282c33939bea2ee46f124818d2b211a453aab5aeaa348b8aa7524c91c29-merged.mount: Deactivated successfully.
Oct  8 13:11:18 np0005476733 podman[282252]: 2025-10-08 17:11:18.455009886 +0000 UTC m=+0.089941515 container cleanup 9578c8d6c7e46f8f256964e96e92c2e6f31fe20585d8b1a80b9549b28504d4fe (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:11:18 np0005476733 systemd[1]: libpod-conmon-9578c8d6c7e46f8f256964e96e92c2e6f31fe20585d8b1a80b9549b28504d4fe.scope: Deactivated successfully.
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.466 2 INFO nova.virt.libvirt.driver [-] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Instance destroyed successfully.#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.467 2 DEBUG nova.objects.instance [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lazy-loading 'resources' on Instance uuid d2559ab1-2621-46b4-8e3c-568cbff22376 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.487 2 DEBUG nova.virt.libvirt.vif [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T17:08:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_dscp_bwlimit_external_network-1818989422',display_name='tempest-test_dscp_bwlimit_external_network-1818989422',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dscp-bwlimit-external-network-1818989422',id=116,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJR/7xp4fhkcZ9yCLipsodZ2B1hsuYH3iQyfkCZCDl7mvXgidE/NHKeXMZyKMiWHW8BEIKPoo7O5FsRdZDjBlPOtIfenzRDdK1bjfcVb4Kuc8gL9en8DLsiKTPLRbj8LaA==',key_name='tempest-keypair-test-1556105078',keypairs=<?>,launch_index=0,launched_at=2025-10-08T17:09:01Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c64130dbc8f84b8db2db002a5e499a8b',ramdisk_id='',reservation_id='r-u0hmfh6i',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestExternalNetwork-1440675542',owner_user_name='tempest-QosTestExternalNetwork-1440675542-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T17:09:01Z,user_data=None,user_id='96fd231581074677b87116ed9773b06d',uuid=d2559ab1-2621-46b4-8e3c-568cbff22376,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a232f057-6f16-4161-bffc-b743b97e5d1f", "address": "fa:16:3e:10:96:59", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.237", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa232f057-6f", "ovs_interfaceid": "a232f057-6f16-4161-bffc-b743b97e5d1f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.488 2 DEBUG nova.network.os_vif_util [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Converting VIF {"id": "a232f057-6f16-4161-bffc-b743b97e5d1f", "address": "fa:16:3e:10:96:59", "network": {"id": "81c575b5-ac88-40d3-8b00-79c5c936eec4", "bridge": "br-int", "label": "public", "subnets": [{"cidr": "192.168.122.0/24", "dns": [], "gateway": {"address": "192.168.122.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.122.237", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "daecd871adaa4a7ba72129f7b1a03cd9", "mtu": 1400, "physical_network": "datacentre", "tunneled": false}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa232f057-6f", "ovs_interfaceid": "a232f057-6f16-4161-bffc-b743b97e5d1f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.489 2 DEBUG nova.network.os_vif_util [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:10:96:59,bridge_name='br-int',has_traffic_filtering=True,id=a232f057-6f16-4161-bffc-b743b97e5d1f,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa232f057-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.489 2 DEBUG os_vif [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:10:96:59,bridge_name='br-int',has_traffic_filtering=True,id=a232f057-6f16-4161-bffc-b743b97e5d1f,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa232f057-6f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.491 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa232f057-6f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.498 2 INFO os_vif [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:10:96:59,bridge_name='br-int',has_traffic_filtering=True,id=a232f057-6f16-4161-bffc-b743b97e5d1f,network=Network(81c575b5-ac88-40d3-8b00-79c5c936eec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa232f057-6f')#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.498 2 DEBUG nova.virt.libvirt.vif [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T17:08:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-test_dscp_bwlimit_external_network-1818989422',display_name='tempest-test_dscp_bwlimit_external_network-1818989422',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-test-dscp-bwlimit-external-network-1818989422',id=116,image_ref='11111111-1111-1111-1111-111111111111',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJR/7xp4fhkcZ9yCLipsodZ2B1hsuYH3iQyfkCZCDl7mvXgidE/NHKeXMZyKMiWHW8BEIKPoo7O5FsRdZDjBlPOtIfenzRDdK1bjfcVb4Kuc8gL9en8DLsiKTPLRbj8LaA==',key_name='tempest-keypair-test-1556105078',keypairs=<?>,launch_index=0,launched_at=2025-10-08T17:09:01Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1024,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c64130dbc8f84b8db2db002a5e499a8b',ramdisk_id='',reservation_id='r-u0hmfh6i',resources=None,root_device_name='/dev/vda',root_gb=10,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='11111111-1111-1111-1111-111111111111',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='10',image_min_ram='0',image_os_glance_failed_import='',image_os_glance_importing_to_stores='',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/custom_neutron_guest',image_owner_specified.openstack.sha256='',owner_project_name='tempest-QosTestExternalNetwork-1440675542',owner_user_name='tempest-QosTestExternalNetwork-1440675542-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T17:09:01Z,user_data=None,user_id='96fd231581074677b87116ed9773b06d',uuid=d2559ab1-2621-46b4-8e3c-568cbff22376,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6a2cde05-dbe2-4f08-94d1-646f8139b49e", "address": "fa:16:3e:0b:48:8d", "network": {"id": "23fdc065-0e6b-415e-9e71-7aa724ab9c52", "bridge": "br-int", "label": "tempest-tenant-ctl-network-670284253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c64130dbc8f84b8db2db002a5e499a8b", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a2cde05-db", "ovs_interfaceid": "6a2cde05-dbe2-4f08-94d1-646f8139b49e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.499 2 DEBUG nova.network.os_vif_util [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Converting VIF {"id": "6a2cde05-dbe2-4f08-94d1-646f8139b49e", "address": "fa:16:3e:0b:48:8d", "network": {"id": "23fdc065-0e6b-415e-9e71-7aa724ab9c52", "bridge": "br-int", "label": "tempest-tenant-ctl-network-670284253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c64130dbc8f84b8db2db002a5e499a8b", "mtu": 1342, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a2cde05-db", "ovs_interfaceid": "6a2cde05-dbe2-4f08-94d1-646f8139b49e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.499 2 DEBUG nova.network.os_vif_util [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0b:48:8d,bridge_name='br-int',has_traffic_filtering=True,id=6a2cde05-dbe2-4f08-94d1-646f8139b49e,network=Network(23fdc065-0e6b-415e-9e71-7aa724ab9c52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6a2cde05-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.499 2 DEBUG os_vif [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0b:48:8d,bridge_name='br-int',has_traffic_filtering=True,id=6a2cde05-dbe2-4f08-94d1-646f8139b49e,network=Network(23fdc065-0e6b-415e-9e71-7aa724ab9c52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6a2cde05-db') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.500 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a2cde05-db, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.504 2 INFO os_vif [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0b:48:8d,bridge_name='br-int',has_traffic_filtering=True,id=6a2cde05-dbe2-4f08-94d1-646f8139b49e,network=Network(23fdc065-0e6b-415e-9e71-7aa724ab9c52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap6a2cde05-db')#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.504 2 INFO nova.virt.libvirt.driver [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Deleting instance files /var/lib/nova/instances/d2559ab1-2621-46b4-8e3c-568cbff22376_del#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.505 2 INFO nova.virt.libvirt.driver [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Deletion of /var/lib/nova/instances/d2559ab1-2621-46b4-8e3c-568cbff22376_del complete#033[00m
Oct  8 13:11:18 np0005476733 podman[282309]: 2025-10-08 17:11:18.51584447 +0000 UTC m=+0.038380727 container remove 9578c8d6c7e46f8f256964e96e92c2e6f31fe20585d8b1a80b9549b28504d4fe (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.521 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[51e7230e-6b74-4a06-bc51-9a2bd5a6ee04]: (4, ('Wed Oct  8 05:11:18 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 (9578c8d6c7e46f8f256964e96e92c2e6f31fe20585d8b1a80b9549b28504d4fe)\n9578c8d6c7e46f8f256964e96e92c2e6f31fe20585d8b1a80b9549b28504d4fe\nWed Oct  8 05:11:18 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 (9578c8d6c7e46f8f256964e96e92c2e6f31fe20585d8b1a80b9549b28504d4fe)\n9578c8d6c7e46f8f256964e96e92c2e6f31fe20585d8b1a80b9549b28504d4fe\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.522 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[22f8fa2b-f017-479a-a540-4f4beab59f96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.523 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81c575b5-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:18 np0005476733 kernel: tap81c575b5-a0: left promiscuous mode
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.539 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[7607b1b0-06ea-4b7b-8cf4-359c86ad1c06]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.573 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[75c3d9b5-a647-4ef8-90ee-18c79945d939]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.575 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[acc28908-f742-49b0-8cec-6181632c101e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.576 2 INFO nova.compute.manager [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.577 2 DEBUG oslo.service.loopingcall [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.577 2 DEBUG nova.compute.manager [-] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.577 2 DEBUG nova.network.neutron [-] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.596 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[636ca0c2-2325-4ded-b474-ff99add4c44b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1024303, 'reachable_time': 32743, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 282326, 'error': None, 'target': 'ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.598 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-81c575b5-ac88-40d3-8b00-79c5c936eec4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.599 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[ac968fc7-a4e3-4397-9292-dbc9e445fc66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.600 103739 INFO neutron.agent.ovn.metadata.agent [-] Port 6a2cde05-dbe2-4f08-94d1-646f8139b49e in datapath 23fdc065-0e6b-415e-9e71-7aa724ab9c52 unbound from our chassis#033[00m
Oct  8 13:11:18 np0005476733 systemd[1]: run-netns-ovnmeta\x2d81c575b5\x2dac88\x2d40d3\x2d8b00\x2d79c5c936eec4.mount: Deactivated successfully.
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.601 103739 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 23fdc065-0e6b-415e-9e71-7aa724ab9c52, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.602 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[003038dd-e478-497e-bf3a-7ef61c9d3a0c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.602 103739 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-23fdc065-0e6b-415e-9e71-7aa724ab9c52 namespace which is not needed anymore#033[00m
Oct  8 13:11:18 np0005476733 neutron-haproxy-ovnmeta-23fdc065-0e6b-415e-9e71-7aa724ab9c52[281795]: [NOTICE]   (281799) : haproxy version is 2.8.14-c23fe91
Oct  8 13:11:18 np0005476733 neutron-haproxy-ovnmeta-23fdc065-0e6b-415e-9e71-7aa724ab9c52[281795]: [NOTICE]   (281799) : path to executable is /usr/sbin/haproxy
Oct  8 13:11:18 np0005476733 neutron-haproxy-ovnmeta-23fdc065-0e6b-415e-9e71-7aa724ab9c52[281795]: [WARNING]  (281799) : Exiting Master process...
Oct  8 13:11:18 np0005476733 neutron-haproxy-ovnmeta-23fdc065-0e6b-415e-9e71-7aa724ab9c52[281795]: [WARNING]  (281799) : Exiting Master process...
Oct  8 13:11:18 np0005476733 neutron-haproxy-ovnmeta-23fdc065-0e6b-415e-9e71-7aa724ab9c52[281795]: [ALERT]    (281799) : Current worker (281801) exited with code 143 (Terminated)
Oct  8 13:11:18 np0005476733 neutron-haproxy-ovnmeta-23fdc065-0e6b-415e-9e71-7aa724ab9c52[281795]: [WARNING]  (281799) : All workers exited. Exiting... (0)
Oct  8 13:11:18 np0005476733 systemd[1]: libpod-5a3ea151501a6a9c8f5613850fa8b3fe2f2e4e95a9aab85c91aacebef819a3dd.scope: Deactivated successfully.
Oct  8 13:11:18 np0005476733 podman[282345]: 2025-10-08 17:11:18.762031017 +0000 UTC m=+0.057666904 container died 5a3ea151501a6a9c8f5613850fa8b3fe2f2e4e95a9aab85c91aacebef819a3dd (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-23fdc065-0e6b-415e-9e71-7aa724ab9c52, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  8 13:11:18 np0005476733 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5a3ea151501a6a9c8f5613850fa8b3fe2f2e4e95a9aab85c91aacebef819a3dd-userdata-shm.mount: Deactivated successfully.
Oct  8 13:11:18 np0005476733 systemd[1]: var-lib-containers-storage-overlay-4a491b2eac6024c986a9ac5de1b3df657043367e4e9574f6430a1cbbf727a343-merged.mount: Deactivated successfully.
Oct  8 13:11:18 np0005476733 podman[282345]: 2025-10-08 17:11:18.797288163 +0000 UTC m=+0.092924010 container cleanup 5a3ea151501a6a9c8f5613850fa8b3fe2f2e4e95a9aab85c91aacebef819a3dd (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-23fdc065-0e6b-415e-9e71-7aa724ab9c52, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 13:11:18 np0005476733 systemd[1]: libpod-conmon-5a3ea151501a6a9c8f5613850fa8b3fe2f2e4e95a9aab85c91aacebef819a3dd.scope: Deactivated successfully.
Oct  8 13:11:18 np0005476733 podman[282376]: 2025-10-08 17:11:18.861692061 +0000 UTC m=+0.040425152 container remove 5a3ea151501a6a9c8f5613850fa8b3fe2f2e4e95a9aab85c91aacebef819a3dd (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=neutron-haproxy-ovnmeta-23fdc065-0e6b-415e-9e71-7aa724ab9c52, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.866 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[eaeaa70b-a2b7-4694-a5e5-3a5a8119d8df]: (4, ('Wed Oct  8 05:11:18 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-23fdc065-0e6b-415e-9e71-7aa724ab9c52 (5a3ea151501a6a9c8f5613850fa8b3fe2f2e4e95a9aab85c91aacebef819a3dd)\n5a3ea151501a6a9c8f5613850fa8b3fe2f2e4e95a9aab85c91aacebef819a3dd\nWed Oct  8 05:11:18 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-23fdc065-0e6b-415e-9e71-7aa724ab9c52 (5a3ea151501a6a9c8f5613850fa8b3fe2f2e4e95a9aab85c91aacebef819a3dd)\n5a3ea151501a6a9c8f5613850fa8b3fe2f2e4e95a9aab85c91aacebef819a3dd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.868 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[d1ef9f05-6057-4448-b4b3-6c241121a432]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.868 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23fdc065-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:18 np0005476733 kernel: tap23fdc065-00: left promiscuous mode
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:18 np0005476733 nova_compute[192580]: 2025-10-08 17:11:18.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.890 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[ddd90255-e55a-4af2-813e-fc1cf187546d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.916 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[bdd62593-9405-4e01-984a-cb84de5f98a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.918 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[a5474f4c-4e45-43a1-ad1d-3f14c4a32d64]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.942 221259 DEBUG oslo.privsep.daemon [-] privsep: reply[5c91fb55-c47b-401c-af84-55e5460fb4b9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1031105, 'reachable_time': 20864, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 282391, 'error': None, 'target': 'ovnmeta-23fdc065-0e6b-415e-9e71-7aa724ab9c52', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.944 103878 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-23fdc065-0e6b-415e-9e71-7aa724ab9c52 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 13:11:18 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:18.944 103878 DEBUG oslo.privsep.daemon [-] privsep: reply[caa6068d-92c3-45ed-8c79-6c960b7f67ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 13:11:19 np0005476733 nova_compute[192580]: 2025-10-08 17:11:19.336 2 DEBUG nova.compute.manager [req-9d0e4855-3e54-4721-b1f3-f31127088e5e req-ec959b2d-f0d0-43f2-8dc8-934d31b19ffe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Received event network-vif-unplugged-a232f057-6f16-4161-bffc-b743b97e5d1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:11:19 np0005476733 nova_compute[192580]: 2025-10-08 17:11:19.336 2 DEBUG oslo_concurrency.lockutils [req-9d0e4855-3e54-4721-b1f3-f31127088e5e req-ec959b2d-f0d0-43f2-8dc8-934d31b19ffe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:11:19 np0005476733 nova_compute[192580]: 2025-10-08 17:11:19.336 2 DEBUG oslo_concurrency.lockutils [req-9d0e4855-3e54-4721-b1f3-f31127088e5e req-ec959b2d-f0d0-43f2-8dc8-934d31b19ffe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:11:19 np0005476733 nova_compute[192580]: 2025-10-08 17:11:19.336 2 DEBUG oslo_concurrency.lockutils [req-9d0e4855-3e54-4721-b1f3-f31127088e5e req-ec959b2d-f0d0-43f2-8dc8-934d31b19ffe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:11:19 np0005476733 nova_compute[192580]: 2025-10-08 17:11:19.336 2 DEBUG nova.compute.manager [req-9d0e4855-3e54-4721-b1f3-f31127088e5e req-ec959b2d-f0d0-43f2-8dc8-934d31b19ffe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] No waiting events found dispatching network-vif-unplugged-a232f057-6f16-4161-bffc-b743b97e5d1f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 13:11:19 np0005476733 nova_compute[192580]: 2025-10-08 17:11:19.337 2 DEBUG nova.compute.manager [req-9d0e4855-3e54-4721-b1f3-f31127088e5e req-ec959b2d-f0d0-43f2-8dc8-934d31b19ffe 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Received event network-vif-unplugged-a232f057-6f16-4161-bffc-b743b97e5d1f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 13:11:19 np0005476733 systemd[1]: run-netns-ovnmeta\x2d23fdc065\x2d0e6b\x2d415e\x2d9e71\x2d7aa724ab9c52.mount: Deactivated successfully.
Oct  8 13:11:21 np0005476733 nova_compute[192580]: 2025-10-08 17:11:21.478 2 DEBUG nova.compute.manager [req-9fdae36c-ad09-459e-b710-c42a4bc3192a req-11bcfa3e-707b-46b6-bf0d-375ff6ea75ae 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Received event network-vif-plugged-a232f057-6f16-4161-bffc-b743b97e5d1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:11:21 np0005476733 nova_compute[192580]: 2025-10-08 17:11:21.478 2 DEBUG oslo_concurrency.lockutils [req-9fdae36c-ad09-459e-b710-c42a4bc3192a req-11bcfa3e-707b-46b6-bf0d-375ff6ea75ae 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:11:21 np0005476733 nova_compute[192580]: 2025-10-08 17:11:21.478 2 DEBUG oslo_concurrency.lockutils [req-9fdae36c-ad09-459e-b710-c42a4bc3192a req-11bcfa3e-707b-46b6-bf0d-375ff6ea75ae 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:11:21 np0005476733 nova_compute[192580]: 2025-10-08 17:11:21.478 2 DEBUG oslo_concurrency.lockutils [req-9fdae36c-ad09-459e-b710-c42a4bc3192a req-11bcfa3e-707b-46b6-bf0d-375ff6ea75ae 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:11:21 np0005476733 nova_compute[192580]: 2025-10-08 17:11:21.478 2 DEBUG nova.compute.manager [req-9fdae36c-ad09-459e-b710-c42a4bc3192a req-11bcfa3e-707b-46b6-bf0d-375ff6ea75ae 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] No waiting events found dispatching network-vif-plugged-a232f057-6f16-4161-bffc-b743b97e5d1f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 13:11:21 np0005476733 nova_compute[192580]: 2025-10-08 17:11:21.479 2 WARNING nova.compute.manager [req-9fdae36c-ad09-459e-b710-c42a4bc3192a req-11bcfa3e-707b-46b6-bf0d-375ff6ea75ae 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Received unexpected event network-vif-plugged-a232f057-6f16-4161-bffc-b743b97e5d1f for instance with vm_state active and task_state deleting.#033[00m
Oct  8 13:11:21 np0005476733 nova_compute[192580]: 2025-10-08 17:11:21.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:22 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:22.028 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '101'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:11:23 np0005476733 nova_compute[192580]: 2025-10-08 17:11:23.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:23 np0005476733 nova_compute[192580]: 2025-10-08 17:11:23.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:11:23 np0005476733 nova_compute[192580]: 2025-10-08 17:11:23.593 2 DEBUG nova.compute.manager [req-4bc03cf1-4291-45a1-8b72-6201752d8ce0 req-ecbbfa2e-f061-4cd6-b4cb-c3ba1abeac85 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Received event network-vif-unplugged-6a2cde05-dbe2-4f08-94d1-646f8139b49e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:11:23 np0005476733 nova_compute[192580]: 2025-10-08 17:11:23.594 2 DEBUG oslo_concurrency.lockutils [req-4bc03cf1-4291-45a1-8b72-6201752d8ce0 req-ecbbfa2e-f061-4cd6-b4cb-c3ba1abeac85 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:11:23 np0005476733 nova_compute[192580]: 2025-10-08 17:11:23.595 2 DEBUG oslo_concurrency.lockutils [req-4bc03cf1-4291-45a1-8b72-6201752d8ce0 req-ecbbfa2e-f061-4cd6-b4cb-c3ba1abeac85 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:11:23 np0005476733 nova_compute[192580]: 2025-10-08 17:11:23.595 2 DEBUG oslo_concurrency.lockutils [req-4bc03cf1-4291-45a1-8b72-6201752d8ce0 req-ecbbfa2e-f061-4cd6-b4cb-c3ba1abeac85 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:11:23 np0005476733 nova_compute[192580]: 2025-10-08 17:11:23.596 2 DEBUG nova.compute.manager [req-4bc03cf1-4291-45a1-8b72-6201752d8ce0 req-ecbbfa2e-f061-4cd6-b4cb-c3ba1abeac85 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] No waiting events found dispatching network-vif-unplugged-6a2cde05-dbe2-4f08-94d1-646f8139b49e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 13:11:23 np0005476733 nova_compute[192580]: 2025-10-08 17:11:23.596 2 DEBUG nova.compute.manager [req-4bc03cf1-4291-45a1-8b72-6201752d8ce0 req-ecbbfa2e-f061-4cd6-b4cb-c3ba1abeac85 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Received event network-vif-unplugged-6a2cde05-dbe2-4f08-94d1-646f8139b49e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 13:11:24 np0005476733 podman[282392]: 2025-10-08 17:11:24.255151481 +0000 UTC m=+0.071034141 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 13:11:24 np0005476733 podman[282393]: 2025-10-08 17:11:24.277285208 +0000 UTC m=+0.090936267 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 13:11:25 np0005476733 nova_compute[192580]: 2025-10-08 17:11:25.295 2 DEBUG nova.network.neutron [-] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 13:11:25 np0005476733 nova_compute[192580]: 2025-10-08 17:11:25.317 2 INFO nova.compute.manager [-] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Took 6.74 seconds to deallocate network for instance.#033[00m
Oct  8 13:11:25 np0005476733 nova_compute[192580]: 2025-10-08 17:11:25.360 2 DEBUG oslo_concurrency.lockutils [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:11:25 np0005476733 nova_compute[192580]: 2025-10-08 17:11:25.360 2 DEBUG oslo_concurrency.lockutils [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:11:25 np0005476733 nova_compute[192580]: 2025-10-08 17:11:25.408 2 DEBUG nova.compute.manager [req-c58541a2-74a2-44d5-966e-2bb379f1999a req-5b4f414c-836a-4e6a-b86d-92d2b1109e32 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Received event network-vif-deleted-a232f057-6f16-4161-bffc-b743b97e5d1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:11:25 np0005476733 nova_compute[192580]: 2025-10-08 17:11:25.415 2 DEBUG nova.compute.provider_tree [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:11:25 np0005476733 nova_compute[192580]: 2025-10-08 17:11:25.434 2 DEBUG nova.scheduler.client.report [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:11:25 np0005476733 nova_compute[192580]: 2025-10-08 17:11:25.455 2 DEBUG oslo_concurrency.lockutils [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:11:25 np0005476733 nova_compute[192580]: 2025-10-08 17:11:25.501 2 INFO nova.scheduler.client.report [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Deleted allocations for instance d2559ab1-2621-46b4-8e3c-568cbff22376#033[00m
Oct  8 13:11:25 np0005476733 nova_compute[192580]: 2025-10-08 17:11:25.558 2 DEBUG oslo_concurrency.lockutils [None req-293169dc-ecf5-4977-8595-786fb8c93c51 96fd231581074677b87116ed9773b06d c64130dbc8f84b8db2db002a5e499a8b - - default default] Lock "d2559ab1-2621-46b4-8e3c-568cbff22376" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.377s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:11:25 np0005476733 nova_compute[192580]: 2025-10-08 17:11:25.724 2 DEBUG nova.compute.manager [req-752f6620-c058-4fba-a47b-42db3ece077c req-f679fbe8-4160-43ab-8a37-7b20ba594775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Received event network-vif-plugged-6a2cde05-dbe2-4f08-94d1-646f8139b49e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 13:11:25 np0005476733 nova_compute[192580]: 2025-10-08 17:11:25.725 2 DEBUG oslo_concurrency.lockutils [req-752f6620-c058-4fba-a47b-42db3ece077c req-f679fbe8-4160-43ab-8a37-7b20ba594775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Acquiring lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:11:25 np0005476733 nova_compute[192580]: 2025-10-08 17:11:25.725 2 DEBUG oslo_concurrency.lockutils [req-752f6620-c058-4fba-a47b-42db3ece077c req-f679fbe8-4160-43ab-8a37-7b20ba594775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:11:25 np0005476733 nova_compute[192580]: 2025-10-08 17:11:25.725 2 DEBUG oslo_concurrency.lockutils [req-752f6620-c058-4fba-a47b-42db3ece077c req-f679fbe8-4160-43ab-8a37-7b20ba594775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] Lock "d2559ab1-2621-46b4-8e3c-568cbff22376-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:11:25 np0005476733 nova_compute[192580]: 2025-10-08 17:11:25.726 2 DEBUG nova.compute.manager [req-752f6620-c058-4fba-a47b-42db3ece077c req-f679fbe8-4160-43ab-8a37-7b20ba594775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] No waiting events found dispatching network-vif-plugged-6a2cde05-dbe2-4f08-94d1-646f8139b49e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 13:11:25 np0005476733 nova_compute[192580]: 2025-10-08 17:11:25.726 2 WARNING nova.compute.manager [req-752f6620-c058-4fba-a47b-42db3ece077c req-f679fbe8-4160-43ab-8a37-7b20ba594775 6fcd5bff45f5455c9ba85fb260c70000 93e11fcdeb2445a49c8ef208f0f0eb3e - - default default] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Received unexpected event network-vif-plugged-6a2cde05-dbe2-4f08-94d1-646f8139b49e for instance with vm_state deleted and task_state None.#033[00m
Oct  8 13:11:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:26.449 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:11:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:26.449 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:11:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:26.449 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:11:26 np0005476733 nova_compute[192580]: 2025-10-08 17:11:26.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:11:26 np0005476733 nova_compute[192580]: 2025-10-08 17:11:26.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:11:26 np0005476733 nova_compute[192580]: 2025-10-08 17:11:26.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:11:26 np0005476733 nova_compute[192580]: 2025-10-08 17:11:26.621 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:11:26 np0005476733 nova_compute[192580]: 2025-10-08 17:11:26.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:28 np0005476733 nova_compute[192580]: 2025-10-08 17:11:28.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:29 np0005476733 nova_compute[192580]: 2025-10-08 17:11:29.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:11:31 np0005476733 nova_compute[192580]: 2025-10-08 17:11:31.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:11:31 np0005476733 nova_compute[192580]: 2025-10-08 17:11:31.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:33 np0005476733 nova_compute[192580]: 2025-10-08 17:11:33.466 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759943478.4649727, d2559ab1-2621-46b4-8e3c-568cbff22376 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 13:11:33 np0005476733 nova_compute[192580]: 2025-10-08 17:11:33.467 2 INFO nova.compute.manager [-] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] VM Stopped (Lifecycle Event)#033[00m
Oct  8 13:11:33 np0005476733 nova_compute[192580]: 2025-10-08 17:11:33.488 2 DEBUG nova.compute.manager [None req-39bb5c34-dd11-4252-b0c8-3b10e606793c - - - - - -] [instance: d2559ab1-2621-46b4-8e3c-568cbff22376] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 13:11:33 np0005476733 nova_compute[192580]: 2025-10-08 17:11:33.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:34 np0005476733 podman[282433]: 2025-10-08 17:11:34.231373922 +0000 UTC m=+0.059442649 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 13:11:34 np0005476733 nova_compute[192580]: 2025-10-08 17:11:34.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:34 np0005476733 nova_compute[192580]: 2025-10-08 17:11:34.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:36 np0005476733 nova_compute[192580]: 2025-10-08 17:11:36.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:11:36 np0005476733 nova_compute[192580]: 2025-10-08 17:11:36.622 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:11:36 np0005476733 nova_compute[192580]: 2025-10-08 17:11:36.623 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:11:36 np0005476733 nova_compute[192580]: 2025-10-08 17:11:36.623 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:11:36 np0005476733 nova_compute[192580]: 2025-10-08 17:11:36.623 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:11:36 np0005476733 podman[282454]: 2025-10-08 17:11:36.736845905 +0000 UTC m=+0.081777213 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:11:36 np0005476733 nova_compute[192580]: 2025-10-08 17:11:36.763 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:11:36 np0005476733 nova_compute[192580]: 2025-10-08 17:11:36.764 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13625MB free_disk=111.29887008666992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:11:36 np0005476733 nova_compute[192580]: 2025-10-08 17:11:36.764 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:11:36 np0005476733 nova_compute[192580]: 2025-10-08 17:11:36.764 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:11:36 np0005476733 nova_compute[192580]: 2025-10-08 17:11:36.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:36 np0005476733 nova_compute[192580]: 2025-10-08 17:11:36.900 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:11:36 np0005476733 nova_compute[192580]: 2025-10-08 17:11:36.901 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:11:37 np0005476733 nova_compute[192580]: 2025-10-08 17:11:37.001 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:11:37 np0005476733 nova_compute[192580]: 2025-10-08 17:11:37.018 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:11:37 np0005476733 nova_compute[192580]: 2025-10-08 17:11:37.045 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:11:37 np0005476733 nova_compute[192580]: 2025-10-08 17:11:37.046 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:11:38 np0005476733 nova_compute[192580]: 2025-10-08 17:11:38.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:40 np0005476733 podman[282481]: 2025-10-08 17:11:40.256360762 +0000 UTC m=+0.087966270 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  8 13:11:41 np0005476733 nova_compute[192580]: 2025-10-08 17:11:41.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:42 np0005476733 nova_compute[192580]: 2025-10-08 17:11:42.047 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:11:43 np0005476733 nova_compute[192580]: 2025-10-08 17:11:43.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:46 np0005476733 nova_compute[192580]: 2025-10-08 17:11:46.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:48 np0005476733 nova_compute[192580]: 2025-10-08 17:11:48.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:49 np0005476733 podman[282503]: 2025-10-08 17:11:49.244907713 +0000 UTC m=+0.058116518 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc.)
Oct  8 13:11:49 np0005476733 podman[282501]: 2025-10-08 17:11:49.246283276 +0000 UTC m=+0.061255787 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Oct  8 13:11:49 np0005476733 podman[282502]: 2025-10-08 17:11:49.271166541 +0000 UTC m=+0.085902465 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 13:11:49 np0005476733 nova_compute[192580]: 2025-10-08 17:11:49.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:11:49 np0005476733 nova_compute[192580]: 2025-10-08 17:11:49.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 13:11:51 np0005476733 ovn_controller[263831]: 2025-10-08T17:11:51Z|00287|pinctrl|WARN|Dropped 431 log messages in last 46 seconds (most recently, 6 seconds ago) due to excessive rate
Oct  8 13:11:51 np0005476733 ovn_controller[263831]: 2025-10-08T17:11:51Z|00288|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:11:51 np0005476733 nova_compute[192580]: 2025-10-08 17:11:51.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:53 np0005476733 nova_compute[192580]: 2025-10-08 17:11:53.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:55 np0005476733 podman[282562]: 2025-10-08 17:11:55.22689632 +0000 UTC m=+0.045707362 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 13:11:55 np0005476733 podman[282561]: 2025-10-08 17:11:55.228024425 +0000 UTC m=+0.054645026 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  8 13:11:56 np0005476733 nova_compute[192580]: 2025-10-08 17:11:56.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:57 np0005476733 nova_compute[192580]: 2025-10-08 17:11:57.073 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:11:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:57.235 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=102, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=101) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:11:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:57.235 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 13:11:57 np0005476733 nova_compute[192580]: 2025-10-08 17:11:57.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:57 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:11:57.237 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '102'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:11:58 np0005476733 nova_compute[192580]: 2025-10-08 17:11:58.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:11:58 np0005476733 nova_compute[192580]: 2025-10-08 17:11:58.582 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:12:01 np0005476733 nova_compute[192580]: 2025-10-08 17:12:01.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:03 np0005476733 nova_compute[192580]: 2025-10-08 17:12:03.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:03 np0005476733 nova_compute[192580]: 2025-10-08 17:12:03.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:12:05 np0005476733 podman[282601]: 2025-10-08 17:12:05.24222789 +0000 UTC m=+0.076179024 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:12:06 np0005476733 nova_compute[192580]: 2025-10-08 17:12:06.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:07 np0005476733 podman[282620]: 2025-10-08 17:12:07.266582809 +0000 UTC m=+0.094784829 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller)
Oct  8 13:12:07 np0005476733 systemd-logind[827]: New session 171 of user zuul.
Oct  8 13:12:07 np0005476733 systemd[1]: Started Session 171 of User zuul.
Oct  8 13:12:07 np0005476733 systemd[1]: session-171.scope: Deactivated successfully.
Oct  8 13:12:07 np0005476733 systemd-logind[827]: Session 171 logged out. Waiting for processes to exit.
Oct  8 13:12:07 np0005476733 systemd-logind[827]: Removed session 171.
Oct  8 13:12:08 np0005476733 nova_compute[192580]: 2025-10-08 17:12:08.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:09 np0005476733 ovn_controller[263831]: 2025-10-08T17:12:09Z|00289|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct  8 13:12:10 np0005476733 nova_compute[192580]: 2025-10-08 17:12:10.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:12:11 np0005476733 podman[282674]: 2025-10-08 17:12:11.246170741 +0000 UTC m=+0.069908355 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible)
Oct  8 13:12:11 np0005476733 nova_compute[192580]: 2025-10-08 17:12:11.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:13 np0005476733 nova_compute[192580]: 2025-10-08 17:12:13.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:14 np0005476733 nova_compute[192580]: 2025-10-08 17:12:14.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:12:14 np0005476733 nova_compute[192580]: 2025-10-08 17:12:14.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:12:16 np0005476733 nova_compute[192580]: 2025-10-08 17:12:16.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:18 np0005476733 nova_compute[192580]: 2025-10-08 17:12:18.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:20 np0005476733 podman[282694]: 2025-10-08 17:12:20.235415912 +0000 UTC m=+0.064341768 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 13:12:20 np0005476733 podman[282696]: 2025-10-08 17:12:20.247487767 +0000 UTC m=+0.066057532 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, config_id=edpm, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  8 13:12:20 np0005476733 podman[282695]: 2025-10-08 17:12:20.269237532 +0000 UTC m=+0.090284406 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 13:12:20 np0005476733 nova_compute[192580]: 2025-10-08 17:12:20.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:12:21 np0005476733 nova_compute[192580]: 2025-10-08 17:12:21.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:23 np0005476733 nova_compute[192580]: 2025-10-08 17:12:23.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:24 np0005476733 nova_compute[192580]: 2025-10-08 17:12:24.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:12:26 np0005476733 podman[282757]: 2025-10-08 17:12:26.22434484 +0000 UTC m=+0.046137376 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 13:12:26 np0005476733 podman[282756]: 2025-10-08 17:12:26.230653601 +0000 UTC m=+0.056541477 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 13:12:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:12:26.449 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:12:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:12:26.449 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:12:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:12:26.450 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:12:26 np0005476733 nova_compute[192580]: 2025-10-08 17:12:26.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:12:26 np0005476733 nova_compute[192580]: 2025-10-08 17:12:26.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:12:26 np0005476733 nova_compute[192580]: 2025-10-08 17:12:26.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:12:26 np0005476733 nova_compute[192580]: 2025-10-08 17:12:26.669 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:12:26 np0005476733 nova_compute[192580]: 2025-10-08 17:12:26.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:28 np0005476733 nova_compute[192580]: 2025-10-08 17:12:28.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:29 np0005476733 systemd-logind[827]: New session 172 of user zuul.
Oct  8 13:12:29 np0005476733 systemd[1]: Started Session 172 of User zuul.
Oct  8 13:12:29 np0005476733 nova_compute[192580]: 2025-10-08 17:12:29.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:12:29 np0005476733 systemd[1]: session-172.scope: Deactivated successfully.
Oct  8 13:12:29 np0005476733 systemd-logind[827]: Session 172 logged out. Waiting for processes to exit.
Oct  8 13:12:29 np0005476733 systemd-logind[827]: Removed session 172.
Oct  8 13:12:30 np0005476733 nova_compute[192580]: 2025-10-08 17:12:30.741 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:12:31 np0005476733 nova_compute[192580]: 2025-10-08 17:12:31.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:33 np0005476733 nova_compute[192580]: 2025-10-08 17:12:33.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:33 np0005476733 nova_compute[192580]: 2025-10-08 17:12:33.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.080 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.080 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.080 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.080 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.080 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:12:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:12:36 np0005476733 podman[282825]: 2025-10-08 17:12:36.257006436 +0000 UTC m=+0.085483112 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:12:36 np0005476733 nova_compute[192580]: 2025-10-08 17:12:36.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:38 np0005476733 podman[282844]: 2025-10-08 17:12:38.259749114 +0000 UTC m=+0.090154262 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller)
Oct  8 13:12:38 np0005476733 nova_compute[192580]: 2025-10-08 17:12:38.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:38 np0005476733 nova_compute[192580]: 2025-10-08 17:12:38.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:12:38 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:12:38.644 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=103, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=102) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:12:38 np0005476733 nova_compute[192580]: 2025-10-08 17:12:38.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:38 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:12:38.645 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 13:12:38 np0005476733 nova_compute[192580]: 2025-10-08 17:12:38.651 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:12:38 np0005476733 nova_compute[192580]: 2025-10-08 17:12:38.651 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:12:38 np0005476733 nova_compute[192580]: 2025-10-08 17:12:38.651 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:12:38 np0005476733 nova_compute[192580]: 2025-10-08 17:12:38.651 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:12:38 np0005476733 nova_compute[192580]: 2025-10-08 17:12:38.787 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:12:38 np0005476733 nova_compute[192580]: 2025-10-08 17:12:38.788 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13614MB free_disk=111.29887008666992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:12:38 np0005476733 nova_compute[192580]: 2025-10-08 17:12:38.788 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:12:38 np0005476733 nova_compute[192580]: 2025-10-08 17:12:38.789 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:12:39 np0005476733 nova_compute[192580]: 2025-10-08 17:12:39.180 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:12:39 np0005476733 nova_compute[192580]: 2025-10-08 17:12:39.180 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:12:39 np0005476733 nova_compute[192580]: 2025-10-08 17:12:39.219 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 13:12:39 np0005476733 nova_compute[192580]: 2025-10-08 17:12:39.256 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 13:12:39 np0005476733 nova_compute[192580]: 2025-10-08 17:12:39.256 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 13:12:39 np0005476733 nova_compute[192580]: 2025-10-08 17:12:39.283 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 13:12:39 np0005476733 nova_compute[192580]: 2025-10-08 17:12:39.340 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 13:12:39 np0005476733 nova_compute[192580]: 2025-10-08 17:12:39.365 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:12:39 np0005476733 nova_compute[192580]: 2025-10-08 17:12:39.421 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:12:39 np0005476733 nova_compute[192580]: 2025-10-08 17:12:39.423 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:12:39 np0005476733 nova_compute[192580]: 2025-10-08 17:12:39.423 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.635s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:12:41 np0005476733 nova_compute[192580]: 2025-10-08 17:12:41.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:42 np0005476733 podman[282871]: 2025-10-08 17:12:42.217152544 +0000 UTC m=+0.051880249 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Oct  8 13:12:42 np0005476733 nova_compute[192580]: 2025-10-08 17:12:42.423 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:12:43 np0005476733 nova_compute[192580]: 2025-10-08 17:12:43.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:46 np0005476733 nova_compute[192580]: 2025-10-08 17:12:46.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:48 np0005476733 nova_compute[192580]: 2025-10-08 17:12:48.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:48 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:12:48.648 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '103'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:12:51 np0005476733 podman[282893]: 2025-10-08 17:12:51.240962123 +0000 UTC m=+0.060543716 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 13:12:51 np0005476733 podman[282892]: 2025-10-08 17:12:51.258006137 +0000 UTC m=+0.074043267 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd)
Oct  8 13:12:51 np0005476733 podman[282894]: 2025-10-08 17:12:51.284771422 +0000 UTC m=+0.097961491 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc.)
Oct  8 13:12:51 np0005476733 nova_compute[192580]: 2025-10-08 17:12:51.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:53 np0005476733 nova_compute[192580]: 2025-10-08 17:12:53.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:53 np0005476733 nova_compute[192580]: 2025-10-08 17:12:53.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:12:53 np0005476733 nova_compute[192580]: 2025-10-08 17:12:53.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 13:12:53 np0005476733 nova_compute[192580]: 2025-10-08 17:12:53.612 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 13:12:54 np0005476733 systemd-logind[827]: New session 173 of user zuul.
Oct  8 13:12:54 np0005476733 systemd[1]: Started Session 173 of User zuul.
Oct  8 13:12:54 np0005476733 systemd[1]: session-173.scope: Deactivated successfully.
Oct  8 13:12:54 np0005476733 systemd-logind[827]: Session 173 logged out. Waiting for processes to exit.
Oct  8 13:12:54 np0005476733 systemd-logind[827]: Removed session 173.
Oct  8 13:12:56 np0005476733 nova_compute[192580]: 2025-10-08 17:12:56.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:57 np0005476733 podman[282982]: 2025-10-08 17:12:57.232922876 +0000 UTC m=+0.055517045 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct  8 13:12:57 np0005476733 podman[282983]: 2025-10-08 17:12:57.240196399 +0000 UTC m=+0.056649012 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 13:12:58 np0005476733 nova_compute[192580]: 2025-10-08 17:12:58.048 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:12:58 np0005476733 nova_compute[192580]: 2025-10-08 17:12:58.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:12:59 np0005476733 ovn_controller[263831]: 2025-10-08T17:12:59Z|00290|pinctrl|WARN|Dropped 107 log messages in last 67 seconds (most recently, 10 seconds ago) due to excessive rate
Oct  8 13:12:59 np0005476733 ovn_controller[263831]: 2025-10-08T17:12:59Z|00291|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:13:01 np0005476733 nova_compute[192580]: 2025-10-08 17:13:01.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:03 np0005476733 nova_compute[192580]: 2025-10-08 17:13:03.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:06 np0005476733 nova_compute[192580]: 2025-10-08 17:13:06.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:07 np0005476733 podman[283027]: 2025-10-08 17:13:07.209732348 +0000 UTC m=+0.043685638 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:13:08 np0005476733 nova_compute[192580]: 2025-10-08 17:13:08.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:09 np0005476733 podman[283046]: 2025-10-08 17:13:09.264348864 +0000 UTC m=+0.086982460 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:13:10 np0005476733 nova_compute[192580]: 2025-10-08 17:13:10.607 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:13:11 np0005476733 nova_compute[192580]: 2025-10-08 17:13:11.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:12 np0005476733 nova_compute[192580]: 2025-10-08 17:13:12.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:13:12 np0005476733 nova_compute[192580]: 2025-10-08 17:13:12.589 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:13:12 np0005476733 nova_compute[192580]: 2025-10-08 17:13:12.590 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:13:12 np0005476733 nova_compute[192580]: 2025-10-08 17:13:12.590 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:13:12 np0005476733 nova_compute[192580]: 2025-10-08 17:13:12.591 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:13:12 np0005476733 nova_compute[192580]: 2025-10-08 17:13:12.591 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:13:12 np0005476733 nova_compute[192580]: 2025-10-08 17:13:12.591 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:13:12 np0005476733 nova_compute[192580]: 2025-10-08 17:13:12.612 2 DEBUG nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Oct  8 13:13:12 np0005476733 nova_compute[192580]: 2025-10-08 17:13:12.612 2 WARNING nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493#033[00m
Oct  8 13:13:12 np0005476733 nova_compute[192580]: 2025-10-08 17:13:12.612 2 WARNING nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e#033[00m
Oct  8 13:13:12 np0005476733 nova_compute[192580]: 2025-10-08 17:13:12.613 2 INFO nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Removable base files: /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493 /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e#033[00m
Oct  8 13:13:12 np0005476733 nova_compute[192580]: 2025-10-08 17:13:12.613 2 INFO nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/8e8e2abd6632b8926f12ff2e0bba1de20acba493#033[00m
Oct  8 13:13:12 np0005476733 nova_compute[192580]: 2025-10-08 17:13:12.613 2 INFO nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/bbb3d8d61ef6b68186f44149e3aba39e4a3bf32e#033[00m
Oct  8 13:13:12 np0005476733 nova_compute[192580]: 2025-10-08 17:13:12.613 2 DEBUG nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Oct  8 13:13:12 np0005476733 nova_compute[192580]: 2025-10-08 17:13:12.613 2 DEBUG nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Oct  8 13:13:12 np0005476733 nova_compute[192580]: 2025-10-08 17:13:12.613 2 DEBUG nova.virt.libvirt.imagecache [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Oct  8 13:13:13 np0005476733 podman[283072]: 2025-10-08 17:13:13.215823523 +0000 UTC m=+0.051125064 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Oct  8 13:13:13 np0005476733 nova_compute[192580]: 2025-10-08 17:13:13.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:16 np0005476733 nova_compute[192580]: 2025-10-08 17:13:16.612 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:13:16 np0005476733 nova_compute[192580]: 2025-10-08 17:13:16.612 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:13:16 np0005476733 nova_compute[192580]: 2025-10-08 17:13:16.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:18 np0005476733 nova_compute[192580]: 2025-10-08 17:13:18.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:21 np0005476733 nova_compute[192580]: 2025-10-08 17:13:21.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:22 np0005476733 podman[283094]: 2025-10-08 17:13:22.215446388 +0000 UTC m=+0.045111402 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 13:13:22 np0005476733 podman[283096]: 2025-10-08 17:13:22.232852495 +0000 UTC m=+0.057292522 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc.)
Oct  8 13:13:22 np0005476733 podman[283095]: 2025-10-08 17:13:22.252870404 +0000 UTC m=+0.078846350 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 13:13:22 np0005476733 nova_compute[192580]: 2025-10-08 17:13:22.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:13:23 np0005476733 nova_compute[192580]: 2025-10-08 17:13:23.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:13:26.450 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:13:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:13:26.451 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:13:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:13:26.451 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:13:26 np0005476733 nova_compute[192580]: 2025-10-08 17:13:26.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:13:26 np0005476733 nova_compute[192580]: 2025-10-08 17:13:26.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:13:26 np0005476733 nova_compute[192580]: 2025-10-08 17:13:26.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:13:26 np0005476733 nova_compute[192580]: 2025-10-08 17:13:26.614 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:13:26 np0005476733 nova_compute[192580]: 2025-10-08 17:13:26.615 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:13:26 np0005476733 nova_compute[192580]: 2025-10-08 17:13:26.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:28 np0005476733 podman[283155]: 2025-10-08 17:13:28.230556113 +0000 UTC m=+0.057094385 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct  8 13:13:28 np0005476733 podman[283156]: 2025-10-08 17:13:28.234999215 +0000 UTC m=+0.054528384 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 13:13:28 np0005476733 nova_compute[192580]: 2025-10-08 17:13:28.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:31 np0005476733 nova_compute[192580]: 2025-10-08 17:13:31.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:32 np0005476733 nova_compute[192580]: 2025-10-08 17:13:32.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:13:33 np0005476733 nova_compute[192580]: 2025-10-08 17:13:33.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:34 np0005476733 nova_compute[192580]: 2025-10-08 17:13:34.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:13:36 np0005476733 nova_compute[192580]: 2025-10-08 17:13:36.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:13:37.395 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=104, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=103) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:13:37 np0005476733 nova_compute[192580]: 2025-10-08 17:13:37.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:37 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:13:37.396 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 13:13:37 np0005476733 ovn_controller[263831]: 2025-10-08T17:13:37Z|00292|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Oct  8 13:13:38 np0005476733 podman[283199]: 2025-10-08 17:13:38.227470667 +0000 UTC m=+0.051319910 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 13:13:38 np0005476733 nova_compute[192580]: 2025-10-08 17:13:38.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:13:38 np0005476733 nova_compute[192580]: 2025-10-08 17:13:38.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:38 np0005476733 nova_compute[192580]: 2025-10-08 17:13:38.651 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:13:38 np0005476733 nova_compute[192580]: 2025-10-08 17:13:38.651 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:13:38 np0005476733 nova_compute[192580]: 2025-10-08 17:13:38.651 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:13:38 np0005476733 nova_compute[192580]: 2025-10-08 17:13:38.651 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:13:38 np0005476733 nova_compute[192580]: 2025-10-08 17:13:38.798 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:13:38 np0005476733 nova_compute[192580]: 2025-10-08 17:13:38.799 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13634MB free_disk=111.29887008666992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:13:38 np0005476733 nova_compute[192580]: 2025-10-08 17:13:38.799 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:13:38 np0005476733 nova_compute[192580]: 2025-10-08 17:13:38.799 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:13:38 np0005476733 nova_compute[192580]: 2025-10-08 17:13:38.906 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:13:38 np0005476733 nova_compute[192580]: 2025-10-08 17:13:38.907 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:13:38 np0005476733 nova_compute[192580]: 2025-10-08 17:13:38.931 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:13:38 np0005476733 nova_compute[192580]: 2025-10-08 17:13:38.949 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:13:38 np0005476733 nova_compute[192580]: 2025-10-08 17:13:38.950 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:13:38 np0005476733 nova_compute[192580]: 2025-10-08 17:13:38.950 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:13:40 np0005476733 podman[283218]: 2025-10-08 17:13:40.278036573 +0000 UTC m=+0.106097231 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  8 13:13:41 np0005476733 nova_compute[192580]: 2025-10-08 17:13:41.951 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:13:41 np0005476733 nova_compute[192580]: 2025-10-08 17:13:41.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:43 np0005476733 nova_compute[192580]: 2025-10-08 17:13:43.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:44 np0005476733 podman[283245]: 2025-10-08 17:13:44.227159099 +0000 UTC m=+0.054592065 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 13:13:46 np0005476733 nova_compute[192580]: 2025-10-08 17:13:46.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:47 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:13:47.398 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '104'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:13:48 np0005476733 nova_compute[192580]: 2025-10-08 17:13:48.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:51 np0005476733 nova_compute[192580]: 2025-10-08 17:13:51.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:53 np0005476733 podman[283265]: 2025-10-08 17:13:53.226865317 +0000 UTC m=+0.059288886 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  8 13:13:53 np0005476733 podman[283267]: 2025-10-08 17:13:53.226876587 +0000 UTC m=+0.053662036 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  8 13:13:53 np0005476733 podman[283266]: 2025-10-08 17:13:53.227447954 +0000 UTC m=+0.056921658 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 13:13:53 np0005476733 nova_compute[192580]: 2025-10-08 17:13:53.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:56 np0005476733 nova_compute[192580]: 2025-10-08 17:13:56.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:58 np0005476733 nova_compute[192580]: 2025-10-08 17:13:58.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:13:59 np0005476733 podman[283328]: 2025-10-08 17:13:59.220650408 +0000 UTC m=+0.050052501 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 13:13:59 np0005476733 podman[283329]: 2025-10-08 17:13:59.264992795 +0000 UTC m=+0.088884571 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 13:14:01 np0005476733 nova_compute[192580]: 2025-10-08 17:14:01.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:14:02 np0005476733 systemd-logind[827]: New session 174 of user zuul.
Oct  8 13:14:02 np0005476733 systemd[1]: Started Session 174 of User zuul.
Oct  8 13:14:02 np0005476733 systemd[1]: session-174.scope: Deactivated successfully.
Oct  8 13:14:02 np0005476733 systemd-logind[827]: Session 174 logged out. Waiting for processes to exit.
Oct  8 13:14:02 np0005476733 systemd-logind[827]: Removed session 174.
Oct  8 13:14:03 np0005476733 nova_compute[192580]: 2025-10-08 17:14:03.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:14:03 np0005476733 nova_compute[192580]: 2025-10-08 17:14:03.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:14:06 np0005476733 nova_compute[192580]: 2025-10-08 17:14:06.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:14:08 np0005476733 ovn_controller[263831]: 2025-10-08T17:14:08Z|00293|pinctrl|WARN|Dropped 265 log messages in last 70 seconds (most recently, 20 seconds ago) due to excessive rate
Oct  8 13:14:08 np0005476733 ovn_controller[263831]: 2025-10-08T17:14:08Z|00294|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:14:08 np0005476733 nova_compute[192580]: 2025-10-08 17:14:08.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:14:09 np0005476733 podman[283395]: 2025-10-08 17:14:09.229049329 +0000 UTC m=+0.054274656 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Oct  8 13:14:11 np0005476733 podman[283413]: 2025-10-08 17:14:11.293058577 +0000 UTC m=+0.119732876 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  8 13:14:11 np0005476733 nova_compute[192580]: 2025-10-08 17:14:11.603 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:14:12 np0005476733 nova_compute[192580]: 2025-10-08 17:14:12.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:14:13 np0005476733 nova_compute[192580]: 2025-10-08 17:14:13.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:14:15 np0005476733 podman[283441]: 2025-10-08 17:14:15.260013959 +0000 UTC m=+0.083175839 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  8 13:14:16 np0005476733 nova_compute[192580]: 2025-10-08 17:14:16.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:14:16 np0005476733 nova_compute[192580]: 2025-10-08 17:14:16.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:14:17 np0005476733 nova_compute[192580]: 2025-10-08 17:14:17.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:14:18 np0005476733 nova_compute[192580]: 2025-10-08 17:14:18.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:14:22 np0005476733 nova_compute[192580]: 2025-10-08 17:14:22.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:14:22 np0005476733 nova_compute[192580]: 2025-10-08 17:14:22.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:14:23 np0005476733 nova_compute[192580]: 2025-10-08 17:14:23.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:14:24 np0005476733 podman[283462]: 2025-10-08 17:14:24.255682668 +0000 UTC m=+0.079137499 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 13:14:24 np0005476733 podman[283463]: 2025-10-08 17:14:24.261615818 +0000 UTC m=+0.076472344 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 13:14:24 np0005476733 podman[283464]: 2025-10-08 17:14:24.269957544 +0000 UTC m=+0.090537013 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.6, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9)
Oct  8 13:14:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:14:26.451 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:14:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:14:26.452 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:14:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:14:26.452 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:14:26 np0005476733 nova_compute[192580]: 2025-10-08 17:14:26.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:14:26 np0005476733 nova_compute[192580]: 2025-10-08 17:14:26.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:14:26 np0005476733 nova_compute[192580]: 2025-10-08 17:14:26.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:14:26 np0005476733 nova_compute[192580]: 2025-10-08 17:14:26.614 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:14:27 np0005476733 nova_compute[192580]: 2025-10-08 17:14:27.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:14:27 np0005476733 nova_compute[192580]: 2025-10-08 17:14:27.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:14:28 np0005476733 nova_compute[192580]: 2025-10-08 17:14:28.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:14:30 np0005476733 podman[283528]: 2025-10-08 17:14:30.226968432 +0000 UTC m=+0.047044954 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 13:14:30 np0005476733 podman[283527]: 2025-10-08 17:14:30.226983262 +0000 UTC m=+0.045012629 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.3)
Oct  8 13:14:32 np0005476733 nova_compute[192580]: 2025-10-08 17:14:32.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:14:32 np0005476733 nova_compute[192580]: 2025-10-08 17:14:32.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:14:33 np0005476733 nova_compute[192580]: 2025-10-08 17:14:33.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.080 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.080 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:14:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:14:36 np0005476733 nova_compute[192580]: 2025-10-08 17:14:36.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:14:37 np0005476733 nova_compute[192580]: 2025-10-08 17:14:37.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:14:38 np0005476733 nova_compute[192580]: 2025-10-08 17:14:38.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:14:40 np0005476733 podman[283567]: 2025-10-08 17:14:40.245130495 +0000 UTC m=+0.062614742 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent)
Oct  8 13:14:40 np0005476733 nova_compute[192580]: 2025-10-08 17:14:40.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:14:40 np0005476733 nova_compute[192580]: 2025-10-08 17:14:40.620 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:14:40 np0005476733 nova_compute[192580]: 2025-10-08 17:14:40.621 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:14:40 np0005476733 nova_compute[192580]: 2025-10-08 17:14:40.621 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:14:40 np0005476733 nova_compute[192580]: 2025-10-08 17:14:40.621 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:14:40 np0005476733 nova_compute[192580]: 2025-10-08 17:14:40.805 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:14:40 np0005476733 nova_compute[192580]: 2025-10-08 17:14:40.807 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13650MB free_disk=111.29887008666992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:14:40 np0005476733 nova_compute[192580]: 2025-10-08 17:14:40.807 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:14:40 np0005476733 nova_compute[192580]: 2025-10-08 17:14:40.807 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:14:40 np0005476733 nova_compute[192580]: 2025-10-08 17:14:40.896 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:14:40 np0005476733 nova_compute[192580]: 2025-10-08 17:14:40.898 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:14:40 np0005476733 nova_compute[192580]: 2025-10-08 17:14:40.926 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:14:40 np0005476733 nova_compute[192580]: 2025-10-08 17:14:40.941 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:14:40 np0005476733 nova_compute[192580]: 2025-10-08 17:14:40.942 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:14:40 np0005476733 nova_compute[192580]: 2025-10-08 17:14:40.942 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:14:42 np0005476733 nova_compute[192580]: 2025-10-08 17:14:42.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:14:42 np0005476733 podman[283585]: 2025-10-08 17:14:42.252704038 +0000 UTC m=+0.082093895 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:14:43 np0005476733 nova_compute[192580]: 2025-10-08 17:14:43.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:14:43 np0005476733 nova_compute[192580]: 2025-10-08 17:14:43.943 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:14:46 np0005476733 podman[283612]: 2025-10-08 17:14:46.250221319 +0000 UTC m=+0.079081218 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible)
Oct  8 13:14:47 np0005476733 nova_compute[192580]: 2025-10-08 17:14:47.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:14:48 np0005476733 nova_compute[192580]: 2025-10-08 17:14:48.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:14:52 np0005476733 nova_compute[192580]: 2025-10-08 17:14:52.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:14:52 np0005476733 ovn_controller[263831]: 2025-10-08T17:14:52Z|00295|pinctrl|WARN|Dropped 207 log messages in last 44 seconds (most recently, 6 seconds ago) due to excessive rate
Oct  8 13:14:52 np0005476733 ovn_controller[263831]: 2025-10-08T17:14:52Z|00296|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:14:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:14:52.497 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=105, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=104) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:14:52 np0005476733 nova_compute[192580]: 2025-10-08 17:14:52.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:14:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:14:52.498 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 13:14:52 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:14:52.499 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '105'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:14:53 np0005476733 nova_compute[192580]: 2025-10-08 17:14:53.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:14:55 np0005476733 podman[283634]: 2025-10-08 17:14:55.230918281 +0000 UTC m=+0.057435147 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  8 13:14:55 np0005476733 podman[283632]: 2025-10-08 17:14:55.238371948 +0000 UTC m=+0.070757791 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 13:14:55 np0005476733 podman[283633]: 2025-10-08 17:14:55.245465145 +0000 UTC m=+0.075469352 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 13:14:57 np0005476733 nova_compute[192580]: 2025-10-08 17:14:57.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:14:58 np0005476733 nova_compute[192580]: 2025-10-08 17:14:58.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:01 np0005476733 podman[283691]: 2025-10-08 17:15:01.220465765 +0000 UTC m=+0.049961127 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 13:15:01 np0005476733 podman[283690]: 2025-10-08 17:15:01.221462887 +0000 UTC m=+0.055308778 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct  8 13:15:02 np0005476733 nova_compute[192580]: 2025-10-08 17:15:02.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:03 np0005476733 nova_compute[192580]: 2025-10-08 17:15:03.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:07 np0005476733 nova_compute[192580]: 2025-10-08 17:15:07.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:08 np0005476733 nova_compute[192580]: 2025-10-08 17:15:08.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:11 np0005476733 podman[283733]: 2025-10-08 17:15:11.251466568 +0000 UTC m=+0.075929436 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Oct  8 13:15:12 np0005476733 nova_compute[192580]: 2025-10-08 17:15:12.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:13 np0005476733 podman[283753]: 2025-10-08 17:15:13.25743958 +0000 UTC m=+0.092039372 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  8 13:15:13 np0005476733 nova_compute[192580]: 2025-10-08 17:15:13.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:15:13 np0005476733 nova_compute[192580]: 2025-10-08 17:15:13.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:16 np0005476733 nova_compute[192580]: 2025-10-08 17:15:16.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:15:16 np0005476733 nova_compute[192580]: 2025-10-08 17:15:16.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:15:17 np0005476733 nova_compute[192580]: 2025-10-08 17:15:17.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:17 np0005476733 podman[283777]: 2025-10-08 17:15:17.215026018 +0000 UTC m=+0.050364290 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm)
Oct  8 13:15:18 np0005476733 nova_compute[192580]: 2025-10-08 17:15:18.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:22 np0005476733 nova_compute[192580]: 2025-10-08 17:15:22.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:23 np0005476733 nova_compute[192580]: 2025-10-08 17:15:23.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:15:23 np0005476733 nova_compute[192580]: 2025-10-08 17:15:23.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:26 np0005476733 podman[283797]: 2025-10-08 17:15:26.230804967 +0000 UTC m=+0.059332227 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:15:26 np0005476733 podman[283804]: 2025-10-08 17:15:26.247858032 +0000 UTC m=+0.059960777 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_id=edpm, name=ubi9-minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  8 13:15:26 np0005476733 podman[283798]: 2025-10-08 17:15:26.259008389 +0000 UTC m=+0.080864745 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 13:15:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:15:26.453 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:15:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:15:26.453 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:15:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:15:26.453 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:15:26 np0005476733 nova_compute[192580]: 2025-10-08 17:15:26.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:15:26 np0005476733 nova_compute[192580]: 2025-10-08 17:15:26.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:15:26 np0005476733 nova_compute[192580]: 2025-10-08 17:15:26.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:15:26 np0005476733 nova_compute[192580]: 2025-10-08 17:15:26.615 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:15:27 np0005476733 nova_compute[192580]: 2025-10-08 17:15:27.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:28 np0005476733 nova_compute[192580]: 2025-10-08 17:15:28.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:15:28 np0005476733 nova_compute[192580]: 2025-10-08 17:15:28.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:32 np0005476733 nova_compute[192580]: 2025-10-08 17:15:32.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:32 np0005476733 podman[283863]: 2025-10-08 17:15:32.223726762 +0000 UTC m=+0.045883097 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 13:15:32 np0005476733 podman[283862]: 2025-10-08 17:15:32.235824579 +0000 UTC m=+0.064102379 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 13:15:32 np0005476733 nova_compute[192580]: 2025-10-08 17:15:32.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:15:33 np0005476733 nova_compute[192580]: 2025-10-08 17:15:33.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:37 np0005476733 nova_compute[192580]: 2025-10-08 17:15:37.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:37 np0005476733 nova_compute[192580]: 2025-10-08 17:15:37.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:15:38 np0005476733 nova_compute[192580]: 2025-10-08 17:15:38.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:15:40.591 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=106, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=105) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:15:40 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:15:40.591 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 13:15:40 np0005476733 nova_compute[192580]: 2025-10-08 17:15:40.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:41 np0005476733 nova_compute[192580]: 2025-10-08 17:15:41.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:15:41 np0005476733 nova_compute[192580]: 2025-10-08 17:15:41.636 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:15:41 np0005476733 nova_compute[192580]: 2025-10-08 17:15:41.637 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:15:41 np0005476733 nova_compute[192580]: 2025-10-08 17:15:41.638 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:15:41 np0005476733 nova_compute[192580]: 2025-10-08 17:15:41.639 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:15:41 np0005476733 nova_compute[192580]: 2025-10-08 17:15:41.860 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:15:41 np0005476733 nova_compute[192580]: 2025-10-08 17:15:41.863 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13641MB free_disk=111.29887008666992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:15:41 np0005476733 nova_compute[192580]: 2025-10-08 17:15:41.863 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:15:41 np0005476733 nova_compute[192580]: 2025-10-08 17:15:41.864 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:15:41 np0005476733 nova_compute[192580]: 2025-10-08 17:15:41.942 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:15:41 np0005476733 nova_compute[192580]: 2025-10-08 17:15:41.942 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:15:41 np0005476733 nova_compute[192580]: 2025-10-08 17:15:41.973 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:15:41 np0005476733 nova_compute[192580]: 2025-10-08 17:15:41.992 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:15:41 np0005476733 nova_compute[192580]: 2025-10-08 17:15:41.994 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:15:41 np0005476733 nova_compute[192580]: 2025-10-08 17:15:41.994 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:15:42 np0005476733 nova_compute[192580]: 2025-10-08 17:15:42.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:42 np0005476733 podman[283907]: 2025-10-08 17:15:42.280915102 +0000 UTC m=+0.094069407 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001)
Oct  8 13:15:43 np0005476733 nova_compute[192580]: 2025-10-08 17:15:43.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:44 np0005476733 podman[283926]: 2025-10-08 17:15:44.254455348 +0000 UTC m=+0.084233343 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  8 13:15:44 np0005476733 nova_compute[192580]: 2025-10-08 17:15:44.995 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:15:46 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:15:46.593 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '106'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:15:47 np0005476733 nova_compute[192580]: 2025-10-08 17:15:47.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:48 np0005476733 podman[283952]: 2025-10-08 17:15:48.22193622 +0000 UTC m=+0.048236223 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  8 13:15:48 np0005476733 nova_compute[192580]: 2025-10-08 17:15:48.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:52 np0005476733 nova_compute[192580]: 2025-10-08 17:15:52.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:53 np0005476733 nova_compute[192580]: 2025-10-08 17:15:53.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:54 np0005476733 ovn_controller[263831]: 2025-10-08T17:15:54Z|00297|pinctrl|WARN|Dropped 185 log messages in last 62 seconds (most recently, 8 seconds ago) due to excessive rate
Oct  8 13:15:54 np0005476733 ovn_controller[263831]: 2025-10-08T17:15:54Z|00298|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:15:57 np0005476733 nova_compute[192580]: 2025-10-08 17:15:57.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:15:57 np0005476733 podman[283975]: 2025-10-08 17:15:57.226555233 +0000 UTC m=+0.050865726 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 13:15:57 np0005476733 podman[283974]: 2025-10-08 17:15:57.236934775 +0000 UTC m=+0.060382360 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:15:57 np0005476733 podman[283976]: 2025-10-08 17:15:57.237060429 +0000 UTC m=+0.057991924 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., version=9.6, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1755695350, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible)
Oct  8 13:15:58 np0005476733 nova_compute[192580]: 2025-10-08 17:15:58.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:02 np0005476733 nova_compute[192580]: 2025-10-08 17:16:02.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:03 np0005476733 podman[284041]: 2025-10-08 17:16:03.225401708 +0000 UTC m=+0.047176178 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 13:16:03 np0005476733 podman[284040]: 2025-10-08 17:16:03.255868183 +0000 UTC m=+0.080802914 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 13:16:03 np0005476733 nova_compute[192580]: 2025-10-08 17:16:03.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:07 np0005476733 nova_compute[192580]: 2025-10-08 17:16:07.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:08 np0005476733 nova_compute[192580]: 2025-10-08 17:16:08.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:16:08 np0005476733 nova_compute[192580]: 2025-10-08 17:16:08.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:12 np0005476733 nova_compute[192580]: 2025-10-08 17:16:12.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:13 np0005476733 podman[284089]: 2025-10-08 17:16:13.259900574 +0000 UTC m=+0.084525322 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001)
Oct  8 13:16:13 np0005476733 nova_compute[192580]: 2025-10-08 17:16:13.684 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:16:13 np0005476733 nova_compute[192580]: 2025-10-08 17:16:13.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:15 np0005476733 podman[284109]: 2025-10-08 17:16:15.254030217 +0000 UTC m=+0.085847524 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Oct  8 13:16:17 np0005476733 nova_compute[192580]: 2025-10-08 17:16:17.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:17 np0005476733 nova_compute[192580]: 2025-10-08 17:16:17.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:16:17 np0005476733 nova_compute[192580]: 2025-10-08 17:16:17.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:16:18 np0005476733 nova_compute[192580]: 2025-10-08 17:16:18.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:19 np0005476733 podman[284142]: 2025-10-08 17:16:19.22835415 +0000 UTC m=+0.057304011 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute)
Oct  8 13:16:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:16:21.724 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=107, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=106) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:16:21 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:16:21.726 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 13:16:21 np0005476733 nova_compute[192580]: 2025-10-08 17:16:21.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:22 np0005476733 nova_compute[192580]: 2025-10-08 17:16:22.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:23 np0005476733 nova_compute[192580]: 2025-10-08 17:16:23.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:16:24.728 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '107'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:16:25 np0005476733 nova_compute[192580]: 2025-10-08 17:16:25.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:16:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:16:26.454 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:16:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:16:26.455 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:16:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:16:26.455 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:16:26 np0005476733 nova_compute[192580]: 2025-10-08 17:16:26.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:16:26 np0005476733 nova_compute[192580]: 2025-10-08 17:16:26.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:16:26 np0005476733 nova_compute[192580]: 2025-10-08 17:16:26.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:16:26 np0005476733 nova_compute[192580]: 2025-10-08 17:16:26.676 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:16:27 np0005476733 nova_compute[192580]: 2025-10-08 17:16:27.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:28 np0005476733 podman[284170]: 2025-10-08 17:16:28.229470811 +0000 UTC m=+0.053151199 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 13:16:28 np0005476733 podman[284169]: 2025-10-08 17:16:28.244898354 +0000 UTC m=+0.069927175 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 13:16:28 np0005476733 podman[284171]: 2025-10-08 17:16:28.263702735 +0000 UTC m=+0.073686646 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, container_name=openstack_network_exporter, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public)
Oct  8 13:16:28 np0005476733 nova_compute[192580]: 2025-10-08 17:16:28.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:16:28 np0005476733 nova_compute[192580]: 2025-10-08 17:16:28.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:32 np0005476733 nova_compute[192580]: 2025-10-08 17:16:32.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:33 np0005476733 nova_compute[192580]: 2025-10-08 17:16:33.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:16:33 np0005476733 nova_compute[192580]: 2025-10-08 17:16:33.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:34 np0005476733 podman[284232]: 2025-10-08 17:16:34.234206743 +0000 UTC m=+0.061230347 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 13:16:34 np0005476733 podman[284233]: 2025-10-08 17:16:34.242996784 +0000 UTC m=+0.067528358 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:16:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:16:37 np0005476733 nova_compute[192580]: 2025-10-08 17:16:37.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:38 np0005476733 nova_compute[192580]: 2025-10-08 17:16:38.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:16:38 np0005476733 nova_compute[192580]: 2025-10-08 17:16:38.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:41 np0005476733 nova_compute[192580]: 2025-10-08 17:16:41.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:16:41 np0005476733 nova_compute[192580]: 2025-10-08 17:16:41.684 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:16:41 np0005476733 nova_compute[192580]: 2025-10-08 17:16:41.685 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:16:41 np0005476733 nova_compute[192580]: 2025-10-08 17:16:41.685 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:16:41 np0005476733 nova_compute[192580]: 2025-10-08 17:16:41.685 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:16:41 np0005476733 nova_compute[192580]: 2025-10-08 17:16:41.852 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:16:41 np0005476733 nova_compute[192580]: 2025-10-08 17:16:41.853 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13664MB free_disk=111.2988166809082GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:16:41 np0005476733 nova_compute[192580]: 2025-10-08 17:16:41.854 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:16:41 np0005476733 nova_compute[192580]: 2025-10-08 17:16:41.854 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:16:42 np0005476733 nova_compute[192580]: 2025-10-08 17:16:42.183 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:16:42 np0005476733 nova_compute[192580]: 2025-10-08 17:16:42.184 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:16:42 np0005476733 nova_compute[192580]: 2025-10-08 17:16:42.207 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:16:42 np0005476733 nova_compute[192580]: 2025-10-08 17:16:42.224 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:16:42 np0005476733 nova_compute[192580]: 2025-10-08 17:16:42.226 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:16:42 np0005476733 nova_compute[192580]: 2025-10-08 17:16:42.226 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:16:42 np0005476733 nova_compute[192580]: 2025-10-08 17:16:42.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:43 np0005476733 nova_compute[192580]: 2025-10-08 17:16:43.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:44 np0005476733 podman[284280]: 2025-10-08 17:16:44.289162243 +0000 UTC m=+0.107689043 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 13:16:45 np0005476733 nova_compute[192580]: 2025-10-08 17:16:45.226 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:16:46 np0005476733 podman[284299]: 2025-10-08 17:16:46.256256291 +0000 UTC m=+0.089939734 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller)
Oct  8 13:16:47 np0005476733 nova_compute[192580]: 2025-10-08 17:16:47.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:48 np0005476733 nova_compute[192580]: 2025-10-08 17:16:48.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:50 np0005476733 podman[284329]: 2025-10-08 17:16:50.249753814 +0000 UTC m=+0.073356014 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Oct  8 13:16:52 np0005476733 nova_compute[192580]: 2025-10-08 17:16:52.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:52 np0005476733 ovn_controller[263831]: 2025-10-08T17:16:52Z|00299|pinctrl|WARN|Dropped 135 log messages in last 58 seconds (most recently, 7 seconds ago) due to excessive rate
Oct  8 13:16:52 np0005476733 ovn_controller[263831]: 2025-10-08T17:16:52Z|00300|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:16:53 np0005476733 nova_compute[192580]: 2025-10-08 17:16:53.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:54 np0005476733 nova_compute[192580]: 2025-10-08 17:16:54.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:16:54 np0005476733 nova_compute[192580]: 2025-10-08 17:16:54.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 13:16:57 np0005476733 nova_compute[192580]: 2025-10-08 17:16:57.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:58 np0005476733 nova_compute[192580]: 2025-10-08 17:16:58.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:16:59 np0005476733 podman[284353]: 2025-10-08 17:16:59.259380759 +0000 UTC m=+0.070189083 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vendor=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  8 13:16:59 np0005476733 podman[284352]: 2025-10-08 17:16:59.259598226 +0000 UTC m=+0.064785630 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 13:16:59 np0005476733 podman[284351]: 2025-10-08 17:16:59.276556178 +0000 UTC m=+0.094910463 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd)
Oct  8 13:17:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:17:01.634 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=108, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=107) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:17:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:17:01.634 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 13:17:01 np0005476733 nova_compute[192580]: 2025-10-08 17:17:01.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:17:01.635 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '108'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:17:02 np0005476733 nova_compute[192580]: 2025-10-08 17:17:02.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:03 np0005476733 nova_compute[192580]: 2025-10-08 17:17:03.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:05 np0005476733 podman[284417]: 2025-10-08 17:17:05.25847059 +0000 UTC m=+0.070830643 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 13:17:05 np0005476733 podman[284416]: 2025-10-08 17:17:05.260649601 +0000 UTC m=+0.078795649 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Oct  8 13:17:07 np0005476733 nova_compute[192580]: 2025-10-08 17:17:07.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:08 np0005476733 nova_compute[192580]: 2025-10-08 17:17:08.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:12 np0005476733 nova_compute[192580]: 2025-10-08 17:17:12.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:13 np0005476733 nova_compute[192580]: 2025-10-08 17:17:13.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:14 np0005476733 nova_compute[192580]: 2025-10-08 17:17:14.601 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:17:15 np0005476733 podman[284466]: 2025-10-08 17:17:15.227350811 +0000 UTC m=+0.053060017 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 13:17:17 np0005476733 podman[284486]: 2025-10-08 17:17:17.272012287 +0000 UTC m=+0.106764173 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 13:17:17 np0005476733 nova_compute[192580]: 2025-10-08 17:17:17.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:18 np0005476733 nova_compute[192580]: 2025-10-08 17:17:18.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:17:18 np0005476733 nova_compute[192580]: 2025-10-08 17:17:18.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:17:18 np0005476733 nova_compute[192580]: 2025-10-08 17:17:18.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:21 np0005476733 podman[284514]: 2025-10-08 17:17:21.256845244 +0000 UTC m=+0.084877023 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute)
Oct  8 13:17:22 np0005476733 nova_compute[192580]: 2025-10-08 17:17:22.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:23 np0005476733 nova_compute[192580]: 2025-10-08 17:17:23.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:17:26.455 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:17:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:17:26.456 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:17:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:17:26.456 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:17:26 np0005476733 nova_compute[192580]: 2025-10-08 17:17:26.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:17:26 np0005476733 nova_compute[192580]: 2025-10-08 17:17:26.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:17:26 np0005476733 nova_compute[192580]: 2025-10-08 17:17:26.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:17:26 np0005476733 nova_compute[192580]: 2025-10-08 17:17:26.653 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:17:27 np0005476733 nova_compute[192580]: 2025-10-08 17:17:27.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:27 np0005476733 nova_compute[192580]: 2025-10-08 17:17:27.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:17:28 np0005476733 nova_compute[192580]: 2025-10-08 17:17:28.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:17:28 np0005476733 nova_compute[192580]: 2025-10-08 17:17:28.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:30 np0005476733 podman[284539]: 2025-10-08 17:17:30.223878799 +0000 UTC m=+0.051043781 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, release=1755695350, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64)
Oct  8 13:17:30 np0005476733 podman[284537]: 2025-10-08 17:17:30.232125213 +0000 UTC m=+0.064589245 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Oct  8 13:17:30 np0005476733 podman[284538]: 2025-10-08 17:17:30.245462849 +0000 UTC m=+0.066640580 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 13:17:32 np0005476733 nova_compute[192580]: 2025-10-08 17:17:32.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:33 np0005476733 nova_compute[192580]: 2025-10-08 17:17:33.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:34 np0005476733 nova_compute[192580]: 2025-10-08 17:17:34.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:17:36 np0005476733 podman[284601]: 2025-10-08 17:17:36.222235486 +0000 UTC m=+0.047216010 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 13:17:36 np0005476733 podman[284600]: 2025-10-08 17:17:36.222870386 +0000 UTC m=+0.052705295 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 13:17:37 np0005476733 nova_compute[192580]: 2025-10-08 17:17:37.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:38 np0005476733 nova_compute[192580]: 2025-10-08 17:17:38.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:39 np0005476733 nova_compute[192580]: 2025-10-08 17:17:39.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:17:41 np0005476733 nova_compute[192580]: 2025-10-08 17:17:41.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:17:42 np0005476733 nova_compute[192580]: 2025-10-08 17:17:42.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:17:42.623 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=109, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=108) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:17:42 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:17:42.623 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 13:17:42 np0005476733 nova_compute[192580]: 2025-10-08 17:17:42.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:43 np0005476733 nova_compute[192580]: 2025-10-08 17:17:43.615 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:17:43 np0005476733 nova_compute[192580]: 2025-10-08 17:17:43.642 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:17:43 np0005476733 nova_compute[192580]: 2025-10-08 17:17:43.643 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:17:43 np0005476733 nova_compute[192580]: 2025-10-08 17:17:43.643 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:17:43 np0005476733 nova_compute[192580]: 2025-10-08 17:17:43.644 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:17:43 np0005476733 nova_compute[192580]: 2025-10-08 17:17:43.843 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:17:43 np0005476733 nova_compute[192580]: 2025-10-08 17:17:43.844 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13661MB free_disk=111.29923248291016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:17:43 np0005476733 nova_compute[192580]: 2025-10-08 17:17:43.844 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:17:43 np0005476733 nova_compute[192580]: 2025-10-08 17:17:43.844 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:17:43 np0005476733 nova_compute[192580]: 2025-10-08 17:17:43.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:43 np0005476733 nova_compute[192580]: 2025-10-08 17:17:43.930 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:17:43 np0005476733 nova_compute[192580]: 2025-10-08 17:17:43.930 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:17:44 np0005476733 nova_compute[192580]: 2025-10-08 17:17:44.099 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 13:17:44 np0005476733 nova_compute[192580]: 2025-10-08 17:17:44.199 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 13:17:44 np0005476733 nova_compute[192580]: 2025-10-08 17:17:44.200 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 13:17:44 np0005476733 nova_compute[192580]: 2025-10-08 17:17:44.227 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 13:17:44 np0005476733 nova_compute[192580]: 2025-10-08 17:17:44.261 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 13:17:44 np0005476733 nova_compute[192580]: 2025-10-08 17:17:44.288 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:17:44 np0005476733 nova_compute[192580]: 2025-10-08 17:17:44.306 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:17:44 np0005476733 nova_compute[192580]: 2025-10-08 17:17:44.308 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:17:44 np0005476733 nova_compute[192580]: 2025-10-08 17:17:44.308 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.464s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:17:46 np0005476733 podman[284645]: 2025-10-08 17:17:46.228160498 +0000 UTC m=+0.058728767 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent)
Oct  8 13:17:46 np0005476733 nova_compute[192580]: 2025-10-08 17:17:46.282 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:17:47 np0005476733 nova_compute[192580]: 2025-10-08 17:17:47.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:48 np0005476733 podman[284664]: 2025-10-08 17:17:48.28642972 +0000 UTC m=+0.109664805 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller)
Oct  8 13:17:48 np0005476733 nova_compute[192580]: 2025-10-08 17:17:48.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:49 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:17:49.626 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '109'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:17:51 np0005476733 ovn_controller[263831]: 2025-10-08T17:17:51Z|00301|pinctrl|WARN|Dropped 171 log messages in last 59 seconds (most recently, 2 seconds ago) due to excessive rate
Oct  8 13:17:51 np0005476733 ovn_controller[263831]: 2025-10-08T17:17:51Z|00302|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:17:52 np0005476733 podman[284692]: 2025-10-08 17:17:52.22474774 +0000 UTC m=+0.054423969 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 13:17:52 np0005476733 nova_compute[192580]: 2025-10-08 17:17:52.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:53 np0005476733 nova_compute[192580]: 2025-10-08 17:17:53.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:54 np0005476733 nova_compute[192580]: 2025-10-08 17:17:54.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:17:54 np0005476733 nova_compute[192580]: 2025-10-08 17:17:54.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 13:17:54 np0005476733 nova_compute[192580]: 2025-10-08 17:17:54.692 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 13:17:57 np0005476733 nova_compute[192580]: 2025-10-08 17:17:57.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:17:58 np0005476733 nova_compute[192580]: 2025-10-08 17:17:58.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:00 np0005476733 podman[284713]: 2025-10-08 17:18:00.823055802 +0000 UTC m=+0.052015603 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 13:18:00 np0005476733 podman[284712]: 2025-10-08 17:18:00.829386665 +0000 UTC m=+0.061638390 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct  8 13:18:00 np0005476733 podman[284714]: 2025-10-08 17:18:00.858924848 +0000 UTC m=+0.084240112 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.6, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal, distribution-scope=public)
Oct  8 13:18:02 np0005476733 nova_compute[192580]: 2025-10-08 17:18:02.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:03 np0005476733 nova_compute[192580]: 2025-10-08 17:18:03.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:07 np0005476733 podman[284773]: 2025-10-08 17:18:07.235313397 +0000 UTC m=+0.062746206 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 13:18:07 np0005476733 podman[284774]: 2025-10-08 17:18:07.24103993 +0000 UTC m=+0.065591987 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 13:18:07 np0005476733 nova_compute[192580]: 2025-10-08 17:18:07.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:08 np0005476733 nova_compute[192580]: 2025-10-08 17:18:08.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:11 np0005476733 nova_compute[192580]: 2025-10-08 17:18:11.684 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:18:12 np0005476733 nova_compute[192580]: 2025-10-08 17:18:12.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:13 np0005476733 nova_compute[192580]: 2025-10-08 17:18:13.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:14 np0005476733 nova_compute[192580]: 2025-10-08 17:18:14.597 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:18:17 np0005476733 podman[284816]: 2025-10-08 17:18:17.240184296 +0000 UTC m=+0.056687253 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:18:17 np0005476733 nova_compute[192580]: 2025-10-08 17:18:17.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:18 np0005476733 nova_compute[192580]: 2025-10-08 17:18:18.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:19 np0005476733 podman[284836]: 2025-10-08 17:18:19.292936861 +0000 UTC m=+0.115617854 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 13:18:20 np0005476733 nova_compute[192580]: 2025-10-08 17:18:20.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:18:20 np0005476733 nova_compute[192580]: 2025-10-08 17:18:20.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:18:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:18:20.619 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=110, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=109) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:18:20 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:18:20.621 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 13:18:20 np0005476733 nova_compute[192580]: 2025-10-08 17:18:20.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:22 np0005476733 nova_compute[192580]: 2025-10-08 17:18:22.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:23 np0005476733 podman[284864]: 2025-10-08 17:18:23.090595767 +0000 UTC m=+0.049555034 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 13:18:24 np0005476733 nova_compute[192580]: 2025-10-08 17:18:24.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:18:26.457 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:18:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:18:26.457 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:18:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:18:26.458 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:18:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:18:26.623 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '110'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:18:27 np0005476733 nova_compute[192580]: 2025-10-08 17:18:27.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:27 np0005476733 nova_compute[192580]: 2025-10-08 17:18:27.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:18:28 np0005476733 nova_compute[192580]: 2025-10-08 17:18:28.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:18:28 np0005476733 nova_compute[192580]: 2025-10-08 17:18:28.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:18:28 np0005476733 nova_compute[192580]: 2025-10-08 17:18:28.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:18:28 np0005476733 nova_compute[192580]: 2025-10-08 17:18:28.642 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:18:29 np0005476733 nova_compute[192580]: 2025-10-08 17:18:29.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:29 np0005476733 nova_compute[192580]: 2025-10-08 17:18:29.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:18:31 np0005476733 podman[284888]: 2025-10-08 17:18:31.250454807 +0000 UTC m=+0.061743854 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-08-20T13:12:41, vcs-type=git, config_id=edpm)
Oct  8 13:18:31 np0005476733 podman[284887]: 2025-10-08 17:18:31.278468092 +0000 UTC m=+0.105434499 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 13:18:31 np0005476733 podman[284886]: 2025-10-08 17:18:31.290611061 +0000 UTC m=+0.117631810 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  8 13:18:32 np0005476733 nova_compute[192580]: 2025-10-08 17:18:32.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:34 np0005476733 nova_compute[192580]: 2025-10-08 17:18:34.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:18:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:18:36 np0005476733 nova_compute[192580]: 2025-10-08 17:18:36.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:18:37 np0005476733 nova_compute[192580]: 2025-10-08 17:18:37.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:38 np0005476733 podman[284950]: 2025-10-08 17:18:38.228926117 +0000 UTC m=+0.054659748 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 13:18:38 np0005476733 podman[284949]: 2025-10-08 17:18:38.236221829 +0000 UTC m=+0.060760522 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, config_id=iscsid)
Oct  8 13:18:39 np0005476733 nova_compute[192580]: 2025-10-08 17:18:39.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:39 np0005476733 nova_compute[192580]: 2025-10-08 17:18:39.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:18:42 np0005476733 nova_compute[192580]: 2025-10-08 17:18:42.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:44 np0005476733 nova_compute[192580]: 2025-10-08 17:18:44.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:44 np0005476733 nova_compute[192580]: 2025-10-08 17:18:44.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:18:44 np0005476733 nova_compute[192580]: 2025-10-08 17:18:44.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:18:44 np0005476733 nova_compute[192580]: 2025-10-08 17:18:44.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:18:44 np0005476733 nova_compute[192580]: 2025-10-08 17:18:44.617 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:18:44 np0005476733 nova_compute[192580]: 2025-10-08 17:18:44.617 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:18:44 np0005476733 nova_compute[192580]: 2025-10-08 17:18:44.757 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:18:44 np0005476733 nova_compute[192580]: 2025-10-08 17:18:44.758 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13672MB free_disk=111.29923248291016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:18:44 np0005476733 nova_compute[192580]: 2025-10-08 17:18:44.758 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:18:44 np0005476733 nova_compute[192580]: 2025-10-08 17:18:44.758 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:18:44 np0005476733 nova_compute[192580]: 2025-10-08 17:18:44.857 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:18:44 np0005476733 nova_compute[192580]: 2025-10-08 17:18:44.858 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:18:44 np0005476733 nova_compute[192580]: 2025-10-08 17:18:44.893 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:18:44 np0005476733 nova_compute[192580]: 2025-10-08 17:18:44.906 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:18:44 np0005476733 nova_compute[192580]: 2025-10-08 17:18:44.907 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:18:44 np0005476733 nova_compute[192580]: 2025-10-08 17:18:44.908 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:18:47 np0005476733 podman[284990]: 2025-10-08 17:18:47.451817426 +0000 UTC m=+0.046171025 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  8 13:18:47 np0005476733 nova_compute[192580]: 2025-10-08 17:18:47.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:47 np0005476733 nova_compute[192580]: 2025-10-08 17:18:47.907 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:18:49 np0005476733 nova_compute[192580]: 2025-10-08 17:18:49.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:50 np0005476733 podman[285011]: 2025-10-08 17:18:50.246912824 +0000 UTC m=+0.078413236 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 13:18:52 np0005476733 nova_compute[192580]: 2025-10-08 17:18:52.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:52 np0005476733 ovn_controller[263831]: 2025-10-08T17:18:52Z|00303|pinctrl|WARN|Dropped 151 log messages in last 61 seconds (most recently, 3 seconds ago) due to excessive rate
Oct  8 13:18:52 np0005476733 ovn_controller[263831]: 2025-10-08T17:18:52Z|00304|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:18:53 np0005476733 podman[285038]: 2025-10-08 17:18:53.218256525 +0000 UTC m=+0.047210579 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Oct  8 13:18:54 np0005476733 nova_compute[192580]: 2025-10-08 17:18:54.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:57 np0005476733 nova_compute[192580]: 2025-10-08 17:18:57.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:18:59 np0005476733 nova_compute[192580]: 2025-10-08 17:18:59.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:19:01.668 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=111, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=110) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:19:01 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:19:01.669 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 13:19:01 np0005476733 nova_compute[192580]: 2025-10-08 17:19:01.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:02 np0005476733 podman[285063]: 2025-10-08 17:19:02.220319537 +0000 UTC m=+0.044190213 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 13:19:02 np0005476733 podman[285062]: 2025-10-08 17:19:02.223998945 +0000 UTC m=+0.049673889 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 13:19:02 np0005476733 podman[285064]: 2025-10-08 17:19:02.22666505 +0000 UTC m=+0.046934401 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9-minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, release=1755695350, vendor=Red Hat, Inc., io.buildah.version=1.33.7, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  8 13:19:02 np0005476733 nova_compute[192580]: 2025-10-08 17:19:02.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:19:03.671 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '111'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:19:04 np0005476733 nova_compute[192580]: 2025-10-08 17:19:04.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:07 np0005476733 nova_compute[192580]: 2025-10-08 17:19:07.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:08 np0005476733 podman[285129]: 2025-10-08 17:19:08.985885342 +0000 UTC m=+0.051243057 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 13:19:09 np0005476733 podman[285128]: 2025-10-08 17:19:09.016963395 +0000 UTC m=+0.081231436 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 13:19:09 np0005476733 nova_compute[192580]: 2025-10-08 17:19:09.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:12 np0005476733 nova_compute[192580]: 2025-10-08 17:19:12.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:14 np0005476733 nova_compute[192580]: 2025-10-08 17:19:14.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:14 np0005476733 nova_compute[192580]: 2025-10-08 17:19:14.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:19:17 np0005476733 nova_compute[192580]: 2025-10-08 17:19:17.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:17 np0005476733 podman[285175]: 2025-10-08 17:19:17.778559944 +0000 UTC m=+0.070158013 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct  8 13:19:19 np0005476733 nova_compute[192580]: 2025-10-08 17:19:19.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:21 np0005476733 podman[285195]: 2025-10-08 17:19:21.243714644 +0000 UTC m=+0.076065392 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  8 13:19:21 np0005476733 nova_compute[192580]: 2025-10-08 17:19:21.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:19:21 np0005476733 nova_compute[192580]: 2025-10-08 17:19:21.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:19:22 np0005476733 nova_compute[192580]: 2025-10-08 17:19:22.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:24 np0005476733 nova_compute[192580]: 2025-10-08 17:19:24.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:24 np0005476733 podman[285223]: 2025-10-08 17:19:24.27003469 +0000 UTC m=+0.097376522 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, container_name=ceilometer_agent_compute)
Oct  8 13:19:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:19:26.458 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:19:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:19:26.458 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:19:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:19:26.458 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:19:27 np0005476733 nova_compute[192580]: 2025-10-08 17:19:27.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:29 np0005476733 nova_compute[192580]: 2025-10-08 17:19:29.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:29 np0005476733 nova_compute[192580]: 2025-10-08 17:19:29.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:19:30 np0005476733 nova_compute[192580]: 2025-10-08 17:19:30.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:19:30 np0005476733 nova_compute[192580]: 2025-10-08 17:19:30.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:19:30 np0005476733 nova_compute[192580]: 2025-10-08 17:19:30.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:19:30 np0005476733 nova_compute[192580]: 2025-10-08 17:19:30.612 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:19:30 np0005476733 nova_compute[192580]: 2025-10-08 17:19:30.612 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:19:32 np0005476733 nova_compute[192580]: 2025-10-08 17:19:32.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:33 np0005476733 podman[285243]: 2025-10-08 17:19:33.234895855 +0000 UTC m=+0.064849473 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 13:19:33 np0005476733 podman[285245]: 2025-10-08 17:19:33.246910249 +0000 UTC m=+0.063345395 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, version=9.6, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, name=ubi9-minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.openshift.expose-services=)
Oct  8 13:19:33 np0005476733 podman[285244]: 2025-10-08 17:19:33.264600424 +0000 UTC m=+0.080547895 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 13:19:34 np0005476733 nova_compute[192580]: 2025-10-08 17:19:34.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:36 np0005476733 nova_compute[192580]: 2025-10-08 17:19:36.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:19:37 np0005476733 nova_compute[192580]: 2025-10-08 17:19:37.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:38 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:19:38.709 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=112, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=111) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:19:38 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:19:38.709 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 13:19:38 np0005476733 nova_compute[192580]: 2025-10-08 17:19:38.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:39 np0005476733 podman[285308]: 2025-10-08 17:19:39.227409457 +0000 UTC m=+0.054046879 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  8 13:19:39 np0005476733 podman[285309]: 2025-10-08 17:19:39.253879653 +0000 UTC m=+0.077055534 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 13:19:39 np0005476733 nova_compute[192580]: 2025-10-08 17:19:39.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:39 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:19:39.712 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '112'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:19:41 np0005476733 nova_compute[192580]: 2025-10-08 17:19:41.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:19:42 np0005476733 nova_compute[192580]: 2025-10-08 17:19:42.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:44 np0005476733 nova_compute[192580]: 2025-10-08 17:19:44.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:46 np0005476733 nova_compute[192580]: 2025-10-08 17:19:46.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:19:46 np0005476733 nova_compute[192580]: 2025-10-08 17:19:46.622 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:19:46 np0005476733 nova_compute[192580]: 2025-10-08 17:19:46.623 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:19:46 np0005476733 nova_compute[192580]: 2025-10-08 17:19:46.623 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:19:46 np0005476733 nova_compute[192580]: 2025-10-08 17:19:46.624 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:19:46 np0005476733 nova_compute[192580]: 2025-10-08 17:19:46.800 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:19:46 np0005476733 nova_compute[192580]: 2025-10-08 17:19:46.801 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13662MB free_disk=111.29923248291016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:19:46 np0005476733 nova_compute[192580]: 2025-10-08 17:19:46.802 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:19:46 np0005476733 nova_compute[192580]: 2025-10-08 17:19:46.802 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:19:46 np0005476733 nova_compute[192580]: 2025-10-08 17:19:46.875 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:19:46 np0005476733 nova_compute[192580]: 2025-10-08 17:19:46.876 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:19:46 np0005476733 nova_compute[192580]: 2025-10-08 17:19:46.905 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:19:46 np0005476733 nova_compute[192580]: 2025-10-08 17:19:46.921 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:19:46 np0005476733 nova_compute[192580]: 2025-10-08 17:19:46.923 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:19:46 np0005476733 nova_compute[192580]: 2025-10-08 17:19:46.924 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:19:47 np0005476733 nova_compute[192580]: 2025-10-08 17:19:47.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:48 np0005476733 podman[285355]: 2025-10-08 17:19:48.21697984 +0000 UTC m=+0.047293392 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 13:19:48 np0005476733 nova_compute[192580]: 2025-10-08 17:19:48.924 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:19:49 np0005476733 nova_compute[192580]: 2025-10-08 17:19:49.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:51 np0005476733 ovn_controller[263831]: 2025-10-08T17:19:51Z|00305|pinctrl|WARN|Dropped 167 log messages in last 59 seconds (most recently, 3 seconds ago) due to excessive rate
Oct  8 13:19:51 np0005476733 ovn_controller[263831]: 2025-10-08T17:19:51Z|00306|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:19:52 np0005476733 podman[285374]: 2025-10-08 17:19:52.033882691 +0000 UTC m=+0.108082095 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 13:19:52 np0005476733 nova_compute[192580]: 2025-10-08 17:19:52.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:54 np0005476733 nova_compute[192580]: 2025-10-08 17:19:54.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:54 np0005476733 podman[285400]: 2025-10-08 17:19:54.563155454 +0000 UTC m=+0.060809734 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 13:19:57 np0005476733 nova_compute[192580]: 2025-10-08 17:19:57.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:19:59 np0005476733 nova_compute[192580]: 2025-10-08 17:19:59.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:02 np0005476733 nova_compute[192580]: 2025-10-08 17:20:02.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:04 np0005476733 podman[285421]: 2025-10-08 17:20:04.216934683 +0000 UTC m=+0.046087514 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 13:20:04 np0005476733 podman[285420]: 2025-10-08 17:20:04.220740975 +0000 UTC m=+0.053103937 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible)
Oct  8 13:20:04 np0005476733 podman[285422]: 2025-10-08 17:20:04.227289824 +0000 UTC m=+0.054804372 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, distribution-scope=public, config_id=edpm, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git)
Oct  8 13:20:04 np0005476733 nova_compute[192580]: 2025-10-08 17:20:04.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:07 np0005476733 nova_compute[192580]: 2025-10-08 17:20:07.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:09 np0005476733 nova_compute[192580]: 2025-10-08 17:20:09.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:10 np0005476733 podman[285487]: 2025-10-08 17:20:10.216457339 +0000 UTC m=+0.043152810 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 13:20:10 np0005476733 podman[285486]: 2025-10-08 17:20:10.223152872 +0000 UTC m=+0.057567820 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 13:20:12 np0005476733 nova_compute[192580]: 2025-10-08 17:20:12.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:13 np0005476733 nova_compute[192580]: 2025-10-08 17:20:13.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:20:14 np0005476733 nova_compute[192580]: 2025-10-08 17:20:14.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:15 np0005476733 nova_compute[192580]: 2025-10-08 17:20:15.596 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:20:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:20:16.041 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=113, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=112) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:20:16 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:20:16.042 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 13:20:16 np0005476733 nova_compute[192580]: 2025-10-08 17:20:16.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:17 np0005476733 nova_compute[192580]: 2025-10-08 17:20:17.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:19 np0005476733 podman[285529]: 2025-10-08 17:20:19.001639892 +0000 UTC m=+0.074283255 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 13:20:19 np0005476733 nova_compute[192580]: 2025-10-08 17:20:19.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:22 np0005476733 podman[285548]: 2025-10-08 17:20:22.30060209 +0000 UTC m=+0.128494277 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001)
Oct  8 13:20:22 np0005476733 nova_compute[192580]: 2025-10-08 17:20:22.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:20:22 np0005476733 nova_compute[192580]: 2025-10-08 17:20:22.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:20:22 np0005476733 nova_compute[192580]: 2025-10-08 17:20:22.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:24 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:20:24.045 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '113'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:20:24 np0005476733 nova_compute[192580]: 2025-10-08 17:20:24.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:25 np0005476733 podman[285576]: 2025-10-08 17:20:25.270277217 +0000 UTC m=+0.097029992 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:20:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:20:26.459 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:20:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:20:26.460 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:20:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:20:26.460 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:20:27 np0005476733 nova_compute[192580]: 2025-10-08 17:20:27.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:29 np0005476733 nova_compute[192580]: 2025-10-08 17:20:29.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:29 np0005476733 nova_compute[192580]: 2025-10-08 17:20:29.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:20:30 np0005476733 nova_compute[192580]: 2025-10-08 17:20:30.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:20:31 np0005476733 nova_compute[192580]: 2025-10-08 17:20:31.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:20:31 np0005476733 nova_compute[192580]: 2025-10-08 17:20:31.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:20:31 np0005476733 nova_compute[192580]: 2025-10-08 17:20:31.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:20:31 np0005476733 nova_compute[192580]: 2025-10-08 17:20:31.615 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:20:32 np0005476733 nova_compute[192580]: 2025-10-08 17:20:32.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:34 np0005476733 nova_compute[192580]: 2025-10-08 17:20:34.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:35 np0005476733 podman[285600]: 2025-10-08 17:20:35.287998255 +0000 UTC m=+0.106968629 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2)
Oct  8 13:20:35 np0005476733 podman[285601]: 2025-10-08 17:20:35.294319957 +0000 UTC m=+0.103533869 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 13:20:35 np0005476733 podman[285602]: 2025-10-08 17:20:35.303438789 +0000 UTC m=+0.107517597 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, vendor=Red Hat, Inc., version=9.6, config_id=edpm, maintainer=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container)
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:20:36.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:20:36 np0005476733 nova_compute[192580]: 2025-10-08 17:20:36.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:20:37 np0005476733 nova_compute[192580]: 2025-10-08 17:20:37.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:39 np0005476733 nova_compute[192580]: 2025-10-08 17:20:39.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:41 np0005476733 podman[285665]: 2025-10-08 17:20:41.271310823 +0000 UTC m=+0.081731553 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 13:20:41 np0005476733 podman[285666]: 2025-10-08 17:20:41.290654581 +0000 UTC m=+0.089409138 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 13:20:42 np0005476733 nova_compute[192580]: 2025-10-08 17:20:42.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:43 np0005476733 nova_compute[192580]: 2025-10-08 17:20:43.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:20:44 np0005476733 nova_compute[192580]: 2025-10-08 17:20:44.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:46 np0005476733 nova_compute[192580]: 2025-10-08 17:20:46.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:20:46 np0005476733 nova_compute[192580]: 2025-10-08 17:20:46.659 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:20:46 np0005476733 nova_compute[192580]: 2025-10-08 17:20:46.660 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:20:46 np0005476733 nova_compute[192580]: 2025-10-08 17:20:46.661 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:20:46 np0005476733 nova_compute[192580]: 2025-10-08 17:20:46.661 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:20:46 np0005476733 nova_compute[192580]: 2025-10-08 17:20:46.901 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:20:46 np0005476733 nova_compute[192580]: 2025-10-08 17:20:46.903 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13660MB free_disk=111.29923248291016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:20:46 np0005476733 nova_compute[192580]: 2025-10-08 17:20:46.903 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:20:46 np0005476733 nova_compute[192580]: 2025-10-08 17:20:46.903 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:20:46 np0005476733 nova_compute[192580]: 2025-10-08 17:20:46.978 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:20:46 np0005476733 nova_compute[192580]: 2025-10-08 17:20:46.978 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:20:47 np0005476733 nova_compute[192580]: 2025-10-08 17:20:47.006 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:20:47 np0005476733 nova_compute[192580]: 2025-10-08 17:20:47.026 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:20:47 np0005476733 nova_compute[192580]: 2025-10-08 17:20:47.028 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:20:47 np0005476733 nova_compute[192580]: 2025-10-08 17:20:47.028 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:20:47 np0005476733 nova_compute[192580]: 2025-10-08 17:20:47.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:49 np0005476733 nova_compute[192580]: 2025-10-08 17:20:49.028 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:20:49 np0005476733 podman[285710]: 2025-10-08 17:20:49.230952525 +0000 UTC m=+0.063293183 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 13:20:49 np0005476733 nova_compute[192580]: 2025-10-08 17:20:49.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:51 np0005476733 ovn_controller[263831]: 2025-10-08T17:20:51Z|00307|pinctrl|WARN|Dropped 159 log messages in last 59 seconds (most recently, 1 seconds ago) due to excessive rate
Oct  8 13:20:51 np0005476733 ovn_controller[263831]: 2025-10-08T17:20:51Z|00308|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:20:52 np0005476733 nova_compute[192580]: 2025-10-08 17:20:52.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:53 np0005476733 podman[285729]: 2025-10-08 17:20:53.318010268 +0000 UTC m=+0.148516176 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:20:54 np0005476733 nova_compute[192580]: 2025-10-08 17:20:54.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:56 np0005476733 podman[285755]: 2025-10-08 17:20:56.279775772 +0000 UTC m=+0.101141423 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Oct  8 13:20:57 np0005476733 nova_compute[192580]: 2025-10-08 17:20:57.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:20:59 np0005476733 nova_compute[192580]: 2025-10-08 17:20:59.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:02 np0005476733 nova_compute[192580]: 2025-10-08 17:21:02.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:04 np0005476733 nova_compute[192580]: 2025-10-08 17:21:04.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:06 np0005476733 podman[285774]: 2025-10-08 17:21:06.266642044 +0000 UTC m=+0.084159650 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 13:21:06 np0005476733 podman[285775]: 2025-10-08 17:21:06.280372833 +0000 UTC m=+0.094104567 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Oct  8 13:21:06 np0005476733 podman[285773]: 2025-10-08 17:21:06.288191322 +0000 UTC m=+0.108443555 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  8 13:21:07 np0005476733 nova_compute[192580]: 2025-10-08 17:21:07.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:09 np0005476733 nova_compute[192580]: 2025-10-08 17:21:09.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:12 np0005476733 podman[285831]: 2025-10-08 17:21:12.267973798 +0000 UTC m=+0.087043643 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 13:21:12 np0005476733 podman[285832]: 2025-10-08 17:21:12.29871952 +0000 UTC m=+0.112439224 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 13:21:12 np0005476733 nova_compute[192580]: 2025-10-08 17:21:12.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:14 np0005476733 nova_compute[192580]: 2025-10-08 17:21:14.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:16 np0005476733 systemd[1]: session-64.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-110.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-67.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-115.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-84.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-94.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 67 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 64 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 110 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 84 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 115 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 94 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 64.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 110.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 67.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 115.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 84.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 94.
Oct  8 13:21:16 np0005476733 systemd[1]: session-98.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-100.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 98 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd[1]: session-149.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 100 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 149 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 98.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 100.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 149.
Oct  8 13:21:16 np0005476733 systemd[1]: session-61.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 61 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd[1]: session-83.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 83 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd[1]: session-139.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-133.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 61.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 139 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd[1]: session-143.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-144.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-97.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-126.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-81.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-109.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-121.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-147.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 133 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd[1]: session-123.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-135.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 97 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd[1]: session-118.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-113.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-75.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-92.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 126 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd[1]: session-120.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-54.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-73.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-95.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 143 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd[1]: session-69.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-141.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-58.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-66.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-72.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-107.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-52.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-112.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-63.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 109 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 144 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 121 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 135 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 123 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 81 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 147 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 120 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 118 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 113 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 75 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 92 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 63 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 112 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 73 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 95 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 52 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 72 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 69 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 54 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 107 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 66 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 141 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 58 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 83.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 139.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 133.
Oct  8 13:21:16 np0005476733 systemd[1]: session-89.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd[1]: session-146.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 143.
Oct  8 13:21:16 np0005476733 systemd[1]: session-56.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 146 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 89 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 56 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 144.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 97.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 126.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 81.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 109.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 121.
Oct  8 13:21:16 np0005476733 systemd[1]: session-87.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 147.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 87 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd[1]: session-137.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 123.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 137 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd[1]: session-70.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 70 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 135.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 118.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 113.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 75.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 92.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 120.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 54.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 73.
Oct  8 13:21:16 np0005476733 systemd[1]: session-124.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 124 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 95.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 69.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 141.
Oct  8 13:21:16 np0005476733 systemd[1]: session-86.scope: Deactivated successfully.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Session 86 logged out. Waiting for processes to exit.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 58.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 66.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 72.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 107.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 52.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 112.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 63.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 89.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 146.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 56.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 87.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 137.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 70.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 124.
Oct  8 13:21:16 np0005476733 systemd-logind[827]: Removed session 86.
Oct  8 13:21:16 np0005476733 nova_compute[192580]: 2025-10-08 17:21:16.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:21:17 np0005476733 nova_compute[192580]: 2025-10-08 17:21:17.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:19 np0005476733 nova_compute[192580]: 2025-10-08 17:21:19.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:20 np0005476733 podman[285878]: 2025-10-08 17:21:20.280414791 +0000 UTC m=+0.090656817 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  8 13:21:22 np0005476733 nova_compute[192580]: 2025-10-08 17:21:22.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:21:22 np0005476733 nova_compute[192580]: 2025-10-08 17:21:22.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:21:22 np0005476733 nova_compute[192580]: 2025-10-08 17:21:22.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:24 np0005476733 podman[285898]: 2025-10-08 17:21:24.333420722 +0000 UTC m=+0.152084710 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 13:21:24 np0005476733 nova_compute[192580]: 2025-10-08 17:21:24.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:21:26.461 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:21:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:21:26.462 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:21:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:21:26.462 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:21:26 np0005476733 podman[285924]: 2025-10-08 17:21:26.603370439 +0000 UTC m=+0.095132961 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm)
Oct  8 13:21:27 np0005476733 nova_compute[192580]: 2025-10-08 17:21:27.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:29 np0005476733 nova_compute[192580]: 2025-10-08 17:21:29.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:31 np0005476733 nova_compute[192580]: 2025-10-08 17:21:31.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:21:31 np0005476733 nova_compute[192580]: 2025-10-08 17:21:31.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:21:32 np0005476733 nova_compute[192580]: 2025-10-08 17:21:32.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:33 np0005476733 nova_compute[192580]: 2025-10-08 17:21:33.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:21:33 np0005476733 nova_compute[192580]: 2025-10-08 17:21:33.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:21:33 np0005476733 nova_compute[192580]: 2025-10-08 17:21:33.591 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:21:34 np0005476733 nova_compute[192580]: 2025-10-08 17:21:34.088 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:21:34 np0005476733 nova_compute[192580]: 2025-10-08 17:21:34.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:37 np0005476733 podman[285945]: 2025-10-08 17:21:37.251260255 +0000 UTC m=+0.078383496 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 13:21:37 np0005476733 podman[285946]: 2025-10-08 17:21:37.251284966 +0000 UTC m=+0.074695058 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=)
Oct  8 13:21:37 np0005476733 podman[285944]: 2025-10-08 17:21:37.251861094 +0000 UTC m=+0.080421370 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  8 13:21:37 np0005476733 nova_compute[192580]: 2025-10-08 17:21:37.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:21:37 np0005476733 nova_compute[192580]: 2025-10-08 17:21:37.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:39 np0005476733 nova_compute[192580]: 2025-10-08 17:21:39.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:42 np0005476733 nova_compute[192580]: 2025-10-08 17:21:42.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:43 np0005476733 podman[286010]: 2025-10-08 17:21:43.263914571 +0000 UTC m=+0.080236755 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Oct  8 13:21:43 np0005476733 podman[286011]: 2025-10-08 17:21:43.281042378 +0000 UTC m=+0.086865787 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 13:21:44 np0005476733 nova_compute[192580]: 2025-10-08 17:21:44.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:44 np0005476733 nova_compute[192580]: 2025-10-08 17:21:44.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:21:47 np0005476733 nova_compute[192580]: 2025-10-08 17:21:47.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:21:47 np0005476733 nova_compute[192580]: 2025-10-08 17:21:47.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:21:47 np0005476733 nova_compute[192580]: 2025-10-08 17:21:47.616 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:21:47 np0005476733 nova_compute[192580]: 2025-10-08 17:21:47.617 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:21:47 np0005476733 nova_compute[192580]: 2025-10-08 17:21:47.617 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:21:47 np0005476733 nova_compute[192580]: 2025-10-08 17:21:47.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:47 np0005476733 nova_compute[192580]: 2025-10-08 17:21:47.825 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:21:47 np0005476733 nova_compute[192580]: 2025-10-08 17:21:47.827 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13826MB free_disk=111.29923248291016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:21:47 np0005476733 nova_compute[192580]: 2025-10-08 17:21:47.827 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:21:47 np0005476733 nova_compute[192580]: 2025-10-08 17:21:47.827 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:21:47 np0005476733 nova_compute[192580]: 2025-10-08 17:21:47.904 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:21:47 np0005476733 nova_compute[192580]: 2025-10-08 17:21:47.905 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:21:47 np0005476733 nova_compute[192580]: 2025-10-08 17:21:47.932 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:21:47 np0005476733 nova_compute[192580]: 2025-10-08 17:21:47.945 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:21:47 np0005476733 nova_compute[192580]: 2025-10-08 17:21:47.946 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:21:47 np0005476733 nova_compute[192580]: 2025-10-08 17:21:47.946 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:21:49 np0005476733 nova_compute[192580]: 2025-10-08 17:21:49.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:49 np0005476733 nova_compute[192580]: 2025-10-08 17:21:49.946 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:21:50 np0005476733 podman[286054]: 2025-10-08 17:21:50.649875671 +0000 UTC m=+0.115192072 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent)
Oct  8 13:21:52 np0005476733 nova_compute[192580]: 2025-10-08 17:21:52.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:54 np0005476733 nova_compute[192580]: 2025-10-08 17:21:54.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:55 np0005476733 ovn_controller[263831]: 2025-10-08T17:21:55Z|00309|pinctrl|WARN|Dropped 203 log messages in last 64 seconds (most recently, 20 seconds ago) due to excessive rate
Oct  8 13:21:55 np0005476733 ovn_controller[263831]: 2025-10-08T17:21:55Z|00310|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:21:55 np0005476733 podman[286074]: 2025-10-08 17:21:55.288920193 +0000 UTC m=+0.113892899 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 13:21:57 np0005476733 podman[286103]: 2025-10-08 17:21:57.235134295 +0000 UTC m=+0.059439911 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:21:57 np0005476733 nova_compute[192580]: 2025-10-08 17:21:57.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:59 np0005476733 nova_compute[192580]: 2025-10-08 17:21:59.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:21:59 np0005476733 nova_compute[192580]: 2025-10-08 17:21:59.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:21:59 np0005476733 nova_compute[192580]: 2025-10-08 17:21:59.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 13:22:02 np0005476733 nova_compute[192580]: 2025-10-08 17:22:02.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:22:04 np0005476733 nova_compute[192580]: 2025-10-08 17:22:04.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:22:07 np0005476733 podman[286124]: 2025-10-08 17:22:07.41987093 +0000 UTC m=+0.079629745 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 13:22:07 np0005476733 podman[286125]: 2025-10-08 17:22:07.451571513 +0000 UTC m=+0.106546125 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 13:22:07 np0005476733 podman[286126]: 2025-10-08 17:22:07.460553371 +0000 UTC m=+0.111683611 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, vcs-type=git, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350)
Oct  8 13:22:07 np0005476733 nova_compute[192580]: 2025-10-08 17:22:07.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:22:09 np0005476733 nova_compute[192580]: 2025-10-08 17:22:09.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:22:12 np0005476733 nova_compute[192580]: 2025-10-08 17:22:12.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:22:14 np0005476733 podman[286188]: 2025-10-08 17:22:14.262486628 +0000 UTC m=+0.080577076 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 13:22:14 np0005476733 podman[286187]: 2025-10-08 17:22:14.300660707 +0000 UTC m=+0.122707731 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, org.label-schema.build-date=20251001, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct  8 13:22:14 np0005476733 nova_compute[192580]: 2025-10-08 17:22:14.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:22:15 np0005476733 nova_compute[192580]: 2025-10-08 17:22:15.597 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:22:16 np0005476733 systemd-logind[827]: New session 175 of user zuul.
Oct  8 13:22:16 np0005476733 systemd[1]: Started Session 175 of User zuul.
Oct  8 13:22:17 np0005476733 nova_compute[192580]: 2025-10-08 17:22:17.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:22:18 np0005476733 nova_compute[192580]: 2025-10-08 17:22:18.604 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:22:19 np0005476733 nova_compute[192580]: 2025-10-08 17:22:19.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:22:21 np0005476733 podman[286391]: 2025-10-08 17:22:21.26233653 +0000 UTC m=+0.080337449 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 13:22:21 np0005476733 ovs-vsctl[286426]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct  8 13:22:22 np0005476733 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 286258 (sos)
Oct  8 13:22:22 np0005476733 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Oct  8 13:22:22 np0005476733 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Oct  8 13:22:22 np0005476733 virtqemud[192152]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct  8 13:22:22 np0005476733 virtqemud[192152]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct  8 13:22:22 np0005476733 virtqemud[192152]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct  8 13:22:22 np0005476733 nova_compute[192580]: 2025-10-08 17:22:22.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:22:23 np0005476733 kernel: block sr0: the capability attribute has been deprecated.
Oct  8 13:22:23 np0005476733 nova_compute[192580]: 2025-10-08 17:22:23.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:22:23 np0005476733 nova_compute[192580]: 2025-10-08 17:22:23.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:22:24 np0005476733 nova_compute[192580]: 2025-10-08 17:22:24.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:22:26 np0005476733 podman[286945]: 2025-10-08 17:22:26.298394279 +0000 UTC m=+0.110547824 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 13:22:26 np0005476733 systemd[1]: Starting Hostname Service...
Oct  8 13:22:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:22:26.462 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:22:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:22:26.462 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:22:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:22:26.462 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:22:26 np0005476733 systemd[1]: Started Hostname Service.
Oct  8 13:22:27 np0005476733 podman[287051]: 2025-10-08 17:22:27.585715705 +0000 UTC m=+0.100879695 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  8 13:22:27 np0005476733 nova_compute[192580]: 2025-10-08 17:22:27.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:22:29 np0005476733 nova_compute[192580]: 2025-10-08 17:22:29.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:22:31 np0005476733 nova_compute[192580]: 2025-10-08 17:22:31.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:22:32 np0005476733 nova_compute[192580]: 2025-10-08 17:22:32.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:22:33 np0005476733 nova_compute[192580]: 2025-10-08 17:22:33.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:22:33 np0005476733 nova_compute[192580]: 2025-10-08 17:22:33.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:22:33 np0005476733 nova_compute[192580]: 2025-10-08 17:22:33.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:22:33 np0005476733 nova_compute[192580]: 2025-10-08 17:22:33.612 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:22:33 np0005476733 nova_compute[192580]: 2025-10-08 17:22:33.613 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:22:33 np0005476733 ovs-appctl[287987]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct  8 13:22:33 np0005476733 ovs-appctl[287992]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct  8 13:22:33 np0005476733 ovs-appctl[287997]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct  8 13:22:34 np0005476733 nova_compute[192580]: 2025-10-08 17:22:34.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.082 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:22:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:22:37 np0005476733 nova_compute[192580]: 2025-10-08 17:22:37.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:22:37 np0005476733 podman[289144]: 2025-10-08 17:22:37.843829216 +0000 UTC m=+0.059973828 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, version=9.6, vcs-type=git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  8 13:22:37 np0005476733 podman[289143]: 2025-10-08 17:22:37.851218441 +0000 UTC m=+0.065690419 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 13:22:37 np0005476733 podman[289142]: 2025-10-08 17:22:37.854109814 +0000 UTC m=+0.066570468 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 13:22:38 np0005476733 nova_compute[192580]: 2025-10-08 17:22:38.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:22:39 np0005476733 nova_compute[192580]: 2025-10-08 17:22:39.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:22:42 np0005476733 virtqemud[192152]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct  8 13:22:42 np0005476733 nova_compute[192580]: 2025-10-08 17:22:42.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:22:43 np0005476733 nova_compute[192580]: 2025-10-08 17:22:43.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:22:44 np0005476733 podman[289737]: 2025-10-08 17:22:44.403212363 +0000 UTC m=+0.087455546 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 13:22:44 np0005476733 podman[289743]: 2025-10-08 17:22:44.472454245 +0000 UTC m=+0.124848030 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 13:22:44 np0005476733 nova_compute[192580]: 2025-10-08 17:22:44.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:22:44 np0005476733 nova_compute[192580]: 2025-10-08 17:22:44.606 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:22:44 np0005476733 systemd[1]: Starting Time & Date Service...
Oct  8 13:22:44 np0005476733 systemd[1]: Started Time & Date Service.
Oct  8 13:22:47 np0005476733 nova_compute[192580]: 2025-10-08 17:22:47.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:22:47 np0005476733 nova_compute[192580]: 2025-10-08 17:22:47.629 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:22:47 np0005476733 nova_compute[192580]: 2025-10-08 17:22:47.629 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:22:47 np0005476733 nova_compute[192580]: 2025-10-08 17:22:47.629 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:22:47 np0005476733 nova_compute[192580]: 2025-10-08 17:22:47.629 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:22:47 np0005476733 nova_compute[192580]: 2025-10-08 17:22:47.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:22:47 np0005476733 nova_compute[192580]: 2025-10-08 17:22:47.831 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:22:47 np0005476733 nova_compute[192580]: 2025-10-08 17:22:47.832 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13280MB free_disk=110.65943145751953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:22:47 np0005476733 nova_compute[192580]: 2025-10-08 17:22:47.832 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:22:47 np0005476733 nova_compute[192580]: 2025-10-08 17:22:47.832 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:22:48 np0005476733 nova_compute[192580]: 2025-10-08 17:22:48.186 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:22:48 np0005476733 nova_compute[192580]: 2025-10-08 17:22:48.187 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:22:48 np0005476733 nova_compute[192580]: 2025-10-08 17:22:48.338 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 13:22:48 np0005476733 nova_compute[192580]: 2025-10-08 17:22:48.439 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 13:22:48 np0005476733 nova_compute[192580]: 2025-10-08 17:22:48.440 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 13:22:48 np0005476733 nova_compute[192580]: 2025-10-08 17:22:48.479 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 13:22:48 np0005476733 nova_compute[192580]: 2025-10-08 17:22:48.507 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 13:22:48 np0005476733 nova_compute[192580]: 2025-10-08 17:22:48.533 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:22:48 np0005476733 nova_compute[192580]: 2025-10-08 17:22:48.618 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:22:48 np0005476733 nova_compute[192580]: 2025-10-08 17:22:48.764 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:22:48 np0005476733 nova_compute[192580]: 2025-10-08 17:22:48.765 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.933s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:22:49 np0005476733 nova_compute[192580]: 2025-10-08 17:22:49.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:22:52 np0005476733 podman[289792]: 2025-10-08 17:22:52.256768715 +0000 UTC m=+0.077852629 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  8 13:22:52 np0005476733 nova_compute[192580]: 2025-10-08 17:22:52.766 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:22:52 np0005476733 nova_compute[192580]: 2025-10-08 17:22:52.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:22:54 np0005476733 nova_compute[192580]: 2025-10-08 17:22:54.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:22:57 np0005476733 ovn_controller[263831]: 2025-10-08T17:22:57Z|00311|pinctrl|WARN|Dropped 31 log messages in last 62 seconds (most recently, 22 seconds ago) due to excessive rate
Oct  8 13:22:57 np0005476733 ovn_controller[263831]: 2025-10-08T17:22:57Z|00312|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:22:57 np0005476733 podman[289812]: 2025-10-08 17:22:57.301017465 +0000 UTC m=+0.121304847 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:22:57 np0005476733 nova_compute[192580]: 2025-10-08 17:22:57.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:22:58 np0005476733 podman[289838]: 2025-10-08 17:22:58.248476511 +0000 UTC m=+0.080202153 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 13:22:59 np0005476733 nova_compute[192580]: 2025-10-08 17:22:59.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:23:01 np0005476733 nova_compute[192580]: 2025-10-08 17:23:01.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:23:01 np0005476733 nova_compute[192580]: 2025-10-08 17:23:01.591 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 13:23:01 np0005476733 nova_compute[192580]: 2025-10-08 17:23:01.758 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 13:23:02 np0005476733 nova_compute[192580]: 2025-10-08 17:23:02.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:23:04 np0005476733 nova_compute[192580]: 2025-10-08 17:23:04.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:23:07 np0005476733 nova_compute[192580]: 2025-10-08 17:23:07.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:23:08 np0005476733 podman[289861]: 2025-10-08 17:23:08.230421337 +0000 UTC m=+0.055725872 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 13:23:08 np0005476733 podman[289860]: 2025-10-08 17:23:08.260823748 +0000 UTC m=+0.080852824 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:23:08 np0005476733 podman[289862]: 2025-10-08 17:23:08.26931668 +0000 UTC m=+0.083383236 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, release=1755695350, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, config_id=edpm, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  8 13:23:09 np0005476733 nova_compute[192580]: 2025-10-08 17:23:09.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:23:09 np0005476733 systemd[1]: session-175.scope: Deactivated successfully.
Oct  8 13:23:09 np0005476733 systemd[1]: session-175.scope: Consumed 1min 28.910s CPU time, 646.5M memory peak, read 106.7M from disk, written 20.1M to disk.
Oct  8 13:23:09 np0005476733 systemd-logind[827]: Session 175 logged out. Waiting for processes to exit.
Oct  8 13:23:09 np0005476733 systemd-logind[827]: Removed session 175.
Oct  8 13:23:09 np0005476733 systemd-logind[827]: New session 176 of user zuul.
Oct  8 13:23:09 np0005476733 systemd[1]: Started Session 176 of User zuul.
Oct  8 13:23:10 np0005476733 systemd[1]: session-176.scope: Deactivated successfully.
Oct  8 13:23:10 np0005476733 systemd-logind[827]: Session 176 logged out. Waiting for processes to exit.
Oct  8 13:23:10 np0005476733 systemd-logind[827]: Removed session 176.
Oct  8 13:23:10 np0005476733 systemd-logind[827]: New session 177 of user zuul.
Oct  8 13:23:10 np0005476733 systemd[1]: Started Session 177 of User zuul.
Oct  8 13:23:10 np0005476733 systemd[1]: session-177.scope: Deactivated successfully.
Oct  8 13:23:10 np0005476733 systemd-logind[827]: Session 177 logged out. Waiting for processes to exit.
Oct  8 13:23:10 np0005476733 systemd-logind[827]: Removed session 177.
Oct  8 13:23:12 np0005476733 nova_compute[192580]: 2025-10-08 17:23:12.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:23:14 np0005476733 nova_compute[192580]: 2025-10-08 17:23:14.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:23:14 np0005476733 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct  8 13:23:14 np0005476733 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  8 13:23:14 np0005476733 podman[289981]: 2025-10-08 17:23:14.95399905 +0000 UTC m=+0.087099884 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 13:23:14 np0005476733 podman[289982]: 2025-10-08 17:23:14.961890772 +0000 UTC m=+0.088349203 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 13:23:17 np0005476733 nova_compute[192580]: 2025-10-08 17:23:17.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:23:19 np0005476733 nova_compute[192580]: 2025-10-08 17:23:19.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:23:19 np0005476733 nova_compute[192580]: 2025-10-08 17:23:19.748 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:23:22 np0005476733 nova_compute[192580]: 2025-10-08 17:23:22.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:23:23 np0005476733 podman[290034]: 2025-10-08 17:23:23.215984404 +0000 UTC m=+0.049141211 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent)
Oct  8 13:23:24 np0005476733 nova_compute[192580]: 2025-10-08 17:23:24.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:23:24 np0005476733 nova_compute[192580]: 2025-10-08 17:23:24.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:23:24 np0005476733 nova_compute[192580]: 2025-10-08 17:23:24.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:23:26 np0005476733 nova_compute[192580]: 2025-10-08 17:23:26.024 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:23:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:23:26.463 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:23:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:23:26.463 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:23:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:23:26.464 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:23:27 np0005476733 podman[290053]: 2025-10-08 17:23:27.49432339 +0000 UTC m=+0.105035968 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:23:27 np0005476733 nova_compute[192580]: 2025-10-08 17:23:27.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:23:29 np0005476733 podman[290079]: 2025-10-08 17:23:29.305682032 +0000 UTC m=+0.118172877 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  8 13:23:29 np0005476733 nova_compute[192580]: 2025-10-08 17:23:29.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:23:31 np0005476733 nova_compute[192580]: 2025-10-08 17:23:31.612 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:23:32 np0005476733 nova_compute[192580]: 2025-10-08 17:23:32.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:23:33 np0005476733 nova_compute[192580]: 2025-10-08 17:23:33.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:23:33 np0005476733 nova_compute[192580]: 2025-10-08 17:23:33.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:23:33 np0005476733 nova_compute[192580]: 2025-10-08 17:23:33.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:23:33 np0005476733 nova_compute[192580]: 2025-10-08 17:23:33.613 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:23:34 np0005476733 nova_compute[192580]: 2025-10-08 17:23:34.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:23:35 np0005476733 nova_compute[192580]: 2025-10-08 17:23:35.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:23:37 np0005476733 nova_compute[192580]: 2025-10-08 17:23:37.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:23:39 np0005476733 podman[290100]: 2025-10-08 17:23:39.063768954 +0000 UTC m=+0.068427027 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 13:23:39 np0005476733 podman[290099]: 2025-10-08 17:23:39.092410869 +0000 UTC m=+0.093864740 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Oct  8 13:23:39 np0005476733 podman[290101]: 2025-10-08 17:23:39.105047442 +0000 UTC m=+0.097440874 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7)
Oct  8 13:23:39 np0005476733 nova_compute[192580]: 2025-10-08 17:23:39.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:23:40 np0005476733 nova_compute[192580]: 2025-10-08 17:23:40.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:23:42 np0005476733 nova_compute[192580]: 2025-10-08 17:23:42.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:23:44 np0005476733 nova_compute[192580]: 2025-10-08 17:23:44.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:23:44 np0005476733 nova_compute[192580]: 2025-10-08 17:23:44.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:23:45 np0005476733 podman[290166]: 2025-10-08 17:23:45.240058179 +0000 UTC m=+0.066922550 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 13:23:45 np0005476733 podman[290165]: 2025-10-08 17:23:45.258332822 +0000 UTC m=+0.083365354 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid)
Oct  8 13:23:48 np0005476733 nova_compute[192580]: 2025-10-08 17:23:48.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:23:48 np0005476733 nova_compute[192580]: 2025-10-08 17:23:48.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:23:48 np0005476733 nova_compute[192580]: 2025-10-08 17:23:48.630 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:23:48 np0005476733 nova_compute[192580]: 2025-10-08 17:23:48.630 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:23:48 np0005476733 nova_compute[192580]: 2025-10-08 17:23:48.630 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:23:48 np0005476733 nova_compute[192580]: 2025-10-08 17:23:48.631 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:23:48 np0005476733 nova_compute[192580]: 2025-10-08 17:23:48.868 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:23:48 np0005476733 nova_compute[192580]: 2025-10-08 17:23:48.869 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13711MB free_disk=111.29898071289062GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:23:48 np0005476733 nova_compute[192580]: 2025-10-08 17:23:48.870 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:23:48 np0005476733 nova_compute[192580]: 2025-10-08 17:23:48.870 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:23:49 np0005476733 nova_compute[192580]: 2025-10-08 17:23:49.020 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:23:49 np0005476733 nova_compute[192580]: 2025-10-08 17:23:49.021 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:23:49 np0005476733 nova_compute[192580]: 2025-10-08 17:23:49.068 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:23:49 np0005476733 nova_compute[192580]: 2025-10-08 17:23:49.141 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:23:49 np0005476733 nova_compute[192580]: 2025-10-08 17:23:49.192 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:23:49 np0005476733 nova_compute[192580]: 2025-10-08 17:23:49.192 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.322s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:23:49 np0005476733 nova_compute[192580]: 2025-10-08 17:23:49.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:23:53 np0005476733 nova_compute[192580]: 2025-10-08 17:23:53.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:23:54 np0005476733 nova_compute[192580]: 2025-10-08 17:23:54.193 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:23:54 np0005476733 podman[290209]: 2025-10-08 17:23:54.282607458 +0000 UTC m=+0.098421226 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 13:23:54 np0005476733 nova_compute[192580]: 2025-10-08 17:23:54.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:23:58 np0005476733 nova_compute[192580]: 2025-10-08 17:23:58.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:23:58 np0005476733 ovn_controller[263831]: 2025-10-08T17:23:58Z|00313|pinctrl|WARN|Dropped 31 log messages in last 61 seconds (most recently, 23 seconds ago) due to excessive rate
Oct  8 13:23:58 np0005476733 ovn_controller[263831]: 2025-10-08T17:23:58Z|00314|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:23:58 np0005476733 podman[290228]: 2025-10-08 17:23:58.271210323 +0000 UTC m=+0.092542629 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 13:23:59 np0005476733 podman[290255]: 2025-10-08 17:23:59.425370714 +0000 UTC m=+0.079101809 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  8 13:23:59 np0005476733 nova_compute[192580]: 2025-10-08 17:23:59.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:24:03 np0005476733 nova_compute[192580]: 2025-10-08 17:24:03.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:24:04 np0005476733 nova_compute[192580]: 2025-10-08 17:24:04.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:24:08 np0005476733 nova_compute[192580]: 2025-10-08 17:24:08.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:24:09 np0005476733 podman[290279]: 2025-10-08 17:24:09.269984422 +0000 UTC m=+0.074940786 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc.)
Oct  8 13:24:09 np0005476733 podman[290278]: 2025-10-08 17:24:09.28776043 +0000 UTC m=+0.102126674 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 13:24:09 np0005476733 podman[290277]: 2025-10-08 17:24:09.299248387 +0000 UTC m=+0.116643697 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  8 13:24:09 np0005476733 nova_compute[192580]: 2025-10-08 17:24:09.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:24:13 np0005476733 nova_compute[192580]: 2025-10-08 17:24:13.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:24:14 np0005476733 nova_compute[192580]: 2025-10-08 17:24:14.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:24:16 np0005476733 podman[290340]: 2025-10-08 17:24:16.242779229 +0000 UTC m=+0.062813229 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 13:24:16 np0005476733 podman[290341]: 2025-10-08 17:24:16.258512981 +0000 UTC m=+0.069375658 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 13:24:18 np0005476733 nova_compute[192580]: 2025-10-08 17:24:18.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:24:19 np0005476733 nova_compute[192580]: 2025-10-08 17:24:19.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:24:19 np0005476733 nova_compute[192580]: 2025-10-08 17:24:19.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:24:19 np0005476733 nova_compute[192580]: 2025-10-08 17:24:19.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:24:23 np0005476733 nova_compute[192580]: 2025-10-08 17:24:23.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:24:24 np0005476733 nova_compute[192580]: 2025-10-08 17:24:24.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:24:24 np0005476733 nova_compute[192580]: 2025-10-08 17:24:24.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:24:24 np0005476733 nova_compute[192580]: 2025-10-08 17:24:24.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:24:25 np0005476733 podman[290385]: 2025-10-08 17:24:25.266461073 +0000 UTC m=+0.086805725 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  8 13:24:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:24:26.464 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:24:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:24:26.465 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:24:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:24:26.465 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:24:28 np0005476733 nova_compute[192580]: 2025-10-08 17:24:28.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:24:29 np0005476733 podman[290408]: 2025-10-08 17:24:29.298433775 +0000 UTC m=+0.117307019 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:24:29 np0005476733 nova_compute[192580]: 2025-10-08 17:24:29.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:24:30 np0005476733 podman[290435]: 2025-10-08 17:24:30.27395316 +0000 UTC m=+0.089308564 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3)
Oct  8 13:24:32 np0005476733 nova_compute[192580]: 2025-10-08 17:24:32.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:24:33 np0005476733 nova_compute[192580]: 2025-10-08 17:24:33.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:24:34 np0005476733 nova_compute[192580]: 2025-10-08 17:24:34.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:24:34 np0005476733 nova_compute[192580]: 2025-10-08 17:24:34.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:24:34 np0005476733 nova_compute[192580]: 2025-10-08 17:24:34.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:24:34 np0005476733 nova_compute[192580]: 2025-10-08 17:24:34.617 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:24:34 np0005476733 nova_compute[192580]: 2025-10-08 17:24:34.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:24:35 np0005476733 nova_compute[192580]: 2025-10-08 17:24:35.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:24:36.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:24:38 np0005476733 nova_compute[192580]: 2025-10-08 17:24:38.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:24:39 np0005476733 nova_compute[192580]: 2025-10-08 17:24:39.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:24:40 np0005476733 podman[290460]: 2025-10-08 17:24:40.228968213 +0000 UTC m=+0.055949789 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 13:24:40 np0005476733 podman[290461]: 2025-10-08 17:24:40.263180247 +0000 UTC m=+0.078317894 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  8 13:24:40 np0005476733 podman[290459]: 2025-10-08 17:24:40.292076691 +0000 UTC m=+0.110765762 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  8 13:24:42 np0005476733 nova_compute[192580]: 2025-10-08 17:24:42.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:24:43 np0005476733 nova_compute[192580]: 2025-10-08 17:24:43.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:24:44 np0005476733 nova_compute[192580]: 2025-10-08 17:24:44.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:24:44 np0005476733 nova_compute[192580]: 2025-10-08 17:24:44.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:24:47 np0005476733 podman[290530]: 2025-10-08 17:24:47.275330631 +0000 UTC m=+0.100751921 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=iscsid, tcib_managed=true, container_name=iscsid)
Oct  8 13:24:47 np0005476733 podman[290531]: 2025-10-08 17:24:47.313689366 +0000 UTC m=+0.122027350 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 13:24:48 np0005476733 nova_compute[192580]: 2025-10-08 17:24:48.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:24:49 np0005476733 nova_compute[192580]: 2025-10-08 17:24:49.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:24:49 np0005476733 nova_compute[192580]: 2025-10-08 17:24:49.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:24:49 np0005476733 nova_compute[192580]: 2025-10-08 17:24:49.866 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:24:49 np0005476733 nova_compute[192580]: 2025-10-08 17:24:49.868 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:24:49 np0005476733 nova_compute[192580]: 2025-10-08 17:24:49.868 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:24:49 np0005476733 nova_compute[192580]: 2025-10-08 17:24:49.868 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:24:50 np0005476733 nova_compute[192580]: 2025-10-08 17:24:50.107 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:24:50 np0005476733 nova_compute[192580]: 2025-10-08 17:24:50.108 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13760MB free_disk=111.29891967773438GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:24:50 np0005476733 nova_compute[192580]: 2025-10-08 17:24:50.108 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:24:50 np0005476733 nova_compute[192580]: 2025-10-08 17:24:50.108 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:24:50 np0005476733 nova_compute[192580]: 2025-10-08 17:24:50.219 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:24:50 np0005476733 nova_compute[192580]: 2025-10-08 17:24:50.220 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:24:50 np0005476733 nova_compute[192580]: 2025-10-08 17:24:50.248 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:24:50 np0005476733 nova_compute[192580]: 2025-10-08 17:24:50.617 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:24:50 np0005476733 nova_compute[192580]: 2025-10-08 17:24:50.620 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:24:50 np0005476733 nova_compute[192580]: 2025-10-08 17:24:50.621 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.513s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:24:53 np0005476733 nova_compute[192580]: 2025-10-08 17:24:53.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:24:54 np0005476733 nova_compute[192580]: 2025-10-08 17:24:54.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:24:55 np0005476733 nova_compute[192580]: 2025-10-08 17:24:55.622 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:24:56 np0005476733 podman[290578]: 2025-10-08 17:24:56.261762714 +0000 UTC m=+0.078375165 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:24:58 np0005476733 nova_compute[192580]: 2025-10-08 17:24:58.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:24:59 np0005476733 nova_compute[192580]: 2025-10-08 17:24:59.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:25:00 np0005476733 ovn_controller[263831]: 2025-10-08T17:25:00Z|00315|pinctrl|WARN|Dropped 35 log messages in last 62 seconds (most recently, 25 seconds ago) due to excessive rate
Oct  8 13:25:00 np0005476733 ovn_controller[263831]: 2025-10-08T17:25:00Z|00316|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:25:00 np0005476733 podman[290600]: 2025-10-08 17:25:00.275633259 +0000 UTC m=+0.101733631 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 13:25:00 np0005476733 podman[290627]: 2025-10-08 17:25:00.38832981 +0000 UTC m=+0.072906351 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  8 13:25:03 np0005476733 nova_compute[192580]: 2025-10-08 17:25:03.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:25:04 np0005476733 nova_compute[192580]: 2025-10-08 17:25:04.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:25:08 np0005476733 nova_compute[192580]: 2025-10-08 17:25:08.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:25:09 np0005476733 nova_compute[192580]: 2025-10-08 17:25:09.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:25:11 np0005476733 podman[290647]: 2025-10-08 17:25:11.259929475 +0000 UTC m=+0.084360797 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  8 13:25:11 np0005476733 podman[290648]: 2025-10-08 17:25:11.260985809 +0000 UTC m=+0.071697263 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 13:25:11 np0005476733 podman[290649]: 2025-10-08 17:25:11.285908475 +0000 UTC m=+0.090484562 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vcs-type=git, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  8 13:25:13 np0005476733 nova_compute[192580]: 2025-10-08 17:25:13.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:25:14 np0005476733 nova_compute[192580]: 2025-10-08 17:25:14.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:25:18 np0005476733 podman[290714]: 2025-10-08 17:25:18.263069311 +0000 UTC m=+0.075908336 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 13:25:18 np0005476733 podman[290713]: 2025-10-08 17:25:18.284690223 +0000 UTC m=+0.097993553 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 13:25:18 np0005476733 nova_compute[192580]: 2025-10-08 17:25:18.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:25:19 np0005476733 nova_compute[192580]: 2025-10-08 17:25:19.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:25:19 np0005476733 nova_compute[192580]: 2025-10-08 17:25:19.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:25:23 np0005476733 nova_compute[192580]: 2025-10-08 17:25:23.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:25:24 np0005476733 nova_compute[192580]: 2025-10-08 17:25:24.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:25:24 np0005476733 nova_compute[192580]: 2025-10-08 17:25:24.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:25:24 np0005476733 nova_compute[192580]: 2025-10-08 17:25:24.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:25:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:25:26.465 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:25:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:25:26.465 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:25:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:25:26.466 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:25:27 np0005476733 podman[290757]: 2025-10-08 17:25:27.26153543 +0000 UTC m=+0.077762755 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  8 13:25:28 np0005476733 nova_compute[192580]: 2025-10-08 17:25:28.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:25:29 np0005476733 nova_compute[192580]: 2025-10-08 17:25:29.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:25:31 np0005476733 podman[290778]: 2025-10-08 17:25:31.253814684 +0000 UTC m=+0.072832268 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  8 13:25:31 np0005476733 podman[290777]: 2025-10-08 17:25:31.34288468 +0000 UTC m=+0.162025348 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 13:25:33 np0005476733 nova_compute[192580]: 2025-10-08 17:25:33.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:25:33 np0005476733 nova_compute[192580]: 2025-10-08 17:25:33.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:25:34 np0005476733 nova_compute[192580]: 2025-10-08 17:25:34.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:25:36 np0005476733 nova_compute[192580]: 2025-10-08 17:25:36.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:25:36 np0005476733 nova_compute[192580]: 2025-10-08 17:25:36.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:25:36 np0005476733 nova_compute[192580]: 2025-10-08 17:25:36.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:25:36 np0005476733 nova_compute[192580]: 2025-10-08 17:25:36.647 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:25:37 np0005476733 nova_compute[192580]: 2025-10-08 17:25:37.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:25:38 np0005476733 nova_compute[192580]: 2025-10-08 17:25:38.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:25:39 np0005476733 nova_compute[192580]: 2025-10-08 17:25:39.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:25:42 np0005476733 podman[290822]: 2025-10-08 17:25:42.234543926 +0000 UTC m=+0.061608239 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 13:25:42 np0005476733 podman[290821]: 2025-10-08 17:25:42.25187871 +0000 UTC m=+0.077032112 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct  8 13:25:42 np0005476733 podman[290823]: 2025-10-08 17:25:42.294654226 +0000 UTC m=+0.110611124 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Oct  8 13:25:43 np0005476733 nova_compute[192580]: 2025-10-08 17:25:43.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:25:44 np0005476733 nova_compute[192580]: 2025-10-08 17:25:44.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:25:44 np0005476733 nova_compute[192580]: 2025-10-08 17:25:44.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:25:45 np0005476733 nova_compute[192580]: 2025-10-08 17:25:45.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:25:48 np0005476733 nova_compute[192580]: 2025-10-08 17:25:48.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:25:49 np0005476733 podman[290888]: 2025-10-08 17:25:49.269634362 +0000 UTC m=+0.080312017 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 13:25:49 np0005476733 podman[290887]: 2025-10-08 17:25:49.26956544 +0000 UTC m=+0.086783504 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 13:25:49 np0005476733 nova_compute[192580]: 2025-10-08 17:25:49.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:25:50 np0005476733 nova_compute[192580]: 2025-10-08 17:25:50.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:25:50 np0005476733 nova_compute[192580]: 2025-10-08 17:25:50.719 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:25:50 np0005476733 nova_compute[192580]: 2025-10-08 17:25:50.719 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:25:50 np0005476733 nova_compute[192580]: 2025-10-08 17:25:50.720 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:25:50 np0005476733 nova_compute[192580]: 2025-10-08 17:25:50.720 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:25:50 np0005476733 nova_compute[192580]: 2025-10-08 17:25:50.928 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:25:50 np0005476733 nova_compute[192580]: 2025-10-08 17:25:50.929 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13781MB free_disk=111.29906845092773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:25:50 np0005476733 nova_compute[192580]: 2025-10-08 17:25:50.930 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:25:50 np0005476733 nova_compute[192580]: 2025-10-08 17:25:50.930 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:25:51 np0005476733 nova_compute[192580]: 2025-10-08 17:25:51.115 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:25:51 np0005476733 nova_compute[192580]: 2025-10-08 17:25:51.116 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:25:51 np0005476733 nova_compute[192580]: 2025-10-08 17:25:51.156 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:25:51 np0005476733 nova_compute[192580]: 2025-10-08 17:25:51.215 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:25:51 np0005476733 nova_compute[192580]: 2025-10-08 17:25:51.218 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:25:51 np0005476733 nova_compute[192580]: 2025-10-08 17:25:51.219 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.289s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:25:53 np0005476733 nova_compute[192580]: 2025-10-08 17:25:53.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:25:54 np0005476733 nova_compute[192580]: 2025-10-08 17:25:54.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:25:55 np0005476733 nova_compute[192580]: 2025-10-08 17:25:55.220 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:25:58 np0005476733 podman[290932]: 2025-10-08 17:25:58.251302183 +0000 UTC m=+0.072296591 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 13:25:58 np0005476733 nova_compute[192580]: 2025-10-08 17:25:58.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:25:59 np0005476733 nova_compute[192580]: 2025-10-08 17:25:59.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:26:02 np0005476733 ovn_controller[263831]: 2025-10-08T17:26:02Z|00317|pinctrl|WARN|Dropped 35 log messages in last 63 seconds (most recently, 28 seconds ago) due to excessive rate
Oct  8 13:26:02 np0005476733 ovn_controller[263831]: 2025-10-08T17:26:02Z|00318|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:26:02 np0005476733 podman[290952]: 2025-10-08 17:26:02.29469307 +0000 UTC m=+0.105011496 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  8 13:26:02 np0005476733 podman[290951]: 2025-10-08 17:26:02.373668034 +0000 UTC m=+0.191143748 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Oct  8 13:26:03 np0005476733 nova_compute[192580]: 2025-10-08 17:26:03.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:26:04 np0005476733 nova_compute[192580]: 2025-10-08 17:26:04.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:26:08 np0005476733 nova_compute[192580]: 2025-10-08 17:26:08.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:26:09 np0005476733 nova_compute[192580]: 2025-10-08 17:26:09.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:26:13 np0005476733 podman[290998]: 2025-10-08 17:26:13.225018432 +0000 UTC m=+0.054398299 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd)
Oct  8 13:26:13 np0005476733 podman[290999]: 2025-10-08 17:26:13.231013414 +0000 UTC m=+0.053211742 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 13:26:13 np0005476733 podman[291000]: 2025-10-08 17:26:13.252145299 +0000 UTC m=+0.070489683 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 13:26:13 np0005476733 nova_compute[192580]: 2025-10-08 17:26:13.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:26:14 np0005476733 nova_compute[192580]: 2025-10-08 17:26:14.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:26:18 np0005476733 nova_compute[192580]: 2025-10-08 17:26:18.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:26:19 np0005476733 nova_compute[192580]: 2025-10-08 17:26:19.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:26:20 np0005476733 podman[291064]: 2025-10-08 17:26:20.242918841 +0000 UTC m=+0.064893453 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 13:26:20 np0005476733 podman[291063]: 2025-10-08 17:26:20.249225953 +0000 UTC m=+0.067813518 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 13:26:20 np0005476733 nova_compute[192580]: 2025-10-08 17:26:20.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:26:21 np0005476733 nova_compute[192580]: 2025-10-08 17:26:21.613 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:26:23 np0005476733 nova_compute[192580]: 2025-10-08 17:26:23.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:26:24 np0005476733 nova_compute[192580]: 2025-10-08 17:26:24.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:26:24 np0005476733 nova_compute[192580]: 2025-10-08 17:26:24.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:26:24 np0005476733 nova_compute[192580]: 2025-10-08 17:26:24.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:26:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:26:26.466 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:26:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:26:26.467 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:26:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:26:26.467 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:26:28 np0005476733 nova_compute[192580]: 2025-10-08 17:26:28.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:26:29 np0005476733 podman[291107]: 2025-10-08 17:26:29.247405371 +0000 UTC m=+0.064693438 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 13:26:29 np0005476733 nova_compute[192580]: 2025-10-08 17:26:29.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:26:33 np0005476733 podman[291129]: 2025-10-08 17:26:33.270640835 +0000 UTC m=+0.089341666 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 13:26:33 np0005476733 podman[291128]: 2025-10-08 17:26:33.292664869 +0000 UTC m=+0.117068432 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 13:26:33 np0005476733 nova_compute[192580]: 2025-10-08 17:26:33.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:26:34 np0005476733 nova_compute[192580]: 2025-10-08 17:26:34.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:26:34 np0005476733 nova_compute[192580]: 2025-10-08 17:26:34.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:26:36.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:26:38 np0005476733 nova_compute[192580]: 2025-10-08 17:26:38.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:26:38 np0005476733 nova_compute[192580]: 2025-10-08 17:26:38.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:26:38 np0005476733 nova_compute[192580]: 2025-10-08 17:26:38.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:26:38 np0005476733 nova_compute[192580]: 2025-10-08 17:26:38.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:26:38 np0005476733 nova_compute[192580]: 2025-10-08 17:26:38.603 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:26:39 np0005476733 nova_compute[192580]: 2025-10-08 17:26:39.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:26:39 np0005476733 nova_compute[192580]: 2025-10-08 17:26:39.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:26:43 np0005476733 nova_compute[192580]: 2025-10-08 17:26:43.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:26:44 np0005476733 podman[291176]: 2025-10-08 17:26:44.23926932 +0000 UTC m=+0.058062646 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, container_name=openstack_network_exporter, release=1755695350, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 13:26:44 np0005476733 podman[291175]: 2025-10-08 17:26:44.258142843 +0000 UTC m=+0.081096973 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 13:26:44 np0005476733 podman[291174]: 2025-10-08 17:26:44.258156993 +0000 UTC m=+0.085962827 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 13:26:44 np0005476733 nova_compute[192580]: 2025-10-08 17:26:44.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:26:46 np0005476733 nova_compute[192580]: 2025-10-08 17:26:46.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:26:47 np0005476733 nova_compute[192580]: 2025-10-08 17:26:47.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:26:48 np0005476733 nova_compute[192580]: 2025-10-08 17:26:48.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:26:49 np0005476733 nova_compute[192580]: 2025-10-08 17:26:49.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:26:50 np0005476733 nova_compute[192580]: 2025-10-08 17:26:50.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:26:50 np0005476733 nova_compute[192580]: 2025-10-08 17:26:50.624 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:26:50 np0005476733 nova_compute[192580]: 2025-10-08 17:26:50.624 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:26:50 np0005476733 nova_compute[192580]: 2025-10-08 17:26:50.625 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:26:50 np0005476733 nova_compute[192580]: 2025-10-08 17:26:50.625 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:26:50 np0005476733 nova_compute[192580]: 2025-10-08 17:26:50.800 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:26:50 np0005476733 nova_compute[192580]: 2025-10-08 17:26:50.802 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13780MB free_disk=111.29906845092773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:26:50 np0005476733 nova_compute[192580]: 2025-10-08 17:26:50.803 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:26:50 np0005476733 nova_compute[192580]: 2025-10-08 17:26:50.803 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:26:50 np0005476733 nova_compute[192580]: 2025-10-08 17:26:50.886 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:26:50 np0005476733 nova_compute[192580]: 2025-10-08 17:26:50.886 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:26:50 np0005476733 nova_compute[192580]: 2025-10-08 17:26:50.913 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:26:50 np0005476733 nova_compute[192580]: 2025-10-08 17:26:50.930 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:26:50 np0005476733 nova_compute[192580]: 2025-10-08 17:26:50.933 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:26:50 np0005476733 nova_compute[192580]: 2025-10-08 17:26:50.933 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:26:51 np0005476733 podman[291237]: 2025-10-08 17:26:51.24564393 +0000 UTC m=+0.060576686 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 13:26:51 np0005476733 podman[291238]: 2025-10-08 17:26:51.251885619 +0000 UTC m=+0.065659888 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 13:26:53 np0005476733 nova_compute[192580]: 2025-10-08 17:26:53.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:26:54 np0005476733 nova_compute[192580]: 2025-10-08 17:26:54.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:26:55 np0005476733 nova_compute[192580]: 2025-10-08 17:26:55.934 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:26:58 np0005476733 nova_compute[192580]: 2025-10-08 17:26:58.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:26:59 np0005476733 nova_compute[192580]: 2025-10-08 17:26:59.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:27:00 np0005476733 podman[291280]: 2025-10-08 17:27:00.290648384 +0000 UTC m=+0.106094511 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 13:27:03 np0005476733 nova_compute[192580]: 2025-10-08 17:27:03.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:27:04 np0005476733 ovn_controller[263831]: 2025-10-08T17:27:04Z|00319|pinctrl|WARN|Dropped 31 log messages in last 61 seconds (most recently, 29 seconds ago) due to excessive rate
Oct  8 13:27:04 np0005476733 ovn_controller[263831]: 2025-10-08T17:27:04Z|00320|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:27:04 np0005476733 podman[291301]: 2025-10-08 17:27:04.276571606 +0000 UTC m=+0.092385452 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Oct  8 13:27:04 np0005476733 podman[291300]: 2025-10-08 17:27:04.323421723 +0000 UTC m=+0.144237970 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller)
Oct  8 13:27:04 np0005476733 nova_compute[192580]: 2025-10-08 17:27:04.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:27:08 np0005476733 nova_compute[192580]: 2025-10-08 17:27:08.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:27:08 np0005476733 nova_compute[192580]: 2025-10-08 17:27:08.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 13:27:08 np0005476733 nova_compute[192580]: 2025-10-08 17:27:08.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:27:10 np0005476733 nova_compute[192580]: 2025-10-08 17:27:10.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:27:13 np0005476733 nova_compute[192580]: 2025-10-08 17:27:13.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:27:15 np0005476733 nova_compute[192580]: 2025-10-08 17:27:15.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:27:15 np0005476733 podman[291346]: 2025-10-08 17:27:15.241051518 +0000 UTC m=+0.070332558 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 13:27:15 np0005476733 podman[291347]: 2025-10-08 17:27:15.258041681 +0000 UTC m=+0.081539806 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Oct  8 13:27:15 np0005476733 podman[291345]: 2025-10-08 17:27:15.272155342 +0000 UTC m=+0.094843901 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd)
Oct  8 13:27:18 np0005476733 nova_compute[192580]: 2025-10-08 17:27:18.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:27:20 np0005476733 nova_compute[192580]: 2025-10-08 17:27:20.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:27:21 np0005476733 nova_compute[192580]: 2025-10-08 17:27:21.600 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:27:21 np0005476733 podman[291405]: 2025-10-08 17:27:21.995042763 +0000 UTC m=+0.074738029 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  8 13:27:22 np0005476733 podman[291406]: 2025-10-08 17:27:22.032886952 +0000 UTC m=+0.096263176 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 13:27:23 np0005476733 nova_compute[192580]: 2025-10-08 17:27:23.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:27:24 np0005476733 nova_compute[192580]: 2025-10-08 17:27:24.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:27:24 np0005476733 nova_compute[192580]: 2025-10-08 17:27:24.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:27:25 np0005476733 nova_compute[192580]: 2025-10-08 17:27:25.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:27:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:27:26.467 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:27:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:27:26.467 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:27:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:27:26.467 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:27:28 np0005476733 nova_compute[192580]: 2025-10-08 17:27:28.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:27:30 np0005476733 nova_compute[192580]: 2025-10-08 17:27:30.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:27:31 np0005476733 podman[291450]: 2025-10-08 17:27:31.232461407 +0000 UTC m=+0.064020016 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Oct  8 13:27:33 np0005476733 nova_compute[192580]: 2025-10-08 17:27:33.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:27:34 np0005476733 nova_compute[192580]: 2025-10-08 17:27:34.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:27:35 np0005476733 nova_compute[192580]: 2025-10-08 17:27:35.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:27:35 np0005476733 podman[291471]: 2025-10-08 17:27:35.282836357 +0000 UTC m=+0.094948605 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm)
Oct  8 13:27:35 np0005476733 podman[291470]: 2025-10-08 17:27:35.330110208 +0000 UTC m=+0.145997096 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 13:27:38 np0005476733 nova_compute[192580]: 2025-10-08 17:27:38.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:27:39 np0005476733 nova_compute[192580]: 2025-10-08 17:27:39.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:27:39 np0005476733 nova_compute[192580]: 2025-10-08 17:27:39.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:27:39 np0005476733 nova_compute[192580]: 2025-10-08 17:27:39.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:27:39 np0005476733 nova_compute[192580]: 2025-10-08 17:27:39.669 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:27:39 np0005476733 nova_compute[192580]: 2025-10-08 17:27:39.670 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:27:40 np0005476733 nova_compute[192580]: 2025-10-08 17:27:40.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:27:43 np0005476733 nova_compute[192580]: 2025-10-08 17:27:43.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:27:45 np0005476733 nova_compute[192580]: 2025-10-08 17:27:45.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:27:46 np0005476733 podman[291517]: 2025-10-08 17:27:46.243882082 +0000 UTC m=+0.063249882 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.6, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Oct  8 13:27:46 np0005476733 podman[291515]: 2025-10-08 17:27:46.25195944 +0000 UTC m=+0.078514400 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 13:27:46 np0005476733 podman[291516]: 2025-10-08 17:27:46.252467156 +0000 UTC m=+0.067712545 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 13:27:47 np0005476733 nova_compute[192580]: 2025-10-08 17:27:47.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:27:47 np0005476733 nova_compute[192580]: 2025-10-08 17:27:47.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:27:48 np0005476733 nova_compute[192580]: 2025-10-08 17:27:48.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:27:50 np0005476733 nova_compute[192580]: 2025-10-08 17:27:50.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:27:50 np0005476733 nova_compute[192580]: 2025-10-08 17:27:50.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:27:50 np0005476733 nova_compute[192580]: 2025-10-08 17:27:50.638 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:27:50 np0005476733 nova_compute[192580]: 2025-10-08 17:27:50.639 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:27:50 np0005476733 nova_compute[192580]: 2025-10-08 17:27:50.639 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:27:50 np0005476733 nova_compute[192580]: 2025-10-08 17:27:50.640 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:27:50 np0005476733 nova_compute[192580]: 2025-10-08 17:27:50.874 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:27:50 np0005476733 nova_compute[192580]: 2025-10-08 17:27:50.875 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13791MB free_disk=111.29906845092773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:27:50 np0005476733 nova_compute[192580]: 2025-10-08 17:27:50.875 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:27:50 np0005476733 nova_compute[192580]: 2025-10-08 17:27:50.875 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:27:51 np0005476733 nova_compute[192580]: 2025-10-08 17:27:51.168 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:27:51 np0005476733 nova_compute[192580]: 2025-10-08 17:27:51.169 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:27:51 np0005476733 nova_compute[192580]: 2025-10-08 17:27:51.357 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing inventories for resource provider 94652b61-be28-442d-a9f4-cded63837444 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 13:27:51 np0005476733 nova_compute[192580]: 2025-10-08 17:27:51.713 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating ProviderTree inventory for provider 94652b61-be28-442d-a9f4-cded63837444 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 13:27:51 np0005476733 nova_compute[192580]: 2025-10-08 17:27:51.714 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Updating inventory in ProviderTree for provider 94652b61-be28-442d-a9f4-cded63837444 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 13:27:51 np0005476733 nova_compute[192580]: 2025-10-08 17:27:51.794 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing aggregate associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 13:27:51 np0005476733 nova_compute[192580]: 2025-10-08 17:27:51.821 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Refreshing trait associations for resource provider 94652b61-be28-442d-a9f4-cded63837444, traits: HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_USB,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AESNI,HW_CPU_X86_CLMUL,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 13:27:51 np0005476733 nova_compute[192580]: 2025-10-08 17:27:51.907 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:27:51 np0005476733 nova_compute[192580]: 2025-10-08 17:27:51.937 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:27:51 np0005476733 nova_compute[192580]: 2025-10-08 17:27:51.939 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:27:51 np0005476733 nova_compute[192580]: 2025-10-08 17:27:51.939 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.064s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:27:52 np0005476733 podman[291581]: 2025-10-08 17:27:52.249863743 +0000 UTC m=+0.072437176 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 13:27:52 np0005476733 podman[291582]: 2025-10-08 17:27:52.27701199 +0000 UTC m=+0.094590394 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 13:27:53 np0005476733 nova_compute[192580]: 2025-10-08 17:27:53.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:27:55 np0005476733 nova_compute[192580]: 2025-10-08 17:27:55.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:27:55 np0005476733 nova_compute[192580]: 2025-10-08 17:27:55.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:27:55 np0005476733 nova_compute[192580]: 2025-10-08 17:27:55.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:27:57 np0005476733 nova_compute[192580]: 2025-10-08 17:27:57.499 2 DEBUG oslo_concurrency.processutils [None req-c6cf4fcb-6bd9-4a6b-be5d-eb1cf823c2c8 36369655df554e8f99bc7e67c1d4bc56 daecd871adaa4a7ba72129f7b1a03cd9 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 13:27:57 np0005476733 nova_compute[192580]: 2025-10-08 17:27:57.523 2 DEBUG oslo_concurrency.processutils [None req-c6cf4fcb-6bd9-4a6b-be5d-eb1cf823c2c8 36369655df554e8f99bc7e67c1d4bc56 daecd871adaa4a7ba72129f7b1a03cd9 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 13:27:58 np0005476733 nova_compute[192580]: 2025-10-08 17:27:58.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:00 np0005476733 nova_compute[192580]: 2025-10-08 17:28:00.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:02 np0005476733 podman[291622]: 2025-10-08 17:28:02.2459586 +0000 UTC m=+0.064521223 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  8 13:28:03 np0005476733 nova_compute[192580]: 2025-10-08 17:28:03.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:03 np0005476733 ovn_controller[263831]: 2025-10-08T17:28:03Z|00321|pinctrl|WARN|Dropped 31 log messages in last 60 seconds (most recently, 28 seconds ago) due to excessive rate
Oct  8 13:28:03 np0005476733 ovn_controller[263831]: 2025-10-08T17:28:03Z|00322|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:28:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:28:03.962 103739 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=114, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ae:49:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '32:9c:24:bf:22:a6'}, ipsec=False) old=SB_Global(nb_cfg=113) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 13:28:03 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:28:03.962 103739 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 13:28:03 np0005476733 nova_compute[192580]: 2025-10-08 17:28:03.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:05 np0005476733 nova_compute[192580]: 2025-10-08 17:28:05.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:06 np0005476733 podman[291645]: 2025-10-08 17:28:06.252857972 +0000 UTC m=+0.067138206 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 13:28:06 np0005476733 podman[291644]: 2025-10-08 17:28:06.390012664 +0000 UTC m=+0.198445541 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Oct  8 13:28:08 np0005476733 nova_compute[192580]: 2025-10-08 17:28:08.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:08 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:28:08.965 103739 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ec52a299-e5bc-4227-a88e-e241833eebb2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '114'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 13:28:09 np0005476733 nova_compute[192580]: 2025-10-08 17:28:09.612 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:28:09 np0005476733 nova_compute[192580]: 2025-10-08 17:28:09.612 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 13:28:09 np0005476733 nova_compute[192580]: 2025-10-08 17:28:09.632 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 13:28:10 np0005476733 nova_compute[192580]: 2025-10-08 17:28:10.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:13 np0005476733 nova_compute[192580]: 2025-10-08 17:28:13.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:15 np0005476733 nova_compute[192580]: 2025-10-08 17:28:15.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:17 np0005476733 podman[291692]: 2025-10-08 17:28:17.241959198 +0000 UTC m=+0.057652826 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 13:28:17 np0005476733 podman[291693]: 2025-10-08 17:28:17.248859128 +0000 UTC m=+0.061056014 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, config_id=edpm, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64)
Oct  8 13:28:17 np0005476733 podman[291691]: 2025-10-08 17:28:17.25987 +0000 UTC m=+0.074260336 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 13:28:19 np0005476733 nova_compute[192580]: 2025-10-08 17:28:19.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:20 np0005476733 nova_compute[192580]: 2025-10-08 17:28:20.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:21 np0005476733 nova_compute[192580]: 2025-10-08 17:28:21.600 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:28:22 np0005476733 nova_compute[192580]: 2025-10-08 17:28:22.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:28:23 np0005476733 podman[291753]: 2025-10-08 17:28:23.255628751 +0000 UTC m=+0.072888462 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  8 13:28:23 np0005476733 podman[291754]: 2025-10-08 17:28:23.263311537 +0000 UTC m=+0.074484334 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 13:28:24 np0005476733 nova_compute[192580]: 2025-10-08 17:28:24.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:24 np0005476733 nova_compute[192580]: 2025-10-08 17:28:24.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:28:24 np0005476733 nova_compute[192580]: 2025-10-08 17:28:24.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:28:25 np0005476733 nova_compute[192580]: 2025-10-08 17:28:25.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:28:26.468 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:28:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:28:26.469 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:28:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:28:26.469 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:28:29 np0005476733 nova_compute[192580]: 2025-10-08 17:28:29.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:30 np0005476733 nova_compute[192580]: 2025-10-08 17:28:30.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:33 np0005476733 podman[291794]: 2025-10-08 17:28:33.249033412 +0000 UTC m=+0.074252237 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  8 13:28:34 np0005476733 nova_compute[192580]: 2025-10-08 17:28:34.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:35 np0005476733 nova_compute[192580]: 2025-10-08 17:28:35.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:28:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:28:36 np0005476733 nova_compute[192580]: 2025-10-08 17:28:36.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:28:37 np0005476733 podman[291814]: 2025-10-08 17:28:37.263401687 +0000 UTC m=+0.081201548 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251001, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  8 13:28:37 np0005476733 podman[291813]: 2025-10-08 17:28:37.276554037 +0000 UTC m=+0.105140623 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  8 13:28:39 np0005476733 nova_compute[192580]: 2025-10-08 17:28:39.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:39 np0005476733 nova_compute[192580]: 2025-10-08 17:28:39.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:28:40 np0005476733 nova_compute[192580]: 2025-10-08 17:28:40.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:41 np0005476733 nova_compute[192580]: 2025-10-08 17:28:41.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:28:41 np0005476733 nova_compute[192580]: 2025-10-08 17:28:41.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:28:41 np0005476733 nova_compute[192580]: 2025-10-08 17:28:41.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:28:41 np0005476733 nova_compute[192580]: 2025-10-08 17:28:41.602 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:28:44 np0005476733 nova_compute[192580]: 2025-10-08 17:28:44.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:45 np0005476733 nova_compute[192580]: 2025-10-08 17:28:45.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:48 np0005476733 podman[291860]: 2025-10-08 17:28:48.227452556 +0000 UTC m=+0.049824495 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 13:28:48 np0005476733 podman[291861]: 2025-10-08 17:28:48.243787748 +0000 UTC m=+0.064401901 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=edpm, vcs-type=git, distribution-scope=public, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, name=ubi9-minimal, release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  8 13:28:48 np0005476733 podman[291859]: 2025-10-08 17:28:48.2569968 +0000 UTC m=+0.086184277 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  8 13:28:48 np0005476733 nova_compute[192580]: 2025-10-08 17:28:48.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:28:49 np0005476733 nova_compute[192580]: 2025-10-08 17:28:49.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:49 np0005476733 nova_compute[192580]: 2025-10-08 17:28:49.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:28:50 np0005476733 nova_compute[192580]: 2025-10-08 17:28:50.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:52 np0005476733 nova_compute[192580]: 2025-10-08 17:28:52.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:28:52 np0005476733 nova_compute[192580]: 2025-10-08 17:28:52.617 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:28:52 np0005476733 nova_compute[192580]: 2025-10-08 17:28:52.618 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:28:52 np0005476733 nova_compute[192580]: 2025-10-08 17:28:52.618 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:28:52 np0005476733 nova_compute[192580]: 2025-10-08 17:28:52.618 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:28:52 np0005476733 nova_compute[192580]: 2025-10-08 17:28:52.817 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:28:52 np0005476733 nova_compute[192580]: 2025-10-08 17:28:52.818 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13760MB free_disk=111.29906845092773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:28:52 np0005476733 nova_compute[192580]: 2025-10-08 17:28:52.819 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:28:52 np0005476733 nova_compute[192580]: 2025-10-08 17:28:52.819 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:28:52 np0005476733 nova_compute[192580]: 2025-10-08 17:28:52.888 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:28:52 np0005476733 nova_compute[192580]: 2025-10-08 17:28:52.889 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:28:52 np0005476733 nova_compute[192580]: 2025-10-08 17:28:52.911 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:28:52 np0005476733 nova_compute[192580]: 2025-10-08 17:28:52.932 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:28:52 np0005476733 nova_compute[192580]: 2025-10-08 17:28:52.935 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:28:52 np0005476733 nova_compute[192580]: 2025-10-08 17:28:52.935 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:28:54 np0005476733 nova_compute[192580]: 2025-10-08 17:28:54.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:54 np0005476733 podman[291923]: 2025-10-08 17:28:54.271217452 +0000 UTC m=+0.083359537 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 13:28:54 np0005476733 podman[291922]: 2025-10-08 17:28:54.297986388 +0000 UTC m=+0.115944599 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 13:28:55 np0005476733 nova_compute[192580]: 2025-10-08 17:28:55.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:28:57 np0005476733 nova_compute[192580]: 2025-10-08 17:28:57.936 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:28:59 np0005476733 nova_compute[192580]: 2025-10-08 17:28:59.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:29:00 np0005476733 nova_compute[192580]: 2025-10-08 17:29:00.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:29:03 np0005476733 ovn_controller[263831]: 2025-10-08T17:29:03Z|00323|pinctrl|WARN|Dropped 43 log messages in last 60 seconds (most recently, 27 seconds ago) due to excessive rate
Oct  8 13:29:03 np0005476733 ovn_controller[263831]: 2025-10-08T17:29:03Z|00324|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:29:04 np0005476733 nova_compute[192580]: 2025-10-08 17:29:04.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:29:04 np0005476733 podman[291967]: 2025-10-08 17:29:04.236340669 +0000 UTC m=+0.066573730 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 13:29:05 np0005476733 nova_compute[192580]: 2025-10-08 17:29:05.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:29:08 np0005476733 podman[291988]: 2025-10-08 17:29:08.293122161 +0000 UTC m=+0.101276021 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute)
Oct  8 13:29:08 np0005476733 podman[291987]: 2025-10-08 17:29:08.307366957 +0000 UTC m=+0.123426429 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 13:29:09 np0005476733 nova_compute[192580]: 2025-10-08 17:29:09.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:29:10 np0005476733 nova_compute[192580]: 2025-10-08 17:29:10.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:29:14 np0005476733 nova_compute[192580]: 2025-10-08 17:29:14.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:29:15 np0005476733 nova_compute[192580]: 2025-10-08 17:29:15.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:29:19 np0005476733 nova_compute[192580]: 2025-10-08 17:29:19.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:29:19 np0005476733 podman[292031]: 2025-10-08 17:29:19.272729907 +0000 UTC m=+0.085888828 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 13:29:19 np0005476733 podman[292030]: 2025-10-08 17:29:19.280043452 +0000 UTC m=+0.100050211 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 13:29:19 np0005476733 podman[292032]: 2025-10-08 17:29:19.291342373 +0000 UTC m=+0.100544047 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41)
Oct  8 13:29:20 np0005476733 nova_compute[192580]: 2025-10-08 17:29:20.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:29:22 np0005476733 nova_compute[192580]: 2025-10-08 17:29:22.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:29:24 np0005476733 nova_compute[192580]: 2025-10-08 17:29:24.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:29:24 np0005476733 nova_compute[192580]: 2025-10-08 17:29:24.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:29:24 np0005476733 nova_compute[192580]: 2025-10-08 17:29:24.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:29:25 np0005476733 podman[292091]: 2025-10-08 17:29:25.230467521 +0000 UTC m=+0.053244854 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 13:29:25 np0005476733 podman[292090]: 2025-10-08 17:29:25.25637007 +0000 UTC m=+0.082427657 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid)
Oct  8 13:29:25 np0005476733 nova_compute[192580]: 2025-10-08 17:29:25.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:29:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:29:26.470 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:29:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:29:26.471 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:29:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:29:26.471 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:29:29 np0005476733 nova_compute[192580]: 2025-10-08 17:29:29.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:29:30 np0005476733 nova_compute[192580]: 2025-10-08 17:29:30.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:29:34 np0005476733 nova_compute[192580]: 2025-10-08 17:29:34.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:29:35 np0005476733 podman[292132]: 2025-10-08 17:29:35.222418876 +0000 UTC m=+0.050581229 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct  8 13:29:35 np0005476733 nova_compute[192580]: 2025-10-08 17:29:35.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:29:37 np0005476733 nova_compute[192580]: 2025-10-08 17:29:37.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:29:39 np0005476733 podman[292153]: 2025-10-08 17:29:39.2451813 +0000 UTC m=+0.066254410 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Oct  8 13:29:39 np0005476733 nova_compute[192580]: 2025-10-08 17:29:39.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:29:39 np0005476733 podman[292152]: 2025-10-08 17:29:39.318318219 +0000 UTC m=+0.143712037 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 13:29:39 np0005476733 nova_compute[192580]: 2025-10-08 17:29:39.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:29:40 np0005476733 nova_compute[192580]: 2025-10-08 17:29:40.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:29:43 np0005476733 nova_compute[192580]: 2025-10-08 17:29:43.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:29:43 np0005476733 nova_compute[192580]: 2025-10-08 17:29:43.591 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:29:43 np0005476733 nova_compute[192580]: 2025-10-08 17:29:43.591 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:29:43 np0005476733 nova_compute[192580]: 2025-10-08 17:29:43.611 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:29:44 np0005476733 nova_compute[192580]: 2025-10-08 17:29:44.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:29:45 np0005476733 nova_compute[192580]: 2025-10-08 17:29:45.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:29:48 np0005476733 nova_compute[192580]: 2025-10-08 17:29:48.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:29:49 np0005476733 nova_compute[192580]: 2025-10-08 17:29:49.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:29:50 np0005476733 podman[292198]: 2025-10-08 17:29:50.266527742 +0000 UTC m=+0.077903134 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 13:29:50 np0005476733 podman[292197]: 2025-10-08 17:29:50.270585621 +0000 UTC m=+0.087803460 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 13:29:50 np0005476733 podman[292199]: 2025-10-08 17:29:50.27276044 +0000 UTC m=+0.078102519 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.expose-services=, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  8 13:29:50 np0005476733 nova_compute[192580]: 2025-10-08 17:29:50.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:29:51 np0005476733 nova_compute[192580]: 2025-10-08 17:29:51.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:29:54 np0005476733 nova_compute[192580]: 2025-10-08 17:29:54.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:29:54 np0005476733 nova_compute[192580]: 2025-10-08 17:29:54.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:29:54 np0005476733 nova_compute[192580]: 2025-10-08 17:29:54.704 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:29:54 np0005476733 nova_compute[192580]: 2025-10-08 17:29:54.705 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:29:54 np0005476733 nova_compute[192580]: 2025-10-08 17:29:54.705 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:29:54 np0005476733 nova_compute[192580]: 2025-10-08 17:29:54.706 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:29:54 np0005476733 nova_compute[192580]: 2025-10-08 17:29:54.936 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:29:54 np0005476733 nova_compute[192580]: 2025-10-08 17:29:54.937 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13768MB free_disk=111.29906845092773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:29:54 np0005476733 nova_compute[192580]: 2025-10-08 17:29:54.937 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:29:54 np0005476733 nova_compute[192580]: 2025-10-08 17:29:54.938 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:29:55 np0005476733 nova_compute[192580]: 2025-10-08 17:29:55.171 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:29:55 np0005476733 nova_compute[192580]: 2025-10-08 17:29:55.172 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:29:55 np0005476733 nova_compute[192580]: 2025-10-08 17:29:55.195 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:29:55 np0005476733 nova_compute[192580]: 2025-10-08 17:29:55.218 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:29:55 np0005476733 nova_compute[192580]: 2025-10-08 17:29:55.221 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:29:55 np0005476733 nova_compute[192580]: 2025-10-08 17:29:55.221 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:29:55 np0005476733 nova_compute[192580]: 2025-10-08 17:29:55.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:29:56 np0005476733 podman[292254]: 2025-10-08 17:29:56.243521283 +0000 UTC m=+0.076629172 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  8 13:29:56 np0005476733 podman[292255]: 2025-10-08 17:29:56.254456953 +0000 UTC m=+0.077654566 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 13:29:58 np0005476733 nova_compute[192580]: 2025-10-08 17:29:58.221 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:29:59 np0005476733 nova_compute[192580]: 2025-10-08 17:29:59.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:30:00 np0005476733 nova_compute[192580]: 2025-10-08 17:30:00.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:30:03 np0005476733 ovn_controller[263831]: 2025-10-08T17:30:03Z|00325|pinctrl|WARN|Dropped 31 log messages in last 60 seconds (most recently, 25 seconds ago) due to excessive rate
Oct  8 13:30:03 np0005476733 ovn_controller[263831]: 2025-10-08T17:30:03Z|00326|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:30:04 np0005476733 nova_compute[192580]: 2025-10-08 17:30:04.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:30:05 np0005476733 nova_compute[192580]: 2025-10-08 17:30:05.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:30:06 np0005476733 podman[292298]: 2025-10-08 17:30:06.236128307 +0000 UTC m=+0.064407161 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 13:30:09 np0005476733 nova_compute[192580]: 2025-10-08 17:30:09.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:30:10 np0005476733 podman[292318]: 2025-10-08 17:30:10.280576575 +0000 UTC m=+0.092068027 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 13:30:10 np0005476733 podman[292317]: 2025-10-08 17:30:10.318367813 +0000 UTC m=+0.135059331 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct  8 13:30:10 np0005476733 nova_compute[192580]: 2025-10-08 17:30:10.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:30:14 np0005476733 nova_compute[192580]: 2025-10-08 17:30:14.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:30:15 np0005476733 nova_compute[192580]: 2025-10-08 17:30:15.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:30:19 np0005476733 nova_compute[192580]: 2025-10-08 17:30:19.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:30:20 np0005476733 nova_compute[192580]: 2025-10-08 17:30:20.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:30:21 np0005476733 podman[292364]: 2025-10-08 17:30:21.231708891 +0000 UTC m=+0.059218535 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 13:30:21 np0005476733 podman[292363]: 2025-10-08 17:30:21.254574923 +0000 UTC m=+0.083031988 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:30:21 np0005476733 podman[292365]: 2025-10-08 17:30:21.262894239 +0000 UTC m=+0.078641477 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  8 13:30:23 np0005476733 nova_compute[192580]: 2025-10-08 17:30:23.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:30:24 np0005476733 nova_compute[192580]: 2025-10-08 17:30:24.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:30:24 np0005476733 nova_compute[192580]: 2025-10-08 17:30:24.580 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:30:24 np0005476733 nova_compute[192580]: 2025-10-08 17:30:24.599 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:30:24 np0005476733 nova_compute[192580]: 2025-10-08 17:30:24.599 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:30:25 np0005476733 nova_compute[192580]: 2025-10-08 17:30:25.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:30:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:30:26.472 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:30:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:30:26.472 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:30:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:30:26.473 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:30:27 np0005476733 podman[292427]: 2025-10-08 17:30:27.226926524 +0000 UTC m=+0.047417947 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 13:30:27 np0005476733 podman[292426]: 2025-10-08 17:30:27.231445539 +0000 UTC m=+0.059032398 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 13:30:29 np0005476733 nova_compute[192580]: 2025-10-08 17:30:29.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:30:30 np0005476733 nova_compute[192580]: 2025-10-08 17:30:30.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:30:34 np0005476733 nova_compute[192580]: 2025-10-08 17:30:34.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:30:35 np0005476733 nova_compute[192580]: 2025-10-08 17:30:35.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.089 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:36 np0005476733 ceilometer_agent_compute[203330]: 2025-10-08 17:30:36.089 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 13:30:37 np0005476733 podman[292470]: 2025-10-08 17:30:37.251487183 +0000 UTC m=+0.076158847 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 13:30:37 np0005476733 nova_compute[192580]: 2025-10-08 17:30:37.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:30:39 np0005476733 nova_compute[192580]: 2025-10-08 17:30:39.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:30:40 np0005476733 nova_compute[192580]: 2025-10-08 17:30:40.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:30:41 np0005476733 podman[292491]: 2025-10-08 17:30:41.280388883 +0000 UTC m=+0.092734587 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Oct  8 13:30:41 np0005476733 podman[292490]: 2025-10-08 17:30:41.295031 +0000 UTC m=+0.120331149 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Oct  8 13:30:41 np0005476733 nova_compute[192580]: 2025-10-08 17:30:41.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:30:44 np0005476733 nova_compute[192580]: 2025-10-08 17:30:44.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:30:44 np0005476733 nova_compute[192580]: 2025-10-08 17:30:44.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:30:44 np0005476733 nova_compute[192580]: 2025-10-08 17:30:44.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:30:44 np0005476733 nova_compute[192580]: 2025-10-08 17:30:44.589 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:30:44 np0005476733 nova_compute[192580]: 2025-10-08 17:30:44.624 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:30:45 np0005476733 nova_compute[192580]: 2025-10-08 17:30:45.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:30:49 np0005476733 nova_compute[192580]: 2025-10-08 17:30:49.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:30:49 np0005476733 nova_compute[192580]: 2025-10-08 17:30:49.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:30:50 np0005476733 nova_compute[192580]: 2025-10-08 17:30:50.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:30:52 np0005476733 podman[292537]: 2025-10-08 17:30:52.272004222 +0000 UTC m=+0.087437377 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  8 13:30:52 np0005476733 podman[292539]: 2025-10-08 17:30:52.288784319 +0000 UTC m=+0.094071790 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vendor=Red Hat, Inc., vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350, container_name=openstack_network_exporter, architecture=x86_64)
Oct  8 13:30:52 np0005476733 podman[292538]: 2025-10-08 17:30:52.307197599 +0000 UTC m=+0.116835979 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 13:30:52 np0005476733 nova_compute[192580]: 2025-10-08 17:30:52.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:30:54 np0005476733 nova_compute[192580]: 2025-10-08 17:30:54.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:30:55 np0005476733 nova_compute[192580]: 2025-10-08 17:30:55.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:30:55 np0005476733 nova_compute[192580]: 2025-10-08 17:30:55.742 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:30:55 np0005476733 nova_compute[192580]: 2025-10-08 17:30:55.742 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:30:55 np0005476733 nova_compute[192580]: 2025-10-08 17:30:55.743 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:30:55 np0005476733 nova_compute[192580]: 2025-10-08 17:30:55.743 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:30:55 np0005476733 nova_compute[192580]: 2025-10-08 17:30:55.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:30:55 np0005476733 nova_compute[192580]: 2025-10-08 17:30:55.956 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:30:55 np0005476733 nova_compute[192580]: 2025-10-08 17:30:55.958 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13798MB free_disk=111.29906845092773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:30:55 np0005476733 nova_compute[192580]: 2025-10-08 17:30:55.958 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:30:55 np0005476733 nova_compute[192580]: 2025-10-08 17:30:55.959 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:30:56 np0005476733 nova_compute[192580]: 2025-10-08 17:30:56.488 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:30:56 np0005476733 nova_compute[192580]: 2025-10-08 17:30:56.488 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:30:56 np0005476733 nova_compute[192580]: 2025-10-08 17:30:56.516 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:30:56 np0005476733 nova_compute[192580]: 2025-10-08 17:30:56.553 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:30:56 np0005476733 nova_compute[192580]: 2025-10-08 17:30:56.555 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:30:56 np0005476733 nova_compute[192580]: 2025-10-08 17:30:56.556 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.597s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:30:58 np0005476733 podman[292597]: 2025-10-08 17:30:58.224106309 +0000 UTC m=+0.048406148 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 13:30:58 np0005476733 podman[292596]: 2025-10-08 17:30:58.26007488 +0000 UTC m=+0.090134054 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 13:30:59 np0005476733 nova_compute[192580]: 2025-10-08 17:30:59.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:31:00 np0005476733 nova_compute[192580]: 2025-10-08 17:31:00.556 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:31:00 np0005476733 nova_compute[192580]: 2025-10-08 17:31:00.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:31:03 np0005476733 ovn_controller[263831]: 2025-10-08T17:31:03Z|00327|pinctrl|WARN|Dropped 35 log messages in last 60 seconds (most recently, 23 seconds ago) due to excessive rate
Oct  8 13:31:03 np0005476733 ovn_controller[263831]: 2025-10-08T17:31:03Z|00328|pinctrl|WARN|IGMP Querier enabled without a valid IPv4 or IPv6 address
Oct  8 13:31:04 np0005476733 nova_compute[192580]: 2025-10-08 17:31:04.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:31:05 np0005476733 nova_compute[192580]: 2025-10-08 17:31:05.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:31:08 np0005476733 podman[292642]: 2025-10-08 17:31:08.215162027 +0000 UTC m=+0.049980720 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ovn_metadata_agent)
Oct  8 13:31:09 np0005476733 nova_compute[192580]: 2025-10-08 17:31:09.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:31:10 np0005476733 nova_compute[192580]: 2025-10-08 17:31:10.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:31:12 np0005476733 podman[292661]: 2025-10-08 17:31:12.267274809 +0000 UTC m=+0.091555629 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 13:31:12 np0005476733 podman[292662]: 2025-10-08 17:31:12.2720009 +0000 UTC m=+0.081328172 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 13:31:14 np0005476733 nova_compute[192580]: 2025-10-08 17:31:14.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:31:15 np0005476733 nova_compute[192580]: 2025-10-08 17:31:15.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:31:19 np0005476733 nova_compute[192580]: 2025-10-08 17:31:19.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:31:20 np0005476733 nova_compute[192580]: 2025-10-08 17:31:20.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:31:23 np0005476733 podman[292706]: 2025-10-08 17:31:23.236916528 +0000 UTC m=+0.057195331 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 13:31:23 np0005476733 podman[292712]: 2025-10-08 17:31:23.267432774 +0000 UTC m=+0.075014580 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vendor=Red Hat, Inc., vcs-type=git, version=9.6, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc.)
Oct  8 13:31:23 np0005476733 podman[292705]: 2025-10-08 17:31:23.275339937 +0000 UTC m=+0.104055300 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 13:31:24 np0005476733 nova_compute[192580]: 2025-10-08 17:31:24.581 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:31:24 np0005476733 nova_compute[192580]: 2025-10-08 17:31:24.587 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:31:24 np0005476733 nova_compute[192580]: 2025-10-08 17:31:24.588 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 13:31:24 np0005476733 nova_compute[192580]: 2025-10-08 17:31:24.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:31:25 np0005476733 nova_compute[192580]: 2025-10-08 17:31:25.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:31:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:31:26.473 103739 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:31:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:31:26.474 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:31:26 np0005476733 ovn_metadata_agent[103734]: 2025-10-08 17:31:26.474 103739 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:31:29 np0005476733 podman[292772]: 2025-10-08 17:31:29.272868826 +0000 UTC m=+0.090672892 container health_status 72f5e7f31550b67ecf453a000249b5aaa3247607ffae230bb7dabf5d14fb2435 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-iscsid:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.license=GPLv2, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct  8 13:31:29 np0005476733 podman[292773]: 2025-10-08 17:31:29.299608461 +0000 UTC m=+0.113017846 container health_status 9cbdb2777cd398b1334cec50568ebec30894890f7af9a8a15956e6ba463c026e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 13:31:29 np0005476733 nova_compute[192580]: 2025-10-08 17:31:29.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:31:30 np0005476733 nova_compute[192580]: 2025-10-08 17:31:30.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:31:34 np0005476733 nova_compute[192580]: 2025-10-08 17:31:34.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:31:35 np0005476733 nova_compute[192580]: 2025-10-08 17:31:35.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:31:39 np0005476733 podman[292816]: 2025-10-08 17:31:39.255616077 +0000 UTC m=+0.072134648 container health_status aa809903f5d8fc439ad253d46b78f88f0b1b0b23a79044994f0f3d45b06609c4 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  8 13:31:39 np0005476733 nova_compute[192580]: 2025-10-08 17:31:39.590 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:31:39 np0005476733 nova_compute[192580]: 2025-10-08 17:31:39.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:31:40 np0005476733 nova_compute[192580]: 2025-10-08 17:31:40.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:31:42 np0005476733 nova_compute[192580]: 2025-10-08 17:31:42.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:31:43 np0005476733 podman[292837]: 2025-10-08 17:31:43.245071915 +0000 UTC m=+0.062835452 container health_status 7f939ebfa387d00726741236d345753dc61026b67e2bb64fcaec612e78ca7e33 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ceilometer-compute:b78cfc68a577b1553523c8a70a34e297', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 13:31:43 np0005476733 podman[292836]: 2025-10-08 17:31:43.315323462 +0000 UTC m=+0.128518002 container health_status 20b9fc947ec5dcdeecd650e6954e57a91dc19c0ea5594e1cc8f57160f1de899b (image=38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-ovn-controller:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Oct  8 13:31:44 np0005476733 nova_compute[192580]: 2025-10-08 17:31:44.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:31:44 np0005476733 nova_compute[192580]: 2025-10-08 17:31:44.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 13:31:44 np0005476733 nova_compute[192580]: 2025-10-08 17:31:44.590 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 13:31:44 np0005476733 nova_compute[192580]: 2025-10-08 17:31:44.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:31:44 np0005476733 nova_compute[192580]: 2025-10-08 17:31:44.945 2 DEBUG nova.compute.manager [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 13:31:45 np0005476733 systemd-logind[827]: New session 178 of user zuul.
Oct  8 13:31:45 np0005476733 systemd[1]: Started Session 178 of User zuul.
Oct  8 13:31:45 np0005476733 nova_compute[192580]: 2025-10-08 17:31:45.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:31:49 np0005476733 nova_compute[192580]: 2025-10-08 17:31:49.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:31:49 np0005476733 ovs-vsctl[293059]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct  8 13:31:50 np0005476733 virtqemud[192152]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct  8 13:31:50 np0005476733 virtqemud[192152]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct  8 13:31:50 np0005476733 virtqemud[192152]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct  8 13:31:51 np0005476733 nova_compute[192580]: 2025-10-08 17:31:51.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:31:51 np0005476733 nova_compute[192580]: 2025-10-08 17:31:51.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:31:53 np0005476733 nova_compute[192580]: 2025-10-08 17:31:53.589 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:31:54 np0005476733 systemd[1]: Starting Hostname Service...
Oct  8 13:31:54 np0005476733 podman[293597]: 2025-10-08 17:31:54.233994959 +0000 UTC m=+0.064343629 container health_status d5f9cc24fcee4f1d6d696dda52de783250f595c46adc2d7d5d2f3f1d4a07d2b7 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 13:31:54 np0005476733 podman[293595]: 2025-10-08 17:31:54.240834898 +0000 UTC m=+0.072394617 container health_status 3e00d92b052d7611fc107231d75acf61594a2b5ba7dae9ebbae3f7d4259a5031 (image=38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.103:5001/podified-antelope-centos9/openstack-multipathd:b78cfc68a577b1553523c8a70a34e297', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 13:31:54 np0005476733 podman[293598]: 2025-10-08 17:31:54.245829117 +0000 UTC m=+0.074069629 container health_status ddc32297a74b638db0784680e0a4d35907a33ce85e60bdc002eb7c2d9a011253 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, version=9.6, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  8 13:31:54 np0005476733 systemd[1]: Started Hostname Service.
Oct  8 13:31:54 np0005476733 nova_compute[192580]: 2025-10-08 17:31:54.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 13:31:55 np0005476733 nova_compute[192580]: 2025-10-08 17:31:55.588 2 DEBUG oslo_service.periodic_task [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 13:31:55 np0005476733 nova_compute[192580]: 2025-10-08 17:31:55.680 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:31:55 np0005476733 nova_compute[192580]: 2025-10-08 17:31:55.681 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:31:55 np0005476733 nova_compute[192580]: 2025-10-08 17:31:55.681 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:31:55 np0005476733 nova_compute[192580]: 2025-10-08 17:31:55.681 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 13:31:55 np0005476733 nova_compute[192580]: 2025-10-08 17:31:55.890 2 WARNING nova.virt.libvirt.driver [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 13:31:55 np0005476733 nova_compute[192580]: 2025-10-08 17:31:55.891 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=13370MB free_disk=111.13507843017578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 13:31:55 np0005476733 nova_compute[192580]: 2025-10-08 17:31:55.891 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 13:31:55 np0005476733 nova_compute[192580]: 2025-10-08 17:31:55.891 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 13:31:55 np0005476733 nova_compute[192580]: 2025-10-08 17:31:55.968 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 13:31:55 np0005476733 nova_compute[192580]: 2025-10-08 17:31:55.968 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=15731MB used_ram=512MB phys_disk=119GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 13:31:55 np0005476733 nova_compute[192580]: 2025-10-08 17:31:55.995 2 DEBUG nova.compute.provider_tree [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed in ProviderTree for provider: 94652b61-be28-442d-a9f4-cded63837444 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 13:31:56 np0005476733 nova_compute[192580]: 2025-10-08 17:31:56.013 2 DEBUG nova.scheduler.client.report [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Inventory has not changed for provider 94652b61-be28-442d-a9f4-cded63837444 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 15731, 'reserved': 512, 'min_unit': 1, 'max_unit': 15731, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 119, 'reserved': 2, 'min_unit': 1, 'max_unit': 119, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 13:31:56 np0005476733 nova_compute[192580]: 2025-10-08 17:31:56.014 2 DEBUG nova.compute.resource_tracker [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 13:31:56 np0005476733 nova_compute[192580]: 2025-10-08 17:31:56.015 2 DEBUG oslo_concurrency.lockutils [None req-8c910c4f-93b4-468c-8092-59c5b6c95255 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 13:31:56 np0005476733 nova_compute[192580]: 2025-10-08 17:31:56.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
